A few weeks ago, I was reading a few blog posts from two of the great minds at a16z about conversational interfaces being a fad and the prediction that no investor will look at AI startups in 2 years.
I thought I would add my three cents to the conversation.
See, from my vantage point, it's less about Alexa being the latest "it" fad put by amazon's geniuses to foster more ordering. It's more about another entry point and an additional way to interact with machines.
Let me explain. When I grew up the only method of interaction was the keyboard. Then came the mouse and the GUI. Then came the joystick. then (giant leap forward) the multi touch screen. Then another 10 years and we have consistently correct and realtime NLP, even when buried in harmonic noise. I went from LISP to Python, from 3 days to train a simple 3-layer perceptron to 3 mins. I've seen walled garden ecosystem plays evolve into API and platform businesses. From supply-side (Wal-Mart) to demand-side (Facebook) economies of scale. And everything in between.
All I see today is a proliferation of exciting ways to interact with machines that can only move us forward. Alexa and Echo are just the beginning. I've been absolutely mesmerized with Facebook's Oculus and the new Apple ARkit.
So my thoughts are It's not that in isolation conversational interfaces are narrow, in aggregate, added to all the other ways of we interact with machines, they add quite a lot. See, you can do something else with both your eyes and hands and still interact with echo and alexa. That's very powerful.
For example, say you're a doctor (I'm saying this because we have been working on it quite a bit lately) and you've got a very simple problem while you work - your hands and eyes are busy. Not your ears and mouth. So, from "Alexa please take notes about patient Chen" to "Alexa has my next patient arrived?" I see today simple and uselful ways to help small business owners, particularly in healthcare, get more efficient during their work day.
It's a tiny bit, but it always the sum of the little things.
Now about AI and the current hype: I did my first thesis on the recognition of spoken words using multilayer backpropagation perceptron in the early 90s. The math has been around since the 70s. So nothing new. What's innovative here is both the processing power becoming virtually free (which makes training somewhat easier) and the client-side approach (example iOS 11coreML) makes it super easy to put machine learning in any little gizmo with a DSP, harvest data, make much needed sense of this data very fast in real-time and send it to my smart cloud. That's pretty cool.
For what I'm doing right now, the combination of both those advances makes the idea of the "self-driving practice", which basically is a small healthcare business that is able to automatically 1. interact with clients (booking based on time inventory and client patterns), 2. manage supplies inventory (example medical supplies based on appointments and procedures), and 3. infers and optimizes time and fees for profit - totally doable today, something I could not do 3 years ago.
Today those two technologies, because they finally work well for real world applications, are enabling me to introduce, litle by little, leaps of efficiencies into healthcare - by automatizing the administrative costs.
So in conclusion, yes, from the standpoint of the VC looking for the first trillion dollar startup, what we see today is very much meh. It feels like innovation has stalled. I agree that given the excellence of iOS11 and ARKit/CoreML, we might see some Apple Glass showing up at the horizon, and that a well baked AR pair of shades will move us to another dimension of super cool stuffery. But the tiny little things add up and conversational interfaces, augmented with client-side AI help me solve real work-life problems for medical doctors today.
(PS. Here's a similar alternate view to a16z's posts discussed in the first paragraph, it's all good.)
My name's phil mora and I blog 2-3 times a month (it varies!) about the things I love: fitness, hacking work, tech and anything holistic. Since July 2015, I've been the Head of Product at Sikka. We're using big data, analytics, machine learning and AI to redefine medicine - our angle is the business of being a doctor. What we do is very, very cool. If you want to talk to me about that, you are more than welcome to contact me here or follow me on medium or twitter and chime in.
Join my newsletter:
I send my personal notes, usually for the week, with the links I found interesting and why they mattered. It's fun, I promise!
I am really excited to share a few things that we’ve been working on for the past 12 months at Sikka Software — and I think we’re truly changing the nature of the game for the benefit of our clients.
See, we’ve been really thinking hard on how to connect the dots of big data, cloud, mobile, artificial intelligence, machine learning and APIs to put together real value for our clients in retail health. Over the past few weeks we’ve rolled out really cool stuff such as a brand new dev API, a marketplace, mobile auto-scheduling and two alexa skills — so I thought I would share a few thoughts about this on-going work because we’re really passionate about it!
A Medicine of Needs and WantS
While in “medicine of need” (that would be your general doctor), big tech focuses first on solving the automation of diagnosis, which eventually will evolve into AI+nurse as it has been predicted by the brilliant minds of A16Z last year, we believe that for all those other doctors in your care team such as your dentist, ophthalmologist, chiropractor, fitness and wellness coach and even your veterinary (that’s the “medicine of want”), focusing on the business of being a healthcare entrepreneur (in other words running a practice as effectively and efficiently as possible) is a problem that today tech can solve and automatize.
So this week I am so proud to share with you that we are launching two major AI-based initiatives that are a bit revolutionary for our clients: Practice Insights, the first set of Alexa-based skills for dentists and veterinarians, and second, we’re adding bot-based auto-scheduling to our Practice Mobilizer mobile apps (Apple and Android).
The vision behind all of this has been summarized super well in Vijay Sikka’s Medium post yesterday: we think that soon practices can become self-driving thanks to our technology. In fact, we’re well on our way to make the practice almost self-aware, and to enable the doctor to interact with it using natural voice conversation to drive the business, such as AI-based auto-schedule, inventory management and auto-order, real-time patient appointment arrival notifications and so much more to come in the next few months.
practice Insights and alexa
So with this in mind, we worked hard with our AI and product teams for 8 months and we’re launching as a first step our Practice Insights service (beta) today.
Practice Insights works with Amazon Echo and Alexa so that doctors can have hands-free interactions with their practice during their work day. Once the Alexa skill is enabled, doctors can ask about business and day-to-day operations questions such as “when is my next patient arriving” or “Read me my morning report” (a very popular Sikka feature that was until now only available via daily PDF emails).
Video Credit: Jason Folk (@jasonfolk)
Practice Mobilizer Bot Auto-Schedule
Last year we were so proud to launch our first two mobile apps we called Practice Mobilizer (and its companion app Patient Mobilizer)
And with a rapidly growing organic user base, we’re now addressing more than 10,000 practices in the US with this complementary service (beta) which in 12 months has enabled more and more doctors to securely exchange voice, text, and video messages with their patients, check their business vitals, while also helping patients to automatically check-in to their appointment while on their way to the practice, thus drastically reducing wait-times.
Today we’re super happy to add auto-schedule to Mobilizer: patients can naturally interact with our bot in the secure message panel enabled by their doctors, and our AI-based scheduling bot will automatically pick-up the conversation using real, “human” natural english to suggest timeslots that will best fit the patient’s schedule, based on the doctor’s appointment calendar.
(Note: we’ve been using great stuff from Microsoft in terms of language understanding called LUIS and superb multi-calendar sync APIs by Cronofy)
Here’s the silver lining: at Sikka we’ve been totally dedicated to add real value to our doctors and practices, using the best that today’s tech can offer, while making sure that the tech is as unintrusive and easy to use as possible.
Practice Assistant and Bot-based Autoschedule are the first offsprings of the AI-first initiative we launched internally a year ago called Ipiktok (means: keen, sharp in Eskimo)
We’ve got a really great team here, and we’re super focused on bringing even more value to retail healthcare in the next year. Try our apps and skills, and please tweet + blog about your experiences using the “ipiktok” hashtag so that we can read and learn from your feedback!
My name's phil mora and I blog about the things I love: fitness, hacking work, tech and anything holistic.
Thinker, doer, designer, coder, leader.
Head of Product at Sikka Software.
Here's my contact info.
i blog about the things I love: fitness, hacking work, tech, Experiences and anything holistic.
> I am the Head of Product and Head of AI at Sikka Software.