The next step for machine learning and AI

The next step for machine learning and AI

David Karandish, founder and CEO of, discusses the next step for machine learning and AI through chatbots and natural language processing.

The next step for machine learning and AI
David Karandish, founder and CEO of, discusses the next step for machine learning and AI through chatbots and natural language processing.

Tonya Hall interviews executives for our sister site ZDNet, and we’re running a selection of some of her most viewed videos. The following is an edited transcript of her conversation with David Karandish, co-founder and CEO of, which first published in December 2018 on ZDNet. To watch more of her videos, check out The Tonya Hall Show on ZDNet’s YouTube channel.

Tonya Hall: A chatbot that knows more than just Taylor Swift lyrics. It can find your TPS reports and more.

You’ve got quite the tech resume. Give us a summary of what brought you to found

David Karandish: So I’ve been in the tech space for about 20 years—all the way back from when designing a webpage put you at about the level of being a wizard or a warlock. And then, I’ve done a lot in the online marketing space, and most recently a company called And then we started in January of last year.

Tonya Hall: What problem is designed to solve?

David Karandish: Jane is here to make all of your company intelligence accessible and a simple chat-interface. You could think of her like having your own Siri or Alexa for the workplace, where she connects to your company’s apps, documents, and the knowledge of your team.

Tonya Hall: Explain the process that Jane follows to build the client dataset.

David Karandish: Jane will connect first to your company’s key applications—so things like Salesforce, Office 365, ServiceNow, Workday, you name it. If it has a cloud-based API, we can probably connect to it. From there, Jane can go retrieve information. She can also start to take actions on your behalf. The second area that she connects to are your documents. So she can read the contents of your documents and make that available for your whole team. And then lastly, we create a knowledge base where Jane can take all of the information you have that doesn’t live in an app or document and make that available as well.

Tonya Hall: You still have a layer of human involvement in creating and cleaning the data though, correct?

David Karandish: We don’t believe it should be people against AI. We believe that they should be working together. And so one of the fundamental design flaws in a lot of AI systems is that if you ask them a question and they don’t know the answer, you’re out of luck. So for us, if you ask Jane a question, and she doesn’t know the answer, she’s going to bring a person in the loop, help you get the right response.

Tonya Hall: Tell us about incorporating natural language processing and how Jane learns from new questions and requests.

Karandish: So Jane is built off a series of neural networks and algorithms, some of them very high-tech, [others] a bunch of convolutional neural networks. Some of them are pretty simple like spell-check and taking your company’s acronym list. But she combines all of this together, and then we ensemble a bunch of votes. So think of all the algorithms voting and saying whether or not we should or should not match to a particular candidate. Based on those votes, she’ll either come back to you with an answer from an app, a document, or a person; or, she’ll send it off to a co-pilot, where someone can come back and help you get that response.

Now we recognize that the world is not always that black or white. So, if she’s right on the line, she’ll come back to you with a clarifier, where she says “Did you mean A, B, or C?” If you then select B, she’ll remember that. That feeds back into the neural network, and she’ll remember it for next time.

Tonya Hall: A co-pilot is a human, I take it?

David Karandish: Co-pilot is one or multiple humans on your team that can help organize Jane’s knowledge base.

Tonya Hall: How have you integrated other apps to work within Jane?

David Karandish: Yeah, so what we do is we design the service first, and this is something different than a lot of other companies. So for us, services would be like email and calendar and HRIS and CRM, ticketing and cloud drives. And then all of the apps that we connect to that service share the same skills. A reason that’s beneficial is that if your IT team decides to switch from Jira to ServiceNow or from NetSuite to Salesforce or from ADP to Workday you don’t have to learn a new way of interacting with Jane, because all of the HRIS are the same. Once we design that service, we hook Jane up to whatever APIs are available for those apps, and then she’s able to retrieve information and take actions on your behalf.

Tonya Hall: What separates Jane from the average chatbot?

David Karandish: What separates Jane from the average chatbot, is, first of all, Jane has learning at the forefront. So we knew that if you ask Siri a question on your phone, she doesn’t have the answer, you’re out of luck. We knew that that’s okay if you’re my five-year-old daughter looking up Taylor Swift lyrics. But if you’re a team member, if you’re a customer, you’ve got to be able to get that answer, and it has to be able to learn even when it doesn’t know the answer. So we put learning at the forefront.

The second big thing is that Jane speaks the lexicon of your organization. So we don’t want her to just speak US English, but we want her to speak US English with your own dialect, your own phrasing, your own ways that you work in the organization. She’ll know what the TPS reports stand for, if you will.

The third thing about Jane is that, in addition to the learning and the dialect, she can also pick up information out of your documents. So, a lot of times, people think about the apps… they’re easy to understand… how she could connect to Salesforce or Dropbox, but the documents themselves have a treasure trove of information. And she can go take information out of your company handbook or your sales-training material and make all of that accessible.

Tonya Hall: What technology breakthrough needs to happen to enable the next great leap forward with machine learning and artificial intelligence?

David Karandish: Machine learning and AI are continuing to make step function progress. And each time a new major type of neural network comes out, you’ll find a lot of people improve it, and then it kind of hits a new baseline, and then we jump up from there. In terms of where you’re going to start to see major step function improvements, it’s going to happen when you can start to chain various responses together. So it’s one thing to ask a question and get an answer back—it’s another thing to have an ongoing dialogue in conversation. And so most neural nets today are trained with individual inputs and outputs; I believe long-term you’ll have full conversations that train the system that help you better understand natural language processing.

Tonya Hall: Thanks for shedding some insight and talking about the work that you’re doing. If somebody wants to connect with you, how can they go about doing that?

David Karandish: You can check us out at on your favorite browser.


Source link