AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Neo4j open source9/4/2023 ![]() ![]() We should be able to improve its treatment recommendation by utilizing knowledge graphs such as Hetionet. But it needs more.įor example, it can sound a warning if the lab results are out of the normal ranges. Currently, Doctor.ai has some very basic functions. So we the developer can also just focus our effort on enriching its healthcare-related functionalities. Unlike the open-domain chatbot Meena from Google, Doctor.ai is a specialized chatbot: it just focuses on healthcare informatics. Finally, we need to define a fallback intent when the user’s intent is not clear.įigure 10: Treatment recommendation in the react frontend.įinally, you can set up Doctor.ai yourself by following the instructions in “ Doctor.ai, an AI-Powered Virtual Voice Assistant for Health Care.” If the classification is successful and all the slots are filled, Doctor.ai will carry out the fulfilment actions. When a user dictates or types some utterance to Doctor.ai, Doctor.ai will try to classify the user’s intent. Each intent mainly consists of sample utterances, slots, and fulfilment actions. ![]() In Lex, intents are defined in the “Intents” page of the chatbot (Figure 4). Once the name of the patient is known, Doctor.ai will try to calculate the ICU visit counts. Otherwise, Doctor.ai will raise a follow-up question about it. A slot needs a value from the current or the previous utterance. In this case, the name of the patient is a slot. And that language expression is an utterance.įor example, the user can utter “How many times has patient ABC visited the ICU?” or “Tell me the counts of patient ABC’s ICU visits.” Both are going after the ICU visit counts of patient ABC. To fulfil it, the user must dictate a language expression to the chatbot. For example, the user may want to count how many ICU visits a patient has had. Intent is a wish that the user wants the chatbot to fulfil. Intent, utterance, and slot are three concepts that need some explanation. It can only understand some predefined intents, and we the programmers need to implement each one of them. The reality is that Lex is a specialized bot. Lex may give the wrong impression that it can understand every casual conversation with some AI magic. Most of the coding took place in Lex and Lambda. The setup of EC2 and Amplify was quite straightforward. Finally, we have put together a React frontend on AWS Amplify for easy interactions.įigure 3: The AWS architecture of Doctor.ai. It takes in voice or written inputs from the user and queries the Neo4j graph via Lambda. This assistant can map a large amount of medical records to a graph and use the Graph Data Science (GDS) library to recommend treatments.ĪWS Lex is the eye, mouth, and ear of Doctor.ai. It is a voice virtual assistant powered by the graph database Neo4j and AWS. With this conviction and the inspiration from Deep Medicine, four Neo4j engineers and I made Doctor.ai in the Singapore Healthcare AI Datathon and EXPO 2021. ![]() On the other hand, the newest medical research spreads so slowly across the profession that many doctors can hardly benefit from it. That can lead to wasteful duplications of administrative works and medical procedures. On the one hand, without digitization, hand-written medical records and images are not readily available and transferable between doctors. And many times even the doctors suffer from limited access to data. In this case, only the doctors decide the prevention, diagnosis, and treatment. Without full access to their own medical records and research results, let alone the software tools that make sense of the data, the patients are completely dependent on the doctors. This information asymmetry generally puts the patients at a disadvantage. ![]()
0 Comments
Read More
Leave a Reply. |