The jobs automation became now the aim of most countries, especially Egypt. From this point, We have the idea of our project title “Interaction-based robot for MIU admissions office”. So that we thought about automating tasks of the admissions office. We will be utilizing a mobile application that can display on a mobile or tablet screen. Our project system will work on the MIU campus, especially in the admissions office. The system’s ability to perceive emotion is dependent on facial expressions and voice tones being detected and analyzed. As a result, the system will be fully capable of responding to all admissions inquiries according to the speaker’s language, age, and emotions.
The application is aimed to do repetitive tasks that waste the staff’s time by doing them in a faster and more efficient way at MIU admission procedures. Our objectives are:
• Building an interaction engine based on face detection that recognizes age, gender, and emotions.
• The chatbot shall be able to reply to the applicants’ inquiries on their admission procedures.
• The chatbot shall be able to answer all the student questions about faculties. For example, test’s date and time, exam fees payment, online placement test, medical examination, available faculties, and how to contact MIU admission.
• The chatbot has the ability to arrange an online meeting in needed cases.
• The chatbot can speak in different languages and detect the user’s language and reply according to his talk.
• The chatbot will reply nicer when detecting an angry face.
• The chatbot will talk slow or fast according to the user’s age.
• The chatbot can communicate speech-to-speech, text-to-text, or speech-to-text in different languages.
The system shall interact with the applicants to MIU efficiently as a chatbot that has already been implemented by natural language processing (NLP).
We began by compiling a list of situations relating to admissions procedures. It’s worth noting that we got our scenarios from the MIU admissions staff, who assisted us in putting together our dataset. As previously stated, the chatbot can converse fluently in two languages (English and Egyptian); this is accomplished by a JSON file that contains scenarios and responses that are produced at random in different languages based on the speaker’s speech.
• In detecting the identity faces and recognizing the users’ age, gender, and emotions and know the admission staff that can access the dataset and technical details
• The System can interact either by speech-to-speech, text-to-text, or text-to-speech using the Text to Speech (TTS) and Speech to Text (STT) package. That works independently without an internet connection or delay.
• Finally, we worked on REST API services using flask by creating an API that helps us work with our python file as a backend by using its HTTP requests like [PUT, DELETE, POST, and GET].
Then move on by calling this created API that contains our python file in the flutter application helped us achieve our target and worked with python as a backend and a Flutter/android application as UI/Frontend.
Documents and Presentations
You will find here the documents and presentation for our proposal.
You will find here the documents and presentation for our SRS.
You will find here the documents and presentation for our SDD.
You will find here the documents and presentation for our Thesis
ISF – Change Makers 2022
We were qualified for the second stage as one of the top 40 teams, then the teams were eliminated to 12 teams to be among the top 12 teams, finally they became only 4 teams and we were not among them