Authors

Ahmed Osama, Seif Nagi, Marian Kromel, Mafdy Magdy

Publishing Date

May 11, 2022

Abstract

The document shall outline our goals for the project, with the title “Interaction-based robot

for MIU admissions office.” We will be utilizing the Pepper robot; Pepper is a semi-humanoid

robot manufactured by SoftBank. Furthermore, Robots are used in schools, colleges, and

institutions to teach programming and perform research on human-robot interactions. The

robot in our project will work on campus, primarily in the admissions office.

Pepper’s ability to perceive emotion is dependent on facial expressions and voice tones being

detected and analyzed. Consequently, our robot is fully capable of responding to any MIU

admissions inquiries.

1.1 Purpose of this document

The purpose of this Software Requirements Specification (SRS) document in our project is to

outline the requirements of an Interaction-based robot for the MIU admissions office. By writing

this document we are going to put the idea down on paper to cover all project details. It provides

audiences with an overview of the system’s features so they can make decisions clearly.

In addition, helping the committee to cover all the expectations and details about our project.

Besides, understanding more details about the project’s objectives, overview, problem statement,

and more. This document shows that there will be a Pepper Robot in MIU Admissions to help

admission staff. That can help them organize, manage their seasonal tasks. Which is replying to

applicant inquiries.

1.2 Scope of this document

The elicitation team: Ahmed Osama, Seif Nagi, Marian Kromel, Mafdy Magdy.

The users are the parents and the students who are applying to MIU. Users can also be from the

admission staff.

The robot is aimed at helping people get done with their needed tasks faster, either students or staff.

So students or parents do not waste time in getting their inquiries answered, and the staff does not

waste time in answering frequent inquiries for each.

1.3 System Overview

The overview of this project is to help admission staff with their frequent tasks by placing

a robot that aimed to reduce repetitive tasks like answering applying inquiries including how to

apply, what are the fees required for the application, and more questions related to admission pro-

cedures. In addition to this, the robot is going to reply to random inquiries from people efficiently

in different languages depending on the speakers’ languages. Also,as shown in the robot can rec-

ognize the admission staff and arrange an online meeting with them, and communicate according

to the speakers’ age, gender, and emotions. If the user is an adult, the robot will talk faster and

more casually than if talking with an old person. With an old person, the robot will talk formally

and slower. The robot can communicate speech-to-speech, text-to-text, or speech-to-text accord-

ing to the users’ needs. In all cases, the conversation will be displayed on the screen even if the

user prefers to speak instead of texting. The robot also detects the user’s emotions and communi-

cates related to the user’s mood. The implementation by a python backend and flutter as frontend

connected using Rest API.

1.4 System Scope

The robot shall interact with students applying to MIU efficiently as a chat-bot that has already

been implemented by natural language processing (NLP). We started by storing some scenarios

related to admission procedures. It should be noted that we received our scenarios from MIU

admission staff which helped us build our dataset. As we mentioned previously, the robot can

speak different languages fluently (English or Egyptian Arabic); translation is implemented with

the dataset that detects the input language and deals based on it. In detecting the identity faces and

recognizing the users’ age, gender, and emotions, implemented using AI model

Detecting face emotion, either happy, sad, anger, neutral, or surprise. Offline-based by using

“Artificial Intelligence (AI)”; We start by collecting datasets for various emotions and applying

Deep learning for training the prediction model. Then using the model to predict the emotion.

In addition to a python file that will execute the whole project. Face detection is required as the

robot needs to know all the staff in admission for any urgent or unknown cases by organizing an

online meeting with a suitable time for both staff and students. Finally, the robot can interact either

speech-to-speech, text-to-text, or text-to-speech using the Text to Speech (TTS) and Speech to text

(STT) packages for flutter. That works independently without an internet connection or delay.