Project Home
PDF access

Authors

Ahmed Serag, Ahmed Yehia,John Emad,Karim Mohamed ,Karim Khaled

Supervised By:Walaa Hassan, Hager Sobeah


Publishing Date

January 3, 2022


Abstract

There are many self taught guitarists that view tutorials online in order to learn new songs. Beginning guitarists, however, find it difficult to execute the right technique. Playing guitar focuses mostly on left hand movement as it controls the tunes. Our Project aims to correct new guitarists techniques for their left hand by scanning their hand’s movement and correcting their finger positions. The proposed solution also captures the frequency of each played note and then produces feedback which would help players to adjust their techniques, making sure they are playing everything correctly and not just copying people playing online. The finger recognition would be done through the MediaPipe a computer vision framework which detects body motion and classifies objects, which can run on most devices. we will also use predictive and classification models to assess the guitarist’s performance.

1.1 Purpose of this document

The purpose of this document is to analyze and break down the software requirements for building a real-time guitarist’s guiding system to help artists and guitar coaches enhance their precision in playing the guitar. It describes the functional requirements as well as the expected behavior in a given environment. It outlines all the tools and methods used in doing this project. The targeted readers for this document are the stakeholders which are self-taught beginner guitarists aiming to learn the basics of playing the guitar, guitar coaches, guitar students, developers, and any type of investor interested in participating in this project.

1.2 Scope of this document

The scope of this document is to show the foundation of our project so that both developers and stakeholders could understand the solution we are providing through a clear definition for the functional and non-functional requirements. Furthermore, this document’s inheritance relationships and database are still viable to change.

1.3 System Overview

Our system is composed of a webcam which records live footage of the guitarist on a VPS (virtual private server), the footage gets pre-processed and saved into a bucket. Then information gets extracted from the video, and it consists of two elements: the sound of the note being played and the finger positions when playing the note. After the information gets extracted, it gets saved into a database for future model training. Then the result of what the user played on the guitar gets displayed and a report is generated after finishing the footage.

1.4 System Scope

  • Extract the hand and finger positions from a given guitar video performance.
  • Extract the single notes played from the the audio of the guitarists performance.
  • Extract the chords played from a guitarist using MLP.
  • Provide music exercises for the user to try out
  • Give real-time feedback for the hand motion correction and whether or not the note or chord were played correctly in the exercise.
  • Provide a feedback report on the total notes hit correctly and methods on how to correct the guitarists hand movements.