Thanks for your prompt responses. According to your personal profiles, I have arranged project teams based on the career/academic goals, STEM skills, and preferences of topics/grouping. Most of the researchers got your top or second choices. The following list shows your new project teams for next year. Some project teams share the same topics, but have different focuses. Welcome to provide your feedbacks for further optimization.
Team 1: Autonomous Drone Navigation System in Indoor Environment
Team member: Owen, Theo
Description: Processing video from the front camera of the drone to recognize and locate the guiding
marks, and navigating the drone autonomously from starting point to destiny based on these sensor
information.
Core technology: drone navigation, drone video image processing, laptop/desktop application
Platforms/Tools: Linux, AR Drone 2.0, AR Drone 2.0 SDK, Nodecopter, OpenCV
Team 2: Drone Defense System: Vision System
Team member: Adnan, Noah
Description: Developing and designing a proof-of-concept prototype of a projectile-based anti-drone system which
can be used to protect the building/property from drone invasion.
Core technology: Sensors, stereo vision, image processing, computer vision, iOS app programming
Platforms/Tools: Linux, OpenCV, Video4Linux, Beaglebone Black, FLIR ONE, FLIR ONE SDK
Team 3: Drone Defense System: Intercepting System
Team member: Henry, Eduardo
Description: Developing and designing a proof-of-concept prototype of a projectile-based anti-drone system which
can be used to protect the building/property from drone invasion.
Core technology: Projectile launching system, robot design, robotic programming,
embedded design
Platforms/Tools: RobotC, Arduino, Vex Robotics System
Team 4: Brainwave Controlled Games/Devices: OpenVIBE
Team member: Brian, Sarah
Description: Using signal processing techniques to analyze and classify brainwaves of different mental activities,
and controlling games or devices accordingly.
Core technology: Electroencephalogram (EEG) signal processing, classification
Platforms/Tools: NeuroSky Mindwave, OpenVIBE
Team 5: Brainwave Controlled Games/Devices: NeuroSky Mindwave
Team member: Ty
Description: Using signal processing techniques to analyze and classify brainwaves of different mental activities,
and controlling games or devices accordingly.
Core technology: Electroencephalogram (EEG) signal processing, mobile programming
Platforms/Tools: NeuroSky Mindwave, NeuroSky SDK, iOS/Android app programming
Team 6: Brainwave Controlled Games/Devices: Emotiv Insight
Team member: Isabelle, Harris
Description: Using signal processing techniques to analyze and classify brainwaves of different mental activities,
and controlling games or devices accordingly.
Core technology: Electroencephalogram (EEG) signal processing, classification, mobile programming
Platforms/Tools: Emotiv Insight, Emotiv SDK, iOS/Android app programming
Team 7: Hand-to-Hand Communication
Team member: Julian, Lithu
Description: Using hand gestures and movements of one person to determine those of another person. Detecting
the neural signals from the first person, processing and amplifying the neural signals, and stimulating the muscle/
nerves of the hand of another person.
Core technology: Electromyography (EMG) signal processing, classification, Transcutaneous Electrical Nerve
Stimulation (TENS), PossessedHand
Platforms/Tools: Backyard Brains Human-Human-Interface
Team 8: Educational Mobile Apps for the Visually Impaired
Team member: Gabriela, Lola, Katherine
Description: Developing educational mobile apps using Senseg FeelScreen Technology. The goal is to add a second
dimension to the traditional 1D braille display or audio for the visually impaired.
Core technology: Mobile app programming, FeelScreen with haptic-feedback
Platforms/Tools: Java, Android Studio, Senseg SDK
Team 9: General Mobile Apps for the Visually Impaired
Team member: Leslie
Description: Developing general mobile apps using Senseg FeelScreen Technology. The goal is to add a second
dimension to the traditional 1D braille display or audio for the visually impaired.
Core technology: Mobile app programming, FeelScreen with haptic-feedback
Platforms/Tools: Java, Android Studio, Senseg SDK
Team 10: 2D Tactile Display Based on Electrovibration
Team member: Aaron, Vivian
Description: Developing and designing a low-cost 2D tactile display system based on electrovibration for the visually impaired.
The system utilizes an iPad tablet, and off-the-shelf components. It includes the power, electronic, software, and
optical subsystems.
Core technology: Electrovibration, image processing, mobile app programming, low-power DC-to-DC converter,
signal amplification, tablet technology
Platforms/Tools: iPad, XCode, Objective-C, analog circuit design tools