Thursday, June 9, 2016

STEM Project Showcase 2016

Hi All,

The STEM Project Showcase is finally done! You might feel a sense of relief and, also, a sense of achievement! YES. Congratulations! You all have put together an AWESOME presentation! I believe that you shared the same excitement as me! Like most of the real-world engineering or research projects, there are always tons of hard works and numerous failures before the final publications or product releases. At times. it can be a lonely and frustrating journey. However, the results will be super sweet if you survive. I wish the STEM experience you have this year will become your assets for the years to come! After the showcase yesterday, Gaby said that "This is our last STEM class!". Until then, I started to realize that we won't have STEM classes together anymore :( However, for the graduating seniors, it means that you will be starting to tackle much larger, harder, real-world problems! Good luck in all your "new projects"! Don't forget to come back and share your new problems, experiences, and solutions!

An introduction video for our STEM challenges, dreams, actions, difficulties, and teams.


Room 201 is packed with curious audience. Students, parents, teachers, administrators, and a college professor all came to participate and support our event.
Sarah and Brain showed how OpenVIBE can be used to capture, process, analyze and classify brainwaves and translate them into mental commands to control games.
Owen and Theo described their journey through solving the autonomous drone navigation problem, and demonstrated a video of a line-following drone. 
Henry and Eduardo shared their game for autistic children. They introduce dialogue and relation building into the game. The game is designed in a non-exclusive way such that all players can enjoy the fun.
Isabelle explained how EEG signals can reveal our mental status. She introduced the new EEG headset, and the process to develop an App to acquire and process brainwaves.
Adnan integrated his understanding about compound eyes, object tracking, and machine learning to solve the drone tracking problem. He demonstrated a compound-eye frame of multiple cameras, and the results of his algorithms.
Ty and Noah discussed the new trend, machine learning, in artificial intelligence. They explained how perceptron and mutilayer perceptron can be used to solve classification problems.
Julian and Lithu investigated the possibility to conduct Hand-to-Hand communication through EMG and TENS signals. They performed gesture recognition by analyzing and classifying EMG signals.
Kate and Leslie demonstrated how to use XCode and OpenCV library to create an iPad app that can recognize subway signs (numbers) and help the visually impaired users navigate through the subway stations.

Gaby and Lola shared their experiences of developing apps with haptic feedback on Senseg tablet for the visually impaired children to comprehend graphical information in subjects like math and geography.
Aaron and Vivian illustrated the steps to design and develop multipixel tactile display based on electrovibration. It's a unique hardware project that involves many equipment, components, materials, and chemicals.





No comments:

Post a Comment