Friday, April 8, 2016

STEM Hackathon II

Hi All,

We are going to have our second STEM Hackathon form 9:00 am to 4:00 pm this Saturday (tomorrow, April 9). Though some of you will be attending college Admitted Students' Day or be traveling to visit your candidate colleges, you can still work on your projects over the weekend to catch up some progresses.

In addition, I have invited the Machine Learning Research Group (an after school activity) to join us the same day to have a Machine Learning Workshop. Some of them may become the Advanced STEM Research students next year. Please feel free to explain or demonstrate your projects to them as they approach you.

Since there will be ACT administrated in our school tomorrow, please be mindful when you come to school, and please go to RM 201 directly. Since people are expected to come in at various time due to other commitments, please start working on your own projects immediately whenever you come. We will have a "whole group" meeting later in the day when more people arrive.  If you will be late, please email Mr. Lin such that we can arrange lunch for you. I am looking forward to seeing you tomorrow!

Tuesday, March 29, 2016

STEM Seminar 3/29 Abstracts

STEM Team 8: Educational Apps
Our team’s goal for this presentation is to explain to the audience the process of getting haptic feedback on the Senseg tablet. Hopefully, after listening to us, you will be able to understand the concept and how to use the Senseg SDK, as well as integrate it to Android Studio. We will also quickly cover the methods provided with the SDK, different superclasses and subclasses that allow us to use the feedback in our advantage.

STEM Team 4: OpenVIBE
In this seminar, we will go over some of the stages of running a successful BCI scenario:       Data Acquisition. This involves connecting the headset to the OpenVibe Acquisition server.      Preprocessing and feature extraction. This involves building an OpenVibe scenario to filter the acquired data, isolate noise, and extract the desired features. Our presentation will feature and explain some OpenVibe boxes we used in the process of building our scenario, like temporal filter, time based epoching, and simple dsp.      Classification. This involves using classification algorithms to assign a specific class to each data point. Our presentation will further explain the distinction between this step and preprocessing. We will also explain how offline training is used to fine-tune an algorithm and acquaint it with the data to ensure more precise classification. The algorithm we will feature is SVM (support vector machine).   Our presentation will conclude with a demonstration of an OpenVibe scenario we've built incorporating the above elements. Our scenario is a Virtual reality handball game, where the user thinks "right" or "left" to throw the ball and score.

STEM Team 6: Electroencephalography (EEG)
Electroencephalography provides a new method for insight into the way we, as humans, work. It is the study of brain waves and the different ways this data is received. This EEG technology has existed for 80 years, yet only recently is it being fully taken advantage of for it’s commercial uses. This presentation will explore the new, innovative applications of EEG in everyday life, recent develops in the tech, and what this means for the future.

STEM Team 10: Multipixel Display  
Innovative technology has been advancing since the Industrial Revolution, and one of those categories:  haptics engineering, which benefits the general public for making technology more convenient. In our presentation, we will be discussing how to create a display screen with printable circuit boards. PCBs are a cheap and effective technology. A PCB is a circuit board that is copper clad. The copper is etched away so that all that is left is a circuit. In our project we will be using PCBs to create the board that people will feel. We are using PCBs because of their various benefits. First, they are cost effective. Second, they allow us to easily produce many different circuit layouts displaying different shapes. In addition, the materials needed are easily accessible to us. Each tactile pixel will be 1mm by 1 mm. An important part of the process is determining the trace width. If the lines are too thick, it will defeat the illusion. But, if the lines are too thin, they will be unable to carry the correct charge. The most important part of this project will be testing and fixing our project if it fails to perform correctly.
STEM Team 3: Video Games for Autistic Children
Team 3 is focused on the creation of a videogame that integrates the functions of a standard game with functions designed to teach autistic children basic social skills through AI - Human interaction. The task here is to fluently teach social skills through “experiences”. Our thinking is that the user will be able to develop a relationship with their in-game “companion”. Once the game itself is completed, human testing would need to be done to test effectiveness as well as receive feedback from the users. We’ve outlined our proposed method of this in our presentation. What’s left to be added to the game is character name selection, character customization, companion selection, fixed companion animations, companion-user dialogue, and sound effects. We’ve only just added the companion and started working on the story/dialogue. However, by the end of the year we believe we will have a working game ready to be tested on users.
 
STEM Team 1: Autonomous Drone Navigation System in Indoor Environment 
After working with the front camera and realizing that it will not work, we have decided to switch methods. Now, we are using the bottom camera to recognize markers and lines that we place on the floor. Since the drone will not be flying that high, the image recognizing program will be more accurate. We will be explaining why our old method eventually failed, how our new method works, and why we have chosen the strategy that we did. We are still able to use most of the code from the old program since we are still trying to recognize colors that the camera sees.

Wednesday, March 23, 2016

STEM Seminar 3/24 Abstract


STEM Team 9: What is Hough Circle Transform and How does it Work?
We will be talking about the OpenCV function, "HoughCircles" and how it pertains to our project. We will inform our classmates on the essential function of HoughCircles and how the algorithm works to provide computer vision applications with circle detection. Additionally, we will briefly talk about our experience with HoughCircles and the difficulties we have encountered so far. We will briefly show how we have included the algorithm in our code. Our focus will mainly be placed on how the function will help us in developing our application in addition to explaining the algorithm to those unfamiliar to it. A draft of our first slide is shown below. 

STEM Team 2: Drone Defense System: Vision System
Team 2 is focused on the creation of a drone defense system. The task is to spot an unidentified drone, track its movements, and then take it down with an autonomous system. With the increased use of drones now and in the future, it is imperative to make such systems to protect property from irresponsible drone use. The drone defense system was divided into the vision system, which is concentrated on the tracking and drone detection, and the interception system, which is concentrated on the drone incapacitation. As the project evolved, however, the focus shifted towards only the vision system and the goal became to successfully and efficiently detect and track the drone. The main aspect of the detection system that was to differentiate me from previous systems was my use of the compound eye and machine learning. The compound eye, such as the eye of a flying insect, is comprised of many facets that are each able to capture a different image. The compound eye does not need to move like single aperture eyes such as those of humans because each of the facets are angled so all the captured images form a single spherical image. To capture images of the drone, my detection system’s camera will be simulating these capturing methods of a compound eye. Machine learning pertains to an artificial intelligence that is able learn on its own without the need for programming. Specifically, I am using a multilayer perceptron that organizes and maps input data to appropriate outputs. The multilayer perceptron will be used with the regression method, enabling me to predict the drone’s location in three-dimensional space based on a pre-recorded data set of the drone’s position. Ultimately, with these implementations, I hope to create an innovative system that contributes to the overall STEM field of object detection. 

STEM Team 7: Hand-to-Hand Communication
The purpose of our project, Hand-to-Hand Communication, is to use EMG signals within our arms to mimic movements on another individual/robotic arm. Today, we will demonstrate how we take raw data and process it. We find the highest values, the peaks for each muscle movement, using a code, and then graph those values. This process is repeated with other muscles, in the attempt of finding a pattern

Sunday, March 20, 2016

STEM Seminar 3/24 & 3/29

We are going to have our next STEM Seminar this coming Thursday (3/24) and next Tuesday (3/29). Each group should prepare to present a key topic of your research field in depth based on both your study and research experience. The presentation should be no longer than 12 minutes (including 2-3  minutes for questions). The abstract of your presentation should be submitted to Mr. Lin by the end of Tuesday (3/22). Presentation materials should be in professional quality: concise, accurate, logical, rich in contents, and visually pleasant. Each team should rehearse your presentation to have proper coordination and time management. Each member of the team should participate in the oral presentation. All demonstrations need to be setup beforehand to eliminate prolonged transition time. Presenters should dress in business casual.

Once your team has finished your presentation, your presentation and multimedia files should be posted  onto your team blog. You can upload your file to Google Drive first and then share the link on your blog.

Friday, January 29, 2016

STEM Research Hackathon Highlights

Hi All,

Hope you enjoy this new research experience! Congrats to those teams making major breakthrough today! I believe that more progresses will be coming soon from all the teams. Here are some highlights of our first hackathon:
  • Team 8 got the first-time haptic feedback from their Senseg tablet and successfully displayed the US map on the screen! The breakthrough makes the team enter the app-building phase!
  • Team 2 collected the first set of images from the "compound eyes" model using cell phone camera, and patched them together using Photoshop. This exercise helps the team to explore the strength and limitation of the current model.
  • Team 9 successfully separated the red-color object from its background and displayed it as binary image! They tested the code on the number 3 subway sign and the result was astonishingly perfect!
  • Team 3 revealed the story flow of their first app for the autistic children. 
  • Team 7 started collecting and processing meaningful EMG data for the pinky through SpikerShield+Arduino+MatLab. The data collection process will continue for the rest of the fingers, and the team will then start the data analysis and classification phase.
  • Team 1 demonstrated their drone target-following algorithm. Though they were in the mid of modifying their algorithm, we could almost see that the drone will soon following you everywhere!
Enjoy your long weekend and see you next semester!

Debugging the vision-based drone navigation code.
Why the binary image conversion not working?
Focusing on research.
OpenCV scenarios are so....... hard!
EMG data processing: finding the peak values.
Special lunch time entertainment.
First time having tables for class!
Smile.... STEM class!

Thursday, January 28, 2016

STEM Research Hackathon

Hi All,

We are going to have our first whole-day STEM Research Hackathon tomorrow from 9:00 am - 2:45 pm. Within 5 hours and 45 minutes, we are going to work intensively on our STEM projects and make dramatic progress! Before you come to the hackathon tomorrow, every team should set a clear, specific, measurable, and ambitious goal. Send me an email about your goal in writing. We wish that we will have several breakthroughs tomorrow. Pizza and soda will be served for lunch at noon. Near the end of the day (2:00 - 2:45 pm), every team will have a brief demonstration of your achievement of the day. See you tomorrow!

Saturday, January 9, 2016

STEM Progress Meeting II

We are going to have our second STEM Progress Meeting next Tuesday (01/09/2016). Please prepare it thoroughly and be ready for any questions. Each team will have 5 ~ 10 minutes to present depends on the size of the team. Everyone in the team should present and participate in Q & A. The outlines of the presentation are listed below.

STEM Team Progress Report Outlines
1. Topic
   a. Research topic
   b. Team member(s)
2. Research problem (if your problem stays the same, just briefly recap in one slide)
   a. Definition & scope
   c. Expected outcomes
3. Method (briefly recap the overall method, but focus on the detailed methods of current sub-problems.)
   a. Methods have used to solve the problem(s)
   b. Evaluation and comparison of different methods
4. Progress
   a. Gantt chart : current status & next steps
   b. Project achievements
   c. Demonstrations
5. Problems & Risks
   a. Technical issue(s)
   b. Resource issues(s): tools, materials, etc. need purchase (vendor, price, and time frame)
   c. Potential show stopper and backup plan
6. References (updated Project Resource page on ....)
   a. Books/magazines
   b. Video tutorials
   c. Websites
   d. Papers from professional journals
   e. Thesis/dissertations

Every team member should have the equal opportunity to present/demo. Group and individual performance will be evaluated by peers and teacher.

Good luck!