Tuesday, March 29, 2016

STEM Seminar 3/29 Abstracts

STEM Team 8: Educational Apps
Our team’s goal for this presentation is to explain to the audience the process of getting haptic feedback on the Senseg tablet. Hopefully, after listening to us, you will be able to understand the concept and how to use the Senseg SDK, as well as integrate it to Android Studio. We will also quickly cover the methods provided with the SDK, different superclasses and subclasses that allow us to use the feedback in our advantage.

STEM Team 4: OpenVIBE
In this seminar, we will go over some of the stages of running a successful BCI scenario:       Data Acquisition. This involves connecting the headset to the OpenVibe Acquisition server.      Preprocessing and feature extraction. This involves building an OpenVibe scenario to filter the acquired data, isolate noise, and extract the desired features. Our presentation will feature and explain some OpenVibe boxes we used in the process of building our scenario, like temporal filter, time based epoching, and simple dsp.      Classification. This involves using classification algorithms to assign a specific class to each data point. Our presentation will further explain the distinction between this step and preprocessing. We will also explain how offline training is used to fine-tune an algorithm and acquaint it with the data to ensure more precise classification. The algorithm we will feature is SVM (support vector machine).   Our presentation will conclude with a demonstration of an OpenVibe scenario we've built incorporating the above elements. Our scenario is a Virtual reality handball game, where the user thinks "right" or "left" to throw the ball and score.

STEM Team 6: Electroencephalography (EEG)
Electroencephalography provides a new method for insight into the way we, as humans, work. It is the study of brain waves and the different ways this data is received. This EEG technology has existed for 80 years, yet only recently is it being fully taken advantage of for it’s commercial uses. This presentation will explore the new, innovative applications of EEG in everyday life, recent develops in the tech, and what this means for the future.

STEM Team 10: Multipixel Display  
Innovative technology has been advancing since the Industrial Revolution, and one of those categories:  haptics engineering, which benefits the general public for making technology more convenient. In our presentation, we will be discussing how to create a display screen with printable circuit boards. PCBs are a cheap and effective technology. A PCB is a circuit board that is copper clad. The copper is etched away so that all that is left is a circuit. In our project we will be using PCBs to create the board that people will feel. We are using PCBs because of their various benefits. First, they are cost effective. Second, they allow us to easily produce many different circuit layouts displaying different shapes. In addition, the materials needed are easily accessible to us. Each tactile pixel will be 1mm by 1 mm. An important part of the process is determining the trace width. If the lines are too thick, it will defeat the illusion. But, if the lines are too thin, they will be unable to carry the correct charge. The most important part of this project will be testing and fixing our project if it fails to perform correctly.
STEM Team 3: Video Games for Autistic Children
Team 3 is focused on the creation of a videogame that integrates the functions of a standard game with functions designed to teach autistic children basic social skills through AI - Human interaction. The task here is to fluently teach social skills through “experiences”. Our thinking is that the user will be able to develop a relationship with their in-game “companion”. Once the game itself is completed, human testing would need to be done to test effectiveness as well as receive feedback from the users. We’ve outlined our proposed method of this in our presentation. What’s left to be added to the game is character name selection, character customization, companion selection, fixed companion animations, companion-user dialogue, and sound effects. We’ve only just added the companion and started working on the story/dialogue. However, by the end of the year we believe we will have a working game ready to be tested on users.
 
STEM Team 1: Autonomous Drone Navigation System in Indoor Environment 
After working with the front camera and realizing that it will not work, we have decided to switch methods. Now, we are using the bottom camera to recognize markers and lines that we place on the floor. Since the drone will not be flying that high, the image recognizing program will be more accurate. We will be explaining why our old method eventually failed, how our new method works, and why we have chosen the strategy that we did. We are still able to use most of the code from the old program since we are still trying to recognize colors that the camera sees.

Wednesday, March 23, 2016

STEM Seminar 3/24 Abstract


STEM Team 9: What is Hough Circle Transform and How does it Work?
We will be talking about the OpenCV function, "HoughCircles" and how it pertains to our project. We will inform our classmates on the essential function of HoughCircles and how the algorithm works to provide computer vision applications with circle detection. Additionally, we will briefly talk about our experience with HoughCircles and the difficulties we have encountered so far. We will briefly show how we have included the algorithm in our code. Our focus will mainly be placed on how the function will help us in developing our application in addition to explaining the algorithm to those unfamiliar to it. A draft of our first slide is shown below. 

STEM Team 2: Drone Defense System: Vision System
Team 2 is focused on the creation of a drone defense system. The task is to spot an unidentified drone, track its movements, and then take it down with an autonomous system. With the increased use of drones now and in the future, it is imperative to make such systems to protect property from irresponsible drone use. The drone defense system was divided into the vision system, which is concentrated on the tracking and drone detection, and the interception system, which is concentrated on the drone incapacitation. As the project evolved, however, the focus shifted towards only the vision system and the goal became to successfully and efficiently detect and track the drone. The main aspect of the detection system that was to differentiate me from previous systems was my use of the compound eye and machine learning. The compound eye, such as the eye of a flying insect, is comprised of many facets that are each able to capture a different image. The compound eye does not need to move like single aperture eyes such as those of humans because each of the facets are angled so all the captured images form a single spherical image. To capture images of the drone, my detection system’s camera will be simulating these capturing methods of a compound eye. Machine learning pertains to an artificial intelligence that is able learn on its own without the need for programming. Specifically, I am using a multilayer perceptron that organizes and maps input data to appropriate outputs. The multilayer perceptron will be used with the regression method, enabling me to predict the drone’s location in three-dimensional space based on a pre-recorded data set of the drone’s position. Ultimately, with these implementations, I hope to create an innovative system that contributes to the overall STEM field of object detection. 

STEM Team 7: Hand-to-Hand Communication
The purpose of our project, Hand-to-Hand Communication, is to use EMG signals within our arms to mimic movements on another individual/robotic arm. Today, we will demonstrate how we take raw data and process it. We find the highest values, the peaks for each muscle movement, using a code, and then graph those values. This process is repeated with other muscles, in the attempt of finding a pattern

Sunday, March 20, 2016

STEM Seminar 3/24 & 3/29

We are going to have our next STEM Seminar this coming Thursday (3/24) and next Tuesday (3/29). Each group should prepare to present a key topic of your research field in depth based on both your study and research experience. The presentation should be no longer than 12 minutes (including 2-3  minutes for questions). The abstract of your presentation should be submitted to Mr. Lin by the end of Tuesday (3/22). Presentation materials should be in professional quality: concise, accurate, logical, rich in contents, and visually pleasant. Each team should rehearse your presentation to have proper coordination and time management. Each member of the team should participate in the oral presentation. All demonstrations need to be setup beforehand to eliminate prolonged transition time. Presenters should dress in business casual.

Once your team has finished your presentation, your presentation and multimedia files should be posted  onto your team blog. You can upload your file to Google Drive first and then share the link on your blog.