Our team’s goal for this presentation is to explain to the audience the process of getting haptic feedback on the Senseg tablet. Hopefully, after listening to us, you will be able to understand the concept and how to use the Senseg SDK, as well as integrate it to Android Studio. We will also quickly cover the methods provided with the SDK, different superclasses and subclasses that allow us to use the feedback in our advantage.
STEM
Team 4: OpenVIBE
In this seminar, we will go over some of the stages of running a successful BCI scenario: Data Acquisition. This involves connecting the headset to the OpenVibe Acquisition server. Preprocessing and feature extraction. This involves building an OpenVibe scenario to filter the acquired data, isolate noise, and extract the desired features. Our presentation will feature and explain some OpenVibe boxes we used in the process of building our scenario, like temporal filter, time based epoching, and simple dsp. Classification. This involves using classification algorithms to assign a specific class to each data point. Our presentation will further explain the distinction between this step and preprocessing. We will also explain how offline training is used to fine-tune an algorithm and acquaint it with the data to ensure more precise classification. The algorithm we will feature is SVM (support vector machine). Our presentation will conclude with a demonstration of an OpenVibe scenario we've built incorporating the above elements. Our scenario is a Virtual reality handball game, where the user thinks "right" or "left" to throw the ball and score.
STEM
Team 6: Electroencephalography (EEG)
STEM
Team 10: Multipixel Display Electroencephalography provides a new method for insight into the way we, as humans, work. It is the study of brain waves and the different ways this data is received. This EEG technology has existed for 80 years, yet only recently is it being fully taken advantage of for it’s commercial uses. This presentation will explore the new, innovative applications of EEG in everyday life, recent develops in the tech, and what this means for the future.
Innovative technology has been advancing since the Industrial Revolution, and one of those categories: haptics engineering, which benefits the general public for making technology more convenient. In our presentation, we will be discussing how to create a display screen with printable circuit boards. PCBs are a cheap and effective technology. A PCB is a circuit board that is copper clad. The copper is etched away so that all that is left is a circuit. In our project we will be using PCBs to create the board that people will feel. We are using PCBs because of their various benefits. First, they are cost effective. Second, they allow us to easily produce many different circuit layouts displaying different shapes. In addition, the materials needed are easily accessible to us. Each tactile pixel will be 1mm by 1 mm. An important part of the process is determining the trace width. If the lines are too thick, it will defeat the illusion. But, if the lines are too thin, they will be unable to carry the correct charge. The most important part of this project will be testing and fixing our project if it fails to perform correctly.
STEM
Team 3: Video Games for Autistic Children
STEM Team 1: Autonomous Drone Navigation System in Indoor Environment
Team 3 is focused on the creation of a videogame that integrates the functions of a standard game with functions designed to teach autistic children basic social skills through AI - Human interaction. The task here is to fluently teach social skills through “experiences”. Our thinking is that the user will be able to develop a relationship with their in-game “companion”. Once the game itself is completed, human testing would need to be done to test effectiveness as well as receive feedback from the users. We’ve outlined our proposed method of this in our presentation. What’s left to be added to the game is character name selection, character customization, companion selection, fixed companion animations, companion-user dialogue, and sound effects. We’ve only just added the companion and started working on the story/dialogue. However, by the end of the year we believe we will have a working game ready to be tested on users.
STEM Team 1: Autonomous Drone Navigation System in Indoor Environment
After working with the front camera and realizing that it will not work, we have decided to switch methods. Now, we are using the bottom camera to recognize markers and lines that we place on the floor. Since the drone will not be flying that high, the image recognizing program will be more accurate. We will be explaining why our old method eventually failed, how our new method works, and why we have chosen the strategy that we did. We are still able to use most of the code from the old program since we are still trying to recognize colors that the camera sees.
No comments:
Post a Comment