Hi All,
Hope you enjoy this new research experience!
Congrats to those teams making major breakthrough today! I believe that
more progresses will be coming soon from all the teams. Here are some
highlights of our first hackathon:
- Team 8 got the first-time
haptic feedback from their Senseg tablet and successfully displayed the
US map on the screen! The breakthrough makes the team enter the
app-building phase!
- Team 2 collected the first set of images
from the "compound eyes" model using cell phone camera, and patched
them together using Photoshop. This exercise helps the team to explore
the strength and limitation of the current model.
- Team 9
successfully separated the red-color object from its background and
displayed it as binary image! They tested the code on the number 3
subway sign and the result was astonishingly perfect!
- Team 3 revealed the story flow of their first app for the autistic children.
- Team
7 started collecting and processing meaningful EMG data for the pinky
through SpikerShield+Arduino+MatLab. The data collection process will
continue for the rest of the fingers, and the team will then start the
data analysis and classification phase.
- Team 1 demonstrated
their drone target-following algorithm. Though they were in the mid of
modifying their algorithm, we could almost see that the drone will soon
following you everywhere!
Enjoy your long weekend and see you next semester!
|
Debugging the vision-based drone navigation code. |
|
Why the binary image conversion not working? |
|
Focusing on research. |
|
OpenCV scenarios are so....... hard! |
|
EMG data processing: finding the peak values. |
|
Special lunch time entertainment. |
|
First time having tables for class! |
|
Smile.... STEM class! |
No comments:
Post a Comment