Lotus is a LeafLabs project supported by a Small Business Innovation Research grant from the National Institute of Mental Health of the National Institutes of Health, under Award Number R43MH109332, with the goal of using light-field microscopy to visualize neural activity in the zebrafish brain at single neuron resolution. In order to perform high-speed volumetric calcium imaging, used to visualize neural activity, we have proposed a microscope design with a frame rate of 300FPS and resolution of 4096 by 3072 pixels, to capture the entire neonatal zebrafish brain.
We made our triumphant return to Shiso Kitchen, for another great cooking class, for our most recent team activity! On the menu, we had eggplant, potato, tomato, and white bean plaki, Mediterranean couscous, shrimp and mushroom skewers, and cinnamon sugar phyllo cups filled with mixed berries. Needless to say, it was delicious. Not only that but we all had a lot of fun together. Team activity success!
Every now and then, LeafLabs has the pleasure of participating in outreach activities, giving back and engaging with the broader community. In the past, we’ve sponsored robotics competitions and volunteered at hackathons. This year, we decided to take advantage of MIT’s Independent Activities Period, to teach a class on something we’re passionate about: computer music.
Of all the organs in the human body, perhaps the most difficult to study is the brain, owing to its dazzling complexity and its isolation from the rest of the body afforded by the protective skull and blood-brain barrier. So imagine if a miniature copy of a brain could be grown in a dish from stem cells in the body. With such a technology, the possibilities seem endless for understanding brain diseases and for developing therapeutics quickly.
Maple was mentioned on Hackaday! Check it out and see what they had to say in their "The $2 32-Bit Arduino (With Debugging)" post.
How does a computer know which parts it is composed of? What hardware and peripherals are connected? For a desktop computer, many things like storage drives and attached USB devices are discoverable on boot. But for embedded systems, a lot of the hardware is connected via non-discoverable protocols such as SPI, UART, and GPIO. The kernel, which controls the hardware, needs to be told what devices are attached and how to talk to them.
The LeafLabs Neuro team is pleased to announce their return to the Society for Neuroscience 2016 conference. Next week in San Diego from 11/13 to 11/16, they'll be showing off some exciting new projects, along with updates on their progress since last year. While their project scope has expanded, their focus remains to be the theme of big data challenges in neuroscience.
Have you wondered how information is carried over a modern interface like USB-C, or if the manufacturer of your smartphone decides not to provide a 3.5mm headset jack, how audio is routed? This post is intended to address these questions at a high level: I’ll embed various links to additional information, for those who care to explore further.
Hello Soon to be College Graduates,
I have advice for you, especially if you’ll soon be graduating in the next year. I am a recent graduate from MIT starting my career at LeafLabs, working on sensor interface and FPGA design. There are so many students who are starting to think about life beyond school so I wanted to share the knowledge I gained when trying to enter the workforce. I really enjoy what I do and what I am working on, but while I was still in school, I was very uncertain about how to find a job that would be right for me. With the combination of exploring options in and outside of school you can prepare yourself and feel more confident leaving school behind.
Written by Andrew J. Meyer, LeafLabs CEO
Among my favorite moments during the three years we worked with Google to build Project Ara, was reading the deluge of internet comments following the press coverage of Google I/O 2014. The responses were split roughly 50/50 between “shut up and take my money” and “I’m an engineer and this is impossible...and stupid.” We knew we were on to something. We also knew a lot more than any of the commenters about what the technical risks and challenges actually were.
It was past the time that we should have gone home, earlier that spring, when we huddled around a lab bunch, inserted a few fully assembled modules - display, processor, and battery - into a sleek aluminum frame, and saw the Android logo pop-up for the first time on a Project Ara phone. I think the battery died pretty soon after that, which was longer than we expected. Engineering had barely started 6 months before, and for the first time, I changed my technical opinion about Paul Eremenko’s grand vision of a modular phone from “probably impossible,” to “let’s get it done.”
Thanks, Paul, for a vision that was absolutely worth chasing. Thanks, Regina and Ken, for pushing us to err on the side of epic. Thanks, Ara and Seth, for inviting us to be a “part of it.” Thank you, Google, for having the courage to keep looking beyond what’s “obviously possible” for new products.
Most of all, thank you to the tremendously talented engineers, both inside and outside of LeafLabs, who reliably pulled rabbits from hats, repeatedly solved problems the internet specifically deemed unsolvable, and for being a team that I’m proud to have been a part of.
Some of the buzz surrounding the Project Ara demo at Google I/O:
"A developer edition running Android with a 5.3-inch screen is shipping this fall, while a consumer version of the phone will be available some time in 2017. To get your hands on a device this year, you have to head over to ATAP's dedicated Ara website and fill out the form indicating what type of module you'd like to develop for the phone."
"'Step one, plug in a module,' Camargo told the crowd. 'Step two, use it.' The system can eject the modules electronically with a simple voice command like, 'okay Google, eject the camera.' That one got arguably the biggest applause line of the demo.'
Bravo to LeafLabs' Eli who presented at the Association for Computing Machinery 31st Symposium on Applied Computing (SAC) last week (April 4-8) in Pisa, Italy! His paper "Structured Gotos are (Slightly) Harmful" was co-authored with Yossi Gil of Technion (Israel).
LeafLabs is excited to announce our partnership with the Center for Sensorimotor Neural Engineering (CSNE), an NSF-funded Engineering Research Center made up of MIT, the University of Washington, and San Diego State University. The CSNE focuses on the development of Brain-Computer Interfaces, particularly for the treatment of stroke and spinal cord injury. As an Industry Partner, LeafLabs looks forward to the prospect of working with CSNE labs on exciting and impactful projects that can leverage the benefits of collaboration between industry and academia.
Murray Carpenter (Boston Globe correspondent) interviews Dr. Ed Boyden from MIT and LeafLabs' CEO Andrew Meyer about advances being made in neurophysiology, big data, and of course Willow. Follow the link to the read the full article "Glut of data from mice brains tests MIT’s computing power" on the Boston Globe website.
Now that it is getting dark at 7pm and we have broken out the bulky sweaters, it is nice to look back to August and our end of summer team activity, canoeing on The Charles. With thunderstorms in the forecast we almost had to cancel. In the end though we had glorious weather (despite a few showers) for a blissful afternoon of canoeing and sightseeing.
Photos courtesy of LeafLab's Perry Hung