LeafLabs would like to congratulate Airbus, A³, and the Vahana team on the first successful flight of Alpha One!
The New Year has started off as busy as ever. New projects rolling in, people returning from travel, and gearing up for conferences.
Happy New Year from LeafLabs!
Greetings! There has been so much going on this in the last couple of months we wanted to take the chance to give a LeafLabs update. In addition to being busy with our daily operations, employees have been coming and going, traveling to clients and conferences.
LeafLabs would like to announce that the National Institute of Mental Health has awarded us a National Institute of Health (NIH) grant under Award Number R44MH114783. The award provides the first year of funding for a Phase II Small Business Innovation Research (SBIR) grant that amounts to $1.99M of funding over three years. The grant will fund a collaboration between LeafLabs, Ken Shepard at Columbia University, and Ed Boyden at MIT to build a 1000-channel silicon probe, for freely-moving neural recording, and implement hardware and software solutions for scalable data analysis.
Lotus is a LeafLabs project that uses a light-field microscope to capture high-resolution 3D images of neuron activity in zebrafish brains. Before a 3D image can be reconstructed from the raw images taken by the microscope, three processing steps must happen. First, a matrix known as the point spread function must be calculated using physical parameters from the experiment. Second, the raw images must be cropped and aligned. Finally, a 3D image can be reconstructed by using the point spread function to perform a 3D deconvolution.
Recently LeafLabs hosted a pizza party/ grant writing jam session at the Mcgovern Institute for Brain Research at MIT. The purpose of the event was to have our engineers and scientists to talk about SBIRs, STTRs, and the collaborative grant writing process with PI’s, researchers, and graduate students.
Lotus is a LeafLabs project supported by a Small Business Innovation Research grant from the National Institute of Mental Health of the National Institutes of Health, under Award Number R43MH109332, with the goal of using light-field microscopy to visualize neural activity in the zebrafish brain at single neuron resolution. In order to perform high-speed volumetric calcium imaging, used to visualize neural activity, we have proposed a microscope design with a frame rate of 300FPS and resolution of 4096 by 3072 pixels, to capture the entire neonatal zebrafish brain.
We made our triumphant return to Shiso Kitchen, for another great cooking class, for our most recent team activity! On the menu, we had eggplant, potato, tomato, and white bean plaki, Mediterranean couscous, shrimp and mushroom skewers, and cinnamon sugar phyllo cups filled with mixed berries. Needless to say, it was delicious. Not only that but we all had a lot of fun together. Team activity success!
Every now and then, LeafLabs has the pleasure of participating in outreach activities, giving back and engaging with the broader community. In the past, we’ve sponsored robotics competitions and volunteered at hackathons. This year, we decided to take advantage of MIT’s Independent Activities Period, to teach a class on something we’re passionate about: computer music.
Of all the organs in the human body, perhaps the most difficult to study is the brain, owing to its dazzling complexity and its isolation from the rest of the body afforded by the protective skull and blood-brain barrier. So imagine if a miniature copy of a brain could be grown in a dish from stem cells in the body. With such a technology, the possibilities seem endless for understanding brain diseases and for developing therapeutics quickly.
Maple was mentioned on Hackaday! Check it out and see what they had to say in their "The $2 32-Bit Arduino (With Debugging)" post.
How does a computer know which parts it is composed of? What hardware and peripherals are connected? For a desktop computer, many things like storage drives and attached USB devices are discoverable on boot. But for embedded systems, a lot of the hardware is connected via non-discoverable protocols such as SPI, UART, and GPIO. The kernel, which controls the hardware, needs to be told what devices are attached and how to talk to them.
The LeafLabs Neuro team is pleased to announce their return to the Society for Neuroscience 2016 conference. Next week in San Diego from 11/13 to 11/16, they'll be showing off some exciting new projects, along with updates on their progress since last year. While their project scope has expanded, their focus remains to be the theme of big data challenges in neuroscience.
Have you wondered how information is carried over a modern interface like USB-C, or if the manufacturer of your smartphone decides not to provide a 3.5mm headset jack, how audio is routed? This post is intended to address these questions at a high level: I’ll embed various links to additional information, for those who care to explore further.
Hello Soon to be College Graduates,
I have advice for you, especially if you’ll soon be graduating in the next year. I am a recent graduate from MIT starting my career at LeafLabs, working on sensor interface and FPGA design. There are so many students who are starting to think about life beyond school so I wanted to share the knowledge I gained when trying to enter the workforce. I really enjoy what I do and what I am working on, but while I was still in school, I was very uncertain about how to find a job that would be right for me. With the combination of exploring options in and outside of school you can prepare yourself and feel more confident leaving school behind.
Written by Andrew J. Meyer, LeafLabs CEO
Among my favorite moments during the three years we worked with Google to build Project Ara, was reading the deluge of internet comments following the press coverage of Google I/O 2014. The responses were split roughly 50/50 between “shut up and take my money” and “I’m an engineer and this is impossible...and stupid.” We knew we were on to something. We also knew a lot more than any of the commenters about what the technical risks and challenges actually were.
It was past the time that we should have gone home, earlier that spring, when we huddled around a lab bunch, inserted a few fully assembled modules - display, processor, and battery - into a sleek aluminum frame, and saw the Android logo pop-up for the first time on a Project Ara phone. I think the battery died pretty soon after that, which was longer than we expected. Engineering had barely started 6 months before, and for the first time, I changed my technical opinion about Paul Eremenko’s grand vision of a modular phone from “probably impossible,” to “let’s get it done.”
Thanks, Paul, for a vision that was absolutely worth chasing. Thanks, Regina and Ken, for pushing us to err on the side of epic. Thanks, Ara and Seth, for inviting us to be a “part of it.” Thank you, Google, for having the courage to keep looking beyond what’s “obviously possible” for new products.
Most of all, thank you to the tremendously talented engineers, both inside and outside of LeafLabs, who reliably pulled rabbits from hats, repeatedly solved problems the internet specifically deemed unsolvable, and for being a team that I’m proud to have been a part of.
Some of the buzz surrounding the Project Ara demo at Google I/O:
"A developer edition running Android with a 5.3-inch screen is shipping this fall, while a consumer version of the phone will be available some time in 2017. To get your hands on a device this year, you have to head over to ATAP's dedicated Ara website and fill out the form indicating what type of module you'd like to develop for the phone."
"'Step one, plug in a module,' Camargo told the crowd. 'Step two, use it.' The system can eject the modules electronically with a simple voice command like, 'okay Google, eject the camera.' That one got arguably the biggest applause line of the demo.'