LeafLabs Grant Collaborations

Recently LeafLabs hosted a pizza party/ grant writing jam session at the Mcgovern Institute for Brain Research at MIT. The purpose of the event was to have our engineers and scientists to talk about SBIRs, STTRs, and the collaborative grant writing process with PI’s, researchers, and graduate students.

CMV12000 Bring-Up: Trials and Tribulations

Lotus is a LeafLabs project supported by a Small Business Innovation Research grant from the National Institute of Mental Health of the National Institutes of Health, under Award Number R43MH109332, with the goal of using light-field microscopy to visualize neural activity in the zebrafish brain at single neuron resolution. In order to perform high-speed volumetric calcium imaging, used to visualize neural activity, we have proposed a microscope design with a frame rate of 300FPS and resolution of 4096 by 3072 pixels, to capture the entire neonatal zebrafish brain.

Team Cooking Extravaganza pt. II

We made our triumphant return to Shiso Kitchen, for another great cooking class, for our most recent team activity! On the menu, we had eggplant, potato, tomato, and white bean plaki, Mediterranean couscous, shrimp and mushroom skewers, and cinnamon sugar phyllo cups filled with mixed berries. Needless to say, it was delicious. Not only that but we all had a lot of fun together. Team activity success!

Silicon Probes Record Neural Activity From Brain Organoid

Of all the organs in the human body, perhaps the most difficult to study is the brain, owing to its dazzling complexity and its isolation from the rest of the body afforded by the protective skull and blood-brain barrier. So imagine if a miniature copy of a brain could be grown in a dish from stem cells in the body. With such a technology, the possibilities seem endless for understanding brain diseases and for developing therapeutics quickly.

Using Linux Device Trees for Fun and Profit

How does a computer know which parts it is composed of? What hardware and peripherals are connected? For a desktop computer, many things like storage drives and attached USB devices are discoverable on boot. But for embedded systems, a lot of the hardware is connected via non-discoverable protocols such as SPI, UART, and GPIO. The kernel, which controls the hardware, needs to be told what devices are attached and how to talk to them.

LeafLabs at SFN 2016

The LeafLabs Neuro team is pleased to announce their return to the Society for Neuroscience 2016 conference. Next week in San Diego from 11/13 to 11/16, they'll be showing off some exciting new projects, along with updates on their progress since last year. While their project scope has expanded, their focus remains to be the theme of big data challenges in neuroscience.

What Happened to My Headset Jack?

Have you wondered how information is carried over a modern interface like USB-C, or if the manufacturer of your smartphone decides not to provide a 3.5mm headset jack, how audio is routed?  This post is intended to address these questions at a high level:  I’ll embed various links to additional information, for those who care to explore further.

Open Letter to Future Graduates

Hello Soon to be College Graduates,

I have advice for you, especially if you’ll soon be graduating in the next year. I am a recent graduate from MIT starting my career at LeafLabs, working on sensor interface and FPGA design. There are so many students who are starting to think about life beyond school so I wanted to share the knowledge I gained when trying to enter the workforce. I really enjoy what I do and what I am working on, but while I was still in school, I was very uncertain about how to find a job that would be right for me. With the combination of exploring options in and outside of school you can prepare yourself and feel more confident leaving school behind.

Project Ara: Erring on the Side of Epic

Written by Andrew J. Meyer, LeafLabs CEO

An early Project Ara phone boots for the first time!

An early Project Ara phone boots for the first time!

Among my favorite moments during the three years we worked with Google to build Project Ara, was reading the deluge of internet comments following the press coverage of Google I/O 2014. The responses were split roughly 50/50 between “shut up and take my money” and “I’m an engineer and this is impossible...and stupid.” We knew we were on to something. We also knew a lot more than any of the commenters about what the technical risks and challenges actually were.

It was past the time that we should have gone home, earlier that spring, when we huddled around a lab bunch, inserted a few fully assembled modules - display, processor, and battery - into a sleek aluminum frame, and saw the Android logo pop-up for the first time on a Project Ara phone. I think the battery died pretty soon after that, which was longer than we expected. Engineering had barely started 6 months before, and for the first time, I changed my technical opinion about Paul Eremenko’s grand vision of a modular phone from “probably impossible,” to “let’s get it done.”

Thanks, Paul, for a vision that was absolutely worth chasing. Thanks, Regina and Ken, for pushing us to err on the side of epic. Thanks, Ara and Seth, for inviting us to be a “part of it.” Thank you, Google, for having the courage to keep looking beyond what’s “obviously possible” for new products.

Most of all, thank you to the tremendously talented engineers, both inside and outside of LeafLabs, who reliably pulled rabbits from hats, repeatedly solved problems the internet specifically deemed unsolvable, and for being a team that I’m proud to have been a part of.





Project Ara in the News

Some of the buzz surrounding the Project Ara demo at Google I/O:


"It's been years since we've had anything like true experimentation and newness in smartphones. "

The Verge

"A developer edition running Android with a 5.3-inch screen is shipping this fall, while a consumer version of the phone will be available some time in 2017. To get your hands on a device this year, you have to head over to ATAP's dedicated Ara website and fill out the form indicating what type of module you'd like to develop for the phone."


"'Step one, plug in a module,' Camargo told the crowd. 'Step two, use it.' The system can eject the modules electronically with a simple voice command like, 'okay Google, eject the camera.' That one got arguably the biggest applause line of the demo.'

CSNE Partnership

LeafLabs is excited to announce our partnership with the Center for Sensorimotor Neural Engineering (CSNE), an NSF-funded Engineering Research Center made up of MIT, the University of Washington, and San Diego State University. The CSNE focuses on the development of Brain-Computer Interfaces, particularly for the treatment of stroke and spinal cord injury.  As an Industry Partner, LeafLabs looks forward to the prospect of working with CSNE labs on exciting and impactful projects that can leverage the benefits of collaboration between industry and academia.