I have traveled the world, hiking in the mountains of Patagonia, the Alps in Switzerland, the rugged terrain in New Zealand, and the cascading waterfalls in Iceland. But not this summer, which was focused on zooming across the world on the Internet.
Scientific meetings, large and small, have all gone virtual. They are a different experience from in-person meetings, with both downsides and upsides. Virtual meetings lack the personal exchanges that occur at in-person meetings, but I have come to appreciate their value in keeping up with research, somewhere between reading a paper and having a personal discussion.
At the beginning of June, I attended a virtual meeting of the Learning in Machines and Brains (LMB) program sponsored by the Canadian Institute for Advanced Research (CIFAR). This program was initiated seventeen years ago by Geoffrey Hinton, a good friend and collaborator, and was continued by Joshua Bengio and Yann LeCun, all recipients of a recent Turing Award for pioneering research on deep learning. LMB is a small group of a few dozen researchers at the top of their game, creating future capabilities for AI based on neural networks. Since 2012 when deep learning went public, computing power used by the largest AI applications has been doubling every 3.4 months, much faster than Moore’s law for VLSI chips, which doubles every eighteen months.
The Telluride Neuromorphic Cognition Workshop is the longest workshop series sponsored by the NSF. It has been meeting annually in the mountain village of Telluride since 1994 when https://besthookupwebsites.org/fuckr-review/ we founded it, bringing together fifty students and fifty faculty for three weeks to create a new technology based on bioinspired low-power analog VLSI chips. The attendees work together intensively, building robots and testing new chips. In July, the workshop was held virtually, including virtual joint projects.
Another virtual meeting I attended this summer was sponsored by the NIH on the BRAIN Initiative. I was on the advisory committee to NIH director Francis Collins to plan this $5B, ten-year grand scientific and engineering challenge in 2014. We were asked to create a set of goals and milestones for innovative neurotechnologies to accelerate brain research. One goal was to record simultaneously from one million neurons. This was achieved earlier this year, ahead of schedule. Another goal was to use machine learning to analyze the deluge of data. New discoveries have overturned old ideas about how brains work. Advances in our understanding of brain circuits in turn is advancing AI based on neural networks.
Neuromorphic chips that use spikes like brains are thousands of times more energy efficient than digital chips
La Jolla is an hour away from hiking trails in the local mountains. One trail, called Cactus to the Clouds, ascends from Palm Springs, going straight up a ridge line that continues up the mountain for sixteen miles to the top of San Jacinto Peak, a net elevation gain of 10,800 feet, the highest elevation increase for a day hike in the United States. It was a long day starting well before dawn to avoid the heat and ending at dusk. Mercifully, there is a tram going down. A nice way to round out a virtual summer.
He is co-author (with Patricia Churchland) of The Computational Brain, and (with Steven Quartz) of Liars, Lovers, and Heroes: What the New Brain Science Reveals About How We Become Who We Are
TERRENCE J. SEJNOWSKI, a pioneer in computational neurobiology, is Francis Crick Professor and Laboratory Head at the Salk Institute; an investigator at the Howard Hughes Medical Institute; and professor of biology and neurosciences at the University of California, San Diego.