MACHINE LEARNING of ANIMAL BEHAVIOR
http://www.opennn.net/
https://playground.tensorflow.org/
https://sourceforge.net/projects/opennn/
http://www.mousemotorlab.org/deeplabcut
https://stackoverflow.com/questions/11477145/open-source-neural-network-library
https://nature.com/magazine-assets/d41586-019-02942-5/d41586-019-02942-5.pdf
https://www.nature.com/articles/d41586-019-02942-5
Deep learning powers a motion-tracking revolution in open-source tools for analysing animal behaviour & posture
by Roberta Kwok / 11.30.2019
“As a postdoc, physiologist Valentina Di Santo spent a lot of time scrutinizing high-resolution films of fish. Di Santo was investigating the motions involved when fish such as skates swim. She filmed individual fish in a tank and manually annotated their body parts frame by frame, an effort that required about a month of full-time work for 72 seconds of footage. Using an open-source application called DLTdv, developed in the computer language MATLAB, she then extracted the coordinates of body parts — the key information needed for her research. That analysis showed, among other things, that when little skates (Leucoraja erinacea) need to swim faster, they create an arch on their fin margin to stiffen its edge1.
https://youtu.be/Eh6oIGE4dwI
But as the focus of Di Santo’s research shifted from individual animals to schools of fish, it was clear a new approach would be required. “It would take me forever to analyse [those data] with the same detail,” says Di Santo, who is now at Stockholm University. So, she turned to DeepLabCut instead. DeepLabCut is an open-source software package developed by Mackenzie Mathis, a neuroscientist at Harvard University in Cambridge, Massachusetts, and her colleagues, which allows users to train a computational model called a neural network to track animal postures in videos.
#protip: as of 2.1 we added dynamic cropping to video analysis. If you animal is a small part of the frame (e.g. openfield assays, as shown below), this will help with analysis speed!!! Docs now added for this feature: https://t.co/mvtkpUJdhQ pic.twitter.com/zbtTmhGyII
— DeepLabCut 🦄 (@DeepLabCut) November 4, 2019
The publicly available version didn’t have an easy way to track multiple animals over time, but Mathis’ team agreed to run an updated version using the fish data, which Di Santo annotated using a graphical user interface (GUI). The preliminary output looks promising, Di Santo says, although she is waiting to see how the tool performs on the full data set. But without DeepLabCut, she says, the study “would not be possible”.
Researchers have long been interested in tracking animal motion, Mathis says, because motion is “a very good read-out of intention within the brain”. But conventionally, that has involved spending hours recording behaviours by hand. The previous generation of animal-tracking tools mainly determined centre of mass and sometimes orientation, and the few tools that captured finer details were highly specialized for specific animals or subject to other constraints, says Talmo Pereira, a neuroscientist at Princeton University in New Jersey. Over the past several years, deep learning — an artificial-intelligence method that uses neural networks to recognize subtle patterns in data — has empowered a new crop of tools. Open-source packages such as DeepLabCut, LEAP Estimates Animal Pose (LEAP) and DeepFly3D use deep learning to determine coordinates of animal body parts in videos.
Complementary tools perform tasks such as identifying specific animals. These packages have aided research on everything from the study of motion in hunting cheetahs to collective zebrafish behaviour. Each tool has limitations; some require specific experimental set-ups, or don’t work well when animals always crowd together. But methods will improve alongside advances in image capture and machine learning, says Sandeep Robert Datta, a neuroscientist at Harvard Medical School in Boston, Massachusetts. “What you’re looking at now is just the very beginning of what is certain to be a long-term transformation in the way neuroscientists study behaviour,” he says.
DeepLabCut is based on software used to analyse human poses. Mathis’ team adapted its underlying neural network to work for other animals with relatively few training data. Between 50 and 200 manually annotated frames are generally sufficient for standard lab studies, although the amount needed depends on factors such as data quality and the consistency of the people doing the labelling, Mathis says. In addition to annotating body parts with a GUI, users can issue commands through a Jupyter Notebook, a computational document popular with data scientists. Scientists have used DeepLabCut to study both lab and wild animals, including mice, spiders, octopuses and cheetahs. Neuroscientist Wujie Zhang at the University of California, Berkeley, and his colleague used it to estimate the behavioural activity of Egyptian fruit bats (Rousettus aegyptiacus) in the lab2.
The deep-learning-based posture tracking package LEAP, developed by Pereira and his colleagues requires 50–100 annotated frames for lab animals, says Pereira. More training data would be needed for wildlife footage, although his team has not yet conducted enough experiments to determine how much. The researchers plan to release another package called Social LEAP (SLEAP) this year to better handle footage of multiple, closely interacting animals. Jake Graving, a behavioural scientist at the Max Planck Institute of Animal Behavior in Konstanz, Germany, and his colleagues compared the performance of a re-implementation of the DeepLabCut algorithm and LEAP on videos of Grevy’s zebras (Equus grevyi)3.
They report that LEAP processed images about 10% faster, but the DeepLabCut algorithm was about three times as accurate. Graving’s team has developed an alternative tool called DeepPoseKit, which it has used to study behaviours of desert locusts (Schistocerca gregaria), such as hitting and kicking. The researchers report that DeepPoseKit combines the accuracy of DeepLabCut with a batch-processing speed that surpasses LEAP. For instance, tracking one zebra in 1 hour of footage filmed at 60 frames per second takes about 3.6 minutes with DeepPoseKit, 6.4 minutes with LEAP and 7.1 minutes with his team’s implementation of the DeepLabCut algorithm, Graving says. DeepPoseKit offers “very good innovations”, Pereira says. Mathis disputes the validity of the performance comparisons, but Graving says that “our results offer the most objective and fair comparison we could provide”. Mathis’ team reported an accelerated version of DeepLabCut that can run on a mobile phone in an article posted in September on the arXiv preprint repository4.
Biologists who want to test multiple software solutions can try Animal Part Tracker, developed by Kristin Branson, a computer scientist at the Howard Hughes Medical Institute’s Janelia Research Campus in Ashburn, Virginia, and her colleagues. Users can select any of several posture-tracking algorithms, including modified versions of those used in DeepLabCut and LEAP, as well as another algorithm from Branson’s lab. DeepPoseKit also offers the option to use alternative algorithms, as will SLEAP. Other tools are designed for more specialized experimental set-ups. DeepFly3D, for instance, tracks 3D postures of single tethered lab animals, such as mice with implanted electrodes or fruit flies walking on a tiny ball that acts as a treadmill.
Pavan Ramdya, a neuroengineer at the Swiss Federal Institute of Technology in Lausanne (EPFL), and his colleagues, who developed the software, are using DeepFly3D to help identify which neurons in fruit flies are active when they perform specific actions. And DeepBehavior, developed by neuroscientist Ahmet Arac at the University of California, Los Angeles, and his colleagues, allows users to track 3D movement trajectories and calculate parameters such as velocities and joint angles in mice and humans. Arac’s team is using this package to assess the recovery of people who have had a stroke and to study the links between brain-network activity and behaviour in mice.
Scientists who want to study multiple animals often need to track which animal is which. To address this challenge, Gonzalo de Polavieja, a neuroscientist at Champalimaud Research, the research arm of the private Champalimaud Foundation in Lisbon, and his colleagues developed idtracker.ai, a neural-network-based tool that identifies individual animals without manually annotated training data. The software can handle videos of up to about 100 fish and 80 flies, and its output can be fed into DeepLabCut or LEAP, de Polavieja says. His team has used idtracker.ai to probe, among other things, how zebrafish decide where to move in a group5. However, the tool is intended only for lab videos rather than wildlife footage and requires animals to separate from one another, at least briefly.
Other software packages can help biologists to make sense of animals’ motions. For instance, researchers might want to translate posture coordinates into behaviours such as grooming, Mathis says. If scientists know which behaviour they’re interested in, they can use the Janelia Automatic Animal Behavior Annotator (JAABA), a supervised machine-learning tool developed by Branson’s team, to annotate examples and automatically identify more instances in videos.
An alternative approach is unsupervised machine learning, which does not require behaviours to be defined beforehand. This strategy might suit researchers who want to capture the full repertoire of an animal’s actions, says Gordon Berman, a theoretical biophysicist at Emory University in Atlanta, Georgia. His team developed the MATLAB tool MotionMapper to identify often repeated movements. Motion Sequencing (MoSeq), a Python-based tool from Datta’s team, finds actions such as walking, turning or rearing. By mixing and matching these tools, researchers can extract new meaning from animal imagery. “It gives you the full kit of being able to do whatever you want,” Pereira says.”
- Di Santo, V., Blevins, E. L. & Lauder, G. V. J. Exp. Biol. 220, 705–712 (2017).
- Zhang, W. & Yartsev, M. M. Cell, 178, 413–428 (2019).
- Graving, J. M. et al. eLife https://doi.org/10.7554/eLife.47994 (2019).
- Mathis, A., Yüksekgönül, M., Rogers, B., Bethge, M., & Mathis, M. W. Preprint at https://arxiv.org/abs/1909.11229 (2019).
- Heras, F. J. H., Romero-Ferrero, F., Hinz, R. C. & de Polavieja, G. G. PLoS Comput. Biol. 15, e1007354 (2019).
SUPPORTING DOZENS of PROGRAMMING LANGUAGES
https://jupyter.org/about
https://blog.jupyter.org/jupyter-meets-the-earth
https://blog.jupyter.org/project-jupyter-computational-narratives-as-the-engine-of-collaborative-data-science
https://nature.com/magazine-assets/d41586-018-07196-1/d41586-018-07196-1.pdf
https://www.nature.com/articles/d41586-018-07196-1
Why Jupyter is data scientists’ computational notebook of choice
by Jeffrey M. Perkel / 10.30.2018
“Perched atop the Cerro Pachón ridge in the Chilean Andes is a building site that will eventually become the Large Synoptic Survey Telescope (LSST). When it comes online in 2022, the telescope will generate terabytes of data each night as it surveys the southern skies automatically. And to crunch those data, astronomers will use a familiar and increasingly popular tool: the Jupyter notebook. Jupyter is a free, open-source, interactive web tool known as a computational notebook, which researchers can use to combine software code, computational output, explanatory text and multimedia resources in a single document. Computational notebooks have been around for decades, but Jupyter in particular has exploded in popularity over the past couple of years. This rapid uptake has been aided by an enthusiastic community of user–developers and a redesigned architecture that allows the notebook to speak dozens of programming languages — a fact reflected in its name, which was inspired, according to co-founder Fernando Pérez, by the programming languages Julia (Ju), Python (Py) and R.
One analysis of the code-sharing site GitHub counted more than 2.5 million public Jupyter notebooks in September 2018, up from 200,000 or so in 2015. In part, says Pérez, that growth is due to improvements in the web software that drives applications such as Gmail and Google Docs; the maturation of scientific Python and data science; and, especially, the ease with which notebooks facilitate access to remote data that might otherwise be impractical to download — such as from the LSST. “In many cases, it’s much easier to move the computer to the data than the data to the computer,” says Pérez of Jupyter’s cloud-based capabilities. “What this architecture helps to do is to say, you tell me where your data is, and I’ll give you a computer right there.”
For data scientists, Jupyter has emerged as a de facto standard, says Lorena Barba, a mechanical and aeronautical engineer at George Washington University in Washington DC. Mario Jurić, an astronomer at the University of Washington in Seattle who coordinates the LSST’s data-management team, says: “I’ve never seen any migration this fast. It’s just amazing.” Computational notebooks are essentially laboratory notebooks for scientific computing. Instead of pasting, say, DNA gels alongside lab protocols, researchers embed code, data and text to document their computational methods. The result, says Jupyter co-creator Brian Granger at California Polytechnic State University in San Luis Obispo, is a “computational narrative” — a document that allows researchers to supplement their code and data with analysis, hypotheses and conjecture.
For data scientists, that format can drive exploration. Notebooks, Barba says, are a form of interactive computing, an environment in which users execute code, see what happens, modify and repeat in a kind of iterative conversation between researcher and data. They aren’t the only forum for such conversations — IPython, the interactive Python interpreter on which Jupyter’s predecessor, IPython Notebook, was built, is another. But notebooks allow users to document those conversations, building “more powerful connections between topics, theories, data and results”, Barba says.
Researchers can also use notebooks to create tutorials or interactive manuals for their software. This is what Mackenzie Mathis, a systems neuroscientist at Harvard University in Cambridge, Massachusetts, did for DeepLabCut, a programming library her team developed for behavioural-neuroscience research. And they can use notebooks to prepare manuscripts, or as teaching aids. Barba, who has implemented notebooks in every course she has taught since 2013, related at a keynote address in 2014 that notebooks allow her students to interactively engage with — and absorb material from — lessons in a way that lectures cannot match. “IPython notebooks are really a killer app for teaching computing in science and engineering,” she said.
The Jupyter notebook has two components. Users input programming code or text in rectangular cells in a front-end web page. The browser then passes that code to a back-end ‘kernel’, which runs the code and returns the results (see our example at go.nature.com/2yqq7ak). By Pérez’s count, more than 100 Jupyter kernels have been created, supporting dozens of programming languages. Normally, each notebook can run only one kernel and one language, but workarounds exist. One demo notebook, for instance, speaks Python, Julia, R and Fortran.
Importantly, the kernels need not reside on the user’s computer. When future users of the LSST use Jupyter notebooks to analyse their data, the code will be running on a supercomputer in Illinois, providing computational muscle no desktop PC could match. Notebooks can also run in the cloud. Google’s Colaboratory project, for instance, provides a Google-themed front-end to the Jupyter notebook. It enables users to collaborate and run code that exploits Google’s cloud resources — such as graphical processing units — and to save their documents on Google Drive.
Jupyter’s newest variant is JupyterLab, which launched as a beta in January 2018 and is available (like the Jupyter notebook) either as a stand-alone package or as part of the free Anaconda scientific-computing environment. Jason Grout is a software engineer at the financial-services company Bloomberg in San Francisco, California, and a member of the JupyterLab team. He calls JupyterLab a “next-generation web interface” for the Jupyter notebook — one that extends the familiar notebook metaphor with drag-and-drop functionality, as well as file browsers, data viewers, text editors and a command console. Whereas the standard Jupyter notebook assigns each notebook its own kernel, JupyterLab creates a computing environment that allows these components to be shared. Thus, a user could view a notebook in one window, edit a required data file in another, and log all executed commands in a third — all within a single web-browser interface.
Users can also customize JupyterLab to fit their workflow. Built-in viewers exist for image, text and CSV files, for instance, but users can build custom components as well. These could display things such as genomic alignments or geospatial data. An attendee on a course taught by Pérez even created a component to display 3D brain-imaging data. “This is a completely [neuroscience] domain-specific tool, obviously — the Jupyter team has no business writing these things. But we provide the right standards, and then that community in 24 hours can come back and write one,” he says.
https://www.youtube.com/watch?v=CoGCuliGNos
Two additional tools have enhanced Jupyter’s usability. One is JupyterHub, a service that allows institutions to provide Jupyter notebooks to large pools of users. The IT team at the University of California, Berkeley, where Pérez is a faculty member, has deployed one such hub, which Pérez uses to ensure that all students on his data-science course have identical computing environments. “We cannot possibly manage IT support for 800 students, helping them debug why the installation on their laptop is not working; that’s simply infeasible,” he says.
The other development is Binder, an open-source service that allows users to use Jupyter notebooks on GitHub in a web browser without having to install the software or any programming libraries. Users can also execute Jupyter notebooks on the Google cloud by inserting https://colab.research.google.com/github before the URL of a notebook on GitHub, or using the commercial service Code Ocean. In September, Code Ocean rolled out a new user interface for its cloud-based code-sharing and code-execution service, also based on Jupyter.
https://www.youtube.com/watch?v=eVAZKZZb8bU
Such tools foster computational reproducibility by simplifying code reuse. But users still need to know how to use notebooks correctly. Joel Grus, a research engineer at the Allen Institute for Artificial Intelligence in Seattle, Washington, gave a presentation titled ‘I don’t like notebooks’ at the Jupyter developers’ conference earlier this year in New York City. He says he has seen programmers get frustrated when notebooks don’t behave as expected, usually because they inadvertently run code cells out of order. Jupyter notebooks also encourage poor coding practice, he says, by making it difficult to organize code logically, break it into reusable modules and develop tests to ensure the code is working properly.
Those aren’t insurmountable issues, Grus concedes, but notebooks do require discipline when it comes to executing code: for instance, by moving analysis code to external files that can be called from the notebook, by defining key variables at the top of the notebook and by restarting the kernel periodically and running the notebook from top to bottom. As one Twitter user quipped, “Restart and run all or it didn’t happen.” That’s a lesson Barba tries to instil in her students. “I explain to my students from day one that they can interact with a notebook in a nonlinear fashion, and that gives them great power for exploration,” she says. “But with great power comes great responsibility.”
https://twitter.com/digitalFlaneuse/status/996481061092806658
One tool that might help is Verdant, a plug-in that captures a history of a user’s actions in Jupyter. “The authors built an extension that allows a flexible user workflow while also capturing the specific code executed, in what order and on what specific data,” says Carol Willing, a member of the Jupyter team at California Polytechnic State University.
Jake VanderPlas, a software engineer at Google in Seattle, Washington, and a member of the Colaboratory team, says notebooks are like hammers: they can be misused, and aren’t appropriate for every application. But for data exploration and communication, notebooks excel.
The astronomy community seemingly agrees. “We went from Jupyter notebooks not existing some six years ago to in essence everybody using them today,” says Jurić. “And we’re a community that still has Fortran 77” — as in 1977 — “sticking around. It’s something.”
PREVIOUSLY
GUERRILLA OPEN ACCESS
https://spectrevision.net/2016/02/18/guerrilla-open-access/
SARCASM RECOGNITION
https://spectrevision.net/2016/02/11/sarcasm-recognition/
ARTIFICIAL SYNAPSES
https://spectrevision.net/2018/01/26/artificial-synapses/