top bar

Difference between revisions of "Hardware-based Synchronization in Micro-Manager"

(added tutorial link)
 
(15 intermediate revisions by 3 users not shown)
Line 1: Line 1:
== Hardware synchronization ==
+
=== What is hardware-based synchronization? ===
 +
Micro-Manager's bread and butter is multi-dimensional acquisition, where (typically) monochrome images are acquired in different channels and at different focal positions and XY positions. In multi-dimensional acquisition experiments it's often desirable to acquire images as quickly and precisely as possible, particularly across multiple channels and slices in a given time point. Some illumination devices, such as AOTF-based laser controllers and LED sources, can switch between channels extremely quickly, with submillisecond switching times. Likewise, some piezo-driven focus drives can move very fast. To take advantage of these fast operations, one would like to minimize the delays between setting each new channel and Z slice, and acquiring an image from the camera.
  
=== Overview ===
+
Unfortunately, computers have limited ability to provide fast and precise control of connected microscope hardware. Many control protocols contain unpredictable delays. And since the commonly-used operating systems are not "real time," there is never a guarantee that a task will be executed within a certain time frame. To achieve the tightest synchronization and fastest acquisition possible, it's necessary to use external hardware to electronically synchronize the equipment.
Computers have limits in their ability to synchronize equipment. Since the commonly used operating systems are not "real time" there is never a guarantee that a task will be executed within a certain time frame. To achieve that tightest and fastest synchronization possible you will therefore need to synchronize the equipment electronically.
+
  
Many microscope hardware components have inputs and outputs for TTL signals (see our Micro-Manager [http://valelab.ucsf.edu/publications/2010EdelsteinCurrProt.pdf tutorial]) that can be used for synchronization. These TTLs are usually physically connected by [http://en.wikipedia.org/wiki/BNC_connector BNC connectors].  There are many different ways of wiring up these control signals, one could - for instance - have a micro-controller device that knows all the delays in each of the components and that triggers each at just the right time.
+
Since 2011, Micro-Manager has included growing support for hardware triggering of certain devices to enable this kind of synchronization. We are pleased to note that Micro-Manager's user interface required the addition of no new buttons, menus or other doodads to make hardware triggering available to users. Once the microscope hardware has been set up, users don't have to worry about the details of how channels and slices are acquired -- they just request what images they want, and Micro-Manager takes care of figuring out how to acquire the images, making use of hardware triggering whenever possible.
  
 +
=== How hardware triggering works in Micro-Manager ===
  
=== Implementation ===
+
Many microscope hardware components have inputs and outputs for TTL signals (see our Micro-Manager [http://valelab.ucsf.edu/publications/2010EdelsteinCurrProt.pdf tutorial]) that can be used for synchronizationThese TTLs are usually physically connected by [http://en.wikipedia.org/wiki/BNC_connector BNC connectors].
Although such setups are possible in Micro-Manager, we decided to provide general support for a  simpler scheme.  Micro-Manager has build-in support for for a wiring setup in which the camera is the controller driving all other equipment, i.e. the camera is the clockSo, the trigger out signal of the camera (which should go high whenever the camera is exposing and go low whenever the camera is not exposing) is connected to the trigger input of the device.
+
  
Micro-Manager will need to know about this configuration.  It learns about this through the device adapted code.  Several device types can support triggering, at the moment these are: (piezo Z) stages, XY-stages (although most XY stages will be too slow for this approach), digital to analogue converters (DA), and any property of any device. The device will indicate through the Micro-Manager API whether or not it supports triggering (we use the term "Sequences").  Usually, the device will have a property through which the user can switch the use of sequences on and off.  Examples for device adapter developers how to write code for sequencable devices can be found in the [https://valelab.ucsf.edu/trac/micromanager/browser/DeviceAdapters/Arduino Arduino device adapter code].
+
Micro-Manager specifically offers built-in support for hardware triggering in which a TTL signal from the camera drives events in other devices (i.e. the camera is the clock). The "trigger out" signal of the camera is connected to the "trigger input" of another device controlling the channel or Z position.
  
The Micro-Manager acquisition engine will always try to use triggers to execute an acquisition.  For instance, if it is running a Z-stack using a triggerable piezo Z drive, it will upload a sequence of the desired Z-positions to the piezo Z-drive controller, tell the camera to take a sequence of the desired number of images and rely on the hardware triggering for Z-control. It will interpret the images coming from the camera correctly and present them as a Z-stack.  Likewise, if your channel configuration only switches a triggerable property (for instance, selection of a laser line through an AOTF controller), it will load the sequence of AOTF states to the controller and rely on the hardware triggering to switch channels.  The nest result is image acquisition that is only limited by the speed of the camera, whereas nothing changes for the end user, i.e., they use the normal way to indicate their MDA strategy.
+
Certain Micro-Manager device adapters, when installed, indicate to the Micro-Manager application that hardware triggering is available. In principle, Micro-Manager's programming interface allows any property of any device to support triggering, as well as Z stages and XY-stages (although most XY stages will be too slow for this approach). Currently, the device adapters that have been written to support triggering are:
  
===Controller devices that currently support triggered sequences===
 
 
* [[AgilentLaserCombiner]] -- Laser channel switching
 
* [[AgilentLaserCombiner]] -- Laser channel switching
 
* [[Arduino]] -- TTL and DAC output states
 
* [[Arduino]] -- TTL and DAC output states
 
* [[ASIStage]] -- ASI piezo focus drives
 
* [[ASIStage]] -- ASI piezo focus drives
 
* [[ESIOImagingControllers]] -- AOTFs (laser combiners) and piezo focus drives
 
* [[ESIOImagingControllers]] -- AOTFs (laser combiners) and piezo focus drives
* [[Marzhauser]] (TANGO controller) --  Märzhäuser piezo focus drive
+
* [[MCL NanoDrive]] --  MadCityLabs piezo XY and Z drives
 +
* [[Marzhauser|Märzhäuser]] (TANGO controller) --  Märzhäuser piezo focus drive
 +
* [[National_Instruments]] -- TTL output states (1.4.22 and later)
 +
 
 +
Each of these devices has a property through which the user can turn on and off triggering (aka "sequencing"). Please read the individual device pages for specific instructions on how to use properties to activate hardware synchronization. (Examples for device adapter developers how to write code for sequencable devices can be found in the [https://valelab.ucsf.edu/trac/micromanager/browser/DeviceAdapters/Arduino Arduino device adapter source code].)
 +
 
 +
When executing a multi-dimensional acquisition, Micro-Manager's acquisition engine opportunistically and transparently uses hardware triggering whenever possible. For instance, if the users requests a Z stack and Micro-Manager detects that a triggerable piezo-driven Z drive is connected, the application will automatically upload a sequence of the desired Z positions to the piezo Z drive controller and then order the camera to take a sequence of ''n'' images where ''n'' is the number of slices in the stack. The trigger device will then receive TTL signals from the camera and automatically move the Z drive through the requested stack, so that the camera precisely images each Z slice in turn. Micro-Manager will then  attach the correct "slice index" tag in the metadata for each image and display the images coming from the camera appropriately as a Z stack. If Time Points have also been selected, then Micro-Manager will run a triggered Z stack at every time point.
 +
 
 +
Likewise, if the user requests a multi-dimensional acquisition with a sequence of channels that require only hardware-triggerable properties, (for instance, the selection of a laser line through an AOTF controller),  then Micro-Manager will load the sequence of AOTF states to the controller and rely on the hardware triggering to switch channels. If multiple time points and zero interval between frames has been requested, then channel triggering will cycle for as long as necessary to produce a multi-channel movie.
 +
 
 +
Triggering can even be carried out with both Z-position and channel triggering, to allow multichannel, volumetric images to be acquired very quickly and precisely. For example, below is a movie of a dividing Drosophila S2 cell, expressing GFP-histone 2B and Cherry-tubulin, acquired by the Micro-Manager team as a demonstration of Micro-Manager's hardware triggering capabilities in the [https://micro-manager.org/wiki/Micro-Manager_at_the_ASCB_Meeting_2011 Micro-Manager booth in the Exhibition Hall of the 2011 ASCB Meeting]. A microcontroller kindly loaned by [http://www.esimaging.co.uk/products ESImaing] was used to trigger both channels and Z positions, where two channels were acquired at each of 5 slices (with 2 micron intervals) in the Z stack. Camera exposure time was set to 50 ms; thus the total time to acquire a single time point (10 images, or 2 channels x 5 slices) was 500 msec. To produce a youtube movie, a maximum intensity projection in each channel was carried out in post-processing:
 +
 
 +
 
 +
{{#ev:youtubehd|O8zdsyNoOg8|500|center}}
 +
 
 +
 
 +
=== Tutorials ===
 +
* [https://github.com/vanNimwegenLab/MiM_NikonTi/blob/master/Docs/NikonTi_hardware_triggering.md Step-by-step to controlling multiple light sources and a z-stage piezo for sequenceable MDA] by Guillaume Witz & Thomas Julou.
  
 
{{Documentation_Sidebar}}
 
{{Documentation_Sidebar}}

Latest revision as of 09:30, 11 April 2017

What is hardware-based synchronization?

Micro-Manager's bread and butter is multi-dimensional acquisition, where (typically) monochrome images are acquired in different channels and at different focal positions and XY positions. In multi-dimensional acquisition experiments it's often desirable to acquire images as quickly and precisely as possible, particularly across multiple channels and slices in a given time point. Some illumination devices, such as AOTF-based laser controllers and LED sources, can switch between channels extremely quickly, with submillisecond switching times. Likewise, some piezo-driven focus drives can move very fast. To take advantage of these fast operations, one would like to minimize the delays between setting each new channel and Z slice, and acquiring an image from the camera.

Unfortunately, computers have limited ability to provide fast and precise control of connected microscope hardware. Many control protocols contain unpredictable delays. And since the commonly-used operating systems are not "real time," there is never a guarantee that a task will be executed within a certain time frame. To achieve the tightest synchronization and fastest acquisition possible, it's necessary to use external hardware to electronically synchronize the equipment.

Since 2011, Micro-Manager has included growing support for hardware triggering of certain devices to enable this kind of synchronization. We are pleased to note that Micro-Manager's user interface required the addition of no new buttons, menus or other doodads to make hardware triggering available to users. Once the microscope hardware has been set up, users don't have to worry about the details of how channels and slices are acquired -- they just request what images they want, and Micro-Manager takes care of figuring out how to acquire the images, making use of hardware triggering whenever possible.

How hardware triggering works in Micro-Manager

Many microscope hardware components have inputs and outputs for TTL signals (see our Micro-Manager tutorial) that can be used for synchronization. These TTLs are usually physically connected by BNC connectors.

Micro-Manager specifically offers built-in support for hardware triggering in which a TTL signal from the camera drives events in other devices (i.e. the camera is the clock). The "trigger out" signal of the camera is connected to the "trigger input" of another device controlling the channel or Z position.

Certain Micro-Manager device adapters, when installed, indicate to the Micro-Manager application that hardware triggering is available. In principle, Micro-Manager's programming interface allows any property of any device to support triggering, as well as Z stages and XY-stages (although most XY stages will be too slow for this approach). Currently, the device adapters that have been written to support triggering are:

Each of these devices has a property through which the user can turn on and off triggering (aka "sequencing"). Please read the individual device pages for specific instructions on how to use properties to activate hardware synchronization. (Examples for device adapter developers how to write code for sequencable devices can be found in the Arduino device adapter source code.)

When executing a multi-dimensional acquisition, Micro-Manager's acquisition engine opportunistically and transparently uses hardware triggering whenever possible. For instance, if the users requests a Z stack and Micro-Manager detects that a triggerable piezo-driven Z drive is connected, the application will automatically upload a sequence of the desired Z positions to the piezo Z drive controller and then order the camera to take a sequence of n images where n is the number of slices in the stack. The trigger device will then receive TTL signals from the camera and automatically move the Z drive through the requested stack, so that the camera precisely images each Z slice in turn. Micro-Manager will then attach the correct "slice index" tag in the metadata for each image and display the images coming from the camera appropriately as a Z stack. If Time Points have also been selected, then Micro-Manager will run a triggered Z stack at every time point.

Likewise, if the user requests a multi-dimensional acquisition with a sequence of channels that require only hardware-triggerable properties, (for instance, the selection of a laser line through an AOTF controller), then Micro-Manager will load the sequence of AOTF states to the controller and rely on the hardware triggering to switch channels. If multiple time points and zero interval between frames has been requested, then channel triggering will cycle for as long as necessary to produce a multi-channel movie.

Triggering can even be carried out with both Z-position and channel triggering, to allow multichannel, volumetric images to be acquired very quickly and precisely. For example, below is a movie of a dividing Drosophila S2 cell, expressing GFP-histone 2B and Cherry-tubulin, acquired by the Micro-Manager team as a demonstration of Micro-Manager's hardware triggering capabilities in the Micro-Manager booth in the Exhibition Hall of the 2011 ASCB Meeting. A microcontroller kindly loaned by ESImaing was used to trigger both channels and Z positions, where two channels were acquired at each of 5 slices (with 2 micron intervals) in the Z stack. Camera exposure time was set to 50 ms; thus the total time to acquire a single time point (10 images, or 2 channels x 5 slices) was 500 msec. To produce a youtube movie, a maximum intensity projection in each channel was carried out in post-processing:



Tutorials

© Micro-Manager : Vale Lab, UCSF 2006-2011 | All Rights Reserved | Contact