Introduction

The Harr Wavelet Kernel analysis (HAWK) plugin is a tool for pre-processing localisation microscopy data. It takes a localisation microscopy dataset as input and gives as its result a processed image stack which is larger than the original localisation microscopy dataset. This processing acts to separate multiple overlapping fluorophores into individual frames, allowing standard localisation microscopy algorithms to process the data without producing artefacts. In order to make effective use of HAWK, you will need a localisation microscopy system installed such as ThunderSTORM1. the rest of this guide will assume that you are familiar with such a system.

Analysing some new data

First load some data that you want to analyse. If you don't have any to hand, we have supplied some.

If you get the real data, it should look (after zooming in) something like this:

Screengrab of some loaded data

To run HAWK, do Plugins > Cox Group > HAWK > HAWK. That will bring up the following dialog:

HAWK dialog

The default parameters are fine, so just press OK. If you want to do drift correction later then you should change "Output order" to "Group temporally". That will run HAWK and bring up the processed data which (after zooming in) looks like this:

HAWK processed data

And that's it for HAWK!

You now need to analyse the HAWK processed data with some sort of localisation algorithm. If you don't have any installed, then you look at our analysis of the data performed with ThunderSTORM. You can load a ThunderSTORM analsis of both the raw and HAWK processed data from the same menu. If you have ThunderSTORM installed, then go to Plugins >ThunderSTORM > Run analysis which will bring up the following dialog:

Thunderstorm dialog

For best results check the "Multi-emitter fitting analysis" box, then press "OK". (Here we have a high degree of overlap, which is why multi-emitter fitting is necessary. Examples of HAWK combined with single emitter fitting are given in the paper.) This will take a few minutes to run. When it's done you should get the following results:

Thunderstorm results Thunderstorm reconstruction

We recommend that you set a filter to remove small localisations (shown). Here are the results side by side. Left: Multi-emitter ThunderSTORM. Middle: HAWK with multi-emitter ThunderSTORM. Right: widefield data (average of the stack). You can see that in the podosome circled in the centre, the HAWK analysed data shows the circular structure which is not visible in the widefield or ThunderSTORM only analysed data.

Results

Note that the results are shown using our reconstruction software. If you perform ThunderSTORM or QuickPALM analysis on your own data, you can load it into our reconstruction using Plugins > Cox Group > Localisation Reconstruction > Load ThunderSTORM or Plugins > Cox Group > Localisation Reconstruction > Load QuickPALM Note that ThunderSTORM will often fit single noise pixels as fluorophores. This is more likely to happen in HAWK filtered data than in standard data. Fitting to a HAWK dataset with ThunderSTORM may therefore lead to a large increase in the number of detected fluorophores. To remove these identifications, you need to filter the fitted fluorophore positions so only those with physically realistic widths are accepted.

References

  1. ThunderSTORM: a comprehensive ImageJ plugin for PALM and STORM data analysis and super-resolution imaging. M. Ovesný, P. Křížek, J. Borkovec, Z. Švindrych & G. M. Hagen. Bioinformatics 2014 vol. 30(16):2389–2390.
  2. ImageJ plug-in for Bayesian analysis of blinking and bleaching. Rosten, Jones &Cox, Nat. Meth. 2013 vol. 10:97–98.