The NRAMM meeting last year recorded and made the presentations public. This is cryoEM, but the last 2 talks from Steve Ludtke and Ed Eng talks about how to set up a computational infrastructure to support these labs with growing data need and that is very relevant to volume imaging people.
As many universities are buying now cryoTEMs and then need to build up the computational infrastructure for it too, why not tap into it for volume imaging?
It worth checking these two presentations out at the end of the program:
That is really interesting – I remember talking a couple of years ago to a guy from a large company doing cryoEM and we both had a laugh about the strategies that were needed.
I’ll enjoy taking a look.
That is a major issue with large volume datasets – you often need access to 10-20Tb of data in a project, and finding it and keeping it is not trivial.
Light sheet microscopy, computed tomography, and next-gen sequencing techniques are also generating huge (>20 Gb-multiple Tb) datasets. If you have colleagues working with these data types, it would be worth collaborating for data storage solutions, IMO.