The NRAMM meeting last year recorded and made the presentations public. This is cryoEM, but the last 2 talks from Steve Ludtke and Ed Eng talks about how to set up a computational infrastructure to support these labs with growing data need and that is very relevant to volume imaging people.
As many universities are buying now cryoTEMs and then need to build up the computational infrastructure for it too, why not tap into it for volume imaging?
It worth checking these two presentations out at the end of the program:
That is really interesting – I remember talking a couple of years ago to a guy from a large company doing cryoEM and we both had a laugh about the strategies that were needed.
I’ll enjoy taking a look.
That is a major issue with large volume datasets – you often need access to 10-20Tb of data in a project, and finding it and keeping it is not trivial.
Light sheet microscopy, computed tomography, and next-gen sequencing techniques are also generating huge (>20 Gb-multiple Tb) datasets. If you have colleagues working with these data types, it would be worth collaborating for data storage solutions, IMO.
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.