IT Science Case Study: High Speed Data Transfer in Extreme Environments

eWEEK IT SCIENCE: The challenge was transferring data at distances of 2km to 6km from the central data center under extremely harsh conditions (temperatures range from 8F to -100F buried under a meter of snow) at the Askaryan Radio Array located at the South Pole. The scientists needed a new network that provides high bandwidth and can synchronize the sensors to within nanoseconds.

eweek.logo.ITScience

Here is the latest article in an eWEEK feature series called IT Science, in which we look at what actually happens at the intersection of new-gen IT and legacy systems.

Unless it’s brand new and right off various assembly lines, servers, storage and networking inside every IT system can be considered “legacy.” This is because the iteration of both hardware and software products is speeding up all the time. It’s not unusual for an app-maker, for example, to update and/or patch for security purposes an application a few times a month, or even a week. Some apps are updated daily! Hardware moves a little slower, but manufacturing cycles are also speeding up.

These articles describe new-gen industry solutions. The idea is to look at real-world examples of how new-gen IT products and services are making a difference in production each day. Most of them are success stories, but there will also be others about projects that blew up. We’ll have IT integrators, system consultants, analysts and other experts helping us with these as needed.

Today’s Topic: High Speed Data Transfer in Extreme Environments

Describe the problem being faced: The challenge was transferring data at distances of 2km to 6km from the central data center under extremely harsh conditions (temperatures range from 8F to -100F buried under a meter of snow) at the Askaryan Radio Array located at the South Pole. The scientists needed a new network that provides high bandwidth and can synchronize the sensors to within nanoseconds.

Describe the strategy that went into finding the solution: Fiber optics are the only feasible medium providing the data rates necessary over long distances without the need to regenerate signals multiple times. Low temperature performance was the primary criteria for finding a fiber optic jumper solution.  Traditional jumpers simply cannot handle temperatures below -40F, and at that temperature the sheathing was breaking off of the prior jumpers. Both durability and flexibility at extreme temperatures were the reasons the IceCube Neutrino Observatory at the University of Wisconin-Madison selected Clearfield Ruggedized jumpers.

List the key components in the solution:

  • Analog antennas (Both Vertical Polarity and Horizontal Polarity) receive impulses from the Neutrino particles as they pass through the polar ice and they pass the signal to the Trigger Processor.
  • Trigger Processor accepts analog signals from the multiple antenna arrays and digitizes that signal which is subsequently transported over the fiber-optic link to the Data Acquisition center.
  • DAQ (Data Acquisition Center) accepts signals from multiple antenna clusters and stores data for analysis.
  • To prototype next-generation communication and timing networks, IceCube uses Clearfield fiber-optic jumpers indoors and outdoors, as well as deploying outdoor fiber cables in optical junction and electronics boxes buried underneath shallow snow.

Describe how the deployment went, perhaps how long it took, and if it came off as planned: By using fiber optics, no additional signal regeneration sites needed to be constructed to cover the distances and no additional power is required to support the regeneration sites. This helped to speed the deployment. The observatory’s near real-time alert system is triggered when a very high-energy neutrino collides with an atomic nucleus in the Antarctic ice in or near the IceCube detector.  It then broadcasts coordinates of the neutrino alert to telescopes worldwide for follow-up observations.

Describe the result, new efficiencies gained, and what was learned from the project: Ruggedized fiber optic cables rapidly deliver larger amounts of data.  Currently, each array has a 1Gb data pipe available to it, and those are not yet used to their full capacity.  This allows for easier expansion in the future. The IceCube project generates about 700 terabytes of data every year as it looks to map the Universe one neutrino at a time. It shares data in near real-time with the IceCube Collaboration, which is comprised of over 300 scientists at 49 institutions around the world.

Describe ROI, carbon footprint savings, and staff time savings, if any: Everything at the South Pole runs on jet fuel and not having to regenerate the data signal reduces power requirements and subsequently saves on fuel and carbon emissions.  Staff time savings are also a result of this and making the distance in on span is a big time saver.

So far, scientists have used results from IceCube to pinpoint the location of a massive black hole, or blazar, which is just off the left shoulder of the constellation Orion about 4 billion light years away. IceCube’s data has also allowed measurement of the Earth’s mass and could eventually provide new insights about our own planet, dark matter and the universe.

Other References:

 Using neutrinos detected by IceCube to measure mass of the Earth

 IceCube neutrinos point to long-sought cosmic ray accelerator

 https://icecube.wisc.edu/

 

If you have a suggestion for an eWEEK IT Science article, email cpreimesberger@eweek.com.

Chris Preimesberger

Chris J. Preimesberger

Chris J. Preimesberger is Editor-in-Chief of eWEEK and responsible for all the publication's coverage. In his 15 years and more than 4,000 articles at eWEEK, he has distinguished himself in reporting...