Will Photonic Computing Save the Datacenter?
Earlier this week I was reading an article in the Technology Review describing the first complete photonic communication system developed by chip manufacturer Intel's photonics lab. This development will lead to some tectonic changes in enterprise computing.
Instead of using copper or other electrical conductors to transmit electrical signals to the processors, Intel developed technology that can encode and decode optical signals natively on the processor, which means there is no more performance hit converting optical signals into electrical impulses or vice versa.
The immediate benefit is a serious performance boost. According to the article, the four-laser system the Intel lab staff demonstrated has the capability to carry data at a rate of 50 gigabytes per second compared to the 10 gigabytes per second achievable over copper wiring. This performance is also scalable depending on how many lasers with varying wavelengths are used to transmit data. The article's author reports that a system could handle up to 1,000 gigabytes per second.
The second benefit, which I think is a little more exciting, is the fact that the computer is no longer tethered to the physics associated with copper wiring. The system memory could be housed separately from the processor, maybe a foot away, according to Intel officials. By doing this, not only will it reduce datacenter cooling costs, but it is likely to be the last step in completely deconstructing the concept of the beige box server that we know and love.
For the past several years, the industry has gone from individual rack-mountable servers to blade servers housed in a common chassis. Now with this ability to separate memory from the processor, the datacenter becomes the chassis. Instead of having individual servers, the industry will see processing fabrics interfacing with memory fabrics and connecting with storage fabrics.
Of course, this is quite a ways down the road, since the enabling technology is still in its nascent stages. However, it will be interesting to watch how fast the financial services industry will embrace the new technology once it is available. Considering the stark performance difference between the current and new technologies, I'm not sure if the typical adoption curve will apply.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Trading Tech
The total portfolio approach gains momentum: Building the right tech foundation for success
The rationale for the TPA, and the crucial role technology plays in enabling such an approach
Google, CME say they’ve proved cloud can support HFT—now what?
After demonstrating in September that ultra-low-latency trading can be facilitated in the cloud, the exchange and tech giant are hoping to see barriers to entry come down.
Institutional priorities in multi-asset investing
Private markets, broader exposures and the race for integration
BlackRock and AccessFintech partner, LSEG collabs with OpenAI, Apex launches Pisces service, and more
The Waters Cooler: CJC launches MDC service, Centreon secures Sixth Street investment, UK bond CT update, and more in this week’s news roundup.
TCB Data-Broadhead pairing highlights challenges of market data management
Waters Wrap: The vendors are hoping that blending TCB’s reporting infrastructure with Broadhead’s DLT-backed digital contract and auditing engine will be the cure for data rights management.
Robeco tests credit tool built in Bloomberg’s Python platform
This follows the asset manager’s participation in Bloomberg’s Code Crunch hackathon in Singapore, alongside other firms including LGT Investment Bank and university students.
FCA eyes equities tape, OpenAI and Capco team up, prediction markets gain steam, and more
The Waters Cooler: More tokenization, Ediphy lawsuit updates, Rimes teams up with Databricks, and more in this week’s news roundup.
Buy-side data heads push being on ‘right side’ of GenAI
Data heads at Man Group and Systematica Investments explain how GenAI has transformed the quant research process.