– FiberSystems
Photonics
What is 100ZR and Why Does it Matter?
In June 2022, transceiver developer II‐VI Incorporated (now Coherent Corp.) and optical networking solutions provider…
In June 2022, transceiver developer II‐VI Incorporated (now Coherent Corp.) and optical networking solutions provider ADVA announced the launch of the industry’s first 100ZR pluggable coherent transceiver. Discussions in the telecom sector about a future beyond 400G coherent technology have usually focused on 800G products, but there is increasing excitement about “downscaling” to 100G coherent products for certain applications in the network edge and business services. This article will discuss the market and technology forces that drive this change in discourse.
The Need for 100G Transmission in Telecom Deployments
The 400ZR pluggables that have become mainstream in datacom applications are too expensive and power-hungry for the optical network edge. Therefore, operators are strongly interested in 100G pluggables that can house coherent optics in compact form factors, just like 400ZR pluggables do. The industry is labeling these pluggables as 100ZR.
A recently released Heavy Reading survey revealed that over 75% of operators surveyed believe that 100G coherent pluggable optics will be used extensively in their edge and access evolution strategy. However, this interest had not really materialized into a 100ZR market because no affordable or power-efficient products were available. The most the industry could offer was 400ZR pluggables that were “powered-down” for 100G capacity.
100ZR and its Enabling Technologies
With the recent II-VI Incorporated and ADVA announcement, the industry is showing its first attempts at a native 100ZR solution that can provide a true alternative to the powered-down 400ZR products. Some of the key specifications of this novel 100ZR solution include:
- A QSFP28 form factor, very similar but slightly smaller than a QSFP-DD
- 5 Watt power consumption
- C-temp and I-temp certifications to handle harsh environments
The 5 Watt-power requirement is a major reduction compared to the 15-Watt specification of 400ZR transceivers in the QSFP-DD form factor. Achieving this spec requires a digital signal processor (DSP) that is specifically optimized for the 100G transceiver.
Transceiver developers often source their DSP, laser, and optical engine from different suppliers, so all these chips are designed separately from each other. This setup reduces the time to market, simplifies the research and design processes, but comes with performance and power consumption trade-offs.
In such cases, the DSP is like a Swiss army knife: a jack of all trades designed for different kinds of optical engines but a master of none. DSPs co-designed and optimized for their specific optical engine and laser can significantly improve power efficiency. You can read more about co-design approaches in one of our previous articles.
Achieving 100ZR Cost-Efficiency through Scale
Making 100ZR coherent optical transceivers more affordable is also a matter of volume production. As discussed in a previous article, if PIC production volumes can increase from a few thousand chips per year to a few million, the price per optical chip can decrease from thousands of Euros to mere tens of Euros. Such manufacturing scale demands a higher upfront investment, but the result is a more accessible product that more customers can purchase.
Achieving this production goal requires photonics manufacturing chains to learn from electronics and leverage existing electronics manufacturing processes and ecosystems. Furthermore, transceiver developers must look for trusted large-scale manufacturing partners to guarantee a secure and high-volume supply of chips and packages.
If you want to know more about how photonics developers can leverage electronic ecosystems and methods, we recommend you read our in-depth piece on the subject.
Takeaways
As the Heavy Reading survey showed, the interest in 100G coherent pluggable optics for edge/access applications is strong, and operators have identified use key use cases within their networks. In the past, there were no true 100ZR solutions that could address this interest, but the use of optimized DSPs and light sources, as well as high-volume manufacturing capabilities, can finally deliver a viable and affordable 100ZR product.
Tags: 100G coherent, 100ZR, DSP, DSPs, edge and access applications, EFFECT Photonics, PhotonicsFit for Platform DSPs
Over the last two decades, power ratings for pluggable modules have increased as we moved…
Over the last two decades, power ratings for pluggable modules have increased as we moved from direct detection to more power-hungry coherent transmission: from 2W for SFP modules to 3.5 W for QSFP modules and now to 14W for QSSFP-DD and 21.1W for OSFP form factors. Rockley Photonics researchers estimate that a future electronic switch filled with 800G modules would draw around 1 kW of power just for the optical modules.
Around 50% of a coherent transceiver’s power consumption goes into the digital signal processing (DSP) chip that also performs the functions of clock data recovery (CDR), optical-electrical gear-boxing, and lane switching. Scaling to higher bandwidths leads to even more losses and energy consumption from the DSP chip and its radiofrequency (RF) interconnects with the optical engine.
One way to reduce transceiver power consumption requires designing DSPs that take advantage of the material platform of their optical engine. In this article, we will elaborate on what that means for the Indium Phosphide platform.
A Jack of All Trades but a Master of None
Transceiver developers often source their DSP, laser, and optical engine from different suppliers, so all these chips are designed separately from each other. This setup reduces the time to market and simplifies the research and design processes but comes with trade-offs in performance and power consumption.
In such cases, the DSP is like a Swiss army knife: a jack of all trades designed for different kinds of optical engines but a master of none. For example, current DSPs are designed to be agnostic to the material platform of the photonic integrated circuit (PIC) they are connected to, which can be Indium Phosphide (InP) or Silicon. Thus, they do not exploit the intrinsic advantages of these material platforms. Co-designing the DSP chip alongside the PIC can lead to a much better fit between these components.
Co-Designing with Indium Phosphide PICs for Power Efficiency
To illustrate the impact of co-designing PIC and DSP, let’s look at an example. A PIC and a standard platform-agnostic DSP typically operate with signals of differing intensities, so they need some RF analog electronic components to “talk” to each other. This signal power conversion overhead constitutes roughly 2-3 Watts or about 10-15% of transceiver power consumption.
However, the modulator of an InP PIC can run at a lower voltage than a silicon modulator. If this InP PIC and the DSP are designed and optimized together instead of using a standard DSP, the PIC could be designed to run at a voltage compatible with the DSP’s signal output. This way, the optimized DSP could drive the PIC directly without needing the RF analog driver, doing away with most of the power conversion overhead we discussed previously.
Additionally, the optimized DSP could also be programmed to do some additional signal conditioning that minimizes the nonlinear optical effects of the InP material, which can reduce noise and improve performance.
Taking Advantage of Active Components in the InP Platform
Russell Fuerst, EFFECT Photonics’ Vice-President of Digital Signal Processing, gave us an interesting insight about designing for the InP platform in a previous interview:
When we started doing coherent DSP designs for optical communication over a decade ago, we pulled many solutions from the RF wireless and satellite communications space into our initial designs. Still, we couldn’t bring all those solutions to the optical markets.
However, when you get more of the InP active components involved, some of those solutions can finally be brought over and utilized. They were not used before in our designs for silicon photonics because silicon is not an active medium and lacked the performance to exploit these advanced techniques.
For example, the fact that the DSP could control laser and modulator components on the InP can lead to some interesting manipulations of light signals. A DSP that can control these components directly could generate proprietary waveforms or use non-standard constellation and modulation schemes that can boost the performance of a coherent transceiver and increase the capacity of the link.
Takeaways
The biggest problem for DSP designers is still improving performance while reducing power use. This problem can be solved by finding ways to integrate the DSP more deeply with the InP platform, such as letting the DSP control the laser and modulator directly to develop new waveform shaping and modulation schemes. Because the InP platforms have active components, DSP designers can also import more solutions from the RF wireless space.
Tags: analog electronics, building blocks, coherent, dispersion compensation, DSP, energy efficiency, Intra DCI, Photonics, PON, power consumption, reach, simplifiedLeveraging Electronic Ecosystems in Photonics
Thanks to wafer-scale technology, electronics have driven down the cost per transistor for many decades.…
Thanks to wafer-scale technology, electronics have driven down the cost per transistor for many decades. This allowed the world to enjoy chips that every generation became smaller and provided exponentially more computing power for the same amount of money. This scale-up process is how everyone now has a computer processor in their pocket that is millions of times more powerful than the most advanced computers of the 1960s that landed men on the moon.
This progress in electronics integration is a key factor that brought down the size and cost of coherent transceivers, packing more bits than ever into smaller areas. However, photonics has struggled to keep up with electronics, with the photonic components dominating the cost of transceivers. If the transceiver cost curve does not continue to decrease, it will be challenging to achieve the goal of making them more accessible across the entire optical network.
To trigger a revolution in the use of photonics worldwide, it needs to be as easy to use as electronics. In the words of our Chief Technology Officer:
We need to buy photonics from a catalog as we do with electronics, have datasheets that work consistently, be able to solder it to a board and integrate it easily with the rest of the product design flow.
Tim Koene – Chief Technology Officer
This goal requires photonics manufacturing to leverage existing electronics manufacturing processes and ecosystems. Photonics must embrace fabless models, chips that can survive soldering steps, and electronic packaging and assembly methods.
The Advantages of Moving to a Fabless Model
Increasing the volume of photonics manufacturing is a big challenge. Some photonic chip developers manufacture their chips in-house within their fabrication facilities. This approach has some substantial advantages, giving component manufacturers complete control over their production process.
However, this approach has its trade-offs when scaling up. If a vertically-integrated chip developer wants to scale up in volume, they must make a hefty capital expenditure (CAPEX) in more equipment and personnel. They must develop new fabrication processes as well as develop and train personnel. Fabs are not only expensive to build but to operate. Unless they can be kept at nearly full utilization, operating expenses (OPEX) also drain the facility owners’ finances.
Especially in the case of an optical transceiver market that is not as big as that of consumer electronics, it’s hard not to wonder whether that initial investment is cost-effective. For example, LightCounting estimates that 55 million optical transceivers were sold in 2021, while the International Data Corporation estimates that 1.4 billion smartphones were sold in 2021. The latter figure is 25 times larger than that of the transceiver market.
Electronics manufacturing experienced a similar problem during their 70s and 80s boom, with smaller chip start-ups facing almost insurmountable barriers to market entry because of the massive CAPEX required. Furthermore, the large-scale electronics manufacturing foundries had excess production capacity that drained their OPEX. The large-scale foundries ended up selling that excess capacity to the smaller chip developers, who became fabless. In this scenario, everyone ended up winning. The foundries serviced multiple companies and could run their facilities at total capacity, while the fabless companies could outsource manufacturing and reduce their expenditures.
This fabless model, with companies designing and selling the chips but outsourcing the manufacturing, should also be the way to go for photonics. Instead of going through a more costly, time-consuming process, the troubles of scaling up for photonics developers are outsourced and (from the perspective of the fabless company) become as simple as putting a purchase order in place. Furthermore, the fabless model allows photonics developers to concentrate their R&D resources on the end market. This is the simplest way forward if photonics moves into million-scale volumes.
Adopting Electronics-Style Packaging
While packaging, assembly, and testing are only a small part of the cost of electronic systems, the reverse happens with photonic integrated circuits (PICs). Researchers at the Technical University of Eindhoven (TU/e) estimate that for most Indium Phosphide (InP) photonics devices, the cost of packaging, assembly, and testing can reach around 80% of the total module cost.
To become more accessible and affordable, the photonics manufacturing chain must become more automated and standardized. The lack of automation makes manufacturing slower and prevents data collection that can be used for process control, optimization, and standardization.
One of the best ways to reach these automation and standardization goals is to learn from electronics packaging, assembly, and testing methods that are already well-known and standardized. After all, building a special production line is much more expensive than modifying an existing production flow.
There are several ways in which photonics packaging, assembly, and testing can be made more affordable and accessible. Below are a few examples:
- Passive alignments: Connecting optical fiber to PICs is one of optical devices’ most complicated packaging and assembly problems. The best alignments are usually achieved via active alignment processes in which feedback from the PIC is used to align the fiber better. Passive alignment processes do not use such feedback. They cannot achieve the best possible alignment but are much more affordable.
- BGA-style packaging: Ball-grid array packaging has grown popular among electronics manufacturers. It places the chip connections under the chip package, allowing more efficient use of space in circuit boards, a smaller package size, and better soldering.
- Flip-chip bonding: A process where solder bumps are deposited on the chip in the final fabrication step. The chip is flipped over and aligned with a circuit board for easier soldering.
These might be novel technologies for photonics developers who have started implementing them in the last five or ten years. However, the electronics industry embraced these technologies 20 or 30 years ago. Making these techniques more widespread will make a massive difference in photonics’ ability to scale up and become as available as electronics.
Making Photonics Chips That Can Survive Soldering
Soldering remains another tricky step for photonics assembly and packaging. Photonics device developers usually custom order a PIC, then wire and die bond to the electronics. However, some elements in the PIC cannot handle soldering temperatures, making it difficult to solder into an electronics board. Developers often must glue the chip onto the board with a non-standard process that needs additional verification for reliability.
This goes back to the issue of process standardization. Current PICs often use different materials and processes from electronics, such as optical fiber connections and metals for chip interconnects, that cannot survive a standard soldering process.
Adopting BGA-style packaging and flip-chip bonding techniques will make it easier for PICs to survive this soldering process. There is ongoing research and development worldwide, including at EFFECT Photonics, to make fiber coupling and other PIC aspects compatible with these electronic packaging methods.
PICs that can handle being soldered to circuit boards will allow the industry to build optical subassemblies that can be made more readily available in the open market and can go into trains, cars, or airplanes.
Takeaways
Photonics must leverage existing electronics ecosystems and processes to scale up and have a greater global impact. Tim Koene, explains what this means:
Photonics technology needs to integrate more electronic functionalities into the same package. It needs to build photonic integration and packaging support that plays by the rules of existing electronic manufacturing ecosystems. It needs to be built on a semiconductor manufacturing process that can produce millions of chips in a month.
As soon as photonics can achieve these larger production volumes, it can reach price points and improvements in quality and yield closer to those of electronics. When we show the market that photonics can be as easy to use as electronics, that will trigger a revolution in its worldwide use.
Our Chief Technology Officer, Tim Koene
This vision is one of our guiding lights at EFFECT Photonics, where we aim to develop optical systems that can have an impact all over the world in many different applications.
Tags: automotive sector, BGA style packaging, compatible, computing power, cost per mm, efficient, electronic, electronic board, electronics, fabless, Photonics, risk, scale, soldering, transistor, wafer scaleDay of Photonics 2022, Photonics FAQ
On October 21st, 1983, the General Conference of Weights and Measures adopted the current value…
On October 21st, 1983, the General Conference of Weights and Measures adopted the current value of the speed of light at 299,792.458 km/s. To commemorate this milestone, hundreds of optics and photonics companies, organizations, and institutes all over the world organize activities every year on this date to celebrate the Day of Photonics and how this technology is impacting our daily lives.
At EFFECT Photonics, we want to celebrate the Day of Photonics by answering some commonly asked questions about photonics and its impact on the world.
What is Photonics?
Photonics is the study and application of photon (light) generation, manipulation, and detection, often aiming to create, control, and sense light signals.
The term photonics emerged in the 60s and 70s with the development of the first semiconductor lasers and optical fibers. Its goals and even the name “photonics” are born from its analogy with electronics: photonics aims to generate, control, and sense photons (the particles of light) in similar ways to how electronics does with electrons (the particles of electricity).
What is Photonics Used for?
Photonics can be applied in many ways. For the Day of Photonics, we will explore two categories:
Communications
Light is the fastest information carrier in the universe and can transmit this information while dissipating less heat and energy than electrical signals. Thus, photonics can dramatically increase the speed, reach, and flexibility of communication networks and cope with the ever-growing demand for more data. And it will do so at a lower energy cost, decreasing the Internet’s carbon footprint.
A classic example is optical fiber communications. The webpage you are reading was originally a stream of 0 and 1s that traveled through an optical fiber to reach you.
Outside of optical fibers, photonics can also deliver solutions beyond what traditional radio communications can offer. For example, optical transmission over the air could handle links between different sites of a mobile network, links between cars, or to a satellite out in space. At some point, we may even see the use of Li-Fi, a technology that replaces indoor Wi-Fi links with infrared light.
Sensing
There are multiple sensing application markets, but their core technology is the same. They need a small device that sends out a known pulse of light, accurately detects how the light comes back, and calculates the properties of the environment from that information. It’s a simple but quite powerful concept.
This concept is already being used to implement LIDAR systems that help self-driving cars determine the location and distance of people and objects. However, there is also potential to use this concept in medical and agri-food applications, such as looking for undesired growths in the human eye or knowing how ripe an apple is.
Will Photonics Replace Electronics?
No, each technology has its strengths.
When transmitting information from point A to B, photonics can do it faster and more efficiently than electronics. For example, optical fiber can transmit information at the speed of light and dissipate less heat than electric wires.
On the other hand, since electricity can be manipulated at the nanometer level more easily than light, electronics are usually better for building computers. There are some specific areas where photonic computers could outperform traditional electronic ones, especially given the rise of quantum computers that can be made with photonic components. However, most computer products will remain electronic for the foreseeable future.
Thus, photonics is not expected to replace electronics but to collaborate and integrate strongly with it. Most future applications will involve photonic systems transmitting or sensing information then processed by electronic computers.
Tags: DayofPhotonics2022, Integrated Photonics, PhotonicsCoherent Free Space Optics for Ground and Space Applications
In a previous article, we described how free-space optics (FSO) could impact mobile fronthaul and…
In a previous article, we described how free-space optics (FSO) could impact mobile fronthaul and enterprise links. They can deliver a wireless access solution that can be deployed quickly, with more bandwidth capacity, security features, and less power consumption than traditional point-to-point microwave links.
However, there’s potential to do even more. There are network applications on the ground that require very high bandwidths in the range of 100 Gbps and space applications that need powerful transceivers to deliver messages across vast distances. Microwaves are struggling to deliver all the requirements for these use cases.
By merging the coherent technology in fiber optical communications with FSO systems, they can achieve greater reach and capacity than before, enabling these new applications in space and terrestrial links.
Reaching 100G on the Ground for Access Networks
Thanks to advances in adaptive optics, fast steering mirrors, and digital signal processors, FSO links can now handle Gbps-capacity links over several kilometers. For example, a collaboration between Dutch FSO startup Aircision and research organization TNO demonstrated in 2021 that their FSO systems could reliably transmit 10 Gbps over 2.5 km.
However, new communication technologies emerge daily, and our digital society keeps evolving and demanding more data. This need for progress has motivated more research and development into increasing the capacity of FSO links to 100 Gbps, providing a new short-reach solution for access networks.
One such initiative came from the collaboration of Norweigan optical network solutions provider Smartoptics, Swedish research institute RISE Acreo, and optical wireless link provider Norwegian Polewall. In a trial set-up at Acreo’s research facilities, Smartoptics’ 100G transponder was used with CFP transceivers to create a 100 Gbps DWDM signal transmitted through the air using Polewall’s optical wireless technology. Their system is estimated to reach 250 meters in the worst possible weather conditions.
Fredrik Larsson, the Optical Transmission Specialist at Smartoptics, explains the importance of this trial:
“Smartoptics is generally recognized as offering a very flexible platform for optical networking, with applications for all types of scenarios. 100Gbps connectivity through the air has not been demonstrated before this trial, at least not with commercially available products. We are proud to be part of that milestone together with Acreo and Polewall,”
Meanwhile, Aircision aims to develop a 100 Gbps coherent FSO system capable of transmitting up to 10km. To achieve this, they have partnered up with EFFECT Photonics, who will take charge of developing coherent modules that can go into Aircision’s future 100G system.
In many ways, the basic technologies to build these coherent FSO systems have been available for some time. However, they included high-power 100G lasers and transceivers originally intended for premium long-reach applications. The high price, footprint, and power consumption of these devices prevented the development of more affordable and lighter FSO systems for the larger access network market.
However, the advances in integration and miniaturization of coherent technology have opened up new possibilities for FSO links. For example, 100ZR transceiver standards enable a new generation of low-cost, low-power coherent pluggables that can be easily integrated into FSO systems. Meanwhile, companies like Aircision are working hard in using technologies such as adaptive optics and fast-steering mirrors to extend the reach of these 100G FSO systems into the kilometer range.
Coherent Optical Technology in Space
Currently, most space missions use radio frequency communications to send data to and from spacecraft. While radio waves have a proven track record of success in space missions, generating and collecting more mission data requires enhanced communications capabilities.
Coherent optical communications can increase link capacities to spacecraft and satellites by 10 to 100 times that of radio frequency systems. Additionally, optical transceivers can lower the size, weight, and power (SWAP) specifications of satellite communication systems. Less weight and size means a less expensive launch or perhaps room for more scientific instruments. Less power consumption means less drain on the spacecraft’s energy sources.
For example, the Laser Communications Relay Demonstration (LCRD) from NASA, launched in December 2021, aims to showcase the unique capabilities of optical communications. Future missions in space will send data to the LCRD, which then relays the data down to ground stations on Earth. The LCRD will forward this data at rates of 1.2 Gigabits per second over optical links, allowing more high-resolution experiment data to be transmitted back to Earth. LCRD is a technology demonstration expected to pave the way for more widespread use of optical communications in space.
Making Coherent Technology Live in Space
Integrated photonics can boost space communications by lowering the payload. Still, it must overcome the obstacles of a harsh space environment, with radiation hardness, an extreme operational temperature range, and vacuum conditions.
For example, the Laser Communications Relay Demonstration (LCRD) from NASA, launched in December 2021, aims to showcase the unique capabilities of optical communications. Future missions in space will send data to the LCRD, which then relays the data down to ground stations on Earth. The LCRD will forward this data at rates of 1.2 Gigabits per second over optical links, allowing more high-resolution experiment data to be transmitted back to Earth. LCRD is a technology demonstration expected to pave the way for more widespread use of optical communications in space.
Mission Type | Conditions |
Pressurized Module | +18.3 ºC to 26.7 °C |
Low-Earth Orbit (LEO) | -65 ºC to +125°C |
Geosynchronous Equatorial Orbit (GEO) | -196 ºC to +128 °C |
Trans-Atmospheric Vehicle | -200 ºC to +260 ºC |
Lunar Surface | -171 ºC to +111 ºC |
Martian Surface | -143 ºC to +27 ºC |
The values in Table 1 are unmanaged environmental temperatures and would decrease significantly for electronics and optics systems in a temperature-managed area, perhaps by as much as half.
A substantial body of knowledge exists to make integrated photonics compatible with space environments. After all, photonic integrated circuits (PICs) use similar materials to their electronic counterparts, which have already been space qualified in many implementations.
Much research has gone into overcoming the challenges of packaging PICs with electronics and optical fibers for these space environments, which must include hermetic seals and avoid epoxies. Commercial solutions, such as those offered by PHIX Photonics Assembly, Technobis IPS, and the PIXAPP Photonic Packaging Pilot Line, are now available.
Takeaways
Whenever you want to send data from point A to B, photonics is usually the most efficient way of doing it, be it over a fiber or free space.
This is why EFFECT Photonics sees future opportunities in the free-space optical (FSO) communications sectors. In mobile access networks or satellite link applications, FSO can provide solutions with more bandwidth capacity, security features, and less power consumption than traditional point-to-point microwave links.
These FSO systems can be further boosted by using coherent optical transmission similar to the one used in fiber optics. Offering these systems in a small package that can resist the required environmental conditions will significantly benefit the access network and space sectors.
Tags: 100G, access capacity, access network, capacity, certification, coherent, free space optics, FSO, GEO, ground, LEO, lunar, Photonics, reach, satellite, space, SWAP, temperature, TransceiversThe Growing Photonics Cluster of the Boston Area
As they lighted the candles in their ship, the Pilgrim families traveling on the Mayflower had no…
As they lighted the candles in their ship, the Pilgrim families traveling on the Mayflower had no idea they would help build a nation that would become a major pioneer in light technology and many other fields.
The United States features many areas with a strong photonics background, including the many companies in California’s Silicon Valley and the regions close to the country’s leading optics universities, such as Colorado, New York, Arizona, and Florida.
However, the Greater Boston area and Massachusetts state, in general, are becoming an increasingly important photonics hub with world-class universities and many successful optics and photonics initiatives and companies. Let’s talk a bit more about their legacy with light-based technology and the history of the town of Maynard with the high-tech industry.
From World-Class Labs to the Real World
The Boston area features many world-class universities collaborating with the government and industry to develop new photonics technology. Harvard, the Massachusetts Institute of Technology (MIT), Boston University, Tufts University, and Northeastern University are major research institutions in the area that lead many photonics-related initiatives.
The state of Massachusetts, in general, has also been home to several prosperous photonics businesses, and initiatives are being made to capitalize on Boston’s extensive medical industry knowledge to boost biomedical optics and photonics. Raytheon, Polaroid, and IPG Photonics are examples of Massachusetts-based businesses that promoted optical technology.
The US federal government and Massachusetts state are committing resources to get these academic and industry partners to collaborate as much as possible. In 2015, the Lab for Education and Application Prototypes (LEAP) network was established as part of a federal drive to revive American manufacturing. The Massachusetts Manufacturing Innovation Initiative, a state grant program, and AIM Photonics, the national manufacturing institution, each contributed $11.3 million to constructing labs around Massachusetts universities and colleges.
The LEAP Network objectives are to teach integrated photonics manufacturing practice, offer companies technician training and certification, encourage company engagement in the tool, process, and application upgrades, and support AIM Photonics in their manufacturing and testing.
These partnerships form a statewide ecosystem to educate the manufacturing workforce throughout the photonics supply chain. The facilities’ strategic placement next to both universities and community colleges allows them to attract students from all areas and stages of their careers, from technicians to engineers to fundamental researchers.
From the Mill to High Tech: The Story of Maynard
A trip down Route 2 into Middlesex County, 25 miles northwest of Boston, will take one past apple orchards, vineyards, and some of Massachusetts’ most stunning nature preserves before arriving at a historic mill on the Assabet River. The community around this mill, Maynard, is a charming and surprisingly historical hub of economic innovation that houses an emerging tech ecosystem.
The renowned Assabet Woolen Mill was established for textile manufacturing in 1847 by Amory Maynard, who by the age of 16 was managing his own sawmill company. Initially a carpet manufacturing plant, Maynard’s enterprise produced blankets and uniforms for the Union Army during the Civil War. The company employed immigrants from Ireland, Finland, Poland, Russia, and Italy, many of them coming to the mill for jobs as soon as they arrived in the nation. By the 1930s, the town of Maynard was recognized as one of the most multi-ethnic places in the state.
The Assabet Woolen Mill continued to create textiles until 1950. The 11-acre former mill complex, currently named Mill and Main, is the contemporary expression of the town’s evolution and relationship with innovative industry.
The Digital Equipment Corporation (DEC) occupied the facility before the end of the 50s with just $70,000 cash and three engineers. From the 1960s onward, DEC became a major global supplier of computer systems and enjoyed tremendous growth. It’s hard to overstate the company’s impact on Maynard, which became the ” Mini Computer Capital of the World” in barely twenty years.
Following DEC’s departure, the mill complex was sold and rented out to a fresh group of young and ambitious computer startups, many of whom are still operating today. Since then, more people and companies have joined, noting the affordable real estate, the enjoyable commute and environs, and the obvious cluster of IT enterprises. For example, when Acacia Communications, Inc. was established in 2009 and needed a home, Maynard’s mill space was a natural fit.
Similarly, EFFECT Photonics is proud to make a home in Maynard´s historic mill space and be a part of this community’s innovative heritage. We hope our work can serve as a positive example and inspiration for the neighborhood and help more innovators and inventors come to Maynard.
Tags: Boston, Boston University, DEC, Maynard, MIT, Photonics, Woolen MillThe Future of Passive Optical Networks
Like every other telecom network, cable networks had to change to meet the growing demand…
Like every other telecom network, cable networks had to change to meet the growing demand for data. These demands led to the development of hybrid fiber-coaxial (HFC) networks in the 1990s and 2000s. In these networks, optical fibers travel from the cable company hub and terminate in optical nodes, while coaxial cable connects the last few hundred meters from the optical node to nearby houses. Most of these connections were asymmetrical, giving customers more capacity to download data than upload.
That being said, the way we use the Internet has evolved over the last ten years. Users now require more upstream bandwidth thanks to the growth of social media, online gaming, video calls, and independent content creation such as video blogging. The DOCSIS standards that govern data transmission over coaxial cables have advanced quickly because of these additional needs. For instance, full-duplex transmission with symmetrical upstream and downstream channels is permitted under the most current DOCSIS 4.0 specifications.
Fiber-to-the-home (FTTH) systems, which bring fiber right to the customer’s door, are also proliferating and enabling Gigabit connections quicker than HFC networks. Overall, extending optical fiber deeper into communities (see Figure 1 for a graphic example) is a critical economic driver, increasing connectivity for the rural and underserved. These investments also lead to more robust competition among cable companies and a denser, higher-performance wireless network.
Passive optical networks (PONs) are a vital technology to cost-effectively expand the use of optical fiber within access networks and make FTTH systems more viable. By creating networks using passive optical splitters, PONs avoid the power consumption and cost of active components in optical networks such as electronics and amplifiers. PONs can be deployed in mobile fronthaul and mid-haul for macro sites, metro networks, and enterprise scenarios.
Despite some success from PONs, the cost of laying more fiber and the optical modems for the end users continue to deter carriers from using FTTH more broadly across their networks. This cost problem will only grow as the industry moves into higher bandwidths, such as 50G and 100G, requiring coherent technology in the modems.
Therefore, new technology and manufacturing methods are required to make PON technology more affordable and accessible. For example, wavelength division multiplexing (WDM)-PON allows providers to make the most of their existing fiber infrastructure. Meanwhile, simplified designs for coherent digital signal processors (DSPs) manufactured at large volumes can help lower the cost of coherent PON technology for access networks.
The Advantages of WDM PONs
Previous PON solutions, such as Gigabit PON (GPON) and Ethernet PON (EPON), used time-division multiplexing (TDM) solutions. In these cases, the fiber was shared sequentially by multiple channels. These technologies were initially meant for the residential services market, but they scale poorly for the higher capacity of business or carrier services. PON standardization for 25G and 50G capacities is ready but sharing a limited bitrate among multiple users with TDM technology is an insufficient approach for future-proof access networks.
This WDM-PON uses WDM multiplexing/demultiplexing technology to ensure that data signals can be divided into individual outgoing signals connected to buildings or homes. This hardware-based traffic separation gives customers the benefits of a secure and scalable point-to-point wavelength link. Since many wavelength channels are inside a single fiber, the carrier can retain very low fiber counts, yielding lower operating costs.
WDM-PON has the potential to become the unified access and backhaul technology of the future, carrying data from residential, business, and carrier wholesale services on a single platform. We discussed this converged access solution in one of our previous articles. Its long-reach capability and bandwidth scalability enable carriers to serve more customers from fewer active sites without compromising security and availability.
Migration to the WDM-PON access network does require a carrier to reassess how it views its network topology. It is not only a move away from operating parallel purpose-built platforms for different user groups to one converged access and backhaul infrastructure. It is also a change from today’s power-hungry and labor-intensive switch and router systems to a simplified, energy-efficient, and transport-centric environment with more passive optical components.
The Possibility of Coherent Access
As data demands continue to grow, direct detect optical technology used in prior PON standards will not be enough. The roadmap for this update remains a bit blurry, with different carriers taking different paths. For example, future expansions might require using 25G or 50G transceivers in the cable network, but the required number of channels in the fiber might not be enough for the conventional optical band (the C-band). Such a capacity expansion would therefore require using other bands (such as the O-band), which comes with additional challenges. An expansion to other optical bands would require changes in other optical networking equipment, such as multiplexers and filters, which increases the cost of the upgrade.
An alternative solution could be upgrading instead to coherent 100G technology. An upgrade to 100G could provide the necessary capacity in cable networks while remaining in the C-band and avoiding using other optical bands. This path has also been facilitated by the decreasing costs of coherent transceivers, which are becoming more integrated, sustainable, and affordable. You can read more about this subject in one of our previous articles.
For example, the renowned non-profit R&D center CableLabs announced a project to develop a symmetric 100G Coherent PON (C-PON). According to CableLabs, the scenarios for a C-PON are many: aggregation of 10G PON and DOCSIS 4.0 links, transport for macro-cell sites in some 5G network configurations, fiber-to-the-building (FTTB), long-reach rural scenarios, and high-density urban networks.
CableLabs anticipates C-PON and its 100G capabilities to play a significant role in the future of access networks, starting with data aggregation on networks that implement a distributed access architecture (DAA) like Remote PH. You can learn more about these networks here.
Combining Affordable Designs with Affordable Manufacturing
The main challenge of C-PON is the higher cost of coherent modulation and detection. Coherent technology requires more complex and expensive optics and digital signal processors (DSPs). Plenty of research is happening on simplifying these coherent designs for access networks. However, a first step towards making these optics more accessible is the 100ZR standard.
100ZR is currently a term for a short-reach (~80 km) coherent 100Gbps transceiver in a QSFP pluggable size. Targeted at the metro edge and enterprise applications that do not require 400ZR solutions, 100ZR provides a lower-cost, lower-power pluggable that also benefits from compatibility with the large installed base of 50 GHz and legacy 100 GHz multiplexer systems.
Another way to reduce the cost of PON technology is through the economics of scale, manufacturing pluggable transceiver devices at a high volume to drive down the cost per device. And with greater photonic integration, even more, devices can be produced on a single wafer. This economy-of-scale principle is the same behind electronics manufacturing, which must be applied to photonics.
Researchers at the Technical University of Eindhoven and the JePPIX consortium have modeled how this economy of scale principle would apply to photonics. If production volumes can increase from a few thousand chips per year to a few million, the price per optical chip can decrease from thousands of Euros to tens of Euros. This must be the goal of the optical transceiver industry.
Takeaways
Integrated photonics and volume manufacturing will be vital for developing future passive optical networks. PONs will use more WDM-PON solutions for increased capacity, secure channels, and easier management through self-tuning algorithms.
Meanwhile, PONs are also moving into incorporating coherent technology. These coherent transceivers have been traditionally too expensive for end-user modems. Fortunately, more affordable coherent transceiver designs and standards manufactured at larger volumes can change this situation and decrease the cost per device.
Tags: 100G, 5G, 6G, access, access networks, aggregation, backhaul, capacity, coherent, DWDM, fronthaul, Integrated Photonics, LightCounting, live events, metro, midhaul, mobile, mobile access, mobile networks, network, optical networking, optical technology, photonic integrated chip, photonic integration, Photonics, PIC, PON, programmable photonic system-on-chip, solutions, technology, VR, WDMHow Many DWDM Channels Do You Really Need?
Optical fiber and dense wavelength division multiplex (DWDM) technology are moving towards the edges of…
Optical fiber and dense wavelength division multiplex (DWDM) technology are moving towards the edges of networks. In the case of new 5G networks, operators will need more fiber capacity to interconnect the increased density of cell sites, often requiring replacing legacy time-division multiplexing transmission with higher-capacity DWDM links. In the case of cable and other fixed access networks, new distributed access architectures like Remote PHY free up ports in cable operator headends to serve more bandwidth to more customers.
A report by Deloitte summarizes the reasons to expand the reach and capacity of optical access networks: “Extending fiber deeper into communities is a critical economic driver, promoting competition, increasing connectivity for the rural and underserved, and supporting densification for wireless.”
To achieve such a deep fiber deployment, operators look to DWDM solutions to expand their fiber capacity without the expensive laying of new fiber. DWDM technology has become more affordable than ever due to the availability of low-cost filters and SFP transceiver modules with greater photonic integration and manufacturing volumes. Furthermore, self-tuning technology has made the installation and maintenance of transceivers easier and more affordable.
Despite the advantages of DWDM solutions, their price still causes operators to second-guess whether the upgrade is worth it. For example, mobile fronthaul applications don’t require all 40, 80, or 100 channels of many existing tunable modules. Fortunately, operators can now choose between narrow- or full-band tunable solutions that offer a greater variety of wavelength channels to fit different budgets and network requirements.
Example: Fullband Tunables in Cable Networks
Let’s look at what happens when a fixed access network needs to migrate to a distributed access architecture like Remote PHY.
A provider has a legacy access network with eight optical nodes, and each node services 500 customers. To give higher bandwidth capacity to these 500 customers, the provider wants to split each node into ten new nodes for fifty customers. Thus, the provider goes from having eight to eighty nodes. Each node requires the provider to assign a new DWDM channel, occupying more and more of the optical C-band. This network upgrade is an example that requires a fullband tunable module with coverage across the entire C-band to provide many DWDM channels with narrow (50 GHz) grid spacing.
Furthermore, using a fullband tunable module means that a single part number can handle all the necessary wavelengths for the network. In the past, network operators used fixed wavelength DWDM modules that must go into specific ports. For example, an SFP+ module with a C16 wavelength could only work with the C16 wavelength port of a DWDM multiplexer. However, tunable SFP+ modules can connect to any port of a DWDM multiplexer. This advantage means technicians no longer have to navigate a confusing sea of fixed modules with specific wavelengths; a single tunable module and part number will do the job.
Overall, fullband tunable modules will fit applications that need a large number of wavelength channels to maximize the capacity of fiber infrastructure. Metro transport or data center interconnects (DCIs) are good examples of applications with such requirements.
Example: Narrowband Tunables in Mobile Fronthaul
The transition to 5G and beyond will require a significant restructuring of mobile network architecture. 5G networks will use higher frequency bands, which require more cell sites and antennas to cover the same geographical areas as 4G. Existing antennas must upgrade to denser antenna arrays. These requirements will put more pressure on the existing fiber infrastructure, and mobile network operators are expected to deliver their 5G promises with relatively little expansion in their fiber infrastructure.
DWDM solutions will be vital for mobile network operators to scale capacity without laying new fiber. However, operators often regard traditional fullband tunable modules as expensive for this application. Mobile fronthaul links don’t need anything close to the 40 or 80 DWDM channels of a fullband transceiver. It’s like having a cable subscription where you only watch 10 out of the 80 TV channels.
This issue led EFFECT Photonics to develop narrowband tunable modules with just nine channels. They offer a more affordable and moderate capacity expansion that better fits the needs of mobile fronthaul networks. These networks often feature nodes that aggregate two or three different cell sites, each with three antenna arrays (each antenna provides 120° coverage at the tower) with their unique wavelength channel. Therefore, these aggregation points often need six or nine different wavelength channels, but not the entire 80-100 channels of a typical fullband module.
With the narrowband tunable option, operators can reduce their part number inventory compared to grey transceivers while avoiding the cost of a fullband transceiver.
Synergizing with Self-Tuning Algorithms
The number of channels in a tunable module (up to 100 in the case of EFFECT Photonics fullband modules) can quickly become overwhelming for technicians in the field. There will be more records to examine, more programming for tuning equipment, more trucks to load with tuning equipment, and more verifications to do in the field. These tasks can take a couple of hours just for a single node. If there are hundreds of nodes to install or repair, the required hours of labor will quickly rack up into the thousands and the associated costs into hundreds of thousands.
Self-tuning allows technicians to treat DWDM tunable modules the same way they treat grey transceivers. There is no need for additional training for technicians to install the tunable module. There is no need to program tuning equipment or obsessively check the wavelength records and tables to avoid deployment errors on the field. Technicians only need to follow the typical cleaning and handling procedures, plug the transceiver, and the device will automatically scan and find the correct wavelength once plugged. This feature can save providers thousands of person-hours in their network installation and maintenance and reduce the probability of human errors, effectively reducing capital and operational expenditures.
Self-tuning algorithms make installing and maintaining narrowband and fullband tunable modules more straightforward and affordable for network deployment.
Takeaways
Fullband self-tuning modules will allow providers to deploy extensive fiber capacity upgrades more quickly than ever. However, in use cases such as mobile access networks where operators don’t need a wide array of DWDM channels, they can opt for narrowband solutions that are more affordable than their fullband alternatives. By combining fullband and narrowband solutions with self-tuning algorithms, operators can expand their networks in the most affordable and accessible ways for their budget and network requirements.
Tags: 100G, 5G, 6G, access, access networks, aggregation, backhaul, capacity, coherent, DWDM, fronthaul, Integrated Photonics, LightCounting, live events, metro, midhaul, mobile, mobile access, mobile networks, network, optical networking, optical technology, photonic integrated chip, photonic integration, Photonics, PIC, PON, programmable photonic system-on-chip, solutions, technology, VR, WDMScaling Up Live Event Coverage with 5G Technology
Everyone who has attended a major venue or event, such as a football stadium or…
Everyone who has attended a major venue or event, such as a football stadium or concert, knows the pains of getting good Internet access in such a packed venue. There are too many people and not enough bandwidth. Many initial use cases for 5G have been restricted to achieving much higher speeds, allowing users to enjoy seamless connectivity for live gaming, video conferencing, and live broadcasting. Within a few years, consumers will demand more immersive experiences to enjoy sporting events, concerts, and movies. These experiences will include virtual and augmented reality and improved payment methods.
These experiences will hopefully lead to a win-win scenario: the end user can improve their experiences at the venue, while the telecom service provider can increase their average income per user. Delivering this higher capacity and immersive experiences for live events is a challenge for telecom providers, as they struggle to scale up their networks cost-effectively for these one-off events or huge venues. Fortunately, 5G technology makes scaling up for these events easier thanks to the greater density of cell sites and the increased capacity of optical transport networks.
A Higher Bandwidth Experience
One of the biggest athletic events in the world, the Super Bowl, draws 60 to 100 thousand spectators to an American stadium once a year. Furthermore, hundreds of thousands, if not millions, of people of out-of-towners will visit the Superbowl host city to support their teams. The amount of data transported inside the Atlanta stadium for the 2019 Superbowl alone reached a record 24 terabytes. The half-time show caused a 13.06 Gbps surge in data traffic on the network from more than 30,000 mobile devices. This massive traffic surge in mobile networks can even hamper the ability of security officers and first responders (i.e., law enforcement and medical workers) to react swiftly to crises.
Fortunately, 5G networks are designed to handle more connections than previous generations. They use higher frequency bands for increased bandwidth and a higher density of antennas and cell sites to provide more coverage. This infrastructure enables reliable data speeds of up to 10 Gbps per device and more channels that enable a stable and prioritized connection for critical medical and security services. Carriers are investing heavily in 5G infrastructure around sports stadiums and other public events to improve the safety and security of visitors.
For example, Sprint updated its cell towers with Massive multiple-input, multiple-output (MIMO) technology ahead of the 2019 Super Bowl in Atlanta. Meanwhile, AT&T implemented the first standards-based mobile 5G network in Atlanta with a $43 million network upgrade. In addition to providing first responders with a quick, dependable communication network for other large events, this helps handle enormous traffic during live events like the Super Bowl.
New Ways of Interaction
5G technology and its increased bandwidth capacity will promote new ways for live audiences to interact with these events. Joris Evers, Chief Communication Officer of La Liga, Spain’s top men’s football league, explains a potential application: “Inside a stadium, you could foresee 5G giving fans more capacities on a portable device to check game stats and replays in near real-time.” The gigabit speeds of 5G can replace the traditional jumbotrons and screens and allow spectators to replay games instantly from their cellphones. Venues are also investigating how 5G and AI might lessen lengthy queues at kiosks for events with tens of thousands of visitors. At all Major League Baseball stadiums, American food service company Aramark deployed AI-driven self-service checkout kiosks. Aramark reports that these kiosks have resulted in a 40% increase in transaction speed and a 25% increase in revenue.
Almost all live events have limited seats available, and ticket prices reflect demand, seating preferences, and supply. However, 5G might allow an endless number of virtual seats. Regarding the potential of 5G for sports, Evers notes that “away from the stadium, 5G may enable VR experiences to happen in a more live fashion.”
Strengthening the Transport Network
This increased bandwidth and new forms of interaction in live events will put more pressure on the existing fiber transport infrastructure. Mobile network operators are expected to deliver their 5G promises while avoiding costly expansions of their fiber infrastructure. The initial rollout of 5G has already happened in most developed countries, with operators upgrading their optical transceivers to 10G SFP+ and wavelength division multiplexing (WDM). Mobile networks must now move to the next phase of 5G deployments, exponentially increasing the number of devices connected to the network.
The roadmap for this update remains a bit blurry, with different carriers taking different paths. For example, South Korea’s service providers decided to future-proof their fiber infrastructure and invested in 10G and 25G WDM technology since the early stages of their 5G rollout. Carriers in Europe and the Americas have followed a different path, upgrading only to 10G in the early phase while thinking about the next step.
Some carriers might do a straightforward upgrade to 25G, but others are already thinking about future-proofing their networks ahead of a 6G standard. For example, future expansions might require using 25G or 50G transceivers in their networks, but the required number of channels in the fiber might not be enough for the conventional optical band (the C-band). Such a capacity expansion would therefore require using other bands (such as the O-band), which comes with additional challenges.
An expansion to other optical bands would require changes in other optical networking equipment, such as multiplexers and filters, which increases the cost of the upgrade. An alternative solution could be upgrading instead from 10G to coherent 100G technology. An upgrade to 100G could provide the necessary capacity in transport networks while remaining in the C-band and avoiding using other optical bands. This path has also been facilitated by the decreasing costs of coherent transceivers, which are becoming more integrated, sustainable, and affordable. You can read more about this subject in one of our previous articles. By deploying these higher-capacity links and DWDM solutions, providers will scale up their transport networks to enable these new services for live events.
Takeaways
Thanks to 5G technology, network providers can provide more than just higher bandwidth for live events and venues; they will also enable new possibilities in live events. For example, audiences can instantly replay what is happening in a football match or use VR to attend a match or concert virtually. This progress in how end users interact with live events must also be backed up by the transport network. The discussions of how to upgrade the transport network are still ongoing and imply that coherent technology could play a significant role in this upgrade.
Tags: 100G, 5G, 6G, access, access networks, aggregation, backhaul, capacity, coherent, DWDM, fronthaul, Integrated Photonics, LightCounting, live events, metro, midhaul, mobile, mobile access, mobile networks, network, optical networking, optical technology, photonic integrated chip, photonic integration, Photonics, PIC, PON, programmable photonic system-on-chip, solutions, technology, VR, WDM