Open Radio Access Networks Can Save Energy, Say Panelists at Industry Summit : Broadband Breakfast

2022-09-10 11:03:22 By : Ms. Lisa Kong

Broadband players in both the fiber marketplace and 5G space are pushing for energy-efficient networks.

September 8, 2022 – Open radio access networks allow administrators to utilize software that save energy, said Paul Challoner, vice president of Network Product Solutions for Ericsson, at Fierce Wireless’ Open RAN Summit on Wednesday.

Radio access networks connect personal devices – such as phones and computers – to core networks through radio waves. Open RANs’ components, unlike those of traditional RANs, are “interoperable” – a single network can function with pieces made by different manufacturers.

A crucial part of an Open RAN is the RAN Intelligent Control, a software component which enables users to obtain and run native as well as third-party software.

For instance, to eliminate energy waste during periods of light traffic, these sophisticated controls employ artificial intelligence software to direct signals to designated base stations only . This allows other base stations to hibernate. Traditional RANs lack the software capacity of O-RANs and therefore must run all base stations at all times.

Elsewhere in the telecom industry, broadband players are pushing for energy-efficient networks as well. Federal Communications Commission Commissioner Geoffrey Starks regularly calls for sustainable, net-zero carbon emissions. “We must continue to find ways to do more while using less,” he said in June.

Additionally, the Fiber Broadband Association recently released a report that found that fiber to the home technology creates less CO2 than either DSL or cable broadband. What’s more, the report found that since FTTH users are disproportionately likely to work from home, FTTH also results in less commute-driven CO2 emissions.

Many believe that open RAN technology will play a substantial role in creating an alternative to proprietary telecommunications equipment. As concerns mount about Chinese telecom manufacturer Huawei – including its connections to the Chinese Communist Party –O-RAN has emerged as a viable alternative to provide 5G coverage.

To safeguard O-RAN networks, networks must employ a bottom-to-top approach to cybersecurity, said Douglas Gardner, chief technologist for the Analog Devices’ CTO Office Security Center of Excellence, at Wednesday’s Open RAN Summit. Gardner argued that every part of the network – down to the chip level – must be constructed with potential cyberthreats in mind.

Ultimately, Gardner believes that expensive up-front investments in cybersecurity prevent far more expensive breaches down the road: “Good security costs vendors money. Get over it.”

David Flower: 5G and Hyper-Personalization: Too Much of a Good Thing?

North Carolina Officials Tout Recent Investments in Rural Fiber

New Hampshire Awards, Journalism and Content Moderation, Truth Social Roadblocks

FCC’s Fabric Challenge Process Important Part of Getting Map Right, Agency Says

Cogent Buys T-Mobile Wireline, $81 Million from Emergency Connectivity Fund, Digital Redlining Study

Fiber Providers Feeling the Heat of Inflation as Cost of Materials, Labor Rise

In ‘Office Hours’ Sessions, NTIA Addresses Questions of Middle Mile Grant Applicants

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

5G, IoT and edge computing are giving companies the opportunity to make hyper-personalization even more ‘hyper’.

It’s very easy for personalization to backfire and subtract value instead of add it.

Consider the troubling fact that we may be arriving at a moment in hyper-personalization’s journey where the most hyper-personalized offer is no offer at all. Nobody likes to be constantly bombarded by content, personalized or not.

And that’s the paradox of hyper-personalization: if everyone’s doing it, then, in a sense, nobody is.

5G and related technologies such as IoT and edge computing are giving companies the opportunity to make hyper-personalization even more “hyper” via broader bandwidths and the faster processing of higher volumes of data.

This means we’re at a very interesting inflection point: where do we stop? If the promise of 5G is more data, better data, and faster data, and the result is knowing our customers even better to bug them even more, albeit in a “personal” way, when, where, and why do we say, “hold on—maybe this is going too far.”?

How do you do hyper-personalization well in a world where everyone else is doing it and where customers are becoming increasingly jaded about it and worried about how companies are using their data?

Let’s first look at what’s going wrong.

Hyper-personalization is very easy to mess up, and when you do mess it up it has the exact opposite of its intended effect: it drives customers away instead of keeping them there.

Consider an online ad for a product that pops up for you on a website a couple days after you already bought the thing being advertised for. This is what I call “noise”. It’s simply a nuisance, and the company placing that ad—or rather, the data platform they’re using to generate the algorithms for the ads—should already know that the person has already bought this item and hence present not a “repeat offer” but an upsell or cross-sell offer.

This sounds rudimentary in the year 2022 but it’s still all too common, and you’re probably nodding your head right now because you’ve experienced this issue.

Noise usually comes from what’s known as bad data, or dirty data. Whatever you want to call it—it pretty much ruins the customer experience.

The second major issue is slow data, which is any data being used way too slowly to be valuable, which usually includes data that has to the trip to the data warehouse before it can be incorporated into any decisions.

Slow data is one of the main reasons edge computing was invented: to be able to process data as closely to where it’s ingested as possible in order to use it before it loses any value.

Slow data produces not-so-fun customer experiences such as walking half a mile to your departure gate at the airport, only to find that the gate has been changed, and then, after you’ve walked the half mile back to where you came from, getting a text message on your phone from the airline saying your gate has been changed.

Again, whatever you want to call it—latency, slow data, annoying—the end result is a bad customer experience.

I have no doubt that the people who invented hyper-personalization had great intentions: make things as personal as possible so that your customers pay attention, stay happy, and stay loyal.

And for a lot of companies, for a long time, it worked. Then came the data deluge. And the regulations. And the jaded customers. We’re now at a stage where we need to rethink how we do personalization because the old ways are no longer effective.

It’s easy—and correct—to blame legacy technology for all of this. But the solution goes deeper than just ripping and replacing. Companies need to think holistically about all sides of their tech stacks to figure out the simplest way to get as much data as possible from A to B.

The faster you can process your data the better. But it’s not all just about speed. You also need to be able to provide quick contextual intelligence to your data so that every packet is informed by all of the packets that came before it. In this sense, your tech stack should be a little like a great storyteller: someone who knows what the customer needs and is feeling at any given moment, because it knows what’s happened up to this point and how it will affect customer decisions moving forward.

Let’s start thinking of our customer experiences as stories and our tech stacks as the storytellers—or maybe, story generators. Maybe then our personalization efforts will become truly ‘hyper-personal’— i.e., relevant, in-the-moment experiences that are a source of delight instead of annoyance.

David Flower brings more than 28 years of experience within the IT industry to the role of CEO of Volt Active Data. Flower has a track record of building significant shareholder value across multiple software sectors on a global scale through the development and execution of focused strategic plans, organizational development and product leadership. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

The next generation wireless technology is being touted as the most secure yet.

WASHINGTON, July 28, 2022 – 5G technology can still present security concerns despite being touted as the most secure of the cellular generations, said Dan Elmore of the Idaho National Laboratory at a 5G Future event Thursday.

In response to the emerging challenge of validating 5G security protocols and data protection technologies, the Idaho National Laboratory established its Wireless Security Institute in 2019 to coordinate government, academic, and private industry research efforts to foster more secure and reliable 5G technology.

While 5G network offers a “rich suite” of security features in the standards, most of it is optional for manufacturers and developers to choose to implement in their system or device, said Elmore, who is the director for critical infrastructure security at the INL. This poses a significant challenge for 5G, particularly for critical infrastructure applications, as consumers may not know how standards are implemented, Elmore said.

Elmore urged consumers, especially federal agencies, to ask the hard questions and consider “what vulnerabilities might be present in how they [manufacturers and developers] employ those standards that could be exploited.”

5G is designed to allow cellular devices to connect at higher speeds with lower latency, the delay in loading requests, than previous generations. Already, wireless carriers are incorporating it into devices and working on national 5G networks.

Because of its facilitation of real-time monitoring, 5G technology is expected to help tackle critical issues like climate change and environmental sustainability.

The technology has already been used by companies to monitor and make more efficient systems to reduce emissions.

WASHINGTON, June 28, 2022 – Because of its facilitation of real-time monitoring and more efficient use of systems, 5G technology will help tackle climate change and beef up environmental sustainability, an Information Technology and Innovation Foundation event heard Tuesday.

5G technology’s ubiquitous connectivity and lower latency enables climate technology that decarbonizes manufacturing plants, enables rainforest monitoring, and limits greenhouse gas emissions from transportation.

5G also enables real-time traffic control and monitoring that can help minimize carbon footprint, said John Hunter from T-Mobile, which has a large 5G network thanks in part to its merger with Sprint.

Finnish 5G equipment supplier Nokia has invested in smart manufacturing relying on the speed of 5G in its plants, which it said has resulted in a 10 to 20 percent carbon dioxide reduction and a 30 percent productivity improvement with 50 percent reduction in product defects.

Non-profit tech startup Rainforest Connection has used 5G technology to implant sensitive microphones into endangered rainforests in over 22 countries around the world. These microphones pick up on sounds in the forest and transmit them in real time to personnel on the ground.

These highly sensitive machines are camouflaged in trees and can pick up sounds of gunfire from poaching and chainsaws from illegal logging activity from miles away. The technology has proven to be significant in rainforest conservation and will enable researchers and scientists to find innovative solutions to help endangered species as they study the audio.

“By being able to integrate technologies such as 5G, we can accelerate that process… to achieve the mission [of mitigating climate change effects] sooner than we expected,” said Rainforest Connection CEO Bourhan Yassin.

Google Not Publisher to Australian Court, Omnispace Testing 5G Satellites, AT&T’s $6M to Digital Literacy

All States Want BEAD Funds, Digicomm Secures Investment, Glo Fiber Expanding in PA

Comcast and Charter’s State Grants, AT&T Fiber in Arizona, New US Cellular Lobbyist

Bryan Darr: An Order of Fiber, Please, with Wireless on the Side

Broadband Breakfast on August 31, 2022 – How to Maximize Minority Participation in the Affordable Connectivity Program

Appeals Court Affirms FCC’s Spectrum Authority, FTC Privacy Rulemaking, (Root) Beer and Broadband

David Flower: 5G and Hyper-Personalization: Too Much of a Good Thing?

Jeremy Jurick and Paul Schneid: Preparing Data for the FCC’s Broadband Filing

Copyright © 2008-2022 Breakfast Media LLC. Articles and Expert Opinions on Broadband Breakfast are not legal advice or legal services.