X

Connect (X)

Tag Archives: cybersecurity

Securing Gateways in the Internet of Anything/Everything (IoX)

A knowledge base article By Ernest Worthman

The Internet of everything (IoX) will envelop everything from home automation to intelligent vehicles, to wearables, to industrial applications, military, infrastructure – the list is almost endless. And there is a lot of discussion about securing these “things” on any number of levels.

One of the more visible and critical levels is the gateway. It is also one of the more frequently targeted. Why? Because if one can get into the gateway, one generally has access devices beyond the gateway that are not secured.

A prime example of that is the PC. When a PC (or laptop, tablet, etc.) is up and running and being used, it is relatively open to access. While it certainly can be secured, the number of such devices that are not secured and easily compromised. And, generally, low-cost sensor, which will be ubiquitous in the IoX are rarely, if at all secured. So, gateway security should be near, or on, the top of the security pyramid.

However, the gateway is not the end-all or be-all. Doing a high-level flyover of the IoX, one begins to realize that all of these “things” are really part of any number of almost, infinite networks – networks that have their own security issues, some local, some global, some infrastructure and some with devices themselves.

And within these networks, infrastructures and devices exist any number of unique security challenges. While next-generation gateways promise the potential of some very sophisticated hardware that can address many of the challenges of complex and often convoluted networks, they will only be a cog in the intricate web of global interconnect.

Take wearables, for example. Wearables can be something as simple as a pace counter that tracks your vitals as you run. Perhaps it will, simply, talk to the IoX to download comparison data or upload the data to your personal network. On the other end, with advances in telemedicine, there might be something like a wireless version of a Holter monitor that not only continuously monitors cardiac activity and uploads the data via the IoX, but may be connected to any number of emergency responder networks or hospitals, as well.

What both have in common, as will almost all the devices within the IoX, is that there will be gateways within the path. And one cannot guarantee that all such devices are secure (although it is getting better).

However, all that aside, let us focus on the gateway and why securing it is so important.

The Gateway Function

As its name implies, the gateway is the “gate keeper” of a network – the point where all traffic funnels through. Depending on the type of network, the gateway can have a variety of functions and various levels of sophistication. Gateways can also be integrated with other components, generally a router or switch, or both. In such a capacity the gateway will partition the network into two separate components – one that is trusted and secure, and one that is untrusted and unsecure.

The future gateways of the IoX will, in its most complex form, be a sophisticated computing and networking device. Some of its capabilities will include aggregating data from a multitude of devices, becoming the fabric switch to route device data, as well as provide security.

That segregation and the understanding of the gateway’s role in the networks of the future becomes critically important going forward. With the IoX vision to expand connectivity throughout the world, components such as gateways, will not only be connected to each other, but, potentially, to every other device on the IoX. That means they will no longer be exposed to just localize threats, but threats from any network, anywhere, including the global network. Therefore, it will be vital to ensure sthe trustworthiness of the gateways, not only across networks, but also globally.

Next Generation IoX Gateways

IoX gateways will be a different breed. There are new technical requirements that these gateways will have to employ. Among them are:

  • Mesh and edge and computing techniques. A lot of the data will be coming from the edge or fog and will need to be handled close to the edge to conserve bandwidth, and the time required to process it in the cloud.
  • Advanced designed to be able to offer flexible platforms to accommodate a large variety of interfaces and network protocols, as well as complex software and exposed deployments. These designs must also protect the connectivity, so the gateway does not permit malicious attacks.
  • High levels of interoperability as well as supporting standards, even legacy network protocol for a time. This is necessary to provide the most flexible connectivity support among the pervasive various types of components and devices from a plethora of different vendors.
  • Be capable of certification to a number of standards, both wireless and wireline, as well as other industry standards.
  • Be platform agnostic, so they can offer services to applications across the board, from structured data subsets to raw physical data from a broad set of devices.
  • Above all, they will need to have a high degree of autonomy. They will have to be ultra, reliable, self-configuring, and remotely reconfigurable.

The Technology

To accomplish this, especially with the likely disparaging environment that will exist as the IoX evolves, requires a number of interconnect options. In general, interconnect solutions integrated into gateway chipsets encompass the following. Of course, depending upon application these will vary from design to design. But in most cases, next generation gateway solutions will have to support interfaces, including Ethernet, PCI Express, USB 2/3.0, SD/SDIO/eMMC, SPI, UART, and I2C/GPIO. It will also contain a variety of wireless interfaces, including Wi-Fi, Bluetooth, ZigBee, Z-Wave, Thread, and their low-power brethren, and 3/4/5G radio protocols.

Fog computing will entail a number of functions, mainly data analysis, event management and routing. For example, the gateway can analyze sensor data from edge devices and make deterministic decisions whether or not the data is authentic, is meaningful, or requires further action. It can also aggregate this data, package, and store or forward it based upon a set of criteria from the application. Figure 1 is an example of what one of these IoX gateways is capable of.

At the Edge

The edge and the fog will be pervasive in the IoX. In fact, edge and fog networks will be one of the major components of the IoX, and IoX gateways must be able to closely integrate with edge and fog networks.

The purpose of intelligence at the edge is to allow data to, seamlessly, flow between the cloud and the devices at the edge. Furthermore, for a time, there will be both legacy and new systems that will require integration, which is more efficiently handled close to the source. This is generally careffered to as fog computing and much of it can be handled by intelligent gateways.

 

Gateway Intelligence

Really, intelligence simply means a menu of technologies coupled with code that analyzes conditions and applies the correct solution. It can be integrated into the hardware as IP or as a software stack. A simple example of that is the radio interface. Assume that the gateway has an integrated multi-band, multi-frequency RF modem capable of working on all the flavors of wireless. It is a simple matter to add code to analyze the signal and process it.

Other intelligence, such as determining the relevancy of data from an edge sensor for example, functions similarly. A series of conditions is coded into the application that analyzes the data; if it meets certain conditions, it is valid, if not it can be deemed an error, irrelevant or even routed for further analysis. AI and fuzzy logic can be used to “teach” the application, or the gateway, to make better decisions and improve the margins of error.

This will become more and more important as the IoX unfolds, simply because the massive amounts and diversity of data that will be part of the IoX. To be able to keep up with that, intelligent gateways will require advanced processors and specialized chips to handle the load.

Gateway Security

The importance of gateway security cannot be over emphasized. The reason for that is because many, of the IoX devices will be of the low-cost, low-tech variety. Simple sensors that will be challenged to have anything other than the most basic of security, if that. One can argue that this may change as technology advances, but the reality is that many common low-cost sensors will have razor-thin margins, and OEMs are reluctant to add the cost of security at these points.

And there are no real indicators that this is changing. Therefore, the security burden falls on the other devices in the loop, and gateways are a reasonably good solution from a system perspective. At the higher level, IoX devices will have encryption by default so that issue becomes less of a problem for gateways.

But that does not mean gateways do not have vulnerabilities. When one looks at the possible compromise issues of an access point, there are so many levels to which one can attack a device on it.

One of the issues, relevant to that, harkens back to legacy devices. Gateways must be able to pass a variety of data, and legacy devices generally use very simple protocols that can, easily be used as vehicle to “trick” gateways and other devices into allowing malicious software to get by. So, gateways must understand the simple structures of legacy equipment yet be aware of the vulnerabilities.

Another consideration for security gateways is complexity. Gateways run more complex software with raised levels of communications. They communicate not only with the IoX devices but also have command and control capabilities. This makes them a higher risk, and value, target.

In that vein, the gateway is responsible for proving to the systems on the back side it communicates with that it is running authentic, boot software, the right application stack, and that the data feeds passing though are verified. With gateways that is less of an issue than with many of the IoX devices since they are relatively high-end and can bear the cost of high levels of security. Gateways can integrate high-end systems on chip (SoCs) that have dedicated microcomputer units (MCUs) that do things like separate execution routines and identity verification.

Another platform that works well with gateways is firewalls. For example, there is one area of advancement that is focusing on allowed and rejected relationships and transactions inside the firewall, at very sophisticated levels. The advantage to that is when there are highly defined criteria of what should, and can happen, then behavior outside of what is defined is quickly, and more precisely identified. That means the firewall itself can provide a significant layer of security within gateways.

Missive

Secure gateways will become a major player in the IoX infrastructure. The IoX will have so many devices, networks, and systems that perimeter security will become a principal element and be implemented on a massive scale. Perimeter defense is emerging as an essential element in the overall security platform.

Securing devices is a given. But many of the generic, lower end devices will not have adequate security to prevent compromise. Next-generation gateways will be sophisticated devices that will integrate a myriad of technologies. To protect those technologies, they will also integrate a number of advanced security platforms.

Gateways are coming of age. They are taking on new roles and reach new heights of sophistication.
As a tool in the security wheelhouse, they will play an integral part in the protection of the new IoX.

 

What Happened in 2021, and What’s Going to Happen in 2022

Contributed article by John Strand, Strand Consulting

2021 started with the Consumer Technology Association (CTA) turning its physical event, the Consumer Electronics Show (CES) in Las Vegas, into a digital event.  In February, GSMA made its Barcelona Mobile World Congress (MWC) into a combination digital and physical event. Usually, MWC attracts about 100,000 guests. In 2021, there were about 20,000 participants, of which only 5,000 came from outside of Spain.

GSMA expects to implement MWC 2022. The question is whether the physical part will be much bigger than it was in 2021. Although there are fewer hospitalizations with this wave of COVID-19 variant, more people become sick and need to stay in bed for at least a week. This variant is more likely to affect business negatively, particularly labor-heavy companies. This in turn is quite likely to affect the physical portion of MWC negatively, because unnerved potential convention-goers probably will remain at home.

2022 will continue to showcase that the mobile telecommunications industry plays a critical role in enabling modern society to function. Telecom companies should receive more goodwill and should become better at exploiting it.

Strand Consult’s study of the broadband middle mile showed that rural broadband providers face many challenges in their effort to deliver broadband to disparate customers over large geographic areas. The study creates transparency for policymakers about the cost, level and source of internet traffic. It demonstrates that five so-called Big Streamers account for a disproportionate share of downstream traffic. For every $1 of revenue earned from the five Big Streamers (Netflix, YouTube, Amazon, Disney+ and Microsoft), rural broadband providers incurred 48 cents in middle mile costs (equipment, electricity and labor), which they could not recover from the Big Streamers, end-users or government reimbursement programs.

The Big Streamers have tremendous global market power and ignore broadband providers’ requests to negotiate on cost recovery. Moreover, the free caching solutions proffered by the Big Streamers add costs to networks and only serve proprietary content. Strand Consult observes that there is a pervasive problem of unrecovered cost at the local, national and international levels that threatens sustainability and undermines policy to close the digital divide. In 2022, this problem will have an effect on the forward motion of the broadband segment.

Many Talk About OpenRAN, but Mobile Operators Still Buy Classic RAN

In our opinion, O-RAN is one of the most overhyped technical solutions since the launch of 3G wireless communications in the year 2000. Although the use of O-RAN promises to cut RAN capital expense (capex) by as much as half, it does not rise to the level of hyperbole that 3G would turn radio spectrum into gold. But eventually, it comes close.

In August 2021, Nokia paused its work in the group for fear of violating U.S. restrictions on the many Chinese members. Strand Consult has yet to find an O-RAN proponent who can explain how the prevalence of 44 Chinese companies in the O-RAN Alliance does not compromise O-RAN.

O-RAN is being promoted by industry and governments from the United States, Japan, Germany, the United Kingdom and even Russia as trade policy and enterprise enhancements, though the O-RAN market itself appears to be growing minimally. The U.S. executive branch stopped referring to the O-Ran Alliance in its policy communications and now uses the O-RAN Policy Coalition as if it’s a technical standards development organization. Yet the O-RAN website clearly shows that it’s an advocacy organization whose purpose is to influence governments on behalf of its member companies.

It is important to understand that O-RAN is built on top of 3GPP 4G and 5G technologies. It is not a solution that can replace existing networks on a 1:1 basis. Nor do O-RAN technologies support 2G and 3G, which most of the world still uses for machine-to-machine (M2M) communications and telephony. If a legacy operator wants O-RAN, it probably would have to maintain two sets of parallel base stations, one set for 2G and 3G and the other 4G and 5G. Running two parallel networks increases rental and energy costs, compared with running one network.

If O-RAN reaches the level of success its proponents predict, it will account for less than 1 percent of the 5G mobile sites in 2025 and not more than 3 percent in 2030. It looks as though O-RAN is too little, too late, to make a difference in a world in which operators are deploying 10,000 classic 5G sites every month.

At the end of the day, mobile operators’ job is to deliver a great network experience to their customers. O-RAN technologies offer only limited features compared to the 200 3GPP 5G networks launched globally by the end of 2021. In practical terms, one cannot compare the functionality of Rakuten´s in Japan 4G and 5G network with the functionality of an American 4G and 5G network.

The United States will evolve further as it upgrades from 3GPP Release 15 to Releases 16 and 17, and Rakuten probably will fall farther behind. The O-RAN claims are even further distorted when proponents say the O-RAN is a way for Europe to catch up with the United States, China and Korea in 5G technology. Note that the United States and South Korea achieved 5G leadership without the use of Huawei and ZTE equipment or the use of O-RAN.

2022 will see continued O-RAN advocacy, although it will be more difficult for its proponents to evade the tough questions about the hard reality.

China and Huawei Probably Will Have Another Difficult Year

When Joe Biden became president in January 2020, many wondered how U.S. policy would change, concerning China and Huawei. Strand Consult maintained that the policy was bipartisan and was unlikely to change, and, if anything, it might be toughened, particularly as reforms adopted in 2018 gave the new administration additional tools to prosecute human rights violations.

Huawei still faces significant financial pressure, and public opinion about Huawei has not changed. Many countries see it as unsafe and unsustainable to use Huawei equipment in telecommunications networks. Many operators have experienced increased reputational and regulatory risk by using Huawei, and corporate customers do not want their sensitive and valuable data to be vulnerable to the Chinese government.

In any event, the good news is that it need not be expensive to rip and replace Chinese equipment. As operators evolve to 5G, they have planned for upgrade costs already, and fortunately, there are many competitively priced alternatives to Huawei.

Huawei has pivoted to the cloud market and attempts to bill itself as a trustworthy IT supplier for the public and private sectors and as an alternative to the large IT software companies that supply a combination of services and a cloud. Huawei probably will succeed with its strategy in China and in some countries sympathetic to the Chinese regime. However, it will be a hard sell for Huawei to convince public sector buyers in the United States and Europe to buy its solution of putting data into Chinese IT systems and the Chinese cloud.

Cybersecurity Is Getting Even Bigger

In 2021, it was telling how gatherings from leaders from across and political spectrum, from developed and emerging countries alike, view cybersecurity. All nations are concerned about addressing serious global problems like illicit finance, human trafficking and ransomware driven by rogue nations and crime cartels. This concern means that secure networks and the practices to defend them will become even more important in 2022.

Both the United States and the European Union have rolled out new policies and regulations to improve network security, including 5G. This includes the European Union’s Toolbox and the U.S. Secure Equipment Act, which FCC to deny equipment authorizations to firms posing an unacceptable national security list. These companies include Huawei, ZTE, Hytera, Hangzhou, Hikvision and Dahua. Drone maker DJI is most likely be added, and many national security experts say restrictions should be increased for Lenovo, TikTok and chipmaker YMTC.

Strand Consult believes the push for greater security is incompatible with O-RAN technologies, which are increasingly influenced by Chinese players.

Amazon, Google, Facebook, Microsoft – Big Tech Mutates Faster Than Corona

There is good news and bad news about big tech. Just when health authorities believe they have the virus under control, a new variant emerges. Similarly, governments are trying to regulate big tech. Yet, just when it seems that big tech could be pinned down, big tech adapts to the new reality – with a new name, a new practice or a new public-private partnership.

The conversation about big tech and its role in society will continue in 2022. Policymakers must realize that big tech is adapting faster than the efforts to regulate it.  If anything, the regulations adopted to date, such as the General Data Protection Regulation (GDPR), have made big tech even stronger. Today, these companies’ revenue, market share and earnings have increased, compared with the time before regulation. Additionally, the government has made it harder for small and medium sized companies to compete.

The bottom line is that efforts to regulate big tech have failed.  Governments should instead make big tech pay for its use of resources. Current policy allows big tech a free ride on telecom networks and the public’s airwaves. These giveaways only increase big tech companies’ market share and profitability.

These are important lessons as policymakers look at the cloud market.

The Cloud Explodes in 2022

Policymakers will turn their attention to public clouds, which hold an increasing amount of citizen and enterprise data. Big tech probably has more knowledge and data about people and firms than the government itself. In 2022, cloud services from Amazon, Microsoft and Google will emerge in the public consciousness. It is hard to see how a Chinese alternative could gain traction in this market, but it still raises questions about existing cloud practices.

Mobile operators put parts of their networks in Amazon, Microsoft and Google clouds. As mobile networks are increasingly integrated with clouds, this means that individuals and firms are even more embedded with big tech. There is no turning off big tech and no choosing not to use it.

This situation adds to the complexity and difficulty of data portability from one cloud to another. In practice, companies may find it impossible to migrate from one cloud to another.  Although this sets off alarms in the antitrust world, it does not diminish the technical reality that cloud services from Amazon, Microsoft and Google are not comparable 1:1. In practice, Amazon, Microsoft and Google will not achieve the same result if you use the three platforms’ AI solutions to analyze your data.  One big question in 2022 is which has the most intelligent AI solution: Amazon, Microsoft, or Google?

One thing is for sure: It is far easier to switch the vendor of 5G infrastructure equipment than to switch cloud providers.

The Markets for Mobile Phones and Services Are Boring

Strand Consult has chronicled the development of the mobile phone market and has published popular reports on the iPhone. It has grown banal to watch Apple launch subsequent new versions of the iPhone that look nearly identical to the one before. With few technical improvements in each subsequent phone, the main difference is the model number. In 2021, Apple released iPhone 13, and in 2022, there most likely will be an iPhone 14, and so on. It is a testament to the company’s marketing that it has been able to navigate inevitable device fatigue.

Mobile apps also lumber on with subsequent versions. The key development in 2021 has been the use of mobile apps to manage COVID-19, and that trend will continue in 2022. Additionally, governments have entered the mobile app market in a big way with vaccine passports, which for many countries have become or will become de rigueur.

Tower Companies Spread in the Value Chain

Tower companies are an important part of the efforts to find profitability in an increasingly difficult telecom market. Many mobile carriers have discovered that they can sell their towers and post unrealized assets. In Europe alone, selling towers has contributed some 36 billion Euros to the mobile industry.

Around the world, we see tower companies starting to spread in the value chain. In Brazil, they invest in fiber, while others consider whether to enter the spectrum market. In 2022, we will see much more of this activity.

A study case is Denmark’s TDC. Three Danish pension funds, PFA, PKA, ATP and Macquarie Infrastructure and Real Assets have chosen to split the telecom operator into an infrastructure company and a service company. The two new entities will be TDC Net for infrastructure and Nuuday for service. We believe it to simply be financial acrobatics. The trend of the breakup of telecommunications companies into infrastructure and service entities will be seen increasingly in 2022.

The Market for Private 5G Networks Is Hot and Crowded

In 2021, much was written about private 5G networks, such as, who will build them, and who will run them. It’s a market in which many want to enter, everyone from mobile operators to IT companies to systems integrators to infrastructure suppliers. O-RAN players also want to enter, though it remains to be seen if they can deliver the heavy demands of a classic mobile network. Expect fierce competition, very low margins and an inevitable shakeout.

The C-band Cha-cha

The United States notched an unparalleled success with the C-band spectrum auction, a record for the U.S. spectrum at more than $90 billion. Mobile operators were set to launch 5G in this band on Dec. 5, 2021, but were hijacked by the Federal Aviation Administration (FAA), which posted a dubious advisory about 5G transmissions and altimeters.

U.S. planes fly to more than 50 countries where some 200 5G networks operate, and there have been no reports of interference between 5G transmissions and altimeters. The FAA, which has known of 5G for years, has done nothing to modernize altimeters. The question is which aviation lobby the FAA is protecting, which most likely is small aircraft operators and possibly helicopter operators that don’t want to upgrade their safety equipment.

Commercial aircraft makers such as Boeing produce planes with three modern altimeters each. Their requested mitigation was a guard band of 110 MHz; the FCC doubled it. U.S. operators also volunteered to reduce power levels around U.S. airports for six months to prove compatibility and will roll out 5G in the band on Jan. 5.  Thus, the United States has the most generous, though excessive, protections for altimeters in the world.

All in all, we’re going to see that the big markets are going to set agendas for other markets. Much of what is needed requires political goodwill in a world in which the political system rarely understands the importance of what is happening.

2022 Will Show Rising Prices in the Wireless Space

After mobile and broadband prices have fallen over time, 2022 should be the year when prices rise around the world. Look no farther than little Denmark, which, in 2021, found the telecom regulator colluding with energy companies to price-fix the wholesale price of fiber access at a level above what the market offers. As such, prices are guaranteed to rise in Denmark because of regulators’ efforts. Given that the regulated price of fiber will increase, broadband prices on private networks will follow.

We also expect that many of the operators that have difficulty creating value for their shareholders through organic growth will raise prices in 2022. It follows that a highly valuable service such as broadband telecommunications should increase in price. This is the law of demand, and without price increases, it will be difficult to invest in network upgrades.

_________________

John Strand is CEO of Strand Consulting.

 

Opinion: Why We Never Seem to Get Ahead in Cybersecurity

By Ernest Worthman

Cybersecurity is one of my favorite topics to discuss. It is also one I frequently beat up on. I lay most of the blame for poor cybersecurity on the supply side. But the demand side deserves its fair share of the blame, as well.

Much of my finger-pointing at the supply side has to do with economics. In most cases, especially in the consumer segment, security represents a cost with no tangible benefit (read: value to manufacturer). With the highly competitive situation in the consumer segment, they usually only provide the minimum, cheapest solution.

Business, government, industry and such are a bit more knowledgeable about good cybersecurity. That is because they are more educated about it and have more to lose. Additionally, many players already have been victims of cyberattacks and have suffered economically. Most are willing to eat the cost of keeping a good security screen up because it is less than the cost of an attack.

To be fair, keeping ahead of malicious code and bad actors is not easy. Corporations spend millions of dollars and countless person-hours securing their perimeters. Sometimes, in spite of their best efforts, they still get hacked. Malware and other nefarious code is in a constant state of flux. New threats emerge constantly, and keeping up with them is challenging and costly.

Unfortunately, not all are so vigilant. Some organizations get lax for a number of reasons. Perhaps they are in a bit of a financial crunch. Perhaps they have not been exposed to the havoc an attack can render. Or perhaps they may not have the proper intellectual resources or the attitude to understand the gravity of what a breach can do and how much money is at stake.

It seems as though not having proper intellectual resources or the proper attitude never would be the case, considering all the noise and visibility of poor cybersecurity. Yet it is the case. Just look back a few months to the FireEye debacle. That breach was due to poor oversight and some sloppy housekeeping. The truth is that most breaches stem from poor due diligence on the part of an organization.

Whether it is consumer or commercial, there are many more reasons than fit the space in this column as to why security is such a nagging problem. The issues cut across both consumer and commercial lines. The issues are similar, in many cases. The difference is in the complexity of hardware and the size of the particular environment.

Perhaps the biggest challenge comes from antiquated equipment running on aging software. The challenge is the cost to replace or upgrade it. It is one thing to replace an aging router in a home. However, for a large organization upgrading hardware to plug security holes, it is not only expensive, but it also has additional soft costs in downtime, lost revenue and outside costs, for example, for experts, analysts and added outside manpower. Organizations, both public and private, often have a hard case to make for spending money to replace something that is currently working, even if it is a security risk.

On the uptime side, take, for example, a municipal water system. Today, hundreds of municipal water systems manage billions of gallons of water with old hardware. Most of them are running years-old, if not decades-old, operational technology (OT). The downtime required to change out hardware and software to improve security, if the systems are managing the water well (no pun intended), can cause interruptions in water service to customers – including critical systems such as connections for fire protection. Similar situations exist in other utilities, too.

In many cases, system operators are simply afraid to patch or update systems, often because patch or update history has been a nightmare. This results in systems, with hundreds of exploitable vulnerabilities, running on operating systems that have long since aged out.

In fact, this is exactly what occurred earlier this year. A couple of water systems were specifically targeted for these very reasons. The particular systems had the front, side, backdoors, and all the windows wide-open, allowing a bad actor to simply walk in, because the priority was the system to not be down.

This is also the case with hospitals and particularly medical equipment. Today, everything is connected to a computer: X-ray machines, MRIs, CTs, vital sign monitors, infusion port devices, EKGs, EEGs … the list goes on and on. Part, or all of it, is usually outward-facing to the internet at some point.

Medical equipment has an additional roadblock. Medical hardware and software is under the thumb of the Food and Drug Administration, and any changes to such systems have to be blessed by them. That is never fast, easy or cheap. Even if they do patch or update, if the change is not properly sanctioned and all the hoops are jumped though if something happens, the organization faces the possibility of a lawsuit or decertification.

There are similar issues across nearly all segments of business and industry that face such challenges. Many are unique to the specific segment.

On the consumer side, smart may make life easier, but this smart new world of devices is creating a new batch of headaches in the cybersecurity space. The lack of diligence by consumers coupled with the lack of security by manufacturers is creating nightmares. Smart TVs, smart light bulbs, smart power strips, smart home security, smart toasters, smart refrigerators, smart HVACs and smart pretty much everything else are placing a plethora of devices, generally all facing inward to a network and many outward to the internet. Everything is connected to everything else, as well as to personal data.

Exploits abound. Aside from the well-worn baby cam hacks, many well-known exploits are hacking some brands of smart TVs, as well as the Roku smart TV platform. New ones emerge daily. Additionally, smart TVs are getting smarter, adding more vectors for ingresses, such as built-in cameras and microphones. These are easy components to capture, allowing bad actors to get a peek into your world.

This will not stop with TVs. Count on just about everything getting a video and audio interface at some point. Washers, dryers, refrigerators, stoves, maybe even your electric toothbrush as it videos your teeth and tells you how bad a job you are doing at brushing.

A bit farfetched? Today, perhaps, but I have to wonder what devices will be like in 20 years.

Sadly, most consumers simply take whatever device they get out of the box, download the app and read just enough of the instructions to connect it with their Wi-Fi — if it is not auto-sensing and connecting. There are, generally, ample notices that once the device is connected to update the software, if needed, and change the password, but these usually go unheeded. The fact is that, in spite of all of this, it has been shown that people rarely update the firmware or change the default setting.

Is there a way to make the world security-conscious? The enterprise has a better chance at it than the consumer segment. However, many consumer devices are beginning to interface with manufacturers or other businesses — like my smart thermostat manufacturer, which tracks my usage and gives me reports. Another example is medical devices, which many of which are capable of interfacing with your doctor, the hospital and first responders, and even be monitored by the manufacturer for battery life, calibration and more.

Getting everyone on the same page will be impossible. There are just too many complex variables in the equation, including cost, competition, data value and relativity to other systems for functionality. The list goes on and on.

No one single solution will be the magic cure-all to cybersecurity woes. It is unlikely the consumer will ever become as security conscious as necessary. That means consumer devices will have to become self-securing. What that looks like is up for grabs, but that is the only way the consumer security space will become sufficiently secure.

On the enterprise and industrial side, there simply needs to be a unwavering priority placed on security. Everyone and everything need to be up to date. Security teams need to be at the ready and vigilant, always looking for potential security leaks, holes and access points.

Although we will never be able to achieve a 100 percent secure network, being on top of cybersecurity will both prevent as much as possible and lessen what breaches do occur. Simply put: Pay attention to what is going on in this space, and throw enough resources at it to cover all bases as best as possible

____________________

Ernest Worthman is an executive editor with AGL Media Group.

Mission Critical Partners Offers Advanced Training

Mission Critical Partners has launched an advanced training program aimed at preparing public safety and justice agencies to be cognizant of cyberattack threats, why and how they work, and how to strengthen their defenses against them, according to a statement from the company.

“Cybersecurity continues to be a persistent challenge for government agencies, including those operating in the public safety, justice, and other mission-critical sectors,” the statement reads. “These entities must be constantly vigilant in their efforts to prevent breaches, a task made incredibly difficult given the ingenuity of cyberattackers, the increasing quantity of attacks targeting the public sector, and the reality that attack vectors evolve by the hour.”

In 2021, ransomware attacks increased more than 300 times over the same period in 2020, according to the company’s president of lifecycle management services, David S. Jones. According to Jones, a massive number of new records landed in dark web data markets, giving cybercriminals added fuel to execute phishing attacks, typically via emails that appear at first glance to be legitimate. He said the goal is to entice the recipient to unwittingly unleash malware by opening the email or clicking on an attachment.

Mission Critical Partners’ training program is designed specifically for public-sector agencies and is available as two separate training courses, according to the company statement.

The first course, “Advanced Cybersecurity for Leadership,” is designed for an organization’s leadership and would educate them regarding the importance of cybersecurity and, on a high level, how to achieve it. The company said the goal of the course is to ensure that those in the leadership program can develop a solid foundational strategy for defending against cyberattacks.

Advanced Cybersecurity for the Front-Line Employee,would educate front-line staff, including telecommunicators and supervisors, regarding the importance of good cyber-hygiene practices, the latest threats that are emerging, and how to identify and take ownership of their role in improving the cybersecurity posture of their organization.

Mission Critical Partners said that each course consists of two, two-hour classes that will be available virtually or on-site.

Shades Of FireEye, Again

By Ernest Worthman

Image courtesy of picstatio.com

By now everybody and their brother has shouted out that T-Mo got hacked. With all the noise about security, of late, and the highly visible and embarrassing breaches of companies such as FireEye, one has to wonder how T-Mo could possibly have a vulnerability that would allow an eight- or nine-figure record data breach. But, to be fair, they are in good company. This ransomware, which attacks tech-management software from a company called Kaseya, was said to have hit as many as 1500 organizations, of which about 50 were what are called managed services providers (MSPs). Kysea has about 40,000 customers using the tool that was the target of the attack.

This is a bit of a different approach to ransomware. Normally, ransomware attacks take advantage of security loopholes, such as common passwords without two-factor authentication. This one is much more sophisticated and attacked Kysea’s software, a unified remote monitoring and management tool for handling networks and endpoints, through an authentication bypass vulnerability in the Kaseya VSA web interface. It just so happens that the web interface contained two gaping flaws in the software.

These flaws allowed the attackers to circumvent authentication controls, gain an authenticated session, upload a malicious payload and execute commands via SQL injection, achieving code execution in the process. And how was it done? By creating a fake, malicious software update using Kaseya VSA dubbed “Kaseya VSA Agent Hot-fix.”

That causes everyone using the VSA tool to be vulnerable. Ergo any company using the tool is vulnerable to getting their files locked. And the process is stealthy. It was infiltrating before Kysea even knew what was happening.

The perpetrators are believed to be an affiliate of a top Russian-speaking ransomware gang known as REvil. They are also believed to be the same ones who hacked JBS’s meatpacking plant last June.

Early rumors had it that the T-Mobile’s breach compromised the data of more than 100 million people. T-Mo claims the actual figure was pegged closer to 40 million. The leaked data includes names, physical addresses, phone numbers, social security numbers, unique IMEI numbers and driver’s licenses information – plenty to create an identity theft crisis. The validity of the data was confirmed by Vice Media’s Motherboard channel, which claims to actually have seen samples of the data and confirmed they contained accurate information on T-Mobile customers.

The hacker is asking for six Bitcoin tokens, which are worth roughly $276,000 at Bitcoin’s current exchange rate. However, that is only for most of the data – about 30 million people’s worth. The rest of the data is apparently being sold privately, rather than being made publicly available.

So those are the ugly details. However, what has me surprised (and I am a T-Mo customer) is that this is its fifth known breach in less than three years. The company previously disclosed breaches in 2018, 2019 and 2020, as well as in January of this year.

What else has me seeing red is why T-Mo is not practicing “safe security.” One would think that after one or two successful attacks, they would make it a priority to implement the highest level of cybersecurity. But five successful attacks? I am finding myself a new carrier.

One thing this does reiterate is the poor state of cybersecurity in many organizations. As I mentioned earlier, the vulnerabilities were not some deeply embedded, buried code or back door. They were, as I noted, “gaping flaws,” as one report described them, that should have been unearthed by any reasonably astute coder.

However, uncovering vulnerabilities is often easier said than done, but there are also a lot of vulnerabilities that are easy to spot. That is one of the big issues with complex and million-line code. While it can be a time-consuming and costly undertaking, periodically, every piece of code needs a review. And there are plenty of organizations specializing in scrubbing code to uncover “gaping flaws” or rogue code.

However, this breach had nothing to do with T-Mo software. They were only running software from a vendor. A nagging question is whether the security at T-Mo was sufficient to catch this malware before it hit their database.

This is an interesting conundrum. As the trend is to move to cloud and “as a service” software, rather than developing and running on-premises software, the issue of security becomes a bit nebulous. Does the owner of the purchased software own the damages if it is hacked? Or is the liability with the developer? As well, there is the question of whether the end-user practiced due diligence in securing its own data.

The argument has been made that reasonable measures must be taken to secure data – at all levels. In the end, everybody along the supply chain has some responsibility to ensure data is secure – even the end-user. Of course, there is no way every possible avenue for intrusion can be locked down. However, there is much more that can be done than is being done – at all levels.

One good way to move toward such a goal is to hold breached vendors accountable, financially, for real damages. From time to time, there are fines levied against leakers, such as Equifax’s $700 million. However, as with fines against carriers over the years, the amount is usually paltry, and penalties are inconsistent.

Meanwhile, nothing is done to make the injured parties whole. Perhaps a model that passes costs up the channel would be more effective – T-Mo compensates its hacked customers (and not just with a year of free credit protection), and Kysea compensates T-Mo for its losses. Besides, I am pretty sure organizations can obtain insurance to cover that, anyway.

Some organizations are more aware than others and are doing a superb job in scrubbing their code, requiring security audits on vendor software, and making sure cloud suppliers have top-shelf security and keep everything up to date. Had this been the case with companies such as FireEye and Kasea, it is more likely that these attacks would have failed.

In any event, again I trumpet the importance of strong security. What more is there to say?

_________________

Ernest Worthman is an executive editor with AGL Media Group.