Inspiring Success

A blog from Creating IT Futures

Back to Listing

July 23, 2018

Columnists’ Corner: Training Cybersecurity Technologists

By Eric Larson

Opinion: Tomorrow’s Cybersecurity Talent Doesn’t Require a 4-Year Degree Today

 

Last year, CompTIA’s “Assessing the IT Skills Gap” study found that cybersecurity was among the top five areas in which organizations reported moderate to significant skills gaps, with more than half (55 percent) of the IT and business executives who were polled expressing concern.

Eric Larson

 

In fact, data loss prevention and data security best-practices are the top skill sets seen as lacking in today’s technology workforce. This is a troublesome issue during an era of escalating cybersecurity risk.

 

As Information Management reported earlier this month, research shows U.S. businesses experienced a 100 percent increase in the volume of cyber attacks during the last two years. And while companies turn more and more to data management techniques to cope with cyber threats, the supply of analytics talent isn’t rising fast enough to meet the demand.

 

Currently there are more than 285,000 unfilled cybersecurity openings nationwide, according to Cyberseek, a free workforce and career resource developed jointly by CompTIA and labor market analytics firm Burning Glass Technologies. Data-driven jobs, such as “cybersecurity analyst” and “vulnerability analyst/penetration tester” are among the open positions.

 

How will U.S. companies find the analytics talent to fill these roles?

 

There are no fast or easy answers. But today the focus seems to be on short-run tactics for narrowing the skills gap. Some companies are hiring or partnering to meet overall cybersecurity needs, using training and certification programs to sharpen the technical skills of their existing workforce. Many also extend this skills development to their non-IT workforce.

 

However, if we aim to close the gap in the end, long-run solutions for feeding the analytics talent pipeline will be necessary. And my organization believes middle and high schools are the best places to focus long-term efforts. Why? Because tweens and teens already make up a quarter of the U.S. population and will account for more than 20 percent of the workforce in the next five years.

 

Furthermore, my team’s research suggests many in this cohort have the disposition to become more than technicians; they will be technologists, people working with technology of various types in companies of all shapes and sizes across the country along a broad spectrum of industries—not just software and hardware companies. We expect that workers with a technologist’s mentality—an optimal mix of hard technical skills and relationship “soft skills” acumen—will be well-suited for the evolving fast-paced, continually evolving cybersecurity environment.

 

But as noted in my last article for Information Management, seven myths about technology careers can block the creation of the next generation of technologists. We already busted the biggest misconception of them all, that technology is all about coding, math and science.

 

Now, let’s explode the second the second biggest myth: “Working in technology requires a four-year college degree.”

 

The truth is that many people who land a job in tech with just some basic training and a certification. According to the U.S. Census Bureau’s 2014 American Community Survey, 59 percent of computer support specialists employed that year didn’t have a bachelor’s degree.

 

Motivated students can learn the underpinnings of technology and start troubleshooting security problems by analyzing analytics after one introductory class—no matter at what age they start studying. Sure, many people learn about managing data in college. But plenty of others get a start through self-study, through online programs, or an associate degree at most.

 

The development of intangible skills requisite for dealing with ever-evolving cyber threats can begin in middle- and high-school and be extended through internships. On-the-job training programs can familiarize students with soft skills and help to contextualize technical knowledge. College remains an excellent option for those with the time and money for a degree. But young people need not wait to join the front lines of the cybersecurity war and make a difference.

 

Eric Larson, senior director of Creating IT Futures signature initiative, IT Futures Labs, contributes to Information Management on a regular basis. Follow him on Twitter at @ITFuturesEric.

 

Please also follow @infomgmt and @FoundationEaton for updates on tech myths, careers and other stories about young technologists.