SHREVEPORT, La. — Attendees at the 2019 Community College Cyber Summit (3CS) will have the opportunity to take the skills they learn with them long after they leave Bossier Parish Community College.

Digital badges, or micro credentials, are a graphical representation of a person’s abilities and competencies, combined with a verifiable description of the knowledge and activities it took to earn it. They are highly visual and optimized for sharing on social media channels and professional networks. 3CS is partnering with SynED, a California-based non-profit that helps colleges and universities utilize technology and experiential learning, to provide digital badges to its attendees, presenters and workshop participants based on professional or personal achievements.

“The digital badges give community college faculty the opportunity to document their continuing professional development in a way that is visible to the entire cybersecurity community,” said conference chair Robert Spear.

Attendee badges are available to anyone who attends at least three conference sessions and completes feedback forms, along with completing the overall conference survey and submitting a “quotable quote.”

Anyone who presents at the conference, submits presentation materials and completes the feedback form is eligible to earn a presenter badge.

Individuals who complete Cybersecurity Skills Development Workshops are eligible to earn badges that correspond to each workshop:

  • Hands-On Cryptography
  • Secure Scripting
  • Intro to IBM’S QRadar
  • Cybersecurity Skills Journal

The workshop badges at 3CS align with the National Initiative for Cybersecurity Education (NICE) framework, which means they are directly applicable to industry standards. Anyone can click on the badge to see exactly what went into earning it, which is important for complex fields like cybersecurity.

“The badges provide engagement for the 3CS community. In addition to links to external resources and workshop materials, the badges include metadata of the knowledge and skills participants demonstrated in the Cybersecurity Skills Development Workshops,” said Casey O’Brien, executive director and principal investigator at the National CyberWatch Center, which organizes the conference.

Lee Yarborough, the leader of the California Digital Badge Initiative at SynED, thinks about digital badges as an extension of a resume, which shows what jobs a person held but nothing about their specific skills or how they acquired them.

Digital badges provide that much-needed context for the employer to determine which specific skills a candidate has. They are also machine readable, which is critical as the hiring process adapts to AI.

“Hiring is now more skills and competency based, and employers want to verify that candidates know the skills they list on their resume,” Yarborough said. “An employer can click on a badge and, in an instant, it tells a better story than a resume or transcript ever could.”

3CS will be held July 30-August 1 at Bossier Parish Community College and Horseshoe Bossier City Hotel and Casino in Shreveport, Lousisana. For more information about digital badges at the conference, visit https://www.my3cs.org/digital-badge.

About 3CS

3CS is organized and produced by the National CyberWatch Center, National Resource Center for Systems Security and Information Assurance (CSSIA), CyberWatch West (CWW), and Broadening Advanced Technological Education Connections (BATEC), which are all funded by the National Science Foundation (NSF). The outcomes of 3CS leverage community college cybersecurity programs across the nation by introducing the latest technologies, best practices, curricula, products and more.

About SynED

SynED is a non-profit organization dedicated to promoting educational excellence by providing higher education professional services to facilitate the development of new models of curriculum, industry alliance, service, and delivery.

California Cyberhub moves under Cyber-Guild Umbrella

Thousand Oaks, CA – 15 July 2019 – In line with requests from both other states across the nation and international audiences, synED, a non-profit 501(c)(3) organization is pleased to announce the reorganization of their pillar program for Cybersecurity Awareness. California Cyberhub will become a component of a parent program named Cyber-Guild™ to better serve national and international opportunities. 

“We were experiencing rapidly increasing interest, from across the US and internationally, with communities wishing to adopt the highly successful model developed in California.” said Executive Director/Chairman of the Board, Scott Young. “The model for community engagement cyber awareness and education is proving to be highly desirable therefore, we needed to establish a parent structure with a universally appealing name.  We also found that other organizations had trademark rights to the Cyberhub name. To address the numerous requests and expand across the nation and around the globe our program is now known as Cyber-Guild.”

Cyber-Guild Director, Liz Fraumann shared, “We will hold the high standards set by the California Cyberhub program and continue to support Californian’s under that name. Cyber-Guild will look to bring new initiatives, projects and activities for everyone under the Cyber-Guild umbrella. We will continue to focus on the K-12 youth, but we look forward to increasing our engagements to include higher education, business and all community members with high value initiatives.”

Organizations and representatives from other states or nations around the globe interested in utilizing the Cyber-Guild program should visit the website: cyber-guild.organd share what they are most interested in achieving. A Cyber-Guild Alliance is being formed to enable sharing of the leading practices and ideas to facilitate collaboration for cyber education for K-12, cyber competitions, and other activities to help build a cyber-ready workforce. These resources will be available for all Alliance participants.

Additional programs synED offers that maybe of interest to organizations, include the Digital Badge Design program, Rapid Customer Centric Design and services such as Environmental Scans and Needs Analysis from our research and reportOUT group. 

ABOUT synED

SynED is a non-profit organization dedicated to promoting educational excellence by promoting synergies between traditional, non-traditional and experiential learning to realize the best possible outcomes for students, faculty, business and society.

For more information, visit synED.org

ABOUT the Cyber-Guild Program

Cyber-GuildTMis the leading integrated community engagement program of synED focused on raising cybersecurity awareness and learning across the United States of America, and globally at all levels.For more information, visit cyber-guild.org

SynED Explores How Microcredentials can Benefit Students, Job Seekers, and Employers

THOUSAND OAKS, Calif., June 17, 2019 /PRNewswire/ — Thanks to services like Zip Recruiter and Indeed, the job search process is harnessing the power of AI to screen candidates and match them with available jobs. To keep pace with the change, resumes are becoming skills-based and moving away from employment chronology.

No matter how detailed a resume is, though, it’s still difficult to tell what exactly a candidate did to earn those skills. Everyone writes their resume a different way, and resume padding is not likely to go away, no matter how much the industry changes.

Digital badges provide an opportunity for employees to compile metadata on specific skills they’ve acquired, and for employers to take a deeper dive into their candidate pools. Badges are even more successful when they align with industry standards, as one recent effort in the cybersecurity field shows.

Digital Badge Basics

Digital badges, or microcredentials, are a graphical representation of a person’s abilities and competencies, combined with a verifiable description of the knowledge and activities it took to earn it. They are highly visual and optimized for sharing on social media channels and professional networks.

Credly is a leading provider of digital badges and helps companies and other organizations create credentials for internal and external use. Brenda Perea, Credly’s Director of Education and Workforce Strategies said badges work best when organizations have a specific goal in mind while creating them — such as mapping to industry standards.

“When an employer clicks a badge in a profile, they can immediately see all of the activities that went into earning that skill,” Perea said.

Lee Yarborough is the leader of the California Digital Badge Initiative at SynED, a California-based non-profit that helps colleges and universities around the world use technology and experiential learning to meet their goals and serve their students. Before her current role, she spent 15 years as a career counselor in California and saw firsthand the deficiencies with how skills are currently captured.

Yarborough often thinks about digital badges as an extension of a resume or transcript. A transcript shows what courses a student took and how well they performed in those classes, but it does not say anything about what skills they learned in those classes. The same is true of a resume — it shows what jobs a person held, but nothing about their specific skills or how they acquired them.

In either case, digital badges provide that much-needed context for the employer to determine which specific skills a candidate has.

“It’s a tool for people to communicate what they did in detail,” Yarborough said. “An employer can click on a badge and, in an instant, it tells a better story than a or transcript ever could.”

Mapping to Industry Standards

Technical fields are often governed by industry standards, such as the NICE Cybersecurity Workforce Framework. These protocols are designed to ensure that employees across all sectors and industries have the skills necessary to perform specific tasks and functions that are common across the discipline.

Digital badges provide an opportunity for employers to see that a candidate meets those standards. A resume can say it in theory, but a badge allows the opportunity to embed examples and related work projects that are accessible in one click.

“It’s great if someone displays a badge on LinkedIn, but the metadata in the badge makes it so much more meaningful,” Yarborough said.

CompTIA, a leading provider of IT and cybersecurity certifications, created badges that map to those certifications and the larger NICE framework. Each badge contains an overview of the certification, a listing of the skills covered as part of the certification, and a listing of careers people with the certification are eligible to pursue.

Badges were also used to connect cyber skills to activities completed as part of the California Mayors Cyber Cup, a statewide cybersecurity competition for high school students. Members of the top three teams from each of the 12 competitions received a digital badge that demonstrated their understanding of key cybersecurity concepts, including cyber ethics, cyber governance, threat intelligence, and data loss prevention.

“Badges provided a vehicle for competitors to display their accomplishments – and encourages their aspiration to achieve more,” Yarborough said.

Benefit to Employers

Sentek Global, an IT and cybersecurity firm based in San Diego, is already starting to see the benefits that digital credentials can bring to the hiring process. The company is growing quickly and needs to review candidates in the most efficient way possible.

“Because the majority of the roles we fill at Sentek Global are technical, reviewing and verifying the skills and credentials of our candidates is time-intensive,” said Joey Tompkins, a talent acquisition specialist at Sentek Global. “From a hiring standpoint, holding digital badges for desired skills and certifications will absolutely help a candidate stand out.”

Badges also reinforce organizational credibility by showing that they value continuing education and lifelong learning. A badge displayed on a social profile creates a positive organizational reputation, which can be helpful in attracting future clients and future customers.

“Digital credentials are a powerful way to engage employees in the workplace and reward their achievements with verifiable and shareable digital badges,” Perea said.

Conference organizers are also embracing badges as a way for attendees to denote specific skills learned from sessions and workshops. A badge stays with an attendee long after the conference ends and provides verification to a company that its professional development funds were well-spent, Yarborough said.

What Does the Future Hold?

Digital badges are in their infancy, but the future is bright as organizations and individuals continue to embrace them and see the value they can bring.

Russ Novak, business unit director of mission assurance and advanced solutions at Sentek Global, said he expects to see candidates presenting digital badges from CompTIA and the Program Management Institute in the near future.

“Now is the time for both industry veterans and new or aspiring professionals to start thinking strategically about the certifications and skills they’ll need to obtain, as well as the accompanying digital badges: those badges will paint a vivid picture of their experience,” Novak said.

As Generation Z graduate college and enters the workforce, the focus on personal branding will become greater than ever. Credly predicts that the generation raised on video tutorials and on-demand learning will embrace badges as a way to demonstrate skills that transcend specific job or industry.

Yarborough also sees the potential for colleges and universities to offer badges that show employers what skills students learn in classes.

Santa Barbara City College successfully implemented badges as part of its Career Strategist Certificate. The certificate is designed to help students understand their strengths and use that information to find meaningful and rewarding jobs.

The badge denotes that students have completed activities in personalized career planning, strategic job searching, and using LinkedIn for Business.

For more information on SynED and the Digital Badge Initiative, visit https://syned.org/digital-badge-initiative/

About SynED

SynED is a non-profit organization dedicated to promoting educational excellence by providing higher education professional services to facilitate the development of new models of curriculum, industry alliance, service, and delivery.

About Credly

Credly empowers organizations to officially recognize individuals for demonstrated competencies and skills through the use of digital credentials. The company is leading the digital credential movement, making talent more visible and opportunity more accessible.

About Sentek Global

Sentek Global has a cadre of cybersecurity experts that are highly trained and highly efficient. Cybersecurity services include: Information Security Audits, Cybersecurity Inspections and Command Cyber Readiness Inspections, Certification and Accreditation (DIACAP and RMF), Validator Support Services, Penetration Testing, Vulnerability Assessments, Network Design and Architecture, Computer Network Defense, Security Operation Center Management.

Originally Posted On: singularityhub.com

In the past few years, artificial intelligence has advanced so quickly that it now seems hardly a month goes by without a newsworthy AI breakthrough. In areas as wide-ranging as speech translation, medical diagnosis, and gameplay, we have seen computers outperform humans in startling ways.

This has sparked a discussion about how AI will impact employment. Some fear that as AI improves, it will supplant workers, creating an ever-growing pool of unemployable humans who cannot compete economically with machines.

This concern, while understandable, is unfounded. In fact, AI will be the greatest job engine the world has ever seen.

New Technology Isn’t a New Phenomenon

On the one hand, those who predict massive job loss from AI can be excused. It is easier to see existing jobs disrupted by new technology than to envision what new jobs the technology will enable.

But on the other hand, radical technological advances aren’t a new phenomenon. Technology has progressed nonstop for 250 years, and in the US unemployment has stayed between 5 to 10 percent for almost all that time, even when radical new technologies like steam power and electricity came on the scene.

But you don’t have to look back to steam, or even electricity. Just look at the internet. Go back 25 years, well within the memory of today’s pessimistic prognosticators, to 1993. The web browser Mosaic had just been released, and the phrase “surfing the web,” that most mixed of metaphors, was just a few months old.

If someone had asked you what would be the result of connecting a couple billion computers into a giant network with common protocols, you might have predicted that email would cause us to mail fewer letters, and the web might cause us to read fewer newspapers and perhaps even do our shopping online. If you were particularly farsighted, you might have speculated that travel agents and stockbrokers would be adversely affected by this technology. And based on those surmises, you might have thought the internet would destroy jobs.

But now we know what really happened. The obvious changes did occur. But a slew of unexpected changes happened as well. We got thousands of new companies worth trillions of dollars. We bettered the lot of virtually everyone on the planet touched by the technology. Dozens of new careers emerged, from web designer to data scientist to online marketer. The cost of starting a business with worldwide reach plummeted, and the cost of communicating with customers and leads went to nearly zero. Vast storehouses of information were made freely available and used by entrepreneurs around the globe to build new kinds of businesses.

But yes, we mail fewer letters and buy fewer newspapers.

The Rise of Artificial Intelligence

Then along came a new, even bigger technology: artificial intelligence. You hear the same refrain: “It will destroy jobs.”

Consider the ATM. If you had to point to a technology that looked as though it would replace people, the ATM might look like a good bet; it is, after all, an automated teller machine. And yet, there are more tellers now than when ATMs were widely released. How can this be? Simple: ATMs lowered the cost of opening bank branches, and banks responded by opening more, which required hiring more tellers.

In this manner, AI will create millions of jobs that are far beyond our ability to imagine. For instance, AI is becoming adept at language translation—and according to the US Bureau of Labor Statistics, demand for human translators is skyrocketing. Why? If the cost of basic translation drops to nearly zero, the cost of doing business with those who speak other languages falls. Thus, it emboldens companies to do more business overseas, creating more work for human translators. AI may do the simple translations, but humans are needed for the nuanced kind.

In fact, the BLS forecasts faster-than-average job growth in many occupations that AI is expected to impact: accountants, forensic scientists, geological technicians, technical writers, MRI operators, dietitians, financial specialists, web developers, loan officers, medical secretaries, and customer service representatives, to name a very few. These fields will not experience job growth in spite of AI, but through it.

But just as with the internet, the real gains in jobs will come from places where our imaginations cannot yet take us.

Parsing Pessimism

You may recall waking up one morning to the news that “47 percent of jobs will be lost to technology.”

That report by Carl Frey and Michael Osborne is a fine piece of work, but readers and the media distorted their 47 percent number. What the authors actually said is that some functions within 47 percent of jobs will be automated, not that 47 percent of jobs will disappear.

Frey and Osborne go on to rank occupations by “probability of computerization” and give the following jobs a 65 percent or higher probability: social science research assistants, atmospheric and space scientists, and pharmacy aides. So what does this mean? Social science professors will no longer have research assistants? Of course they will. They will just do different things because much of what they do today will be automated.

The intergovernmental Organization for Economic Co-operation and Development released a report of their own in 2016. This report, titled “The Risk of Automation for Jobs in OECD Countries,” applies a different “whole occupations” methodology and puts the share of jobs potentially lost to computerization at nine percent. That is normal churn for the economy.

But what of the skills gap? Will AI eliminate low-skilled workers and create high-skilled job opportunities? The relevant question is whether most people can do a job that’s just a little more complicated than the one they currently have. This is exactly what happened with the industrial revolution; farmers became factory workers, factory workers became factory managers, and so on.

Embracing AI in the Workplace

A January 2018 Accenture report titled “Reworking the Revolution” estimates that new applications of AI combined with human collaboration could boost employment worldwide as much as 10 percent by 2020.

Electricity changed the world, as did mechanical power, as did the assembly line. No one can reasonably claim that we would be better off without those technologies. Each of them bettered our lives, created jobs, and raised wages. AI will be bigger than electricity, bigger than mechanization, bigger than anything that has come before it.

This is how free economies work, and why we have never run out of jobs due to automation. There are not a fixed number of jobs that automation steals one by one, resulting in progressively more unemployment. There are as many jobs in the world as there are buyers and sellers of labor.

Image Credit: enzozo / Shutterstock.com

Originally Posted On: singularityhub.com

You’re driving along the highway when, suddenly, a person darts out across the busy road. There’s speeding traffic all around you, and you have a split second to make the decision: do you swerve to avoid the person and risk causing an accident?

Do you carry on and hope to miss them? Do you brake? How does your calculus change if, for example, there’s a baby strapped in the back seat?

In many ways, this is the classic “moral dilemma,” often called the trolley problem. It has a million perplexing variants, designed to expose human bias, but they all share the basics in common. You’re in a situation with life-or-death stakes, and no easy options, where the decision you make effectively prioritizes who lives and who dies.

A new paper from MIT published last week in Nature attempts to come up with a working solution to the trolley problem, crowdsourcing it from millions of volunteers. The experiment, launched in 2014, defied all expectations, receiving over 40 million responses from 233 countries, making it one of the largest moral surveys ever conducted.

A human might not consciously make these decisions. It’s hard to weigh up relevant ethical systems as your car veers off the road. But, in our world, decisions are increasingly made by algorithms, and computers just might be able to react faster than we can.

Hypothetical situations with self-driving cars are not the only moral decisions algorithms will have to make. Healthcare algorithms will choose who gets which treatment with limited resources. Automated drones will choose how much “collateral damage” to accept in military strikes.

Not All Morals Are Created Equal

Yet “solutions” to trolley problems are as varied as the problems themselves. How can machines make moral decisions when problems of morality are not universally agreed upon, and may have no solution? Who gets to choose right and wrong for the algorithm?

The crowd-sourcing approach adopted by the Moral Machine researchers is a pragmatic one. After all, for the public to accept self-driving cars, they must accept the moral framework behind their decisions. It’s no good if the ethicists or lawyers agree on a solution that’s unacceptable or inexplicable to ordinary drivers.

The results have the intriguing implication that moral priorities (and hence the types of algorithmic decisions that might be acceptable to people) vary depending on where you are in the world.

The researchers first acknowledge that it’s impossible to know the frequency or character of these situations in real life. Those involved in accidents often can’t tell us exactly what happened, and the range of possible situations defies easy classification. So, to make the problem tractable, they break it down into simplified scenarios, looking for universal moral rules.

As you take the survey, you’re presented with thirteen questions that ask for a simple yes or no choice, trying to narrow down responses to nine factors.

Should the car swerve into the other lane, or should it keep going? Should you preserve the young people versus the old people? Women over men? Pets over humans? Should you try to spare the most lives possible, or is one baby “worth” two elderly people? Spare the passengers in the car versus the pedestrians? Those who are crossing the road legally versus illegally? Should you spare people who are more physically fit? What about those with higher social status, like doctors or businessmen?

In this harsh, hypothetical world, somebody’s got to die, and you’ll find yourself answering each of these questions—with varying degrees of enthusiasm. Yet making these decisions exposes deeply-ingrained cultural norms and biases.

Crunching through the vast dataset the researchers obtained as a result  of the survey yields universal rules as well as fascinating exceptions. The three most dominant factors, averaged across the entire population, were that everyone preferred to spare more lives than fewer, humans over pets, and the young over the elderly.

Regional Differences

You might agree with these broad strokes, but looking further yields some pretty disturbing moral conclusions. More respondents chose to save a criminal than a cat, but fractionally preferred to save a dog over a criminal. As a global average, being old is judged more harshly than being homeless—yet homeless people were spared less often than the obese.

These rules didn’t apply universally: respondents from France, the United Kingdom, and the US had the greatest preference for youth, while respondents from China and Taiwan were more willing to spare the elderly. Respondents from Japan displayed a strong preference for saving pedestrians over passengers in the car, while respondents from China tended to choose to save passengers over pedestrians.

The researchers found that they could cluster responses by country into three groups: “Western,” predominantly North America and Europe, where they argued morality was predominantly influenced by Christianity; “Eastern,” consisting of Japan, Taiwan, and Middle Eastern countries influenced by Confucianism and Islam, respectively; and “Southern” countries including Central and South America, alongside those with a strong French cultural influence. In the Southern cluster there were stronger preferences for sparing women and the fit than anywhere else. In the Eastern cluster, the bias towards saving young people was least powerful.

Filtering by the various attributes of the respondent yields endless interesting tidbits. “Very religious” respondents are fractionally more likely to save humans over animals, but both religious and irreligious respondents display roughly equal preference for saving those of high social status vs. those of low social status, even though (one might argue) it contradicts some religious doctrines. Both men and women prefer to save women, on average—but men are ever-so-slightly less inclined to do so.

Questions With No Answer

No one is arguing that this study somehow “resolves” these weighty moral questions. The authors of the study note that crowdsourcing the data online introduces a sample bias. The respondents skewed young, skewed male, and skewed well-educated; in other words, they looked like the kind of people who might spend 20 minutes online filling out a survey about morality for self-driving cars from MIT.

Even with a vast sample size, the number of questions the researchers posed were limited. Getting nine different variables into the mix was hard enough—it required making the decisions simple and clear-cut. What happens if, as you might expect in reality, the risks were different depending on the decision you took? What if the algorithm were able to calculate, for example, that you had only a 50 percent chance of killing pedestrians given the speed you’re going?

Edmond Awad, one of the authors of the study, expressed caution about over-interpreting the results. “It seems concerning that people found it okay to a significant degree to spare higher status over lower status,” he told MIT Technology Review. “It’s important to say, ‘Hey, we could quantify that’ instead of saying, ‘Oh, maybe we should use that. The discussion should move to risk analysis—about who is at more risk or less risk—instead of saying who’s going to die or not, and also about how bias is happening.”

Perhaps the most important result of the study is the discussion it has generated. As algorithms start to make more and more important decisions, affecting people’s lives, it’s crucial that we have a robust discussion of AI ethics. Designing an “artificial conscience” should be a process with input from everybody. While there may not always be easy answers, it’s surely better to understand, discuss, and attempt to agree on the moral framework for these algorithms, rather than allowing the algorithms to shape the world with no human oversight.

Image Credit: Scharfsinn / Shutterstock.com

About Author: Thomas Hornigold is a physics student at the University of Oxford. When he’s not geeking out about the Universe, he hosts a podcast, Physical Attraction, which explains physics – one chat-up line at a time.

Originally Posted On: techrepublic.com

The explosion of data in consumer and business spaces can place our productivity at risk. There are ways you can resist drowning in data.

The pace of data creation steadily increases as technology becomes more and more ingrained in people’s lives and continues to evolve.

According to Forbes.com last May, “there are 2.5 quintillion bytes of data created each day at our current pace, but that pace is only accelerating with the growth of the Internet of Things (IoT). Over the last two years alone 90 percent of the data in the world was generated.”

While technology should make our lives easier, the information it provides can negatively impact our mental function by overwhelming us with too much input.

However, don’t confuse cognitive overload with work overload. Whereby work overload is simply having too much to do and not enough time to complete it, cognitive overload refers to having too much information to process at once.

SEE: Leadership spotlight: How to make meetings worthwhile (Tech Pro Research)

Fouad ElNaggar, co-founder and CEO of Sapho, an employee experience software provider based in San Bruno, Calif., is passionate about cognitive overload. Together we developed some tips for workers on how to fix the problem.

1. Close/shut off distracting applications

The irony of productivity applications is that they can actually make you less productive. Microsoft Office includes Outlook, an email application, which can “helpfully” notify you when new email arrives.

Sadly, this can also contribute to your information overload if you’re in the middle of a task, and you switch to Outlook to read an email. You might even forget about the current task you’re working on. Instant messaging apps, or frankly, anything that dings or pops up an alert are just as distracting. When trying to stay focused on a task, close or shut off any applications which could serve as potential distractions. Oh, and silence your phone, too.

2. Switch off push notifications

If you can’t close a potentially distracting application because you need it available, you can still quiet it down. Between Slack, Gchat, calendar, email and text messages, it probably seems like those tiny dialog boxes pop up on your screen all day long. Take a few minutes to evaluate which push notifications actually help you get work done, and turn off the rest.

SEE: Project prioritization tool: An automated workbook (Tech Pro Research)

3. Bucket your email correspondence

Constantly checking and responding to email is a major time drain. Set aside two times a day to answer emails, and do not check it any other time. Put your phone on “Do Not Disturb,” and make it a point to not let notifications interrupt you during that time.

4. Stay off personal social media/news sites/other temptations

It’s easy and tempting to check social media, or your favorite news outlet while working, especially if you’re waiting for a task to finish before you proceed (such as rebooting a server or uploading a file). However, this just puts more data into your current memory banks, so to speak, so that instead of thinking about that server patching project now you’re also thinking about the NFL draft or how many people “like” your funny Facebook meme. Save social media for lunch time or after work. It’ll be more meaningful, and you can keep your work and recreation separate, as it should be.

5. Utilize minimalism

I keep a very minimalistic workspace: a family picture, a Rick Grimes (from “The Walking Dead,” which contains many parallels to IT life) keychain figure, and a calendar. No fancy furniture, no posters, no inspiring slogans, and no clutter. This helps me stay oriented to what I need to do without the sensory overload.

I also apply the same principles to my computer: I only keep programs running which I need, and even close unnecessary browser tabs, SSH sessions, and Windows explorer windows so that I’m only concentrating at the task at hand.

SEE: IT jobs 2018: Hiring priorities, growth areas, and strategies to fill open roles (Tech Pro Research)

6. Avoid multitasking

You may not have a choice, but avoiding to multitask is one of the best things you can do to keep your brain from being overwhelmed. Dividing your attention into four or five parallel tasks is a sure-fire way to ensure that those tasks take longer or end up being completed less efficiently than if you accomplished these things one at at time. Worse, it’s all too easy to drop tasks entirely as your attention span shifts, resulting in uncompleted work.

7. Utilize documentation

Document your to-do lists, operational processes, and daily procedures you need to follow (building a new server, for instance) so that you don’t rely on memory and can quickly handle tasks—or better yet—refer them to someone else. Anytime I discover how something works or what I can improve upon I update the related electronic documentation so I don’t have to comb through old emails, leaf through handwritten notes, or worse, ask coworkers or fellow employees to fill in missing details that I should have recorded.

8. Take notes as you go

In addition to relying upon established documentation to make your efforts more productive, take notes during difficult operations such as a server recovery effort or network troubleshooting endeavor. It helps to serve as a “brain dump” of your activities so that you can purge them from memory and refer to this information later, if needed.

Believe me, there’s nothing more challenging then sorting through a complex series of tasks during an outage post-mortem to recall what you did to fix the problem. A written record can save your brain.

SEE: Comparison chart: Enterprise collaboration tools (Tech Pro Research)

9. Take routine breaks

This should be a no-brainer, yet too many people consider themselves too busy to take a break, when doing so allows you to step away from work and hit the “pause” button. It’s not just about relaxing your brain so that you return to work with a more productive mindset, but a quick walk around the building might be beneficial in allowing you to think and come up with new ideas or solutions to problems you’re facing, thereby eliminating one more area of information overload.

10. Avoid open space seating areas

I’ve written about some of the problems of the infamous (and unfortunately common) open-seating plan in companies. In a nutshell, having no privacy and sitting in close physical and audial proximity even to individuals considered close friends strains working relationships and breeds frustration.

Avoiding cognitive overload isn’t just about not taking on or dealing with too much at once, but it’s also about not letting other people’s activities intrude upon your own productivity. Whether it’s an annoying personal phone call, playing music or even just chewing loudly, other people’s nearby activity can be a source of unwanted details, which reduces your capacity to do your job. You may not have a choice about sitting in an assigned open space seat, but take advantage of opportunities such as working from home, using an available conference room, or moving to an empty part of the office when you really need to focus.

11. Break projects down into chunks

Facing the entirety of a complex project is a daunting mission. It’s better and more effective to break a project down into subcomponents, and then focus on these separately, one at a time.

For instance, say you want to migrate users, computers, and services from one Active Directory domain to another. This would be overwhelming to focus on at once, so the best way to proceed is to divide the project into tasks. One task could be migrating user accounts and permissions. The next task could be migrating computer accounts, and the task after that could be addressing DNS changes, and so on. Plan it out in advance, and then tackle it piece-by-piece.

12. Control your calendar

Don’t let colleagues fill in your day with meaningless meetings. Have a conversation with your coworkers about which meetings are absolutely necessary for you to participate in and skip the rest. If you are a manager or leader, encourage your employees to schedule in-person meetings only when they are absolutely necessary.

13. Don’t take your phone into your bedroom

You spend enough time on screens during the day. The simple act of charging your phone in another room gives you time to really disconnect. It also gives you a chance to wake up refreshed, and think about the day ahead before reactively reaching for your device and checking social media or email.

SEE: Research: The evolution of enterprise software UX (Tech Pro Research)

Reducing team cognitive overload

ElNaggar and I also thought of a couple of tips for business leaders on ways to reduce cognitive overload for their team. These tips include:

14. Invest in the right technology

Take the time to learn what processes or tools are pain points for your employees’ productivity. Research which solutions can automate certain tasks or limit daily distractions and implement them across your workforce.

15. Embrace employee-centric workflows

ElNaggar says that leaders “embrace the idea that employee experience matters, which will have a ripple effect in their organization.” He recommends that leaders start to develop more employee-centric workflows that reduce interruptions for their employees to help them focus on priorities and accomplish more work.

An example of an employee-centric workflow would be a business application or web portal, which gives employees a single, actionable view into all of their systems and breaks down complex processes into single-purpose, streamlined workflows, allowing employees to be more productive.

“Without leadership teams championing an employee-centric mindset, nothing will really change in the mid and lower levels of a company. Business leaders must start thinking about the impact their employees’ digital experience has on their work performance and overall satisfaction, and support the idea that investing in employee experience will drive employee engagement and productivity,” ElNaggar concluded.

Originally Posted On: informationweek.com

There might be a better, knowledge management-based, way to conduct the US Census, according to a group of university researchers.

Consider for a minute whether the best way to collect important data is to mail 125 million (or so) paper forms, often to “Current Occupant,” and to then follow up with humans carrying clipboards and ringing doorbells. You probably would conclude that it’s a lot of work and a process likely to result in the collection of incomplete or inaccurate data.

Then, you’ll update that data only every 10 years: Lots can change in 10 years. Yet, you will use the collected data to determine things like how your congressional representatives will be elected, how federal funds are allocated to local schools, even where new roads will be built and public transportation offered.

Is there a better way to do the US Census than how it has been done for 228 years?

A group of university researchers believes that the data gathered and analyzed by the US Census Bureau can be found in existing sources without sending any forms or people out into the field. Actually, the researchers argue that the government can collect much more data and more timely data using sources like tax returns, state websites, even Google search data.

“The costs of a census are pretty large, $17.5 billion. That’s based on these paper forms. That’s really the driver behind our research,” says Murray Jennex, a professor focused on knowledge management at San Diego State University. “The Census Bureau has spent a lot of money for technology to analyze data, but very little on collecting data,” he added during a recent interview.

Jennex was part of the team that included San Diego State professors James Kelly (lead author), Kaveh Abhari and Eric Frost, along with Alexandra Durcikova of the University of Oklahoma. Together, they authored a research paper titled, “Data in the Wild: A KM Approach to doing a Census Without Asking Anyone and the Issue of Privacy.” That paper will be presented in January at the Hawaii International Conference on System Sciences.

While the cost of paper census surveys — including the one scheduled for 2020 — is a key consideration in the team’s research, there are several other major factors.

One such consideration is the growing abundance of data in the public sphere, such as that collected by many federal  — the Internal Revenue Service, Department of Education, Department of Labor for example — state and municipal agencies, and academic research organizations. Add in the trend data that can be gleaned from search engines such as Google, public utility records, and commercial data services such as the major consumer credit bureaus. Together they represent a wealth of data, highlighting how many people live where, areas where poverty is most challenging, ethnic trends, and the need for elderly, healthcare, and educational support.

In addition, that data can be updated and analyzed in what Jennex calls “not quite real time.” “The data we would be using could be refreshed every year, and could be used to guide public policies,” he said.

Murray Jennex, San Diego State

The limiting factor, however, is that of privacy, how the Census Bureau could protect personally identifiable information (PII). Jennex notes that data can be anonymized by stripping off PII, which would be effective protection when the data analysis covers large areas, even five-digit ZIP codes. But it might not take a lot of work for someone to identify unique individuals or families at a neighborhood level, particularly those who stand out in the neighborhood by income, size of household, or ethnic background.

So, protections would have to be put in place.

Another hurdle that the researchers acknowledge is that “government is actually very bad at sharing data.” For decades, government agencies have tended to keep their data siloed, despite attempts by some government leaders to move to an open data approach. Jennex cited the IRS as a particularly rich data source, not only for basic financial data but also for insight into household size, health issues, employment trends, and even transportation planning as more Americans work out of home offices.

Existing data, such as that from the IRS, actually can be more accurate than that currently collected through census forms — known as the American Community Survey. In their paper the researchers cited how “household income” can be misleading, depending on whether household members are married or unrelated. Also, the income questions focus on what someone made in a single year, not factoring in that the individual year’s earnings were significantly higher or lower than what they earn in a more typical year.

However, don’t expect the paper questionnaire to go away in the year and a half before you expect to find one in your mail. The changes that the researchers suggest are much further down the road.

Jim Connolly is a versatile and experienced technology journalist who has reported on IT trends for more than two decades. As Executive Managing Editor of InformationWeek, he oversees the day-to-day planning and editing on the site. Most recently he has been editor of UBM’s … View Full Bio

Originally Posted On: informationweek.com

When hiring gets tough, IT leaders get strategic. Here’s how successful organizations seize the experts their competitors’ only wish they could land.

The technology industry’s unemployment rate is well below the national average, forcing companies to compete aggressively for top talent. When presented with a range of recruitment strategies by a recent Robert Half Technology questionnaire — including using recruiters, providing job flexibility and offering more pay — most IT decision makers said they are likely to try all approaches in order to land the best job candidates for their teams.

“We’re currently in a very competitive hiring market,” noted Ryan Sutton, district president for Robert Half Technology. “Employers want to hire the best talent to help keep their organization’s information safe, but so do a lot of other companies.”

Robert Half’s research finds that software development and data analytics experts are the most challenging to hire. Many other talents are scarce, too. “Some of the most in-demand skills right now include cloud security, security engineering, software engineering, DevOps, business intelligence and big data, as well as expertise in Java full-stack, ReactJS and AngularJS,” Sutton said.

What works

Finding qualified job candidates typically requires using a combination of strategies. But it’s also important to be able to move quickly. “At the core of the labor market now is a demand for speed and efficiency in the hiring process, but don’t confuse an expeditious process with a hastily made decision,” Sutton warned. “Some smart options would be to work with a specialized recruiter who knows your local market well; increasing the pay and benefits package to better attract a top candidate; and losing some of the skills requirements on your job description that aren’t must-haves to widen your talent pool.” He also reminded hiring managers to not underestimate the power of networking. “Let your contacts know you’re looking to hire for a certain position.”

Look beyond the typical sources, suggested Art Langer, a professor and director of the Center for Technology Management at Columbia University and founder and chairman of Workforce Opportunity Services (WOS), a nonprofit organization that connects underserved and veteran populations with IT jobs. “There is a large pool of untapped talent from underserved communities that companies overlook,” he explained. Businesses are now competing in a global market. “New technology allows us to connect with colleagues and potential partners around the world as easily as with our neighbors,” Langer said. “Companies hoping to expand overseas can benefit from employees who speak multiple languages.”

Companies need to explore different models of employment if they want access to the best and the brightest job candidates, observed Nick Hamm, CEO of 10K Advisors, a Salesforce consulting firm. “Some of the most talented professionals are choosing to leave full-time employment to pursue freelancing careers or start their own small consulting companies as a way to gain more balance or reduce commute times,” he advised. “If companies want access to these individuals, they’ll need the right processes and mindset in place to incorporate contract employees into core teams.” Using a talent broker to find the right experts, vet them and apply them inside an organization to solve business problems can alleviate many of the challenges people may now have tapping into the gig economy, Hamm added.

John Samuel, CIO, of Computer Generated Solutions, a business applications, enterprise learning and outsourcing services company, advised building some flexibility into job descriptions and requirements. “In this tight job market, a good way is to find candidates with the right attitude and a solid foundation and then train them in areas where they lack experience,” he said. Like Sutton, Samuel believes that many job descriptions are unrealistic, listing many requirements that aren’t core to the job’s role. “Rather than limiting your potential pool of candidates, simplify the job description to include your core requirements to entice applicants to fill open roles,” Samuel recommended.

Mike Weast, regional IT vice president at staffing firm Addison Group, urged hiring managers not to rely on software searches, no matter how intuitive they may claim to be, to uncover qualified job candidates. “There’s a lot of talk about using AI to find qualified candidates, but recruiters are needed to bridge the AI gap,” he claimed. “AI doesn’t qualify a candidate for showing up on time, having a strong handshake or making eye contact when communicating.”

Training current employees to meet the requirements of a vacant position is an often-overlooked method of acquiring experts. “It always makes sense to give existing employees the opportunity to expand their knowledge base and transition into vacant positions,” explained Lori Brock, head of innovation, Americas, for OSRAM, a multinational lighting manufacturer headquartered in Munich. “The roles within IT are merging with the traditional R&D functions as well as with roles in manufacturing, procurement, sales, marketing and more,” she added. “We can no longer consider jobs in IT fields as belonging to an IT silo within any organization.”

Last thought

It’s important to pounce quickly when finding a skilled, qualified job candidate. “Now is certainly not the time to be slow to hire,” Sutton said. “It’s a candidate’s market and they are well aware of the opportunities available to them.”

For more on IT hiring and management check out these recent articles.

John Edwards is a veteran business technology journalist. His work has appeared in The New York Times, The Washington Post, and numerous business and technology publications, including Computerworld, CFO Magazine, IBM Data Management Magazine, RFID Journal, and Electronic … View Full Bio

Originally Posted On: techrepublic.com

Three jobs completely new to the IT industry will be data trash engineer, virtual identity defender, and voice UX designer, according to Cognizant.

With technology flooding the enterprise, many people fear the emergence of tech will take over their jobs. However, tech like artificial intelligence (AI) and machine learning will actually create more jobs for humans, according to a recent Cognizant report. The report outlines 21 “plausible and futuristic” jobs that will surface in the next decade.

The 21 jobs follow three major underlying themes: Ethical behaviors, security and safety, and dreams, said the report. These themes come from humans’ deeper aspirations for the future of the enterprise and daily life. Humans want machines to be ethical; humans want to feel safe in a technologically-fueled future; and humans always dreamt of a futuristic world, which is coming to fruition, according to the report.

SEE: Artificial intelligence: Trends, obstacles, and potential wins (Tech Pro Research)

Some of the jobs on Cognizant’s list could spark life-long careers, and some positions might be more fleeting, said the report. Here are the 21 jobs of the future:

  1. Cyber attack agent
  2. Voice UX designer
  3. Smart home design manager
  4. Algorithm bias auditor
  5. Virtual identity defender
  6. Cyber calamity forecaster
  7. Head of machine personality design
  8. Data trash engineer
  9. Uni4Life coordinator
  10. Head of business behavior
  11. Joy adjutant
  12. Juvenile cybercrime rehabilitation counselor
  13. Tidewater architect
  14. Esports arena builder
  15. VR arcade manager
  16. Vertical farm consultant
  17. Machine risk officer
  18. Flying car developer
  19. Haptic interface programmer
  20. Subscription management specialist
  21. Chief purpose planner

Three of the positions would be completely new in the IT world: Data trash engineer, virtual identity defender, and voice UX designer. A data trash engineer would be responsible for using unused data in an organization to find hidden insights, said the report; a virtual identity defender would lead a team to make a company’s business goal a reality; and a voice UX designer will use diagnostic tools, algorithms, and more to create the perfect voice assistant, said the report.

Click here for descriptions of all 21 positions.

The big takeaways for tech leaders:

  • Emerging tech will actually create a whole new set of jobs for humans in the next 10 years, with some having more staying power than others. — Cognizant, 2018
  • The tech jobs of the future all follow three underlying themes that humans share: Ethical behaviors, security and safety, and dreams. — Cognizant, 2018

Originally Posted On: news.fiu.edu

Some skills are considered too “small” or specific to become a degree program and aren’t often listed on a student’s academic transcript. Yet, it’s a collection of these very skills that employers know are a big deal in the rapidly-changing 21st century workforce.

This is where badges come in. These digital icons represent achievements or skills in a certain area or subject matter. A form of ‘micro-credentialing,’ badges allow students to break down their educational experience – competency by competency – and tell the complete story of their educational journey to potential employers.

Today, badges are a rising trend in the rapidly changing world of higher education. In fact, according to a 2016 survey by the University Professional and Continuing Education Association, one in five colleges has issued a digital badge.

Randy Pestana and Brian Fonseca – from the Jack D. Gordon Institute for Public Policy in the Steven J. Green School of International and Public Affairs – understand the urgency behind bringing this new form of credentialing to FIU. The skills gap – or the mismatch between what employers are looking for and job candidates have to offer – dominates their conversations with industry partners.

“They continue to tell us that job candidates don’t have the skills they need,” Pestana said. “Employers are looking for people who not only have a deep knowledge of a specific subject matter, but also a wide array of other skills that allow them to work across a variety of other subject areas.”

In an attempt to begin to close this gap and give students from all majors and disciplines the opportunity to build the skills that matter most in the 21st century – and still graduate in four years – Pestana and Fonseca began working on building a badge program at FIU.

They started with a subject area that has major implications for all industries and sectors: cybersecurity.

“Hospitality, healthcare, government, law, business – there isn’t an industry that isn’t susceptible to cyberattacks,” Pestana said. “These badges give the basic knowledge everyone needs to know, because anyone can be targeted by a cyberattack and have their personal information compromised.”

Collaborating across the university, Pestana and Fonseca brought in expertise from FIU’s Division of Information Technology, College of Business, College of Engineering & Computing, College of Law and StartUp FIU to create six badges. They are focused on different areas related to cybersecurity, including the Internet of Things, blockchain, cryptocurrencies and cybersecurity policy and law.

To earn a badge, students attend a Saturday workshop, which includes a lecture and active learning exercise. If students earn all six badges, they will also earn a certificate in cybersecurity fundamentals.

Cybersecurity was a natural place to begin offering badges.

FIU is a nationally recognized hub for interdisciplinary cybersecurity study and research and is focused on helping grow a future pipeline of cybersecurity professionals. In fact, earlier this year, FIU was selected to be the educational partner and host of the 2018 National Initiative for Cybersecurity Education (NICE) Conference and Expo, which aims to bring together higher education and industry to address growing cybersecurity workforce shortages.

The cybersecurity badges are just the beginning of a broader initiative to bring more 21st century workforce competencies to FIU.

A special interdisciplinary committee led by Senior Vice President for Academic and Student Affairs Elizabeth Bejar – and which includes members from academic and student-services units across the institution – will be working closely with local industry partners to explore bringing new badge programs to the university.

“FIU is always looking toward the future – that’s who we are,” Bejar said. “We’re here to educate lifelong learners and ensure they have the relevant, just-in-time skills that put them at a competitive advantage in our 21st century workforce.”

Recognizing Everyone As A Student For Life