Originally Posted On: news.fiu.edu

Some skills are considered too “small” or specific to become a degree program and aren’t often listed on a student’s academic transcript. Yet, it’s a collection of these very skills that employers know are a big deal in the rapidly-changing 21st century workforce.

This is where badges come in. These digital icons represent achievements or skills in a certain area or subject matter. A form of ‘micro-credentialing,’ badges allow students to break down their educational experience – competency by competency – and tell the complete story of their educational journey to potential employers.

Today, badges are a rising trend in the rapidly changing world of higher education. In fact, according to a 2016 survey by the University Professional and Continuing Education Association, one in five colleges has issued a digital badge.

Randy Pestana and Brian Fonseca – from the Jack D. Gordon Institute for Public Policy in the Steven J. Green School of International and Public Affairs – understand the urgency behind bringing this new form of credentialing to FIU. The skills gap – or the mismatch between what employers are looking for and job candidates have to offer – dominates their conversations with industry partners.

“They continue to tell us that job candidates don’t have the skills they need,” Pestana said. “Employers are looking for people who not only have a deep knowledge of a specific subject matter, but also a wide array of other skills that allow them to work across a variety of other subject areas.”

In an attempt to begin to close this gap and give students from all majors and disciplines the opportunity to build the skills that matter most in the 21st century – and still graduate in four years – Pestana and Fonseca began working on building a badge program at FIU.

They started with a subject area that has major implications for all industries and sectors: cybersecurity.

“Hospitality, healthcare, government, law, business – there isn’t an industry that isn’t susceptible to cyberattacks,” Pestana said. “These badges give the basic knowledge everyone needs to know, because anyone can be targeted by a cyberattack and have their personal information compromised.”

Collaborating across the university, Pestana and Fonseca brought in expertise from FIU’s Division of Information Technology, College of Business, College of Engineering & Computing, College of Law and StartUp FIU to create six badges. They are focused on different areas related to cybersecurity, including the Internet of Things, blockchain, cryptocurrencies and cybersecurity policy and law.

To earn a badge, students attend a Saturday workshop, which includes a lecture and active learning exercise. If students earn all six badges, they will also earn a certificate in cybersecurity fundamentals.

Cybersecurity was a natural place to begin offering badges.

FIU is a nationally recognized hub for interdisciplinary cybersecurity study and research and is focused on helping grow a future pipeline of cybersecurity professionals. In fact, earlier this year, FIU was selected to be the educational partner and host of the 2018 National Initiative for Cybersecurity Education (NICE) Conference and Expo, which aims to bring together higher education and industry to address growing cybersecurity workforce shortages.

The cybersecurity badges are just the beginning of a broader initiative to bring more 21st century workforce competencies to FIU.

A special interdisciplinary committee led by Senior Vice President for Academic and Student Affairs Elizabeth Bejar – and which includes members from academic and student-services units across the institution – will be working closely with local industry partners to explore bringing new badge programs to the university.

“FIU is always looking toward the future – that’s who we are,” Bejar said. “We’re here to educate lifelong learners and ensure they have the relevant, just-in-time skills that put them at a competitive advantage in our 21st century workforce.”

Originally Posted On: edsource.org

No degree means diminished opportunities, study finds

Millions of Californians who began their college education but never finished deserve special support and policy changes to help get them across the finish line later in life, a new report urges.

The study from the non-partisan California Competes organization estimates that 4 million Californians, ages 25 to 64, earned some college credits at various times but no associate or bachelor’s degrees and are not in school now. As a result, their employment and financial prospects have suffered and they face “diminishing opportunities in labor markets that increasingly rely on workers with degrees,” said the report entitled “Back to College: California’s Imperative to Re-Engage Adults.”

The report found that those adults with some college but no degree are significantly less likely to earn more than $75,000 a year compared to those who have at least an associate degree from a community college. Only 14 percent of those who didn’t finish their degrees earn in that upper income bracket, compared to 36 percent of those who have degrees (and 5 percent of those with just high school or less).

Not surprisingly, fewer of those adults who have some college credits but no diploma own homes and have full health insurance compared to graduates. And other research shows that non-completers have higher default rates on college loans, with unhappy consequences.

“We’ve already invested in folks who haven’t crossed the finish line. So our argument is that it makes sense to help them get across the finish line to benefit the broader California economy and to boost their individual prosperity,” Lande Ajose, executive director of California Competes, said in an interview. That Oakland-based organization analyzes ways to improve higher education in the state and how such reforms can aid the economy. Ajose is also chairwoman of the California Student Aid Commission, which administers Cal Grants.

The study showed ethnic disparities for college completion among California adults between ages 25 and 64, with higher rates for whites and Asians than for Latinos and blacks. Sixty two percent of Asians of that age had earned a degree, compared to 53 percent of whites, 34 percent of blacks and 18 percent of Latinos.

However, black adults showed the highest rate (28 percent) of their ethnic group who started but did not finish a degree, followed by whites (23 percent), Latinos (17 percent) and Asians (13 percent).

Among the roadblocks facing adults who want to return to college are limitations on financial aid that don’t affect most traditional age students, noted the report.

For example, federal Pell Grants are available for only 12 semesters over a person’s life and many of these adults are likely to have already used that allotment up years ago. Because of qualification rules and limits on expenditures, state-funded Cal Grants are very difficult to obtain for people who are older than 28 and several years out of high school. State officials are looking at ways to improve Cal Grants, including making them more available to people who attend community college years after high school.

“The inadequate financial aid options available to returning adults exacerbate the economic trends” that hurt the earning potential of people without degrees, the report said. In addition those people face personal and scheduling problems juggling work and family issues with their studies if they want to complete their degrees.

In addition, the report described poor coordination among California’s higher education systems and resulting “structural barriers that impede adults’ abilities to return to school.” Those include difficult access to academic transcripts and older data among different colleges and universities if an adult started at one or two campuses and seeks to finish at another, it said.

While describing problems, the report does not offer specific suggestions for improvements. California Competes officials said they expect a second report to do so by year’s end.

Adults without college degrees or certificates are at the center of a much-discussed effort in California. State leaders hope that the opening of a new on-line community college late next year will offer training and extra education for skilled jobs in fast growing industries. Those credentials are intended mainly to be completed in a year or less.

However, most adults who want to finish the more traditional associate or bachelor’s degrees still must attend the state’s other 114 community colleges or a four-year university. Adult students currently can take some online courses offered at those schools.

However, Ajose said college campuses should make their class schedules and other services more flexible to serve older students.

Meanwhile, a separate new report shows that students who took out federal student loans for college but never finished degrees default at high rates and face many problems as a result. Twenty three percent of borrowers who started college in 2003-04 defaulted within 12 years compared to 11 percent of those who completed, according to a policy brief by the The Institute for College Access and Success (TICAS).

Defaulters face “stark and immediate consequences” that could include fines, wage garnishment, lost job opportunities and suspended driver’s and professional licences, said the report entitled “The Self-Defeating Consequences of Student Loan Default.” TICAS, a non-partisan research and policy group with offices in Oakland and Washington, D.C., called for reforms that would lift some of the most burdensome penalties and make easier to enroll in income-based repayment plans.

Originally Posted On: technologyreview.com

Online versions of college courses are attracting hundreds of thousands of students, millions of dollars in funding, and accolades from university administrators. Is this a fad, or is higher education about to get the overhaul it needs?

Written By: 

A hundred years ago, higher education seemed on the verge of a technological revolution. The spread of a powerful new communication network—the modern postal system—had made it possible for universities to distribute their lessons beyond the bounds of their campuses. Anyone with a mailbox could enroll in a class. Frederick Jackson Turner, the famed University of Wisconsin historian, wrote that the “machinery” of distance learning would carry “irrigating streams of education into the arid regions” of the country. Sensing a historic opportunity to reach new students and garner new revenues, schools rushed to set up correspondence divisions. By the 1920s, postal courses had become a full-blown mania. Four times as many people were taking them as were enrolled in all the nation’s colleges and universities combined.

The hopes for this early form of distance learning went well beyond broader access. Many educators believed that correspondence courses would be better than traditional on-campus instruction because assignments and assessments could be tailored specifically to each student. The University of Chicago’s Home-Study Department, one of the nation’s largest, told prospective enrollees that they would “receive individual personal attention,” delivered “according to any personal schedule and in any place where postal service is available.” The department’s director claimed that correspondence study offered students an intimate “tutorial relationship” that “takes into account individual differences in learning.” The education, he said, would prove superior to that delivered in “the crowded classroom of the ordinary American University.”

We’ve been hearing strikingly similar claims today. Another powerful communication network—the Internet—is again raising hopes of a revolution in higher education. This fall, many of the country’s leading universities, including MIT, Harvard, Stanford, and Princeton, are offering free classes over the Net, and more than a million people around the world have signed up to take them. These “massive open online courses,” or MOOCs, are earning praise for bringing outstanding college teaching to multitudes of students who otherwise wouldn’t have access to it, including those in remote places and those in the middle of their careers. The online classes are also being promoted as a way to bolster the quality and productivity of teaching in general—for students on campus as well as off. Former U.S. secretary of education William Bennett has written that he senses “an Athens-like renaissance” in the making. Stanford president John Hennessy told the New Yorker he sees “a tsunami coming.”

The excitement over MOOCs comes at a time of growing dissatisfaction with the state of college education. The average price tag for a bachelor’s degree has shot up to more than $100,000. Spending four years on campus often leaves young people or their parents weighed down with big debts, a burden not only on their personal finances but on the overall economy. And many people worry that even as the cost of higher education has risen, its quality has fallen. Dropout rates are often high, particularly at public colleges, and many graduates display little evidence that college improved their critical-thinking skills. Close to 60 percent of Americans believe that the country’s colleges and universities are failing to provide students with “good value for the money they and their families spend,” according to a 2011 survey by the Pew Research Center. Proponents of MOOCs say the efficiency and flexibility of online instruction will offer a timely remedy.

But not everyone is enthusiastic. The online classes, some educators fear, will at best prove a distraction to college administrators; at worst, they will end up diminishing the quality of on-campus education. Critics point to the earlier correspondence-course mania as a cautionary tale. Even as universities rushed to expand their home-study programs in the 1920s, investigations revealed that the quality of the instruction fell short of the levels promised and that only a tiny fraction of enrollees actually completed the courses. In a lecture at Oxford in 1928, the eminent American educator Abraham Flexner delivered a withering indictment of correspondence study, claiming that it promoted “participation” at the expense of educational rigor. By the 1930s, once-eager faculty and administrators had lost interest in teaching by mail. The craze fizzled.

Is it different this time? Has technology at last advanced to the point where the revolutionary promise of distance learning can be fulfilled? We don’t yet know; the fervor surrounding MOOCs makes it easy to forget that they’re still in their infancy. But even at this early juncture, the strengths and weaknesses of this radically new form of education are coming into focus.

Rise of the MOOCs

“I had no clue what I was doing,” Sebastian Thrun says with a chuckle, as he recalls his decision last year to offer Stanford’s Introduction to Artificial Intelligence course free online. The 45-year-old robotics expert had a hunch that the class, which typically enrolls a couple of hundred undergraduates, would prove a draw on the Net. After all, he and his co-professor, Peter Norvig, were both Silicon Valley stars, holding top research posts at Google in addition to teaching at Stanford. But while Thrun imagined that enrollment might reach 10,000 students, the actual number turned out to be more than an order of magnitude higher. When the class began, in October 2011, some 160,000 people had signed up.

The experience changed Thrun’s life. Declaring “I can’t teach at Stanford again,” he announced in January that he was joining two other roboticists to launch an ambitious educational startup called Udacity. The venture, which bills itself as a “21st-century university,” is paying professors from such schools as Rutgers and the University of Virginia to give open courses on the Net, using the technology originally developed for the AI class. Most of the 14 classes Udacity offers fall into the domains of computer science and mathematics, and Thrun says it will concentrate on such fields for now. But his ambitions are hardly narrow: he sees the traditional university degree as an outdated artifact and believes Udacity will provide a new form of lifelong education better suited to the modern labor market.

Udacity is just one of several companies looking to capitalize on the burgeoning enthusiasm for MOOCs. In April, two of Thrun’s colleagues in Stanford’s computer science department, Daphne Koller and Andrew Ng, rolled out a similar startup called Coursera. Like Udacity, Coursera is a for-profit business backed with millions of dollars in venture capital. Unlike Udacity, Coursera is working in concert with big universities. Where Thrun wants to develop an alternative to a traditional university, Koller and Ng are looking to build a system that established schools can use to deliver their own classes over the Net. Coursera’s original partners included not only Stanford but Princeton, Penn, and the University of Michigan, and this summer the company announced affiliations with 29 more schools. It already has about 200 classes on offer, in fields ranging from statistics to sociology.

On the other side of the country, MIT and Harvard joined forces in May to form edX, a nonprofit that is also offering tuition-free online classes to all comers. Bankrolled with $30 million from each school, edX is using an open-source teaching platform developed at MIT. It includes video lessons and discussion forums similar to those offered by its for-profit rivals, but it also incorporates virtual laboratories where students can carry out simulated experiments. This past summer, the University of California at Berkeley joined edX, and in September the program debuted its first seven classes, mainly in math and engineering. Overseeing the launch of edX is Anant Agarwal, the former director of MIT’s Computer Science and Artificial Intelligence Laboratory.

The leaders of Udacity, Coursera, and edX have not limited their aspirations to enhancing distance learning. They believe that online instruction will become a cornerstone of the college experience for on-campus students as well. The merging of virtual classrooms with real classrooms, they say, will propel academia forward. “We are reinventing education,” declares Agarwal. “This will change the world.”

Professor Robot

Online courses aren’t new; big commercial outfits like the University of Phoenix and DeVry University offer thousands of them, and many public colleges allow students to take classes on the Net for credit. So what makes MOOCs different? As Thrun sees it, the secret lies in “student engagement.” Up to now, most Internet classes have consisted largely of videotaped lectures, a format that Thrun sees as deeply flawed. Classroom lectures are in general “boring,” he says, and taped lectures are even less engaging: “You get the worst part without getting the best part.” While MOOCs include videos of professors explaining concepts and scribbling on whiteboards, the talks are typically broken up into brief segments, punctuated by on-screen exercises and quizzes. Peppering students with questions keeps them involved with the lesson, Thrun argues, while providing the kind of reinforcement that has been shown to strengthen comprehension and retention.

Norvig, who earlier this year taught a Udacity class on computer programming, points to another difference between MOOCs and their predecessors. The economics of online education, he says, have improved dramatically. Cloud computing facilities allow vast amounts of data to be stored and transmitted at very low cost. Lessons and quizzes can be streamed free over YouTube and other popular media delivery services. And social networks like Facebook provide models for digital campuses where students can form study groups and answer each other’s questions. In just the last few years, the cost of delivering interactive multimedia classes online has dropped precipitously. That’s made it possible to teach huge numbers of students without charging them tuition.

It’s hardly a coincidence that Udacity, Coursera, and edX are all led by computer scientists. To fulfill their grand promise—making college at once cheaper and better—MOOCs will need to exploit the latest breakthroughs in large-scale data processing and machine learning, which enable computers to adjust to the tasks at hand. Delivering a complex class to thousands of people simultaneously demands a high degree of automation. Many of the labor-intensive tasks traditionally performed by professors and teaching assistants—grading tests, tutoring, moderating discussions—have to be done by computers. Advanced analytical software is also required to parse the enormous amounts of information about student behavior collected during the classes. By using algorithms to spot patterns in the data, programmers hope to gain insights into learning styles and teaching strategies, which can then be used to refine the technology further. Such artificial-intelligence techniques will, the MOOC pioneers believe, bring higher education out of the industrial era and into the digital age.

While their ambitions are vast, Thrun, Koller, and Agarwal all stress that their fledgling organizations are just starting to amass information from their courses and analyze it. “We haven’t yet used the data in a systematic way,” says Thrun. It will be some time before the companies are able to turn the information they’re collecting into valuable new features for professors and students. To see the cutting edge in computerized teaching today, you have to look elsewhere—in particular, to a small group of academic testing and tutoring outfits that are hard at work translating pedagogical theories into software code.

One of the foremost thinkers in this field is a soft-spoken New Yorker named David Kuntz. In 1994, after earning his master’s degree in philosophy and working as an epistemologist, or knowledge theorist, for the Law School Admission Council (the organization that administers the LSAT examinations), Kuntz joined the Educational Testing Service, which runs the SAT college-admission tests. ETS was eager to use the burgeoning power of computers to design more precise exams and grade them more efficiently. It set Kuntz and other philosophers to work on a very big question: how do you use software to measure meaning, promote learning, and evaluate understanding? The question became even more pressing when the World Wide Web opened the Internet to the masses. Interest in “e-learning” surged, and the effort to develop sophisticated teaching and testing software combined with the effort to design compelling educational websites.

Three years ago, Kuntz joined a small Manhattan startup called Knewton as its head of research. The company specializes in the budding discipline of adaptive learning. Like other trailblazers in instructional software, including the University of California-Irvine spinoff ALEKS, Carnegie Mellon’s Open Learning Initiative, and the much celebrated Khan Academy, it is developing online tutoring systems that can adapt to the needs and learning styles of individual students as they proceed through a course of instruction. Such programs, says Kuntz, “get better as more data is collected.” Software for, say, teaching algebra can be written to reflect alternative theories of learning, and then, as many students proceed through the program, the theories can be tested and refined and the software improved. The bigger the data sets, the more adept the systems become at providing each student with the right information in the right form at the right moment.

Knewton has introduced a remedial math course for incoming college students, and its technology is being incorporated into tutoring programs offered by the textbook giant Pearson. But Kuntz believes that we’re only just beginning to see the potential of educational software. Through the intensive use of data analysis and machine learning techniques, he predicts, the programs will advance through several “tiers of adaptivity,” each offering greater personalization through more advanced automation. In the initial tier, which is already largely in place, the sequence of steps a student takes through a course depends on that student’s choices and responses. Answers to a set of questions may, for example, trigger further instruction in a concept that has yet to be mastered—or propel the student forward by introducing material on a new topic. “Each student,” explains Kuntz, “takes a different path.” In the next tier, which Knewton plans to reach soon, the mode in which material is presented adapts automatically to each student. Although the link between media and learning remains controversial, many educators believe that different students learn in different ways. Some learn best by reading text, others by watching a demonstration, others by playing a game, and still others by engaging in a dialogue. A student’s ideal mode may change, moreover, at each stage in a course—or even at different times during the day. A video lecture may be best for one lesson, while a written exercise may be best for the next. By monitoring how students interact with the teaching system itself—when they speed up, when they slow down, where they click—a computer can learn to anticipate their needs and deliver material in whatever medium promises to maximize their comprehension and retention.

Looking toward the future, Kuntz says that computers will ultimately be able to tailor an entire “learning environment” to fit each student. Elements of the program’s interface, for example, will change as the computer senses the student’s optimum style of learning.

Big Data on Campus

The advances in tutoring programs promise to help many college, high-school, and even elementary students master basic concepts. One-on-one instruction has long been known to provide substantial educational benefits, but its high cost has constrained its use, particularly in public schools. It’s likely that if computers are used in place of teachers, many more students will be able to enjoy the benefits of tutoring. According to one recent study of undergraduates taking statistics courses at public universities, the latest of the online tutoring systems seem to produce roughly the same results as face-to-
face instruction.

While MOOCs are incorporating adaptive learning routines into their software, their ambitions for data mining go well beyond tutoring. Thrun says that we’ve only seen “the tip of the iceberg.” What particularly excites him and other computer scientists about free online classes is that thanks to their unprecedented scale, they can generate the immense quantities of data required for effective machine learning. Koller says that Coursera has set up its system with intensive data collection and analysis in mind. Every variable in a course is tracked. When a student pauses a video or increases its playback speed, that choice is captured in the Coursera database. The same thing happens when a student answers a quiz question, revises an assignment, or comments in a forum. Every action, no matter how inconsequential it may seem, becomes grist for the statistical mill.

Assembling information on student behavior at such a minute level of detail, says Koller, “opens new avenues for understanding learning.” Previously hidden patterns in the way students navigate and master complex subject matter can be brought to light.

The number-crunching also promises to benefit teachers and students directly, she adds. Professors will receive regular reports on what’s working in their classes and what’s not. And by pinpointing “the most predictive factors for success,” MOOC software will eventually be able to guide each student onto “the right trajectory.” Koller says she hopes that Lake Wobegon, the mythical town in which “all students are above average,” will “come to life.”

MIT and Harvard are designing edX to be as much a tool for educational research as a digital teaching platform, Anant Agarwal says. Scholars are already beginning to use data from the system to test hypotheses about how people learn, and as the portfolio of courses grows, the opportunities for research will proliferate. Beyond generating pedagogical insights, Agarwal foresees many other practical applications for the edX data bank. Machine learning may, for instance, pave the way for an automated system to detect cheating in online classes, a challenge that is becoming more pressing as universities consider granting certificates or even credits to students who complete MOOCs.

With a data explosion seemingly imminent, it’s hard not to get caught up in the enthusiasm of the MOOC architects. Even though their work centers on computers, their goals are deeply humanistic. They’re looking to use machine learning to foster student learning, to deploy artificial intelligence in the service of human intelligence. But the enthusiasm should be tempered by skepticism. The benefits of machine learning in education remain largely theoretical. And even if AI techniques generate genuine advances in pedagogy, those breakthroughs may have limited application. It’s one thing for programmers to automate courses of instruction when a body of knowledge can be defined explicitly and a student’s progress measured precisely. It’s a very different thing to try to replicate on a computer screen the intricate and sometimes ineffable experiences of teaching and learning that take place on a 
college campus.

The promoters of MOOCs have a “fairly naïve perception of what the analy­sis of large data sets allows,” says Timothy Burke, a history professor at Swarthmore College. He contends that distance education has historically fallen short of expectations not for technical reasons but, rather, because of “deep philosophical problems” with the model. He grants that online education may provide efficient training in computer programming and other fields characterized by well-established procedures that can be codified in software. But he argues that the essence of a college education lies in the subtle interplay between students and teachers that cannot be simulated by machines, no matter how sophisticated the programming.

Alan Jacobs, a professor of English at Wheaton College in Illinois, raises similar concerns. In an e-mail to me, he observed that the work of college students “can be affected in dramatic ways by their reflection on the rhetorical situations they encounter in the classroom, in real-time synchronous encounters with other people.” The full richness of such conversations can’t be replicated in Internet forums, he argued, “unless the people writing online have a skilled novelist’s ability to represent complex modes of thought and experience in prose.” A computer screen will never be more than a shadow of a good college classroom. Like Burke, Jacobs worries that the view of education reflected in MOOCs has been skewed toward that of the computer scientists developing the platforms.

Flipping the Classroom

The designers and promoters of MOOCs don’t suggest that computers will make classrooms obsolete. But they do argue that online instruction will change the nature of teaching on campus, making it more engaging and efficient. The traditional model of instruction, where students go to class to listen to lectures and then head off on their own to complete assignments, will be inverted. Students will listen to lectures and review other explanatory material alone on their computers (as some middle-school and high-school students already do with Khan Academy videos), and then they’ll gather in classrooms to explore the subject matter more deeply—through discussions with professors, say, or through lab exercises. In theory, this “flipped classroom” will allocate teaching time more rationally, enriching the experience of both professor and student.

Here, too, there are doubts. One cause for concern is the high dropout rate that has plagued the early MOOCs. Of the 160,000 people who enrolled in Norvig and Thrun’s AI class, only about 14 percent ended up completing it. Of the 155,000 students who signed up for an MIT course on electronic circuits earlier this year, only 23,000 bothered to finish the first problem set. About 7,000, or 5 percent, passed the course. Shepherding thousands of students through a college class is a remarkable achievement by any measure—typically only about 175 MIT students finish the circuits course each year—but the dropout rate highlights the difficulty of keeping online students attentive and motivated. Norvig acknowledges that the initial enrollees in MOOCs have been an especially self-motivated group. The real test, particularly for on-campus use of online instruction, will come when a broader and more typical cohort takes the classes. MOOCs will have to inspire a wide variety of students and retain their interest as they sit in front of their computers through weeks of study.

The greatest fear among the critics of MOOCs is that colleges will rush to incorporate online instruction into traditional classes without carefully evaluating the possible drawbacks. Last fall, shortly before he cofounded Coursera, Andrew Ng adapted his Stanford course on machine learning so that online students could participate, and thousands enrolled. But at least one on-campus student found the class wanting. Writing on his blog, computer science major Ben Rudolph complained that the “academic rigor” fell short of Stanford’s standards. He felt that the computerized assignments, by providing automated, immediate hints and guidance, failed to encourage “critical thinking.” He also reported a sense of isolation. He “met barely anyone in [the] class,” he said, because “everything was done alone in my room.” Ng has staunchly defended the format of the class, but the fact is that no one really knows how an increasing stress on computerized instruction will alter the dynamics of college life.

The leaders of the MOOC movement acknowledge the challenges they face. Perfecting the model, says Agarwal, will require “sophisticated inventions” in many areas, from grading essays to granting credentials. This will only get harder as the online courses expand further into the open-ended, exploratory realms of the liberal arts, where knowledge is rarely easy to codify and the success of a class can hinge on a professor’s ability to guide students toward unexpected insights. The outcome of this year’s crop of MOOCs should tell us a lot more about the value of the classes and the role they’ll ultimately play in the educational system.

At least as daunting as the technical challenges will be the existential questions that online instruction raises for universities. Whether massive open courses live up to their hype or not, they will force college administrators and professors to reconsider many of their assumptions about the form and meaning of teaching. For better or worse, the Net’s disruptive forces have arrived at the gates of academia.


Nicholas Carr is the author of The Shallows: What the Internet Is Doing to Our Brains. His last article for MIT Technology Review was “The Library of Utopia.”

Originally Posted On: informationweek.com

Amazon recently proved it isn’t infallible when it shut down a human resources system that was systematically biased against women. However, there’s more to the story that today’s enterprise leaders should know.

When people talk about machine learning masters, Amazon is always top-of-mind. For more than two decades, the company’s recommendation capabilities have been coveted by others hoping to imitate it. However, even Amazon hasn’t mastered machine learning completely, as evidenced by a biased HR system it shut down. What may be surprising to some is the reality of the underlying situation, which is that biased data isn’t just a technical problem, it’s a business problem.

Specifically, Reuters and others recently reported that since 2014 Amazon had been using a recruiting engine that was systematically biased against women seeking technical positions. It doesn’t necessarily follow that Amazon is biased against tech-savvy women, but the situation does seem to indicate that the historical data used to train the system included more males than females.

Historically, more men have held technical positions than women, generally speaking, not just at Amazon. At the present time, the world is comprised of about half men and half women, with one sex more predominant in some cultures than others. However, women hold 26% of “professional computing occupations”. If the dataset represents that three out of four workers in a technical position are men, then it follows an AI trained on the data would reflect the underlying data.

Amazon is now faced with a public relations fiasco even though it abandoned the system. According to a spokesperson, it “was never used by Amazon recruiters to evaluate candidates.” It was used in a trial phase, never independently and never rolled out to a larger group. The project was abandoned a couple years ago for many reasons, including that it never returned strong candidates for a role. Interestingly, the company claims that bias wasn’t the issue.

If bias isn’t the issue, then what is?

There’s no doubt that the outcome of Amazon’s HR system was biased. Biased data produces biased outcomes. However, there is another important issue not identified by Amazon or other some media, which is data quality.

For years, organizations have been hearing about the need for good-quality data. For one thing, good-quality data is more reliable than bad-quality data. Just about every business wants to use analytics to make better business decisions, but not everyone is thinking about the quality of the data that is being relied upon to make such decisions. Data is also used to train AI systems, so the quality of that data should be top-of-mind. Sadly, in an HR context, bad data is the norm.

Kevin Parker

“If they’d asked us, I would have said starting with resumes is a bad idea,” said Kevin Parker, CEO of hiring intelligence company HireVue. “It will never work, particularly when you’re looking at resumes for training data. “

As if the poor quality of resume data wasn’t enough to derail Amazon’s project, add job descriptions. Job descriptions are often poorly written, so the likely result is a system that attempts to match attributes from one pool of poor quality data with another pool of poor-quality data.

Bias is a huge issue, regardless

Humans tend to be naturally biased creatures. Since humans have created and are still behind the creation of data, it only stands to reason that their biases will be reflected in the data. While there are ways of correcting for bias, it isn’t as simple as pressing a button. One must be able to identify the bias in the first place and should also understand the context of that bias.

“We think of resumes as a representation of the person, but let’s go to the person and get to the root of what we’re trying to do, and try to figure out if the person is a great match for this particular job. Are they empathetic? Are they great problem solvers? Are they great analytical thinkers? All of the things that define success in a job or role,” said HireVue’s Parker.

HireVue is building its own AI models that are correlated to performance in customer organizations.

“[The models are] validated. We do a lot of work to eliminate bias in the training data and we can prove it arithmetically,” said Parker. “The underlying flaw is don’t start with resumes because it won’t end well.”

HireVue looks at the data collected during the course of a 20 to 30-minute video interview. During that time, it’s able to collect tens of thousands of data points. Its system is purportedly capable of showing an arithmetic before and after, so if all successful people in a particular role are middle-aged white men but the same level of success is desired from a more diverse workforce, then what are the underlying competencies and work-related skills is the company seeking?

“By understanding the attributes of the best, middle and poor performers in an organization, an AI model can be built [that looks] for those attributes in a video interview so you can know almost in real-time if a candidate is a good candidate or not and respond to each in a different way,” said Parker.

Recruitment software and marketplace ScoutExchange analyzes the track record of individual recruiters to identify the types of biases they’ve exhibited over time, such as whether they hired more men than women or whether they tend to prefer candidates from certain colleges or universities over others.

Ken Lazarus

“There’s bias in all data and you need a strategy to deal with it or you’re going to end up results you don’t like and you won’t use [the system],” said Ken Lazarus, CEO of ScoutExchange. “The people at Amazon are pretty smart and pretty good at machine learning and recommendations, but it points out the real difficulty of trying to match humans without any track record. We look at a recruiter’s track record so we can remove bias. Everyone needs a strategy to do that or you’re not going to get anywhere.”

The three things to take away from Amazon’s situation are these:

1 – Despite all the hype about machine learning, it isn’t perfect. Even Amazon doesn’t get everything right all the time. No organization or individual does.

2 – Bias isn’t the sole domain of statisticians and data scientists. Business and IT leaders need to be concerned about it because bias can have very real business impacts as Amazon’s gaffe demonstrates.

3 – Data quality matters. Data quality is not considered as hot a topic as AI, but the two go hand-in-hand. Data is AI brain food.

[For more about data bias in AI, check out these articles.]

Lisa Morgan is a freelance writer who covers big data and BI for InformationWeek. She has contributed articles, reports, and other types of content to various publications and sites ranging from SD Times to the Economist Intelligent Unit. Frequent areas of coverage include … View Full Bio

Originally Posted On: vice.com

High schoolers are weighing the benefits of blue-collar trades at a time when well-paying jobs—and no debt—are hard to pass up.

This story appears in VICE Magazine’s Power and Privilege Issue. Click HEREto subscribe.

On a recent Wednesday morning, about 20 students at Queens Technical High School marched into a supply closet and retrieved what looked, to an outsider, like silver suitcases. They sat back down at the classroom’s U-shaped table arrangement and opened what were in fact “advanced cable trainers,” kits containing the cables and wire cutters they’d be working with throughout their senior year. Meanwhile, their teacher, David Abreu, began to lecture them about what it’s like out “in industry”—the vocational school term for the proverbial “real world.”

“When you go out there, there’s no reason why anyone should be sitting on mommy’s couch, eating cereal, and watching cartoons or a telenovela,” he told the teens, who were mostly male. “There’s tons of construction, and there’s not enough people. So they’re hiring from outside of New York City. They’re getting people from the Midwest. I love the accents, but they don’t have enough of you.”

He asked the class if anyone could name the most common delay announcement on New York City’s notoriously beleaguered subway system. A hand with pink fingernails promptly shot into the air.

“We’re sorrrrrry,” a girl with curly hair and ripped jeans mimicked, before sticking out her tongue. The room erupted in laughter.

“Signal problems,” corrected the teacher, himself a graduate of Queens Tech. “They don’t have enough technicians to keep things working properly. What they need is you.”

Abreu is a Queens Tech graduate, and his son is now a student at the school as well.

Abreu was onto something. As the Brookings Institution noted in 2017, participation in career and technical education (CTE) has declined for several decades. That was in part because of a lack of funding and the fact that many states implemented more stringent academic requirements. However, the growing belief that everyone should obtain a college education also surely played a part. The National Center for Education Studies found that the number of CTE credits earned by American high school students declined by 14 percent between 1990 and 2009.

But the jobs are still there. NPR reported in April that the pressure to attend a four-year college remained so strong in American society that many high-paying jobs in the trades were currently sitting empty. Melissa Burg, the principal of Queens Tech, insisted that New York City’s Department of Education and some savvy parents had taken note of this dynamic, increasingly regarding a bachelor’s degree as the new high school diploma.

“I think those [trade] jobs go unfilled because skilled labor is looked down upon, even though those skilled labor people make more money than I do,” she explained. “I don’t know if people don’t want to work as physically hard as they used to, or if they see their families who’ve worked hard physically, or if those families are saying, ‘Don’t do what I did.’”

Meanwhile, in-state tuition and fees at public four-year schools have increased at an average rate of more than 3 percent above inflation each year in the past decade, according to data from the College Board. Experts say that’s partly a result of a sort of amenities arms race, in which schools use expensive construction projects to lure applicants. That cost—in combination with factors like increased demand and lack of state funding—is then passed down to the customer, in this case the student, and the situation has resulted in the average graduate walking away with almost $40,000 in debt.

For the students at Queens Tech, as for many young people across America graduating into what every newspaper and expert tells them is a wonderful economy, adult life is no longer about how much money they stand to make out of the gate. It’s about what size hole they might have to crawl out from just to break even. Perhaps that’s why there’s a renewed sense of energy surrounding CTE: Burg said that in the almost ten years she’s been principal—a timeline that roughly correlates with the last financial crisis—she consistently saw an uptick in applications. Brookings noted renewed interest nationwide as well.

Clive Belfield, a professor of economics at Queens College, said some young people might be wary of entering certain blue-collar industries because of the Uberization and outsourcing they’ve observed in their lifetimes. However, he noted that the trades may be less imperiled in New York than elsewhere in America, because unions are still a somewhat protective force, and suggested working for the MTA might be among the safest choices of all. Definitionally, those jobs can’t be sent to China.

Overall, he continued, a renewed interest in the trades might represent a natural market correction given how unaffordable college is, and how increasingly useless a diploma may seem to be. But in a world that pays a comically disproportionate amount of attention to Ivy League students and what they’re up to, even the most pragmatic 17-year-old may not pass up the chance at a four-year degree.

“It’s hard to think, when you’re that young and living in a world that’s obsessed with Harvard University, ‘This job is not very glamorous, but at least I’ll get to keep it,’” Belfield told me.

The advanced cable trainers (kits containing cables and wire cutters) let students practice on miniature versions of cable lines.

The decision to take on either stigma or debt—which carries its own form of stigma—is not an ideal one, and Mauricio Bustamante wrestled with it in his senior year. Now 20, he was in the enviable position of being offered both a union gig with the MTA and full tuition to a school upstate back in 2015. Although high school graduates had median weekly earnings of $718 in 2017, according to the US Bureau of Labor Statistics, the apprentice job paid $22 an hour to start, or $880 a week. That seemed like an enormous amount to the then-17-year-old kid who grew up in a single-parent home in Woodside, Queens. Then again, the prospect of being the first person in his family to attend college was undeniably appealing. There’s also the fact that the average college graduate took home a starting salary of $50,516 a year in 2017, or $971 a week, according tothe National Association of Colleges and Employers (NACE). That number was even higher—$1,271—for engineers.

Both Bustamante and his mother changed their opinions on what to do several times.

“And then just as I was about to decline my offer at the school, she changed her mind and said, ‘You have to go,’” he told me. “I ended up doing it for her, really.”

Bustamante, a former student of Abreu’s who’s now a junior at St. Lawrence University, where he’s double-majoring in math and economics, ultimately figured it was worth at least trying to land a job crunching numbers for a nonprofit, or calculating risk for an insurance company. But when I canvassed students currently finishing up the electrical installation track at his alma mater, they were less focused on the college-versus-job question than whether they wanted to work below ground or in a more traditional office setting.

Haw Wunna Zaw, a student at Queens Tech, says that even if a construction worker makes as much money as a doctor, they don’t earn as much respect in American society.

Haw Wunna Zaw, a 16-year-old born of immigrant parents, applied to roughly the same number of vocational and traditional high schools. His mom was a PhD candidate and his father went to military school in Burma, and, when we met, Haw said he hoped to work for the MTA after graduation while taking college classes at night. He added he’d received no pressure from his parents to choose university over going straight to work, something he attributed to the fact that, where they come from, there’s an immense pressure to go to college after high school. They just want him to be happy.

Still, he seemed to understand jobs in the trades had lost clout in American society. As union membership has declined, old-school patronage has broken down, and tech companies have disrupted industry after industry, fixing escalators for a living is less sexy than ever.

“If you’re a doctor, people admire you and you have the glory,” he told me. “If you’re a construction worker, you may get paid the same as a doctor, but you don’t look as good.”

Meanwhile, Brigitte Barcos, the 17-year-old who stuck out her tongue in class, originally applied with her best friend to Queens Tech, where the two planned on studying cosmetology together. The friend didn’t get in, and Barcos didn’t end up liking the track she was on. Then she found herself unexpectedly passionate about tinkering with circuits, planning to pursue electrical engineering any way she could. For her, that didn’t necessarily mean college and a degree that might help her become a supervisor for other people getting their hands dirty, rather than dirtying her own. Besides, she hadn’t bought into the once ironclad notion that college was a path to financial solvency.

“As union membership has declined, old-school patronage has broken down, and tech companies have disrupted industry after industry, fixing escalators for a living is less sexy than ever.”

“I feel like everyone has the expectation that you have to go to college to get more money, and that’s a lie,” she told me. “You waste more money to go to college than you get out of it.”

The only problem, Barcos explained, was that her parents didn’t think trades like electrical installation were appropriate for women. That mind-set is one Abreu said he’d had to contend with over the years with relative frequency, though he’d also seen immigrant parents cheer on daughters with 95 percent averages who decided they wanted to help fix the crumbling transit system.

Seventeen-year-old Brigitte Barcos originally attended Queens Tech for cosmetology before switching to the electrical installation track.

More prevalent, Abreu told me, was the unshakable conviction that college was the only answer, something he disagreed with his mother about decades ago. At one point, he had enrolled in a traditional four-year college, only to back out early after getting an offer to come back and apprentice at Queens Tech as part of another vocational program the school offered. His mother thought he had made a huge mistake—until she saw his first paycheck.

So when coaching students who are committed to taking jobs “in industry” but facing off with reluctant parents, money often amounts to Abreu’s best bargaining chip. In fact, he said, his kids stop at nothing to get those high-paying jobs. Part of that seemed to stem from the fact that Queens Tech has tended to be surrounded by trade unions, and students can see the incredibly long lines of people just waiting for a chance to apply. With the certifications they obtain as part of their high school curriculum, they can acquire what amounts to an express pass to jobs that thousands of people are visibly desperate for. That changes your thinking.

After class finished up for the day, Abreu told me about a group of former students who were dead-set on becoming bridge painters—a profession that pays around $95 an hour, and therefore remains highly competitive, even for trade-school VIPs like them.

“I said, ‘You know why they pay that much money, right?’” he recalled. “It’s a dangerous job. But there they were, out there in line the midnight before, standing out there together, huddled in the cold, just waiting for an application.”

Follow Allie Conti on Twitter.

Originally Posted On: singularityhub.com

It’s common to hear phrases like ‘machine learning’ and ‘artificial intelligence’ and believe that somehow, someone has managed to replicate a human mind inside a computer. This, of course, is untrue—but part of the reason this idea is so pervasive is because the metaphor of human learning and intelligence has been quite useful in explaining machine learning and artificial intelligence.

Indeed, some AI researchers maintain a close link with the neurosciencecommunity, and inspiration runs in both directions. But the metaphor can be a hindrance to people trying to explain machine learning to those less familiar with it. One of the biggest risks of conflating human and machine intelligence is that we start to hand over too much agency to machines. For those of us working with software, it’s essential that we remember the agency is human—it’s humans who build these systems, after all.

It’s worth unpacking the key differences between machine and human intelligence. While there are certainly similarities, it’s by looking at what makes them different that we can better grasp how artificial intelligence works, and how we can build and use it effectively.

Neural Networks

Central to the metaphor that links human and machine learning is the concept of a neural network. The biggest difference between a human brain and an artificial neural net is the sheer scale of the brain’s neural network. What’s crucial is that it’s not simply the number of neurons in the brain (which reach into the billions), but more precisely, the mind-boggling number of connections between them.

But the issue runs deeper than questions of scale. The human brain is qualitatively different from an artificial neural network for two other important reasons: the connections that power it are analogue, not digital, and the neurons themselves aren’t uniform (as they are in an artificial neural network).

This is why the brain is such a complex thing. Even the most complex artificial neural network, while often difficult to interpret and unpack, has an underlying architecture and principles guiding it (this is what we’re trying to do, so let’s construct the network like this…).

Intricate as they may be, neural networks in AIs are engineered with a specific outcome in mind. The human mind, however, doesn’t have the same degree of intentionality in its engineering. Yes, it should help us do all the things we need to do to stay alive, but it also allows us to think critically and creatively in a way that doesn’t need to be programmed.

The Beautiful Simplicity of AI

The fact that artificial intelligence systems are so much simpler than the human brain is, ironically, what enables AIs to deal with far greater computational complexity than we can.

Artificial neural networks can hold much more information and data than the human brain, largely due to the type of data that is stored and processed in a neural network. It is discrete and specific, like an entry on an excel spreadsheet.

In the human brain, data doesn’t have this same discrete quality. So while an artificial neural network can process very specific data at an incredible scale, it isn’t able to process information in the rich and multidimensional manner a human brain can. This is the key difference between an engineered system and the human mind.

Despite years of research, the human mind still remains somewhat opaque. This is because the analog synaptic connections between neurons are almost impenetrable to the digital connections within an artificial neural network.

Speed and Scale

Consider what this means in practice. The relative simplicity of an AI allows it to do a very complex task very well, and very quickly. A human brain simply can’t process data at scale and speed in the way AIs need to if they’re, say, translating speech to text, or processing a huge set of oncology reports.

Essential to the way AI works in both these contexts is that it breaks data and information down into tiny constituent parts. For example, it could break sounds down into phonetic text, which could then be translated into full sentences, or break images into pieces to understand the rules of how a huge set of them is composed.

Humans often do a similar thing, and this is the point at which machine learning is most like human learning; like algorithms, humans break data or information into smaller chunks in order to process it.

But there’s a reason for this similarity. This breakdown process is engineered into every neural network by a human engineer. What’s more, the way this process is designed will be down to the problem at hand. How an artificial intelligence system breaks down a data set is its own way of ‘understanding’ it.

Even while running a highly complex algorithm unsupervised, the parameters of how an AI learns—how it breaks data down in order to process it—are always set from the start.

Human Intelligence: Defining Problems

Human intelligence doesn’t have this set of limitations, which is what makes us so much more effective at problem-solving. It’s the human ability to ‘create’ problems that makes us so good at solving them. There’s an element of contextual understanding and decision-making in the way humans approach problems.

AIs might be able to unpack problems or find new ways into them, but they can’t define the problem they’re trying to solve.

Algorithmic insensitivity has come into focus in recent years, with an increasing number of scandals around bias in AI systems. Of course, this is caused by the biases of those making the algorithms, but underlines the point that algorithmic biases can only be identified by human intelligence.

Human and Artificial Intelligence Should Complement Each Other

We must remember that artificial intelligence and machine learning aren’t simply things that ‘exist’ that we can no longer control. They are built, engineered, and designed by us. This mindset puts us in control of the future, and makes algorithms even more elegant and remarkable.

Originally Posted On: chronicle.com

Written By: Sara Goldrick-Rab and Michelle Miller-Adams

Separate reports released this week by respected Washington-based think tanks miss the mark on the free-college question. The Education Trust’s “A Promise Fulfilled”and the Institute for Higher Education Policy’s analyses of two statewide college Promise programs reach the same conclusion: Free college is failing low-income students.

We disagree.

These reports draw attention to one feature of today’s free-college programs — almost all of them award scholarships on a “last dollar” basis, meaning that need-based grant aid is applied to tuition before students can receive money from “free college” funding. In practice, this means most students who are eligible for Pell Grants and who are already enrolled in college do not receive any additional funds. These reports focus too narrowly on this single issue and in doing so overlook not only the many other ways that free-college programs benefit low-income students, but also their value to the larger project of making college affordable.

Both reports employ a narrow definition of equity, reflecting a traditional approach to assessing the impact of financial aid and a belief that money is well spent only if it goes solely to the lowest-income people. We share the commitment to ensuring that low-income students are supported but argue that this approach misses ways in which free college supports students far better than typical need-based aid does.

First, even when free college does not bring additional tuition dollars to low-income students, it helps many of them decide to go to college in the first place. Encouraged to enroll thanks to the clear messaging of free-college programs, they receive financial aid they would have missed out on by not enrolling.

Second, free college is partly about leaving behind that small American bureaucratic tragedy known as the Free Application for Federal Student Aid. Efforts to simplify the Fafsa form miss the core problem: Means testing, by its very nature, requires application and verification processes. Insisting that people prove they are poor in order to afford an education undermines the entire enterprise. Moreover, the effort to assess financial need will always be flawed.

Plenty of people have far more need than is revealed by the Fafsa analysis used to determine financial aid. Many students above the Pell Grant cutoff struggle with college costs; a family making $50,000 is too poor to afford college but is often too “rich” to qualify for a Pell Grant. The lack of college affordability for middle-class students helps explain downward mobility for the middle class, the looming student-loan debt problem, and a growing crisis of food and housing insecurity on college campuses. A college degree can be a ticket to a more secure future, but only if students can complete degrees free from crushing debt. Making middle-class people poorer hardly increases equity.

Free-college programs are wise to take a new approach to college affordability. They seek to support low-income students but also to actually transform the systems where those students are educated. By amplifying the message that going to college is possible and valuable, and by simplifying the financial-aid journey by clearly communicating that tuition costs will be covered (not just “affordable”), they inject elements of a college-going culture at the secondary- and even primary-school level, elicit new student-support resources from schools and community members, and create incentives for colleges and universities to better serve their students. They also emphasize the importance and value of public higher education, which is crucial at a time when it is under attack.

Stakeholders in free-college programs care tremendously about low-income students — so much so that many of them are willing to upend a system that has long failed those students. They acknowledge that college affordability for the working class and the middle class and the provision of an educated work force to meet the competitive challenges of the 21st century are key aspects of equity.

Supporters of free-college initiatives also have an advanced understanding of the political economy of social programs, including financial aid, and they recognize that narrowly targeted programs often lack political support. The politics of resentment are strong, especially these days, and free-college programs need middle-class beneficiaries and middle-class voters to sustain them. The consistently declining value of the Pell Grant over decades is strong evidence of the need to broaden the base of support for making college affordable.

All of these factors suggest that it is unwise to use a narrow metric to gauge whether low-income students benefit from free college, as both the IHEP and Ed Trust reports unfortunately do.

A college degree can be a ticket to a more secure future, but only if students can complete degrees free from crushing debt.

Of course, proponents of free college want to go beyond the status quo and shift to a first-dollar model — without an application process — that includes supplements for living expenses (via either grants or supportive programs). The reason this has not yet happened is simple: It’s expensive. The statewide free-college movement is not even a decade old. Even the most well-intentioned states cannot afford to take this approach without the help of the federal government, which provides most current financial aid. Without the ability to repurpose the dollars now flowing into Pell Grants, or major increases in taxes, or uncommon generosity by major private funders, free college-programs will remain last-dollar propositions. So the federal government must help.

But in the meantime, free-college programs are helping low-income students. It is unproductive and unhelpful to those students to stunt the progress of this movement. It is also dangerous to argue against the very real needs of a middle class dominated by asset-limited, income-constrained families. We do not live in a perfect world, so we should welcome the paradigm shift and the very real incremental gains that free-college programs represent. This is not the end of policy innovation but rather the beginning of significant progress. It would be a shame for equity-minded people to take the wind out of the free-college movement’s sails.

Sara Goldrick-Rab is a professor of higher-education policy and sociology at Temple University and the author of Paying the Price: College Costs, Financial Aid, and the Betrayal of the American Dream (University of Chicago Press, 2016). Michelle Miller-Adams is a professor of political science at Grand Valley State University and author of Promise Nation: Transforming Communities Through Place-Based Scholarships (W.E. Upjohn Institute for Employment Research, 2015).

FOR IMMEDIATE RELEASE

Contact: Steve Wright
Information Communication Technologies-Digital Media Sector Navigator
California Community Colleges
[email protected]

ROCKLIN, Calif. — There’s never been a better time to enter the IT workforce, as thousands of high paying jobs remain unfilled across California. A new initiative at California’s Community College is making it easier than ever for people with little or no technical experience to find a pathway toward one of those jobs in just a few months.

The IT Technician Pathway, offered at 22 California community colleges, is a series of four sets of courses designed to take students from computer sales to help desk support to more specialized fields like networking and cybersecurity. Each group of courses in the pathway corresponds to industry certifications that are essential for employment in any IT job.

The pathway is divided into the following segments:

  • Phase one: Computer retail sales
  • Phase two: Help desk/user support
  • Phase three: IT technician
  • Phase four: Cybersecurity or networking specialization

The pathway is offered as part of the Information Communication Technologies and Digital Media (ICT-DM) sector in the Doing What Matters for Jobs and the Economy — Strong Workforce Program.

Shawn Monsen, a faculty member at Sierra College and ICT-DM product development lead, is working to align the pathway’s recommended courses with four-year colleges so that students can earn a good job right away and create the foundation to earn a bachelor’s degree and increase their earning potential even more.

“This program provides students with a path to gain industry certifications to get better paying skilled jobs,” Monsen said.

Articulation pathways have been established between California Community Colleges and National University which allows courses taken in the pathway to be used toward NUs Cybersecurity and IT Management bachelor’s degrees. In the end, the more universities that offer these degrees and provide articulation pathways that lead to those degrees, the better positioned California will be to meet its current and future IT needs.

“The industry has a desperate need for these skilled workers,” Monsen said. “The pathway provides a means for students to get those skills, earn those industry certifications and move into those jobs.”

The IT technician Pathway also aligns with efforts to increase cybersecurity education at the K-12 level through CyberPatriot and other cyber competitions. These events bring students from all walks of life together to learn how to keep networks safe against cyber threats.

Middle and high school students participating in cyber competitions already have many of the foundational skills needed for the IT technician and can advance through it to a high-paying job even faster.

The California Cyberhub coordinates cybersecurity education efforts across the state and is a key partner in the IT Technician Pathway, particularly the cybersecurity specialization.

While the impact on students is immense, it’s not the only benefit to utilizing the pathway model for IT education. By forging partnerships between community colleges and four-year universities, California is positioning itself as a leader in technology education and creating a model that can be implemented nationwide.

“Over 27,000 students take one or more IT courses at the California Community Colleges per year. With 64 Cisco Academies, 24/7 online computer labs, and over 330 IT Faculty — 70 percent with master’s degrees — they are the best kept secret in the cybersecurity solution, said Information Communication Technologies-Digital Media Sector Navigator Steve Wright. “The IT Technician Pathway is a uniform statewide guided pathway for entry level and advanced upskilling workers. Articulation to a four year degree completes the journey to a professional education and better wages.”

For more information on the IT Technician Pathway, visit ict-dm.net.

About Doing What Matters for Jobs and the Economy – Strong Workforce Program

Doing What MATTERS for jobs and the economy is a four-pronged framework to respond to the call of our nation, state, and regions to close the skills gap. The four prongs are: Give Priority for Jobs and the Economy » Make Room for Jobs and the Economy » Promote Student Success » Innovate for Jobs and the Economy.

The goals of Doing What Matters for Jobs and the Economy are to supply in-demand skills for employers, create relevant career pathways and stackable credentials, promote student success, and get Californians into open jobs.

In a recent Chronicle Review essay with the clickbait headline (which the authors did not write) “Why the University’s Insatiable Appetite Will Be Its Undoing,” Adam Daniel and Chad Wellmon, respectively an administrator and a professor at the University of Virginia, argue that the university should be more focused on what it does best — teaching and research — and less responsive to broad social pressures: “To save itself and to better serve its democratic purpose, the university needs to be not more but less reactive to public demands.”

There are serious problems with arguments like this, much in the air right now, that blame universities for everything: overbuilding, high tuition, teaching too many subjects, incurring too much debt. Universities, according to Daniel and Wellmon, are simply doing too much all around.

Maybe that’s true for UVa, though I suspect not. It is certainly not true for the majority of the universities in the United States facing serious economic problems, problems which are not of their own making.

Higher ed needs to change. But accusing it of insatiability will only justify more damaging cutbacks.

Assaults like Daniel and Wellmon’s are worryingly short on specifics, and therefore leave us with few means for finding a constructive solution. Instead, they all too readily echo the drumbeat — most common in conservative circles but not only there — that higher education costs too much and doesn’t do its job. I agree that tuition is too high at many of our universities, and I am an ardent champion for higher-education redesign that better supports our students in a complex world. However, if we do not take seriously the reasons we are in the state we are in right now, we will come up with more spurious and wrong-headed “solutions” that exacerbate rather than remedy the problems in higher education today.

“Always historicize!” isn’t a bad idea if you are looking to find a solution to a problem, rather than a scapegoat. What follows are some key assumptions made by policy makers and the public over the course of the last several decades, beginning with the reversal of the post-World War II investment in U.S. higher education during the governorship and then the presidency of Ronald Reagan. Each of these arguments for educational reform and retrenchment has contributed to the current crisis.

No. 1: “Higher ed should be run like businesses.” Colleges and universities, the thinking goes, need to be entrepreneurial. They need to hire CEOs as presidents, and their boards should be composed of business people. As a consequence, universities end up pursuing big grants and big donors. We know this favors science. We also know, from Christopher Newfield’s work, that it incurs long-term costs — buildings, labs, staff — that persist after the initial massive investment and after the granting organizations or the private donors have moved on to other interests. And it leads to the escalation of administrative salaries, with universities competing with corporations for college presidencies. The move to external funding also requires increased administrative staff (not bloat) to manage the complexities of budget, intellectual property and copyright agreements, income and profit sharing, and many other contingencies.

No. 2: “The public should not need to fund higher education. Higher education should fund itself.” In recent years, we have witnessed massive state cutbacks to higher ed, resulting in a roughly 20-percent-to-50-percent per capita reduction in public subsidy in some states. So tuition rises. Some states, such as Colorado, now subsidize under 5 percent of university costs. The rest comes from private or public funding sources (such as Pell Grants or grants from government research agencies) or, tragically, higher tuition.

No. 3: “Higher ed doesn’t really train students for the future. It’s out of date.” Increasing numbers of Americans think higher education is no longer worth it (although, given the growth in Kaplan-style SAT cram schools and the escalation of applications to elite colleges and universities, it is clear that the affluent are still working to ensure that their own kids go to college). However, many of the attempts to bring college “up to date” are badly misinformed wastes — for instance, MOOCs, which certainly won’t do the trick. They enrich technology entrepreneurs without improving the quality of learning. Lots of bad policy is justified by this one, perhaps most notoriously the California State University system’s hasty 2013 implementation (for an undisclosed sum) of the for-profit Udacity online courses in remedial math, algebra, and statistics at San Jose State. Supposedly these online courses were going to outperform actual classroom teachers. The retreat from that program in the face of poor results was as rapid as its adoption, yet the clarion call for “technology” to solve educational woes remains.

No. 4: “Higher ed costs too much.” It absolutely does. You now need to be rich to afford many universities. But there is huge variation. Community college is still relatively inexpensive — but also lacks the resources to expend on those students facing the biggest challenges. Belt-tightening is hardly necessary in university and community-college systems where costs are already low and resources very scarce, where faculty with full-time jobs teach heavy loads, and where well over half of courses are taught by underpaid adjunct professors with no benefits or security. Belt-tightening? At many public universities (and private too), students are facing food insecurity. And so are adjunct faculty. Institutions are impoverished. They have been robbed.

No. 5: “Make international student visas more difficult to attain.” The recent rise in xenophobia and difficulties in obtaining student visas have led to a diminishing number of students from all around the world coming to the U.S. American higher ed is valued everywhere, and we used to have the international student body to prove it. After a decade of inviting international students (for cultural, social, intellectual, and, one must acknowledge, financial reasons), now such students are going to … Canada. Universities are feeling the effects everywhere, and so will our labor force.

Higher ed needs to change. But accusing it of insatiability will only justify more damaging cutbacks. Where will those be made? Who will make them? And will students and faculty, knowledge and teaching and research, be the winners? Or will this end up being another blame-the-victim assault on higher ed? If we aren’t sufficiently explicit about the pressures that have brought us to this juncture, we undermine any chance for sane, reasoned, innovative reform.

Cathy N. Davidson is a professor of English at the Graduate Center of the City University of New York and the author of The New Education: How To Revolutionize the University To Prepare Students for a World in Flux (Basic Books, 2017).

Originally Posted On: chronicle.com

Last spring The Chronicle published an advice column on “5 Big-Picture Mistakes New Ph.D.s Make on the Job Market.” Allow me to add No. 6: They don’t consider applying to community colleges.

In that column, the academic-career consultant Karen Kelsky did mention community colleges, noting that different types of institutions “each have their own niche and their own functions.” My goal here, as a faculty member who has worked in the two-year sector for 31 years, is to clarify the niche and function of two-year institutions and offer some reasons why new or newish Ph.D.s should include us in their job search.

After all — let’s be honest — there are compelling reasons not to consider community colleges, from the perspective of a typical doctoral student.

Take the prestige factor, for instance. Most graduate students naturally want to work at the most prestigious institution that will hire them — and their adviser cheers them on in that pursuit. Many advisers go so far as to actively discourage their Ph.D.s from even thinking about applying to a community college, since our institutions are perceived as the red-headed stepchildren of higher education.

Where those graduate students (and their advisers) often err, however, is in their unrealistic expectations about the hiring market.

For about 12 years now, I have been brought in by research universities all over the country to meet with their graduate students and discuss career options at two-year colleges. Interestingly, I’ve noticed a clear correlation between the prestige of the institution and the number of students who attend my talk — but it’s not what you might think. Paradoxically, perhaps, the higher the university is ranked, the larger my crowd is likely to be. At Near Ivy U., I might draw 60 or 70 students (I’ve had as many as 120). It’s as if those students understand that the job market is tight, and they (wisely) want to cover all their bases.

Meanwhile, at Directional State U., I might have only five or six graduate students show up. That’s what I mean by unrealistic expectations. Where do those students think they are going to get teaching jobs? With a doctorate from a regional public university, the very best they can hope for is a teaching job at a similar institution — and even that will be challenging for them, as the professors at many regional universities earned Ph.D.s from large public or private R1’s. That leaves small teaching-oriented institutions and community colleges as the places most likely to hire Ph.D.s from Directional State.

A more likely scenario, these days, is for a new Ph.D. to strike out on the full-time market and instead spend a few years working as an adjunct — probably at a community college. Refusing to entertain the possibility of applying to community colleges only increases that likelihood. Having eliminated from consideration over 30 percent of the U.S. higher-education sector, such candidates have decreased their chances of finding a full-time faculty position by roughly the same margin.

Another reason many graduate students don’t seriously consider community colleges is that they don’t know anything about us. Nor, typically, do their advisers. That’s why I write these career columns, and it’s also why I’m invited to speak, usually by graduate advisers who want their students to know all of their options. Occasionally, the invitation comes from a career center or a graduate-student association, essentially going over the heads of faculty advisers. (Although you might be surprised at the number of career-center directors and graduate-school deans I talk with whose universities provide absolutely zero career programming for graduate students. Then again, maybe you wouldn’t.)

So those are the main reasons why graduate students don’t apply to two-year colleges. Here’s why advisers should be urging Ph.D.s to include us in the mix:

There’s a lot of us. Everyone has a dream job. (Mine is writing headlines for The Onion.) But somewhere between that pleasant daydream and the unemployment line is this thing known as “a job.” Graduate students would do well to focus more on the latter than the former.

It isn’t necessarily easier to get a full-time job at a two-year college than at a four-year institution — it’s just different. But arbitrarily ruling out more than 1,200 potential employers does not strike me as a winning job-search strategy.

Adjuncting counts at our campuses. One of the five “mistakes” that Kelsky’s column cited was believing “adjuncting is a way to get your foot in the door.” No doubt that is true at research institutions and other four-year colleges, but it is definitely false at community colleges. As I’ve noted before, adjuncting is not only a good way to get hired full-time at a community college, it may be the best way, if you don’t get a full-time offer right out of the gate.

Just as research universities are looking for certain qualities in a candidate, so are we. The one we particularly look for: teaching experience. It’s very difficult to get hired full-time at a two-year college without at least a couple years of classroom experience.

Adjuncting is one way to get that experience. And in community-college circles, it’s considered perfectly legit. There’s no stigma. We hire many of our own adjuncts for full-time positions, and many of our full-time faculty members started out as adjuncts. So if you apply to community colleges and don’t get hired on the first go-round, take heart. There’s a decent chance you can adjunct your way into a full-time position, eventually.

You don’t need a Ph.D. to teach here.Accrediting guidelines for faculty members at two-year colleges require only a master’s degree with 18 graduate semester hours in the discipline. While community colleges are hiring more Ph.D.s than we used to, we still hire plenty of people whose terminal graduate degree is a master’s.

Does having a Ph.D. give someone an advantage in our hiring process? Probably. All things being equal between two candidates, the one with the Ph.D. would usually win out. But all things are rarely equal. Most community-college search committees are looking for the best classroom teacher they can find, and that may or may not be the person with the Ph.D.

The pay and benefits are decent. In most states, full-time faculty members at two-year colleges make a little less than our counterparts at regional universities and a little more than high-school teachers — although salaries can vary widely from place to place. Still, our starting salaries aren’t bad — usually in the range of $45,000 to $55,000, depending on the state. Other factors that may affect that number include terminal degree and years of teaching experience.

Our benefits, however — including health insurance and retirement plans — are typically the same as what university professors receive. And in every state I’ve ever worked in (five, so far), those benefits have been excellent.

The work here is less stressful. If you love teaching, that is. A career at a two-year college certainly seems far less anxiety-producing than what I hear and read about faculty life at research universities.

Not all two-year colleges offer tenure (or as it’s called on some campuses, “continuing contract”), but most do. And of those, most require only three to five years of acceptable performance in teaching, service, and professional development in order to earn tenure. At all but a handful of community college nationwide, you don’t have to publish a thing to get tenure. Just do a good job in the classroom, serve on a few committees (well, maybe more than a few, at first), and attend a conference or workshop when you can, and you’ll be fine.

Sure, you can stress if you want to — by expanding your workload to include pursuing a research agenda, trying to write books or articles, getting into administration. But at a community college, you can also choose a quiet, relatively stress-free life of teaching the subject you love. That’s a luxury most of our four-year colleagues don’t have.

It’s a rewarding career. Remember what I said above about us being the red-headed step-children of higher education? Well, that reputation doesn’t bother most of us because, by and large, we love our jobs. They’re not perfect. No job is (except maybe writing headlines for The Onion). But teaching at a community college is both professionally rewarding and psychologically fulfilling. What more can anyone ask of a career?

Yes, we teach a lot — usually five classes a semester. But that’s OK, because we like teaching. No, we don’t get a lot of time or money to write or pursue our research or attend conferences — but we can usually find the resources to do those things if we really want to.

Above all, we love our students, who as a rule are bright, eager, and the opposite of entitled. We have our share of slackers, to be sure, but most of our students are happy to be there. They feel as though they’ve been given a valuable opportunity, which of course they have. But so have we — the opportunity to teach young (and not-so-young) people who, out of all the postsecondary students in the country, probably need us most.

In the end, most of us who teach at community colleges would not trade places with our colleagues at research universities for anything, prestige be damned. Call it the Revenge of the Red-Headed Stepchildren.

Rob Jenkins is an associate professor of English at Georgia State University’s Perimeter College. He writes regularly for The Chronicle’s community-college column. The opinions expressed here are his own and not necessarily those of his employer. You can follow Rob on Twitter @HigherEdSpeak.

Recognizing Everyone As A Student For Life