Why Higher Ed Fails at Job Preparation--and What We Can Do About It
(This is reblogged from nowyouseeit.net: with the usual apologies for loss of apostrophes and dashes that come from reposting from Wordpress . . . )
Every industry study I have read about new college graduates begins with a rant: they have no skills that prepare them for the competitive, global job market of today. Everyone acknowledges that the issue isn't a college degree itself. A college grad today earns 65% more than someone with a high school diploma, and a master's degree increases that to 105%. But whether it is a study conducted by the Industry Education Council of the UK or a commercial for-profit educational provider, everyone says students today lack people skills, communication skills, collaborative skills, analytical skills, networking skills, and even the most elementary skills of how to write a great resume and cover letter and how to interview well.
My office at Duke happens to be right under the Career Center. All day long I see a parade of attractive college seniors in beautifully tailored business suits march in and out of that office for career counseling. Do you mean to tell me that we are sending them out into the business world with nothing but an ability to dress for success--and maybe 100+K in college loans? How did it come to this? And what can we do about it?
First, let's back up and understand the problem. We all think we know what work is. We all think we know what school is. What we really know is how work and school have been defined for the last 150 years. Before industrialization, before the assembly line, before Taylorization, there was no school bell sending everyone into the classroom at the same time, dividing up the day into set subjects, putting every student in a row, classes arranged by age not by maturity or preparation. If you are going to trade a labor force for mass production, key words are efficiency, uniformity, timeliness, and standardization. That is true whether you are a worker on the line, the foreman running the line, the manager supervising the plant, the designers creating the blueprint for the object being produced, the industrial designer figuring out how to improve the product for the next round, the sales force selling the product, or everyone over in that white collar office building whose job it is to measure outputs, supervise sales, distribute products, manage the whole operation, or report on the bottom line.
Whether over in the factory or over in the corner office of the top floor of the skyscraper, industrial age business was arranged hierarchically, with someone in charge, and then managers passing on those instructions according to an organizational chart that made clear who, exactly, was reporting to whom: everyone in 20th century business knows who is the boss of me. Training a worker for a place in a complex operation for the last one hundred years has meant knowing one's place and contributing one's productivity within a hierarchical, vertical management system that depends upon specialization, expertise, and coming up with the right metrics for determining success, with Human Resources (HR) departments all systematizing all the variant human outcomes within complex organizations.
Everything about higher education, from the late 19th century forward, has been reorganized to meet that need. That is why we have (this is a condensed list): departments, disciplines, different professional degrees (such as the MBA, etc), majors, minors, certification, professional schools, business schools, graduate school, degree requirements, required courses, electives, distribution requirements, statistics, standard deviation, spreadsheets, blueprints, I.Q. tests, multiple choice tests, learning disabilities, and just about anything else that divides and measures this kind of knowledge from that kind of knowledge, this kind of learner from that kind of learner. If you go back to Harvard in 1874, you find a chemistry professor Josiah Cooke still asserting that "all truth is one and inseparable" and advocating for the "unity of truth" as the motivating principle of higher education. Let's cut to the chase: he lost that battle. Big time.
The twentieth century might well be defined as the century of standardization, efficiency, specialization, and certification. School and work, as we have inherited the terms, are all about standardization and about division of knowledge and certification of attainment of expertise in one area of knowledge. But that is not the world our children are inheriting in the 21st century. By one estimate, 65% of the jobs that will be available upon college graduation for students now entering high school (that's eight years from now) do not yet exist. So we are preparing them as experts and as specialists for jobs we can't even yet imagine? Are we making them experts in obsolescence?
Doing well on standardized tests of existing knowledge is not a job skill; it is a checklist of facts many of which are fast becoming outmoded (or rendered irrelevant by Google). Standardized testing is not the apparatus for assessing logical abilities, inference, imagination, creativity, problem solving, project management, collaborative or communication skills, or the ability to retool oneself in the fact of enormous changes in one's chosen profession or field. Facts. That is what standardized tests test. Or answers. Calculations. Those things may or may not be relevant to the future ways of working successfully together. Yet that skill at item-response testing is what the 20th century calls standards. Success at that form of standardized knowledge is what is necessary to get into college today. But doing well on standardized tests is not the skill you need to succeed in the workplace today. We have a mismatch. We are doing a good job training students for the twentieth century.
What changed everything for the students of today? Two things: (1) the Internet. (2) the World Wide Web. The reason Thomas Friedman's The World Is Flat made such a huge impact is he does a brilliant job of explaining in clear, succinct, and concrete ways how the end-to-end principle of the Internet and the WWW have reorganized life in the 21st century not as a vertical hierarchy but as a flat horizontal plane. That doesn't mean everything and everyone is equal, but it does mean that the assembly line and standardization and all those metrics of the early 20th century are now describing a principle of communication and productivity that increasingly do not exist. With the Internet and the World Wide Web, no one is at the controls saying which information will go where. All information is bundled at the end point (my computer or, at most, my server) and then capable of being captured by any other end point (your computer), without a broadcaster, a publisher, an editor, a manager, a company, a foreman, or a CEO. This end-to-end principle requires collaborative skills, judgment and logical skills, synthesizing and analytical abilities, critical and creative skills, qualitative and quantitative skills, all together, with few lines between them. This end-to-end principle of the Internet and the World Wide Web have an impact on how we work, on how we communicate, on how we interact, on how we gather as citizens, on how we gather as global observers, on how we organize and how we disrupt organizations, on levels small or large.
Just as the assembly line rearranged everything about work and school in teh 20th century, the Internet and the World Wide Web are rearranging our lives in the 21st century. We are about fifteen years into the commercialization of the Web. It was about 15 years beyond Taylor that educators began to reshape the university into disciplines, departments, and so forth. So we are right on time for a major reorganization of the contemporary university.
Because of the financial crisis, a lot of idiotic things are happening now, such as cutting back the humanities (in the UK), or trying to eliminate tenure, or charging more and more for tuition, or, equally inane, thinking that online courses can substitute for higher education. (Where, exactly, will students learn those invaluable interpersonal job skills from online courses, especially if they are conceived as a revenue-producer, not as interactive, challenging, collaborative enhancements?) The college drop out rate is soaring, so is debt of those who do graduate from college. For-profits are doing (for the most part) an excellent job of training their best studentsbut it is well known that for-profits could not be profitable if it werent for the fact that tax-payer supported government college loans can be applied to for-profit tuition (what?!) and that the very high drop out rate from the grueling pace at most for-profit universities (a four-year degree often crammed into two years) means that a lot of the profit margin comes from a high drop out rate yielding large, paid fees for un-rendered services. That spillage is built into the business model. And that is corrupt. Yet our early 20th century over-specialized colleges and research universities are arranged, de facto, to train students either for graduate school or for a workplace that no longer exists.
We have a problem on our hands. What can we do about it?
There is a lot we can do and I and my colleagues around the globe have been working for the last decade towards new forms of digital learning, participatory learning, collaborative learning that are better suited to the 21st century needs. I'm not pretending to have solved all of the problems--hardyl! It is easier to define a problem than to fix one. But that's a major reason why we created the HASTAC network of networks back in 2002--to begin exploring some solutions together, many of which are being tried all over, in different contexts.
I'm also working on a far more local level. Last year, in December of 2009, a dean of the graduate school at Duke approached me and my HASTAC colleagues about proposing a Master's degree that would address these issues in a small local way. It was basically a "put your energy where your mouthis/stop griping and actually do something" challenge. Run a program? I was r and d person at Duke (Vice Provost for Interdisciplinary Studies) for eight years and quite some time ago decided that becoming a university provost or president was not a career path I wanted to try. I had the best administrative job in the world, I like to say--and all others seem dull or restrictive by comparison. So it was counter-intuitive that I would jump at the chance to develop and direct a new Master's program. But no one had ever said to me before, "if you could come up with a program, that, in a very local and modest fashion would address these enormous issues you and HASTAC keep pointing to, what might that look like?" Before I knew it, I was working with a lot of other people to think about . .. what that might look like! It's not a "career path" I would ever have thought I'd take but, lo and behold, fifteen months later, I'm going through the various committees at Duke to see if we can't try something that doesn't look very much like any other Master's that's out there.
What we have come up with is hardly perfect. In the real world of real programs, nothing ever is. But it's pretty interesting and we hope it might be a prototype for other programs at other institutions. To our knowledge, it's the first Master's program at a research university to move across human, social, and natural sciences; to combine qualitative and quantitative learning; to merge the research Master's with the professional Master's degree; to require both deep theoretical and historical thinking and practical, business, applied, management experience and training.
Our tentative title is the Masters in Knowledge and Networks (MKN). You can read the abstract for it here: http://www.hastac.org/blogs/cathy-davidson/updated-masters-knowledge-and... or http://tinyurl.com/49h7ko3 And that blog sends you to a link for the entire proposal, hundreds of pages, market surveys, and so forth.
This prototype program has a few basic principles that could be extended to other institutiions, other settings, other programs:
(1) If the human and interpretive social sciences are where students learn the skills that contemporary employers say they need and that are lacking in contemporary college graduates (reading and writing skills, interpersonal skills, communication skills), why are these fields marginalized in most higher education and considered to produce students who are unemployable? If the human and social sciences are where students learn to think about a global community, consequences of social actions, and the ramifications of injustice and inequality in interconnected, interactive, global terms, how can these not be essential job skills (not to mention essential human skills) for a global, distributed, interdependent world?
(2) If more and more of the giants of the information industrysuch as Google Books and Google and Facebook and Twitter and everything else about social mediaprofit by data mining information that we contribute, voluntarily and without remuneration, to them, why arent information era analytics required of all applicants for jobs? Data mining and assessment skills are essential tools in any job, but the field is too new. In statistics and decision sciences, the form of data mining tends to be pre-professional and not coupled to the information analytics that companies use daily. In the human sciences, analytical skills are, currently, considered other than the skills you need to read, write, and communicate. What? Talk about old school! That division of communication skills and analytical skills is so twentieth century.
(3) If change is the byword of our era, shouldnt we be studying how change happens, so we know which changes are significant and which merely a different flavor? Philosopher George Santayana famously said that those who do not remember history are condemned to repeat it. Those who do not understand how historical change and historical process works are condemned to feel every change is a crisis, not a process. Understanding the process can turn a crisis into an opportunity.
(4) If all of the major modes of communication and interaction are going through a paradigmatic shift, shouldnt we understand a lot about the history of reading, writing, and multimedia communication in order to have a better sense of the relationship of tools and pipes (infrastructure and content) and be able to make more than a wild, uninformed guess about what might be shaping our future?
(5) And once we have all those good, deep, thoughtful, and useful tools, shouldnt we have some mentored, careful experience in applying them in a real-world situation before we throw ourselves into the job market? In the MKN, teams of students go into corporations headquartered here, local businesses, NGOs, nonprofits, learning organizations, arts and civic institutions, and learn the culture, apply the management principles they have been learning for a year in a proseminar, and then, with the mentor from the business or organization, single out a communications, networking, and information-era problem that the employer/mentor knows is necessary for success but has not yet had time to focus on. (I have not yet met a single employer, large or small, who doesnt have a list of these.) The MKN outsources their risk, making development of the technology and networking system the responsibility of these Masters students who keep an accounting of all their time and a paper trail of their workflow so they can not only present a solution, but also a sustainable workplan for the organization to follow. Each week, they continue the proseminar (all the second year students in their residencies, meeting together, with mentors and experts, as well as with the new first-year students still in their coursework), and share their frustrations and successes, and work towards a win. If they fail, the company does not suffer; it simply doesnt gain. Nothing to lose. And win or lose, the students gain the invaluable work experience, carefully mentored, that they can bring to their first paying job. Outsourced risk, incredible opportunity. And all firmly based in theoretically, historically, analytically (qualitative and quantitative) grounded knowledge. (Do I hear Josiah Cooke? All truth is one and inseparableand distributed, collaborative, networked, and interactive.)
(6) And, one more, once we can think deeply, once we have polished up our technology skills, once we have good analytic skills to be able to decipher the numbers Google and others provide for us (and use for or against us), once we have collaborative and project-management skills that have been tested in a sustained and thorough way in a real organization, shouldnt we be able to translate all of that into an original, urgent, compelling job letter as well as into an interview since we will know more about what a future employer needs than that employer might even know. The employer will recognize the crisis. Our MKN graduates should know what things to think about and what to do so they can help prevent the next one.
Read about this Master's in Knowledge and Networks (proposed nameit yet may change). And let me know what you think. Higher education today fails at job preparation. This Masters degree were proposing doesn't solve all the problems. No program can. But this one has a hidden advantage: instead of being created by some specialized committee, it evolved online, on an open Comment Press site, that was up and public for a year, receiving comments. It was presented at dozens of public forums as well as on the HASTAC website. About 15,000 people visited the various sites, and there were about 300 substantive commentson the Comment Press site, on index cards at town halls, on blogs, at conferences, and we shaped this Masters to those good ideas, supplied mostly by HASTACers (there are about 5500 of us now) who are convinced that we need to rethink school for work in the 21st century.
This is only one small Masters program, but its a start, thanks to the hundreds of people who have contributed ideas along the way. We have a long, long way to go. But if the industrial age redesigned education for the twentieth century, in 2011 we are right on time to begin redesigning higher education for the 21st.