I’ve been meaning to start a blog for years. It was going to be a gift to myself (so to speak) when I was tenured and promoted (not because I was afraid or concerned with speaking out, cause for those that know me, that is a completely moot and almost ridiculous point), but one thing lead to another and another and I just never got around to putting all the pieces together. But lately, I’ve decided I couldn’t keep putting off taking care of my digital identity and with that comes finally doing this whole blog thing.
Recently, I’ve done some blogs for a couple of groups that I am part of: #womeninTC and rhetoricians of health and medicine. I’ve found that this sort of writing has helped me clear space in my head to do the heavy scholarly writing that I need to get done. So I’m happy to be starting to write something here.
I recently wrote a post about how different parts of our lives often intersect. This happens a lot with social media when you have your personal friends and colleagues all following you. In my case, the different dimensions of my work often intersect in interesting ways. For example, the work I do around mentoring as part of #womeninTC definitely applies to other work that I do in the #medrhet un-organization or in working as President of the Council of Programs in Technical and Scientific Communication or in sitting on a University committee on professional development opportunities for faculty around research.
But I kind of digress…the intersections I want to talk about here are those between the practitioner world of technical communication and the academic world of technical communication, specifically around issues of research. Tom Johnson (I’d Rather Be Writing) recently wrote a post on the divide between practitioners and academics. Some of what he says is spot on. Some of it is not, which is due in large part in a lack of an understanding about what it is that we do in academia. Over on the Facebook group, women in tech comm (which is different from the #womeninTC mentioned above), there was a discussion a couple of weeks back (started asking about a conference I think) that also seemed to indicate a lack of understanding about what types and kinds of research we do in academia. I’m going address some of these issues in my ongoing efforts to try and bridge this gap.
Types of Institutions
It’s important to have a general sense about institution types because it is a big driver in the expectations and job requirements for university professors. i work at the University of Cincinnati, which my Carnegie Classification (this is the standard for ranking and comparing within higher education) is a research university very high productivity. This means my institution’s mission is primarily a research and knowledge making enterprise. (That’s a kind of broad brush, but for our purpose here let’s roll with it.)
As that mission trickles down to the faculty level, it means that a bulk of my job is research. I teach 2 courses a term, and then I am required to do service to department, university, and the profession. (This is the research, teaching and service categories you may have heard about.) Right down the road from me is Northern Kentucky University, which is designated as a Master’s large school, which means that it is larger in size than others like them and they award only master’s degrees. Folks who teach there are teaching 4 courses a term. In higher education lingo, we would call that a “teaching school” because the bulk of their job is teaching with research most likely being behind service as well. Like many of you in the workplace, we have contracts and we also have institutional documents that guide our “workload” and our “reappointment, tenure and promotion.” All of these documents lay out the expectations of our work. If it’s not specifically listed, that means it’s likely it is going to be highly valued.
One of the points to walk away with is that you can’t just lump all academics together because we have very different jobs with very different expectations. It’s no different than two practicing technical communicators who have very different jobs and very different specialities.
Understanding Research and the field
Non-academics rarely understand the competing pressures that impact research study design, the data collection,and ultimately the dissemination of the findings. What this means is that an academic’s research is usually driven by their own interests (which may not align to a specific workplace problem), the time they have, the requirements for their job, their own initiative (cause yes, we do bad research), resources, and access (e.g., to specific populations or specialized software/hardware). Those are just the big things.
And for work to count in the field (that is given credit at annual review time that works much like it does in the workplace) it has to be published in specific places that are highly regarded by others in the field. That too adds limitations or conditions on what the final research project might look like. For example, I know that some of my research that has a much more applied angle needs to be sent to Technical Communication (and the cover on that link is to my award winning article!). If I have some empirical data, then IEEE PCS would be a good place to send it. All of it has to be peer reviewed, which means two other people will read it and provide comments on it. For it to be published, those two people and the editor have to agree the idea is worthy. Often, authors are asked to “revise and resubmit” and the revisions are based on the reviewer comments. This whole process of submission and review can take as little as 6 months often much longer.
Understanding in big general terms how things work will make bridging the divide a little easier.
So for example, again going back to a comment on Tom’s blog, Mark Baker threw out a hypothesis (and rightly claims it’s based on nothing but his own view) that he feels practitioners with college tech comm degrees leave the field within five years because of a disconnect between what they were taught and what actually happens in the workplace. I’ve never met Mark, but if I did, I would extend my hand, introduce myself, and politely say that in one way his hypothesis does not hold water. Because guess what? I do have data. I know, at the masters degree level, that it is not true.
But Mark’s hypothesis does open a number of important points we need to talk about. First, when you talk about degree programs, you need to be specific because there are lots of different kinds (and going through all of that here would bog us down. Go to the Intercom issue in the summer of 2012, which highlights some of these differences and look for a piece sometime next year in Technical Communication, provided peer review goes all right :-)). My data on what folks are doing with their degrees is at the masters level. To my knowledge we don’t have this kind of data (unless it’s locally housed) at the undergraduate level. It would be a great project for an academic, as Mark points out, but I’m not certain if Mark or any practitioner fully understand what that project would entail.
Just like most practitioners, academics are busy people. We see students come through our classes, they eventually graduate (hopefully), and they go on to lead successful lives. When they are still in their programs, we mostly know how to reach them their university email address, but after they graduate, most stop using that address (and many institutions disable them). So off they go, and we get a whole new set of faces.
It’s only been in the last few years that departments were expected to try and keep up with their alumni. Before the bottom fell out of higher education in 2008, alumni relations was mostly handled at the institutional level. But since then, departments are now courting alums and hoping those alums give to the institution AND mark any donation for the department. But to the point about research, you can’t find out about alumni unless you know how to reach them. At my institution, the University of Cincinnati, we initiated an undergraduate degree in 2008. Those first few years we were just trying to figure things out and I can assure you we have NO data on those students unless they’ve contacted us since.
So if I wanted to find out if they (1) got a job in the field and (2) are still working in the field (which would answer Mark’s question), I would have to go my “institutional research” division and request data of graduates from the start of our program until let’s say, 2013 (5 years worth of data). Institutional research may (or may not) have a complete set because that’s dependent on folks in the registrar’s office entering information correctly and that the numerous system upgrades have made all the data compatible. I can tell you that my data set would not be complete and potentially may include students that never took a professional writing course. And this request only goes smoothly IF “institutional research” has an open access policy to faculty. Otherwise, I’m asking my department head (think immediate supervisor) to approve it and then he’ll ask the Dean (think middle manager) who will then have someone in his office find out if his signature is sufficient or whether it has to go up one more rung.
Once we have some data, I have to clean it up and start gathering contact information (which will not be any system that will be released to faculty). I would probably start by asking my colleagues if they had heard from any of the students on the list. I would simultaneously check our LinkedIn group and then do some searches there. By the time I cleaned up the data, tried to make it complete, & attempted to find contact information, I have probably just invested around 45-50 hours. Yup. That is not a typo. And I have not even contacted a student. Not a single one.
During that stage, I would have concurrently submitted a request to my Institutional Review Board (IRB), which is the university oversight on all research that all university faculty are required to contact before they do research that may require human subjects. (I’ll leave it there, but you won’t meet an academic who doesn’t have a horror story about their IRB.) The IRB process can take from 1-50 hours. I hour if my study is deemed “not human subjects research” and the upper end is if I have to complete more information, answer questions, follow-up with research sites, etc. But in our example here, we’re just doing a simple survey so it’ll probably be a 12-15 submission process. Yup. Again not a typo. And I have not even contacted a student. Not a single one. Time in project = 57 hours
Now I’m ready to contact students. So I sit down and send out requests. I can use a mail merge so that isn’t too hard or laborious, but then I have to account for following up (to try and maximize responses), making notations about bounced information (and trying to find new information), answering queries, etc. So add in another 10 hours for data collection.Time in project = 67
When I finally close my data collection, then I have to analyze it. I’m going to assume that I got a pretty good response. Since 2008, we’ve graduated around 150 students. Student surveys like this will generate pretty good returns, sometimes as high as 70%. So now I’m putting in some statistical analysis time, which is not something I use everyday so I’m going to have spend 5-10 hours brushing up on my skills on top of the 30 hours (conservative estimate) that I’ll put into data analysis and creation of multiple charts and graphs that port the data differently, etc. And I haven’t even started to do “research” into what has already been published so that I’ll be prepared to right up this information for publication. Time in project= 102
Because here’s the money phrase…few academics would take on this type of time intensive project without writing it up for peer-reviewed publication because that’s how we get paid. That’s how we show that all this time resulted in something. And the research and writing up of this project would entail another (gosh, depending on whether my writing mojo was in tact) another 40 hours.
Conservatively, I just put in 142 hours on a project to get into peer review. Since I am doing other parts of my job (including other research projects), the time for this project is divided up into small parts so that it has forward progress. But you can easily see how this small focused project takes a lot of time. Now just consider a workplace project that is complex with multiple parts, multiple methods, and then lots and lots of data that has to analyzed.
If I really wanted to do justice to Mark’s question, I would need more students than just my own so I would have to contact other folks at other institutions so they could do this same process (though, of course time would then be shared sort of).
Here’s your takeaway: Good research is not quick, and often a simple and good question like Mark’s takes a lot of time and energy to answer in a way that is valid and reliable.
Open Access Journals and Research Dissemination
While members of STC can read that journal, the other academic journals that publish much of the technical communication scholarship is indeed behind a paywall. Tom Johnson mentioned this and I’ve seen it pop up in conversations in social media recently. What that means is you can read the abstract, but without a subscription (through a library) or forking over payment, you’re not getting the full article. I know this sucks. It does. And this system was in place long before I started working in the academy and will be in place, probably, long after I leave.
There are no easy answers here but the reasons for it are based on something practitioners understand: profit margins and work. Open Access does not mean free. It just means there is a shift in who is paying for what. In the humanities and social sciences, where money and resources are limited, it is difficult to create an open access journal that has the same high quality and sustainability.
Would academics like more open access options for their work? Yes. Is that likely to happen in the near future? No. Is that kind of sad? Yes. But that’s the world we live in right now, and until institutions start to change what they may or may not pay for as far as subscriptions, the tide isn’t going to turn very quickly. (Also, as noted above, credit is awarded for publishing in long standing and well regarded publications.)
Yvonne Sanchez mentions in a comment on Tom’s post that writes, ” I would love to see a partnership between practitioners and academics to produce something similar to what JSTOR (http://daily.jstor.org/) is doing, or at a minimum, I would love to see a partnership that produced techcomm-related contributions to JSTOR.” I laughed out loud. Please don’t get me wrong. This is good idea, but the amount of time and money to make this happen is almost unfathomable to me. I would think you’d need around 1M and you’d have to find at least a handful of academics who had the time and inclination (and this would be an almost full time job for each of them), there’s just no way it’s going to happen soon.
JSTOR is a private company that negotiates deals with other publishers and entities to archive content. It’s not that there isn’t techcomm related content in JSTOR. There is. The reason you may not know it or its hard to find has to do with the way it’s archived and tagged.
I’m not trying to be a negative Nellie here. And there are some small movements trying to turn the tide. But practitioners–and even academics–have to understand that there are HUGE entrenched issues that impede this sort of progress, and I cannot underscore the amount of money these things take. Yet, with all that said, if someone has a genius solution, by all means I am willing to listen to it and try to move it forward. We also need to remember about our own field’s efforts. For example, I’m sort of hopeful that the STC BoK may help with at least showing where the research is and let’s not forget the Tech Comm Eserver library.
The bottom line here is it will be much easier and likely more doable to work toward an open information exchange where research is made more public rather than starting new, large scale projects that have so many barriers to overcome. I can easily imagine nagging asking an author (whom I probably know and in many cases would consider a friend) of an article that has a great applied value and doing an interview or writing up a “practitioner summary” of the work. With some help from practitioners with a wide social network reach, it’s likely that could help move the field forward with getting research results into the hands of those who want and need it. (and then if it’s used or we find out it doesn’t work, then that creates an iterative cycle of knowledge making and remaking :-))
Academic Style
I know that academic writing can be painful. I do. I am sorry if you have to slog through some of it. This is a long standing, entrenched practice that dates back to the 16th century.Reviewers expect a certain style so it’s sort of self fulfilling problem.
Little by little things are changing and I would just encourage you to please keep reading. If nothing else, read the introduction and conclusion because those are normally more specific.
And if you have a question, send an email to the author. I’ve gotten a few of these through the years and loved having that interaction. I’m betting my colleagues would love it too.
Literature Reviews
Recently, someone wrote about the need for literature reviews to help practitioners understand what’s out there on a subject. I agree. A good literature review is invaluable. When I talk about literature reviews, I mean that someone goes to all the academic databases and journals and gathers together the literature on a topic and then reviews and synthesizes that information. While this is research and it’s so valuable, it’s not a common type of research that tech comm academics do. (In the sciences, it is much more common.) Not having asked my colleagues why they don’t do them, I would guess that it’s based on the time it takes (because it does take a lot of time) and it’s not sexy research, which often academics feel they have to novel or innovative (which is not necessarily the case, but that’s story for another day). So I totally agree we need to do more of them. I’m open to suggestions on ways to get this done.
Just one more note on our jobs
I do not have summers off. Let’s dispel that myth right here. True, I do not teach in the summer, but the areas of my job are still going on, and as an active researcher and active member of several national communities, I am working all summer long. The typical faculty member works 61 hours a week (and I think this a low number for the many academics whom I am familiar with their work). We have multiple competing demands on our time at any one moment. An example: for every three hours in a week that I am in front of students, I am spending another 10-20 hours a week preparing for class and responding to students. For years, I’ve been meaning to keep detailed notes of everything I do in a week.
My point is we are busy, just like most practitioners. Also, the majority of academics are conscientious about their jobs and wanting to do the best thing by students and produce good research and be good colleagues to keep their institutions or their professional organizations running.
Let me say this as kindly as I can. I am amazed at the number of practitioners who feel as though they have all the answers about what folks in higher ed need to be doing because they went to college. I can assure that I’m not going to start telling Jeff Bezos how to do his job because I shopped on Amazon (though I do have some ideas about that user experience!). In other words, unless you’ve held a full-time faculty appointment or done some research about work life in higher education, you are in no position to pass judgement on how I, and my colleagues are doing it. Yes, there is room for improvement in all sorts of ways. Yes, we welcome your constructive feedback on how we can make our programs better. But the consistent “academic bashing” that paints us with a broad brush as being theory driven (with no connection to practice), disconnected from the world, technological phobic, and just not getting what “really happens in the workplace” is not useful, helpful, nor wholly accurate.
Bridging the Divide
So Tom lays out some ways to bridge this divide. All of them are good. As far as #2 and academics blogging about their work, I am thinking through (with some other smart people) ways to get the word out about important academic work and specifically talk about the practitioner’s takeaway. So stay tuned for more information on that.
I cannot stress how important #3 is. Personally, I’ll be launching a couple of large practitioner focused projects that have to do with ALL of us understanding what skills students and soon to be new hires need to have. This is a thorny kind of issue that will never have a definitive answer, but we–ALL of us–need to cooperate to get some data that can potentially impact academic programs. Tom’s point about tools is actually a programmatic one.
I would revise or add an addendum to #4 and that is let academics know what big questions you need answered. In some cases, the answers may be buried in the scholarship (see lit reviews above) or it may be something that someone may want to research. It’s also possible for some types of research questions to be made into a class project.
I would three more to Tom’s list:
— we have to encourage more exchanges at our conferences. TechCommGeekMom has a good point but I would counter with many of the presentations I’ve attended at professional conferences seem like a half baked idea with little research (and I LOVE research and methods and this was one of the reasons I was so successful as a consultant) and so site specific that I could never take that back to the classroom as something I could call a best practice. I would love to see conference sessions that were like conversations where practitioners could talk about what issues they could use some research on and academics could ask questions (from a panel of different types of practitioners) about what’s going on the workplace. I’m not certain why we haven’t really done this before. Though we did do something about current topics from the world of work at the STC Academic SIG sponsored pre-conference of CPTSC. There has to be a greater commitment from professional organizations to work toward these kinds of dialogue sessions.
— we need more practitioners to reach out and connect with local programs or make it known on social media that you’re willing to do some things like guest lecture in a class, hold a one hour workshop, help students network, offer to spend time just talking to faculty, etc. We definitely don’t want to overwork you, but I truly am tired of running across comments all over the Internet about how disconnected and behind the times academics are. I’m not saying that it may not be true in some cases. But what drives me batty are those folks who will complain and point fingers and never offer to share their time or expertise.
— we need academics, when it makes sense for their research agenda, to work harder at showing how their research can be applied or used. I’m not saying all research has to be applied. There is always a place for good theory. But we do need to do a better job of connecting our (over)use of the single case study to larger issues. This would improve the research enterprise for practitioners and academics alike.
I truly didn’t mean to write a mini-treatise here. I hope the shear number of words shows how passionate and dedicated I am about these issues. One last thing: I would prefer that we stop using the phrase “bridging the divide.” We need a new metaphor that signals that we–academics and practitioners alike–share a common field. I would the idea of a house because all of us reside within the big field of technical and professional communication together. And like any family living in a house, we will have disagreements; we will have our own personalities and opinions; and we all look a little different. However, at the end of the day, we’re still all living in the same house. It’s a good house. My goal is to try and make it stronger.