Sign in or register
for additional privileges

C2C Digital Magazine (Fall 2020 / Winter 2021)

Colleague 2 Colleague, Author
Cover, page 11 of 21

 

You appear to be using an older verion of Internet Explorer. For the best experience please upgrade your IE version or switch to a another web browser.

ITS: Providing professional trainings to colleagues at K-State

By Shalin Hai-Jew, Kansas State University 


At various institutions of higher education, over the years, professional development has come in a variety of forms.  There may be in-person, blended, or online trainings. These may be provided external vendors, internal trainers, or invited members of other campuses.  There are usually annual conferences for updating on various technological and other skills.  Faculty, staff, administrators, and students may take college courses together.  Some in-depth studies may include additional earned degrees.  

This work intensified with the emergence of SARS-CoV-2 and COVID-19.  With disruptive public health challenges and budgetary pressures, people are taking on a broader range of work tasks.  They have to upskill to stay competitive.  This means that professional development on a shoestring has become more important than ever.  Trainings are offered all over campus by various units, and the core of the formal trainings is set in human resources.  

ITS Trainings




Figure 1:  Technology (by geralt on Pixabay) 


Information Technology Services (ITS) has long provided trainings to campus on critical-systems technologies used across the university. They have worked to provide cybersecurity training.  They have also provided more boutique trainings on data analysis software, to campus researchers.  In Fall 2018, at K-State, a small group from the larger ITS community (dubbed One IT) started meeting to review their training offerings to the university and larger community. They set up systems to assess their training offerings and to capture accurate metrics.  They worked to understand the training needs at the university. They worked to expand training and supports more broadly. They formed partnerships across campus with other units. They worked to create a train-the-trainer short course to enable other trainers to come online. They provided resources to the themselves around issues of accessibility. They started setting up a feedback loop to better understand training participant experiences in order to improve the various trainings and related services.
 

Assessing Training Needs


Prior to this group, the way individuals understood needs for technology came from the following (in descending order):  word of mouth (electronic and in-person), the ServiceNow ticket requests, and an open web form related to ITS trainings.  Sometimes, a faculty member would mention something during a Faculty Senate meeting, and word would get back to the staff.  

Once a request came through, staff would conduct an “environmental scan”; they would explore available trainings on the Web and Internet, on YouTube, on SlideShare, and on other sites.  If they could point the individual or group off to a credible third-party source, they would.  

More often than not, there would be preference for a locally created training to suit particular needs of the requestor: a one-hour training on how to use the Qualtrics survey platform for research and for data analysis for undergraduate researchers; a two-hour training on how to use the qualitative data analytics tool NVivo for master’s and doctoral students in a particular discipline; and so on.  Once the parameters were known, additional on- and off-campus contacts would be made, and a training would be set up, handouts created, and the training delivered.  Follow-on videos would be created as needed.  

If it seemed that the training had generalizability, it would be repackaged as a campus-wide offering.  More often than not, though, university-level offerings are created from staff knowledge, skills and abilities (and attitudes) or “KSAs,” and those interested are drawn out based on public announcements.   

A User Base


The campus has to have a sufficient population of individuals who might be interested or are already interested. The first group has to be cultivated, the latter group already exists.  

The training participants are a cross-section of campus.  They have intense expertise on some topics in their areas of focus, and they often have much less interest in learning technology tools outside of their direct areas.  For many they want to be able to achieve a particular outcome (data analysis, data visualization, or something else), and they’ll move on back to their own areas of focus.  

The participants, too, can be semi-experts on the training topic already because of regular usage, but they want to learn new ways of doing things.  They want tips and tricks, and keyboard shortcuts.  They expect something dramatic in their learning, to make the session worth their time.  Sometimes, they’re attending because they want to catch you in some gaffe; others are there to lend a hand and offer advice went he presenter gets stumped.  

Sometimes, it is possible to impress by ensuring that the members of the group get to meet each other and make professional and social connections.  Sometimes, it is possible to impress to pointing to a site with free templates or cool training videos or other downloadable resources.  The basic assumptions of andragogy apply—that adult learners are pragmatic and want to acquire skills that they can use; they dislike makework.  They want to be met where they are.  They want a lifeline to a person (an email address or a phone number) who can solve their problems if something does not function as they assume it would (the infamous gap between the mental model of the technology user and the technology).  While people are understanding, they generally dislike being stumped, and that state of affairs causes a lot of discomfort and annoyance and even anger.  

On campus, people want free resources. This is understandable given the thinness of budgets.  There is a fair amount of self-provisioning of equipment and software, just to achieve work.  If one is offering a technology training, there are often questions of whether the software is free or discounted, at least.  


Go?  No Go?


Committing to offering a training on a particular topic requires some consideration.  The inputs to creating a training are not minimal.  These include instructional design, creating various digital contents (online surveys, an e-book, slideshows, digital leave behinds or handouts), setting up publicity (newsletter articles, calendar announcements), setting up courses on a human resources system, and other efforts.  Then, too, the trainer has to know the software or topic sufficiently well to field questions live and in real time…and to troubleshoot issues effectively.  (One can always offer a raincheck to an individual to take the issue offline to an individual consultation.)  These latter capabilities require a much more in-depth knowledge and capability set than an “intro to” sort of training might suggest is required.  

There are political considerations, too.  Cui bono or “who benefits”?  With limited time, limited resources, limited attention, some strategizing and tactical thinking have to be used to inform what is pursued.  Perhaps a training may lighten the burden for a particular academic unit.  Perhaps a training would be a bridge to a more formal class for some learners. Political considerations include knowing if such a training or a similar one exists on campus already because professionals have turf, and it would not be good to step on another’s territory.  [It is not common to attract public criticism, but people will grumble privately.]  Administrators go in for rounds of credit-taking for any work achieved by staff, which is an unfortunate byproduct of bureaucracies.  

This is not to say that there aren’t internal benefits for ITS.  For example, university-wide trainings on “ETDRs” (electronic theses, dissertations, and reports) were offered starting in Fall Semester 2020. This was offered to raise awareness of public online resources and to drive traffic to these. This training was set up to empower graduate students, in order to head off help tickets, one-on-one consultations, general emails and calls, and other pressures on the support system.  

Designing Trainings




 
Figure 2:  Sketchbook / Drawing Board (by fancycrave1 on Pixabay)



A typical approach to designing a training is the “ADDIE” method, which includes the following five steps:  

1. Analysis
2. Design
3. Development
4. Implementation
5. Evaluation

Each context has different “authorizing” standards and conditions.  If one is designing learning for a regulation, that regulation has to be understood thoroughly.  A subject matter expert (SME) or content expert may have to be brought on to vet the training contents.  

There are standards that apply to every instructional design:  intellectual property and copyright, privacy protections, FERPA, accessibility (Section 508), and others.  It is critical to stay legal on every front and on every dimension.  

Developing Trainings


My method for developing trainings involves some upfront analysis and design.  I elicit ideas from possible attendees to understand their practical “use cases.”  

I find it easier to create a digital leave behind or handout first that I can point training participants to as a reference. The handout tends to be fairly comprehensive.  It includes screenshots.  It includes hyperlinks.  It includes stepped-sequences that are tested and effective for particular ends.  

Then, I design a sequence of learning for the hour-long session or the two-hour session.  I like trainings to be interactive with a “conversation” going on from the first moment and throughout.  This means that the training has to be semi-structured and modular only, so base contents are covered…but there are optional additional sequences that may be covered at the pleasure of the participants.  

Given the competition for people’s time and attention, I have to make a case for the value in the learning.  And I also try to add something sparkly or eye-catching in the title or the session (like a guest speaker)…in order to attract and hold attention. 

Pilot-Testing or Trialing Trainings


Ideally, such trainings will be pilot-tested or trialed with live learners.  However, given how busy people are, this is not often possible. What usually happens is that the initial groups of learners go through a semi-rough training in the first few iterations, and their feedback is elicited and used to improve the training.  

In the Live Training Session 


The general sequence of a training goes as follows:  

  • Any housekeeping issues 
  • Self-intros by the participants about their work and their interest in the target topic; any questions that the group wants addressed during the session 
  • An overview of the handout or digital leave behinds (which I use to ensure that people don’t try to take notes like crazy or feel like they are not having their questions fully answered) 
  • A plan for the time (the agenda) 
  • Coverage of the general technology capabilities and graphical user interface (GUI) [These include verbal descriptions and visuals at every step.]  
  • Walk-throughs of various sequences in the technologies for particular outcomes 
  • Participant directed tasks for walk-throughs 
  • Hands-on work (for some trainings) 
  • Transferable methods for using the technologies to constructive effects 
  • Ways to avoid common errors 
  • Dedicated time for question asking and discussion near the end of the session 

Throughout, there are check-ins to make sure that the learners are doing okay and that the pacing is right.  There are interactive conversations. There are decisions made about the course of the training by the participants.  (There is a kind of live “participatory design” in real time.)  



 
Figure 3:  Chat (by Memed_Nurrohmad on Pixabay) 


I always warn participants that they should not limit their thinking to what is demonstrated in the session.  I also let them know that what is covered in the session is only 10 – 20% of the capabilities of the software tool.  I also try to offer as broad of a tactical approach as possible.  

Participants share their own experiences.  In some cases, participants may be invited to the podium or invited to share screen and show sequences to the audience.  The live feeling is based on the liveness of the event, and this keeps people engaged.  This also shows respect for all participants, which is important.  

Also, during the live session, it makes sense to point out other entities on campus that support the effort:  for research, the statistics lab, the regulatory compliance office, librarians; friendly and professional individuals for particular areas of specialization, and so on.  

The period for a training is often brief, one to two hours.  People attend for a short refresher; they attend to acquire a sense of a software; they come for answers to questions.  While the training is set up to provide for a range of needs, the time limits mean that it is important to provide learning resources in other ways.  Digital leave behinds are therefore important, and I encourage their use pre- and post- training.  (Some trainers will create folders of digital artifacts to train on for particular targeted learning outcomes.  These require additional instructional design and development.  I will use some light assignments, such as for querying social media platforms using NodeXL or NCapture of NVivo…or creating a diagram in MS Visio Professional.)  

For me, my main ambition for those who participate in trainings is that they follow through with trying new technologies and methods, especially in relation to their own research and work.  To that end, I try to be available to support their efforts in a professional and collegial way.  


Setting Clear Boundaries




Figure 4:  Target 


Some participants will misuse the “relationship” from the training. They will demand that the presenter help them scrape data from social media platforms even if they do not have the appropriate technologies.  They think that opening a ticket on ServiceNow will be sufficiently coercive…or setting an appointment on the Acuity app.  They will ask the presenter to conduct shared research and cowrite papers for their own academic advancement because of “friendship” (in lieu of any funding for my office).  They will send multiple emails over the weekends or nights with requests for one-on-one sessions.  They will drop into the presenter’s Zoom room at unspecified times to try to ask questions (perhaps not realizing that I receive an email with their logged identity and time of access). They will send partially finished projects for refinement and finishing, using the target technology.  Some will build a complex application which then requires testing and backstopping, to ensure full functionality.  A lot of IT tools require particular outputs and deliverables.  

Some will send their graduate students for their research advisement, which I will invariably decline (I am not on the advisory committee).  Many will extend invitations to serve on graduate advisory committees, but these are unfunded and do not support the objectives of my office.  

It is important to protect against project creep from a basic training… However, at the same time, the door has to be open to support and help where possible…but to graciously decline where one cannot provide that help.  (If a person is too aggressive in trying to mount a campaign to get one to do their work, then being a little less gracious is always possible.  But this is only after multiple times of explaining boundaries and what is included in the training vs. what is extraneous.  Some individuals, however, try hard not to take no for an answer to the point of being unreasonable.  It is important to remember that if one takes on a task, one also has to face all consequences that follow.  If one does not have trust in the asker of the favor, then it is wise just to decline the request.)  

Sometimes, trainings are about methods, about how to approach a technology with the proper patience.  


Post-Training Assessment (at Micro and Macro Levels)




Figure 5:  Far Enough Out 


For those who offer trainings, it helps to have feedback.  It helps to revise resources often.  It helps to swap out example files, so that one is not used again and again.  (I am guilty of this last point especially. In my NVivo trainings, I use a project file that was based on a real project of mine…and which has accrued masses of videos and image files and datasets and so on.  I need to start from a clean file again and create resources that demonstrate particular features of the software instead using a one-file-for-all-needs out of induced laziness based on a busy schedule.)  

Some efforts have been made by this training group to systematize assessment of the trainings for improvement.  An online survey was created by one of the members, but it has not been widely adopted, even though the results are just to the trainer himself/herself (with a request that the findings be used to improve the training).  There is some effort to centralize the data gathering, too, but this is not a politically popular approach as-yet.  

Organizationally, training programs may be evaluated using some tools created from (Donald) Kirkpatrick's Four-Level Training Evaluation Model (1959, 1975, 1993).  The four levels include:  reaction, learning, behavior, and results.  Reaction (1) involves understanding how engaged the participants were and their sense of its strengths and weaknesses.  The Learning (2) level refers to an objective measure of what was learned.  The Behavior (3) refers to understanding how the training knowledge and skills are applied by the participants in their respective contexts, post-training.  And for Results (4), this refers to organizational evaluations of the return on investment in training (at a more macro level).    


Coordinating as a Group 


In the two years that this One IT training group has existed, there has been progress on some fronts. 



 
Figure 6:  Teamwork (by geralt on Pixabay)



The folders of shared contents on SharePoint may suggest something of what the group has been involved in.  

Bureaucracy:  
  • some related to bureaucracy (like a living charter, authorizing documents, lists of available trainings) and institutional memory (like recorded Zoom videos, meeting notes with decisions and action items)
  • one for process modeling visuals (BMPNs, flowcharts, cross-functional diagrams)

Training Guidelines:  
  • one for guidelines for quality trainings
  • one for guidelines for building in accessibility to trainings and contents 

Marketing, Communications:  
  • one about the marketing sequence (the university calendar, the personal calendar, room scheduling, session in the human resources software, the article in the IT newsletter and the university-wide daily newsletter, and others)

Training Metrics:  
  • one to collect training metrics (recordings of training sessions provided, the modality, the numbers of participants, and notes)  

Projects:  
  • folders for various projects requested by administrators 

Endeavors:  
  • one about digital badging 

In reviewing these, only a few folders are active at any one time.  Many involve partial work or complete work, and then a retirement to stasis.  For such a group, the various members have limited time and focus to pursue group ambitions.  And other work objectives take precedence.  

Of the team, only a minority are on the tip of the spear and offer live trainings.  While there is always interest in building complex skillsets, many trainings only have one person who can deliver the work.  This means that there is a single point of failure; if the individual leaves or is too busy, there is no way to recoup the available training. This means that the various departments are on their own.  

A few technologies may be supported by a larger team, but this availability really depends on whether the respective individuals have regular cause to use the particular technologies.  Just having documentation of how a tool works is insufficient for training because it is in applied usage that the trainings are valuable.  Also, it helps to have broader knowledge of technologies, so that a trainer can be a bridge to other technologies.  

A few members of the team create contents for automated trainings.  Others are administrators who ensure that the needs of leadership are met for documentation and to justify funding and leadership support.  

Conclusion


Those in ITS who provide trainings offer important bridges to various technologies and methods across campus.  Formally reviewing practices and shoring them up stands to benefit all stakeholders.  


Caveat:  This work shows the author’s point-of-view.  On a committee with a dozen and a half members, there are likely a variety of viewpoints.  






About the Author


Shalin Hai-Jew works as an instructional designer and researcher at Kansas State University.  Her email is shalin@ksu.edu.  


Comment on this page
 

Discussion of "ITS: Providing professional trainings to colleagues at K-State"

Add your voice to this discussion.

Checking your signed in status ...

Previous page on path Cover, page 11 of 21 Next page on path

Related:  (No related content)