Week 6 Discussion Response
Journal of Athletic Training 2020;55(9):902–910 doi: 10.4085/1062-6050-0540.19 ! by the National Athletic Trainers’ Association, Inc www.natajournals.org
Current Concepts
‘‘To Tech or Not to Tech?’’ A Critical Decision-Making Framework for Implementing Technology in Sport Johann Windt, PhD, CSCS*§; Kerry MacDonald, PhD†; David Taylor, MSc‡; Bruno D. Zumbo, PhD§; Ben C. Sporer, PhD*§; David T. Martin, PhD||
*Vancouver Whitecaps FC, BC, Canada; †Volleyball Canada, Ottawa, ON; ‡United States Olympic and Paralympic Committee, Colorado Springs, CO; §University of British Columbia, Vancouver, Canada; ||Australian Catholic University, Belbourne
The current technological age has created exponential growth in the availability of technology and data in every industry, including sport. It is tempting to get caught up in the excitement of purchasing and implementing technology, but technology has a potential dark side that warrants consideration. Before investing in technology, it is imperative to consider the potential roadblocks, including its limitations and the contextual challeng- es that compromise implementation in a specific environment. A thoughtful approach is therefore necessary when deciding
whether to implement any given technology into practice. In
this article, we review the vision and pitfalls behind technology’s
potential in sport science and medicine applications and then
present a critical decision-making framework of 4 simple
questions to help practitioners decide whether to purchase
and implement a given technology.
Key Words: analytics, measurement, wearable devices, global positioning systems
T echnology is here to stay—not just in sport but in virtually every discipline. This special issue focuses on training load, recovery monitoring, and manage-
ment, and in nearly every article, readers will find examples of how technology can be used in these areas. External loads can be monitored through global positioning systems (GPS), inertial measurement units (IMUs), optical tracking systems, and so on.1 Internal loads may be captured with heart-rate monitors, lactate measurements, and more.2
Recovery states may be measured with devices ranging from low-tech wellness surveys3–5 to more high-tech solutions, such as heart-rate variability6,7 or force-plate testing.8,9 Currently, we are seeing new technological solutions with potential sporting applications, such as implantable devices,10 markerless motion capture,11,12
breath analysis,13 smart garments, biomechanical insoles, and skin sensors.14 In this technological age, sports science practitioners must critically appraise the plethora of options available and make informed decisions about evaluating and adopting technology in their specific contexts. These context-specific questions demand a critical evaluation of the case for the intended use and the available evidence that supports (or does not support) technological implementa- tion. Our aim in this article is to provide a simple, foundational framework to aid practitioners in that critical decision-making process.
This article is divided into 3 parts: (1) a vision for what technology can provide and why we should be excited about its potential, (2) a warning about the potential dark side of technology and the pitfalls that can derail its successful implementation, and (3) a critical decision-
making framework consisting of 4 key questions to ask before purchasing a new technology.
PART 1: THE BRIGHT SIDE—A VISION FOR TECHNOLOGY
Excitement is the most appropriate response when taking an optimist’s view of technology in sport. In this section, we outline just a handful of benefits practitioners can expect from successful technological implementation.
Benefit 1: Improving and Off-Loading Data Collection—An Example From Pro Football
Technology can improve measurement precision and automate the process so that practitioners do not have to manually record data as they did in the past. For example, understanding the match demands to which athletes are exposed is foundational in the sport sciences.15–17 Football is no exception, and time-motion analyses to understand football players’ physical outputs (eg, total distance run in a match, time spent at different speeds) and physiological responses have been performed for decades.18–20 Before the technological advances that are commonplace today, these time-motion analyses were performed using tape-recorded commentaries, video recordings, and film analyses. All of these notational analysis processes were extremely time consuming, often limiting researchers’ ability to examine more than a small number of players in a defined number of matches.21 Technological advances, both through wearable devices22 and optical tracking systems,21 now provide these physical output measures to researchers and practitioners in near real time.23,24 Although these systems are not without
902 Volume 55 ! Number 9 ! September 2020
D ow
nloaded from http://m
eridian.allenpress.com /jat/article-pdf/55/9/902/2596927/i1062-6050-55-9-902.pdf by Florida Institute of Technology user on 30 N
ovem ber 2021
error and vary among technological providers and sys- tems,25 many provide more accurate physical output data than estimates derived from manual notational analyses based on video and can supply these data on all players simultaneously and in near real time. In this way, technology and its efficiency have off-loaded weeks and months of practitioner and researcher time. If used wisely, this regained time allows a deeper dive into the information and may inform practice more thoroughly. For example, advances in optical tracking and wearable technology have now allowed a better understanding of how physical exertion in football players relates to such contextual variables as player position, stage of play, and teams or players being in or out of possession.26–28
Benefit 2: Sport-Specific Load Measures—A Volleyball Journey
The prevalence of chronic injuries in volleyball players is known to be high.29,30 Of these injuries, jumper’s knee (ie, patellar tendinopathy) is the most common. The injury mechanism is fundamentally an overload of the extensor mechanisms of the knee joint.31 With the introduction of IMUs that measure athlete jump counts, the sport-specific load requirements of the knee extensors could be quantified for the first time without manual video annotation of jump counts for all players.32
One author (K.M.) used his dual roles as a researcher and volleyball coach to scientifically evaluate such a technology and implement simple heuristics to inform decision making in coaching. As a researcher, he performed validation work on a wearable IMU for measuring jump count32 to ensure that he could rely on the jump-count data being provided by the IMU software. Specifically, the jump counts from the accelerometers related very closely to the jump counts extracted from manual video notation, so he was comfort- able that the errors were few enough to reasonably inform practice. As a coach, measuring jump loads for all players in training and matches facilitated a better understanding of the position-specific training and match demands and individualized athlete load management throughout the season. After a retrospective assessment of training and match demands, he prospectively planned and prescribed individual jump loads. This prospective prescription may have improved the capacity of the players’ tendons to withstand the sports demands and prevent the development or flare-ups of such chronic injuries as jumper’s knee. In his case, this informed decision-making process helped to mitigate the prevalence of overuse injuries (zero practice sessions or games missed due to overuse injuries) in the team’s volleyball players and culminated in a national championship.
Although not all stories end in a championship, technological implementation allows for sport-specific and movement-specific load quantification that can inform practitioners’ workload, recovery, and return-to-sport decisions. A multitude of factors affect the onset of any injury, yet an informed approach to load is certainly a beneficial addition to any injury-mitigation strategy.
Benefit 3: A 3608 View of the Athlete
Performance is multifactorial and requires adequate physical, mental, technical, and tactical expertise to
compete at the elite level. The contribution of each element depends on the sport’s demands and the characteristics of the individual athlete. Furthermore, sport is dynamic and ever changing as athletes pursue multiple phases over the course of a single season, including training, competition, and recovery. Sleep, recovery, nutrition, social factors, and lifestyle can all affect athletes’ responses to and outcomes in training and performance.33,34
Technology allows for the rapid collection and analysis of data from many of these areas. The ability to integrate data streams enables practitioners to better understand how one factor affects another by providing a holistic perspec- tive of the athlete. It also permits information to be shared across disciplines, blending injuries with training load, medicine with physiology, and physical with technical and tactical performance. Where limitations once existed in storing and processing vast amounts of information safely and in a time-efficient manner, technological advancements have reduced many of the challenges involving costs, computing speed, and intelligence tools. Although pitfalls still exist (see the next section), technology allows practitioners and teams to provide holistic perspectives on athlete performance when using a strategic approach.
PART 2: THE POTENTIAL DARK SIDE—A WARNING TO THE WISE
The data life cycle may be summarized broadly as plan ! collect ! analyze ! communicate. Each subsequent step relates closely to the intended use of the information, as determined by a thoughtful plan underpinning the technology’s implementation. A problem at any stage of this life cycle can be fatal for the successful use of any technology. Failed technological implementations can have lasting ramifications, so considering the following potential pitfalls is important.
Pitfall 1: Not All Promises of Technology Are Kept
Underlying all stages of the data life cycle is a belief that the data are trustworthy enough to collect and interpret. However, some of the bold promises made by technology companies may not actually be true. In these instances, failed promises may result in poor data quality (eg, measurement error is too large) that challenges a practi- tioner’s ability to interpret any signal amid the noise. The failures could also stem from black-box algorithms that summarize the data and produce unactionable, uninterpret- able outputs.
Scientists have explored technological devices in an attempt to better understand the validity and reliability of the data from these emergent technologies. As such, different technologies are known to have inherent limita- tions: for example, the ability of GPS technology to accurately measure high-speed running velocities,22,35,36 the sensitivity of heart-rate variability measurement,37 and the subsequent requirement of rigid, standardized testing procedures or the effectiveness of wrist-based sleep monitoring compared with the criterion standard of polysomnography testing.38 Although these examples include some technological devices that have published validity-related evidence, it is important to note 2 items. First, none of these devices and the data they provide are perfect; all come with inherent measurement error. Second,
Journal of Athletic Training 903
D ow
nloaded from http://m
eridian.allenpress.com /jat/article-pdf/55/9/902/2596927/i1062-6050-55-9-902.pdf by Florida Institute of Technology user on 30 N
ovem ber 2021
most consumer devices have little scientific evidence for their accuracy, validity, and reliability,14 so a prudent practitioner should approach any new technological device with a healthy dose of skepticism.
Pitfall 2: Technology Transforms Into a Dust Collector If It Cannot Be Implemented
The best technologies are useless if they are not implemented in a way that informs decision making or changes practice. Given that technological implementation may require sizable investments of financial and human resources, understanding the burdens on time and staff resources that implementation will require is crucial. If the burden on staff is too great, practitioners may be stretched beyond their skill sets, be forced into uncomfortable situations, or have unrealistic time constraints placed on them, negatively affecting the feasibility and quality of data collection. If the staff are not educated about the potential benefits, lack the desire to collect the data appropriately, and do not believe the technology will provide useful information, the investment is flawed before it begins.
Pitfall 3: Technology Does Not Necessarily Provide the Right Data or the Raw Data
A high-tech solution and precise data collection do not inherently mean that the right data are being collected. What constitutes the right data in a particular setting depends on several contextual and organizational factors, such as the questions being asked by different practitioners within the organization and how the organization’s decision-making structure allows data to inform decisions.
Even if the right data are being collected, it is vital to understand what type of data the technology will provide. Many technologies come with a software package that delivers a dashboard or printable report of the data collected. It is important to consider whether these standard reports or dashboards analyze the data in a way that reflects the user’s needs and corresponds to the original plan. The analyze portion of the data life cycle implies that the technology analyzes the data in the way you need or that you can access and analyze the data yourself in accordance with those needs if the technology does not provide the answers you are seeking. In these instances, it is important to understand whether a technology provides access to the raw data so that you can perform the appropriate analyses in accordance with your original plan. If the technology does not provide such access and only reports summary findings based on proprietary algorithms, the ability of researchers and practitioners to analyze data in the ways they need may be compromised.
Pitfall 4: Technology Does Not Inherently Communicate a Message
Even when technology is introduced and data are collected consistently in an applied environment, the data have just made it through part of the data life cycle—they must still be analyzed and disseminated in accordance with the plan. Technology itself does not inherently communi- cate to decision makers. Although some technological devices are accompanied by software tools that provide reports or dashboards that summarize the underlying data,
the message delivered to decision makers must be readily interpretable by the end user, which can include high- performance team members, coaching staff, or manage- ment, and answer the specific questions that were planned. What appropriate communication and dissemination look like, therefore, depend on the intended use of the technology and the context in which it is implemented. Understanding end users’ requirements, interests, and necessary decisions is crucial so that information can be tailored into a clear, concise message. Crucially, these steps must be taken in each environment and are not accom- plished simply by having purchased a given technology.
PART 3: A CRITICAL DECISION-MAKING FRAMEWORK
Asking the right questions before jumping into a new technological investment can help guide practitioners and researchers to the vision of the technology while avoiding some of the common pitfalls. Unsurprisingly, several frameworks for integrating technology into sport have been proposed in the literature,39,40 and we strongly encourage readers to explore and critically think through other frameworks in addition to those presented here. In our critical decision-making framework, we pose 4 questions, all of which should be answered affirmatively before arriving at a decision to purchase a given technology (Figure 1). Each of these 4 questions, important follow-up questions, sources of evidence for finding appropriate answers, and key take-home messages are discussed in the following sections, detailed in Figure 1, and summarized in the Table.
Question 1: Would the Promised Information Be Helpful?
New technologies arrive on the market daily, many of which may pique the interest of curious and intelligent individuals. These technologies often come with bold claims, savvy marketing, and grand promises. In the sport sciences, these claims may include accurate injury prediction or ‘‘1-stop shops’’ for understanding an athlete’s fatigue and recovery status. We propose that the first question practitioners should ask when encountering a technology and engaging with such claims is would the promised information be helpful? The ability to extract new information can be exciting, yet this does not mean the information will help to inform decision making for practitioners in their specific contexts.
To answer this question, individuals should consider what specific question will be answered or which decision will be informed. This is an extremely important point, as technology must inform practice to be useful. When a new technological opportunity is considered, at least 1 key decision should be informed by the available information or 1 question should be answered. This is the premise of the first (and most important) stage of the data life cycle: planning. To plan effectively, practitioners must understand their specific contexts so they address relevant and pertinent questions that end users need to answer. Ideally, a need may already have been identified within the practice for more information to be collected, in which case investing in a technology that provides that specific data makes more sense. Considering 2 contexts from earlier in this article,
904 Volume 55 ! Number 9 ! September 2020
D ow
nloaded from http://m
eridian.allenpress.com /jat/article-pdf/55/9/902/2596927/i1062-6050-55-9-902.pdf by Florida Institute of Technology user on 30 N
ovem ber 2021
wearable technology may provide insight for team-sport practitioners, but this looks different in soccer, where GPS data may be most valuable to understand athlete distances and speeds, than volleyball, where an accelerometer that provides information about jump counts and distances may be deemed less important (and not measurable with GPS indoors). Note that in each of these instances, a need is a question that must be answered or a decision that must be informed, rather than a specific technology that the organization believes is in itself necessary.
Take-Home Suggestions. ! Start with the end in mind. Understand the decision you
want to inform and which data from the technology you
will need to extract and communicate in order to inform that decision.
! Explore existing paths. Is there an existing data stream you can use instead or a better alternative?
Question 2: Can You Trust the Information You Will Be Getting?
When researchers speak about technology, the discussion typically includes aspects of measurement error, reliability, responsiveness, and validity. Beyond all these technical terms, practitioners need to know whether they can take
Table. Unpacking Each Question Within the Critical Framework Through Follow-Up Questions, Sources of Evidence, and Take-Home Messages
Would the promised information be helpful?
Can you trust the information you
will be getting?
Can you integrate, manage, and analyze the data effectively?
Can you implement the technology in
your practice?
Follow-up questions
What question will you answer or what decision will you inform?
Has a need already been identified for the promised information?
How much validity-related evidence is available regarding the new technology?
Are you confident enough with the limitations of the technology to inform practice?
In what format and by what means is information from the technology delivered, and how much cleaning needs to be done to integrate it with other measurements?
Do you have the analytical resources to handle and analyze the data?
What burden is placed on athletes and practitioners to collect the data?
Does your culture allow for technology to be implemented and data to be collected, and will the technology affect the culture?
Does your context allow for data to inform and alter practice?
Sources of evidence
Understand the challenges of your own context.
Consult with other researchers and practitioners who have faced similar questions and challenges.
Scientific literature surrounding the validity- related evidence for the technology
Internal validation and reliability
Professional network
Data samples from the company, short-term trials
Internal discussions or methodologic and statistical consultancy
Professional network
Qualitative scientific evidence Internal communication (formal
and informal) and education
Take-home messages
Start with the end in mind. Evaluate the existing
environment and infrastructure to see whether you need new technology to get the information.
Evaluate continually. Consider the consequences. Pilot where possible. Partner where appropriate.
Plan ahead. Educate practitioners involved in
collection on proper formatting and process.
Automate processes where possible.
Audit data and proactively set up quality controls.
Understand the implementation context.
Look for invisible monitoring opportunities.
Build technological implementation into existing routines.
Figure 1. A critical decision-making framework for integrating technology in sport.
Journal of Athletic Training 905
D ow
nloaded from http://m
eridian.allenpress.com /jat/article-pdf/55/9/902/2596927/i1062-6050-55-9-902.pdf by Florida Institute of Technology user on 30 N
ovem ber 2021
information from a given technology and be confident in making a decision based on the evidence it provides. We believe the principles of unified validity theory can help guide researchers and practitioners in trying to answer this question of trustworthiness.
A Brief Overview of Unified Validity Theory. Ground- ed in the work of Loeveringer41 and Cronbach and Meehl,42
Samuel Messick posited and promoted a unified view of validity theory.43–45 In place of validity types, the following definition of unified validity theory was proposed by Messick and adopted in the Standard for Educational and Psychological Testing: ‘‘Validity is an integrated evaluative judgment of the degree to which empirical evidence and theoretical rationales support the adequacy and appropri- ateness of interpretations and actions based on test scores or other modes of assessment.’’46
Unpacking this definition reveals 3 primary ways in which unified validity theory differs from other common and more traditional views of validity:
1. Validity is about claims and inferences that can be made, not about measures.
2. Validity evidence has multiple sources, and the aim is an integrated evaluative judgment (Figure 2).
3. Validity invites consideration of the consequences associated with technology and the data it provides.
How Does Validity Theory Translate to Adopting a New Technology? Unified validity theory provides prac- titioners and scientists with a lens to look through as they consider different measures. When we look through this lens, we see that thinking about measurements from technology is similar to thinking about science. The answers to most of our questions are more nuanced than yes versus no or valid versus invalid. Instead, we use terms such as it depends, to a degree, or in this specific context. This sets the stage for a practitioner considering a new technology to evaluate the technology on a continuum in terms of his or her specific context.
As an integrative, evaluative judgement, practitioners should examine all the available sources of evidence regarding a given technology and the specific metric it provides. Some answers may be found in the peer-reviewed literature, or the practitioner may have to pilot data internally. Reaching out to colleagues who have already
implemented these technologies may offer opportunities to discuss their internal validity-related evidence.
The consequences of testing are a final consideration that unified validity theory emphasizes and that practitioners should carefully consider. Implicit and explicit conse- quences are inherent when measuring and testing some- thing by implementing a technology. Athletes and practitioners will consider the quality being measured as important, athletes may train to improve that given quality, and decisions may be based more heavily on the provided data than on other pieces of information. These intended and unintended consequences can be positive or negative but should be considered carefully.
This process and evaluation must be performed on each of the different metrics that a practitioner hopes to use to inform decision making. Returning to our volleyball example, the IMU provides a more accurate measure of jump count than does video notation. However, this same IMU also provides measures of jump height and ground reaction forces. Although these measures may theoretically be linked to overuse injuries and performance, they should each be further investigated. In this case, video notation would not be the appropriate comparison measure, and more advanced biomechanical analyses and equipment would be preferable.
Ultimately, no technology, or the data it provides, is perfectly trustworthy. The practitioner faces this question: Given all the information at his or her disposal, are the limitations of the technology minimal enough that it can still inform decision making?
Take-Home Suggestions. ! Evaluate continually. View trust in one’s data as an
ongoing endeavor to judge how trustworthy the technol- ogy and data are on a spectrum using all the available sources of validity-related evidence.
! Consider the consequences of testing. What potential consequences, intended and unintended, could introduc- ing the technology have for athletes and practitioners?
! Pilot test where possible. If practitioners can gain early access to the technology before purchasing, they can conduct preliminary analyses of the data before purchas- ing.
! Partner where appropriate. When in-house expertise is not sufficient for examining certain aspects of the data,
Figure 2. Some sources of validity-related evidence.
906 Volume 55 ! Number 9 ! September 2020
D ow
nloaded from http://m
eridian.allenpress.com /jat/article-pdf/55/9/902/2596927/i1062-6050-55-9-902.pdf by Florida Institute of Technology user on 30 N
ovem ber 2021
collaborate with a research laboratory, university, or third-party company to facilitate analyses of the trustworthiness of the technology.
Question 3: Can You Integrate, Manage, and Analyze the Data Effectively?
A 3608 view of an athlete’s training, recovery, lifestyle, and so on is a major potential benefit of technological implementation. If a technology and the data it provides are deemed trustworthy enough to be added to this holistic athlete view, the next step is to understand how the data will be extracted, integrated with other data sources, and analyzed in a meaningful way.47 Different technologies provide different levels of granularity to the data, and the means of data extraction can vary from manual download- ing of files (eg, spreadsheets) to application programming interfaces that allow automated data extraction. The extracted data may also need additional cleaning before analysis and modification to be integrated with other data sources. It is vital to understand how much time will be needed to extract and clean the data; although these processes (eg, manual downloads and spreadsheet data management) may be common, they limit scalability and may preclude successful and sustainable implementation of a given technology.
A well-known truism in the data-science realm is that data scientists spend most of their time cleaning and preparing data for analysis. Given the complexity and challenges inherent in combining data from disparate technologies, practitioners must consider whether they have the expertise on their team or at their disposal to create a system that brings their data together. This may be accomplished through in-house data-science personnel, third-party athlete-management systems, or external con- sulting agencies. Without this expertise, it is very challenging to combine data sources to create a holistic athlete profile.
Once the data are collected and combined, analysis presents its own challenges. Over the last few decades, intensive longitudinal data have become increasingly common in elite sport settings. These data present a specific set of challenges and assumptions inherent to repeated-measures data, and in many instances, more sophisticated analyses are recommended to deal with these challenges.48,49 At least in the workload injury field, the authors of a methodologic review50 identified that statistical approaches to adequately address these challenges when investigating the question of how workload data relate to injury risk have not been applied to many intensive longitudinal data sets. Statistical in-house or outsourced expertise can help ensure that the statistical approaches applied are appropriate for the data complexity.
To understand the demands of accessing, cleaning, integrating, and analyzing the data that a new technology will provide, it is highly prudent to ask the company for a free trial and access to their data streams. Discussions with other practitioners in the industry who already use the technology may also be fruitful to paint a realistic picture of the data-management demands.
Take-Home Suggestions. ! Plan ahead: Obtaining data samples from prospective
companies ahead of time helps to ensure that the
processes and systems can be tested and evaluated before a technology is introduced.
! Educate: By training practitioners in basic principles of data collection,51 many of the data-cleaning challenges can be proactively prevented.
! Automate: Software solutions (eg, Alteryx [Irvine, CA], Fivetran [Oakland, CA], Matillion [Manchester, UK]) and open-source coding platforms (eg, R [Vienna, Austria], Python [Wilmington, DE]) can enable data scientists to streamline and automate many processes, thereby reducing the amount of time required to manually input, download, and edit data.
! Audit: Set up data-audit or quality-control checks to ensure the data are clean and appropriately combined and then respond appropriately when you find mistakes and outliers.
Question 4: Can You Implement the Technology in Your Practice?
The fourth and final question to consider is whether a technology can realistically be implemented in your specific sporting context. In the research arena, we know that injury-prevention protocols that are effective in a controlled trial setting may fail to deliver the same results in a real-world environment because the effects are largely dependent on the adoption and implementation of the program.43–45 In the same way, even a near-perfect technology may fail in an environment where it encounters an implementation problem.
Implementation failures may occur anywhere along the data-science pipeline. The challenges in data analysis were largely addressed in question 3, but implementation challenges are especially pivotal to consider at the data- collection and data-dissemination steps.
Implementation challenges in data collection stem from increased practitioner or athlete burden. The demand on staff and athletes alike to collect data can be significant. It is not uncommon for the rollout of a new technology to be one more item in a long list of responsibilities for staff. Athletes are also not always burden free when it comes to the implementation process. It is important to consider what the athletes will be asked to do, the collective burden of technology and data collection on athletes as a whole, and how the athletes will perceive the new technologies. One must consider the possible ramifications of this increased staff and athlete burden and whether the technology’s potential benefits outweigh the cost of implementation.
The flexibility, mentality, and willingness of the people in an environment to adopt new practices can determine whether implementation challenges in data-informed deci- sion making arise. Certain sports have already embraced technology, whereas others may be deemed resistant to technological innovations. It is therefore essential to consider if members of your sporting culture will accept a specific technology in their environment. For numerous reasons, an organization may not want the data or may not be keen to have the answers that the data could provide. Also, deep-rooted doubt in the reliability or validity of the data may prevail, despite the best available evidence. Many sporting cultures resist changing the way things have always been done, so technological implementation may be seen as changing or modifying their sport.
Journal of Athletic Training 907
D ow
nloaded from http://m
eridian.allenpress.com /jat/article-pdf/55/9/902/2596927/i1062-6050-55-9-902.pdf by Florida Institute of Technology user on 30 N
ovem ber 2021
The successful implementation of any technology requires careful consideration of the time and resources required from practitioners and athletes, the process and procedures that need to be in place to minimize the burden from the technology, and the communication and decision- making channels whereby the analyzed data will be delivered and used to inform practice.
Take-Home Suggestions. ! Understand and improve the implementation context.
Consider the burden and challenges on practitioners and athletes at all points across the data pipeline. Educating practitioners on the rationale and benefits associated with technology and empowering them in their roles may facilitate ‘‘buy-in’’ and more successful implementation.
! Look for invisible monitoring opportunities, which impose virtually no burden on athletes. The data collection is automated, and nothing else is needed from the athletes. These invisible opportunities, whatever they may be, will still require data aggregation and integration from staff and require an environment for informed decision making, but they may enhance implementation success when athlete buy-in is the primary barrier.
! Build technological implementation into existing and new routines and tasks to make it obvious that technology is being integrated and something new is being introduced.
The Final Question: Is the Technology Worth It?
The final step, which should only be considered seriously once all 4 previous questions have been answered affirmatively, is whether the technology is worth the investment. This is essentially a cost-benefit analysis comparing the expected net performance effect with the financial burden that the technology carries in the overall context of the other 4 questions in the framework.
On the Flip Side: Delivering With Technology
Although our focus in this article was predominantly on guiding researchers and practitioners to critically evaluate whether they should purchase a new technology, we believe it would be a mistake not to briefly address several key considerations for when that technology is introduced.
The following concrete recommendations may facilitate buy-in and increase the probability that technological implementation succeeds: (1) undersell and overdeliver; (2) allow athletes to provide the equivalent of informed consent before data collection; (3) try to include multiple key stakeholders in data-evaluation sessions so that the technology does not isolate support staff or create a ‘‘secret society’’; (4) continue to evaluate signal-to-noise ratios and provide clarity on accuracy and reliability; (5) allow complex analyses to take place behind closed doors but present simplified, clean data that support important messages to coaches and athletes; and (6) do not use data in ways that contribute to political agendas or undermine the integrity of colleagues.
Integrating advanced technology into high-performance sport can be challenging. Emotions, egos, time constraints, and unrealistic expectations can make it difficult to gain approval as well as purchase and implement new technology. Many young practitioners may purchase
technology in an attempt to win favor from upper management as they work to encourage enthusiasm, hope, and belief in players and stakeholders. Without being fully aware of the power of the placebo (that is, the excitement of buying something new), they may convince sports administrators to allocate significant funds for a speculative purchase. We hope that the critical framework provided in this article will help to prevent these types of poor decisions, yet we believe the placebo, or belief, effect is an important aspect of technological implementation to leverage.
Interestingly, it has been reported that some of the earliest placebo researchers examined the influence of fake advanced medical technology for treating pain. For those who believed new technology could take away pain, the sophisticated device that supposedly harnessed the power of special metals worked equally well when the devices looked like they were made of metal but were actually made of wood.52 Introducing advanced technology into a high-performance program may have similarly positive effects if it is sold and implemented the right way, with a collective belief among practitioners and athletes that the technology can have a positive effect. In contrast, organizational differences of opinion can dilute the power of the belief effect. When the support team is divided on the efficacy of using new technology, their conversations and attitudes can ultimately undermine buy-in from athletes. Technological implementation should ultimately be a collective team effort whereby stakeholders engage throughout the stages of the decision-making framework and as the technology is implemented.
CONCLUSIONS
Technology may help organizations reach their grandiose vision or drag companies down because of its pitfalls. We hope this critical framework will empower practitioners and organizations to make informed, wise decisions about whether a technology should be implemented. However, we acknowledge that this stepwise approach is an oversimpli- fication and over-regimentation of the technological- evaluation process. At times, a technology may simply cost too much, in which case the 4 questions are irrelevant. Several questions may be investigated at once, such as when a company provides a free trial. Although the framework may be applied differently than described in this stepwise presentation, we caution that all 4 questions are critical to answer affirmatively before an investment is made.
Three fundamental principles underpin this type of a framework: (1) proactivity allows practitioners to start with the end in mind and to plan ahead in considering how to solve implementation challenges across the data pipeline; (2) critical thinking informs how practitioners evaluate the trustworthiness of technology and its data, as well as the intended and unintended consequences of introducing the technology; and (3) collaboration, specif- ically internal collaboration, underpins the success of communication and data-informed decisions and external collaboration can be essential for piloting technologies where appropriate (eg, outsourcing validity-related, data management, or analytical work that is beyond an organization’s current capabilities). Each of these princi-
908 Volume 55 ! Number 9 ! September 2020
D ow
nloaded from http://m
eridian.allenpress.com /jat/article-pdf/55/9/902/2596927/i1062-6050-55-9-902.pdf by Florida Institute of Technology user on 30 N
ovem ber 2021
ples bodes well in critically thinking about technology and even more broadly for practitioner and organizational excellence.
The decision ‘‘to tech or not to tech’’ is critical but complex. It should be made through a careful evaluation of evidence related to the technology and the environment in which it will be deployed. Ultimately, it is a question that every organization and practitioner will face in today’s technologically driven age. Much like the technologies in question, no person or company will make all perfect decisions, but a thoughtful framework and critical approach can help them hit the target more often than not.
REFERENCES
1. Bourdon PC, Cardinale M, Murray A, et al. Monitoring athlete
training loads: consensus statement. Int J Sports Physiol Perform.
2017;12(suppl 2):S2161–S2170. doi:10.1123/IJSPP.2017-0208
2. Borresen J, Lambert MI. The quantification of training load, the
training response and the effect on performance. Sports Med.
2012;39(9):779–795. doi:10.2165/11317780-000000000-00000
3. Saw AE, Main LC, Gastin PB. Monitoring the athlete training
response: subjective self-reported measures trump commonly used
objective measures: a systematic review. Br J Sports Med.
2016;50(5):281–291. doi:10.1136/bjsports-2015-094758
4. Neupert EC, Cotterill ST, Jobson SA. Training-monitoring engage-
ment: an evidence-based approach in elite sport. Int J Sports Physiol
Perform. 2018;14(1):99–104. doi:10.1123/ijspp.2018-0098
5. Nässi A, Ferrauti A, Meyer T, Pfeiffer M, Kellmann M.
Psychological tools used for monitoring training responses of
athletes. Perform Enhanc Health. 2017;5(4):125–133. doi:10.1016/
j.peh.2017.05.001
6. Flatt AA, Esco MR. Smartphone-derived heart-rate variability and
training load in a women’s soccer team. Int J Sports Physiol
Perform. 2015;10(8):994–1000. doi:10.1123/ijspp.2014-0556
7. Buchheit M. Monitoring training status with HR measures: do all
roads lead to Rome? Front Physiol. 2014;5:73. doi:10.3389/fphys.
2014.00073
8. Gathercole R, Sporer B, Stellingwerff T. Countermovement jump
performance with increased training loads in elite female rugby
athletes. Int J Sports Med. 2015;36(9):722–728. doi:10.1055/s-
0035-1547262
9. Wu PP-Y, Sterkenburg N, Everett K, Chapman DW, White N,
Mengersen K. Predicting fatigue using countermovement jump
force-time signatures: PCA can distinguish neuromuscular versus
metabolic fatigue. PLoS One. 2019;14(7):e0219295. doi:10.1371/
journal.pone.0219295
10. Meng E, Sheybani R. Insight: implantable medical devices. Lab
Chip. 2014;14(17):3233–3240. doi:10.1039/C4LC00127C
11. van der Kruk E, Reijne MM. Accuracy of human motion capture
systems for sport applications; state-of-the-art review. Eur J Sport
Sci. 2018;18(6):806–819. doi:10.1080/17461391.2018.1463397
12. Grigg J, Haakonssen E, Rathbone E, Orr R, Keogh JWL. The
validity and intra-tester reliability of markerless motion capture to
analyse kinematics of the BMX Supercross gate start. Sports
Biomech. 2018;17(3):383–401. doi:10.1080/14763141.2017.
1353129
13. Bruderer T, Gaisl T, Gaugg MT, et al. On-line analysis of exhaled
breath. Chem Rev. 2019;119(19):10803–10828. doi:10.1021/acs.
chemrev.9b00005
14. Peake JM, Kerr G, Sullivan JP. A critical review of consumer
wearables, mobile applications, and equipment for providing
biofeedback, monitoring stress, and sleep in physically active
populations. Front Physiol. 2018;9:743. doi:10.3389/fphys.2018.
00743
15. Petersen CJ, Pyne DB, Portus MR, Dawson BT. Comparison of
player movement patterns between 1-day and test cricket. J Strength
Cond Res. 2011;25(5):1368–1373.
16. Gray AJ, Jenkins DG. Match analysis and the physiological
demands of Australian football. Sports Med. 2010;40(4):347–360.
doi:10.2165/11531400-000000000-00000
17. Michalsik LB, Madsen K, Aagaard P. Match performance and
physiological capacity of female elite team handball players. Int J
Sports Med. 2014;35(7):595–607.
18. Bangsbo J, Nrregaard L, Thors F. Activity profile of competition
soccer. Can J Sport Sci. 1991;16(2):110–116.
19. Reilly T. Energetics of high-intensity exercise (soccer) with
particular reference to fatigue. J Sports Sci. 1997;15(3):257–263.
doi:10.1080/026404197367263
20. Reilly T, Thomas V. A motion analysis of work-rate in different
positional roles in professional football match-play. J Hum Mov
Stud. 1976;2:87–97.
21. Barris S, Button C. A review of vision-based motion analysis in
sport. Sports Med. 2008;38(12):1025–1043. doi:10.2165/00007256-
200838120-00006
22. Cummins C, Orr R, O’Connor H, West C. Global positioning
systems (GPS) and microtechnology sensors in team sports: a
systematic review. Sports Med. 2013;43(10):1025–1042. doi:10.
1007/s40279-013-0069-2
23. Torreño N, Munguı́a-Izquierdo D, Coutts A, de Villarreal ES,
Asian-Clemente J, Suarez-Arrones L. Relationship between external
and internal loads of professional soccer players during full-matches
in official games using global positioning systems and heart-rate
technology. Int J Sports Physiol Perform. 2016;11(7):940–946.
doi:10.1123/ijspp.2015-0252
24. Li RT, Kling SR, Salata MJ, Cupp SA, Sheehan J, Voos JE.
Wearable performance devices in sports medicine. Sports Health.
2016;8(1):74–78. doi:10.1177/1941738115616917
25. FIFA Quality Performance Reports for EPTS. Football Technology.
FIFA Web site. https://football-technology.fifa.com/en/media-tiles/
fifa-quality-performance-reports-for-epts/. Accessed February 4,
2020.
26. Bradley PS, Lago-Peñas C, Rey E, Gomez Diaz A. The effect of
high and low percentage ball possession on physical and technical
profiles in English FA Premier League soccer matches. J Sports Sci.
2013;31(12):1261–1270. doi:10.1080/02640414.2013.786185
27. Bush MD, Archer DT, Hogg R, Bradley PS. Factors influencing
physical and technical variability in the English Premier League. Int
J Sports Physiol Perform. 2015;10(7):865–872. doi:10.1123/ijspp.
2014-0484
28. Gregson W, Drust B, Atkinson G, Salvo VD. Match-to-match
variability of high-speed activities in premier league soccer. Int J
Sports Med. 2010;31(4):237–242. doi:10.1055/s-0030-1247546
29. MacDonald KJ, Palacios-Derflingher LM, Emery CA, Meeuwisse
WH. The effect of injury definition and surveillance methodology
on measures of injury occurrence and burden in elite volleyball. Int
J Sports Med. 2018;39(11):860–866. doi:10.1055/a-0577-4639
30. Bere T, Kruczynski J, Veintimilla N, Hamu Y, Bahr R. Injury risk is
low among world-class volleyball players: 4-year data from the
FIVB Injury Surveillance System. Br J Sports Med .
2015;49(17):1132–1137. doi:10.1136/bjsports-2015-094959
31. Helland C, Bojsen-Mller J, Raastad T, et al. Mechanical properties
of the patellar tendon in elite volleyball players with and without
patellar tendinopathy. Br J Sports Med. 2013;47(13):862–868.
doi:10.1136/bjsports-2013-092275
32. MacDonald K, Bahr R, Baltich J, Whittaker JL, Meeuwisse WH.
Validation of an inertial measurement unit for the measurement of
jump count and height. Phys Ther Sport. 2017;25:15–19. doi:10.
1016/j.ptsp.2016.12.001
33. Mujika I, Halson S, Burke LM, Balagué G, Farrow D. An
integrated, multifactorial approach to periodization for optimal
Journal of Athletic Training 909
D ow
nloaded from http://m
eridian.allenpress.com /jat/article-pdf/55/9/902/2596927/i1062-6050-55-9-902.pdf by Florida Institute of Technology user on 30 N
ovem ber 2021
performance in individual and team sports. Int J Sports Physiol
Perform. 2018;13(5):538–561. doi:10.1123/ijspp.2018-0093 34. Kiely J. Periodization theory: confronting an inconvenient truth.
Sports Med. 2018;48(4):753–764. doi:10.1007/s40279-017-0823-y 35. Haugen T, Buchheit M. Sprint running performance monitoring:
methodological and practical considerations. Sports Med. 2015;46(5):641–656. doi:10.1007/s40279-015-0446-0
36. Coutts AJ, Duffield R. Validity and reliability of GPS devices for measuring movement demands of team sports. J Sci Med Sport.
2010;13(1):133–135. doi:10.1016/j.jsams.2008.09.015 37. Bellenger CR, Fuller JT, Thomson RL, Davison K, Robertson EY,
Buckley JD. Monitoring athletic training status through autonomic heart rate regulation: a systematic review and meta-analysis. Sports
Med. 2016;46(10):1461–1486. doi:10.1007/s40279-016-0484-2
38. Kelly JM, Strecker RE, Bianchi MT. Recent developments in home sleep-monitoring devices. ISRN Neurol. 2012:768794. doi:10.5402/
2012/768794 39. Torres-Ronda L, Schelling X. critical process for the implementa-
tion of technology in sport organizations. Strength Cond J. 2017;39(6):54–59. doi:10.1519/SSC.0000000000000339
40. Liebermann DG, Katz L, Hughes MD, Bartlett RM, McClements J, Franks IM. Advances in the application of information technology
to sport performance. J Sports Sci. 2002;20(10):755–769. doi:10. 1080/026404102320675611
41. Loevinger J. Objective tests as instruments of psychological theory. Psychol Rep. 1957;3(3):635–694. doi:10.2466/pr0.1957.3.3.635
42. Cronbach LJ, Meehl PE. Construct validity in psychological tests. Psychol Bull. 1955;52(4):281–302.
43. Messick S. Foundations of validity: meaning and consequences in psychological assessment. ETS Res Rep Series. 1993;1993(2):i–18.
doi:10.1002/j.2333-8504.1993.tb01562.x
44. Messick S. Validity of test interpretation and use. ETS Res Rep
Series. 1990;1990(1):1487–1495. doi:10.1002/j.2333-8504.1990.
tb01343.x
45. Messick S. The standard problem: meaning and values in
measurement and evaluation. Am Psychol. 1975;30(10):955–966.
doi:10.1037/0003-066X.30.10.955
46. American Educational Research Association, American Psycholog-
ical Association, and National Council on Measurement in
Education. Standards for Educational and Psychological Testing.
Washington, DC: American Psychological Association; 1999.
47. Dayal U, Castellanos M, Simitsis A, Wilkinson K. Data integration
flows for business intelligence. In: Kersten M, Novikov B, Teubner
J, eds. EDBT ’09: Proceedings of the 12th International Conference
on Extending Database Technology: Advances in Database
Technology. Saint Petersburg, Russia; March 22, 2009:1–11.
https://dl.acm.org/doi/abs/10.1145/1516360.1516362. Accessed
May 20, 2020.
48. Bolger N, Laurenceau J-P. Intensive Longitudinal Methods: An
Introduction to Diary and Experience Sampling Research. New
York, NY: Guilford Press; 2013.
49. Walls TA, Schafer JL, eds. Models for Intensive Longitudinal Data.
New York, NY: Oxford University Press; 2006.
50. Windt J, Ardern CL, Gabbett TJ, et al. Getting the most out of
intensive longitudinal data: a methodological review of workload–
injury studies. BMJ Open. 2018;8(10):e022626. doi:10.1136/
bmjopen-2018-022626
51. Broman KW, Woo KH. Data organization in spreadsheets. Am Stat.
2018;72(1):2–10. doi:10.1080/00031305.2017.1375989
52. Finniss DG. Placebo effects: historical and modern evaluation. Int
Rev Neurobiol. 2018;139:1–27.
Address correspondence to Johann Windt, PhD, CSCS, Vancouver Whitecaps FC, 3065 Wesbrook Mall, Vancouver, BC V6T 1Z3, Canada. Address email to [email protected].
910 Volume 55 ! Number 9 ! September 2020
D ow
nloaded from http://m
eridian.allenpress.com /jat/article-pdf/55/9/902/2596927/i1062-6050-55-9-902.pdf by Florida Institute of Technology user on 30 N
ovem ber 2021