CAN ENDOSCOPIC LUMBAR DISCECTOMY VIDEOS SHARED ON YOUTUBE BE USED AS PATIENT EDUCATION TOOLS? A QUALITY CONTROL STUDY
PDF
Cite
Share
Request
Original Article
P: 239-244
October 2020

CAN ENDOSCOPIC LUMBAR DISCECTOMY VIDEOS SHARED ON YOUTUBE BE USED AS PATIENT EDUCATION TOOLS? A QUALITY CONTROL STUDY

J Turk Spinal Surg 2020;31(4):239-244
1. Çanakkale Onsekiz Mart University Faculty of Medicine, Department of Orthopaedics and Traumatology, Çanakkale, Turkey
2. Balıkesir University Faculty of Medicine, Department of Neurosurgery, Balıkesir, Turkey
No information available.
No information available
Received Date: 18.07.2020
Accepted Date: 25.10.2020
Publish Date: 18.11.2020
PDF
Cite
Share
Request

ABSTRACT

Objective:

Today, the internet is the initial resource of health information for people who are worried about their health condition. For this reason, it is crucial to clarify the reliability and content correctness of online medical videos. Therefore, this current study aimed to investigate the reliability and correctness of videos associated with endoscopic lumbar discectomy on YouTube®.

Materials and Methods:

We conducted a search on YouTube® using the keywords “endoscopic lumbar discectomy”. The headings of the first 50 videos on YouTube® associated with endoscopic lumbar discectomy were obtained and simultaneously evaluated by two spine surgeons. We excluded from our analysis videos with advertisements and video in a language other than English. We evaluated the videos using the DISCERN and JAMA scores and video power index.

Results:

The average number of views per video was 95,954. Most of the video contents were surgical techniques and general information. The average video length was 7.67 minutes. The average DISCERN and JAMA scores were determined as 30.2 and 1.94, respectively. According to the average DISCERN scores, 38% of the videos were evaluated as very poor, 44% as poor, 16% as average and 2% of as good in terms of video reliability.

Conclusion:

Generally, the reliability of the videos uploaded on YouTube® associated with endoscopic lumbar discectomy was “poor” or “very poor”. Therefore, we recommend that YouTube® videos should not be used as patient education tools for endoscopic lumbar discectomy.

Keywords:
Lumbar disc herniation, endoscopic lumbar discectomy, reliability, YouTube

INTRODUCTION

YouTube® is currently the leading video-sharing internet site and it is used by more than 30 million people daily(1). For this reason, it is crucial to clarify the reliability and correctness of medical videos on YouTube®. Recently, many studies have been conducted that evaluate the contents of medical videos on YouTube®. In most of these studies, the reliability was reported to be low(1-4).

Spine surgery is a medical topic that is commonly searched on the internet(5). Many patients who are recommended surgical treatment for lumbar disc herniation search internet sites, particularly on YouTube®, for additional information. The present study is the first in the literature that evaluates the contents of videos associated with endoscopic lumbar discectomy, which is a relatively new technique that has become more popular recently. The main aim of the present study was to investigate the reliability and correctness of videos associated with endoscopic lumbar discectomy on YouTube®.

MATERIALS AND METHODS

We searched “endoscopic lumbar discectomy” on YouTube® on 8th October 2019 and chose the option to see the number of views. The titles of the first 50 YouTube® videos associated with endoscopic lumbar discectomy were obtained and evaluated simultaneously by two spine surgeons. We screened the results and excluded the following from our analysis: videos with advertisements, duplicate or repetitive videos, videos shorter than 30 seconds and videos in a language other than English. We divided the videos into subgroups as “real” and “animation” according to the type of display; as “physician”, “medical facility”, “manufacturing company”, “TV channel” and “medical illustrator” according to the uploader; and as “patient info”, “surgical technique”, “patient experience” and “lecture” according to the content. Additionally, numbers of views and comments, number of likes and dislikes, upload date, video length and whether or not the video had an audio were recorded in our data.

We calculated the like ratio using the subsequent method for determining the reputation: [like count / (dislike count + like count) × 100)]. To conjointly evaluate the view and like ratios, we used the video power index (VPI) that was used by Erdem et al.(6) using the VPI method: (like ratio x view ratio)/100. We analysed the average view count per day using the following method: (total view count/the amount of time (in days) that the video has been online for viewing on YouTube®

Evaluation of the Reliability

Each video was evaluated by two spine surgeons simultaneously using the DISCERN and JAMA scales. Total scores were noted individually by two viewers to stay impartial. We used the mean DISCERN and JAMA scores of both viewers to analyse the mean scores.

DISCERN Scale: The DISCERN scale evaluates the reliability of videos. DISCERN scores of 63-75 points are categorised as “excellent”, 51-62 as “good”, 39-50 as “average”, 28-38 as “poor” and <28 as “very poor”. Based on this method, higher DISCERN scores indicate a higher quality of information(7) (Table 1).

Table 1

JAMA Scale: The JAMA scale is a tool that is used to evaluate information obtained from medical websites. Based on this method, higher scores indicate an increased quality of the assessed information(8) (Table 2).

Table 2

Statistical Analysis

We used the IBM Statistical Package for Social Sciences Statistics 22 software for statistical analysis. The Kruskal-Wallis test was used in intergroup evaluations and Mann-Whitney U test in the detection of the group that led the variance. Spearman’s analysis was used in evaluating the correlation between the data. We calculated Krippendorff’s a value to evaluate the inter-rater consistency between the viewers. Kripppendorff’s a<0.67 was classified as weak, 0.67≤a<0.80 as moderate and ≥0.80 as excellent. P value less than 0.05 was assumed to be significant.

RESULTS

We analysed the top 50 most watched videos. Forty-two videos contained real images while eight consisted of animated videos. The content of the videos included 70% (n=35) surgical techniques, 24% (n=12) general introduction (patient info), 4% (n=2) patient experiences and 2% (n=1) lectures. In addition, 64% of the videos were shared by physicians, 22% by medical facilities, 10% by manufacturing companies, 2% by TV channels and 2% by medical illustrators.

Thirty videos (60%) mentioned using the tranforaminal technique, nine videos (18%) used the interlaminar technique, nine (18%) videos used the microendoscopic technique and one (2%) video mentioned using the unilateral biportal endoscopic technique. One video (2%) did not mention any specific endoscopic technique. Twenty-seven videos (54%) had audios while 23 videos (46%) did not. The general features of the videos used in this study are shown in Table 3.

Table 3

The mean view count per video was 95,954 (range: 2,413-2,827,927). The total number of views of all of the videos was 4,527,724. Lengths of the videos, number of views, duration since uploading, number of comments, number of likes, view ratio (daily view counts), like ratio, and VPI assessments are shown in Table 4. The dissemination of the videos according to the uploaders is shown in Table 5.

Table 4
Table 5

The average DISCERN score analysed by the two viewers was 30.22±8.4 and 30.18±9.2 respectively. The average JAMA score of the videos analysed by the two viewers was 1.85±0.35 and 1.92±0.3, respectively. Hence, the average DISCERN score was 30.2±8.5 and average JAMA score was 1.89±0.3. When the DISCERN scores of both viewers were analysed using the Spearman test, we found a strong correlation. There was a moderate agreement between the observers in the reliability analysis using the Krippendorff’s alpha test (r=0,776, p<0.001, Krippendorff a=0.77). In addition, the JAMA scores of the two viewers using the Spearman test were determine to have a very strong correlation. There was also a moderate agreement between the two viewers in the Krippendorff alpha test (r=0.758, p <0.001, Krippendorff a=0.731)

After analysing the average DISCERN scores of the two viewers, we found that the quality of the videos was very poor in 38%, poor in 44%, average in 16% and good in 2% of the videos used in our study.

We compared the DISCERN, JAMA and VPI values of the videos between the physician, medical facility and other groups. In terms of DISCERN and JAMA scores, we found insignificant differences between these various groups (p=0.083 and p=0.466, respectively) Conversely, the VPI values of the videos uploaded by medical facilities were found to be significantly higher than the videos uploaded by physicians and others (p=0.031) (Figure 1).

Figure 1

Since “surgical technique” was the largest subgroup of videos in terms of the content, we compared DISCERN and JAMA scores and VPI assessments between the surgical technique videos and others. The average DISCERN scores of the surgical technique videos were significantly lower than those of the others (28.1 vs 35, p=0.019). However, the average JAMA scores and VPI values did not show any significant difference between the surgical technique videos and the others (p=0.528 and p=0.646, respectively). Although there was a considerable difference in terms of the mean VPI values between the surgical technique videos and the others (10.8 vs 137.9), we found no statistically significant difference. This difference in the mean VPI values was due to the substantial difference in view and like counts of the first and second most viewed videos versus the other videos, which were patient experience and general introduction videos (view count; 2,830,340 and 1,099,638, respectively), (like count; 2,200 and 6,600, respectively) (Figure 2).

Figure 2

One of the parameters used in comparing the videos in this study was videos with audio and without an audio. In the videos with audio group, the average DISCERN score was 34.6, while the average DISCERN score of videos without audio group was 25. The higher average DISCERN score of videos with audio were found to be statistically significant (p=0.0001). However, the two groups’ assessment of VPI and JAMA scores were found to be statistically insignificant (p=0.693 and p=0.387, respectively.) Similar to the results above, although there was a marked difference between the VPI values of the videos with and without audio (80.7 vs 11.6), no significant statistical difference was found. This was most probably because the first and second most viewed videos were both videos with audio.

Another parameter used in comparing the videos in this study was whether the videos were real or animated. When compared in these terms, the differences in JAMA and DISCERN scores were found to be statistically insignificant (p=0.403 and p=0.710, respectively). Conversely, the VPI values of the animated videos were found to be statistically higher than those of the real videos. (95.1 vs 40.1, p=0.030)

We also evaluated the correlation between the parameters of the DISCERN and JAMA scores, VPI and DISCERN scores, VPI and JAMA scores, view count and DISCERN scores and view count and JAMA scores. We only found a moderate negative correlation between VPI values and DISCERN scores (r=−0.29) and no correlation amongst the other parameters.

DISCUSSION

Reports have shown that the reliability of health-based information delivered by physicians is higher than information delivered by others(9-14). However, the present study showed insignificant differences between the DISCERN and JAMA scores of the videos uploaded by physicians and those uploaded by medical facilities or others. In the study of Erdem et al.(6) they assessed kyphosis videos on YouTube® and found that the VPI values of the videos uploaded by physicians had the best scores. However, our data showed that the mean VPI value of videos uploaded by medical facilities was higher than the others and the difference was significant. We attribute this result to the advertisements that medical facilities generate to make their videos more known and accessible.

In Erdem et al.’s(6) study, academic videos that had been uploaded by authors who were associated with a university or research group had significantly lower VPI values than other groups’ videos, although they had the highest quality scores. In their study, they found an insignificant correlation between VPI and quality scores. Comparatively, neither Erdem et al.’s(6) study nor ours found any correlation between the number of views and quality scores in our respective studies. In our study, we only found a moderately negative correlation between VPI values and DISCERN scores, which is also similar to the results of Erdem and Karaca’s(6) study.

In the literature, many reports have shown that videos on the internet regarding many health care topics was unreliable. Berland et al.(15) showed that patients may face challenges in obtaining accurate and correct information from the internet, and the absence of reliable internet-based medical knowledge might deleteriously influence patients’ decision making on treatment options. Previous reports regarding spinal surgery showed that the videos on lumbar discectomy(1,16), anterior cervical discectomy and fusion(11), scoliosis(17) and kyphosis(6) on YouTube® were in low quality. Our study showed that videos regarding endoscopic discectomy on YouTube® are not educational, and these results are consistent with the results of previous studies(1-4,6,9-18). Most of the videos in the present study were found as very poor or poor. From this data, we can conclude that such videos present a risk of misinforming patients and negatively affecting the communication between the physician and patient(6).

In the present study, the DISCERN scores of surgical technique videos were significantly lower than those of other videos. Since surgical technique videos provide information about a particular surgical technique, this difference in DISCERN scores may be related to the lower points assigned to questions 9-15, which evaluated the information quality about other treatment choices.

A former systematic review showed that a large number of health-based videos on YouTube® include subjective knowledge and experiences of the patients(19). However, we found that most of the videos about endoscopic lumbar discectomy were uploaded by physicians and the percentage of videos consisting of patient experience was considerably lower than what is reported in the literature. One of the 50 videos in our study consisted of patient experience. The DISCERN score of this video was lower than the average DISCERN score (22 vs 30.5), as might be expected. However, the view count of this video was significantly higher than any other video in our study, as well as the mean view count of all the videos in our study (2,830,103 vs 230,568). This video’s view count was even higher than the total view count of the other 49 videos combined. (2,830,340 vs 1,967,384) The reason for this could be understandable, as there is evidence in the literature suggesting that the regular viewer has issues understanding videos uploaded by physicians(14). Watching a patient who had a related experience might relieve the patients’ concerns in a more relatable way that medical professionals may not have considered(11). For these reasons, viewers might have been more interested in inpatient experiences than in surgical technique and general information videos.

In the present study, we analysed not all, but only the most viewed videos on this subject on YouTube®. Therefore, our findings might not reflect the data of all videos on the subject. Even though this might seem to be the main limitation of this study, the total view count of the videos that were included in this study is 4,797,724, which included most of the total views of all the videos on YouTube® concerning endoscopic lumbar discectomy. In the present study, the 50th most viewed video’s view count was only 2,413. This means that even if we had added 100 more videos to our study, it would only alter the total view count by less than 240,000 views. Additionally, our study only included videos that were in English, and endoscopic lumbar discectomy videos published in any other language were not assessed.

CONCLUSION

The reliability of videos concerning endoscopic lumbar discectomy uploaded on YouTube® was low. Our results show that patients cannot differentiate between correct and incorrect medical information on YouTube® and often rate personal patient experience videos higher than more factual, educational and technique-based videos. Using videos on YouTube® as patient education tools for endoscopic lumbar discectomy can often be misleading and inaccurate.

References

1
Gokcen HB, Gumussuyu G. A Quality analysis of disc herniation videos on YouTube®. World Neurosurg. 2019;S1878-8750:30246-3.
2
Fischer J, Geurts J, Valderrabano V, Hügle T. Educational quality of YouTube videos on knee arthrocentesis. J Clin Rheumatol. 2013;19:373-6.
3
Ho M, Stothers L, Lazare D, Tsang B, Macnab A. Evaluation of educational content of YouTube videos relating to neurogenic bladder and intermittent catheterization. Can Urol Assoc. 2015;9:320-54.
4
Mukewar S, Mani P, Wu X, Lopez R, Shen B. YouTube and inflammatory bowel disease. J Crohns Colitis. 2013;7:392-402.
5
Drozd B, Couvillon E, Suarez A. Medical YouTube videos and methods of evaluation: literature review. JMIR Med Educ. 2018;12;4:e3.
6
Erdem MN, Karaca S. Evaluating the accuracy and quality of the information in kyphosis videos shared on YouTube. Spine. 2018;43:E1334-9.
7
Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health. 1999;53:105-11.
8
Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor--Let the reader and viewer beware. JAMA. 1997;16;277:1244-5.
9
Ferhatoglu MF, Kartal A, Ekici U, Gurkan A. Evaluation of the reliability, utility, and quality of the ınformation in sleeve gastrectomy videos shared on open access video sharing platform YouTube. Obes Surg. 2019;29:1477-84.
10
Fat MJ, Doja A, Barrowman N, Sell E. YouTube videos as a teaching tool and patient resource for infantile spasms. J Child Neurol. 2011;26:804-9.
11
Ovenden CD, Brooks FM. Anterior cervical discectomy and fusion YouTube videos as a source of patient education. Asian Spine J. 2018;12:987-91.
12
Haymes AT, Harries V. How to stop a nosebleed’: an assessment of the quality of epistaxis treatment advice on YouTube. J Laryngol Otol. 2016;130:749-54.
13
Tartaglione JP, Rosenbaum AJ, Abousayed M, Hushmendy SF, DiPreta JA. Evaluating the quality, accuracy and readability of online resources pertaining to hallux valgus. Foot Ankle Spec. 2016; 9:17-23.
14
Desai T, Shariff A, Dhingra V, Minhas D, Eure M, Kats M. Is content really king?: an objective analysis of the public’s response to medical videos on YouTube. PLoS One. 2013;18;8:e82469.
15
Berland GK, Elliott MN, Morales LS, Algazy JI, Kravitz RL, Broder MS, et al. Health information on the internet: accessibility, quality, and readability in English and Spanish. JAMA. 2001;285:2612-21.
16
Brooks FM, Lawrence H, Jones A, McCarthy MJ. YouTube as a source of patient information for lumbar discectomy. Ann R Coll Surg Engl. 2014;96:144-46.
17
Staunton PF, Baker JF, Green J, Devitt A. Online curves: a quality analysis of scoliosis videos on YouTube. Spine. 2015;40:1857-61.
18
Kuru T, Erken HY. Evaluation of the Quality and Reliability of YouTube Videos on Rotator Cuff Tears. Cureus. 2020;12:e6852.
19
Madathil KC, Rivera-Rodriguez AJ, Greenstein JS, Gramopadhye AK. Healthcare information on YouTube®: a systematic review. Health Informatics J.  2015;21:173-94.