ASSESSMENT OF THE MEDICAL ACCURACY AND QUALITY OF KYPHOSIS VIDEOS SHARED ON SOCIAL MEDIA PLATFORMS
PDF
Cite
Share
Request
ORIGINAL ARTICLE
VOLUME: 35 ISSUE: 3
P: 99 - 105
July 2024

ASSESSMENT OF THE MEDICAL ACCURACY AND QUALITY OF KYPHOSIS VIDEOS SHARED ON SOCIAL MEDIA PLATFORMS

J Turk Spinal Surg 2024;35(3):99-105
1. Amasya Sabuncuoğlu Şerefeddin Training and Research Hospital Clinic of Orthopedics and Traumatology, Amasya, Turkey
2. Acıbadem Bakırköy Hospital Clinic of Orthopedics and Traumatology, İstanbul, Turkey
3. Ondokuz Mayıs University Faculty of Medicine Department of Orthopedics and Traumatology, Samsun, Turkey
4. Süleyman Demirel University Faculty of Medicine Department of Orthopedics and Traumatology, Isparta, Turkey
5. University of Health Sciences Turkey Kanuni Sultan Süleyman Training and Research Hospital, Clinic of Orthopedics and Traumatology, İstanbul, Turkey
6. EMOT Hospital Clinic of Physical Therapy and Rehabilitation, İzmir, Turkey
No information available.
No information available
Received Date: 10.07.2024
Accepted Date: 25.07.2024
PDF
Cite
Share
Request

ABSTRACT

Objective

This study aims to evaluate the accuracy and quality of videos about kyphosis by analyzing posts about the disease on social media using a scoring system.

Materials and Methods

We searched the word “kyphosis” in the search engine of relevant social media sites. The Global Quality Score (GQS), the Journal of American Medical Association (JAMA) score, Kyphosis Specific Score, DISCERN, and Video Power Index (VPI) scales were used to analyze the quality and accuracy of the medical posts.

Results

YouTube was the most common media for video posts and had the highest GQS, JAMA, and DISCERN scores (1.87, 2.18, 41.2). YouTube videos had significantly higher correlations with JAMA, GQS, and DISCERN (p<0.01). Facebook videos showed a moderate correlation between JAMA criteria GQS (p=0.724, p<0.001) and DISCERN (p=0.568, p<0.01). A high correlation was observed between GQS and DISCERN (p=0.713, p<0.01). The social media outlet with the lowest scores was Instagram, with JAMA 1.4 (±0.93), DISCERN 27.4 (±15.7), GQS 2.52 (±1.15), and VPI 264.2 (±180.9).

Conclusion

Videos on YouTube and Facebook were found to have better medical quality. It is evident that there is a need to establish strategies for integrating social media into future patient education to align with the contemporary era of information exchange.

Keywords:
Kyphosis, social media, accuracy of medical posts

INTRODUCTION

The use of the social media in daily life is increasing(1). The rapid proliferation and transfer of information on social media offers new opportunities for patients or their relatives to learn about their medical conditions before visiting a specialist and to connect with others who have the same experience(2-5).

With the use of the internet and the increase in knowledge, people have access to medical information much more than in the past(4). Orthopedic surgery, which has a wide range of patients from the neonatal period to the geriatric period, has been affected by these developments. Prior research has documented the frequency of internet and social media utilization among orthopedic patients(6-8). Of the visual materials used to create content on social media, videos in particular are more engaging in terms of reaching communities with relevant information and interactivity. However, despite the richness of the sources, the timeliness and accuracy of the information in them can be questionable. Most of the content owners who post on social media environments may provide misleading information in their videos and often do not go through any editorial process, thus raising the important problem of credibility.

Kyphosis is a deformity of the thoracic spine that can be caused by various factors such as trauma, degeneration, inflammatory conditions, or infections. While there are studies on the quality of videos on various medical problems on social media, there are very few reports on the quality of videos related to the spine, especially kyphosis, which is distributed among different age groups(9-12). However, it was observed that the previous study on kyphosis was evaluated only on the YouTube platform(9).

In our study, we aimed to evaluate the medical accuracy and quality of kyphosis-related videos shared in videos uploaded on YouTube, Facebook and Instagram platforms, which contain large user groups and sharing on the internet.

MATERIALS AND METHODS

In order to obtain data independent of search algorithms, an e-mail account that had not been used before was created and accounts were opened on Facebook, YouTube, and Instagram. On October 07, 2023, an inquiry was made in the search engine of pertinent social media platforms on the term “kyphosis”. Videos in languages other than English and republished videos were excluded. The first 50 videos among the search results were included in the study and characteristics such as video duration, number and rate of views (number of views/day), number and rate of likes [such as 100/(dislike)], and by which person/organization the video was uploaded were recorded. The present study also investigated the metric of daily views (total views divided by total online days), a variable that has not been previously operationalized or utilized in prior research. With this parameter, the popularity and usability among internet users independent of platforms was questioned.

The Journal of American Medical Association (JAMA) score(8), Global Quality Score (GQS)(13), Quality Criteria for Consumer Health Information (DISCERN) and Video Power Index (VPI) scales were used(14, 15). There is a scarcity of publications that employ all four scales concurrently(16-19). However, there is no study comparing videos from three different platforms analyzed in our study in terms of kyphosis.

In addition, although the original VPI formula was (number of likes/dislikes + number of likes) x 100, the formula was changed to (number of likes/number of views) x 100 since the numbers in the number of dislikes were hidden after the policy change on YouTube. For the sake of objectivity, it was used the same way on all social media accounts(12).

Previous research has often excluded factors like view counts and the duration of related video content due to their low occurrence, or because the video source content and groups are highly fragmented. Therefore, it is conceivable that such distinctions may manipulate the data obtained in the comprehensive comparison of video algorithms. To maintain homogeneity and statistical robustness in our study’s results, we applied exclusion criteria such as short video duration, a low count of likes and views, and simplified the categorization of source and content groups.

Videos were divided into five categories based on their content and source. Source-based categories were 1) academic (the uploader was affiliated with an institute/research group), 2) physician (the individual or group responsible for uploading the content lacked affiliation with any academic institution or research organization), 3) non-physician (physiotherapists, massage therapists, non-health professional trainers and alternative medicine providers), 4) patient, and 5) commercial. Content-based categories were 1) information about the disease, 2) exercise education, 3) treatment of the disease, 4) patient experiences, and 5) advertising. This study did not require ethics committee approval because it was conducted as an internet-based research and did not involve the collection of personal or sensitive data from participants.

Statistical Analysis

The data files underwent processing and analysis utilizing SPSS v.25 (IBM Corp, Armonk, NY). The correlation between date was investigated through the utilization of social media platforms (YouTube, Facebook and Instagram). The study aimed to compare values, specifically popularity and medical knowledge, across different publication sources and social media platforms. Mann-Whitney U test was used for non-normal distribution. The Spearman’s correlation tests were used to examine the associations between the parameters. The chosen level of significance was established at a p-value of less than 0.05.

RESULTS

The first 200 (4x50) videos that met the criteria on the three social media platforms were included. All videos on these platforms were analyzed separately for source and content type (Tables 1-3). Cross-correlations with JAMA, GQS, VPI and DISCERN were then performed (Table 4).

A) YouTube

When 50 YouTube videos were analyzed, the mean duration was 843 seconds (±1203), the number of views per day was 1995 (±631), the number of views was 712,352 (±516,024), and the scores were JAMA 1.87 (±0.98), GQS 2.1 (±0.66), VPI 154.2 (±159.2) and DISCERN 41.2 (±20.7). The distribution by source was academic 6%, physician 31%, trainer 48%, patient 8% and commerical 7%. In terms of content, the percentages were information 33%, exercise training 37%, treatment 23%, patient experience 5% and advertising 2%.

YouTube videos showed a high correlation between JAMA criteria GQS (p=0.812, p<0.001) and DISCERN (p=0.605, p<0.001). However, a high correlation was observed between GQS and DISCERN in videos on this platform (p=0.753, p<0.001). A high correlation was observed between the number of daily views and JAMA (p=0.691, p<0.001) and a moderate correlation was observed between the VPI (p=0.372, p<0.001) score.

When compared depending on the source, a significant difference was found in JAMA (x2=5.84, p=0.046), GQS (x2=6.52, p=0.049), and duration (x2=9.57, p=0.023). The academic and trainer groups were found to have higher JAMA scores than the others (w=-3.725, p=0.043, w=-4.04, p=0.029). The academic group had the highest GQS and DISCERN values (w=3.212, p=0.044) and was found to have much higher ratings than the other groups (w=-3.5134, p=0.052). However, longer videos were shared in this group than others (w=3.69, p=0.013).

JAMA scores differed significantly when content was considered (x2=13.47, p=0.012), GQS (x2=8.15, p=0.016), DISCERN (x2=12.28, p=0.039). Informational videos were high in all scores compared to the other content groups [JAMA (w=-4.154, p=0.017), DISCERN (w=-3.856, p=0.029) and GQS (w=-3.988, p=0.025)].  However, for patient experience videos, JAMA (w=-4.771, p=0.004), DISCERN (w=-4.126, p=0.017) and GQS (w=-4.656, p=0.011) were significantly lower.

B) Facebook

When 50 Facebook videos were analyzed, the average duration was 325 seconds (±391), daily views were 751 (±1350), the number of views was 152,213 (±616,243), and the scores were JAMA 1.53 (±0.71), GQS 1.91 (±0.65), VPI 127.7 (±207.2) and DISCERN 31.9 (±13.9). The distribution by source was academic 4%, physician 34%, trainer 44%, patient 12%, commerical 6%. In terms of content, information was 29%, exercise training 33%, treatment 14%, patient experience 23% and advertising 1%.

Facebook videos shown a moderate correlation between JAMA criteria GQS (p=0.724, p<0.001) and DISCERN (p=0.568, p< 0.001). A strong positive association was identified between the Global Quality Scale (GQS) and the DISCERN (p = 0.713, p< 0.001). However, a high correlation was observed between daily viewing and VPI (p=0.693, p<0.001).

When analyzed by video source, a significant difference was observed between JAMA (x2=7.90, p=0.042) and GQS (x2=6.67, p=0.044). The posts in the Tranier group were found to have higher JAMA and GQS scores than the others (w=-3.886, p=0.046, w=-3.99, p=0.039). Likewise, this group’s posts received significantly more likes than the others (w=-3.7659, p=0.039).

Considering the content, the informational videos group JAMA (w=-4.154, p=0.017) had significantly different GQS scores (x2=9.24, p=0.026) and was the highest scoring group. Treatment videos had higher daily viewing and VPI values than the others (w=2.8818, p<0.050). Patient experience videos had lower DISCERN values (w=-2.813, p=0.019). Surgery videos received more likes than others (w=3.522, p=0.05).

C) Instagram

When 50 Instagram videos were analyzed, the mean duration was 41.2 seconds (±24.9), daily views were 6371 (±12,388) and the number of views was 68,123 (±78045). When the scores were analyzed, JAMA was 1.4 (±0.93), DISCERN 27.4 (±15.7), GQS 2.52 (±1.15), VPI 264.2 (±180.9). The distribution by source was academic 1%, doctor 27%, trainer 35%, patient 31% and commercial 3%. In terms of content, information was 26%, exercise training 39%, treatment 6%, patient experience 28% and advertisement 1%.

A moderate correlation was seen between the DISCERN and GQS in videos on this platform (p=0.652, p=0.043). Similarly, there was a moderate correlation between JAMA criteria and GQS (p=0.176, p=0.050). There was no association between the number of views per day and all video quality assessment scores.

There was a significant difference between GQS (x2=10.26, p=0.038), DISCERN (x2=8.47, p=0.045) depending on the source. Trainer group posts were viewed more daily than others (w=4.452, p=0.027). Academic and physician groups had higher GQS values than others (w=3.235, p=0.05). The posts made by the physician group appeared to garner a higher number of likes compared to those made by other groups (w=5.354, p=0.044).

When content groups were evaluated, there was a significant difference in DISCERN (x2=9.653, p=0.037) and GQS (x2=10.102, p=0.033). It was clear that the information group received more daily views and GQS values than the others (w=-2.145, p<0.001, w=-3.897, p=0.017). Exercise videos appeared to receive more likes than others (w=-4.332, p<0.001) and were found to have higher DISCERN scores (w=6.835, p=0.021).

Overall, the study found moderate to strong significant correlations among the JAMA, GQS, and DISCERN scores. However, there was no significant correlation between the VPI and the other scales. These findings can be analyzed using the cross-correlation table with score systems, as shown in Table 3.

DISCUSSION

A simple Google search for the term “kyphosis” yields 212,000 video links. This large volume of data may suggest that the accuracy and quality of the content could be questionable. For this reason, there are many studies examining the accuracy of information sharing, diversity of content and reliability of sources by evaluating the content available on social media platforms(6-8).

However, it can be considered that the content uploader or the content itself is as important as the search algorithms that form the ranking order of how the posts are displayed to the user. Social media algorithms are constantly being developed and updated to enhance the user experience and encourage interaction between content producers and users. Each platform has its own specific algorithm structure and priorities, so the ranking order of content may differ between platforms.

Our study examined the content quality and diversity of videos about kyphosis on different platforms. We also compared the available algorithms and identified different patterns in how evaluation scores such as JAMA, GQS, VPI and DISCERN differ between these platforms. Ensuring that high-quality, accurate medical content is available and easily accessible on these platforms can enhance patient education, improve disease management, and support better clinical outcomes. Therefore, healthcare professionals and organizations should consider focusing their efforts on platforms with higher engagement and better information quality to disseminate reliable health information effectively.

It has been noted that videos on kyphosis on the YouTube platform tend to have the longest duration and attract a significant audience. In a study, the average JAMA score of YouTube videos was determined as 1.36 and the GQS score was 1.68(9). Similar scores were calculated in our study (JAMA 1.87, GQS 2.1). The high correlation of YouTube videos with JAMA, GQS and DISCERN criteria suggests that this platform has an important role in sharing health information in terms of both reliability and content quality.

When we looked at the content providers, it was observed that the trainer group uploaded the most content with 48%, and at the same time, 37% of the content consisted of exercise videos. This finding reflects that the trainer group plays an important role by providing practical guidance on kyphosis and that the community is interested in exercise-based approaches. Similarly, Erdem and Karaca(9) reported that the highest content uploading group was trainers with 36% and exercise videos with 46%, and that exercise videos attracted more attention in the community.

However, exercise videos had lower scores than informational videos in all scores except VPI (JAMA 1.34, DISCERN 26.3, GQS 1.52, VPI 97.1). Therefore, it can be concluded that the need for improvement in quality standards should not be ignored.

A noteworthy point in terms of content providers is that the academic group, which constituted 6% of the content providers, had the highest scores in all scoring (JAMA: 3.5±0.8, DISCERN: 71.2±39.4, GQS: 5.2±0.5) except for VPI (14.21±5.35). This may indicate that the academic group prioritizes quality over quantity in order to ensure information accuracy and reliability. However, YouTube’s algorithm may prioritize science-based content, information from official health organizations, and expert opinions more when ranking videos by evaluating factors such as users’ viewing history, interactions, viewing time, and keywords.

Facebook videos show moderate correlations between JAMA, GQS and DISCERN scores when compared to other platforms. In addition, there are high correlations between the number of daily views and VPI and DISCERN scores.

Ng et al.(20) found an average scoliosis-specific content score of 5.7 (0-20) and DISCERN score of 22.5 (16-45) in their content quality study on scoliosis and reported that the quality of information provided was generally poor. Although a higher DISCERN score (31.9±13.9) was found in our study, our findings are in the same direction. When evaluated according to the sources, trainer and physician groups constituted 44% and 34%, respectively. Truumees et al.(21) found 42% and 28%, respectively, which is consistent with our study. However, the JAMA and DISCERN scores of the trainer group videos are higher than the other groups.

The groups with the highest rates in terms of content were exercise and informational videos with 33% and 29%, respectively. Similarly, Erdem and Karaca(9) in 2018 stated that training videos represented a significant proportion of 46% followed by informational videos with 24%. It was observed that informational videos had DISCERN and GQS scores compared to other groups.

The rate of patient experience videos was determined as 23% and this rate was found to be the highest rate together with Instagram (28%). It is clearly seen that patient experience videos have the second highest scores in evaluation scores compared to all other groups. This may suggest that Facebook is a platform where more personal content is shared.

These results suggest that videos shared on the Facebook platform differ from other platforms in terms of all scores, and that certain content and resource groups are differentially dominant on the platform. Therefore, it can be assumed that there are doubts about the reliability of content on this platform.

Instagram videos had the highest number of daily views. Instagram Reels videos had 8.4 times more daily views than Facebook videos and 2.2 times more views than YouTube videos. Trainers were the most frequent content uploaders on this platform, with 35% of content uploaded on this platform, which is usually short and most frequently exercise training-oriented content. Physician-generated videos have higher JAMA, GQS and DISCERN scores than other groups. This may suggest that Instagram is a platform where visual-oriented content is shared and healthcare professionals can effectively produce content on this platform.

However, it is noteworthy that videos where patient experiences are shared are higher than other platforms with a rate of 28%. This is related to the use of Instagram as a platform for sharing the experiences and opinions of individual users, as well as the algorithm of the platform. Instagram’s algorithm operates by analyzing users’ interactions with others, their preferences, and the relevance of content. On the other hand, the fact that the algorithm does not allow advertising posts may affect the results. Since accounts are treated differently on the Instagram platform, it is rare to find advertising content on Instagram Reels in search results. As we saw in our study, the proportion of commercial content on Instagram videos (3%) is lower than on other platforms. However, despite this difference, at the same time, due to the high number of trainer groups on Instagram, such accounts can be considered as hidden advertising accounts.

The 90 second time limit for Instagram videos makes the platform different from other social media platforms. This restriction is a feature that shapes the visual character of the platform. Indeed, this limitation can be considered as a disadvantage; however, a different picture emerges when VPI scores are analyzed. VPI scores were found to be 1.8 times higher for Instagram videos than YouTube videos and 2 times higher than Facebook videos. In particular, although YouTube videos have high JAMA, GQS, and DISCERN values, the low average VPI score compared to Instagram indicates that the video content and VPI scoring used in the algorithm are inconsistent and do not fully reflect the medical quality of the videos.

In our study, the fact that the videos were rated by a single person through scales can be considered an important limitation. Examining videos from three different social media platforms presents both benefits and challenges. While the diversity of fields addressed by these videos can complicate comparisons, this variability also underscores the uniqueness of the study. Despite difficulties in standardizing the groups, efforts to ensure relative scientific similarity among the compared groups add validity to the findings.

CONCLUSION

Our results of the kyphosis-related videos analyzed on different social media platforms differed in terms of content and quality, but often revealed that the medical quality cannot be considered good and the lack of patients’ access to accurate information.

However, it was observed that the content shared on different platforms varied depending on the audience, preferences, and formats. Therefore, considering the increasing need for users to prepare optimal medical videos on kyphosis on social media, it is important for content producers, especially healthcare professionals, to take into account the unique features of the relevant platform and the tendencies of the users in order to effectively reach their target audience.

References

1
Sechrest RC. The internet and the physician-patient relationship. Clin Orthop Relat Res. 2010;468:2566-71.
2
Goto Y, Nagase T. Oncology information on the Internet. Jpn J Clin Oncol. 2012;42:368-74.
3
Baker JF, Devitt BM, Kiely PD, Green J, Mulhall KJ, Synnott KA, Poynton AR. Prevalence of Internet use amongst an elective spinal surgery outpatient population. Eur Spine J. 2010;19:1776-9.
4
Desai T, Shariff A, Dhingra V, Minhas D, Eure M, Kats M. Is content really king? An objective analysis of the public’s response to medical videos on YouTube. PLoS One. 2013;8:e82469.
5
Brooks F, Lawrence H, Jones A, McCarthy M. YouTube™ as a source of patient information for lumbar discectomy. Ann R Coll Surg Engl. 2014;96:144-6.
6
Madathil KC, Rivera-Rodriguez AJ, Greenstein JS, Gramopadhye AK. Healthcare information on YouTube: a systematic review. Health Informatics J. 2015;21:173-94.
7
Morahan-Martin JM. How internet users find, evaluate, and use online health information: a cross-cultural review. Cyberpsychol Behav. 2004;7:497-510.
8
Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor-let the reader and viewer beware. JAMA. 1997;277:1244-5.
9
Erdem MN, Karaca S. Evaluating the accuracy and quality of the information in kyphosis videos shared on YouTube. Spine (Phila Pa 1976). 2018;43:E1334-E9.
10
Rudisill SS, Saleh NZ, Hornung AL, Zbeidi S, Ali RM, Siyaji ZK, et al. YouTube as a source of information on pediatric scoliosis: a reliability and educational quality analysis. Spine Deformity. 2023;11:3-9.
11
Yaradılmış YU, Evren AT, Okkaoğlu MC, Öztürk Ö, Haberal B, Özdemir M. Evaluation of quality and reliability of YouTube videos on spondylolisthesis. Interdiscip Neurosurg. 2020;22:100827.
12
Richardson MA, Park W, Bernstein DN, Mesfin A. Analysis of the quality, reliability, and educational content of YouTube videos concerning spine tumors. Int J Spine Surg. 2022;16:278-82.
13
Li M, Yan S, Yang D, Li B, Cui W. YouTube™ as a source of information on food poisoning. BMC Public Health. 2019;19:1-6.
14
Yurdaisik I. Analysis of the most viewed first 50 videos on YouTube about breast cancer. Biomed Res Int. 2020;2020:2750148.
15
Charnock D. The DISCERN Handbook. Quality criteria for consumer health information on treatment choices Radcliffe: University of Oxford and The British Library. 1998:7-51.
16
Gurler D, Buyukceran I, editors. Assessment of the medical reliability of videos on social media: detailed analysis of the quality and usability of four social media platforms (Facebook, Instagram, Twitter, and YouTube). Healthcare (Basel). 2022;10:1836.
17
Cetinavci D, Yasar V, Yucel A, Elbe H. Evaluation of the usage of YouTube videos about Histology and Embryology as an educational material. Anat Histol Embryol. 2022;51:810-7.
18
Yildiz MB, Yildiz E, Balci S, Özçelik Köse A. Evaluation of the quality, reliability, and educational content of YouTube videos as an information source for soft contact lenses. Eye Contact Lens. 2021;47:617-21.
19
Kartal A, Kebudi A. Evaluation of the reliability, utility, and quality of information used in total extraperitoneal procedure for inguinal hernia repair videos shared on WebSurg. Cureus. 2019;11:e5566.
20
Ng JP, Tarazi N, Byrne DP, Baker JF, McCabe JP. Scoliosis and the social media: Facebook as a means of information exchange. Spine Deform. 2017;5:102-8.
21
Truumees D, Duncan A, Mayer EK, Geck M, Singh D, Truumees E. Cross sectional analysis of scoliosis-specific information on the internet: potential for patient confusion and misinformation. Spine Deform. 2020;8:1159-67.