Measures and Scales

On this page, you can find scales that my colleagues and I have developed. The scales are copyrighted but you are free to use them without permission as long as you give credit by citing the respective articles. If you have any questions, please feel free to reach out.

Comprehensively Developed and Validated Scales

    • Youth Social Media Literacy Inventory – YSMLI (objective knowledge test) [Link to paper, Items]

With this work, we developed a comprehensive inventory to assess youth social media literacy. It is unique in the limited pool of media literacy measures in that it is an objective, reliable, valid, and flexible inventory to be used with youth to specifically assess social media literacy in a variety of contexts. We validated versions with different lengths (90-items, 48-items, 18-items to make the inventory applicable to various research areas and projects. We hope the inventory will be used to further research the effectiveness of social media literacy education, and evaluate the success of efforts to maximize benefits while minimizing risks of social media use for youth.

    • Online Privacy Literacy Scale – OPLIS (objective knowledge test) [Website]

OPLIS is a 20-item instrument designed to measure people’s online privacy literacy. We understand online privacy literacy as a combination of factual or declarative (“knowing that”) and procedural (“knowing how”) knowledge about online privacy and data protection (Trepte et al., 2015). Accordingly, the scale is based on a multidimensional concept of privacy literacy which includes the following aspects: (1) Knowledge about institutional practices
(2) Knowledge about technical aspects of data protection, (3) Knowledge about data protection law (4) Knowledge about data protection strategies. In the final scale, each dimension is measured with five items respectively.

    • Algorithm Literacy Scale (objective knowledge test) [Link to paper (Items in online supplement)]

We developed and validated an algorithm literacy scale that consists of two interrelated dimensions: 1) awareness of algorithms use and 2) knowledge about algorithms. The final scale consists of each 11 items measuring algorithm awareness and knowledge. Both subscales correlated positively with participants’ subjective coding skills and proved to be an appropriate predictor for participants’ handling of algorithmic curation in three test-scenarios.

Ad-hoc Scales (good factorial validity and reliability) 

    • Critical Media Literacy Scale (self-reported) [Items]
    • Online Privacy Concerns Scale [Items]
    • Need for Privacy Questionnaire [Items]
    • Disclosure Management Assessment [Items and information]


I strongly believe in the value of openness and transparency in science. In light of recent meta-scientific discoveries and developments (e.g., replication crisis, identification of questionable research practices, meta-analytical evidence for publication bias, etc.), I strongly advocate for making all materials (e.g., items, stimuli, coding procedures, etc.), data (if possible), and analysis scripts available to the public. Next to this website, you can find most of the data that I collected over the years on my OSF page.

I believe sharing data, materials, and scripts is important not only for evaluating a study’s contribution and methodological soundness, but also to allow other researchers to reproduce the study computationally, replicate it independently, or include it in meta-analyses. In the last years, I have published several datasets that can be used for scientific purposes. If you have any questions about the data or are interested in collaborating on an analysis, please feel free to reach out.

  • Masur, P. K. & Ranzini, G. (2024). A survey study to replicate three key studies in communication privacy research. Open Science Framework:
  • Masur, P. K., Bazarova, N. N. & DiFranzo, D. J. (2023). A survey study on the relationship between social norms and self-disclosure on Facebook and Instagram. Open Science Framework:
  • Masur, P. K., DiFranzo, D. J., & Bazarova, N. N. (2021). An experimental study on behavioral adaption to existing collective norms on social media. Open Science Framework:
  • Masur, P. K., DiFranzo, D. J., & Bazarova, N. N. (2021). An experimental study on collective norm perceptions on social media. Open Science Framework:
  • Trepte, S., Masur, P. K. & Dienlin, T. (2019). A longitudinal survey on privacy concerns, literacy, disclosure, support. (5 waves, representative for the German population). GESIS Datorium:
  • Masur, P. K. & Scharkow, M. (2016). A cross-sectional survey on privacy and self-disclosure on social media. Open Science Framework:
  • Bauer, A., Loy, L. S., Masur, P. K. & Schneider, F. M. (2018). A diary study on instant messaging and mindfulness. Open Science Framework:


You can find most of my software and code on github. I primarily work with R and have developed some procedures to run various statistical models and to facilitate computationally reproducible reporting. Below, you find some R packages that I recently developed:

  • DiFranzo, D., Bazarova, N. N., Yang, Q., Hui, W., Masur, P.K., Ozanne, M., Beichen, M., Sankaran, A., Zhao, P., Bae, I., & Han, E. (2024). The Truman Platform: A complete, open-source social media simulation. [Website]
  • Masur, P. K. & Scharkow, M. (2020). specr: Conducting and Visualizing Specification Curve Analyses (R-package, version 1.0.0) [Website | Reference manual]
  • Masur, P. K. (2022). ggmirt: Plotting functions to extend the package “mirt” for IRT analyses (R-package, version