Home > Research > Publications & Outputs > Analyzing Connections Between User Attributes, ...

Electronic data

  • Accepted Draft

    Rights statement: The final publication is available at Springer via https://link.springer.com/article/10.1007/s12559-019-09695-3

    Accepted author manuscript, 29.7 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Text available via DOI:

View graph of relations

Analyzing Connections Between User Attributes, Images, and Text

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Analyzing Connections Between User Attributes, Images, and Text. / Burdick, Laura; Mihalcea, Rada; Boyd, Ryan; Pennebaker, James W.

In: Cognitive Computation, Vol. 13, 23.03.2021, p. 241-260.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Burdick, L, Mihalcea, R, Boyd, R & Pennebaker, JW 2021, 'Analyzing Connections Between User Attributes, Images, and Text', Cognitive Computation, vol. 13, pp. 241-260. https://doi.org/10.1007/s12559-019-09695-3

APA

Burdick, L., Mihalcea, R., Boyd, R., & Pennebaker, J. W. (2021). Analyzing Connections Between User Attributes, Images, and Text. Cognitive Computation, 13, 241-260. https://doi.org/10.1007/s12559-019-09695-3

Vancouver

Burdick L, Mihalcea R, Boyd R, Pennebaker JW. Analyzing Connections Between User Attributes, Images, and Text. Cognitive Computation. 2021 Mar 23;13:241-260. https://doi.org/10.1007/s12559-019-09695-3

Author

Burdick, Laura ; Mihalcea, Rada ; Boyd, Ryan ; Pennebaker, James W. / Analyzing Connections Between User Attributes, Images, and Text. In: Cognitive Computation. 2021 ; Vol. 13. pp. 241-260.

Bibtex

@article{3a6d9cfd7a194d8e9f4af9db5c56c5b3,
title = "Analyzing Connections Between User Attributes, Images, and Text",
abstract = "This work explores the relationship between a person{\textquoteright}s demographic/ psychological traits (e.g., gender, personality) and selfidentity images and captions. We use a dataset of images and captions provided by N = 1,350 individuals, and we automatically extract features from both the images and captions. We identify several visual and textual properties that show reliable relationships with individual differences between participants. The automated techniques presented here allow us to draw interesting conclusions from our data that would be difficult to identify manually, and these techniques are extensible to other large datasets. We believe that our work on the relationship between user characteristics and user data has relevance in online settings, where users upload billions of images each day (Meeker M, 2014. Internet trends 2014–Code conference. Retrieved May 28, 2014).",
keywords = "personality, gender, natural language processing, computer vision, computational social science",
author = "Laura Burdick and Rada Mihalcea and Ryan Boyd and Pennebaker, {James W.}",
note = "The final publication is available at Springer via https://link.springer.com/article/10.1007/s12559-019-09695-3",
year = "2021",
month = mar,
day = "23",
doi = "10.1007/s12559-019-09695-3",
language = "English",
volume = "13",
pages = "241--260",
journal = "Cognitive Computation",
issn = "1866-9956",
publisher = "Springer",

}

RIS

TY - JOUR

T1 - Analyzing Connections Between User Attributes, Images, and Text

AU - Burdick, Laura

AU - Mihalcea, Rada

AU - Boyd, Ryan

AU - Pennebaker, James W.

N1 - The final publication is available at Springer via https://link.springer.com/article/10.1007/s12559-019-09695-3

PY - 2021/3/23

Y1 - 2021/3/23

N2 - This work explores the relationship between a person’s demographic/ psychological traits (e.g., gender, personality) and selfidentity images and captions. We use a dataset of images and captions provided by N = 1,350 individuals, and we automatically extract features from both the images and captions. We identify several visual and textual properties that show reliable relationships with individual differences between participants. The automated techniques presented here allow us to draw interesting conclusions from our data that would be difficult to identify manually, and these techniques are extensible to other large datasets. We believe that our work on the relationship between user characteristics and user data has relevance in online settings, where users upload billions of images each day (Meeker M, 2014. Internet trends 2014–Code conference. Retrieved May 28, 2014).

AB - This work explores the relationship between a person’s demographic/ psychological traits (e.g., gender, personality) and selfidentity images and captions. We use a dataset of images and captions provided by N = 1,350 individuals, and we automatically extract features from both the images and captions. We identify several visual and textual properties that show reliable relationships with individual differences between participants. The automated techniques presented here allow us to draw interesting conclusions from our data that would be difficult to identify manually, and these techniques are extensible to other large datasets. We believe that our work on the relationship between user characteristics and user data has relevance in online settings, where users upload billions of images each day (Meeker M, 2014. Internet trends 2014–Code conference. Retrieved May 28, 2014).

KW - personality

KW - gender

KW - natural language processing

KW - computer vision

KW - computational social science

U2 - 10.1007/s12559-019-09695-3

DO - 10.1007/s12559-019-09695-3

M3 - Journal article

VL - 13

SP - 241

EP - 260

JO - Cognitive Computation

JF - Cognitive Computation

SN - 1866-9956

ER -