18.10.2022; Vortragsreihe
Distinguished Research Fellow 2022 – Award Ceremonies Richard McFarland and Jeffrey Parsons
Jeffrey Parsons (Memorial University of Newfoundland)
Place: Festsaal and Online (Zoom link for online participation)
In this Award Ceremony Professor Richard McFarland (ESSEC Business School Paris) and Professor Jeffrey Parsons (Memorial University of Newfoundland) are awarded as two of the four Distinguished Research Fellows 2022. They also give insights in current research projects:
Richard McFarland: It isn’t Personal,” or is it? The Use of Social Exchange Tactics in B2B Selling
Abstract:
Relationships between salespeople and professional buyers are both economic, with a focus on achieving instrumental firm goals, and interpersonal, with a focus on achieving individual social goals. Utilizing this dichotomy, the authors extend the sales influence tactics (SITs) literature, classifying SITs as either instrumental or social exchange tactics. They identify personal appeals and ingratiation as social exchange tactics given that they motivate compliance based on buyers’ desires to achieve social goals. The authors then build theory centered on social exchange tactics in order to resolve several conflicts in the literature: conflicts regarding whether personal appeals and ingratiation are coercive, and conflicting empirical findings regarding the effects of personal appeals and ingratiation on performance and trust outcomes. The authors argue that personal appeals are coercive and ingratiation is non-coercive. They next identify contact frequency as a missing boundary condition that resolves past mixed empirical findings. Personal appeals were predicted to have negative effects on manifest influence and trust at low contact frequency levels and positive effects at high contact frequency levels, with the reverse pattern of results holding for ingratiation. A study using multiple sources of data was conducted, including data from 89 salespeople and 303 buyers within matched B2B dyads.
Jeffrey Parsons: Learned Inattention in Crowdsourcing: How Training Affects Data Diversity
Abstract:
Crowdsourcing is a popular way to collect data outside traditional organizational boundaries. To ensure that crowdsourced data are of sufficient quality for the intended use, organizations prefer to engage contributors who have the skill to perform specified tasks, such as identifying or classifying observed entities of interest. A popular strategy to induce task skill in prospective contributors is to train them. As people learn the characteristics of an entity needed to identify it, selective attention theory suggests they focus on only those characteristics and ignore other observable characteristics, which can reduce data diversity. Diverse data represent observed phenomena more fully, thereby facilitating the use of data for purposes beyond the original use. We examine how training affects the diversity of crowdsourced data in an experimental setting. Contributors – divided into explicitly trained, implicitly trained, and untrained groups – observed entities and reported their observations in a simulated crowdsourcing task. We find that trained contributors reported less diverse data than untrained contributors, and explicit (rule-based) training resulted in less diverse data than implicit (exemplar-based) training. In a follow-up study, we validate the diversity findings of the first experiment and show the potential usefulness of data from untrained contributors. We conclude by discussing implications for designing crowdsourcing platforms to promote data diversity.