Evaluation of transfer
In addition to the concrete implementation of knowledge and technology transfer, CIMTT attaches great importance to the evaluation of its project and transfer work. Evaluation creates transparency about the effects of transfer, the conditions under which it is successful and where adjustments are necessary. A structured approach is crucial precisely because transfer effects are often not clearly verifiable - initial data is sometimes missing, effects occur with a time delay, external influences overlay results. At the same time, evaluation is worthwhile because it makes transfer successes more visible and provides concrete starting points for adjusting activities in a targeted manner and continuously adapting transfer concepts to changing framework conditions.
Our approach combines results and process monitoring. At the results level, quantitative indicators are used where available (e.g. reach, participation, outputs, progress towards project goals). Where key figures are only meaningful to a limited extent or are only available with a time delay, they are supplemented by qualitative assessments in order to better understand dynamics in the transfer area. This includes continuous feedback loops (e.g. brief feedback after formats, reflection rounds with partners, structured assessments of stakeholders) as well as in-depth qualitative surveys that make motives, barriers and success factors visible. Another component is representative surveys, e.g. on trends and the suitability of certain transfer formats, which go beyond individual events or projects and systematically record the assessments of different target groups. This not only reveals individual voices, but also patterns and tensions in the cooperation process.
Evaluation options based on examples
Questionnaire-based evaluation of events using the example of PAL
One example of the continuous evaluation and adaptation of transfer activities is the questionnaire-based evaluation of events in the PerspektiveArbeit Lausitz project. Participant and visitor feedback is part of a project-wide indicator system and provides important information on the quality and relevance of the formats. An online questionnaire developed jointly with ATB Arbeit, Technik und Bildung gGmbH and Fachkräftenetzwerk OL gGmbH is used to collect data and is completed anonymously after each event. Among other things, it records satisfaction, relevance of topics, willingness to participate in future offers, desired topics and information channels used. The evaluation supports the further development of formats and topics in a targeted manner and in line with requirements over the course of the project.
Design and implementation of surveys using the example of TR@NSFER-ING
One example of this overarching evaluation perspective is the TR@NSFER-ING study, a quantitative online survey in the School of Engineering Sciences. It systematically records how knowledge and technology transfer is currently implemented, which forms of transfer activities and collaborations exist and which attitudes, motivations and framework conditions shape the transfer. The survey takes into account both people with intensive transfer experience and those who have not yet pursued many transfer activities in order to obtain a realistic overall picture. The structure and topics are based on the nationwide Transfer1000 study by Fraunhofer IAO, which facilitates comparability and classification of the results. The results not only serve to identify hurdles and development potential, but also provide an empirically sound basis for the needs-based further development of support offers, incentive structures and transfer formats as well as for strengthening the visibility and effectiveness of transfer activities in the engineering sector.
Qualitative evaluation of a format based on interviews, observations and reflection discussions using the example of the Lusatia 2050 Future Workshop
One example of process-oriented evaluation is the accompanying research for the Lusatia 2050 Future Workshop. The aim was to understand the dynamics of cooperation between scientific and non-scientific stakeholders and to make both supportive and obstructive framework conditions visible. To this end, qualitative methods were combined to map both snapshots and changes over the course of the format. The data collection included focus group discussions, moderated reflection rounds and participant observations in order to record interactions, communication patterns and cooperation practices in the transfer process. In addition, guided interviews were conducted before, during and after the workshops in order to understand expectations, experiences and perceived effects as well as their development over time. The evaluation provides concrete starting points for how formats can be further developed in terms of interaction, communication and knowledge integration. The procedure is based on established methodological standards for focus groups, observations and interviews. All participants were informed transparently about the purpose, procedure and data protection of the evaluation.