Archive for Februar, 2013

Februar 26th, 2013

Conference report #2: „from analog to digital“
February 2013, Munich

A snowy weekend early February in Munich: journalism researchers, media practitioners and others came together for two days  in the auditorium maximum of the Institute for Communication Science and Media Research to hear a number of presentations about a currently hot debated topic: „Journalism & Technology“.
On February 8 and 9, 2013 the division „Journalism/Journalism research“ of the DGPuK (German Association of Media and Communication Studies) invited the audience to discuss this topic under the motto „From analog to digital“. Thanks to the hard working tweeps, Nele was able to put together a conference Storify (in German) – including meta communication ^^ Further information about the conference, some visual impressions and long abstracts of all presentations can be found on the conference website.

In the first panel, research on various aspects of changing newsroom(s) (practices) was presented:

  • Peter Schumacher started his talk on the new role of online news site chief editors with the statement that „Blattmachen online“ is a very diverse and somewhat fuzzy task which contains aspects like the selection, the „mixing“, distribution and presentation of news items throughout the day. He presented the results of eleven semi-standardized interviews with chief editors of online news sites as well as a content analysis of the lead stories (top five slots) of six German news websites. What he found in his research is that the selection routines of chief editors in online newsrooms are not only structured by a specific setting of the workplace – several monitors help the person in charge to simultaneously observe the news stories of news agencies, other news websites and user statistics etc. Moreover, the „Drehgeschwindigkeit“ (rotation) of the top stories is heavily influenced by temporal structures, especially by the news production rhythms of the offline pendant; e.g. in the afternoon, more „shovelware“ from the print newsroom is integrated on the website. The chief editors also described a certain „topicality pressure“, i.e. they change the top stories very often during the day. For example, Schuhmacher identified patterns of „afternoon nervousness“ on taz.de. Regarding the mixture of leading stories, Schuhmacher found that every online newsroom has its own (quite flexible) rules which lead to a specific thematic website profile. To create that certain mixture, most chief editors rely on their long-time experience and assumptions about relevance, but not so much on live statistics and numbers. Instead, clicks and usage data are strategical components which are used to place news stories at the right moment and to gain the highest attention as possible.
  • Afterwards, Sonja Kretzschmar presented a standardized survey among 90 editors responsible for crossmedia activities of local newspaper editions. The findings indicate that while social media nowadays are an important tool for journalistic inquiry, distribution and interaction, other crossmedia activities are not executed extensivly (e.g. very few time is spent on mobile editions). Moreover, the systematic coordination and integration of cross media activities is very limited due to rudimentary organizational routines – according to Kretzschmar, the analyzed German local newspapers did not adjust their work routines to meet the challenges of crossmedia; principles of change management are mostly not taken into account. Instead, the implementation of innovation appears as a „top-down“ process, initiated by publishers. Hence, a lack of transparency and internal communication leads to a certain resistance within the newsroom staff (also depending on age and skills). Kretzschmar concluded that there is room for optimization, and a need for further education among journalists within local newspaper departments.

The second panel was dedicated to the role of technology as a supportive structure for participatory practices:

  • First, Thomas Roessing presented his research on Wikipedia as a gateway for breaking news. According to him, the role of Wikipedia (not Wikinews!) in the very moment of breaking news events is highly contested among the members of the Wikipedia community due to the website’s self-understanding as an encyclopedia. He exemplified the function of Wikipedia as a „second-level-gatekeeper“ with some case studies (e.g. tsunami in 2004, London bombings in 2005, mass panic at the Loveparade in 2010), where he combined a quantative analysis of the article version histories as well as a qualitative analysis of the community discussions.
  • Christian Nuernbergk presented his dissertation project, a complex network analysis which was based on linkings between over 300 weblogs regarding the news reporting by the leading German online news site Spiegel Online and Indymedia on the G8 summit in Heiligendamm (2007). His project was focused on the participatory performances of information mediation and distribution and their resonance in networks of media related follow-up communication. He came to the conclusion that at that time the bloggers and the blogosphere in general to some extent failed as information facilitators, due to misfunctions in their network structure.
  • Finally, Timo Spieß (together with Annika Sehl) presented his bachelor thesis (!) in which he analyzed the potentials and risks of Social TV and Second Screen for TV journalism. His research object was the „Rundshow“, the first Social TV experiment in German television, which was initiated by the public service broadcaster Bayrischer Rundfunk. The daily late-evening show aired in Summer 2012 and integrated several features such as Google hangouts, interactive tools (e.g. the smartphone app and voting tool „Die Macht“; open editorial conferences) and various Social Media channels. Spieß quantatively analyzed the usage data of these participatory features (>16.000 text fragments and > 32.000 voting results – the data came from the BR). His findings indicate that: a) most social interaction regarding the show took place on Twitter (76,6%) – the second-screen app „Die Macht“ was not used extensively for interaction (7,9%) but there was a high interest in the voting function; b) the tools were mainly used by (a smaller group of) users that have been very active beforehand; and c) the users/viewers mainly discussed the show (concept) itself and not so much the current issues – they also showed not much interest in the preparation of the show (e.g. by proposing a topic or submitting material/UGC). All in all, the Social TV experiment reached a smaller (active) group of people with a high social media affinity – a continuation of the program is not planned.

The third panel was entitled technology as a journalistic tool„:

  • Ralf Spiller & Stefan Weinacht presented the first explorative survey among data journalists in Germany. First, it came as a surprise that only 28 persons define themselves as „data-journalists“. Second, data journalists have a quite different self-image than „normal“ journalists: they see themselves as investigative „detectives“ and „team workers“ and emphasize functions such as „to control politics, economy and society“. Nevertheless: According to Spiller/Weinacht it seems questionable if every form of data- journalism really counts as journalism – sometimes it seems more appropriate to refer to it as a „service“ which formerly has been called „computer-assisted reporting“.
  • Cornelia Wolf gave a presentation about the technical potential of mobile apps and their implementation by German news media. In her dissertation project, she carried out a content analysis of 457 journalistic smartphone and tablet apps with regard to ten dimensions of their technical potential (e.g. actuality, additivity, connectivity, intuitivity or playfulness). Her findings indicate that journalistic apps in general do not make extensive use of interactive functions; interestingly, radio apps integrate features for content production to a higher extent than other media types. Furthermore, the use of technical potentials seems to be highly dependend on the „mother medium“. All in all, print magazines appear  to offer the most innovative apps, although mobile specifics, such as context sensitivity, are not embedded (yet).

Two members of our project team also participated in the conference – Wiebke as the speaker of the DGPuK group and host of panel 4 „technological intermediation“ which included a talk by Nele. In her presentation about technical artifacts as intermediaries (the slides are available on our „Output“ page), Nele discussed the conference theme from a macro perspective by taking into account the „other“ side of journalism: the audience. In her talk she brought up a systematization of technical objects (divided into infrastructure, hardware and software), and some explanations why they sometimes appear complicated for media pracitioners, users and researchers alike (which stems from their seamlessness, their dynamic and multiple layers etc.).  Nele also proposed a systematization of intermediating functions of technical objects as well as a model that integrates journalists, users and technical objects (their design, functionality and purpose), and how they are mutually shaping or being shaped by processes of appropriation, routines/practices of usage and social representations [>> Note: this is work in progress^^].

Two keynotes & one panel discussion

The first keynote by John Pavlik on day one surely was a highlight of the event. Via live video streaming from Qatar, Pavlik gave an introduction to the implementation of Augmented Reality (AR) in the field of journalism. One example was his project on situated documentaries which he called a form of „first person journalism“ (non-linear, interactive, dynamic, contextual and immersive) where the user becomes an ethnographer of his environment; another example was a special issue of the „Süddeutsche Zeitung Magazin“ in 2010. According to Pavlik, among the profits of using AR in journalism are the effective addressing and (re-)engagement of younger user segments (in times of multi-screen usage and widespread use of mobile devices). Nevertheless, journalists must think innovative and creative to harness these potentials of storytelling which could also enable a „fluid discourse“. Pavlik ended his talk with the (rather critical) remark that „technology intermediaries now control the future of the news“, an issue which has to be discussed.

In the late afternoon, Christoph Neuberger moderated a panel discussion that brought together four experts from the R&D area: Prof. Dr. Berchtold pointed out that the printing industry is declining and investments in innovation on the field are rare; Dipl.-Ing. Christoph Dosch introduced the „Contentus Project“, a very interesting attempt to make media content more accessible and archivable via meta data and semantic search; Hannspeter Richter (workflow management at Bayrischer Rundfunk) talked about the new trimedia strategy of the broadcaster, i.e. in the future, the editorial departments of TV, radio and online will work together on specific topics and produce content for all three media; and Prof. Dr. Hußmann who talked about his work at the department for media informatics (University of Munich) and future developments, e.g. the ubiquity of video displays, moving images and interactivity as a standard requirement of media content, new workplaces (like BendDesk) and new challenges for journalists (such as the authentification of sources with the help of implicit biometrical information). In the following discussion it was problematized that journalism research is always more or less behind technological innovation because academics are learning about new technological tools when they are already implemented in newsrooms. Another issue was the supplement and substitution of journalistic work by technology. Good news for journalists: crafting/handiwork might be substituted in many areas, but intellectual performances will not. Or, as Mr. Dosch, has put it: „Journalists are the soul of democracy“.

In the second keynote (on day two), Jürgen Wilke from the University of Mainz talked about technological change and journalism from a historical perspective. He drew a historical, physically determined line from mechanization to electrification till computerization and digitization. According to Wilke, the electrification of journalism was a rather late phenomenon which was mainly driven by economic aspects and led to several thrusts of acceleration. Eventually, such phases of technological change are influencing journalism on four levels: information gathering, information processing, information content and formal presentation, as well as information dissemination.

Room for Discussion

Overall, the (well organized) event was dominated by empiricial research and showed a variety of methodological approaches towards (technologically driven) innovation and technology-related changes on the field of journalism. In his final remarks, the local conference organizer, Prof. Dr. Christoph Neuberger, pointed out that the timely analyses of current developments in the journalism field was very impressive (which it was). At the same time, he encouraged journalism researchers to push forward theory building (as an important addition of descriptive work) and methodological innovation – I absolutely second that. Two important aspects were added by the audience: a) that journalism research should rethink the „newness“ of recent developments and, in this context, should consider older studies and theories which were elaborated in decades that saw massive structural changes (e.g. computerization of newsrooms in the 70s/80s); and b) that we should reflect very carefully about tendencies to „fetishize“ innovation as a merely positive connotated, „natural“ development – not only with regard to our research objects but also our own research and interpretations.

[nh]

Februar 25th, 2013

Beitrag für BPB-Dossier „Lokaljournalismus“

Die Bundeszentrale für politische Bildung hat ein Online-Dossier „Lokaljournalismus“ veröffentlicht, in dem auch ein Text aus dem jpub20-Projekt enthalten ist: „Vom Gatekeeping zum Gatewatching. Verändern Soziale Medien den Journalismus?„. [js]

 

Februar 13th, 2013

Tagungsbericht „Wandel und Messbarkeit des öffentlichen Vertrauens im Zeitalter des Web 2.0“

Am 25.1. fand auf dem Mediencampus in Leipzig die Fachtagung „Wandel und Messbarkeit des öffentlichen Vertrauens im Zeitalter des Web 2.0“ des Instituts für Praktische Journalismus- und Kommunikationsforschung (IPJ) statt. Das Programm war in vier aufeinander folgende Panels aufgeteilt, die jeweils die Sicht und Erkenntnisse einer der vier Säulen des Instituts – Kommunikationsmanagement, Journalismusforschung, Medien-Informatik, Markenkommunikation – widerspiegelten und somit einen interessanten interdisziplinären Zugang zur Vertrauensforschung boten:

Zunächst erläuterte Günter Bentele Grundlagen seiner Theorie des Vertrauens und Möglichkeiten seiner Messung und verwies auf die wachsende Bedeutung von Vertrauen für und in Personen, Organisationen und Systeme in der Mediengesellschaft. Anschließend stellte Patricia Grünberg unter dem Motto „Vertrauen in das Gesundheitssystem“ eine Inhaltsanalyse von Printberichten zu Zeiten gesundheitspolitischer Reformvorhaben vor, die u.a. ergab, dass Medienbeiträge zum Gesundheitssystem hauptsächlich negativ ausfallen. Das letzte Drittel des Kommunikationsmanagement-Teils bestritt Jens Seiffert mit einem Vortrag zur angeblichen Vertrauenskrise des Finanzsystems in Deutschland: Nach seinen Analysen des Corporate Trust Index, einer Erhebung der Existenz und Bewertung von sieben Vertrauensfaktoren in Printberichten zu bestimmten Unternehmen, drücken Journalisten trotz Wirtschaftskrise in ihrer Berichterstattung nach wie vor Vertrauen in die Wirtschaft aus.

Wenn über öffentliches Vertrauen geredet wird, darf auch der Journalismus als großer Hersteller von Öffentlichkeit (oder zumindest öffentlichen Kommunikationsangeboten) nicht fehlen. So befasste sich auch der zweite Tagungsteil mit Journalismusforschung. Im ersten Beitrag gab Michael Haller einen Überblick über die Forschung zu öffentlichem Vertrauen und der Rolle der Medien. Für das jpub-2.0-Team war ich vor Ort und sprach über eine etwas andere Verbindung von Vertrauen und Journalismus, nämlich darüber, ob und wie Transparenz im Journalismus das Vertrauen in Journalismus beeinflusst. Aufbauend auf dem Artikel „Transparenz im Journalismus“, den Klaus Meier zusammen mit mir veröffentlichte, befasste sich mein Vortrag zunächst mit der Frage, was Transparenz im Journalismus überhaupt ist: Wie lassen sich so unterschiedliche Transparenz-Instrumente wie die Verlinkung von Quellen, Ombudsmänner, Webvideos von Redaktionskonferenzen, Hinweise auf korrigierte Fehler usw. unter einer Definition zusammenfassen? Und wie kann ein theoretisches Modell dennoch der Mannigfaltigkeit des Phänomens gerecht werden? Ausgehend von der häufig geäußerten Behauptung eines positiven Zusammenhangs zwischen Vertrauen und der Öffnung (oder zumindest Offenbarung) journalistischer Prozesse und Ressourcen wurden im zweiten Teil des Vortrags u.a. vier der wenigen empirischen Studien zu dieser These vorgestellt. Das Fazit: ein klares, bestimmtes, typisch sozialwissenschaftliches „Es kommt darauf an.“ (Die Folien zum Vortrag können auf meinem Slideshare-Account angesehen werden.) Anschließend berichtete Christian Bollert, wie er und seine drei Mitstreiter u.a. durch Transparenz in Form eines Redaktionskodexes bei Rezipienten und Werbekunden Vertrauen in eine neue Medienmarke herstell(t)en, nämlich in das von ihnen gegründete Webradio detektor.fm.

Der dritte Teil der Tagung befasste sich mit den Möglichkeiten der Messung öffentlichen Vertrauens durch Text Mining, ein Verfahren der Medien-Informatik, bei dem computergestützt große Textmengen, etwa Zehntausende Presseberichte, inhaltsanalytisch ausgewertet werden können. Als Beispiel ging Gerhard Heyer etwa auf eine Analyse des Zusammenhangs von Aktienkursen und Unternehmensmeldungen in den Medien ein, nach der positive Berichte über eine Firma keinen, negative aber einen negativen Effekt auf den Börsenwert des Betriebs haben. Allerdings wies er auch auf das Problem rein quantitativ orientierter Verfahren der Textanalyse hin, nämlich dass Wichtiges unter Umständen nicht so oft gesagt bzw. geschrieben wird. Auch stelle jedes Verbreitungsmedium spezielle Ansprüche an die Analyse. So seien Sentimentanalysen von Blog-Posts wegen der häufig in ihnen vorkommenden Ironie (noch) deutlich weniger aussagekräftig als die von anderen Textsorten. Gregor Wiedemann und Andreas Niekler erläuterten anschließend, wie das Text Mining im aktuellen Projekt „Postdemokratie und Neoliberalismus“ zur Anwendung kommen soll, und zeigten erste Auswertungen. Besonders interessant aber war die anschließende Diskussion: Ein Zuhörer fragte nach, in wie fern das Verfahren auch berücksichtigen könne, dass sich die Bedeutungen von Wörtern auch verändern können. Als Beispiel nannte er den Begriff „Kollateralschaden“, der zunächst nur im Zusammenhang mit bewaffneten Konflikten gebraucht, aber zunehmend vom militärischen Kontext gelöst worden sei, so dass man ihn heute etwa auch in Bezug auf betriebsbedingte Kündigungen verwende. Die Vortragenden beschwichtigten: Da die Wörterbücher, die den Analysen zu Grunde liegen, jeweils von einem Experten für das untersuchte Gebiet mitgestaltet würden, könne das Problem umgangen werden. Zur Demonstration der Schnelligkeit der Software, ließen sie sich vom Programm adhoc eine Visualisierung des Vorkommens des Wortes „Kollateralschaden“ in Printartikeln seit 1998 anzeigen – und belegten damit gleich die Vermutung einer Bedeutungsveränderung des Begriffs: Wurde der Begriff zu Beginn des untersuchten Zeitraumes noch fast ausschließlich in Phasen von Kriegen und Konflikten in der Presse benutzt, kam er zum Ende der Analyseperiode deutlich verstreuter vor, was darauf hinweist, dass er nicht mehr nur im militärischen Kontext verwendet wurde.

Der abschließende Tagungsteil befasste sich mit dem Vertrauen in Marken im Zeitalter des Social Web. Da ich meinen Zug erwischen musste, kann ich hier leider nur die Informationen aus dem Programmheft wiedergeben. Demnach reflektierten zunächst Manfred Kirchgeorg und Martin Wiedmann über das Vertrauenskonstrukt in der Marketingwissenschaft und stellten empirische Beispiele vor. Anschließend berichtete Steffen Hermann über die Interdependenz von Social Media, redaktionellen Medien und Unternehmensreputation/-vertrauen bei Stakeholdern.

Insgesamt war das Tagungsthema „Wandel und Messbarkeit des öffentlichen Vertrauens“ eine gute Klammer für die sehr unterschiedlichen Forschungszweige und führte dazu, dass sich in jedem Tagungsteil vielfältige Anschlusspunkte für die drei anderen im Institut vertretenen Disziplinen ergaben – und auch für Psychologen und Soziologen, wie sie sich im Plenum fanden. Das einzige Manko war das Fehlen von W-Lan und Hashtag. Vielleicht war ja auch deshalb der Teil „im Zeitalter des Web 2.0“ auf dem Tagungsflyer deutlich kleiner abgedruckt als der Rest des Konferenztitels.

(Ein weiterer Bericht zur Tagung findet sich auch auf der Website des IPJ.) [jr]

Februar 3rd, 2013

New(s) stuff

After four months we think it’s time for a new roundup of recent developments and innovations in journalism and user participation.

One tool journalists are currently experimenting with is Google-Hangouts. Rob O’Regan knows six ways in which newsrooms can use Google-Hangouts, i.e. for

  • interviews (obvious)
  • discussing breaking news
  • how-to’s, demonstrations or educational programming
  • collaboration within the newsroom (e.g. to jointly discuss story development)
  • chats of writers and editors with paying users
  • focus groups to get feedback on issues, articles, websites etc.

Btw: O’Regans evaluation that “Google+ is not yet a Facebook killer” is certainly true for the majority of people. But for a small number of recipients of online journalism, Google+ has already killed Facebook: Some of the users we interviewed for jpub20-case studies valued Google+ much higher than Facebook in terms of discussion quality and culture.

Similarly, these 91 (!) slides by Mykl Novak offer not only an overview over the contents and functions of tumblr and the socio demographics of its users as well as comparisons with Facebook, Google+ and Twitter as far as unique visitors and duration of visits are concerned, but also present some examples of how newsrooms use tumblr. Among Novak’s tip: Strike a balance between

  • creation
  • curation
  • transparency
  • new, visual formats and
  • and user participation/crowdsourcing.

You’re rather interested in using Pinterest for journalism? No problem: Mallary Jean Tenore tells you how other journalists use Pinterest to

  • highlight feature content
  • resurface old content
  • respond to news events
  • showcase local attractions and events and
  • reach new audiences.

In this older German post, we already told you about Truth Teller, an application that spots false claims made by politicians in speeches, interviews and so on – in real-time! An algorithm transcribes the words of the speaker into text checks them against the Washington Post’s database of checked facts. Now the WP has launched the prototype of Truth Teller and explains how it works in a video. In a recent post, David Holmes explains the advantage of using robots to do the fact-checking: No one would think they’re biased. But Holmes also points to the problem that the WP’s database the politicians’ claims are checked against consists of facts that have been verified by real human journalists. And if that data leans one way or the other, the unbiased algorithm produces biased evaluations nonetheless. Furthermore, the robot checks keywords, figures and so on. But it cannot understand what it transcribes so that it cannot check whether correct facts are used in a misleading context. However, “For right now at least, the program seems to hit a sweet spot between human reporting and algorithmic data collection.”, Holmes writes and suggests using Twitter as a data source for robot fact-checking during breaking news events.

In any case, “data” is a word you hear and read more and more often together in one sentence with “journalism”. No wonder that newsrooms are thinking about how to organize and utilize it: Sarah Marshall reports on a library software for collecting data and on how journalists extract stories from it. And Luuk Sengers writes about a research database that stores documents, questions, contacts, calendars and so on in one file.

If a newsroom used one of these tools publicly, it could showcase its research processes – thus, create transparency – and invite users to participate by saying which questions should be answered first because they are most important to them, by adding questions to be answered, by pointing to sources who could answer the journalists’ questions and so on. German daily Frankfurter Allgemeine Zeitung has just completed a three-day-experiment with such a kind of user participation using an interactive mindmap to gather research questions and answers concerning as well as to discuss about textile production.

From live-research to live-coverage: Reporting current events in real-time using Twitter offers recipients unique opportunities to comment on the event itself as well as on the way quality of coverage. David Higgerson offers some advice for journalists who want to give it a try. tl;dr? Just take a look at this infographic about live-blogging by Elisabeth Ashton. Interested in more infographics? Cool Infographics is a site dedicated only to them. And if you are searching for a tool to use for live-blogging, you should read Sarah Marshall‘s article on Liveblog Pro, a platform built by two students and a journalist who seem not to have much confidence in journalists’ skills concerning the adoption of new technology: “Liveblog Pro was built with journalists in mind, making it as simple as possible.”

The fine thing about Twitter is that you cannot only use it to disseminate information but also to collect it. However, using tweets as a source is very risky unless the information gathered is verified. Fortunately, Steve Buttry knows how to evaluate the validity of tweets. It might also be useful to watch this video in which Malachy Browne explains how Storyful separated news from noise by verifying user-generated videos and images during hurricane Sandy. In case that for once you haven’t checked the info carefully enough: Rachel McAthy talked to Steve Buttry, Craig Silverman and others about how to correct mistakes online. [jr]