Updates from Eric Ulken Toggle Comment Threads | Keyboard Shortcuts

  • Eric Ulken 3:47 am on February 21, 2009  

    What I’ve been up to 

    OK, I’m feeling really guilty about not updating the blog, so here’s a bullet-point summary of what’s been going on since my last post, ages ago:

    • I got sick.
    • I started feeling better, so…
    • I went to BeeBCamp 2 at the BBC on Wednesday and heard lots of interesting talk about the future and the Beeb’s place in it.
    • I remained sick, but thought I was feeling better, so…
    • I went to the Guardian on Friday. Got a nice tour from Kevin Anderson and chatted with some really smart tech folks, including Django co-creator Simon Willison. I even ended up giving a little LAT Data Desk show-and-tell when the scheduled Friday afternoon guest speaker flaked.
    • I’ve been following the unfolding “effing-bloggers-vs.-real-journalists” kerfuffle here.
    • I’m still semi-sick. (I think I’ve been sick more than I’ve been well so far on this trip.)

    Health permitting, I’ve got return visits to the BBC and Guardian lined up for next week, and a trip to Oxford to take in a Reuters Institute talk on news business models.

    I still intend to write full posts on BBC and Guardian visits. But I’ll spare you the sickness post.

    Permalink
     
  • Eric Ulken 5:35 pm on February 11, 2009
    Tags: Charles Arthur, copyright, mysociety.org, The Guardian,   

    Government data wants to be free 

    Buckingham palace

    Attended a fascinating debate last night on the topic of copyright and government agencies. (No, really. It only sounds tedious.)

    Turns out government data in the U.K. is protected by something called crown copyright, which limits people’s ability to legally redistribute it.

    It’s hard for me to understand why data collected in the public interest isn’t, in fact, freely usable by the public, as it is where I come from. (The U.K. didn’t have a Freedom of Information law until 2000, and even now data released under FOI is subject to restrictions on reproduction.)

    What this means is that many of the mashups based on government data in the U.S. (I’m thinking of stuff like EveryBlock and, yes, much of the output of the L.A. Times’ Data Desk) would be impossible here under the law.

    There are some encouraging signs, though:

    • Guardian technology editor Charles Arthur, who was on the panel last night, has helped lead the charge for opening up government information by co-founding the Guardian’s Free Our Data campaign. He says a broad, cross-party consensus seems to be forming around the need to open up government data. Unfortunately, the government — which, to be fair, has its hands full with things like war and financial upheaval — hasn’t picked up the gauntlet yet. (Random thought: It’s kind of too bad that news organizations in the U.S. are so skittish about advocating for good causes.)
    • Meanwhile, some people aren’t waiting for the rules to change. For example, mysociety.org runs a site called WhatDoTheyKnow, a sort of clearinghouse for FOI requests and the responses from government agencies to those requests. It would appear that the responses are published without regard for any copyright restrictions, but it’s hard to imagine government lawyers going after a non-profit for reproducing information released under FOI. In other words: When the law doesn’t make sense, maybe it just needs to be bent until it can be changed.

    +++

    Oops: Got a little sidetracked from my “What I’ve learned in England” posts. They’ll resume soon.

    Photo of Buckingham Palace by René Ehrhardt via Flickr

    Permalink
     
    • Ben Welsh 6:05 pm on February 12, 2009

      While we’re relatively blessed here stateside, the most insidious trend I’ve seen recently are “public-private partnerships” that present themselves as models of openness, but have a minor impact on public disclosure and tend to act more effectively as a cover that allows governments to pose as transparent. For an illustrative example in the tech sector, look at how the FCC and state governments are handling broadband data.

      http://bits.blogs.nytimes.com/2009/02/11/why-spend-350-million-to-map-broadband/#more-2511

      And take a look at the “openness” of their actual product.

      http://www.connectmn.org/mapping/

      Ah, a PDF and an expensive, clunky, locked down ESRI mapping app. Whoopie.

      In full disclosure, I used to work with Drew and we were both part of a lawsuit seeking to force the FCC to disclose its broadband deployment data.

      And, in case I look like a lone nut, have a quick read of the comments to this recent article on a new “openess” initiative in Boston data.

      http://www.boston.com/news/local/articles/2008/11/19/new_map_technology_to_aid_city_snowplowing/

      Or the shameful failure of the Senate Office of Public Records to account for something as basic as amendments in a database that’s intended for no other purpose than public disclosure — rending it deeply crippled for data analysis.

      http://realtime.sunlightprojects.org/2008/08/14/mark-warner-biggest-recipient-of-lobbyist-dough-new-disclosures-show-so-far/

      Permalink
    • Andrew D. Nystrom / @latimesnystrom 9:30 am on February 19, 2009

      Interesting Eric, thanks for sharing. Thought you and your readers might like to check out the freshly launched Mapping L.A.’s Neighborhoods project, by the @latimesdatadesk – http://latimes.com/mappingla

      It just went live this morning. We welcome everyone’s input and look forward to a rollicking debate on LA’s ever-shifting geographic, cultural and psychographic boundaries.

      Coolest feature, IMHO? Select one of the 87 draft neighborhoods (based on U.S. Census 2000 data), then remix leaving your own visual comments.

      Kudos to Ben and the full LAT Data Desk team.

      Permalink
    • Mike Pirner 12:08 pm on February 19, 2009

      Hey Eric,

      Am enjoying following your travels. Hope all is well.

      Mike Pirner

      Permalink
  • Eric Ulken 7:17 pm on February 9, 2009
    Tags: Alison Gow, , Birmingham Post, Joanna Geary, Jonathan Ross, Kevin Anderson, Marc Reeves, Peter Horrocks, Phillip Schofield, Sarah Hartley, , Suw Charman-Anderson, ,   

    England notes, part 1: Twitter is huge 

    Jonathan Ross (aka @wossy)

    People (and particularly media people) here are crazy about Twitter. Simple observation suggests the microblogging phenomenon is even bigger here than it is in the U.S., and the stats seem to bear that out.

    But why is Twitter so big here? One possible explanation, offered by social media consultant Suw Charman-Anderson (aka @suw), is the enthusiastic use of the tool by some big-name Brits. To wit:

    In the newspaper industry here, lots of people are twittering, and not just casually. Just ask @foodiesarah (Sarah Hartley, online editor for the Manchester Evening News), @alisongow (Alison Gow, deputy editor of the Liverpool Daily Post), @kevglobal (Kevin Anderson, blogs editor for The Guardian and spouse of the aforementioned @suw) and @joannageary (Joanna Geary, development editor at the Birmingham Post). Joanna’s boss, @marcreeves (Marc Reeves, editor of the Birmingham Post), even has his Twitter URL on his business card. How many American newspaper editors could say the same?

    As development editor — a role that includes overseeing the newspaper’s efforts in social media — Joanna managed to get the Post to devote occasional space in the paper to explaining Twitter. Tapping her Twitter network, she organized a group of reader experts to act as unpaid bloggers on a variety of topics (see the authors of the Lifestyle blog for a sampling). And, job seekers take note: Her avid Twittering is no doubt partly responsible for her new gig at The Times of London, which starts next month.

    The Birmingham Post isn’t the only U.K. newspaper to spill ink about Twitter: The Daily Telegraph went so far as to publish a full Twitter guide, including step-by-step instructions on how to tweet and a piece on “why the world is Twitter-crazy.” (That may be overreaching a little: It’s worth pointing out that Twitter is by one measure only the 23rd most visited social network in the U.K., but apparently all social networks are not created equal.)

    I should also note that, while the rate of Twitter adoption here is high, usage doesn’t necessarily correlate with understanding. For a particularly embarrassing illustration of this, here’s a cautionary tale from the BBC: Multimedia newsroom boss Peter Horrocks last week sent what he thought was a direct message on Twitter to a colleague, Richard Sambrook, discussing some high-level appointments. Except he sent it as an @-reply, visible to the candidates being discussed, along with the unsuccessful candidates and everybody else in the world for that matter. Ouch.

    +++

    Also: London is the birthplace of the Twestival, a social gathering of Twitter users that has turned into a global event. (The next Twestival is this Thursday, Feb. 12, in 175 cities around the world. Unfortunately, the London Twestival is sold out, so if I’m going I guess I’ll have to find another city.) And… There’s even an online Twitter newspaper here, the All Tweet Journal. Points for the name, at the very least.

    +++

    Next: Tough times for some U.K. papers

    Permalink
     
  • Eric Ulken 4:05 pm on February 8, 2009
    Tags: Robert Peston, , , , University of Central Lancashire   

    What I’ve learned in England (so far) 

    "Look right"

    I’ve been in the U.K. for about a week now — long enough to feel guilty for neglecting my blogging duties, but not long enough to really get my head around what’s going on over here.

    I was in Preston last week for the Journalism Leaders Programme at the University of Central Lancashire, where I met journalists from Europe and Africa and heard some familiar stories about change-averse newsroom culture. I also visited newsrooms in Liverpool and Birmingham and listened as editors described the very real changes taking place there.

    Some trends I’ve observed in the process:

    In posts over the next few days, I’ll try to elaborate on each of those points. Meanwhile, here are some happenings in the U.K. media world that have spawned dinner-table conversation in the past week:

    Next: England’s Twitter explosion

    Photo by Charles Collier via Flickr

    Permalink
     
  • Eric Ulken 6:37 pm on January 30, 2009
    Tags:   

    ‘Killing local news’? I doubt it 

    I rarely opine about the print side of the newspaper industry, because it’s not my area of expertise and I don’t usually read printed newspapers. But humor me:

    As part of its continued downsizing, the Los Angeles Times has announced that it’s getting rid of the standalone California section and folding local news into the front section. Reactions are predictably negative, with one Twitterer deducing that the Times is “killing local news.” I don’t think it’s that cut-and-dried. The Times has obviously done a poor job of explaining this move, but to me it is defensible.

    Combining all the paper’s general-interest news into a single section could be a good thing for several reasons:

    • A unified A section means the paper is finally putting local news where it belongs: front and center. Nothing is more important to the paper’s long-term success than local news, so relegating most California stories to an inside section always seemed a bit unfair to me.
    • Since there are already a lot of local stories in the A section, it makes sense to put the rest there too. Honestly, hardly anybody outside the newspaper industry understands why some local stories go in the A section and the rest appear in the local news section. It’s needlessly confusing.
    • People are going to have to get used to smaller newspapers; economic realities dictate it. So rather than print a bunch of anemic 4-page sections, why not do fewer, beefier ones and save some money in the process? (This of course presumes that the total number of pages doesn’t decrease substantially. I’ve seen no official word yet on how much news space is likely to be lost. If the local news hole shrinks as a result of this move, then I’ll retract this defense and join the chorus of outrage.)
    • One alternative that’s been mentioned, merging business and local news into a single section, doesn’t feel right to me because the two are thematically different. Local news is general; business is, well, specific. Meshing the two on a single section front could confuse readers. On the other hand, readers are already quite used to seeing local news merged with national and foreign on A1.
    • This solution, even if it puts some people off, feels more palatable journalistically than other alternatives, which might include even more editorial staff cuts or the closing of additional bureaus.

    It’s too bad this news comes at the same time as the announcement that 70 more Times journalists will walk out the door. That will hurt. Juggling pages is comparatively painless.

    Permalink
     
  • Eric Ulken 9:49 am on January 28, 2009
    Tags: , DocumentCloud, Eric Umansky, Scott Klein   

    Filling in the blanks on DocumentCloud 

    My OJR post on DocumentCloud, the muchbuzzedabout $1-million Knight News Challenge grant proposal, is up. I did an e-mail interview with three of the proposal’s authors, Aron Pilhofer of The New York Times and Scott Klein and Eric Umansky of ProPublica. Here’s an excerpt:

    Aron Pilhofer: The grant would be used to create an independent, non-profit organization called DocumentCloud, which would manage the grant, build and maintain the software and so forth. Given the intensely competitive nature of the news business, we reckoned that this project had to be in the hands of an independent, impartial broker in order for a consortium like this to work.

    More here.

    Permalink
     
  • Eric Ulken 11:14 am on January 26, 2009
    Tags: Brussels, Clo Willaerts, European Journalism Centre   

    Distinctions that no longer matter 

    Today in Brussels, I sneaked into the kickoff seminar for the European Blogging Competition, at which about 90 bloggers and would-be bloggers, representing every European Union nation, are getting a crash course in E.U. politics and blogging techniques.

    Good panelists, good Q&A. But oddly, one of the liveliest discussions revolved around this old question: Is blogging journalism? (It’s a question that, in my view, misses the point. Blogs are simply a platform, much like newsprint, on which journalism can be produced.)

    What was really being discussed, I think, was the difference between independent and affiliated journalists, or between amateurs and professionals, or between traditional and non-traditional news sources. And there, the distinctions are increasingly hard to make.

    To use U.S. analogies: Is a reporter for TechCrunch independent or affiliated? Is The Huffington Post amateur or professional? Should we trust a scoop on TPM Muckraker less or more than a scoop in the New York Post?

    Some in the audience seemed set on drawing a line between the journalism produced by paid journalists working for traditional news organizations and that produced by “bloggers.”

    People wrongly conflate “traditional” with “credible.” (Of course, a strong brand will always bring cachet, but there are new strong brands emerging all the time.)

    In a few years, nobody will care whether a website has (or had) a legacy print or broadcast product attached. What matters in the long run is the quality of your work, as judged by your audience, and the credibility that quality brings you.

    It took me a while to understand that.

    +++
    Also today: Nice talk on standing out in the blogosphere from Clo Willaerts, who crowdsourced her presentation in advance on Twitter.
    +++

    +++
    The actual blogging in the European Blogging Competition (sponsored by the European Journalism Centre, where I interned 6 years ago) begins Feb. 1.
    +++

    Permalink
     
  • Eric Ulken 6:36 pm on January 24, 2009
    Tags: Dopplr, Tumblr,   

    The when-and-where 

    As the travel plans take shape, I’ll be keeping an updated version of my itinerary here.

    I wanted to use Dopplr to make a pretty map for the itinerary, but I find its interface a little too constraining. At some point I’ll figure out how to make it do what I want, but in the meantime a boring outline will have to do.

    Incidentally, you can follow me on Twitter, or you can see a combination of my Twitter posts, blog entries and Delicious bookmarks on my Tumblelog.

    Permalink
     
  • Eric Ulken 10:25 am on January 23, 2009
    Tags:   

    Separating signal from noise on Twitter 

    Results of a Twitter search on earthquake near:"Los Angeles" within:15mi

    By now a lot of people in the media have discovered how to use Twitter as a promotional tool, judging from the growing number of auto-generated messages populating (polluting?) the Tweetstream.

    But I think relatively few journalists are actually listening to what the community is saying. Which is a shame, because this is our audience talking. And the conversation is often more transparent, more sincere and more insightful than what you see on news sites’ forums and comment boards.

    (I should note that I’m a relative Twitter novice, and I welcome the opportunity to get schooled if I’m totally off base in what I’m about to say.)

    Steve Yelvington (who, incidentally, was helming startribune.com when I was an intern there more than a decade ago), describes Twitter this way:

    It’s like a big caffeine party. Everybody’s talking at once. Really fast.

    But you have magic ears.

    You only hear the people you want to listen to, and the people who are saying something directly to you.

    That is Twitter’s great promise, but it’s also where I think the microblogging behemoth comes up short. Because two things happen when you’re listening only to the people you want to hear:

    • They say a lot of things you don’t care about.
    • You miss all the good stuff they’re not talking about.

    So, how can journalists separate the useful stuff from the chatter on Twitter? There are some technological answers to this question. Here are a few I’ve found:

    • Twitter advanced search: Sure, everybody knows about Twitter search, but the advanced search options can be a pretty effective way to cut through the noise. For a simple example, try this: earthquake near:”Los Angeles” within:15mi (The geographic search makes use of the place name that users set in their profiles as a geotag for each of their tweets. Imperfect, but it’s a start.)
    • TwitScoop: See what’s trending in real time. In the future, I imagine a local version of this (limited to tweets in a particular geographic area) on a big display on the wall of every newsroom.
    • TweetDeck: A beautiful app (built with Adobe Air, for good compatibility karma) that lets you follow multiple subsets of the tweetstream (your replies, custom searches, etc.) in real time. If you cover a beat, why not set up a few custom searches on the topics you follow and see what people are saying?

    And then there are some things I wish I could do with Twitter that, as far as I know, aren’t possible yet. Here’s this journalist’s wish list for Twitter and third-party developers:

    • Better geographic tools, so it’s easier for tweeters to update their location and for searchers to filter geographically. Community news sites could benefit from this.
    • The ability to create running searches across a subset of Twitter users. Let’s say you cover technology and are following a few key sources and want to know whenever one of your sources posts about Yahoo. There might be a tool that enables this, and if so, maybe somebody can enlighten me.
    • The ability to find conversations that mention a particular URL. Seems like it would be useful (and not just for ego purposes) to know what people are saying about the content you create. Twitturly does a good job of showing which URLs are most popular overall, but as far as I can tell it doesn’t let you specify a URL to examine.

    So, what are your techniques for separating signal from noise on Twitter? And while you’re at it, what would you add to the wish list?

    Incidentally, if you’re new to the Twitter thing, here are some good posts to get you going. And for a contrarian point of view on the whole signal-to-noise thing, check out Scoble. (Or, maybe he’s mainstream and I’m the contrarian? [Shudder])

    Permalink
     
    • Andrew D. Nystrom / @latimesnystrom 12:46 am on January 24, 2009

      Eric, a few folks the @LATimes actually had a call this morning with one of Twitter’s lead product managers.

      Besides parsing the numbers related to inauguration-related spikes in traffic – and the pleasantly surprising platform stability – we spent most of the call talking about how to mine Twitter’s APIs and search feeds.

      Re Twitter’s ’09 development plans, integrating their powerful search experience for users is a top priority.

      As widely reported, Twitter confirmed they are also working on a pro/paid version of the service, and they are actively soliciting feedback on what dashboard-type of features would make a paid service appeal to heavy users.

      As always, I’d love to hear more feedback [via @latimesnystrom] on what folks would like to see the 60 feeds @latimestweets follows do. So far, more, quicker breaking news is a popular request, along with more unique/original content, and more reporter-run streams, like @LAjurno + @latimesfood + @latimesJerry.

      Happy travels, look forward to hearing more about your adventures,
      ~ Andrew, social media guy embedded in the LA Times / latimes.com newsroom

      Permalink
  • Eric Ulken 2:41 pm on January 8, 2009
    Tags: Amazon.com, , Center for Public Integrity, Derek Willis, Marc Frons, Matthew Ericson, Ruby on Rails,   

    Making sense of data at The New York Times 

    (After a long holiday hiatus, I’m finally getting around to posting this write-up of my visit with Aron Pilhofer at the NYT.)

    "Movable Type" at The New York Times building

    Tuesday, Dec. 23, 2008: The digital art installation in the lobby of the new New York Times building says more, I think, about the future of news and of the Times Company than its creators may have intended. Yes, we know that the future is digital and real-time and kinetic, like the work by Mark Hansen and Ben Rubin. But, more than that, the journalism of the future will be defined by its capacity to extract meaning from countless bits of data. The work, titled Movable Type, elegantly illustrates the bits. Making sense of them is Aron Pilhofer‘s domain.

    It is my first visit to the new building, directly across 8th Avenue from the Port Authority Bus Terminal. I am meeting Pilhofer, who leads the paper’s interactive news technology team, for a quick tour and chat. His group of 10 developers, assembled over the last year or so, works on editorial projects (such as the Times’ live election results) but doesn’t report to the newsroom. Their boss is Marc Frons, the website’s CTO.

    Over cups of caffeinated liquid in the Times’ airy 14th-floor cafeteria, Pilhofer tells me about Represent, a newly launched project from his team that, as the name suggests, lets you “keep track of what the people who represent you are doing.” Though still in soft launch, it’s already generating some nice buzz. (A bit on the tech specs from co-creator Derek Willis: They’re using GeoDjango to drive the mapping features.)

    Pilhofer is an archetypal journo-techie, raised in the computer-assisted reporting school and fluent both in the cadences of the newsroom and in the technical lingo used by his fellow geeks. Before joining the Times’ computer-assisted reporting team, he honed his skills at the Center for Public Integrity, a D.C. nonprofit that seems to have been a sort of proving ground for smart, webby CAR folks (Willis and my former LAT colleague, Ben Welsh, are also alums).

    This is the breed of journalist that web-oriented newsrooms would like to find more of. The problem is, “they just don’t exist,” Pilhofer says of his ilk. When I throw out the old question about whether it’s easier to teach a journalist programming skills or to teach a techie the principles of journalism, he tells me it’s not so much a question of trainability. Rather, he says, “there are more programmers out there that will find journalism interesting to learn” than vice-versa. He tells me that, with a couple of exceptions, the people on his team have either “very limited journalism experience or none whatsoever.”

    Given that most of Pilhofer’s group comes from a hardcore tech background, I wonder whether they’ve acceded to rigid product development conventions like wireframes and detailed requirements documents. His response: “Hell no.” (Actually, he uses a more colorful four-letter word, but you get the point.)

    He does throw out a lot of prod-dev terms like agile development, scrums, Extreme Programming and pair programming, but he uses newsroom analogies to describe them. Agile development methodology, for example, which stresses frequent deadlines and shuns long meetings, has a lot in common with the rhythm of a newsroom. And pair programming, an unconventional workflow in which two coders work in tandem on the same problem and test each other’s work as they go, is analogous to team reporting.

    Some other highlights from our chat:

    +++
    Software: Pilhofer says his team relies heavily on open-source solutions. Ruby on Rails is the workhorse in this shop, but it’s been adapted to produce flat files when necessary (as opposed to rendering pages on the fly), a performance tweak that enabled the Times to keep up with unprecedented traffic to its election results data.
    +++

    +++
    Hosting: Amazon’s EC2 service, used for most of the team’s data projects, has enabled them to scale with demand. “Amazon has been the savior of this group,” Pilhofer says.
    +++

    +++
    Newsroom geography: The interactive news technology group sits on the Times building’s second floor, in close proximity to the graphics and CAR teams, the two groups Pilhofer says his team works most closely with. (The paper’s business desk takes up much of the rest of the floor.) The graphics desk, in particular, has been a close collaborator, bringing sophisticated visual interpretations to many of the team’s projects. Pilhofer calls deputy graphics director Matthew Ericson the “de facto co-manager” of the interactive news technology team.
    +++

    +++
    Roles and hierarchy: Responsibility for the Times’ interactive projects is shared among Pilhofer’s team, the graphics department and other groups in the newsroom (a highly collaborative, loosely organized structure that reminds me of how interactive projects got done at the L.A. Times, but on a much larger scale). “I kind of like the way it’s working right now, where there isn’t some big, centralized, one-person-in-charge-of-everything,” Pilhofer says. “I think it’s healthier.” Each group brings certain strengths. For instance, the graphics folks want to do really intense, deep immersive online interactives, but they can’t do that without back-end help from Pilhofer’s team, so the two groups work together. Organizationally, Pilhofer says his team benefits from a direct connection to the website’s software and infrastructure folks while other teams are more closely tied to the newsroom. The downside to this setup, of course, is that it’s sometimes hard to know who owns what.
    +++

    More Pilhofer: Old Media, New Tricks recently published an interview with Pilhofer.

    Update 2009.01.13: Emily Nussbaum has a feature on Pilhofer, Ericson and other NYT geeks in New York Magazine.

    Coming next week: A look at DocumentCloud, a promising Knight News Challenge proposal from Pilhofer and ProPublica’s Eric Umansky and Scott Klein.

    Photo by Eric Ulken.

    Permalink
     
    • Daniel 4:30 pm on January 9, 2009

      Yeah, I bet it’s tough to tell some of the reporters apart from folks who hang out at the Port Authority. Kidding, of course.

      Loved this post of yours, Eric. I have to make it over there the next time I’m in NYC. Keep up the good work, and thanks for the link!

      Permalink
    • Suzanne 12:21 pm on January 11, 2009

      Where has this blog been all my life, and why did it take until now to stumble on it?

      Brilliant stuff.

      Permalink
c
compose new post
j
next post/next comment
k
previous post/previous comment
r
reply
e
edit
o
show/hide comments
t
go to top
l
go to login
h
show/hide help
shift + esc
cancel