jump to navigation

Reading Ahead: A Research by Portigal Consulting August 29, 2009

Posted by psychobserver in Customer Experience, Ethnographic Research, Innovation, Research, Trends.
2 comments

Investigating the “reading experience” and “physical versus digital books” has to be one of the most interesting research topics available out there. Reading is one of the deepest experiences we have with “things” / “content” and it is presently facing (and resisting still) the new technologies available and other tensions that could entirely change that experience.

Steve Portigal and Dan Soltzberg from Portigal Consulting just released the findings from a really interesting research study they performed on the topic called Reading Ahead (study performed out of their own interest). I strongly recommend checking out their different blog posts about this and especially listening to their findings presentation.

Basically it is so good it got me to post again after such a long time away from my blog…

Note: I won’t summarize the study here as my post is already pretty long, so to fully put my reflections below into perspective, it’s better to look at their slides and listen to their presentation first (the presentation lasts 1 hour and 20 minutes).

There are several thoughts that came to me as I was listening to the presentation. Let me try to structure a few below.

Device Integration: Loss of Important Emotional Stimuli
Steve and Dan go through a lot of impacts arising from going towards a digital reading experience. One impact they mention is the loss of a direct link between the object (the book) and a set of memories or emotions. A book is a physical object and it ages. When you see it on your shelves or take it in your hands, it will automatically trigger specific memories (when you bought it, where you read it), specific feelings or moods. Taking an ebook reader in hand, will stimulate an emotional response as well, but using a single device for reading will make this response much shallower and less rich, as the response will mostly be linked to the device instead of the individual book.

As I listened to the talk I thought about a similar revolution that happened in the past when emails appeared to replace normal mail. Letters are single objects in themselves and are directly attached to memories, moods and other people. In the past, we were free to store these letters in different places according to the emotions they would trigger. Now, with emails (and this is re-inforced by the huge quantity of emails we receive), this link has totally disappeared. Going back to our archive in our email client and digging an old email will not have by far the same effect as reopening a box where we stored all our love letters. And this leads to a different type of emotional attachment, where users are attached to the device (the email client or the eBook reader) rather than the actual content (mails or books).

A changing world: Speed Versus Reflection
Right after listening to the presentation, I read this article from the Los Angeles Times called “The Lost Art of Reading” by David L. Ulin. This article strongly reinforces the insights extracted from Steve and Dan’s research to show how deep the reading experience is. It also highlights a very important fact which is that books are not only fighting a war against technology. They are also fighting a war against our changing way of lives.

As the article discusses, we live in a world where speed is of the essence – where we need to react to information we gather in the next second in fear of losing our edge. Books are all about a different way of life. Books can take years to write and for the most part could not care less about current events happening on the day you read them. Books are about unplugging (term used in Steve and Dan’s study) yourself from the world. Books are about reflecting on things. And these are less and less tolerated in the world today… but I am wandering off topic here (as I usually do), as this concerns the future of reading books in general and not the tension between physical and digital books.

Design Opportunities: Giving Life to Notes
Steve and Dan at the end of their report go through some opportunities that digital books’ designs can integrate to make the digital reading experience a stronger contender against the traditional reading experience. One that they do not discuss (unless I missed it) is notes. I would split notes into two categories in the reading experience: notes from the author and notes from the reader. Both of these categories of notes could be revolutionized by the advent of digital reading.

What would added interactivity do to author’s notes for example? What if you could click on a referenced article the author mentions and read that article the next second? Or even get a review/summary and ulitmately purchase another book the author also happens to cite? This would add a new depth to books and provide an exceptional experience to the reader (not to mention a great marketing opportunity for publishers). As Steve and Dan discuss in their presentation, this can of features would have to rely on a great ecosystem to provide this integrated experience. That’s something not in place at all today (especially outside of the US).

The other category of notes is user notes and this is where to me there is an even greater opportunity. Right now, you can highlight things in a book or put sticky papers on pages you think are interesting. What about searching for a term you saw in the book after you finished reading or looking up all the pages where another author is referenced? What about building yourself a set of quotes and comments about the book that you typed as you where reading and that are organized and retrievable anywhere after you’re done reading? What about co-reading even? See what people felt or commented as they read the book you are reading. What if the digital book allowed you to share your experience with other readers thus giving the digital book a life of its own, turning each book into a vertical social network almost…

As usual, I am just following a spur of the moment to write this post and I ought to have put more reflection into it (sorry… I am a product of the new changing world where speed is everything), but I do hope that it can even so modestly contribute a tiny bit to the discussion surrounding books and the reading experience. I am personally a big fan of physical books, but if the digital reading experience and a good ecosystem were in place, I would definitely juggle between both physial and digital reading (similarly to the way I do with CDs and digital music).

Tim Berners-Lee: The next Web of open, linked data March 15, 2009

Posted by psychobserver in Social Networking, TED Talks, Trends.
add a comment

I got back to watching some TED talks and the talk from Tim Bernes-Lee felt like a good one to start with. In his talk he reveals his vision for the evolution of the Web, moving from sharing documents to sharing raw data. The presentation is very visionary and somewhat scary I would say.

Watch it below:

If I get the idea well, Linked Data means that when browsing the Web “tomorrow” instead of looking through documents (web pages), the search engine (or whatever is used to browse) will look through raw data. So, I guess one could type a question like “How many friends does Nicolas Lassus have?” and get results from many different places/databases that would answer that question.

Personally, I see quite a number of huge hurdles to overcome before something like that could work. The first thing is something faces by Wikipedia everyday. How do you ensure that the data is accurate and legitimate? Even if people are not trying to manipulate others, there are so many ways to calculate something that just providing raw data can be totally unusable. For this to work, everything around the world would have to be entirely standardize. For example, the way unemployment rate is calculated in different countries is different because of each country’s specificities (or political agenda). Theoritically speaking standardization would be great, but is it realistic?

Another problem is that data is actually a huge business. The open source concept is great, but gathering good data is actually a very time-consuming and tough job. How do we “reward” the people who bring the data to the masses? Personally, I am not a big supporter of the free economy and I believe things that are free today, may not be free tomorrow anymore. Somebody at some point in the supply chain has to pay for things.

The last problem is privacy. If people are able to post data about other people on the web. How do we control that? Facebook and other social networks are testing the limit of this on their side and it will be interesting to see where things go… (I’m reading some articles on the topic and will try to write a post soon about that).

This said, Linked Data can definitely have great applications in some fields. All the fields where data standardization has been happening really fast in the past like financial reporting, corporate social responsibility, etc. Linked data in that field would be a great advance. As I worked on CSR a bit, I really feel that all these reports corporation work so hard on producing should be replaced by Linked data. Something that allows people to easily compare and analyze what companies are doing to make better informed decisions about their purchases or which brands they supports. This standardization has already started and there is just a very small step to make to actually make Linked Data a reality… whether something like that could spread to the whole Web will be interesting to witness.

Follow-up on user-generated content… March 20, 2008

Posted by psychobserver in Advertising, Newsweek, Strategy, Trends.
add a comment

Here is an article from Wharton “The Experts vs. the Amateurs” that is related to the article I posted about from Newsweek: “Revenge of the Experts” (my post here).

Despite its title, I think the most important issue raised in this latest article is the one about business models. With the “everything is free” idea going on (thank you Wired) everybody is running around trying to come up with an “hybrid business model”. Indeed, current free models offered by formerly paid publications are not sustainable as they don’t generate enough to cover for the loss of subscription money. So, is there an hybrid free business model? … I tend to think there is none. If you want quality, reviewed, edited content, … well it makes sense you have to pay for it somehow. And there is only so much advertising money can cover.

The big problem is that the line between Expert and Amateur content is very blurry as the article states, so it makes matters worse. What makes it even worse is that as human beings we are really bad at reacting to a situation until there is a big crisis. So, as long as old-fashioned professional publications survive, we won’t realize that we actually want to pay for quality content. Personally, I do a lot of consulting work, and I am happy people are actually willing to pay money for my work, instead of having to watch a 5 minute advertising video before every meeting they have with me. I can imagine it is the same for a reporter or an editor. My feeling is that paid publications will come back in the end… I can’t help but think that the whole “free” thing will quite quickly disappear… not that I am an expert on the issue in any way, just a thought.   :o)

The end of User-Generated Content? March 8, 2008

Posted by psychobserver in Innovation, Newsweek, Social Networking, Trends, User Experience, Web 2.0.
2 comments

The end of user-generated content? Really?! With social networks, blogs, wikis and more new similar applications appearing every day, who would defend such an idea? It is at first glance what Newsweek seems to be doing with their article: “Revenge of the Experts” (found through Putting People First blog). But is it really what they are saying?

Revenge of the ExpertsThe debate is not really about whether user-generated content will disappear or not. People will continue to generate content. And with the increasing power of applications and tools we have within our grasp, we will continue to generate more and more content. But it is the role of this content that we generate that will be changing. With all the excitement brought by “Web 2.0” (for lack of a better word) about common user doing the job of experts and companies using them to build a business model, we forgot that experts did not appear out of nowhere. Experts are here because, well, they are experts! They are much better at doing something than other people, and they should be rewarded for that. The tools that we now have available helped closing the gap between real expertise and perceived one, but the difference remains nevertheless.

The fact that blogs exist for example does not mean we can all be good reporters or journalists. It only means that we can all publish stuff. The fact that we can now comment on articles on most of the major magazines and newspapers, does not make us more expert than the person writing that article. And actually if we go beyond the facade of user-generated content, we discover that most content, as highlight in the article, is generated by a very small group of people. In the end, to create quality content to all can refer to, you need experts. Wikipedia just showed that an amazing tool could be created by offering a place where experts from a wide range of fields could aggregate all their knowledge, but it omitted to include a clear accountability review on the quality of each contributor.

In every such discussions I have these days everything boils down back to the word “good”. In the recent discussion on the use of personas, the conclusion basically is that if the person is “good” then personas are great. In this case it is the same. If a person is good, or an expert, then we can trust his or her judgment. This means that we need expertise, and we need ways to identify who has that expertise. After all the excitement, we could very well see more old fashioned business model that we thought were dead make a come back.

Open your mind… and dream: Nokia Morph February 29, 2008

Posted by psychobserver in Concept, Innovation, Mobile, Trends, User Experience.
2 comments

This is a concept video from Nokia. You can download it from Nokia’s website or watch it on YouTube (embedded below).

Concepts have been used a lot in the car industry to spur design ideas and creativity. I think this video does an amazing job at setting a vision for the future of mobile. It is crazy and … and a very long term vision. But it is also based on actual technology and actual constraints we have today with mobile devices (like features integration, screen size). Just like in the car industry where we don’t see concept cars in the street, there is very little chance we will see this concept out at all. Still, just like in the car industry, some of the features in these concepts can make it to the main stream products. I can’t wait personally how screen size limitations are addressed with new technologies… See Philips work… or Modu Mobile.

Agile User Research February 20, 2008

Posted by psychobserver in Customer Experience, Ethnographic Research, Hong Kong, Research, Trends, User Experience.
1 comment so far

On February 5th, Hong Kong had the chance of having Martin Fowler speak about Agile Methodology. I have to admit that I was actually dragged to the talk by my developer colleagues… I did not really feel like going to a very technical presentation where I would be totally lost. I was really wrong not to want to go. After I actually understood that XP did not mean Windows XP, but eXtreme Programming (he he… embarrassed smile) I really enjoyed the talk.

There are two main points that made me think about how Agile methodology could have an impact on my work: user experience / user research.

1. How can we make research more agile?

The Agile concept is to break down every project in small fully functional modules that can be delivered in a very short period of time (could be 2 weeks for a development project). This helps focusing on the core features of the project, while leaving the rest for later. This also helps starting the design even without knowing all the business requirements and actually supports better defining business requirements along the way as the client sees the system build itself from scratch.

In this post, I will just focus on the research part of user experience. Indeed, the interaction design part can be incorporated quite easily in an Agile methodology, but the preliminary research appears more tricky to me. Indeed, when we start a project we first want to know what the customers or users want. We have an array of tools to address this from quantitative ones like surveys to qualitative ones like usability testing or ethnographic research. Studies like this can actually last for quite a long time and from the client’s point of view it is hard to visualize what they will get out of it. What if we could break down any research into small items that would last maybe under a week and deliver clear conclusions at the end of each week. Being new to Agile stuff, I still need to think about that some more… the first problem I see is how to perform a relevant study (in terms of sample size for example) in such a short time… But with this in mind, making research more iterative helps design a better research in the end by fine-tuning the study objectives bit-by-bit.

2. Should I work towards not having a job?

From Martin’s point of view and following Agile concepts, the developer and the client should be in direct contact. This makes the role of the Business Analyst on such projects redundant. Of course he mentioned that on most projects Business Analyst are actually key in creating a bond between the different parties, but that made me think… Are researchers like business analysts?… In an ideal case, if my client (I mean, the operational teams) could talk directly to his or her customers, everything should be better. What if instead of designing punctual studies, researchers should all strive to design systems that allow their clients to stay in touch with their customers on a continuous basis, making our role as researchers redundant.

That’s pretty much the concept of customer experience. Stay in touch with your customers on a continuous basis so that you can better design your products and services depending on their changing needs. Still maybe more could be done to integrate advanced qualitative methods into the operations of a company. We see more and more ethnographic research within companies… but my feeling is that more is possible.

Just some unfinished thoughts…

The web auditory experience January 22, 2008

Posted by psychobserver in Innovation, Trends, User Experience.
add a comment

For a long time, the Web has been about having a visual experience. But now, there are signs that it is changing. With more and more videos on the Web, with Skype causing (or maybe causes are elsewhere) people to wear headsets in office and with music being added to websites, the Web now has the opportunity to deliver a richer experience using both visual and auditory cues.

Click-2-ListenIt looks that the first focus of this evolution is to tackle one of the most critical problem with the Web, the fact that people don’t read. I was reading (… so people really don’t read?) an article today and next to the title was a button saying: “Click-2-Listen”.

Out of curiosity, I clicked on it and listened to the article being read to me. In the end, I missed most of it as my colleagues were talking to me and I could not be bothered to press Pause/Play all the time. The quality of the speech although impressive is nothing like a real human talking. In the end, I think I prefer by far reading.

I guess the nice thing with this text to speech service is that you can download the media file and play it later maybe on your mp3 player when you are on the way home. It seems to me that if we are in front of our screen, we’d rather read through an article which is a much more flexible task rather than listen to the article. The service could then be much more targeted towards people who download the file for later.

Two companies offering the service are:

  • News Worthy Audio
  • Odiogo (I really like the tagline: “Gives your text content voice … and legs!”, which plays much more on the listen later feature of the service)

Steve Portigal’s Interactions column: Persona Non Grata January 16, 2008

Posted by psychobserver in Ethnographic Research, Strategy, Tools, Trends.
162 comments

Request a copy of Portigal Consulting new column: Persona Non Grata (link to ACM Interactions Publication). Check out their blog to see how.

A really good article on Personas… It is well written, entertaining and to the point; i.e. Personas are misused in most cases and a tool meant to “help companies actually get closer to their REAL customers” has been transformed into a tool used to “be perceived as close to real customers”.

Should personas be dumped altogether or should researchers educate marketing and design people about how to use them to actually make a difference in the product lifecycle (instead of just being as a marketing tool). I am for education… but how long will that process be?

Online World vs. Real World – An Increasingly Blurry Line January 14, 2008

Posted by psychobserver in Customer Experience, Innovation, Strategy, Trends, User Experience.
add a comment

A great article featured on Experientia’s Putting People First: “Technology and the World of Consumption” from the blog apophenia. The article is interesting already and the discussion below is even more.

“her daughter moved seamlessly between the digital and physical worlds to consume”

The whole idea can be summarized by the quote on the right. Up until now, the real and the online worlds have been considered as entirely separate, selling different products and services, and addressing the needs of different customers. But more and more these differences are disappearing. With the new generation growing accustomed to the online world, the distinction is less and less relevant. Consumers are learning how to adapt their shopping behavior to optimize their experience regardless of how retailers are thinking and planning their offering.

Thus behaviors like searching online and buying in the real world, or the reverse – searching in the real world and buying online – becoming common place. This transition is far from an easy one. If we look at the services industry for example, banks have been struggling for a long time to move their customer from branches to the ATM and then online. Only now are they seeing younger customers using cheaper channels. In the case of the banks, cost has been driving the transition and helped companies doing the necessary changes pro-actively (even before customers actually wanted those changes).

But what about retail? The cost component and the complementary of both worlds is not self-apparent. That could explain why companies are slower to react. But react they will have to. Both real and online worlds have their place. They both address different kinds of needs, but surely both will have to adapt to the changing habits of consumers. Personally I see this as one of the most interesting potential for innovation and changes in customer experience.

Social Networking: The growth dilemma… November 25, 2007

Posted by psychobserver in Customer Experience, Hong Kong, Social Networking, Strategy, Trends, Web 2.0.
add a comment

I recently did some work for a mobile social networking start-up here in Hong Kong helping them with their interface and their user experience in general. Although my focus was on the interface (and the project very short as usual) we happened to talk quite a bit about strategy. In fact, I always find it hard to focus on user experience and user interface without poking my nose into the overall strategy behind them (I’d like to spend more time on that, but it is not the point of this post… maybe for a later one).

Anyway, it seems to me that there is a tough dilemma when building a social network. The problem is that the number of users and the amount of interactions going on in the network is everything. Indeed for most social networks (excluding a few business-related or referral-based ones), the networks do not have a clear revenue model. As everything is advertising-based the indicators to get funding and be recognized in the industry are basically number of users and page views. Unfortunately, these indicators in some cases go against user goals and go against the long term survival of the network.

We are now all used to this. We grow a social network and at first all is going well. We have our close friends there and we are having fun exchanging news, pictures, videos, etc. However after a while, our network grows, previous friends, former colleagues, people we actually met via our social network gets added and all goes out of control. Indeed the whole concept is based on encouraging users to “make” more friends. Games are put in place to push them to add people to their network, and simply when somebody asks sombody to be his or her friend it is really hard to say no. It feels like a no return kind of decision. Even though we might never meet the person in real, we just cannot say no to people that easily. And once they are added to our network, we have little control over what they do with their access to our information, thus the beginning of the end and often users drop-out from the network.

Although it means taking more time to grow the network it seems to me that social networks should learn from real life more. Instead of just facing a “friend” or “not friend” situation, users should be able to grow relationships slower and keeping them under their control without feeling like they are making others feel bad (or having themselves the perception that they could make the other person feel bad). It works in this way in the real world. When we meet somebody for the first time, we do not usually invite them home to look at our family pictures or read our private diary. It is just normal that relationships take time to build. It should be the same online. It is of great convenience to be able to exchange information with friends online, but users should not pay the price of this by losing their right to privacy… and if social networks do not realize this, they will all sooner or later face the fact that when the hype is gone, people will choose privacy over convenience.

With all this said, the problem still remains. In a short term focus world like the one we live in, we demand quick results and a slow growing network just does not make sense… or does it?