Tuesday, August 31, 2010
Do Links Really Rot Our Brains? A Second Look
Links have become an essential part of how I write, and also part of how I read.
Given a choice between reading something on paper and reading it online, I much prefer reading online: I can follow up on an article's links to explore source material, gain a deeper understanding of a complex point, or just look up some term of art with which I'm unfamiliar.
There is, I think, nothing unusual about this today. So I was flummoxed earlier this year whenNicholas Carr started a campaign against the humble link, and found at least partial support from some other estimable writers (among them Laura Miller, Marshall Kirkpatrick, Jason Fry and Ryan Chittum). Carr's "delinkification" critique is part of a larger argument contained in his book The Shallows. I read the book this summer and plan to write about it more. But for now let's zero in on Carr's case against links, on pages 126-129 of his book as well as in his "delinkification" post.
The nub of Carr's argument is that every link in a text imposes "a little cognitive load" that makes reading less efficient. Each link forces us to ask, "Should I click?" As a result, Carr wrote in the "delinkification" post, "People who read hypertext comprehend and learn less, studies show, than those who read the same material in printed form."
This appearance of the word "hypertext" is a tipoff to one of the big problems with Carr's argument: it mixes up two quite different visions of linking.
"Hypertext" is the term invented by Ted Nelson in 1965 to describe text that, unlike traditional linear writing, spreads out in a network of nodes and links. Nelson's idea hearkened back to Vannevar Bush's celebrated "As We May Think,"paralleled Douglas Engelbart's pioneering work on networked knowledge systems, and looked forward to today's Web.
This original conception of hypertext fathered two lines of descent. One adopted hypertext as a practical tool for organizing and cross-associating information; the other embraced it as an experimental art form, which might transform the essentially linear nature of our reading into a branching game, puzzle or poem, in which the reader collaborates with the author. The pragmatists use links to try to enhance comprehension or add context, to say "here's where I got this" or "here's where you can learn more"; the hypertext artists deploy them as part of a larger experiment in expanding (or blowing up) the structure of traditional narrative.
These are fundamentally different endeavors. The pragmatic linkers have thrived in the Web era; the literary linkers have so far largely failed to reach anyone outside the academy. The Web has given us a hypertext world in which links providing useful pointers outnumber links with artistic intent a million to one. If we are going to study the impact of hypertext on our brains and our culture, surely we should look at the reality of the Web, not the dream of the hypertext artists and theorists.
The other big problem with Carr's case against links lies in that ever-suspect phrase, "studies show." Any time you hear those words your brain-alarm should sound: What studies? By whom? What do they show? What were they actually studying? How'd they design the study? Who paid for it?
To my surprise, as far as I can tell, not one of the many other writers who weighed in on delinkification earlier this year took the time to do so. I did, and here's what I found.
You recall Carr's statement that "people who read hypertext comprehend and learn less, studies show, than those who read the same material in printed form." Yet the studies he cites show nothing of the sort. Carr's critique of links employs a bait-and-switch dodge: He sets out to persuade us that Web links -- practical, informational links -- are brain-sucking attention scourges robbing us of the clarity of print. But he does so by citing a bunch of studies that actually examined the other kind of link, the "hypertext will change how we read" kind. Also, the studies almost completely exclude print.
If you're still with me, come a little deeper into these linky weeds. In The Shallows, here is how Carr describes the study that is the linchpin of his argument:
In a 2001 study, two Canadian scholars asked seventy people to read "The Demon Lover," a short story by the modernist writer Elizabeth Bowen. One group read the story in a traditional linear-text format; a second group read a version with links, as you'd find on a Web page. The hypertext readers took longer to read the story ,yet in subsequent interviews they also reported more confusion and uncertainty about what they had read. Three-quarters of them said that they had difficulty following the text, while only one in ten of the linear-text readers reported such problems. One hypertext reader complained, "The story was very jumpy..."
Sounds reasonable. Then you look at the study, and realize how misleadingly Carr has summarized it -- and how little it actually proves.
The researchers Carr cites divided a group of readers into two groups. Both were provided with the text of Bowen's story split into paragraph-sized chunks on a computer screen. (There's no paper, no print, anywhere.) For the first group, each chunk concluded with a single link reading "next" that took them to the next paragraph. For the other group, the researchers took each of Bowen's paragraphs and embedded three different links in each section -- which seemed to branch in some meaningful way but actually all led the reader on to the same next paragraph. (The researchers didn't provide readers with a "back" button, so they had no opportunity to explore the hypertext space -- or discover that their links all pointed to the same destination.) Look at this illustration from the study for a better sense of the design.
Bowen's story was written as reasonably traditional linear fiction, so the idea of rewriting it as literary hypertext is dubious to begin with. But that's not what the researchers did. They didn't turn the story into a genuine literary hypertext fiction, a maze of story chunks that demands you assemble your own meaning. Nor did they transform it into something resembling a piece of contemporary Web writing, with an occasional link thrown in to provide context or offer depth.
No, what the researchers did was to muck up a perfectly good story with meaningless links. Of course the readers of this version had a rougher time than the control group, who got to read a much more sensibly organized version. All this study proved was something we already knew: that badly executed hypertext can indeed ruin the process of reading. So, of course, can badly executed narrative structure, or grammar, or punctuation.
In both The Shallows and his blog post, Carr also makes reference to a meta-analysis (or "study of studies") on hypertext reading studies, a paper that examined 40 other studies and concluded that "the increased demands of decision-making and visual processing in hypertext impaired reading performance." But a closer look at this paper reveals another apples-and-oranges problem.
Carr is saying that Web links slow down our brains. But none of the studies the meta-analysis compiles looked at Web-style links. They all drew comparisons between linear hypertexts (screens with "next" links, not printed articles) on one side, and on the other, literary-style hypertexts broken up into multiple nodes where "participants had many choices in sequencing their reading."
Every other study that I've looked into in this area shares these same problems; I'll spare you the detail. These studies may help explain why there's never been a literary-hypertext bestseller, but they don't do much to illuminate reading on the Web. Carr talks about links having "propulsive force," but does anyone really experience them that way today? Maybe in the early days of the Web, when they were newfangled, people felt compelled to click -- like primitives suddenly encountering TV and jabbing their fingers at the channel selector, wondering what will magically appear next.
I think we all passed through that phase quickly. If your experience matches mine, then today, your eyes pass over a link. Most often you ignore it. Sometimes, you hover your mouse pointer to see where it goes. Every now and then, you click the link open in a new tab to read when you're done. And very rarely, you might actually stop what you're reading and read the linked text. If you do, it's usually a sign that you've lost interest in the original article anyway. Which can happen just as easily in a magazine or newspaper -- where, instead of clicking a link, we just turn the page.
Yes, a paragraph larded up with too many links can be distracting. Links, like words, need to be used judiciously. This is a long post and I have included only a modest number of links -- all that I needed to point you to my sources and references, and most of which most of you won't ever click. Overuse of links is usually a sign that the writer does not know how to link, which on the Web means he does not know how to write. But such abuse hardly discredits linking itself. Many writers still don't understand that comma-splicing is bad grammar, but does that get us talking about the "de-comma-fication" of our prose?
For Carr and his sympathizers, links impede understanding; I believe that they deepen it. Back in 1997 Steven Johnson (in his book Interface Culture) made the case for links as a tool for synthesis -- "a way of drawing connections between things," a device that creates "threads of association," a means to bring coherence to our overflowing cornucopia of information. The Web's links don't make it a vast wasteland or a murky shallows; they organize and enrich it.
"Channel surfing," Johnson wrote, "is all about the thrill of surfaces. Web surfing is about depth, about wanting to know more." As the Web has grown vast, that desire has grown with it. To swear off links is to abandon curiosity. To be tired of links is to be tired of life.
Tomorrow, in the next post in this series, I'll examine some of the ways links are being misused on the Web today -- driven not by some abstract belief in the virtues of hypertext but rather by crude business imperatives. Then, in the final installment, I'll make the case for good linking practices as a source of badly needed context and a foundation for trust.
This post orginally appeared at Wordyard and is republished here with permission.
Use Past Experiences To Achieve Future Success
The more clearly you understand your strengths and how they enabled you to get to this point, the more effectively you can leverage them into a bright and fulfilling future.
It's important not to regret the past. Instead, you must appreciate all of your positive experiences and find a way to learn from the negative ones.
-- Karen Newman, Author, Experience Mapping
Read more advice from Newman in an experpt from her book at WomenEntrepreneur.com >
Tuesday, August 24, 2010
Hitting New 52-Week Lows?
Shares of Wells Fargo have hit a 52-week low today (as MarketFolly alerted us to earlier), and Bank of America hit a new low on Friday.
So, how come? Solvency risk or merely profitability risk?
That would seem to be a crucial question.
(We could have sworn that this question came up during the reason Treasury-meets-with-financial-bloggers roundtable, and that we saw Felix Salmon write about it, but can't find the account any longer on his site.)
Image: StockCharts.com
Image: StockCharts.com
Monday, August 23, 2010
Here's The Kyle Bass Portfolio
See this week's charts as a slideshow →
Or select individually:
- A Really Long View Of Inflation Shows That It's All The Fed's Fault
- Here's The Kyle Bass Portfolio
- Goldman Sachs Thinks The End Of The Stimulus Is Going To Crush GDP Growth
- Today's Unemployment Claims Ruined Any Hope Of A Decent Jobs Situation This Year
- How To Blow A Bond Bubble
Get This Delivered To Your Inbox
You can get this dropped in your inbox every afternoon as Clusterstock Chart Of The Day. It's simple. It's convenient. It's free. All we need is your email address, country and postal code. Sign up below!
See this week's charts as a slideshow →
Or select individually:
- A Really Long View Of Inflation Shows That It's All The Fed's Fault
- Here's The Kyle Bass Portfolio
- Goldman Sachs Thinks The End Of The Stimulus Is Going To Crush GDP Growth
- Today's Unemployment Claims Ruined Any Hope Of A Decent Jobs Situation This Year
- How To Blow A Bond Bubble
Get This Delivered To Your Inbox
You can get this dropped in your inbox every afternoon as Clusterstock Chart Of The Day. It's simple. It's convenient. It's free. All we need is your email address, country and postal code. Sign up below!
Read more: http://www.businessinsider.com/charts-of-the-week-heres-the-kyle-bass-portfolio-2010-8#ixzz0xNuYUMZG
Friday, August 20, 2010
Created Equal
There are many ways to classify commodity ETFs. One alternative is by the type of commodity they track, such as gold or oil. Another is by the type of investment product they use. There are three common types of commodity ETFs by the type of investment: physical commodities, futures-based, and equity-based. Understanding the nuances of these different investments is important when considering commodity ETFs.
Physically backed commodity ETFs invest directly in the physical commodity, and issue shares backed by the physical inventory. The largest physically backed ETF is the SPDR Gold Shares with $50 billion in assets. This is more gold than the official gold reserves of Switzerland or China. Important considerations for physically backed ETFs are the cost, safety and permanence of storing the commodity. Storage and insurance costs can be prohibitive for a commodity such as oil. Storing perishable agricultural commodities is not only expensive but also impractical. As a result, most physically backed ETFs invest in precious metals, such as gold, silver or platinum, which can be stored for long periods of time without losing their value and where the cost of storage is low relative to the value of the commodity.
Futures-based commodity ETFs invest in futures contracts for the particular commodity it tracks. An example is the U.S. Natural Gas Fund, which has $2.5 billion in assets. There has been a lot of negative press about futures-based ETFs, the most recent one from Bloomberg Businessweek (“Amber Waves of pain”, July 22, 2010). The article chronicled two problems that caused the returns of futures-based commodity ETFs to significantly lag the return of the commodity itself. The first one has been well documented and is caused by the fact that many commodity markets have been in contango for quite some time. The article suggests a second, more sinister problem with futures-based ETFs. Because the ETFs roll the futures contracts at set times, “professional futures traders exploit the ETFs’ monthly rolls at the little guy’s expense.” U.S Commodity Funds, the firm behind the U.S. Natural Gas Fund, is launching a futures-based commodity ETF that seeks to avoid the contango issue by concentrating in commodities that are in backwardation, which is the opposite condition to contango. Contango: when the future price of a commodity is higher than the current (spot) price.)
Some futures based funds are structured as Exchange Traded Notes (ETNs), which have the additional counterparty credit risk exposure to the bank selling the notes. An example of a futures-based commodity ETN is the iPath Dow Jones-UBS Platinum ETN, which is exposed to the credit risk of the issuer, Barclays Bank PLC.
Equity based commodity ETFs do not invest in the physical commodity but rather buy shares of companies involved in the commodity. An example if the Market Vectors Gold Miners ETF which invests in shares of gold mining companies. The returns of these funds, while correlated with the price of the commodity, do not track the prices of the underlying commodity. These funds tend to be more volatile than the price of the commodity, and typically rise faster than the price of the commodity when commodity prices increase, but also fall faster than the price of the commodity when commodity prices decline.
The following chart from the Journal of Indexes article “Rethinking Investing in Commodities,” May/June 2010, shows the performance of three types of commodity based indices since 2003: equities, physical, and spot, and compares them to the return of the S&P 500. Equity based commodity indices performed best during this period followed by the spot index. Futures-based indices performed worse among these three types of commodity indices but still managed to beat the returns of the S&P 500 during that period by a small margin.
Source: Journal of Indexes, “Rethinking Investing in Commodities,” May/June 2010
In summary, not all commodity ETFs are created equal and have different structures, risks, and returns. The following table shows a sample of commodity based ETFs and ETNs across these three types of commodity based investments, equities, physical and futures-based commodities:
Read more: http://www.businessinsider.com/not-all-commodity-etfs-are-created-equal-2010-8#ixzz0x6Ogo7LG
The 56 Year Benner Cycle
The 56 year cycle mentioned yesterday (“Periods When to Make Money” (© 1883) was picked up by FT Alphaville; we hear it caused some “consternation” in certain circles where the marinating of ice cubes takes place.
I find these approaches quite fascinating, if for no other reason than I consider myself a student of market history. (Whether it is an actionable thesis is an entirely different question). For those of you who are also interested in such things, let’s explore this periodicity, better known as the Benner Cycle.
Samuel Benner was a prosperous farmer who was wiped out financially by the 1873 panic. When he try to discern the causes of fluctuations in markets, he came across a large degree of cyclicality.
Benner eventually published his findings in a book in 1875 — BENNERS PROPHECIES: FUTURE UPS AND DOWNS IN PRICEs – making business and commodity price forecasts for 1876 -1904. Many (but not all) of these forecasts were fairly accurate.
The Benner Cycle includes:
-an 11 year cycle in corn and pig prices with peaks alternating every 5 and 6 years.
-cotton prices which moved in a cycle with peaks every 11 years.
-a 27 year cycle in pig iron prices with lows every 11, 9, 7 years and peaks in the order 8, 9, 10 years.
It makes some degree of intuitive sense that a farmer would recognize longer term cycles. Their entire year is based on the annual sowing/growing/reaping cycle; The 11 year solar cycle would certainly impact their crop yields, revenue, etc. So looking at how the variants of crop yield and prices impacts the overall economy and markets makes lots of sense.
There are two caveats to all of these cyclical variants — Gann, Elliot Wave, Fibonnacci, Benner. First, consider there is insufficient data — we really need 500 years of market history to have a better data set to draw conclusions. Second, the unfortunate tendency to form fit after the fact (I see people doing this with Fibs especially). Mnay of the peaks and valleys are off by a year or two, but it looks close. Some of it might be explained by randomness.
Regardless, I think it is worth thinking about as a general long term framework — and a reminder that the so-called 100 year floods comes along much more frequently than the name implies . . .
Via Google Books
For those who wish to explore this further, you should check out David McMinn’s THE BENNER CYCLE, FIBONACCI NUMBERS& THE NUMBER 56.
I am not a huge Prechter fan, but I found his book Prechter’s Perspectives very intriguing — it covers the long term political-socio-economic cycles of recession, war, recovery, expansion, bubble, etc. It is intellectually stimulating, but not exactly actionable . . .
Read more: http://www.businessinsider.com/benner-cycle-2010-8#ixzz0x6ONUVyY
Tuesday, August 17, 2010
Why Dell Paid $1.15 Billion For 3PAR
Dell plans to spend $1.15 billion in cash on data storage company 3PAR. Why? Dell must diversify its business.
As you can see in the chart below, Dell still gets 72% of its revenue from desktop PCs, laptops, and printers and monitors. Revenue from those divisions has been flat for the last five years, as HP and Acer have grown in popularity, taking Dell's sales.
To hedge against the softening PC market, Dell is diversifying by getting into data and services. Each of those parts of its business have seen modest growth in the last five years.
Despite the growth, Dell's services and storage business was $4.2 billion last quarter, less than 30% of its total revenue.
For comparison, HP generated $13.3 billion in sales from services and storage, or approximately 42% of its revenue, in its most recent quarter. Dell's services business accounts for 13% of its sales, versus 27% for HP.
DELL Aug 16 2010, 05:20 PM EDT
|
Read more: http://www.businessinsider.com/chart-of-the-day-dell-revenue-2010-8#ixzz0wotfN8Xp
Friday, August 13, 2010
RAOK
What's Happening With GDP
Note from dshort: The index data is updated through August 10th. I've shortened the timeframe for the first chart so the daily volatility in the underlying Weighted Composite Index is easier to see.
For the past several months, the Consumer Metrics Institute's Daily Growth Index has been one of the most interesting data series I follow, and I recommend bookmarking the Institute's website. Their page of frequently asked questions is an excellent introduction to the service.
The charts below focus on the 'Trailing Quarter' Growth Index, which is computed as a 91-day moving average for the year-over-year growth/contraction of the Weighted Composite Index, an index that tracks near real-time consumer behavior in a wide range of consumption categories. The Growth Index is a calculated metric that smooths the volatility and gives a better sense of expansions and contractions in consumption.
The 91-day period is useful for comparison with key quarterly metrics such as GDP. Since the consumer accounts for over two-thirds of the US economy, one would expect that a well-crafted index of consumer behavior would serve as a leading indicator. As the chart suggests, during the five-year history of the index, it has generally lived up to that expectation. Actually, the chart understates the degree to which the Growth Index leads GDP. Why? Because the advance estimates for GDP are released a month after the end of the quarter in question, so the Growth Index lead time has been substantial.
Has the Growth Index also served as a leading indicator of the stock market? The next chart is an overlay of the index and the S&P 500. The Growth Index clearly peaked before the market in 2007 and bottomed in late August of 2008, over six months before the market low in March 2009.
The most recent peak in the Growth Index was around the first of September, 2009, almost eight months before the interim high in the S&P 500 on April 23rd. Since its peak, the Growth Index has declined dramatically and is now well into contraction territory.
It's important to remember that the Growth Index is a moving average of year-over-year expansion/contraction whereas the market is a continuous record of value. Even so, the pattern is remarkable. The question is whether the latest dip in the Growth Index is signaling a substantial market decline like in 2008-2009 or a buying opportunity like in June 2006.
The next chart is a three-way overlay — the 91-day Growth Index, GDP and the S&P 500. I've also highlighted the recession that officially began in December 2007 and unofficially ended last summer. As a leading indicator for GDP, the Growth Index also offers an early warning for possible recessions.
Preliminary Conclusion
The Consumer Metrics Institute's Growth Index hasn't been in operation very long, but thus far it has been an effective leading indicator of GDP. As such, the prospect of a double-dip recession, something that's happened only once since the Great Depression, remains a possibility.
Read more: http://www.businessinsider.com/a-quick-introduction-to-the-consumer-metrics-institute-growth-index-2010-8#ixzz0wRU1UZWL
Wednesday, August 11, 2010
Facebook's Location Product
CNET doesn't have much detail on the new product, but says Facebook's take on location will be centered around an API allowing existing location-aware apps to plug into the social network.
That's consistent with Facebook's strategy elsewhere (and is what we predicted back in May).
CNET calls the new API 'Facebook's Foursquare competitor', but we doubt Facebook will offer much that directly competes with check-in apps. Nothing Facebook has done to date gives it any experience in game mechanics or sourcing local deals. Making it essential for Foursquare to integrate run its check-ins through the new API, so that that data can inform other apps running on the platform, is probably all Facebook is after.
Read more: http://www.businessinsider.com/facebooks-location-product-is-finally-on-the-way-2010-8#ixzz0wFkDLoUG
10 Things We Learned Today
- Amazon is looking to expand beyond e-Readers to new hardware and even once considered launching a mobile phone, according to the New York Times.
- The real reason behind HP CEO Mark Hurd's resignation is questioned, citing differences between HP's statements and Hurd's own publicist.
- OKCupid releases a study saying that iPhone users have more sex than BlackBerry and Android users.
- Is your job still viable post-recession? Read about 23 jobs that show no signs of recovering after this recession.
- RIM works out a deal with Saudi authorities, granting the country access to BlackBerry Messenger.
- Newsweek loses another notable name, continuing the slow exodus of major Newsweek players.
- Twitter unrolls Fast Follow, allowing non-users to begin following Twitter users and receive that person's tweets through text message.
- Wired continues its criticism of Monday's joint statement between Google and Verizon on net neutrality.
- Facebook tidies up those annoying birthday wall greetings that clog up your news feed.
- Plastic Logic's e-Reader, Que ProReader, is discontinued -- the company is instead focusing on the next generation of the Que.
Read more: http://www.businessinsider.com/10-things-we-learned-today-2010-8-10#ixzz0wFjlOR5E
Friday, August 6, 2010
The Dollar Is About To Come Back
Worried about the soft patch in the US economy, and the attendant decline in the dollar?
Morgan Stanley's Stephen Hull would advise you not to worry:
Our core views from the remainder of 2010 remain unchanged, namely that after a period of weakness we think that the dollar is close to forming a bottom against the major currencies.
Following a string of weaker than expected economic data, expectations about the outlook for US activity has meant the dollar has been in the sweet spot for bears. Going forward, we do not expect the US economy to decouple from the major economies. We expect the dollar to recover via either US growth rebounding in the second half of 2010 or data weakening elsewhere, or both!
We forecast the US economy to grow by 3.4% and 3.3% on an annualised basis in the third and fourth quarter, higher than consensus estimates as Exhibit 1 shows. While that is our core view, it is also possible that the US is just leading a broader decline in global activity, and if that is the case then we should soon start to see weakness in other economies, which presumably might be associated with a period of risk aversion. If we are right with either of these outcomes, we would expect the dollar to recover from its recent selloff.
Image: Morgan Stanley |
Read more: http://www.businessinsider.com/morgan-stanley-the-period-of-weakness-for-the-us-is-over-and-the-dollar-is-about-to-come-back-2010-8#ixzz0vmR64mBn