Tuesday, May 08, 2007

India trying to save the world from yoga patents

Western governments are granting patents, trademarks, and copyrights over yoga to con-artists who claim to have invented the millennia-old practice. The Indian government is retaliating by publishing a giant, multi-lingual database of yoga-stuff so that patent examiners can see that "yoga didn't originate in a San Francisco commune."
The U.S. Patent and Trademark Office has issued 150 yoga-related copyrights, 134 patents on yoga accessories, and 2,315 yoga trademarks. There's big money in those pretzel twists and contortions - $3 billion a year in America alone. It's a mystery to most Indians that anybody can make that much money from the teaching of a knowledge that is not supposed to be bought or sold like sausages.

The Indian government is not laughing. It has set up a task force that is cataloging traditional knowledge, including ayurvedic remedies and hundreds of yoga poses, to protect them from being pirated and copyrighted by foreign hucksters. The data will be translated from ancient Sanskrit and Tamil texts, stored digitally, and available in five international languages, so that patent offices in other countries can see that yoga didn't originate in a San Francisco commune.

It is worth noting that the people in the forefront of the patenting of traditional Indian wisdom are Indians, mostly overseas. We know a business opportunity when we see one and have exported generations of gurus skilled in peddling enlightenment for a buck. But as Indians, they ought to know that the very idea of patenting knowledge is a gross violation of the tradition of yoga.

Read More...

Supreme Court Issues Two Important Patent Decisions

May 1, 2007 3:18 PM byJason Mendelson

Clearly, the Supreme Court read my patent rant. Okay, maybe not, but I'd like to claim that they did. As many of you know, I have a real issue with the entire patent litigation system. As many of you also know, Brad and I are huge proponents of invalidating software patents, in general. We feel that they stiffle innovation and are used mostly by unsavory folks trolling for dollars.

Today, the Supreme Court issued two important rulings. The first opinion deals with the concept of what is "obvious" under patent law. In a rare, rare situation, the court was unanimous. I haven't read the opinion (yet), but the news is reporting that they slapped down a federal appeals court that went too far in providing patent protection. Clearly the court is sending a message to the PTO office that it believes there are too many patents being granted.

In the second case, the Supremes endorsed US law that says US patents are not infringed upon if the products at issue are made and sold in other countries. In other words, foreign law pertains to goods sold in foreign countries.

It will be some time until we know how / if this actually affects our patent system as it stands today. For now, it's a step in the right direction.

Read More...

User Generated Objects: 3D Printing In The NYT

200705080907

We've mentioned stereolithography, or 3D printing, on these pages several times in the last few years and now the mainstream media seems to have got the bug too: the NYT has an article that reviews the current state of 3D printing and the prospects that maybe one day we'll be printing objects from our home. They say:

3D Systems, a pioneer in the field, plans to introduce a three-dimensional printer later this year that will sell for $9,900. “We think we can deliver systems for under $2,000 in three to five years,” said Abe Reichental, the company’s chief executive. “That will open a market of people who are not just engineers — collectors, hobbyists, interior decorators.”
Even at today’s prices, uses for 3-D printers are multiplying. Colleges and high schools are buying them for design classes. Dental labs are using them to shape crowns and bridges. Doctors print models from CT scans to help plan complex surgery. Architects are printing three-dimensional models of their designs. And the Army Corps of Engineers used the technology to build a topographical map of New Orleans to help plan reconstruction...
“You could go to Mattel.com, download Barbie, scan your Mom’s head, slap the head on Barbie and print it out,” suggests Joe Shenberger, the director of sales for Desktop Factory. “You could have a true custom one-off toy.”

Beam It Down From the Web, Scotty - New York Times PSFK Articles On 3D Printing

Read More...

Google Analytics Is Re-Launched: Do These Five Things First In V2

Complexity SimplifiedGoogle announced the launch of Version 2 of Google Analytics today. Over the next few weeks Google will upgrade current GA users to the new version. Many of the most frequent GA users (big or small) already have access to V2 as of this morning (please log into your account and check).

Version 2 is so radically different and provides such a compelling value proposition to users of web analytics that I am excited to write a blog post about a product (the first time I have done this in 11 months of existence of this blog).

I am also the Analytics Evangelist for Google but you’ll see that I am so excited about GA V2 not because I consult for Google but because I believe that v2 is a leap forward for all of its current users and a new standard for the industry when it comes to interacting complex web analytics data. Please share with me what you think, at the end of this post.

Also while this post is about GA V2 it contains examples of the best practices I have talked about on this blog frequently, I have just tried to do those with GA here. So if you use Omniture or WebTrends or WebSideStory or HBX or Visual Sciences or ClickTracks or indexTools or NetInsight or any other piece of web analytics software (and care only a little about GA) you’ll still find tangible examples of analysis you can do to find actionable insights. You can follow along and replicate these with your web analytics tool.

Here are the five things you should do the first time you log into GA V2:

Summary:

  1. Notice the awesome new data interaction model.
  2. Take the enhanced “data discoverability” for a spin.
  3. Context is king! Find your context quickly.
  4. Ahh…. Segmentation is just a step away.
  5. Upgraded goodies: Schedule and email any report or dashboards, Better site overlay, Much nicer page level reporting and more.

Details:

# 1: Notice the awesome new data interaction model.

One more of most dramatic changes to Google Analytics V2 is the new immersive data interaction model. It sets a new benchmark for how users interact with data. It shifts the model from a few digging long and hard to find insights to the many not having to dig a lot to find first blush insights and also having the power to easily and quickly dig deeper if they want to.

The V2 UI is completely new and the center piece of this launch. Every where from totally customizable dashboards to the overview reports to the presentation of the data and more “stories” that go with the reports now. The tool is easier to use, key metrics jump out to you and it is ever more easy to understand what is going one (if only all sexiness in the world was so productive! : )).

Here’s the new dashboard (notice the use of colors, font, content groupings etc in service of quickly communicating with you):

Google Analtyics Dashboard

All images in this post are linked to slightly higher resolution images, go ahead and click on all the images . The screenshots represent real data for this blog.

Here is the new presentation of the split between New vs. Returning visitors (notice the small “story” under the graph, the use of colors and layout of the table, and a quick and easy way view that communicates not just what happened last month but also a eye catching graph that tells us “performance” vs site average):

Google Analytics: New vs. Returning Visitors

As you’ll use the tool you’ll see more and more examples of effective communication of data via a very well thought out UI that is perhaps the best one today amongst all web analytics tools.

People underestimate the value of being pretty. Our world of web analytics is already too complex and data is hard to parse and insights harder still to get. Effective presentation of data (ok pretty!) is greatly accretive to helping understand trends and insights and significantly increase ease of use both for a lay person and the super analyst.

# 2: Take the enhanced “data discoverability” for a spin.

It is both a blessing and a curse that there so much data that we have access to. It means both that we can track and report a lot but it also means that it is a non-trivial challenge to find all the metrics/pieces of data you need to find the nuggets of actionable insights.

The new version of Google Analytics does a great job of addressing this challenge by immensely improving, what I call, data discoverability. The key data you need is not hidden sixteen clicks away. Most of it has been surfaced so that it is staring at you already or you can find it in two clicks.

For example look at the Visitors Overview (click Visitors on the left navigation in GA):

google analytics v2 discover data

Notice that not only do you get a trend of the Visitors to the site but you can also get all your key metrics in one “page view”. Further more the next action is within easy reach, either click on one of the many metrics you see under the graph or there are Visitor Segmentation options being suggested to you. You quickly get the whole story but you also learn what else is there.

Did you notice the lovely sparklines? Somewhere Edward Tufte is smiling. :)

Here’s another example. I am deeper into my reports and want to see where my traffic is coming from. Easy report from any tool (and a report you should constantly look at).

google analytics v2 discover data

On one page you have three interesting ways to discover data (and find insights):

1) You can easily switch the “master metric trend” to one of the other metrics (and get a quick glimpse of the performance of your referring sites).

2) You can easily switch between “standard metrics” to “bottom-line metrics” (Conversion). Compare image below to image above, one click access to clickstream (behavior) and outcomes.

google analytics v2 goal conversion

3) Oh your standard metrics are always there, even though you were looking for Visits, to prompt you to dig deeper (notice that Visits are doing ok but something happened in May that caused a increase in content consumption - pages/visit and time on site - and caused lowered bounce rates).

As you use the tool you’ll find many little and big ways in which the new UI makes it easier for you to drill down, drill up and drill around.

# 3 Context is king! Find your context quickly.

On this blog we have highlighted the importance of having relevant context to helping you make optimal decisions. The recent how should analysts spend their day post indicated that 20% of the time should be dedicated to staying plugged into the context.

The new version of Google Analytics provides several features that help you get relevant context to the performance of your website metrics. I think both of these make it significantly easier for novices and experts to understand their data (which might lead to more insights). Let me share a couple of examples……

In my emetrics presentations I have talked about how key metrics are often “lonely” and need friends to highlight important opportunities and occurrences. No matter where you go in the new Google Analytics your metrics won’t be “lonely”, you won’t find too many reports where you only look at one metric by itself. Lots of thought has been put into showing key metrics in context.

Here is a example, I am looking for the conversion rate for the last month and sure enough it is easy to fine (click on traffic sources, then Keywords, then switch Visits for the graph into Goal Conversion Rate):

google analytics v2 search keyword performance

Now notice something cool, not only do you have a trend for conversion rate on your website but in the Site Usage area of your report you can see your key metrics for the Search Traffic (numbers in bold black) but, this is fun, notice that you can also see (in smaller grey font) the comparison of your search metrics with your Site Metrics. You easily get important context such as “the % of new visitors is higher for search but their site page views per visit is lower”. Often we buy into the hype of search engines because we might only look at one metric or the other, now you can get the whole picture, quickly.

And you don’t lose that valuable context as you drill down, in this case I drill down to looking at the Conversion Rate for the top keyword from search engines:

google analytics v2 search keyword drill down

Even at a quick glance I know exactly how this keyword is performing, not just for Conversion Rate but all other important metrics (this is the top performing keyword but only contributes 1.41% of the site visits!).

Also data discoverability continues to be enhanced, notice right under the keyword are options to see performance for Total, Paid (PPC / SEM) and Non-Paid (Organic - SEO). You never have to leave the “page” to do all this.

But perhaps one of the easiest way for you to get context about your performance is to simply compare it to a, well, comparable time period. With Version 2 this is easier than ever. You still have to boring calendar you can choose your time periods from, but what I like better is the new Timeline feature where I have the option to using two slides and drag them to choose my date range. Very efficient…..

google analytics v2 context from time

As soon as I hit Apply Range I can see at a glance trend of the main metric I was looking at for the two time periods (Visits) but notice the changes for all other metrics. My sparkline trends now show the two time periods. I also have automated raw numbers for my key metrics and in helpful Red and Green indicators how each metric has performed over those two time periods.

Again in this case you can understand your performance better and even at this high level the questions you should now ask of your data will bubble up.

The nice thing is that once you choose your timeline for comparison in any report, that comparison then permeates all your reports so that you can start at a high level and drill down and still have the valuable context. Here for example is a drill down to sources of traffic to the site where I find the same timeline comparison…….

google analytics v2 time context for traffic sources

You can hover your mouse on the timeline to get daily performance, or you can easily look at the deltas for the key metrics (click on the image above to see how the context continues for your top sources and keywords, all on the same page - remember the goal is for you not to dig around to get actionable insights).

# 4: Ahh…. Segmentation is just a step away.

Most of the reports you’ll see in Version 2 of Google Analytics provide easy access to segmentation options. For example in this report while looking at the Direct Traffic you can simply click on arrow next to Segment and you can see lots of segmentation options (including by some Value that you can define and pass to GA):

Google Analytics v2: Segmentation

And here is another example for when you are deep in the bowels of doing your long term analysis you can quickly see how these options (in a composite image) would be very helpful:

Google Analytics v2: Segmentation

# 5: Upgraded goodies: Schedule and email reports/dashboards, Better site overlay, Much nicer page level reporting and more.

You now have a very convenient to share your analysis / insights with a wider group of people in your organization. Just select any report (even ones you have segmented and timeline compared etc) and click the Email button…..

Google Analytics v2: Email and schedule reports

As you can see above you have the ability to write a custom message, choose a convenient format (including extremely high resolution PDF’s) and the schedule.

Site overlay gets a v2 upgrade as well, notice something new…..

Google Analytics v2 - Site Overlay

The site overlay report not only opens in a new window (where you can simply “surf your site”) but on the top you see a new “navigational bar” that allows you to switch your choices of what you want your site overlay to display. In the screen shot above that Clicks, Goal Value (How much is each link driven in terms of goal revenue), Goals 1 and 2 (click density for driving to conversion goals that you have set for your website). You can now visually get a great picture of how each page is performing.

Site overlay is one of the most underutilized reports of any tool, with v2 it gets better in Google Analytics and builds a foundation for future enhancements.

Page level analysis has also gotten easier and much better in V2. As you’ll see in the screenshot below you can choose the page you want and then look at a detailed summary or navigation summary or entrance paths to the page or external sources who referred traffic to a page or the keywords that drove traffic to a page from a search engine.

Google Analytics v2: Page level analysis

The Entrance Paths is particularly interesting. For example how many people came to the product page, where did they go next and of those how many ended up in the shopping cart and if not there then where did they end up? Good to understand and actionable (even though I am not a huge fan of site level path analysis, that is not a good use of time).

Novice users (or experienced users!) will find it very convenient to locate help and definitions throughout the application. Just click on the question mark icon next to any metric you see or the Conversion University link next to any report.

Google Analytics v2: Help

Let us all resolve never to get confused about Hits, Visits, Visitors and Unique Visitors!! : )

So what should you do now? Can’t let you get away without action items now:

  1. If you have used Google Analytics thus far then try the analysis above and give the new reports a spin, I guarantee that you’ll find the tool significantly easier to use and you’ll discover your own little and big trends faster.
  2. If you have never used Google Analytics before then now is a good time to try. It still comes free (sign up here) and you’ll see what a free tool can do that your current web analytics tool can’t. You may or may not decide to use cancel your current tool subscription, that is a very personal choice, but you’ll make that decision for a informed position.

Closing thought: Having been such a fan of Measure Map I am super impressed with what Jeff and his team have delivered, something that is a revolutionary step forward when it comes to complex web data and how we interact with that data in our quest to find insights. But I am a greed person and want a lot more! :) I am excited for the future possibilities of innovation and invention on top of this new platform.

What do you all think? Have you tried V2? Are you excited about what you have read above (if you made it this far)? Was this post by a Analytics Evangelist or a super excited web analytics geek/blogger? Please share your thoughts and feedback via comments, I would love to hear what you all have to say. Thanks.

Read More...

Hot or Not Tears Itself Apart, Reinvents

When James Hong and Jim Young founded HotorNot in October, 2000, they had no real plans for the service to be anything other than a fun site for a few friends. They turned a free low end computer they received for setting up an etrade account into a web server, launched the site from their house in Mountain View, California, and emailed 40 friends. By the end of the day, 40,000 people had visited the site, which now had 30 second load times.

It wasn’t too long before the service was hosted at RackSpace and the users were flooding in to rate user-uploaded pictures of themselves on a scale of 1-10. In January 2001 they added a dead simple dating site. Instead of reading endless profiles and trying to find a connection, users just say yes or no to a given picture. If it’s a yes, the other person is shown your picture the next time they look through profiles. If they like you as well, a connection is made.

The Money Rolls In

Until last month, HotorNot was free until that last crucial stage when two people wanted to meet each other. At that point, one of the members (usually the man, Hong tells me) must have been a paid subscriber, which costs $6/month. Hong says their conversion rate was extremely high - 15% of active users eventually upgraded to premium accounts.

The premium revenue, plus advertising and fees for virtual flowers, soon topped $600,000 per month. Nearly all of that was profit for the two founders, who reportedly pocketed $20 million or so between them over the years. The company has never raised any outside funding.

Hong says they receive 2-3 emails per day telling them about marriages that resulted from an initial meeting on HotorNot.

In the last year though a few competitors have popped up (see yesnomayb, a copy of the business model) and a number of free dating sites also started to eat away at traffic. Traffic started to drift sideways, and the developers were getting bored at doing little more than site maintenance. Going To A Free Model

That’s when Hong and Young decided to rip apart their business model and remove the requirement for members to have premium accounts to talk to each other. A month ago, the requirement was turned off, and about $500k/month in revenue disappeared overnight. The founders also turned the company into a proper “C” corporation and issued stock options for the first time to all employees.

(I can’t help thinking that if HotorNot took venture financing somewhere along the way, they would not have been able to get their board of directors to agree to this.)

Hong says this lit a fire under the company, which is now running on reserve cash of a few million dollars. So far things look good. Traffic jumped over 60% - 10 million people visited the site in the last month, up from 6 million the month before. Advertising and virtual gift revenue spiked, and the site is now break even even though they killed their largest revenue stream.

Hong and Young aren’t stopping there. They have plans to expand the site greatly and say they will launch new products in the coming weeks.

Whether this works in the long run is yet to be seen. But the company wanted to try something new, and the founders took enough money off the table to be comfortable for life. Entrepreneurs tend to have a screwed up way of measuring risk - the more the better - and these guys are no exception.

Read More...

Deal Note: Scribd

from BGSL by umair So Scribd is hot - really hot. If we accept the rumour, at the A-round, Scribd achieved a valuation of >20m. Not bad, given the fact that it hasn't exactly seen exponential growth in terms of attention share. The hypothesis behind investing in Scribd is easy - a global repository of documents will garner an incredible attention share at any reasonable scale. Even at a measly Digg-scale, the revenue potential of a Scribd begins to be significant. And then there's the fact that Scribd ads can be hypertargeted... Now, that's all well and good. In fact, what's really interesting about Scribd isn't the yawner of an investment thesis - but the fact that it's one of the few startups around that really pushes the definition of what media is. Can other people's documents really be a medium? What are the economic of that medium? Very interesting and thought-provoking questions. But back to the IRR. I'm just not so sure of the key assumption behind the investment: that Scribd solves a problem that actually exists. Is there a supply of prosumers with "documents" leaping at the chance to share them? Initial attention share tells us very clearly - not yet. And even if there are, why wouldn't they just start a blog? YouTube had a clear monopoly on online video (at least usable online video). Scribd doesn't have the same clarity of market power. Would I have taken a bet on Scribd anyways? Probably. Good ideas are (very) few and far between these days. And the potential upside of a Scribd is well worth the risk. Let's discuss the sideline of a Scribd as host (essentially) for ripped-off books, magazines, etc. YouTube was in a legal grey area (ie, microchunks). Scribd isn't (which it acknowledges). Can Scribd exert pressure on publishers? Not unless it's in the grey area. But the larger point ist that there are lots of other positionings to be explored - Scribd as Office meets community (which is what a lot of the buzz is about), Scribd as Digg-feeder, etc - which is what offsets the risk of the key and somewhat shaky assumption, and makes Scribd a fairly cool play which will be a lot of fun, if not an obvious game-changer.

Read More...

All Steamed Up

The blue lagoon near Reykjavik, Iceland.
Paco Cruz / Digital Press Photos / Newscom

Xianyang, China, was once a great place to live--during the Qin dynasty, anyway, more than 2,000 years ago. Since then, it has gone pretty much downhill. Today Xianyang is one of the most polluted cities in a very polluted country, partly as a result of the air-fouling coal that's burned to generate much of its power. The air in Reykjavík, by contrast, is crystal clear, because nothing is burned there. Iceland's capital gets 100% of its heat and 40% of its electricity from geothermal power. (The rest comes from hydropower.) The same forces that have scattered no fewer than 130 volcanoes across the tiny country bring molten rock relatively close to the surface everywhere. When this encounters underground water, it generates steam, which is tapped to produce clean, renewable electricity.

All of which explains why a group of engineers from the Icelandic power company Enex have left the pure air of Reykjavík behind to work in smoggy Xianyang. The ancient Chinese city might just have the geothermal resources to become the Reykjavík of the East. In December engineers from both countries completed the first stage of a joint venture that could eventually provide geothermal-powered heating to millions of people in Xianyang. If the project is successful, the city will eventually have the biggest such system in the world.

That would be good for everyone. Last year alone, China added 102 gigawatts to its electrical grid--roughly twice the total capacity of California's--and about 90% of that came from carbon-belching coal plants. Geothermal energy can at least make a start on cleaning up this mess. The China Energy Research Society expects 110 gigawatt hours (GWh) to be produced through geothermal power nationally by 2010, out of 2.7 million GWh in total. That's a tiny slice, but energy experts believe China has the potential to do much more. "There are geothermal resources in almost every province in China," says Ingvar Fridleifsson, director of the United Nations University Geothermal Training Program in Reykjavík. Geothermal pumps will even be used to heat and cool some of the venues at the 2008 Olympic Games in Beijing.

It's the Chinese government that has committed the country to tapping its geothermal potential. But as is often the case, it's newly entrepreneurial citizens who are making things happen. One Chinese student who studied geothermal technology in Reykjavík went home to transform what had been a peasant village into a model geothermal development, with housing, pools and a recreation park all heated geothermally. "People can say a lot of things about the Chinese government," says Hans Bragi Bernhardsson, head of China operations for Enex. "But if they decide to do something, they achieve it." In this case, let's hope so.

with reporting by Krista Mahr/Reykjavik

Read More...

Participation on Web 2.0 sites remains weak

A tiny 0.16 percent of visits to Google's top video-sharing site, YouTube, are by users seeking to upload video for others to watch

Similarly, only two-tenths of one percent of visits to Flickr, a popular photo-editing site owned by Yahoo Inc., are to upload new photos.

The vast majority of visitors are the Internet equivalent of the television generation's couch potatoes -- voyeurs who like to watch rather than create.

Tue Apr 17, 2007 10:55PM EDT By Eric Auchard (Reuters) SAN FRANCISCO (Reuters) - Web 2.0, a catchphrase for the latest generation of Web sites where users contribute their own text, pictures and video content, is far less participatory than commonly assumed, a study showed on Tuesday. A tiny 0.16 percent of visits to Google's top video-sharing site, YouTube, are by users seeking to upload video for others to watch, according to a study of online surfing data by Bill Tancer, an analyst with Web audience measurement firm Hitwise. Similarly, only two-tenths of one percent of visits to Flickr, a popular photo-editing site owned by Yahoo Inc., are to upload new photos, the Hitwise study found. The vast majority of visitors are the Internet equivalent of the television generation's couch potatoes -- voyeurs who like to watch rather than create, Tancer's statistics show. Wikipedia, the anyone-can-edit online encyclopedia, is the one exception cited in the Hitwise study: 4.6 percent of all visits to Wikipedia pages are to edit entries on the site. But despite relatively low-user involvement, visits to Web 2.0-style sites have spiked 668 percent in two years, Tancer said. "Web 2.0 and participatory sites (are) really gaining traction," he told an audience of roughly 3,000 Internet entrepreneurs, developers and financiers attending the Web 2.0 Expo industry conference in San Francisco this week. Web 2.0, a phrase popularized by conference organizer Tim O'Reilly, refers to the current generation of Web sites that seek to turn viewers into contributors by giving them tools to write, post, comment and upload their own creative work. Besides Wikipedia, other well-known Web 2.0 destinations are social network sites like News Corp.'s MySpace and Facebook and photo-sharing site Photobucket. Visits by Web users to the category of participatory Web 2.0 sites account for 12 percent of U.S. Web activity, up from only 2 percent two years ago, the study showed. Web 2.0 photo-sharing sites now account for 56 percent of visits to all online photo sites. Of that, Photobucket alone accounts for 41 percent of the traffic, Hitwise data shows. An older, first generation of sites, now in the minority, are photo-finishing sites that give users the ability to store, share and print photos.

Read More...

Ad Industry Still Virtually Dumb

dumb ad wankersOk, ok - we all took the mickey out of all those brands that rushed into SecondLife to an audience of no-one but at least we can get the rationale behind what they did there. Those brands built experiences for consumers to interact with. Pretty sensible thinking, no?

So, What does the ad industry do next? Takes a step backwards and introduces spam to SL in the form of video billboards. Adverlab shows us some demo shots of brands like Dove appearing in the virtual world (Yeah, because SL is full of fat birds).

And just another point the developers AMPP might want to consider. Up until now, for residents to watch video they have to click the Play button that appears at the bottom of the screen whenever video content is in the immediate vicinity.

Read More...

Art Hijacks Internet Ads

Picture_2 Artist Steve Lambert and the non-profit “R&D For The Public Domain” Eyebeam OpenLab, coded a Firefox add-on that replaces ads on a website with contemporary art. While a prototype is up and running, they’re working to build a fully curetted art database.

The project will be supported by a small website providing information on the current artists and curator, along with a schedule of past and upcoming AddArt shows. Each 2 weeks will include 5-8 artists selected by emerging and established curators. Images will have to be cropped to standard banner sizes or can be custom made for the project. Artists can target sites (such as every ad on FoxNews.com) and/or default to any page on the internet with ads. One artist will be shown per page. The curatorial duty will be passed among curators through recommendations, word of mouth, and solicitations to the AddArt site.

Read More...

Monday, May 07, 2007

Apartments in Dubai - What you get for $10 Million ?

signature damac

Now that’s called luxury and is a perfect post for our Dream Homes section. Dubai’s Damac Properties is targeting the world’s super-rich with a collection of apartments priced at up to $10 million. The exclusive residences will each occupy an entire floor in two of Damac’s landmark skyscrapers: Ocean Heights in Dubai Marina and Lotus Heights in Business Bay. The apartments, known as Signature Residences, will have uninterrupted 360 degree views thanks to their lofty locations from the 67th to 75th floors of the towers. They will each be reached by their own private, voice-activated elevator. All apartments come with a maid’s room, four parking spaces each and every apartment will have its very own swimming pool, personal gymnasium and steam room.

Read More...

Top 5 javascript frameworks

By Justin Silverton

5) Yahoo! User Interface Library

The Yahoo! User Interface (YUI) Library is a set of utilities and controls, written in JavaScript, for building richly interactive web applications using techniques such as DOM scripting, DHTML and AJAX. The YUI Library also includes several core CSS resources. All components in the YUI Library have been released as open source under a BSD license and are free for all uses.

Features

Two different types of components are available: Utilities and controls. The YUI utilities simplify in-browser devolvement that relies on cross-browser DOM scripting, as do all web applications with DHTML and AJAX characteristics. The YUI Library Controls provide highly interactive visual design elements for your web pages. These elements are created and managed entirely on the client side and never require a page refresh.

utilities available:

  • Animation: Create “cinematic effects” on your pages by animating the position, size, opacity or other characteristics of page elements. These effects can be used to reinforce the user’s understanding of changes happening on the page.
  • Browser History Manager: Developers of rich internet applications want bookmarks to target not just pages but page states and they want the browser’s back button to operate meaningfully within their application’s screens. Browser History Manager provides bookmarking and back button control in rich internet applications.
  • Connection Manager: This utility library helps manage XMLHttpRequest (commonly referred to as AJAX) transactions in a cross-browser fashion, including integrated support for form posts, error handling and callbacks. Connection Manager also supports file uploading.
  • DataSource Utility: DataSource provides an interface for retrieving data from arrays, XHR services, and custom functions with integrated caching and Connection Manager support.
  • Dom Collection:The DOM Utility is an umbrella object comprising a variety of convenience methods for common DOM-scripting tasks, including element positioning and CSS style management.
  • Drag & Drop: Create draggable objects that can be picked up and dropped elsewhere on the page. You write code for the “interesting moments” that are triggered at each stage of the interaction (such as when a dragged object crosses over a target); the utility handles all the housekeeping and keeps things working smoothly in all supported browsers.

Controls available:

  • AutoComplete: The AutoComplete Control allows you to streamline user interactions involving text-entry; the control provides suggestion lists and type-ahead functionality based on a variety of data-source formats and supports server-side data-sources via XMLHttpRequest.
  • Button Control: The Button Control provides checkbox, radio button, submit and menu-button UI elements that are more impactful visually and more powerful programmatically than the browser’s built-in form widgets.
  • Calendar: The Calendar Control is a graphical, dynamic control used for date selection.
  • Container: The Container family of controls supports a variety of DHTML windowing patterns including Tooltip, Panel, Dialog and SimpleDialog. The Module and Overlay controls provide a platform for implementing additional, customized DHTML windowing patterns.
  • DataTable Control: DataTable leverages the semantic markup of the HTML table and enhances it with sorting, column-resizing, inline editing of data fields, and more.
  • Logger: The YUI Logger provides a quick and easy way to write log messages to an on-screen console, the FireBug extension for Firefox, or the Safari JavaScript console. Debug builds of YUI Library components are integrated with Logger to output messages for debugging implementations.
  • Menu: Application-style fly-out menus require just a few lines of code with the Menu Control. Menus can be generated entirely in JavaScript or can be layered on top of semantic unordered lists.

Download and more information: here

4) Prototype

Prototype is a JavaScript Framework that aims to ease development of dynamic web applications.

Featuring a unique, easy-to-use toolkit for class-driven development and the nicest Ajax library around, Prototype is quickly becoming the codebase of choice for web application developers everywhere.

Features

  • Easily deploy ajax applications: Besides simple requests, this module also deals in a smart way with JavaScript code returned from a server and provides helper classes for polling.
  • DOM extending: adds many convenience methods to elements returned by the $() function: for instance, you can write $(’comments’).addClassName(’active’).show() to get the element with the ID ‘comments’, add a class name to it and show it (if it was previously hidden).
  • Utilizes JSON (JavaScript Object Notation): JSON is a light-weight and fast alternative to XML in Ajax requests

Download and more information here

3) Rico

Designed for building rich Internet applications.

Features

  • Animation Effects: provides responsive animation for smooth effects and transitions that that can communicate change in richer ways than traditional web applications have explored before. Unlike most effects, Rico 2.0 animation can be interrupted, paused, resumed, or have other effects applied to it to enable responsive interaction that the user does not have to wait on
  • Styling: Rico provides several cinematic effects as well as some simple visual style effects in a very simple interface.
  • Drag And Drop: Desktop applications have long used drag and drop in their interfaces to simplify user interaction. Rico provides one of the simplest interfaces for enabling your web application to support drag and drop. Just register any HTML element or JavaScript object as a draggable and any other HTML element or JavaScript object as a drop zone and Rico handles the rest.
  • AJAX Support: Rico provides a very simple interface for registering Ajax request handlers as well as HTML elements or JavaScript objects as Ajax response objects. Multiple elements and/or objects may be updated as the result of one Ajax request.

Download and more information here

2) Qooxdoo

qooxdoo is one of the most comprehensive and innovative Open Source multipurpose AJAX frameworks, dual-licensed under LGPL/EPL. It includes support for professional JavaScript development, a state-of-the-art GUI toolkit and high-level client-server communication.

Features

  • Client detection: qooxdoo knows what browser is being used and makes this information available to you.
  • Browser abstraction: qooxdoo includes a browser abstraction layer which tries to abstract all browser specifics to one common “standard”. This simplifies the real coding of countless objects by allowing you to focus on what you want and not “how to want it”. The browser abstraction layer comes with some basic functions often needed when creating real GUIs. For example, runtime styles or positions (in multiple relations: page, client and screen) of each element in your document.
  • Advanced property implementation: qooxdoo supports “real” properties for objects. This means any class can define properties which the created instances should have. The addProperty handler also adds getter and setter functions. The only thing one needs to add - should you need it - is a modifier function.
  • Event Management: qooxdoo comes with its own event interface. This includes event registration and deregistration functions.

    Furthermore there is the possibility to call the target function in any object context. (The default is the object which defines the event listener.) The event system normalizes differences between the browsers, includes support for mousewheel, doubleclick and other fancy stuff. qooxdoo also comes with an advanced capture feature which allows you to capture all events when a user drags something around for example.

Download and more information here

1) Dojo

Dojo allows you to easily build dynamic capabilities into web pages and any other environment that supports JavaScript sanely. You can use the components that Dojo provides to make your web sites more usable, responsive, and functional. With Dojo you can build degradable user interfaces more easily, prototype interactive widgets quickly, and animate transitions. You can use the lower-level APIs and compatibility layers from Dojo to write portable JavaScript and simplify complex scripts. Dojo’s event system, I/O APIs, and generic language enhancement form the basis of a powerful programming environment. You can use the Dojo build tools to write command-line unit-tests for your JavaScript code. The Dojo build process helps you optimize your JavaScript for deployment by grouping sets of files together and reuse those groups through “profiles”.

Features

  • Multiple Points Of Entry: A fundamental concept in the design of Dojo is “multiple points of entry”. This term means that Dojo should work very hard to make sure that users should be able to start using Dojo at the level they are most comfortable with.
  • Interpreter Independence: Dojo tries very hard to ensure that it’s possible to support at least the very core of the system on as many JavaScript enabled platforms as possible. This will allow Dojo to serve as a “standard library” for JavaScript programmers as they move between client-side, server-side, and desktop programming environments.
  • Unifies several codebases: builds on several contributed code bases (nWidgets, Burstlib, and f(m)).

Download and more information here

Read More...

Whistle-Blower on Student Aid Is Vindicated

Published: May 7, 2007

WASHINGTON — When Jon Oberg, a Department of Education researcher, warned in 2003 that student lending companies were improperly collecting hundreds of millions in federal subsidies and suggested how to correct the problem, his supervisor told him to work on something else.

Jon Oberg, a former Department of Education researcher, warned that student loan companies were abusing a subsidy program and collecting millions in federal payments to which they were not entitled.

The department “does not have an intramural program of research on postsecondary education finance,” the supervisor, Grover Whitehurst, a political appointee, wrote in a November 2003 e-mail message to Mr. Oberg, a civil servant who was soon to retire. “In the 18 months you have remaining, I will expect your time and talents to be directed primarily to our business of conceptualizing, competing and monitoring research grants.”

For three more years, the vast overpayments continued. Education Secretary Rod Paige and his successor, Margaret Spellings, argued repeatedly that under existing law they were powerless to stop the payments and that it was Congress that needed to act. Then this past January, the department largely shut off the subsidies by sending a simple letter to lenders — the very measure Mr. Oberg had urged in 2003.

The story of Mr. Oberg’s effort to stop this hemorrhage of taxpayers’ money opens a window, lawmakers say, onto how the Bush administration repeatedly resisted calls to improve oversight of the $85 billion student loan industry. The department failed to halt the payments to lenders who had exploited loopholes to inflate their eligibility for subsidies on the student loans they issued.

Recent investigations by state attorneys general and Congress have highlighted how the department failed to clamp down on gifts and incentives that lenders offered to universities and their financial aid officers to get more student loans. Under this pressure, the department is now seeking to set new rules.

The subsidy payments that Mr. Oberg uncovered are another corner of the lending system on which the department long failed to act, critics say, letting millions of dollars flow from the public treasury to about a dozen lenders.

The department now says it did not fully understand the extent of the maneuvers the loan companies were making to get the subsidies until last September, when its inspector general investigated and issued a report detailing manipulations carried out by a Nebraska lender, Nelnet. The audit recommended that the department recover $278 million from the lender, but education officials instead reached a settlement allowing Nelnet to keep the money but cutting it off from further subsidies that it claimed it was eligible to receive.

Senator Edward M. Kennedy, Democrat of Massachusetts and chairman of the Senate education committee, has asked Ms. Spellings to turn over documents related to the settlement decision. She is likely to come under questioning about the Nelnet settlement on May 10, at a hearing of the House education committee.

Mr. Oberg, now retired, has a master’s degree from the University of Nebraska and a doctorate in political science from the Free University of Berlin. He is a former Navy officer, university professor, and aide to Senator J. James Exon, a Nebraska Democrat, from 1979 to 1984. He was an Education Department liaison to Congress under the Clinton administration.

The subsidy payment issue that came to preoccupy Mr. Oberg grew out of decisions Congress made in the 1980s to ensure that low-cost student loans were available at a time when the economy was souring. Lawmakers guaranteed nonprofit lenders a rate of return of 9.5 percent on student loans that were financed by tax-exempt bonds to protect the companies from spiraling costs.

Congress eliminated much of the subsidy program in 1993 because interest rates had dropped, but at that time retained the 9.5 percent return for existing loans. By 2002, lenders had devised ways to inflate the volume of loans for which they received the 9.5 percent subsidies. Congress closed one loophole in 2004, but lenders found others. Congress further restricted the subsidies in 2006.

In 1997, the Clinton administration proposed legislation to eliminate all references to the subsidies from the Higher Education Act in an effort to rein them in. Mr. Oberg took the legislation to Sally Stroup, who was then serving as senior aide to the Republican chairman of the House education committee.

“Sally told me there was no way that language was coming out,” Mr. Oberg recalled. “She didn’t give a reason — just forget it.” Ms. Stroup, who went on to become an assistant secretary of education in the Bush administration, and who is now back as an aide on Capitol Hill, did not return several phone calls and messages left for comment.

In 2000, Mr. Oberg transferred to the department’s research operation, and two years into the Bush administration, began to review the government filings of Nelnet and other lenders. He found that not only were payments to lenders rising rapidly, but also that the base amounts of the loans lenders were claiming as eligible for the 9.5 percent subsidies were exploding.

“Several big lending agencies were gaming the system,” Mr. Oberg said in a recent interview at his home in Rockville, Md.

He notified the Education Department’s inspector general’s office. He also told his superiors but felt they were brushing him off. So in November 2003, he wrote a memorandum for general distribution throughout the department warning that lender manipulations could cost the government billions unless stopped, and he recommended that the secretary could end the abuse with a letter to lenders clarifying government rules.

That is when his supervisor, Mr. Whitehurst, director of the department’s Institute for Education Sciences, stepped in. Mr. Whitehurst said that he had forwarded Mr. Oberg’s memorandum to appropriate senior officials, whom he declined to identify, but acknowledged that he “wasn’t real happy” because he considered Mr. Oberg’s research to be outside his job description.

“Plus, I didn’t understand the issues,” Mr. Whitehurst said recently. “In retrospect, it looks like he identified an important issue and came up with a reasonable solution. But it was Greek to me at the time — preferential interest rates on bonds? I didn’t know what he was doing, except that he wasn’t supposed to be doing it.”

He told Mr. Oberg to stop because he wanted him to be monitoring grants, not lending practices. Officials also rewrote Mr. Oberg’s job description, documents show, barring him from further research into the subsidies. Although Mr. Oberg was a civil servant, the Bush administration may have seen him as a holdover from the Clinton administration.

Mr. Oberg said he decided to continue his research in his free time because, “If you tell some people they can’t do something, they want to do it all the more.”

But when he requested from his own department data on payments to lenders, known in the bureaucracy as the 9.5 percent Special Allowance Payments, Donald Conner, an analyst in the department’s postsecondary division, e-mailed Mr. Oberg saying, "I’m not permitted to give any 9.5 percent SAP information."

Mr. Whitehurst, in an interview, suggested that Mr. Oberg was viewed by some senior officials as an annoyance. “I was told he was like a dog on a bone, agitating on this issue,” Mr. Whitehurst said. Ms. Spellings did not reply to a memorandum Mr. Oberg sent her about waste in the loan program just before his 2005 retirement, Mr. Oberg said.

But Mr. Oberg’s warnings prompted a clamor in Congress and a string of reports by government investigators calling for a stop to the giveaways. Senior department officials disputed or declined to follow the recommendations of all of them.

A 2004 report by the Government Accountability Office urged the department to rewrite its regulations to save billions of dollars in future loan subsidy payments. But Ms. Stroup, who had once worked for one of the lending companies that is now under investigation for the subsidies, argued in response that it would be simpler for Congress to clamp down with new legislation. Mr. Paige repeated that argument in a letter to Mr. Kennedy, who was pressing the department to curb the subsidies.

Then, in 2005, the Education Department’s inspector general recommended that $36 million be recovered from a New Mexico lender. Ms. Spellings overruled the finding that the payments were improper and declined to recover the payments. And in January 2007, after the inspector general recommended that $278 million in overpayments be recovered from Nelnet, the department instead reached a settlement under which Nelnet could keep the money — if it dropped plans to bill the department for another $800 million in subsidies.

Nelnet was the nation’s most generous corporate donor to the National Republican Congressional Committee in 2006, and its top three executives were the largest individual donors to the committee as well, according to the nonprofit Center for Responsive Politics.

Nelnet was also well connected at the department. Don Bouc, Nelnet’s president through 2004 and president emeritus thereafter, sat on the department’s Advisory Committee on Student Financial Assistance from 2001 through Feb. 1 of this year, even while the department was auditing the company’s subsidies and negotiating the settlement. Mr. Bouc resigned from the committee 11 days after the department announced that it would not seek to recover the $278 million.

Ben Kiser, a Nelnet spokesman, said Mr. Bouc’s service for the committee was unrelated to the audit.

Robert Shireman, a researcher in Berkeley, Calif., who co-authored a private nonprofit group’s 2004 report on the subsidies called “Money for Nothing,” said, “There has been an outrageous lack of interest at the Education Department in doing anything to stop the bleeding.”

Then this January, turning to a measure Mr. Oberg had recommended in 2003, the department issued a “subregulatory guidance” letter cutting off subsidy payments to all lenders except those who prove their eligibility with an audit.

Kristin D. Conklin, a senior adviser at the department, said the department had been unaware, until its inspector general issued its Nelnet audit last September, that lenders were collecting subsidy payments on loans that were clearly ineligible.

That audit documented how Nelnet had transferred loans repeatedly into and out of tax-exempt bonds issued before 1993 to expand the volume of loans eligible for the subsidies. The audit identified so-called first-generation loans, financed from the pre-1993 bonds, and second-generation loans, financed from the proceeds of the first-generation loans, as eligible for the government subsidies. It said later-generation loans were ineligible.

“It’s not like we were sitting on this big problem and didn’t address it,” Ms. Conklin said. “We didn’t know the extent to which these third- and fourth-generation loans were being used. The full scope of this problem first became known to us in September, and we moved seriously to address it in the following months.”

Ms. Conklin also said the department had previously lacked the power to cut off overpayments using a simple letter. Only intervening legislation passed by Congress made that possible, she said. Today, with Mr. Oberg’s predictions proven accurate, he has become a bit of a celebrity. Mr. Kennedy arranged his testimony before the Senate in February.

“Taxpayers owe a tip of the hat to former Nebraskan Jon Oberg, who blew the whistle on the scheme that allowed companies to grab hundreds of millions in subsidies,” the Lincoln Journal Star wrote in October.

Read More...

HOWTO own a 128-bit number!

Would you like to be the exclusive owner of a number, with the right to sue other people for knowing your number or telling other people what it is? Now you can.

Last week, the AACS consortium made history by issuing legal threats against the 1.8 million web-pages (and counting) that mentioned its secret code for preventing HD-DVD discs from being copied.

In effect, AACS-LA (the AACS Licensing Authority) claimed that it owned a randomly chosen 128-bit number, and that anyone who possessed or transmitted that number was breaking the law. Moreover, it claimed to own millions more random numbers -- claimed that the US Digital Millennium Copyright Act, which criminalises telling people how to break anti-copying software, gave it exclusive dominion over its many keys.

Why should the AACS get all the fun? Princeton prof Ed Felten has come up with a great way of giving out legally protected 128-bit numbers to anyone who wants them. If he gives out 2^128 of these, then all 128-bit numbers will be owned and no one will ever be able to use a 128-bit key without breaking the law. Good times.

Here’s how we do it. First, we generate a fresh pseudorandom integer, just for you. Then we use your integer to encrypt a copyrighted haiku, thereby transforming your integer into a circumvention device capable of decrypting the haiku without your permission. We then give you all of our rights to decrypt the haiku using your integer. The DMCA does the rest.

The haiku is copyright 2007 by Edward W. Felten:

We own integers, Says AACS LA. You can own one too.

My number is AF BC 9C 5D DA 6B 7A A8 7C 33 A1 2B E7 D3 EA 11. You aren't allowed to know this number. I also reloaded the page and generated a few more numbers. I'm not telling you what they are, but I'll be setting up a Google alert for them and if I catch you using them, I'm gonna take your house away. Link

See also: AACS vows to fight people who publish the key AACS DRM body censors Cory's class blog Digg users revolt over AACS key Secret AACS numbers, the photoshopped edition Side effect of AACS turmoil: MSM turns on Web 2.0? UPDATED Blu-Ray AND HD-DVD broken - processing keys extracted EFF explains the law on AACS keys More AACS spoofs: WOW protest, and PSA vid: Think Before You Post HD-DVD/Blu-Ray cracker muslix64 interviewed Web-page aggregates links to "forbidden numbers" used to break HD-DVD

Read More...

Photobucket Was A Steal v. Google/YouTube

By almost any measure, MySpace got Photobucket for an absolute steal when compared to the Google YouTube deal. The companies are somewhat comparable - both have very large libraries of user-created videos, and both built their business on the back of MySpace. Photobucket also has a huge library of shared photos, a business YouTube never entered.

Google paid $1.65 billion in stock for YouTube. By the time the deal closed, the Google stock was worth nearly $1.8 billion. Photobucket is being acquired for just less than 1/5 of that - $250 million plus an earnout of up to $50 million

At the time of the announcement of their acquisition in October 2006 YouTube had very little revenue. Photobucket, however, is on track to blow through their projection of $25 million this year.

Also, the relative sizes of the two companies aren’t that far off. At the time of the acquisition, Comscore suggested that YouTube had approximately 25 million U.S. monthly visitors. Today, Photobucket has around 20 million U.S. monthly visitors, or 80% of what YouTube had when it was acquired.

Photobucket has 40 million registered users and is gaining another 85,000 or so per day. Their users are highly active, and upload a lot of content to the network. YouTube’s registered users were far below Photobucket’s 40 million at the time of their acquisition. YouTube had (and still has) a lot of traffic coming to the site to view videos, but far fewer users actually creating and posting content.

Leaving revenue aside, the traffic numbers indicate a comparable price of $1.3 billion for Photobucket, 4x the price they actually received from MySpace. To look at this another way, YouTube was paid about $67 per unique visitor. Photobucket got just $13.

Did Google overpay for YouTube? Did MySpace get Photobucket for a steal? Perhaps both. But in the end, being no. 1 in a category means you get a premium on acquisition. In the case of YouTube, that premium seems to be about 4x.

Another factor: Photobucket just didn’t generate the bidding hype that YouTube saw. It looks like the final bidders were IAC and MySpace, with a number of other bidders falling off in the last few weeks (perhaps spooked by the MySpace blockage of Photobucket videos).

In a year or so this deal is likely to look as brilliant for NewsCorp (which owns MySpace) as the MySpace acquisition was. Some would argue that they play dirty poker, but shutting Photobucket down at a crucial point in the acquisition negotiations was a brilliant move, and may have shaved hundreds of millions of dollars off of the purchase price.

Read More...