Phil Sim

Web, media, PR and… footy

The problem with mash-ups

I’ve always been very bullish on the idea of modular web-services. It’s sound logical, desn’t it? Why reinvent the wheel when you can leverage what somebody has already built. To a certain degree, it’s working. Widgets are getting increasingly popular and there are some raving success stories for mashups like Google Maps.

However, plugging in someone elses technology can also turn around and bite you in the butt.

In the latest version of our platform we have a module for managing editorial workflows. I’ve been consistently dissapointed with your rich text editors like TinyMCE. Nothing ever seemed to show up the way we wanted it, as soon as you got even a little complicated and having been impressed with Zoho Writer I jumped at the chance to embed that into our platform

However, we hit a bug pretty quickly. If you defocus from the embedded Zoho Writer you pretty much kill it. We use custom AJAX tabs a lot so this is a pretty major bug for us. We redesigned the interface around it, but it has never quite been implemented how I wanted it – but Zoho support let me know they were aware of it and it would be fixed in the next API release.

That was 3 months ago and I’ve just been told that the API release was de-prioritised because Zoho are working on a new release of the Writer product.

Now, I guess I can’t be upset with Zoho for doing this. I’m sure the embeddable Writer isn’t a great priority for them and not too many people would be crying out about this bug. But it has certainly raised a lot of questions in my mind about making that leap of faith to embed somebody else’s product into yours because you cede control over that element to someone who may not necessarily care about it.

I really don’t have the resources or the inclination to try and reproduce a product with all the sophistication of Zoho Writer but maybe its worth sacrificing features if it means you give away this element of control.

I’ve run into similiar problems with Drupal. You bet on deploying a module, put in a heap of work to get it up and running and then you run into a showstopper. At least with Drupal, though, it’s open source so you can do something about it and go and code your own fix.

So I’ll give a lot of thought before I do this again. I had been looking forward to Zoho producing an API for their Chat product which I was told was in the works, but maybe the build your own is really the only option if its a key element of your platform.

Filed under: AJAX Challenge

Google credits users for outage

I just received in my inbox this mail from Google, admitting they had not met their SLA agreement in August and would be crediting our account with a 15-day service. They would also be creating a dashboard so users can better get status information on the health of their system.

We’re committed to making Google Apps Premier Edition a service on which your organization can depend. During the first half of August, we didn’t do this as well as we should have. We had three outages – on August 6, August 11, and August 15. The August 11 outage was experienced by nearly all Google Apps Premier users while the August 6 and 15 outages were minor and affected a very small number of Google Apps Premier users. As is typical of things associated with Google, these outages were the subject of much public commentary.

Through this note, we want to assure you that system reliability is a top priority at Google. When outages occur, Google engineers around the world are immediately mobilized to resolve the issue. We made mistakes in August, and we’re sorry. While we’re passionate about excellence, we can’t promise you a future that’s completely free of system interruptions. Instead, we promise you rapid resolution of any production problem; and more importantly, we promise you focused discipline on preventing recurrence of the same problem.

Given the production incidents that occurred in August, we’ll be extending the full SLA credit to all Google Apps Premier customers for the month of August, which represents a 15-day extension of your service. SLA credits will be applied to the new service term for accounts with a renewal order pending. This credit will be applied to your account automatically so there’s no action needed on your part.

We’ve also heard your guidance around the need for better communication when outages occur. Here are three things that we’re doing to make things better:

  1. We’re building a dashboard to provide you with system status information. This dashboard, which we aim to make available in a few months, will enable us to share the following information during an outage:
    1. A description of the problem, with emphasis on user impact. Our belief is during the course of an outage, we should be singularly focused on solving the problem. Solving production problems involves an investigative process that’s iterative. Until the problem is solved, we don’t have accurate information around root cause, much less corrective action, that will be particularly useful to you. Given this practical reality, we believe that informing you that a problem exists and assuring you that we’re working on resolving it is the useful thing to do.
    2. A continuously updated estimated time-to-resolution. Many of you have told us that it’s important to let you know when the problem will be solved. Once again, the answer is not always immediately known. In this case, we’ll provide regular updates to you as we progress through the troubleshooting process.
  2. In cases where your business requires more detailed information, we’ll provide a formal incident report within 48 hours of problem resolution. This incident report will contain the following information:

    a. business description of the problem, with emphasis on user impact;
    b. technical description of the problem, with emphasis on root cause;
    c. actions taken to solve the problem;
    d. actions taken or to be taken to prevent recurrence of the problem; and
    e. time line of the outage.

  3. In cases where your business requires an in-depth dialogue about the outage, we’ll support your internal communication process through participation in post-mortem calls with you and your management team.

Once again, thanks for you continued support and understanding.

The Google Apps Team

Filed under: AJAX Challenge

No network can be all things to all people

Earlier this week I spoke on a panel session held by AIMIA on the topic of online PR and one of the questions asked related to Open Social as a social platform.

Co-incidently, earlier I’d been discussing a similiar topic with co-panelist Michael Henderson and our moderator Steven Noble regarding where social platforms were headed.

My response regarding Open Social was that I didn’t think it had legs. Like many I was initially excited by the idea, but if these kinds of things haven’t got some level of traction earlier in the peace they just won’t fly and Google simply doesn’t have the power in the social networking space to impose its will on anyone.

My conversation with Michael and Stephen related to Facebook. I commented that I was visiting Facebook a lot less these days and so were many of my friends so the utility of the site was quickly diminishing for me. Michael disagreed saying it was still very much his primary platform. Today, I noticed a twitter from Mark Jones, who noted that right now he would check Twitter first, email second and RSS third where as a year ago it was the other way around. And then you have the A-list folk like Duncan Riley and Robert Scoble who have been sucked into the Friendfeed realm.

Another observation from the Online PR panel was in relation to blogging. Hendo stated the belief in his presentation that blogging was on the decline. I tend to think that as the likes of WordPress and Typepad enable more social networking functionality that blogs are going to have a renaissance as a central aggregation point for many people’s online existence. And while I”m here I’ll point to a blog on the Inqusitr arguing that LinkedIn is really the undervalued social networking property. Oh, and there was this piece on TechCrunch noting that MySpace was still the dominant platform around music.

What all this seems to add up to is that there is never going to be a single, one-site-to-rule-them-all winner in the social networking space. I think Facebook has enough critical mass that it will be the fallback option for a large number of people. The place you go to if you want to find somebody and probably a destination that most people will visit every once and a while but with the fizz dying down in relation to Facebook as an application, I don’t think a generic platform can deliver enough utility to be all things to all people.

It seems to me you can compare social networking sites to telephone directories. You’ve got your main directory (so in Australia it’s the Yellow/White Pages) but then you have a whole host of directories about the place for greater, levels of depth and information (ie greater specialist utility).

Of course, there is the problem of data portability and interoperability – who wants to update six different site statuses for instance? But at this stage it looks like all of these different networks are going to be held together by a kludge of hacks, APIs and web services.

At MediaConnect over the last 24 hours we’ve just started turning on integration with Twitter, Facebook, blogs and email. We recognise while it would be nice for our site to be the be-all, end-all for all our users that’s never going to be the case and so it’s critical that you play with as many different web services as makes sense. I think the only web model going forward is to offer both a platform, which offer maximum utility for your core users and then to integrate and make your service/information available on as many different platforms as you can. Right now, I’d like for us to have presences on our rich application platform, on a lighter, html version of the platform, a mobile platform, as an email-driven application and with almost full-functionality available from sites like iGoogle and netvibes.

We’ve had plenty of internal discussions about the merits of trying to keep people locked into our primary platform as much as possible, but I just don’t think thats the smart way forward. I think you need to accept that no two-people work the same way, plus we all have a legacy of various applications we’re in some way tied to. So you just need to play with as much as possible as well as possible. I think this has been a major factor to the success of Rememberthemilk. It plays with almost everything – so there’s a good chance that no matter what combination of platforms you’re likely to use, you can somehow get at your todo list.

In the end, you just need to remember that your core value proposition is the utility you offer your audience/community, not your interface. As all of these sites start to talk to each other, users are going to make choices and its likely going to be based on interface and familiarity as much as it is features or services. They are going to have to invest quite a bit of work to make their chosen interface work with the various services they would like to integrate with and so expecting all your users to give all that up is a big, big ask and will be too much for many users.

I think most people will have three to five social networks they use regularly – Facebook + a social network or two based around their hobbies + 1 or 2 work-based networks. You’ll have conversations on all of these sites but you’ll probably only have a maximum of two that are your main expression points and which you come to consider as being your online identity. There will be enough interaction between them that you’ll easily slide between them but not enough integration that you can give them all up and pull them into a single aggregation point (and who really wants that, right?).

Filed under: AJAX Challenge, ,

Form Builder is key to online suite

Google Docs unveiled a much nicer version of its form builder and semi-detached it from spreadsheets.

I believe a Form Builder to be one of the most important applications in any online productivity suite. On a desktop you’re adding your own data but when you go online you’re going to want to allow others to add data to your workflows. It once again goes to show that you can’t just transplant desktop apps onto the web – you need to reconsider what makes up a online productivity suite because they should be very different beasts.

Anyway, I’m glad that the Google Docs team appear to understand the importance of their Form Builder. From their blog post on the new form builder and additional feature enhancements titled: ‘Forms move out of their parents basement’, engineer Andrew Bonventre notes: “Just wait until Forms is old enough to drive… Oh, the places we’ll go!”

I’d love to be able to create multi-page forms with different paths based on the user’s answers.

I’ll also point out for heavy docs users, an absolutely critical new feature that has been added to spreadsheets. The ‘importrange’ function allows you to import data from different spreadsheets. For one, this means you can have a spreadsheet that makes use of more than one form but we’re also going to use it to build a master scoresheeting document that pulls data from the various sheets used by each department.

Filed under: Content Aggregation

Blogging again

I think I might have just taken my longest sabbatical from blogging for Squash.

Like many bloggers who also run businesses you inevitably just eventually run out of time and energy especially when your in development/creation stage and you really need to give all your concentration and creativity to your project. I’ve been heads down in that stage with MediaConnect for what seems like an agee but just starting to come up for air again now and looking to turn our attention to biz dev and marketing again.

Which will give me an excuse to start blogging with a bit of regularity again. 🙂

I do miss the thought process that blogging prompts me to engage in. To think broadly and analytically. But I tend to lose focus when I blog as my thoughts and attention wonder all over the blogosphere. It’s probably a good time for me now to get my head out of my work and see what else is around before I dive into a new round of development as well. Not exactly sure how long I’ll keep it up for but with the League season coming to an end and my side the Parramatta Eels on the verge of bowing out, I’m likely to have a bit more time again and a need to channel my writing somewhere.

Mind you, I have doubts I have any RSS subscribers left so not exactly sure who I’m talking to. If anyone is still out there say hi!

Filed under: Blogs

What Google can do to stop plagiarism and save journalism

When people lament the decay of journalism in the online age, you’ll almost certainly hear Google thrown in as a primary contributor. Critics cite the echo-chamber effect of Google News and the need for journalists to spit copy out at a million miles-an-hour to keep up in the Google-age as factors driving the craft of journalism into the ground.

Plagiarism also will often rear its ugly head. The problem is getting worse and worse. Yesterday, on our MediaConnect/ITJourno site we wrote how journalist Asher Moses complained about a piece of his being pretty much pilfered by the Daily Mail. This piece about a local newspaper in the US made up almost entirely of plagiarised content ran prominently on online magazine Slate along with this blog on the Publish2 blog. Now plagiarism pre-dates Google by centuries but it’s probably a fair argument to suggest that the rise of Google-driven online media models has intensified the problem by a degree of magnitude.

Google recently made a move to lower its ranking given to duplicated content so if you use wire copy its less likely to show up prominently in Google News and on search engines. Surely that would act as a demotivator to plagiarise you might say? No, it just forces unethical outlets to go to a little more effort to rework the plagiarised content so its not identical to the original. The Daily Mail story was a perfect example of a story which has been reworked but the fact of the matter remains that it has taken slabs of content without any attribution.

Google has become such a powerful force that it heavily influences, some might say dictates, the directions and strategies of most online media companies. If Google changes its ranking algorithms, SEO gurus all over the world scramble to react.

So what would happen if Google used its power to remedy some of the issues its rise has created – especially in regards to plagiarism.

What if an outlet caught plagiarising content was immediately removed from the Google News index? And/or had its Pagerank trashed?

What if Google set a benchmark for what it considered fair use and asked all those outlets who wished to be included in Google News to abide by those guidelines.

You’d solve the plagiarism problem almost overnight.

For many sites, being thrown out of Google News would instantly cripple them. No one would risk running plagiarised copy. In fact, editors might even get vigilant about checking that contributors pieces haven’t been overtly ‘inspired’ by the work of others. If getting busted for plagiarism potentially threatens your financial viability, what outlet wouldn’t attempt to put checks and balances in place. Who knows, sub-editors might even become employable again!

Google should demand that any re-used copy/quotes come with attribution to the original source and that it adheres to re-use guidelines if they are published clearly and obviously on any article.

Some will say Google is but a search engine and its not its charter to decide what is and what isn’t plagiarism. But Google News has rules laid around it, already. To be indexed you need to be a multi-authored content site, with easily attainable contact information and so forth. It’s a human-managed process that would only need to be expanded to take into account copyright guidelines.

Google also has to make policies and decisions relating to black-hat linking and SEO. If sites are caught buying links they can have their pagerank reset so again, its not a big leap to do the same regarding what is essentially blackhat publishing.

And even if Google didn’t want to give itself that responsibility, it could instead work with an industry body established by large media companies and publishers that could define guidelines and make decisions on when those rules have been broken. I would suggest those guidelines should put the emphasis on publishers to state clearly their accepted re-use policy and those who choose to re-use copy simply need to comply with those guidelines. If found to have not done so, the body would simply inform Google that the infringing site has breached and then Google and other news aggregators who opted in to the system would then remove said sites from their indexes and/or downgrade their overall site rankings.

Legal solutions to this problem won’t work – the economics don’t add up. But that shouldn’t matter – we operate in a link economy and Google is already the overlord of the link so this really would be just a natural extension of its responsibility to the Internet community and its media partners. And I guarantee it would all but put an end to commercial plagiarism almost immediately.

If you think this idea could float I encourage you to spread the word. There are a lot of movements like Data Portability that the web community have managed to champion but the media community seldom seems able to come together to do anything yet the plagiarism is one massive problem that threatens everyone’s livelihood and yet I believe, as described above, is relatively easily solved. Even if the media industry doesn’t go as far as forming an official industry group, a group like could at least be formed to
start engaging with companies like Google to work towards a solution.

Filed under: Plagiarism

@philipsim on Twitter

Top Clicks

  • None

Blog Stats

  • 280,405 hits