You can’t do anything “over REST”

Sometimes, you can let things slide, but there are other time when terms are just used so incorrectly that it has to be called out. One thing that always gets me is the gross misuse of the term REST. For those who know what REpresentation State Transfer (REST) means then you know that, although the REST architectural style is commonly used with HTTP, it is not bound to any specific protocol.

One of the things that starts my head spinning is seeing how the term “REST” is so often used in place of thing they really mean in order to toss out buzzwords. When I’m involved in technical discussions or read articles on the web, I start to feel like Inigo Montoya when I hear the term “REST”. More often than not, someone is probably referring about HTTP, or even HTTPS, but you can never be too sure. Here’s a few of my favorite statements:

We’ll send it over REST

Oh no you won’t! Given that REST is not a protocol, I find this kind of statement simply mind boggling. One can assume that someone would like to return data over HTTP. However, it is entirely possible to create a RESTful application over other protocols such as XMPP, RMI, or something else. It helps top be specific when you’re involved in a technical discussion.

We’ll make a REST request

Are you sure? What exactly does a REST request look like? If you can’t request data from a URI like rest://example.com/foo, then you’re not making a “REST request.” As stated above, be specific as to what protocol you’re using.

We’ll return it as a REST object

This one pains me more than the other two. Seriously, what kind of “objects” are RESTful? Is is XML, JSON, binary, what? Again, there is no such thing. There are only resources and representations, and it’s the representations of those resources you need to be specific about. What, exactly, are you sending over the wire?

We’ll just add some methods to our REST server

OMFG! For real, a REST server? Even though the Facebook claims to have one of those, it doesn’t make improper use of term valid. You can’t serve “REST,” plain and simple.

Just so that I can continue beating the horse: you can’t send jack over REST. REST is not HTTP and HTTP is not REST. If you have a web API that you’re exposing over HTTP in a RESTful fashion, why can’t it just be called an HTTP Service or API server? Correct use of the term REST is just as important as implementing a RESTful application correctly. Sadly, the same folks who use the term REST incorrectly are also not creating applications can claim to be RESTful.

Why free software shouldn’t depend on Richard M. Stallman’s advice

There’s been a long running rant about how using Mono is um, bad. But I just don’t get it. Now we have Richard M. Stallman coming out against Mono and C# with an argument that sounds kinda like “we shouldn’t use it just because we shouldn’t.” Hmm, Ok. [ok, that is way too much of an over simplification and taking some things out of context. However, I’m still not sure what’s bad: C#, Mono, or both?]

The odd thing about the post if that it focuses on C#, but none of the other languages that the the Mono CLR supports. Second, he goes on to state that “If we lose the use of C#, we will lose them [the applications] too.” Given that C# is an ECMA standard (as is the CLR itself), I think the conerns about not being able use C# are unwarrented. If we have to worry that the ECMA would allow Microsoft to pull rank on C#, then web developers should be rethinking thier use of JavaScript.

But the wierd thing is that Stallman doesn’t make the same point about any other langauage that the Mono CLR supports. For example, if Tomboy were written in the Boo programming language but remain on the Mono CLR, would evertyhing be ok? Why is there such a profound hatred of C# and not other lanagues supported by the CLR? Why not come out against the use CIL? Or is Stallman just not making his point clear enough?

As somone who uses Ubuntu 9.0.4 on a daily basis, I can apprciate what Mono has to offer from an end user perspecitive. I’m a HUGE fan of GnomeDo, which has turned out to be a better implementation of Quicksilver than Quicksilver. Then of course there’s Banshee, which is blosoming into an excellent media player. And also there F-Spot for photo management. I could go on, but the point is here that there are a lot of really great applications for GNOME that happen to be built on Mono.

Overall, I find that the post is weak on sound technical and legal arguments and high as a kite on FUD. Where’s the meat? Specifically, what can Microsoft go after that’s not GNOME if people start rewriting Mono applications in C++? Jo Shields has a lengthy, but excellent, post called Why Mono Doesn’t Suck. Jo’s post makes a lot of really good points about Mono if you don’t have a short attention span.

In the end, i think that a Mono is ultimately a good thing for Linux on the desktop. Anything that gives developers better productivity, and choice is a good. Part of being free is being able to make a choice: we should be free to choose whether or not we actually want to use applications developed with Mono.

MAPI Support in Evolution is Far From Stable

For the past two weeks, I’ve been enjoying 64-bit Ubuntu 9.04 on my HP 8530w. It’s very fast, and ext4 is helps considerably with boot times. Overall, I find this release to be pretty good. But, there’s one item that irks me to no end and that is the much touted “MAPI” support for Evolution.

For starters, the MAPI support is not installed by default, but it’s a post-install add-on. No big deal really, but it is misleading to say that it’s included in the release when it’s actually just in the repos. Next up is that fact that it simply doesn’t work. In fact, the Evolution-MAPI plugin is alpha quality at best. For starters, just setting up an account is busted. Each time I’d go to authenticate, Evolution would simply crash. Then I found this post which suggested using the IP address instead of the host name, which actaully worked.

When Evolution connects to the Exchange server, the initial load is PAINFULLY slow. The other thing is that folder structure is weird as well. In my case, my inbox was buried under 4 other sub-folders. The good news is that I can see email, with caveats. For example, replies or forwards are not prefixed with “re:” or “fw:”, even if they were in Outlook. Second, any meeting request acceptance or decline doesn’t get prefixed with “accepted:” or “decline:”, you just see the subject of the original meeting request. Which brings me to calendar and contacts. While I can see my calendars, they don’t get translated into my local time zone. This is kind of a problem. Contacts kinda work. I can see some of them, but for the most part, Evolution crashes before I can successfully select a contact.

Yeah, I know I should probably file some bugs. But be advised that if you’re upgrading to 9.04 to gain Exchange 2007 support, this isn’t the release you want. Here’s to hoping that things improve significantly in Karmic Koala.

Eclipse 3.5RC3 Gives New Life to SWT on OS X

I’ve been a long time user of Eclipse, but also a critic of SWT – the UI toolkit that Eclipse uses. While Eclipse has always been a very productive tool on OS X, SWT has always lagged a bit behind other platforms. But the great thing about Eclipse 3.5, and SWT in general, is that it is now using Cocoa instead of Carbon. Additionally, the SWT guys have paid attention to a lot of little details, such as sheets and Mac-looking drag and drop indicators to name a few. There’s a lot that’s gone into this SWT release that makes me rethink my position on Swing.

One other thing to point out is that the Eclipse 3.5 release candidates are friggin’ snappy as hell. Startup times on all three platforms are very good and responsiveness is simply better overall. This is a release I’m really looking forward to.

NBC Cancels Life, Shoots self in foot

Well, it’s official: Life has been canceled. This was such a great show, but now gone. Considering that NBC put this great show on at a time when two ratings juggernauts (Lost and American Idol) are already dominating Wednesday night, it’s no wonder Life did lousy ratings-wise. Yep, I watched Lost on Wednesdays, but religiously DVR’d Life. But whatever.

Well, there’s not much else worth watching on NBC nowadays other than 30 Rock and Southland. I’d throw Heroes in there too, but Heroes is a mere shadow of its former self. You’re dropping the ball NBC.

Semantic Web research publications need to be more “webbish”

Over the past few weeks, I’ve been taking a deep dive into the Semantic Web.  As some will tell you, a number of scalability and performance issues with the Semantic Frameworks have not been fully addressed. While true to some extent, there’s been a large amount of quality research out there, but it is actually somewhat difficult to find.  Part of the reason is that much of this research is published on the web in PDF. To add insult to injury, these are PDFs without associated meta data nor hyperlinks to associated research (which is not to say that prior research isn’t properly cited).

Does this seem slightly ironic? The Semantic Web is built on the hypertext-driven nature of the web. Even though PDF is a readable and relatively open format, it’s still a rather sucktastic hypertext and on-screen format (especially when it’s published as a two-column layout).  PDFs are an agnostic means of publishing a document that was authored in something like Microsoft Word or LaTex. They are generated as an afterthought and most do not take the time to properly hyperlink these PDFs to external sources. Why is this bad? Given that we’re using the document web (or rather the “Page Ranked” web) of today, this makes it a bit more challenging to locate this kind of information on the web. In a sense, this is akin to not eating your own dog food. If knowledge isn’t properly linked into the web as it works today, it effectively doesn’t exist. Guys like Roy T. Fielding get this, and it’s most likely why his dissertation is available as HTML, as well as PDF variants.

As a friendly suggestion to the Semantic Web research community: please consider using XHTML to publish your research. Or even better, XHTML with RDF/A. Additionally, leverage hyperlinks to cite related or relevant works. It’s not enough anymore to use a purely textual bibliography or footnotes. The document web needs hyperlinks.

There’s a lot of cool and important stuff being worked on but this knowledge is not being properly disseminated. No doubt, in some cases publishing to two different formats can be a lot of work. But in the long term the payoffs are that information is widely available and you’re leveraging the Semantic Web as it was meant to be.

Blu-ray at 1080p is better than your local theater

This weekend, we decided to take our daughter to her first movie in a real theater. This was the first time my and I had set foot into a movie theater in about three years (Yeah, we haven’t been out to see a movie since she was born). Sadly, we quickly realized that we haven’t missed much as all.

If memory serves me correctly, the Lowes at the Loop is actually a relatively decent theater. At least, three years ago anyway. First off all, all you can smell is the stench of pop corn. Second, the volume in the theater is simply way too loud. For a movie like Madagascar, there’s no need for it to be that loud. The child of mine simply did not approve. Lastly, the picture quality sucked! It didn’t help that they had technical issues with the projector which delayed the showing for 15 minutes. Once the lights dimmed and movie started, the color was just dull and washed out. Add to that the screen is smeared with candy and whatever other crap people throw at it. It was just a very unpleasant experience that ruined a decent movie and we had to leave half-way through because it just too damn loud.

The funny thing is that I think we have a much more enjoyable experience at home. I only have a 37-inch Sony Bravia XBR-6 coupled with a Sony BDP-S350 Blu-ray player. With just the TV and Blu-ray player alone, I’d say that the viewing experience is already better better than what we had at the Lowes. The family opinion is that the picture quality on the “teenie” 37-inch screen is far better than what we saw at the Lowes. Plus, there’s no funk on the screen. And the best part is that we have it all to ourselves without people on cell phones, etc.! While it’s true that a Blu-ray is more expensive than a DVD, prices are coming down. But when you consider that that two adult tickets cost $20, it makes the Blu-ray price pretty attractive. Anyway, my wife and I are pretty sure that movies will be viewed at home from this point forward, because yes, Blu-ray is that good.

Please NBC, don’t cancel Life!

You know, a lot of network T.V. kinda sucks so when a decent show comes along that goes on the chopping block, it bums me out a bit. I started watching Life on NBC just because I happened to be watching NBC at 10pm ona Wednesday and it happened to be on. I didn’t plan on watching it, but I was taken in quite quickly. Season 1 was amusing but cut short due to the writers strike.  It was a quirky enough show that I decided to give it another look this year. 

Given the fact that Life started on a Friday night schedule had me concerned that NBC didn’t have high hopes for the show. Thankfully, NBC picked up a full season and moved the show back to Wednesday. So far this season, the writing and acting are so much better and the story lines are even quirkier. The show also go an addition humor boost with the addition of Donal Logue (a.k.a Jimmy the Cab Driver). The show has even got my wife hooked and it’s now DVR’d every week. Furthermore, I find myself watching Life episodes more than Heroes at the moment.

But now I hear that NBC is giving Jay Leno the 10pm time-slot for each day of the week. Unfortunately, Life  is one of those shows at risk in 2009, as are some other shows. Balls! Anyway, there’s certainly better things I can do than whine about a T.V. show, given the awesome weather we’ve been having here in the northeast.

Eclipse on Mac Java 6 Reveals More SWT Shortcomings

Two years ago, I raised a few points about some of the short comings of SWT. Because of it’s native bindings, SWT makes the Java mantra of “write once, run anywhere” quite a bit more daunting. For the most part, SWT’s cross-platform support is actually quite good and it is a decent in terms of performance. And, if weren’t for SWT’s existence, we probably wouldn’t have seen Sun address Swing’s performance issues like they did in Java 6. Unfortunately, when a minority platform like OS X makes some steep architectural changes, SWT-based applications end up with more work on thier hands.

As most folks know, Java 6 on Mac OS X 10.5 was a long time coming. It took Apple over a full year after the initial release of Java 6 to get it running on Mac OS X. Now that it’s here and working pretty much “ok”, I decided it was time to start running Java 6 as my default JVM. Then the surprise: Eclipse won’t run under Java 6 on the Mac. Why? Because Java 6 under Leopard is 64-bit. The current version of SWT on OS X relies on Carbon, which is 32-bit and we won’t be seeing 64-bit Carbon anytime soon. Support for 32-bit Cocoa is planned for later next year, but I didn’t see word on when 64-bit Cocoa or even just Java 6 support might arrive.

Eclipse is still a great IDE even if I have to continue to run it under Java 5. However, this is one of those things that is annoying each time a platform needs to make significant changes. But this time, you can’t put all of the blame on the Eclipse crew. Apple did an absolutely terrible job keeping the Java community abreast of what thier plans were with Java 6. In fact, it almost seemed that Java 6 would never appear on Leaopard. Coupled with the fact that Java 6 was now going to be only 64-bit and Carbon was not goning to see 64-bit support. But long story short, SWT and therefore Eclipse is always going to be hindered by OS changes to a greater degree than say NetBeans or IDEA.

WordPress Update Gone Bad

It’s always annoying to upgrade WordPress and then to find out 2 days later that all of your links now return a 404 error. All this depsite the fact that it was all working just fine after the update. It turns out there seemed to be some kind of global change my hosting providers mod_rewrite rules which, in turn caused the links to fail. Thankfully, I was able to correct it by adding a revised .htaccess file. It’s trivial but also disturbing seeing as how I didn’t need maintain a separate .htaccess until now.