Want a Slave Trade tour? Don't miss Arochukwu.

Excuse me, please, but I need a moment of pidgin.

Na wah oh! E be like say ndi-Arochukwu don vex well well. Them say "all the Akata dey go Ghana, dey take their dollar go Ghana, say na slavery history tour". We, nko? We no get slavery? We get am plenty. I beg bring your dollar come make slavery tour". Ah beg. Dis one don pass man.

I read it in Naija Blog:

The Nigerian Tourist Development Commission's website has a page on an hypothetical slave tour for Nigeria. They write that "Arochukwu has a distinguished reputation as a source for the supply of slaves." I wonder if the good people of this town would like to be considered in this way. I'm not sure its quite something to be that proud of.

OK, to be sure we don't treat the history of the slave trade as gingerly in Nigeria as we do in the U.S. An old girlfriend of mine was from Arochukwu, and when I wanted to tease her (which was often) I called her "slave trader". She'd call me "bushman" It's all good. Of course I didn't dwell on the fact that my Mom is from near Calabar, where the Aros would typically sell all the slaves they'd captured in their raids on the Igbo interior (where my Dad is from).

But even for those of us who can be that relaxed about it all (easy enough when your forbear was not the one shuffled off in a coffle to Calabar for a ghastly journey and a ghastlier existence abroad) the idea of building a tourism industry around all that sounds potty. Then again, I remember once traveling to New Orleans with a bunch of my Norwegian friends. They were dead set on going to see a plantation museum (I rememeber the flyer laid it on thick about "witnessing the slave's experience"). I recoiled from the idea and excused myself from the expedition, preferring to sleep in the car, but they came back all a-twitter. I guess there might be some logic to the whole thing. The same logic that keeps Mme Tussaud's Chamber of Horrors and the Torture Museum in Amsterdam going. I also hear that many Black Americans visit Goree in Senegal and Ghana's coastal slaving fortresses, and that such tourism is supposedly Ghana's largest source of hard currency.

It's all about the Benjamins. Especially when Benjamin used to be named "Baneji".

And oh by the way... Whoever designed that Nigerian Tourism site? And whoever paid for it? I got something fo' dat ass. I don't remember the last time I saw anything that garish on the Web. It needs to be in a bad design competition.

[Uche Ogbuji]

via Copia

“Mix and match Web components with Python WSGI”

“Mix and match Web components with Python WSGI”

Subtitle: Learn about the Python standard for building Web applications with maximum flexibility
Synopsis: Learn to create and reuse components in your Web server using Python. The Python community created the Web Server Gateway Interface (WSGI), a standard for creating Python Web components that work across servers and frameworks. It provides a way to develop Web applications that take advantage of the many strengths of different Web tools. This article introduces WSGI and shows how to develop components that contribute to well-designed Web applications.

Despite the ripples in the Python community over Guido's endorsement of Django (more on this in a later posting), I'm not the least bit interested in any one Python Web framework any more. WSGI has set me free. WSGI is brilliant. It's certainly flawed, largely because of legacy requirements, but the fact that it's so good despite those flaws is amazing.

I wrote this article because I think too many introductions to WSGI, and especially middleware, are either too simple or too complicated. In line with my usual article writing philosophy of what could I have read when I started out to make me understand this topic more clearly, I've tried to provide sharp illustration of the WSGI model, and a few clear and practical examples. The articles I read that were too simple glossed over nuances that I think should really be grasped from the beginning (and are not that intimidating). In the too-complicated corner is primarily PEP 333 itself, which is fairly well written, but too rigorous for an intro. In addition, I think the example of WSGI middleware in the PEP is very poor. I'm quite proud of the example I crafted for this article, and I hope it helps encourage more people to create middleware.

I do want to put in a good word for Ian Bicking and Paste. He has put in tireless effort to evangelize WSGI (it was his patient discussion that won me over to WSGI). In his Paste toolkit, he's turned WSGI's theoretical strengths into readily-available code. On the first project I undertook using a Paste-based framework (Pylons), I was amazed at my productivity, even considering that I'm used to productive programming in Python. The experience certainly left me wondering why, BDFL or no BDFL, I would choose a huge mega-framework over a loosely-coupled system of rich components.

[Uche Ogbuji]

via Copia

“Dynamic SVG features for browsers”

“Dynamic SVG features for browsers”

Subtitle: Build on SVG basics to create attractive, dynamic effects in your Web projects
Synopsis: Learn how to use dynamic features of Scalable Vector Graphics (SVG) to provide useful and attractive effects in your Web applications. SVG 1.1, an XML language for describing two-dimensional vector graphics, provides a practical and flexible graphics format in XML. Many SVG features provide for dynamic effects, including features for integration into a Web browser. Build on basic SVG techniques introduced in a previous tutorial.
Lead-in: SVG is a technology positioned for many uses in the Web space. You can use it to present simple graphics (as with JPEG) or complex applications (as with Macromedia Flash). An earlier tutorial from June 2006 introduced the basic features of the format. This tutorial continues to focus on SVG for Web development, as it demonstrates dynamic effects that open up new means of enhancing Web pages. The lessons are built around examples that you can view and experiment with in your favorite browser.
Developed by the W3C, SVG has the remarkable ambition of providing a practical and flexible graphics format in XML, despite the notorious verbosity of XML. It can be developed, processed, and deployed in many different environments -- from mobile systems such as phones and PDAs to print environments. SVG's feature set includes nested transformations, clipping paths, alpha masks, raster filter effects, template objects, and, of course, extensibility. SVG also supports animation, zooming and panning views, a wide variety of graphic primitives, grouping, scripting, hyperlinks, structured metadata, CSS, a specialized DOM superset, and easy embedding in other XML documents. Many of these features allow for dynamic effects in images. Overall, SVG is one of the most widely and warmly embraced XML applications.
Dynamic SVG is a hot topic, and several tutorials and articles are available that include fairly complicated examples of dynamic SVG techniques. This tutorial is different because it focuses on a breadth of very simple examples. You will be able to put together the many techniques you learn in this tutorial to create effects of whatever sophistication you like, but each example in this tutorial is simple, clear, and reasonably self-contained. The tutorial rarely deals with any SVG objects more complex than the circle shape, and it keeps embellishments in scripting and XML to a minimum. The combination of simple, step-by-step development and a focus on real-world browser environment makes this tutorial unique.

Pay attention to that last paragraph. There are many SVG script/animation tutorials out there, including several by IBM developerWorks, but I found most of them don't really suit my learning style, and i set out to write a tutorial that would have been ideal for me when I was first learning dynamic SVG techniques. The tutorial covers CSS animation, other scripting techniques and SMIL declarative animations. It builds on the earlier tutorial “Create vector graphics in the browser with SVG”.

See also:

[Uche Ogbuji]

via Copia

GoDaddy reconsidered

Last week I told a story of how GoDaddy's customer service told me I'd forfeited a certificate I'd purchased when I didn't use it in 60 days. On Friday I got a voice mail from someone in "the office of the president of GoDaddy.com". She told me they had restored the certificate to my account. Surely enough, the credit is there.

Things that make you go "hmmmmm". I'm not sure I like the possibility that I get by Weblogging a complaint what I could not get by making the same complaint to customer service. Of course, that's just one possibility. As I hinted in my earlier entry, even at the time of the annoying incident I suspected I was dealing more with a clueless customer service rep (as well as his clueless supervisor) than a true corporate policy of seizing my certificate. I might have called the number right back and reached a rep who sorted out the problem right away. Or perhaps if I'd sent my complaint to the company, rather than posting it, they'd have been as attentive. This sort of thing happens all the time.

My friend David Courtney who related the following story:

I read about your GoDaddy experience. I have to say, I'm rather surprised you were treated this way. I shall have to re-evaluate my opinion of GoDaddy. I registered the ultradesic.com domain with them last year and bought a Turbo SSL certificate. Once my data center issued myCertificate Signing Request, I went to the GoDaddy website to get the SSL certificate. I took that SSL certificate back to my data center only to find out they had used an invalid method of generating the original CSR. They gave me a new CSR. I went back to GoDaddy only to find out that once the key was issued, you couldn't get a new key. I was extremely aggravated at my data center for messing up the process. But in this case, the GoDaddy person I communicated with via [username omitted]@godaddy.com was extremely helpful. He immediately issued me a credit for my certificate so I was able to generate a new certificate from the new CSR.

I had another somewhat similar problem this year. I was sick of dealing with my data center's total disregard for security. (i.e. no ssh access to my domain. I had to use plain text FTP to get anything done.) So I moved to a new data center. Well, because I moved my domain to a new location, I had to generate my certificate all over again. When I went to my account at godaddy.com, I again ran into the problem of "The key has been issued, you can't re-key it." But I again e-mailed [username omitted]@godaddy.com and the problem was resolved very quickly. He credited me a certificate for both my domains, no questions asked.

This is in line with everything I'd heard about the company before last week, so I'll assume I just caught a it of bad luck in the customer service lottery and accept their olive branch in good grace, putting last week's incident aside for now.

[Uche Ogbuji]

via Copia

Is USPTO abandoning XML in its electronic filing system?

I wrote an article a while back, "Thinking XML: Patent filings meet XML" in which I covered, among other things, the various patent agancies' efforts to support electronic filing. Many of these efforts are XML-based. Except now perhaps the USPTO's (EFS-Web) isn't. There were a lot of gnarly aspects of the EFS-Web process, and I had heard from some users who ended up abandoning the system. It looks as if the USPTO is trying to address these problems by chucking the whole approach and just having people upload PDFs (via XML.org Daily Newslink--yeah. I'm way behind). I wonder whether they also considered supporting ODF, at least as an alternative to PDF. It seems to me that what they needed was broader, not narrower format and tool support.

[Uche Ogbuji]

via Copia

XML Universal names (namespaces): To fuse or not to fuse

I ran into Ken MacLeod on the Atom IRC channel today. Actually I think I've chatted with him before but I didn't know the nick I was responding to was Ken. Certainly a fortuitous discovery, but more importantly Ken drew my attention to an old Weblog posting I'd somehow missed. In it he makes two separate points that I think overrun in perhaps a confusing way. Firstly he advocates that XML APIs strictly treat a node's universal (i.e. namespace-qualified) name as a tightly bound unit. I (arbitrarily) call this fusing the universal name. An example is APIs such as ElementTree that use James Clark notation for namespaces.

The second point is that some XML APIs have really ungainly syntax for handling namespaces, with SAX and DOM being the worst offenders (we both agree that the Java/IDL heritage of these APIs is the worst problem).

To take the first point first, I disagree that APIs based on fused names are superior. Yes an XML universal name should conceptually be a unit, but in practice it is not, and people often have a real need to work severally with either piece of the tuple. I used the analogy of complex numbers in the IRC discussion. The complex number 3 + 2j is a single number, but there is nothing wrong with an API's making it easy for a user to manipulate its real and imaginary parts. It's up to the developer not to somehow abuse this flexibility.

The second point is well taken, but I strongly believe that it is not the lack of fused names that makes an API poor. SAX (SAX 2, to be precise) is poor because of the redundancy and odd structural conventions in reporting information such as prefixes. DOM (DOM Level 2, to be precise) is poor because of the bewildering decision to maintain namespace declarations as separate attribute objects in addition to the redundant information offered as node object properties. Both owe much of their weakness to concerns for backwards compatability with non-namespace-aware APIs.

For the most part APIs that were born and bred in the era of namespaces are much less tortured, regardless of how tightly or loosely they expose the parts of a universal name. If anything, I believe that it's better for the API to make it easy for the developer to separate namespace name, local name and prefix. Yes, even prefix, because the golden world in which prefixes are irrelevant does not exist. It was ruined by the advent of QNames in content, or "hidden namespaces". Yes, this happens to be one negative side-effect of the success of XSLT and XPath, which were great specs for the most part, but also represented the first real triumph of the hidden namespaces idea that has left such a mess in its wake.

Ken ended his Weblog post with a proposed notation for fused names, which is just like one I had mulled and discarded for Amara, He also showed me a derived convention for his Orchard software, which I didn't know was still in development (it looks very interesting). This convention looks just like the mapping accessors I somewhat grudgingly added to Amara in February. I'm still open to changing this until Amara 1.2, so I'll give the whole matter some thought (including the Orchard approach). Feedback is welcome. I think the fact that Ken and I independently came up with a such a series of similar ideas makes me think we're on the right track.

I do want to make sure it's clear that giving the user a better API for namespaces is not bound to insistence on fused names.

[Uche Ogbuji]

via Copia

The GoDaddy certificate rip-off

In March we purchased a package from GoDaddy. The purchase package looked in part like the following:

QTY ITEM                                            PRICE
1   .COM Domain Name Transfer - 1 Year              $2.24
    FOURTHOUGHT.COM
24  Premium Hosting w/ PHP / PERL- v2               $287.04
1   Turbo SSL (2 Years)                             $0.00

Getting the included certificate was a large part of the incentive for choosing this package and vendor (GoDaddy), but we didn't get around to using it right away. Today, after a bunch of much-needed server maintenance we were ready to set up and use the cert. I went to our account info to find that GoDaddy claimed we had no credit for an SSL certificate. Figuring it was a simple error I cheerfully called customer support.

I was surprised to be told me that since we had not used the certificate for 60 days, we could not have it. I asked why and thee gentleman on the phone went on about how we didn't pay for the certificate, anyway. I scoured our purchase receipt and did a few likely text searches on the huge GoDaddy customer agreements and I found no notification of the 60 day forfeiture. I pointed this out to him at which point he became defensive, saying it had to be in the documentation somewhere and at any rate there was nothing he could do to help me. He kept telling me that we had not paid for the certificate anyway. I told him that even though the invoice shows a $0 line-item for it, it was part of a package deal, and so in paying for the whole package we had paid for the certificate. He kept repeating, as if a mantra to make me go away, that we hadn't paid for the certificate. I explained that I could understand if we were now given a certificate that expired March 2008, and thus forfeited the unused portion of the two-year duration, but he insisted there was nothing that could be done. I asked to speak to someone who might be a bit better authorized to deal with the situation, and he was very reluctant until he finally passed me to his "floor supervisor".

This gentleman told me that since I had not used it for 60 days, the certificate counted as an "unused product". He said that he couldn't restore it to our account because it was "free" and so there was no money to refund and re-purchase. I asked him whether he was willing to restore the cert form a customer service point of view, but he was facing a system limitation because of the $0 line-item. In short I was kinda giving him a way out. I would have been at least a little mollified if they were technically hamstrung rather than obstinate about playing "GOTCHA! 60-day forfeiture", along the lines of a particularly rough game of Calvinball. Strangely he refused to really admit it as such. He kept insisting that the problem was that we had "never paid for a certificate". I imagine they're trained to never admit any sort of fault. I only have so much time in the day so I left off the matter at that point.

I guess my main point is to be careful when dealing with GoDaddy.com about undisclosed limitations on their offerings. I think the 60 day "unused product" limitation is a poor policy in the first place, but I'd understand better if it had at least been disclosed. As far as I can tell, it had not been.

I'll go ahead and purchase a certificate form another vendor. I shall not do any business in future with GoDaddy.com. I'm sure other vendors will be more costly, but honestly, a few extra bucks per domain-year is well worth the principle.

I hope this note saves anyone else such a surprise.

[Uche Ogbuji]

via Copia

Some thoughts on QNames in content (including proposal for a better, ahem, name)

I'll cut your ass in half and leave you with a semi-colon

—Mr. Man

QNames in content have been on my brain today. See a follow up posting for more on why.

First of all I think we should find a new name for this phenomenon, because I don't think QNames qua QNames are key to the problem. For one thing, you have the problem even if you only use a prefix in content, as XSLT does in, say the extension-element-prefixes, or XPath in html:*/html:span (the second step is a QName, but not the first). I think a better name for this problem is "hidden namespaces" because that's exactly the problem: the document depends on a construct that is hiding a namespace in a separate layer where generic processing cannot see it.

Whatever the name, I re-read today a couple of important documents regarding the issue. First of all there is the TAG finding on QNames, which is unfortunately not much more than an agglomeration of existing wisdom. Norm Walsh, the editor of that document wrote of a more radical direction as part of his "XML 2.0" article. I like his ideas (though I'm partial to Jeffrey Yasskin's ampersand variation, and I hope conversation soon drives towards something along those lines. XML is almost ten years old, and I see nothing wrong with a bit of a shake-up.

Until then, I think we can deploy two safeguards to protect ourselves from the subtle problems of namespaces. I call them: "sanity within the document, and registries without". The two components are very different in character.

Firstly, we need to discard the idea of in-document scoping of namespaces. It seemed a great idea at the time, even to me, but in practice it's a mess, and Joe English was the first one to illuminate the mess in the light of a brilliant metaphor (Google cache of original since XML-DEV is down now). (See my article "Principles of XML design: Use XML namespaces with care" for more on this). If we can rely on sanity in XML documents we can at least simplify state processing a good deal. Ideally all the XML sources in an XML processing pipeline would emit sane XML.

Secondly, I think the time has come for namespace registries. It would definitely be nice to build on the unfortunately stalled RDDL, but whereas the goal of RDDL is to provide human readable information, what I think we really need in a namespace registry is a little nugget of machine-readable data. Drumroll please...

A list of preferred prefixes for a namespace (supporting lookup of namespace name to well-known prefix, and vice versa). I know this will be controversial. Prefixes are supposed to be insignificant. Users should have flexibility to use whatever prefix, blah blah blah. I'm sorry, but that's all theoretically nice, but we have practical problems to solve. The fact that the most powerful constructs in XPath depend for their semantics on the whimsy of prefix choices should bother you a bit. The fact that Canonical XML had to abandon the idea of normalizing prefixes should bother you even more. It's time to just say that xsl means "The XSLT namespace" (yeah, yeah: "what version?" etc.—hard problems would still remain) and that if you choose to use it for a different namespace, you're technically compliant to namespaces, but you're asking for a heaping help of trouble, buddy.

For now I'm just throwing out ideas to help organize my thoughts, and for discussion. It seems to me that if we could rely on authors and tools, supported by a registry, to produce sane documents that (wherever possible) used essentially reserved prefixes, including, of course, for hidden namespaces, we could simplify namespace-aware processing a great deal. I can think of some practical hurdles for the registry idea, but I can't think of any reason why it's not even worth a try.

[Uche Ogbuji]

via Copia

“XML in Firefox 1.5, Part 3: JavaScript meets XML in Firefox”

“XML in Firefox 1.5, Part 3: JavaScript meets XML in Firefox”

Subtitle: Learn how to manipulate XML in the Firefox browser using JavaScript features
Synopsis: In this third article of the XML in Firefox 1.5 series, you learn to manipulate XML with the JavaScript implementation in Mozilla Firefox. In the first two articles, XML in Firefox 1.5, Part 1: Overview of XML features and XML in Firefox 1.5, Part 2: Basic XML processing, you learned about the different XML-related facilities in Mozilla Firefox, and the basics of XML parsing, Cascading Style Sheets (CSS), and XSLT stylesheet invocation.

Continuing with the series this article provides examples for loading an XML file into Firefox using script, for applying XSLT to XML, and for loading XML with references to scripts. In particular the latter trick is used to display XML files with rendered hyperlinks, which is unfortunately still a bitt of a tricky corner of the XML/Web story. I elaborate more on this trick in my tutorial “Use Cascading Stylesheets to display XML, Part 2”.

See also:

[Uche Ogbuji]

via Copia

I Wish XForms was Recursively Declarative

I've become a big fan of declarative problem solving lately, which is one of the reasons I really enjoy composing web-based user interfaces with XSLT and XForms. However, I was thinking about how I would build an XForm to edit a very recursive structure, an EBNF instance as an XML document. I thought it would be nice to define a widget (an xf:group) for each of the more major components of a grammar and (in XSLT push fashion) recursively render a form for editing an instance of the grammar.

After all, XSLT was the main reason I really like the idea of schematron for document validation. The XML infoset is perfect match for capturing an EBNF, since it is purely syntactic and very recursive. So, it's a shame I couldn't take advantage of an XML-based user interface's processing mechanism (like XForms) to render an edit form in the same way an xsl:apply-template with a mode would.

Imagine:

Grammar Instance

SELECT * WHERE { OPTIONAL { GRAPH ?provenance { ?person a foaf:Person } } }

Grammar Instance (as an XML Document)

<SelectQuery>
  <AllVariables/>  
  <Where>
    <GroupGraphPattern>
      <GraphPattern>
        <OPTIONAL/>
        <GroupGraphPattern>
          <GraphPattern>
            <GRAPH graphName="?provenance">
            <GroupGraphPattern>
               <BasicGraphPattern>?person a foaf:Person</BasicGraphPattern>
            </GroupGraphPattern>
          </GraphPattern>      
        </GroupGraphPattern>
    </GroupGraphPattern>
  </Where>
</SelectQuery>

XForm snippet

<xf:group ref="SelectQuery/Where">
    <xf:group ref="GroupGraphPattern" mode="push">
      <fieldset>
        <legend>A SPARQL GroupGraphPattern</legend>
        <xf:apply-templates-equivalent mode="push"/>
      </fieldset>
    </xf:group>
</xf:group>

Which would render a radial set of fieldsets, one for each GroupGraphPattern in the recursive structure. Somewhat related: Quadtrees in Javascript and CSS

I guess I can see how having to maintain the dependencies in this scenario would be something similar to having an XSLT processor bound to a 'live' XML instance - very expensive.

Chimezie Ogbuji

via Copia