Working together for standards The Web Standards Project


Blogger – Can I get in please?

By Ian Lloyd | April 3rd, 2006 | Filed in Accessibility, Web Standards (general)

How Blogger blocks users from getting past the front door when the browser is JavaScript-capable but is sitting behind script blocking firewalls – and how it’s not alone in making this mistake.

Skip to comment form

I use a number of different systems to run a number of different blogs that I either own/run or that I contribute to. For a long time I’ve liked Blogger (even if it is deemed a bit uncool for some of us hack-savvy power users), and I still do. But something’s bugging me, and I figure if I sound off here something might be done to improve it.

Blogger – please can you let me in the front door?

Where I work, the company has a fairly aggressive (although entirely sensible) security policy that sometimes stops web pages in their tracks, and increasingly with anything that could be branded Web 2.0 or built with AJAX. What is it that’s breaking? I’ve done a bit of investigation and it seems to be any script that could be misconstrued as attempting a bit of cross-site scripting or calling some Java function. In the case of Blogger, the all-important submit button is not appearing because the script that writes it in to the page is not activated:

Blogger login - missing a submit button
Where is the submit button? In script-blocking limbo, that’s where.

Now, I’m not posting here to try to get Google/Blogger to make a change for just little old me sat behind my company’s security policies, I’m trying to point out that they’ve effectively shut me out and anyone else who finds themselves working with such restrictions (and there are probably quite a large number).

There is a way around the problem – it is possible to disable JavaScript, but bear in mind that this is often locked down by system administrators on browsers being used by large companies. If I were able to disable JavaScript, I would be given the noscript alternative submit that Blogger has provided and I could get in.

Wider Implications

There is a more serious point to this than just Blogger not letting me in the front door. I am continually running into brick walls with my standards-compliant, fully (well, almost fully) CSS and JS-capable browser because the people building the applications and web sites did not consider what might happen in this scenario – that is that while the browser might be up to the job, the script that does the crucial piece of work is getting blocked at the outset. Perhaps these people should familiarise themselves with what Jeremy Keith has termed Hijax – with a bit of checking in the JavaScript it should be possible to ascertain whether the browser is actually capable of doing what you ask of it). I am continually dropping in on web sites that use an external file to write content in to the page that, because it’s being actively blocked from doing its job, ends up with a page that just sits there looking stupid and acting unresponsive.

Please, authors of fancy-pants Web 2.0 sites (and Blogger), don’t make assumptions that your script is always going to run. It may not – and you need to be prepared for this. Please check what else is going on in your script files ensure there’s nothing in there that can derail the good stuff.

[Of course, you could simply say "Well, what are you doing posting to a blog from work? Earn a living, you skiving git!" And that's one way of dealing with things. Ho hum.]

Your Replies

#1 On April 3rd, 2006 8:48 am Ian Muir replied:

This is a perfect example of what happens everytime a new javascript/flash trick becomes popular. When flash was used fro more than just ads, people would create enourmous complex flash sites with no html/non-flash version. The same is happening now with AJAX.

Many designers/developers have no idea how to make unobtrusive code that degrades nicely. They’re so worried about making it cool for the people using Firefox with Javascript turned on, Flash 8, and 1440×900 widescreen they don’t bother with the 10% of people that don’t have javascript capabilites or the people that don’t update their Flash player every 3 days.

I would be a lot more impressed by more of these Web 2.0 apps if they relied more on the substance and less on the neat factor.

#2 On April 3rd, 2006 9:48 am J. R. San Juan replied:

Are you using Firefox? This is the kind of things you can get with this so-called compliant browser. I rarely run into problems using Explorer, but they are frequent with the Mozilla browser. No matter if it’s o a PC or on a Mac, nor which is the OS. Fierefox deals very bad with java code.

Bad code? No, bad browser.

#3 On April 3rd, 2006 10:04 am WaSP Member lloydi replied:

Yes, I am using Firefox. No it’s not the browser. Blogger’s home page works fine when I try it using the same browser in other locations – it’s the security policy of this company that is actively blocking content being written in to the page, and it will block it being written in for *all* browsers (yep, including IE). So, I say to you first – please read the article properly before posting a comment. And secondly I say:

Bad security policy at my place of work? No, bad contingency planning by Blogger (and others who do the same with dynamically written-in content) = bad code.

By the way. it’s JavaScript I refer to in this posting, not Java code. There’s a *big* difference.

#4 On April 3rd, 2006 11:18 am Sebastian Redl replied:

document.write is bad, Blogger’s particular case is as simple as that.

However, the larger problem is peculiar indeed. The problem here is that the behaviour of your particular client environment is not rational:
1) The browser is capable of executing JavaScript. The catcher or simple progressive enhancement catches are unreliable. Sure, Blogger’s system of writing the submit button with JavaScript is a bad idea. It should be in plain HTML, no , and then have inobstrusive JS manipulate it. But is that enough for this case?
2) The browser is capable of downloading external JavaScript. If ALL external JS was blocked, then a proper (not Blogger’s) progressive enhancement would simply not exist. Unless there’s JS in the HTML, which is bad.
3) The client simply refuses to download some files. What does that mean? You have a home-baked custom solution that decides that every file containing “document.write” is potentially dangerous. Fine. Another site has a different method of defining the “bad” files, and thus blocks other files. Obviously you cannot blame the web author for not predicting which files aren’t downloaded.

Let’s now assume that the web author has divided his JS stuff into several files. There’s a file containing all the site-specific stuff: it intercepts the events with its event handlers, which then do something. These event handlers make use of a common library of functions, residing in a different JS file.
Except that the proxy blocked this other file.
As a result, the perfectly non-intrusive JS gets errors resulting from missing functions. These errors cause the event interceptors to return errors themselves, and I believe some browsers block the default action in this case. Or they might, who knows?

This effectively is like a program on your computer failing to work because the sysadmin deleted a DLL that the program installed in its own private location. Do you blame the program developer for the failure?

Your plea to Blogger is unfair. They didn’t make the assumption that the script is always going to run. They are aware of browsers without JS and browsers with JS disabled and deal with them.
They aren’t prepared for someone pulling the rug from under their feet.

(Note: My real stance is more in the middle. Yes, the web is less reliable than a program on your PC, and yes, it is possible to plan for very weird failures. But web developers must do cost-benefit calculations too. Some failures can be predicted and worked around. I think this particular case is just not worth the effort.)

#5 On April 3rd, 2006 1:37 pm Elaine replied:

Of course, you could simply say “Well, what are you doing posting to a blog from work? Earn a living, you skiving git!”

Ah, but you could actually be trying to use Blogger for work, which I have done myself….

#6 On April 3rd, 2006 1:49 pm Martijn ten Napel replied:

Should the lesson be that submit buttons should never be hidden? I’m not a javascript guru, but if you would use javascript that hides a submit button when the code is being executed, would that help?

I mean, if you rely on select-boxes or radiobuttons and want your users the benefit of not having to click a button after making a choice, I guess that hiding the submit button should be done in the same .js file if and only if the code is executed and your actions on changing a form element will work?

I’m not sure about web applications though; something that is meant to be used as an application and that needs javascript to function properly. I don’t know if Blogger is a web application, I guess not, but I guess the standard procedure must be if that you want your content/website to be accessible on a mobile phone with just (X)HTML capabilities you should rely on pure HTML andseparate behaviour radically from the content, like Peter-Paul Koch had proposed several times.

This is just an illustration of why that would be wise.

#7 On April 3rd, 2006 2:01 pm Nick Fitzsimons replied:

Just a thought: if you hit Enter while the Username or Password text field has focus, the form will be submitted. (Well, it should be, unless they’ve done something weird to stop it happening.)

#8 On April 3rd, 2006 4:56 pm WaSP Member lloydi replied:

@ Sebastian

“Your plea to Blogger is unfair. They didn’t make the assumption that the script is always going to run. They are aware of browsers without JS and browsers with JS disabled and deal with them. They aren’t prepared for someone pulling the rug from under their feet.”

To some extent this is true – they have provided a noscript alternative. My reason for picking on them (sorry, Blogger) was to highlight the problem that I have seen on many occassions whereby using certain JS techniques will fail utterly because of security. There are many sites that work perfectly well, and have done for a long time, using external JavaScript files. It really is only recently that I’ve started to notice simple things such as this failing because of techniques that fall foul of such script-blocking procedures.

Essentially, I wanted to highlight that there is a place between the world of JavaScript off and JavaScript on and people should be aware of it.

Having just read Jeremy Keith’s book on Dom Scripting it really brought home to me that where he stresses repeatedly about not making assumptions about what a browser can or can’t do and the importance of testing conditions all the way, there are many sites out there that do not. Hence, Blogger’s submit button disappears and breaks the most important part of the page.

As for the script-blocking policy that the company I work for uses, generally speaking scripts are allowed in and JavaScript is enabled. It is only on scripts that appear to be doing document.write that are blocked – a response to the risks of cross-site scripting (and for that reason I do not believe this is a rare case).

They (Blogger) might not even be aware that this is happening, but it just goes to show that sometimes a submit button i sbest left as a submit button.

But thanks for your well considered response, Sebastian – very informative and thought out :-)

#9 On April 3rd, 2006 5:51 pm Steve Clay replied:

I have to agree with Sebastian on this one. You can’t expect an application to work when arbitrary code is yanked from it. Today it’s a JS files with document.write, tomorrow it will be anti-virus apps rewriting code and markup.

“Clever” security like this doesn’t protect as much as it just causes endless end-user support hassles. Oh wait, Flash has cookies and can call remote files, too…Looks like we gotta start filtering bytecode!

Maybe it’s blocking Blogger just to keep you productive ;)

#10 On April 3rd, 2006 11:52 pm Alexey Feldgendler replied:

I have to agree with other commentors who think that it’s a bad security policy. It’s bad in two ways:

1. As Steve says, you can’t expect an application to work when arbitrary pieces of code is taken out. A good security policy would just “switch modes”: if the filter detects a “dangerous” script, it should completely strip out all scripts and remove the NOSCRIPT tags but leave the content between them, so that a usable fallback is displayed.

2. It’s even more stupid to deprive users from the ability to disable scripts. How can this be dangerous?

#11 On April 4th, 2006 7:00 am WaSP Member lloydi replied:

Darnit. I posted a reply which got lost :-( Second version is smaller ‘cos I’m grumpy now!

Alex, I like the idea that if certain pieces of code cause the script to block that the associated noscript content is used instead, but that might be tricky to tally up

As for the second point, many large companies restrict what can and can’t be done on the desktop, and often it’s simpler to simply lock down the settings completely rather than lock some down and free up others (it makes diagnosing faults on internal apps more straightforward if you know the environment for a fact). The firewall contains the logic about wether the script is a good or bad one, and the browser is locked into JS-capable mode. Most of the time, scirpts get through and work just fine.

Any company that is serious about security will have policies that are different from the next company’s – I understand that. Having said that, the product that has caused the problem I was experiencing is an off-the-shelf product and the methods used in the Blogger example (and others) are triggering the security to block that apparantly trivial piece of code from working. My point, really, was to highlight that there is an issue here where things can sometimes be overcomplicated and fail. I still can’t see why it’s necessary for Blogger to use external js to write in a submit button.

#12 On April 4th, 2006 7:24 am andrew replied:

Interesting point of view, Ian.

Interesting timing too considering that document.write is one of the options Microsoft are presenting as an Eolas related “workaround” to externally load ActiveX controls in IE. I suspect though that if your company’s security policy is blocking specific (some might say arbitrary ;-) JS tags then displaying AX wouldn’t fly regardless of loading technique…

A.

#13 On April 4th, 2006 7:28 am andrew replied:

Looks like the a href was stripped…

That link to the Microsoft article was:
http://msdn.microsoft.com/library/default.asp?url=/workshop/author/dhtml/overview/activating_activex.asp

#14 On April 4th, 2006 12:57 pm Britney replied:

Would be cool if Blogger fixed the bug so we all could surf from work. Thanks for the link Andrew always fun reading what Microsoft is up to!

#15 On April 6th, 2006 11:40 am Kynn Bartlett replied:

And of course, CAPTCHA means that other people have problems getting into Blogger as well…

#16 On April 6th, 2006 11:58 am fydo replied:

@ J. R. San Juan

“Are you using Firefox? This is the kind of things you can get with this so-called compliant browser. I rarely run into problems using Explorer, but they are frequent with the Mozilla browser. No matter if it’s o a PC or on a Mac, nor which is the OS. Fierefox deals very bad with java code.
Bad code? No, bad browser.”

I’m not saying firefox is 100% compliant to w3c standards, but I think the problem you’re having here isn’t the browser’s fault. So many websites these days only care about getting their website running in IE. They don’t pay much attention to being HTML/XHTML or CSS compliant, so of course their website will look like garbage/not work on browsers that render websites correctly.

And as lloydi mentioned already, it’s not java, it’s javascript.

#17 On April 7th, 2006 9:04 am WaSP Member lloydi replied:

OK, the problem is basically one to do with Java. Not JavaScript, as it turns out (sorry fydo!). Or rather, it’s a combination of the two – on Blogger’s home page there are calls in the linked Javascript to kick start some java (snippet):

function _uBInfo(page) {
var sr="-",sc="-",ul="-",fl="-",je=1;
var n=navigator;
if (self.screen) {
sr=screen.width+"x"+screen.height;
sc=screen.colorDepth+"-bit";
} else if (self.java) {
var j=java.awt.Toolkit.getDefaultToolkit();
var s=j.getScreenSize();
sr=s.width+"x"+s.height;
}
if (n.language) { ul=n.language.toLowerCase(); }
else if (n.browserLanguage) { ul=n.browserLanguage.toLowerCase(); }
je=n.javaEnabled()?1:0;
if (_uflash) fl=_uFlash();
return "&utmsr="+sr+"&utmsc="+sc+"&utmul="+ul+"&utmje="+je+"&utmfl="+fl;
}

The security firewalls are breaking the page because there’s “A JVM and browser violation” taking place. When an incoming java applet attempts to access or start a local application it gets blocked (apparantly a classic sign of worm activity). Hence, that script fails to work, and the noscript fallback does not kick in because the browser does understand JavaScript – it just didn’t get it all through the firewall. And this is ‘off-the-shelf’ behaviour for the security software and will therefore affect any browser that tries to access that page via that firewall.

It makes me wonder why it’s necessary to use Java to get a screen size, though.

#18 On April 7th, 2006 2:11 pm Bruno Girin replied:

@ian or rather why does it need the screen size and bit depth to display a submit button? When I see code like this, I have alarm bells ringing and saying “KISS principle broken, bad case of over-engineering!” Not that the code is inherently bad but it is unlikely to be the simplest solution that could have worked for whatever they want to do.

I reckon someone needs to have a hard look at this and ask “what is it we were trying to do here? and is there a simple solution that would only require semantic markup and CSS?”

#19 On April 8th, 2006 8:10 am kebap replied:

J R San Juan you are a total idiot! stupid fuck.

#20 On April 8th, 2006 1:09 pm Isofarro replied:

Ian, you bring up a good point regarding JavaScript development. Just because a function doesn’t get called doesn’t mean JavaScript is disabled. Its not just JavaScript is enabled or its disabled, but

1.) JavaScript is enabled and the js gets executed
2.) JavaScript is enabled but the JavaScript file doesn’t reach the browser
3.) JavaScript is disabled.

Blogger can easily solve circumstance 2 by always having a submit button (not stuck inside a noscript element).

The company security policy where you work is also part of the problem. I understand why the policy is like that. To be honest, that’s not Blogger’s problem – but – scenario 2.) can happen in other ways other than a corporate security policy, so it makes sense for Blogger to cater for it.

Reminds me of a time when some bright spark decided to filter out all banner ads as part of a corporate security measure. Had the marketing department hitting the roof since there was no way of testing whether the banner ads they’d just paid for were working. The “security measure” lasted for oh, about half-a-day before it was quietly pulled.

#21 On April 11th, 2006 3:01 pm keith replied:

I need a browser

#22 On April 19th, 2006 11:28 am brittany replied:

hey

#23 On April 29th, 2006 11:45 am hwright.net » Ouch - careful where you put that Ajax replied:

[...] Ian Lloyd posted about a problem with blogger use from behind a firewall. Seems company websites with strict surfing rules have problem with Ajax that looks like cross-site scripting. I immediately posted the blog to the developers I’m working with. [...]

#24 On April 29th, 2006 11:53 am lillious vaughn replied:

I cant access hardly any sites or get kicked out because my browser is not compatable. how do i up grade my browser. what do i do.where do i go and how do i go to a browser window when i dont know where it is.

#25 On September 29th, 2006 3:05 am Blog Standard Stuff » Bloglines Is Broken (for me, at least) replied:

[...] My browser (Firefox 1.5) has JavaScript enabled and most of the time it works just fine. However, there are circumstances where it will not work. It’s the same kind of scenario that broke access to Blogger for me a while back (please do read through the comments there). But to summarise: [...]

#26 On September 29th, 2006 3:09 am WaSP Member lloydi replied:

Ah, here we go again … Bloglines have just updated their service and introduced some Firewall-unfriendly JavaScript, rendering that service useless at my place of work now. Sigh …

#27 On October 25th, 2006 4:01 am Using a sledgehammer to crack a nut - The Web Standards Project replied:

[...] Hopefully you’re all familiar with the phrase in the heading and it’s not some strange quirky British turn of phrase that’s got many of you scratching your heads. It is, though, the perfect phrase for describing what I feel about Ma.gnolia’s sign in screen. It’s not the first time I’ve had issues with over complicated log in pages (complicated in terms of how it’s built) and I dare say it won’t be the last. And just like last time I posted on this topic, I’m expecting that I’ll get comments that will range from "You’re right, this is way over complicating the process – have they not heard of web standards?", through "It’s actually very hard to build sites that use progressive enhancement propely without massive amounts of code forking" to "stop blaming frameworks". But please read on before heading straight to the comments. [...]

#28 On January 12th, 2007 7:57 am No Relation» Blog Archive » Can I always use a 100% AJAX Solution? replied:

[...] – Using a sledgehammer to crack a nut: Ma.gnolia’s case. – Blogger: Can I get in please? – Bloglines is broken! (at least for me) [...]

Return to top

Post a Reply

Comments are closed.


All of the entries posted in WaSP Buzz express the opinions of their individual authors. They do not necessarily reflect the plans or positions of the Web Standards Project as a group.

This site is valid XHTML 1.0 Strict, CSS | Get Buzz via RSS or Atom | Colophon | Legal