Page 1 of 2 12 LastLast
Results 1 to 10 of 11

Thread: How to use a Gzip file as a link

  1. #1
    Join Date
    Mar 2012
    Posts
    53
    Thanks
    7
    Thanked 0 Times in 0 Posts

    Default How to use a Gzip file as a link

    I just Gzipped my css file and now I am not able to understand how to use it as an external css file as it has a .gz at the end instead of .css Here is the link files.cryoffalcon.com/bloghuts/Compressed/BlogHuts%20COMPRESSED%20CSS.css.gz

  2. #2
    Join Date
    Mar 2006
    Location
    Illinois, USA
    Posts
    12,164
    Thanks
    265
    Thanked 690 Times in 678 Posts

    Default

    You can't. That's no longer a CSS file.

    You can use gzip compression to send files from the server, but not on a per-file basis like that, with the altered file extension. Here's an introduction to how it works:
    http://betterexplained.com/articles/...p-compression/

    I'm not sure whether CSS can work that way, but I know that HTML can. Regardless, you'll need the proper headers sent with the file to tell the browser how to interpret it rather than seeing it as a bunch of nonsense (which is what gzip data would look like if the browser is expecting css).
    Daniel - Freelance Web Design | <?php?> | <html>| español | Deutsch | italiano | português | català | un peu de français | some knowledge of several other languages: I can sometimes help translate here on DD | Linguistics Forum

  3. The Following User Says Thank You to djr33 For This Useful Post:

    cryoffalcon (05-21-2012)

  4. #3
    Join Date
    Apr 2012
    Location
    Chester, Cheshire
    Posts
    329
    Thanks
    7
    Thanked 35 Times in 35 Posts

    Default

    If you're worried about page load times then split your css file down into a few smaller ones. This is also good for accessability features such as colour schemes and the like. Have all your static layout in one css file, your colour scheme in another. Navigation menus, individual page styles and specialised styles can all go in seperate css files then be linked to the page.

    The resulting smaller file sizes will decrease page load time. It's easier for debugging and once loaded, each subsequent page refresh will be retrieved from cache, further increasing efficiency.

  5. #4
    Join Date
    Mar 2006
    Location
    Illinois, USA
    Posts
    12,164
    Thanks
    265
    Thanked 690 Times in 678 Posts

    Default

    That's not necessarily the best idea: one strategy is to place all content into as few files as possible so that you have a minimum number of requests. In reality for small files, the loading time of a page has more to do with request time than the actual filesize. It's not significantly slower to load 5kb than 10kb, because the 5kb still has the significant delay of the request time.

    I'm not necessarily suggesting one strategy or another in this case, but there are good reasons to use a single file rather than many files. Regardless, it will take the same amount of file size space-- either in one file or many.
    Daniel - Freelance Web Design | <?php?> | <html>| español | Deutsch | italiano | português | català | un peu de français | some knowledge of several other languages: I can sometimes help translate here on DD | Linguistics Forum

  6. #5
    Join Date
    Apr 2012
    Location
    Chester, Cheshire
    Posts
    329
    Thanks
    7
    Thanked 35 Times in 35 Posts

    Default

    That's true. The amount of delay you're talking about is negligable at best. If a user is using a slow connection or has other issues wth their hardware or browser; even a hypochondriac AV/Firewall then the total Network Delay will be outweighed by any client side delay they may encounter.

    I agree that reducing requests is something to consider, but from a purely practical point of view, I've always found it best to split up the css into debuggable chunks. This is especially true of large sites with lots of dynamic code where it's not always clear which styles go where and how the page fits together until it's in a full sandbox, or even live. I usually try to create a colour blind safe scheme for my sites that cover all three forms of colour blindness. If all of the colour information is in one css file then adding new schemes doesn't mean duplicating an entire site's stylesheet just to change the colours.

    I suppose it's just how we've been taught WebDev really. Different horses and so on.

  7. #6
    Join Date
    Mar 2006
    Location
    Illinois, USA
    Posts
    12,164
    Thanks
    265
    Thanked 690 Times in 678 Posts

    Default

    The amount of delay you're talking about is negligable at best.
    It's really not. This is a well known strategy and there is lots of information about it around.
    For example, the entire concept of image sprites is based on this specifically to reduce the number of requests: use complicated CSS so store all icon-type images in a single large image file and use only one request to get it. Why would that exist if this is "negligible at best"?

    The 20 million results on google here should back that up:
    https://www.google.com/search?q=web+design+requests

    I think your advice is generally helpful, but you need to keep in mind that there are other things that may be relevant too. Here, your advice is just completely missing one potentially important aspect.

    I agree with you that there are reasons to split CSS files. But there are also reasons not to.
    Daniel - Freelance Web Design | <?php?> | <html>| español | Deutsch | italiano | português | català | un peu de français | some knowledge of several other languages: I can sometimes help translate here on DD | Linguistics Forum

  8. #7
    Join Date
    Apr 2012
    Location
    Chester, Cheshire
    Posts
    329
    Thanks
    7
    Thanked 35 Times in 35 Posts

    Default

    Quote Originally Posted by djr33 View Post
    It's really not. This is a well known strategy and there is lots of information about it around.
    For example, the entire concept of image sprites is based on this specifically to reduce the number of requests: use complicated CSS so store all icon-type images in a single large image file and use only one request to get it. Why would that exist if this is "negligible at best"?

    The 20 million results on google here should back that up:
    https://www.google.com/search?q=web+design+requests

    I think your advice is generally helpful, but you need to keep in mind that there are other things that may be relevant too. Here, your advice is just completely missing one potentially important aspect.

    I agree with you that there are reasons to split CSS files. But there are also reasons not to.
    I take it you're trying to minimise the processing delay; I assume that in both cases the propogation, transmission and queuing delays would remain the same. In extreme cases, this may shave 500 milliseconds from the total network delay. But when you consider than client-side delay may be 5,000ms plus, that extra .5 of a second is negligable.

    In most cases, minimising the processing delay will save between 50 - 100 ms, when benchmarking one CSS file verus three (I've tried this using Wireshark and OPNET IT Guru 14.5, both synchronous and asynchronous transfers). This is also relatively negligable when taking into account client-side delay as well, including human reaction time. (You can't even get to "LOAD YOU BAST..." before it's status 200 ) I don't think anyone realistically expects a full page refresh in under a hundreth of a second.

    For image files this is slightly different, as the header defines the image size, number of colours, resolution, and other information needed to display the image. A text file like a CSS Stylesheet conatins less specific metadata in it's header.

    I'd be interested in finding out how much gzipping a stylesheet affects it's processing delay, changing it's header to an Archive File header should, in theory, degrade it's processing delay as there is more header data to process.

  9. #8
    Join Date
    Mar 2006
    Location
    Illinois, USA
    Posts
    12,164
    Thanks
    265
    Thanked 690 Times in 678 Posts

    Default

    One of the main reasons for the request time concern isn't for the server. It's for the client. Surely you've experienced half of a page's images (or the stylesheet) missing for a few moments? Sometimes it simply takes longer to make the requests (and receive responses) because there are many of them. It's not necessarily because of the server. It's also about the end user's internet connection, which is often less stable than the server's, of course.
    So take the example of a mobile phone: requests can take a lot of time to connect, while once they're connected the delay for 5-10kb is unnoticeable.
    Daniel - Freelance Web Design | <?php?> | <html>| español | Deutsch | italiano | português | català | un peu de français | some knowledge of several other languages: I can sometimes help translate here on DD | Linguistics Forum

  10. #9
    Join Date
    Apr 2012
    Location
    Chester, Cheshire
    Posts
    329
    Thanks
    7
    Thanked 35 Times in 35 Posts

    Default

    That's what I was getting at.

    I agree with reducing delay to a minimum; but to a compromise. I wouldn't want to detriment the maintenence of the site for the sake that someone with a dial up connection, or TalkTalk broadband can view my site like normal users, instead of waiting an extra second at most for the CSS to kick in fully.

    I know it may sound slightly cynical. I understand that minimising delay is imprtant, but isn't maximising the usability and accessability of your site just as, if not more important?

  11. #10
    Join Date
    Mar 2006
    Location
    Illinois, USA
    Posts
    12,164
    Thanks
    265
    Thanked 690 Times in 678 Posts

    Default

    I don't see these concerns as significant for most instances of web design. The only one that matters is the user's experience. Our hosting accounts have reasonable limits we won't ever hit and the difference between 5kb and 10kb for both storage and bandwidth really isn't a problem. Certainly it's relevant if we're also downloading 100mb videos from the site, but that's another issue entirely. That 5kb difference isn't an important one.

    For a major site that has a lot of traffic and is facing server issues on a daily basis (like google or facebook), it's worth dealing with the server-optimizing things. But in general, I think that's overkill. Improving the visitor's experience, however, is never overkill
    Daniel - Freelance Web Design | <?php?> | <html>| español | Deutsch | italiano | português | català | un peu de français | some knowledge of several other languages: I can sometimes help translate here on DD | Linguistics Forum

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •