View Full Version : Images take a long time to load?

Rich Walker
10-31-2006, 04:46 PM
Hi there,

I wonder if anyone could help me please. I am currently designing a website as a favour for a friend. It's my first time using HTML, Dreamweaver etc. So please bear with me...

The site is: www.afrotv.co.uk (http://www.afrotv.co.uk)

My problem is that the images take a long time to load (even on my broadband connection).

The size of the image folder is 716kb for the whole site.

Is there anyway i could reduce the time to load these images, or have I just made a complete mess of the design phase and made my images too big?

Hope someone can help.

Thanks a lot.

Best regards,

Rich Walker.

11-01-2006, 12:41 AM
You can make the images smaller (by compressing them more).
Depending on your server, it might be slow so you could use a faster server to do it, but that's obviously a hassle.
It might be your computer if your cache is close to full or system runs slowly. Try clearing your cache and running less at once.
Aside from that, no, there is no way to transfer data faster. Sorry.

11-01-2006, 08:46 AM
if you are dealing lot of images it is obvious that it will take some time to complete the image loading.

Since you are using the jpg images i don't think it can be compressed again already its compressed.

If you can avoid the unnecessary usage of images in your site (in your site you've used the spacer image extensively).

Why don't you try to use a table less layout rather than table based layout.

11-01-2006, 09:40 AM
JPGs can be recompressed. The format allows for 1-100% quality. (0 might be allowed; not sure). Anything about 90 is almost never distinguishable from 100, and usually 30 looks just fine.
Saving the JPGs at lower quality can help this.
Converting them in some cases might help as well, if you happen to have one image that has just a few colors, for example, a GIF would do well.

Tables can be ok, but be creative in the use of images.... you can use background colors and other means to create a similar effect while not slowing down the page loading.

11-01-2006, 09:43 AM
hi djr33

thanks for the information

11-01-2006, 09:46 AM

Also note that compressing something twice will end in lower quality than doing so once. Each time you compress something (anything) it loses some information. Doing so twice means losing information twice.
For example, an image compressed to 30% would be better than the same image compressed to 70% then to 30%.
The effect won't be awful, though, if you must recompress.
If possible, however, use the original full quality image to create the smaller version from.
This is a huge reason to save the original full quality version of any images you create.
The same theory applies to modifying a JPG and resaving it. In doing so, you also get generation loss. (Generation loss is the term used for describing the loss of info per save.)
If you can, change the original and create a new JPG from that, rather than recompressing the already compressed JPG copy.

11-01-2006, 10:26 AM
Thanks a lot for the image compression information. It was really worthy :)

Rich Walker
11-01-2006, 11:19 AM
Hi guys,

Thanks SO much for all your helpful replies! :)

Is there anyway i can compress my jpeg's without having to re-splice everything in fireworks and link it all up again?

What do you think to the general aesthetics of the site?

The server that the site uses is based in Africa, would this make a difference to the loading time?

Thanks again for all your help, it is greatly appreciated.



11-01-2006, 12:08 PM
I've got some tools for compressing jpg image while searching through net. Please check out this (http://jpeg-compression.qarchive.org/).

Hope you can find something useful here

11-02-2006, 04:27 AM
From my VERY basic understanding of servers and such.... this is basically how it works:
1. your computer makes a request for data from a server.
2. that request is sent up through many levels:
--ISP (internet service provider)
--local central station
--general station
3. eventually it hits the top, which, in this case, would likely be a US national server of some sort.
4. the request is transfered to africa
5. the request goes back down to the server through the same sort of path it went up.
I *think* this is right, just based on casual observations I've made.

I did a test of the datarate for downloading from a company that tests from different locations in the US.
The San Francisco server was more than twice as fast (as I'm very close to SF) than the server in New York.

So... yes, location does matter.

Have the server as close to [the majority of] the page's viewers as possible. (This is a big thing to change, so might not work out, which is understandable.)
This may be part or all of the reason the images are slow.... I'm not sure.
I'd suggest testing on different computers from different places to get a general idea of how it works before making big decisions/conclusions.

As for slicing an image....
1. Save the original
2. You need only replace the specific parts that are change... just those chunks of the image.
3. With ImageReady, it's integration with Photoshop is nice so that you CAN save the slices and use the original.
4. Not sure about Fireworks. There is hopefully a similar answer.

Remember, hit Save, not save as or save for web... that will save the ORIGINAL which you can edit later. Save as and save for web are to output [only] a copy to the web. Don't delete the original. And always edit the original.

11-02-2006, 05:08 PM
From my VERY basic understanding of servers and such....

Search for information on routing. It's not a particularly difficult topic to understand, but the details are quite involved. It's much easier to understand with diagrams, and based on knowledge of network devices and topographies. A good book or site on the subject should provide all of that.

2. that request is sent up through many levels:
--ISP (internet service provider)
--local central station
--general station
3. eventually it hits the top, which, in this case, would likely be a US national server of some sort.

The Internet couldn't be too hierarchical. If one country-wide network was responsible for linking to other nations, it would be too overworked and the effect of failure would be too dramatic.

5. the request goes back down to the server through the same sort of path it went up.

Packets are routed individually. A router might send them in many directions if the most direct routes become congested - should it care about things such as quality of service (QoS). This is one reason why large networks should have redundant links: if one breaks down or becomes too heavily used, data can be directed through a different path.

So... yes, location does matter.

Generally, yes. As the number of hops (router-to-router steps) increases, so does the time to get data from one side of the connection to the other. However, geographically near locations may still be relatively far apart on the network.


11-02-2006, 10:42 PM
The general idea is all I claim to be correct... the info goes up a line of servers, to a "top" point then transfers to another server and starts going back down.
Thanks for clearing that up :)