Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Google Upgrades WebP To Challenge PNG Image Format

Soulskill posted more than 2 years ago | from the more-cat-pictures-per-gigabyte dept.

Google 249

New submitter more writes with news that Google has added to its WebP image format the ability to losslessly compress images, and to do so with a substantial reduction in file size compared to the PNG format. Quoting: "Our main focus for lossless mode has been in compression density and simplicity in decoding. On average, we get a 45% reduction in size when starting with PNGs found on the web, and a 28% reduction in size compared to PNGs that are re-compressed with pngcrush and pngout. Smaller images on the page mean faster page loads."

cancel ×

249 comments

Sorry! There are no comments related to the filter you selected.

NIH (3, Insightful)

Anonymous Coward | more than 2 years ago | (#38100222)

Why not update the png format? See subject.

Re:NIH (5, Insightful)

retech (1228598) | more than 2 years ago | (#38100298)

Because that requires a committee and would take 10x as long, if ever, to get done.

Re:NIH (1)

sstamps (39313) | more than 2 years ago | (#38100412)

It shouldn't. It's not like they set the PNG format up to be extensible this way.

If it truly is a significant innovation, it should sail through the standards approval process as a recognized extension.

Re:NIH (4, Insightful)

Anonymous Coward | more than 2 years ago | (#38100560)

TIFF exists. The world doesn't need another file format where most clients don't implement the full standard and the user can never expect a file in that format to be reliably readable everywhere.

Re:NIH (3, Insightful)

Anonymous Coward | more than 2 years ago | (#38100580)

If it truly is a significant innovation, it should sail through the standards approval process

Hahahahahahahahahahahahahahahahaaaaaaa

Wow, you've never actually dealt with a standards body before, have you?

Re:NIH (1)

Anonymous Coward | more than 2 years ago | (#38100650)

I agree, just try to read the ECMA OOXML standard [ecma-international.org] in its entirety before it's obsolete.

Re:NIH (5, Insightful)

Anonymous Coward | more than 2 years ago | (#38100596)

But extensions are good for adding information, not removing it. You could probably implement whatever compression enhancements Google made to WebP in PNG through extensions, but probably not in a way that makes old versions of libpng still produce usable results while still having a reduced filesize. At which point it doesn't really matter if you add it to WebP or PNG, the backward compatibility benefit of PNG extensions can't be exploited either way

Re:NIH (5, Insightful)

Kjella (173770) | more than 2 years ago | (#38100600)

If it truly is a significant innovation, it should sail through the standards approval process as a recognized extension.

Which is not actually that helpful, because then you have tons of PNG-capable applications that can't read PNGs. TIFF used to be this way, where TIFF actually means it can be compressed like ten different ways and support was very mixed. If you have a significant new non-backwards compatible format, just releasing it as a new format is maybe just as easy.

Re:NIH (5, Interesting)

0123456 (636235) | more than 2 years ago | (#38100832)

Which is not actually that helpful, because then you have tons of PNG-capable applications that can't read PNGs. TIFF used to be this way, where TIFF actually means it can be compressed like ten different ways and support was very mixed.

Only ten different ways? Back in the early 90s I was creating TIFF files that I doubt anyone can display these days; we had our own TIFF tags assigned and could compress files however we wanted to.

This is why TIFF was:

1. Very useful for app developers.
2. A total disaster for interoperability.

Re:NIH (0)

Anonymous Coward | more than 2 years ago | (#38101112)

If it truly is a significant innovation, it should sail through the standards approval process as a recognized extension.

Which is not actually that helpful, because then you have tons of PNG-capable applications that can't read PNGs. TIFF used to be this way, where TIFF actually means it can be compressed like ten different ways and support was very mixed. If you have a significant new non-backwards compatible format, just releasing it as a new format is maybe just as easy.

The PNG format is made of chunks. So one can still keep the current ones and simply have new ones for the lossless stuff.

The new chunks will be ignored by legacy software, but they'll be able to still display the "lossfull" image using the bits they understand. Newer software will be able to read the whole image in its full glory.

Re:NIH (5, Informative)

petermgreen (876956) | more than 2 years ago | (#38100794)

One of the key design features of PNG was that any PNG should be able to be read by any decoder. That is why PNG has relatively few options on how the core data is encoded*

Adding optional stuff is ok (unless it's animation......) but if you want to make a key change to the core of the format I suspect the PNG guys would tell you to go make your own format based on PNG but with it's own specification, file extension and "magic number" (as was done for MNG, and JNG).

* a handful of filter types all of which are easy to implement, one compression algorith, one byte order standard, 15 allowed color/bitdepth combintions (the majority of which represent very comon combinations and all of which can be easilly mapped to 24-bit RGB).

Re:NIH (2, Insightful)

Trillan (597339) | more than 2 years ago | (#38100320)

WebP lossy may not catch on, but it isn't pointless. Compared to JPEG, in return for a muddier image (to my eyes, at least) you get alpha support. As Google is one of the biggest distributors of images on the Internet, I think the real purpose is to pay less for licensing JPEG.

WebP lossless seems much less useful to me. Unless there's licensing issues I'm not aware of, it seems pretty pointless.

Re:NIH (4, Insightful)

BitZtream (692029) | more than 2 years ago | (#38100452)

You don't have to pay for a JPEG license, try again.

Re:NIH (0)

Anonymous Coward | more than 2 years ago | (#38100624)

Only the old JPEG is license free. JPEG 2000 is not.

Re:NIH (-1)

Anonymous Coward | more than 2 years ago | (#38100656)

[citation please]

This is just FUD.-

Re:NIH (4, Informative)

ThePhilips (752041) | more than 2 years ago | (#38100822)

Here you go, boy. [lmgtfy.com]

Right now JPEG org promises that you will not be sued for implementing the basic JPEG 2000.

Re:NIH (1)

ubrgeek (679399) | more than 2 years ago | (#38100548)

How are they the biggest distributor? Isn't most of their stuff images other people are hosting (image search) or upload themselves (picasa)?

Re:NIH (3, Interesting)

PhilHibbs (4537) | more than 2 years ago | (#38100920)

The images you get from Image Search are Google's version of the image which have been resized to fit the search layout. I would still be surprised if that made Google the number 1, I would have thought Akamai would be the top slot, or Facebook.

Re:NIH (1)

ubrgeek (679399) | more than 2 years ago | (#38101098)

I didn't know that. Thanks for the clarification (timely, too. Someone actually asked me a google image related question so this helps me respond.)

Re:NIH (4, Interesting)

Guspaz (556486) | more than 2 years ago | (#38100806)

JPEG XR produces images similar to JPEG-2000 while having complexity similar to JPEG, supports transparencies, requires support for lossless compression (unlike JPEG) since lossless is just a quantizer setting, and it's already supported by IE9.

That last bit is probably the most important part. IE's marketshare is shrinking, but it's still big enough that any format it doesn't support is unlikely to see widespread support as the only format available for a site. I doubt IE will ever support WebP, and as such, no website will ever really be able to use WebP. Not unless they do browser detection, and most sites won't bother with multiple image compression formats, they're going to pick the best common one they can, which is currently PNG or JPEG.

Remember PNG alpha support... Until IE supported it, nobody really used it. Once IE did, it became mainstream.

Re:NIH (1)

SanityInAnarchy (655584) | more than 2 years ago | (#38101040)

most sites won't bother with multiple image compression formats,

Really? I'd think sites would enjoy a 50% reduction in bandwidth in supported browsers, even if they don't get it for IE.

50% reduction of images, not all bandwidth (0)

Anonymous Coward | more than 2 years ago | (#38101426)

That might only amount to 5-10% reduction of total bandwidth.

Re:NIH (2)

sstamps (39313) | more than 2 years ago | (#38100322)

Yeah, I was thinking that it sounds like a good candidate for the long-awaited compression method 1. :P

Re:NIH (1)

yog (19073) | more than 2 years ago | (#38100382)

Google has open-sourced the WebP code and utilities, so (I think) this format will not be encumbered by patents or licensing issues. That is a great contribution in itself. I continue to be amazed by Google and its ability to make money while giving stuff away.

Re:NIH (1, Insightful)

BitZtream (692029) | more than 2 years ago | (#38100462)

As opposed to PNG and JPEG which are both open and have no patent or license issues either?

Re:NIH (1)

yog (19073) | more than 2 years ago | (#38101342)

Not as opposed to PNG and JPEG, simply as a (possibly superior) alternative. JPEG did have some patent issues which fortunately have been resolved (google "jpeg patent"). Probably every graphical format that is successful will attract the attention of lawyers of patent holders and patent trolls. Hopefully Google has thoroughly vetted this technology. Pretty pathetic the hoops one has to jump through these days to create something as abstract as a computer image standard.

Re:NIH (1)

larry bagina (561269) | more than 2 years ago | (#38100496)

Google might not have any patents on the algorithms but there may be a patent troll that does.

Re:NIH (1)

stevenvi (779021) | more than 2 years ago | (#38100512)

Open source != patent free.

I'm not saying that it is patent encumbered, but just pointing out the flaw in your assumption. So long as there is a guarantee that it will be free to use forever, I see no reason why modern browsers shouldn't implement it. What's the downside?

Re:NIH (2)

Daniel_Staal (609844) | more than 2 years ago | (#38100962)

So long as there is a guarantee that it will be free to use forever, I see no reason why modern browsers shouldn't implement it. What's the downside?

Extra code that has to be written, loaded, run, tested, and maintained. This leads to application size bloat, larger memory footprints, and more work for developers.

It may be worth it, if the format is a significant advance. But it's not cost-free, to either the developers or the users.

Re:NIH (1)

marcello_dl (667940) | more than 2 years ago | (#38100566)

open source != free for a reason...
If they license the patents involved together with the source code in a FOSS compatible way, good.
If they won't try to pull an android and divide users into the group with the latest and greatest implementation (chrome users) and the others, great.
I should not be criticizing them ahead of time, so let's see.

Re:NIH (5, Insightful)

timeOday (582209) | more than 2 years ago | (#38100694)

Why not update the png format?

Recycling a name for a new incompatible format is a terrible idea. If I have a png image and software that supports pngs, I should be able to read that image, period.

Re:NIH (4, Funny)

Ichijo (607641) | more than 2 years ago | (#38101344)

Recycling a name for a new incompatible format is a terrible idea. If I have a png image and software that supports pngs, I should be able to read that image, period.

An that goes double for .avi files!

Re:NIH (0)

Anonymous Coward | more than 2 years ago | (#38100764)

How about Google learn to optimally compress PNGs on their own sites first?

30192 Downloads/nav_logo_hp2.png
29553 nav_logo_hp2_opt.png

Flawed, Misleading or Fraudulent? (0)

Anonymous Coward | more than 2 years ago | (#38101128)

Google are converting to Webp 'lossless', which their own page shows is not truely lossless and then back to PNG . If I try and further compress googles PNG samples I cannot, however, if I grab the originals and resize using "convert orig.png -resize out.png" then I can equal WebP file sizes in (true lossless) PNG by using 7z deflate...

27451 FizyplanktonOpt.png
33073 FizyplanktonOrig.png

Sterling work as ever guys...

"Yesterday I cudn't spell gogal enginear and now I are one"!

Re:NIH (1)

stanlyb (1839382) | more than 2 years ago | (#38101110)

Because it is GOOGLE. They once again have to prove that they could actually do something better.....Nice try anyway, maybe next time googlers, don't give up.

Transparency yet? (0)

Superken7 (893292) | more than 2 years ago | (#38100280)

Last time it was planned as an upgrade, has the alpha channel made it to the improvements? It's a bit sad to release an image file format for the future of the web that doesn't support transparency IMHO

Re:Transparency yet? (4, Informative)

The MAZZTer (911996) | more than 2 years ago | (#38100308)

Yes, it's in TFA url and title. :)

Re:Transparency yet? (2)

BrandonJones (1581809) | more than 2 years ago | (#38100376)

The article title is: "Lossless and transparency encoding in WebP" So I'd say that's a yes on transparancy

Animation? (1, Funny)

Comboman (895500) | more than 2 years ago | (#38101350)

Lossless compression and transparency are nice, but unless it allows for looping animated pictures of cats, I'm sticking with GIF.

Awesome (3, Insightful)

Anonymous Coward | more than 2 years ago | (#38100304)

Another unsupported format from Google.

It's interesting how successful they are at dominating/directing so many areas of the Internet, but they seem so ineffectual in other areas like this and the video format they are trying to get the world to switch to.

Re:Awesome (4, Interesting)

Xanny (2500844) | more than 2 years ago | (#38101324)

They are converting all of youtube to WebM, and it is the only royalty free web video codec. I'm pretty sure they will beat h.264 in the long run because free wins in the end. The fact the encoding is behind the scenes doesn't matter. In a decade html5 video will be defined by webm because no one wants to license h.264 for encoding products.

Is google's image format ICC capable? (4, Interesting)

Jackie_Chan_Fan (730745) | more than 2 years ago | (#38100344)

... because Chrome is STILL NOT color managed.

Re:Is google's image format ICC capable? (1)

SadButTrue (848439) | more than 2 years ago | (#38100622)

To lazy to look that up... What does that mean?

Re:Is google's image format ICC capable? (2, Funny)

Anonymous Coward | more than 2 years ago | (#38100736)

If you played WoW you would know. ICC is Ice Crown Citadel, home of the Lich King formerly known as Prince Arthas. What an outdated dungeon in an MMO has to do with Chrome displaying a new image format I'm unsure of.

Re:Is google's image format ICC capable? (0)

MurukeshM (1901690) | more than 2 years ago | (#38101172)

So Prince Arthas became the Lich King? Sad. I have this jinx when it comes to some games, never getting past a certain point. Not that I can't get through, but something happens to screw my system...

Re:Is google's image format ICC capable? (1)

Guspaz (556486) | more than 2 years ago | (#38100828)

Colour profiles. Chrome ignores them. On my computers, every image I open up in Chrome is oversaturated compared to opening it in an image editor like Photoshop.

Re:Is google's image format ICC capable? (1)

SadButTrue (848439) | more than 2 years ago | (#38101216)

Sorry, I really had never heard of this.
So this is an OS setting?
I assume this would be useful for correcting for differences in displays? If so, how would it differ from the hardware adjustment on monitors?

Re:Is google's image format ICC capable? (4, Informative)

DarkXale (1771414) | more than 2 years ago | (#38100800)

>"Last month we announced WebP support for animation, ICC profile, XMP metadata and tiling."
I assume thats a 'yes'.

Re:Is google's image format ICC capable? (0)

Anonymous Coward | more than 2 years ago | (#38100810)

Yep, color profiles are in [google.com] .

This is a big deal! (0)

Dr. Spork (142693) | more than 2 years ago | (#38100362)

I didn't realize it was even possible to make such a big improvement in lossless image compression. The web definitely needs it - any smartphone user that pays by data volume can confirm this.

Re:This is a big deal! (5, Informative)

Trixter (9555) | more than 2 years ago | (#38100614)

I didn't realize it was even possible to make such a big improvement in lossless image compression.

You falsely assume that PNG was state-of-the-art in lossless compression. PNG took a great idea (filter the image and take advantage of the 2-D correlation present in most real-world images) and coupled with it a terrible idea (zlib for the back-end compression of the filter output). You're supposed to do order-0 compression (ie. statistical, like Huffman coding) on the filter residuals, not pattern-match searching (zlib). zlib is a great piece of software, but like all tools, there are things it is very well-suited for and others it is not well-suited for. This was a misstep by the PNG team.

The choice the PNG people made was fueled by the Unisys GIF/LZW patent of the time, and at that time IBM also had a patent on range coders. So I guess it's understandable why they didn't use those order-0 methods on the filter residuals. But it was a huge mistake to knee-jerk away from ALL statistical methods and choose zlib as the back-end. They could have used basic Huffman; not sure why they didn't.

Re:This is a big deal! (4, Informative)

Edgewize (262271) | more than 2 years ago | (#38101228)

That seems like an oversimplification since the DEFLATE algorithm includes a huffman encoding step, and it is within the specs for the compressor to simply never emit back-references. It would be a horrible bug in the implementation of zlib to have worse compression performance than a basic huffman encoding.

Re:This is a big deal! (2)

Megane (129182) | more than 2 years ago | (#38100712)

Any smartphone user that pays by data volume would probably be better off with lossy image compression.

Re:This is a big deal! (0)

Anonymous Coward | more than 2 years ago | (#38101070)

Again, depends on the image in question. There are some images where PNGs will be (significantly) smaller than even compression heavy JPG, while remaining completely non-lossy along with Alpha support. Websites are not exactly unlikely to run into such images.

Lossless image compression is a big deal (0)

Anonymous Coward | more than 2 years ago | (#38100368)

Very important for certain classes of photos, medical images for example. Doctors cannot allow any loss to the image due to liability and the chance of the "lost" resolution of the image leading to a missed or incorrect diagnosis. Another is pictures of the Golden Girls. No way I want any loss of quality there.

Which Golden Girl would you plow first? last?

TIA

Re:Lossless image compression is a big deal (2)

Jeng (926980) | more than 2 years ago | (#38100550)

Which Golden Girl would you plow first?

The cosmonaut of course.

Really?? (-1, Troll)

colsandurz45 (1314477) | more than 2 years ago | (#38100430)

Smaller images on the page mean faster page loads.

Really?

Re:Really?? (0)

Anonymous Coward | more than 2 years ago | (#38100610)

Yes, really.

If the emphasis is on compression... (3, Interesting)

Tastecicles (1153671) | more than 2 years ago | (#38100446)

...doesn't anyone think it might be time to revisit fractal image compression [ucsd.edu] and maybe look at ways of improving iterated function systems and their associated algorithms (I might give Mike Barnsley a call and ask him how his IFS patents are developing if you're nice and mod me up)?

Re:If the emphasis is on compression... (0)

Anonymous Coward | more than 2 years ago | (#38100594)

Whatever happened to FIF format? I remember a cover (floppy) disk with some images compressed this way and was pretty impressed for the time; then something about it being used in microsoft encarter (or the dark dark ages), then nothing.

Re:If the emphasis is on compression... (1)

Tastecicles (1153671) | more than 2 years ago | (#38100754)

I remember that. I was not at all surprised that Microsoft, in the hunt for a compression format they didn't have to pay royalties on (ISI was a community project back then), literally stole the idea of using lossy compression, texturing and wavelets to cram as much image information as they possibly could onto one CD. They took IFS young, as I remember, because the resultant images were unusable anywhere but a 14" CRT. If they'd waited a year they'd've had to have paid royalties to Mike but at least they would have had a workable algorithm (not to mention better quality images in their product) rather than what they ended up with, which was almost purely the result of many thousands of man-hours of bulletin board conversations and napkin chickenscratch.

Re:If the emphasis is on compression... (2)

Trixter (9555) | more than 2 years ago | (#38100658)

...doesn't anyone think it might be time to revisit fractal image compression [ucsd.edu] and maybe look at ways of improving iterated function systems and their associated algorithms?

Considering that the best results were obtained using college grads as the compression engine [dogma.net] , probably not.

Re:If the emphasis is on compression... (1)

Guspaz (556486) | more than 2 years ago | (#38100852)

It could be the first image compression that uses Mechanical Turk as a core component ;)

Re:If the emphasis is on compression... (1)

Anonymous Coward | more than 2 years ago | (#38100714)

Fractal compression is a lossy compression method. This is lossless.

Any guesses on when IE will natively support WebP (0)

Anonymous Coward | more than 2 years ago | (#38100490)

I'm thinking...maybe 2025. Yeah, that sounds about right.

American innovation at work! (-1)

Anonymous Coward | more than 2 years ago | (#38100500)

And this is exactly why we need SOPA. Innovation like this would not be possible anymore if we let rogue foreigners pirate our IP. Please help reelect such fine representatives such as the bill's introducer Lamar Smith (R) and true patriot co-sponsors such as Bob Goodlatte (R), Dennis R. Ross (R), Elton Gallegy (R), Marsha Blackburn (R), Mary Bono Mack (R), Steve Chabot (R), Timothy Griffin (R), Lee Terry (R), Mark Amodei (R), John Carter (R), Peter King (R), Thomas Marino (R), Alan Nunnelee (R), Steve Scalise (R). Bring back home the $135 billion bring stolen from this country by pirates and counterfeiters.

Re:American innovation at work! (1)

RoccamOccam (953524) | more than 2 years ago | (#38101454)

I'm sure that you didn't mean to leave all of the esteemed Democratic representatives that are co-sponsoring the bill: Rep. John Barrow [D, GA-12], Rep. Karen Bass [D, CA-33], Rep. Howard Berman [D, CA-28], Rep. John Conyers [D, MI-14], Rep. Ted Deutch [D, FL-19], Rep. Ben Luján [D, NM-3], Rep. William Owens [D, NY-23], Rep. Adam Schiff [D, CA-29], Rep. Debbie Wasserman Schultz [D, FL-20], Rep. Melvin Watt [D, NC-12]

Smaller images on the page mean faster page loads? (1)

knifeyspooney (623953) | more than 2 years ago | (#38100534)

Gol-ly! Is there anything those Google engineers don't know?

An even better way to decrease page load time: (5, Interesting)

larry bagina (561269) | more than 2 years ago | (#38100616)

block google analytics.

Re:An even better way to decrease page load time: (1)

webheaded (997188) | more than 2 years ago | (#38100674)

No kidding. I started putting those at the very bottom of the pages and that seemed to help somewhat. I'll take improvements where I can get them. :p

Re:An even better way to decrease page load time: (1)

Frosty Piss (770223) | more than 2 years ago | (#38101166)

I started putting those at the very bottom of the pages...

That's where everyone else was already putting the Google Anylitcs code. That's where Google suggests you put it...

It's a service like any other that Google offers: If it's not useful to *YOU*, don't use it on your site.

There are alternatives, but they too effect load time.

If you make a significant $$ off of AdSense, Google Analytics can be very useful.

Just like no one is forcing anyone to post gobs of personal stuff on Facebook, no one is forcing Webmasters to use Google Anything

Re:An even better way to decrease page load time: (0)

Anonymous Coward | more than 2 years ago | (#38101266)

Analytics? Have you guys never heard of AdBlock plus, Ghostery and, you know, just compressing the entire HTTP connection transparently? (Yes, this can still allow different compression algorithms for different mime types.)

What about HDRI? (2)

art6217 (757847) | more than 2 years ago | (#38100638)

Why, with today's bright screens, no one implements high dynamic range imaging [wikipedia.org] in both GUI environments and common image formats?

"Paper white" is still "all bits on"...

Re:What about HDRI? (1)

Guspaz (556486) | more than 2 years ago | (#38100922)

Most modern screens can't display deep colour. My Dell U2711 can do it, but you really have to buy a high-end or professional display like it to get 10-bit colour support. Since most displays can't show it, there's not all that much demand to support it.

Re:What about HDRI? (1)

Mr.Z of the LotFC (880939) | more than 2 years ago | (#38101362)

Most modern displays also tend not to be able to show anything over 1920x1080 (or thereabouts, often much smaller), & yet we still use formats that support much larger images. Just as you can zoom, you can change the display exposure with an HDR image. I suspect the lack of support is more related to most cameras not producing HDR, while they do have resolutions higher than monitors.

Re:What about HDRI? (1)

Ichijo (607641) | more than 2 years ago | (#38101402)

Since most displays can't show it, there's not all that much demand to support it.

It's a chicken-and-egg problem. What's the cheapest way to break out of the loop: make HDR displays, or give WebP support for HDR?

Re:What about HDRI? (1)

Anonymous Coward | more than 2 years ago | (#38101012)

What you describe as HDRI is not so much "something to be implemented". Any image format with sufficient bit-depth can be tone-mapped, or have it's contrast curves adjusted to display "HDRI".

The "cool looking photos" in those articles are simple a result of a tone compression.

Your eyes can only do this in a real environment by taking some guesses and actually adjusting to the ambient light. when you look at a dark part of the scene, you take in more light, when you look at a light part, you take in less light and your brain converges the two into a single mental image. Photographers do this and then instead of sending it to you as a mental image, they use a tone mapping curve (tone compression) to make it look alright on a static display.

I'm curious what you mean by "implements high dynamic range imaging" in image formats? I'm not sure that it even makes sense, what you're saying, but I wanted to let you clarify before I dismissed it.

Re:What about HDRI? (1)

gl4ss (559668) | more than 2 years ago | (#38101076)

..why wouldn't it be? that's the info that goes to the monitor.

you want hdr on your desktop, use a hdr-composited image as a background.

Re:What about HDRI? (1)

nomel (244635) | more than 2 years ago | (#38101212)

Using a tonal mapped image is not the same as an HDR image or display.

Re:What about HDRI? (1)

nomel (244635) | more than 2 years ago | (#38101186)

I think you're confused about hdr. Max brightness will always be "all bits on". It's only paper white because that's the absolute max brightness your display can show!

You need an HDR display to view HDR images, otherwise you're just doing tonal mapping. The examples show in that wiki are not HDR images, they're tonal mapped images. Their dynamic range is exactly the dynamic range of all the other pictures you've seen today, 3 color channels, 256 bits per color channel. High dynamic range displays require brighter backlights to make the higher *dynamic range* possible, otherwise you're just increasing bits-per-pixel and reducing color banding. You'll never find an HDR display in anything powered with our current battery tech because of this.

For a realistic idea of an HDR display, here's an interesting review: http://www.bit-tech.net/hardware/2005/10/04/brightside_hdr_edr/5 [bit-tech.net]

Re:What about HDRI? (0)

Anonymous Coward | more than 2 years ago | (#38101356)

I imagine you could use 16-bit PNG with the gamma adjusted to almost give it a somewhat logarithmic scale - a cheap simulation of floating point, if you will.

It's been a long time coming (4, Interesting)

Twinbee (767046) | more than 2 years ago | (#38100644)

As someone who would love to use variable transparency (translucency) pictures on my own website, this story is very cool news. For one thing, it allows pictures to have drop shadows on varied backgrounds, without having to be forced to save as full 32bit PNG.

I'm now somewhat disappointed PNG didn't get this far sooner. It's served its purpose well over time, but I didn't realize there was still so much room for compression.

Congrats to Google, and I hope the other browser quickly adopt this apparently great picture format. I wonder how its animation side compares to APNG or MNG. The PC has always been gasping for decent lossless animation support, even though the Amiga 20 years ago had seemingly a dozen animation formats to choose from. Also, web browsers have (or at least had) great difficulty in playing animations at higher than around 16-25fps (apart from flash). It's a pretty sad state of affairs all round really.

Logical fallacy (0)

Anonymous Coward | more than 2 years ago | (#38100672)

"Smaller images on the page mean faster page loads."

Not if decoding them deadlocks the CPU.

"Simplicity in decoding" does not entail any information about the cycles it would eat.

Internet Explorer (0)

Anonymous Coward | more than 2 years ago | (#38100718)

(Posting AC because I'm at work)

That's what web designers need - another image formate that Internet Explorer won't support for years, then will support but support poorly, forcing designers to use annoying hacks to get around the inadequacies of MS's support for the format. Yeah. We really need this.

Naming Issue (1)

Zamphatta (1760346) | more than 2 years ago | (#38100732)

I just wish they'd change the name of the format. It feels awkward (in English) to pronounce WebP 'cause the "b" & "p" next to each other. I'm not exactly sure how that works smoothly, and I've noticed other people shy away from talkin' about it 'cause of it. That's a problem for any hopeful tech.

Re:Naming Issue (1)

psmears (629712) | more than 2 years ago | (#38101254)

I know what you mean, but it hasn't stopped them talking about "web pages" :-)

is his really necessary for tomorrows internet? (1)

Cyko_01 (1092499) | more than 2 years ago | (#38100864)

CSS3 will soon eliminate the need for rounded corner images and gradient backgrounds, and even smartphone bandwidth is increasing to reasonable speeds. Most ads these days are displayed with flash, and the quality of thumbnail images really isn't that important either

Re:is his really necessary for tomorrows internet? (1)

nitio (825314) | more than 2 years ago | (#38101050)

and the quality of thumbnail images really isn't that important either

(Emphasis mine). Obviously you don't know what the Internet is for...

Re:is his really necessary for tomorrows internet? (1)

QuasiSteve (2042606) | more than 2 years ago | (#38101090)

CSS3 will soon eliminate the need for rounded corner images

If all you want is single-radius rounded corners on rectangles, yes. While this fits most design processes, it falls well short of the flexibility offered by an alpha channel. On the up side, it's independent of image resolution (in the case of bitmaps) so the rounded corners are nice and smooth no matter the zoom level.

and gradient backgrounds

Again, only for simple gradients - yes, you can stack multiple divs together to get something more complex - but at some point the code you generate, even if sent gzipped, is actually going to take more bandwidth than a 1-pixel wide/high gradient bitmap.

and even smartphone bandwidth is increasing to reasonable speeds

While on the flip side, providers are dropping FUP-style contracts and going with hard limits. Savings do matter.

Most ads these days are displayed with flash

But are likely to be increasingly exchanged for HTML - if only to target iDevices currently but certainly going forward for other devices as well.

and the quality of thumbnail images really isn't that important either

Perhaps not, but I'm sure Google wouldn't mind serving, say, 15% less bandwidth in google image results without appreciable loss in quality (or perhaps even an increase in quality) by simply serving WebP instead of JPG.

Personally I'm all for a format that performs better for a given task. Currently I'm archiving 2nd tier images as JPGs at 100% quality without chroma subsampling, etc. (primaries get the PNG treatment) because I'm confident that, should I care to see them again in say 20 years, JPG will still be well-supported. I could use JPEG2000 but support for that is currently low and I have seen no reason to think that will improve substantially in time.

If Google can actually market WebP to, say, camera makers, (their own) smartphone developers and major platforms like flickr, facebook, imgur, etc. so that it will actually get picked up.. it just might make me start saving new images in WebP instead.
Given how slow its acceptance currently is, however.. as well as support for Adobe's DNG, I'm not getting my hopes up.

Re:is his really necessary for tomorrows internet? (1)

Merk42 (1906718) | more than 2 years ago | (#38101156)

PNGS are used for more than just rounded corners and gradient backgrounds.

Bandwidth may be increasing, but not caps

I also doubt “Tomorrow‘s Internet” will use Flash

Some irony in this? (1)

N Monkey (313423) | more than 2 years ago | (#38100882)

From the article thet seem to be targeting both lossy and lossless:

WebP was proposed as an alternative to JPEG, with 25–34% better compression compared to JPEG images at equivalent SSIM index.

and

Our main focus for lossless mode has been in compression density and simplicity in decoding. On average, we get a 45% reduction in size when starting with PNGs found on the web, and a 28% reduction in size compared to PNGs that are re-compressed with pngcrush and pngout. Smaller images on the page mean faster page loads

So their aim is to reduce bandwidth, which is admirable, yet the video side of Google is choosing to avoid H.264 which, AFAIU, has been shown to be better "bang for the bit" than VP8, and surely video is a far bigger consumer of bandwidth these days. (I'm not sure the unencumbered argument would stand up to close scrutiny).

Re:Some irony in this? (0)

Anonymous Coward | more than 2 years ago | (#38101210)

H.264 is not free. Given that they own youtube, they probably don't want to pay a licensing fee. I don't know why you have a problem with them developing an image compression format(that will likely be free), and not wanting to use a particular commercial video format.

Re:Some irony in this? (1)

Threni (635302) | more than 2 years ago | (#38101252)

"video is a far bigger consumer of bandwidth these days."

Not on my phone it isn't.

are image standards too established? (4, Interesting)

rlwhite (219604) | more than 2 years ago | (#38100896)

As someone who rooted for the adoption of JPEG2000, I wonder, have we reached the point where the existing major image formats are 'good enough' and so established that new standards are unlikely to unseat them?

oblig xkcd (1)

Anonymous Coward | more than 2 years ago | (#38101008)

http://xkcd.com/927/

PNG was designed with room to grow. (1)

Anonymous Coward | more than 2 years ago | (#38101170)

Let's be honest, vanilla PNG right now is only about as complex as running the deltas between neighboring pixels through deflate. PNG was clearly designed to do a reasonable job while not to taxing CPU or RAM.

That's a good thing. PNG is 15 years old now, from a time where a brand new computer had a 166 MHz CPU and 16 MB of RAM.

With how far computers have come since then, there is plenty of reason to want smarter compression methods and data filters in PNG, and the image format actually left a lot of room for adding new things.

Granted, any images that use these new compression or filtering methods won't be viewable with todays software, but if they end up in libpng, they'll end up being available to every program that uses libpng with absolutely no effort on the part of the developers of said programs, making it a way faster way to gain adoption than trying to get people to add support for an entirely new image format.

thanks google (0)

Anonymous Coward | more than 2 years ago | (#38101184)

We need you to copy yet another thing that has been done before, clearly there aren't enough image formats to deal with already.

IE and Photoshop (0)

Anonymous Coward | more than 2 years ago | (#38101188)

As a creator of images for the web, I don't see this being implemented into my workflow anytime soon because:

1. It will not be backwards compatible. Will google's format work in IE 6, 7 or other older browsers? Some web developers still have to consider the lowest common denominator. I still don't use png for this reason.

2. How long will it be for Photoshop to build this into the "save for web" options. Until then, I doubt you will find many designers opening up a different program just to save a few K on their graphics.

I am all for the supposed savings with this format, but I guess we will have to wait 5 years to use it... Maybe we'll all have fiber by then and the savings will be irrelevant.

PNG is good enough (1)

GuB-42 (2483988) | more than 2 years ago | (#38101388)

PNG is well supported, free, stable and fast. 28% size gain on something that is not a major problem in the first place is kind of a weak argument. IMHO. The only thing that made PNGs so popular it that GIFs don't support 32 bit (RGBA) colors. Look at audio files : most people still use MP3 even if OGG is superior in nearly every aspect, simply because MP3 is good enough.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>