Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology

New Optical DSPs With Tera-ops Performance 74

GFD writes: "The EETimes has a story about a new class of hybrid digital/optical signal processors that are programmable and offer tera-ops performance potential for relatively low cost and power requirements. No fundamental breakthroughs but rather a very slick use of existing optical networking components to create a programmable optical processor that looks to the rest of the world like a single chip digital signal processor. Elegant and impressive if they can deliver."
This discussion has been archived. No new comments can be posted.

New Optical DSPs With Tera-ops Performance

Comments Filter:
  • This should help in advancing IP Telephony and Video. Be prepared for, not only, integrated messaging, but IP video conferencing telephones, and more interactive IVR systems, voice mail and OS's.

  • Very slick! I wonder how this bodes for future MB chipset designs.

    When you see what companies like nVidia [nvidia.com] are doing with chipsets like the nForce [anandtech.com] (i.e.: better then mediocre on-board graphics, very capable on-board audio, Ethernet etc, etc...) we may start seeing motherboards with surplus PCI slots.
    • by grmoc ( 57943 )
      Actually, I'd prefer motherboards with NO pci slots. Upgrading to a faster bus would be a good idea.
      I'm running up against the bus-bandwidth problem just about every day.
      • We already have standards to make it up to 4x faster, 64 bit PCI, and 66Mhz PCI. They are forward and backward compatible, and there are cards that use them today.

        So just start demanding that your motherboard maker start giving you 64 bit pci and you will be set.
        • Yes, but for some things even 4x isn't the speed increase you'd like to see. For example- video compositing. It just can't be effectively done on a PC due to the large amounts of bandwidth taken up by each stream. If you're doing professional video (SDI) and you'd like to deal with the bitstream itself (which would be really nice) it is 240M/s per stream. That is a lot of data. Then you have to consider that you're going in and out of the video capture card, and that you probably would like to fade between two streams, etc. 4x is nice, but an order of magnitude improvement would be more like it.
      • If we were to abandon PCI, what would you have us use? PCI is fast enough for mosth things, actually does a decent P&P, and is cheap.

  • I'm not sure that an optical transform is the same as a digital transform, or that they can be used to do the same thing. Can their optical FFT/digital encoding produce the same bits during JPEG encoding as a digital FFT/digital encoding JPEG encoder? This is crucial for image/video compression algorithms.

    • by MarkusQ ( 450076 ) on Tuesday October 09, 2001 @02:43PM (#2407312) Journal
      I'm not sure that an optical transform is the same as a digital transform, or that they can be used to do the same thing. Can their optical FFT/digital encoding produce the same bits during JPEG encoding as a digital FFT/digital encoding JPEG encoder? This is crucial for image/video compression algorithms.

      They are the same, in theory (in practice the tolerance of your components limits you to only a few digits of accuracy). The basic (and very generic) relationship is:

      RW: Some real-world, physical process

      OB: An observational model of RW

      AN: An analytic model of OB

      DI: A digital implementation of AN

      AI: An analog implementation of AN, sometimes even based on RW.

      The wonder of science is that many RW have the same OB, and many OB have the same AN (in both cases allowing for some paramiterization). While all of these can "implement the same function" they will have very different time/space/energy/cost/etc profiles. Digital, in particular, givers you greater precision and flexibility, but at a rather high cost in speed, size, and energy usage.

      Up until faily recently (say, the last twenty to fifty years) the DI's were mostly done by hand. The only reason to do them was to get those extra digits, mostly for designers of the AIs (such as tube amplifiers and anti-aircraft guns) or to produce tables for use "in the field".

      -- MarkusQ

    • Yes, they are the same. I took a class many years ago and half the course was on audio and the other half was video. We went through the rigourous math etc. The most interesting potential application I remember was that a 2D FFT of an image could be used for pattern matching. You take the FFT of the image and correlate it with an FFT of the image you want to match. The de-correlated output result had highlighted "points" whose brightness corresponded to the "goodness" of the match.

      It was really very cool. One image was a bunch of letters on a page, arranged randomly. The thing we were matching was the letter 'h'. The brightest points in the result were indeed the letters 'h' on the page. Some 'n's also correlated to a degree and they also showed up in the result, although they were not as bright as the 'h's. Most fascinating - it didn't matter what rotation the individual letters had. An upside-down 'h' or a 90-degree rotated 'h' were equally recognized. If anything, this optical processing is probably purer than current digital methods which are only approximations.
  • by mmacdona86 ( 524915 ) on Tuesday October 09, 2001 @02:15PM (#2407155)
    since they are not actually processing the signal digitally. They are slickly converting it to a light signal, doing the heavy lifting with optical elements, which is essentially analog processing, and then converting it back to a digital signal. A real valuable short cut for those applications where you can translate what needs to be done to optical elements, but not anything like a general-purpose tera-ops digital computer.
    • I was about to post the same observation; instead (having no mod points) I'll try to draw attention to your post. (And likely get modded down as redundant or offtopic if I succeed. *sigh*)

      IMHO, the optical aspects are a red herring. The real speed advantage comes from going analog, which has always been (and always will be) much faster than digital. This gets rediscovered every few years, and then lost when the harsh limits on analog accuracy become more bothersome at the same time as the speed of digital is creeping upwards.

      -- MarkusQ

  • by grmoc ( 57943 ) on Tuesday October 09, 2001 @02:17PM (#2407169)
    Basically, for those of you too impatient to read the article, it works by using VSCELs (lasers) through conventional optics, and into a high-speed collector. The lasers can work at up to 1 Ghz, and the processing is done (it seems) in analog by the optics. Acquisition of the date is performed by the collector, which operates at 10 Ghz.

    The theory is that optics can perform FFTs, DCTs, etc for you at the speed of light, and there are many applications that need these operatons done. Any other processing, correlation, etc would be done by conventional, low-performance DSPs.

    They also say that their current model works at 20 T ops/sec at 20 watts, and list what would be required of typical DSPs, etc down to ASICS.

    Seems promising, but it is still a long way away from a nice optical CPU.
    • Actually, they are VCSELs (Vertical Cavity Surface Emitting Lasers) and are made from semiconductors. The reason it matters is that the edge emitting variety, which were first developed, would be nearly impossible to mount effectively. That is the main advantage of the VCSEL, its ability to be bonded to PCBs or other substrates.

      Just a FYI.
  • by ispq ( 301273 )
    I always lightens my heart to see technology march onward like this. Hella cool beans.
  • by peter303 ( 12292 ) on Tuesday October 09, 2001 @04:00PM (#2407811)
    Peter Guilfoyle had almost the same optical technology twenty years ago, albeit slower, but a couple orders of magnitude faster than the silicon of the day. The military had been using these for decades for analog image processing, but Guilfoyle integrated a digital protocol.

    During the early 1980s Guilfoyle attempted to commercialize this device, but failed. Engineers designed a computer around it, but realized it was more economic and reliable to implement it in silicon ASICs (custom gate arrays) than as an optical processor. The venture capitalists sided with the engineers and kicked Guilfoyle out. The company was named "Saxpy" after the name of fundamental matrix primitive used in array processors of era. A couple of prototypes were built, but never really sold.

    The 1980s were the golden age of the custom supercomputers and there were dozens [paralogos.com] built and died by the 1990s. Custom super computers could not keep up with the economics of commodity clusters (Beuwolfs). Custom machines took 3-5 years to develop a new generation, whereas commodity CPUs became 10-100 times faster & cheaper during the same time span. The only way for sustom computers to keep ahead of commodity computers is to be at least 10,000 times faster than commodity computers to avoid the catch-up problem.

    (I bought supers in the 1980s and Saxpy was on the vendors list.)
  • by PingXao ( 153057 ) on Tuesday October 09, 2001 @04:50PM (#2408036)
    Must .... resist .... urge ..... AAAAAARRRRRRGGGGHHHHHHH!!!!!!

    Imagine a Beowulf cluster of these?

    Run/Duck/Hide
  • No fundamental breakthroughs but rather a very slick use of existing....

    No fundamental breakthroughs. Then, Quick! Better patent it!
  • I remember a friend of mine telling me about a siggraph demo in which you played a fighting game by punching, kicking, and so on. It used a holographic system to recognize your moves - Nothing more than a hologram - Which changed the signals sent to the computer running the game.

    I wonder if you might be able to do something clever with programmable optical computing that involved holograms as switches, more or less an optical FPGA.

    • You'd like to hope so.
      From what you're describing it sounds like it would really benefit from some detailed geometry. Perhaps it just needs the right modeling environment.
      And, while we're at it, let's get some standard models for nanotech bioimplants.
      I think the average tradesman would agree that a oil derrick-like configuration would make a lot of sense. You know, something floating up top and a set of jointed tubes going in. Close hatch on overpressure and just fall off, no blood.
      At least at the University of Michigan, they say they've got nano light tubes for use within the blood stream pretty well licked.
      They use UV light filled tubes to zap bad guys, but I got a better idea and that's use nicotine brought in through a tube or set of tubes. Kill the bad guys and get a little buzz when it hits the liver. That must be God's will! This will fly in Kentucky.
      Just have a little refill cartridge on top. Perhaps it won't start in the States now that I consider it a bit more closely.
      Too hip. Must come from europe.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...