Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Hardware

Glaze3D: Yet Another 3D Chipset 83

Dixie_Flatline writes "This chipset looks pretty cool to me. I'm suprised that it hasn't shown up on slashdot before now, based on it's immense hoopyness. And it's got Linux support...and it hasn't even been released yet. Go here. " Basically this thing claims to crush everything else on the market. If it makes for better games, I'm all for it.
This discussion has been archived. No new comments can be posted.

Glaze3D: Yet Another 3D Chipset

Comments Filter:
  • I wish em luck (hey, variety is the spice of life), but I'm not too optimistic.

    With 250+ Megapixel fillrates these days, more fillrate is nice but is not the key bottleneck. Remember that there are only 1.83 million pixels on a 1600x1200 screen. And Glaze3D's technological approach of using embedded RAM is not particularly unique (see PixelFusion or to a lesser extent, S3's Savage4). Also, in general, chips with embedded RAM get higher memory bandwidths, but are manufactured with processes in such a way that the logic is not as fast as with logic-optimized ASICs.

    The key bottleneck is vertex geometry processing, and is constrained currently by the Intel CPU floating point, and to a lesser degree, by Intel's AGP bus and memory bus.

    When Microsoft or Linux come out with 3D GUIs that require anisotropic texture filtering (and thus huge fillrates), then Glaze3D and similar chips will be more relevant. Potentially antialiasing (via accumulation or "T" buffers) and/or multi-pass rendering of shadows could be attractive enough features to drive demand for greater fillrates, but I suspect Bitboy's competitors (3dfx, Nvidia, etc.) will be "good enough" on that score.

    --LP
  • 3DNews has more info here [3dhardware.net].

    Basically, it has every important feature from every card on the market, running about 3-4x faster (I'm not sure if I buy this, but...). It also has full OpenGL and Directx 7 support, but isn't listed as being Linux compatable.

    Now, what it doesn't have, and what I'm waiting for: full geometry. I don't really care what this thing is packing, if someone (I'm betting on Matrox at the moment) can get a card with full onboard geometry[1], that what I'm waiting for. (Oh, and at least 128MB memory, preferably 256--I hate AGP.)

    1: I don't mean geometry boosters, or a crutch for my poor celerons; I want complete FP and vector units on card, with whatever bus mastering it would take to circumvent the processors. (Can AGP tap into core for more than textures? That would just about fit the bill, esp. at 4x.) Sure it'll cost a fortune, but Halo II is worth it ;-)
  • As far as simulating goes (maybe prototyping as well, to a certain extent), if you look on their webpage, they apparently wrote a hardware simulation proggie for Win95/NT.
  • There was an interview [fullon3d.com] posted at FullOn3D that has a few questions and answers from these guys. Mostly stuff about the manufacturing, driver support and features, etc.
  • Something that I think should be pointed out is that current CPU/bus tech. is far behind what 3d accelerators can use. Just as an example, take a voodoo2, and put it on any single processor system (except k7, which may finally do it) currently available... changing the clock speed of the CPU changes the frame rate of the card. Now, unless I'm really smokin' somethin', that is indicative of a CPU-bound operation. IOW, we STILL don't have a CPU that can feed a voodoo2 more than it can handle.

    Sorry I don't have numbers, but I need to do some research before I can say anything about that (if someone would like to do that for me... =)

    -- ioctl
  • The original article states ther is supports Linux... not so say the people at bitboys. I have e-mailed them about this, and the issue of linux supports is "undecided"...

    So much for the linux support hype....
  • I don't need a 32" tv either, but I have one! And it has svideo in and out jacks plus it can 400x600 res so I can plug my pc (viper 330) into my tv and work on a 32 inch screen! Or play on a 32" screen. Whatever. A card as fast as the one they are claiming would make it look even better.
  • Pure vapourware as far as I'm concerned. They don't have it running in silicon, just a software emulation running on NT. They say the emulation runs at the speed the actual silicon product runs at.

    Excuse me while I laugh my ass off. Who gives a crap about the actual card then, just give me the simulator. I'll add my own features thanks.

    And, the images shown could be knocked up in a couple of days in Photoshop or Gimp. Check the water reflection shot. It lookes EXTRMELY fake to me.

    The depth-of-field looks plain wrong. Why aren't the shots hi-res?? Photoshop filters look even more obvious in hi-res.

    Imagine the spec of an NT box required to emulate a chip like the Glaze at full speed. They don't exist.

    I wouldn't take it too seriously if I were you. They have a staff of 7, only 2 of which are designing the Glaze hardware. Hmm 2 engineers vs however many NVidia or SGI or 3Dfx employ.

    Someone somewhere is having a big laugh. I'll be very, very surprised if ANYTHING EVER comes of this.

  • Excuse me here, but chip design is nothing like game or demo design. They may be from the coolest demo scene ever, it does not mean squat. The two skill sets are entirely different, although extensive knowledge of graphics algorithms is a must for architects specifying/designing high level specs for graphics accelerators. Being a doctoral student with IC design experience, I can talk for hours on the difficulties and the financial burden associated with designing a 16-bit RISC processor, let alone a state-of-the-art graphics accelerator.

    This sounds like a hoax to me, or a bunch of really hopeful kids with an impressive spec, but nothing else. I would have the following questions to any group venturing on their own to design and build their graphic accelerator without bundles of cash and/or experience:

    -How will you pay for the design tools? EDA tools to compile the HDL design, place & route, simulate, etc. cost more than 100K per seat. It is nothing like grabbing gcc or egcs from the nearest Linux ftp site and starting coding..

    -How will they prototype their chip? A graphics accelerator will not fit in most FPGAs, so even that cheapest prototyping method is out of the question. They will probably need to rent/acquire hardware emulators or have prototypes built, which cost a lot of $$$$.

    -Then again, it doesn't only take a bunch of us geeks & 3D programming wizzes (plural of 'wiz', anyone?) to put a 3D chip on the market-you need hordes of suits for marketing, advertising and other stuff, too..They don't come in cheap.

    With all due respect for Finnish technical talent(Nokia and of course Linus comes to mind), I believe they will fail miserably. If this is for real, that is.
  • Yeah, I've heard a lot of things about this thing, but it was announced so far from its release that people didn't really pay any attention (good for them, I wish I had that kind of common sense).

    This thing has a huge coolness factor. Not just because its damn fast, but also because of the way in which it was designed. Basically, the entire thing was designed with an advanced hardware simulator. They actually had completed drivers before they had any hardware. In fact, I believe they still don't have any hardware. That, when combined with the fact that it was announced so early, kind of makes people think it's vaporware. I sure hope it isn't, but I don't have much to support that hope.
  • I agree 100% with the last paragraph. Most independant programmers seem to get bored with a product they are working on once they've figured out that they /can/ do it.

    It makes sense that the trench work involved in writing a game only seems to happen when you're employed by someone and under a contractual obligation to read those joystick specs and support all those sound cards and all those video cards, etc, etc ...

    This is what made the demoscene so cool; you got to show /what/ you could do, but you never had to do more than you wanted to before moving onto another product. I suspect this was the ultimate fate of alotta the guys outta the demo scene that figured since they had written a demo, they could write a game. It wasn't that they /couldn't/, but rather that it just got a little boring in the end.

    At any rate, to stay kinda on topic, the Glaze3D doesn't quite look like Vaporware to me, but I'm sure just like any news, it's significance and excitement tends to decrease exponentially as the release date approaches as other manufacturers flood the market with their competative offerings.
    We'll see.

    I do admit I get more and more excited as the PC offerings align themselves to the specs of the current generation of arcade consoles.

    SirSlud

    BTW, people seem to forget Scream Tracker, which was finished, for all intensive purposes, and is still being used by trackers all around the world. :) So theres some non-vaporware outta the Future Crew ranks.
  • I don't think anyone has called them out and out liars. I'm sure they're working very hard on a product they think is great. If they do build a product with specs like that, it would be impressive. The questions then become, "How much is this thing going to cost?", and "What competeting products are going to be on the shelf next to it?"

    On a related note, I would like to take this opportunity to announce that I'm building a cold fusion reactor. It is expected to go live in May of 2048. I am currently up to date with my project plan and the crayon drawing looks great.

    -Barry

    Ummm...this is my sig...or something

  • Maximum PC had an article on it in the April issue.
  • Okay, this text is based on the conversation I had last October with a friend of mine, who happened to be one of the Pyramid3D software developers at VLSI Solution. I have to emphasize that all opinions are his, and this is heavily from the P3D point of view, naturally. Unfortunately he is on holiday right now, and I just have to relay his views here.

    Years ago Bitboys approached VLSI with the promise that they had designed a 3D-engine. When the money was assigned, it was found out that it was actually only code directly from some game, and had no any real definitions behind it. The actual P3D development was done by VLSI Solution, who had 20 people working on it full-time since 1995, for over three years. Bitboys were bought out of the project around 1997.

    After Bitboys no longer worked with P3D, Glaze3D appeared out of nowhere. [I remember seeing mostly the same Glaze3D page on Bitboys web site way over a year ago. Now it only has two new images, and probably some revised specs.] A lot of Glaze3D's publicity material is actually from P3D demos, including some of the screenshots on the page. Even if there is something real now, in the beginning Bitboys were advertising technology that didn't exist with the demo material from completely another project.

    In short: Glaze3D is most likely an illegal product that will probably never be released.

    Pyramid3D does exist, and there are beta versions of the actual card. It was demoed at Assembly '97, IIRC, and in summer 1998 it was nearly completed. But as we know, TriTech dropped the project and P3D will never hit the shelves. There may be some P3D-based products, like inexpensive video converters, though. TriTech was supposed to do the marketing for P3D, but they actually never did anything else than host a webpage.

    P3D would have had good 2D (300MHz RAMDAC), good 3D (probably not much chance against today's cards), and both video-in and video-out that support basically all usual formats (NTSC, PAL, SECAM) and their mutations. [I have seen some NTSC->PAL conversions done with P3D, very good quality for real-time conversion.] It was the first card to have ready DirectX6 drivers. Open source drivers for Linux/FreeBSD were also developed. The target price was $100 with 16MB RAM. And this was early 1998. What a pity.

    Hope this clears some points about the relationship between Pyramid3D and Glaze3D.

    -sph
  • I've just been through a bit of an upgrade cycle with my primary box - originally I had an S3 ViRGE DX 4MB, then I got hold of a Mystique 220 for £25 secondhand (the retail one with all the toys :), and yesterday I got my 16MB AGP Riva TNT, which is a visible improvement over the ViRGE :).

    The Matrox was the best 2D card I've ever seen - in fact, in some 2D tests it's actually faster than the TNT. (Of course the TNT whomps it on 3D).

    The other nice thing about the Matrox is that kernel 2.2.x has explicit support for the Mystique when it comes to the all important frame buffer console. 1152x864x32bpp with the nice SPARC font (12x22) is a damn nice working environment.

    Also, all the StarOffice/S3V Xserver problems went away, too.

    Finally (in praise of the card I've just ditched) the Mystiques make great partners for a Voodoo2, cos they put out a very strong, clean video signal which survives the passthru better than most.

    Peter.
  • And what processor(s) on earth can possibly feed this monster enough data to use it to its full potential?

    None. At least among the consumer CPUs that will be shipping for the next several years.

    Which is why onboard geometry acceleration, a la NVidia's NV10, is going to become a requirement in fast accelerators. All the extra fill rate power is pretty useless if you're waiting for the CPU the whole time.

  • ummm... 800fps?

    impossible. A card being able to push that many triangles while possible, doesn't take into account the other factors of a game. Sony's PS2 can possibly real-time render objects, but that doesn't take into consideration AI or game-physics or anything else, if you only are displaying the graphics then fine you might get 800 fps(I still doubt it) but you won't get it in quake
  • You say the key bottleneck is vertex geometry?

    The next generation nVidia part (NV10) will have Geometric Transform and Lighting handled on the card. That's the end of that bottleneck.

    Intel isn't too happy about this, because all of a sudden a 300 MHz machine becomes very gameworthy.

  • Even in a single rendering pass pixels are most probably rendered more than once. Basically, all polygons in a scene are rendered whether you can't see them because they overlap or not.
  • Hey, why don't you read my subject?
  • by DocTee ( 6393 )
    I read thru this the other night and pretty much
    came to the conclusion that this was a clever hoax
    /scam. Can anybody else come up with some more
    concrete evidence?
  • It was somewhere around a year ago, when I was looking at what 3D card to get (I never did decide, so I'm still using my old Voodoo Graphics, which is dying a slow and painful death).

    They said it would be out some time in '99 or '00 - sounds like they're basically on schedule. Should be interesting to see how it compares with what 3Dfx (or is it 3dfx now? I can never remember) comes up with next...
  • These guys have said this before, and nothing materialized. 512mb max of onboard ram is a hell of a boast. these guys better come through, but i predict about $1000+ for a good board.
  • That is one awesome chip. Can't wait to see JUST ONE board!! Linux support rocks. 12 Million Triangles. Woah. The images looked like they'd been ray-traced almost. Yeow.

    Make sure it works on Alpha Boxes!! Please!

    pana
  • From the general comments I read about Glaze3D on other message boards, it seems like this company has announced other crazy chipsets before (Pyramid3D?) and they never materialized. So I'll believe it when I see it. Oh yes, and it's due in Q1 2000 at the eariliest, which will probably put it a few months behind Voodoo4 and NV10, so who knows what 3dfx and nVidia will have by then.
  • That sounds really sweet, I'll keep my eyes open for when they actually ship something. Are they going to release full specs to the Mesa and XFree groups?

    ----
  • Ok, maybe I'm being ignorant, but I've still got an S3-Virge video card and I don't think I need to replace it.

    Assuming the only game I play on my Linux machine is Civ:CTP and I don't dual boot, do these new cards offer me anything? I don't think so, in fact, it's getting hard for me to justify any new hardware purchases, except a bit more RAM and maybe a faster processor (and that's just for compressing MP3's!).

    Please, correct me if I've missed something.
  • "The product delivers a fillrate of 1200 million texels per second and a geometry throughput of 15 million triangles per second. This translates to a real-world performance in, for example, id Software's Quake III Arena of over 200 frames per second at true color in full monitor resolutions with all details and features enabled."

    That is for a single chipset - single! What the hell do you do with 128MB of graphics memory?

    The quad gives you 4x all those numbers... 4800 million texels, 60 million triangles, Quake at 800fps?! Oh, and of couse the ability to have 512MB of graphics memory...

    All this and they say that it will be affordable too.

    And what processor(s) on earth can possibly feed this monster enough data to use it to its full potential?

    Needless to say, I want one ;)

  • Some people might like to research this further [bitboys.fi] before claiming bogus and vaporware.

    The Bitboys site has been around for quite some time. The Glaze3D has been in development for a few years as well. But until now it hadn't been announced that they had the chip manufactured, this appears to be true now, which follows their development roadmap.

    They already manufactured a chipset a few years ago in the days of the Voodoo 1 era. Anyone remember the Pyramid3D by Tritech? This was them.

    So before crying FAKE! learn the facts.

    O, and they come from the DemoScene (Future Crew anyone?) so I imagine they know what they're doing.

  • by Vector7 ( 2410 ) on Tuesday August 03, 1999 @06:05AM (#1767292) Journal
    This is interesting, this group had a chip called the Pyramid3D a few years ago that never made it onto the market because its performance wasn't competative (although it did have a few neat features like antialiasing). Anyway, they say it won't be shipping until the first half of 2000, which could mean at least 6 months from now and likely more, and by then the performance their claimed 3x current performance may not seem as great, as TNT3 and who knows what else will be out before then. It could be another case of too little, too late.

    Interestingly, they are using embedded DRAM, the same technology Rendition is supposedly using in their next-generation chip (if they ever release one).

    Anyway, it looks like a cool chip, provided it has a robust OpenGL implementation for X.
  • When the UPS man comes skipping up to your door, not even struggling to carry the box your brand new Transmeta-fied Amiga with the super-duper 3d comes in, and you open it up to find it full of packing material and nothing else you'll know not to trust people that have said, "It's coming now" for the past few years. (I heard news about this 3D card a *long* time ago, back before 3D cards were anything to speak about. This is non-news to me...)

    - A.P.
    --


    "One World, One Web, One Program" - Microsoft Promotional Ad

  • by Anonymous Coward
    Instead of bitching about whether they'll meet their deadlines, maybe the community should try and help them? With six people in their company, they might have a hard time writing/optimizing drivers although they DO come from the demo scene, and I'm sure these six people know the hardware inside and out. Anyway, if they would opensource their drivers (maybe even the win32 driver), the 'community' could optimize it and port it to the various drivers. Someone could suggest it to them maybe?

    In any case, I hope they at least release the specs. Imagine running this card on a variety of platforms: Linux, Be, Win32, but also MacOS/linuxppc (hope they make a pci(66) version too), and maybe even on Alpha's (NT/Linux) and Amiga's.

    No way they could support all of those on their own. Even though their drivers are probably already pretty mature, due to the way they developed their chip (PCI card linked to a simulation computer if i understand correctly), they would probably not think of supporting some of the more 'exotic' platforms.

    Everybody would be more happy (them selling more boards, and us being able to use one everywhere)


  • All the effects that 3DFX's marketing dept has renamed the "T-Buffer" have been around (IN OPENGL!!) for years. At least the BitBoys call it by its proper name, which is an accumulation buffer, instead of letting the marketing department rename it, complete with lame story about the chief engineer. I suppose pandering to idiots sells 3D hardware.
  • well, on the other hand this lets you do some neat stuff with say, antialiasing. with an aa depth of just 2, that 1600x1200 becomes 800x600 and so on. certainly it's not the most important feature around, but it can enhance the visual quality noticably...
  • Well, you shouldn't get a voodoo at all, go TNT2ultra. much faster under openGL (for quake1/2/3) and it looks *much* nicer
    "Subtle mind control? Why do all these HTML buttons say 'Submit' ?"
  • Wow a lot of memory on the cards, makes todays ultra tnt2's look horribly bad. Hopefully they will be priced competitively, according to glaze's site.
  • Roughly where does it mention Linux?
  • Interesting to note they have a motion blurring 'demo' that looks frighteningly like the T-buffer stuff that kicked off two days ago. That was crap too. They both look like the cartoon rendering of Tom (of &Jerry fame) being hit by a plank. Lots of similar images laid ontop of another to give the impression of movement, garbage.

    Presumably the root of all these difficulties are that as we bang vertices over to the accelerator, it considers each frame as a stand alone rendering. This was what OpenGL was designed for, after all. What, to me, seems to be missing is the ability to pass the velocity for a vertex over to the card as well, hence opening the floodgates for manufacturers to add motion blurring as they see fit.

    Or frame interpolation, good idea? Send to card at 25fps, render at 75fps, freeing up a truckload of processor time and a truckload of AGP bandwidth into the deal.

    Crivens, at this rate we may actually see some innovation rather than merely bumping up fillrates month on month.

    Dave :)

  • Fancy-schmancy 3D cards really won't do you much good at all if all you're doing is 2D stuff like CTP. Heck, an ISA SVGA card would probably suit your needs just fine. 3D accelerators are _mostly_ just for gamers, and then mostly FPS games. (How many screen shots are there of Q3A compared to just about _anything_ else for new 3d cards?)

    (Then again, no offense intended, but those Virge cards are crap. :) If all you do is 2D, and you'd like smoother Xwindows, or better depth / resolution, buying a new Voodoo3 would be a waste of your money. Try digging around for a slightly used Millenium II -- There's probably PCI ones going for pennies now, and you won't be disappointed.)
  • It's in the FAQ section. Go to the bottom of the page when you're looking at the specs and stuff.

  • Future Crew, did have some killer demos, back in the early/mid 90's. Just killer demos!
  • by Anonymous Coward
    In the FAQ!

    Q. What operating systems does the Glaze3D(TM) support?

    A. Glaze3D(TM) supports all versions of Windows and Windows NT, including the new Windows 2000. It also supports Linux and is capable of working in any other PCI/AGP environment.
  • When you say it comes in quad doesn't mean 4 cards (e.g. SLI), it means it can process 4 pixels at a time, like how voodoo2, tnt, etc do 2 at a time.. 4 is just the next step.. and based on an earlier post, its 64 megs of ram.. again just the next step above the current 32 on tnt2 ultra and v3..

    wait for them to deliver, vaporware is more common then we like to believe..
  • a game released in 1996 that no one heard of...a 3d benchmarking program that no one uses...a first person VR game thats to be released...all this is just a tragic *waste* of the sort of talent they had...screamtracker and the demos were groundbreaking..no one had ever done any of this stuff before - world class rendering, fantastic speeds, incredible graphics and all they have to show for it a few years later is *this* ?
  • Although demo coding and chip design require completely different knowledge, many of the skills involved are common to both. The most important skill is the willingness to do the extra work on the details. Future Crew demos always stuck out from the herd. I think they could quite easily have the ability to design the silicon.

    To get the manufacturing done and then drivers written is a great deal of work though.

    I think they have a chance.
    If..

    They are phenominally talented.
    They get venture capital and expand
    Their technology is licenced by an existing 3d manufacturer who has fallen behind the pack.

    Failing that I think they have a good chance at making the silicon and selling the chips for use in arcade machines. They don't need standardised drivers etc.

    Of course they will sink into nothingness if their fill rate and tri-setup isn't top notch.
    Geometry acceleration would have been an advantage too.

    And then there's the golden rule of 3D cards.
    ---Rendering speed only counts in a shipping product.
  • Being probably the only person in the Linux community to obtain a signed NDA with TriTech for the Pyramid technical data, I have a different view of things. In my opinion, it was the management for TriTech that is to blame for the fiasco that was the Pyramid- not the designers. That chip was ready long before anyone in the regular trade rags heard about it. They spent all their time trying to get in bed with Microsoft (hell, they sponsored a LOT of the Meltdown events and supplied the technology to Microsoft for the DirectX bump mapping system.) and not pushing to get a board vendor to make a board with it or make boards themselves. By the time they got "serious" about it, the chip had already been eclipsed in performance by other chips even though it was superior in image quality over everything else at the time. Nobody wanted quality, they only wanted quantity at that point in time.
  • since when do 3d cards make better games? Quake I was the best of the series, and Doom II was the best single player shooter ever.

    The great console games were from the 16 bit wars era, not this new fangled 3d crap they've been pushing.

    I just want a Chrono Trigger sequel really, then I'll be happy.
  • Wrong, wrong, wrong. The chip manufacturer ALWAYS makes a set of reference drivers that work with the chip in question on a reference design board (which most consumer boards don't deviate from much, if at all) so I'd expect them to provide reference Linux drivers. I for one hope this board's FOR REAL, and I hope the drivers will be open source. Here's hopin'...
  • I think I might have been more impressed with their PCI (/AGP) development and testing platform. If it's as good as it sounds, I may even want to buy one for myself for a couple of projects I'm working on (where I'd much rather do high-level programming than embedded coding until it's really ready to become a chip). I don't know how this will compare to the next TNT chip, but I bet (if it ever gets done) it will kick 3dfx's butt (3dfx always skimps on way too many features).
  • I looked up Infineon, and found that it's the
    recently spun off Siemens Semiconductor operation.
    Infineon say they'll have 0.17u EDRAM at about the
    same time the Glaze is supposed to be available.
    It's likely that they still have lots of design work to do before this chip tapes out.

    Every graphics chip supplier will need EDRAM for the next generation, and from what I've heard,
    they're all working on it.

    We'll see who has it first.
  • Oh, I don't know that I'd say the Virge cards are total crap. I've had my Virge 4MB since I bought my PC (circa two years ago) and it's held up pretty well. Anything that'll run Half-life acceptably can't be all bad.


    ...on the other hand, I just bought an STB Velocity 4400 cheap and I can't wait for it to get here. :)

    --
    "Perfection is achieved, not when there is nothing left to add, but when there is nothing left to take away. "

  • I wouldn't automatically assume that these are the same guys involved in the Pyramid3D chip.. They could be though.. Anyone have hard info?

    It is the same guys. Look in the "Team" section.

    Mika Tuomi and Sami Tammilehto are from Future Crew.

    Crystal Dreams II was a much better demo then Second Reality.

    This has been rehashed a million times in comp.sys.ibm.pc.demos, in 1993 that is. I think the basic consensus was that if you are a programmer CD2 was really interesting, but if you just show it to your regular friends/family, it is boring. SR was much more entertaining for regular folks.

    Whatever happened to Triton? What happened to Into the Shadows?

    Probably the same thing as a million other non-commercial game products. It takes so much more to make a playable game than a demo. (demo as in demoscene not as in gamedemo) Demo programming is fun because of programming against unusual constraints, and you can see and hear the results. What fun is there in figuring out how to read the joystick, or figuring out why it crashes on some computers? (That's a rhetorical question.)

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...