Information om PC hardware
find
sidste nyt søg arkiv statistik om os
Emner
Andet (50)
Bundkort (83)
Emulator (38)
Grafikkort (277)
Harddisk (19)
Køling (18)
Processorer (128)
RAM (25)
Software (86)

Links
Drivere
Emulator
Hardware
Spil
Download

Kommentarer
Lukning af HardwareTidende (stand-by på ubestemt tid) (341)
(2004-11-30 23:59:59)

Half-Life 2 - årets spil 2004 (100)
(2004-11-30 23:58:49)

Specifikationer på ATi Radeon R480 (Radeon X850) (39)
(2004-11-30 23:15:03)

Betydning af ramtimings, FSB og taktfrekvenser (64)
(2004-11-30 22:54:53)

The Elder Scrolls IV: Oblivion - tidlige billeder fra fortsættelsen til Morrowind (33)
(2004-11-30 22:51:58)

Hovedtelefoner med surround sound - introduktion og test (221)
(2004-11-30 21:03:22)

Test af 45 cpukølere til Athlon XP (socket A/462) (56)
(2004-11-30 20:33:15)

Bundkort til Athlon 64 - ATi Radeon Xpress 200 mod Nvidia nForce4 Ultra og VIA K8T800 Pro (21)
(2004-11-30 19:38:15)

Omfattende test af Athlon 64 cpu'er til socket 939 (120)
(2004-11-30 19:30:02)

Test af 5 bundkort til socket 939 (nForce3 Ultra mod K8T800 Pro) (90)
(2004-11-30 19:23:37)

DOOM3 på GeForce3 (15)
(2004-11-30 17:38:17)

Maj 2000 grafikkort-test (2)
(2004-11-30 15:23:46)

Guide til LCD-skærme (fladskærme) (290)
(2004-11-30 00:46:39)

Støjmålinger på 6 mellemklasse og high-end grafikkort (54)
(2004-11-30 00:01:19)

Test af stort antal grafikkort i 1600x1200 (juni 2004) (76)
(2004-11-29 18:35:19)

Ekstrem ydelse med grafikkort - første resultater med Nvidia SLI (95)
(2004-11-29 15:36:13)

Athlon 64 4000+ og Athlon 64 FX-55 mod Intels hurtigste cpu'er (56)
(2004-11-28 15:10:20)

Drivere til ATi grafikkort - Catalyst 4,8 mod 4,7 (44)
(2004-11-27 00:37:49)

Test af 12 forskellige bundkort med i925X eller i915 chipsæt - hvilke er bedst? (28)
(2004-11-25 22:22:53)

Test af ydelse i Half-life 2: High-end grafikkort og cpu'er (79)
(2004-11-25 22:21:44)


DOOM3 på GeForce3
Software af Webmaster Thursday den 22 February 2001 kl. 19:47 [ Software ]
Da Carmack fra Id Software deltog i introduktionen af GeForce3 til Apple PC'ere den 20. februar (MacWorld i Tokyo), fremviste han de første billeder fra DOOM3. Billederne er lidt af en verdens-sensation, da billederne ikke er præ-renderende, men kørte i realtime på GeForce3. Tror du TNT2 Ultra og en Pentium4 1,5 GHz kan klare dette:


Kilde: http://www.voodooextreme.com/

Bemærk de utroligt detaljerede ansigtskonturer. Belysningen er ligeledes med at at højne realismen. Her er endnu et billede:


Kilde: http://www.voodooextreme.com/

Sikkert et af de mange monstre, som altid er lige præcis i den gang, som du er nødt til at passere... Her er dog et monster, som sikkert vil fremkalde sved, hvis granatkasteren er tom:


Kilde: http://www.voodooextreme.com/

Atter en helt utrolig detaljegrad. Hvis Id Software kan frembringe et spil med sådanne detaljer og samtidigt et fantastisk gameplay, så klapper Nvidia, ATI, Intel og AMD sikkert i deres hænder. DOOM3 har således potentiale til at forvandle PlayStation2 til noget, som vi stikker til med en pind...Her er et lidt mindre detaljeret billede, sikkert grundet større afstand:


Kilde: http://www.voodooextreme.com/

Vi slutter af med, hvad der kunne være din makker, som netop har fået øje på "noget, som bevæget sig i mørket":


Kilde: http://www.voodooextreme.com/

Som sagt er ovenstående billeder taget direkte fra spillet, og er ikke taget fra en prærenderet introfilm, som er lavet på gigantiske super-computere. Spørgsmålet er, hvorvidt GeForce2 (Ultra) overhovedet kan være med her. Carmack har iøvrigt udtalt følgende: "The ideal card for DOOM [3, red.] hasn't shipped yet, but there are a couple good candidates just over the horizon.".

Kilde: http://www.voodooextreme.com/games/interviews/carmack/2.html

Carmack fortsætter med at kritisere GeForce2, Radeon osv.. Så her har du det sort på hvidt - GeForce2 kan ikke håndtere DOOM3 ordentligt, mens alt nu tyder på, at GeForce3 derimod burde have armkræfter nok.

Umiddelbart ser monstrene mm.. i DOOM3 væsentligt mere detaljerede og livagtige ud end i Unreal2 - og det skyldes ikke, at Unreal2 er dårligt på nogen måde:


Kilde: http://www.dailyradar.com/features/game_feature_page_1444_1.html

Unreal2 er ligeledes fremragende, men mangler klart den "fin-pudsethed", som DOOM3 fremviser. Dette skyldes sikkert, at DOOM3 benytter flere polygoner i ovenstående billeder, end hvad tilfældet var i billederne fra Unreal2 (fra september 2000). Hverken DOOM3 eller Unreal2 er dog færdige endnu, så meget kan nå at ændre sig. Men interessant er det, at Carmack er helt oppe i skyerne mht. GeForce3, og det skal normalt meget til, før Carmack får hardware-orgasme :-) Her er et par udtalelser fra Carmack, og når han taler, så lytter både Bill Gates, hardware-producenter og spil-entusiater:

"We've been doing hacks and tricks for years, but now we'll be able to do things we've been wanting to do for a long time," Carmack said. "For instance, every light has its own highlight and every surface casts a shadow, like in the real world. Everything can behave the same now and we can apply effects for every pixel."

"We're very excited about the quality we're getting," Carmack added. "This is a wonderful time to be in graphics and the GeForce 3 is the most exciting thing we've had to work with in years." (min fremhævelse, red.)

Kilde: http://www.maccentral.com/news/0102/21.geforce.shtml

Bedre kan det vel næsten ikke formuleres. Steve Jobs fra Apple præsentere yderligere følgende information:

"Jobs talked about the hours needed to render individual seconds of Luxo Jr ., the first Pixar movie, on a Cray supercomputer -- Kirk demonstrated Nvidia's interpretation of Luxo Jr and showed the cinematic effects being done in real-time."

Kilde: http://www.maccentral.com/news/0102/21.live.shtml

Så måske er det alligevel korrekt, når GeForce3 tidligere blev omtalt som "Toy Story On Chip". Umiddelbart ser det i hvertfald ud til, at GeForce3 ville kunne rendere tegnefilmen "Toy Story" fra Walt Disney i realtime uden store problemer - billederne fra DOOM3 understreger netop dette. Forvent dog ikke, at DOOM3 eller Unreal2 ankommer før dec. 2001 eller 2002, men GeForce3 kan sikkert købes i Danmark inden udgangen af marts. Så kun en tåbe køber GeForce2 Ultra i dag :-)

Præsentationen af GeForce3 (MacWorld, Tokyo), hvor du kan se John Carmack præsentere DOOM3 i realtime (ret imponerende), findes optaget i ret god opløsning. Videofilmen fylder 14MB (varer 8:39min). Du kan hente GeForce3/Doom3-filmen lokalt fra HardwareTidende.dk. Bare klik på dette link:

http://www.hardwaretidende.dk/hard/links.php?node=01/04/09/2678336

DOOM3 på GeForce3 er jo næsten er religiøs oplevelse :-)



Rune skrev Thursday den 22 February 2001 kl. 22:14: [ svar | nyeste ]
Så slap Nvidia endelig katten ud af sækken og har netop publiceret officiel pressemeddelelse vedrørende GEFORCE3. Læs den her:

http://biz.yahoo.com/bw/010222/0193_2.html

eller her:

http://www.hardocp.com/new_img_01/feb/022201b.html

Det korte af det lange. GeForce3 kan klare FSAA i høje opløsninger og er på dette punkt 4 gange hurtigere end GeForce2 ULTRA! GPU'en er nu programmerbar, men præcis, hvad alt dette betyder må vi vente med til de første officielle benchmarks ankommer i næste uge. Den effektive ram-båndbredde skulle efter sigende være uhyre meget bedre end GeForce2 ULTRA. HARDOCP.com havde en artikel på vej, som Nvidia netop har nedlagt veto mod - heldigvis fik VoodooExtreme.com dog fat i lidt af indholdet:

"If the Quake3 framerates are correct, the GeForce3 is ungodly fast when it comes to sheer horsepower. Of course what may seem to be excessive now may turn out to not be when games like DOOM3 and Duke Nukem Forever finally make it to the shelves. And from what I hear, the GeForce3 will be THE card to have to play both of those titles.".

Så UNGODLY FAST i Quake3 Arena....THE card mht. DOOM3.... yum...yum...hmmm, hvor lagde jeg nu min savleklud?

Senere hørt i nyhederne: "Maskeret mand truede i går ekspedient i edb-butik til at udlevere et specielt computerkort. Røveren var ligeglad med kassebeholdningen og andre værdigenstande"

- Rune



Rune skrev Friday den 23 February 2001 kl. 00:39: [ svar | nyeste ]
Hvis du vil se flere billeder fra Doom3 på GeForce3, så har Doom3.dk en imponerende samling her:

http://www.doom3.dk/screenshots.asp

Evt. kan du finde billeder her også:

http://www.shugashack.com/screens.x/doom2k/New%20Doom

men det er vist de samme, som du kan finde på Doom3.dk... :-)

- Rune



Rune skrev Friday den 23 February 2001 kl. 10:50: [ svar | nyeste ]
John Carmack, manden bag Wolfenstein, Doom1-2 og Quake1-3, har beskrevet sine oplevelse i forbindelse med GEFORCE3. Det er en lang beskrivelse, og vi bringer naturligvis hele molevitten. For at gøre det nemmere for læsere, som kun spiser bøf og ej salat/kartofler, har vi fremhævet det vigtigste med sort - så gider du ikke læse det hele, så led efter fremhævet tekst!

Name: John Carmack Email: johnc@idsoftware.com Description: Programmer Project: ------------------------------------------------------------------------------- Feb 22, 2001 ------------ I just got back from Tokyo, where I demonstrated our new engine running under MacOS-X with a GeForce 3 card. We had quite a bit of discussion about whether we should be showing anything at all, considering how far away we are from having a title on the shelves, so we probably aren't going to be showing it anywhere else for quite a while. We do run a bit better on a high end wintel system, but the Apple performance is still quite good, especially considering the short amount of time that the drivers had before the event. It is still our intention to have a simultaneous release of the next product on Windows, MacOS-X, and Linux.

Here is a dump on the GeForce 3 that I have been seriously working with for a few weeks now: The short answer is that the GeForce 3 is fantastic. I haven't had such an impression of raising the performance bar since the Voodoo 2 came out, and there are a ton of new features for programmers to play with. Graphics programmers should run out and get one at the earliest possible time. For consumers, it will be a tougher call. There aren't any applications our right now that take proper advantage of it, but you should still be quite a bit faster at everything than GF2, especially with anti-aliasing. Balance that against whatever the price turns out to be. While the Radeon is a good effort in many ways, it has enough shortfalls that I still generally call the GeForce 2 ultra the best card you can buy right now, so Nvidia is basically dethroning their own product. It is somewhat unfortunate that it is labeled GeForce 3, because GeForce 2 was just a speed bump of GeForce, while GF3 is a major architectural change. I wish they had called the GF2 something else.

The things that are good about it: Lots of values have additional internal precision, like texture coordinates and rasterization coordinates. There are only a few places where this matters, but it is nice to be cleaning up. Rasterization precision is about the last thing that the multi-thousand dollar workstation boards still do any better than the consumer cards. Adding more texture units and more register combiners is an obvious evolutionary step. An interesting technical aside: when I first changed something I was doing with five single or dual texture passes on a GF to something that only took two quad texture passes on a GF3, I got a surprisingly modest speedup. It turned out that the texture filtering and bandwidth was the dominant factor, not the frame buffer traffic that was saved with more texture units. When I turned off anisotropic filtering and used compressed textures, the GF3 version became twice as fast. The 8x anisotropic filtering looks really nice, but it has a 30%+ speed cost. For existing games where you have speed to burn, it is probably a nice thing to force on, but it is a bit much for me to enable on the current project. Radeon supports 16x aniso at a smaller speed cost, but not in conjunction with trilinear, and something is broken in the chip that makes the filtering jump around with triangular rasterization dependencies. The depth buffer optimizations are similar to what the Radeon provides, giving almost everything some measure of speedup, and larger ones available in some cases with some redesign. 3D textures are implemented with the full, complete generality. Radeon offers 3D textures, but without mip mapping and in a non-orthogonal manner (taking up two texture units). Vertex programs are probably the most radical new feature, and, unlike most "radical new features", actually turn out to be pretty damn good. The instruction language is clear and obvious, with wonderful features like free arbitrary swizzle and negate on each operand, and the obvious things you want for graphics like dot product instructions. The vertex program instructions are what SSE should have been. A complex setup for a four-texture rendering pass is way easier to understand with a vertex program than with a ton of texgen/texture matrix calls, and it lets you do things that you just couldn't do hardware accelerated at all before. Changing the model from fixed function data like normals, colors, and texcoords to generalized attributes is very important for future progress. Here, I think Microsoft and DX8 are providing a very good benefit by forcing a single vertex program interface down all the hardware vendor's throats.

This one is truly stunning: the drivers just worked for all the new features that I tried. I have tested a lot of pre-production 3D cards, and it has never been this smooth. The things that are indifferent:

I'm still not a big believer in hardware accelerated curve tessellation.

I'm not going to go over all the reasons again, but I would have rather seen the features left off and ended up with a cheaper part.

The shadow map support is good to get in, but I am still unconvinced that a fully general engine can be produced with acceptable quality using shadow maps for point lights. I spent a while working with shadow buffers last year, and I couldn't get satisfactory results. I will revisit that work now that I have GeForce 3 cards, and directly compare it with my current approach.

At high triangle rates, the index bandwidth can get to be a significant thing. Other cards that allow static index buffers as well as static vertex buffers will have situations where they provide higher application speed. Still, we do get great throughput on the GF3 using vertex array range and glDrawElements.

The things that are bad about it:

Vertex programs aren't invariant with the fixed function geometry paths. That means that you can't mix vertex program passes with normal passes in a multipass algorithm. This is annoying, and shouldn't have happened.

Now we come to the pixel shaders, where I have the most serious issues. I can just ignore this most of the time, but the way the pixel shader functionality turned out is painfully limited, and not what it should have been. DX8 tries to pretend that pixel shaders live on hardware that is a lot more general than the reality. Nvidia's OpenGL extensions expose things much more the way they actually are: the existing register combiners functionality extended to eight stages with a couple tweaks, and the texture lookup engine is configurable to interact between textures in a list of specific ways. I'm sure it started out as a better design, but it apparently got cut and cut until it really looks like the old BumpEnvMap feature writ large: it does a few specific special effects that were deemed important, at the expense of a properly general solution.

Yes, it does full bumpy cubic environment mapping, but you still can't just do some math ops and look the result up in a texture. I was disappointed on this count with the Radeon as well, which was just slightly too hardwired to the DX BumpEnvMap capabilities to allow more general dependent texture use. Enshrining the capabilities of this mess in DX8 sucks. Other companies had potentially better approaches, but they are now forced to dumb them down to the level of the GF3 for the sake of compatibility. Hopefully we can still see some of the extra flexibility in OpenGL extensions.

The future:

I think things are going to really clean up in the next couple years. All of my advocacy is focused on making sure that there will be a completely clean and flexible interface for me to target in the engine after DOOM, and I think it is going to happen.

The market may have shrunk to just ATI and Nvidia as significant players. Matrox, 3D labs, or one of the dormant companies may surprise us all, but the pace is pretty frantic. I think I would be a little more comfortable if there was a third major player competing, but I can't fault Nvidia's path to success.

Kilde: http://www.bluesnews.com/cgi-bin/finger.pl?id=1&time=20010222225435



Rune skrev Monday den 23 July 2001 kl. 10:22: [ svar | nyeste ]
DoomWorld har udgivet FAQ til Doom3 (eller hvad spillet nu kommer til at hedde), som kan læses her:

http://www.doomworld.com/files/doom3faq.shtml


Rune skrev Thursday den 02 August 2001 kl. 12:31: [ svar | nyeste ]
Iføge Carmack vil GeForce3 kunne klare 30 FPS i Doom3 med alle detaljer sat på, har du ikke GeForce3 eller bedre, så bliver du nok nødt til at reducere antallet af detljer i spillet. Her er, hvad den almægtige Carmack netop har ytret:

"Obviously, any game done with the new Doom engine is going to run slower than a game done with Q3 technology. You can make some of it back up by going to the simpler lighting model and running at a lower resolution, but you just won't be able to hit 60+ fps on a GF2. The low end of our supported platforms will be a GF1 / 64 bit GF2Go / Radeon, and it is expected to chug a bit there, even with everything cut down. There are several more Q3 engine games in the works that will continue to run great on existing systems, and Doom is still a long ways off in any case, so there will be a lot more upgrades and new systems. We are aiming to have a GF3 run Doom with all features enabled at 30 fps. We expect the high end cards at the time of release to run it at 60+ fps with improved quality. This is an intentionally lower average FPS for the hardware cross section than we targeted for Q3, but still higher than we targeted Q2 and earlier games (before hardware acceleration was prevalent). In the GLQuake days, light maps were considered an extravagance ("Render the entire screen TWICE? Are you MAD?"), and some unfortunate hardware companies just thought increased performance meant higher resolutions and more triangles instead of more complex pixel operations. Five passes sounds like a lot right now, but it will be just as quaint as dual texturing in the near future. I am quite looking forward to 100+ operations per interaction in future work.

John Carmack"

Kilde: http://slashdot.org/comments.pl?sid=01/08/01/1220230&cid=220

Hmmm, gad vide, hvornår Doom3 egentlig udkommer. Hvis Doom3 forsinkes til efteråret 2002, kan det jo være, at Carmack refererer til Radeon3 og GeForce5 (DreamForce1), hvis Doom3 skal kunne klare 60+ FPS med alle detaljer aktiveret...


Rune skrev Friday den 03 August 2001 kl. 12:31: [ svar | nyeste ]
Vi har netop ændret belysningsniveauet på DOOM3-billederne i ovenstående artikel, så det nu burde være væsentlig nemmere at se, hvad billederne viser.


Rune skrev Monday den 03 September 2001 kl. 11:28: [ svar | nyeste ]
Se de sidste nye Doom3-billeder her:

http://www.voodooextreme.com/index.taf?start=2&days=1


Rune skrev Monday den 03 September 2001 kl. 11:32: [ svar | nyeste ]
Jeg faldt også over dette nye Doom3-billede:



Rune skrev Thursday den 23 May 2002 kl. 20:53: [ svar | nyeste ]
Carmack har atter ytret sig om Doom3 og nuværende grafikkort mm.:

GameSpy: The world of video cards seems to change on a daily basis. What do you think of the current crop of cards on the market, and where do you see things heading? Are there any new cards that inetrest you? Where would you like to see things go?

Carmack: There are interesting things to be said about the upcoming cards, but NDAs will force me to just discuss the available cards.

In order from best to worst for Doom:

I still think that overall, the GeForce 4 Ti is the best card you can buy. It has high speed and excellent driver quality.

Based on the feature set, the Radeon 8500 should be a faster card for Doom than the GF4, because it can do the seven texture accesses that I need in a single pass, while it takes two or three passes (depending on details) on the GF4. However, in practice, the GF4 consistently runs faster due to a highly efficient implementation. For programmers, the 8500 has a much nicer fragment path than the GF4, with more general features and increased precision, but the driver quality is still quite a ways from Nvidia's, so I would be a little hesitant to use it as a primary research platform.

The GF4-MX is a very fast card for existing games, but it is less well suited to Doom, due to the lower texture unit count and the lack of vertex shaders.

On a slow CPU with all features enabled, the GF3 will be faster than the GF4-MX, because it offloads some work. On systems with CPU power to burn, the GF4 may still be faster.
The 128 bit DDR GF2 systems will be faster than the Radeon-7500 systems, again due to low level implementation details overshadowing the extra texture unit.

The slowest cards will be the 64 bit and SDR ram GF and Radeon cards, which will really not be fast enough to play the game properly unless you run at 320x240 or so. (mine fremhævninger, red.)

Kilde: http://www.gamespy.com/e32002/pc/carmack/


Oldtimer skrev Thursday den 02 September 2004 kl. 13:05: [ svar | nyeste ]
whoaaaa tjek lige de her screenshots. De fik godt nok pudset grafikken af i de 3,5 år siden denne artikel *LOL*


GWiZ skrev Friday den 03 September 2004 kl. 00:13: [ svar | nyeste ]
"Forvent dog ikke, at DOOM3 eller Unreal2 ankommer før dec. 2001 eller 2002" Høh, nogen spil tager den tid nogen spil skal tage!


Anders skrev Friday den 03 September 2004 kl. 13:23: [ svar | nyeste ]
"Så her har du det sort på hvidt - GeForce2 kan ikke håndtere DOOM3 ordentligt, mens alt nu tyder på, at GeForce3 derimod burde have armkræfter nok." -hæhæ


Rune skrev Friday den 03 September 2004 kl. 13:29: [ svar | nyeste ]
Vi bliver jo hele tiden klogere - så denne påstand holdt desværre ej heller vand:

"Så måske er det alligevel korrekt, når GeForce3 tidligere blev omtalt som "Toy Story On Chip". Umiddelbart ser det i hvertfald ud til, at GeForce3 ville kunne rendere tegnefilmen "Toy Story" fra Walt Disney i realtime uden store problemer - billederne fra DOOM3 understreger netop dette."

:-)

Derudover skulle Doom3 kunne afvikles nogenlunde på GeForce3, dvs. uden FSAA og ANI i low detail og nok i 640x480 (måske 800x600), hvilket dog næppe er noget at juble meget over...

Og skulle enkelte savne en artikel, der lovpriser Unreal3 og GeForce 6800 Ultra, så ved de måske nu, hvorfor vi har undladt at publicere en sådanne artikel :-)


Anders skrev Friday den 03 September 2004 kl. 13:35: [ svar | nyeste ]
Hæhæ - der er næppe nogen, der vil bebrejde hardwaretidende noget, men denne artikkel illustrerer hvor lang tid vi har gået og ventet på DOOM3 og HL2, og hvor lang tid, der har været hype omkring disse spil... Godt der er kommet et par gode udgivelser i mellemtiden!


Ánders skrev Tuesday den 30 November 2004 kl. 17:38: [ svar | nyeste ]
Set i lyset af, hvor meget hype en del spil- og hardwareproducenter prøver at skabe langt før deres produkter er færdige, er det bemærkelsesværdig hvor få svipsere HardwareTidende har haft. Denne artikel om IDs "præsentation" af DOOM3 (tre år før endelig udgivelse) er dog lidt morsom :)


Info
HardwareTidende er en omfattende guide til optimal afvikling af computerspil på PC. Dette kræver viden om hardware såsom grafikkort, cpu'er, ram og meget mere. Du kan læse mere om konceptet bag HardwareTidende her:

Om os

Hardwaretidende bliver kun opdateret, når der sker noget nyt og interessant. Skriv endelig til os, hvis der er noget, vi har overset; nyheder modtages gerne.



Startside
Lad HardwareTidende blive din startside: Klik her

Eller hvis du kun vil tilføje os til dine foretrukne: Klik her

Copyleft © 2000-2004 HardwareTidende