top of page
bullafegeftagold

Asus C229 Video card Driver: Download and Install the Latest Version



The built-in Microsoft Windows Update service may not update your drivers properly. Instead, use The Video / Graphics Driver Update Utility for ASUS. It is intelligent software that automatically recognizes your computer's operating system and video / graphics manufacturer and model to find the most up-to-date drivers for it. There is no risk of installing the wrong driver. The Video / Graphics Driver Update Utility downloads and installs your drivers quickly and easily.




Asus C229 Video card Driver



To find the latest driver, including Windows 11 drivers, choose from our list of most popular ASUS Video / Graphics downloads or search our driver archive for the driver that fits your specific ASUS video / graphics model and your PC's operating system.


After downloading your driver update, you will need to install it. Driver updates come in a variety of file formats with different file extensions. For example, you may have downloaded an EXE, INF, ZIP, or SYS file. Each file type has a slightly different installation procedure to follow. Visit our Driver Support Page to watch helpful step-by-step videos on how to install drivers based on their file extension.


The system specifications will remain the same throughout the testing. No adjustment will be made to the respective control panels during the testing, with the exception of the 3DMark 11 testing, where PhysX will be disabled in the NVIDIA Control Panel, if applicable. I will first test the cards at stock speeds, and then overclocked to see the effects of an increase in clock speed. The cards will be placed in order from highest to lowest performance in each graph to show where they fall by comparison. The AMD comparison cards will be using the 13.3 drivers with Nvidia GT 640 cards using the 314.07 drivers.


Without the crucial drivers and software for your Logitech C922, your video quality can become distorted, and many of the main features may be unavailable. Some streaming software may not recognize your C922 without correct drivers, either.


The setup asked me If I wanted to keep my earlier installation or not... I choose NO... Then it asked for the location of the installation... C:\ Windows...And that was all it asked... The installation went smooth and after the reboot, the Windows XP boot screen was kept intact, except the default option now was the Millenium instead of XP. An HUGE "?" came up in my head, but since the default option was changed... Why not? I selected Millenium and it booted fine... It finished the instalation without any issue, all installed programs kept intact, except for the graphics card driver, but I've installed it again and everything is fine now...


  • As usual, before we proceed to analysis of the new accelerator we recommend that you read the analytic article scrutinizing the architecture and specifications of the NVIDIA GeForce FX (NV30) CONTENTS General information

  • Peculiarities of the NVIDIA GeForce FX 5800 Ultra 128MB video card

  • Test system configuration and drivers' settings

  • Test results: briefly on 2D

  • RightMark3D synthetic tests: philosophy and tests description

  • Test results: RightMark3D: Pixel Filling

  • Test results: RightMark3D: Geometry Processing Speed

  • Test results: RightMark3D: Hidden Surface Removal

  • Test results: RightMark3D: Pixel Shading

  • Test results: RightMark3D: Point Sprites

  • Test results: 3DMark2001 SE synthetic tests

  • Additional theoretical information and summary on the synthetic tests

  • Information on anisotropic filtering and anti-aliasing

  • Architectural features and prospects

  • Test results: 3DMark2001 SE: Game1

  • Test results: 3DMark2001 SE: Game2

  • Test results: 3DMark2001 SE: Game3

  • Test results: 3DMark2001 SE: Game4

  • Test results: 3DMark03: Game1

  • Test results: 3DMark03: Game2

  • Test results: 3DMark03: Game3

  • Test results: 3DMark03: Game4

  • Test results: Quake3 ARENA

  • Test results: Serious Sam: The Second Encounter

  • Test results: Return to Castle Wolfenstein

  • Test results: Code Creatures DEMO

  • Test results: Unreal Tournament 2003 DEMO

  • Test results: AquaMark

  • Test results: RightMark 3D

  • Test results: DOOM III Alpha version

  • 3D quality: Anisotropic filtering

  • 3D quality: Anti-aliasing

  • 3D quality in general

  • Conclusion

  • General informationWe are now on the threshold of spring 2003. The NVIDIA NV25 which became a forefather of the GeForce4 Ti line was released a year ago. Let's look through the last two years. In winter 2001 the company launched NV20 (GeForce3) which gave birth to many currently popular technologies. The new product - NV25 - should have arrived half a year later if it were not ATI Technologies with its R200. In autumn we got only the Ti 200/500 from the Titanium family as the GeForce3 line was referred to. They were not solutions on the new chip but simply updated cards based on the same NV20 brought into the market because of the marketing policy. That was the first malfunction of the NVIDIA's semi-annual cycle, and the NV25 was released only a year ago. The GeForce4 Ti is the improved version of the NV20: higher power of the 3D accelerator due to higher frequencies, 128MB memory cards (and the attempt to make this size standard) etc. It seemed that half a year later the industry could bring to life the NV30 which looked so mysterious. Even if we ignore the previous delay with the NV25 (because the NV30 was to be scheduled for 2002), the most optimistic forecasts indicated august 2002. It seemed that we were on the edge of the new competition stage between ATI and NVIDIA. The previous battle was lost by the Canadian because the RADEON 8500 was too expensive and could compete only against GeForce3, not GeForce4 Ti. However, the NV25 had a lame anisotropic filtering and the ATI's solution could show comparable performance in case this function was enabled, but at the same time it had inferior anisotropic quality. So, in summer 2002 the ATI's 3D flagship RADEON 9700 PRO came into the scene with great fanfare. All previous solutions look faded, and ATI easily mounted the 3D gaming throne. Where was the NV30? NVIDIA spent all summer and autumn months to master the 0.13 micron fab process so that they could make a chip with 125 M transistors right on the new technological process. It's very likely that the product was repeatedly redesigned to get an acceptable percentage of valid dies. At last, in November they announced it and we could even feel the new cards but they were still too raw including their drivers. So, NVIDIA missed one more semi-annual cycle. The second release of the NV25 with the AGP 8x support in the form of NV28 is not counted because it was just a marketing trick; besides, the AGP8x didn't help the NV28 because 128MB local memory is now more than sufficient for all modern games. Thus, the gap between NV20 and NV25 is actually a year, as well as between NV25 and NV30. As to ATI, it keeps to a 9-month cycle, that is why in spring we expect its new products R350 and RV350. Who do they to stand against? NV30? Looks like that. Because the NV30 turned out to be the most powerful accelerator (see the details below) but such cards will be in great deficiency. As they were behind the schedule and also were going to focus attention of their partners on the improved NV30 version - NV35, the output of the NV30 dies was cut down. There were only about 100 000 pcs produced. Taking into account a very high cost of the NV30 and the NVIDIA's strategy to release a chip for professional graphics first (NV??GL) and only then a gaming solution (NV??), it was logical that NVIDIA decided to use the most part of the NV30 chips for production of the Quadro FX. Fortunately, the prices for professional accelerators are traditionally high, and it's possible to make up for the huge expenses for the NV30. As you know, a part of new cards always go to OEM companies, and the retail market will get only a small portion of the GeForce FX 5800 Ultra (the rumor has it that such cards won't be available on the open market at all). Most likely, the remaining GPUs will be used for GeForce FX 5800 cards. By the way, the NV30 chip is used for the whole line: GeForce FX 5800 Ultra - 500 MHz chip, 128 MB 500 MHz (DDR II 1000) 128bit local memory;

  • GeForce FX 5800 - 400 MHz chip, 128 MB 400 MHz (DDR II 800) 128bit local memory.

  • We also expect cheaper and less speedier versions named NV31 and NV34. The company says they will be simply cut-down versions based on the NV30, and the expenses for the NV3X technologies development must be covered by the sales of such cards. We will return to this issue later, and now we move on to the NV30 itself. Characteristics: 0.13 micron fabrication process, copper connections.

  • 125M transistors

  • 3 geometrical processors (each exceed the specs of the DX9 VS 2.0)

  • 8 pixel processors (exceed the specs of the DX9 PS 2.0 markedly)

  • Flexibly configurable array of 8 pipelined texture filtering units calculates up to 8 sampled and filtered results at a clock.

  • AGP 3.0 (8x) system interface

  • 128 bit (!) DDR II interface of the local memory

  • Effective 4-channel memory controller with a crossbar

  • Developed optimization techniques for the local memory: full frame buffer compression including color data (for the first time the compression ratio is set to 4:1, only in the MSAA modes), and depth (Z buffer compression)

  • Tile optimization: caching, compression and Early Cull HSR

  • Support of precise integer-valued formats (10/16 bit per component) and precise floating-point formats (16 and 32 bits per component - also known as 64 and 128 bit color) for the textures and frame buffer

  • Through accuracy of all operations - 32 bit floating-point arithmetic

  • Being activated, the new algorithm of optimized anisotropic filtering reduces the performance drop (fps) without bad quality degradation

  • Anisotropic quality up to 8? of the usual bilinear texture, i.e. up to 128 discrete samples per one texture fetch

  • New hybrid AA modes - 8? (DirectX and OpenGL) and 6xS (only DirectX)

  • Frame buffer compression makes possible to reduce a performance drop markedly with FSAA enabled

  • Two integrated 10 bit RAMDACs 400 MHz

  • Integrated interface for external TV-Out chip

  • Integrated into GPU TV-Out

  • Three TDMS interfaces for external DVI interface chips

  • Current consumed by the GeForce FX chip based on the .13 technology is comparable to the requirements of the AGP 3.0 specification. Therefore, it's possible to make cards without external power.

  • And now the block diagram of the GeForce FX: Functions of the blocks: Cache controller, Memory controller, Crossbar - the block controls exchange and caching of data coming from the local memory of GPU and AGP system bus.

  • Vertex Processors - geometrical processors which execute vertex shaders and emulate a fixed T&L. They fulfill geometric transformations and prepare parameters for rendering and for pixel processors.

  • Pixel Processors - execute pixel shaders and emulate pixel stages. They shade pixels and make requests for texture fetch units.

  • Texture Fetch & Filtering & Decompression Units. They implement fetching of certain values of certain textures required by pixel processors.

  • Texture & Color Interpolators - they are interpolators of texture coordinates and color values calculated as output parameters in the vertex processor. These units calculate for each pixel processor its unique input parameters according to the position of a pixel it is shading.

  • Frame Buffer Logic - the unit controls operation with the frame buffer including Frame Buffer Compression & Decompression, caching, Tile HSR Logic - so called Early Cull HSR, and MSAA Allocation and post processing of samples - the final filtering in FSAA modes (FSAA post-processor)

  • 2D Core

  • Two display controllers, two RAMDACs and a rich set of interfaces including three integrated DVI and one integrated TV-Out

  • More detailed information on the NV30 can be found in our analytical article. In closing there is a list of currently available OpenGL extensions and OpenGL ICD version: Vendor: NVIDIA Corporation

  • Renderer: GeForce FX 5800 Ultra/AGP/SSE2

  • Version: 1.4.0

  • Extensions:

  • GL_ARB_depth_texture

  • GL_ARB_fragment_program

  • GL_ARB_imaging

  • GL_ARB_multisample

  • GL_ARB_multitexture

  • GL_ARB_point_parameters

  • GL_ARB_shadow

  • GL_ARB_texture_border_clamp

  • GL_ARB_texture_compression

  • GL_ARB_texture_cube_map

  • GL_ARB_texture_env_add

  • GL_ARB_texture_env_combine

  • GL_ARB_texture_env_dot3

  • GL_ARB_texture_mirrored_repeat

  • GL_ARB_transpose_matrix

  • GL_ARB_vertex_program

  • GL_ARB_window_pos

  • GL_S3_s3tc

  • GL_EXT_abgr

  • GL_EXT_bgra

  • GL_EXT_blend_color

  • GL_EXT_blend_func_separate

  • GL_EXT_blend_minmax

  • GL_EXT_blend_subtract

  • GL_EXT_compiled_vertex_array

  • GL_EXT_draw_range_elements

  • GL_EXT_fog_coord

  • GL_EXT_multi_draw_arrays

  • GL_EXT_packed_pixels

  • GL_EXT_point_parameters

  • GL_EXT_rescale_normal

  • GL_EXT_secondary_color

  • GL_EXT_separate_specular_color

  • GL_EXT_shadow_funcs

  • GL_EXT_stencil_two_side

  • GL_EXT_stencil_wrap

  • GL_EXT_texture3D

  • GL_EXT_texture_compression_s3tc

  • GL_EXT_texture_cube_map

  • GL_EXT_texture_edge_clamp

  • GL_EXT_texture_env_add

  • GL_EXT_texture_env_combine

  • GL_EXT_texture_env_dot3

  • GL_EXT_texture_filter_anisotropic

  • GL_EXT_texture_lod

  • GL_EXT_texture_lod_bias

  • GL_EXT_texture_object

  • GL_EXT_vertex_array

  • GL_HP_occlusion_test

  • GL_IBM_texture_mirrored_repeat

  • GL_KTX_buffer_region

  • GL_NV_blend_square

  • GL_NV_copy_depth_to_color

  • GL_NV_depth_clamp

  • GL_NV_fence

  • GL_NV_float_buffer

  • GL_NV_fog_distance

  • GL_NV_fragment_program

  • GL_NV_half_float

  • GL_NV_light_max_exponent

  • GL_NV_multisample_filter_hint

  • GL_NV_occlusion_query

  • GL_NV_packed_depth_stencil

  • GL_NV_pixel_data_range

  • GL_NV_point_sprite

  • GL_NV_primitive_restart

  • GL_NV_register_combiners

  • GL_NV_register_combiners2

  • GL_NV_texgen_reflection

  • GL_NV_texture_compression_vtc

  • GL_NV_texture_env_combine4

  • GL_NV_texture_expand_normal

  • GL_NV_texture_rectangle

  • GL_NV_texture_shader

  • GL_NV_texture_shader2

  • GL_NV_texture_shader3

  • GL_NV_vertex_array_range

  • GL_NV_vertex_array_range2

  • GL_NV_vertex_program

  • GL_NV_vertex_program1_1

  • GL_NV_vertex_program2

  • GL_NVX_ycrcb

  • GL_SGIS_generate_mipmap

  • GL_SGIS_texture_lod

  • GL_SGIX_depth_texture

  • GL_SGIX_shadow

  • GL_WIN_swap_hint

  • WGL_EXT_swap_control

CardThe card has AGP x4/x8 interface, 128 MB GDDR-II SDRAM (8 chips on both PCB sides). The card comes with Samsung's memory chips K4N26323AE-GC1K of the BGA form-factor. The maximum frequency of 550 (1100) MHz, that is why the access time must be 1.8 ns. The memory runs at 500 (1000) MHz in 3D and at 300 (600) MHz in 2D.Well, this is the fast memory for today in mass (and pre-mass) production. NVIDIA GeForce FX 5800 Ultra 128MB The first thing catching the eye here is a gigantic cooler. Because the cooler's fastening is very complicated and the heatsink is toughly glued to the chip, I didn't take the cooler off, and above you can see one of the cards we tested before. Although there are some slight differences in the PCBs, the boards can be considered practically identical. Here is an example when the attempt to take off the cooler tore off the chip's surface (by the courtesy of Yinchu Chan AKA Cho; the picture from popular Chinese site ): By the way, we can estimate the die's size. And once we touched upon the chip, note that unlike the ATI's R300, this die has an all-metal lid on, which protects from splits and serves as an additional heatsink. The package is the same flip FCPGA. As expected, here we have the NV30GL. Why should we expect it? Above we mentioned the new NVIDIA's strategy when it first makes a chip with all features for professional and game needs. And then it locks some professional features to make gaming solutions. The PCB is very complicated. Although it supports only the 128bit memory bus the design consists of 12 layers, 2 for PCB shielding: It protects the high-speed memory from pickups. Some say that the GDDR-II is not easy in this respect. By the way, the memory is cooled very carefully. The heatsink is made of copper alloy and there is also a thermo layer that ensures 100% heat transfer. Now look at the impressive cooler FlowFX. As you can see, the device is made of a copper plate and a turbine cooler the fan of which pumps air through the pipe heatsink. The pipes connecting the copper plate with this heatsink carry low-boiling liquid which transfers heat. As you can see, such a huge cooler made the card wider, that is why the first PCI slot behind AGP can't be used. On the first "floor" of the card are d-Sub, DVI and S-Video connectors, on the second - the hole for taking in cold air and exhausting hot one. The cooler has a lot of fasteners: two bolts and four clips on the copper plate's edges near the memory chips. On the back is a spring bracket and a copper plate for cooling the memory. Such a solid cooler is justified as the chipset and the memory generate a great amount of heat. However, such turbine is not necessary - we have some experience of working with the card using a usual cooler. No stability problems were noticed. Just note that the memory needs cooling in any case, at least, passive cooling. The system makes a lot of noise (like any other turbine). Probably, to make the squeak quieter the developers decided to lower the fan's frequencies together with reducing the card's speed in 2D. We will speak about it in the drivers' section. The fan's speed is probably controlled by changing the voltage because it has no tachometer as shown on the pictures. Probably, there is a logic element under the heatsink which controls the chip or the chips itself can do it, which is more probable because the NV30 supports hardware temperature monitoring. I spent quite a lot of time working with this card, and I must say that the noise the FlowFX makes really gets on nerves when it switches from 2D to 3D changing the noise level all the time. When a test is run in the packet mode, it's not pleasant to hear how the cooler's revolutions constantly change. The card is as long as the GeForce4 Ti 4600 based one. Besides, the card requires external power supply, for which there is a special socket in the upper right-hand corner (like that on the 3dfx Voodoo5 5500; by the way, it's curious that the "FX" is named after 3dfx, the external power supply connector is like on the Voodoo5, the card is long (almost like the Voodoo5 5500), and the number 5800 is not far from 5500 :-) ). It's possible that such cards won't appear on the retail market, and maybe the cooling system will be changed by the manufacturers. And the last thing to note: we will omit operation of the TV-out of this card because the review is too lengthy already. But we will touch upon it in the nearest review of a similar solution (probably, it will be a production card based on the GeForce FX 5800). Just note that this sample doesn't have a TV-out codec though the seat is provided. The drivers notify that the TV codec is integrated into the chip. We'll check it. OverclockingUnfortunately, overclocking is far not simple. But maybe this is a good thing because of possible burning of such an expensive card. The monitoring controls the frequencies and when we lift them it either slows down the card (like the thermal protection of modern Intel processors) or returns the frequencies to the default values with the driver according to information from the thermal sensors. But we do not stop looking into this question. [ Part 2 ] Andrey Vorobiev (anvakams@ixbt.com) Alexander Medvedev (unclesam@ixbt.com) Write a comment below. No registration needed! 2ff7e9595c


0 views0 comments

Recent Posts

See All

コメント


bottom of page