tag:blogger.com,1999:blog-27901662.post5533049988774126004..comments2024-03-28T07:30:33.361+00:00Comments on Zack Rusin: Mesa and LLVMZackhttp://www.blogger.com/profile/16222054590923441165noreply@blogger.comBlogger14125tag:blogger.com,1999:blog-27901662.post-81197592922263805592007-05-25T14:32:00.000+01:002007-05-25T14:32:00.000+01:00I bestow the blue crayon of honor upon thee, turke...I bestow the blue crayon of honor upon thee, turkeyAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-27901662.post-59423905736715572422007-05-25T11:42:00.000+01:002007-05-25T11:42:00.000+01:00Hi, thank you for all the comments!To answer all o...Hi, thank you for all the comments!<BR/><BR/>To answer all of them in one go:<BR/>1) I'll be talking about the special effects on akademy.<BR/>2) Yes, Apple is already using LLVM in their OpenGL implementation but only for the software path.<BR/>3) Partially, Keith is working on a lot of other amazing things and he's focusing on different areas right now. <BR/>4) Personally I'll be working on Intel drivers first and Intel hardware, especially 965's is what's I'd recommend, especially for general desktop usage(they're not there yet though). If closed source doesn't bother you, then NVIDIA drivers are still the best.<BR/>5) Microsoft Research Accelerator is focusing on general parallel aspects of programming that map to the next-gen hardware. We're focusing more on general architecture that would permit us to incredibly optimize dedicated shading languages while providing means of running programs in any arbitrary language (most of them by definition not parallel at all) on the GPU.Zackhttps://www.blogger.com/profile/16222054590923441165noreply@blogger.comtag:blogger.com,1999:blog-27901662.post-35832294621917840002007-05-24T23:03:00.000+01:002007-05-24T23:03:00.000+01:00How does this compare to MS Research Accelerator?f...How does this compare to MS Research Accelerator?<BR/>ftp://ftp.research.microsoft.com/pub/tr/TR-2005-184.pdf<BR/>and http://research.microsoft.com/act/ (see Accelerator section)Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-27901662.post-63466800840543498622007-05-24T21:19:00.000+01:002007-05-24T21:19:00.000+01:00Great, so we can also finally replace the nvidia-c...Great, so we can also finally replace the nvidia-cg-toolkit with a free alternative without any problems :)<BR/><BR/>(people who want to be compatible with both directx and opengl use cg, and IMHO there is only the non-free nvidia toolkit available atm.)Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-27901662.post-43846277766786095192007-05-24T21:00:00.000+01:002007-05-24T21:00:00.000+01:00Wow! Double Wow! No... I mean Wow.Wow! Double Wow! No... I mean Wow.Patrice Tremblayhttps://www.blogger.com/profile/14484171562846489122noreply@blogger.comtag:blogger.com,1999:blog-27901662.post-1679680368554073962007-05-24T20:23:00.000+01:002007-05-24T20:23:00.000+01:00"I'll be buying a new computer in a couple of mont..."<I>I'll be buying a new computer in a couple of months. What video cards support both Open Source drivers and all the new Qt/KDE eyecandy?</I>"<BR/><BR/>Current video cards? None.<BR/><BR/>(the intel igp isn't a "card" and its performance is abysmal anyway)Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-27901662.post-88542980516762886272007-05-24T18:14:00.000+01:002007-05-24T18:14:00.000+01:00I'll be buying a new computer in a couple of month...I'll be buying a new computer in a couple of months. What video cards support both Open Source drivers and all the new Qt/KDE eyecandy?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-27901662.post-38609620240914516442007-05-24T17:56:00.000+01:002007-05-24T17:56:00.000+01:00Is this code the "proposal" that Keith Whitwell ta...Is this code the "proposal" that Keith Whitwell talks about here<BR/><BR/>http://lists.cs.uiuc.edu/pipermail/llvmdev/2007-May/009110.html<BR/><BR/>?Anonymoushttps://www.blogger.com/profile/00807216800351567752noreply@blogger.comtag:blogger.com,1999:blog-27901662.post-79926306725688092722007-05-24T17:29:00.000+01:002007-05-24T17:29:00.000+01:00@illissuis: Think Nvidia's C derivative for shadin...@illissuis: Think Nvidia's C derivative for shading, the program gets compiled down and are then run. The GPU shouldn't have to play interpreter, merely execute the instructions produced by the compiler (LLVM) -- the compiler will be run by the CPU.<BR/><BR/>IIRC, Apple is using LLVM for this exactly. Also, from what I can remember a long time ago on LLVM's mailing lists, some graphics company is also using LLVM in such a regard.Saemhttps://www.blogger.com/profile/06809893053403438232noreply@blogger.comtag:blogger.com,1999:blog-27901662.post-3778191465592675692007-05-24T17:10:00.001+01:002007-05-24T17:10:00.001+01:00What is the special effects library to which you r...What is the special effects library to which you refer?b10663rhttps://www.blogger.com/profile/08073995943287353986noreply@blogger.comtag:blogger.com,1999:blog-27901662.post-31587045272968659902007-05-24T17:10:00.000+01:002007-05-24T17:10:00.000+01:00illissius, don't think you understand. Python wou...illissius, don't think you understand. Python wouldn't be run on the GPU. From what I gather, someone at LLVM is developing a Python frontend which together with the backend would then turn Python into LLVM bytecode and run through the same code generation path as the LLVM C++ frontend (a retargetted g++) produces. IOW, the GPU won't know the difference.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-27901662.post-30669066939427863572007-05-24T16:28:00.000+01:002007-05-24T16:28:00.000+01:00You the MAN, Zack san.You the MAN, Zack san.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-27901662.post-68295557490872214822007-05-24T16:05:00.000+01:002007-05-24T16:05:00.000+01:00While nifty, wouldn't Python on a GPU be shit-fuck...While nifty, wouldn't Python on a GPU be shit-fucking slow? GPUs are good at parallel, not sequential, processing, and an interpreted language sounds like it would exacerbate this manyfold. <BR/>This seems more like "because I can" than actually useful, to me. But it *is* totally awesome. (Like, holy shit. It's like you leapfrogged the entire rest of the industry.)illissiushttps://www.blogger.com/profile/05302810539016462675noreply@blogger.comtag:blogger.com,1999:blog-27901662.post-2921604082655257562007-05-24T14:53:00.000+01:002007-05-24T14:53:00.000+01:00Not that understand all in the post but it seems j...Not that understand all in the post but it seems just excellent=)<BR/><BR/>But <I>And in KDE 4 most graphics code will be able to utilize eye-popping effects with virtually no CPU price</I> I understood completely=)peppelorumhttps://www.blogger.com/profile/13197277316363596154noreply@blogger.com