Denarius
Mar 22, 03:50 PM
HAHA, if I lived in Europe I wouldn't want to leave that place, not to say that's where you are, but Europe is great. Not everyone is true, but a big understatement thus the millions of illegal immigrants and people constantly being killed trying to get here. Ahem, Mexico for one.
Oh I don't know, Britain attracts illegal immigrants like a turd attracts flies. The French get very annoyed about the number hanging around near the tunnel trying to sneak over.
Oh I don't know, Britain attracts illegal immigrants like a turd attracts flies. The French get very annoyed about the number hanging around near the tunnel trying to sneak over.
bigpics
Mar 24, 12:57 PM
Dude, I'm sorry to inform you that what you're saying is an outright lie, and there are guys from the Lossless Compression Clan, called "Apple Lossless codec", "FLAC", and "APE", standing with heavy cluebats in their hands, ready to perform a painful reality sync on anyone thinking compression ALWAYS degrades quality.
Because it doesn't, full stop.You're (very probably) right. My comments were aimed at those who were saying the Classic is overkill because who could ever "need" anything more than 128 or even 256 kbps AAC's or mp3's. (Nobody even mentioned 320, at which many of my fave songs are ripped.)
So as for the "lossless" CODECs, my reach exceeds my grasp. When it comes to photo files I pretty much understand the principles of ZFW lossless compression in TIFF files and have thousands of 'em. And in case anyone doesn't know, if you work on JPEG's and do multiple editing sessions on a photo, you do introduce new compression artifacts every time you re-save even at the highest settings. I've done tests for kicks and giggles - repeatedly opening and saving .jpg's and you reach a point where the image looks like a (very) bad xerox copy.
Back to audio, I've plowed through a few articles on formats - years ago - and I've seen slightly differing conclusions about Apple Lossless and FLAC ('tho all felt that these were alternatives worth considering for at least the great majority of people serious about sound), but, frankly, I lack the chops to have an informed opinion of my own, and know nada about APE.
And, no, while I can appreciate friends' systems that are tricked out with vacuum tube amps, "reference" speakers and high-end vinyl pressings, I'm hardly one of the hard-core audiophiles in practice. My files are mostly 256 and 320 kbps, my home speaker placements are wrong and I use preset ambiance settings that totally mess with the sound to produce surround effects from AAC's.
Worse, the great majority of my listening is on the mid-level rig in my car at freeway speeds or in city traffic, meaning I and millions of others are constantly fighting like, what, 20-30 db of non-music noise that totally overwhelms delicate nuances in sound. And worst, some of my earliest pre-iPod rips (back when I had a massive 20 GB HDD) were done in RealPlayer at 96 or even 64 kbps - before I sold or traded those CDs - and yeah, in the car, some of those still sound "pretty good" to me (tho' some clearly don't).
Add the (lack of) quality of most ear buds and headsets used by most people, and there's probably less than 5% of music listeners experiencing "true high-fidelity." To turn around an old ad campaign, no, our music listening today is "not live - it's Memorex."
But my point was and is that there's no reason to champion lossy compression per se other than for the economies of storage space it provides, and for fungible uses like topical podcasts.
As long as we have the space, "data fidelity" is desirable so that the files we produce which will be around for many years - and get spread to many people - don't discard signal for no real gain. No one would put up with "lossy" word processing compression that occasionally turned "i's" into "l's" after all.
And those audio files will still be around in a future of better DAC's, speakers, active systems which routinely monitor and cancel out things like apartment, road and car noise (in quieter electric cars with better road noise supression in the first place), better mainstream headsets and who knows what other improvements.
Compatibility between players (software or hardware) used to be another reason to choose, say, mp3's, but there's really no meaningful competition to Apple's portable sound wonders any more.
So please keep those "cluebats" holstered! No offense intended. ;)
Because it doesn't, full stop.You're (very probably) right. My comments were aimed at those who were saying the Classic is overkill because who could ever "need" anything more than 128 or even 256 kbps AAC's or mp3's. (Nobody even mentioned 320, at which many of my fave songs are ripped.)
So as for the "lossless" CODECs, my reach exceeds my grasp. When it comes to photo files I pretty much understand the principles of ZFW lossless compression in TIFF files and have thousands of 'em. And in case anyone doesn't know, if you work on JPEG's and do multiple editing sessions on a photo, you do introduce new compression artifacts every time you re-save even at the highest settings. I've done tests for kicks and giggles - repeatedly opening and saving .jpg's and you reach a point where the image looks like a (very) bad xerox copy.
Back to audio, I've plowed through a few articles on formats - years ago - and I've seen slightly differing conclusions about Apple Lossless and FLAC ('tho all felt that these were alternatives worth considering for at least the great majority of people serious about sound), but, frankly, I lack the chops to have an informed opinion of my own, and know nada about APE.
And, no, while I can appreciate friends' systems that are tricked out with vacuum tube amps, "reference" speakers and high-end vinyl pressings, I'm hardly one of the hard-core audiophiles in practice. My files are mostly 256 and 320 kbps, my home speaker placements are wrong and I use preset ambiance settings that totally mess with the sound to produce surround effects from AAC's.
Worse, the great majority of my listening is on the mid-level rig in my car at freeway speeds or in city traffic, meaning I and millions of others are constantly fighting like, what, 20-30 db of non-music noise that totally overwhelms delicate nuances in sound. And worst, some of my earliest pre-iPod rips (back when I had a massive 20 GB HDD) were done in RealPlayer at 96 or even 64 kbps - before I sold or traded those CDs - and yeah, in the car, some of those still sound "pretty good" to me (tho' some clearly don't).
Add the (lack of) quality of most ear buds and headsets used by most people, and there's probably less than 5% of music listeners experiencing "true high-fidelity." To turn around an old ad campaign, no, our music listening today is "not live - it's Memorex."
But my point was and is that there's no reason to champion lossy compression per se other than for the economies of storage space it provides, and for fungible uses like topical podcasts.
As long as we have the space, "data fidelity" is desirable so that the files we produce which will be around for many years - and get spread to many people - don't discard signal for no real gain. No one would put up with "lossy" word processing compression that occasionally turned "i's" into "l's" after all.
And those audio files will still be around in a future of better DAC's, speakers, active systems which routinely monitor and cancel out things like apartment, road and car noise (in quieter electric cars with better road noise supression in the first place), better mainstream headsets and who knows what other improvements.
Compatibility between players (software or hardware) used to be another reason to choose, say, mp3's, but there's really no meaningful competition to Apple's portable sound wonders any more.
So please keep those "cluebats" holstered! No offense intended. ;)
Roy Hobbs
Jan 2, 11:30 AM
Since Intel is releasing the 2.0 Ghz C2Q chip this week, it seems likely to find its way into an iTV and/or iMac device. That's four cores on the cheap.
Rocketman
Highly unlikely that the Quad chip will end up in the iTV. Especially at the already announced $299 proce point of iTV
Rocketman
Highly unlikely that the Quad chip will end up in the iTV. Especially at the already announced $299 proce point of iTV
W1MRK
Apr 16, 05:17 PM
I can drive a 18 wheeler but I haven't tried a manual car or pickup yet. I think its different. LOL
BlizzardBomb
Sep 1, 11:58 AM
My guess: 17" dropping to $1,099, 20" to $1,499
$1,999 with more hd, a gig of ram and, hopefully, (i do doubt it though as well) a nice gpu (at least as bto, unlikely though for the imac).
I'd order one right away! :cool:
$1,999 is pushing it a bit IMO. :)
I highly doubt they would killl it off. I think they'd drop the price on it which would make it even more desirable for standard consumers with a budget. Sort of a, why get the mini when I could just pay a bit more for the iMac 17" kind of thing.
Good point, although the suffocating the Mini would be a problem. If the updated Mini is decent enough it should be able to survive though.
$1,999 with more hd, a gig of ram and, hopefully, (i do doubt it though as well) a nice gpu (at least as bto, unlikely though for the imac).
I'd order one right away! :cool:
$1,999 is pushing it a bit IMO. :)
I highly doubt they would killl it off. I think they'd drop the price on it which would make it even more desirable for standard consumers with a budget. Sort of a, why get the mini when I could just pay a bit more for the iMac 17" kind of thing.
Good point, although the suffocating the Mini would be a problem. If the updated Mini is decent enough it should be able to survive though.
Capt T
Mar 25, 03:55 PM
iPad 1 does not support HDMI out, so I'm assuming no, it doesn't work.
The iPad 1 does support HDMI out. I have the adaptor and checked it out with a movie. It doesn't support mirroring but it does support the output.
The iPad 1 does support HDMI out. I have the adaptor and checked it out with a movie. It doesn't support mirroring but it does support the output.
aiqw9182
Mar 24, 04:46 PM
On the server, AMD has inexpensive 12-core, 4-way CPUs since some time. Now going for 16-core with Bulldozer (well, now it will be more like 16-core integer/8-core floating point).
The absolute bargain now are the 8-core, 4-way CPUs. You can have a 32-core machine for very little money.
The the next Atom will have a DirectX 10.1 GPU, meanwhile Bobcat Fusion already has DirectX 11 hardware and OpenCL.
AMD's CPU's are trash and they're cheap for a reason.
Oh and for someone who doesn't use Windows you sure seem interested in Windows only API's. Love all of those OpenCL applications you listed by the way. ;)
The absolute bargain now are the 8-core, 4-way CPUs. You can have a 32-core machine for very little money.
The the next Atom will have a DirectX 10.1 GPU, meanwhile Bobcat Fusion already has DirectX 11 hardware and OpenCL.
AMD's CPU's are trash and they're cheap for a reason.
Oh and for someone who doesn't use Windows you sure seem interested in Windows only API's. Love all of those OpenCL applications you listed by the way. ;)
treblah
Jul 18, 01:44 AM
I'm sufficiently excited. Here's hoping for higher quality (than the current TV shows) and Netflix-esque pricing.
Start "TS isn't accurate/only for the US" whining in 3, 2, 1…
If I'm going to spend all that time downloading a movie, I should at least be able to keep it. Bah.
You've never streamed a Quicktime movie? You don't have to wait for it to end before you start watching it, unless of course you were going to watch it on an iPod…
Start "TS isn't accurate/only for the US" whining in 3, 2, 1…
If I'm going to spend all that time downloading a movie, I should at least be able to keep it. Bah.
You've never streamed a Quicktime movie? You don't have to wait for it to end before you start watching it, unless of course you were going to watch it on an iPod…
FFTT
Nov 23, 06:30 AM
I think what I said about software developers catching up has merit.
It's not just the pro applications themselves that need to catch up to
take advantage of multi-core architecture, but also all those very important
plug-ins.
This especially holds true in audio recording software with some critical plug-in developers still struggling to catch up to universal binary versions of their software.
It's not just the pro applications themselves that need to catch up to
take advantage of multi-core architecture, but also all those very important
plug-ins.
This especially holds true in audio recording software with some critical plug-in developers still struggling to catch up to universal binary versions of their software.
leekohler
Mar 23, 04:19 PM
No, I fully support that.
I am simply disappointed that they pander to special interests. Just one of many reasons I voted with my wallet and bought a droid.
It sounds to me like they made a business decision. Lots of companies respond this way when the public makes it's opinion known.
I am simply disappointed that they pander to special interests. Just one of many reasons I voted with my wallet and bought a droid.
It sounds to me like they made a business decision. Lots of companies respond this way when the public makes it's opinion known.
angelwatt
Jul 13, 10:28 PM
Well I hope it doesn't come too soon. Blu-ray is just too expensive right now and it would jack up Mac cost significantly. It's also better to see how the Blu-ray vs HD DVD thing works out as well just to make sure Apple doesn't back a dead horse.
SciFrog
Feb 17, 12:55 PM
No remote login?
RaceTripper
Jan 10, 03:53 PM
Well traded the Subaru today time to get something a bit more sensible so i got a 2007 Ford Focus ST-2.
Done all the paper work today and pick her up tomorrow afternoon, cant wait.
MattNow if I were in England and I was getting a Ford Focus, I think it would have to be a RS. :D
We don't get to play with those on our side of the pond. :(
Done all the paper work today and pick her up tomorrow afternoon, cant wait.
MattNow if I were in England and I was getting a Ford Focus, I think it would have to be a RS. :D
We don't get to play with those on our side of the pond. :(
Stetrain
Apr 2, 08:18 PM
While you may think your sarcasm-laden post witty, the fact remains that you have not stated any kind of revelation.
They do not care about ONE consumer...but they certainly are going to care about the thousands of units that are being returned and exchanged in hopes of finding one good unit.
I would tell you to review the iPad forum but I have a feeling that message would be lost on somebody so insistent on keeping their head in the sand.
I like how you continue to respond to this one person and ignore the posts about actual personal experience with iPads and those who own them, and those who have seen plenty of others' iPads, all without defects.
They do not care about ONE consumer...but they certainly are going to care about the thousands of units that are being returned and exchanged in hopes of finding one good unit.
I would tell you to review the iPad forum but I have a feeling that message would be lost on somebody so insistent on keeping their head in the sand.
I like how you continue to respond to this one person and ignore the posts about actual personal experience with iPads and those who own them, and those who have seen plenty of others' iPads, all without defects.
knightlie
Jun 23, 03:21 AM
The Magic Trackpad � http://www.macrumors.com/2010/06/07/apples-magic-trackpad-or-magic-slate-revealed/ � would allow for multi-touch on desktops, enabling many iOS applications to be used on a desktop computer (and obviously laptops could do the same thing with their trackpads).
Not necessarily. iOS apps need to be touched directly, without a pointer acting as intermediary, whereas a touch/track pad is used to control a pointer on the screen.
Touch screen and touch pad do not have to perform the same function. To enable iOS apps, the Magic Touchpad would need a screen on it, which would turn it into... an iPad.
Not necessarily. iOS apps need to be touched directly, without a pointer acting as intermediary, whereas a touch/track pad is used to control a pointer on the screen.
Touch screen and touch pad do not have to perform the same function. To enable iOS apps, the Magic Touchpad would need a screen on it, which would turn it into... an iPad.
toddybody
Mar 24, 01:40 PM
power-hungry gpu monsters.
6970 folks, not 6990 :)
6970 folks, not 6990 :)
Kranchammer
Mar 24, 01:44 PM
6970 folks, not 6990 :)
Still a monster, just a smaller monster. Kinda like 6970 is to Godzukei what 6990 is to Godzilla. ;)
Still a monster, just a smaller monster. Kinda like 6970 is to Godzukei what 6990 is to Godzilla. ;)
QuarterSwede
Apr 10, 06:18 PM
as the other guys have said, in the UK automatics are pretty rare. i think we all know one friend or so who has an Auto only license, everyone else just gets a normal license.
if you are the sort of person who enjoys driving to any degree then a manual gearbox is much better. autos are just so boring, they never kick down when you need it or bizarrely hold on to a gear for much longer than you were expecting. im sure there are some good autos out there but they will always be more inefficient than a manual.
When is the last time you were in an automatic and what year/make/model was the car?
Automatics these days are generally a LOT better than they used to be. This is coming from someone who really loves driving a stick on country roads and likes the control you get from one.
I'm starting to think most stick drivers are blind to how much automatics have changed.
if you are the sort of person who enjoys driving to any degree then a manual gearbox is much better. autos are just so boring, they never kick down when you need it or bizarrely hold on to a gear for much longer than you were expecting. im sure there are some good autos out there but they will always be more inefficient than a manual.
When is the last time you were in an automatic and what year/make/model was the car?
Automatics these days are generally a LOT better than they used to be. This is coming from someone who really loves driving a stick on country roads and likes the control you get from one.
I'm starting to think most stick drivers are blind to how much automatics have changed.
Frobozz
Mar 25, 09:40 AM
Nop... consider.
2x CPUs 130W rated. So thats 260W, right there. However, no CPU consumes the rated, so it's give or take ~260W.
Each 5770 is ~108W, given two, that's ~216 W. Right off the bat we have ~476 W being consumed. Not bad; however let's look at the side where its not a dual 5770 setup.
The PSU on the Mac Pro is rated for 980 W of power, but for simplicity sake let's say 1 kW. Now, factor in the Super drive, Ethernet, Airport, at least 1 HDD and peripheral docks/cards you are looking at ~100 W. Take into account a 20 W per 1GB of memory (assume 6GB) and you've got ~120 W more. So far ~ 220 W more.
Now we have ~480 W [~260W + ~220W]consumption leaving only ~520 W left for a GPU. Currently, the HD 6970 requires 2x 8-pin connectors to provide 150 W per pin. That's 300W right off. So we are left with ~220 W in the system. Now, factor in that PCIe slot power draw at 75 W and we've got a ~145 W left over. ~145 W is cutting it too close and something will yield (yes I do realize 145 W is a lot more, but read on). Now, the sad part, we were assuming 1kW PSU which is not the case; it's 980 W meaning there will be less power, ~125 W. Now, also take into consideration no PSU is 100% efficient, hence there will be greater power outlet draw and the PSU will be operating at high voltage/amps and its life span will decrease dramatically over very high usage.
In other words the current PSU may come up short. Add to that the fact that all current shipping and past model Mac Pros don't have extra dual 8-pin connectors. They have dual 6-pins. There is an adapter to make a 6-pin into an 8-pin, but it is risky at best, big no-no.
So as you can see an HD 6970 would be barely supported on current models. Future models? Perhaps yes assuming Apple bumps to 1.1kW or 1.2kW PSU.
Take into account this was calculated assuming 6GB of memory and 1 HDD, anymore RAM (20 W/1GB) or HDDs (10W/disc) and the consumption will go up. Also, assuming nothing is hooked up to peripheral ports; like a small external drive that draws 5-10 W.
I have an 850 watt PSU in my gaming rig with a 4870x2 and custom coolers all around on the CPU, GPU, and case. I think your calculations are pretty close to correct if you wanted to run everything in the case at once. But that's not typical to run everything at max all at once. I suppose Apple might not want to get in the business of telling people it's okay to buy this honking huge GPU as long as you're not running a lot of extra hard drives and extra PCI-E cards.
But, for people looking to simply drop in a fast GPU and not have a lot of extra bells and whistles (read: a gaming rig), they would be fine with 850 watts or so, even with a 6970. Or at least damn close.
The tricky part with GPU's is that the high end units commonly exceed rated specifications at max load, so these calculations are tricky. And your point about running too close to your max is also a good one. It's fair to say that when you add up all your max dissipation, add 20% or so, and that's the wattage your PSU needs.
2x CPUs 130W rated. So thats 260W, right there. However, no CPU consumes the rated, so it's give or take ~260W.
Each 5770 is ~108W, given two, that's ~216 W. Right off the bat we have ~476 W being consumed. Not bad; however let's look at the side where its not a dual 5770 setup.
The PSU on the Mac Pro is rated for 980 W of power, but for simplicity sake let's say 1 kW. Now, factor in the Super drive, Ethernet, Airport, at least 1 HDD and peripheral docks/cards you are looking at ~100 W. Take into account a 20 W per 1GB of memory (assume 6GB) and you've got ~120 W more. So far ~ 220 W more.
Now we have ~480 W [~260W + ~220W]consumption leaving only ~520 W left for a GPU. Currently, the HD 6970 requires 2x 8-pin connectors to provide 150 W per pin. That's 300W right off. So we are left with ~220 W in the system. Now, factor in that PCIe slot power draw at 75 W and we've got a ~145 W left over. ~145 W is cutting it too close and something will yield (yes I do realize 145 W is a lot more, but read on). Now, the sad part, we were assuming 1kW PSU which is not the case; it's 980 W meaning there will be less power, ~125 W. Now, also take into consideration no PSU is 100% efficient, hence there will be greater power outlet draw and the PSU will be operating at high voltage/amps and its life span will decrease dramatically over very high usage.
In other words the current PSU may come up short. Add to that the fact that all current shipping and past model Mac Pros don't have extra dual 8-pin connectors. They have dual 6-pins. There is an adapter to make a 6-pin into an 8-pin, but it is risky at best, big no-no.
So as you can see an HD 6970 would be barely supported on current models. Future models? Perhaps yes assuming Apple bumps to 1.1kW or 1.2kW PSU.
Take into account this was calculated assuming 6GB of memory and 1 HDD, anymore RAM (20 W/1GB) or HDDs (10W/disc) and the consumption will go up. Also, assuming nothing is hooked up to peripheral ports; like a small external drive that draws 5-10 W.
I have an 850 watt PSU in my gaming rig with a 4870x2 and custom coolers all around on the CPU, GPU, and case. I think your calculations are pretty close to correct if you wanted to run everything in the case at once. But that's not typical to run everything at max all at once. I suppose Apple might not want to get in the business of telling people it's okay to buy this honking huge GPU as long as you're not running a lot of extra hard drives and extra PCI-E cards.
But, for people looking to simply drop in a fast GPU and not have a lot of extra bells and whistles (read: a gaming rig), they would be fine with 850 watts or so, even with a 6970. Or at least damn close.
The tricky part with GPU's is that the high end units commonly exceed rated specifications at max load, so these calculations are tricky. And your point about running too close to your max is also a good one. It's fair to say that when you add up all your max dissipation, add 20% or so, and that's the wattage your PSU needs.
HecubusPro
Sep 1, 01:43 PM
I would laugh (because I'm mean like that) if the iMac 23" or iMac with Conroe took a long time to come out. So many of us MBP lovers have been waiting for Merom, and to see others squirm like us... muah hahaha
I 2nd that comment! Though the idea of having a MBP with merom and a 23" iMac with merom makes me feel all tingly inside. Apple cannot make these products available fast enough. :)
I 2nd that comment! Though the idea of having a MBP with merom and a 23" iMac with merom makes me feel all tingly inside. Apple cannot make these products available fast enough. :)
Ozu
Sep 6, 11:04 PM
It seems to me that the distribution of 480i content is pretty much settled. Netflix and Blockbuster do this well and at very competitive prices. I can't see that Apple would benefit much from trying to compete there.
How high-def content is distributed, on the other hand, is far from settled. In fact, the world of high-def video in 2006 looks a lot like the world of digital music in 1999; a technology consumers clearly want, but an emerging technology mired in competing standards and confusing technical details. Apple must have noticed that similarity.
I've had a beautiful 720p TV for eight months, and have yet to actually see anything in 720p on it. The closest I've come is hooking my MacBook up to it and watching quicktime trailers. I'm not going to buy a Blu-Ray or HDDVD player until the standards war is over and the players cost less than $300, and that's not going to happen until late 2007 at the earliest.
If I could buy a movie in 720p from the iTunes Music Store and watch it on my TV next Tuesday night I'd do it. Sure it'd take a few hours to download. But the alternative is to wait at least a year.
How high-def content is distributed, on the other hand, is far from settled. In fact, the world of high-def video in 2006 looks a lot like the world of digital music in 1999; a technology consumers clearly want, but an emerging technology mired in competing standards and confusing technical details. Apple must have noticed that similarity.
I've had a beautiful 720p TV for eight months, and have yet to actually see anything in 720p on it. The closest I've come is hooking my MacBook up to it and watching quicktime trailers. I'm not going to buy a Blu-Ray or HDDVD player until the standards war is over and the players cost less than $300, and that's not going to happen until late 2007 at the earliest.
If I could buy a movie in 720p from the iTunes Music Store and watch it on my TV next Tuesday night I'd do it. Sure it'd take a few hours to download. But the alternative is to wait at least a year.
jgould
Feb 19, 08:09 PM
Hasn't changed too much this time around:
http://link.trekcubed.com/trekmb_Feb2011_s.jpg (http://link.trekcubed.com/trekmb_Feb2011.jpg)
I like the wall paper... Which Orbiter and where'd ya get it? :)
http://link.trekcubed.com/trekmb_Feb2011_s.jpg (http://link.trekcubed.com/trekmb_Feb2011.jpg)
I like the wall paper... Which Orbiter and where'd ya get it? :)
Will_reed
Jul 18, 02:21 AM
Rental is such a dumb idea Maybe purchase but I've seen the quality of the video on the music store and personally I don't think it's worth the money.
Bonsai1214
Sep 20, 12:30 AM
ah, thanks clarifying that. it was kind of hard to tell from some of the pictures. their website said something about "direct access". is it hard to get to the buttons? especially the sleep button?
No comments:
Post a Comment