AppleKrate
Sep 19, 10:49 AM
The MacBookPro is still too new a release to have the major type of changes you and others are hoping for. All you're going to get for the next year or two is speed bumps and maybe an upgrade in HD capacity, Graphics card, or Optical Drive (Blue-Ray or HD-DVD)
Basically I see two types of users in here pleading for the newer chips: the average users who just "like the idea of fast" when it really does them no good, and the professionals who are consistantly holding out for something better. The professionals are few and far between.
Please tell me what is majorly new about the current MacBook Pro besides an intel chip :confused: (and the name of course :rolleyes: )
PS how about an amateur professional? If not, maybe a professional amateur?
Basically I see two types of users in here pleading for the newer chips: the average users who just "like the idea of fast" when it really does them no good, and the professionals who are consistantly holding out for something better. The professionals are few and far between.
Please tell me what is majorly new about the current MacBook Pro besides an intel chip :confused: (and the name of course :rolleyes: )
PS how about an amateur professional? If not, maybe a professional amateur?
Sirmausalot
Apr 6, 08:11 AM
"Come to see a surprise sneak peek at something very special - you really do not want to miss this one!"
Does this mean it's not going to ship yet?
Does this mean it's not going to ship yet?
dave420
Apr 25, 01:39 PM
but I really do not like the fact that the iPhone has a breadcrumbs database of my travels for the last 3 years!
This type of thing should not happen without users' knowledge... and it was. Or else this file would not be news!
I too don't like the idea of a device saving my location. On the other hand when I am using the Maps app for driving directions which sends my current location to Google, I would be naive to think that information isn't being stored somewhere.
This type of thing should not happen without users' knowledge... and it was. Or else this file would not be news!
I too don't like the idea of a device saving my location. On the other hand when I am using the Maps app for driving directions which sends my current location to Google, I would be naive to think that information isn't being stored somewhere.
Consultant
Mar 31, 03:09 PM
So Google is becoming big brother of the open wasteland? :D
MacSA
Aug 7, 05:32 PM
As a recent switcher to Mac, I have had a lot of experience with M$'s System Restore function. It is NOT a "go back and find that data I deleted" application. It IS a "can we please go back to a time when this computer wasn't totally ********* up" application.
I know, I cant believe people are comparing it to the sytem restore on Windows... sys restore on my PC is total bollocks and never solved any problem I had.
I know, I cant believe people are comparing it to the sytem restore on Windows... sys restore on my PC is total bollocks and never solved any problem I had.
Macnoviz
Jul 22, 03:03 AM
So I read in this thread that Kentsfield and Clovertown ARE compatible with Conroe and Woodcrest sockets (respectively) (Cloverton or Clovertown?)
Hope for upgrading an iMac to Quad Core is kindled! At least if Apple releases Conroe iMacs.
BTW, In my opinion, one thing a person should never, ever say is some computer has too much power, and that it will never be needed. So when 128 core CPUs come out in ~10 years time, will we still be considering dual core CPUs as fast enough for our use?
I seem to remember that when the original DOS operating system was created, its RAM was limited. I can't remember exactly to how much, but it was decided that people would never use more than a few kilobytes of memory. Now we are arguing that Mac should provide no less than a gigabyte! Now we are moving to 64 bit processing, with its capability to address a few exobytes, or millions of Terabytes of storage, it seems impossible that we will ever need 128bit computing. But, no doubt, one day we will.
When we will be able to download our entire lives, and even conciousness into a computer, as is said to happen in about 40 years (very much looking forward to), I dare say it will take a lot of memory to do, and even more processing power to manage effectively, especially if we wanted to "live" inside computers, as we will no doubt want to do someday.
So as a conclusion to my most recent rant, Please, never tell me a computer is too powerfu, has too many cores, or has too much storage capacity. If it is there to be used, it will be used. It always is.
I agree with your point on never saying a computer is too powerful, although living in computers is probably not going to happen. Sounds a bit too Matrix-like for me.
Hope for upgrading an iMac to Quad Core is kindled! At least if Apple releases Conroe iMacs.
BTW, In my opinion, one thing a person should never, ever say is some computer has too much power, and that it will never be needed. So when 128 core CPUs come out in ~10 years time, will we still be considering dual core CPUs as fast enough for our use?
I seem to remember that when the original DOS operating system was created, its RAM was limited. I can't remember exactly to how much, but it was decided that people would never use more than a few kilobytes of memory. Now we are arguing that Mac should provide no less than a gigabyte! Now we are moving to 64 bit processing, with its capability to address a few exobytes, or millions of Terabytes of storage, it seems impossible that we will ever need 128bit computing. But, no doubt, one day we will.
When we will be able to download our entire lives, and even conciousness into a computer, as is said to happen in about 40 years (very much looking forward to), I dare say it will take a lot of memory to do, and even more processing power to manage effectively, especially if we wanted to "live" inside computers, as we will no doubt want to do someday.
So as a conclusion to my most recent rant, Please, never tell me a computer is too powerfu, has too many cores, or has too much storage capacity. If it is there to be used, it will be used. It always is.
I agree with your point on never saying a computer is too powerful, although living in computers is probably not going to happen. Sounds a bit too Matrix-like for me.
ThunderSkunk
Apr 6, 04:03 PM
I guess I see it like this:
We use two models of Motion tablets in our studios, the LE1700 running Win7 and the newer J running XP Tablet, for maximum horsepower. Both allow our designers to create complex CAD programs with huge 3d files and multipart assemblies parametric to external data sources, and do it in the field. These tablets have wacom pressure sensitive digitizers, highly visible outdoor displays, 3hr battery lives, weigh 4-5lbs, and cost 3-4 thousand dollars.
We use iPads for everything else mobile, because they're fast, and light, and we're used to carrying around yellow pads everywhere we go anyway. No more yellow pads. Eventually, when more people start to realize that the platform is a good one for more than just content consumption, we'll get more and bigger functionality in better and better applications.
The xoom has neither the functionality of windows nor that of iOS. The day Androids marketplace starts catching up with iOS, we'll reconsider.
But throwing in slightly bigger megapixel cameras and SD card readers really doesn't enter into it.
We use two models of Motion tablets in our studios, the LE1700 running Win7 and the newer J running XP Tablet, for maximum horsepower. Both allow our designers to create complex CAD programs with huge 3d files and multipart assemblies parametric to external data sources, and do it in the field. These tablets have wacom pressure sensitive digitizers, highly visible outdoor displays, 3hr battery lives, weigh 4-5lbs, and cost 3-4 thousand dollars.
We use iPads for everything else mobile, because they're fast, and light, and we're used to carrying around yellow pads everywhere we go anyway. No more yellow pads. Eventually, when more people start to realize that the platform is a good one for more than just content consumption, we'll get more and bigger functionality in better and better applications.
The xoom has neither the functionality of windows nor that of iOS. The day Androids marketplace starts catching up with iOS, we'll reconsider.
But throwing in slightly bigger megapixel cameras and SD card readers really doesn't enter into it.
janstett
Oct 23, 11:44 AM
Unfortunately not many multithreaded apps - yet. For a long time most of the multi-threaded apps were just a select few pro level things. 3D/Visualization software, CAD, database systems, etc.. Those of us who had multiprocessor systems bought them because we had a specific software in mind or group of software applications that could take advantage of multiple processors. As current CPU manufacturing processes started hitting a wall right around the 3GHz mark, chip makers started to transition to multiple CPU cores to boost power - makes sense. Software developers have been lazy for years, just riding the wave of ever-increasing MHz. Now the multi-core CPUs are here and the software is behind as many applications need to have serious re-writes done in order to take advantage of multiple processors. Intel tried to get a jump on this with their HT (Hyper Threading) implementation that essentially simulated dual-cores on a CPU by way of two virtual CPUs. Software developers didn't exactly jump on this and warm up to it. But I also don't think the software industry truly believed that CPUs would go multi-core on a mass scale so fast... Intel and AMD both said they would, don't know why the software industry doubted. Intel and AMD are uncommonly good about telling the truth about upcoming products. Both will be shipping quad-core CPU offerings by year's end.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
Kabeyun
Mar 22, 01:03 PM
Blackberry playbook = The IPad 2 killer - you heard it here first.
...and last, at least as far as the spec war argument goes. You're grafting a computer-shopping mentality onto a tablet market, and people don't think of tablets as computers. People don't buy tablets based on specs, and the spec difference between current or impending offerings it not what will define the user experience.
...and last, at least as far as the spec war argument goes. You're grafting a computer-shopping mentality onto a tablet market, and people don't think of tablets as computers. People don't buy tablets based on specs, and the spec difference between current or impending offerings it not what will define the user experience.
BoyBach
Nov 29, 12:52 PM
To those saying they'll boycott, I'd just like to point out...
...Universal is by far the largest record label in the world, and those of you that say you don't listen to anyone of their artists might need to dig deeper into their subsidiaries, as just a few of the musicians in their stable are:...
That's a nice back catalogue, but how many new albums has The Carpenters, Jimi Hendrix, Carole King, John Lennon, etc released recently that weren't 'Greatest Hits' and 'Best of's? The challenge for the "Big Boy's" of the record industry is to find the next group of artists that will still be selling in 20-30 years time. I don't think my children and grandchildren will be buying Pussycat Dolls and Britney Spears albums in thirty years time. This is reason that their music sales have been falling, it's not exclusively piracy.
...Universal is by far the largest record label in the world, and those of you that say you don't listen to anyone of their artists might need to dig deeper into their subsidiaries, as just a few of the musicians in their stable are:...
That's a nice back catalogue, but how many new albums has The Carpenters, Jimi Hendrix, Carole King, John Lennon, etc released recently that weren't 'Greatest Hits' and 'Best of's? The challenge for the "Big Boy's" of the record industry is to find the next group of artists that will still be selling in 20-30 years time. I don't think my children and grandchildren will be buying Pussycat Dolls and Britney Spears albums in thirty years time. This is reason that their music sales have been falling, it's not exclusively piracy.
ChrisA
Jul 20, 10:57 AM
.... Introduction of world's first commercial 8-core system.
Not quite the first. Sun has been shipping a commercial 8-core systems for about a year now. The T2000 has all 8 cores on one chip but each core also does four-way hyper threading so they claim 32 hardware threads. The price for an 8-core T1000 is about $8K. A system with 8 cores and 8GB RAM burns about 250W
Of course it does not run OS X but Gnome on Solaris has a very OS X -like "feel" to it.
Not quite the first. Sun has been shipping a commercial 8-core systems for about a year now. The T2000 has all 8 cores on one chip but each core also does four-way hyper threading so they claim 32 hardware threads. The price for an 8-core T1000 is about $8K. A system with 8 cores and 8GB RAM burns about 250W
Of course it does not run OS X but Gnome on Solaris has a very OS X -like "feel" to it.
jfinn1976
Jun 15, 09:56 AM
Those of you still looking to order from the Shack...
The latest I am hearing this morning from at least
one store is that preorders start at 1pm.
...however they are not calling it preorders. They
take down your name, phone and email and check
the system. No deposit.
I am being told that you ARE guaranteed a phone
with this reservation.
This is the same I was told last night and this morning.
No deposit required, they'll call at 1, and get a pin number and he said it will not be a problem getting it.
The latest I am hearing this morning from at least
one store is that preorders start at 1pm.
...however they are not calling it preorders. They
take down your name, phone and email and check
the system. No deposit.
I am being told that you ARE guaranteed a phone
with this reservation.
This is the same I was told last night and this morning.
No deposit required, they'll call at 1, and get a pin number and he said it will not be a problem getting it.
rolandf
Aug 7, 07:47 PM
Good lord. Whatever happened to simplicity? It looked like a three ring circus up there today.
Now come on. Time machine? With a picture of outer space and stars? This looks so gimmicky. They are getting to be like Microsoft and just adding new features instead of making things easier and streamlined. Why not just improve the Backup program that comes with .Mac or include it for free? Do we really need another interface? To me it looks like form over function.
Not very innovative so-far. The Intel change took the OS's soul and the inspiration. Very disappointing. Mail, completely overloaded, like MS office.
No mentioning of resolution independent GUI, etc. There are a couple of UNIX OS's out there that are more innovative.
All in all, Apple seems on the wrong track.
Now come on. Time machine? With a picture of outer space and stars? This looks so gimmicky. They are getting to be like Microsoft and just adding new features instead of making things easier and streamlined. Why not just improve the Backup program that comes with .Mac or include it for free? Do we really need another interface? To me it looks like form over function.
Not very innovative so-far. The Intel change took the OS's soul and the inspiration. Very disappointing. Mail, completely overloaded, like MS office.
No mentioning of resolution independent GUI, etc. There are a couple of UNIX OS's out there that are more innovative.
All in all, Apple seems on the wrong track.
DocNo
Apr 11, 10:09 AM
This is a little more out there but my friend has a theory that Apple has let Kevin Smith use the new Final Cut to cut and make his new film that is coming it. The importance of this is that he feels movie making is going the way of music making these days. He believes anything under 20 million is going to be funded independently, not released via movie studios and will sell the movies directly to the theaters.
He feels only the big blockbuster movies like Transformers and stuff will be left the studios, much like many musicians are skipping the record companies and making and releasing music themselves.
And as with the iPhone and iPad, if you are hopelessly behind in a traditional market (i.e. Mac OSX vs. Windows) go create a new one (i.e. iOS)! I have no doubt this is where Apple is going...
He feels only the big blockbuster movies like Transformers and stuff will be left the studios, much like many musicians are skipping the record companies and making and releasing music themselves.
And as with the iPhone and iPad, if you are hopelessly behind in a traditional market (i.e. Mac OSX vs. Windows) go create a new one (i.e. iOS)! I have no doubt this is where Apple is going...
ehoui
Mar 22, 12:51 PM
Competition is good.
Can we make this a sticky so that we are not compelled to reiterate this basic fact over and over. Yes, competition is good. So is breathing.
Can we make this a sticky so that we are not compelled to reiterate this basic fact over and over. Yes, competition is good. So is breathing.
ed233
Jul 28, 02:02 PM
Do you have any links that describe Merom's SpeedStep compared to Yonah's? I thought Yonah's was quite good, allowing you to reduce both clock speed and voltage simultaneously. It is always a problem with Intel, they say "improved SpeedStep", but they never tell you "improved compared to what".
I was able to find this about Conroe's implementation, which sounds fairly impressive:
http://www.sharkyextreme.com/hardware/cpu/article.php/3620036
The Conroe core includes support for Intel SpeedStep technology, and in an attempt to lower power and heat requirements, it emulates a mobile processor by lowering the multiplier when idle or in low usage. In the case of the Core 2 Extreme and Duo processors we reviewed, that amounted to a 1.6 GHz clock speed at idle. The Conroe can immediately fire up at full speed and match the system load. Core voltages can also be lowered through similar techniques, such as Intelligent Power Capability, which can turn computing functions on and off when needed, in order to fully maximize power efficiency.
I was able to find this about Conroe's implementation, which sounds fairly impressive:
http://www.sharkyextreme.com/hardware/cpu/article.php/3620036
The Conroe core includes support for Intel SpeedStep technology, and in an attempt to lower power and heat requirements, it emulates a mobile processor by lowering the multiplier when idle or in low usage. In the case of the Core 2 Extreme and Duo processors we reviewed, that amounted to a 1.6 GHz clock speed at idle. The Conroe can immediately fire up at full speed and match the system load. Core voltages can also be lowered through similar techniques, such as Intelligent Power Capability, which can turn computing functions on and off when needed, in order to fully maximize power efficiency.
digitalbiker
Aug 25, 07:51 PM
I'm not trying to be a wise a@@, but when did Apple make a Pismo. I do remember them, but not being made by Apple. I am sorry, I don't recall the manufactuer for them at this time.:confused:
Apple always made the Pismo. I don't know the exact years but it was a black G3 PowerBook.
Apple always made the Pismo. I don't know the exact years but it was a black G3 PowerBook.
bigmc6000
Aug 11, 05:16 PM
:confused: patent intrusion in europe??? Are you serious? Do you have any examples to verify your claims where a european company violated US patent law and this wasn't enforced by the european judicial system?
Go buy, oh say, Clerks II (or some other movie that just came out) on DVD. It's a hell of a lot easier to find it in Europe than it is here (obviously assumption to you not already knowing where to get it)...
And seriously what's the EU court going to do? "We'll fine you", "No really we're not kidding", "Ok, we fine you!", "Oh, you want an appeal, ok. We won't fine you yet"
(Has MS ever paid a dime of the millions of dollars they've been "fined"??, note I'm not saying the US system is any better but the EU certainly isn't.)
The main point is that, as people have continually pointed out, the wireless technology available in Europe is the same as what's being used in India and China. AKA - the reverse-engineers in China just love to get ahold of stuff that works with what they've got...
Go buy, oh say, Clerks II (or some other movie that just came out) on DVD. It's a hell of a lot easier to find it in Europe than it is here (obviously assumption to you not already knowing where to get it)...
And seriously what's the EU court going to do? "We'll fine you", "No really we're not kidding", "Ok, we fine you!", "Oh, you want an appeal, ok. We won't fine you yet"
(Has MS ever paid a dime of the millions of dollars they've been "fined"??, note I'm not saying the US system is any better but the EU certainly isn't.)
The main point is that, as people have continually pointed out, the wireless technology available in Europe is the same as what's being used in India and China. AKA - the reverse-engineers in China just love to get ahold of stuff that works with what they've got...
ethana
Mar 31, 02:43 PM
How is it biting them in the ass? Android is the fastest growing OS with a larger share than IOS. I think it's been a very succesfull strategy.
Smartphone OS, yes (iPhone vs. Android phones).
iOS as whole (iPads + iPods + iPhones) kills Android numbers though. By LARGE margins.
Smartphone OS, yes (iPhone vs. Android phones).
iOS as whole (iPads + iPods + iPhones) kills Android numbers though. By LARGE margins.
Platform
Sep 13, 09:13 AM
Most people run more than one app at once.
Most are multi-threaded though and if I am not incorrect it doesn't matter for Photoshop if there are two or 72 cores...;)
Most are multi-threaded though and if I am not incorrect it doesn't matter for Photoshop if there are two or 72 cores...;)
Matthew Yohe
Apr 7, 10:41 PM
:mad:Best Buy told me today that they had them in but Apple would not let them sell them. I have been going for two weeks every other day and they finally tell me they have them and can't sell them. I hate this crap. I want my IPad 2.
Well of course they say they have it now, because they can't sell you any. They also probably had it the various times you went in, and yet lied to you.
Well of course they say they have it now, because they can't sell you any. They also probably had it the various times you went in, and yet lied to you.
blakbyrd
Aug 5, 04:07 PM
Reposting my prediction from another thread:
Borbarad
Aug 6, 11:29 AM
Mac OS X Leopard
Introducing Vista 2.0
http://www.flickr.com/photo_zoom.gne?id=207241438&size=l
:D
B
Introducing Vista 2.0
http://www.flickr.com/photo_zoom.gne?id=207241438&size=l
:D
B
MacinDoc
Mar 22, 02:25 PM
The screen is not 50% smaller. Nice way of making yourself look stupid.
What BaldiMac said. The 3" increase in screen size of the iPad more than doubles the screen's dimensions.
What BaldiMac said. The 3" increase in screen size of the iPad more than doubles the screen's dimensions.