I want to hear which you think is better, real reasons please. Not just, well I think Mac is stupid and for hipsters..or Aw PCs are for poor people..or crap like that. Thank you :)

Views: 1004

Replies to This Discussion

Oh, I can't disagree with you at all about how hostile Microsoft has been to Linux. I had problems with first setting up my machine as multiboot. I run a (yes, increasingly antiquated) HP G56 machine and the partitioning system that came initially wonderful and flexible but Linux simply could not be installed under it without eliminating other partitions. I had to purchase another piece of software to make an in place change. I hate the replacement old-style logical/extended partition crap I had to substitute but at least Linux could see all the other partitions on the drive at last. I LOVE my GRUB 2 loader :) Hewlett-Packard talked about loss of support if Linux was installed (I was out of their warranty anyways - fuck them). I've heard the horror stories of trying to install Linux  under a OEM machine with Windows 8 pre-installed. So I'm entirely with you here. But I didn't misuse the word "creed". I've seen the same sort of fervor in the Linux diehards as I've seen with Apple. This isn't true of all such users of course, but I've seen the same degree of cliquish behavior and arrogance in both communities which have persisted despite any lack of substantial evidence.  

Of course my system is not a production server system. I still would not use Windows for such. For my home use system the boot times were actually competitive. Win8 launched from GRUB 2 and had the login screen up and ready faster than Ubuntu. But Windows 8 was optimized for devices where such rapid start up is expected and necessary. Linux still has it beat for actual login, but the margin seemed to be pretty thin. I also know personally one of the principle contractors behind the interface design of Vista, Win 7, and Win 8 and have some inside info that Microsoft has been trying to break away from the legacy issues that have tended to compromise performance and security. MS used to be pretty arrogant about that crap and the competition has been a needed kick in the ass for them. 

There is no "if I say so" issue here. My equipment meets, or even exceeds the necessary requirements to run the operating systems in question. But again, I'm running under a GUI environment which can easily become unstable. But that's also why I get frequent updates to fix those issues as soon as these issues are discovered and resolved. If an OS was bug free it wouldn't need updates :p You work in a data center which has a specific hardware and software setup to minimize downtime (thankfully!). But I also doubt you are trying to use the hardware to run an graphics intensive MMORPG which a general consumer machine might be expected to do (I also work for a game software company). But in my case crashes and freezes occurred with less intensive machine use (browsing the web, watching videos, running a local server for web development and testing, text editing with LibreOffice or similar work) so I was certainly shocked it happened at all with Linux. I believe the guilty culprit was Nautilus and I think the problems has since been resolved with later updates. f

Oops. Looks like I'm in that age bracket too. (46 this year). I use Ubuntu Linux and I'm quite familiar with its GUI. My mom also is set up to multiboot with Linux, (she's approaching her 70s) and she loves it too. But she's also completely crashed and lost data when she tried to do a update on it on her own - never recovered that data. either :'(  - but I still find myself finding people giving me instructions like "sudo blah blah blah". Would I expect a typical computer consumer to comprehend such? No. Do I personally understand and comprehend the intent of such commands? Yes, but I"m also a systems programmer meaning I've been part of the development of operating systems and at the lowest level.

And command lines are, again, appropriate in a server farm environment where you work, but you would be hard pressed to justify why a software product like Google Chrome has options that are only available as command line switches and other browsers with the same features simple have them directly available within the functioning browser and don't require the user to fire up another program (the command interpreter shell) just to manipulate them. 

My arguments were primarily focused on general consumers and not IT professionals. Personally, after over 30+ years in the industry I've found Things That Really Suck (tm) common to all modern OS. Things have improved but I still find a number of persistent annoyances that all of them are guilty of. 

I've seen the same sort of fervor in the Linux diehards as I've seen with Apple. This isn't true of all such users of course, but I've seen the same degree of cliquish behavior and arrogance in both communities which have persisted despite any lack of substantial evidence.

I'm with you on this one, Richard, but only partway. The Linux cliques do exist, right along with the arrogance and snobbery and the people who don't know what soap is. One of the things I dislike about IT departments is the lack of social skills and the prevalence of personality disorders in people who are good with Unix and Linux.

Nevertheless, tribe Linux has good reason to be deeply suspicious of Microsoft, most recently due to team Redmond's efforts to strangle Linux by pegging Windows to hardware that won't boot anything else. (I suppose they envy Apple's lucrative business model enough to take a stab at emulating it.)

This time the cover story is "better security" and the fact that it's anti-competitive-- great for Microsoft and tough luck for the only other major OS that uses the same PC hardware-- is just a coincidence. (No, really. A coincidence. Would Microsoft lie about a thing like that? Never!)

The legal battle has just started with a lawsuit filed in Spain on March 26. A whole lot more lawsuits are likely coming too.  

My reasons for sticking with Macs are not particularly compelling for most. I work as a commercial photographer and do a heavy load of digital editing on a daily basis, and Macs have simply become the industry standard in my region. While, historically, there were some legitimate reasons for this, I'm actually not sure any of them remain valid, but that's just the way it is. There is a certain degree of convenience in keeping work and home in-line on the same operating system.

Hardware value? It depends on what you want to do. The one point where I am okay with the cost is the built-in monitor. In terms of editing still images, the fidelity is quite good and I get better detail in the shadows and highlights than I could find in comparably priced monitors at the time I purchased my last desktop machine. Are there better displays for my needs out there? Yes. I haven't searched in some time, but I believe brands such as NEC and Ezio have superior options... at a superior price point. Other aspects of hardware are less critical to my work given that the iMac's offerings are competent in this regard, and RAM I can get elsewhere at a fraction of the cost. 

The operating system? Ah, OSX had a heyday release for me with which I was quite satisfied. I don't fault Apple for trying to make a gui so easy any person can use it with relative ease and shallow learning curve. After all, I use a computer to speed up my life, not bog it down in inefficiencies. The problem is that this comes at the expense of user flexibility and power. Apple creates features which it thinks the average user will want, disregarding the fact that not everyone wants to use their machines the same way. For instance, on the last OS release, Apple decided that full screen display would default to your primary monitor only. The reason for this had something to do with a feature called 'spaces' which I don't really use or give a fuck about. Why was it an issue? My primary monitor is my shitty monitor. I keep all of my palettes, tool sets, and peripheral windows on that monitor including my menu bars. This keeps my good monitor clutter free for image editing. What it means is that Apple decided it knows better than I which monitor should be primary or how I should use multiple monitors, or else I end up with an inferior and gimped full screen result. It took another version release to fix the error.

So, I like the os for it's facility and familiarity, but with every release I don't know what they are going to gimp next because they have decided all on their own what freedoms I do or do not need. Having to find workarounds to workarounds sort of nullifies the facility angle. It doesn't kill the operating system for me, and I am still reasonably content, but I do have my eye on alternatives going forward if this trend continues. I will likely upgrade my machine in two years, so we'll see what's on the market then.

My reasons for sticking with Macs are not particularly compelling for most. I work as a commercial photographer and do a heavy load of digital editing on a daily basis, and Macs have simply become the industry standard in my region. While, historically, there were some legitimate reasons for this, I'm actually not sure any of them remain valid, but that's just the way it is. There is a certain degree of convenience in keeping work and home in-line on the same operating system.

I'm a commercial photographer, too. And much as with the hardware and software others expect you to be using, I'm sure you're aware that you'd be suspect if your main camera (assuming you use a DSLR) were anything but a Nikon or Canon.

For what I do—taking thousands of photos of the same subject much of the time—I have found a program, ThumbsPlus, that does batch processing of things ranging from rotation to resizing to color correction to watermarking, eliminating a lot of labor. It does it more easily than any other software I've found and having actually had a Mac for a while (because I thought that as a photographer I had to have one), I discovered that (a) there's nothing really equivalent in the Mac world and (b) ThumbsPlus's designer (Cerius Software) tells me they will not be porting their program to Mac. Before anyone mentions it, Photoshop "actions" are much clunkier than ThumbsPlus's approach.

It's rare for photo editing/ capture/ management software to encapsulate everyone's needs. It is surprising how often I hear people say, "I suppose I'll need to buy Photoshop soon if I am going to get serious." While I am a heavy Photoshop user myself, I typically steer people away from buying software on the sole basis of thinking it's what they are expected to buy. There are plenty of standalone applications catering to a whole bunch of different needs in different ways at the professional and amateur level alike. And while apps like Gimp don't have what I need, that doesn't mean they aren't obscenely good value for what they do and cost.

In my situation, I don't pick my machine for work. I was freelancing with studios and photographers in various roles when I finished school, and I have worked in-house at a retail chain for the last six years. Given only the options of OSX vs. Windows, I would stay with OSX; however, I am not one to perpetuate the notion that Macs are required for professional photographers; they just happen to have a firm chunk of the photography market share in my environment.

I use a variety of different programs when doing art. A mix of PhotoShop; Corel PhotoPaint, PhotoImpact, and Paint Shop Pro; the original JASC Paint Shop Pro; and some freeware as well. ThumbsPlus is just the best and most powerful batch processor I've found. Originally designed to make thumbnail pages (hence the name), it does a whole lot more now. 

Bought my first PC in 1986; IBM PC AT. No Microsoft Windows just PC DOS. Bought my first Mac the following year, end of 1987. It was a Macintosh SE30. Nine inch screen monochrome.

Around that time Microsoft released MS Windows. Ran out and bought it along with a really expensive digital art program called "Designer". The promise of gui based PC's was great but the actual product left me wanting. Windows was very slow, even for the most basic art based files and prone to crashing.

The little Mac ran circles around any 80x86 based intel PC out there in that time period. I did desktop publishing with Aldus Pagemaker and vector based graphics with Adobe Illustrator and despite it's black and white monitor it ran faster and printed graphics faster than it's bigger and faster competitor. Matter of fact at about the time most graphic art divisions, at the ad agency I worked at turned to the little monochrome Mac exclusively.

I've stayed with what worked for me. I use to buy a new, faster, sleeker Mac about once every year and a half. Which correlated with improvements in graphic and design software. When my kids were in their fives and six's I bought software for them and it was a whole lot easier on a Mac, to show them how to use a computer, than it was on my, then, 80386 IBM PS2 Model 70. At the time Mac's were more intuitive.

Today I still use Mac's. Why? It's what works for me. Have nothing against PC's and I still have one despite the fact that I rarely use it. Purchased it for my son, for college, and he uses it to play video games. He uses his Macbook Pro for school. Kids!

I'll certainly concede Mac beat PC all hollow in 1986 for GUIs and WYSIWYG apps!

I started on a PC and will probably stay there because I've got stuff dating back to 1987 on it somewhere.  At work I code for Linux boxes but that's a very different "universe" and the two rarely cross--if I want to do anything but write code I swivel my chair 90 degrees to the right and use the windows 7 box on the company's main network for email, etc.

Noel - I started with a little "LE," harddrive capacity - get ready for a chuckle - 250 Mgs! From tgherre to an iMac, to an iBook, to a MacBook. I do computer graphics, and couldn't be happier.

My first hard drive was 40 megs.  The machine was truly awesome for its day with a full MB of RAM.

When I started we didn't have ones and zeros, we only had zeros.

Now I have a flash drive, the size of my little finger, with a 15 gig capacity - it's truly amazing what we've seen happen in only the last 20 years, hardly the blink of an eye, really.

OK, I've got to step up, here. I started programming in 1972. The first computer I worked on was an IBM 1401 with 4K bytes (that's 4000 bytes - count 'em) of core (main memory), punched cards in and out and a printer. But this 4K computer produced all the reports this largish insurance company needed at the time. Later the company stepped up and bought a tape drive for their master files.

As part of my training I was given an existing production program which used 3998 core positions and I had to add a column to the report. Great fun!! I just had to use areas of memory which were pre-designated for special uses (like the card-read area - 1-80) as temporary storage.

RSS

© 2014   Created by Dan.

Badges  |  Report an Issue  |  Terms of Service