PlayStation 4 Specs Straight From Sony

By Spencer . February 20, 2013 . 6:09pm

imageWhile we didn’t see the PlayStation 4 hardware, Sony Computer Entertainment released specs about what’s inside the box or dodecahedron. PlayStation 4, which comes out this holiday, has a custom 8 core processor, 8GB of RAM, and a hard drive (how much space will be available wasn’t revealed.)

 

Main Processor

Single-chip custom processor
CPU :  x86-64 AMD “Jaguar”,  8 cores
GPU : 1.84 TFLOPS,  AMD next-generation Radeon™ based graphics engine

Memory

GDDR5 8GB

Hard Disk Drive

Built-in

Optical Drive (read only)

BD 6xCAV
DVD 8xCAV

I/O

Super-Speed USB (USB 3.0)
AUX

Communication
Ethernet (10BASE-T, 100BASE-TX, 1000BASE-T)
IEEE 802.11 b/g/n
Bluetooth® 2.1 (EDR)

AV output

HDMI
Analog-AV out
Digital Output (optical)


Read more stories about on Siliconera.

Video game stories from other sites on the web. These links leave Siliconera.

  • http://www.cubiz.tk/ Auragar

    Sounds like a beast. And since it has an optical drive I think this mean there will be physical media. Great. PS4 you got yourself an interested customer.

    • Testsubject909

      I doubt that retail will die anytime soon. Not everyone is using the almighty fiber optics internets and for those who aren’t using their services, they’d be losing out on a lot of retail customers with a full on alienation. A mixture of both digital and retail services is optimal at this time so that’s what I feel they’ll go for.

      • http://www.cubiz.tk/ Auragar

        Yah… my internet is complete ass. I NEED retail.

  • http://www.youtube.com/user/xxHiryuuxx Tohsaka

    ‘Memory

    GDDR5 8GB’

    Graphic…DDR? Are we sure that isn’t something else? GDDR5 goes to a video card RAM, not to onboard RAM…

    I know I’m going into technical details here but still. Something not right about that. And 8GB GDDR5 is way high considering the Nvidia’s Titan debuted at 1000 bucks a few days ago and only sports 6 and could very easily clean this console’s clock.

    Unless it’s going to debut for one thousand five hundred and ninety-nine US dollars.

    • Locklear93

      Nope, we’re sure. They specifically said during the presentation that this is the kind of memory usually reserved for high end graphics cards.

      • http://www.youtube.com/user/xxHiryuuxx Tohsaka

        High performance is, again, an Nvidia Titan that retails for 1000 dollars for 6GB GDDR5. I doubt this is correct.

        • Niyi Aguaze

          Not the “Ultimate High-End”. Just Low “High-End”.

        • Wesley Kenneth Houpt Mattingly

          Arguing with facts is retarded, your opinion is noted, and wrong. Sony clearly stated that, and you can complain as much as you want, but it just sounds like a butthurt PC fanboy more than anything. Be like me, enjoy all platforms, it’s the way life was intended to be.

    • drproton

      GDDR5 is basically DDR3 specialized for high-bandwidth processing. There’s no reason that a computer couldn’t use it for main memory, other than by design.

      • http://www.youtube.com/user/xxHiryuuxx Tohsaka

        But WHY in a case such as this? What is so special about this console that would warrant such an odd use?

        • Locklear93

          My first guess would be the switch to unified memory. Unlike the PS3, the PS4 doesn’t have reserved video RAM. While this is a boon for developers in that they can use as much or as little as they need, for what they need, it also means Sony can’t predict how much, or which physical modules are likely to be used for things that need to load extremely quickly, like textures, shaders, and so on.

          • drproton

            They also might be able to avoid a lot of overhead in moving things back and forth between main memory and graphics memory which exists on conventional configurations. Since the GPGPU is supposedly part of the design philosophy, sharing memory between CPU and GPU might pay off there as well.

        • Kevadu

          Why is simple: price. GDDR is significantly more expensive. The fact that the PS4 has 8 gigs of it is actually kind of crazy…I know most pre-announcement speculation said it would only be 4 GB, mainly due to price concerns.

        • malek86

          Nothing so weird about it. The 360 also used shared GDDR3 memory. As Mirumu explained above, graphic memory is good for games but not so good for general OS tasks, which is why you wouldn’t use it for anything other than a GPU or a console.

    • mirumu

      Usually graphics memory is more optimised for bandwidth than latency. That’s not what you want for a desktop CPU constantly multitasking between multiple applications, usually sitting idle, and ramping up speed on demand to save power. Some applications have unpredictable memory use, and high latency RAM will slow them down. You do need higher bandwidth for games though, and that’s why GPUs tend to use high bandwidth memory like GDDR5. Basically it’s about using the right tool for the job and desktop PCs need to be able to handle anything.

      In the case of the PS4, games are all that really matters. Everything else is secondary. Having two different types of memory would only drive up system cost and complexity. Using high bandwidth RAM here makes more sense. Even if certain tasks will run slower, the high bandwidth will more than compensate for that when system is under load moving large blocks of memory around between disk, network, GPU and other I/O devices. Sony are making a tradeoff. Sure they could use DDR3, and things like AI code may actually run faster, but given the inevitable graphics and sound workload the end result would still be slower overall.

      Keep in mind too that Sony used Rambus XDR memory in both the PS2 and PS3 as well much the same reason so it’s not like this is something new for them.

      • malek86

        That also explains why the next Xbox would use DDR3, as rumors say that MS wants to make it use a lot of apps multitasking too, like a sort of PC.

        • mirumu

          It could be. There are a few other reasons Microsoft might still prefer DDR3 though. The most obvious one that comes to mind is that each memory type requires it’s own memory controller integrated into the CPU. The bulk of x86 CPUs today are designed to work only with DDR3 so sticking to that would make things easy. Sony’s CPU by comparison must be a custom design. GDDR5 is based upon DDR3 however and ATI/AMD was the first company to use GDDR5 in graphics cards so I’m sure it was still fairly easy either way.

          It could also be that Microsoft just see GDDR5 as only offering diminishing returns for the extra cost. DDR3 is still plenty fast and is very inexpensive.

          • malek86

            That’s only if their memory bus is wide enough. If not, the extra bandwidth of GDDR5 should make quite a big difference in intensive games. In some benchmarks I’ve seen that comparing DDR3 to GDDR5 on the same card, it can increase performance quite a bit.

          • mirumu

            Yeah, I’d assume the CPU and GPU would access the memory in different bus widths. Otherwise DDR3 would be much slower if the GPU was limited to the typical 64-bit data path of a CPU. I haven’t seen AMD mention what memory controller Jaguar has, but supposedly it has a wider 128-bit data path, but that would still be too thin for use as a fast GPU bus I’d have thought.

          • Staryj

            It’s APU so memory controller will be shared by cpu and gpu and it’s 256bit wide. Don’t look at jaguar for pc spec, it’s a custom chip and not directly comparable.

  • Charlie

    But still no pictures of the console. C’mon Sony!

  • Herok♞

    Looking at this what do you guys think the price will be as a guess?

    • http://www.youtube.com/user/xxHiryuuxx Tohsaka

      Tough to tell. Looking at the hardware that they’re vying for on that Radeon HD card of theirs, it’s only half the power of an HD 7970 for total bandwidth processing power (rated 3.5TF). So let’s assume it’s on par with a 7870. 8GB of RAM is about the standard if it’s DDR3 (instead of the listed GDDR5). All told and keeping it under cost as Sony typically does? …400-500 bucks?

      • Niyi Aguaze

        About the PS4 GPU, it’s rated at 3TF (I think). So it’s more on par with the Radeon HD 7950.

        • Wesley Kenneth Houpt Mattingly

          I’ve got the 7950, great card! If the PS4″s GPU is comparable to that, I am excited to see what it will do. Sad that the card is $300 and the console won’t be too much more than that. Console gaming is so much cheaper, lol!

    • Krisi92

      599 US Dollars.

  • sibarraz

    Jaguar-? Shotouts to Atari

    • Testsubject909

      The worse kind. Ever seen someone try to work a Jaguar?

      Poor AVGN and Spoony…

  • Andres Pena

    as long as i can play ps3 games on it, then its all good lol

    • Locklear93

      You can’t. <_<

      Edit: Let me qualify that statement a bit. There's no backward compatibility per se, but they could always do software emulation at some point. As for sticking a PS3 disc in and expecting it to run, though… nope.

      • Ehren Rivers

        In the presentation they were saying it would be pretty easy to port older PS games to PS4, and that they hope to eventually offer the full range of Playstation titles for download and play on the PS4.

      • Niyi Aguaze

        Through Gaikai. Insert the CD Game. Then it’ll automatically detect the game and it’ll play it, through streaming the data. So far it’ll be for PS3 games, but soon PS1/PS2 games will also have that capability.

        • http://simplephilistine.wordpress.com/ Arla

          Mind posting a source to this information? I can’t find it.

          • Niyi Aguaze
          • Locklear93

            With all due respect, that article doesn’t mention PS3 games at all, and I don’t remember David Perry mentioning them when he was on stage. It could conceivably be done, but the article you linked, and the presentation itself, didn’t confirm that it’d be used for that.

          • http://simplephilistine.wordpress.com/ Arla

            Yea this “Insert the CD Game. Then it’ll automatically detect the game and it’ll play it,” is not in there at all.

    • Niyi Aguaze

      Through Gaikai. Insert the CD, then it’ll detect the game and stream the data, so you can play it.

  • Odin

    Ha, the only thing that’s “next-gen” about the PS4 would be the ram amount. Once the Titan is released, it will make this GPU look old.

    • Niyi Aguaze

      It’ll make the GPU look Mid to Low High-End at the very most.

    • Luna Kazemaru

      Hahaha thats if titan will ever come out.

    • mirumu

      We don’t really know anything about the PS4′s GPU, but in the presentation when they talked about Heavy Rain and Beyond Two souls they claimed the former managed 15000 polygons on the PS3, while the latter manages 30000 polygons on the PS4.

      If that’s the case it would put the PS4′s graphics at around the level of a mid-range card, very roughly in the same ballpark as the Wii-U.

      Of course we don’t know what effects may be applied on those polygons or if it’s even an accurate number.

      Honestly I think anyone expecting the PS4 to beat a high end PC graphics card like a Titan, GTX680 or HD7970 is in for disappointment.

      • Geoff Kelly

        Actually no, Heavy Rain and Beyond: Two Souls are both PS3 titles … they were saying that they progressed up to 30k polys with the PS3. They did not reveal at all how many polys their tech demo was for some reason.

        • mirumu

          I re-watched the video and yeah, you’re right. Every site I can find talking about it in Google says the 30,000 poly quote was for Beyond: Two Souls on the PS4, but yes, he didn’t actually say that. I wonder if perhaps Sony asked them not to give any raw numbers for PS4 performance?

        • D H

          To be fair, the PS4 Old Man wasn’t a full body, so they may have withheld the number of polygons because it may not have been as impressive of a gain as it could be.

      • Adol Christin

        “If that’s the case it would put the PS4′s graphics at around the level
        of a mid-range card, very roughly in the same ballpark as the Wii-U.”

        I can guarantee its much more powerful. We already know the Wii U GPU is based on the 4000 series. The PS4 GPU is probably based on the 7000 or maybe future 8000 series. A huge difference in performance.

        How did you come up with same as Wii U ???

        • mirumu

          I never claimed it was the same as the Wii-U, just comparable. I was going purely on the polygon numbers David Cage mentioned. As Geoff Kelly said in his reply however Cage didn’t actually say the numbers were from the PS4. So yeah, my rough unscientific estimate would be totally and completely wrong.

          I would add however that just because something uses a more modern architecture doesn’t automatically mean it’s faster. The number of texture and shader units matters too.

          • Adol Christin

            I have done my research and all evidence points at it being based on the 7800 series. A big improvement from the Wii U GPU.

            Sorry, but I don’t see it being comparable.

          • mirumu

            It obviously isn’t comparable. As I said the numbers I based that opinion on turned out to be wrong.

          • Adol Christin

            No worries. I wonder if multi-plat titles will look that-much-better on the PS4.

          • mirumu

            Yeah, I wonder about that too. I saw Naughty Dog said the other day that they design all of their game’s assets at higher resolution so they can release better looking versions for more powerful consoles later. I hope the other companies doing multi platform releases are doing this too.

          • http://twitter.com/Vietgeta VietGeta

            We’ll have to see with Watch Dog’s release. Despite it being shown at Sony’s E3 and this PS4 meeting. It’ll be on PS3, PS4, 360, and Wii U.

            But from the looks of Watch Dogs, it didn’t look like they took advantage of the PS4 that much in that gameplay. It still looked like the PS3 version.

      • malek86

        It will be much more powerful than the Wii U, that’s guaranteed.

        Still, in the beginning it probably won’t matter much, since chances are we are gonna see a lot of PS4/PS3/360 cross-developed titles.

      • Barzh

        The GPU in the Wii U is comparable to a low-end GPU from 2010 so no way (HD 5670, c’mon!).

    • Skeima

      if ps4 use the Titan then it’ll cost a kidney….so no

    • asch999

      and when AMD launch their new GPU that surpasssed the Titan, The Titan will look Old. Also don’t forget the price of The Titan and other component to built a PC will be more expensive than PS4

    • Isaac Newton

      this titan you say can it play artificial academy?

  • riceisnice

    So how expensive is this going to be?

    • MrRobbyM

      With PC-like hardware and no built-in BC, I’d say 400-500 bucks. Like we’ve been hearing, there will probably two models at launch.

      • CirnoLakes

        I’d say $400 at most. 500 seems like an overestimation. At that price, they’d be selling the systems at a decent gain on day one.

        The CPU in there is basically a Zambezi.
        http://www.newegg.com/Product/Product.aspx?Item=N82E16819103960

        This thing right here. The customization between the PS4 8-core and the consumer Zambezi is probably negligible. A CPU that is weaker and cheaper than my slightly outdated Intel 2500K. And the fact that they’re undoubtedly paying less than $100 per CPU on each CPU due to an agreement between AMD and Sony(less than $60 per CPU, more than likely).

        The fact that the GPU probably doesn’t amount to anything expensive, and the fact that Sony is responsible for all the other parts, OS, psu, and all other parts that are highly specialized in house creations. Then they could probably sell this thing for $250 and make a profit.

        So I’d say the starting price is going to be either $350 or $400 and $400 is the top number. And even with that they’re making a killing selling the hardware.

        • malek86

          I’m thinking more than that. While the hardware does look inexpensive, we still don’t know much about the GPU – which could be the problem here. As the CPU is not terribly powerful, they’ll have to rely on a powerful GPU for graphics, so I wouldn’t discount it so easily. We only know it’s 1.84TF, so assuming it’s single precision, that would put it pretty much on par with a 7850.

          Frankly, I think $350 will be the starter model and $450 the highest one. Remember they don’t need to make the premium model’s price too close to the basic model, or they could risk a Wii U situation where the Basic model is unwanted and remaining on the shelves.

          Of course those prices might go up to $400 and $500 with a free game, if they choose to increase their profit margin (or perhaps break even? We really don’t know anything about the production costs yet). It’s not impossible.

  • CirnoLakes

    If developers get used to developing for 8 cores instead of two, I can say goodbye to my 2500k being near top of the line in PC gaming. My 2500k is just slightly outdated as we speak.

    But will become a heck of a lot more outdated if most games are developed for 8-cores.

    • malek86

      Outdated? Oh come on, Sandy Bridges were and are still great for gaming. Even my Core i3 easily outperforms pretty much every previous generation’s CPU, including quad-core ones. Of course, if devs get used to multicore development, things will get rougher. But when that happens, Intel/AMD will also start making them cheaper, so it’s not a big problem.

      Anyway, I last updated my rig in May 2011 (when The Witcher 2 came out), I wouldn’t mind doing it again after 3 years or so.

      • CirnoLakes

        By “slightly outdated”, I mean in comparison to top of the line. And by that, I mean the higher end models of Ivy Bridge that have come out recently. I’m not degrading my legitimately powerful CPU by admitting it is weaker than an Ivy Bridge.

        When I bought my 2500K, it and the the 2600k, were together the best consumer version and LGA 1155 CPUs on the market. That isn’t so anymore because Intel has gone to LGA 1155 numbers 3000 and higher.

        “Even my Core i3 easily outperforms pretty much every previous generation’s CPU”

        Of course. But that still doesn’t mean my 2500K is “top of the line”. As far as consumer models are concerned, that title is reserved for the 3770K. Which is easily more powerful than my 2500K.

        “Of course, if devs get used to multicore development, things will get rougher”
        Of course, and to the point I may want to upgrade my 2500K to whatever 8-core Intel puts out. That’s the point I’m making.

        Getting an 8 core used to not be useful because most games utilized no more than 4 cores. With the PlayStation 4, that may change. In fact, given how some PC exclusive titles push current consumer computers, some PC game two years or so or less down the road, may push my little quad core to its limits. In which case I’ll be looking into a high end Intel 8-core, not an Intel quad core.

  • http://twitter.com/machinebuilder Odell

    its an APU, so both cpu and gpu shares ram memory right?

    • http://www.cubiz.tk/ Auragar

      I believe they said this, yes.

  • AKA Takumix

    As Nvidia announced their next Titan Geforce card coming by the end of 2013 and priced at 900$ will have a single-precision floating-point performance figure of 4.5 TFLOP/s and 1.3 TFLOP/s double-precision, now I wonder if the spec of 1.84 TFLOPS Sony released for PS4 GPU is in single or double precision… http://www.techpowerup.com/180364/NVIDIA-GeForce-GTX-Titan-Final-Specifications-Internal-Benchmarks-Revealed.html

  • Larry L Harris Jr.

    http://threecheersformedianews.blogspot.com/2013/02/my-thoughts-on-ps4.html

    Seriously why didn’t they just wait for E3 I do not understand…

  • Go2hell66

    $599 US dollars!!!

  • http://vindictushots.tumblr.com/ Okuni-chan

    Interesting they went with AMD.

    • CirnoLakes

      AMD has been losing heavily in the core PC market to Intel in recent year, as their recent FM1 series has been a disappointment. Even their best chip, the Zambezi(the chip that is essentially in the PS4), has failed to perform nearly as well as the 2500K(which is why I bought it).

      I’m thinking that, due to this competition and losing market share with Intel, they went to Sony and stuck up an insanely cheap deal to recoup losses. Playing to their strength of making and selling middling parts and cheap prices. While nothing AMD has put out lately can compete with Intel’s top of the line parts, they’re more than powerful enough to qualify as next gen console parts. AMD locking down the consoles this generation as compensation for losing the PC space to Intel, more than does so. And is a smart business move.

      People who have enthusiastically kept up with the PC market in the past few years see this as an obvious business move for both AMD and Sony. As AMD are poised to sell their CPUs for dirt cheap to compete with AMD, and Sony is looking for the most powerful yet cheapest CPUs possible, and as a company that normally makes specialized hardware, can afford to push for the software leap that will push the software market to developing for 8-cores instead of merely 4. In fact for more developers, developing for 8-cores on x86 architecture will be easier than developing for 4-cores on Power architecture.

      So, I think that, while the choice of AMD is still interesting. Mostly for the architecture. Many of us admittedly did see it coming from miles away.

      • http://vindictushots.tumblr.com/ Okuni-chan

        Oh wow I didn’t know that. I knew AMD was having a bit of a struggle but didn’t know the full story to it. It does make sense that they went to Sony when put that way. I hope it works out for the both of them.

  • Learii

    i wait for 2nd model

  • malek86

    I knew the next consoles would have had an optical out. And people told me Nintendo didn’t include it “because they wanted to get ahead of a bad standard”, hah. So naive.

  • Red Veron

    I didn’t know they still have that legacy AV out they’ve had since PS1. I feel sorry for the people who run HD consoles on AV cables since I did that up to last year on my old TV.

    • katzedan

      I think it’s good since I (and probably other peoples) use this AV port/cable to connect to an external audio, like my headphone or guitar amplifier …
      At least in some games (like Rocksmith) this will correct the audio delay that the HDMI causes!

  • http://www.facebook.com/profile.php?id=1598149010 Rayhan PromisedGallery

    it seems i will wait 6 years to buy this. i’ll wait until huge price drop, tons of games (and jailbreak haha)

    heck, i even just bought PS3 last year, and it’s a fat one

  • Isaac Newton

    Why so many of those people wants to make PS4 a PC?
    I mean come on PS4 is a PS4.
    And PC is a PC.
    PS4 is purely for game and PC is for Por…. I mean Browsing and any crappy things!

    • Ritsujun

      They’re mad that publishers don’t support PC.

      • Isaac Newton

        Mad or Bitter….
        Well if those publishers support PC then HELL YEAH
        HENTAI GAMES for console no hold bars ROFL

    • CirnoLakes

      >PS4 is purely for game and PC is for Por…. I mean Browsing and any crappy things!
      Cannot tell if joke or just one of the most ridiculously ignorant statements I have ever hear in my entire life.

      • Isaac Newton

        for PORN!

      • Evan Groman

        Oh, why is it ridiculously ignorant? Because oh god gaming PC’s are so orgasmic?

        Yes, it’s a joke, you fuggin tool. Of course it’s a joke.

  • isfuturebright

    It’s funny that they didn’t show the console. I think they wait for E3?

  • http://www.facebook.com/people/Bob-Obb/100001994017630 Jimmy Dean

    6x BD… hell yes. What’s the PS3, like 1x or 2x? No more mandatory installs I hope.

    Also, 8GB DDR5 RAM is HUGE. HUGE. RAM was one of the PS3′s biggest cruches, they seriously went all out on this one.

    • CirnoLakes

      >6x BD
      >No more mandatory installs I hope
      Nope. Discs are still outdated technology in comparison to hard disks, and many developers aren’t going to want to put up with that bottleneck.

      Pretty soon all games are going to run exclusively on hard disks. And what replaces them won’t be a new, faster, higher capacity disc, it will be SSDs. The death of gaming though the disc is long overdue. In fact the only reason disc gaming still exists at all is because gamers like yourself are used to it and thus holding technology back. Just like cartridges were holding gaming back during the 5th generation and Sony didn’t use them with their PlayStation.

  • Sergio Briceño

    At least now we can safely say that the Wii U isn’t doomed. The difference it’s not as big as everyone was making it out to be, but it will be big indeed.

    • CirnoLakes

      Lower specs have never made a system “doomed” to begin with.

      The Wii sold competitively to the XBOX 360 and PlayStation 3 despite significantly inferior specs. And the Vita is struggling to compete with the 3DS despite having comparatively superior specs. If specs mattered, the Vita would have clearly won over the 3DS.

      I don’t see why the same couldn’t apply to this generation.

      • Sergio Briceño

        I’ll tell you. The Wii won, by a margin of about 50%. But it has steadily decreasing it’s sales. Why? Because the past generation has dragged far longer than previous ones and the hardware on the Wii was inferior ever since it was released.

        This time around, at least Nintendo has got a head start. I don’t think there will be people complaining about graphics and the lack of power until 2016-2017, unlike the Wii, which nobody thought was going to be successful based solely on it’s capabilities and it’s focus on motion controlled gaming.

        Don’t get me wrong, I love games on my console over anything else and I know Nintendo are the ones that bring the good games. I laughed at the PS3 presentation when they said that connectivity and social features were their basis because that’s what both casual and core gamers have come to expect and demand in their consoles. I must be neither, I don’t really give a damn about the share button.

        All I wanted to say is, that for the last couple of weeks, people on forums have been claiming the PS4 reveal to be a Wii U killer, heck, even a Xbox-whatever killer and it hasn’t even been announced yet. Well, if those assumptions were made based solely on predictions about it’s power, then the PS4 didn’t amaze me, not the way the Xbox 360 and PS3 did when they were announced. So I think, in that regard, the Wii U can rest assured, it’s not going to die anytime soon.

        That’s all.

Siliconera Tests
Siliconera Videos

Popular