General Computer Thread

Discussion in 'Science and Technology' started by Amaris, May 26, 2016.

  1. Ar-Pharazon

    Ar-Pharazon Admiral Premium Member

    Joined:
    May 19, 2005
    Location:
    Far North Chicago Suburbs
    A heisenberg compensator? Or maybe a fluid link.
     
    Gingerbread Demon likes this.
  2. Gingerbread Demon

    Gingerbread Demon I love Star Trek Discovery Premium Member

    Joined:
    May 31, 2015
    Location:
    The Other Realms
    OK thank you MS for screwing with Windows yet again.

    Creators update just dropped here and they removed the ability to scale text on icons and other elements...... FFS how do we change those things for those of us who can't read the default fonts?
     
  3. John Clark

    John Clark Rear Admiral Rear Admiral

    Joined:
    Jul 4, 2008
    Location:
    There
    Have they taken the slider bar out of the display options?
    (I've still got that on Win10, but I'm not on the creators update)

    For Windows 7, I'm having to mess about with font size/icon size manually as one of the programmes we uses at work hides options at standard size, but other programmes take up to much screen at the magnification needed). - Of course, it doesn't help that the monitors we're stuck with won't go higher than 1280*1024:(
     
  4. Gingerbread Demon

    Gingerbread Demon I love Star Trek Discovery Premium Member

    Joined:
    May 31, 2015
    Location:
    The Other Realms

    No slider bar. There's just one generic scaling option for everything.

    First this
    scaling01.jpg

    Then this
    scaling02.jpg

    And where I've marked is the only changes you can make. You can't change title bars or icon text anymore. Grrrrr
     
  5. Marc

    Marc Fleet Admiral Premium Member

    Joined:
    Nov 14, 2003
    Location:
    Shinning Waters

    read a few comments around the net that Microsoft is still having a few heads aches with the issue of scaling because of having to support 4K and higher monitors.

    Can imagine it's quite a head when have to cover the range from some-one with a cheap laptop running a 14" screen at 1440x900 to someone with a high video card a UHD display. Then you have people running 16:9, 16:10 aspect ratios and if you've got a MS Surface 3 or later it's 3:2 and finally the poor victims still on 4:3 :)

    think allowing users to put in a custom % rather than the slider probably gives a lot more control. Sure if you want to really scale then sliding all the way in wonder direction is one thing but if you're looking top at a particluar point is can easier just to type 116%.

    Should also still be do it fairly easily even if one can't totally read the the screen.
     
  6. Gingerbread Demon

    Gingerbread Demon I love Star Trek Discovery Premium Member

    Joined:
    May 31, 2015
    Location:
    The Other Realms

    I guess but it's a pain. I liked that I used to be able to scale the desktop icons and have their text a bit larger, and my title bars also bigger so I can actually read them.

    Changing icons also changed the size of them in file explorer which is were I spend a lot of time. Now they are not very readable. I'll try the new settings and see how I go.
     
  7. John Clark

    John Clark Rear Admiral Rear Admiral

    Joined:
    Jul 4, 2008
    Location:
    There
    That's a bother:(

    I know on my work pc, to have all the buttons properly available in one programme, I should have it at 125%, but then other MS programme toolbars take up way to much space. I've currently got it set to 115% which at least makes both sides vaguely usable.
     
  8. Robert Maxwell

    Robert Maxwell memelord Premium Member

    Joined:
    Jun 12, 2001
    Location:
    space
    Pretty much. Microsoft and Intel never intended 64-bit for the desktop user. It was Itanium if you wanted 64-bit, 32-bit otherwise.

    x86 was/is a very well-supported architecture but also very complicated and with lots of legacy junk in it that Intel wanted to get away from and which makes little sense in a server environment. Itanium allowed them to design a new architecture from scratch, one that could perform better and be easier to write code for. It was designed from the ground up for parallelization--an extremely important factor for servers at the time. But it wasn't supposed to take so long to develop, and the results were lackluster, to Intel's great embarrassment.

    Intel did not initially intend to move everyone over to IA-64. They wanted IA-64 for the enterprise, x86 for everyone else. Once IA-64 did so poorly at the high end, however, Intel tried to reposition it as a workstation-quality product--not for low-end users but certainly less demanding desktop work (compared to server loads). This just didn't work because performance was terrible compared to x86 and software support was limited. These two factors really teamed up to kill it. Why switch to a platform that's more expensive and performs worse? There was just no upside for users.

    AMD's glory days may be behind it, but forcing Intel's hand with the x86_64 extension was the right move and the consumer PC market is all the better for it.
     
  9. Marc

    Marc Fleet Admiral Premium Member

    Joined:
    Nov 14, 2003
    Location:
    Shinning Waters
    They might be behind them but with Epyc and Threadripper and to a lesser extent Ryzen, at AMD is getting back in the game to a large degree. The lack of real competition has meant that Intel hasn't really had to innovate let alone push the boundaries. Kaby Lake offered little over Skylake when it was released last year and in a couple of weeks Intel will announce the details of Coffee Lake. Though they'll continue to kick users in the teeth by requiring a new motherboard if they want a new processor.

    Now if they could just get their act together with the video cards.
     
  10. Santaman

    Santaman Vice Admiral Admiral

    Joined:
    Jul 27, 2001
    Location:
    Tyre city
    As far as I know all the sudden Intel will make 6 core i7 and i5 chips.. gosh.. hmm, why would that be?

    Ryzen isn't perfect and it might not be the fastest chip, price performance wise though it hurts Intel, at about every price point you do get more cores/more threads with AMD.
    Ryzen is the first attempt, again a whole new architecture compared to Bulldozer, on their budget this is something that deserves respect, it is not even out there that long, as soon it becomes really mainstream it will get more support.

    As for the rest, IPC improvements are getting harder and harder to do, next stop will be more core/more threads, I mean take a look at Power 9 which has 4 or 8 threads per core, you won't see those in a PC but as an example of what is possible...
     
  11. Gingerbread Demon

    Gingerbread Demon I love Star Trek Discovery Premium Member

    Joined:
    May 31, 2015
    Location:
    The Other Realms
    Power supply question.

    I have a Corsair RMX750 which is a 750 watt PSU

    I'm considering a CPU upgrade again to an AMD 8350 4ghz 8 core which has a power rating of 125 watts.

    My 6300 is rated at 90 watts.

    My video card is the Radeon R9 380 4gig which uses I think 190 watts at full load.

    Do you think I'd have to change the PSU for the next CPU upgrade or should I be ok?
     
  12. Santaman

    Santaman Vice Admiral Admiral

    Joined:
    Jul 27, 2001
    Location:
    Tyre city
    More than enough, I run a FX 8350 with a GTX 780 and it works fine on a 750 watt PSU, before the 780 I used an ASUS HD 6970: This beast https://www.asus.com/Graphics-Cards/EAH6970_DCII2DI4S2GD5/ which uses even more power, PSU is a Be Quiet! Powerzone 750

    The only thing I would look into is the mainboard, 8350's draw a lot of amps and need a good VRM* I use an Asus Sabertooth which has a 12 phase VRM that can take 235 watts.

    *voltage regulation module
     
  13. Gingerbread Demon

    Gingerbread Demon I love Star Trek Discovery Premium Member

    Joined:
    May 31, 2015
    Location:
    The Other Realms

    I'm using an MSI 970 Gaming and it says it will run the 8350 without issue.
     
  14. Marc

    Marc Fleet Admiral Premium Member

    Joined:
    Nov 14, 2003
    Location:
    Shinning Waters
    Does seem a tad counter intuitive - cpu makerse work to cut the power consumption of their chips and then people stick these hionking great power hog video cards and multiple ones at that! The latest AMD Radeon Vega 64 pulls up to 375watts (hmm two Epyc CPUs or 1 Vega 64)
     
  15. Santaman

    Santaman Vice Admiral Admiral

    Joined:
    Jul 27, 2001
    Location:
    Tyre city
    It is mainly because CPU's are multifunctional so they don't do anything really well else it would mean something else gets cannibalized, GPU's only do a limited amount of calculation types, however they are REAHEEAAHEAAHEEEEEAAAALY good at them, like mining crypto coins, no CPU is good at that, also the computing power needed to calculate 3D stuff is staggering..

    @ Tetra that board will do well.
     
  16. Marc

    Marc Fleet Admiral Premium Member

    Joined:
    Nov 14, 2003
    Location:
    Shinning Waters
    yeah I know - people want to play their games at 4K resolutions etc etc.

    Not sure if crypto currency is zero sum game by the time people have bought in the motherboards and videos cards but there seems to be big enough demand that vendors are releasing specific boards from crypto-mining and even specific video card models (though probably not as much of dead end as the specific asic systems). Still pissing of the enthusiasts as it's cutting into the supplies for the regular market.
     
  17. Santaman

    Santaman Vice Admiral Admiral

    Joined:
    Jul 27, 2001
    Location:
    Tyre city
    GPU's have come a long way, the last decade or so they've developed at an insane rate but like CPU's I expect them to hit a performance platform quite soon as well.
     
  18. Marc

    Marc Fleet Admiral Premium Member

    Joined:
    Nov 14, 2003
    Location:
    Shinning Waters
    As much as people deride them, the onboard graphics from AMD and Intel are faster that cards that people were paying good money for not that many years ago.

    AMD's Vega chip has 12.5billion transistors on a 486mm square die using a 14nm process (same as Intel will be using the upcoming 8th Core processors aka Coffee Lake) the top end nVidia has 12bil.

    Those are big chips!!

    A Threadripper has 4.2bil transisters on a 192mm square die,
     
  19. Santaman

    Santaman Vice Admiral Admiral

    Joined:
    Jul 27, 2001
    Location:
    Tyre city
    AMD's APU graphics cards, especially the later generations are quite good, I assume Ryzen derived APU's will be even more powerful, Intel is hampered by lack of patents, they're limited in what they can do, some stuff is owned by Nvidia, some by AMD and even VIA because they own S3.

    I think the need for graphics power is more in demand than CPU power, I'm using a AM1 Athlon 5350 which is a tiny 25 watt chip as a main machine and its graphics card, when it can us all its decoders, can play 4K content etc, the CPU itself is of course not all that fast , it's a 2.05Ghz quad core but when something can use the on chip GPU for something the machine is actually quite spiffy.
    I assume that AMD has been quite on the right way with this, making GPU and CPU be able to offload tasks to eachother, of course the first real APU was a Cyrix Media GX..
     
  20. Gingerbread Demon

    Gingerbread Demon I love Star Trek Discovery Premium Member

    Joined:
    May 31, 2015
    Location:
    The Other Realms
    Howcome no one makes a motherboard with more then one CPU socket so say you could have two 8 core CPUs and double processing power?