2 Games made by valve that is hidden on steam for whatever reason

Well all I can do is provide proof and try to educate you, it's up to you whether you ignore it or not I guess. And you are ignoring it. You can't just call BS on something you don't believe, you need to back it up and tell me why. There's no reason for me to turn around and say this software, made by nVidia, (not to mention FRAPS which gets the same reading) that counts frames accurately is now wrong.

Well, I called BS and backed it up. FRAPS has always been crap. I didn't ignore it. I addressed it. Unlike you, who didn't address the harddrive comment. Why aren't you using Steam's built in counter, anyway?

It's a technically wrong reading. Infact, it's completely wrong unless it's a specific number, and it's not their fault. Let me explain:

I'm not going to explain too far in depth because you'll probably ignore me and you won't understand, anyway. The video is a .bik (bink) file made by RAD game tools. They also make tools to tell you information about .bik files and to convert, make, and play them back.

According to those tools, the Valve splashscreen has a frame-rate of 23.98 (Commonly known as 24 FPS, however, that's incorrect. 23.976 is closer, but still an approximation.), and each game has a different fixed width and height. No, it's not affected by your monitor.

The .bik file format was designed to be small and support shaders, but the scenes are essentially pre-rendered. That is to say, they aren't going to change framerates. Due to how framerate calculations are done, you will never get a 100% correct answer. These cutscenes "confuse" the program, making it give a larger output than what's there.

So, like it or not, no, you are not getting 200+ FPS. You're getting ≅ 24.976 frames per second. I didn't know what framerate it was at before, but I knew it would be under 200. Even if it was 200, especially with the Source 2 engine loading up while it played, you ain't gettin' 200 fps.


Here's a better demonstration of Coil Whine. It's a phenomenon known to be caused by the GPU, the CPU (more accurately the power coils themselves that are surrounding the CPU bracket on the motherboard) and sometimes PSU too. I have zero mechanical HDDs inside my system. The coil whine is audible running any game over around 100-200 fps. It happens both with and without the mechanical USB drive attached.

Huh, I was wrong. I thought it was more likely that it'd be a mechanical hard drive.

Yes, when you are loading something, but not in something like a splashscreen or a menu. There's 0% usage during these splashscreens and menus. It's not doing anything until you load something. Storage devices don't get polled every single frame, that would be absolutely ridiculous.

Okay, if you're going to try to educate someone, be correct. Your computer cannot have 0% resource usage unless it's off. Splashscreens are intended to hide the loading of resources. This is a thing. It exists. It is used. It is doing something. I measured the resource usage spike and FPS drop. Multiple times. I didn't say they did. You do know that computers can do multiple things at one time, right? You know what happens while a cutscene is being played? Resources are being moved around. It makes for a smoother (or, rather, prettier) experience.

Load times are smaller while actually playing because the engine is already in memory, as are many textures and sound files. You're just loading the map(s), new soundfiles, and textures. How would it make sense to drop the engine and everything you already need, then reload it?

When I tested it, I get 33% usage of CPU oddly enough, but no disk usage whatsoever until after the splashscreen finishes. This makes sense, because the source engine does not load anything until the splashscreen ends.

Here's a screenshot from a PC with no disk usage.

The source engine does load something while the splashscreen is playing. Here, let me do a comprehensive test:

Load time after splash screen immediately skipped Load time after splash screen fully played (Splash screen = 10 seconds)
32 seconds 21 seconds
27 seconds 18 seconds
40 seconds 30 seconds
28 seconds 21 seconds
30 seconds 21 seconds

Note that piss-poor hardware (AKA: Not my daily driver) was used to exacerbate the effect. There was a major resource spike during and after the splashscreen that is consistent.

Notice something about those times?

32-21: 11

27-18: 9

40-30: 10

27-21: 7

24-21: 9

Average: 9.2 seconds.

Something is being loaded. It may not be cumbersome, but resource usage, time after time, shows that, when the cutscene is cut short by a few seconds, resources tend to be lower afterwards. However, when the intro isn't played, resources used tend to be higher and don't drop after 10 or so seconds. It could be one large thing or it could be a small things. You wouldn't notice as much because your harddrive is not mechanical and your CPU is fast and the Source engine is optimized.

It's not incorrect, and you can prove it using a camera pointed at your monitor. As the image scans, the GPU keeps getting new frames WHILE the image is being displayed by the monitor. This is what tearing is. For 120fps on a 60hz monitor, you are getting 2 frames for every refresh. So on the monitor, if you turn suddenly, you get the top half as the first frame, and then the bottom half as the second frame. With 600-900 fps, you get about 10 frames on the screen at once, all torn up and divided as the monitor scans down. You can visibly see it even without a camera. It's not a myth and you can't just ignore facts because you don't believe them.

Your own advice: Take it.

It is. Do you have a high speed camera? Cause', otherwise, you're getting 30 or 60 FPS from the camera.

That's not what causes tearing. Your monitor refreshes at a constant rate, let's say, 60hz. Your GPU's refresh rate changes. LCD monitors redraw data all at once. CRTs redraw data from the top left, fading and moving. Tearing is caused when the GPU sends information faster than the monitor can handle it. The monitor essentially updates the wrong pixels.

You weren't wrong. Just being a pedant.

I get 600-900 so long as there's less than 10 people around.

Baseline is all that's important. So, you get less than 600. It doesn't matter if you can flicker between 600 and 900, it matters what you can consistently get.

In my own testing actually, on servers with bigger maps (other gametypes) and 50 people I generally get 150-300 fps.

Baseline is 150 fps.

When I played the same map locally I still only get around 200-350.

50 people + playing on the server does not affect your FPS that much. If you want to dispute that, you need to get proof and show me.

Here's your proof:

You can pull 600-900 FPS with less than 10 people.

With 50 you can pull

150-300.

That's a reduction of 2/3 the maximum amount (300, 900). That's "a lot."

inb4 "subjactive, lawl."

I understand what you mean, but I'm not talking about a visual difference, which there is none.

Yes there is. Minor=/=None.

I'm talking about the input, your mouse movements.

Mouse movements are not tied to FPS, it's tied to polling rate, DPI and speed settings.

Keyboard input can be changed, but is still minimum of 2MS with many high end keyboards.

Due to the way the source engine calculates input from your mouse, having a higher framerate actually changes how your character moves and aims in the game.

Let me find the comment...

"but I'm not talking about a visual difference" - You

"but I'm not talking about a visual difference" - You

"but I'm not talking about a visual difference" - You

This is even apparent back in Quake. If you set your FPS higher in that game, you could actually run faster AND jump higher. This has been known for a very long time.

That was quake. This is Counter Strike. Movement speed in counter strike is capped pretty low, and bhopping requires you move at least 250 map units/s, and it's generally faster. You can move a hell of a lot faster with more usefulness in Hl2:Dm.

Movement speed is based off of Map units/s, not framerates. You only appear to be slower with lower framerates. But, then again, you're not talking about visual differences, right?

You can see a game that operates like this here. This is one of the only games I know past the year 1999 that runs with framerate dependent timings.

It's been a bad practice to make games dependent on framerates since, like, 1995, yet you're saying movement in counter strike is dependant on frames and is totally not a visual thing.

That's because emulators lock their game timings to the framerate. I fucking said that.

They're supposed to run at 24 or 20 or 30 fps - depending on the game.

HEADDESK

If you speed them up, the timing speeds up alongside the framerate.

groan

This is not the case with the source engine.

sigh

They are so vastly different it's not funny.

Oh Captain my captain

If it was, you would run slower at 30fps, and then run faster at 60fps. That's just obviously not the case. That's not how any of this works.

Comment over 10,000 characters. Part 2 coming.

/r/HalfLife Thread Parent