When it comes to the topic of IMAGE QUALITY, most people are lacking some vital information (and common sense logic), which I believe can help you (and anyone that cares to learn) a whole lot.
Most people focus on the ‘megapixels. Those are basically the dimensions of an image.
For example, 1920x1080 = 2,073,600, or basically ‘2 Megapixels.’ While an image that has the dimensions of 5472x3648 = 19,961,856, or basically 20mp for all intensive purposes.
While those numbers are relevant, especially if you are cropping into that image… or trying to make it bigger, the reality is that it is the dpi (dots per inch) for print, or ppi (pixels per inch) for screen resolution that are really the MOST important.
The “PI” absolutely impacts the 'quality' of the image, much faster and much more than the 'dimensions' in most cases (even when just displaying, but especially when enlarging or cropping images smaller than their current size).
If you take a small 200x200 block, leave it the same size, but increase the dpi... the average person will NOT see any difference with the naked eye!
However, the 'math' and logic are simple. It's basketballs vs soft balls vs bb's, and limited to the display and what a person can see... but also WHY and HOW they 'can see' that image is determined by stuff far different than just dimensions.
Background: Most cameras capture a certain dimension (it’s ‘megapixels’) at 300dpi. It then has basically two options: a) JPG (compressed, and the most common), b) RAW (uncompressed), this difference plays a part in the ‘color depth’ (JPG’s are compressed down to 256 shades of 256 basic colors, while RAWs are 16.7 million shades of 16.7 million colors, huge difference in photo editing… and the final edited print, but little difference in unedited images.
Next, and how RESOLUTION impacts WEBSITE SPEED.
The old CRT (catho-ray tube) displays were limited to 72dpi DISPLAY (regardless what the starting image quality was, the displays have a maximum of what they are capable of displaying... and it's not until you drastically enlarge an image (zoom in) beyond their established dimensions that you 'see' pixilation). Somewhere I have an old full page 'production monitor' that was a whopping 144dpi, which cost me over $5000 back in the early 90's... just do I could gain more accurate 'actual' viewing quality at the size that would be printed, without losing much quality. 72+72=144, which meant I could zoom in 200% or crop the image by 50%, and either way I’d still SEE perfect screen resolution. The actual image may or may not have been actually changed, depending on the process. If you think of the enlarging process like a rubber band being PULLED, it thins out the further it’s ends are pulled away from each other.
The newer LCD & LED changed things... from Dpi to Ppi (dots per inch vs pixels per inch), though most people that have been 'in the industry' for years still refer to the 'dots,' though 'pixels' is more correct today, and both are technically 'dots' made up of RGB (red, green, blue) 'dots' (pixels) that have variable intensities, that ultimately can, in combination with each other, make up the visual spectrum of about 16.7 million different colors, shades, and shadows. The terms 'dpi' and 'ppi' are somewhat interchangeably, but technology jumped from the 'basket balls' (72dpi) up to soccer balls (96-120ppi) for their display capability standard, and even the best displays still have significant room to improve... however, the limits of the human eye generally make pushing the limits less worthwhile, except for the most creative that might actually CROP INTO an image (and still want a good clean sharp resolution after the crop).
I’ve heard some monitor and tv sales people claim that the math is the “actual display resolution divided by the actual inches of the display.” But that is an over-simplified thought, failing to accept that there is an inherent limit to the 'size' of the actual pixels.
Monitor/screen manufacturers have been notorious for 'over selling' and over stating their displays actual dimensions for decades. Obviously 1920 bb's lined up would look significantly different than 1920 basketballs lined up... both 1920, both 'dots' (or pixels). But it is the SIZE of those dots is ultimately the 'resolution' limitations that particular display will have.
There is also a factor of ‘proximity.’
How close we are to the screen. Los Vegas and Times Square in NYC prove that, with their tiled mega screens and animated bill boards. From a distance, those large screens making up that bill board on the side of a building appear as 'pixels' ... the closer we get, the larger they get, and soon we can see they are a bank of screens stacked together to display the larger image.
Generally, a native 800x600 image is 800x600, is an 800x600 on any display, any website... capable of, but limited to, displaying just 800x600.
Some people will claim there is absolutely no difference in the image quality - regardless of dpi (ppi) unless the image is printed... which is true ONLY if they image is never ever displayed above that 800x600 size.
Those same people are, however, absolutely wrong IF or WHEN THAT SAME IMAGE IS EXPANDED, ENLARGED, OR CROPPED IN (thinking the 'new version' will be display at the same 800x600 dimensions the original was displayed). Composition material aside, you can compress a basketball, but it's impossible to enlarge a bb (and maintain the quality, integrity, and appearance). Same with over inflating that basketball to twice its normal size, distortion is necessarily created.
Some of the newest LED displays (ie real 4k) have gone up to as much 192dpi/ppi (note: 72*3=216, but 96*2 = 192).
Understanding that 1920x1080 (standard HD Quality) is 2 megapixels, it makes sense that 4k (4 megapixels) would be the next logical step up.
To the best of my knowledge, today, there is nothing in the private sector capable of displaying (visually) anything higher than 192ppi (that a normal human being can afford). Frankly, it doesn't much matter, because the average human eye can't easily see anything 'crisper' than that (until the image is enlarged or magnified). The vast majority of the screen sold today are still down in the 96dpi range.
The dpi most easily determined WHEN THE JPG IS CREATED...
If the jpg was created IN CAMERA... is usually still captured at a highly compressed 72dpi (which is usually great up to about 8x10 PRINT Sizes, because the dimensions are higher than what is required for 8x10 prints) without losing any more detail in the image than necessary. However, a RAW image is nearly 10 times the file size, for an image with exactly the same dimensions. CAMERA RAW is generally the highest quality most cameras are capable of capturing at, and usually requires a secondary program (like PhotoShop) to ‘post-process’ and export to a printable file type. .
Without getting too geeky...