For detail, Yes, because you’d be able to zoom into the picture and see details that you wouldn’t be able to see with the naked eye. Now if you specifically looked at the photo from the exact focal length of the lens used to take the photo, this may not be the case.
As for color, that can be a bit subjective, because cameras already capture more color than the average human eye can see (our eyes see 1 million colors or so while cameras can capture 16 million or more). The issue comes in terms of accuracy. People will often prefer to look at a picture with more saturation (vibrance/colors that “pop”) vs one that’s exactly as their eyes would have seen in real life
It’s already been done, you can see as much detail as your eyes and brain can perceive. From 20 feet away almost nobody can tell the difference between a good 1080p tv and a 4k one. You can see the pixels are different when you get up close but from a short distance our eyes are not precise enough to notice the extra details
i mean we already have lenses with long ass focal length that create huge field depth effect allowing us to take pictures of small objects in high resolution.
We cant do the same with our eyes since eye can focus only so much so we have to hold object further away because the view is blurred. And if you hold it further away you cant see the details of the object.
Latest Answers