Heat shouldn’t really make a computer perform worse, until it fails. At which point it likely stops working completely.
However most computers understand that running to failure isnt a good idea and have sensors to monitor temperature and limit their processing if they’re getting too hot so they don’t fry themselves.
There may be some minor changes in performance with temperature, as electrical properties can change. But I don’t think any consumer level computer would need to worry about that. It’s going to be the self limiting that you’ll notice.
Latest Answers