Most of these chargeable devices charge using USB nowadays.
USB is not only hardware it comes with mandated protocols implemented in the firmware. These include a way for the device and the charger to negotiate how much current they need to charge up.
Other connectors will either have something similar (thunderbolt) or will have a set voltage/amperage associates with them.
There just aren’t that many configuration for charging. The current used to charge a battery has no real relation to how much the battery outputs when in use.
You can fill your bathtub by opening the tap full or just a little or fill it by the bucket. In the end, you have a full tub and the drain will still drain at the same speed.
So some devices like ovens and water heaters that run directly off mains power will absolutely burn out if you hook them to the wrong voltage. Same with devices that use static transformers like an old 24V doorbell transformers.
Most devices nowdays use smarter switching power supplies. By measuring the output voltage and enabling the transformer part time just enough to supply enough current the switching supply can provide as much current as needed at an arbitrary stable voltage to some device.
If the mains input voltage is higher then it switches less often and draws less current. If the mains input is lower then it switches more often and draws more current.
As far as amperage goes: The wall doesn’t push amperage into a device, the device pulls it. A device will not receive any more amperage than it “wants”.
As far as voltage goes: A device that uses anything less than the wall voltage (230V or 120V depending on where you live) will have a “transformer” which reduces the voltage as it heads in to the device.
Latest Answers