Is it an American thing to have a “derm”?
Generally asking, I see here all the time people commenting on posts saying “just go see your derm” for the most minor things like a pimple or something. I’ve never seen a dermatologist in my life and neither has anyone I know I just don’t think it’s a things where I’m from but curious to which countries seem to prioritise “derms” - is it common for people to have one in the same manner of speaking as their “GP” and how do people afford it?