No, one of the constructors for String takes only char[] as a parameter. You can pass in an arbitrary array of chars, even invalid UTF-16.
You're correct that well-written code should never do this. However, there is no guarantee that some library you're using doesn't. You can never assume that 'new String(oldString.getBytes("UTF-8"), "UTF-8").equals(oldString)', which has some unfortunate side-effects if you're doing anything involving serialization and equality.
I agree that Java's String API is generally quite well-designed, but the ability to access the raw UTF-16 is a very big leak in the abstraction.
If that ability was lacking, other people would be complaining about it. Abstractions should not prevent you from accessing the bits underneath: they should make it unnecessary. Which they never completely succeed in, because there are always fringe use cases you didn't foresee.
You're correct that well-written code should never do this. However, there is no guarantee that some library you're using doesn't. You can never assume that 'new String(oldString.getBytes("UTF-8"), "UTF-8").equals(oldString)', which has some unfortunate side-effects if you're doing anything involving serialization and equality.
I agree that Java's String API is generally quite well-designed, but the ability to access the raw UTF-16 is a very big leak in the abstraction.