Why does Oracle require you to define the maximum size of a varchar2, if a
varchar2(10) and a varchar2(4000) both use the same amount of memory to
store a given string? Is it merely for type checking, i.e., to give an
automatic error if you try to store something longer than you expected? Or
could there be some advantage to declaring a varchar2(10) rather than a
varchar2(4000)?

Question given to me by developer.

Shawn
Oracle DBA