You already have some great responses here, but I'll weigh in just for the heck of it: I've also seen +24VDC and 0VDC, as well as +24VDC and -24VDC. In college I grew accustomed to using +24VDC and -24VDC, but at my place of employment we always use +24VDC and 0VDC.
In most cases the notation typically depends on the established practice by your immediate company or organization. The logic behind the notation, however, is based on your chosen reference node. If your reference is +24VDC, then you'll measure a negative voltage because the
conventional flow of current is positive to negative. On the other hand if your reference is real ground (in the case of a floating ground, a real ground may be assumed for the sake of consistency in nomenclature), then you'll measure 0VDC. To summarize, the notation is intended to illustrate the potential difference between two points, a reference node and a source node. The reference and source nodes are often assumed based on known or inferred polarities, but it is very dangerous to make this assumption.
As Jabberwoky pointed out for everyone

thumbsup

, 0VDC can be misleading, which is why I still personally prefer the + and - nomenclature. +24VDC and -24VDC tells me more about the system, which is more useful information, especially in complicated power distribution.
I don't believe either nomenclature is wrong, but it would be interesting to know if IEEE, UL, IEC, or NEC guidelines address this question. If any of them do, I suspect it would be a recommended best practice.