Why do reference data types point?

Consider:

enter image description here

I understand that for primitive data types, the memory allocated (orange rectangle) contains the thing you want, but for reference data types, the memory allocated contains a reference/pointer to the thing that you want.

Why design it like this? For things that are usually reference data types (Objects, Arrays etc.), why not just put the thing you want in the orange rectangle?

Jon Skeet
people
quotationmark

How big is that orange rectangle going to be?

For primitives, you already know the size already. But what about for objects? For concrete final classes, you'd know how much memory is going to be used already... but what about other cases?

For example:

InputStream x = new FileInputStream("foo");

If the variable x has to contain all the fields of the object (and know what type it is) the it's got to be big enough for all the FileInputStream fields. Okay, in this case we can manage that, although it's slightly odd for the size of an InputStream variable to be defined by its usage. What about this:

InputStream x = getInputStream("foo");

The compiler can't know what type of object would be returned by getInputStream - so how can it know how big x is going to be? When the value of x is a reference, it's much simpler - it's the same size regardless of the actual type of the object it refers to.

Then of course you want to be able to share objects between various "users" of that object, which is where using references becomes very efficient. (We don't have to copy all the fields every time we pass a value around - we just copy the reference.) Treating objects as values instead of references changes the semantics a lot. For example:

// A simple mutable type with get/setValue doing the obvious thing
MutableType x = new MutableType();
x.setValue(5);
MutableType y = x;
x.setValue(10);
System.out.println(y.getValue());

With references, this prints 10 because x and y are just references to the same object. With value type semantics, presumably it would print 5 instead. Not unreasonable - but a very big change in how the language works.

Then there are other issues besides, but I think that's enough to start with :)

Of course, C++ has answers to all of these things, but the designers of Java deemed it simpler to just use references for all non-primitive types.

people

See more on this question at Stackoverflow