Does a MemoryBarrier guarantee memory visibility for all memory?

If I understand correctly, in C#, a lock block guarantees exclusive access to a set of instructions, but it also guarantees that any reads from memory reflect the latest version of that memory in any CPU cache. We think of lock blocks as protecting the variables read and modified within the block, which means:

  1. Assuming you've properly implemented locking where necessary, those variables can only be read and written to by one thread at a time, and
  2. Reads within the lock block see the latest versions of a variable and writes within the lock block become visible to all threads.

(Right?)

This second point is what interests me. Is there some magic by which only variables read and written in code protected by the lock block are guaranteed fresh, or do the memory barriers employed in the implementation of lock guarantee that all memory is now equally fresh for all threads? Pardon my mental fuzziness here about how caches work, but I've read that caches hold several multi-byte "lines" of data. I think what I'm asking is, does a memory barrier force synchronization of all "dirty" cache lines or just some, and if just some, what determines which lines get synchronized?

Jon Skeet
people
quotationmark

Reads within the lock block see the latest versions of a variable and writes within the lock block are visible to all threads.

No, that's definitely a harmful oversimplification.

When you enter the lock statement, there a memory fence which sort of means that you'll always read "fresh" data. When you exit the lock state, there's a memory fence which sort of means that all the data you've written is guaranteed to be written to main memory and available to other threads.

The important point is that if multiple threads only ever read/write memory when they "own" a particular lock, then by definition one of them will have exited the lock before the next one enters it... so all those reads and writes will be simple and correct.

If you have code which reads and writes a variable without taking a lock, then there's no guarantee that it will "see" data written by well-behaved code (i.e. code using the lock), or that well-behaved threads will "see" the data written by that bad code.

For example:

private readonly object padlock = new object();
private int x;

public void A()
{
    lock (padlock)
    {
        // Will see changes made in A and B; may not see changes made in C
        x++;
    }
}

public void B()
{
    lock (padlock)
    {
        // Will see changes made in A and B; may not see changes made in C
        x--;
    }
}

public void C()
{
    // Might not see changes made in A, B, or C. Changes made here
    // might not be visible in other threads calling A, B or C.
    x = x + 10;
}

Now it's more subtle than that, but that's why using a common lock to protect a set of variables works.

people

See more on this question at Stackoverflow