Bug in SoftReference?

Discussions

Performance and scalability: Bug in SoftReference?

  1. Bug in SoftReference? (6 messages)

    Hello,

    I've got a problem with SoftReference's. According to specification all soft references must be cleared before OutOfMemoryError is thrown. But it is not the case. The code below throws OutOfMemoryError with Xmx 32Mb. When I change to WeakReference everything works fine. Does anybody know whether it is a bug in JRE or there is someting wrong with my code?

    Code:

    import java.lang.ref.Reference;
    import java.lang.ref.SoftReference;
    import java.util.Collections;
    import java.util.HashMap;
    import java.util.Iterator;
    import java.util.LinkedList;
    import java.util.Map;

    public class Test {
    private byte[] buf=new byte[100000];
    private Integer key;
    private Map map;

    protected void finalize() throws Throwable {
    map.remove(key);
    super.finalize();
    }

    public static void main(String[] args) throws InterruptedException {
    Map map=new HashMap();

    for (int i=0; i<1000; i++) {
    Test test = new Test();
    test.key=new Integer(i);
    test.map=map;

    map.put(test.key, new SoftReference(test));
    }

    LinkedList keyList=new LinkedList(map.keySet());
    Collections.sort(keyList);

    Iterator it=keyList.iterator();
    while (it.hasNext()) {
    Object o=it.next();
    synchronized (map) {
    if (map.containsKey(o)) {
    System.out.println(o + " -> "+((Reference) map.get(o)).get());
    }
    }
    }
    }
    }

    Threaded Messages (6)

  2. Bug in SoftReference?[ Go to top ]

    Hi,
    It might be a JVM thing, because it works on some JVM's and doesn't on others. Here are the results:
    Sun JDK 1.4.2_03 Linux, Sun JDK 1.5.0 Linux, Blackdown-1.4.2-rc1 Linux, Sun JDK 1.4.2_02 Windows it does not work.
    BEA JRockit 1.4.2_03-b02 Linux and IBM 1.2.2 Windows it works.
    However if you comment out the finalize method it works on all the platforms I have metioned.
    I don't know if this is a bug or a feature, I am looking forward to find out.
    Best regards, Mircea
  3. Bug in SoftReference?[ Go to top ]

    Hello,

    If you change 100000 to 1000000 in private byte[] buf=new byte[100000];
    it should fail even with finalize() commented. I'm under impression that garbage collection of SoftReferences doesn't suspend thread which creates them and this is why it fails with OutOfMemoryError. Main thread is too fast to pollute heap that garbage collection fails to clean it timely.
    ---
    Regards, Pavel.
  4. Bug in SoftReference?[ Go to top ]

    I'm under impression that garbage collection of SoftReferences doesn't suspend
    > thread which creates them and this is why it fails with OutOfMemoryError. Main
    > thread is too fast to pollute heap that garbage collection fails to clean it
    > timely.

    That's not a bug, it's a feature. I'm sure you do not want the garbage collector stop your worker threads completely when it is not neccessary (like for minor garbage collections). If you need this behaviour, use System.gc() (which triggers full garbage collections in a "stop-the-world-mode"), but then you can stop using SoftReferences altogether and watch your application's performance degrade to death.

    Is your interest in this subject purely of an academic nature or do you have a business case where you need a solution to this problem?

    Cheers, Lars
  5. Bug in SoftReference[ Go to top ]

    If it is a feature then Java specification shall not say: All soft references to softly-reachable objects are guaranteed to have been cleared before the virtual machine throws an OutOfMemoryError. I don't want gc to stop my threads every time, but I want it to stop my thread and clean memory before throwing OutOfMemoryError, as promised.

    Calling gc manually kills performance.

    I have a business case. I need to have a cache of parsed java files which can be reparsed from disk, but I want to keep them in memory as long as possible. I used SoftReferences and ran to OutOfMemoryErorr on projects bigger than one thousand files. Now I have to use WeakReference, but gc doesn't make any effort to keep WeakReferences in memory as long as it can.

    I posted a bug report to Sun, but I don't expect them to fix it in nearest years.
  6. I got a confirmation from Sun that this is a bug
  7. Sun confirmed that this is a bug[ Go to top ]

    Is there a formal report of the bug. I am currently trying to pursue several different paths for implementing a Memory Cache.We are using JDK 1.3, so I would like to see a formal acceptance from Sun acknowledging the bug so that I could document it as a justification in my report for the case not to use it.

    Thanks
    Sy