2016年11月30日 星期三

[ Java 文章收集 ] HashMap Vs. ConcurrentHashMap Vs. SynchronizedMap – How a HashMap can be Synchronized

Source From Here
Preface
HashMap is a very powerful data structure in Java. We use it everyday and almost in all applications. There are quite a few examples which I have written before on How to Implement Threadsafe cache, How to convert HashMap to Arraylist? We used HashMap in both below examples but those are pretty simple use cases of HashMapHashMap is a non-synchronized collection class.


Do you have any of below questions?
* What’s the difference between ConcurrentHashMap and Collections.synchronizedMap(Map)?
* What’s the difference between ConcurrentHashMap and Collections.synchronizedMap(Map) in term of performance?
ConcurrentHashMap vs Collections.synchronizedMap(Map)
* Popular HashMap and ConcurrentHashMap interview questions

In this tutorial we will go over all above queries and reason why and how we could Synchronize Hashmap?

Why?
The Map object is an associative containers that store elements, formed by a combination of a uniquely identify key and a mapped value. If you have very highly concurrent application in which you may want to modify or read key value in different threads then it’s ideal to use Concurrent Hashmap. Best example is Producer Consumer which handles concurrent read/write.

So what does the thread-safe Map means? If multiple threads access a hash map concurrently, and at least one of the threads modifies the map structurally, it must be synchronized externally to avoid an inconsistent view of the contents.

How?
There are two ways we could synchronized HashMap:
1. Java Collections synchronizedMap() method
2. Use ConcurrentHashMap

- HashMap Vs. synchronizedMap Vs. ConcurrentHashMap
  1. //Hashtable  
  2. Map normalMap = new Hashtable();  
  3.   
  4. //synchronizedMap  
  5. synchronizedHashMap = Collections.synchronizedMap(new HashMap());  
  6.   
  7. //ConcurrentHashMap  
  8. concurrentHashMap = new ConcurrentHashMap();  
ConcurrentHashMap
* You should use ConcurrentHashMap when you need very high concurrency in your project.
* It is thread safe without synchronizing the whole map.
* Reads can happen very fast while write is done with a lock.
* There is no locking at the object level.
* The locking is at a much finer granularity at a hashmap bucket level.
ConcurrentHashMap doesn’t throw a ConcurrentModificationException if one thread tries to modify it while another is iterating over it.
ConcurrentHashMap uses multitude of locks.

SynchronizedHashMap
* Synchronization at Object level.
Every read/write operation needs to acquire lock.
Locking the entire collection is a performance overhead.
* This essentially gives access to only one thread to the entire map & blocks all the other threads.
* It may cause contention.
* SynchronizedHashMap returns Iterator, which fails-fast on concurrent modification.

Now let’s take a look at code
1. Create class CrunchifyConcurrentHashMapVsSynchronizedHashMap.java
2. Create object for each HashMapSynchronizedMap and ConcurrentHashMap
3. Add and retrieve 500k entries from Map
4. Measure start and end time and display time in milliseconds
5. We will use ExecutorService to run 5 threads in parallel
- CrunchifyConcurrentHashMapVsSynchronizedMap.java
  1. package crunchify.com.tutorials;  
  2.   
  3. import java.util.Collections;  
  4. import java.util.HashMap;  
  5. import java.util.Hashtable;  
  6. import java.util.Map;  
  7. import java.util.concurrent.ConcurrentHashMap;  
  8. import java.util.concurrent.ExecutorService;  
  9. import java.util.concurrent.Executors;  
  10. import java.util.concurrent.TimeUnit;  
  11.   
  12. /** 
  13. * @author Crunchify.com 
  14. * 
  15. */  
  16.   
  17. public class CrunchifyConcurrentHashMapVsSynchronizedMap {  
  18.   
  19.     public final static int THREAD_POOL_SIZE = 5;  
  20.   
  21.     public static Map crunchifyHashTableObject = null;  
  22.     public static Map crunchifySynchronizedMapObject = null;  
  23.     public static Map crunchifyConcurrentHashMapObject = null;  
  24.   
  25.     public static void main(String[] args) throws InterruptedException {  
  26.   
  27.         // Test with Hashtable Object  
  28.         crunchifyHashTableObject = new Hashtable();  
  29.         crunchifyPerformTest(crunchifyHashTableObject);  
  30.   
  31.         // Test with synchronizedMap Object  
  32.         crunchifySynchronizedMapObject = Collections.synchronizedMap(new HashMap());  
  33.         crunchifyPerformTest(crunchifySynchronizedMapObject);  
  34.   
  35.         // Test with ConcurrentHashMap Object  
  36.         crunchifyConcurrentHashMapObject = new ConcurrentHashMap();  
  37.         crunchifyPerformTest(crunchifyConcurrentHashMapObject);  
  38.   
  39.     }  
  40.   
  41.     public static void crunchifyPerformTest(final Map crunchifyThreads) throws InterruptedException {  
  42.   
  43.         System.out.println("Test started for: " + crunchifyThreads.getClass());  
  44.         long averageTime = 0;  
  45.         for (int i = 0; i < 5; i++) {  
  46.   
  47.             long startTime = System.nanoTime();  
  48.             ExecutorService crunchifyExServer = Executors.newFixedThreadPool(THREAD_POOL_SIZE);  
  49.   
  50.             for (int j = 0; j < THREAD_POOL_SIZE; j++) {  
  51.                 crunchifyExServer.execute(new Runnable() {  
  52.                     @SuppressWarnings("unused")  
  53.                     @Override  
  54.                     public void run() {  
  55.   
  56.                         for (int i = 0; i < 500000; i++) {  
  57.                             Integer crunchifyRandomNumber = (int) Math.ceil(Math.random() * 550000);  
  58.   
  59.                             // Retrieve value. We are not using it anywhere  
  60.                             Integer crunchifyValue = crunchifyThreads.get(String.valueOf(crunchifyRandomNumber));  
  61.   
  62.                             // Put value  
  63.                             crunchifyThreads.put(String.valueOf(crunchifyRandomNumber), crunchifyRandomNumber);  
  64.                         }  
  65.                     }  
  66.                 });  
  67.             }  
  68.   
  69.             // Make sure executor stops  
  70.             crunchifyExServer.shutdown();  
  71.   
  72.             // Blocks until all tasks have completed execution after a shutdown request  
  73.             crunchifyExServer.awaitTermination(Long.MAX_VALUE, TimeUnit.DAYS);  
  74.   
  75.             long entTime = System.nanoTime();  
  76.             long totalTime = (entTime - startTime) / 1000000L;  
  77.             averageTime += totalTime;  
  78.             System.out.println("2500K entried added/retrieved in " + totalTime + " ms");  
  79.         }  
  80.         System.out.println("For " + crunchifyThreads.getClass() + " the average time is " + averageTime / 5 + " ms\n");  
  81.     }  
  82. }  
One possible execution result:
Test started for: class java.util.Hashtable
500K entried added/retrieved in 1432 ms
500K entried added/retrieved in 1425 ms
500K entried added/retrieved in 1373 ms
500K entried added/retrieved in 1369 ms
500K entried added/retrieved in 1438 ms
For class java.util.Hashtable the average time 1407 ms

Test started for: class java.util.Collections$SynchronizedMap
500K entried added/retrieved in 1431 ms
500K entried added/retrieved in 1460 ms
500K entried added/retrieved in 1387 ms
500K entried added/retrieved in 1456 ms
500K entried added/retrieved in 1406 ms
For class java.util.Collections$SynchronizedMap the average time 1428 ms

Test started for: class java.util.concurrent.ConcurrentHashMap
500K entried added/retrieved in 413 ms
500K entried added/retrieved in 351 ms
500K entried added/retrieved in 427 ms
500K entried added/retrieved in 337 ms
500K entried added/retrieved in 339 ms
For class java.util.concurrent.ConcurrentHashMap the average time 373 ms <== Much faster

This message was edited 10 times. Last update was at 30/11/2016 20:59:38

沒有留言:

張貼留言

[Git 常見問題] error: The following untracked working tree files would be overwritten by merge

  Source From  Here 方案1: // x -----删除忽略文件已经对 git 来说不识别的文件 // d -----删除未被添加到 git 的路径中的文件 // f -----强制运行 #   git clean -d -fx 方案2: 今天在服务器上  gi...