Source From HerePrefaceHashMap is a very powerful data structure in Java. We use it everyday and almost in all applications. There are quite a few examples which I have written before on How to Implement Threadsafe cache, How to convert
HashMap to
Arraylist? We used
HashMap in both below examples but those are pretty simple use cases of
HashMap.
HashMap is a non-synchronized collection class.
Do you have any of below questions?
In this tutorial we will go over all above queries and reason why and how we could Synchronize Hashmap?
Why?The
Map object is an associative containers that store elements, formed by a combination of a uniquely identify key and a mapped value.
If you have very highly concurrent application in which you may want to modify or read key value in different threads then it’s ideal to use Concurrent Hashmap. Best example is
Producer Consumer which handles concurrent read/write.
So what does the thread-safe Map means?
If multiple threads access a hash map concurrently, and at least one of the threads modifies the map structurally, it must be synchronized externally to avoid an inconsistent view of the contents.
How?There are two ways we could synchronized
HashMap:
- HashMap Vs. synchronizedMap Vs. ConcurrentHashMap
-
- Map normalMap = new Hashtable();
-
-
- synchronizedHashMap = Collections.synchronizedMap(new HashMap());
-
-
- concurrentHashMap = new ConcurrentHashMap();
ConcurrentHashMap
* You should use ConcurrentHashMap when you need very high concurrency in your project.
* It is thread safe without synchronizing the whole map.
* Reads can happen very fast while write is done with a lock.
* There is no locking at the object level.
* The locking is at a much finer granularity at a hashmap bucket level.
*
ConcurrentHashMap doesn’t throw a
ConcurrentModificationException if one thread tries to modify it while another is iterating over it.
*
ConcurrentHashMap uses multitude of locks.
SynchronizedHashMap
* Synchronization at Object level.
* Every read/write operation needs to acquire lock.
* Locking the entire collection is a performance overhead.
* This essentially gives access to only one thread to the entire map & blocks all the other threads.
* It may cause contention.
* SynchronizedHashMap returns Iterator, which fails-fast on concurrent modification.
Now let’s take a look at code1. Create class
CrunchifyConcurrentHashMapVsSynchronizedHashMap.java2. Create object for each
HashMap,
SynchronizedMap and
ConcurrentHashMap3. Add and retrieve 500k entries from
Map4. Measure start and end time and display time in milliseconds
5. We will use
ExecutorService to run 5 threads in parallel
- CrunchifyConcurrentHashMapVsSynchronizedMap.java
- package crunchify.com.tutorials;
-
- import java.util.Collections;
- import java.util.HashMap;
- import java.util.Hashtable;
- import java.util.Map;
- import java.util.concurrent.ConcurrentHashMap;
- import java.util.concurrent.ExecutorService;
- import java.util.concurrent.Executors;
- import java.util.concurrent.TimeUnit;
-
-
-
-
-
-
- public class CrunchifyConcurrentHashMapVsSynchronizedMap {
-
- public final static int THREAD_POOL_SIZE = 5;
-
- public static Map crunchifyHashTableObject = null;
- public static Map crunchifySynchronizedMapObject = null;
- public static Map crunchifyConcurrentHashMapObject = null;
-
- public static void main(String[] args) throws InterruptedException {
-
-
- crunchifyHashTableObject = new Hashtable();
- crunchifyPerformTest(crunchifyHashTableObject);
-
-
- crunchifySynchronizedMapObject = Collections.synchronizedMap(new HashMap());
- crunchifyPerformTest(crunchifySynchronizedMapObject);
-
-
- crunchifyConcurrentHashMapObject = new ConcurrentHashMap();
- crunchifyPerformTest(crunchifyConcurrentHashMapObject);
-
- }
-
- public static void crunchifyPerformTest(final Map crunchifyThreads) throws InterruptedException {
-
- System.out.println("Test started for: " + crunchifyThreads.getClass());
- long averageTime = 0;
- for (int i = 0; i < 5; i++) {
-
- long startTime = System.nanoTime();
- ExecutorService crunchifyExServer = Executors.newFixedThreadPool(THREAD_POOL_SIZE);
-
- for (int j = 0; j < THREAD_POOL_SIZE; j++) {
- crunchifyExServer.execute(new Runnable() {
- @SuppressWarnings("unused")
- @Override
- public void run() {
-
- for (int i = 0; i < 500000; i++) {
- Integer crunchifyRandomNumber = (int) Math.ceil(Math.random() * 550000);
-
-
- Integer crunchifyValue = crunchifyThreads.get(String.valueOf(crunchifyRandomNumber));
-
-
- crunchifyThreads.put(String.valueOf(crunchifyRandomNumber), crunchifyRandomNumber);
- }
- }
- });
- }
-
-
- crunchifyExServer.shutdown();
-
-
- crunchifyExServer.awaitTermination(Long.MAX_VALUE, TimeUnit.DAYS);
-
- long entTime = System.nanoTime();
- long totalTime = (entTime - startTime) / 1000000L;
- averageTime += totalTime;
- System.out.println("2500K entried added/retrieved in " + totalTime + " ms");
- }
- System.out.println("For " + crunchifyThreads.getClass() + " the average time is " + averageTime / 5 + " ms\n");
- }
- }
One possible execution result:
Test started for: class java.util.Hashtable
500K entried added/retrieved in 1432 ms
500K entried added/retrieved in 1425 ms
500K entried added/retrieved in 1373 ms
500K entried added/retrieved in 1369 ms
500K entried added/retrieved in 1438 ms
For class java.util.Hashtable the average time 1407 ms
Test started for: class java.util.Collections$SynchronizedMap
500K entried added/retrieved in 1431 ms
500K entried added/retrieved in 1460 ms
500K entried added/retrieved in 1387 ms
500K entried added/retrieved in 1456 ms
500K entried added/retrieved in 1406 ms
For class java.util.Collections$SynchronizedMap the average time 1428 ms
Test started for: class java.util.concurrent.ConcurrentHashMap
500K entried added/retrieved in 413 ms
500K entried added/retrieved in 351 ms
500K entried added/retrieved in 427 ms
500K entried added/retrieved in 337 ms
500K entried added/retrieved in 339 ms
For class java.util.concurrent.ConcurrentHashMap the average time 373 ms <== Much faster
This message was edited 10 times. Last update was at 30/11/2016 20:59:38