Saturday, 31 August 2013

Using ConcurrentHashMap efficiently?

Using ConcurrentHashMap efficiently?

I have a Android Application whose core component is a
HashMap<String,float[]>. The System is having high concurrency. e.g here
are the following three situations I have which occur frequently and they
are highly overlapping in nature
Iterate through all the keys in the hashmap and do some operation on its
value(read only operations).
Add new key,value pairs in the Hashmap.
Remove Certain keys from the Hashmap.
I do all these operations in different threads and thus am using a
ConcurrentHashMap since some inconsistency in retrievals doesnt matter.
e.g While iterating the map,if new entries are added then it doesnt matter
to not read in those new values immediately as I ensure that next time
they are read .
Also while removing the entries I am recreating the iterator everytime to
avoid "ConcurrentModificationException"
Suppose , there is a following hashmap(i.e ConcurrentHashmap)
ConcurrentHashMap<String,float[]> test=new ConcurrentHashMap<String,
float[]>(200);
Now for Retrieval I do the following
Iterator<String> reader=test.keySet().iterator();
while(reader.hasNext())
{
String s=reader.next();
float[] temp=test.get(s);
//do some operation with float[] temp here(read only
operation)
}
and for removal I do the following
boolean temp = true;
while (temp) {
for (String key : test.keySet()) {
temp = false;
if (key.equals("abc")) {
test.remove(key);
temp = true;
break;
}
}
}
and when inserting in new values I simply do
test.put("temp value", new float[10]);
I am not sure if its a very efficient utilisation. Also it does matter not
to read in removed values(however I need efficiency ,and since the
iterator is again created during the function call,its guaranteed that in
the next time I don't get the removed values)so that much inconsistency
can be tolerated?
Could someone please tell me an efficient way to do it?

No comments:

Post a Comment