我有一个ConcurrentHashMap oricginalCahce,其中mulptile线程从代码的许多地方并发写入。 我必须定期刷新地图,所以我在下面创建Map的临时副本并清理origianl Map
void flushcopyOfOriginalCache(String queueName) {
Map<Integer, Date> copyOfOriginalCache = null;
Object lock = getLastscheduledTimeUpdateLock(queueName);
//this lock is not aquired at the time of writing as the writing in taking place in many places of the code
synchronized (lock) {
//rechecking for whether expired or not
if (isOrigCacheExpired(queueName)) {
log.info("origCache Expired for queue {}", queueName);
if (orignalCache.containsKey(queueName)) {
copyOfOriginalCache = new ConcurrentHashMap<>(
orignalCache.get(queueName));
//How safe is below clean ??? as parallel writes might be going on orignalCache!
orignalCache.get(queueName).clear();
}
}
}
if (copyOfOriginalCache == null || copyOfOriginalCache.isEmpty()) {
return;
}
//Below is expenssive DB operations on with data in copyOfOriginalCache
}
如何确保这个干净的方法不应该删除正在编写的条目。请指导。