aboutsummaryrefslogtreecommitdiffstats
path: root/bitbake/lib/bb/cache.py
diff options
context:
space:
mode:
authorRichard Purdie <richard.purdie@linuxfoundation.org>2014-07-25 14:54:23 +0100
committerRichard Purdie <richard.purdie@linuxfoundation.org>2014-07-26 08:50:14 +0100
commit89d178841208557b030103cf0ae813a42550487c (patch)
tree8fee8247d56c36cf28a5b36671725e01325eb58f /bitbake/lib/bb/cache.py
parenta05435fc59d32f2fcf4ea4185cb0655eeb343211 (diff)
downloadopenembedded-core-contrib-89d178841208557b030103cf0ae813a42550487c.tar.gz
bitbake: codeparser cache improvements
It turns out the codeparser cache is the bottleneck I've been observing when running bitbake commands, particularly as it grows. There are some things we can do about this: * We were processing the cache with "intern()" at save time. Its actually much more memory efficient to do this at creation time. * Use hashable objects such as frozenset rather than set so that we can compare objects * De-duplicate the cache objects, link duplicates to the same object saving memory and disk usage and improving speed * Using custom setstate/getstate to avoid the overhead of object attribute names in the cache file To make this work, a global cache was needed for the list of set objects as this was the only way I could find to get the data in at setstate object creation time :(. Parsing shows a modest improvement with these changes, cache load time is significantly better, cache save time is reduced since there is now no need to reprocess the data and cache is much smaller. We can drop the compress_keys() code and internSet code from the shared cache core since its no longer used and replaced by codeparser specific pieces. (Bitbake rev: 4aaf56bfbad4aa626be8a2f7a5f70834c3311dd3) Signed-off-by: Richard Purdie <richard.purdie@linuxfoundation.org>
Diffstat (limited to 'bitbake/lib/bb/cache.py')
-rw-r--r--bitbake/lib/bb/cache.py12
1 files changed, 0 insertions, 12 deletions
diff --git a/bitbake/lib/bb/cache.py b/bitbake/lib/bb/cache.py
index c7f3b7ab71..f892d7dc32 100644
--- a/bitbake/lib/bb/cache.py
+++ b/bitbake/lib/bb/cache.py
@@ -764,16 +764,6 @@ class MultiProcessCache(object):
self.cachedata = data
- def internSet(self, items):
- new = set()
- for i in items:
- new.add(intern(i))
- return new
-
- def compress_keys(self, data):
- # Override in subclasses if desired
- return
-
def create_cachedata(self):
data = [{}]
return data
@@ -833,8 +823,6 @@ class MultiProcessCache(object):
self.merge_data(extradata, data)
os.unlink(f)
- self.compress_keys(data)
-
with open(self.cachefile, "wb") as f:
p = pickle.Pickler(f, -1)
p.dump([data, self.__class__.CACHE_VERSION])