##// END OF EJS Templates
cmdutil: convert _revertprefetch() to a generic stored file hook (API)...
cmdutil: convert _revertprefetch() to a generic stored file hook (API) This will be used by LFS to fetch required files in a group for multiple commands, prior to being accessed. That avoids the one-at-a-time fetch when the filelog wrapper goes to access it, and it is missing locally (which costs two round trips to the server.) The core command list that needs this is probably at least: - annotate - archive (which is also used by extdiff) - cat - diff - export - grep - verify (sadly) - anything that has the '{data}' template There are no core users of the revert prefetch hook, and never have been since it was introduced in 45e02cfad4bd for remotefilelog. Thanks to Yuya for figuring out a way to reliably trigger the deprecated warning. Unfortunately, it wanted to blame the caller of revert. Passing along an adjusted stack level seemed the least bad choice (although it still blames a core function). One thing to note is that the store lock isn't being held when this is called. I'm not at all familiar with remotefilelog or its locking requirements, so this may not be a big deal. Currently, LFS doesn't hold a lock when downloading files. Even though largefiles doesn't either, I'm starting to think it should, and maybe the .hg/store/lock isn't good enough to cover the globally shared cache. .. api:: The cmdutil._revertprefetch() hook point for prefetching stored files has been replaced by the command agnostic cmdutil._prefetchfiles(). The new function takes a list of files, instead of a list of lists of files.

File last commit:

r35348:a29fe459 default
r35941:efbd0423 default
Show More
logexchange.py
118 lines | 3.7 KiB | text/x-python | PythonLexer
# logexchange.py
#
# Copyright 2017 Augie Fackler <raf@durin42.com>
# Copyright 2017 Sean Farley <sean@farley.io>
#
# This software may be used and distributed according to the terms of the
# GNU General Public License version 2 or any later version.
from __future__ import absolute_import
from .node import hex
from . import (
vfs as vfsmod,
)
# directory name in .hg/ in which remotenames files will be present
remotenamedir = 'logexchange'
def readremotenamefile(repo, filename):
"""
reads a file from .hg/logexchange/ directory and yields it's content
filename: the file to be read
yield a tuple (node, remotepath, name)
"""
vfs = vfsmod.vfs(repo.vfs.join(remotenamedir))
if not vfs.exists(filename):
return
f = vfs(filename)
lineno = 0
for line in f:
line = line.strip()
if not line:
continue
# contains the version number
if lineno == 0:
lineno += 1
try:
node, remote, rname = line.split('\0')
yield node, remote, rname
except ValueError:
pass
f.close()
def readremotenames(repo):
"""
read the details about the remotenames stored in .hg/logexchange/ and
yields a tuple (node, remotepath, name). It does not yields information
about whether an entry yielded is branch or bookmark. To get that
information, call the respective functions.
"""
for bmentry in readremotenamefile(repo, 'bookmarks'):
yield bmentry
for branchentry in readremotenamefile(repo, 'branches'):
yield branchentry
def writeremotenamefile(repo, remotepath, names, nametype):
vfs = vfsmod.vfs(repo.vfs.join(remotenamedir))
f = vfs(nametype, 'w', atomictemp=True)
# write the storage version info on top of file
# version '0' represents the very initial version of the storage format
f.write('0\n\n')
olddata = set(readremotenamefile(repo, nametype))
# re-save the data from a different remote than this one.
for node, oldpath, rname in sorted(olddata):
if oldpath != remotepath:
f.write('%s\0%s\0%s\n' % (node, oldpath, rname))
for name, node in sorted(names.iteritems()):
if nametype == "branches":
for n in node:
f.write('%s\0%s\0%s\n' % (n, remotepath, name))
elif nametype == "bookmarks":
if node:
f.write('%s\0%s\0%s\n' % (node, remotepath, name))
f.close()
def saveremotenames(repo, remotepath, branches=None, bookmarks=None):
"""
save remotenames i.e. remotebookmarks and remotebranches in their
respective files under ".hg/logexchange/" directory.
"""
wlock = repo.wlock()
try:
if bookmarks:
writeremotenamefile(repo, remotepath, bookmarks, 'bookmarks')
if branches:
writeremotenamefile(repo, remotepath, branches, 'branches')
finally:
wlock.release()
def pullremotenames(localrepo, remoterepo):
"""
pulls bookmarks and branches information of the remote repo during a
pull or clone operation.
localrepo is our local repository
remoterepo is the peer instance
"""
remotepath = remoterepo.url()
bookmarks = remoterepo.listkeys('bookmarks')
# on a push, we don't want to keep obsolete heads since
# they won't show up as heads on the next pull, so we
# remove them here otherwise we would require the user
# to issue a pull to refresh the storage
bmap = {}
repo = localrepo.unfiltered()
for branch, nodes in remoterepo.branchmap().iteritems():
bmap[branch] = []
for node in nodes:
if node in repo and not repo[node].obsolete():
bmap[branch].append(hex(node))
saveremotenames(localrepo, remotepath, bmap, bookmarks)