##// END OF EJS Templates
lfs: add the 'lfs' requirement in the changegroup transaction introducing lfs...
lfs: add the 'lfs' requirement in the changegroup transaction introducing lfs A hook like this is how largefiles manages to do the same. Largefiles uses a changegroup hook, but this uses pretxnchangegroup because that actually causes the transaction to rollback in the unlikely event that writing the requirements out fails. Sadly, the requires file itself isn't rolled back if a subsequent hook fails, but that seems trivial. Now that commit, changegroup and convert are covered, I don't think there's any way to get an lfs repo without the requirement. The grep exit code is blotted out of some test-lfs-serve.t tests now showing the requirement, because run-tests.py doesn't support conditionalizing the exit code.

File last commit:

r29316:28dfcf3d default
r35520:6bb940de default
Show More
wirestore.py
39 lines | 1.3 KiB | text/x-python | PythonLexer
# Copyright 2010-2011 Fog Creek Software
#
# This software may be used and distributed according to the terms of the
# GNU General Public License version 2 or any later version.
'''largefile store working over Mercurial's wire protocol'''
from __future__ import absolute_import
from . import (
lfutil,
remotestore,
)
class wirestore(remotestore.remotestore):
def __init__(self, ui, repo, remote):
cap = remote.capable('largefiles')
if not cap:
raise lfutil.storeprotonotcapable([])
storetypes = cap.split(',')
if 'serve' not in storetypes:
raise lfutil.storeprotonotcapable(storetypes)
self.remote = remote
super(wirestore, self).__init__(ui, repo, remote.url())
def _put(self, hash, fd):
return self.remote.putlfile(hash, fd)
def _get(self, hash):
return self.remote.getlfile(hash)
def _stat(self, hashes):
'''For each hash, return 0 if it is available, other values if not.
It is usually 2 if the largefile is missing, but might be 1 the server
has a corrupted copy.'''
batch = self.remote.iterbatch()
for hash in hashes:
batch.statlfile(hash)
batch.submit()
return dict(zip(hashes, batch.results()))