##// END OF EJS Templates
update: fix spurious unclean status bug shown by previous commit...
update: fix spurious unclean status bug shown by previous commit The crux of the problem is: - the dirstate is corrupted (the sizes/dates are assigned to the wrong files) - because when worker.worker is used with a return value (batchget in merge.py here), the return value when worker.worker effectively parallelizes is permuted - this is because worker.worker's partition of input and combination of output values are not inverses of one another: it split [1,2,3,4,5,6] into [[1,3,5],[2,4,6]], but combines that into [1,3,5,2,4,6]. Given that worker.worker doesn't call its function argument on contiguous chunks on the input arguments, sticking with lists means we'd need to know the relation between the inputs of worker.worker function argument (for instance, requiring that every input element is mapped to exactly one output element). It seems better to instead switch return values to dicts, which can combined reliably with a straighforward restriction. Differential Revision: https://phab.mercurial-scm.org/D6581

File last commit:

r42720:898b36f7 default
r42722:d29db0a0 default
Show More
copies.py
805 lines | 29.3 KiB | text/x-python | PythonLexer
Matt Mackall
copies: move findcopies code to its own module...
r6274 # copies.py - copy detection for Mercurial
#
# Copyright 2008 Matt Mackall <mpm@selenic.com>
#
Martin Geisler
updated license to be explicit about GPL version 2
r8225 # This software may be used and distributed according to the terms of the
Matt Mackall
Update license to GPLv2+
r10263 # GNU General Public License version 2 or any later version.
Matt Mackall
copies: move findcopies code to its own module...
r6274
Gregory Szorc
copies: use absolute_import
r25924 from __future__ import absolute_import
Pulkit Goyal
copytrace: move fast heuristic copytracing algorithm to core...
r34180 import collections
Simon Heimberg
separate import lines from mercurial and general python modules
r8312 import heapq
Pulkit Goyal
copytrace: move fast heuristic copytracing algorithm to core...
r34180 import os
Matt Mackall
copies: move findcopies code to its own module...
r6274
Pulkit Goyal
copies: add a config to limit the number of candidates to check in heuristics...
r34847 from .i18n import _
Gregory Szorc
copies: use absolute_import
r25924 from . import (
Yuya Nishihara
copies: use intersectmatchers() in non-merge p1 optimization...
r33869 match as matchmod,
Durham Goode
copies: optimize forward copy detection logic for rebases...
r28000 node,
Gregory Szorc
copies: use absolute_import
r25924 pathutil,
util,
)
Pulkit Goyal
copies: improve logic of deciding copytracing on based of config options...
r39402 from .utils import (
stringutil,
)
Gregory Szorc
copies: use absolute_import
r25924
Martin von Zweigbergk
copies: pass contexts into _findlimit()...
r41438 def _findlimit(repo, ctxa, ctxb):
Ryan McElroy
amend: fix amending rename commit with diverged topologies (issue4405)...
r23071 """
Find the last revision that needs to be checked to ensure that a full
transitive closure for file copies can be properly calculated.
Generally, this means finding the earliest revision number that's an
ancestor of a or b but not both, except when a or b is a direct descendent
of the other, in which case we can return the minimum revnum of a and b.
Patrick Mezard
copies: don't report copies with unrelated branch
r10179 """
Ryan McElroy
amend: fix amending rename commit with diverged topologies (issue4405)...
r23071
Matt Mackall
symmetricdifference: move back to copies...
r6429 # basic idea:
# - mark a and b with different sides
# - if a parent's children are all on the same side, the parent is
# on that side, otherwise it is on no side
# - walk the graph in topological order with the help of a heap;
# - add unseen parents to side map
# - clear side of any parent that has children on different sides
Matt Mackall
copies: refactor symmetricdifference as _findlimit...
r6431 # - track number of interesting revs that might still be on a side
# - track the lowest interesting rev seen
# - quit when interesting revs is zero
Matt Mackall
copies: teach symmetric difference about working revisions...
r6430
cl = repo.changelog
Martin von Zweigbergk
copies: get working copy parents from wctx, not dirstate, to make in-mem work...
r41439 wdirparents = None
Martin von Zweigbergk
copies: pass contexts into _findlimit()...
r41438 a = ctxa.rev()
b = ctxb.rev()
Matt Mackall
copies: teach symmetric difference about working revisions...
r6430 if a is None:
Martin von Zweigbergk
copies: get working copy parents from wctx, not dirstate, to make in-mem work...
r41439 wdirparents = (ctxa.p1(), ctxa.p2())
Martin von Zweigbergk
copies: use node.wdirrev instead of inventing another constant for it...
r41267 a = node.wdirrev
Matt Mackall
copies: teach symmetric difference about working revisions...
r6430 if b is None:
Martin von Zweigbergk
copies: get working copy parents from wctx, not dirstate, to make in-mem work...
r41439 assert not wdirparents
wdirparents = (ctxb.p1(), ctxb.p2())
Martin von Zweigbergk
copies: use node.wdirrev instead of inventing another constant for it...
r41267 b = node.wdirrev
Matt Mackall
symmetricdifference: move back to copies...
r6429
side = {a: -1, b: 1}
visit = [-a, -b]
heapq.heapify(visit)
interesting = len(visit)
Martin von Zweigbergk
copies: use node.wdirrev instead of inventing another constant for it...
r41267 limit = node.wdirrev
Matt Mackall
symmetricdifference: move back to copies...
r6429
while interesting:
r = -heapq.heappop(visit)
Martin von Zweigbergk
copies: use node.wdirrev instead of inventing another constant for it...
r41267 if r == node.wdirrev:
Martin von Zweigbergk
copies: get working copy parents from wctx, not dirstate, to make in-mem work...
r41439 parents = [pctx.rev() for pctx in wdirparents]
Matt Mackall
copies: teach symmetric difference about working revisions...
r6430 else:
parents = cl.parentrevs(r)
Martin von Zweigbergk
copies: consider nullrev a common ancestor...
r41437 if parents[1] == node.nullrev:
parents = parents[:1]
Matt Mackall
copies: teach symmetric difference about working revisions...
r6430 for p in parents:
Matt Mackall
symmetricdifference: move back to copies...
r6429 if p not in side:
# first time we see p; add it to visit
side[p] = side[r]
if side[p]:
interesting += 1
heapq.heappush(visit, -p)
elif side[p] and side[p] != side[r]:
# p was interesting but now we know better
side[p] = 0
interesting -= 1
Matt Mackall
copies: teach symmetric difference about working revisions...
r6430 if side[r]:
Matt Mackall
copies: refactor symmetricdifference as _findlimit...
r6431 limit = r # lowest rev visited
Matt Mackall
copies: teach symmetric difference about working revisions...
r6430 interesting -= 1
Patrick Mezard
copies: don't report copies with unrelated branch
r10179
Ryan McElroy
amend: fix amending rename commit with diverged topologies (issue4405)...
r23071 # Consider the following flow (see test-commit-amend.t under issue4405):
# 1/ File 'a0' committed
# 2/ File renamed from 'a0' to 'a1' in a new commit (call it 'a1')
# 3/ Move back to first commit
# 4/ Create a new commit via revert to contents of 'a1' (call it 'a1-amend')
# 5/ Rename file from 'a1' to 'a2' and commit --amend 'a1-msg'
#
# During the amend in step five, we will be in this state:
#
# @ 3 temporary amend commit for a1-amend
# |
# o 2 a1-amend
# |
# | o 1 a1
# |/
# o 0 a0
#
Mads Kiilerich
spelling: fixes from proofreading of spell checker issues
r23139 # When _findlimit is called, a and b are revs 3 and 0, so limit will be 2,
Ryan McElroy
amend: fix amending rename commit with diverged topologies (issue4405)...
r23071 # yet the filelog has the copy information in rev 1 and we will not look
# back far enough unless we also look at the a and b as candidates.
# This only occurs when a is a descendent of b or visa-versa.
return min(limit, a, b)
Matt Mackall
symmetricdifference: move back to copies...
r6429
Martin von Zweigbergk
copies: split up _chain() in naive chaining and filtering steps...
r42565 def _chainandfilter(src, dst, a, b):
"""chain two sets of copies 'a' and 'b' and filter result"""
Martin von Zweigbergk
copies: document cases in _chain()...
r42413
# When chaining copies in 'a' (from 'src' via some other commit 'mid') with
# copies in 'b' (from 'mid' to 'dst'), we can get the different cases in the
# following table (not including trivial cases). For example, case 2 is
# where a file existed in 'src' and remained under that name in 'mid' and
# then was renamed between 'mid' and 'dst'.
#
# case src mid dst result
# 1 x y - -
# 2 x y y x->y
# 3 x y x -
# 4 x y z x->z
# 5 - x y -
# 6 x x y x->y
Martin von Zweigbergk
copies: split up _chain() in naive chaining and filtering steps...
r42565 #
# _chain() takes care of chaining the copies in 'a' and 'b', but it
# cannot tell the difference between cases 1 and 2, between 3 and 4, or
# between 5 and 6, so it includes all cases in its result.
# Cases 1, 3, and 5 are then removed by _filter().
Martin von Zweigbergk
copies: document cases in _chain()...
r42413
Martin von Zweigbergk
copies: split up _chain() in naive chaining and filtering steps...
r42565 t = _chain(a, b)
_filter(src, dst, t)
return t
def _filter(src, dst, t):
"""filters out invalid copies after chaining"""
for k, v in list(t.items()):
# remove copies from files that didn't exist
if v not in src:
del t[k]
# remove criss-crossed copies
elif k in src and v in dst:
del t[k]
# remove copies to files that were then removed
elif k not in dst:
del t[k]
def _chain(a, b):
"""chain two sets of copies 'a' and 'b'"""
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775 t = a.copy()
for k, v in b.iteritems():
if v in t:
Martin von Zweigbergk
copies: remove redundant filtering of ping-pong renames in _chain()...
r42440 t[k] = t[v]
Martin von Zweigbergk
copies: filter out copies from non-existent source later in _chain()...
r42416 else:
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775 t[k] = v
return t
Martin von Zweigbergk
copies: make "limit" argument to _tracefile() mandatory...
r42427 def _tracefile(fctx, am, limit):
Martin von Zweigbergk
copies: consistently use """ for docstrings...
r35422 """return file context that is the ancestor of fctx present in ancestor
manifest am, stopping after the first ancestor lower than limit"""
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775
for f in fctx.ancestors():
if am.get(f.path(), None) == f.filenode():
return f
Martin von Zweigbergk
copies: make "limit" argument to _tracefile() mandatory...
r42427 if not f.isintroducedafter(limit):
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775 return None
Martin von Zweigbergk
copies: respect narrowmatcher in "parent -> working dir" case...
r41918 def _dirstatecopies(repo, match=None):
ds = repo.dirstate
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775 c = ds.copies().copy()
Pulkit Goyal
py3: explicitly convert dict.keys() and dict.items() into a list...
r34350 for k in list(c):
Martin von Zweigbergk
copies: always respect matcher arg to _forwardcopies()...
r35421 if ds[k] not in 'anm' or (match and not match(k)):
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775 del c[k]
return c
Durham Goode
copies: add matcher parameter to copy logic...
r24782 def _computeforwardmissing(a, b, match=None):
Durham Goode
copy: move _forwardcopies file logic to a function...
r24011 """Computes which files are in b but not a.
This is its own function so extensions can easily wrap this call to see what
files _forwardcopies is about to process.
"""
Durham Goode
copies: add matcher parameter to copy logic...
r24782 ma = a.manifest()
mb = b.manifest()
Durham Goode
copies: remove use of manifest.matches...
r31256 return mb.filesnotin(ma, match=match)
Durham Goode
copy: move _forwardcopies file logic to a function...
r24011
Martin von Zweigbergk
copies: extract function for deciding whether to use changeset-centric algos...
r42284 def usechangesetcentricalgo(repo):
"""Checks if we should use changeset-centric copy algorithms"""
Martin von Zweigbergk
changelog: parse copy metadata if available in extras...
r42318 return (repo.ui.config('experimental', 'copies.read-from') in
('changeset-only', 'compatibility'))
Martin von Zweigbergk
copies: extract function for deciding whether to use changeset-centric algos...
r42284
Martin von Zweigbergk
copies: extract method for getting non-wdir forward copies...
r35423 def _committedforwardcopies(a, b, match):
"""Like _forwardcopies(), but b.rev() cannot be None (working copy)"""
Mads Kiilerich
diff: search beyond ancestor when detecting renames...
r20294 # files might have to be traced back to the fctx parent of the last
# one-side-only changeset, but not further back than that
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093 repo = a._repo
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922
Martin von Zweigbergk
copies: extract function for deciding whether to use changeset-centric algos...
r42284 if usechangesetcentricalgo(repo):
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922 return _changesetforwardcopies(a, b, match)
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093 debug = repo.ui.debugflag and repo.ui.configbool('devel', 'debug.copies')
dbg = repo.ui.debug
if debug:
dbg('debug.copies: looking into rename from %s to %s\n'
% (a, b))
Martin von Zweigbergk
copies: pass contexts into _findlimit()...
r41438 limit = _findlimit(repo, a, b)
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093 if debug:
dbg('debug.copies: search limit: %d\n' % limit)
Mads Kiilerich
diff: search beyond ancestor when detecting renames...
r20294 am = a.manifest()
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775 # find where new files came from
# we currently don't try to find where old files went, too expensive
# this means we can miss a case like 'hg rm b; hg cp a b'
cm = {}
Durham Goode
copies: optimize forward copy detection logic for rebases...
r28000
# Computing the forward missing is quite expensive on large manifests, since
# it compares the entire manifests. We can optimize it in the common use
# case of computing what copies are in a commit versus its parent (like
# during a rebase or histedit). Note, we exclude merge commits from this
# optimization, since the ctx.files() for a merge commit is not correct for
# this comparison.
forwardmissingmatch = match
Yuya Nishihara
copies: use intersectmatchers() in non-merge p1 optimization...
r33869 if b.p1() == a and b.p2().node() == node.nullid:
Martin von Zweigbergk
copies: remove dependency on scmutil by directly using match.exact()...
r42102 filesmatcher = matchmod.exact(b.files())
Yuya Nishihara
copies: use intersectmatchers() in non-merge p1 optimization...
r33869 forwardmissingmatch = matchmod.intersectmatchers(match, filesmatcher)
Durham Goode
copies: optimize forward copy detection logic for rebases...
r28000 missing = _computeforwardmissing(a, b, match=forwardmissingmatch)
Pierre-Yves David
_adjustlinkrev: reuse ancestors set during rename detection (issue4514)...
r23980 ancestrycontext = a._repo.changelog.ancestors([b.rev()], inclusive=True)
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093
if debug:
Martin von Zweigbergk
copies: process files in deterministic order for stable tests...
r42396 dbg('debug.copies: missing files to search: %d\n' % len(missing))
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093
Martin von Zweigbergk
copies: process files in deterministic order for stable tests...
r42396 for f in sorted(missing):
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093 if debug:
dbg('debug.copies: tracing file: %s\n' % f)
Pierre-Yves David
_adjustlinkrev: reuse ancestors set during rename detection (issue4514)...
r23980 fctx = b[f]
fctx._ancestrycontext = ancestrycontext
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093
Boris Feld
copies: add time information to the debug information
r40094 if debug:
start = util.timer()
Pierre-Yves David
_adjustlinkrev: reuse ancestors set during rename detection (issue4514)...
r23980 ofctx = _tracefile(fctx, am, limit)
Siddharth Agarwal
copies._forwardcopies: use set operations to find missing files...
r18878 if ofctx:
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093 if debug:
dbg('debug.copies: rename of: %s\n' % ofctx._path)
Siddharth Agarwal
copies._forwardcopies: use set operations to find missing files...
r18878 cm[f] = ofctx.path()
Boris Feld
copies: add time information to the debug information
r40094 if debug:
Pulkit Goyal
py3: use '%f' for floats instead of '%s'...
r40112 dbg('debug.copies: time: %f seconds\n'
Boris Feld
copies: add time information to the debug information
r40094 % (util.timer() - start))
Martin von Zweigbergk
copies: extract method for getting non-wdir forward copies...
r35423 return cm
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922 def _changesetforwardcopies(a, b, match):
if a.rev() == node.nullrev:
return {}
repo = a.repo()
children = {}
cl = repo.changelog
missingrevs = cl.findmissingrevs(common=[a.rev()], heads=[b.rev()])
for r in missingrevs:
for p in cl.parentrevs(r):
if p == node.nullrev:
continue
if p not in children:
children[p] = [r]
else:
children[p].append(r)
roots = set(children) - set(missingrevs)
# 'work' contains 3-tuples of a (revision number, parent number, copies).
# The parent number is only used for knowing which parent the copies dict
# came from.
Martin von Zweigbergk
copies: document how 'copies' dict instances are reused...
r42720 # NOTE: To reduce costly copying the 'copies' dicts, we reuse the same
# instance for *one* of the child nodes (the last one). Once an instance
# has been put on the queue, it is thus no longer safe to modify it.
# Conversely, it *is* safe to modify an instance popped off the queue.
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922 work = [(r, 1, {}) for r in roots]
heapq.heapify(work)
Martin von Zweigbergk
copies: avoid calling matcher if matcher.always()...
r42688 alwaysmatch = match.always()
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922 while work:
Martin von Zweigbergk
copies: simplify merging of copy dicts on merge commits...
r42719 r, i1, copies = heapq.heappop(work)
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922 if work and work[0][0] == r:
# We are tracing copies from both parents
r, i2, copies2 = heapq.heappop(work)
Martin von Zweigbergk
copies: simplify merging of copy dicts on merge commits...
r42719 for dst, src in copies2.items():
Martin von Zweigbergk
copies: don't filter out copy targets created on other side of merge commit...
r42686 # Unlike when copies are stored in the filelog, we consider
# it a copy even if the destination already existed on the
# other branch. It's simply too expensive to check if the
# file existed in the manifest.
Martin von Zweigbergk
copies: simplify merging of copy dicts on merge commits...
r42719 if dst not in copies:
# If it was copied on the p1 side, leave it as copied from
Martin von Zweigbergk
copies: don't filter out copy targets created on other side of merge commit...
r42686 # that side, even if it was also copied on the p2 side.
copies[dst] = copies2[dst]
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922 if r == b.rev():
Martin von Zweigbergk
copies: do full filtering at end of _changesetforwardcopies()...
r42685 _filter(a, b, copies)
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922 return copies
Martin von Zweigbergk
copies: avoid unnecessary copying of copy dict...
r42687 for i, c in enumerate(children[r]):
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922 childctx = repo[c]
if r == childctx.p1().rev():
parent = 1
childcopies = childctx.p1copies()
else:
assert r == childctx.p2().rev()
parent = 2
childcopies = childctx.p2copies()
Martin von Zweigbergk
copies: avoid calling matcher if matcher.always()...
r42688 if not alwaysmatch:
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922 childcopies = {dst: src for dst, src in childcopies.items()
if match(dst)}
Martin von Zweigbergk
copies: avoid unnecessary copying of copy dict...
r42687 # Copy the dict only if later iterations will also need it
if i != len(children[r]) - 1:
Martin von Zweigbergk
copies: avoid reusing the same variable for two different copy dicts...
r42714 newcopies = copies.copy()
Martin von Zweigbergk
copies: avoid unnecessary copying of copy dict...
r42687 else:
Martin von Zweigbergk
copies: avoid reusing the same variable for two different copy dicts...
r42714 newcopies = copies
if childcopies:
newcopies = _chain(newcopies, childcopies)
Martin von Zweigbergk
copies: do full filtering at end of _changesetforwardcopies()...
r42685 for f in childctx.filesremoved():
Martin von Zweigbergk
copies: avoid reusing the same variable for two different copy dicts...
r42714 if f in newcopies:
del newcopies[f]
heapq.heappush(work, (c, parent, newcopies))
Martin von Zweigbergk
copies: do copy tracing based on ctx.p[12]copies() if configured...
r41922 assert False
Martin von Zweigbergk
copies: extract method for getting non-wdir forward copies...
r35423 def _forwardcopies(a, b, match=None):
"""find {dst@b: src@a} copy mapping where a is an ancestor of b"""
Martin von Zweigbergk
narrow: make copies.pathcopies() filter with narrowspec again...
r40487 match = a.repo().narrowmatch(match)
Martin von Zweigbergk
copies: extract method for getting non-wdir forward copies...
r35423 # check for working copy
if b.rev() is None:
Martin von Zweigbergk
copies: group wdir-handling in one place...
r35424 if a == b.p1():
Martin von Zweigbergk
copies: extract method for getting non-wdir forward copies...
r35423 # short-circuit to avoid issues with merge states
Martin von Zweigbergk
copies: respect narrowmatcher in "parent -> working dir" case...
r41918 return _dirstatecopies(b._repo, match)
Martin von Zweigbergk
copies: extract method for getting non-wdir forward copies...
r35423
Martin von Zweigbergk
copies: group wdir-handling in one place...
r35424 cm = _committedforwardcopies(a, b.p1(), match)
# combine copies from dirstate if necessary
Martin von Zweigbergk
copies: split up _chain() in naive chaining and filtering steps...
r42565 return _chainandfilter(a, b, cm, _dirstatecopies(b._repo, match))
Martin von Zweigbergk
copies: group wdir-handling in one place...
r35424 return _committedforwardcopies(a, b, match)
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775
Martin von Zweigbergk
copies: make _backwardrenames() filter out copies by destination...
r41919 def _backwardrenames(a, b, match):
Pulkit Goyal
copytrace: replace experimental.disablecopytrace config with copytrace (BC)...
r34079 if a._repo.ui.config('experimental', 'copytrace') == 'off':
Durham Goode
copy: add flag for disabling copy tracing...
r26013 return {}
Siddharth Agarwal
copies: do not track backward copies, only renames (issue3739)...
r18136 # Even though we're not taking copies into account, 1:n rename situations
# can still exist (e.g. hg cp a b; hg mv a c). In those cases we
# arbitrarily pick one of the renames.
Martin von Zweigbergk
copies: make _backwardrenames() filter out copies by destination...
r41919 # We don't want to pass in "match" here, since that would filter
# the destination by it. Since we're reversing the copies, we want
# to filter the source instead.
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775 f = _forwardcopies(b, a)
r = {}
Mads Kiilerich
copies: make the loss in _backwardcopies more stable...
r18355 for k, v in sorted(f.iteritems()):
Martin von Zweigbergk
copies: make _backwardrenames() filter out copies by destination...
r41919 if match and not match(v):
continue
Siddharth Agarwal
copies: do not track backward copies, only renames (issue3739)...
r18136 # remove copies
if v in a:
continue
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775 r[v] = k
return r
Durham Goode
copies: add matcher parameter to copy logic...
r24782 def pathcopies(x, y, match=None):
Martin von Zweigbergk
copies: consistently use """ for docstrings...
r35422 """find {dst@y: src@x} copy mapping for directed compare"""
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093 repo = x._repo
debug = repo.ui.debugflag and repo.ui.configbool('devel', 'debug.copies')
if debug:
repo.ui.debug('debug.copies: searching copies from %s to %s\n'
% (x, y))
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775 if x == y or not x or not y:
return {}
a = y.ancestor(x)
if a == x:
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093 if debug:
repo.ui.debug('debug.copies: search mode: forward\n')
Durham Goode
copies: add matcher parameter to copy logic...
r24782 return _forwardcopies(x, y, match=match)
Matt Mackall
copies: rewrite copy detection for non-merge users...
r15775 if a == y:
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093 if debug:
repo.ui.debug('debug.copies: search mode: backward\n')
Martin von Zweigbergk
copies: make _backwardrenames() filter out copies by destination...
r41919 return _backwardrenames(x, y, match=match)
Boris Feld
copies: add a devel debug mode to trace what copy tracing does...
r40093 if debug:
repo.ui.debug('debug.copies: search mode: combined\n')
Martin von Zweigbergk
copies: split up _chain() in naive chaining and filtering steps...
r42565 return _chainandfilter(x, y, _backwardrenames(x, a, match=match),
_forwardcopies(a, y, match=match))
Matt Mackall
copies: split the copies api for "normal" and merge cases (API)
r15774
Pierre-Yves David
mergecopies: rename 'ca' to 'base'...
r30186 def mergecopies(repo, c1, c2, base):
Matt Mackall
copies: move findcopies code to its own module...
r6274 """
Martin von Zweigbergk
copies: move comment about implementation of mergecopies() to end...
r42287 Finds moves and copies between context c1 and c2 that are relevant for
Pulkit Goyal
copytrace: move the default copytracing algorithm in a new function...
r34080 merging. 'base' will be used as the merge base.
Copytracing is used in commands like rebase, merge, unshelve, etc to merge
files that were moved/ copied in one merge parent and modified in another.
For example:
Pulkit Goyal
copies: add more details to the documentation of mergecopies()...
r33821
o ---> 4 another commit
|
| o ---> 3 commit that modifies a.txt
| /
o / ---> 2 commit that moves a.txt to b.txt
|/
o ---> 1 merge base
If we try to rebase revision 3 on revision 4, since there is no a.txt in
revision 4, and if user have copytrace disabled, we prints the following
message:
```other changed <file> which local deleted```
Gábor Stefanik
graft: support grafting changes to new file in renamed directory (issue5436)
r30581 Returns five dicts: "copy", "movewithdir", "diverge", "renamedelete" and
"dirmove".
Matt Mackall
copies: add docstring for mergecopies
r16168
Matt Mackall
copies: fix mergecopies doc mapping direction
r16177 "copy" is a mapping from destination name -> source name,
Matt Mackall
copies: add docstring for mergecopies
r16168 where source is in c1 and destination is in c2 or vice-versa.
Siddharth Agarwal
copies: separate moves via directory renames from explicit copies...
r18134 "movewithdir" is a mapping from source name -> destination name,
where the file at source present in one context but not the other
needs to be moved to destination by the merge process, because the
other context moved the directory it is in.
Matt Mackall
copies: add docstring for mergecopies
r16168 "diverge" is a mapping of source name -> list of destination names
for divergent renames.
Thomas Arendsen Hein
merge: warn about file deleted in one branch and renamed in other (issue3074)...
r16794
"renamedelete" is a mapping of source name -> list of destination
names for files deleted in c1 that were renamed in c2 or vice-versa.
Gábor Stefanik
graft: support grafting changes to new file in renamed directory (issue5436)
r30581
"dirmove" is a mapping of detected source dir -> destination dir renames.
This is needed for handling changes to new files previously grafted into
renamed directories.
Martin von Zweigbergk
copies: move comment about implementation of mergecopies() to end...
r42287
This function calls different copytracing algorithms based on config.
Matt Mackall
copies: move findcopies code to its own module...
r6274 """
# avoid silly behavior for update from empty dir
Matt Mackall
copies: teach symmetric difference about working revisions...
r6430 if not c1 or not c2 or c1 == c2:
Gábor Stefanik
graft: support grafting changes to new file in renamed directory (issue5436)
r30581 return {}, {}, {}, {}, {}
Matt Mackall
copies: move findcopies code to its own module...
r6274
Martin von Zweigbergk
copies: respect narrowmatcher in "parent -> working dir" case...
r41918 narrowmatch = c1.repo().narrowmatch()
Matt Mackall
copies: teach copies about dirstate.copies...
r6646 # avoid silly behavior for parent -> working dir
Matt Mackall
misc: replace .parents()[0] with p1()
r13878 if c2.node() is None and c1.node() == repo.dirstate.p1():
Martin von Zweigbergk
copies: respect narrowmatcher in "parent -> working dir" case...
r41918 return _dirstatecopies(repo, narrowmatch), {}, {}, {}, {}
Matt Mackall
copies: teach copies about dirstate.copies...
r6646
Pulkit Goyal
copytrace: move the default copytracing algorithm in a new function...
r34080 copytracing = repo.ui.config('experimental', 'copytrace')
Martin von Zweigbergk
copies: move check for experimental.copytrace==<falsy> earlier...
r42411 if stringutil.parsebool(copytracing) is False:
# stringutil.parsebool() returns None when it is unable to parse the
# value, so we should rely on making sure copytracing is on such cases
return {}, {}, {}, {}, {}
Pulkit Goyal
copytrace: move the default copytracing algorithm in a new function...
r34080
Martin von Zweigbergk
copies: ignore heuristics copytracing when using changeset-centric algos...
r42412 if usechangesetcentricalgo(repo):
# The heuristics don't make sense when we need changeset-centric algos
return _fullcopytracing(repo, c1, c2, base)
Durham Goode
copy: add flag for disabling copy tracing...
r26013 # Copy trace disabling is explicitly below the node == p1 logic above
# because the logic above is required for a simple copy to be kept across a
# rebase.
Pulkit Goyal
copies: improve logic of deciding copytracing on based of config options...
r39402 if copytracing == 'heuristics':
Yuya Nishihara
copytrace: use ctx.mutable() instead of adhoc constant of non-public phases
r34365 # Do full copytracing if only non-public revisions are involved as
# that will be fast enough and will also cover the copies which could
# be missed by heuristics
Pulkit Goyal
copytrace: add a a new config to limit the number of drafts in heuristics...
r34312 if _isfullcopytraceable(repo, c1, base):
Pulkit Goyal
copytrace: use the full copytracing method if only drafts are involved...
r34289 return _fullcopytracing(repo, c1, c2, base)
Pulkit Goyal
copytrace: move fast heuristic copytracing algorithm to core...
r34180 return _heuristicscopytracing(repo, c1, c2, base)
Pulkit Goyal
copytrace: move the default copytracing algorithm in a new function...
r34080 else:
return _fullcopytracing(repo, c1, c2, base)
Durham Goode
copy: add flag for disabling copy tracing...
r26013
Pulkit Goyal
copytrace: add a a new config to limit the number of drafts in heuristics...
r34312 def _isfullcopytraceable(repo, c1, base):
Yuya Nishihara
copytrace: use ctx.mutable() instead of adhoc constant of non-public phases
r34365 """ Checks that if base, source and destination are all no-public branches,
if yes let's use the full copytrace algorithm for increased capabilities
since it will be fast enough.
Pulkit Goyal
copies: add docs for config `experimental.copytrace.sourcecommitlimit`...
r34517
`experimental.copytrace.sourcecommitlimit` can be used to set a limit for
number of changesets from c1 to base such that if number of changesets are
more than the limit, full copytracing algorithm won't be used.
Pulkit Goyal
copytrace: use the full copytracing method if only drafts are involved...
r34289 """
Pulkit Goyal
copytrace: add a a new config to limit the number of drafts in heuristics...
r34312 if c1.rev() is None:
c1 = c1.p1()
Yuya Nishihara
copytrace: use ctx.mutable() instead of adhoc constant of non-public phases
r34365 if c1.mutable() and base.mutable():
Pulkit Goyal
copytrace: add a a new config to limit the number of drafts in heuristics...
r34312 sourcecommitlimit = repo.ui.configint('experimental',
'copytrace.sourcecommitlimit')
commits = len(repo.revs('%d::%d', base.rev(), c1.rev()))
return commits < sourcecommitlimit
Pulkit Goyal
copytrace: use the full copytracing method if only drafts are involved...
r34289 return False
Martin von Zweigbergk
copies: calculate mergecopies() based on pathcopies()...
r42408 def _checksinglesidecopies(src, dsts1, m1, m2, mb, c2, base,
copy, renamedelete):
if src not in m2:
# deleted on side 2
if src not in m1:
# renamed on side 1, deleted on side 2
renamedelete[src] = dsts1
elif m2[src] != mb[src]:
if not _related(c2[src], base[src]):
return
# modified on side 2
for dst in dsts1:
if dst not in m2:
# dst not added on side 2 (handle as regular
# "both created" case in manifestmerge otherwise)
copy[dst] = src
Pulkit Goyal
copytrace: move the default copytracing algorithm in a new function...
r34080 def _fullcopytracing(repo, c1, c2, base):
""" The full copytracing algorithm which finds all the new files that were
added from merge base up to the top commit and for each file it checks if
this file was copied from another file.
This is pretty slow when a lot of changesets are involved but will track all
the copies.
"""
Matt Mackall
copies: move findcopies code to its own module...
r6274 m1 = c1.manifest()
m2 = c2.manifest()
Pierre-Yves David
mergecopies: rename 'ca' to 'base'...
r30186 mb = base.manifest()
Matt Mackall
copies: move findcopies code to its own module...
r6274
Martin von Zweigbergk
copies: calculate mergecopies() based on pathcopies()...
r42408 copies1 = pathcopies(base, c1)
copies2 = pathcopies(base, c2)
inversecopies1 = {}
inversecopies2 = {}
for dst, src in copies1.items():
inversecopies1.setdefault(src, []).append(dst)
for dst, src in copies2.items():
inversecopies2.setdefault(src, []).append(dst)
copy = {}
diverge = {}
renamedelete = {}
allsources = set(inversecopies1) | set(inversecopies2)
for src in allsources:
dsts1 = inversecopies1.get(src)
dsts2 = inversecopies2.get(src)
if dsts1 and dsts2:
# copied/renamed on both sides
if src not in m1 and src not in m2:
# renamed on both sides
dsts1 = set(dsts1)
dsts2 = set(dsts2)
# If there's some overlap in the rename destinations, we
# consider it not divergent. For example, if side 1 copies 'a'
# to 'b' and 'c' and deletes 'a', and side 2 copies 'a' to 'c'
# and 'd' and deletes 'a'.
if dsts1 & dsts2:
for dst in (dsts1 & dsts2):
copy[dst] = src
else:
diverge[src] = sorted(dsts1 | dsts2)
elif src in m1 and src in m2:
# copied on both sides
dsts1 = set(dsts1)
dsts2 = set(dsts2)
for dst in (dsts1 & dsts2):
copy[dst] = src
# TODO: Handle cases where it was renamed on one side and copied
# on the other side
elif dsts1:
# copied/renamed only on side 1
_checksinglesidecopies(src, dsts1, m1, m2, mb, c2, base,
copy, renamedelete)
elif dsts2:
# copied/renamed only on side 2
_checksinglesidecopies(src, dsts2, m2, m1, mb, c1, base,
copy, renamedelete)
renamedeleteset = set()
divergeset = set()
Martin von Zweigbergk
copies: replace .items() by .values() where appropriate...
r42410 for dsts in diverge.values():
Martin von Zweigbergk
copies: calculate mergecopies() based on pathcopies()...
r42408 divergeset.update(dsts)
Martin von Zweigbergk
copies: replace .items() by .values() where appropriate...
r42410 for dsts in renamedelete.values():
Martin von Zweigbergk
copies: calculate mergecopies() based on pathcopies()...
r42408 renamedeleteset.update(dsts)
Matt Mackall
copies: move findcopies code to its own module...
r6274
Matt Mackall
copies: group bothnew with other sets
r26659 # find interesting file sets from manifests
Martin von Zweigbergk
narrow: move copies overrides to core...
r40002 addedinm1 = m1.filesnotin(mb, repo.narrowmatch())
addedinm2 = m2.filesnotin(mb, repo.narrowmatch())
Martin von Zweigbergk
copies: inline _computenonoverlap() in mergecopies()...
r42409 u1 = sorted(addedinm1 - addedinm2)
u2 = sorted(addedinm2 - addedinm1)
header = " unmatched files in %s"
if u1:
repo.ui.debug("%s:\n %s\n" % (header % 'local', "\n ".join(u1)))
if u2:
repo.ui.debug("%s:\n %s\n" % (header % 'other', "\n ".join(u2)))
Matt Mackall
copies: move findcopies code to its own module...
r6274
Martin von Zweigbergk
copies: calculate mergecopies() based on pathcopies()...
r42408 fullcopy = copies1.copy()
fullcopy.update(copies2)
Martin von Zweigbergk
copies: move early return for "no copies" case a little earlier...
r42342 if not fullcopy:
return copy, {}, diverge, renamedelete, {}
if repo.ui.debugflag:
Thomas Arendsen Hein
merge: show renamed on one and deleted on the other side in debug output
r16795 repo.ui.debug(" all copies found (* = to merge, ! = divergent, "
"% = renamed and deleted):\n")
Mads Kiilerich
copies: report found copies sorted
r18362 for f in sorted(fullcopy):
Matt Mackall
copies: move findcopies code to its own module...
r6274 note = ""
Matt Mackall
many, many trivial check-code fixups
r10282 if f in copy:
note += "*"
Matt Mackall
copies: rename diverge2 to divergeset for clarity
r26317 if f in divergeset:
Matt Mackall
many, many trivial check-code fixups
r10282 note += "!"
Matt Mackall
copies: rename renamedelete to renamedeleteset for clarity
r26658 if f in renamedeleteset:
Thomas Arendsen Hein
merge: show renamed on one and deleted on the other side in debug output
r16795 note += "%"
Siddharth Agarwal
copies: make debug messages more sensible...
r18135 repo.ui.debug(" src: '%s' -> dst: '%s' %s\n" % (fullcopy[f], f,
note))
Matt Mackall
copies: rename diverge2 to divergeset for clarity
r26317 del divergeset
Matt Mackall
copies: move findcopies code to its own module...
r6274
Martin Geisler
do not attempt to translate ui.debug output
r9467 repo.ui.debug(" checking for directory renames\n")
Matt Mackall
copies: move findcopies code to its own module...
r6274
# generate a directory move map
Matt Mackall
copies: use ctx.dirs() for directory rename detection
r16178 d1, d2 = c1.dirs(), c2.dirs()
Matt Mackall
copies: re-include root directory in directory rename detection (issue3511)
r17055 invalid = set()
Matt Mackall
copies: move findcopies code to its own module...
r6274 dirmove = {}
# examine each file copy for a potential directory move, which is
# when all the files in a directory are moved to a new directory
Dirkjan Ochtman
use dict.iteritems() rather than dict.items()...
r7622 for dst, src in fullcopy.iteritems():
Durham Goode
copies: switch to using pathutil.dirname...
r25282 dsrc, ddst = pathutil.dirname(src), pathutil.dirname(dst)
Matt Mackall
copies: move findcopies code to its own module...
r6274 if dsrc in invalid:
# already seen to be uninteresting
continue
elif dsrc in d1 and ddst in d1:
# directory wasn't entirely moved locally
Kyle Lippincott
copies: correctly skip directories that have already been considered...
r39299 invalid.add(dsrc)
Matt Mackall
copies: move findcopies code to its own module...
r6274 elif dsrc in d2 and ddst in d2:
# directory wasn't entirely moved remotely
Kyle Lippincott
copies: correctly skip directories that have already been considered...
r39299 invalid.add(dsrc)
elif dsrc in dirmove and dirmove[dsrc] != ddst:
Matt Mackall
copies: move findcopies code to its own module...
r6274 # files from the same directory moved to two different places
Kyle Lippincott
copies: correctly skip directories that have already been considered...
r39299 invalid.add(dsrc)
Matt Mackall
copies: move findcopies code to its own module...
r6274 else:
# looks good so far
Kyle Lippincott
copies: correctly skip directories that have already been considered...
r39299 dirmove[dsrc] = ddst
Matt Mackall
copies: move findcopies code to its own module...
r6274
for i in invalid:
if i in dirmove:
del dirmove[i]
del d1, d2, invalid
if not dirmove:
Gábor Stefanik
graft: support grafting changes to new file in renamed directory (issue5436)
r30581 return copy, {}, diverge, renamedelete, {}
Matt Mackall
copies: move findcopies code to its own module...
r6274
Kyle Lippincott
copies: correctly skip directories that have already been considered...
r39299 dirmove = {k + "/": v + "/" for k, v in dirmove.iteritems()}
Matt Mackall
copies: move findcopies code to its own module...
r6274 for d in dirmove:
Siddharth Agarwal
copies: make debug messages more sensible...
r18135 repo.ui.debug(" discovered dir src: '%s' -> dst: '%s'\n" %
(d, dirmove[d]))
Matt Mackall
copies: move findcopies code to its own module...
r6274
Pierre-Yves David
checkcopies: move 'movewithdir' initialisation right before its usage...
r30183 movewithdir = {}
Matt Mackall
copies: move findcopies code to its own module...
r6274 # check unaccounted nonoverlapping files against directory moves
Martin von Zweigbergk
copies: calculate mergecopies() based on pathcopies()...
r42408 for f in u1 + u2:
Matt Mackall
copies: move findcopies code to its own module...
r6274 if f not in fullcopy:
for d in dirmove:
if f.startswith(d):
# new file added in a directory that was moved, move it
Matt Mackall
copies: skip directory rename checks when not merging...
r6425 df = dirmove[d] + f[len(d):]
Matt Mackall
copies: don't double-detect items in the directory copy check
r6426 if df not in copy:
Siddharth Agarwal
copies: separate moves via directory renames from explicit copies...
r18134 movewithdir[f] = df
Siddharth Agarwal
copies: make debug messages more sensible...
r18135 repo.ui.debug((" pending file src: '%s' -> "
"dst: '%s'\n") % (f, df))
Matt Mackall
copies: move findcopies code to its own module...
r6274 break
Gábor Stefanik
graft: support grafting changes to new file in renamed directory (issue5436)
r30581 return copy, movewithdir, diverge, renamedelete, dirmove
Durham Goode
copies: refactor checkcopies() into a top level method...
r19178
Pulkit Goyal
copytrace: move fast heuristic copytracing algorithm to core...
r34180 def _heuristicscopytracing(repo, c1, c2, base):
""" Fast copytracing using filename heuristics
Assumes that moves or renames are of following two types:
1) Inside a directory only (same directory name but different filenames)
2) Move from one directory to another
(same filenames but different directory names)
Works only when there are no merge commits in the "source branch".
Source branch is commits from base up to c2 not including base.
If merge is involved it fallbacks to _fullcopytracing().
Can be used by setting the following config:
[experimental]
copytrace = heuristics
Pulkit Goyal
copies: add a config to limit the number of candidates to check in heuristics...
r34847
In some cases the copy/move candidates found by heuristics can be very large
in number and that will make the algorithm slow. The number of possible
candidates to check can be limited by using the config
`experimental.copytrace.movecandidateslimit` which defaults to 100.
Pulkit Goyal
copytrace: move fast heuristic copytracing algorithm to core...
r34180 """
if c1.rev() is None:
c1 = c1.p1()
if c2.rev() is None:
c2 = c2.p1()
copies = {}
changedfiles = set()
m1 = c1.manifest()
if not repo.revs('%d::%d', base.rev(), c2.rev()):
# If base is not in c2 branch, we switch to fullcopytracing
repo.ui.debug("switching to full copytracing as base is not "
"an ancestor of c2\n")
return _fullcopytracing(repo, c1, c2, base)
ctx = c2
while ctx != base:
if len(ctx.parents()) == 2:
# To keep things simple let's not handle merges
repo.ui.debug("switching to full copytracing because of merges\n")
return _fullcopytracing(repo, c1, c2, base)
changedfiles.update(ctx.files())
ctx = ctx.p1()
cp = _forwardcopies(base, c2)
for dst, src in cp.iteritems():
if src in m1:
copies[dst] = src
# file is missing if it isn't present in the destination, but is present in
# the base and present in the source.
# Presence in the base is important to exclude added files, presence in the
# source is important to exclude removed files.
Augie Fackler
py3: use list comprehensions instead of filter where we need to eagerly filter...
r36364 filt = lambda f: f not in m1 and f in base and f in c2
missingfiles = [f for f in changedfiles if filt(f)]
Pulkit Goyal
copytrace: move fast heuristic copytracing algorithm to core...
r34180
if missingfiles:
basenametofilename = collections.defaultdict(list)
dirnametofilename = collections.defaultdict(list)
for f in m1.filesnotin(base.manifest()):
basename = os.path.basename(f)
dirname = os.path.dirname(f)
basenametofilename[basename].append(f)
dirnametofilename[dirname].append(f)
for f in missingfiles:
basename = os.path.basename(f)
dirname = os.path.dirname(f)
samebasename = basenametofilename[basename]
samedirname = dirnametofilename[dirname]
movecandidates = samebasename + samedirname
# f is guaranteed to be present in c2, that's why
# c2.filectx(f) won't fail
f2 = c2.filectx(f)
Pulkit Goyal
copies: add a config to limit the number of candidates to check in heuristics...
r34847 # we can have a lot of candidates which can slow down the heuristics
# config value to limit the number of candidates moves to check
maxcandidates = repo.ui.configint('experimental',
'copytrace.movecandidateslimit')
if len(movecandidates) > maxcandidates:
repo.ui.status(_("skipping copytracing for '%s', more "
"candidates than the limit: %d\n")
% (f, len(movecandidates)))
continue
Pulkit Goyal
copytrace: move fast heuristic copytracing algorithm to core...
r34180 for candidate in movecandidates:
f1 = c1.filectx(candidate)
Gábor Stefanik
copies: clean up _related logic...
r37410 if _related(f1, f2):
Pulkit Goyal
copytrace: move fast heuristic copytracing algorithm to core...
r34180 # if there are a few related copies then we'll merge
# changes into all of them. This matches the behaviour
# of upstream copytracing
copies[candidate] = f
return copies, {}, {}, {}, {}
Gábor Stefanik
copies: clean up _related logic...
r37410 def _related(f1, f2):
Pierre-Yves David
checkcopies: extract the '_related' closure...
r30138 """return True if f1 and f2 filectx have a common ancestor
Walk back to common ancestor to see if the two files originate
from the same file. Since workingfilectx's rev() is None it messes
up the integer comparison logic, hence the pre-step check for
None (f1 and f2 can only be workingfilectx's initially).
"""
if f1 == f2:
Sushil khanchi
copies: return True instead of filename as it is expected to return boolean...
r41873 return True # a match
Pierre-Yves David
checkcopies: extract the '_related' closure...
r30138
g1, g2 = f1.ancestors(), f2.ancestors()
try:
f1r, f2r = f1.linkrev(), f2.linkrev()
if f1r is None:
f1 = next(g1)
if f2r is None:
f2 = next(g2)
while True:
f1r, f2r = f1.linkrev(), f2.linkrev()
if f1r > f2r:
f1 = next(g1)
elif f2r > f1r:
f2 = next(g2)
Gábor Stefanik
copies: clean up _related logic...
r37410 else: # f1 and f2 point to files in the same linkrev
return f1 == f2 # true if they point to the same file
Pierre-Yves David
checkcopies: extract the '_related' closure...
r30138 except StopIteration:
return False
Phil Cohen
context: add workingfilectx.markcopied...
r34788 def duplicatecopies(repo, wctx, rev, fromrev, skiprev=None):
Martin von Zweigbergk
copies: consistently use """ for docstrings...
r35422 """reproduce copies from fromrev to rev in the dirstate
Matt Mackall
duplicatecopies: move from cmdutil to copies...
r22901
If skiprev is specified, it's a revision that should be used to
filter copy records. Any copies that occur between fromrev and
skiprev will not be duplicated, even if they appear in the set of
copies between fromrev and rev.
Martin von Zweigbergk
copies: consistently use """ for docstrings...
r35422 """
Matt Mackall
duplicatecopies: move from cmdutil to copies...
r22901 exclude = {}
Pulkit Goyal
copies: improve logic of deciding copytracing on based of config options...
r39402 ctraceconfig = repo.ui.config('experimental', 'copytrace')
bctrace = stringutil.parsebool(ctraceconfig)
Durham Goode
copy: add flag for disabling copy tracing...
r26013 if (skiprev is not None and
Pulkit Goyal
copies: improve logic of deciding copytracing on based of config options...
r39402 (ctraceconfig == 'heuristics' or bctrace or bctrace is None)):
Pulkit Goyal
copytrace: replace experimental.disablecopytrace config with copytrace (BC)...
r34079 # copytrace='off' skips this line, but not the entire function because
Durham Goode
copy: add flag for disabling copy tracing...
r26013 # the line below is O(size of the repo) during a rebase, while the rest
# of the function is much faster (and is required for carrying copy
# metadata across the rebase anyway).
Matt Mackall
duplicatecopies: move from cmdutil to copies...
r22901 exclude = pathcopies(repo[fromrev], repo[skiprev])
for dst, src in pathcopies(repo[fromrev], repo[rev]).iteritems():
if dst in exclude:
continue
Martin von Zweigbergk
copies: fix duplicatecopies() with overlay context...
r42509 if dst in wctx:
wctx[dst].markcopied(src)