##// END OF EJS Templates
inno: script to automate building Inno installer...
inno: script to automate building Inno installer The official Inno installer build process is poorly documented. And attempting to reproduce behavior of the installer uploaded to www.mercurial-scm.org has revealed a number of unexpected behaviors. This commit attempts to improve the state of reproducibility of the Inno installer by introducing a Python script to largely automate the building of the installer. The new script (which must be run from an environment with the Visual C++ environment configured) takes care of producing an Inno installer. When run from a fresh Mercurial source checkout with all the proper system dependencies (the VC++ toolchain, Windows 10 SDK, and Inno tools) installed, it "just works." The script takes care of downloading all the Python dependencies in a secure manner and manages the build environment for you. You don't need any additional config files: just launch the script, pointing it at an existing Python and ISCC binary and it takes care of the rest. The produced installer creates a Mercurial installation with a handful of differences from the existing 4.9 installers (produced by someone else): * add_path.exe is missing (this was removed a few changesets ago) * The set of api-ms-win-core-* DLLs is different (I suspect this is due to me using a different UCRT / Windows version). * kernelbase.dll and msasn1.dll are missing. * There are a different set of .pyc files for dulwich, keyring, and pygments due to us using the latest versions of each. * We include Tcl/Tk DLLs and .pyc files (I'm not sure why these are missing from the existing installers). * We include the urllib3 and win32ctypes packages (which are dependencies of dulwich and pywin32, respectively). I'm not sure why these aren't present in the existing installers. * We include a different set of files for the distutils package. I'm not sure why. But it should be harmless. * We include the docutils package (it is getting picked up as a dependency somehow). I think this is fine. * We include a copy of argparse.pyc. I'm not sure why this was missing from existing installers. * We don't have a copy of sqlite3/dump.pyc. I'm not sure why. The SQLite C extension code only imports this module when conn.iterdump() is called. It should be safe to omit. * We include files in the email.test and test packages. The set of files is small and their presence should be harmless. The new script and support code is written in Python 3 because it is brand new and independent code and I don't believe new Python projects should be using Python 2 in 2019 if they have a choice about it. The readme.txt file has been renamed to readme.rst and overhauled to reflect the existence of build.py. Differential Revision: https://phab.mercurial-scm.org/D6066

File last commit:

r30180:736f92c4 default
r42019:d7dc4ac1 default
Show More
localstore.py
68 lines | 2.4 KiB | text/x-python | PythonLexer
# Copyright 2009-2010 Gregory P. Ward
# Copyright 2009-2010 Intelerad Medical Systems Incorporated
# Copyright 2010-2011 Fog Creek Software
# Copyright 2010-2011 Unity Technologies
#
# This software may be used and distributed according to the terms of the
# GNU General Public License version 2 or any later version.
'''store class for local filesystem'''
from __future__ import absolute_import
from mercurial.i18n import _
from mercurial import util
from . import (
basestore,
lfutil,
)
class localstore(basestore.basestore):
'''localstore first attempts to grab files out of the store in the remote
Mercurial repository. Failing that, it attempts to grab the files from
the user cache.'''
def __init__(self, ui, repo, remote):
self.remote = remote.local()
super(localstore, self).__init__(ui, repo, self.remote.url())
def put(self, source, hash):
if lfutil.instore(self.remote, hash):
return
lfutil.link(source, lfutil.storepath(self.remote, hash))
def exists(self, hashes):
retval = {}
for hash in hashes:
retval[hash] = lfutil.instore(self.remote, hash)
return retval
def _getfile(self, tmpfile, filename, hash):
path = lfutil.findfile(self.remote, hash)
if not path:
raise basestore.StoreError(filename, hash, self.url,
_("can't get file locally"))
with open(path, 'rb') as fd:
return lfutil.copyandhash(
util.filechunkiter(fd), tmpfile)
def _verifyfiles(self, contents, filestocheck):
failed = False
for cset, filename, expectedhash in filestocheck:
storepath, exists = lfutil.findstorepath(self.repo, expectedhash)
if not exists:
storepath, exists = lfutil.findstorepath(
self.remote, expectedhash)
if not exists:
self.ui.warn(
_('changeset %s: %s references missing %s\n')
% (cset, filename, storepath))
failed = True
elif contents:
actualhash = lfutil.hashfile(storepath)
if actualhash != expectedhash:
self.ui.warn(
_('changeset %s: %s references corrupted %s\n')
% (cset, filename, storepath))
failed = True
return failed