Show More
The requested changes are too big and content was truncated. Show full diff
@@ -0,0 +1,28 b'' | |||
|
1 | .. _repo-admin-set: | |
|
2 | .. _permissions-info-add-group-ref: | |
|
3 | ||
|
4 | Repository Administration | |
|
5 | ========================= | |
|
6 | ||
|
7 | Repository permissions in |RCE| can be managed in a number of different ways. | |
|
8 | This overview should give you an insight into how you could adopt particular | |
|
9 | settings for your needs: | |
|
10 | ||
|
11 | * Global |repo| permissions: This allows you to set the default permissions | |
|
12 | for each new |repo| created within |RCE|, see :ref:`repo-default-ref`. All | |
|
13 | |repos| created will inherit these permissions unless explicitly configured. | |
|
14 | * Individual |repo| permissions: To set individual |repo| permissions, | |
|
15 | see :ref:`set-repo-perms`. | |
|
16 | * Repository Group permissions: This allows you to define the permissions for | |
|
17 | a group, and all |repos| created within that group will inherit the same | |
|
18 | permissions. | |
|
19 | ||
|
20 | .. toctree:: | |
|
21 | ||
|
22 | repo_admin/repo-perm-steps | |
|
23 | repo_admin/repo-extra-fields | |
|
24 | repo_admin/repo-hooks | |
|
25 | repo_admin/repo-issue-tracker | |
|
26 | repo_admin/repo-vcs | |
|
27 | repo_admin/restore-deleted-repositories | |
|
28 | repo_admin/repo-admin-tasks No newline at end of file |
@@ -0,0 +1,24 b'' | |||
|
1 | .. _repo-admin-tasks: | |
|
2 | ||
|
3 | Common Admin Tasks for Repositories | |
|
4 | ----------------------------------- | |
|
5 | ||
|
6 | ||
|
7 | Manually Force Delete Repository | |
|
8 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
|
9 | ||
|
10 | In case of attached forks or pull-requests repositories should be archived. | |
|
11 | Here is how to force delete a repository and remove all dependent objects | |
|
12 | ||
|
13 | ||
|
14 | .. code-block:: bash | |
|
15 | ||
|
16 | # starts the ishell interactive prompt | |
|
17 | $ rccontrol ishell enterprise-1 | |
|
18 | ||
|
19 | .. code-block:: python | |
|
20 | ||
|
21 | In [4]: from rhodecode.model.repo import RepoModel | |
|
22 | In [3]: repo = Repository.get_by_repo_name('test_repos/repo_with_prs') | |
|
23 | In [5]: RepoModel().delete(repo, forks='detach', pull_requests='delete') | |
|
24 | In [6]: Session().commit() |
@@ -0,0 +1,24 b'' | |||
|
1 | .. _user-admin-tasks: | |
|
2 | ||
|
3 | Common Admin Tasks for Users | |
|
4 | ---------------------------- | |
|
5 | ||
|
6 | ||
|
7 | Manually Set Personal Repository Group | |
|
8 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
|
9 | ||
|
10 | Here is how to set a repository group as personal for a user using ishell. | |
|
11 | ||
|
12 | ||
|
13 | .. code-block:: bash | |
|
14 | ||
|
15 | # starts the ishell interactive prompt | |
|
16 | $ rccontrol ishell enterprise-1 | |
|
17 | ||
|
18 | .. code-block:: python | |
|
19 | ||
|
20 | In [1]: repo_group = RepoGroup.get_by_group_name('some_group_name') | |
|
21 | In [2]: user = User.get_by_username('some_user') | |
|
22 | In [3]: repo_group.user = user | |
|
23 | In [4]: repo_group.personal = True | |
|
24 | In [5]: Session().add(repo_group);Session().commit() |
@@ -0,0 +1,35 b'' | |||
|
1 | .. _search-methods-ref: | |
|
2 | ||
|
3 | search methods | |
|
4 | ============== | |
|
5 | ||
|
6 | search | |
|
7 | ------ | |
|
8 | ||
|
9 | .. py:function:: search(apiuser, search_query, search_type, page_limit=<Optional:10>, page=<Optional:1>, search_sort=<Optional:'newfirst'>, repo_name=<Optional:None>, repo_group_name=<Optional:None>) | |
|
10 | ||
|
11 | Fetch Full Text Search results using API. | |
|
12 | ||
|
13 | :param apiuser: This is filled automatically from the |authtoken|. | |
|
14 | :type apiuser: AuthUser | |
|
15 | :param search_query: Search query. | |
|
16 | :type search_query: str | |
|
17 | :param search_type: Search type. The following are valid options: | |
|
18 | * commit | |
|
19 | * content | |
|
20 | * path | |
|
21 | :type search_type: str | |
|
22 | :param page_limit: Page item limit, from 1 to 500. Default 10 items. | |
|
23 | :type page_limit: Optional(int) | |
|
24 | :param page: Page number. Default first page. | |
|
25 | :type page: Optional(int) | |
|
26 | :param search_sort: Search sort order. Default newfirst. The following are valid options: | |
|
27 | * newfirst | |
|
28 | * oldfirst | |
|
29 | :type search_sort: Optional(str) | |
|
30 | :param repo_name: Filter by one repo. Default is all. | |
|
31 | :type repo_name: Optional(str) | |
|
32 | :param repo_group_name: Filter by one repo group. Default is all. | |
|
33 | :type repo_group_name: Optional(str) | |
|
34 | ||
|
35 |
@@ -0,0 +1,85 b'' | |||
|
1 | |RCE| 4.17.0 |RNS| | |
|
2 | ------------------ | |
|
3 | ||
|
4 | Release Date | |
|
5 | ^^^^^^^^^^^^ | |
|
6 | ||
|
7 | - 2019-07-04 | |
|
8 | ||
|
9 | ||
|
10 | New Features | |
|
11 | ^^^^^^^^^^^^ | |
|
12 | ||
|
13 | - New artifacts feature. | |
|
14 | Ability to store binary artifacts for repository with ACL | |
|
15 | - UI/UX refresh for most of the pages. This includes multiple fixes and improvements. | |
|
16 | - Diffs: store wide-diff mode in user sessions to store user preference for diff display. | |
|
17 | - Mercurial: added support for Mercurial 4.9 | |
|
18 | - API: Added search API methods | |
|
19 | - Files: adding/editing allows previews for generated content. | |
|
20 | - Files: allowed multi file upload using UI. | |
|
21 | - Repository Groups: last change is now smartly calculated based on latest change | |
|
22 | from all it's children repositories. | |
|
23 | - Archives: it's now currently possible to download partial directories from files view. | |
|
24 | - SVN: allowed executing pre-commit code with rcextensions, also added example to | |
|
25 | validate SVN file size and paths on pre-commit level. | |
|
26 | ||
|
27 | ||
|
28 | General | |
|
29 | ^^^^^^^ | |
|
30 | - Exception store: add filter for display and deletion. | |
|
31 | - Files: loading history doesn't display hidden and obsolete commits anymore. | |
|
32 | - Repositories: bring back missing watch action in summary view. | |
|
33 | - Admin: user groups is now using pure DB filtering to speed up display | |
|
34 | for large number of groups. | |
|
35 | - Mercurial: enabled full evolve+topic extensions when evolve is enabled. | |
|
36 | - Dependencies: bumped evolve to 8.5.1 | |
|
37 | - Dependencies: bumped pyramid to 1.10.4 | |
|
38 | - Dependencies: bumped psutil to 5.5.1 | |
|
39 | - Dependencies: bumped pygments to 2.4.2 | |
|
40 | - Dependencies: bumped pyramid to 1.10.4 | |
|
41 | - Dependencies: bumped psycopg2 to 2.8.3 | |
|
42 | - Dependencies [security]: updated colander to 1.7.0 | |
|
43 | ||
|
44 | ||
|
45 | Security | |
|
46 | ^^^^^^^^ | |
|
47 | ||
|
48 | - SSH: replaced pycrypto with cryptography to generate SSH keys as pycrypto isn't | |
|
49 | considered safe anymore. | |
|
50 | ||
|
51 | ||
|
52 | Performance | |
|
53 | ^^^^^^^^^^^ | |
|
54 | ||
|
55 | - Config: updated header limits on gunicorn to prevent errors on large Mercurial repositories. | |
|
56 | - User sessions: added option to cleanup redis based sessions in user session interface. | |
|
57 | - Authentication: reduced usage of raw auth calls inside templates to speed up rendering. | |
|
58 | - Sessions: don't touch session for API calls. Before each API call created new session | |
|
59 | object which wasn't required. | |
|
60 | ||
|
61 | ||
|
62 | Fixes | |
|
63 | ^^^^^ | |
|
64 | ||
|
65 | - hHooks: fixed more unicode problems with new pull-request link generator. | |
|
66 | - Mercurial: fix ssh-server support for mercurial custom options. | |
|
67 | - Pull requests: updated metadata information for failed merges with multiple heads. | |
|
68 | - Pull requests: calculate ancestor in the same way as creation mode. | |
|
69 | Fixed problem with updates generating wrong diffs in case of merges. | |
|
70 | - Pull requests: fixed a bug in removal of multiple reviewers at once. | |
|
71 | - Summary: fix timeout issues loading summary page without styling. | |
|
72 | - SSH: fix invocation of custom hgrc. | |
|
73 | - SSH: call custom hooks via SSH backend | |
|
74 | - Markup: fix styling for check-lists. | |
|
75 | - Archives: allows downloading refs that have slashes and special refs. e.g f/feat1 branch names. | |
|
76 | - Files: ensure we generate archives with consistent hashing (except for .tar.gz which uses temp files names in header) | |
|
77 | - Files: fixed rendering of readme files under non-ascii paths. | |
|
78 | ||
|
79 | ||
|
80 | Upgrade notes | |
|
81 | ^^^^^^^^^^^^^ | |
|
82 | ||
|
83 | - In this release we introduced new UI across the application. | |
|
84 | In case of problems with the display on your systems please send us info to support@rhodecode.com. | |
|
85 |
@@ -0,0 +1,19 b'' | |||
|
1 | # contains not directly required libraries we want to pin the version. | |
|
2 | ||
|
3 | atomicwrites==1.2.1 | |
|
4 | attrs==18.2.0 | |
|
5 | billiard==3.5.0.3 | |
|
6 | chameleon==2.24 | |
|
7 | cffi==1.12.2 | |
|
8 | ecdsa==0.13.2 | |
|
9 | hupper==1.6.1 | |
|
10 | gnureadline==6.3.8 | |
|
11 | jinja2==2.9.6 | |
|
12 | jsonschema==2.6.0 | |
|
13 | pyramid-jinja2==2.7 | |
|
14 | pluggy==0.11.0 | |
|
15 | setproctitle==1.1.10 | |
|
16 | scandir==1.10.0 | |
|
17 | tempita==0.5.2 | |
|
18 | vine==1.3.0 | |
|
19 | configparser==3.7.4 |
@@ -0,0 +1,93 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2010-2019 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | import pytest | |
|
22 | from rhodecode.tests import HG_REPO | |
|
23 | from rhodecode.api.tests.utils import ( | |
|
24 | build_data, api_call, assert_error, assert_ok) | |
|
25 | ||
|
26 | ||
|
27 | @pytest.mark.usefixtures("testuser_api", "app") | |
|
28 | class TestApiSearch(object): | |
|
29 | ||
|
30 | @pytest.mark.parametrize("query, expected_hits, expected_paths", [ | |
|
31 | ('todo', 23, [ | |
|
32 | 'vcs/backends/hg/inmemory.py', | |
|
33 | 'vcs/tests/test_git.py']), | |
|
34 | ('extension:rst installation', 6, [ | |
|
35 | 'docs/index.rst', | |
|
36 | 'docs/installation.rst']), | |
|
37 | ('def repo', 87, [ | |
|
38 | 'vcs/tests/test_git.py', | |
|
39 | 'vcs/tests/test_changesets.py']), | |
|
40 | ('repository:%s def test' % HG_REPO, 18, [ | |
|
41 | 'vcs/tests/test_git.py', | |
|
42 | 'vcs/tests/test_changesets.py']), | |
|
43 | ('"def main"', 9, [ | |
|
44 | 'vcs/__init__.py', | |
|
45 | 'vcs/tests/__init__.py', | |
|
46 | 'vcs/utils/progressbar.py']), | |
|
47 | ('owner:test_admin', 358, [ | |
|
48 | 'vcs/tests/base.py', | |
|
49 | 'MANIFEST.in', | |
|
50 | 'vcs/utils/termcolors.py', | |
|
51 | 'docs/theme/ADC/static/documentation.png']), | |
|
52 | ('owner:test_admin def main', 72, [ | |
|
53 | 'vcs/__init__.py', | |
|
54 | 'vcs/tests/test_utils_filesize.py', | |
|
55 | 'vcs/tests/test_cli.py']), | |
|
56 | ('owner:michał test', 0, []), | |
|
57 | ]) | |
|
58 | def test_search_content_results(self, query, expected_hits, expected_paths): | |
|
59 | id_, params = build_data( | |
|
60 | self.apikey_regular, 'search', | |
|
61 | search_query=query, | |
|
62 | search_type='content') | |
|
63 | ||
|
64 | response = api_call(self.app, params) | |
|
65 | json_response = response.json | |
|
66 | ||
|
67 | assert json_response['result']['item_count'] == expected_hits | |
|
68 | paths = [x['f_path'] for x in json_response['result']['results']] | |
|
69 | ||
|
70 | for expected_path in expected_paths: | |
|
71 | assert expected_path in paths | |
|
72 | ||
|
73 | @pytest.mark.parametrize("query, expected_hits, expected_paths", [ | |
|
74 | ('readme.rst', 3, []), | |
|
75 | ('test*', 75, []), | |
|
76 | ('*model*', 1, []), | |
|
77 | ('extension:rst', 48, []), | |
|
78 | ('extension:rst api', 24, []), | |
|
79 | ]) | |
|
80 | def test_search_file_paths(self, query, expected_hits, expected_paths): | |
|
81 | id_, params = build_data( | |
|
82 | self.apikey_regular, 'search', | |
|
83 | search_query=query, | |
|
84 | search_type='path') | |
|
85 | ||
|
86 | response = api_call(self.app, params) | |
|
87 | json_response = response.json | |
|
88 | ||
|
89 | assert json_response['result']['item_count'] == expected_hits | |
|
90 | paths = [x['f_path'] for x in json_response['result']['results']] | |
|
91 | ||
|
92 | for expected_path in expected_paths: | |
|
93 | assert expected_path in paths |
@@ -0,0 +1,112 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2011-2019 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | ||
|
22 | import logging | |
|
23 | ||
|
24 | from rhodecode.api import jsonrpc_method | |
|
25 | from rhodecode.api.exc import JSONRPCValidationError | |
|
26 | from rhodecode.api.utils import Optional | |
|
27 | from rhodecode.lib.index import searcher_from_config | |
|
28 | from rhodecode.model import validation_schema | |
|
29 | from rhodecode.model.validation_schema.schemas import search_schema | |
|
30 | ||
|
31 | log = logging.getLogger(__name__) | |
|
32 | ||
|
33 | ||
|
34 | @jsonrpc_method() | |
|
35 | def search(request, apiuser, search_query, search_type, page_limit=Optional(10), | |
|
36 | page=Optional(1), search_sort=Optional('newfirst'), | |
|
37 | repo_name=Optional(None), repo_group_name=Optional(None)): | |
|
38 | """ | |
|
39 | Fetch Full Text Search results using API. | |
|
40 | ||
|
41 | :param apiuser: This is filled automatically from the |authtoken|. | |
|
42 | :type apiuser: AuthUser | |
|
43 | :param search_query: Search query. | |
|
44 | :type search_query: str | |
|
45 | :param search_type: Search type. The following are valid options: | |
|
46 | * commit | |
|
47 | * content | |
|
48 | * path | |
|
49 | :type search_type: str | |
|
50 | :param page_limit: Page item limit, from 1 to 500. Default 10 items. | |
|
51 | :type page_limit: Optional(int) | |
|
52 | :param page: Page number. Default first page. | |
|
53 | :type page: Optional(int) | |
|
54 | :param search_sort: Search sort order. Default newfirst. The following are valid options: | |
|
55 | * newfirst | |
|
56 | * oldfirst | |
|
57 | :type search_sort: Optional(str) | |
|
58 | :param repo_name: Filter by one repo. Default is all. | |
|
59 | :type repo_name: Optional(str) | |
|
60 | :param repo_group_name: Filter by one repo group. Default is all. | |
|
61 | :type repo_group_name: Optional(str) | |
|
62 | """ | |
|
63 | ||
|
64 | data = {'execution_time': ''} | |
|
65 | repo_name = Optional.extract(repo_name) | |
|
66 | repo_group_name = Optional.extract(repo_group_name) | |
|
67 | ||
|
68 | schema = search_schema.SearchParamsSchema() | |
|
69 | ||
|
70 | try: | |
|
71 | search_params = schema.deserialize( | |
|
72 | dict(search_query=search_query, | |
|
73 | search_type=search_type, | |
|
74 | search_sort=Optional.extract(search_sort), | |
|
75 | page_limit=Optional.extract(page_limit), | |
|
76 | requested_page=Optional.extract(page)) | |
|
77 | ) | |
|
78 | except validation_schema.Invalid as err: | |
|
79 | raise JSONRPCValidationError(colander_exc=err) | |
|
80 | ||
|
81 | search_query = search_params.get('search_query') | |
|
82 | search_type = search_params.get('search_type') | |
|
83 | search_sort = search_params.get('search_sort') | |
|
84 | ||
|
85 | if search_params.get('search_query'): | |
|
86 | page_limit = search_params['page_limit'] | |
|
87 | requested_page = search_params['requested_page'] | |
|
88 | ||
|
89 | searcher = searcher_from_config(request.registry.settings) | |
|
90 | ||
|
91 | try: | |
|
92 | search_result = searcher.search( | |
|
93 | search_query, search_type, apiuser, repo_name, repo_group_name, | |
|
94 | requested_page=requested_page, page_limit=page_limit, sort=search_sort) | |
|
95 | ||
|
96 | data.update(dict( | |
|
97 | results=list(search_result['results']), page=requested_page, | |
|
98 | item_count=search_result['count'], | |
|
99 | items_per_page=page_limit)) | |
|
100 | finally: | |
|
101 | searcher.cleanup() | |
|
102 | ||
|
103 | if not search_result['error']: | |
|
104 | data['execution_time'] = '%s results (%.3f seconds)' % ( | |
|
105 | search_result['count'], | |
|
106 | search_result['runtime']) | |
|
107 | else: | |
|
108 | node = schema['search_query'] | |
|
109 | raise JSONRPCValidationError( | |
|
110 | colander_exc=validation_schema.Invalid(node, search_result['error'])) | |
|
111 | ||
|
112 | return data |
@@ -0,0 +1,48 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2011-2019 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | import logging | |
|
22 | ||
|
23 | from pyramid.view import view_config | |
|
24 | ||
|
25 | from rhodecode.apps._base import RepoAppView | |
|
26 | from rhodecode.lib.auth import ( | |
|
27 | LoginRequired, HasRepoPermissionAnyDecorator) | |
|
28 | ||
|
29 | log = logging.getLogger(__name__) | |
|
30 | ||
|
31 | ||
|
32 | class RepoArtifactsView(RepoAppView): | |
|
33 | ||
|
34 | def load_default_context(self): | |
|
35 | c = self._get_local_tmpl_context(include_app_defaults=True) | |
|
36 | c.rhodecode_repo = self.rhodecode_vcs_repo | |
|
37 | return c | |
|
38 | ||
|
39 | @LoginRequired() | |
|
40 | @HasRepoPermissionAnyDecorator( | |
|
41 | 'repository.read', 'repository.write', 'repository.admin') | |
|
42 | @view_config( | |
|
43 | route_name='repo_artifacts_list', request_method='GET', | |
|
44 | renderer='rhodecode:templates/artifacts/artifact_list.mako') | |
|
45 | def repo_artifacts(self): | |
|
46 | c = self.load_default_context() | |
|
47 | c.active = 'artifacts' | |
|
48 | return self._get_template_context(c) |
@@ -0,0 +1,55 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | import logging | |
|
4 | ||
|
5 | from alembic.migration import MigrationContext | |
|
6 | from alembic.operations import Operations | |
|
7 | from sqlalchemy import String, Column | |
|
8 | from sqlalchemy.sql import text | |
|
9 | ||
|
10 | from rhodecode.lib.dbmigrate.versions import _reset_base | |
|
11 | from rhodecode.lib.utils2 import safe_str | |
|
12 | from rhodecode.model import meta, init_model_encryption | |
|
13 | from rhodecode.model.db import RepoGroup | |
|
14 | ||
|
15 | ||
|
16 | log = logging.getLogger(__name__) | |
|
17 | ||
|
18 | ||
|
19 | def upgrade(migrate_engine): | |
|
20 | """ | |
|
21 | Upgrade operations go here. | |
|
22 | Don't create your own engine; bind migrate_engine to your metadata | |
|
23 | """ | |
|
24 | _reset_base(migrate_engine) | |
|
25 | from rhodecode.lib.dbmigrate.schema import db_4_16_0_2 | |
|
26 | ||
|
27 | init_model_encryption(db_4_16_0_2) | |
|
28 | ||
|
29 | context = MigrationContext.configure(migrate_engine.connect()) | |
|
30 | op = Operations(context) | |
|
31 | ||
|
32 | repo_group = db_4_16_0_2.RepoGroup.__table__ | |
|
33 | ||
|
34 | with op.batch_alter_table(repo_group.name) as batch_op: | |
|
35 | batch_op.add_column( | |
|
36 | Column("repo_group_name_hash", String(1024), nullable=True, unique=False)) | |
|
37 | ||
|
38 | _generate_repo_group_name_hashes(db_4_16_0_2, op, meta.Session) | |
|
39 | ||
|
40 | ||
|
41 | def downgrade(migrate_engine): | |
|
42 | pass | |
|
43 | ||
|
44 | ||
|
45 | def _generate_repo_group_name_hashes(models, op, session): | |
|
46 | repo_groups = models.RepoGroup.get_all() | |
|
47 | for repo_group in repo_groups: | |
|
48 | print(safe_str(repo_group.group_name)) | |
|
49 | hash_ = RepoGroup.hash_repo_group_name(repo_group.group_name) | |
|
50 | params = {'hash': hash_, 'id': repo_group.group_id} | |
|
51 | query = text( | |
|
52 | 'UPDATE groups SET repo_group_name_hash = :hash' | |
|
53 | ' WHERE group_id = :id').bindparams(**params) | |
|
54 | op.execute(query) | |
|
55 | session().commit() |
@@ -0,0 +1,35 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | import logging | |
|
4 | ||
|
5 | from alembic.migration import MigrationContext | |
|
6 | from alembic.operations import Operations | |
|
7 | ||
|
8 | from rhodecode.lib.dbmigrate.versions import _reset_base | |
|
9 | from rhodecode.model import init_model_encryption | |
|
10 | ||
|
11 | ||
|
12 | log = logging.getLogger(__name__) | |
|
13 | ||
|
14 | ||
|
15 | def upgrade(migrate_engine): | |
|
16 | """ | |
|
17 | Upgrade operations go here. | |
|
18 | Don't create your own engine; bind migrate_engine to your metadata | |
|
19 | """ | |
|
20 | _reset_base(migrate_engine) | |
|
21 | from rhodecode.lib.dbmigrate.schema import db_4_16_0_2 | |
|
22 | ||
|
23 | init_model_encryption(db_4_16_0_2) | |
|
24 | ||
|
25 | context = MigrationContext.configure(migrate_engine.connect()) | |
|
26 | op = Operations(context) | |
|
27 | ||
|
28 | repo_group = db_4_16_0_2.RepoGroup.__table__ | |
|
29 | ||
|
30 | with op.batch_alter_table(repo_group.name) as batch_op: | |
|
31 | batch_op.alter_column("repo_group_name_hash", nullable=False) | |
|
32 | ||
|
33 | ||
|
34 | def downgrade(migrate_engine): | |
|
35 | pass |
@@ -0,0 +1,37 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | import logging | |
|
4 | ||
|
5 | from alembic.migration import MigrationContext | |
|
6 | from alembic.operations import Operations | |
|
7 | from sqlalchemy import Column, LargeBinary | |
|
8 | ||
|
9 | from rhodecode.lib.dbmigrate.versions import _reset_base | |
|
10 | from rhodecode.model import init_model_encryption | |
|
11 | ||
|
12 | ||
|
13 | log = logging.getLogger(__name__) | |
|
14 | ||
|
15 | ||
|
16 | def upgrade(migrate_engine): | |
|
17 | """ | |
|
18 | Upgrade operations go here. | |
|
19 | Don't create your own engine; bind migrate_engine to your metadata | |
|
20 | """ | |
|
21 | _reset_base(migrate_engine) | |
|
22 | from rhodecode.lib.dbmigrate.schema import db_4_16_0_2 | |
|
23 | ||
|
24 | init_model_encryption(db_4_16_0_2) | |
|
25 | ||
|
26 | context = MigrationContext.configure(migrate_engine.connect()) | |
|
27 | op = Operations(context) | |
|
28 | ||
|
29 | repo_group = db_4_16_0_2.RepoGroup.__table__ | |
|
30 | ||
|
31 | with op.batch_alter_table(repo_group.name) as batch_op: | |
|
32 | batch_op.add_column( | |
|
33 | Column("changeset_cache", LargeBinary(1024), nullable=True)) | |
|
34 | ||
|
35 | ||
|
36 | def downgrade(migrate_engine): | |
|
37 | pass |
@@ -0,0 +1,69 b'' | |||
|
1 | import os | |
|
2 | import base64 | |
|
3 | from cryptography.fernet import Fernet, InvalidToken | |
|
4 | from cryptography.hazmat.backends import default_backend | |
|
5 | from cryptography.hazmat.primitives import hashes | |
|
6 | from cryptography.hazmat.primitives.kdf.pbkdf2 import PBKDF2HMAC | |
|
7 | ||
|
8 | ||
|
9 | class Encryptor(object): | |
|
10 | key_format = 'enc2$salt:{}$data:{}' | |
|
11 | pref_len = 5 # salt:, data: | |
|
12 | ||
|
13 | def __init__(self, enc_key): | |
|
14 | self.enc_key = enc_key | |
|
15 | ||
|
16 | def b64_encode(self, data): | |
|
17 | return base64.urlsafe_b64encode(data) | |
|
18 | ||
|
19 | def b64_decode(self, data): | |
|
20 | return base64.urlsafe_b64decode(data) | |
|
21 | ||
|
22 | def get_encryptor(self, salt): | |
|
23 | """ | |
|
24 | Uses Fernet as encryptor with HMAC signature | |
|
25 | :param salt: random salt used for encrypting the data | |
|
26 | """ | |
|
27 | kdf = PBKDF2HMAC( | |
|
28 | algorithm=hashes.SHA512(), | |
|
29 | length=32, | |
|
30 | salt=salt, | |
|
31 | iterations=100000, | |
|
32 | backend=default_backend() | |
|
33 | ) | |
|
34 | key = self.b64_encode(kdf.derive(self.enc_key)) | |
|
35 | return Fernet(key) | |
|
36 | ||
|
37 | def _get_parts(self, enc_data): | |
|
38 | parts = enc_data.split('$', 3) | |
|
39 | if len(parts) != 3: | |
|
40 | raise ValueError('Encrypted Data has invalid format, expected {}'.format(self.key_format)) | |
|
41 | prefix, salt, enc_data = parts | |
|
42 | ||
|
43 | try: | |
|
44 | salt = self.b64_decode(salt[self.pref_len:]) | |
|
45 | except TypeError: | |
|
46 | # bad base64 | |
|
47 | raise ValueError('Encrypted Data salt invalid format, expected base64 format') | |
|
48 | ||
|
49 | enc_data = enc_data[self.pref_len:] | |
|
50 | return prefix, salt, enc_data | |
|
51 | ||
|
52 | def encrypt(self, data): | |
|
53 | salt = os.urandom(64) | |
|
54 | encryptor = self.get_encryptor(salt) | |
|
55 | enc_data = encryptor.encrypt(data) | |
|
56 | return self.key_format.format(self.b64_encode(salt), enc_data) | |
|
57 | ||
|
58 | def decrypt(self, data, safe=True): | |
|
59 | parts = self._get_parts(data) | |
|
60 | salt = parts[1] | |
|
61 | enc_data = parts[2] | |
|
62 | encryptor = self.get_encryptor(salt) | |
|
63 | try: | |
|
64 | return encryptor.decrypt(enc_data) | |
|
65 | except (InvalidToken,): | |
|
66 | if safe: | |
|
67 | return '' | |
|
68 | else: | |
|
69 | raise |
@@ -0,0 +1,200 b'' | |||
|
1 | # Copyright (c) 2010 Agendaless Consulting and Contributors. | |
|
2 | # (http://www.agendaless.com), All Rights Reserved | |
|
3 | # License: BSD-derived (http://www.repoze.org/LICENSE.txt) | |
|
4 | # With Patches from RhodeCode GmBH | |
|
5 | ||
|
6 | ||
|
7 | import os | |
|
8 | ||
|
9 | from beaker import cache | |
|
10 | from beaker.session import SessionObject | |
|
11 | from beaker.util import coerce_cache_params | |
|
12 | from beaker.util import coerce_session_params | |
|
13 | ||
|
14 | from pyramid.interfaces import ISession | |
|
15 | from pyramid.settings import asbool | |
|
16 | from zope.interface import implementer | |
|
17 | ||
|
18 | from binascii import hexlify | |
|
19 | ||
|
20 | ||
|
21 | def BeakerSessionFactoryConfig(**options): | |
|
22 | """ Return a Pyramid session factory using Beaker session settings | |
|
23 | supplied directly as ``**options``""" | |
|
24 | ||
|
25 | class PyramidBeakerSessionObject(SessionObject): | |
|
26 | _options = options | |
|
27 | _cookie_on_exception = _options.pop('cookie_on_exception', True) | |
|
28 | _constant_csrf_token = _options.pop('constant_csrf_token', False) | |
|
29 | ||
|
30 | def __init__(self, request): | |
|
31 | SessionObject.__init__(self, request.environ, **self._options) | |
|
32 | ||
|
33 | def session_callback(request, response): | |
|
34 | exception = getattr(request, 'exception', None) | |
|
35 | if (exception is None or self._cookie_on_exception) and self.accessed(): | |
|
36 | self.persist() | |
|
37 | headers = self.__dict__['_headers'] | |
|
38 | if headers['set_cookie'] and headers['cookie_out']: | |
|
39 | response.headerlist.append(('Set-Cookie', headers['cookie_out'])) | |
|
40 | request.add_response_callback(session_callback) | |
|
41 | ||
|
42 | # ISession API | |
|
43 | ||
|
44 | @property | |
|
45 | def id(self): | |
|
46 | # this is as inspected in SessionObject.__init__ | |
|
47 | if self.__dict__['_params'].get('type') != 'cookie': | |
|
48 | return self._session().id | |
|
49 | return None | |
|
50 | ||
|
51 | @property | |
|
52 | def new(self): | |
|
53 | return self.last_accessed is None | |
|
54 | ||
|
55 | changed = SessionObject.save | |
|
56 | ||
|
57 | # modifying dictionary methods | |
|
58 | ||
|
59 | @call_save | |
|
60 | def clear(self): | |
|
61 | return self._session().clear() | |
|
62 | ||
|
63 | @call_save | |
|
64 | def update(self, d, **kw): | |
|
65 | return self._session().update(d, **kw) | |
|
66 | ||
|
67 | @call_save | |
|
68 | def setdefault(self, k, d=None): | |
|
69 | return self._session().setdefault(k, d) | |
|
70 | ||
|
71 | @call_save | |
|
72 | def pop(self, k, d=None): | |
|
73 | return self._session().pop(k, d) | |
|
74 | ||
|
75 | @call_save | |
|
76 | def popitem(self): | |
|
77 | return self._session().popitem() | |
|
78 | ||
|
79 | __setitem__ = call_save(SessionObject.__setitem__) | |
|
80 | __delitem__ = call_save(SessionObject.__delitem__) | |
|
81 | ||
|
82 | # Flash API methods | |
|
83 | def flash(self, msg, queue='', allow_duplicate=True): | |
|
84 | storage = self.setdefault('_f_' + queue, []) | |
|
85 | if allow_duplicate or (msg not in storage): | |
|
86 | storage.append(msg) | |
|
87 | ||
|
88 | def pop_flash(self, queue=''): | |
|
89 | storage = self.pop('_f_' + queue, []) | |
|
90 | return storage | |
|
91 | ||
|
92 | def peek_flash(self, queue=''): | |
|
93 | storage = self.get('_f_' + queue, []) | |
|
94 | return storage | |
|
95 | ||
|
96 | # CSRF API methods | |
|
97 | def new_csrf_token(self): | |
|
98 | token = (self._constant_csrf_token | |
|
99 | or hexlify(os.urandom(20)).decode('ascii')) | |
|
100 | self['_csrft_'] = token | |
|
101 | return token | |
|
102 | ||
|
103 | def get_csrf_token(self): | |
|
104 | token = self.get('_csrft_', None) | |
|
105 | if token is None: | |
|
106 | token = self.new_csrf_token() | |
|
107 | return token | |
|
108 | ||
|
109 | return implementer(ISession)(PyramidBeakerSessionObject) | |
|
110 | ||
|
111 | ||
|
112 | def call_save(wrapped): | |
|
113 | """ By default, in non-auto-mode beaker badly wants people to | |
|
114 | call save even though it should know something has changed when | |
|
115 | a mutating method is called. This hack should be removed if | |
|
116 | Beaker ever starts to do this by default. """ | |
|
117 | def save(session, *arg, **kw): | |
|
118 | value = wrapped(session, *arg, **kw) | |
|
119 | session.save() | |
|
120 | return value | |
|
121 | save.__doc__ = wrapped.__doc__ | |
|
122 | return save | |
|
123 | ||
|
124 | ||
|
125 | def session_factory_from_settings(settings): | |
|
126 | """ Return a Pyramid session factory using Beaker session settings | |
|
127 | supplied from a Paste configuration file""" | |
|
128 | prefixes = ('session.', 'beaker.session.') | |
|
129 | options = {} | |
|
130 | ||
|
131 | # Pull out any config args meant for beaker session. if there are any | |
|
132 | for k, v in settings.items(): | |
|
133 | for prefix in prefixes: | |
|
134 | if k.startswith(prefix): | |
|
135 | option_name = k[len(prefix):] | |
|
136 | if option_name == 'cookie_on_exception': | |
|
137 | v = asbool(v) | |
|
138 | options[option_name] = v | |
|
139 | ||
|
140 | options = coerce_session_params(options) | |
|
141 | return BeakerSessionFactoryConfig(**options) | |
|
142 | ||
|
143 | ||
|
144 | def set_cache_regions_from_settings(settings): | |
|
145 | """ Add cache support to the Pylons application. | |
|
146 | ||
|
147 | The ``settings`` passed to the configurator are used to setup | |
|
148 | the cache options. Cache options in the settings should start | |
|
149 | with either 'beaker.cache.' or 'cache.'. | |
|
150 | ||
|
151 | """ | |
|
152 | cache_settings = {'regions': []} | |
|
153 | for key in settings.keys(): | |
|
154 | for prefix in ['beaker.cache.', 'cache.']: | |
|
155 | if key.startswith(prefix): | |
|
156 | name = key.split(prefix)[1].strip() | |
|
157 | cache_settings[name] = settings[key].strip() | |
|
158 | ||
|
159 | if ('expire' in cache_settings | |
|
160 | and isinstance(cache_settings['expire'], basestring) | |
|
161 | and cache_settings['expire'].lower() in ['none', 'no']): | |
|
162 | cache_settings['expire'] = None | |
|
163 | ||
|
164 | coerce_cache_params(cache_settings) | |
|
165 | ||
|
166 | if 'enabled' not in cache_settings: | |
|
167 | cache_settings['enabled'] = True | |
|
168 | ||
|
169 | regions = cache_settings['regions'] | |
|
170 | if regions: | |
|
171 | for region in regions: | |
|
172 | if not region: | |
|
173 | continue | |
|
174 | ||
|
175 | region_settings = { | |
|
176 | 'data_dir': cache_settings.get('data_dir'), | |
|
177 | 'lock_dir': cache_settings.get('lock_dir'), | |
|
178 | 'expire': cache_settings.get('expire', 60), | |
|
179 | 'enabled': cache_settings['enabled'], | |
|
180 | 'key_length': cache_settings.get('key_length', 250), | |
|
181 | 'type': cache_settings.get('type'), | |
|
182 | 'url': cache_settings.get('url'), | |
|
183 | } | |
|
184 | region_prefix = '%s.' % region | |
|
185 | region_len = len(region_prefix) | |
|
186 | for key in list(cache_settings.keys()): | |
|
187 | if key.startswith(region_prefix): | |
|
188 | region_settings[key[region_len:]] = cache_settings.pop(key) | |
|
189 | ||
|
190 | if (isinstance(region_settings['expire'], basestring) | |
|
191 | and region_settings['expire'].lower() in ['none', 'no']): | |
|
192 | region_settings['expire'] = None | |
|
193 | coerce_cache_params(region_settings) | |
|
194 | cache.cache_regions[region] = region_settings | |
|
195 | ||
|
196 | ||
|
197 | def includeme(config): | |
|
198 | session_factory = session_factory_from_settings(config.registry.settings) | |
|
199 | config.set_session_factory(session_factory) | |
|
200 | set_cache_regions_from_settings(config.registry.settings) |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
@@ -1,5 +1,5 b'' | |||
|
1 | 1 | [bumpversion] |
|
2 |
current_version = 4.1 |
|
|
2 | current_version = 4.17.0 | |
|
3 | 3 | message = release: Bump version {current_version} to {new_version} |
|
4 | 4 | |
|
5 | 5 | [bumpversion:file:rhodecode/VERSION] |
@@ -5,25 +5,20 b' done = false' | |||
|
5 | 5 | done = true |
|
6 | 6 | |
|
7 | 7 | [task:rc_tools_pinned] |
|
8 | done = true | |
|
9 | 8 | |
|
10 | 9 | [task:fixes_on_stable] |
|
11 | done = true | |
|
12 | 10 | |
|
13 | 11 | [task:pip2nix_generated] |
|
14 | done = true | |
|
15 | 12 | |
|
16 | 13 | [task:changelog_updated] |
|
17 | done = true | |
|
18 | 14 | |
|
19 | 15 | [task:generate_api_docs] |
|
20 | done = true | |
|
16 | ||
|
17 | [task:updated_translation] | |
|
21 | 18 | |
|
22 | 19 | [release] |
|
23 |
state = |
|
|
24 |
version = 4.1 |
|
|
25 | ||
|
26 | [task:updated_translation] | |
|
20 | state = in_progress | |
|
21 | version = 4.17.0 | |
|
27 | 22 | |
|
28 | 23 | [task:generate_js_routes] |
|
29 | 24 |
@@ -53,3 +53,12 b' web-build:' | |||
|
53 | 53 | |
|
54 | 54 | generate-pkgs: |
|
55 | 55 | nix-shell pkgs/shell-generate.nix --command "pip2nix generate --licenses" |
|
56 | ||
|
57 | generate-js-pkgs: | |
|
58 | rm -rf node_modules && \ | |
|
59 | nix-shell pkgs/shell-generate.nix --command "node2nix --input package.json -o pkgs/node-packages.nix -e pkgs/node-env.nix -c pkgs/node-default.nix -d --flatten --nodejs-8" && \ | |
|
60 | sed -i -e 's/http:\/\//https:\/\//g' pkgs/node-packages.nix | |
|
61 | ||
|
62 | generate-license-meta: | |
|
63 | nix-build pkgs/license-generate.nix -o result-license && \ | |
|
64 | cat result-license/licenses.json | python -m json.tool > rhodecode/config/licenses.json No newline at end of file |
@@ -133,6 +133,11 b' rhodecode.api.url = /_admin/api' | |||
|
133 | 133 | ## `SignatureVerificationError` in case of wrong key, or damaged encryption data. |
|
134 | 134 | #rhodecode.encrypted_values.strict = false |
|
135 | 135 | |
|
136 | ## Pick algorithm for encryption. Either fernet (more secure) or aes (default) | |
|
137 | ## fernet is safer, and we strongly recommend switching to it. | |
|
138 | ## Due to backward compatibility aes is used as default. | |
|
139 | #rhodecode.encrypted_values.algorithm = fernet | |
|
140 | ||
|
136 | 141 | ## return gzipped responses from RhodeCode (static files/application) |
|
137 | 142 | gzip_responses = false |
|
138 | 143 | |
@@ -374,6 +379,10 b' rc_cache.cache_repo_longterm.max_size = ' | |||
|
374 | 379 | beaker.session.type = file |
|
375 | 380 | beaker.session.data_dir = %(here)s/data/sessions |
|
376 | 381 | |
|
382 | ## redis sessions | |
|
383 | #beaker.session.type = ext:redis | |
|
384 | #beaker.session.url = redis://127.0.0.1:6379/2 | |
|
385 | ||
|
377 | 386 | ## db based session, fast, and allows easy management over logged in users |
|
378 | 387 | #beaker.session.type = ext:database |
|
379 | 388 | #beaker.session.table_name = db_session |
@@ -30,10 +30,12 b" loglevel = 'debug'" | |||
|
30 | 30 | # SECURITY |
|
31 | 31 | |
|
32 | 32 | # The maximum size of HTTP request line in bytes. |
|
33 | limit_request_line = 4094 | |
|
33 | # 0 for unlimited | |
|
34 | limit_request_line = 0 | |
|
34 | 35 | |
|
35 | 36 | # Limit the number of HTTP headers fields in a request. |
|
36 | limit_request_fields = 1024 | |
|
37 | # By default this value is 100 and can’t be larger than 32768. | |
|
38 | limit_request_fields = 10240 | |
|
37 | 39 | |
|
38 | 40 | # Limit the allowed size of an HTTP request header field. |
|
39 | 41 | # Value is a positive number or 0. |
@@ -108,6 +108,11 b' use = egg:rhodecode-enterprise-ce' | |||
|
108 | 108 | ## `SignatureVerificationError` in case of wrong key, or damaged encryption data. |
|
109 | 109 | #rhodecode.encrypted_values.strict = false |
|
110 | 110 | |
|
111 | ## Pick algorithm for encryption. Either fernet (more secure) or aes (default) | |
|
112 | ## fernet is safer, and we strongly recommend switching to it. | |
|
113 | ## Due to backward compatibility aes is used as default. | |
|
114 | #rhodecode.encrypted_values.algorithm = fernet | |
|
115 | ||
|
111 | 116 | ## return gzipped responses from RhodeCode (static files/application) |
|
112 | 117 | gzip_responses = false |
|
113 | 118 | |
@@ -349,6 +354,10 b' rc_cache.cache_repo_longterm.max_size = ' | |||
|
349 | 354 | beaker.session.type = file |
|
350 | 355 | beaker.session.data_dir = %(here)s/data/sessions |
|
351 | 356 | |
|
357 | ## redis sessions | |
|
358 | #beaker.session.type = ext:redis | |
|
359 | #beaker.session.url = redis://127.0.0.1:6379/2 | |
|
360 | ||
|
352 | 361 | ## db based session, fast, and allows easy management over logged in users |
|
353 | 362 | #beaker.session.type = ext:database |
|
354 | 363 | #beaker.session.table_name = db_session |
@@ -191,7 +191,7 b' let' | |||
|
191 | 191 | # check required files |
|
192 | 192 | STATIC_CHECK="/robots.txt /502.html |
|
193 | 193 | /js/scripts.js /js/rhodecode-components.js |
|
194 | /css/style.css /css/style-polymer.css" | |
|
194 | /css/style.css /css/style-polymer.css /css/style-ipython.css" | |
|
195 | 195 | |
|
196 | 196 | for file in $STATIC_CHECK; |
|
197 | 197 | do |
@@ -29,7 +29,7 b' 1. Go to :menuselection:`Admin --> Repos' | |||
|
29 | 29 | beside the |repo| to which you wish to add extra fields. |
|
30 | 30 | 2. On the |repo| settings page, select the :guilabel:`Extra fields` tab. |
|
31 | 31 | |
|
32 | .. image:: ../images/extra-repo-fields.png | |
|
32 | .. image:: ../../images/extra-repo-fields.png | |
|
33 | 33 | |
|
34 | 34 | The most important is the `New field key` variable which under the value will |
|
35 | 35 | be stored. It needs to be unique for each repository. The label and description |
|
1 | NO CONTENT: file renamed from docs/admin/repo-hooks.rst to docs/admin/repo_admin/repo-hooks.rst |
|
1 | NO CONTENT: file renamed from docs/admin/repo-issue-tracker.rst to docs/admin/repo_admin/repo-issue-tracker.rst |
|
1 | NO CONTENT: file renamed from docs/admin/repo-perm-steps.rst to docs/admin/repo_admin/repo-perm-steps.rst |
|
1 | NO CONTENT: file renamed from docs/admin/repo-vcs.rst to docs/admin/repo_admin/repo-vcs.rst |
|
1 | NO CONTENT: file renamed from docs/admin/restore-deleted-repositories.rst to docs/admin/repo_admin/restore-deleted-repositories.rst |
@@ -16,19 +16,17 b' The following are the most common system' | |||
|
16 | 16 | |
|
17 | 17 | .. toctree:: |
|
18 | 18 | |
|
19 | config-files-overview | |
|
20 | vcs-server | |
|
21 | svn-http | |
|
22 | svn-path-permissions | |
|
23 | gunicorn-ssl-support | |
|
24 | apache-config | |
|
25 | nginx-config | |
|
26 | backup-restore | |
|
27 | tuning-rhodecode | |
|
28 | indexing | |
|
29 | reset-information | |
|
30 | enable-debug | |
|
31 | admin-tricks | |
|
32 | cleanup-cmds | |
|
33 | restore-deleted-repositories | |
|
34 | ||
|
19 | system_admin/config-files-overview | |
|
20 | system_admin/vcs-server | |
|
21 | system_admin/svn-http | |
|
22 | system_admin/svn-path-permissions | |
|
23 | system_admin/gunicorn-ssl-support | |
|
24 | system_admin/apache-config | |
|
25 | system_admin/nginx-config | |
|
26 | system_admin/backup-restore | |
|
27 | system_admin/tuning-rhodecode | |
|
28 | system_admin/indexing | |
|
29 | system_admin/reset-information | |
|
30 | system_admin/enable-debug | |
|
31 | system_admin/admin-tricks | |
|
32 | system_admin/cleanup-cmds |
@@ -57,7 +57,7 b' 2. To add a message that will be display' | |||
|
57 | 57 | 3. Select :guilabel:`Save`, and you will see the message once your page |
|
58 | 58 | refreshes. |
|
59 | 59 | |
|
60 | .. image:: ../images/server-wide-announcement.png | |
|
60 | .. image:: ../../images/server-wide-announcement.png | |
|
61 | 61 | :alt: Server Wide Announcement |
|
62 | 62 | |
|
63 | 63 | .. _md-rst: |
@@ -207,7 +207,7 b' 2. Restart the |RCE| instance and check ' | |||
|
207 | 207 | Instance "enterprise-2" successfully stopped. |
|
208 | 208 | Instance "enterprise-2" successfully started. |
|
209 | 209 | |
|
210 | .. image:: ../images/language.png | |
|
210 | .. image:: ../../images/language.png | |
|
211 | 211 | |
|
212 | 212 | .. _set-repo-pub: |
|
213 | 213 | |
@@ -239,3 +239,26 b' following URL: ``{instance-URL}/_admin/p' | |||
|
239 | 239 | |
|
240 | 240 | .. _Markdown: http://daringfireball.net/projects/markdown/ |
|
241 | 241 | .. _reStructured Text: http://docutils.sourceforge.net/docs/index.html |
|
242 | ||
|
243 | ||
|
244 | Unarchiving a repository | |
|
245 | ^^^^^^^^^^^^^^^^^^^^^^^^^ | |
|
246 | ||
|
247 | Archive operation for the repository is similar as delete. Archive keeps the data for future references | |
|
248 | but makes the repository read-only. After archiving the repository it shouldn't be modified in any way. | |
|
249 | This is why repository settings are disabled for an archived repository. | |
|
250 | ||
|
251 | If there's a need for unarchiving a repository for some reasons, the interactive | |
|
252 | ishell interface should be used. | |
|
253 | ||
|
254 | .. code-block:: bash | |
|
255 | ||
|
256 | # Open iShell from the terminal | |
|
257 | $ rccontrol ishell enterprise-1/community-1 | |
|
258 | ||
|
259 | .. code-block:: python | |
|
260 | ||
|
261 | # Set repository as un-archived | |
|
262 | In [1]: repo = Repository.get_by_repo_name('SOME_REPO_NAME') | |
|
263 | In [2]: repo.archived = False | |
|
264 | In [3]: Session().add(repo);Session().commit() |
@@ -8,7 +8,7 b' the information in the following section' | |||
|
8 | 8 | |
|
9 | 9 | .. toctree:: |
|
10 | 10 | |
|
11 | apache-conf-example | |
|
12 | apache-diffie-hellman | |
|
13 | apache-subdirectory | |
|
14 | apache-wsgi-coding | |
|
11 | apache/apache-conf-example | |
|
12 | apache/apache-diffie-hellman | |
|
13 | apache/apache-subdirectory | |
|
14 | apache/apache-wsgi-coding |
@@ -66,14 +66,18 b' Below config if for an Apache Reverse Pr' | |||
|
66 | 66 | # Directive to properly generate url (clone url) for RhodeCode |
|
67 | 67 | ProxyPreserveHost On |
|
68 | 68 | |
|
69 | # It allows request bodies to be sent to the backend using chunked transfer encoding. | |
|
70 | SetEnv proxy-sendchunked 1 | |
|
71 | ||
|
72 | # Increase headers size for large Mercurial headers sent with many branches | |
|
73 | LimitRequestLine 16380 | |
|
74 | ||
|
69 | 75 | # Url to running RhodeCode instance. This is shown as `- URL:` when |
|
70 | 76 | # running rccontrol status. |
|
77 | ||
|
71 | 78 | ProxyPass / http://127.0.0.1:10002/ timeout=7200 Keepalive=On |
|
72 | 79 | ProxyPassReverse / http://127.0.0.1:10002/ |
|
73 | 80 | |
|
74 | # Increase headers for large Mercurial headers | |
|
75 | LimitRequestLine 16380 | |
|
76 | ||
|
77 | 81 | # strict http prevents from https -> http downgrade |
|
78 | 82 | Header always set Strict-Transport-Security "max-age=63072000; includeSubdomains; preload" |
|
79 | 83 |
|
1 | NO CONTENT: file renamed from docs/admin/apache-diffie-hellman.rst to docs/admin/system_admin/apache/apache-diffie-hellman.rst |
|
1 | NO CONTENT: file renamed from docs/admin/apache-subdirectory.rst to docs/admin/system_admin/apache/apache-subdirectory.rst |
|
1 | NO CONTENT: file renamed from docs/admin/apache-wsgi-coding.rst to docs/admin/system_admin/apache/apache-wsgi-coding.rst |
|
1 | NO CONTENT: file renamed from docs/admin/backup-restore.rst to docs/admin/system_admin/backup-restore.rst |
|
1 | NO CONTENT: file renamed from docs/admin/cleanup-cmds.rst to docs/admin/system_admin/cleanup-cmds.rst |
|
1 | NO CONTENT: file renamed from docs/admin/config-files-overview.rst to docs/admin/system_admin/config-files-overview.rst |
|
1 | NO CONTENT: file renamed from docs/admin/enable-debug.rst to docs/admin/system_admin/enable-debug.rst |
|
1 | NO CONTENT: file renamed from docs/admin/gunicorn-ssl-support.rst to docs/admin/system_admin/gunicorn-ssl-support.rst |
|
1 | NO CONTENT: file renamed from docs/admin/indexing.rst to docs/admin/system_admin/indexing.rst |
@@ -8,7 +8,7 b' the information in the following section' | |||
|
8 | 8 | |
|
9 | 9 | .. toctree:: |
|
10 | 10 | |
|
11 | nginx-config-example | |
|
12 | nginx-diffie-hellman | |
|
13 | nginx-proxy-conf | |
|
14 | nginx-url-prefix | |
|
11 | nginx/nginx-config-example | |
|
12 | nginx/nginx-diffie-hellman | |
|
13 | nginx/nginx-proxy-conf | |
|
14 | nginx/nginx-url-prefix |
|
1 | NO CONTENT: file renamed from docs/admin/nginx-config-example.rst to docs/admin/system_admin/nginx/nginx-config-example.rst |
|
1 | NO CONTENT: file renamed from docs/admin/nginx-diffie-hellman.rst to docs/admin/system_admin/nginx/nginx-diffie-hellman.rst |
|
1 | NO CONTENT: file renamed from docs/admin/nginx-proxy-conf.rst to docs/admin/system_admin/nginx/nginx-proxy-conf.rst |
|
1 | NO CONTENT: file renamed from docs/admin/nginx-url-prefix.rst to docs/admin/system_admin/nginx/nginx-url-prefix.rst |
|
1 | NO CONTENT: file renamed from docs/admin/reset-information.rst to docs/admin/system_admin/reset-information.rst |
|
1 | NO CONTENT: file renamed from docs/admin/svn-http.rst to docs/admin/system_admin/svn-http.rst |
|
1 | NO CONTENT: file renamed from docs/admin/svn-path-permissions.rst to docs/admin/system_admin/svn-path-permissions.rst |
@@ -8,14 +8,14 b' may find some of the following methods u' | |||
|
8 | 8 | |
|
9 | 9 | .. toctree:: |
|
10 | 10 | |
|
11 | tuning-gunicorn | |
|
12 | tuning-vcs-memory-cache | |
|
13 | tuning-user-sessions-performance | |
|
14 | tuning-increase-db-performance | |
|
15 | tuning-scale-horizontally-cluster | |
|
16 | tuning-mount-cache-memory | |
|
17 | tuning-change-encoding | |
|
18 | tuning-change-large-file-dir | |
|
19 | tuning-change-lfs-dir | |
|
20 | tuning-hg-auth-loop | |
|
11 | tuning/tuning-gunicorn | |
|
12 | tuning/tuning-vcs-memory-cache | |
|
13 | tuning/tuning-user-sessions-performance | |
|
14 | tuning/tuning-increase-db-performance | |
|
15 | tuning/tuning-scale-horizontally-cluster | |
|
16 | tuning/tuning-mount-cache-memory | |
|
17 | tuning/tuning-change-encoding | |
|
18 | tuning/tuning-change-large-file-dir | |
|
19 | tuning/tuning-change-lfs-dir | |
|
20 | tuning/tuning-hg-auth-loop | |
|
21 | 21 |
|
1 | NO CONTENT: file renamed from docs/admin/tuning-change-encoding.rst to docs/admin/system_admin/tuning/tuning-change-encoding.rst |
|
1 | NO CONTENT: file renamed from docs/admin/tuning-change-large-file-dir.rst to docs/admin/system_admin/tuning/tuning-change-large-file-dir.rst |
|
1 | NO CONTENT: file renamed from docs/admin/tuning-change-lfs-dir.rst to docs/admin/system_admin/tuning/tuning-change-lfs-dir.rst |
@@ -42,7 +42,7 b' 2. In the ``[server:main]`` section, cha' | |||
|
42 | 42 | ## restarted, could prevent memory leaks |
|
43 | 43 | max_requests = 1000 |
|
44 | 44 | max_requests_jitter = 30 |
|
45 | ## amount of time a worker can spend with handling a request before it | |
|
45 | ## amount of time a worker can spend with handling a request tuning-change-lfs-dir.before it | |
|
46 | 46 | ## gets killed and restarted. Set to 6hrs |
|
47 | 47 | timeout = 21600 |
|
48 | 48 |
|
1 | NO CONTENT: file renamed from docs/admin/tuning-hg-auth-loop.rst to docs/admin/system_admin/tuning/tuning-hg-auth-loop.rst |
|
1 | NO CONTENT: file renamed from docs/admin/tuning-increase-db-performance.rst to docs/admin/system_admin/tuning/tuning-increase-db-performance.rst |
|
1 | NO CONTENT: file renamed from docs/admin/tuning-mount-cache-memory.rst to docs/admin/system_admin/tuning/tuning-mount-cache-memory.rst |
|
1 | NO CONTENT: file renamed from docs/admin/tuning-scale-horizontally-cluster.rst to docs/admin/system_admin/tuning/tuning-scale-horizontally-cluster.rst |
|
1 | NO CONTENT: file renamed from docs/admin/tuning-user-sessions-performance.rst to docs/admin/system_admin/tuning/tuning-user-sessions-performance.rst |
|
1 | NO CONTENT: file renamed from docs/admin/tuning-vcs-memory-cache.rst to docs/admin/system_admin/tuning/tuning-vcs-memory-cache.rst |
|
1 | NO CONTENT: file renamed from docs/admin/vcs-server.rst to docs/admin/system_admin/vcs-server.rst |
@@ -13,11 +13,12 b' permissions applied to it; |perm|.' | |||
|
13 | 13 | |
|
14 | 14 | .. toctree:: |
|
15 | 15 | |
|
16 | public-access | |
|
17 | default-user-perms | |
|
18 | adding-anonymous-user | |
|
19 | adding-new-user | |
|
20 | setting-default-permissions | |
|
21 | setting-usergroup-permissions | |
|
16 | user_admin/public-access | |
|
17 | user_admin/default-user-perms | |
|
18 | user_admin/adding-anonymous-user | |
|
19 | user_admin/adding-new-user | |
|
20 | user_admin/setting-default-permissions | |
|
21 | user_admin/setting-usergroup-permissions | |
|
22 | user_admin/user-admin-tasks | |
|
22 | 23 | |
|
23 | .. |perm| replace:: **None**, **Read**, **Write**, or **Admin** No newline at end of file | |
|
24 | .. |perm| replace:: **None**, **Read**, **Write**, or **Admin** |
|
1 | NO CONTENT: file renamed from docs/admin/adding-anonymous-user.rst to docs/admin/user_admin/adding-anonymous-user.rst |
|
1 | NO CONTENT: file renamed from docs/admin/adding-new-user.rst to docs/admin/user_admin/adding-new-user.rst |
|
1 | NO CONTENT: file renamed from docs/admin/default-user-perms.rst to docs/admin/user_admin/default-user-perms.rst |
|
1 | NO CONTENT: file renamed from docs/admin/public-access.rst to docs/admin/user_admin/public-access.rst |
|
1 | NO CONTENT: file renamed from docs/admin/setting-default-permissions.rst to docs/admin/user_admin/setting-default-permissions.rst |
|
1 | NO CONTENT: file renamed from docs/admin/setting-usergroup-permissions.rst to docs/admin/user_admin/setting-usergroup-permissions.rst |
@@ -204,6 +204,7 b' are not required in args.' | |||
|
204 | 204 | methods/pull-request-methods |
|
205 | 205 | methods/repo-methods |
|
206 | 206 | methods/repo-group-methods |
|
207 | methods/search-methods | |
|
207 | 208 | methods/server-methods |
|
208 | 209 | methods/user-methods |
|
209 | 210 | methods/user-group-methods |
@@ -462,6 +462,7 b' get_repo_file' | |||
|
462 | 462 | :param cache: Use internal caches for fetching files. If disabled fetching |
|
463 | 463 | files is slower but more memory efficient |
|
464 | 464 | :type cache: Optional(bool) |
|
465 | ||
|
465 | 466 | Example output: |
|
466 | 467 | |
|
467 | 468 | .. code-block:: bash |
@@ -499,53 +500,51 b' get_repo_nodes' | |||
|
499 | 500 | .. py:function:: get_repo_nodes(apiuser, repoid, revision, root_path, ret_type=<Optional:'all'>, details=<Optional:'basic'>, max_file_bytes=<Optional:None>) |
|
500 | 501 | |
|
501 | 502 | Returns a list of nodes and children in a flat list for a given |
|
502 |
|
|
|
503 | path at given revision. | |
|
503 | 504 | |
|
504 |
|
|
|
505 | It's possible to specify ret_type to show only `files` or `dirs`. | |
|
505 | 506 | |
|
506 |
|
|
|
507 |
|
|
|
507 | This command can only be run using an |authtoken| with admin rights, | |
|
508 | or users with at least read rights to |repos|. | |
|
508 | 509 | |
|
509 |
|
|
|
510 |
|
|
|
511 |
|
|
|
512 |
|
|
|
513 |
|
|
|
514 |
|
|
|
515 |
|
|
|
516 |
|
|
|
517 |
|
|
|
518 |
|
|
|
519 |
|
|
|
520 |
|
|
|
521 |
|
|
|
522 |
|
|
|
523 |
|
|
|
524 |
|
|
|
525 |
|
|
|
526 | ||
|
527 | Example output: | |
|
510 | :param apiuser: This is filled automatically from the |authtoken|. | |
|
511 | :type apiuser: AuthUser | |
|
512 | :param repoid: The repository name or repository ID. | |
|
513 | :type repoid: str or int | |
|
514 | :param revision: The revision for which listing should be done. | |
|
515 | :type revision: str | |
|
516 | :param root_path: The path from which to start displaying. | |
|
517 | :type root_path: str | |
|
518 | :param ret_type: Set the return type. Valid options are | |
|
519 | ``all`` (default), ``files`` and ``dirs``. | |
|
520 | :type ret_type: Optional(str) | |
|
521 | :param details: Returns extended information about nodes, such as | |
|
522 | md5, binary, and or content. | |
|
523 | The valid options are ``basic`` and ``full``. | |
|
524 | :type details: Optional(str) | |
|
525 | :param max_file_bytes: Only return file content under this file size bytes | |
|
526 | :type details: Optional(int) | |
|
528 | 527 | |
|
529 | .. code-block:: bash | |
|
528 | Example output: | |
|
529 | ||
|
530 | .. code-block:: bash | |
|
530 | 531 | |
|
531 |
|
|
|
532 |
|
|
|
533 |
|
|
|
534 |
|
|
|
535 |
|
|
|
536 | Line2 | |
|
537 | ", | |
|
538 | "extension": "md", | |
|
539 |
|
|
|
540 | "md5": "059fa5d29b19c0657e384749480f6422", | |
|
541 |
|
|
|
542 |
|
|
|
543 |
|
|
|
544 | "type": "file" | |
|
545 |
|
|
|
546 | ... | |
|
547 | ] | |
|
548 | error: null | |
|
532 | id : <id_given_in_input> | |
|
533 | result: [ | |
|
534 | { | |
|
535 | "binary": false, | |
|
536 | "content": "File line", | |
|
537 | "extension": "md", | |
|
538 | "lines": 2, | |
|
539 | "md5": "059fa5d29b19c0657e384749480f6422", | |
|
540 | "mimetype": "text/x-minidsrc", | |
|
541 | "name": "file.md", | |
|
542 | "size": 580, | |
|
543 | "type": "file" | |
|
544 | }, | |
|
545 | ... | |
|
546 | ] | |
|
547 | error: null | |
|
549 | 548 | |
|
550 | 549 | |
|
551 | 550 | get_repo_refs |
@@ -32,9 +32,9 b' The most important this id needs to be u' | |||
|
32 | 32 | |
|
33 | 33 | In [1]: saml2user = { |
|
34 | 34 | ...: # OneLogin, uses externalID available to read from in the UI |
|
35 | ...: 123: {'id: '48253211'}, | |
|
35 | ...: 123: {'id': '48253211'}, | |
|
36 | 36 | ...: # for Google/DuoSecurity email is also an option for unique ID |
|
37 |
...: 124: {'id |
|
|
37 | ...: 124: {'id': 'email@domain.com'}, | |
|
38 | 38 | ...: } |
|
39 | 39 | |
|
40 | 40 | |
@@ -70,7 +70,7 b' Enter in the ishell prompt' | |||
|
70 | 70 | ...: new_external_identity.external_id = external_id |
|
71 | 71 | ...: new_external_identity.external_username = '{}-saml-{}'.format(user.username, user.user_id) |
|
72 | 72 | ...: new_external_identity.provider_name = provider |
|
73 | ...: new_external_identity.local_user_id = user_id | |
|
73 | ...: new_external_identity.local_user_id = user.user_id | |
|
74 | 74 | ...: new_external_identity.access_token = '' |
|
75 | 75 | ...: new_external_identity.token_secret = '' |
|
76 | 76 | ...: new_external_identity.alt_token = '' |
@@ -46,7 +46,7 b' and commit files and |repos| while manag' | |||
|
46 | 46 | nix/default-env |
|
47 | 47 | admin/system-admin |
|
48 | 48 | admin/user-admin |
|
49 |
admin/ |
|
|
49 | admin/repo-admin | |
|
50 | 50 | admin/security-tips |
|
51 | 51 | auth/auth |
|
52 | 52 | issue-trackers/issue-trackers |
@@ -64,6 +64,13 b' and commit files and |repos| while manag' | |||
|
64 | 64 | |
|
65 | 65 | .. toctree:: |
|
66 | 66 | :maxdepth: 1 |
|
67 | :caption: User Documentation | |
|
68 | ||
|
69 | usage/basic-usage | |
|
70 | tutorials/tutorials | |
|
71 | ||
|
72 | .. toctree:: | |
|
73 | :maxdepth: 1 | |
|
67 | 74 | :caption: Developer Documentation |
|
68 | 75 | |
|
69 | 76 | api/api |
@@ -73,13 +80,6 b' and commit files and |repos| while manag' | |||
|
73 | 80 | |
|
74 | 81 | .. toctree:: |
|
75 | 82 | :maxdepth: 1 |
|
76 | :caption: User Documentation | |
|
77 | ||
|
78 | usage/basic-usage | |
|
79 | tutorials/tutorials | |
|
80 | ||
|
81 | .. toctree:: | |
|
82 | :maxdepth: 1 | |
|
83 | 83 | :caption: About |
|
84 | 84 | |
|
85 | 85 | known-issues/known-issues |
@@ -9,6 +9,7 b' Release Notes' | |||
|
9 | 9 | .. toctree:: |
|
10 | 10 | :maxdepth: 1 |
|
11 | 11 | |
|
12 | release-notes-4.17.0.rst | |
|
12 | 13 | release-notes-4.16.2.rst |
|
13 | 14 | release-notes-4.16.1.rst |
|
14 | 15 | release-notes-4.16.0.rst |
@@ -35,6 +35,7 b'' | |||
|
35 | 35 | "<%= dirs.js.node_modules %>/moment/min/moment.min.js", |
|
36 | 36 | "<%= dirs.js.node_modules %>/clipboard/dist/clipboard.min.js", |
|
37 | 37 | "<%= dirs.js.node_modules %>/favico.js/favico-0.3.10.min.js", |
|
38 | "<%= dirs.js.node_modules %>/dropzone/dist/dropzone.js", | |
|
38 | 39 | "<%= dirs.js.node_modules %>/sticky-sidebar/dist/sticky-sidebar.min.js", |
|
39 | 40 | "<%= dirs.js.node_modules %>/sticky-sidebar/dist/jquery.sticky-sidebar.min.js", |
|
40 | 41 | "<%= dirs.js.node_modules %>/waypoints/lib/noframework.waypoints.min.js", |
@@ -107,7 +108,8 b'' | |||
|
107 | 108 | }, |
|
108 | 109 | "files": { |
|
109 | 110 | "<%= dirs.css.dest %>/style.css": "<%= dirs.css.src %>/main.less", |
|
110 | "<%= dirs.css.dest %>/style-polymer.css": "<%= dirs.css.src %>/polymer.less" | |
|
111 | "<%= dirs.css.dest %>/style-polymer.css": "<%= dirs.css.src %>/polymer.less", | |
|
112 | "<%= dirs.css.dest %>/style-ipython.css": "<%= dirs.css.src %>/ipython.less" | |
|
111 | 113 | } |
|
112 | 114 | }, |
|
113 | 115 | "production": { |
@@ -118,7 +120,8 b'' | |||
|
118 | 120 | }, |
|
119 | 121 | "files": { |
|
120 | 122 | "<%= dirs.css.dest %>/style.css": "<%= dirs.css.src %>/main.less", |
|
121 | "<%= dirs.css.dest %>/style-polymer.css": "<%= dirs.css.src %>/polymer.less" | |
|
123 | "<%= dirs.css.dest %>/style-polymer.css": "<%= dirs.css.src %>/polymer.less", | |
|
124 | "<%= dirs.css.dest %>/style-ipython.css": "<%= dirs.css.src %>/ipython.less" | |
|
122 | 125 | } |
|
123 | 126 | }, |
|
124 | 127 | "components": { |
@@ -13,6 +13,7 b'' | |||
|
13 | 13 | "clipboard": "^2.0.1", |
|
14 | 14 | "exports-loader": "^0.6.4", |
|
15 | 15 | "favico.js": "^0.3.10", |
|
16 | "dropzone": "^5.5.0", | |
|
16 | 17 | "grunt": "^0.4.5", |
|
17 | 18 | "grunt-cli": "^1.3.1", |
|
18 | 19 | "grunt-contrib-concat": "^0.5.1", |
@@ -1,3 +1,3 b'' | |||
|
1 | 1 | [pip2nix] |
|
2 | requirements = ., -r ./requirements.txt | |
|
2 | requirements = ., -r ./requirements.txt, -r ./requirements_pinned.txt | |
|
3 | 3 | output = ./pkgs/python-packages.nix |
This diff has been collapsed as it changes many lines, (990 lines changed) Show them Hide them | |||
@@ -166,13 +166,13 b' let' | |||
|
166 | 166 | sha512 = "dgOe12GyCF1VZBLUQqnzGWlf3xb255FajNCVB1VFj/AtskYtoamnafa7m3a+1vs+C8qbg4Benn5KwgxVDSW4cg=="; |
|
167 | 167 | }; |
|
168 | 168 | }; |
|
169 |
"@polymer/paper-spinner-3.0. |
|
|
169 | "@polymer/paper-spinner-3.0.2" = { | |
|
170 | 170 | name = "_at_polymer_slash_paper-spinner"; |
|
171 | 171 | packageName = "@polymer/paper-spinner"; |
|
172 |
version = "3.0. |
|
|
173 | src = fetchurl { | |
|
174 |
url = "https://registry.npmjs.org/@polymer/paper-spinner/-/paper-spinner-3.0. |
|
|
175 | sha512 = "MYIU6qWZnhZ5yNFOBzROPgBteGfxKEnDZ6bCgjrvUtJkBuQEz0MQZzSE/zmZc0oaJ9u5QK5xAFuYdudsGv7+sQ=="; | |
|
172 | version = "3.0.2"; | |
|
173 | src = fetchurl { | |
|
174 | url = "https://registry.npmjs.org/@polymer/paper-spinner/-/paper-spinner-3.0.2.tgz"; | |
|
175 | sha512 = "XUzu8/4NH+pnNZUTI2MxtOKFAr0EOsW7eGhTg3VBhTh7DDW/q3ewzwYRWnqNJokX9BEnxKMiXXaIeTEBq4k2dw=="; | |
|
176 | 176 | }; |
|
177 | 177 | }; |
|
178 | 178 | "@polymer/paper-styles-3.0.1" = { |
@@ -211,13 +211,13 b' let' | |||
|
211 | 211 | sha512 = "yiUk09opTEnE1lK+tb501ENb+yQBi4p++Ep0eGJAHesVYKVMPNgPphVKkIizkDaU+n0SE+zXfTsRbYyOMDYXSg=="; |
|
212 | 212 | }; |
|
213 | 213 | }; |
|
214 |
"@polymer/polymer-3. |
|
|
214 | "@polymer/polymer-3.2.0" = { | |
|
215 | 215 | name = "_at_polymer_slash_polymer"; |
|
216 | 216 | packageName = "@polymer/polymer"; |
|
217 |
version = "3. |
|
|
218 | src = fetchurl { | |
|
219 |
url = "https://registry.npmjs.org/@polymer/polymer/-/polymer-3. |
|
|
220 | sha512 = "hwN8IMERsFATz/9dSMxYHL+84J9uBkPuuarxJWlTsppZ4CAYTZKnepBfNrKoyNsafBmA3yXBiiKPPf+fJtza7A=="; | |
|
217 | version = "3.2.0"; | |
|
218 | src = fetchurl { | |
|
219 | url = "https://registry.npmjs.org/@polymer/polymer/-/polymer-3.2.0.tgz"; | |
|
220 | sha512 = "L6uV1oM6T6xbwbVx6t3biG5T2VSSB03LxnIrUd9M2pr6RkHVPFHJ37pC5MUwBAEhkGFJif7eks7fdMMSGZTeEQ=="; | |
|
221 | 221 | }; |
|
222 | 222 | }; |
|
223 | 223 | "@types/clone-0.1.30" = { |
@@ -229,13 +229,13 b' let' | |||
|
229 | 229 | sha1 = "e7365648c1b42136a59c7d5040637b3b5c83b614"; |
|
230 | 230 | }; |
|
231 | 231 | }; |
|
232 |
"@types/node-6.14. |
|
|
232 | "@types/node-6.14.6" = { | |
|
233 | 233 | name = "_at_types_slash_node"; |
|
234 | 234 | packageName = "@types/node"; |
|
235 |
version = "6.14. |
|
|
236 | src = fetchurl { | |
|
237 |
url = "https://registry.npmjs.org/@types/node/-/node-6.14. |
|
|
238 | sha512 = "JWB3xaVfsfnFY8Ofc9rTB/op0fqqTSqy4vBcVk1LuRJvta7KTX+D//fCkiTMeLGhdr2EbFZzQjC97gvmPilk9Q=="; | |
|
235 | version = "6.14.6"; | |
|
236 | src = fetchurl { | |
|
237 | url = "https://registry.npmjs.org/@types/node/-/node-6.14.6.tgz"; | |
|
238 | sha512 = "rFs9zCFtSHuseiNXxYxFlun8ibu+jtZPgRM+2ILCmeLiGeGLiIGxuOzD+cNyHegI1GD+da3R/cIbs9+xCLp13w=="; | |
|
239 | 239 | }; |
|
240 | 240 | }; |
|
241 | 241 | "@types/parse5-2.2.34" = { |
@@ -409,22 +409,22 b' let' | |||
|
409 | 409 | sha512 = "mJ3QKWtCchL1vhU/kZlJnLPuQZnlDOdZsyP0bbLWPGdYsQDnSBvyTLhzwBA3QAMlzEL9V4JHygEmK6/OTEyytA=="; |
|
410 | 410 | }; |
|
411 | 411 | }; |
|
412 |
"@webcomponents/shadycss-1. |
|
|
412 | "@webcomponents/shadycss-1.9.1" = { | |
|
413 | 413 | name = "_at_webcomponents_slash_shadycss"; |
|
414 | 414 | packageName = "@webcomponents/shadycss"; |
|
415 |
version = "1. |
|
|
416 | src = fetchurl { | |
|
417 |
url = "https://registry.npmjs.org/@webcomponents/shadycss/-/shadycss-1. |
|
|
418 | sha512 = "6SZqLajRPWL0rrKDZOGF8PCBq5B9JqgFmE5rX5psk6i8WrqiMkSCuO8+rnirzViTsU5CqnjQPFC3OvG4YJdMrQ=="; | |
|
419 | }; | |
|
420 | }; | |
|
421 | "@webcomponents/webcomponentsjs-2.2.1" = { | |
|
415 | version = "1.9.1"; | |
|
416 | src = fetchurl { | |
|
417 | url = "https://registry.npmjs.org/@webcomponents/shadycss/-/shadycss-1.9.1.tgz"; | |
|
418 | sha512 = "IaZOnWOKXHghqk/WfPNDRIgDBi3RsVPY2IFAw6tYiL9UBGvQRy5R6uC+Fk7qTZsReTJ0xh5MTT8yAcb3MUR4mQ=="; | |
|
419 | }; | |
|
420 | }; | |
|
421 | "@webcomponents/webcomponentsjs-2.2.10" = { | |
|
422 | 422 | name = "_at_webcomponents_slash_webcomponentsjs"; |
|
423 | 423 | packageName = "@webcomponents/webcomponentsjs"; |
|
424 | version = "2.2.1"; | |
|
425 | src = fetchurl { | |
|
426 | url = "https://registry.npmjs.org/@webcomponents/webcomponentsjs/-/webcomponentsjs-2.2.1.tgz"; | |
|
427 | sha512 = "lZZ+Lkke6JhsJcQQqSVk1Pny6/8y4qhJ98LO7a/MwBSRO8WqHqK1X2vscfeL8vOnYGFnmBUyVG95lwYv/AXyLQ=="; | |
|
424 | version = "2.2.10"; | |
|
425 | src = fetchurl { | |
|
426 | url = "https://registry.npmjs.org/@webcomponents/webcomponentsjs/-/webcomponentsjs-2.2.10.tgz"; | |
|
427 | sha512 = "5dzhUhP+h0qMiK0IWb7VNb0OGBoXO3AuI6Qi8t9PoKT50s5L1jv0xnwnLq+cFgPuTB8FLTNP8xIDmyoOsKBy9Q=="; | |
|
428 | 428 | }; |
|
429 | 429 | }; |
|
430 | 430 | "@xtuc/ieee754-1.2.0" = { |
@@ -499,22 +499,22 b' let' | |||
|
499 | 499 | sha1 = "82ffb02b29e662ae53bdc20af15947706739c536"; |
|
500 | 500 | }; |
|
501 | 501 | }; |
|
502 |
"ajv-6. |
|
|
502 | "ajv-6.10.0" = { | |
|
503 | 503 | name = "ajv"; |
|
504 | 504 | packageName = "ajv"; |
|
505 |
version = "6. |
|
|
506 | src = fetchurl { | |
|
507 |
url = "https://registry.npmjs.org/ajv/-/ajv-6. |
|
|
508 | sha512 = "FBHEW6Jf5TB9MGBgUUA9XHkTbjXYfAUjY43ACMfmdMRHniyoMHjHjzD50OK8LGDWQwp4rWEsIq5kEqq7rvIM1g=="; | |
|
509 | }; | |
|
510 | }; | |
|
511 |
"ajv-keywords-3. |
|
|
505 | version = "6.10.0"; | |
|
506 | src = fetchurl { | |
|
507 | url = "https://registry.npmjs.org/ajv/-/ajv-6.10.0.tgz"; | |
|
508 | sha512 = "nffhOpkymDECQyR0mnsUtoCE8RlX38G0rYP+wgLWFyZuUyuuojSSvi/+euOiQBIn63whYwYVIIH1TvE3tu4OEg=="; | |
|
509 | }; | |
|
510 | }; | |
|
511 | "ajv-keywords-3.4.0" = { | |
|
512 | 512 | name = "ajv-keywords"; |
|
513 | 513 | packageName = "ajv-keywords"; |
|
514 |
version = "3. |
|
|
515 | src = fetchurl { | |
|
516 |
url = "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-3. |
|
|
517 | sha1 = "e86b819c602cf8821ad637413698f1dec021847a"; | |
|
514 | version = "3.4.0"; | |
|
515 | src = fetchurl { | |
|
516 | url = "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-3.4.0.tgz"; | |
|
517 | sha512 = "aUjdRFISbuFOl0EIZc+9e4FfZp0bDZgAdOOf30bJmw8VM9v84SHyVyxDfbWxpGYbdZD/9XoKxfHVNmxPkhwyGw=="; | |
|
518 | 518 | }; |
|
519 | 519 | }; |
|
520 | 520 | "align-text-0.1.4" = { |
@@ -716,15 +716,6 b' let' | |||
|
716 | 716 | sha1 = "a894b75d4bc4f6cd679ef3244a9fd8f46ae2d428"; |
|
717 | 717 | }; |
|
718 | 718 | }; |
|
719 | "arrify-1.0.1" = { | |
|
720 | name = "arrify"; | |
|
721 | packageName = "arrify"; | |
|
722 | version = "1.0.1"; | |
|
723 | src = fetchurl { | |
|
724 | url = "https://registry.npmjs.org/arrify/-/arrify-1.0.1.tgz"; | |
|
725 | sha1 = "898508da2226f380df904728456849c1501a4b0d"; | |
|
726 | }; | |
|
727 | }; | |
|
728 | 719 | "asap-2.0.6" = { |
|
729 | 720 | name = "asap"; |
|
730 | 721 | packageName = "asap"; |
@@ -752,13 +743,13 b' let' | |||
|
752 | 743 | sha512 = "p32cOF5q0Zqs9uBiONKYLm6BClCoBCM5O9JfeUSlnQLBTxYdTK+pW+nXflm8UkKd2UYlEbYz5qEi0JuZR9ckSw=="; |
|
753 | 744 | }; |
|
754 | 745 | }; |
|
755 |
"assert-1. |
|
|
746 | "assert-1.5.0" = { | |
|
756 | 747 | name = "assert"; |
|
757 | 748 | packageName = "assert"; |
|
758 |
version = "1. |
|
|
759 | src = fetchurl { | |
|
760 |
url = "https://registry.npmjs.org/assert/-/assert-1. |
|
|
761 | sha1 = "99912d591836b5a6f5b345c0f07eefc08fc65d91"; | |
|
749 | version = "1.5.0"; | |
|
750 | src = fetchurl { | |
|
751 | url = "https://registry.npmjs.org/assert/-/assert-1.5.0.tgz"; | |
|
752 | sha512 = "EDsgawzwoun2CZkCgtxJbv392v4nbk9XDD06zI+kQYoBM/3RBWLlEyJARDOmhAAosBjWACEkKL6S+lIZtcAubA=="; | |
|
762 | 753 | }; |
|
763 | 754 | }; |
|
764 | 755 | "assert-plus-0.2.0" = { |
@@ -815,22 +806,22 b' let' | |||
|
815 | 806 | sha1 = "b6bbe0b0674b9d719708ca38de8c237cb526c3d1"; |
|
816 | 807 | }; |
|
817 | 808 | }; |
|
818 |
"async-2.6. |
|
|
809 | "async-2.6.2" = { | |
|
819 | 810 | name = "async"; |
|
820 | 811 | packageName = "async"; |
|
821 |
version = "2.6. |
|
|
822 | src = fetchurl { | |
|
823 |
url = "https://registry.npmjs.org/async/-/async-2.6. |
|
|
824 | sha512 = "fNEiL2+AZt6AlAw/29Cr0UDe4sRAHCpEHh54WMz+Bb7QfNcFw4h3loofyJpLeQs4Yx7yuqu/2dLgM5hKOs6HlQ=="; | |
|
825 | }; | |
|
826 | }; | |
|
827 |
"async-each-1.0. |
|
|
812 | version = "2.6.2"; | |
|
813 | src = fetchurl { | |
|
814 | url = "https://registry.npmjs.org/async/-/async-2.6.2.tgz"; | |
|
815 | sha512 = "H1qVYh1MYhEEFLsP97cVKqCGo7KfCyTt6uEWqsTBr9SO84oK9Uwbyd/yCW+6rKJLHksBNUVWZDAjfS+Ccx0Bbg=="; | |
|
816 | }; | |
|
817 | }; | |
|
818 | "async-each-1.0.3" = { | |
|
828 | 819 | name = "async-each"; |
|
829 | 820 | packageName = "async-each"; |
|
830 |
version = "1.0. |
|
|
831 | src = fetchurl { | |
|
832 |
url = "https://registry.npmjs.org/async-each/-/async-each-1.0. |
|
|
833 | sha1 = "19d386a1d9edc6e7c1c85d388aedbcc56d33602d"; | |
|
821 | version = "1.0.3"; | |
|
822 | src = fetchurl { | |
|
823 | url = "https://registry.npmjs.org/async-each/-/async-each-1.0.3.tgz"; | |
|
824 | sha512 = "z/WhQ5FPySLdvREByI2vZiTWwCnF0moMJ1hK9YQwDTHKh6I7/uSckMetoRGb5UBZPC1z0jlw+n/XCgjeH7y1AQ=="; | |
|
834 | 825 | }; |
|
835 | 826 | }; |
|
836 | 827 | "asynckit-0.4.0" = { |
@@ -1445,22 +1436,22 b' let' | |||
|
1445 | 1436 | sha512 = "vyL2OymJxmarO8gxMr0mhChsO9QGwhynfuu4+MHTAW6czfq9humCB7rKpUjDd9YUiDPU4mzpyupFSvOClAwbmQ=="; |
|
1446 | 1437 | }; |
|
1447 | 1438 | }; |
|
1448 |
"binary-extensions-1.1 |
|
|
1439 | "binary-extensions-1.13.1" = { | |
|
1449 | 1440 | name = "binary-extensions"; |
|
1450 | 1441 | packageName = "binary-extensions"; |
|
1451 |
version = "1.1 |
|
|
1452 | src = fetchurl { | |
|
1453 |
url = "https://registry.npmjs.org/binary-extensions/-/binary-extensions-1.1 |
|
|
1454 | sha512 = "DYWGk01lDcxeS/K9IHPGWfT8PsJmbXRtRd2Sx72Tnb8pcYZQFF1oSDb8hJtS1vhp212q1Rzi5dUf9+nq0o9UIg=="; | |
|
1455 | }; | |
|
1456 | }; | |
|
1457 |
"bluebird-3.5. |
|
|
1442 | version = "1.13.1"; | |
|
1443 | src = fetchurl { | |
|
1444 | url = "https://registry.npmjs.org/binary-extensions/-/binary-extensions-1.13.1.tgz"; | |
|
1445 | sha512 = "Un7MIEDdUC5gNpcGDV97op1Ywk748MpHcFTHoYs6qnj1Z3j7I53VG3nwZhKzoBZmbdRNnb6WRdFlwl7tSDuZGw=="; | |
|
1446 | }; | |
|
1447 | }; | |
|
1448 | "bluebird-3.5.4" = { | |
|
1458 | 1449 | name = "bluebird"; |
|
1459 | 1450 | packageName = "bluebird"; |
|
1460 |
version = "3.5. |
|
|
1461 | src = fetchurl { | |
|
1462 |
url = "https://registry.npmjs.org/bluebird/-/bluebird-3.5. |
|
|
1463 | sha512 = "/qKPUQlaW1OyR51WeCPBvRnAlnZFUJkCSG5HzGnuIqhgyJtF+T94lFnn33eiazjRm2LAHVy2guNnaq48X9SJuw=="; | |
|
1451 | version = "3.5.4"; | |
|
1452 | src = fetchurl { | |
|
1453 | url = "https://registry.npmjs.org/bluebird/-/bluebird-3.5.4.tgz"; | |
|
1454 | sha512 = "FG+nFEZChJrbQ9tIccIfZJBz3J7mLrAhxakAbnrJWn8d7aKOC+LWifa0G+p4ZqKp4y13T7juYvdhq9NzKdsrjw=="; | |
|
1464 | 1455 | }; |
|
1465 | 1456 | }; |
|
1466 | 1457 | "bn.js-4.11.8" = { |
@@ -1661,13 +1652,13 b' let' | |||
|
1661 | 1652 | sha1 = "9bb5304d2e0b56698b2c758b08a3eaa9daa58a39"; |
|
1662 | 1653 | }; |
|
1663 | 1654 | }; |
|
1664 |
"camelcase-5. |
|
|
1655 | "camelcase-5.3.1" = { | |
|
1665 | 1656 | name = "camelcase"; |
|
1666 | 1657 | packageName = "camelcase"; |
|
1667 |
version = "5. |
|
|
1668 | src = fetchurl { | |
|
1669 |
url = "https://registry.npmjs.org/camelcase/-/camelcase-5. |
|
|
1670 | sha512 = "faqwZqnWxbxn+F1d399ygeamQNy3lPp/H9H6rNrqYh4FSVCtcY+3cub1MxA8o9mDd55mM8Aghuu/kuyYA6VTsA=="; | |
|
1658 | version = "5.3.1"; | |
|
1659 | src = fetchurl { | |
|
1660 | url = "https://registry.npmjs.org/camelcase/-/camelcase-5.3.1.tgz"; | |
|
1661 | sha512 = "L28STB170nwWS63UjtlEOE3dldQApaJXZkOI1uMFfzf3rRuPegHaHesyee+YxQ+W6SvRDQV6UrdOdRiR153wJg=="; | |
|
1671 | 1662 | }; |
|
1672 | 1663 | }; |
|
1673 | 1664 | "caniuse-api-1.6.1" = { |
@@ -1679,22 +1670,22 b' let' | |||
|
1679 | 1670 | sha1 = "b534e7c734c4f81ec5fbe8aca2ad24354b962c6c"; |
|
1680 | 1671 | }; |
|
1681 | 1672 | }; |
|
1682 |
"caniuse-db-1.0.300009 |
|
|
1673 | "caniuse-db-1.0.30000967" = { | |
|
1683 | 1674 | name = "caniuse-db"; |
|
1684 | 1675 | packageName = "caniuse-db"; |
|
1685 |
version = "1.0.300009 |
|
|
1686 | src = fetchurl { | |
|
1687 |
url = "https://registry.npmjs.org/caniuse-db/-/caniuse-db-1.0.300009 |
|
|
1688 | sha512 = "CX/QvLA8oh7kQ9cHCCzFm0UZW4KwSyQSRJ5A1XtH42HaMJQ0yh+9fEVWagMqv9I1vSCtaqA5Mb8k0uKfv7jhDw=="; | |
|
1689 | }; | |
|
1690 | }; | |
|
1691 |
"caniuse-lite-1.0.300009 |
|
|
1676 | version = "1.0.30000967"; | |
|
1677 | src = fetchurl { | |
|
1678 | url = "https://registry.npmjs.org/caniuse-db/-/caniuse-db-1.0.30000967.tgz"; | |
|
1679 | sha512 = "70gk6cLSD5rItxnZ7WUxyCpM9LAjEb1tVzlENQfXQXZS/IiGnfAC6u32G5cZFlDBKjNPBIta/QSx5CZLZepxRA=="; | |
|
1680 | }; | |
|
1681 | }; | |
|
1682 | "caniuse-lite-1.0.30000967" = { | |
|
1692 | 1683 | name = "caniuse-lite"; |
|
1693 | 1684 | packageName = "caniuse-lite"; |
|
1694 |
version = "1.0.300009 |
|
|
1695 | src = fetchurl { | |
|
1696 |
url = "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.300009 |
|
|
1697 | sha512 = "ogq4NbUWf1uG/j66k0AmiO3GjqJAlQyF8n4w8a954cbCyFKmYGvRtgz6qkq2fWuduTXHibX7GyYL5Pg58Aks2g=="; | |
|
1685 | version = "1.0.30000967"; | |
|
1686 | src = fetchurl { | |
|
1687 | url = "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30000967.tgz"; | |
|
1688 | sha512 = "rUBIbap+VJfxTzrM4akJ00lkvVb5/n5v3EGXfWzSH5zT8aJmGzjA8HWhJ4U6kCpzxozUSnB+yvAYDRPY6mRpgQ=="; | |
|
1698 | 1689 | }; |
|
1699 | 1690 | }; |
|
1700 | 1691 | "caseless-0.12.0" = { |
@@ -1742,13 +1733,13 b' let' | |||
|
1742 | 1733 | sha512 = "Mti+f9lpJNcwF4tWV8/OrTTtF1gZi+f8FqlyAdouralcFWFQWF2+NgCHShjkCb+IFBLq9buZwE1xckQU4peSuQ=="; |
|
1743 | 1734 | }; |
|
1744 | 1735 | }; |
|
1745 |
"chokidar-2. |
|
|
1736 | "chokidar-2.1.5" = { | |
|
1746 | 1737 | name = "chokidar"; |
|
1747 | 1738 | packageName = "chokidar"; |
|
1748 |
version = "2. |
|
|
1749 | src = fetchurl { | |
|
1750 |
url = "https://registry.npmjs.org/chokidar/-/chokidar-2. |
|
|
1751 | sha512 = "z9n7yt9rOvIJrMhvDtDictKrkFHeihkNl6uWMmZlmL6tJtX9Cs+87oK+teBx+JIgzvbX3yZHT3eF8vpbDxHJXQ=="; | |
|
1739 | version = "2.1.5"; | |
|
1740 | src = fetchurl { | |
|
1741 | url = "https://registry.npmjs.org/chokidar/-/chokidar-2.1.5.tgz"; | |
|
1742 | sha512 = "i0TprVWp+Kj4WRPtInjexJ8Q+BqTE909VpH8xVhXrJkoc5QC8VO9TryGOqTr+2hljzc1sC62t22h5tZePodM/A=="; | |
|
1752 | 1743 | }; |
|
1753 | 1744 | }; |
|
1754 | 1745 | "chownr-1.1.1" = { |
@@ -1976,13 +1967,13 b' let' | |||
|
1976 | 1967 | sha512 = "mmGt/1pZqYRjMxB1axhTo16/snVZ5krrKkcmMeVKxzECMMXoCgnvTPp10QgHfcbQZw8Dq2jMNG6je4JlWU0gWg=="; |
|
1977 | 1968 | }; |
|
1978 | 1969 | }; |
|
1979 |
"combined-stream-1.0. |
|
|
1970 | "combined-stream-1.0.8" = { | |
|
1980 | 1971 | name = "combined-stream"; |
|
1981 | 1972 | packageName = "combined-stream"; |
|
1982 |
version = "1.0. |
|
|
1983 | src = fetchurl { | |
|
1984 |
url = "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0. |
|
|
1985 | sha512 = "brWl9y6vOB1xYPZcpZde3N9zDByXTosAeMDo4p1wzo6UMOX4vumB+TP1RZ76sfE6Md68Q0NJSrE/gbezd4Ul+w=="; | |
|
1973 | version = "1.0.8"; | |
|
1974 | src = fetchurl { | |
|
1975 | url = "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz"; | |
|
1976 | sha512 = "FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg=="; | |
|
1986 | 1977 | }; |
|
1987 | 1978 | }; |
|
1988 | 1979 | "commander-2.14.1" = { |
@@ -2003,6 +1994,15 b' let' | |||
|
2003 | 1994 | sha512 = "wPMUt6FnH2yzG95SA6mzjQOEKUU3aLaDEmzs1ti+1E9h+CsrZghRlqEM/EJ4KscsQVG8uNN4uVreUeT8+drlgg=="; |
|
2004 | 1995 | }; |
|
2005 | 1996 | }; |
|
1997 | "commander-2.19.0" = { | |
|
1998 | name = "commander"; | |
|
1999 | packageName = "commander"; | |
|
2000 | version = "2.19.0"; | |
|
2001 | src = fetchurl { | |
|
2002 | url = "https://registry.npmjs.org/commander/-/commander-2.19.0.tgz"; | |
|
2003 | sha512 = "6tvAOO+D6OENvRAh524Dh9jcfKTYDQAqvqezbCW82xj5X0pSrcpxtvRKHLG0yBY6SD7PSDrJaj+0AiOcKVd1Xg=="; | |
|
2004 | }; | |
|
2005 | }; | |
|
2006 | 2006 | "commondir-1.0.1" = { |
|
2007 | 2007 | name = "commondir"; |
|
2008 | 2008 | packageName = "commondir"; |
@@ -2012,13 +2012,13 b' let' | |||
|
2012 | 2012 | sha1 = "ddd800da0c66127393cca5950ea968a3aaf1253b"; |
|
2013 | 2013 | }; |
|
2014 | 2014 | }; |
|
2015 |
"component-emitter-1. |
|
|
2015 | "component-emitter-1.3.0" = { | |
|
2016 | 2016 | name = "component-emitter"; |
|
2017 | 2017 | packageName = "component-emitter"; |
|
2018 |
version = "1. |
|
|
2019 | src = fetchurl { | |
|
2020 |
url = "https://registry.npmjs.org/component-emitter/-/component-emitter-1. |
|
|
2021 | sha1 = "137918d6d78283f7df7a6b7c5a63e140e69425e6"; | |
|
2018 | version = "1.3.0"; | |
|
2019 | src = fetchurl { | |
|
2020 | url = "https://registry.npmjs.org/component-emitter/-/component-emitter-1.3.0.tgz"; | |
|
2021 | sha512 = "Rd3se6QB+sO1TwqZjscQrurpEPIfO0/yYnSin6Q/rD3mOutHvUrCAhJub3r90uNb+SESBuE0QYoB90YdfatsRg=="; | |
|
2022 | 2022 | }; |
|
2023 | 2023 | }; |
|
2024 | 2024 | "concat-map-0.0.1" = { |
@@ -2093,13 +2093,13 b' let' | |||
|
2093 | 2093 | sha512 = "Y+SQCF+0NoWQryez2zXn5J5knmr9z/9qSQt7fbL78u83rxmigOy8X5+BFn8CFSuX+nKT8gpYwJX68ekqtQt6ZA=="; |
|
2094 | 2094 | }; |
|
2095 | 2095 | }; |
|
2096 |
"core-js-2.6. |
|
|
2096 | "core-js-2.6.5" = { | |
|
2097 | 2097 | name = "core-js"; |
|
2098 | 2098 | packageName = "core-js"; |
|
2099 |
version = "2.6. |
|
|
2100 | src = fetchurl { | |
|
2101 |
url = "https://registry.npmjs.org/core-js/-/core-js-2.6. |
|
|
2102 | sha512 = "L72mmmEayPJBejKIWe2pYtGis5r0tQ5NaJekdhyXgeMQTpJoBsH0NL4ElY2LfSoV15xeQWKQ+XTTOZdyero5Xg=="; | |
|
2099 | version = "2.6.5"; | |
|
2100 | src = fetchurl { | |
|
2101 | url = "https://registry.npmjs.org/core-js/-/core-js-2.6.5.tgz"; | |
|
2102 | sha512 = "klh/kDpwX8hryYL14M9w/xei6vrv6sE8gTHDG7/T/+SEovB/G4ejwcfE/CBzO6Edsu+OETZMZ3wcX/EjUkrl5A=="; | |
|
2103 | 2103 | }; |
|
2104 | 2104 | }; |
|
2105 | 2105 | "core-util-is-1.0.2" = { |
@@ -2201,13 +2201,13 b' let' | |||
|
2201 | 2201 | sha512 = "xYL0AMZJ4gFzJQsHUKa5jiWWi2vH77WVNg7JYRyewwj6oPh4yb/y6Y9ZCw9dsj/9UauMhtuxR+ogQd//EdEVNA=="; |
|
2202 | 2202 | }; |
|
2203 | 2203 | }; |
|
2204 |
"css-what-2.1. |
|
|
2204 | "css-what-2.1.3" = { | |
|
2205 | 2205 | name = "css-what"; |
|
2206 | 2206 | packageName = "css-what"; |
|
2207 |
version = "2.1. |
|
|
2208 | src = fetchurl { | |
|
2209 |
url = "https://registry.npmjs.org/css-what/-/css-what-2.1. |
|
|
2210 | sha512 = "wan8dMWQ0GUeF7DGEPVjhHemVW/vy6xUYmFzRY8RYqgA0JtXC9rJmbScBjqSu6dg9q0lwPQy6ZAmJVr3PPTvqQ=="; | |
|
2207 | version = "2.1.3"; | |
|
2208 | src = fetchurl { | |
|
2209 | url = "https://registry.npmjs.org/css-what/-/css-what-2.1.3.tgz"; | |
|
2210 | sha512 = "a+EPoD+uZiNfh+5fxw2nO9QwFa6nJe2Or35fGY6Ipw1R3R4AGz1d1TEZrCegvw2YTmZ0jXirGYlzxxpYSHwpEg=="; | |
|
2211 | 2211 | }; |
|
2212 | 2212 | }; |
|
2213 | 2213 | "cssesc-0.1.0" = { |
@@ -2417,13 +2417,13 b' let' | |||
|
2417 | 2417 | sha512 = "kqag/Nl+f3GwyK25fhUMYj81BUOrZ9IuJsjIcDE5icNM9FJHAVm3VcUDxdLPoQtTuUylWm6ZIknYJwwaPxsUzg=="; |
|
2418 | 2418 | }; |
|
2419 | 2419 | }; |
|
2420 |
"dir-glob-2. |
|
|
2420 | "dir-glob-2.2.2" = { | |
|
2421 | 2421 | name = "dir-glob"; |
|
2422 | 2422 | packageName = "dir-glob"; |
|
2423 |
version = "2. |
|
|
2424 | src = fetchurl { | |
|
2425 |
url = "https://registry.npmjs.org/dir-glob/-/dir-glob-2. |
|
|
2426 | sha512 = "37qirFDz8cA5fimp9feo43fSuRo2gHwaIn6dXL8Ber1dGwUosDrGZeCCXq57WnIqE4aQ+u3eQZzsk1yOzhdwag=="; | |
|
2423 | version = "2.2.2"; | |
|
2424 | src = fetchurl { | |
|
2425 | url = "https://registry.npmjs.org/dir-glob/-/dir-glob-2.2.2.tgz"; | |
|
2426 | sha512 = "f9LBi5QWzIW3I6e//uxZoLBlUt9kcp66qo0sSCxL6YZKc75R1c4MFCoe/LaZiBGmgujvQdxc5Bn3QhfyvK5Hsw=="; | |
|
2427 | 2427 | }; |
|
2428 | 2428 | }; |
|
2429 | 2429 | "dom-converter-0.2.0" = { |
@@ -2435,13 +2435,13 b' let' | |||
|
2435 | 2435 | sha512 = "gd3ypIPfOMr9h5jIKq8E3sHOTCjeirnl0WK5ZdS1AW0Odt0b1PaWaHdJ4Qk4klv+YB9aJBS7mESXjFoDQPu6DA=="; |
|
2436 | 2436 | }; |
|
2437 | 2437 | }; |
|
2438 |
"dom-serializer-0.1. |
|
|
2438 | "dom-serializer-0.1.1" = { | |
|
2439 | 2439 | name = "dom-serializer"; |
|
2440 | 2440 | packageName = "dom-serializer"; |
|
2441 |
version = "0.1. |
|
|
2442 | src = fetchurl { | |
|
2443 |
url = "https://registry.npmjs.org/dom-serializer/-/dom-serializer-0.1. |
|
|
2444 | sha1 = "073c697546ce0780ce23be4a28e293e40bc30c82"; | |
|
2441 | version = "0.1.1"; | |
|
2442 | src = fetchurl { | |
|
2443 | url = "https://registry.npmjs.org/dom-serializer/-/dom-serializer-0.1.1.tgz"; | |
|
2444 | sha512 = "l0IU0pPzLWSHBcieZbpOKgkIn3ts3vAh7ZuFyXNwJxJXk/c4Gwj9xaTJwIDVQCXawWD0qb3IzMGH5rglQaO0XA=="; | |
|
2445 | 2445 | }; |
|
2446 | 2446 | }; |
|
2447 | 2447 | "dom5-2.3.0" = { |
@@ -2462,15 +2462,6 b' let' | |||
|
2462 | 2462 | sha512 = "jnjyiM6eRyZl2H+W8Q/zLMA481hzi0eszAaBUzIVnmYVDBbnLxVNnfu1HgEBvCbL+71FrxMl3E6lpKH7Ge3OXA=="; |
|
2463 | 2463 | }; |
|
2464 | 2464 | }; |
|
2465 | "domelementtype-1.1.3" = { | |
|
2466 | name = "domelementtype"; | |
|
2467 | packageName = "domelementtype"; | |
|
2468 | version = "1.1.3"; | |
|
2469 | src = fetchurl { | |
|
2470 | url = "https://registry.npmjs.org/domelementtype/-/domelementtype-1.1.3.tgz"; | |
|
2471 | sha1 = "bd28773e2642881aec51544924299c5cd822185b"; | |
|
2472 | }; | |
|
2473 | }; | |
|
2474 | 2465 | "domelementtype-1.3.1" = { |
|
2475 | 2466 | name = "domelementtype"; |
|
2476 | 2467 | packageName = "domelementtype"; |
@@ -2480,15 +2471,6 b' let' | |||
|
2480 | 2471 | sha512 = "BSKB+TSpMpFI/HOxCNr1O8aMOTZ8hT3pM3GQ0w/mWRmkhEDSFJkkyzz4XQsBV44BChwGkrDfMyjVD0eA2aFV3w=="; |
|
2481 | 2472 | }; |
|
2482 | 2473 | }; |
|
2483 | "domhandler-2.1.0" = { | |
|
2484 | name = "domhandler"; | |
|
2485 | packageName = "domhandler"; | |
|
2486 | version = "2.1.0"; | |
|
2487 | src = fetchurl { | |
|
2488 | url = "https://registry.npmjs.org/domhandler/-/domhandler-2.1.0.tgz"; | |
|
2489 | sha1 = "d2646f5e57f6c3bab11cf6cb05d3c0acf7412594"; | |
|
2490 | }; | |
|
2491 | }; | |
|
2492 | 2474 | "domhandler-2.3.0" = { |
|
2493 | 2475 | name = "domhandler"; |
|
2494 | 2476 | packageName = "domhandler"; |
@@ -2498,15 +2480,6 b' let' | |||
|
2498 | 2480 | sha1 = "2de59a0822d5027fabff6f032c2b25a2a8abe738"; |
|
2499 | 2481 | }; |
|
2500 | 2482 | }; |
|
2501 | "domutils-1.1.6" = { | |
|
2502 | name = "domutils"; | |
|
2503 | packageName = "domutils"; | |
|
2504 | version = "1.1.6"; | |
|
2505 | src = fetchurl { | |
|
2506 | url = "https://registry.npmjs.org/domutils/-/domutils-1.1.6.tgz"; | |
|
2507 | sha1 = "bddc3de099b9a2efacc51c623f28f416ecc57485"; | |
|
2508 | }; | |
|
2509 | }; | |
|
2510 | 2483 | "domutils-1.5.1" = { |
|
2511 | 2484 | name = "domutils"; |
|
2512 | 2485 | packageName = "domutils"; |
@@ -2516,13 +2489,22 b' let' | |||
|
2516 | 2489 | sha1 = "dcd8488a26f563d61079e48c9f7b7e32373682cf"; |
|
2517 | 2490 | }; |
|
2518 | 2491 | }; |
|
2519 |
"d |
|
|
2492 | "dropzone-5.5.1" = { | |
|
2493 | name = "dropzone"; | |
|
2494 | packageName = "dropzone"; | |
|
2495 | version = "5.5.1"; | |
|
2496 | src = fetchurl { | |
|
2497 | url = "https://registry.npmjs.org/dropzone/-/dropzone-5.5.1.tgz"; | |
|
2498 | sha512 = "3VduRWLxx9hbVr42QieQN25mx/I61/mRdUSuxAmDGdDqZIN8qtP7tcKMa3KfpJjuGjOJGYYUzzeq6eGDnkzesA=="; | |
|
2499 | }; | |
|
2500 | }; | |
|
2501 | "duplexify-3.7.1" = { | |
|
2520 | 2502 | name = "duplexify"; |
|
2521 | 2503 | packageName = "duplexify"; |
|
2522 |
version = "3. |
|
|
2523 | src = fetchurl { | |
|
2524 |
url = "https://registry.npmjs.org/duplexify/-/duplexify-3. |
|
|
2525 | sha512 = "vM58DwdnKmty+FSPzT14K9JXb90H+j5emaR4KYbr2KTIz00WHGbWOe5ghQTx233ZCLZtrGDALzKwcjEtSt35mA=="; | |
|
2504 | version = "3.7.1"; | |
|
2505 | src = fetchurl { | |
|
2506 | url = "https://registry.npmjs.org/duplexify/-/duplexify-3.7.1.tgz"; | |
|
2507 | sha512 = "07z8uv2wMyS51kKhD1KsdXJg5WQ6t93RneqRxUHnskXVtlYYkLqM0gqStQZ3pj073g687jPCHrqNfCzawLYh5g=="; | |
|
2526 | 2508 | }; |
|
2527 | 2509 | }; |
|
2528 | 2510 | "ecc-jsbn-0.1.2" = { |
@@ -2534,13 +2516,13 b' let' | |||
|
2534 | 2516 | sha1 = "3a83a904e54353287874c564b7549386849a98c9"; |
|
2535 | 2517 | }; |
|
2536 | 2518 | }; |
|
2537 |
"electron-to-chromium-1.3. |
|
|
2519 | "electron-to-chromium-1.3.133" = { | |
|
2538 | 2520 | name = "electron-to-chromium"; |
|
2539 | 2521 | packageName = "electron-to-chromium"; |
|
2540 |
version = "1.3. |
|
|
2541 | src = fetchurl { | |
|
2542 |
url = "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.3. |
|
|
2543 | sha512 = "WIZdNuvE3dFr6kkPgv4d/cfswNZD6XbeLBM8baOIQTsnbf4xWrVEaLvp7oNnbnMWWXDqq7Tbv+H5JfciLTJm4Q=="; | |
|
2522 | version = "1.3.133"; | |
|
2523 | src = fetchurl { | |
|
2524 | url = "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.3.133.tgz"; | |
|
2525 | sha512 = "lyoC8aoqbbDqsprb6aPdt9n3DpOZZzdz/T4IZKsR0/dkZIxnJVUjjcpOSwA66jPRIOyDAamCTAUqweU05kKNSg=="; | |
|
2544 | 2526 | }; |
|
2545 | 2527 | }; |
|
2546 | 2528 | "elliptic-6.4.1" = { |
@@ -2651,13 +2633,13 b' let' | |||
|
2651 | 2633 | sha1 = "1b61c0562190a8dff6ae3bb2cf0200ca130b86d4"; |
|
2652 | 2634 | }; |
|
2653 | 2635 | }; |
|
2654 |
"eslint-scope-4.0. |
|
|
2636 | "eslint-scope-4.0.3" = { | |
|
2655 | 2637 | name = "eslint-scope"; |
|
2656 | 2638 | packageName = "eslint-scope"; |
|
2657 |
version = "4.0. |
|
|
2658 | src = fetchurl { | |
|
2659 |
url = "https://registry.npmjs.org/eslint-scope/-/eslint-scope-4.0. |
|
|
2660 | sha512 = "1G6UTDi7Jc1ELFwnR58HV4fK9OQK4S6N985f166xqXxpjU6plxFISJa2Ba9KCQuFa8RCnj/lSFJbHo7UFDBnUA=="; | |
|
2639 | version = "4.0.3"; | |
|
2640 | src = fetchurl { | |
|
2641 | url = "https://registry.npmjs.org/eslint-scope/-/eslint-scope-4.0.3.tgz"; | |
|
2642 | sha512 = "p7VutNr1O/QrxysMo3E45FjYDTeXBy0iTltPFNSqKAIfjDSXC+4dj+qfyuD8bfAXrW/y6lW3O76VaYNPKfpKrg=="; | |
|
2661 | 2643 | }; |
|
2662 | 2644 | }; |
|
2663 | 2645 | "espree-3.5.4" = { |
@@ -2732,13 +2714,13 b' let' | |||
|
2732 | 2714 | sha1 = "8f61b75cde012b2e9eb284d4545583b5643b61ab"; |
|
2733 | 2715 | }; |
|
2734 | 2716 | }; |
|
2735 |
"events- |
|
|
2717 | "events-3.0.0" = { | |
|
2736 | 2718 | name = "events"; |
|
2737 | 2719 | packageName = "events"; |
|
2738 |
version = " |
|
|
2739 | src = fetchurl { | |
|
2740 |
url = "https://registry.npmjs.org/events/-/events- |
|
|
2741 | sha1 = "9ebdb7635ad099c70dcc4c2a1f5004288e8bd924"; | |
|
2720 | version = "3.0.0"; | |
|
2721 | src = fetchurl { | |
|
2722 | url = "https://registry.npmjs.org/events/-/events-3.0.0.tgz"; | |
|
2723 | sha512 = "Dc381HFWJzEOhQ+d8pkNon++bk9h6cdAoAj4iE6Q4y6xgTzySWXlKn05/TVNpjnfRqi/X0EpJEJohPjNI3zpVA=="; | |
|
2742 | 2724 | }; |
|
2743 | 2725 | }; |
|
2744 | 2726 | "evp_bytestokey-1.0.3" = { |
@@ -2948,13 +2930,13 b' let' | |||
|
2948 | 2930 | sha1 = "9326b1488c22d1a6088650a86901b2d9a90a2cbc"; |
|
2949 | 2931 | }; |
|
2950 | 2932 | }; |
|
2951 |
"fined-1. |
|
|
2933 | "fined-1.2.0" = { | |
|
2952 | 2934 | name = "fined"; |
|
2953 | 2935 | packageName = "fined"; |
|
2954 |
version = "1. |
|
|
2955 | src = fetchurl { | |
|
2956 |
url = "https://registry.npmjs.org/fined/-/fined-1. |
|
|
2957 | sha512 = "jQp949ZmEbiYHk3gkbdtpJ0G1+kgtLQBNdP5edFP7Fh+WAYceLQz6yO1SBj72Xkg8GVyTB3bBzAYrHJVh5Xd5g=="; | |
|
2936 | version = "1.2.0"; | |
|
2937 | src = fetchurl { | |
|
2938 | url = "https://registry.npmjs.org/fined/-/fined-1.2.0.tgz"; | |
|
2939 | sha512 = "ZYDqPLGxDkDhDZBjZBb+oD1+j0rA4E0pXY50eplAAOPg2N/gUBSSk5IM1/QhPfyVo19lJ+CvXpqfvk+b2p/8Ng=="; | |
|
2958 | 2940 | }; |
|
2959 | 2941 | }; |
|
2960 | 2942 | "flagged-respawn-1.0.1" = { |
@@ -2975,13 +2957,13 b' let' | |||
|
2975 | 2957 | sha1 = "dae46a9d78fbe25292258cc1e780a41d95c03782"; |
|
2976 | 2958 | }; |
|
2977 | 2959 | }; |
|
2978 |
"flush-write-stream-1. |
|
|
2960 | "flush-write-stream-1.1.1" = { | |
|
2979 | 2961 | name = "flush-write-stream"; |
|
2980 | 2962 | packageName = "flush-write-stream"; |
|
2981 |
version = "1. |
|
|
2982 | src = fetchurl { | |
|
2983 |
url = "https://registry.npmjs.org/flush-write-stream/-/flush-write-stream-1. |
|
|
2984 | sha512 = "calZMC10u0FMUqoiunI2AiGIIUtUIvifNwkHhNupZH4cbNnW1Itkoh/Nf5HFYmDrwWPjrUxpkZT0KhuCq0jmGw=="; | |
|
2963 | version = "1.1.1"; | |
|
2964 | src = fetchurl { | |
|
2965 | url = "https://registry.npmjs.org/flush-write-stream/-/flush-write-stream-1.1.1.tgz"; | |
|
2966 | sha512 = "3Z4XhFZ3992uIq0XOqb9AreonueSYphE6oYbpt5+3u06JWklbsPkNv3ZKkP9Bz/r+1MWCaMoSQ28P85+1Yc77w=="; | |
|
2985 | 2967 | }; |
|
2986 | 2968 | }; |
|
2987 | 2969 | "for-in-1.0.2" = { |
@@ -3056,13 +3038,13 b' let' | |||
|
3056 | 3038 | sha1 = "1504ad2523158caa40db4a2787cb01411994ea4f"; |
|
3057 | 3039 | }; |
|
3058 | 3040 | }; |
|
3059 |
"fsevents-1.2. |
|
|
3041 | "fsevents-1.2.9" = { | |
|
3060 | 3042 | name = "fsevents"; |
|
3061 | 3043 | packageName = "fsevents"; |
|
3062 |
version = "1.2. |
|
|
3063 | src = fetchurl { | |
|
3064 |
url = "https://registry.npmjs.org/fsevents/-/fsevents-1.2. |
|
|
3065 | sha512 = "z8H8/diyk76B7q5wg+Ud0+CqzcAF3mBBI/bA5ne5zrRUUIvNkJY//D3BqyH571KuAC4Nr7Rw7CjWX4r0y9DvNg=="; | |
|
3044 | version = "1.2.9"; | |
|
3045 | src = fetchurl { | |
|
3046 | url = "https://registry.npmjs.org/fsevents/-/fsevents-1.2.9.tgz"; | |
|
3047 | sha512 = "oeyj2H3EjjonWcFjD5NvZNE9Rqe4UW+nQBU2HNeKw0koVLEFIhtyETyAakeAM3de7Z/SW5kcA+fZUait9EApnw=="; | |
|
3066 | 3048 | }; |
|
3067 | 3049 | }; |
|
3068 | 3050 | "function-bind-1.1.1" = { |
@@ -3146,13 +3128,13 b' let' | |||
|
3146 | 3128 | sha1 = "4a973f635b9190f715d10987d5c00fd2815ebe3d"; |
|
3147 | 3129 | }; |
|
3148 | 3130 | }; |
|
3149 |
"glob-7.1. |
|
|
3131 | "glob-7.1.4" = { | |
|
3150 | 3132 | name = "glob"; |
|
3151 | 3133 | packageName = "glob"; |
|
3152 |
version = "7.1. |
|
|
3153 | src = fetchurl { | |
|
3154 |
url = "https://registry.npmjs.org/glob/-/glob-7.1. |
|
|
3155 | sha512 = "vcfuiIxogLV4DlGBHIUOwI0IbrJ8HWPc4MU7HzviGeNho/UJDfi6B5p3sHeWIQ0KGIU0Jpxi5ZHxemQfLkkAwQ=="; | |
|
3134 | version = "7.1.4"; | |
|
3135 | src = fetchurl { | |
|
3136 | url = "https://registry.npmjs.org/glob/-/glob-7.1.4.tgz"; | |
|
3137 | sha512 = "hkLPepehmnKk41pUGm3sYxoFs/umurYfYJCerbXEyFIWcAzvpipAgVkBqqT9RBKMGjnq6kMuyYwha6csxbiM1A=="; | |
|
3156 | 3138 | }; |
|
3157 | 3139 | }; |
|
3158 | 3140 | "glob-parent-3.1.0" = { |
@@ -3524,13 +3506,13 b' let' | |||
|
3524 | 3506 | sha1 = "e36c3f2d2cae7d746a857e38d18d5f32a7882db8"; |
|
3525 | 3507 | }; |
|
3526 | 3508 | }; |
|
3527 |
"homedir-polyfill-1.0. |
|
|
3509 | "homedir-polyfill-1.0.3" = { | |
|
3528 | 3510 | name = "homedir-polyfill"; |
|
3529 | 3511 | packageName = "homedir-polyfill"; |
|
3530 |
version = "1.0. |
|
|
3531 | src = fetchurl { | |
|
3532 |
url = "https://registry.npmjs.org/homedir-polyfill/-/homedir-polyfill-1.0. |
|
|
3533 | sha1 = "4c2bbc8a758998feebf5ed68580f76d46768b4bc"; | |
|
3512 | version = "1.0.3"; | |
|
3513 | src = fetchurl { | |
|
3514 | url = "https://registry.npmjs.org/homedir-polyfill/-/homedir-polyfill-1.0.3.tgz"; | |
|
3515 | sha512 = "eSmmWE5bZTK2Nou4g0AI3zZ9rswp7GRKoKXS1BLUkvPviOqs4YTN1djQIqrXy9k5gEtdLPy86JjRwsNM9tnDcA=="; | |
|
3534 | 3516 | }; |
|
3535 | 3517 | }; |
|
3536 | 3518 | "hooker-0.2.3" = { |
@@ -3587,15 +3569,6 b' let' | |||
|
3587 | 3569 | sha1 = "b01abbd723acaaa7b37b6af4492ebda03d9dd37b"; |
|
3588 | 3570 | }; |
|
3589 | 3571 | }; |
|
3590 | "htmlparser2-3.3.0" = { | |
|
3591 | name = "htmlparser2"; | |
|
3592 | packageName = "htmlparser2"; | |
|
3593 | version = "3.3.0"; | |
|
3594 | src = fetchurl { | |
|
3595 | url = "https://registry.npmjs.org/htmlparser2/-/htmlparser2-3.3.0.tgz"; | |
|
3596 | sha1 = "cc70d05a59f6542e43f0e685c982e14c924a9efe"; | |
|
3597 | }; | |
|
3598 | }; | |
|
3599 | 3572 | "htmlparser2-3.8.3" = { |
|
3600 | 3573 | name = "htmlparser2"; |
|
3601 | 3574 | packageName = "htmlparser2"; |
@@ -3650,13 +3623,13 b' let' | |||
|
3650 | 3623 | sha1 = "83f0a0ec378bf3246178b6c2ad9136f135b1c962"; |
|
3651 | 3624 | }; |
|
3652 | 3625 | }; |
|
3653 |
"ieee754-1.1.1 |
|
|
3626 | "ieee754-1.1.13" = { | |
|
3654 | 3627 | name = "ieee754"; |
|
3655 | 3628 | packageName = "ieee754"; |
|
3656 |
version = "1.1.1 |
|
|
3657 | src = fetchurl { | |
|
3658 |
url = "https://registry.npmjs.org/ieee754/-/ieee754-1.1.1 |
|
|
3659 | sha512 = "GguP+DRY+pJ3soyIiGPTvdiVXjZ+DbXOxGpXn3eMvNW4x4irjqXm4wHKscC+TfxSJ0yw/S1F24tqdMNsMZTiLA=="; | |
|
3629 | version = "1.1.13"; | |
|
3630 | src = fetchurl { | |
|
3631 | url = "https://registry.npmjs.org/ieee754/-/ieee754-1.1.13.tgz"; | |
|
3632 | sha512 = "4vf7I2LYV/HaWerSo3XmlMkp5eZ83i+/CDluXi/IGTs/O1sejBNhTtnxzmRZfvOUqj7lZjqHkeTvpgSFDlWZTg=="; | |
|
3660 | 3633 | }; |
|
3661 | 3634 | }; |
|
3662 | 3635 | "iferr-0.1.5" = { |
@@ -3974,13 +3947,13 b' let' | |||
|
3974 | 3947 | sha1 = "7ba5ae24217804ac70707b96922567486cc3e84a"; |
|
3975 | 3948 | }; |
|
3976 | 3949 | }; |
|
3977 |
"is-glob-4.0. |
|
|
3950 | "is-glob-4.0.1" = { | |
|
3978 | 3951 | name = "is-glob"; |
|
3979 | 3952 | packageName = "is-glob"; |
|
3980 |
version = "4.0. |
|
|
3981 | src = fetchurl { | |
|
3982 |
url = "https://registry.npmjs.org/is-glob/-/is-glob-4.0. |
|
|
3983 | sha1 = "9521c76845cc2610a85203ddf080a958c2ffabc0"; | |
|
3953 | version = "4.0.1"; | |
|
3954 | src = fetchurl { | |
|
3955 | url = "https://registry.npmjs.org/is-glob/-/is-glob-4.0.1.tgz"; | |
|
3956 | sha512 = "5G0tKtBTFImOqDnLB2hG6Bp2qcKEFduo4tZu9MT/H6NQv/ghhy30o55ufafxJ/LdH79LLs2Kfrn85TLKyA7BUg=="; | |
|
3984 | 3957 | }; |
|
3985 | 3958 | }; |
|
3986 | 3959 | "is-number-3.0.0" = { |
@@ -4145,13 +4118,13 b' let' | |||
|
4145 | 4118 | sha1 = "dd8b74278b27102d29df63eae28308a8cfa1b583"; |
|
4146 | 4119 | }; |
|
4147 | 4120 | }; |
|
4148 |
"js-base64-2.5. |
|
|
4121 | "js-base64-2.5.1" = { | |
|
4149 | 4122 | name = "js-base64"; |
|
4150 | 4123 | packageName = "js-base64"; |
|
4151 |
version = "2.5. |
|
|
4152 | src = fetchurl { | |
|
4153 |
url = "https://registry.npmjs.org/js-base64/-/js-base64-2.5. |
|
|
4154 | sha512 = "wlEBIZ5LP8usDylWbDNhKPEFVFdI5hCHpnVoT/Ysvoi/PRhJENm/Rlh9TvjYB38HFfKZN7OzEbRjmjvLkFw11g=="; | |
|
4124 | version = "2.5.1"; | |
|
4125 | src = fetchurl { | |
|
4126 | url = "https://registry.npmjs.org/js-base64/-/js-base64-2.5.1.tgz"; | |
|
4127 | sha512 = "M7kLczedRMYX4L8Mdh4MzyAMM9O5osx+4FcOQuTvr3A9F2D9S5JXheN0ewNbrvK2UatkTRhL5ejGmGSjNMiZuw=="; | |
|
4155 | 4128 | }; |
|
4156 | 4129 | }; |
|
4157 | 4130 | "js-tokens-3.0.2" = { |
@@ -4208,6 +4181,15 b' let' | |||
|
4208 | 4181 | sha1 = "46c3fec8c1892b12b0833db9bc7622176dbab34b"; |
|
4209 | 4182 | }; |
|
4210 | 4183 | }; |
|
4184 | "jshint-2.10.2" = { | |
|
4185 | name = "jshint"; | |
|
4186 | packageName = "jshint"; | |
|
4187 | version = "2.10.2"; | |
|
4188 | src = fetchurl { | |
|
4189 | url = "https://registry.npmjs.org/jshint/-/jshint-2.10.2.tgz"; | |
|
4190 | sha512 = "e7KZgCSXMJxznE/4WULzybCMNXNAd/bf5TSrvVEq78Q/K8ZwFpmBqQeDtNiHc3l49nV4E/+YeHU/JZjSUIrLAA=="; | |
|
4191 | }; | |
|
4192 | }; | |
|
4211 | 4193 | "jshint-2.9.7" = { |
|
4212 | 4194 | name = "jshint"; |
|
4213 | 4195 | packageName = "jshint"; |
@@ -4370,13 +4352,13 b' let' | |||
|
4370 | 4352 | sha1 = "2009291bb31cea861bbf10a7c15a28caf75c31ec"; |
|
4371 | 4353 | }; |
|
4372 | 4354 | }; |
|
4373 |
"loader-runner-2. |
|
|
4355 | "loader-runner-2.4.0" = { | |
|
4374 | 4356 | name = "loader-runner"; |
|
4375 | 4357 | packageName = "loader-runner"; |
|
4376 |
version = "2. |
|
|
4377 | src = fetchurl { | |
|
4378 |
url = "https://registry.npmjs.org/loader-runner/-/loader-runner-2. |
|
|
4379 | sha512 = "By6ZFY7ETWOc9RFaAIb23IjJVcM4dvJC/N57nmdz9RSkMXvAXGI7SyVlAw3v8vjtDRlqThgVDVmTnr9fqMlxkw=="; | |
|
4358 | version = "2.4.0"; | |
|
4359 | src = fetchurl { | |
|
4360 | url = "https://registry.npmjs.org/loader-runner/-/loader-runner-2.4.0.tgz"; | |
|
4361 | sha512 = "Jsmr89RcXGIwivFY21FcRrisYZfvLMTWx5kOLc+JTxtpBOG6xML0vzbc6SEQG2FO9/4Fc3wW4LVcB5DmGflaRw=="; | |
|
4380 | 4362 | }; |
|
4381 | 4363 | }; |
|
4382 | 4364 | "loader-utils-0.2.17" = { |
@@ -4460,15 +4442,6 b' let' | |||
|
4460 | 4442 | sha1 = "b28aa6288a2b9fc651035c7711f65ab6190331a6"; |
|
4461 | 4443 | }; |
|
4462 | 4444 | }; |
|
4463 | "lodash.debounce-4.0.8" = { | |
|
4464 | name = "lodash.debounce"; | |
|
4465 | packageName = "lodash.debounce"; | |
|
4466 | version = "4.0.8"; | |
|
4467 | src = fetchurl { | |
|
4468 | url = "https://registry.npmjs.org/lodash.debounce/-/lodash.debounce-4.0.8.tgz"; | |
|
4469 | sha1 = "82d79bff30a67c4005ffd5e2515300ad9ca4d7af"; | |
|
4470 | }; | |
|
4471 | }; | |
|
4472 | 4445 | "lodash.isplainobject-4.0.6" = { |
|
4473 | 4446 | name = "lodash.isplainobject"; |
|
4474 | 4447 | packageName = "lodash.isplainobject"; |
@@ -4613,13 +4586,13 b' let' | |||
|
4613 | 4586 | sha512 = "xitP+WxNPcTTOgnTJcrhM0xvdPepipPSf3I8EIpGKeFLjt3PlJLIDG3u8EX53ZIubkb+5U2+3rELYpEhHhzdkg=="; |
|
4614 | 4587 | }; |
|
4615 | 4588 | }; |
|
4616 |
"mem-4. |
|
|
4589 | "mem-4.3.0" = { | |
|
4617 | 4590 | name = "mem"; |
|
4618 | 4591 | packageName = "mem"; |
|
4619 |
version = "4. |
|
|
4620 | src = fetchurl { | |
|
4621 |
url = "https://registry.npmjs.org/mem/-/mem-4. |
|
|
4622 | sha512 = "WQxG/5xYc3tMbYLXoXPm81ET2WDULiU5FxbuIoNbJqLOOI8zehXFdZuiUEgfdrU2mVB1pxBZUGlYORSrpuJreA=="; | |
|
4592 | version = "4.3.0"; | |
|
4593 | src = fetchurl { | |
|
4594 | url = "https://registry.npmjs.org/mem/-/mem-4.3.0.tgz"; | |
|
4595 | sha512 = "qX2bG48pTqYRVmDB37rn/6PT7LcR8T7oAX3bf99u1Tt1nzxYfxkgqDwUwolPlXweM0XzBOBFzSx4kfp7KP1s/w=="; | |
|
4623 | 4596 | }; |
|
4624 | 4597 | }; |
|
4625 | 4598 | "memory-fs-0.4.1" = { |
@@ -4658,31 +4631,31 b' let' | |||
|
4658 | 4631 | sha512 = "x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg=="; |
|
4659 | 4632 | }; |
|
4660 | 4633 | }; |
|
4661 |
"mime-db-1. |
|
|
4634 | "mime-db-1.40.0" = { | |
|
4662 | 4635 | name = "mime-db"; |
|
4663 | 4636 | packageName = "mime-db"; |
|
4664 |
version = "1. |
|
|
4665 | src = fetchurl { | |
|
4666 |
url = "https://registry.npmjs.org/mime-db/-/mime-db-1. |
|
|
4667 | sha512 = "R3C4db6bgQhlIhPU48fUtdVmKnflq+hRdad7IyKhtFj06VPNVdk2RhiYL3UjQIlso8L+YxAtFkobT0VK+S/ybg=="; | |
|
4668 | }; | |
|
4669 | }; | |
|
4670 |
"mime-types-2.1.2 |
|
|
4637 | version = "1.40.0"; | |
|
4638 | src = fetchurl { | |
|
4639 | url = "https://registry.npmjs.org/mime-db/-/mime-db-1.40.0.tgz"; | |
|
4640 | sha512 = "jYdeOMPy9vnxEqFRRo6ZvTZ8d9oPb+k18PKoYNYUe2stVEBPPwsln/qWzdbmaIvnhZ9v2P+CuecK+fpUfsV2mA=="; | |
|
4641 | }; | |
|
4642 | }; | |
|
4643 | "mime-types-2.1.24" = { | |
|
4671 | 4644 | name = "mime-types"; |
|
4672 | 4645 | packageName = "mime-types"; |
|
4673 |
version = "2.1.2 |
|
|
4674 | src = fetchurl { | |
|
4675 |
url = "https://registry.npmjs.org/mime-types/-/mime-types-2.1.2 |
|
|
4676 | sha512 = "3iL6DbwpyLzjR3xHSFNFeb9Nz/M8WDkX33t1GFQnFOllWk8pOrh/LSrB5OXlnlW5P9LH73X6loW/eogc+F5lJg=="; | |
|
4677 | }; | |
|
4678 | }; | |
|
4679 |
"mimic-fn-1 |
|
|
4646 | version = "2.1.24"; | |
|
4647 | src = fetchurl { | |
|
4648 | url = "https://registry.npmjs.org/mime-types/-/mime-types-2.1.24.tgz"; | |
|
4649 | sha512 = "WaFHS3MCl5fapm3oLxU4eYDw77IQM2ACcxQ9RIxfaC3ooc6PFuBMGZZsYpvoXS5D5QTWPieo1jjLdAm3TBP3cQ=="; | |
|
4650 | }; | |
|
4651 | }; | |
|
4652 | "mimic-fn-2.1.0" = { | |
|
4680 | 4653 | name = "mimic-fn"; |
|
4681 | 4654 | packageName = "mimic-fn"; |
|
4682 |
version = "1 |
|
|
4683 | src = fetchurl { | |
|
4684 |
url = "https://registry.npmjs.org/mimic-fn/-/mimic-fn-1 |
|
|
4685 | sha512 = "jf84uxzwiuiIVKiOLpfYk7N46TSy8ubTonmneY9vrpHNAnp0QBt2BxWV9dO3/j+BoVAb+a5G6YDPW3M5HOdMWQ=="; | |
|
4655 | version = "2.1.0"; | |
|
4656 | src = fetchurl { | |
|
4657 | url = "https://registry.npmjs.org/mimic-fn/-/mimic-fn-2.1.0.tgz"; | |
|
4658 | sha512 = "OqbOk5oEQeAZ8WXWydlu9HJjz9WVdEIvamMCcXmuqUYjTknH/sqsWvhQ3vgwKFRR1HpjvNBKQ37nbJgYzGqGcg=="; | |
|
4686 | 4659 | }; |
|
4687 | 4660 | }; |
|
4688 | 4661 | "minimalistic-assert-1.0.1" = { |
@@ -4775,22 +4748,22 b' let' | |||
|
4775 | 4748 | sha1 = "30057438eac6cf7f8c4767f38648d6697d75c903"; |
|
4776 | 4749 | }; |
|
4777 | 4750 | }; |
|
4778 |
"moment-2.2 |
|
|
4751 | "moment-2.24.0" = { | |
|
4779 | 4752 | name = "moment"; |
|
4780 | 4753 | packageName = "moment"; |
|
4781 |
version = "2.2 |
|
|
4782 | src = fetchurl { | |
|
4783 |
url = "https://registry.npmjs.org/moment/-/moment-2.2 |
|
|
4784 | sha512 = "3IE39bHVqFbWWaPOMHZF98Q9c3LDKGTmypMiTM2QygGXXElkFWIH7GxfmlwmY2vwa+wmNsoYZmG2iusf1ZjJoA=="; | |
|
4785 | }; | |
|
4786 | }; | |
|
4787 |
"mousetrap-1.6. |
|
|
4754 | version = "2.24.0"; | |
|
4755 | src = fetchurl { | |
|
4756 | url = "https://registry.npmjs.org/moment/-/moment-2.24.0.tgz"; | |
|
4757 | sha512 = "bV7f+6l2QigeBBZSM/6yTNq4P2fNpSWj/0e7jQcy87A8e7o2nAfP/34/2ky5Vw4B9S446EtIhodAzkFCcR4dQg=="; | |
|
4758 | }; | |
|
4759 | }; | |
|
4760 | "mousetrap-1.6.3" = { | |
|
4788 | 4761 | name = "mousetrap"; |
|
4789 | 4762 | packageName = "mousetrap"; |
|
4790 |
version = "1.6. |
|
|
4791 | src = fetchurl { | |
|
4792 |
url = "https://registry.npmjs.org/mousetrap/-/mousetrap-1.6. |
|
|
4793 | sha512 = "jDjhi7wlHwdO6q6DS7YRmSHcuI+RVxadBkLt3KHrhd3C2b+w5pKefg3oj5beTcHZyVFA9Aksf+yEE1y5jxUjVA=="; | |
|
4763 | version = "1.6.3"; | |
|
4764 | src = fetchurl { | |
|
4765 | url = "https://registry.npmjs.org/mousetrap/-/mousetrap-1.6.3.tgz"; | |
|
4766 | sha512 = "bd+nzwhhs9ifsUrC2tWaSgm24/oo2c83zaRyZQF06hYA6sANfsXHtnZ19AbbbDXCDzeH5nZBSQ4NvCjgD62tJA=="; | |
|
4794 | 4767 | }; |
|
4795 | 4768 | }; |
|
4796 | 4769 | "move-concurrently-1.0.1" = { |
@@ -4811,13 +4784,13 b' let' | |||
|
4811 | 4784 | sha1 = "5608aeadfc00be6c2901df5f9861788de0d597c8"; |
|
4812 | 4785 | }; |
|
4813 | 4786 | }; |
|
4814 |
"nan-2.12 |
|
|
4787 | "nan-2.13.2" = { | |
|
4815 | 4788 | name = "nan"; |
|
4816 | 4789 | packageName = "nan"; |
|
4817 |
version = "2.12 |
|
|
4818 | src = fetchurl { | |
|
4819 |
url = "https://registry.npmjs.org/nan/-/nan-2.12 |
|
|
4820 | sha512 = "JY7V6lRkStKcKTvHO5NVSQRv+RV+FIL5pvDoLiAtSL9pKlC5x9PKQcZDsq7m4FO4d57mkhC6Z+QhAh3Jdk5JFw=="; | |
|
4790 | version = "2.13.2"; | |
|
4791 | src = fetchurl { | |
|
4792 | url = "https://registry.npmjs.org/nan/-/nan-2.13.2.tgz"; | |
|
4793 | sha512 = "TghvYc72wlMGMVMluVo9WRJc0mB8KxxF/gZ4YYFy7V2ZQX9l7rgbPg7vjS9mt6U5HXODVFVI2bOduCzwOMv/lw=="; | |
|
4821 | 4794 | }; |
|
4822 | 4795 | }; |
|
4823 | 4796 | "nanomatch-1.2.13" = { |
@@ -4829,13 +4802,13 b' let' | |||
|
4829 | 4802 | sha512 = "fpoe2T0RbHwBTBUOftAfBPaDEi06ufaUai0mE6Yn1kacc3SnTErfb/h+X94VXzI64rKFHYImXSvdwGGCmwOqCA=="; |
|
4830 | 4803 | }; |
|
4831 | 4804 | }; |
|
4832 |
"neo-async-2.6. |
|
|
4805 | "neo-async-2.6.1" = { | |
|
4833 | 4806 | name = "neo-async"; |
|
4834 | 4807 | packageName = "neo-async"; |
|
4835 |
version = "2.6. |
|
|
4836 | src = fetchurl { | |
|
4837 |
url = "https://registry.npmjs.org/neo-async/-/neo-async-2.6. |
|
|
4838 | sha512 = "MFh0d/Wa7vkKO3Y3LlacqAEeHK0mckVqzDieUKTT+KGxi+zIpeVsFxymkIiRpbpDziHc290Xr9A1O4Om7otoRA=="; | |
|
4808 | version = "2.6.1"; | |
|
4809 | src = fetchurl { | |
|
4810 | url = "https://registry.npmjs.org/neo-async/-/neo-async-2.6.1.tgz"; | |
|
4811 | sha512 = "iyam8fBuCUpWeKPGpaNMetEocMt364qkCsfL9JuhjXX6dRnguRVOfk2GZaDpPjcOKiiXCPINZC1GczQ7iTq3Zw=="; | |
|
4839 | 4812 | }; |
|
4840 | 4813 | }; |
|
4841 | 4814 | "nice-try-1.0.5" = { |
@@ -4856,13 +4829,13 b' let' | |||
|
4856 | 4829 | sha512 = "rmTZ9kz+f3rCvK2TD1Ue/oZlns7OGoIWP4fc3llxxRXlOkHKoWPPWJOfFYpITabSow43QJbRIoHQXtt10VldyQ=="; |
|
4857 | 4830 | }; |
|
4858 | 4831 | }; |
|
4859 |
"node-libs-browser-2. |
|
|
4832 | "node-libs-browser-2.2.0" = { | |
|
4860 | 4833 | name = "node-libs-browser"; |
|
4861 | 4834 | packageName = "node-libs-browser"; |
|
4862 |
version = "2. |
|
|
4863 | src = fetchurl { | |
|
4864 |
url = "https://registry.npmjs.org/node-libs-browser/-/node-libs-browser-2. |
|
|
4865 | sha512 = "5AzFzdoIMb89hBGMZglEegffzgRg+ZFoUmisQ8HI4j1KDdpx13J0taNp2y9xPbur6W61gepGDDotGBVQ7mfUCg=="; | |
|
4835 | version = "2.2.0"; | |
|
4836 | src = fetchurl { | |
|
4837 | url = "https://registry.npmjs.org/node-libs-browser/-/node-libs-browser-2.2.0.tgz"; | |
|
4838 | sha512 = "5MQunG/oyOaBdttrL40dA7bUfPORLRWMUJLQtMg7nluxUvk5XwnLdL9twQHFAjRx/y7mIMkLKT9++qPbbk6BZA=="; | |
|
4866 | 4839 | }; |
|
4867 | 4840 | }; |
|
4868 | 4841 | "nopt-1.0.10" = { |
@@ -4910,6 +4883,15 b' let' | |||
|
4910 | 4883 | sha1 = "1ab28b556e198363a8c1a6f7e6fa20137fe6aed9"; |
|
4911 | 4884 | }; |
|
4912 | 4885 | }; |
|
4886 | "normalize-path-3.0.0" = { | |
|
4887 | name = "normalize-path"; | |
|
4888 | packageName = "normalize-path"; | |
|
4889 | version = "3.0.0"; | |
|
4890 | src = fetchurl { | |
|
4891 | url = "https://registry.npmjs.org/normalize-path/-/normalize-path-3.0.0.tgz"; | |
|
4892 | sha512 = "6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA=="; | |
|
4893 | }; | |
|
4894 | }; | |
|
4913 | 4895 | "normalize-range-0.1.2" = { |
|
4914 | 4896 | name = "normalize-range"; |
|
4915 | 4897 | packageName = "normalize-range"; |
@@ -4991,13 +4973,13 b' let' | |||
|
4991 | 4973 | sha1 = "7e7d858b781bd7c991a41ba975ed3812754e998c"; |
|
4992 | 4974 | }; |
|
4993 | 4975 | }; |
|
4994 |
"object-keys-1. |
|
|
4976 | "object-keys-1.1.1" = { | |
|
4995 | 4977 | name = "object-keys"; |
|
4996 | 4978 | packageName = "object-keys"; |
|
4997 |
version = "1. |
|
|
4998 | src = fetchurl { | |
|
4999 |
url = "https://registry.npmjs.org/object-keys/-/object-keys-1. |
|
|
5000 | sha512 = "FTMyFUm2wBcGHnH2eXmz7tC6IwlqQZ6mVZ+6dm6vZ4IQIHjs6FdNsQBuKGPuUUUY6NfJw2PshC08Tn6LzLDOag=="; | |
|
4979 | version = "1.1.1"; | |
|
4980 | src = fetchurl { | |
|
4981 | url = "https://registry.npmjs.org/object-keys/-/object-keys-1.1.1.tgz"; | |
|
4982 | sha512 = "NuAESUOUMrlIXOfHKzD6bpPu3tYt3xvjNdRIQ+FeT0lNb4K8WR70CaDxhuNguS2XG+GjkyMwOzsN5ZktImfhLA=="; | |
|
5001 | 4983 | }; |
|
5002 | 4984 | }; |
|
5003 | 4985 | "object-visit-1.0.1" = { |
@@ -5117,13 +5099,13 b' let' | |||
|
5117 | 5099 | sha1 = "3fbcfb15b899a44123b34b6dcc18b724336a2cae"; |
|
5118 | 5100 | }; |
|
5119 | 5101 | }; |
|
5120 |
"p-is-promise- |
|
|
5102 | "p-is-promise-2.1.0" = { | |
|
5121 | 5103 | name = "p-is-promise"; |
|
5122 | 5104 | packageName = "p-is-promise"; |
|
5123 |
version = " |
|
|
5124 | src = fetchurl { | |
|
5125 |
url = "https://registry.npmjs.org/p-is-promise/-/p-is-promise- |
|
|
5126 | sha1 = "9c9456989e9f6588017b0434d56097675c3da05e"; | |
|
5105 | version = "2.1.0"; | |
|
5106 | src = fetchurl { | |
|
5107 | url = "https://registry.npmjs.org/p-is-promise/-/p-is-promise-2.1.0.tgz"; | |
|
5108 | sha512 = "Y3W0wlRPK8ZMRbNq97l4M5otioeA5lm1z7bkNkxCka8HSPjR0xRWmpCmc9utiaLP9Jb1eD8BgeIxTW4AIF45Pg=="; | |
|
5127 | 5109 | }; |
|
5128 | 5110 | }; |
|
5129 | 5111 | "p-limit-1.3.0" = { |
@@ -5135,13 +5117,13 b' let' | |||
|
5135 | 5117 | sha512 = "vvcXsLAJ9Dr5rQOPk7toZQZJApBl2K4J6dANSsEuh6QI41JYcsS/qhTGa9ErIUUgK3WNQoJYvylxvjqmiqEA9Q=="; |
|
5136 | 5118 | }; |
|
5137 | 5119 | }; |
|
5138 |
"p-limit-2. |
|
|
5120 | "p-limit-2.2.0" = { | |
|
5139 | 5121 | name = "p-limit"; |
|
5140 | 5122 | packageName = "p-limit"; |
|
5141 |
version = "2. |
|
|
5142 | src = fetchurl { | |
|
5143 |
url = "https://registry.npmjs.org/p-limit/-/p-limit-2. |
|
|
5144 | sha512 = "NhURkNcrVB+8hNfLuysU8enY5xn2KXphsHBaC2YmRNTZRc7RWusw6apSpdEj3jo4CMb6W9nrF6tTnsJsJeyu6g=="; | |
|
5123 | version = "2.2.0"; | |
|
5124 | src = fetchurl { | |
|
5125 | url = "https://registry.npmjs.org/p-limit/-/p-limit-2.2.0.tgz"; | |
|
5126 | sha512 = "pZbTJpoUsCzV48Mc9Nh51VbwO0X9cuPFE8gYwx9BTCt9SF8/b7Zljd2fVgOxhIF/HDTKgpVzs+GPhyKfjLLFRQ=="; | |
|
5145 | 5127 | }; |
|
5146 | 5128 | }; |
|
5147 | 5129 | "p-locate-2.0.0" = { |
@@ -5171,22 +5153,22 b' let' | |||
|
5171 | 5153 | sha1 = "cbc79cdbaf8fd4228e13f621f2b1a237c1b207b3"; |
|
5172 | 5154 | }; |
|
5173 | 5155 | }; |
|
5174 |
"p-try-2. |
|
|
5156 | "p-try-2.2.0" = { | |
|
5175 | 5157 | name = "p-try"; |
|
5176 | 5158 | packageName = "p-try"; |
|
5177 |
version = "2. |
|
|
5178 | src = fetchurl { | |
|
5179 |
url = "https://registry.npmjs.org/p-try/-/p-try-2. |
|
|
5180 | sha512 = "hMp0onDKIajHfIkdRk3P4CdCmErkYAxxDtP3Wx/4nZ3aGlau2VKh3mZpcuFkH27WQkL/3WBCPOktzA9ZOAnMQQ=="; | |
|
5181 | }; | |
|
5182 | }; | |
|
5183 |
"pako-1.0. |
|
|
5159 | version = "2.2.0"; | |
|
5160 | src = fetchurl { | |
|
5161 | url = "https://registry.npmjs.org/p-try/-/p-try-2.2.0.tgz"; | |
|
5162 | sha512 = "R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ=="; | |
|
5163 | }; | |
|
5164 | }; | |
|
5165 | "pako-1.0.10" = { | |
|
5184 | 5166 | name = "pako"; |
|
5185 | 5167 | packageName = "pako"; |
|
5186 |
version = "1.0. |
|
|
5187 | src = fetchurl { | |
|
5188 |
url = "https://registry.npmjs.org/pako/-/pako-1.0. |
|
|
5189 | sha512 = "3HNK5tW4x8o5mO8RuHZp3Ydw9icZXx0RANAOMzlMzx7LVXhMJ4mo3MOBpzyd7r/+RUu8BmndP47LXT+vzjtWcQ=="; | |
|
5168 | version = "1.0.10"; | |
|
5169 | src = fetchurl { | |
|
5170 | url = "https://registry.npmjs.org/pako/-/pako-1.0.10.tgz"; | |
|
5171 | sha512 = "0DTvPVU3ed8+HNXOu5Bs+o//Mbdj9VNQMUOe9oKCwh8l0GNwpTDMKCWbRjgtD291AWnkAgkqA/LOnQS8AmS1tw=="; | |
|
5190 | 5172 | }; |
|
5191 | 5173 | }; |
|
5192 | 5174 | "parallel-transform-1.1.0" = { |
@@ -5207,13 +5189,13 b' let' | |||
|
5207 | 5189 | sha1 = "df94fd8cf6531ecf75e6bef9a0858fbc72be2247"; |
|
5208 | 5190 | }; |
|
5209 | 5191 | }; |
|
5210 |
"parse-asn1-5.1. |
|
|
5192 | "parse-asn1-5.1.4" = { | |
|
5211 | 5193 | name = "parse-asn1"; |
|
5212 | 5194 | packageName = "parse-asn1"; |
|
5213 |
version = "5.1. |
|
|
5214 | src = fetchurl { | |
|
5215 |
url = "https://registry.npmjs.org/parse-asn1/-/parse-asn1-5.1. |
|
|
5216 | sha512 = "KPx7flKXg775zZpnp9SxJlz00gTd4BmJ2yJufSc44gMCRrRQ7NSzAcSJQfifuOLgW6bEi+ftrALtsgALeB2Adw=="; | |
|
5195 | version = "5.1.4"; | |
|
5196 | src = fetchurl { | |
|
5197 | url = "https://registry.npmjs.org/parse-asn1/-/parse-asn1-5.1.4.tgz"; | |
|
5198 | sha512 = "Qs5duJcuvNExRfFZ99HDD3z4mAi3r9Wl/FOjEOijlxwCZs7E7mW2vjTpgQ4J8LpTF8x5v+1Vn5UQFejmWT11aw=="; | |
|
5217 | 5199 | }; |
|
5218 | 5200 | }; |
|
5219 | 5201 | "parse-filepath-1.0.2" = { |
@@ -5927,13 +5909,13 b' let' | |||
|
5927 | 5909 | sha1 = "9ec61f79049875707d69414596fd907a4d711e73"; |
|
5928 | 5910 | }; |
|
5929 | 5911 | }; |
|
5930 |
"randombytes-2.0 |
|
|
5912 | "randombytes-2.1.0" = { | |
|
5931 | 5913 | name = "randombytes"; |
|
5932 | 5914 | packageName = "randombytes"; |
|
5933 |
version = "2.0 |
|
|
5934 | src = fetchurl { | |
|
5935 |
url = "https://registry.npmjs.org/randombytes/-/randombytes-2.0 |
|
|
5936 | sha512 = "CIQ5OFxf4Jou6uOKe9t1AOgqpeU5fd70A8NPdHSGeYXqXsPe6peOwI0cUl88RWZ6sP1vPMV3avd/R6cZ5/sP1A=="; | |
|
5915 | version = "2.1.0"; | |
|
5916 | src = fetchurl { | |
|
5917 | url = "https://registry.npmjs.org/randombytes/-/randombytes-2.1.0.tgz"; | |
|
5918 | sha512 = "vYl3iOX+4CKUWuxGi9Ukhie6fsqXqS9FE2Zaic4tNFD2N2QQaXOMFbuKK4QmDHC0JO6B1Zp41J0LpT0oR68amQ=="; | |
|
5937 | 5919 | }; |
|
5938 | 5920 | }; |
|
5939 | 5921 | "randomfill-1.0.4" = { |
@@ -5954,15 +5936,6 b' let' | |||
|
5954 | 5936 | sha512 = "guh4ZNAf96f+CDwfnPbFeFiO5YcfPllUmZrgcoOmx6iqZPq+DcKbnyjPuBxEAtQ3tqqd++qChsQfQB+VBzFT0Q=="; |
|
5955 | 5937 | }; |
|
5956 | 5938 | }; |
|
5957 | "readable-stream-1.0.34" = { | |
|
5958 | name = "readable-stream"; | |
|
5959 | packageName = "readable-stream"; | |
|
5960 | version = "1.0.34"; | |
|
5961 | src = fetchurl { | |
|
5962 | url = "https://registry.npmjs.org/readable-stream/-/readable-stream-1.0.34.tgz"; | |
|
5963 | sha1 = "125820e34bc842d2f2aaafafe4c2916ee32c157c"; | |
|
5964 | }; | |
|
5965 | }; | |
|
5966 | 5939 | "readable-stream-1.1.14" = { |
|
5967 | 5940 | name = "readable-stream"; |
|
5968 | 5941 | packageName = "readable-stream"; |
@@ -6116,13 +6089,13 b' let' | |||
|
6116 | 6089 | sha1 = "c24bce2a283adad5bc3f58e0d48249b92379d8ef"; |
|
6117 | 6090 | }; |
|
6118 | 6091 | }; |
|
6119 |
"renderkid-2.0. |
|
|
6092 | "renderkid-2.0.3" = { | |
|
6120 | 6093 | name = "renderkid"; |
|
6121 | 6094 | packageName = "renderkid"; |
|
6122 |
version = "2.0. |
|
|
6123 | src = fetchurl { | |
|
6124 |
url = "https://registry.npmjs.org/renderkid/-/renderkid-2.0. |
|
|
6125 | sha512 = "FsygIxevi1jSiPY9h7vZmBFUbAOcbYm9UwyiLNdVsLRs/5We9Ob5NMPbGYUTWiLq5L+ezlVdE0A8bbME5CWTpg=="; | |
|
6095 | version = "2.0.3"; | |
|
6096 | src = fetchurl { | |
|
6097 | url = "https://registry.npmjs.org/renderkid/-/renderkid-2.0.3.tgz"; | |
|
6098 | sha512 = "z8CLQp7EZBPCwCnncgf9C4XAi3WR0dv+uWu/PjIyhhAb5d6IJ/QZqlHFprHeKT+59//V6BNUsLbvN8+2LarxGA=="; | |
|
6126 | 6099 | }; |
|
6127 | 6100 | }; |
|
6128 | 6101 | "repeat-element-1.1.3" = { |
@@ -6179,13 +6152,13 b' let' | |||
|
6179 | 6152 | sha1 = "97f717b69d48784f5f526a6c5aa8ffdda055a4d1"; |
|
6180 | 6153 | }; |
|
6181 | 6154 | }; |
|
6182 |
"resolve-1. |
|
|
6155 | "resolve-1.10.1" = { | |
|
6183 | 6156 | name = "resolve"; |
|
6184 | 6157 | packageName = "resolve"; |
|
6185 |
version = "1. |
|
|
6186 | src = fetchurl { | |
|
6187 |
url = "https://registry.npmjs.org/resolve/-/resolve-1. |
|
|
6188 | sha512 = "TZNye00tI67lwYvzxCxHGjwTNlUV70io54/Ed4j6PscB8xVfuBJpRenI/o6dVk0cY0PYTY27AgCoGGxRnYuItQ=="; | |
|
6158 | version = "1.10.1"; | |
|
6159 | src = fetchurl { | |
|
6160 | url = "https://registry.npmjs.org/resolve/-/resolve-1.10.1.tgz"; | |
|
6161 | sha512 = "KuIe4mf++td/eFb6wkaPbMDnP6kObCaEtIDuHOUED6MNUo4K670KZUHuuvYPZDxNF0WVLw49n06M2m2dXphEzA=="; | |
|
6189 | 6162 | }; |
|
6190 | 6163 | }; |
|
6191 | 6164 | "resolve-cwd-2.0.0" = { |
@@ -6332,22 +6305,22 b' let' | |||
|
6332 | 6305 | sha1 = "0e7350acdec80b1108528786ec1d4418d11b396d"; |
|
6333 | 6306 | }; |
|
6334 | 6307 | }; |
|
6335 |
"semver-5. |
|
|
6308 | "semver-5.7.0" = { | |
|
6336 | 6309 | name = "semver"; |
|
6337 | 6310 | packageName = "semver"; |
|
6338 |
version = "5. |
|
|
6339 | src = fetchurl { | |
|
6340 |
url = "https://registry.npmjs.org/semver/-/semver-5. |
|
|
6341 | sha512 = "RS9R6R35NYgQn++fkDWaOmqGoj4Ek9gGs+DPxNUZKuwE183xjJroKvyo1IzVFeXvUrvmALy6FWD5xrdJT25gMg=="; | |
|
6342 | }; | |
|
6343 | }; | |
|
6344 |
"serialize-javascript-1. |
|
|
6311 | version = "5.7.0"; | |
|
6312 | src = fetchurl { | |
|
6313 | url = "https://registry.npmjs.org/semver/-/semver-5.7.0.tgz"; | |
|
6314 | sha512 = "Ya52jSX2u7QKghxeoFGpLwCtGlt7j0oY9DYb5apt9nPlJ42ID+ulTXESnt/qAQcoSERyZ5sl3LDIOw0nAn/5DA=="; | |
|
6315 | }; | |
|
6316 | }; | |
|
6317 | "serialize-javascript-1.7.0" = { | |
|
6345 | 6318 | name = "serialize-javascript"; |
|
6346 | 6319 | packageName = "serialize-javascript"; |
|
6347 |
version = "1. |
|
|
6348 | src = fetchurl { | |
|
6349 |
url = "https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1. |
|
|
6350 | sha512 = "A5MOagrPFga4YaKQSWHryl7AXvbQkEqpw4NNYMTNYUNV51bA8ABHgYFpqKx+YFFrw59xMV1qGH1R4AgoNIVgCw=="; | |
|
6320 | version = "1.7.0"; | |
|
6321 | src = fetchurl { | |
|
6322 | url = "https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.7.0.tgz"; | |
|
6323 | sha512 = "ke8UG8ulpFOxO8f8gRYabHQe/ZntKlcig2Mp+8+URDP1D8vJZ0KUt7LYo07q25Z/+JVSgpr/cui9PIp5H6/+nA=="; | |
|
6351 | 6324 | }; |
|
6352 | 6325 | }; |
|
6353 | 6326 | "set-blocking-2.0.0" = { |
@@ -6593,13 +6566,13 b' let' | |||
|
6593 | 6566 | sha1 = "04e6926f662895354f3dd015203633b857297e2c"; |
|
6594 | 6567 | }; |
|
6595 | 6568 | }; |
|
6596 |
"sshpk-1.16. |
|
|
6569 | "sshpk-1.16.1" = { | |
|
6597 | 6570 | name = "sshpk"; |
|
6598 | 6571 | packageName = "sshpk"; |
|
6599 |
version = "1.16. |
|
|
6600 | src = fetchurl { | |
|
6601 |
url = "https://registry.npmjs.org/sshpk/-/sshpk-1.16. |
|
|
6602 | sha512 = "Zhev35/y7hRMcID/upReIvRse+I9SVhyVre/KTJSJQWMz3C3+G+HpO7m1wK/yckEtujKZ7dS4hkVxAnmHaIGVQ=="; | |
|
6572 | version = "1.16.1"; | |
|
6573 | src = fetchurl { | |
|
6574 | url = "https://registry.npmjs.org/sshpk/-/sshpk-1.16.1.tgz"; | |
|
6575 | sha512 = "HXXqVUq7+pcKeLqqZj6mHFUMvXtOJt1uoUx09pFW6011inTMxqI8BA8PM95myrIyyKwdnzjdFjLiE6KBPVtJIg=="; | |
|
6603 | 6576 | }; |
|
6604 | 6577 | }; |
|
6605 | 6578 | "ssri-5.3.0" = { |
@@ -6629,13 +6602,13 b' let' | |||
|
6629 | 6602 | sha1 = "142bf6b64c2b416e4b707ebf8f09b8b5a5043877"; |
|
6630 | 6603 | }; |
|
6631 | 6604 | }; |
|
6632 |
"stream-browserify-2.0. |
|
|
6605 | "stream-browserify-2.0.2" = { | |
|
6633 | 6606 | name = "stream-browserify"; |
|
6634 | 6607 | packageName = "stream-browserify"; |
|
6635 |
version = "2.0. |
|
|
6636 | src = fetchurl { | |
|
6637 |
url = "https://registry.npmjs.org/stream-browserify/-/stream-browserify-2.0. |
|
|
6638 | sha1 = "66266ee5f9bdb9940a4e4514cafb43bb71e5c9db"; | |
|
6608 | version = "2.0.2"; | |
|
6609 | src = fetchurl { | |
|
6610 | url = "https://registry.npmjs.org/stream-browserify/-/stream-browserify-2.0.2.tgz"; | |
|
6611 | sha512 = "nX6hmklHs/gr2FuxYDltq8fJA1GDlxKQCz8O/IM4atRqBH8OORmBNgfvW5gG10GT/qQ9u0CzIvr2X5Pkt6ntqg=="; | |
|
6639 | 6612 | }; |
|
6640 | 6613 | }; |
|
6641 | 6614 | "stream-each-1.2.3" = { |
@@ -6836,13 +6809,13 b' let' | |||
|
6836 | 6809 | sha512 = "2wsvQ+4GwBvLPLWsNfLCDYGsW6xb7aeC6utq2Qh0PFwgEy7K7dsma9Jsmb2zSQj7GvYAyUGSntLtsv++GmgL1A=="; |
|
6837 | 6810 | }; |
|
6838 | 6811 | }; |
|
6839 |
"tapable-1.1. |
|
|
6812 | "tapable-1.1.3" = { | |
|
6840 | 6813 | name = "tapable"; |
|
6841 | 6814 | packageName = "tapable"; |
|
6842 |
version = "1.1. |
|
|
6843 | src = fetchurl { | |
|
6844 |
url = "https://registry.npmjs.org/tapable/-/tapable-1.1. |
|
|
6845 | sha512 = "9I2ydhj8Z9veORCw5PRm4u9uebCn0mcCa6scWoNcbZ6dAtoo2618u9UUzxgmsCOreJpqDDuv61LvwofW7hLcBA=="; | |
|
6815 | version = "1.1.3"; | |
|
6816 | src = fetchurl { | |
|
6817 | url = "https://registry.npmjs.org/tapable/-/tapable-1.1.3.tgz"; | |
|
6818 | sha512 = "4WK/bYZmj8xLr+HUCODHGF1ZFzsYffasLUgEiMBY4fgtltdO6B4WJtlSbPaDTLpYTcGVwM2qLnFTICEcNxs3kA=="; | |
|
6846 | 6819 | }; |
|
6847 | 6820 | }; |
|
6848 | 6821 | "through-2.3.8" = { |
@@ -6872,13 +6845,13 b' let' | |||
|
6872 | 6845 | sha512 = "YvC1SV1XdOUaL6gx5CoGroT3Gu49pK9+TZ38ErPldOWW4j49GI1HKs9DV+KGq/w6y+LZ72W1c8cKz2vzY+qpzg=="; |
|
6873 | 6846 | }; |
|
6874 | 6847 | }; |
|
6875 |
"tiny-emitter-2.0 |
|
|
6848 | "tiny-emitter-2.1.0" = { | |
|
6876 | 6849 | name = "tiny-emitter"; |
|
6877 | 6850 | packageName = "tiny-emitter"; |
|
6878 |
version = "2.0 |
|
|
6879 | src = fetchurl { | |
|
6880 |
url = "https://registry.npmjs.org/tiny-emitter/-/tiny-emitter-2.0 |
|
|
6881 | sha512 = "2NM0auVBGft5tee/OxP4PI3d8WItkDM+fPnaRAVo6xTDI2knbz9eC5ArWGqtGlYqiH3RU5yMpdyTTO7MguC4ow=="; | |
|
6851 | version = "2.1.0"; | |
|
6852 | src = fetchurl { | |
|
6853 | url = "https://registry.npmjs.org/tiny-emitter/-/tiny-emitter-2.1.0.tgz"; | |
|
6854 | sha512 = "NB6Dk1A9xgQPMoGqC5CVXn123gWyte215ONT5Pp5a0yt4nlEoO1ZWeCwpncaekPHXO60i47ihFnZPiRPjRMq4Q=="; | |
|
6882 | 6855 | }; |
|
6883 | 6856 | }; |
|
6884 | 6857 | "tiny-lr-fork-0.0.5" = { |
@@ -7034,13 +7007,13 b' let' | |||
|
7034 | 7007 | sha1 = "29c5733148057bb4e1f75df35b7a9cb72e6a59dd"; |
|
7035 | 7008 | }; |
|
7036 | 7009 | }; |
|
7037 |
"uglify-js-3.4. |
|
|
7010 | "uglify-js-3.4.10" = { | |
|
7038 | 7011 | name = "uglify-js"; |
|
7039 | 7012 | packageName = "uglify-js"; |
|
7040 |
version = "3.4. |
|
|
7041 | src = fetchurl { | |
|
7042 |
url = "https://registry.npmjs.org/uglify-js/-/uglify-js-3.4. |
|
|
7043 | sha512 = "8CJsbKOtEbnJsTyv6LE6m6ZKniqMiFWmm9sRbopbkGs3gMPPfd3Fh8iIA4Ykv5MgaTbqHr4BaoGLJLZNhsrW1Q=="; | |
|
7013 | version = "3.4.10"; | |
|
7014 | src = fetchurl { | |
|
7015 | url = "https://registry.npmjs.org/uglify-js/-/uglify-js-3.4.10.tgz"; | |
|
7016 | sha512 = "Y2VsbPVs0FIshJztycsO2SfPk7/KAF/T72qzv9u5EpQ4kB2hQoHlhNQTsNyy6ul7lQtqJN/AoWeS23OzEiEFxw=="; | |
|
7044 | 7017 | }; |
|
7045 | 7018 | }; |
|
7046 | 7019 | "uglify-to-browserify-1.0.2" = { |
@@ -7160,13 +7133,13 b' let' | |||
|
7160 | 7133 | sha1 = "8376873f7d2335179ffb1e6fc3a8ed0dfc8ab559"; |
|
7161 | 7134 | }; |
|
7162 | 7135 | }; |
|
7163 |
"upath-1.1. |
|
|
7136 | "upath-1.1.2" = { | |
|
7164 | 7137 | name = "upath"; |
|
7165 | 7138 | packageName = "upath"; |
|
7166 |
version = "1.1. |
|
|
7167 | src = fetchurl { | |
|
7168 |
url = "https://registry.npmjs.org/upath/-/upath-1.1. |
|
|
7169 | sha512 = "bzpH/oBhoS/QI/YtbkqCg6VEiPYjSZtrHQM6/QnJS6OL9pKUFLqb3aFh4Scvwm45+7iAgiMkLhSbaZxUqmrprw=="; | |
|
7139 | version = "1.1.2"; | |
|
7140 | src = fetchurl { | |
|
7141 | url = "https://registry.npmjs.org/upath/-/upath-1.1.2.tgz"; | |
|
7142 | sha512 = "kXpym8nmDmlCBr7nKdIx8P2jNBa+pBpIUFRnKJ4dr8htyYGJFokkr2ZvERRtUN+9SY+JqXouNgUPtv6JQva/2Q=="; | |
|
7170 | 7143 | }; |
|
7171 | 7144 | }; |
|
7172 | 7145 | "upper-case-1.1.3" = { |
@@ -7223,13 +7196,13 b' let' | |||
|
7223 | 7196 | sha1 = "7afb1afe50805246489e3db7fe0ed379336ac0f9"; |
|
7224 | 7197 | }; |
|
7225 | 7198 | }; |
|
7226 |
"util-0.1 |
|
|
7199 | "util-0.11.1" = { | |
|
7227 | 7200 | name = "util"; |
|
7228 | 7201 | packageName = "util"; |
|
7229 |
version = "0.1 |
|
|
7230 | src = fetchurl { | |
|
7231 |
url = "https://registry.npmjs.org/util/-/util-0.1 |
|
|
7232 | sha512 = "0Pm9hTQ3se5ll1XihRic3FDIku70C+iHUdT/W926rSgHV5QgXsYbKZN8MSC3tJtSkhuROzvsQjAaFENRXr+19A=="; | |
|
7202 | version = "0.11.1"; | |
|
7203 | src = fetchurl { | |
|
7204 | url = "https://registry.npmjs.org/util/-/util-0.11.1.tgz"; | |
|
7205 | sha512 = "HShAsny+zS2TZfaXxD9tYj4HQGlBezXZMZuM/S5PKLLoZkShZiGk9o5CzukI1LVHZvjdvZ2Sj1aW/Ndn2NB/HQ=="; | |
|
7233 | 7206 | }; |
|
7234 | 7207 | }; |
|
7235 | 7208 | "util-deprecate-1.0.2" = { |
@@ -7268,31 +7241,31 b' let' | |||
|
7268 | 7241 | sha512 = "yXJmeNaw3DnnKAOKJE51sL/ZaYfWJRl1pK9dr19YFCu0ObS231AB1/LbqTKRAQ5kw8A90rA6fr4riOUpTZvQZA=="; |
|
7269 | 7242 | }; |
|
7270 | 7243 | }; |
|
7271 |
"v8-compile-cache-2.0. |
|
|
7244 | "v8-compile-cache-2.0.3" = { | |
|
7272 | 7245 | name = "v8-compile-cache"; |
|
7273 | 7246 | packageName = "v8-compile-cache"; |
|
7274 |
version = "2.0. |
|
|
7275 | src = fetchurl { | |
|
7276 |
url = "https://registry.npmjs.org/v8-compile-cache/-/v8-compile-cache-2.0. |
|
|
7277 | sha512 = "1wFuMUIM16MDJRCrpbpuEPTUGmM5QMUg0cr3KFwra2XgOgFcPGDQHDh3CszSCD2Zewc/dh/pamNEW8CbfDebUw=="; | |
|
7278 | }; | |
|
7279 | }; | |
|
7280 |
"v8flags-3.1. |
|
|
7247 | version = "2.0.3"; | |
|
7248 | src = fetchurl { | |
|
7249 | url = "https://registry.npmjs.org/v8-compile-cache/-/v8-compile-cache-2.0.3.tgz"; | |
|
7250 | sha512 = "CNmdbwQMBjwr9Gsmohvm0pbL954tJrNzf6gWL3K+QMQf00PF7ERGrEiLgjuU3mKreLC2MeGhUsNV9ybTbLgd3w=="; | |
|
7251 | }; | |
|
7252 | }; | |
|
7253 | "v8flags-3.1.3" = { | |
|
7281 | 7254 | name = "v8flags"; |
|
7282 | 7255 | packageName = "v8flags"; |
|
7283 |
version = "3.1. |
|
|
7284 | src = fetchurl { | |
|
7285 |
url = "https://registry.npmjs.org/v8flags/-/v8flags-3.1. |
|
|
7286 | sha512 = "MtivA7GF24yMPte9Rp/BWGCYQNaUj86zeYxV/x2RRJMKagImbbv3u8iJC57lNhWLPcGLJmHcHmFWkNsplbbLWw=="; | |
|
7287 | }; | |
|
7288 | }; | |
|
7289 |
"vendors-1.0. |
|
|
7256 | version = "3.1.3"; | |
|
7257 | src = fetchurl { | |
|
7258 | url = "https://registry.npmjs.org/v8flags/-/v8flags-3.1.3.tgz"; | |
|
7259 | sha512 = "amh9CCg3ZxkzQ48Mhcb8iX7xpAfYJgePHxWMQCBWECpOSqJUXgY26ncA61UTV0BkPqfhcy6mzwCIoP4ygxpW8w=="; | |
|
7260 | }; | |
|
7261 | }; | |
|
7262 | "vendors-1.0.3" = { | |
|
7290 | 7263 | name = "vendors"; |
|
7291 | 7264 | packageName = "vendors"; |
|
7292 |
version = "1.0. |
|
|
7293 | src = fetchurl { | |
|
7294 |
url = "https://registry.npmjs.org/vendors/-/vendors-1.0. |
|
|
7295 | sha512 = "w/hry/368nO21AN9QljsaIhb9ZiZtZARoVH5f3CsFbawdLdayCgKRPup7CggujvySMxx0I91NOyxdVENohprLQ=="; | |
|
7265 | version = "1.0.3"; | |
|
7266 | src = fetchurl { | |
|
7267 | url = "https://registry.npmjs.org/vendors/-/vendors-1.0.3.tgz"; | |
|
7268 | sha512 = "fOi47nsJP5Wqefa43kyWSg80qF+Q3XA6MUkgi7Hp1HQaKDQW4cQrK2D0P7mmbFtsV1N89am55Yru/nyEwRubcw=="; | |
|
7296 | 7269 | }; |
|
7297 | 7270 | }; |
|
7298 | 7271 | "verror-1.10.0" = { |
@@ -7430,13 +7403,13 b' let' | |||
|
7430 | 7403 | sha1 = "b79669bb42ecb409f83d583cad52ca17eaa1643f"; |
|
7431 | 7404 | }; |
|
7432 | 7405 | }; |
|
7433 |
"worker-farm-1. |
|
|
7406 | "worker-farm-1.7.0" = { | |
|
7434 | 7407 | name = "worker-farm"; |
|
7435 | 7408 | packageName = "worker-farm"; |
|
7436 |
version = "1. |
|
|
7437 | src = fetchurl { | |
|
7438 |
url = "https://registry.npmjs.org/worker-farm/-/worker-farm-1. |
|
|
7439 | sha512 = "6w+3tHbM87WnSWnENBUvA2pxJPLhQUg5LKwUQHq3r+XPhIM+Gh2R5ycbwPCyuGbNg+lPgdcnQUhuC02kJCvffQ=="; | |
|
7409 | version = "1.7.0"; | |
|
7410 | src = fetchurl { | |
|
7411 | url = "https://registry.npmjs.org/worker-farm/-/worker-farm-1.7.0.tgz"; | |
|
7412 | sha512 = "rvw3QTZc8lAxyVrqcSGVm5yP/IJ2UcB3U0graE3LCFoZ0Yn2x4EoVSqJKdB/T5M+FLcRPjz4TDacRf3OCfNUzw=="; | |
|
7440 | 7413 | }; |
|
7441 | 7414 | }; |
|
7442 | 7415 | "wrap-ansi-2.1.0" = { |
@@ -7536,14 +7509,14 b' let' | |||
|
7536 | 7509 | sources."@polymer/paper-behaviors-3.0.1" |
|
7537 | 7510 | sources."@polymer/paper-button-3.0.1" |
|
7538 | 7511 | sources."@polymer/paper-ripple-3.0.1" |
|
7539 |
sources."@polymer/paper-spinner-3.0. |
|
|
7512 | sources."@polymer/paper-spinner-3.0.2" | |
|
7540 | 7513 | sources."@polymer/paper-styles-3.0.1" |
|
7541 | 7514 | sources."@polymer/paper-toast-3.0.1" |
|
7542 | 7515 | sources."@polymer/paper-toggle-button-3.0.1" |
|
7543 | 7516 | sources."@polymer/paper-tooltip-3.0.1" |
|
7544 |
sources."@polymer/polymer-3. |
|
|
7517 | sources."@polymer/polymer-3.2.0" | |
|
7545 | 7518 | sources."@types/clone-0.1.30" |
|
7546 |
sources."@types/node-6.14. |
|
|
7519 | sources."@types/node-6.14.6" | |
|
7547 | 7520 | sources."@types/parse5-2.2.34" |
|
7548 | 7521 | sources."@webassemblyjs/ast-1.7.10" |
|
7549 | 7522 | sources."@webassemblyjs/floating-point-hex-parser-1.7.10" |
@@ -7563,8 +7536,8 b' let' | |||
|
7563 | 7536 | sources."@webassemblyjs/wasm-parser-1.7.10" |
|
7564 | 7537 | sources."@webassemblyjs/wast-parser-1.7.10" |
|
7565 | 7538 | sources."@webassemblyjs/wast-printer-1.7.10" |
|
7566 |
sources."@webcomponents/shadycss-1. |
|
|
7567 | sources."@webcomponents/webcomponentsjs-2.2.1" | |
|
7539 | sources."@webcomponents/shadycss-1.9.1" | |
|
7540 | sources."@webcomponents/webcomponentsjs-2.2.10" | |
|
7568 | 7541 | sources."@xtuc/ieee754-1.2.0" |
|
7569 | 7542 | sources."@xtuc/long-4.2.1" |
|
7570 | 7543 | sources."abbrev-1.1.1" |
@@ -7576,7 +7549,7 b' let' | |||
|
7576 | 7549 | ]; |
|
7577 | 7550 | }) |
|
7578 | 7551 | sources."ajv-4.11.8" |
|
7579 |
sources."ajv-keywords-3. |
|
|
7552 | sources."ajv-keywords-3.4.0" | |
|
7580 | 7553 | (sources."align-text-0.1.4" // { |
|
7581 | 7554 | dependencies = [ |
|
7582 | 7555 | sources."kind-of-3.2.2" |
@@ -7586,7 +7559,11 b' let' | |||
|
7586 | 7559 | sources."amdefine-1.0.1" |
|
7587 | 7560 | sources."ansi-regex-0.2.1" |
|
7588 | 7561 | sources."ansi-styles-1.1.0" |
|
7589 | sources."anymatch-2.0.0" | |
|
7562 | (sources."anymatch-2.0.0" // { | |
|
7563 | dependencies = [ | |
|
7564 | sources."normalize-path-2.1.1" | |
|
7565 | ]; | |
|
7566 | }) | |
|
7590 | 7567 | sources."appenlight-client-git+https://git@github.com/AppEnlight/appenlight-client-js.git#0.5.1" |
|
7591 | 7568 | sources."aproba-1.2.0" |
|
7592 | 7569 | (sources."argparse-0.1.16" // { |
@@ -7602,11 +7579,10 b' let' | |||
|
7602 | 7579 | sources."array-union-1.0.2" |
|
7603 | 7580 | sources."array-uniq-1.0.3" |
|
7604 | 7581 | sources."array-unique-0.3.2" |
|
7605 | sources."arrify-1.0.1" | |
|
7606 | 7582 | sources."asap-2.0.6" |
|
7607 | 7583 | sources."asn1-0.2.4" |
|
7608 | 7584 | sources."asn1.js-4.10.1" |
|
7609 |
(sources."assert-1. |
|
|
7585 | (sources."assert-1.5.0" // { | |
|
7610 | 7586 | dependencies = [ |
|
7611 | 7587 | sources."inherits-2.0.1" |
|
7612 | 7588 | sources."util-0.10.3" |
@@ -7616,7 +7592,7 b' let' | |||
|
7616 | 7592 | sources."assign-symbols-1.0.0" |
|
7617 | 7593 | sources."ast-types-0.9.6" |
|
7618 | 7594 | sources."async-0.1.22" |
|
7619 |
sources."async-each-1.0. |
|
|
7595 | sources."async-each-1.0.3" | |
|
7620 | 7596 | sources."asynckit-0.4.0" |
|
7621 | 7597 | sources."atob-2.1.2" |
|
7622 | 7598 | (sources."autoprefixer-6.7.7" // { |
@@ -7738,8 +7714,8 b' let' | |||
|
7738 | 7714 | sources."base64-js-1.3.0" |
|
7739 | 7715 | sources."bcrypt-pbkdf-1.0.2" |
|
7740 | 7716 | sources."big.js-5.2.2" |
|
7741 |
sources."binary-extensions-1.1 |
|
|
7742 |
sources."bluebird-3.5. |
|
|
7717 | sources."binary-extensions-1.13.1" | |
|
7718 | sources."bluebird-3.5.4" | |
|
7743 | 7719 | sources."bn.js-4.11.8" |
|
7744 | 7720 | sources."boolbase-1.0.0" |
|
7745 | 7721 | sources."boom-2.10.1" |
@@ -7763,7 +7739,7 b' let' | |||
|
7763 | 7739 | sources."builtin-status-codes-3.0.0" |
|
7764 | 7740 | (sources."cacache-10.0.4" // { |
|
7765 | 7741 | dependencies = [ |
|
7766 |
sources."glob-7.1. |
|
|
7742 | sources."glob-7.1.4" | |
|
7767 | 7743 | sources."graceful-fs-4.1.15" |
|
7768 | 7744 | sources."lru-cache-4.1.5" |
|
7769 | 7745 | sources."minimatch-3.0.4" |
@@ -7772,20 +7748,20 b' let' | |||
|
7772 | 7748 | }) |
|
7773 | 7749 | sources."cache-base-1.0.1" |
|
7774 | 7750 | sources."camel-case-3.0.0" |
|
7775 |
sources."camelcase-5. |
|
|
7751 | sources."camelcase-5.3.1" | |
|
7776 | 7752 | (sources."caniuse-api-1.6.1" // { |
|
7777 | 7753 | dependencies = [ |
|
7778 | 7754 | sources."browserslist-1.7.7" |
|
7779 | 7755 | ]; |
|
7780 | 7756 | }) |
|
7781 |
sources."caniuse-db-1.0.300009 |
|
|
7782 |
sources."caniuse-lite-1.0.300009 |
|
|
7757 | sources."caniuse-db-1.0.30000967" | |
|
7758 | sources."caniuse-lite-1.0.30000967" | |
|
7783 | 7759 | sources."caseless-0.12.0" |
|
7784 | 7760 | sources."center-align-0.1.3" |
|
7785 | 7761 | sources."chalk-0.5.1" |
|
7786 |
(sources."chokidar-2. |
|
|
7787 | dependencies = [ | |
|
7788 |
sources."is-glob-4.0. |
|
|
7762 | (sources."chokidar-2.1.5" // { | |
|
7763 | dependencies = [ | |
|
7764 | sources."is-glob-4.0.1" | |
|
7789 | 7765 | ]; |
|
7790 | 7766 | }) |
|
7791 | 7767 | sources."chownr-1.1.1" |
@@ -7825,7 +7801,7 b' let' | |||
|
7825 | 7801 | }) |
|
7826 | 7802 | (sources."cli-1.0.1" // { |
|
7827 | 7803 | dependencies = [ |
|
7828 |
sources."glob-7.1. |
|
|
7804 | sources."glob-7.1.4" | |
|
7829 | 7805 | sources."minimatch-3.0.4" |
|
7830 | 7806 | ]; |
|
7831 | 7807 | }) |
@@ -7848,10 +7824,10 b' let' | |||
|
7848 | 7824 | sources."color-string-0.3.0" |
|
7849 | 7825 | sources."colormin-1.1.2" |
|
7850 | 7826 | sources."colors-0.6.2" |
|
7851 |
sources."combined-stream-1.0. |
|
|
7827 | sources."combined-stream-1.0.8" | |
|
7852 | 7828 | sources."commander-2.14.1" |
|
7853 | 7829 | sources."commondir-1.0.1" |
|
7854 |
sources."component-emitter-1. |
|
|
7830 | sources."component-emitter-1.3.0" | |
|
7855 | 7831 | sources."concat-map-0.0.1" |
|
7856 | 7832 | (sources."concat-stream-1.6.2" // { |
|
7857 | 7833 | dependencies = [ |
@@ -7864,7 +7840,7 b' let' | |||
|
7864 | 7840 | sources."convert-source-map-1.6.0" |
|
7865 | 7841 | (sources."copy-concurrently-1.0.5" // { |
|
7866 | 7842 | dependencies = [ |
|
7867 |
sources."glob-7.1. |
|
|
7843 | sources."glob-7.1.4" | |
|
7868 | 7844 | sources."minimatch-3.0.4" |
|
7869 | 7845 | sources."rimraf-2.6.3" |
|
7870 | 7846 | ]; |
@@ -7872,11 +7848,11 b' let' | |||
|
7872 | 7848 | sources."copy-descriptor-0.1.1" |
|
7873 | 7849 | (sources."copy-webpack-plugin-4.6.0" // { |
|
7874 | 7850 | dependencies = [ |
|
7875 |
sources."is-glob-4.0. |
|
|
7851 | sources."is-glob-4.0.1" | |
|
7876 | 7852 | sources."minimatch-3.0.4" |
|
7877 | 7853 | ]; |
|
7878 | 7854 | }) |
|
7879 |
sources."core-js-2.6. |
|
|
7855 | sources."core-js-2.6.5" | |
|
7880 | 7856 | sources."core-util-is-1.0.2" |
|
7881 | 7857 | sources."create-ecdh-4.0.3" |
|
7882 | 7858 | sources."create-hash-1.2.0" |
@@ -7896,7 +7872,7 b' let' | |||
|
7896 | 7872 | sources."regexpu-core-1.0.0" |
|
7897 | 7873 | ]; |
|
7898 | 7874 | }) |
|
7899 |
sources."css-what-2.1. |
|
|
7875 | sources."css-what-2.1.3" | |
|
7900 | 7876 | sources."cssesc-0.1.0" |
|
7901 | 7877 | sources."cssnano-3.10.0" |
|
7902 | 7878 | sources."csso-2.3.2" |
@@ -7921,11 +7897,10 b' let' | |||
|
7921 | 7897 | sources."detect-file-1.0.0" |
|
7922 | 7898 | sources."detect-indent-4.0.0" |
|
7923 | 7899 | sources."diffie-hellman-5.0.3" |
|
7924 |
sources."dir-glob-2. |
|
|
7900 | sources."dir-glob-2.2.2" | |
|
7925 | 7901 | sources."dom-converter-0.2.0" |
|
7926 |
(sources."dom-serializer-0.1. |
|
|
7927 | dependencies = [ | |
|
7928 | sources."domelementtype-1.1.3" | |
|
7902 | (sources."dom-serializer-0.1.1" // { | |
|
7903 | dependencies = [ | |
|
7929 | 7904 | sources."entities-1.1.2" |
|
7930 | 7905 | ]; |
|
7931 | 7906 | }) |
@@ -7939,14 +7914,15 b' let' | |||
|
7939 | 7914 | sources."domelementtype-1.3.1" |
|
7940 | 7915 | sources."domhandler-2.3.0" |
|
7941 | 7916 | sources."domutils-1.5.1" |
|
7942 |
|
|
|
7917 | sources."dropzone-5.5.1" | |
|
7918 | (sources."duplexify-3.7.1" // { | |
|
7943 | 7919 | dependencies = [ |
|
7944 | 7920 | sources."readable-stream-2.3.6" |
|
7945 | 7921 | sources."string_decoder-1.1.1" |
|
7946 | 7922 | ]; |
|
7947 | 7923 | }) |
|
7948 | 7924 | sources."ecc-jsbn-0.1.2" |
|
7949 |
sources."electron-to-chromium-1.3. |
|
|
7925 | sources."electron-to-chromium-1.3.133" | |
|
7950 | 7926 | sources."elliptic-6.4.1" |
|
7951 | 7927 | sources."emojis-list-2.1.0" |
|
7952 | 7928 | sources."end-of-stream-1.4.1" |
@@ -7961,14 +7937,14 b' let' | |||
|
7961 | 7937 | sources."es-to-primitive-1.2.0" |
|
7962 | 7938 | sources."es6-templates-0.2.3" |
|
7963 | 7939 | sources."escape-string-regexp-1.0.5" |
|
7964 |
sources."eslint-scope-4.0. |
|
|
7940 | sources."eslint-scope-4.0.3" | |
|
7965 | 7941 | sources."espree-3.5.4" |
|
7966 | 7942 | sources."esprima-1.0.4" |
|
7967 | 7943 | sources."esrecurse-4.2.1" |
|
7968 | 7944 | sources."estraverse-4.2.0" |
|
7969 | 7945 | sources."esutils-2.0.2" |
|
7970 | 7946 | sources."eventemitter2-0.4.14" |
|
7971 |
sources."events- |
|
|
7947 | sources."events-3.0.0" | |
|
7972 | 7948 | sources."evp_bytestokey-1.0.3" |
|
7973 | 7949 | sources."execa-1.0.0" |
|
7974 | 7950 | sources."exit-0.1.2" |
@@ -8025,10 +8001,10 b' let' | |||
|
8025 | 8001 | sources."minimatch-0.3.0" |
|
8026 | 8002 | ]; |
|
8027 | 8003 | }) |
|
8028 |
sources."fined-1. |
|
|
8004 | sources."fined-1.2.0" | |
|
8029 | 8005 | sources."flagged-respawn-1.0.1" |
|
8030 | 8006 | sources."flatten-1.0.2" |
|
8031 |
(sources."flush-write-stream-1. |
|
|
8007 | (sources."flush-write-stream-1.1.1" // { | |
|
8032 | 8008 | dependencies = [ |
|
8033 | 8009 | sources."readable-stream-2.3.6" |
|
8034 | 8010 | sources."string_decoder-1.1.1" |
@@ -8051,7 +8027,7 b' let' | |||
|
8051 | 8027 | ]; |
|
8052 | 8028 | }) |
|
8053 | 8029 | sources."fs.realpath-1.0.0" |
|
8054 |
sources."fsevents-1.2. |
|
|
8030 | sources."fsevents-1.2.9" | |
|
8055 | 8031 | sources."function-bind-1.1.1" |
|
8056 | 8032 | sources."gaze-0.5.2" |
|
8057 | 8033 | sources."get-caller-file-1.0.3" |
@@ -8083,7 +8059,7 b' let' | |||
|
8083 | 8059 | sources."globals-9.18.0" |
|
8084 | 8060 | (sources."globby-7.1.1" // { |
|
8085 | 8061 | dependencies = [ |
|
8086 |
sources."glob-7.1. |
|
|
8062 | sources."glob-7.1.4" | |
|
8087 | 8063 | sources."minimatch-3.0.4" |
|
8088 | 8064 | ]; |
|
8089 | 8065 | }) |
@@ -8115,12 +8091,18 b' let' | |||
|
8115 | 8091 | sources."supports-color-2.0.0" |
|
8116 | 8092 | ]; |
|
8117 | 8093 | }) |
|
8118 | sources."grunt-contrib-jshint-0.12.0" | |
|
8094 | (sources."grunt-contrib-jshint-0.12.0" // { | |
|
8095 | dependencies = [ | |
|
8096 | sources."jshint-2.9.7" | |
|
8097 | sources."lodash-4.17.11" | |
|
8098 | sources."minimatch-3.0.4" | |
|
8099 | ]; | |
|
8100 | }) | |
|
8119 | 8101 | (sources."grunt-contrib-less-1.4.1" // { |
|
8120 | 8102 | dependencies = [ |
|
8121 | 8103 | sources."ansi-regex-2.1.1" |
|
8122 | 8104 | sources."ansi-styles-2.2.1" |
|
8123 |
sources."async-2.6. |
|
|
8105 | sources."async-2.6.2" | |
|
8124 | 8106 | sources."chalk-1.1.3" |
|
8125 | 8107 | sources."has-ansi-2.0.0" |
|
8126 | 8108 | sources."lodash-4.17.11" |
@@ -8172,7 +8154,7 b' let' | |||
|
8172 | 8154 | sources."hmac-drbg-1.0.1" |
|
8173 | 8155 | sources."hoek-2.16.3" |
|
8174 | 8156 | sources."home-or-tmp-2.0.0" |
|
8175 |
sources."homedir-polyfill-1.0. |
|
|
8157 | sources."homedir-polyfill-1.0.3" | |
|
8176 | 8158 | sources."hooker-0.2.3" |
|
8177 | 8159 | sources."html-comment-regex-1.1.2" |
|
8178 | 8160 | sources."html-loader-0.4.5" |
@@ -8203,7 +8185,7 b' let' | |||
|
8203 | 8185 | sources."supports-color-5.5.0" |
|
8204 | 8186 | ]; |
|
8205 | 8187 | }) |
|
8206 |
sources."ieee754-1.1.1 |
|
|
8188 | sources."ieee754-1.1.13" | |
|
8207 | 8189 | sources."iferr-0.1.5" |
|
8208 | 8190 | sources."ignore-3.3.10" |
|
8209 | 8191 | sources."image-size-0.5.5" |
@@ -8211,9 +8193,9 b' let' | |||
|
8211 | 8193 | dependencies = [ |
|
8212 | 8194 | sources."find-up-3.0.0" |
|
8213 | 8195 | sources."locate-path-3.0.0" |
|
8214 |
sources."p-limit-2. |
|
|
8196 | sources."p-limit-2.2.0" | |
|
8215 | 8197 | sources."p-locate-3.0.0" |
|
8216 |
sources."p-try-2. |
|
|
8198 | sources."p-try-2.2.0" | |
|
8217 | 8199 | sources."pkg-dir-3.0.0" |
|
8218 | 8200 | ]; |
|
8219 | 8201 | }) |
@@ -8261,12 +8243,12 b' let' | |||
|
8261 | 8243 | sources."isobject-3.0.1" |
|
8262 | 8244 | sources."isstream-0.1.2" |
|
8263 | 8245 | sources."jquery-1.11.3" |
|
8264 |
sources."js-base64-2.5. |
|
|
8246 | sources."js-base64-2.5.1" | |
|
8265 | 8247 | sources."js-tokens-3.0.2" |
|
8266 | 8248 | sources."js-yaml-2.0.5" |
|
8267 | 8249 | sources."jsbn-0.1.1" |
|
8268 | 8250 | sources."jsesc-1.3.0" |
|
8269 |
(sources."jshint-2. |
|
|
8251 | (sources."jshint-2.10.2" // { | |
|
8270 | 8252 | dependencies = [ |
|
8271 | 8253 | sources."lodash-4.17.11" |
|
8272 | 8254 | sources."minimatch-3.0.4" |
@@ -8297,12 +8279,11 b' let' | |||
|
8297 | 8279 | sources."findup-sync-2.0.0" |
|
8298 | 8280 | ]; |
|
8299 | 8281 | }) |
|
8300 |
sources."loader-runner-2. |
|
|
8282 | sources."loader-runner-2.4.0" | |
|
8301 | 8283 | sources."loader-utils-1.2.3" |
|
8302 | 8284 | sources."locate-path-2.0.0" |
|
8303 | 8285 | sources."lodash-0.9.2" |
|
8304 | 8286 | sources."lodash.camelcase-4.3.0" |
|
8305 | sources."lodash.debounce-4.0.8" | |
|
8306 | 8287 | sources."lodash.isplainobject-4.0.6" |
|
8307 | 8288 | sources."lodash.memoize-4.1.2" |
|
8308 | 8289 | sources."lodash.uniq-4.5.0" |
@@ -8318,7 +8299,7 b' let' | |||
|
8318 | 8299 | sources."mark.js-8.11.1" |
|
8319 | 8300 | sources."math-expression-evaluator-1.2.17" |
|
8320 | 8301 | sources."md5.js-1.3.5" |
|
8321 |
sources."mem-4. |
|
|
8302 | sources."mem-4.3.0" | |
|
8322 | 8303 | (sources."memory-fs-0.4.1" // { |
|
8323 | 8304 | dependencies = [ |
|
8324 | 8305 | sources."readable-stream-2.3.6" |
@@ -8328,9 +8309,9 b' let' | |||
|
8328 | 8309 | sources."micromatch-3.1.10" |
|
8329 | 8310 | sources."miller-rabin-4.0.1" |
|
8330 | 8311 | sources."mime-1.6.0" |
|
8331 |
sources."mime-db-1. |
|
|
8332 |
sources."mime-types-2.1.2 |
|
|
8333 |
sources."mimic-fn-1 |
|
|
8312 | sources."mime-db-1.40.0" | |
|
8313 | sources."mime-types-2.1.24" | |
|
8314 | sources."mimic-fn-2.1.0" | |
|
8334 | 8315 | sources."minimalistic-assert-1.0.1" |
|
8335 | 8316 | sources."minimalistic-crypto-utils-1.0.1" |
|
8336 | 8317 | sources."minimatch-0.2.14" |
@@ -8346,22 +8327,22 b' let' | |||
|
8346 | 8327 | sources."minimist-0.0.8" |
|
8347 | 8328 | ]; |
|
8348 | 8329 | }) |
|
8349 |
sources."moment-2.2 |
|
|
8350 |
sources."mousetrap-1.6. |
|
|
8330 | sources."moment-2.24.0" | |
|
8331 | sources."mousetrap-1.6.3" | |
|
8351 | 8332 | (sources."move-concurrently-1.0.1" // { |
|
8352 | 8333 | dependencies = [ |
|
8353 |
sources."glob-7.1. |
|
|
8334 | sources."glob-7.1.4" | |
|
8354 | 8335 | sources."minimatch-3.0.4" |
|
8355 | 8336 | sources."rimraf-2.6.3" |
|
8356 | 8337 | ]; |
|
8357 | 8338 | }) |
|
8358 | 8339 | sources."ms-2.0.0" |
|
8359 |
sources."nan-2.12 |
|
|
8340 | sources."nan-2.13.2" | |
|
8360 | 8341 | sources."nanomatch-1.2.13" |
|
8361 |
sources."neo-async-2.6. |
|
|
8342 | sources."neo-async-2.6.1" | |
|
8362 | 8343 | sources."nice-try-1.0.5" |
|
8363 | 8344 | sources."no-case-2.3.2" |
|
8364 |
(sources."node-libs-browser-2. |
|
|
8345 | (sources."node-libs-browser-2.2.0" // { | |
|
8365 | 8346 | dependencies = [ |
|
8366 | 8347 | (sources."readable-stream-2.3.6" // { |
|
8367 | 8348 | dependencies = [ |
@@ -8377,7 +8358,7 b' let' | |||
|
8377 | 8358 | sources."nopt-2.0.0" |
|
8378 | 8359 | ]; |
|
8379 | 8360 | }) |
|
8380 |
sources."normalize-path- |
|
|
8361 | sources."normalize-path-3.0.0" | |
|
8381 | 8362 | sources."normalize-range-0.1.2" |
|
8382 | 8363 | sources."normalize-url-1.9.1" |
|
8383 | 8364 | sources."npm-run-path-2.0.2" |
@@ -8399,7 +8380,7 b' let' | |||
|
8399 | 8380 | sources."kind-of-3.2.2" |
|
8400 | 8381 | ]; |
|
8401 | 8382 | }) |
|
8402 |
sources."object-keys-1. |
|
|
8383 | sources."object-keys-1.1.1" | |
|
8403 | 8384 | sources."object-visit-1.0.1" |
|
8404 | 8385 | sources."object.defaults-1.1.0" |
|
8405 | 8386 | sources."object.getownpropertydescriptors-2.0.3" |
@@ -8413,11 +8394,11 b' let' | |||
|
8413 | 8394 | sources."osenv-0.1.5" |
|
8414 | 8395 | sources."p-defer-1.0.0" |
|
8415 | 8396 | sources."p-finally-1.0.0" |
|
8416 |
sources."p-is-promise- |
|
|
8397 | sources."p-is-promise-2.1.0" | |
|
8417 | 8398 | sources."p-limit-1.3.0" |
|
8418 | 8399 | sources."p-locate-2.0.0" |
|
8419 | 8400 | sources."p-try-1.0.0" |
|
8420 |
sources."pako-1.0. |
|
|
8401 | sources."pako-1.0.10" | |
|
8421 | 8402 | (sources."parallel-transform-1.1.0" // { |
|
8422 | 8403 | dependencies = [ |
|
8423 | 8404 | sources."readable-stream-2.3.6" |
@@ -8425,7 +8406,7 b' let' | |||
|
8425 | 8406 | ]; |
|
8426 | 8407 | }) |
|
8427 | 8408 | sources."param-case-2.1.1" |
|
8428 |
sources."parse-asn1-5.1. |
|
|
8409 | sources."parse-asn1-5.1.4" | |
|
8429 | 8410 | sources."parse-filepath-1.0.2" |
|
8430 | 8411 | sources."parse-passwd-1.0.0" |
|
8431 | 8412 | sources."parse5-3.0.3" |
@@ -8564,7 +8545,7 b' let' | |||
|
8564 | 8545 | sources."query-string-4.3.4" |
|
8565 | 8546 | sources."querystring-0.2.0" |
|
8566 | 8547 | sources."querystring-es3-0.2.1" |
|
8567 |
sources."randombytes-2.0 |
|
|
8548 | sources."randombytes-2.1.0" | |
|
8568 | 8549 | sources."randomfill-1.0.4" |
|
8569 | 8550 | sources."raw-loader-1.0.0-beta.0" |
|
8570 | 8551 | (sources."readable-stream-1.1.14" // { |
@@ -8608,14 +8589,9 b' let' | |||
|
8608 | 8589 | }) |
|
8609 | 8590 | sources."relateurl-0.2.7" |
|
8610 | 8591 | sources."remove-trailing-separator-1.1.0" |
|
8611 |
(sources."renderkid-2.0. |
|
|
8592 | (sources."renderkid-2.0.3" // { | |
|
8612 | 8593 | dependencies = [ |
|
8613 | 8594 | sources."ansi-regex-2.1.1" |
|
8614 | sources."domhandler-2.1.0" | |
|
8615 | sources."domutils-1.1.6" | |
|
8616 | sources."htmlparser2-3.3.0" | |
|
8617 | sources."isarray-0.0.1" | |
|
8618 | sources."readable-stream-1.0.34" | |
|
8619 | 8595 | sources."strip-ansi-3.0.1" |
|
8620 | 8596 | ]; |
|
8621 | 8597 | }) |
@@ -8625,7 +8601,7 b' let' | |||
|
8625 | 8601 | sources."request-2.81.0" |
|
8626 | 8602 | sources."require-directory-2.1.1" |
|
8627 | 8603 | sources."require-main-filename-1.0.1" |
|
8628 |
sources."resolve-1. |
|
|
8604 | sources."resolve-1.10.1" | |
|
8629 | 8605 | sources."resolve-cwd-2.0.0" |
|
8630 | 8606 | sources."resolve-dir-1.0.1" |
|
8631 | 8607 | sources."resolve-from-3.0.0" |
@@ -8641,12 +8617,12 b' let' | |||
|
8641 | 8617 | sources."sax-1.2.4" |
|
8642 | 8618 | (sources."schema-utils-0.4.7" // { |
|
8643 | 8619 | dependencies = [ |
|
8644 |
sources."ajv-6. |
|
|
8620 | sources."ajv-6.10.0" | |
|
8645 | 8621 | ]; |
|
8646 | 8622 | }) |
|
8647 | 8623 | sources."select-1.1.2" |
|
8648 |
sources."semver-5. |
|
|
8649 |
sources."serialize-javascript-1. |
|
|
8624 | sources."semver-5.7.0" | |
|
8625 | sources."serialize-javascript-1.7.0" | |
|
8650 | 8626 | sources."set-blocking-2.0.0" |
|
8651 | 8627 | (sources."set-value-2.0.0" // { |
|
8652 | 8628 | dependencies = [ |
@@ -8698,7 +8674,7 b' let' | |||
|
8698 | 8674 | sources."source-map-url-0.4.0" |
|
8699 | 8675 | sources."split-string-3.1.0" |
|
8700 | 8676 | sources."sprintf-js-1.0.3" |
|
8701 |
(sources."sshpk-1.16. |
|
|
8677 | (sources."sshpk-1.16.1" // { | |
|
8702 | 8678 | dependencies = [ |
|
8703 | 8679 | sources."assert-plus-1.0.0" |
|
8704 | 8680 | ]; |
@@ -8722,7 +8698,7 b' let' | |||
|
8722 | 8698 | ]; |
|
8723 | 8699 | }) |
|
8724 | 8700 | sources."sticky-sidebar-3.3.1" |
|
8725 |
(sources."stream-browserify-2.0. |
|
|
8701 | (sources."stream-browserify-2.0.2" // { | |
|
8726 | 8702 | dependencies = [ |
|
8727 | 8703 | sources."readable-stream-2.3.6" |
|
8728 | 8704 | sources."string_decoder-1.1.1" |
@@ -8759,7 +8735,7 b' let' | |||
|
8759 | 8735 | sources."js-yaml-3.7.0" |
|
8760 | 8736 | ]; |
|
8761 | 8737 | }) |
|
8762 |
sources."tapable-1.1. |
|
|
8738 | sources."tapable-1.1.3" | |
|
8763 | 8739 | sources."through-2.3.8" |
|
8764 | 8740 | (sources."through2-2.0.5" // { |
|
8765 | 8741 | dependencies = [ |
@@ -8768,7 +8744,7 b' let' | |||
|
8768 | 8744 | ]; |
|
8769 | 8745 | }) |
|
8770 | 8746 | sources."timers-browserify-2.0.10" |
|
8771 |
sources."tiny-emitter-2.0 |
|
|
8747 | sources."tiny-emitter-2.1.0" | |
|
8772 | 8748 | (sources."tiny-lr-fork-0.0.5" // { |
|
8773 | 8749 | dependencies = [ |
|
8774 | 8750 | sources."debug-0.7.4" |
@@ -8808,9 +8784,9 b' let' | |||
|
8808 | 8784 | sources."source-map-0.6.1" |
|
8809 | 8785 | ]; |
|
8810 | 8786 | }) |
|
8811 |
(sources."uglify-js-3.4. |
|
|
8812 | dependencies = [ | |
|
8813 |
sources."commander-2.1 |
|
|
8787 | (sources."uglify-js-3.4.10" // { | |
|
8788 | dependencies = [ | |
|
8789 | sources."commander-2.19.0" | |
|
8814 | 8790 | sources."source-map-0.6.1" |
|
8815 | 8791 | ]; |
|
8816 | 8792 | }) |
@@ -8843,7 +8819,7 b' let' | |||
|
8843 | 8819 | sources."has-values-0.1.4" |
|
8844 | 8820 | ]; |
|
8845 | 8821 | }) |
|
8846 |
sources."upath-1.1. |
|
|
8822 | sources."upath-1.1.2" | |
|
8847 | 8823 | sources."upper-case-1.1.3" |
|
8848 | 8824 | (sources."uri-js-4.2.2" // { |
|
8849 | 8825 | dependencies = [ |
@@ -8857,14 +8833,14 b' let' | |||
|
8857 | 8833 | ]; |
|
8858 | 8834 | }) |
|
8859 | 8835 | sources."use-3.1.1" |
|
8860 |
sources."util-0.1 |
|
|
8836 | sources."util-0.11.1" | |
|
8861 | 8837 | sources."util-deprecate-1.0.2" |
|
8862 | 8838 | sources."util.promisify-1.0.0" |
|
8863 | 8839 | sources."utila-0.4.0" |
|
8864 | 8840 | sources."uuid-3.3.2" |
|
8865 |
sources."v8-compile-cache-2.0. |
|
|
8866 |
sources."v8flags-3.1. |
|
|
8867 |
sources."vendors-1.0. |
|
|
8841 | sources."v8-compile-cache-2.0.3" | |
|
8842 | sources."v8flags-3.1.3" | |
|
8843 | sources."vendors-1.0.3" | |
|
8868 | 8844 | (sources."verror-1.10.0" // { |
|
8869 | 8845 | dependencies = [ |
|
8870 | 8846 | sources."assert-plus-1.0.0" |
@@ -8879,7 +8855,7 b' let' | |||
|
8879 | 8855 | sources."waypoints-4.0.1" |
|
8880 | 8856 | (sources."webpack-4.23.1" // { |
|
8881 | 8857 | dependencies = [ |
|
8882 |
sources."ajv-6. |
|
|
8858 | sources."ajv-6.10.0" | |
|
8883 | 8859 | ]; |
|
8884 | 8860 | }) |
|
8885 | 8861 | (sources."webpack-cli-3.1.2" // { |
@@ -8919,7 +8895,7 b' let' | |||
|
8919 | 8895 | sources."which-module-2.0.0" |
|
8920 | 8896 | sources."window-size-0.1.0" |
|
8921 | 8897 | sources."wordwrap-0.0.2" |
|
8922 |
sources."worker-farm-1. |
|
|
8898 | sources."worker-farm-1.7.0" | |
|
8923 | 8899 | (sources."wrap-ansi-2.1.0" // { |
|
8924 | 8900 | dependencies = [ |
|
8925 | 8901 | sources."ansi-regex-2.1.1" |
@@ -8935,9 +8911,9 b' let' | |||
|
8935 | 8911 | dependencies = [ |
|
8936 | 8912 | sources."find-up-3.0.0" |
|
8937 | 8913 | sources."locate-path-3.0.0" |
|
8938 |
sources."p-limit-2. |
|
|
8914 | sources."p-limit-2.2.0" | |
|
8939 | 8915 | sources."p-locate-3.0.0" |
|
8940 |
sources."p-try-2. |
|
|
8916 | sources."p-try-2.2.0" | |
|
8941 | 8917 | ]; |
|
8942 | 8918 | }) |
|
8943 | 8919 | sources."yargs-parser-11.1.1" |
@@ -35,6 +35,18 b' self: super: {' | |||
|
35 | 35 | ]; |
|
36 | 36 | }); |
|
37 | 37 | |
|
38 | "cffi" = super."cffi".override (attrs: { | |
|
39 | buildInputs = [ | |
|
40 | pkgs.libffi | |
|
41 | ]; | |
|
42 | }); | |
|
43 | ||
|
44 | "cryptography" = super."cryptography".override (attrs: { | |
|
45 | buildInputs = [ | |
|
46 | pkgs.openssl | |
|
47 | ]; | |
|
48 | }); | |
|
49 | ||
|
38 | 50 | "gevent" = super."gevent".override (attrs: { |
|
39 | 51 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
40 | 52 | # NOTE: (marcink) odd requirements from gevent aren't set properly, |
@@ -152,12 +164,6 b' self: super: {' | |||
|
152 | 164 | }; |
|
153 | 165 | }); |
|
154 | 166 | |
|
155 | "pytest-runner" = super."pytest-runner".override (attrs: { | |
|
156 | propagatedBuildInputs = [ | |
|
157 | self."setuptools-scm" | |
|
158 | ]; | |
|
159 | }); | |
|
160 | ||
|
161 | 167 | "python-ldap" = super."python-ldap".override (attrs: { |
|
162 | 168 | propagatedBuildInputs = attrs.propagatedBuildInputs ++ [ |
|
163 | 169 | pkgs.openldap |
@@ -5,7 +5,7 b'' | |||
|
5 | 5 | |
|
6 | 6 | self: super: { |
|
7 | 7 | "alembic" = super.buildPythonPackage { |
|
8 |
name = "alembic-1.0. |
|
|
8 | name = "alembic-1.0.10"; | |
|
9 | 9 | doCheck = false; |
|
10 | 10 | propagatedBuildInputs = [ |
|
11 | 11 | self."sqlalchemy" |
@@ -14,8 +14,8 b' self: super: {' | |||
|
14 | 14 | self."python-dateutil" |
|
15 | 15 | ]; |
|
16 | 16 | src = fetchurl { |
|
17 |
url = "https://files.pythonhosted.org/packages/ |
|
|
18 | sha256 = "0rpjqp2iq6p49x1nli18ivak1izz547nnjxi110mzrgc1v7dxzz9"; | |
|
17 | url = "https://files.pythonhosted.org/packages/6e/8b/fa3bd058cccd5e9177fea4efa26bfb769228fdd3178436ad5e05830ef6ef/alembic-1.0.10.tar.gz"; | |
|
18 | sha256 = "1dwl0264r6ri2jyrjr68am04x538ab26xwy4crqjnnhm4alwm3c2"; | |
|
19 | 19 | }; |
|
20 | 20 | meta = { |
|
21 | 21 | license = [ pkgs.lib.licenses.mit ]; |
@@ -51,6 +51,17 b' self: super: {' | |||
|
51 | 51 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
52 | 52 | }; |
|
53 | 53 | }; |
|
54 | "asn1crypto" = super.buildPythonPackage { | |
|
55 | name = "asn1crypto-0.24.0"; | |
|
56 | doCheck = false; | |
|
57 | src = fetchurl { | |
|
58 | url = "https://files.pythonhosted.org/packages/fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4/asn1crypto-0.24.0.tar.gz"; | |
|
59 | sha256 = "0jaf8rf9dx1lf23xfv2cdd5h52f1qr3w8k63985bc35g3d220p4x"; | |
|
60 | }; | |
|
61 | meta = { | |
|
62 | license = [ pkgs.lib.licenses.mit ]; | |
|
63 | }; | |
|
64 | }; | |
|
54 | 65 | "atomicwrites" = super.buildPythonPackage { |
|
55 | 66 | name = "atomicwrites-1.2.1"; |
|
56 | 67 | doCheck = false; |
@@ -77,8 +88,8 b' self: super: {' | |||
|
77 | 88 | name = "authomatic-0.1.0.post1"; |
|
78 | 89 | doCheck = false; |
|
79 | 90 | src = fetchurl { |
|
80 | url = "https://code.rhodecode.com/upstream/authomatic/archive/90a9ce60cc405ae8a2bf5c3713acd5d78579a04e.tar.gz?md5=3c68720a1322b25254009518d1ff6801"; | |
|
81 | sha256 = "1cgk0a86sbsjbri06gf5z5l4npwkjdxw6fdnwl4vvfmxs2sx9yxw"; | |
|
91 | url = "https://code.rhodecode.com/upstream/authomatic/artifacts/download/0-4fe9c041-a567-4f84-be4c-7efa2a606d3c.tar.gz?md5=f6bdc3c769688212db68233e8d2b0383"; | |
|
92 | sha256 = "0pc716mva0ym6xd8jwzjbjp8dqxy9069wwwv2aqwb8lyhl4757ab"; | |
|
82 | 93 | }; |
|
83 | 94 | meta = { |
|
84 | 95 | license = [ pkgs.lib.licenses.mit ]; |
@@ -146,15 +157,15 b' self: super: {' | |||
|
146 | 157 | }; |
|
147 | 158 | }; |
|
148 | 159 | "bleach" = super.buildPythonPackage { |
|
149 |
name = "bleach-3.0 |
|
|
160 | name = "bleach-3.1.0"; | |
|
150 | 161 | doCheck = false; |
|
151 | 162 | propagatedBuildInputs = [ |
|
152 | 163 | self."six" |
|
153 | 164 | self."webencodings" |
|
154 | 165 | ]; |
|
155 | 166 | src = fetchurl { |
|
156 |
url = "https://files.pythonhosted.org/packages/ae |
|
|
157 | sha256 = "06474zg7f73hv8h1xw2wcsmvn2ygj73zxgxxqg8zcx8ap1srdls8"; | |
|
167 | url = "https://files.pythonhosted.org/packages/78/5a/0df03e8735cd9c75167528299c738702437589b9c71a849489d00ffa82e8/bleach-3.1.0.tar.gz"; | |
|
168 | sha256 = "1yhrgrhkln8bd6gn3imj69g1h4xqah9gaz9q26crqr6gmmvpzprz"; | |
|
158 | 169 | }; |
|
159 | 170 | meta = { |
|
160 | 171 | license = [ pkgs.lib.licenses.asl20 ]; |
@@ -187,6 +198,20 b' self: super: {' | |||
|
187 | 198 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
188 | 199 | }; |
|
189 | 200 | }; |
|
201 | "cffi" = super.buildPythonPackage { | |
|
202 | name = "cffi-1.12.2"; | |
|
203 | doCheck = false; | |
|
204 | propagatedBuildInputs = [ | |
|
205 | self."pycparser" | |
|
206 | ]; | |
|
207 | src = fetchurl { | |
|
208 | url = "https://files.pythonhosted.org/packages/64/7c/27367b38e6cc3e1f49f193deb761fe75cda9f95da37b67b422e62281fcac/cffi-1.12.2.tar.gz"; | |
|
209 | sha256 = "19qfks2djya8vix95bmg3xzipjb8w9b8mbj4j5k2hqkc8j58f4z1"; | |
|
210 | }; | |
|
211 | meta = { | |
|
212 | license = [ pkgs.lib.licenses.mit ]; | |
|
213 | }; | |
|
214 | }; | |
|
190 | 215 | "chameleon" = super.buildPythonPackage { |
|
191 | 216 | name = "chameleon-2.24"; |
|
192 | 217 | doCheck = false; |
@@ -230,7 +255,7 b' self: super: {' | |||
|
230 | 255 | }; |
|
231 | 256 | }; |
|
232 | 257 | "colander" = super.buildPythonPackage { |
|
233 |
name = "colander-1. |
|
|
258 | name = "colander-1.7.0"; | |
|
234 | 259 | doCheck = false; |
|
235 | 260 | propagatedBuildInputs = [ |
|
236 | 261 | self."translationstring" |
@@ -238,8 +263,8 b' self: super: {' | |||
|
238 | 263 | self."enum34" |
|
239 | 264 | ]; |
|
240 | 265 | src = fetchurl { |
|
241 |
url = "https://files.pythonhosted.org/packages/ec |
|
|
242 | sha256 = "18ah4cwwxnpm6qxi6x9ipy51dal4spd343h44s5wd01cnhgrwsyq"; | |
|
266 | url = "https://files.pythonhosted.org/packages/db/e4/74ab06f54211917b41865cafc987ce511e35503de48da9bfe9358a1bdc3e/colander-1.7.0.tar.gz"; | |
|
267 | sha256 = "1wl1bqab307lbbcjx81i28s3yl6dlm4rf15fxawkjb6j48x1cn6p"; | |
|
243 | 268 | }; |
|
244 | 269 | meta = { |
|
245 | 270 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
@@ -252,19 +277,19 b' self: super: {' | |||
|
252 | 277 | self."six" |
|
253 | 278 | ]; |
|
254 | 279 | src = fetchurl { |
|
255 |
url = "https://code.rhodecode.com/upstream/configobj/arc |
|
|
256 | sha256 = "1hhcxirwvg58grlfr177b3awhbq8hlx1l3lh69ifl1ki7lfd1s1x"; | |
|
280 | url = "https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626"; | |
|
281 | sha256 = "0kqfrdfr14mw8yd8qwq14dv2xghpkjmd3yjsy8dfcbvpcc17xnxp"; | |
|
257 | 282 | }; |
|
258 | 283 | meta = { |
|
259 | 284 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
260 | 285 | }; |
|
261 | 286 | }; |
|
262 | 287 | "configparser" = super.buildPythonPackage { |
|
263 |
name = "configparser-3.7. |
|
|
288 | name = "configparser-3.7.4"; | |
|
264 | 289 | doCheck = false; |
|
265 | 290 | src = fetchurl { |
|
266 |
url = "https://files.pythonhosted.org/packages/b |
|
|
267 | sha256 = "0cnz213il9lhgda6x70fw7mfqr8da43s3wm343lwzhqx94mgmmav"; | |
|
291 | url = "https://files.pythonhosted.org/packages/e2/1c/83fd53748d8245cb9a3399f705c251d3fc0ce7df04450aac1cfc49dd6a0f/configparser-3.7.4.tar.gz"; | |
|
292 | sha256 = "0xac32886ihs2xg7w1gppcq2sgin5qsm8lqwijs5xifq9w0x0q6s"; | |
|
268 | 293 | }; |
|
269 | 294 | meta = { |
|
270 | 295 | license = [ pkgs.lib.licenses.mit ]; |
@@ -285,16 +310,34 b' self: super: {' | |||
|
285 | 310 | }; |
|
286 | 311 | }; |
|
287 | 312 | "coverage" = super.buildPythonPackage { |
|
288 |
name = "coverage-4.5. |
|
|
313 | name = "coverage-4.5.3"; | |
|
289 | 314 | doCheck = false; |
|
290 | 315 | src = fetchurl { |
|
291 |
url = "https://files.pythonhosted.org/packages/35 |
|
|
292 | sha256 = "1wbrzpxka3xd4nmmkc6q0ir343d91kymwsm8pbmwa0d2a7q4ir2n"; | |
|
316 | url = "https://files.pythonhosted.org/packages/82/70/2280b5b29a0352519bb95ab0ef1ea942d40466ca71c53a2085bdeff7b0eb/coverage-4.5.3.tar.gz"; | |
|
317 | sha256 = "02f6m073qdispn96rc616hg0rnmw1pgqzw3bgxwiwza4zf9hirlx"; | |
|
293 | 318 | }; |
|
294 | 319 | meta = { |
|
295 | 320 | license = [ pkgs.lib.licenses.asl20 ]; |
|
296 | 321 | }; |
|
297 | 322 | }; |
|
323 | "cryptography" = super.buildPythonPackage { | |
|
324 | name = "cryptography-2.6.1"; | |
|
325 | doCheck = false; | |
|
326 | propagatedBuildInputs = [ | |
|
327 | self."asn1crypto" | |
|
328 | self."six" | |
|
329 | self."cffi" | |
|
330 | self."enum34" | |
|
331 | self."ipaddress" | |
|
332 | ]; | |
|
333 | src = fetchurl { | |
|
334 | url = "https://files.pythonhosted.org/packages/07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449/cryptography-2.6.1.tar.gz"; | |
|
335 | sha256 = "19iwz5avym5zl6jrrrkym1rdaa9h61j20ph4cswsqgv8xg5j3j16"; | |
|
336 | }; | |
|
337 | meta = { | |
|
338 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ]; | |
|
339 | }; | |
|
340 | }; | |
|
298 | 341 | "cssselect" = super.buildPythonPackage { |
|
299 | 342 | name = "cssselect-1.0.3"; |
|
300 | 343 | doCheck = false; |
@@ -337,11 +380,11 b' self: super: {' | |||
|
337 | 380 | }; |
|
338 | 381 | }; |
|
339 | 382 | "defusedxml" = super.buildPythonPackage { |
|
340 |
name = "defusedxml-0. |
|
|
383 | name = "defusedxml-0.6.0"; | |
|
341 | 384 | doCheck = false; |
|
342 | 385 | src = fetchurl { |
|
343 |
url = "https://files.pythonhosted.org/packages/74 |
|
|
344 | sha256 = "1x54n0h8hl92vvwyymx883fbqpqjwn2mc8fb383bcg3z9zwz5mr4"; | |
|
386 | url = "https://files.pythonhosted.org/packages/a4/5f/f8aa58ca0cf01cbcee728abc9d88bfeb74e95e6cb4334cfd5bed5673ea77/defusedxml-0.6.0.tar.gz"; | |
|
387 | sha256 = "1xbp8fivl3wlbyg2jrvs4lalaqv1xp9a9f29p75wdx2s2d6h717n"; | |
|
345 | 388 | }; |
|
346 | 389 | meta = { |
|
347 | 390 | license = [ pkgs.lib.licenses.psfl ]; |
@@ -399,11 +442,11 b' self: super: {' | |||
|
399 | 442 | }; |
|
400 | 443 | }; |
|
401 | 444 | "ecdsa" = super.buildPythonPackage { |
|
402 | name = "ecdsa-0.13"; | |
|
445 | name = "ecdsa-0.13.2"; | |
|
403 | 446 | doCheck = false; |
|
404 | 447 | src = fetchurl { |
|
405 |
url = "https://files.pythonhosted.org/packages/ |
|
|
406 | sha256 = "1yj31j0asmrx4an9xvsaj2icdmzy6pw0glfpqrrkrphwdpi1xkv4"; | |
|
448 | url = "https://files.pythonhosted.org/packages/51/76/139bf6e9b7b6684d5891212cdbd9e0739f2bfc03f380a1a6ffa700f392ac/ecdsa-0.13.2.tar.gz"; | |
|
449 | sha256 = "116qaq7bh4lcynzi613960jhsnn19v0kmsqwahiwjfj14gx4y0sw"; | |
|
407 | 450 | }; |
|
408 | 451 | meta = { |
|
409 | 452 | license = [ pkgs.lib.licenses.mit ]; |
@@ -491,8 +534,8 b' self: super: {' | |||
|
491 | 534 | self."configparser" |
|
492 | 535 | ]; |
|
493 | 536 | src = fetchurl { |
|
494 | url = "https://code.rhodecode.com/upstream/entrypoints/archive/96e6d645684e1af3d7df5b5272f3fe85a546b233.tar.gz?md5=7db37771aea9ac9fefe093e5d6987313"; | |
|
495 | sha256 = "0bihrdp8ahsys437kxdhk52gz6kib8rxjv71i93wkw7594fcaxll"; | |
|
537 | url = "https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d"; | |
|
538 | sha256 = "0qih72n2myclanplqipqxpgpj9d2yhff1pz5d02zq1cfqyd173w5"; | |
|
496 | 539 | }; |
|
497 | 540 | meta = { |
|
498 | 541 | license = [ pkgs.lib.licenses.mit ]; |
@@ -623,11 +666,11 b' self: super: {' | |||
|
623 | 666 | }; |
|
624 | 667 | }; |
|
625 | 668 | "hupper" = super.buildPythonPackage { |
|
626 |
name = "hupper-1. |
|
|
669 | name = "hupper-1.6.1"; | |
|
627 | 670 | doCheck = false; |
|
628 | 671 | src = fetchurl { |
|
629 | url = "https://files.pythonhosted.org/packages/f1/75/1915dc7650b4867fa3049256e24ca8eddb5989998fcec788cf52b9812dfc/hupper-1.4.2.tar.gz"; | |
|
630 | sha256 = "16vb9fkiaakdpcp6pn56h3w0dwvm67bxq2k2dv4i382qhqwphdzb"; | |
|
672 | url = "https://files.pythonhosted.org/packages/85/d9/e005d357b11249c5d70ddf5b7adab2e4c0da4e8b0531ff146917a04fe6c0/hupper-1.6.1.tar.gz"; | |
|
673 | sha256 = "0d3cvkc8ssgwk54wvhbifj56ry97qi10pfzwfk8vwzzcikbfp3zy"; | |
|
631 | 674 | }; |
|
632 | 675 | meta = { |
|
633 | 676 | license = [ pkgs.lib.licenses.mit ]; |
@@ -671,15 +714,15 b' self: super: {' | |||
|
671 | 714 | }; |
|
672 | 715 | }; |
|
673 | 716 | "ipdb" = super.buildPythonPackage { |
|
674 |
name = "ipdb-0.1 |
|
|
717 | name = "ipdb-0.12"; | |
|
675 | 718 | doCheck = false; |
|
676 | 719 | propagatedBuildInputs = [ |
|
677 | 720 | self."setuptools" |
|
678 | 721 | self."ipython" |
|
679 | 722 | ]; |
|
680 | 723 | src = fetchurl { |
|
681 |
url = "https://files.pythonhosted.org/packages/80 |
|
|
682 | sha256 = "02m0l8wrhhd3z7dg3czn5ys1g5pxib516hpshdzp7rxzsxgcd0bh"; | |
|
724 | url = "https://files.pythonhosted.org/packages/6d/43/c3c2e866a8803e196d6209595020a4a6db1a3c5d07c01455669497ae23d0/ipdb-0.12.tar.gz"; | |
|
725 | sha256 = "1khr2n7xfy8hg65kj1bsrjq9g7656pp0ybfa8abpbzpdawji3qnw"; | |
|
683 | 726 | }; |
|
684 | 727 | meta = { |
|
685 | 728 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -802,14 +845,14 b' self: super: {' | |||
|
802 | 845 | }; |
|
803 | 846 | }; |
|
804 | 847 | "jupyter-core" = super.buildPythonPackage { |
|
805 |
name = "jupyter-core-4. |
|
|
848 | name = "jupyter-core-4.5.0"; | |
|
806 | 849 | doCheck = false; |
|
807 | 850 | propagatedBuildInputs = [ |
|
808 | 851 | self."traitlets" |
|
809 | 852 | ]; |
|
810 | 853 | src = fetchurl { |
|
811 |
url = "https://files.pythonhosted.org/packages/b |
|
|
812 | sha256 = "1dy083rarba8prn9f9srxq3c7n7vyql02ycrqq306c40lr57aw5s"; | |
|
854 | url = "https://files.pythonhosted.org/packages/4a/de/ff4ca734656d17ebe0450807b59d728f45277e2e7f4b82bc9aae6cb82961/jupyter_core-4.5.0.tar.gz"; | |
|
855 | sha256 = "1xr4pbghwk5hayn5wwnhb7z95380r45p79gf5if5pi1akwg7qvic"; | |
|
813 | 856 | }; |
|
814 | 857 | meta = { |
|
815 | 858 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -1008,14 +1051,14 b' self: super: {' | |||
|
1008 | 1051 | }; |
|
1009 | 1052 | }; |
|
1010 | 1053 | "paste" = super.buildPythonPackage { |
|
1011 |
name = "paste-3.0. |
|
|
1054 | name = "paste-3.0.8"; | |
|
1012 | 1055 | doCheck = false; |
|
1013 | 1056 | propagatedBuildInputs = [ |
|
1014 | 1057 | self."six" |
|
1015 | 1058 | ]; |
|
1016 | 1059 | src = fetchurl { |
|
1017 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1018 | sha256 = "1a6i8fh1fg8r4x800fvy9r82m15clwjim6yf2g9r4dff0y40dchv"; | |
|
1060 | url = "https://files.pythonhosted.org/packages/66/65/e3acf1663438483c1f6ced0b6c6f3b90da9f0faacb0a6e2aa0f3f9f4b235/Paste-3.0.8.tar.gz"; | |
|
1061 | sha256 = "05w1sh6ky4d7pmdb8nv82n13w22jcn3qsagg5ih3hjmbws9kkwf4"; | |
|
1019 | 1062 | }; |
|
1020 | 1063 | meta = { |
|
1021 | 1064 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1033,7 +1076,7 b' self: super: {' | |||
|
1033 | 1076 | }; |
|
1034 | 1077 | }; |
|
1035 | 1078 | "pastescript" = super.buildPythonPackage { |
|
1036 |
name = "pastescript-3. |
|
|
1079 | name = "pastescript-3.1.0"; | |
|
1037 | 1080 | doCheck = false; |
|
1038 | 1081 | propagatedBuildInputs = [ |
|
1039 | 1082 | self."paste" |
@@ -1041,23 +1084,23 b' self: super: {' | |||
|
1041 | 1084 | self."six" |
|
1042 | 1085 | ]; |
|
1043 | 1086 | src = fetchurl { |
|
1044 |
url = "https://files.pythonhosted.org/packages/0 |
|
|
1045 | sha256 = "1hvmyz1sbn7ws1syw567ph7km9fi0wi75r3vlyzx6sk0z26xkm6r"; | |
|
1087 | url = "https://files.pythonhosted.org/packages/9e/1d/14db1c283eb21a5d36b6ba1114c13b709629711e64acab653d9994fe346f/PasteScript-3.1.0.tar.gz"; | |
|
1088 | sha256 = "02qcxjjr32ks7a6d4f533wl34ysc7yhwlrfcyqwqbzr52250v4fs"; | |
|
1046 | 1089 | }; |
|
1047 | 1090 | meta = { |
|
1048 | 1091 | license = [ pkgs.lib.licenses.mit ]; |
|
1049 | 1092 | }; |
|
1050 | 1093 | }; |
|
1051 | 1094 | "pathlib2" = super.buildPythonPackage { |
|
1052 |
name = "pathlib2-2.3. |
|
|
1095 | name = "pathlib2-2.3.4"; | |
|
1053 | 1096 | doCheck = false; |
|
1054 | 1097 | propagatedBuildInputs = [ |
|
1055 | 1098 | self."six" |
|
1056 | 1099 | self."scandir" |
|
1057 | 1100 | ]; |
|
1058 | 1101 | src = fetchurl { |
|
1059 |
url = "https://files.pythonhosted.org/packages/b |
|
|
1060 | sha256 = "0hpp92vqqgcd8h92msm9slv161b1q160igjwnkf2ag6cx0c96695"; | |
|
1102 | url = "https://files.pythonhosted.org/packages/b5/f4/9c7cc726ece2498b6c8b62d3262aa43f59039b953fe23c9964ac5e18d40b/pathlib2-2.3.4.tar.gz"; | |
|
1103 | sha256 = "1y0f9rkm1924zrc5dn4bwxlhgdkbml82lkcc28l5rgmr7d918q24"; | |
|
1061 | 1104 | }; |
|
1062 | 1105 | meta = { |
|
1063 | 1106 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1075,14 +1118,14 b' self: super: {' | |||
|
1075 | 1118 | }; |
|
1076 | 1119 | }; |
|
1077 | 1120 | "pexpect" = super.buildPythonPackage { |
|
1078 |
name = "pexpect-4. |
|
|
1121 | name = "pexpect-4.7.0"; | |
|
1079 | 1122 | doCheck = false; |
|
1080 | 1123 | propagatedBuildInputs = [ |
|
1081 | 1124 | self."ptyprocess" |
|
1082 | 1125 | ]; |
|
1083 | 1126 | src = fetchurl { |
|
1084 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1085 | sha256 = "1fla85g47iaxxpjhp9vkxdnv4pgc7rplfy6ja491smrrk0jqi3ia"; | |
|
1127 | url = "https://files.pythonhosted.org/packages/1c/b1/362a0d4235496cb42c33d1d8732b5e2c607b0129ad5fdd76f5a583b9fcb3/pexpect-4.7.0.tar.gz"; | |
|
1128 | sha256 = "1sv2rri15zwhds85a4kamwh9pj49qcxv7m4miyr4jfpfwv81yb4y"; | |
|
1086 | 1129 | }; |
|
1087 | 1130 | meta = { |
|
1088 | 1131 | license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ]; |
@@ -1117,63 +1160,63 b' self: super: {' | |||
|
1117 | 1160 | }; |
|
1118 | 1161 | }; |
|
1119 | 1162 | "plaster-pastedeploy" = super.buildPythonPackage { |
|
1120 |
name = "plaster-pastedeploy-0. |
|
|
1163 | name = "plaster-pastedeploy-0.7"; | |
|
1121 | 1164 | doCheck = false; |
|
1122 | 1165 | propagatedBuildInputs = [ |
|
1123 | 1166 | self."pastedeploy" |
|
1124 | 1167 | self."plaster" |
|
1125 | 1168 | ]; |
|
1126 | 1169 | src = fetchurl { |
|
1127 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1128 | sha256 = "1bkggk18f4z2bmsmxyxabvf62znvjwbivzh880419r3ap0616cf2"; | |
|
1170 | url = "https://files.pythonhosted.org/packages/99/69/2d3bc33091249266a1bd3cf24499e40ab31d54dffb4a7d76fe647950b98c/plaster_pastedeploy-0.7.tar.gz"; | |
|
1171 | sha256 = "1zg7gcsvc1kzay1ry5p699rg2qavfsxqwl17mqxzr0gzw6j9679r"; | |
|
1129 | 1172 | }; |
|
1130 | 1173 | meta = { |
|
1131 | 1174 | license = [ pkgs.lib.licenses.mit ]; |
|
1132 | 1175 | }; |
|
1133 | 1176 | }; |
|
1134 | 1177 | "pluggy" = super.buildPythonPackage { |
|
1135 |
name = "pluggy-0. |
|
|
1178 | name = "pluggy-0.11.0"; | |
|
1136 | 1179 | doCheck = false; |
|
1137 | 1180 | src = fetchurl { |
|
1138 | url = "https://files.pythonhosted.org/packages/38/e1/83b10c17688af7b2998fa5342fec58ecbd2a5a7499f31e606ae6640b71ac/pluggy-0.8.1.tar.gz"; | |
|
1139 | sha256 = "05l6g42p9ilmabw0hlbiyxy6gyzjri41m5l11a8dzgvi77q35p4d"; | |
|
1181 | url = "https://files.pythonhosted.org/packages/0d/a1/862ab336e8128fde20981d2c1aa8506693412daf5083b1911d539412676b/pluggy-0.11.0.tar.gz"; | |
|
1182 | sha256 = "10511a54dvafw1jrk75mrhml53c7b7w4yaw7241696lc2hfvr895"; | |
|
1140 | 1183 | }; |
|
1141 | 1184 | meta = { |
|
1142 | 1185 | license = [ pkgs.lib.licenses.mit ]; |
|
1143 | 1186 | }; |
|
1144 | 1187 | }; |
|
1145 | 1188 | "prompt-toolkit" = super.buildPythonPackage { |
|
1146 |
name = "prompt-toolkit-1.0.1 |
|
|
1189 | name = "prompt-toolkit-1.0.16"; | |
|
1147 | 1190 | doCheck = false; |
|
1148 | 1191 | propagatedBuildInputs = [ |
|
1149 | 1192 | self."six" |
|
1150 | 1193 | self."wcwidth" |
|
1151 | 1194 | ]; |
|
1152 | 1195 | src = fetchurl { |
|
1153 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1154 | sha256 = "05v9h5nydljwpj5nm8n804ms0glajwfy1zagrzqrg91wk3qqi1c5"; | |
|
1196 | url = "https://files.pythonhosted.org/packages/f1/03/bb36771dc9fa7553ac4bdc639a9ecdf6fda0ff4176faf940d97e3c16e41d/prompt_toolkit-1.0.16.tar.gz"; | |
|
1197 | sha256 = "1d65hm6nf0cbq0q0121m60zzy4s1fpg9fn761s1yxf08dridvkn1"; | |
|
1155 | 1198 | }; |
|
1156 | 1199 | meta = { |
|
1157 | 1200 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1158 | 1201 | }; |
|
1159 | 1202 | }; |
|
1160 | 1203 | "psutil" = super.buildPythonPackage { |
|
1161 |
name = "psutil-5. |
|
|
1204 | name = "psutil-5.5.1"; | |
|
1162 | 1205 | doCheck = false; |
|
1163 | 1206 | src = fetchurl { |
|
1164 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1165 | sha256 = "1hyna338sml2cl1mfb2gs89np18z27mvyhmq4ifh22x07n7mq9kf"; | |
|
1207 | url = "https://files.pythonhosted.org/packages/c7/01/7c30b247cdc5ba29623faa5c8cf1f1bbf7e041783c340414b0ed7e067c64/psutil-5.5.1.tar.gz"; | |
|
1208 | sha256 = "045qaqvn6k90bj5bcy259yrwcd2afgznaav3sfhphy9b8ambzkkj"; | |
|
1166 | 1209 | }; |
|
1167 | 1210 | meta = { |
|
1168 | 1211 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1169 | 1212 | }; |
|
1170 | 1213 | }; |
|
1171 | 1214 | "psycopg2" = super.buildPythonPackage { |
|
1172 |
name = "psycopg2-2. |
|
|
1215 | name = "psycopg2-2.8.3"; | |
|
1173 | 1216 | doCheck = false; |
|
1174 | 1217 | src = fetchurl { |
|
1175 |
url = "https://files.pythonhosted.org/packages/b |
|
|
1176 | sha256 = "17klx964gw8z0znl0raz3by8vdc7cq5gxj4pdcrfcina84nrdkzc"; | |
|
1218 | url = "https://files.pythonhosted.org/packages/5c/1c/6997288da181277a0c29bc39a5f9143ff20b8c99f2a7d059cfb55163e165/psycopg2-2.8.3.tar.gz"; | |
|
1219 | sha256 = "0ms4kx0p5n281l89awccix4d05ybmdngnjjpi9jbzd0rhf1nwyl9"; | |
|
1177 | 1220 | }; |
|
1178 | 1221 | meta = { |
|
1179 | 1222 | license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ]; |
@@ -1239,14 +1282,25 b' self: super: {' | |||
|
1239 | 1282 | }; |
|
1240 | 1283 | }; |
|
1241 | 1284 | "pyasn1-modules" = super.buildPythonPackage { |
|
1242 |
name = "pyasn1-modules-0.2. |
|
|
1285 | name = "pyasn1-modules-0.2.5"; | |
|
1243 | 1286 | doCheck = false; |
|
1244 | 1287 | propagatedBuildInputs = [ |
|
1245 | 1288 | self."pyasn1" |
|
1246 | 1289 | ]; |
|
1247 | 1290 | src = fetchurl { |
|
1248 |
url = "https://files.pythonhosted.org/packages/b |
|
|
1249 | sha256 = "0z3w5dqrrvdplg9ma45j8n23xvyrj9ki8mg4ibqbn7l4qpl90855"; | |
|
1291 | url = "https://files.pythonhosted.org/packages/ec/0b/69620cb04a016e4a1e8e352e8a42717862129b574b3479adb2358a1f12f7/pyasn1-modules-0.2.5.tar.gz"; | |
|
1292 | sha256 = "15nvfx0vnl8akdlv3k6s0n80vqvryj82bm040jdsn7wmyxl1ywpg"; | |
|
1293 | }; | |
|
1294 | meta = { | |
|
1295 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
1296 | }; | |
|
1297 | }; | |
|
1298 | "pycparser" = super.buildPythonPackage { | |
|
1299 | name = "pycparser-2.19"; | |
|
1300 | doCheck = false; | |
|
1301 | src = fetchurl { | |
|
1302 | url = "https://files.pythonhosted.org/packages/68/9e/49196946aee219aead1290e00d1e7fdeab8567783e83e1b9ab5585e6206a/pycparser-2.19.tar.gz"; | |
|
1303 | sha256 = "1cr5dcj9628lkz1qlwq3fv97c25363qppkmcayqvd05dpy573259"; | |
|
1250 | 1304 | }; |
|
1251 | 1305 | meta = { |
|
1252 | 1306 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -1274,23 +1328,12 b' self: super: {' | |||
|
1274 | 1328 | license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; |
|
1275 | 1329 | }; |
|
1276 | 1330 | }; |
|
1277 |
"py |
|
|
1278 |
name = "py |
|
|
1331 | "pygments" = super.buildPythonPackage { | |
|
1332 | name = "pygments-2.4.2"; | |
|
1279 | 1333 | doCheck = false; |
|
1280 | 1334 | src = fetchurl { |
|
1281 | url = "https://files.pythonhosted.org/packages/75/22/a90ec0252f4f87f3ffb6336504de71fe16a49d69c4538dae2f12b9360a38/pyflakes-0.8.1.tar.gz"; | |
|
1282 | sha256 = "0sbpq6pqm1i9wqi41mlfrsc5rk92jv4mskvlyxmnhlbdnc80ma1z"; | |
|
1283 | }; | |
|
1284 | meta = { | |
|
1285 | license = [ pkgs.lib.licenses.mit ]; | |
|
1286 | }; | |
|
1287 | }; | |
|
1288 | "pygments" = super.buildPythonPackage { | |
|
1289 | name = "pygments-2.3.1"; | |
|
1290 | doCheck = false; | |
|
1291 | src = fetchurl { | |
|
1292 | url = "https://files.pythonhosted.org/packages/64/69/413708eaf3a64a6abb8972644e0f20891a55e621c6759e2c3f3891e05d63/Pygments-2.3.1.tar.gz"; | |
|
1293 | sha256 = "0ji87g09jph8jqcvclgb02qvxasdnr9pzvk90rl66d90yqcxmyjz"; | |
|
1335 | url = "https://files.pythonhosted.org/packages/7e/ae/26808275fc76bf2832deb10d3a3ed3107bc4de01b85dcccbe525f2cd6d1e/Pygments-2.4.2.tar.gz"; | |
|
1336 | sha256 = "15v2sqm5g12bqa0c7wikfh9ck2nl97ayizy1hpqhmws5gqalq748"; | |
|
1294 | 1337 | }; |
|
1295 | 1338 | meta = { |
|
1296 | 1339 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -1330,7 +1373,7 b' self: super: {' | |||
|
1330 | 1373 | }; |
|
1331 | 1374 | }; |
|
1332 | 1375 | "pyramid" = super.buildPythonPackage { |
|
1333 |
name = "pyramid-1.10. |
|
|
1376 | name = "pyramid-1.10.4"; | |
|
1334 | 1377 | doCheck = false; |
|
1335 | 1378 | propagatedBuildInputs = [ |
|
1336 | 1379 | self."hupper" |
@@ -1345,28 +1388,13 b' self: super: {' | |||
|
1345 | 1388 | self."repoze.lru" |
|
1346 | 1389 | ]; |
|
1347 | 1390 | src = fetchurl { |
|
1348 |
url = "https://files.pythonhosted.org/packages/0 |
|
|
1349 | sha256 = "1h5105nfh6rsrfjiyw20aavyibj36la3hajy6vh1fa77xb4y3hrp"; | |
|
1391 | url = "https://files.pythonhosted.org/packages/c2/43/1ae701c9c6bb3a434358e678a5e72c96e8aa55cf4cb1d2fa2041b5dd38b7/pyramid-1.10.4.tar.gz"; | |
|
1392 | sha256 = "0rkxs1ajycg2zh1c94xlmls56mx5m161sn8112skj0amza6cn36q"; | |
|
1350 | 1393 | }; |
|
1351 | 1394 | meta = { |
|
1352 | 1395 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
|
1353 | 1396 | }; |
|
1354 | 1397 | }; |
|
1355 | "pyramid-beaker" = super.buildPythonPackage { | |
|
1356 | name = "pyramid-beaker-0.8"; | |
|
1357 | doCheck = false; | |
|
1358 | propagatedBuildInputs = [ | |
|
1359 | self."pyramid" | |
|
1360 | self."beaker" | |
|
1361 | ]; | |
|
1362 | src = fetchurl { | |
|
1363 | url = "https://files.pythonhosted.org/packages/d9/6e/b85426e00fd3d57f4545f74e1c3828552d8700f13ededeef9233f7bca8be/pyramid_beaker-0.8.tar.gz"; | |
|
1364 | sha256 = "0hflx3qkcdml1mwpq53sz46s7jickpfn0zy0ns2c7j445j66bp3p"; | |
|
1365 | }; | |
|
1366 | meta = { | |
|
1367 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
1368 | }; | |
|
1369 | }; | |
|
1370 | 1398 | "pyramid-debugtoolbar" = super.buildPythonPackage { |
|
1371 | 1399 | name = "pyramid-debugtoolbar-4.5"; |
|
1372 | 1400 | doCheck = false; |
@@ -1538,14 +1566,14 b' self: super: {' | |||
|
1538 | 1566 | }; |
|
1539 | 1567 | }; |
|
1540 | 1568 | "python-dateutil" = super.buildPythonPackage { |
|
1541 |
name = "python-dateutil-2. |
|
|
1569 | name = "python-dateutil-2.8.0"; | |
|
1542 | 1570 | doCheck = false; |
|
1543 | 1571 | propagatedBuildInputs = [ |
|
1544 | 1572 | self."six" |
|
1545 | 1573 | ]; |
|
1546 | 1574 | src = fetchurl { |
|
1547 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1548 | sha256 = "00ngwcdw36w5b37b51mdwn3qxid9zdf3kpffv2q6n9kl05y2iyc8"; | |
|
1575 | url = "https://files.pythonhosted.org/packages/ad/99/5b2e99737edeb28c71bcbec5b5dda19d0d9ef3ca3e92e3e925e7c0bb364c/python-dateutil-2.8.0.tar.gz"; | |
|
1576 | sha256 = "17nsfhy4xdz1khrfxa61vd7pmvd5z0wa3zb6v4gb4kfnykv0b668"; | |
|
1549 | 1577 | }; |
|
1550 | 1578 | meta = { |
|
1551 | 1579 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ]; |
@@ -1690,7 +1718,7 b' self: super: {' | |||
|
1690 | 1718 | }; |
|
1691 | 1719 | }; |
|
1692 | 1720 | "rhodecode-enterprise-ce" = super.buildPythonPackage { |
|
1693 |
name = "rhodecode-enterprise-ce-4.1 |
|
|
1721 | name = "rhodecode-enterprise-ce-4.17.0"; | |
|
1694 | 1722 | buildInputs = [ |
|
1695 | 1723 | self."pytest" |
|
1696 | 1724 | self."py" |
@@ -1709,36 +1737,29 b' self: super: {' | |||
|
1709 | 1737 | ]; |
|
1710 | 1738 | doCheck = true; |
|
1711 | 1739 | propagatedBuildInputs = [ |
|
1712 | self."setuptools-scm" | |
|
1713 | 1740 | self."amqp" |
|
1714 | 1741 | self."authomatic" |
|
1715 | self."atomicwrites" | |
|
1716 | self."attrs" | |
|
1717 | 1742 | self."babel" |
|
1718 | 1743 | self."beaker" |
|
1719 | 1744 | self."bleach" |
|
1720 | 1745 | self."celery" |
|
1721 | self."chameleon" | |
|
1722 | 1746 | self."channelstream" |
|
1723 | 1747 | self."click" |
|
1724 | 1748 | self."colander" |
|
1725 | 1749 | self."configobj" |
|
1726 | 1750 | self."cssselect" |
|
1751 | self."cryptography" | |
|
1727 | 1752 | self."decorator" |
|
1728 | 1753 | self."deform" |
|
1729 | 1754 | self."docutils" |
|
1730 | 1755 | self."dogpile.cache" |
|
1731 | 1756 | self."dogpile.core" |
|
1732 | self."ecdsa" | |
|
1733 | 1757 | self."formencode" |
|
1734 | 1758 | self."future" |
|
1735 | 1759 | self."futures" |
|
1736 | self."gnureadline" | |
|
1737 | 1760 | self."infrae.cache" |
|
1738 | 1761 | self."iso8601" |
|
1739 | 1762 | self."itsdangerous" |
|
1740 | self."jinja2" | |
|
1741 | self."billiard" | |
|
1742 | 1763 | self."kombu" |
|
1743 | 1764 | self."lxml" |
|
1744 | 1765 | self."mako" |
@@ -1747,21 +1768,18 b' self: super: {' | |||
|
1747 | 1768 | self."msgpack-python" |
|
1748 | 1769 | self."pyotp" |
|
1749 | 1770 | self."packaging" |
|
1771 | self."pathlib2" | |
|
1750 | 1772 | self."paste" |
|
1751 | 1773 | self."pastedeploy" |
|
1752 | 1774 | self."pastescript" |
|
1753 | self."pathlib2" | |
|
1754 | 1775 | self."peppercorn" |
|
1755 | 1776 | self."psutil" |
|
1756 | 1777 | self."py-bcrypt" |
|
1778 | self."pycurl" | |
|
1757 | 1779 | self."pycrypto" |
|
1758 | self."pycurl" | |
|
1759 | self."pyflakes" | |
|
1760 | 1780 | self."pygments" |
|
1761 | 1781 | self."pyparsing" |
|
1762 | self."pyramid-beaker" | |
|
1763 | 1782 | self."pyramid-debugtoolbar" |
|
1764 | self."pyramid-jinja2" | |
|
1765 | 1783 | self."pyramid-mako" |
|
1766 | 1784 | self."pyramid" |
|
1767 | 1785 | self."pyramid-mailer" |
@@ -1784,7 +1802,6 b' self: super: {' | |||
|
1784 | 1802 | self."sshpubkeys" |
|
1785 | 1803 | self."subprocess32" |
|
1786 | 1804 | self."supervisor" |
|
1787 | self."tempita" | |
|
1788 | 1805 | self."translationstring" |
|
1789 | 1806 | self."urllib3" |
|
1790 | 1807 | self."urlobject" |
@@ -1813,7 +1830,6 b' self: super: {' | |||
|
1813 | 1830 | self."greenlet" |
|
1814 | 1831 | self."gunicorn" |
|
1815 | 1832 | self."waitress" |
|
1816 | self."setproctitle" | |
|
1817 | 1833 | self."ipdb" |
|
1818 | 1834 | self."ipython" |
|
1819 | 1835 | self."rhodecode-tools" |
@@ -1855,8 +1871,8 b' self: super: {' | |||
|
1855 | 1871 | self."elasticsearch1-dsl" |
|
1856 | 1872 | ]; |
|
1857 | 1873 | src = fetchurl { |
|
1858 |
url = "https://code.rhodecode.com/rhodecode-tools-ce/arc |
|
|
1859 | sha256 = "1k8l3s4mvshza1zay6dfxprq54fyb5dc85dqdva9wa3f466y0adk"; | |
|
1874 | url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-10ac93f4-bb7d-4b97-baea-68110743dd5a.tar.gz?md5=962dc77c06aceee62282b98d33149661"; | |
|
1875 | sha256 = "1vfhgf46inbx7jvlfx4fdzh3vz7lh37r291gzb5hx447pfm3qllg"; | |
|
1860 | 1876 | }; |
|
1861 | 1877 | meta = { |
|
1862 | 1878 | license = [ { fullName = "Apache 2.0 and Proprietary"; } ]; |
@@ -1878,11 +1894,11 b' self: super: {' | |||
|
1878 | 1894 | }; |
|
1879 | 1895 | }; |
|
1880 | 1896 | "scandir" = super.buildPythonPackage { |
|
1881 |
name = "scandir-1. |
|
|
1897 | name = "scandir-1.10.0"; | |
|
1882 | 1898 | doCheck = false; |
|
1883 | 1899 | src = fetchurl { |
|
1884 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1885 | sha256 = "0r3hvf1a9jm1rkqgx40gxkmccknkaiqjavs8lccgq9s8khh5x5s4"; | |
|
1900 | url = "https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz"; | |
|
1901 | sha256 = "1bkqwmf056pkchf05ywbnf659wqlp6lljcdb0y88wr9f0vv32ijd"; | |
|
1886 | 1902 | }; |
|
1887 | 1903 | meta = { |
|
1888 | 1904 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ]; |
@@ -1900,22 +1916,11 b' self: super: {' | |||
|
1900 | 1916 | }; |
|
1901 | 1917 | }; |
|
1902 | 1918 | "setuptools" = super.buildPythonPackage { |
|
1903 |
name = "setuptools-40. |
|
|
1919 | name = "setuptools-41.0.1"; | |
|
1904 | 1920 | doCheck = false; |
|
1905 | 1921 | src = fetchurl { |
|
1906 | url = "https://files.pythonhosted.org/packages/c2/f7/c7b501b783e5a74cf1768bc174ee4fb0a8a6ee5af6afa92274ff964703e0/setuptools-40.8.0.zip"; | |
|
1907 | sha256 = "0k9hifpgahnw2a26w3cr346iy733k6d3nwh3f7g9m13y6f8fqkkf"; | |
|
1908 | }; | |
|
1909 | meta = { | |
|
1910 | license = [ pkgs.lib.licenses.mit ]; | |
|
1911 | }; | |
|
1912 | }; | |
|
1913 | "setuptools-scm" = super.buildPythonPackage { | |
|
1914 | name = "setuptools-scm-2.1.0"; | |
|
1915 | doCheck = false; | |
|
1916 | src = fetchurl { | |
|
1917 | url = "https://files.pythonhosted.org/packages/e5/62/f9e1ac314464eb5945c97542acb6bf6f3381dfa5d7a658de7730c36f31a1/setuptools_scm-2.1.0.tar.gz"; | |
|
1918 | sha256 = "0yb364cgk15sfw3x8ln4ssh98z1dj6n8iiz4r2rw1cfsxhgi8rx7"; | |
|
1922 | url = "https://files.pythonhosted.org/packages/1d/64/a18a487b4391a05b9c7f938b94a16d80305bf0369c6b0b9509e86165e1d3/setuptools-41.0.1.zip"; | |
|
1923 | sha256 = "04sns22y2hhsrwfy1mha2lgslvpjsjsz8xws7h2rh5a7ylkd28m2"; | |
|
1919 | 1924 | }; |
|
1920 | 1925 | meta = { |
|
1921 | 1926 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1966,40 +1971,40 b' self: super: {' | |||
|
1966 | 1971 | }; |
|
1967 | 1972 | }; |
|
1968 | 1973 | "sshpubkeys" = super.buildPythonPackage { |
|
1969 |
name = "sshpubkeys- |
|
|
1974 | name = "sshpubkeys-3.1.0"; | |
|
1970 | 1975 | doCheck = false; |
|
1971 | 1976 | propagatedBuildInputs = [ |
|
1972 |
self." |
|
|
1977 | self."cryptography" | |
|
1973 | 1978 | self."ecdsa" |
|
1974 | 1979 | ]; |
|
1975 | 1980 | src = fetchurl { |
|
1976 |
url = "https://files.pythonhosted.org/packages/27 |
|
|
1977 | sha256 = "0r4kpwzmg96a2x56pllik7dmc3fnqk189v3sfgsi07q2ryrhr6xm"; | |
|
1981 | url = "https://files.pythonhosted.org/packages/00/23/f7508a12007c96861c3da811992f14283d79c819d71a217b3e12d5196649/sshpubkeys-3.1.0.tar.gz"; | |
|
1982 | sha256 = "105g2li04nm1hb15a2y6hm9m9k7fbrkd5l3gy12w3kgcmsf3k25k"; | |
|
1978 | 1983 | }; |
|
1979 | 1984 | meta = { |
|
1980 | 1985 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1981 | 1986 | }; |
|
1982 | 1987 | }; |
|
1983 | 1988 | "subprocess32" = super.buildPythonPackage { |
|
1984 |
name = "subprocess32-3.5. |
|
|
1989 | name = "subprocess32-3.5.4"; | |
|
1985 | 1990 | doCheck = false; |
|
1986 | 1991 | src = fetchurl { |
|
1987 | url = "https://files.pythonhosted.org/packages/be/2b/beeba583e9877e64db10b52a96915afc0feabf7144dcbf2a0d0ea68bf73d/subprocess32-3.5.3.tar.gz"; | |
|
1988 | sha256 = "1hr5fan8i719hmlmz73hf8rhq74014w07d8ryg7krvvf6692kj3b"; | |
|
1992 | url = "https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz"; | |
|
1993 | sha256 = "17f7mvwx2271s1wrl0qac3wjqqnrqag866zs3qc8v5wp0k43fagb"; | |
|
1989 | 1994 | }; |
|
1990 | 1995 | meta = { |
|
1991 | 1996 | license = [ pkgs.lib.licenses.psfl ]; |
|
1992 | 1997 | }; |
|
1993 | 1998 | }; |
|
1994 | 1999 | "supervisor" = super.buildPythonPackage { |
|
1995 |
name = "supervisor- |
|
|
2000 | name = "supervisor-4.0.3"; | |
|
1996 | 2001 | doCheck = false; |
|
1997 | 2002 | propagatedBuildInputs = [ |
|
1998 | 2003 | self."meld3" |
|
1999 | 2004 | ]; |
|
2000 | 2005 | src = fetchurl { |
|
2001 |
url = "https://files.pythonhosted.org/packages/ba |
|
|
2002 | sha256 = "1w3ahridzbc6rxfpbyx8lij6pjlcgf2ymzyg53llkjqxalp6sk8v"; | |
|
2006 | url = "https://files.pythonhosted.org/packages/97/48/f38bf70bd9282d1a18d591616557cc1a77a1c627d57dff66ead65c891dc8/supervisor-4.0.3.tar.gz"; | |
|
2007 | sha256 = "17hla7mx6w5m5jzkkjxgqa8wpswqmfhbhf49f692hw78fg0ans7p"; | |
|
2003 | 2008 | }; |
|
2004 | 2009 | meta = { |
|
2005 | 2010 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
@@ -2128,22 +2133,22 b' self: super: {' | |||
|
2128 | 2133 | }; |
|
2129 | 2134 | }; |
|
2130 | 2135 | "vine" = super.buildPythonPackage { |
|
2131 |
name = "vine-1. |
|
|
2136 | name = "vine-1.3.0"; | |
|
2132 | 2137 | doCheck = false; |
|
2133 | 2138 | src = fetchurl { |
|
2134 |
url = "https://files.pythonhosted.org/packages/ |
|
|
2135 | sha256 = "0xjz2sjbr5jrpjk411b7alkghdskhphgsqqrbi7abqfh2pli6j7f"; | |
|
2139 | url = "https://files.pythonhosted.org/packages/1c/e1/79fb8046e607dd6c2ad05c9b8ebac9d0bd31d086a08f02699e96fc5b3046/vine-1.3.0.tar.gz"; | |
|
2140 | sha256 = "11ydsbhl1vabndc2r979dv61s6j2b0giq6dgvryifvq1m7bycghk"; | |
|
2136 | 2141 | }; |
|
2137 | 2142 | meta = { |
|
2138 | 2143 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
2139 | 2144 | }; |
|
2140 | 2145 | }; |
|
2141 | 2146 | "waitress" = super.buildPythonPackage { |
|
2142 |
name = "waitress-1. |
|
|
2147 | name = "waitress-1.3.0"; | |
|
2143 | 2148 | doCheck = false; |
|
2144 | 2149 | src = fetchurl { |
|
2145 |
url = "https://files.pythonhosted.org/packages/3 |
|
|
2146 | sha256 = "1a85gyji0kajc3p0s1pwwfm06w4wfxjkvvl4rnrz3h164kbd6g6k"; | |
|
2150 | url = "https://files.pythonhosted.org/packages/43/50/9890471320d5ad22761ae46661cf745f487b1c8c4ec49352b99e1078b970/waitress-1.3.0.tar.gz"; | |
|
2151 | sha256 = "09j5dzbbcxib7vdskhx39s1qsydlr4n2p2png71d7mjnr9pnwajf"; | |
|
2147 | 2152 | }; |
|
2148 | 2153 | meta = { |
|
2149 | 2154 | license = [ pkgs.lib.licenses.zpl21 ]; |
@@ -2218,18 +2223,18 b' self: super: {' | |||
|
2218 | 2223 | }; |
|
2219 | 2224 | }; |
|
2220 | 2225 | "webob" = super.buildPythonPackage { |
|
2221 |
name = "webob-1.8. |
|
|
2226 | name = "webob-1.8.5"; | |
|
2222 | 2227 | doCheck = false; |
|
2223 | 2228 | src = fetchurl { |
|
2224 | url = "https://files.pythonhosted.org/packages/e4/6c/99e322c3d4cc11d9060a67a9bf2f7c9c581f40988c11fffe89bb8c36bc5e/WebOb-1.8.4.tar.gz"; | |
|
2225 | sha256 = "16cfg5y4n6sihz59vsmns2yqbfm0gfsn3l5xgz2g0pdhilaib0x4"; | |
|
2229 | url = "https://files.pythonhosted.org/packages/9d/1a/0c89c070ee2829c934cb6c7082287c822e28236a4fcf90063e6be7c35532/WebOb-1.8.5.tar.gz"; | |
|
2230 | sha256 = "11khpzaxc88q31v25ic330gsf56fwmbdc9b30br8mvp0fmwspah5"; | |
|
2226 | 2231 | }; |
|
2227 | 2232 | meta = { |
|
2228 | 2233 | license = [ pkgs.lib.licenses.mit ]; |
|
2229 | 2234 | }; |
|
2230 | 2235 | }; |
|
2231 | 2236 | "webtest" = super.buildPythonPackage { |
|
2232 |
name = "webtest-2.0.3 |
|
|
2237 | name = "webtest-2.0.33"; | |
|
2233 | 2238 | doCheck = false; |
|
2234 | 2239 | propagatedBuildInputs = [ |
|
2235 | 2240 | self."six" |
@@ -2238,8 +2243,8 b' self: super: {' | |||
|
2238 | 2243 | self."beautifulsoup4" |
|
2239 | 2244 | ]; |
|
2240 | 2245 | src = fetchurl { |
|
2241 |
url = "https://files.pythonhosted.org/packages/ |
|
|
2242 | sha256 = "0qp0nnbazzm4ibjiyqfcn6f230svk09i4g58zg2i9x1ga06h48a2"; | |
|
2246 | url = "https://files.pythonhosted.org/packages/a8/b0/ffc9413b637dbe26e291429bb0f6ed731e518d0cd03da28524a8fe2e8a8f/WebTest-2.0.33.tar.gz"; | |
|
2247 | sha256 = "1l3z0cwqslsf4rcrhi2gr8kdfh74wn2dw76376i4g9i38gz8wd21"; | |
|
2243 | 2248 | }; |
|
2244 | 2249 | meta = { |
|
2245 | 2250 | license = [ pkgs.lib.licenses.mit ]; |
@@ -2293,42 +2298,42 b' self: super: {' | |||
|
2293 | 2298 | }; |
|
2294 | 2299 | }; |
|
2295 | 2300 | "zope.deprecation" = super.buildPythonPackage { |
|
2296 |
name = "zope.deprecation-4. |
|
|
2301 | name = "zope.deprecation-4.4.0"; | |
|
2297 | 2302 | doCheck = false; |
|
2298 | 2303 | propagatedBuildInputs = [ |
|
2299 | 2304 | self."setuptools" |
|
2300 | 2305 | ]; |
|
2301 | 2306 | src = fetchurl { |
|
2302 |
url = "https://files.pythonhosted.org/packages/a |
|
|
2303 | sha256 = "095jas41wbxgmw95kwdxqhbc3bgihw2hzj9b3qpdg85apcsf2lkx"; | |
|
2307 | url = "https://files.pythonhosted.org/packages/34/da/46e92d32d545dd067b9436279d84c339e8b16de2ca393d7b892bc1e1e9fd/zope.deprecation-4.4.0.tar.gz"; | |
|
2308 | sha256 = "1pz2cv7gv9y1r3m0bdv7ks1alagmrn5msm5spwdzkb2by0w36i8d"; | |
|
2304 | 2309 | }; |
|
2305 | 2310 | meta = { |
|
2306 | 2311 | license = [ pkgs.lib.licenses.zpl21 ]; |
|
2307 | 2312 | }; |
|
2308 | 2313 | }; |
|
2309 | 2314 | "zope.event" = super.buildPythonPackage { |
|
2310 |
name = "zope.event-4. |
|
|
2315 | name = "zope.event-4.4"; | |
|
2311 | 2316 | doCheck = false; |
|
2312 | 2317 | propagatedBuildInputs = [ |
|
2313 | 2318 | self."setuptools" |
|
2314 | 2319 | ]; |
|
2315 | 2320 | src = fetchurl { |
|
2316 |
url = "https://files.pythonhosted.org/packages/ |
|
|
2317 | sha256 = "1rrkyx42bcq8dkpj23c2v99kczlrg8d39c06q5qpr0vs4hjfmv70"; | |
|
2321 | url = "https://files.pythonhosted.org/packages/4c/b2/51c0369adcf5be2334280eed230192ab3b03f81f8efda9ddea6f65cc7b32/zope.event-4.4.tar.gz"; | |
|
2322 | sha256 = "1ksbc726av9xacml6jhcfyn828hlhb9xlddpx6fcvnlvmpmpvhk9"; | |
|
2318 | 2323 | }; |
|
2319 | 2324 | meta = { |
|
2320 | 2325 | license = [ pkgs.lib.licenses.zpl21 ]; |
|
2321 | 2326 | }; |
|
2322 | 2327 | }; |
|
2323 | 2328 | "zope.interface" = super.buildPythonPackage { |
|
2324 |
name = "zope.interface-4. |
|
|
2329 | name = "zope.interface-4.6.0"; | |
|
2325 | 2330 | doCheck = false; |
|
2326 | 2331 | propagatedBuildInputs = [ |
|
2327 | 2332 | self."setuptools" |
|
2328 | 2333 | ]; |
|
2329 | 2334 | src = fetchurl { |
|
2330 |
url = "https://files.pythonhosted.org/packages/ac |
|
|
2331 | sha256 = "0k67m60ij06wkg82n15qgyn96waf4pmrkhv0njpkfzpmv5q89hsp"; | |
|
2335 | url = "https://files.pythonhosted.org/packages/4e/d0/c9d16bd5b38de44a20c6dc5d5ed80a49626fafcb3db9f9efdc2a19026db6/zope.interface-4.6.0.tar.gz"; | |
|
2336 | sha256 = "1rgh2x3rcl9r0v0499kf78xy86rnmanajf4ywmqb943wpk50sg8v"; | |
|
2332 | 2337 | }; |
|
2333 | 2338 | meta = { |
|
2334 | 2339 | license = [ pkgs.lib.licenses.zpl21 ]; |
@@ -1,37 +1,31 b'' | |||
|
1 | 1 | ## dependencies |
|
2 | 2 | |
|
3 | setuptools-scm==2.1.0 | |
|
4 | 3 | amqp==2.3.1 |
|
5 | 4 | # not released authomatic that has updated some oauth providers |
|
6 |
https://code.rhodecode.com/upstream/authomatic/ar |
|
|
7 | atomicwrites==1.2.1 | |
|
8 | attrs==18.2.0 | |
|
5 | https://code.rhodecode.com/upstream/authomatic/artifacts/download/0-4fe9c041-a567-4f84-be4c-7efa2a606d3c.tar.gz?md5=f6bdc3c769688212db68233e8d2b0383#egg=authomatic==0.1.0.post1 | |
|
6 | ||
|
9 | 7 | babel==1.3 |
|
10 | 8 | beaker==1.9.1 |
|
11 |
bleach==3. |
|
|
9 | bleach==3.1.0 | |
|
12 | 10 | celery==4.1.1 |
|
13 | chameleon==2.24 | |
|
14 | 11 | channelstream==0.5.2 |
|
15 | 12 | click==7.0 |
|
16 |
colander==1. |
|
|
13 | colander==1.7.0 | |
|
17 | 14 | # our custom configobj |
|
18 |
https://code.rhodecode.com/upstream/configobj/ar |
|
|
15 | https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626#egg=configobj==5.0.6 | |
|
19 | 16 | cssselect==1.0.3 |
|
17 | cryptography==2.6.1 | |
|
20 | 18 | decorator==4.1.2 |
|
21 | 19 | deform==2.0.7 |
|
22 | 20 | docutils==0.14.0 |
|
23 | 21 | dogpile.cache==0.7.1 |
|
24 | 22 | dogpile.core==0.4.1 |
|
25 | ecdsa==0.13 | |
|
26 | 23 | formencode==1.2.4 |
|
27 | 24 | future==0.14.3 |
|
28 | 25 | futures==3.0.2 |
|
29 | gnureadline==6.3.8 | |
|
30 | 26 | infrae.cache==1.0.1 |
|
31 | 27 | iso8601==0.1.12 |
|
32 | 28 | itsdangerous==0.24 |
|
33 | jinja2==2.9.6 | |
|
34 | billiard==3.5.0.3 | |
|
35 | 29 | kombu==4.2.1 |
|
36 | 30 | lxml==4.2.5 |
|
37 | 31 | mako==1.0.7 |
@@ -40,23 +34,20 b' markupsafe==1.1.0' | |||
|
40 | 34 | msgpack-python==0.5.6 |
|
41 | 35 | pyotp==2.2.7 |
|
42 | 36 | packaging==15.2 |
|
43 | paste==3.0.5 | |
|
37 | pathlib2==2.3.4 | |
|
38 | paste==3.0.8 | |
|
44 | 39 | pastedeploy==2.0.1 |
|
45 |
pastescript==3. |
|
|
46 | pathlib2==2.3.3 | |
|
40 | pastescript==3.1.0 | |
|
47 | 41 | peppercorn==0.6 |
|
48 |
psutil==5. |
|
|
42 | psutil==5.5.1 | |
|
49 | 43 | py-bcrypt==0.4 |
|
50 | pycrypto==2.6.1 | |
|
51 | 44 | pycurl==7.43.0.2 |
|
52 | pyflakes==0.8.1 | |
|
53 |
pygments==2. |
|
|
45 | pycrypto==2.6.1 | |
|
46 | pygments==2.4.2 | |
|
54 | 47 | pyparsing==2.3.0 |
|
55 | pyramid-beaker==0.8 | |
|
56 | 48 | pyramid-debugtoolbar==4.5.0 |
|
57 | pyramid-jinja2==2.7 | |
|
58 | 49 | pyramid-mako==1.0.2 |
|
59 |
pyramid==1.10. |
|
|
50 | pyramid==1.10.4 | |
|
60 | 51 | pyramid_mailer==0.15.1 |
|
61 | 52 | python-dateutil |
|
62 | 53 | python-ldap==3.1.0 |
@@ -74,10 +65,9 b' routes==2.4.1' | |||
|
74 | 65 | simplejson==3.16.0 |
|
75 | 66 | six==1.11.0 |
|
76 | 67 | sqlalchemy==1.1.18 |
|
77 |
sshpubkeys== |
|
|
78 |
subprocess32==3.5. |
|
|
79 |
supervisor== |
|
|
80 | tempita==0.5.2 | |
|
68 | sshpubkeys==3.1.0 | |
|
69 | subprocess32==3.5.4 | |
|
70 | supervisor==4.0.3 | |
|
81 | 71 | translationstring==1.3 |
|
82 | 72 | urllib3==1.24.1 |
|
83 | 73 | urlobject==2.4.3 |
@@ -85,29 +75,29 b' venusian==1.2.0' | |||
|
85 | 75 | weberror==0.10.3 |
|
86 | 76 | webhelpers2==2.0 |
|
87 | 77 | webhelpers==1.3 |
|
88 |
webob==1.8. |
|
|
78 | webob==1.8.5 | |
|
89 | 79 | whoosh==2.7.4 |
|
90 | 80 | wsgiref==0.1.2 |
|
91 | 81 | zope.cachedescriptors==4.3.1 |
|
92 |
zope.deprecation==4. |
|
|
93 |
zope.event==4. |
|
|
94 |
zope.interface==4. |
|
|
82 | zope.deprecation==4.4.0 | |
|
83 | zope.event==4.4.0 | |
|
84 | zope.interface==4.6.0 | |
|
95 | 85 | |
|
96 | 86 | # DB drivers |
|
97 | 87 | mysql-python==1.2.5 |
|
98 | 88 | pymysql==0.8.1 |
|
99 | 89 | pysqlite==2.8.3 |
|
100 |
psycopg2==2. |
|
|
90 | psycopg2==2.8.3 | |
|
101 | 91 | |
|
102 | 92 | # IPYTHON RENDERING |
|
103 | 93 | # entrypoints backport, pypi version doesn't support egg installs |
|
104 |
https://code.rhodecode.com/upstream/entrypoints/ar |
|
|
94 | https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d#egg=entrypoints==0.2.2.rhodecode-upstream1 | |
|
105 | 95 | nbconvert==5.3.1 |
|
106 | 96 | nbformat==4.4.0 |
|
107 | 97 | jupyter_client==5.0.0 |
|
108 | 98 | |
|
109 | 99 | ## cli tools |
|
110 |
alembic==1.0. |
|
|
100 | alembic==1.0.10 | |
|
111 | 101 | invoke==0.13.0 |
|
112 | 102 | bumpversion==0.5.3 |
|
113 | 103 | |
@@ -115,15 +105,14 b' bumpversion==0.5.3' | |||
|
115 | 105 | gevent==1.4.0 |
|
116 | 106 | greenlet==0.4.15 |
|
117 | 107 | gunicorn==19.9.0 |
|
118 |
waitress==1. |
|
|
119 | setproctitle==1.1.10 | |
|
108 | waitress==1.3.0 | |
|
120 | 109 | |
|
121 | 110 | ## debug |
|
122 |
ipdb==0.1 |
|
|
111 | ipdb==0.12.0 | |
|
123 | 112 | ipython==5.1.0 |
|
124 | 113 | |
|
125 | 114 | ## rhodecode-tools, special case |
|
126 |
https://code.rhodecode.com/rhodecode-tools-ce/ar |
|
|
115 | https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-10ac93f4-bb7d-4b97-baea-68110743dd5a.tar.gz?md5=962dc77c06aceee62282b98d33149661#egg=rhodecode-tools==1.2.1 | |
|
127 | 116 | |
|
128 | 117 | ## appenlight |
|
129 | 118 | appenlight-client==0.6.26 |
@@ -10,7 +10,7 b' gprof2dot==2017.9.19' | |||
|
10 | 10 | |
|
11 | 11 | mock==1.0.1 |
|
12 | 12 | cov-core==1.15.0 |
|
13 |
coverage==4.5. |
|
|
13 | coverage==4.5.3 | |
|
14 | 14 | |
|
15 |
webtest==2.0.3 |
|
|
15 | webtest==2.0.33 | |
|
16 | 16 | beautifulsoup4==4.6.3 |
@@ -45,7 +45,7 b' PYRAMID_SETTINGS = {}' | |||
|
45 | 45 | EXTENSIONS = {} |
|
46 | 46 | |
|
47 | 47 | __version__ = ('.'.join((str(each) for each in VERSION[:3]))) |
|
48 |
__dbversion__ = 9 |
|
|
48 | __dbversion__ = 98 # defines current db version for migrations | |
|
49 | 49 | __platform__ = platform.system() |
|
50 | 50 | __license__ = 'AGPLv3, and Commercial License' |
|
51 | 51 | __author__ = 'RhodeCode GmbH' |
@@ -37,11 +37,11 b' class TestCommentCommit(object):' | |||
|
37 | 37 | assert_error(id_, expected, given=response.body) |
|
38 | 38 | |
|
39 | 39 | @pytest.mark.parametrize("commit_id, expected_err", [ |
|
40 | ('abcabca', {'hg': 'Commit {commit} does not exist for {repo}', | |
|
41 | 'git': 'Commit {commit} does not exist for {repo}', | |
|
40 | ('abcabca', {'hg': 'Commit {commit} does not exist for `{repo}`', | |
|
41 | 'git': 'Commit {commit} does not exist for `{repo}`', | |
|
42 | 42 | 'svn': 'Commit id {commit} not understood.'}), |
|
43 | ('idontexist', {'hg': 'Commit {commit} does not exist for {repo}', | |
|
44 | 'git': 'Commit {commit} does not exist for {repo}', | |
|
43 | ('idontexist', {'hg': 'Commit {commit} does not exist for `{repo}`', | |
|
44 | 'git': 'Commit {commit} does not exist for `{repo}`', | |
|
45 | 45 | 'svn': 'Commit id {commit} not understood.'}), |
|
46 | 46 | ]) |
|
47 | 47 | def test_api_comment_commit_wrong_hash(self, backend, commit_id, expected_err): |
@@ -53,7 +53,7 b' class TestCommentCommit(object):' | |||
|
53 | 53 | |
|
54 | 54 | expected_err = expected_err[backend.alias] |
|
55 | 55 | expected_err = expected_err.format( |
|
56 | repo=backend.repo.scm_instance(), commit=commit_id) | |
|
56 | repo=backend.repo.scm_instance().name, commit=commit_id) | |
|
57 | 57 | assert_error(id_, expected_err, given=response.body) |
|
58 | 58 | |
|
59 | 59 | @pytest.mark.parametrize("status_change, message, commit_id", [ |
@@ -44,6 +44,7 b' class TestGetRepo(object):' | |||
|
44 | 44 | self, apikey_attr, expect_secrets, cache_param, backend, |
|
45 | 45 | user_util): |
|
46 | 46 | repo = backend.create_repo() |
|
47 | repo_id = repo.repo_id | |
|
47 | 48 | usr = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN) |
|
48 | 49 | group = user_util.create_user_group(members=[usr]) |
|
49 | 50 | user_util.grant_user_group_permission_to_repo( |
@@ -64,6 +65,8 b' class TestGetRepo(object):' | |||
|
64 | 65 | permissions = expected_permissions(repo) |
|
65 | 66 | |
|
66 | 67 | followers = [] |
|
68 | ||
|
69 | repo = RepoModel().get(repo_id) | |
|
67 | 70 | for user in repo.followers: |
|
68 | 71 | followers.append(user.user.get_api_data( |
|
69 | 72 | include_secrets=expect_secrets)) |
@@ -84,6 +87,7 b' class TestGetRepo(object):' | |||
|
84 | 87 | # TODO: Depending on which tests are running before this one, we |
|
85 | 88 | # start with a different number of permissions in the database. |
|
86 | 89 | repo = RepoModel().get_by_repo_name(backend.repo_name) |
|
90 | repo_id = repo.repo_id | |
|
87 | 91 | permission_count = len(repo.repo_to_perm) |
|
88 | 92 | |
|
89 | 93 | RepoModel().grant_user_permission(repo=backend.repo_name, |
@@ -102,6 +106,8 b' class TestGetRepo(object):' | |||
|
102 | 106 | permissions = expected_permissions(repo) |
|
103 | 107 | |
|
104 | 108 | followers = [] |
|
109 | ||
|
110 | repo = RepoModel().get(repo_id) | |
|
105 | 111 | for user in repo.followers: |
|
106 | 112 | followers.append(user.user.get_api_data()) |
|
107 | 113 |
@@ -444,7 +444,7 b' def get_repo_nodes(request, apiuser, rep' | |||
|
444 | 444 | result: [ |
|
445 | 445 | { |
|
446 | 446 | "binary": false, |
|
447 |
"content": "File line |
|
|
447 | "content": "File line", | |
|
448 | 448 | "extension": "md", |
|
449 | 449 | "lines": 2, |
|
450 | 450 | "md5": "059fa5d29b19c0657e384749480f6422", |
@@ -529,6 +529,7 b' def get_repo_file(request, apiuser, repo' | |||
|
529 | 529 | :param cache: Use internal caches for fetching files. If disabled fetching |
|
530 | 530 | files is slower but more memory efficient |
|
531 | 531 | :type cache: Optional(bool) |
|
532 | ||
|
532 | 533 | Example output: |
|
533 | 534 | |
|
534 | 535 | .. code-block:: bash |
@@ -168,6 +168,28 b' class BaseAppView(object):' | |||
|
168 | 168 | from rhodecode.lib.base import attach_context_attributes |
|
169 | 169 | attach_context_attributes(c, self.request, self.request.user.user_id) |
|
170 | 170 | |
|
171 | c.is_super_admin = c.auth_user.is_admin | |
|
172 | ||
|
173 | c.can_create_repo = c.is_super_admin | |
|
174 | c.can_create_repo_group = c.is_super_admin | |
|
175 | c.can_create_user_group = c.is_super_admin | |
|
176 | ||
|
177 | c.is_delegated_admin = False | |
|
178 | ||
|
179 | if not c.auth_user.is_default and not c.is_super_admin: | |
|
180 | c.can_create_repo = h.HasPermissionAny('hg.create.repository')( | |
|
181 | user=self.request.user) | |
|
182 | repositories = c.auth_user.repositories_admin or c.can_create_repo | |
|
183 | ||
|
184 | c.can_create_repo_group = h.HasPermissionAny('hg.repogroup.create.true')( | |
|
185 | user=self.request.user) | |
|
186 | repository_groups = c.auth_user.repository_groups_admin or c.can_create_repo_group | |
|
187 | ||
|
188 | c.can_create_user_group = h.HasPermissionAny('hg.usergroup.create.true')( | |
|
189 | user=self.request.user) | |
|
190 | user_groups = c.auth_user.user_groups_admin or c.can_create_user_group | |
|
191 | # delegated admin can create, or manage some objects | |
|
192 | c.is_delegated_admin = repositories or repository_groups or user_groups | |
|
171 | 193 | return c |
|
172 | 194 | |
|
173 | 195 | def _get_template_context(self, tmpl_args, **kwargs): |
@@ -215,12 +237,17 b' class RepoAppView(BaseAppView):' | |||
|
215 | 237 | c.rhodecode_db_repo = self.db_repo |
|
216 | 238 | c.repo_name = self.db_repo_name |
|
217 | 239 | c.repository_pull_requests = self.db_repo_pull_requests |
|
240 | c.repository_is_user_following = ScmModel().is_following_repo( | |
|
241 | self.db_repo_name, self._rhodecode_user.user_id) | |
|
218 | 242 | self.path_filter = PathFilter(None) |
|
219 | 243 | |
|
220 | 244 | c.repository_requirements_missing = {} |
|
221 | 245 | try: |
|
222 | 246 | self.rhodecode_vcs_repo = self.db_repo.scm_instance() |
|
223 | if self.rhodecode_vcs_repo: | |
|
247 | # NOTE(marcink): | |
|
248 | # comparison to None since if it's an object __bool__ is expensive to | |
|
249 | # calculate | |
|
250 | if self.rhodecode_vcs_repo is not None: | |
|
224 | 251 | path_perms = self.rhodecode_vcs_repo.get_path_permissions( |
|
225 | 252 | c.auth_user.username) |
|
226 | 253 | self.path_filter = PathFilter(path_perms) |
@@ -85,8 +85,6 b' class NavigationRegistry(object):' | |||
|
85 | 85 | 'admin_settings_hooks'), |
|
86 | 86 | NavEntry('search', _('Full Text Search'), |
|
87 | 87 | 'admin_settings_search'), |
|
88 | NavEntry('integrations', _('Integrations'), | |
|
89 | 'global_integrations_home'), | |
|
90 | 88 | NavEntry('system', _('System Info'), |
|
91 | 89 | 'admin_settings_system'), |
|
92 | 90 | NavEntry('exceptions', _('Exceptions Tracker'), |
@@ -424,6 +424,10 b' def admin_routes(config):' | |||
|
424 | 424 | pattern='/repo_groups') |
|
425 | 425 | |
|
426 | 426 | config.add_route( |
|
427 | name='repo_groups_data', | |
|
428 | pattern='/repo_groups_data') | |
|
429 | ||
|
430 | config.add_route( | |
|
427 | 431 | name='repo_group_new', |
|
428 | 432 | pattern='/repo_group/new') |
|
429 | 433 |
@@ -46,10 +46,10 b' def route_path(name, params=None, **kwar' | |||
|
46 | 46 | |
|
47 | 47 | class TestAdminMainView(TestController): |
|
48 | 48 | |
|
49 |
def test_ |
|
|
49 | def test_access_admin_home(self): | |
|
50 | 50 | self.log_user() |
|
51 |
response = self.app.get(route_path('admin_home'), status= |
|
|
52 | assert response.location.endswith('/audit_logs') | |
|
51 | response = self.app.get(route_path('admin_home'), status=200) | |
|
52 | response.mustcontain("Administration area") | |
|
53 | 53 | |
|
54 | 54 | def test_redirect_pull_request_view(self, view): |
|
55 | 55 | self.log_user() |
@@ -23,11 +23,11 b' import pytest' | |||
|
23 | 23 | |
|
24 | 24 | from rhodecode.apps._base import ADMIN_PREFIX |
|
25 | 25 | from rhodecode.lib import helpers as h |
|
26 | from rhodecode.model.db import Repository, UserRepoToPerm, User | |
|
26 | from rhodecode.model.db import Repository, UserRepoToPerm, User, RepoGroup | |
|
27 | 27 | from rhodecode.model.meta import Session |
|
28 | 28 | from rhodecode.model.repo_group import RepoGroupModel |
|
29 | 29 | from rhodecode.tests import ( |
|
30 |
assert_session_flash, TEST_USER_REGULAR_LOGIN, TESTS_TMP_PATH |
|
|
30 | assert_session_flash, TEST_USER_REGULAR_LOGIN, TESTS_TMP_PATH) | |
|
31 | 31 | from rhodecode.tests.fixture import Fixture |
|
32 | 32 | |
|
33 | 33 | fixture = Fixture() |
@@ -38,6 +38,7 b' def route_path(name, params=None, **kwar' | |||
|
38 | 38 | |
|
39 | 39 | base_url = { |
|
40 | 40 | 'repo_groups': ADMIN_PREFIX + '/repo_groups', |
|
41 | 'repo_groups_data': ADMIN_PREFIX + '/repo_groups_data', | |
|
41 | 42 | 'repo_group_new': ADMIN_PREFIX + '/repo_group/new', |
|
42 | 43 | 'repo_group_create': ADMIN_PREFIX + '/repo_group/create', |
|
43 | 44 | |
@@ -59,13 +60,30 b' def _get_permission_for_user(user, repo)' | |||
|
59 | 60 | |
|
60 | 61 | @pytest.mark.usefixtures("app") |
|
61 | 62 | class TestAdminRepositoryGroups(object): |
|
63 | ||
|
62 | 64 | def test_show_repo_groups(self, autologin_user): |
|
63 |
|
|
|
64 | response.mustcontain('data: []') | |
|
65 | self.app.get(route_path('repo_groups')) | |
|
66 | ||
|
67 | def test_show_repo_groups_data(self, autologin_user, xhr_header): | |
|
68 | response = self.app.get(route_path( | |
|
69 | 'repo_groups_data'), extra_environ=xhr_header) | |
|
70 | ||
|
71 | all_repo_groups = RepoGroup.query().count() | |
|
72 | assert response.json['recordsTotal'] == all_repo_groups | |
|
65 | 73 | |
|
66 |
def test_show_repo_groups_ |
|
|
74 | def test_show_repo_groups_data_filtered(self, autologin_user, xhr_header): | |
|
75 | response = self.app.get(route_path( | |
|
76 | 'repo_groups_data', params={'search[value]': 'empty_search'}), | |
|
77 | extra_environ=xhr_header) | |
|
78 | ||
|
79 | all_repo_groups = RepoGroup.query().count() | |
|
80 | assert response.json['recordsTotal'] == all_repo_groups | |
|
81 | assert response.json['recordsFiltered'] == 0 | |
|
82 | ||
|
83 | def test_show_repo_groups_after_creating_group(self, autologin_user, xhr_header): | |
|
67 | 84 | fixture.create_repo_group('test_repo_group') |
|
68 |
response = self.app.get(route_path( |
|
|
85 | response = self.app.get(route_path( | |
|
86 | 'repo_groups_data'), extra_environ=xhr_header) | |
|
69 | 87 | response.mustcontain('"name_raw": "test_repo_group"') |
|
70 | 88 | fixture.destroy_repo_group('test_repo_group') |
|
71 | 89 |
@@ -214,8 +214,7 b' class TestAdminSettingsGlobal(object):' | |||
|
214 | 214 | }) |
|
215 | 215 | |
|
216 | 216 | response = response.follow() |
|
217 | response.mustcontain( | |
|
218 | """<div class="branding">- %s</div>""" % new_title) | |
|
217 | response.mustcontain(new_title) | |
|
219 | 218 | |
|
220 | 219 | def post_and_verify_settings(self, settings): |
|
221 | 220 | old_title = 'RhodeCode' |
@@ -173,4 +173,4 b' class TestAdminUsersSshKeysView(TestCont' | |||
|
173 | 173 | |
|
174 | 174 | response.mustcontain('Private key') |
|
175 | 175 | response.mustcontain('Public key') |
|
176 |
response.mustcontain('-----BEGIN |
|
|
176 | response.mustcontain('-----BEGIN PRIVATE KEY-----') |
@@ -50,7 +50,7 b' class ExceptionsTrackerView(BaseAppView)' | |||
|
50 | 50 | count +=1 |
|
51 | 51 | return count |
|
52 | 52 | |
|
53 | def get_all_exceptions(self, read_metadata=False, limit=None): | |
|
53 | def get_all_exceptions(self, read_metadata=False, limit=None, type_filter=None): | |
|
54 | 54 | exc_store_path = exc_tracking.get_exc_store() |
|
55 | 55 | exception_list = [] |
|
56 | 56 | |
@@ -59,7 +59,7 b' class ExceptionsTrackerView(BaseAppView)' | |||
|
59 | 59 | return val.split('_')[-1] |
|
60 | 60 | except Exception: |
|
61 | 61 | return 0 |
|
62 | count = 0 | |
|
62 | ||
|
63 | 63 | for fname in reversed(sorted(os.listdir(exc_store_path), key=key_sorter)): |
|
64 | 64 | |
|
65 | 65 | parts = fname.split('_', 2) |
@@ -83,13 +83,17 b' class ExceptionsTrackerView(BaseAppView)' | |||
|
83 | 83 | except Exception: |
|
84 | 84 | log.exception('Failed to read exc data from:{}'.format(full_path)) |
|
85 | 85 | pass |
|
86 | ||
|
87 | 86 | # convert our timestamp to a date obj, for nicer representation |
|
88 | 87 | exc['exc_utc_date'] = time_to_utcdatetime(exc['exc_timestamp']) |
|
89 | exception_list.append(exc) | |
|
90 | 88 | |
|
91 | count += 1 | |
|
92 | if limit and count >= limit: | |
|
89 | type_present = exc.get('exc_type') | |
|
90 | if type_filter: | |
|
91 | if type_present and type_present == type_filter: | |
|
92 | exception_list.append(exc) | |
|
93 | else: | |
|
94 | exception_list.append(exc) | |
|
95 | ||
|
96 | if limit and len(exception_list) >= limit: | |
|
93 | 97 | break |
|
94 | 98 | return exception_list |
|
95 | 99 | |
@@ -103,8 +107,10 b' class ExceptionsTrackerView(BaseAppView)' | |||
|
103 | 107 | c = self.load_default_context() |
|
104 | 108 | c.active = 'exceptions_browse' |
|
105 | 109 | c.limit = safe_int(self.request.GET.get('limit')) or 50 |
|
110 | c.type_filter = self.request.GET.get('type_filter') | |
|
106 | 111 | c.next_limit = c.limit + 50 |
|
107 |
c.exception_list = self.get_all_exceptions( |
|
|
112 | c.exception_list = self.get_all_exceptions( | |
|
113 | read_metadata=True, limit=c.limit, type_filter=c.type_filter) | |
|
108 | 114 | c.exception_list_count = self.count_all_exceptions() |
|
109 | 115 | c.exception_store_dir = exc_tracking.get_exc_store() |
|
110 | 116 | return self._get_template_context(c) |
@@ -132,12 +138,20 b' class ExceptionsTrackerView(BaseAppView)' | |||
|
132 | 138 | def exception_delete_all(self): |
|
133 | 139 | _ = self.request.translate |
|
134 | 140 | c = self.load_default_context() |
|
141 | type_filter = self.request.POST.get('type_filter') | |
|
135 | 142 | |
|
136 | 143 | c.active = 'exceptions' |
|
137 | all_exc = self.get_all_exceptions() | |
|
138 |
exc_count = |
|
|
144 | all_exc = self.get_all_exceptions(read_metadata=bool(type_filter), type_filter=type_filter) | |
|
145 | exc_count = 0 | |
|
146 | ||
|
139 | 147 | for exc in all_exc: |
|
140 | exc_tracking.delete_exception(exc['exc_id'], prefix=None) | |
|
148 | if type_filter: | |
|
149 | if exc.get('exc_type') == type_filter: | |
|
150 | exc_tracking.delete_exception(exc['exc_id'], prefix=None) | |
|
151 | exc_count += 1 | |
|
152 | else: | |
|
153 | exc_tracking.delete_exception(exc['exc_id'], prefix=None) | |
|
154 | exc_count += 1 | |
|
141 | 155 | |
|
142 | 156 | h.flash(_('Removed {} Exceptions').format(exc_count), category='success') |
|
143 | 157 | raise HTTPFound(h.route_path('admin_settings_exception_tracker')) |
@@ -20,12 +20,12 b'' | |||
|
20 | 20 | |
|
21 | 21 | import logging |
|
22 | 22 | |
|
23 | from pyramid.httpexceptions import HTTPFound | |
|
23 | from pyramid.httpexceptions import HTTPFound, HTTPNotFound | |
|
24 | 24 | from pyramid.view import view_config |
|
25 | 25 | |
|
26 | 26 | from rhodecode.apps._base import BaseAppView |
|
27 | 27 | from rhodecode.lib import helpers as h |
|
28 |
from rhodecode.lib.auth import (LoginRequired, |
|
|
28 | from rhodecode.lib.auth import (LoginRequired, NotAnonymous) | |
|
29 | 29 | from rhodecode.model.db import PullRequest |
|
30 | 30 | |
|
31 | 31 | |
@@ -33,14 +33,23 b' log = logging.getLogger(__name__)' | |||
|
33 | 33 | |
|
34 | 34 | |
|
35 | 35 | class AdminMainView(BaseAppView): |
|
36 | def load_default_context(self): | |
|
37 | c = self._get_local_tmpl_context() | |
|
38 | return c | |
|
36 | 39 | |
|
37 | 40 | @LoginRequired() |
|
38 | @HasPermissionAllDecorator('hg.admin') | |
|
41 | @NotAnonymous() | |
|
39 | 42 | @view_config( |
|
40 |
route_name='admin_home', request_method='GET' |
|
|
43 | route_name='admin_home', request_method='GET', | |
|
44 | renderer='rhodecode:templates/admin/main.mako') | |
|
41 | 45 | def admin_main(self): |
|
42 | # redirect _admin to audit logs... | |
|
43 | raise HTTPFound(h.route_path('admin_audit_logs')) | |
|
46 | c = self.load_default_context() | |
|
47 | c.active = 'admin' | |
|
48 | ||
|
49 | if not (c.is_super_admin or c.is_delegated_admin): | |
|
50 | raise HTTPNotFound() | |
|
51 | ||
|
52 | return self._get_template_context(c) | |
|
44 | 53 | |
|
45 | 54 | @LoginRequired() |
|
46 | 55 | @view_config(route_name='pull_requests_global_0', request_method='GET') |
@@ -49,8 +58,7 b' class AdminMainView(BaseAppView):' | |||
|
49 | 58 | def pull_requests(self): |
|
50 | 59 | """ |
|
51 | 60 | Global redirect for Pull Requests |
|
52 | ||
|
53 | :param pull_request_id: id of pull requests in the system | |
|
61 | pull_request_id: id of pull requests in the system | |
|
54 | 62 | """ |
|
55 | 63 | |
|
56 | 64 | pull_request = PullRequest.get_or_404( |
@@ -17,7 +17,7 b'' | |||
|
17 | 17 | # This program is dual-licensed. If you wish to learn more about the |
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | ||
|
20 | import datetime | |
|
21 | 21 | import logging |
|
22 | 22 | import formencode |
|
23 | 23 | import formencode.htmlfill |
@@ -30,16 +30,16 b' from pyramid.response import Response' | |||
|
30 | 30 | from rhodecode import events |
|
31 | 31 | from rhodecode.apps._base import BaseAppView, DataGridAppView |
|
32 | 32 | |
|
33 | from rhodecode.lib.ext_json import json | |
|
34 | 33 | from rhodecode.lib.auth import ( |
|
35 | 34 | LoginRequired, CSRFRequired, NotAnonymous, |
|
36 | 35 | HasPermissionAny, HasRepoGroupPermissionAny) |
|
37 | 36 | from rhodecode.lib import helpers as h, audit_logger |
|
38 | from rhodecode.lib.utils2 import safe_int, safe_unicode | |
|
37 | from rhodecode.lib.utils2 import safe_int, safe_unicode, datetime_to_time | |
|
39 | 38 | from rhodecode.model.forms import RepoGroupForm |
|
40 | 39 | from rhodecode.model.repo_group import RepoGroupModel |
|
41 | 40 | from rhodecode.model.scm import RepoGroupList |
|
42 |
from rhodecode.model.db import |
|
|
41 | from rhodecode.model.db import ( | |
|
42 | or_, count, func, in_filter_generator, Session, RepoGroup, User, Repository) | |
|
43 | 43 | |
|
44 | 44 | log = logging.getLogger(__name__) |
|
45 | 45 | |
@@ -88,22 +88,168 b' class AdminRepoGroupsView(BaseAppView, D' | |||
|
88 | 88 | return False |
|
89 | 89 | return False |
|
90 | 90 | |
|
91 | # permission check in data loading of | |
|
92 | # `repo_group_list_data` via RepoGroupList | |
|
91 | 93 | @LoginRequired() |
|
92 | 94 | @NotAnonymous() |
|
93 | # perms check inside | |
|
94 | 95 | @view_config( |
|
95 | 96 | route_name='repo_groups', request_method='GET', |
|
96 | 97 | renderer='rhodecode:templates/admin/repo_groups/repo_groups.mako') |
|
97 | 98 | def repo_group_list(self): |
|
98 | 99 | c = self.load_default_context() |
|
100 | return self._get_template_context(c) | |
|
99 | 101 | |
|
100 | repo_group_list = RepoGroup.get_all_repo_groups() | |
|
101 | repo_group_list_acl = RepoGroupList( | |
|
102 | repo_group_list, perm_set=['group.admin']) | |
|
103 | repo_group_data = RepoGroupModel().get_repo_groups_as_dict( | |
|
104 | repo_group_list=repo_group_list_acl, admin=True) | |
|
105 | c.data = json.dumps(repo_group_data) | |
|
106 | return self._get_template_context(c) | |
|
102 | # permission check inside | |
|
103 | @LoginRequired() | |
|
104 | @NotAnonymous() | |
|
105 | @view_config( | |
|
106 | route_name='repo_groups_data', request_method='GET', | |
|
107 | renderer='json_ext', xhr=True) | |
|
108 | def repo_group_list_data(self): | |
|
109 | self.load_default_context() | |
|
110 | column_map = { | |
|
111 | 'name_raw': 'group_name_hash', | |
|
112 | 'desc': 'group_description', | |
|
113 | 'last_change_raw': 'updated_on', | |
|
114 | 'top_level_repos': 'repos_total', | |
|
115 | 'owner': 'user_username', | |
|
116 | } | |
|
117 | draw, start, limit = self._extract_chunk(self.request) | |
|
118 | search_q, order_by, order_dir = self._extract_ordering( | |
|
119 | self.request, column_map=column_map) | |
|
120 | ||
|
121 | _render = self.request.get_partial_renderer( | |
|
122 | 'rhodecode:templates/data_table/_dt_elements.mako') | |
|
123 | c = _render.get_call_context() | |
|
124 | ||
|
125 | def quick_menu(repo_group_name): | |
|
126 | return _render('quick_repo_group_menu', repo_group_name) | |
|
127 | ||
|
128 | def repo_group_lnk(repo_group_name): | |
|
129 | return _render('repo_group_name', repo_group_name) | |
|
130 | ||
|
131 | def last_change(last_change): | |
|
132 | if isinstance(last_change, datetime.datetime) and not last_change.tzinfo: | |
|
133 | delta = datetime.timedelta( | |
|
134 | seconds=(datetime.datetime.now() - datetime.datetime.utcnow()).seconds) | |
|
135 | last_change = last_change + delta | |
|
136 | return _render("last_change", last_change) | |
|
137 | ||
|
138 | def desc(desc, personal): | |
|
139 | return _render( | |
|
140 | 'repo_group_desc', desc, personal, c.visual.stylify_metatags) | |
|
141 | ||
|
142 | def repo_group_actions(repo_group_id, repo_group_name, gr_count): | |
|
143 | return _render( | |
|
144 | 'repo_group_actions', repo_group_id, repo_group_name, gr_count) | |
|
145 | ||
|
146 | def user_profile(username): | |
|
147 | return _render('user_profile', username) | |
|
148 | ||
|
149 | auth_repo_group_list = RepoGroupList( | |
|
150 | RepoGroup.query().all(), perm_set=['group.admin']) | |
|
151 | ||
|
152 | allowed_ids = [-1] | |
|
153 | for repo_group in auth_repo_group_list: | |
|
154 | allowed_ids.append(repo_group.group_id) | |
|
155 | ||
|
156 | repo_groups_data_total_count = RepoGroup.query()\ | |
|
157 | .filter(or_( | |
|
158 | # generate multiple IN to fix limitation problems | |
|
159 | *in_filter_generator(RepoGroup.group_id, allowed_ids) | |
|
160 | )) \ | |
|
161 | .count() | |
|
162 | ||
|
163 | repo_groups_data_total_inactive_count = RepoGroup.query()\ | |
|
164 | .filter(RepoGroup.group_id.in_(allowed_ids))\ | |
|
165 | .count() | |
|
166 | ||
|
167 | repo_count = count(Repository.repo_id) | |
|
168 | base_q = Session.query( | |
|
169 | RepoGroup.group_name, | |
|
170 | RepoGroup.group_name_hash, | |
|
171 | RepoGroup.group_description, | |
|
172 | RepoGroup.group_id, | |
|
173 | RepoGroup.personal, | |
|
174 | RepoGroup.updated_on, | |
|
175 | User, | |
|
176 | repo_count.label('repos_count') | |
|
177 | ) \ | |
|
178 | .filter(or_( | |
|
179 | # generate multiple IN to fix limitation problems | |
|
180 | *in_filter_generator(RepoGroup.group_id, allowed_ids) | |
|
181 | )) \ | |
|
182 | .outerjoin(Repository) \ | |
|
183 | .join(User, User.user_id == RepoGroup.user_id) \ | |
|
184 | .group_by(RepoGroup, User) | |
|
185 | ||
|
186 | if search_q: | |
|
187 | like_expression = u'%{}%'.format(safe_unicode(search_q)) | |
|
188 | base_q = base_q.filter(or_( | |
|
189 | RepoGroup.group_name.ilike(like_expression), | |
|
190 | )) | |
|
191 | ||
|
192 | repo_groups_data_total_filtered_count = base_q.count() | |
|
193 | # the inactive isn't really used, but we still make it same as other data grids | |
|
194 | # which use inactive (users,user groups) | |
|
195 | repo_groups_data_total_filtered_inactive_count = repo_groups_data_total_filtered_count | |
|
196 | ||
|
197 | sort_defined = False | |
|
198 | if order_by == 'group_name': | |
|
199 | sort_col = func.lower(RepoGroup.group_name) | |
|
200 | sort_defined = True | |
|
201 | elif order_by == 'repos_total': | |
|
202 | sort_col = repo_count | |
|
203 | sort_defined = True | |
|
204 | elif order_by == 'user_username': | |
|
205 | sort_col = User.username | |
|
206 | else: | |
|
207 | sort_col = getattr(RepoGroup, order_by, None) | |
|
208 | ||
|
209 | if sort_defined or sort_col: | |
|
210 | if order_dir == 'asc': | |
|
211 | sort_col = sort_col.asc() | |
|
212 | else: | |
|
213 | sort_col = sort_col.desc() | |
|
214 | ||
|
215 | base_q = base_q.order_by(sort_col) | |
|
216 | base_q = base_q.offset(start).limit(limit) | |
|
217 | ||
|
218 | # authenticated access to user groups | |
|
219 | auth_repo_group_list = base_q.all() | |
|
220 | ||
|
221 | repo_groups_data = [] | |
|
222 | for repo_gr in auth_repo_group_list: | |
|
223 | row = { | |
|
224 | "menu": quick_menu(repo_gr.group_name), | |
|
225 | "name": repo_group_lnk(repo_gr.group_name), | |
|
226 | "name_raw": repo_gr.group_name, | |
|
227 | "last_change": last_change(repo_gr.updated_on), | |
|
228 | "last_change_raw": datetime_to_time(repo_gr.updated_on), | |
|
229 | ||
|
230 | "last_changeset": "", | |
|
231 | "last_changeset_raw": "", | |
|
232 | ||
|
233 | "desc": desc(repo_gr.group_description, repo_gr.personal), | |
|
234 | "owner": user_profile(repo_gr.User.username), | |
|
235 | "top_level_repos": repo_gr.repos_count, | |
|
236 | "action": repo_group_actions( | |
|
237 | repo_gr.group_id, repo_gr.group_name, repo_gr.repos_count), | |
|
238 | ||
|
239 | } | |
|
240 | ||
|
241 | repo_groups_data.append(row) | |
|
242 | ||
|
243 | data = ({ | |
|
244 | 'draw': draw, | |
|
245 | 'data': repo_groups_data, | |
|
246 | 'recordsTotal': repo_groups_data_total_count, | |
|
247 | 'recordsTotalInactive': repo_groups_data_total_inactive_count, | |
|
248 | 'recordsFiltered': repo_groups_data_total_filtered_count, | |
|
249 | 'recordsFilteredInactive': repo_groups_data_total_filtered_inactive_count, | |
|
250 | }) | |
|
251 | ||
|
252 | return data | |
|
107 | 253 | |
|
108 | 254 | @LoginRequired() |
|
109 | 255 | @NotAnonymous() |
@@ -39,7 +39,7 b' from rhodecode.model.forms import UserGr' | |||
|
39 | 39 | from rhodecode.model.permission import PermissionModel |
|
40 | 40 | from rhodecode.model.scm import UserGroupList |
|
41 | 41 | from rhodecode.model.db import ( |
|
42 | or_, count, User, UserGroup, UserGroupMember) | |
|
42 | or_, count, User, UserGroup, UserGroupMember, in_filter_generator) | |
|
43 | 43 | from rhodecode.model.meta import Session |
|
44 | 44 | from rhodecode.model.user_group import UserGroupModel |
|
45 | 45 | from rhodecode.model.db import true |
@@ -107,11 +107,17 b' class AdminUserGroupsView(BaseAppView, D' | |||
|
107 | 107 | allowed_ids.append(user_group.users_group_id) |
|
108 | 108 | |
|
109 | 109 | user_groups_data_total_count = UserGroup.query()\ |
|
110 | .filter(UserGroup.users_group_id.in_(allowed_ids))\ | |
|
110 | .filter(or_( | |
|
111 | # generate multiple IN to fix limitation problems | |
|
112 | *in_filter_generator(UserGroup.users_group_id, allowed_ids) | |
|
113 | ))\ | |
|
111 | 114 | .count() |
|
112 | 115 | |
|
113 | 116 | user_groups_data_total_inactive_count = UserGroup.query()\ |
|
114 | .filter(UserGroup.users_group_id.in_(allowed_ids))\ | |
|
117 | .filter(or_( | |
|
118 | # generate multiple IN to fix limitation problems | |
|
119 | *in_filter_generator(UserGroup.users_group_id, allowed_ids) | |
|
120 | ))\ | |
|
115 | 121 | .filter(UserGroup.users_group_active != true()).count() |
|
116 | 122 | |
|
117 | 123 | member_count = count(UserGroupMember.user_id) |
@@ -123,11 +129,14 b' class AdminUserGroupsView(BaseAppView, D' | |||
|
123 | 129 | UserGroup.group_data, |
|
124 | 130 | User, |
|
125 | 131 | member_count.label('member_count') |
|
126 | ) \ | |
|
127 | .filter(UserGroup.users_group_id.in_(allowed_ids)) \ | |
|
128 | .outerjoin(UserGroupMember) \ | |
|
129 | .join(User, User.user_id == UserGroup.user_id) \ | |
|
130 | .group_by(UserGroup, User) | |
|
132 | ) \ | |
|
133 | .filter(or_( | |
|
134 | # generate multiple IN to fix limitation problems | |
|
135 | *in_filter_generator(UserGroup.users_group_id, allowed_ids) | |
|
136 | )) \ | |
|
137 | .outerjoin(UserGroupMember) \ | |
|
138 | .join(User, User.user_id == UserGroup.user_id) \ | |
|
139 | .group_by(UserGroup, User) | |
|
131 | 140 | |
|
132 | 141 | base_q_inactive = base_q.filter(UserGroup.users_group_active != true()) |
|
133 | 142 | |
@@ -141,14 +150,16 b' class AdminUserGroupsView(BaseAppView, D' | |||
|
141 | 150 | user_groups_data_total_filtered_count = base_q.count() |
|
142 | 151 | user_groups_data_total_filtered_inactive_count = base_q_inactive.count() |
|
143 | 152 | |
|
153 | sort_defined = False | |
|
144 | 154 | if order_by == 'members_total': |
|
145 | 155 | sort_col = member_count |
|
156 | sort_defined = True | |
|
146 | 157 | elif order_by == 'user_username': |
|
147 | 158 | sort_col = User.username |
|
148 | 159 | else: |
|
149 | 160 | sort_col = getattr(UserGroup, order_by, None) |
|
150 | 161 | |
|
151 |
if |
|
|
162 | if sort_defined or sort_col: | |
|
152 | 163 | if order_dir == 'asc': |
|
153 | 164 | sort_col = sort_col.asc() |
|
154 | 165 | else: |
@@ -162,7 +173,7 b' class AdminUserGroupsView(BaseAppView, D' | |||
|
162 | 173 | |
|
163 | 174 | user_groups_data = [] |
|
164 | 175 | for user_gr in auth_user_group_list: |
|
165 | user_groups_data.append({ | |
|
176 | row = { | |
|
166 | 177 | "users_group_name": user_group_name(user_gr.users_group_name), |
|
167 | 178 | "name_raw": h.escape(user_gr.users_group_name), |
|
168 | 179 | "description": h.escape(user_gr.user_group_description), |
@@ -175,7 +186,8 b' class AdminUserGroupsView(BaseAppView, D' | |||
|
175 | 186 | "owner": user_profile(user_gr.User.username), |
|
176 | 187 | "action": user_group_actions( |
|
177 | 188 | user_gr.users_group_id, user_gr.users_group_name) |
|
178 |
} |
|
|
189 | } | |
|
190 | user_groups_data.append(row) | |
|
179 | 191 | |
|
180 | 192 | data = ({ |
|
181 | 193 | 'draw': draw, |
@@ -682,8 +682,7 b' class UsersView(UserAppView):' | |||
|
682 | 682 | if personal_repo_group: |
|
683 | 683 | raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id)) |
|
684 | 684 | |
|
685 | personal_repo_group_name = RepoGroupModel().get_personal_group_name( | |
|
686 | c.user) | |
|
685 | personal_repo_group_name = RepoGroupModel().get_personal_group_name(c.user) | |
|
687 | 686 | named_personal_group = RepoGroup.get_by_group_name( |
|
688 | 687 | personal_repo_group_name) |
|
689 | 688 | try: |
@@ -38,19 +38,18 b' class LocalFileStorage(object):' | |||
|
38 | 38 | """ |
|
39 | 39 | Resolves a unique name and the correct path. If a filename |
|
40 | 40 | for that path already exists then a numeric prefix with values > 0 will be |
|
41 |
added, for example test.jpg -> test |
|
|
41 | added, for example test.jpg -> 1-test.jpg etc. initially file would have 0 prefix. | |
|
42 | 42 | |
|
43 | 43 | :param name: base name of file |
|
44 | 44 | :param directory: absolute directory path |
|
45 | 45 | """ |
|
46 | 46 | |
|
47 | basename, ext = os.path.splitext(name) | |
|
48 | 47 | counter = 0 |
|
49 | 48 | while True: |
|
50 |
name = '% |
|
|
49 | name = '%d-%s' % (counter, name) | |
|
51 | 50 | |
|
52 | 51 | # sub_store prefix to optimize disk usage, e.g some_path/ab/final_file |
|
53 |
sub_store = cls._sub_store_from_filename( |
|
|
52 | sub_store = cls._sub_store_from_filename(name) | |
|
54 | 53 | sub_store_path = os.path.join(directory, sub_store) |
|
55 | 54 | if not os.path.exists(sub_store_path): |
|
56 | 55 | os.makedirs(sub_store_path) |
@@ -209,3 +208,16 b' class LocalFileStorage(object):' | |||
|
209 | 208 | filename = os.path.join(directory, filename) |
|
210 | 209 | |
|
211 | 210 | return filename, metadata |
|
211 | ||
|
212 | def get_metadata(self, filename): | |
|
213 | """ | |
|
214 | Reads JSON stored metadata for a file | |
|
215 | ||
|
216 | :param filename: | |
|
217 | :return: | |
|
218 | """ | |
|
219 | filename = self.store_path(filename) | |
|
220 | filename_meta = filename + '.meta' | |
|
221 | ||
|
222 | with open(filename_meta, "rb") as source_meta: | |
|
223 | return json.loads(source_meta.read()) |
@@ -21,6 +21,7 b' import os' | |||
|
21 | 21 | import pytest |
|
22 | 22 | |
|
23 | 23 | from rhodecode.lib.ext_json import json |
|
24 | from rhodecode.model.db import Session, FileStore | |
|
24 | 25 | from rhodecode.tests import TestController |
|
25 | 26 | from rhodecode.apps.file_store import utils, config_keys |
|
26 | 27 | |
@@ -46,9 +47,12 b' class TestFileStoreViews(TestController)' | |||
|
46 | 47 | ('abcde-0.exe', "1234567", True), |
|
47 | 48 | ('abcde-0.jpg', "xxxxx", False), |
|
48 | 49 | ]) |
|
49 | def test_get_files_from_store(self, fid, content, exists, tmpdir): | |
|
50 | self.log_user() | |
|
50 | def test_get_files_from_store(self, fid, content, exists, tmpdir, user_util): | |
|
51 | user = self.log_user() | |
|
52 | user_id = user['user_id'] | |
|
53 | repo_id = user_util.create_repo().repo_id | |
|
51 | 54 | store_path = self.app._pyramid_settings[config_keys.store_path] |
|
55 | store_uid = fid | |
|
52 | 56 | |
|
53 | 57 | if exists: |
|
54 | 58 | status = 200 |
@@ -58,17 +62,28 b' class TestFileStoreViews(TestController)' | |||
|
58 | 62 | f.write(content) |
|
59 | 63 | |
|
60 | 64 | with open(filesystem_file, 'rb') as f: |
|
61 |
|
|
|
65 | store_uid, metadata = store.save_file(f, fid, extra_metadata={'filename': fid}) | |
|
66 | ||
|
67 | entry = FileStore.create( | |
|
68 | file_uid=store_uid, filename=metadata["filename"], | |
|
69 | file_hash=metadata["sha256"], file_size=metadata["size"], | |
|
70 | file_display_name='file_display_name', | |
|
71 | file_description='repo artifact `{}`'.format(metadata["filename"]), | |
|
72 | check_acl=True, user_id=user_id, | |
|
73 | scope_repo_id=repo_id | |
|
74 | ) | |
|
75 | Session().add(entry) | |
|
76 | Session().commit() | |
|
62 | 77 | |
|
63 | 78 | else: |
|
64 | 79 | status = 404 |
|
65 | 80 | |
|
66 |
response = self.app.get(route_path('download_file', fid= |
|
|
81 | response = self.app.get(route_path('download_file', fid=store_uid), status=status) | |
|
67 | 82 | |
|
68 | 83 | if exists: |
|
69 | 84 | assert response.text == content |
|
70 |
file_store_path = os.path.dirname(store.resolve_name( |
|
|
71 |
metadata_file = os.path.join(file_store_path, |
|
|
85 | file_store_path = os.path.dirname(store.resolve_name(store_uid, store_path)[1]) | |
|
86 | metadata_file = os.path.join(file_store_path, store_uid + '.meta') | |
|
72 | 87 | assert os.path.exists(metadata_file) |
|
73 | 88 | with open(metadata_file, 'rb') as f: |
|
74 | 89 | json_data = json.loads(f.read()) |
@@ -19,9 +19,10 b'' | |||
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | |
|
22 | import os | |
|
23 | 22 | import uuid |
|
24 | 23 | |
|
24 | import pathlib2 | |
|
25 | ||
|
25 | 26 | |
|
26 | 27 | def get_file_storage(settings): |
|
27 | 28 | from rhodecode.apps.file_store.local_store import LocalFileStorage |
@@ -30,6 +31,11 b' def get_file_storage(settings):' | |||
|
30 | 31 | return LocalFileStorage(base_path=store_path) |
|
31 | 32 | |
|
32 | 33 | |
|
34 | def splitext(filename): | |
|
35 | ext = ''.join(pathlib2.Path(filename).suffixes) | |
|
36 | return filename, ext | |
|
37 | ||
|
38 | ||
|
33 | 39 | def uid_filename(filename, randomized=True): |
|
34 | 40 | """ |
|
35 | 41 | Generates a randomized or stable (uuid) filename, |
@@ -38,7 +44,8 b' def uid_filename(filename, randomized=Tr' | |||
|
38 | 44 | :param filename: the original filename |
|
39 | 45 | :param randomized: define if filename should be stable (sha1 based) or randomized |
|
40 | 46 | """ |
|
41 | _, ext = os.path.splitext(filename) | |
|
47 | ||
|
48 | _, ext = splitext(filename) | |
|
42 | 49 | if randomized: |
|
43 | 50 | uid = uuid.uuid4() |
|
44 | 51 | else: |
@@ -30,7 +30,7 b' from rhodecode.apps.file_store.exception' | |||
|
30 | 30 | |
|
31 | 31 | from rhodecode.lib import helpers as h |
|
32 | 32 | from rhodecode.lib import audit_logger |
|
33 | from rhodecode.lib.auth import (CSRFRequired, NotAnonymous) | |
|
33 | from rhodecode.lib.auth import (CSRFRequired, NotAnonymous, HasRepoPermissionAny, HasRepoGroupPermissionAny) | |
|
34 | 34 | from rhodecode.model.db import Session, FileStore |
|
35 | 35 | |
|
36 | 36 | log = logging.getLogger(__name__) |
@@ -68,7 +68,7 b' class FileStoreView(BaseAppView):' | |||
|
68 | 68 | 'user_id': self._rhodecode_user.user_id, |
|
69 | 69 | 'ip': self._rhodecode_user.ip_addr}} |
|
70 | 70 | try: |
|
71 |
store_ |
|
|
71 | store_uid, metadata = self.storage.save_file( | |
|
72 | 72 | file_obj.file, filename, extra_metadata=metadata) |
|
73 | 73 | except FileNotAllowedException: |
|
74 | 74 | return {'store_fid': None, |
@@ -82,7 +82,7 b' class FileStoreView(BaseAppView):' | |||
|
82 | 82 | |
|
83 | 83 | try: |
|
84 | 84 | entry = FileStore.create( |
|
85 |
file_uid=store_ |
|
|
85 | file_uid=store_uid, filename=metadata["filename"], | |
|
86 | 86 | file_hash=metadata["sha256"], file_size=metadata["size"], |
|
87 | 87 | file_description='upload attachment', |
|
88 | 88 | check_acl=False, user_id=self._rhodecode_user.user_id |
@@ -96,8 +96,8 b' class FileStoreView(BaseAppView):' | |||
|
96 | 96 | 'access_path': None, |
|
97 | 97 | 'error': 'File {} failed to store in DB.'.format(filename)} |
|
98 | 98 | |
|
99 |
return {'store_fid': store_ |
|
|
100 |
'access_path': h.route_path('download_file', fid=store_ |
|
|
99 | return {'store_fid': store_uid, | |
|
100 | 'access_path': h.route_path('download_file', fid=store_uid)} | |
|
101 | 101 | |
|
102 | 102 | @view_config(route_name='download_file') |
|
103 | 103 | def download_file(self): |
@@ -109,6 +109,35 b' class FileStoreView(BaseAppView):' | |||
|
109 | 109 | log.debug('File with FID:%s not found in the store', file_uid) |
|
110 | 110 | raise HTTPNotFound() |
|
111 | 111 | |
|
112 | db_obj = FileStore().query().filter(FileStore.file_uid == file_uid).scalar() | |
|
113 | if not db_obj: | |
|
114 | raise HTTPNotFound() | |
|
115 | ||
|
116 | # private upload for user | |
|
117 | if db_obj.check_acl and db_obj.scope_user_id: | |
|
118 | user = db_obj.user | |
|
119 | if self._rhodecode_db_user.user_id != user.user_id: | |
|
120 | log.warning('Access to file store object forbidden') | |
|
121 | raise HTTPNotFound() | |
|
122 | ||
|
123 | # scoped to repository permissions | |
|
124 | if db_obj.check_acl and db_obj.scope_repo_id: | |
|
125 | repo = db_obj.repo | |
|
126 | perm_set = ['repository.read', 'repository.write', 'repository.admin'] | |
|
127 | has_perm = HasRepoPermissionAny(*perm_set)(repo.repo_name, 'FileStore check') | |
|
128 | if not has_perm: | |
|
129 | log.warning('Access to file store object forbidden') | |
|
130 | raise HTTPNotFound() | |
|
131 | ||
|
132 | # scoped to repository group permissions | |
|
133 | if db_obj.check_acl and db_obj.scope_repo_group_id: | |
|
134 | repo_group = db_obj.repo_group | |
|
135 | perm_set = ['group.read', 'group.write', 'group.admin'] | |
|
136 | has_perm = HasRepoGroupPermissionAny(*perm_set)(repo_group.group_name, 'FileStore check') | |
|
137 | if not has_perm: | |
|
138 | log.warning('Access to file store object forbidden') | |
|
139 | raise HTTPNotFound() | |
|
140 | ||
|
112 | 141 | FileStore.bump_access_counter(file_uid) |
|
113 | 142 | |
|
114 | 143 | file_path = self.storage.store_path(file_uid) |
@@ -25,7 +25,7 b' import formencode' | |||
|
25 | 25 | import formencode.htmlfill |
|
26 | 26 | import peppercorn |
|
27 | 27 | |
|
28 | from pyramid.httpexceptions import HTTPNotFound, HTTPFound | |
|
28 | from pyramid.httpexceptions import HTTPNotFound, HTTPFound, HTTPBadRequest | |
|
29 | 29 | from pyramid.view import view_config |
|
30 | 30 | from pyramid.renderers import render |
|
31 | 31 | from pyramid.response import Response |
@@ -67,7 +67,6 b' class GistView(BaseAppView):' | |||
|
67 | 67 | (Gist.ACL_LEVEL_PUBLIC, _("Can be accessed by anonymous users")) |
|
68 | 68 | ] |
|
69 | 69 | |
|
70 | ||
|
71 | 70 | return c |
|
72 | 71 | |
|
73 | 72 | @LoginRequired() |
@@ -296,6 +295,8 b' class GistView(BaseAppView):' | |||
|
296 | 295 | response = Response(content) |
|
297 | 296 | response.content_type = 'text/plain' |
|
298 | 297 | return response |
|
298 | elif return_format: | |
|
299 | raise HTTPBadRequest() | |
|
299 | 300 | |
|
300 | 301 | return self._get_template_context(c) |
|
301 | 302 |
@@ -68,6 +68,10 b' def includeme(config):' | |||
|
68 | 68 | pattern='/_markup_preview') |
|
69 | 69 | |
|
70 | 70 | config.add_route( |
|
71 | name='file_preview', | |
|
72 | pattern='/_file_preview') | |
|
73 | ||
|
74 | config.add_route( | |
|
71 | 75 | name='store_user_session_value', |
|
72 | 76 | pattern='/_store_session_attr') |
|
73 | 77 |
@@ -27,13 +27,14 b' from pyramid.view import view_config' | |||
|
27 | 27 | from rhodecode.apps._base import BaseAppView |
|
28 | 28 | from rhodecode.lib import helpers as h |
|
29 | 29 | from rhodecode.lib.auth import ( |
|
30 | LoginRequired, NotAnonymous, HasRepoGroupPermissionAnyDecorator, | |
|
31 | CSRFRequired) | |
|
30 | LoginRequired, NotAnonymous, HasRepoGroupPermissionAnyDecorator, CSRFRequired) | |
|
31 | from rhodecode.lib.codeblocks import filenode_as_lines_tokens | |
|
32 | 32 | from rhodecode.lib.index import searcher_from_config |
|
33 | 33 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int |
|
34 | 34 | from rhodecode.lib.ext_json import json |
|
35 | from rhodecode.lib.vcs.nodes import FileNode | |
|
35 | 36 | from rhodecode.model.db import ( |
|
36 | func, true, or_, in_filter_generator, Repository, RepoGroup, User, UserGroup) | |
|
37 | func, true, or_, case, in_filter_generator, Repository, RepoGroup, User, UserGroup) | |
|
37 | 38 | from rhodecode.model.repo import RepoModel |
|
38 | 39 | from rhodecode.model.repo_group import RepoGroupModel |
|
39 | 40 | from rhodecode.model.scm import RepoGroupList, RepoList |
@@ -105,21 +106,27 b' class HomeView(BaseAppView):' | |||
|
105 | 106 | |
|
106 | 107 | return {'suggestions': _user_groups} |
|
107 | 108 | |
|
108 | def _get_repo_list(self, name_contains=None, repo_type=None, limit=20): | |
|
109 | def _get_repo_list(self, name_contains=None, repo_type=None, repo_group_name='', limit=20): | |
|
109 | 110 | org_query = name_contains |
|
110 | 111 | allowed_ids = self._rhodecode_user.repo_acl_ids( |
|
111 | 112 | ['repository.read', 'repository.write', 'repository.admin'], |
|
112 | 113 | cache=False, name_filter=name_contains) or [-1] |
|
113 | 114 | |
|
114 | 115 | query = Repository.query()\ |
|
115 | .order_by(func.length(Repository.repo_name))\ | |
|
116 | .order_by(Repository.repo_name)\ | |
|
117 | 116 | .filter(Repository.archived.isnot(true()))\ |
|
118 | 117 | .filter(or_( |
|
119 | 118 | # generate multiple IN to fix limitation problems |
|
120 | 119 | *in_filter_generator(Repository.repo_id, allowed_ids) |
|
121 | 120 | )) |
|
122 | 121 | |
|
122 | query = query.order_by(case( | |
|
123 | [ | |
|
124 | (Repository.repo_name.startswith(repo_group_name), repo_group_name+'/'), | |
|
125 | ], | |
|
126 | )) | |
|
127 | query = query.order_by(func.length(Repository.repo_name)) | |
|
128 | query = query.order_by(Repository.repo_name) | |
|
129 | ||
|
123 | 130 | if repo_type: |
|
124 | 131 | query = query.filter(Repository.repo_type == repo_type) |
|
125 | 132 | |
@@ -145,20 +152,26 b' class HomeView(BaseAppView):' | |||
|
145 | 152 | } |
|
146 | 153 | for obj in acl_iter] |
|
147 | 154 | |
|
148 | def _get_repo_group_list(self, name_contains=None, limit=20): | |
|
155 | def _get_repo_group_list(self, name_contains=None, repo_group_name='', limit=20): | |
|
149 | 156 | org_query = name_contains |
|
150 | 157 | allowed_ids = self._rhodecode_user.repo_group_acl_ids( |
|
151 | 158 | ['group.read', 'group.write', 'group.admin'], |
|
152 | 159 | cache=False, name_filter=name_contains) or [-1] |
|
153 | 160 | |
|
154 | 161 | query = RepoGroup.query()\ |
|
155 | .order_by(func.length(RepoGroup.group_name))\ | |
|
156 | .order_by(RepoGroup.group_name) \ | |
|
157 | 162 | .filter(or_( |
|
158 | 163 | # generate multiple IN to fix limitation problems |
|
159 | 164 | *in_filter_generator(RepoGroup.group_id, allowed_ids) |
|
160 | 165 | )) |
|
161 | 166 | |
|
167 | query = query.order_by(case( | |
|
168 | [ | |
|
169 | (RepoGroup.group_name.startswith(repo_group_name), repo_group_name+'/'), | |
|
170 | ], | |
|
171 | )) | |
|
172 | query = query.order_by(func.length(RepoGroup.group_name)) | |
|
173 | query = query.order_by(RepoGroup.group_name) | |
|
174 | ||
|
162 | 175 | if name_contains: |
|
163 | 176 | ilike_expression = u'%{}%'.format(safe_unicode(name_contains)) |
|
164 | 177 | query = query.filter( |
@@ -183,11 +196,17 b' class HomeView(BaseAppView):' | |||
|
183 | 196 | def _get_user_list(self, name_contains=None, limit=20): |
|
184 | 197 | org_query = name_contains |
|
185 | 198 | if not name_contains: |
|
186 | return [] | |
|
199 | return [], False | |
|
187 | 200 | |
|
188 | name_contains = re.compile('(?:user:)(.+)').findall(name_contains) | |
|
201 | # TODO(marcink): should all logged in users be allowed to search others? | |
|
202 | allowed_user_search = self._rhodecode_user.username != User.DEFAULT_USER | |
|
203 | if not allowed_user_search: | |
|
204 | return [], False | |
|
205 | ||
|
206 | name_contains = re.compile('(?:user:[ ]?)(.+)').findall(name_contains) | |
|
189 | 207 | if len(name_contains) != 1: |
|
190 | return [] | |
|
208 | return [], False | |
|
209 | ||
|
191 | 210 | name_contains = name_contains[0] |
|
192 | 211 | |
|
193 | 212 | query = User.query()\ |
@@ -207,22 +226,28 b' class HomeView(BaseAppView):' | |||
|
207 | 226 | { |
|
208 | 227 | 'id': obj.user_id, |
|
209 | 228 | 'value': org_query, |
|
210 | 'value_display': obj.username, | |
|
229 | 'value_display': 'user: `{}`'.format(obj.username), | |
|
211 | 230 | 'type': 'user', |
|
212 | 231 | 'icon_link': h.gravatar_url(obj.email, 30), |
|
213 | 232 | 'url': h.route_path( |
|
214 | 233 | 'user_profile', username=obj.username) |
|
215 | 234 | } |
|
216 | for obj in acl_iter] | |
|
235 | for obj in acl_iter], True | |
|
217 | 236 | |
|
218 | 237 | def _get_user_groups_list(self, name_contains=None, limit=20): |
|
219 | 238 | org_query = name_contains |
|
220 | 239 | if not name_contains: |
|
221 | return [] | |
|
240 | return [], False | |
|
222 | 241 | |
|
223 | name_contains = re.compile('(?:user_group:)(.+)').findall(name_contains) | |
|
242 | # TODO(marcink): should all logged in users be allowed to search others? | |
|
243 | allowed_user_search = self._rhodecode_user.username != User.DEFAULT_USER | |
|
244 | if not allowed_user_search: | |
|
245 | return [], False | |
|
246 | ||
|
247 | name_contains = re.compile('(?:user_group:[ ]?)(.+)').findall(name_contains) | |
|
224 | 248 | if len(name_contains) != 1: |
|
225 | return [] | |
|
249 | return [], False | |
|
250 | ||
|
226 | 251 | name_contains = name_contains[0] |
|
227 | 252 | |
|
228 | 253 | query = UserGroup.query()\ |
@@ -241,27 +266,34 b' class HomeView(BaseAppView):' | |||
|
241 | 266 | { |
|
242 | 267 | 'id': obj.users_group_id, |
|
243 | 268 | 'value': org_query, |
|
244 | 'value_display': obj.users_group_name, | |
|
269 | 'value_display': 'user_group: `{}`'.format(obj.users_group_name), | |
|
245 | 270 | 'type': 'user_group', |
|
246 | 271 | 'url': h.route_path( |
|
247 | 272 | 'user_group_profile', user_group_name=obj.users_group_name) |
|
248 | 273 | } |
|
249 | for obj in acl_iter] | |
|
274 | for obj in acl_iter], True | |
|
250 | 275 | |
|
251 | def _get_hash_commit_list(self, auth_user, searcher, query): | |
|
276 | def _get_hash_commit_list(self, auth_user, searcher, query, repo=None, repo_group=None): | |
|
277 | repo_name = repo_group_name = None | |
|
278 | if repo: | |
|
279 | repo_name = repo.repo_name | |
|
280 | if repo_group: | |
|
281 | repo_group_name = repo_group.group_name | |
|
282 | ||
|
252 | 283 | org_query = query |
|
253 | 284 | if not query or len(query) < 3 or not searcher: |
|
254 | return [] | |
|
285 | return [], False | |
|
255 | 286 | |
|
256 | commit_hashes = re.compile('(?:commit:)([0-9a-f]{2,40})').findall(query) | |
|
287 | commit_hashes = re.compile('(?:commit:[ ]?)([0-9a-f]{2,40})').findall(query) | |
|
257 | 288 | |
|
258 | 289 | if len(commit_hashes) != 1: |
|
259 | return [] | |
|
290 | return [], False | |
|
291 | ||
|
260 | 292 | commit_hash = commit_hashes[0] |
|
261 | 293 | |
|
262 | 294 | result = searcher.search( |
|
263 | 295 | 'commit_id:{}*'.format(commit_hash), 'commit', auth_user, |
|
264 | raise_on_exc=False) | |
|
296 | repo_name, repo_group_name, raise_on_exc=False) | |
|
265 | 297 | |
|
266 | 298 | commits = [] |
|
267 | 299 | for entry in result['results']: |
@@ -286,7 +318,55 b' class HomeView(BaseAppView):' | |||
|
286 | 318 | } |
|
287 | 319 | |
|
288 | 320 | commits.append(commit_entry) |
|
289 | return commits | |
|
321 | return commits, True | |
|
322 | ||
|
323 | def _get_path_list(self, auth_user, searcher, query, repo=None, repo_group=None): | |
|
324 | repo_name = repo_group_name = None | |
|
325 | if repo: | |
|
326 | repo_name = repo.repo_name | |
|
327 | if repo_group: | |
|
328 | repo_group_name = repo_group.group_name | |
|
329 | ||
|
330 | org_query = query | |
|
331 | if not query or len(query) < 3 or not searcher: | |
|
332 | return [], False | |
|
333 | ||
|
334 | paths_re = re.compile('(?:file:[ ]?)(.+)').findall(query) | |
|
335 | if len(paths_re) != 1: | |
|
336 | return [], False | |
|
337 | ||
|
338 | file_path = paths_re[0] | |
|
339 | ||
|
340 | search_path = searcher.escape_specials(file_path) | |
|
341 | result = searcher.search( | |
|
342 | 'file.raw:*{}*'.format(search_path), 'path', auth_user, | |
|
343 | repo_name, repo_group_name, raise_on_exc=False) | |
|
344 | ||
|
345 | files = [] | |
|
346 | for entry in result['results']: | |
|
347 | repo_data = { | |
|
348 | 'repository_id': entry.get('repository_id'), | |
|
349 | 'repository_type': entry.get('repo_type'), | |
|
350 | 'repository_name': entry.get('repository'), | |
|
351 | } | |
|
352 | ||
|
353 | file_entry = { | |
|
354 | 'id': entry['commit_id'], | |
|
355 | 'value': org_query, | |
|
356 | 'value_display': '`{}` file: {}'.format( | |
|
357 | entry['repository'], entry['file']), | |
|
358 | 'type': 'file', | |
|
359 | 'repo': entry['repository'], | |
|
360 | 'repo_data': repo_data, | |
|
361 | ||
|
362 | 'url': h.route_path( | |
|
363 | 'repo_files', | |
|
364 | repo_name=entry['repository'], commit_id=entry['commit_id'], | |
|
365 | f_path=entry['file']) | |
|
366 | } | |
|
367 | ||
|
368 | files.append(file_entry) | |
|
369 | return files, True | |
|
290 | 370 | |
|
291 | 371 | @LoginRequired() |
|
292 | 372 | @view_config( |
@@ -363,36 +443,42 b' class HomeView(BaseAppView):' | |||
|
363 | 443 | def query_modifier(): |
|
364 | 444 | qry = query |
|
365 | 445 | return {'q': qry, 'type': 'content'} |
|
366 | label = u'File search for `{}` in this repository.'.format(query) | |
|
367 | queries.append( | |
|
368 |
|
|
|
369 |
|
|
|
370 |
|
|
|
371 |
|
|
|
372 |
|
|
|
373 |
|
|
|
374 | repo_name=repo_name, | |
|
375 |
|
|
|
446 | ||
|
447 | label = u'File search for `{}`'.format(h.escape(query)) | |
|
448 | file_qry = { | |
|
449 | 'id': -10, | |
|
450 | 'value': query, | |
|
451 | 'value_display': label, | |
|
452 | 'type': 'search', | |
|
453 | 'subtype': 'repo', | |
|
454 | 'url': h.route_path('search_repo', | |
|
455 | repo_name=repo_name, | |
|
456 | _query=query_modifier()) | |
|
376 | 457 | } |
|
377 | ) | |
|
378 | 458 | |
|
379 | 459 | # commits |
|
380 | 460 | def query_modifier(): |
|
381 | 461 | qry = query |
|
382 | 462 | return {'q': qry, 'type': 'commit'} |
|
383 | 463 | |
|
384 |
label = u'Commit search for `{}` |
|
|
385 | queries.append( | |
|
386 |
|
|
|
387 |
|
|
|
388 |
|
|
|
389 |
|
|
|
390 |
|
|
|
391 |
|
|
|
392 |
|
|
|
393 |
|
|
|
464 | label = u'Commit search for `{}`'.format(h.escape(query)) | |
|
465 | commit_qry = { | |
|
466 | 'id': -20, | |
|
467 | 'value': query, | |
|
468 | 'value_display': label, | |
|
469 | 'type': 'search', | |
|
470 | 'subtype': 'repo', | |
|
471 | 'url': h.route_path('search_repo', | |
|
472 | repo_name=repo_name, | |
|
473 | _query=query_modifier()) | |
|
394 | 474 | } |
|
395 | ) | |
|
475 | ||
|
476 | if repo_context in ['commit', 'commits']: | |
|
477 | queries.extend([commit_qry, file_qry]) | |
|
478 | elif repo_context in ['files', 'summary']: | |
|
479 | queries.extend([file_qry, commit_qry]) | |
|
480 | else: | |
|
481 | queries.extend([commit_qry, file_qry]) | |
|
396 | 482 | |
|
397 | 483 | elif is_es_6 and repo_group_name: |
|
398 | 484 | # files |
@@ -400,37 +486,43 b' class HomeView(BaseAppView):' | |||
|
400 | 486 | qry = query |
|
401 | 487 | return {'q': qry, 'type': 'content'} |
|
402 | 488 | |
|
403 |
label = u'File search for `{}` |
|
|
404 | queries.append( | |
|
405 |
|
|
|
406 |
|
|
|
407 |
|
|
|
408 |
|
|
|
409 |
|
|
|
410 |
|
|
|
411 |
|
|
|
412 |
|
|
|
489 | label = u'File search for `{}`'.format(query) | |
|
490 | file_qry = { | |
|
491 | 'id': -30, | |
|
492 | 'value': query, | |
|
493 | 'value_display': label, | |
|
494 | 'type': 'search', | |
|
495 | 'subtype': 'repo_group', | |
|
496 | 'url': h.route_path('search_repo_group', | |
|
497 | repo_group_name=repo_group_name, | |
|
498 | _query=query_modifier()) | |
|
413 | 499 | } |
|
414 | ) | |
|
415 | 500 | |
|
416 | 501 | # commits |
|
417 | 502 | def query_modifier(): |
|
418 | 503 | qry = query |
|
419 | 504 | return {'q': qry, 'type': 'commit'} |
|
420 | 505 | |
|
421 |
label = u'Commit search for `{}` |
|
|
422 | queries.append( | |
|
423 |
|
|
|
424 |
|
|
|
425 |
|
|
|
426 |
|
|
|
427 |
|
|
|
428 |
|
|
|
429 |
|
|
|
430 |
|
|
|
506 | label = u'Commit search for `{}`'.format(query) | |
|
507 | commit_qry = { | |
|
508 | 'id': -40, | |
|
509 | 'value': query, | |
|
510 | 'value_display': label, | |
|
511 | 'type': 'search', | |
|
512 | 'subtype': 'repo_group', | |
|
513 | 'url': h.route_path('search_repo_group', | |
|
514 | repo_group_name=repo_group_name, | |
|
515 | _query=query_modifier()) | |
|
431 | 516 | } |
|
432 | ) | |
|
433 | 517 | |
|
518 | if repo_context in ['commit', 'commits']: | |
|
519 | queries.extend([commit_qry, file_qry]) | |
|
520 | elif repo_context in ['files', 'summary']: | |
|
521 | queries.extend([file_qry, commit_qry]) | |
|
522 | else: | |
|
523 | queries.extend([commit_qry, file_qry]) | |
|
524 | ||
|
525 | # Global, not scoped | |
|
434 | 526 | if not queries: |
|
435 | 527 | queries.append( |
|
436 | 528 | { |
@@ -438,6 +530,7 b' class HomeView(BaseAppView):' | |||
|
438 | 530 | 'value': query, |
|
439 | 531 | 'value_display': u'File search for: `{}`'.format(query), |
|
440 | 532 | 'type': 'search', |
|
533 | 'subtype': 'global', | |
|
441 | 534 | 'url': h.route_path('search', |
|
442 | 535 | _query={'q': query, 'type': 'content'}) |
|
443 | 536 | }) |
@@ -447,6 +540,7 b' class HomeView(BaseAppView):' | |||
|
447 | 540 | 'value': query, |
|
448 | 541 | 'value_display': u'Commit search for: `{}`'.format(query), |
|
449 | 542 | 'type': 'search', |
|
543 | 'subtype': 'global', | |
|
450 | 544 | 'url': h.route_path('search', |
|
451 | 545 | _query={'q': query, 'type': 'commit'}) |
|
452 | 546 | }) |
@@ -469,54 +563,107 b' class HomeView(BaseAppView):' | |||
|
469 | 563 | if not query: |
|
470 | 564 | return {'suggestions': res} |
|
471 | 565 | |
|
566 | def no_match(name): | |
|
567 | return { | |
|
568 | 'id': -1, | |
|
569 | 'value': "", | |
|
570 | 'value_display': name, | |
|
571 | 'type': 'text', | |
|
572 | 'url': "" | |
|
573 | } | |
|
472 | 574 | searcher = searcher_from_config(self.request.registry.settings) |
|
473 | for _q in self._get_default_search_queries(self.request.GET, searcher, query): | |
|
474 | res.append(_q) | |
|
575 | has_specialized_search = False | |
|
475 | 576 | |
|
577 | # set repo context | |
|
578 | repo = None | |
|
579 | repo_id = safe_int(self.request.GET.get('search_context[repo_id]')) | |
|
580 | if repo_id: | |
|
581 | repo = Repository.get(repo_id) | |
|
582 | ||
|
583 | # set group context | |
|
584 | repo_group = None | |
|
476 | 585 | repo_group_id = safe_int(self.request.GET.get('search_context[repo_group_id]')) |
|
477 | 586 | if repo_group_id: |
|
478 | 587 | repo_group = RepoGroup.get(repo_group_id) |
|
479 | composed_hint = '{}/{}'.format(repo_group.group_name, query) | |
|
480 | show_hint = not query.startswith(repo_group.group_name) | |
|
481 | if repo_group and show_hint: | |
|
482 | hint = u'Repository search inside: `{}`'.format(composed_hint) | |
|
483 | res.append({ | |
|
484 | 'id': -1, | |
|
485 | 'value': composed_hint, | |
|
486 | 'value_display': hint, | |
|
487 | 'type': 'hint', | |
|
488 | 'url': "" | |
|
489 | }) | |
|
588 | prefix_match = False | |
|
589 | ||
|
590 | # user: type search | |
|
591 | if not prefix_match: | |
|
592 | users, prefix_match = self._get_user_list(query) | |
|
593 | if users: | |
|
594 | has_specialized_search = True | |
|
595 | for serialized_user in users: | |
|
596 | res.append(serialized_user) | |
|
597 | elif prefix_match: | |
|
598 | has_specialized_search = True | |
|
599 | res.append(no_match('No matching users found')) | |
|
490 | 600 | |
|
491 | repo_groups = self._get_repo_group_list(query) | |
|
492 | for serialized_repo_group in repo_groups: | |
|
493 | res.append(serialized_repo_group) | |
|
601 | # user_group: type search | |
|
602 | if not prefix_match: | |
|
603 | user_groups, prefix_match = self._get_user_groups_list(query) | |
|
604 | if user_groups: | |
|
605 | has_specialized_search = True | |
|
606 | for serialized_user_group in user_groups: | |
|
607 | res.append(serialized_user_group) | |
|
608 | elif prefix_match: | |
|
609 | has_specialized_search = True | |
|
610 | res.append(no_match('No matching user groups found')) | |
|
494 | 611 | |
|
495 | repos = self._get_repo_list(query) | |
|
496 | for serialized_repo in repos: | |
|
497 | res.append(serialized_repo) | |
|
612 | # FTS commit: type search | |
|
613 | if not prefix_match: | |
|
614 | commits, prefix_match = self._get_hash_commit_list( | |
|
615 | c.auth_user, searcher, query, repo, repo_group) | |
|
616 | if commits: | |
|
617 | has_specialized_search = True | |
|
618 | unique_repos = collections.OrderedDict() | |
|
619 | for commit in commits: | |
|
620 | repo_name = commit['repo'] | |
|
621 | unique_repos.setdefault(repo_name, []).append(commit) | |
|
498 | 622 | |
|
499 | # TODO(marcink): should all logged in users be allowed to search others? | |
|
500 | allowed_user_search = self._rhodecode_user.username != User.DEFAULT_USER | |
|
501 | if allowed_user_search: | |
|
502 | users = self._get_user_list(query) | |
|
503 |
|
|
|
504 | res.append(serialized_user) | |
|
623 | for _repo, commits in unique_repos.items(): | |
|
624 | for commit in commits: | |
|
625 | res.append(commit) | |
|
626 | elif prefix_match: | |
|
627 | has_specialized_search = True | |
|
628 | res.append(no_match('No matching commits found')) | |
|
505 | 629 | |
|
506 | user_groups = self._get_user_groups_list(query) | |
|
507 | for serialized_user_group in user_groups: | |
|
508 | res.append(serialized_user_group) | |
|
630 | # FTS file: type search | |
|
631 | if not prefix_match: | |
|
632 | paths, prefix_match = self._get_path_list( | |
|
633 | c.auth_user, searcher, query, repo, repo_group) | |
|
634 | if paths: | |
|
635 | has_specialized_search = True | |
|
636 | unique_repos = collections.OrderedDict() | |
|
637 | for path in paths: | |
|
638 | repo_name = path['repo'] | |
|
639 | unique_repos.setdefault(repo_name, []).append(path) | |
|
509 | 640 | |
|
510 | commits = self._get_hash_commit_list(c.auth_user, searcher, query) | |
|
511 | if commits: | |
|
512 | unique_repos = collections.OrderedDict() | |
|
513 | for commit in commits: | |
|
514 | repo_name = commit['repo'] | |
|
515 | unique_repos.setdefault(repo_name, []).append(commit) | |
|
641 | for repo, paths in unique_repos.items(): | |
|
642 | for path in paths: | |
|
643 | res.append(path) | |
|
644 | elif prefix_match: | |
|
645 | has_specialized_search = True | |
|
646 | res.append(no_match('No matching files found')) | |
|
647 | ||
|
648 | # main suggestions | |
|
649 | if not has_specialized_search: | |
|
650 | repo_group_name = '' | |
|
651 | if repo_group: | |
|
652 | repo_group_name = repo_group.group_name | |
|
516 | 653 | |
|
517 | for repo, commits in unique_repos.items(): | |
|
518 | for commit in commits: | |
|
519 | res.append(commit) | |
|
654 | for _q in self._get_default_search_queries(self.request.GET, searcher, query): | |
|
655 | res.append(_q) | |
|
656 | ||
|
657 | repo_groups = self._get_repo_group_list(query, repo_group_name=repo_group_name) | |
|
658 | for serialized_repo_group in repo_groups: | |
|
659 | res.append(serialized_repo_group) | |
|
660 | ||
|
661 | repos = self._get_repo_list(query, repo_group_name=repo_group_name) | |
|
662 | for serialized_repo in repos: | |
|
663 | res.append(serialized_repo) | |
|
664 | ||
|
665 | if not repos and not repo_groups: | |
|
666 | res.append(no_match('No matches found')) | |
|
520 | 667 | |
|
521 | 668 | return {'suggestions': res} |
|
522 | 669 | |
@@ -564,8 +711,11 b' class HomeView(BaseAppView):' | |||
|
564 | 711 | def repo_group_main_page(self): |
|
565 | 712 | c = self.load_default_context() |
|
566 | 713 | c.repo_group = self.request.db_repo_group |
|
567 | repo_data, repo_group_data = self._get_groups_and_repos( | |
|
568 | c.repo_group.group_id) | |
|
714 | repo_data, repo_group_data = self._get_groups_and_repos(c.repo_group.group_id) | |
|
715 | ||
|
716 | # update every 5 min | |
|
717 | if self.request.db_repo_group.last_commit_cache_update_diff > 60 * 5: | |
|
718 | self.request.db_repo_group.update_commit_cache() | |
|
569 | 719 | |
|
570 | 720 | # json used to render the grids |
|
571 | 721 | c.repos_data = json.dumps(repo_data) |
@@ -594,6 +744,34 b' class HomeView(BaseAppView):' | |||
|
594 | 744 | @LoginRequired() |
|
595 | 745 | @CSRFRequired() |
|
596 | 746 | @view_config( |
|
747 | route_name='file_preview', request_method='POST', | |
|
748 | renderer='string', xhr=True) | |
|
749 | def file_preview(self): | |
|
750 | # Technically a CSRF token is not needed as no state changes with this | |
|
751 | # call. However, as this is a POST is better to have it, so automated | |
|
752 | # tools don't flag it as potential CSRF. | |
|
753 | # Post is required because the payload could be bigger than the maximum | |
|
754 | # allowed by GET. | |
|
755 | ||
|
756 | text = self.request.POST.get('text') | |
|
757 | file_path = self.request.POST.get('file_path') | |
|
758 | ||
|
759 | renderer = h.renderer_from_filename(file_path) | |
|
760 | ||
|
761 | if renderer: | |
|
762 | return h.render(text, renderer=renderer, mentions=True) | |
|
763 | else: | |
|
764 | self.load_default_context() | |
|
765 | _render = self.request.get_partial_renderer( | |
|
766 | 'rhodecode:templates/files/file_content.mako') | |
|
767 | ||
|
768 | lines = filenode_as_lines_tokens(FileNode(file_path, text)) | |
|
769 | ||
|
770 | return _render('render_lines', lines) | |
|
771 | ||
|
772 | @LoginRequired() | |
|
773 | @CSRFRequired() | |
|
774 | @view_config( | |
|
597 | 775 | route_name='store_user_session_value', request_method='POST', |
|
598 | 776 | renderer='string', xhr=True) |
|
599 | 777 | def store_user_session_attr(self): |
@@ -604,4 +782,4 b' class HomeView(BaseAppView):' | |||
|
604 | 782 | if existing_value != val: |
|
605 | 783 | self.request.session[key] = val |
|
606 | 784 | |
|
607 | return 'stored:{}'.format(key) | |
|
785 | return 'stored:{}:{}'.format(key, val) |
@@ -31,12 +31,12 b' from pyramid.renderers import render' | |||
|
31 | 31 | |
|
32 | 32 | from rhodecode.apps._base import BaseAppView |
|
33 | 33 | from rhodecode.model.db import ( |
|
34 | or_, joinedload, UserLog, UserFollowing, User, UserApiKeys) | |
|
34 | or_, joinedload, Repository, UserLog, UserFollowing, User, UserApiKeys) | |
|
35 | 35 | from rhodecode.model.meta import Session |
|
36 | 36 | import rhodecode.lib.helpers as h |
|
37 | 37 | from rhodecode.lib.helpers import Page |
|
38 | 38 | from rhodecode.lib.user_log_filter import user_log_filter |
|
39 | from rhodecode.lib.auth import LoginRequired, NotAnonymous, CSRFRequired | |
|
39 | from rhodecode.lib.auth import LoginRequired, NotAnonymous, CSRFRequired, HasRepoPermissionAny | |
|
40 | 40 | from rhodecode.lib.utils2 import safe_int, AttributeDict, md5_safe |
|
41 | 41 | from rhodecode.model.scm import ScmModel |
|
42 | 42 | |
@@ -153,7 +153,7 b' class JournalView(BaseAppView):' | |||
|
153 | 153 | desc = action_extra() |
|
154 | 154 | _url = h.route_url('home') |
|
155 | 155 | if entry.repository is not None: |
|
156 |
_url = h.route_url('repo_c |
|
|
156 | _url = h.route_url('repo_commits', | |
|
157 | 157 | repo_name=entry.repository.repo_name) |
|
158 | 158 | |
|
159 | 159 | feed.add_item( |
@@ -199,7 +199,7 b' class JournalView(BaseAppView):' | |||
|
199 | 199 | desc = action_extra() |
|
200 | 200 | _url = h.route_url('home') |
|
201 | 201 | if entry.repository is not None: |
|
202 |
_url = h.route_url('repo_c |
|
|
202 | _url = h.route_url('repo_commits', | |
|
203 | 203 | repo_name=entry.repository.repo_name) |
|
204 | 204 | |
|
205 | 205 | feed.add_item( |
@@ -297,18 +297,19 b' class JournalView(BaseAppView):' | |||
|
297 | 297 | user_id = self.request.POST.get('follows_user_id') |
|
298 | 298 | if user_id: |
|
299 | 299 | try: |
|
300 | ScmModel().toggle_following_user( | |
|
301 | user_id, self._rhodecode_user.user_id) | |
|
300 | ScmModel().toggle_following_user(user_id, self._rhodecode_user.user_id) | |
|
302 | 301 | Session().commit() |
|
303 | 302 | return 'ok' |
|
304 | 303 | except Exception: |
|
305 | 304 | raise HTTPBadRequest() |
|
306 | 305 | |
|
307 | 306 | repo_id = self.request.POST.get('follows_repo_id') |
|
308 | if repo_id: | |
|
307 | repo = Repository.get_or_404(repo_id) | |
|
308 | perm_set = ['repository.read', 'repository.write', 'repository.admin'] | |
|
309 | has_perm = HasRepoPermissionAny(*perm_set)(repo.repo_name, 'RepoWatch check') | |
|
310 | if repo and has_perm: | |
|
309 | 311 | try: |
|
310 | ScmModel().toggle_following_repo( | |
|
311 | repo_id, self._rhodecode_user.user_id) | |
|
312 | ScmModel().toggle_following_repo(repo_id, self._rhodecode_user.user_id) | |
|
312 | 313 | Session().commit() |
|
313 | 314 | return 'ok' |
|
314 | 315 | except Exception: |
@@ -160,4 +160,4 b' class TestMyAccountSshKeysView(TestContr' | |||
|
160 | 160 | |
|
161 | 161 | response.mustcontain('Private key') |
|
162 | 162 | response.mustcontain('Public key') |
|
163 |
response.mustcontain('-----BEGIN |
|
|
163 | response.mustcontain('-----BEGIN PRIVATE KEY-----') |
@@ -360,7 +360,7 b' class MyAccountView(BaseAppView, DataGri' | |||
|
360 | 360 | 'repository.read', 'repository.write', 'repository.admin']) |
|
361 | 361 | |
|
362 | 362 | repos_data = RepoModel().get_repos_as_dict( |
|
363 | repo_list=repo_list, admin=admin) | |
|
363 | repo_list=repo_list, admin=admin, short_name=False) | |
|
364 | 364 | # json used to render the grid |
|
365 | 365 | return json.dumps(repos_data) |
|
366 | 366 | |
@@ -423,7 +423,7 b' class MyAccountView(BaseAppView, DataGri' | |||
|
423 | 423 | default_redirect_url = '' |
|
424 | 424 | |
|
425 | 425 | # save repo |
|
426 | if entry.get('bookmark_repo'): | |
|
426 | if entry.get('bookmark_repo') and safe_int(entry.get('bookmark_repo')): | |
|
427 | 427 | repo = Repository.get(entry['bookmark_repo']) |
|
428 | 428 | perm_check = HasRepoPermissionAny( |
|
429 | 429 | 'repository.read', 'repository.write', 'repository.admin') |
@@ -432,7 +432,7 b' class MyAccountView(BaseAppView, DataGri' | |||
|
432 | 432 | should_save = True |
|
433 | 433 | default_redirect_url = '${repo_url}' |
|
434 | 434 | # save repo group |
|
435 | elif entry.get('bookmark_repo_group'): | |
|
435 | elif entry.get('bookmark_repo_group') and safe_int(entry.get('bookmark_repo_group')): | |
|
436 | 436 | repo_group = RepoGroup.get(entry['bookmark_repo_group']) |
|
437 | 437 | perm_check = HasRepoGroupPermissionAny( |
|
438 | 438 | 'group.read', 'group.write', 'group.admin') |
@@ -496,6 +496,7 b' class MyAccountView(BaseAppView, DataGri' | |||
|
496 | 496 | if not user_bookmark: |
|
497 | 497 | raise HTTPFound(redirect_url) |
|
498 | 498 | |
|
499 | # repository set | |
|
499 | 500 | if user_bookmark.repository: |
|
500 | 501 | repo_name = user_bookmark.repository.repo_name |
|
501 | 502 | base_redirect_url = h.route_path( |
@@ -506,7 +507,7 b' class MyAccountView(BaseAppView, DataGri' | |||
|
506 | 507 | .safe_substitute({'repo_url': base_redirect_url}) |
|
507 | 508 | else: |
|
508 | 509 | redirect_url = base_redirect_url |
|
509 | ||
|
510 | # repository group set | |
|
510 | 511 | elif user_bookmark.repository_group: |
|
511 | 512 | repo_group_name = user_bookmark.repository_group.group_name |
|
512 | 513 | base_redirect_url = h.route_path( |
@@ -517,9 +518,11 b' class MyAccountView(BaseAppView, DataGri' | |||
|
517 | 518 | .safe_substitute({'repo_group_url': base_redirect_url}) |
|
518 | 519 | else: |
|
519 | 520 | redirect_url = base_redirect_url |
|
520 | ||
|
521 | # custom URL set | |
|
521 | 522 | elif user_bookmark.redirect_url: |
|
522 | redirect_url = user_bookmark.redirect_url | |
|
523 | server_url = h.route_url('home').rstrip('/') | |
|
524 | redirect_url = string.Template(user_bookmark.redirect_url) \ | |
|
525 | .safe_substitute({'server_url': server_url}) | |
|
523 | 526 | |
|
524 | 527 | log.debug('Redirecting bookmark %s to %s', user_bookmark, redirect_url) |
|
525 | 528 | raise HTTPFound(redirect_url) |
@@ -90,7 +90,7 b' def includeme(config):' | |||
|
90 | 90 | # Files |
|
91 | 91 | config.add_route( |
|
92 | 92 | name='repo_archivefile', |
|
93 | pattern='/{repo_name:.*?[^/]}/archive/{fname}', repo_route=True) | |
|
93 | pattern='/{repo_name:.*?[^/]}/archive/{fname:.*}', repo_route=True) | |
|
94 | 94 | |
|
95 | 95 | config.add_route( |
|
96 | 96 | name='repo_files_diff', |
@@ -172,6 +172,10 b' def includeme(config):' | |||
|
172 | 172 | pattern='/{repo_name:.*?[^/]}/add_file/{commit_id}/{f_path:.*}', |
|
173 | 173 | repo_route=True) |
|
174 | 174 | config.add_route( |
|
175 | name='repo_files_upload_file', | |
|
176 | pattern='/{repo_name:.*?[^/]}/upload_file/{commit_id}/{f_path:.*}', | |
|
177 | repo_route=True) | |
|
178 | config.add_route( | |
|
175 | 179 | name='repo_files_create_file', |
|
176 | 180 | pattern='/{repo_name:.*?[^/]}/create_file/{commit_id}/{f_path:.*}', |
|
177 | 181 | repo_route=True) |
@@ -189,19 +193,27 b' def includeme(config):' | |||
|
189 | 193 | name='repo_stats', |
|
190 | 194 | pattern='/{repo_name:.*?[^/]}/repo_stats/{commit_id}', repo_route=True) |
|
191 | 195 | |
|
192 | # Changelog | |
|
196 | # Commits | |
|
197 | config.add_route( | |
|
198 | name='repo_commits', | |
|
199 | pattern='/{repo_name:.*?[^/]}/commits', repo_route=True) | |
|
200 | config.add_route( | |
|
201 | name='repo_commits_file', | |
|
202 | pattern='/{repo_name:.*?[^/]}/commits/{commit_id}/{f_path:.*}', repo_route=True) | |
|
203 | config.add_route( | |
|
204 | name='repo_commits_elements', | |
|
205 | pattern='/{repo_name:.*?[^/]}/commits_elements', repo_route=True) | |
|
206 | config.add_route( | |
|
207 | name='repo_commits_elements_file', | |
|
208 | pattern='/{repo_name:.*?[^/]}/commits_elements/{commit_id}/{f_path:.*}', repo_route=True) | |
|
209 | ||
|
210 | # Changelog (old deprecated name for commits page) | |
|
193 | 211 | config.add_route( |
|
194 | 212 | name='repo_changelog', |
|
195 | 213 | pattern='/{repo_name:.*?[^/]}/changelog', repo_route=True) |
|
196 | 214 | config.add_route( |
|
197 | 215 | name='repo_changelog_file', |
|
198 | 216 | pattern='/{repo_name:.*?[^/]}/changelog/{commit_id}/{f_path:.*}', repo_route=True) |
|
199 | config.add_route( | |
|
200 | name='repo_changelog_elements', | |
|
201 | pattern='/{repo_name:.*?[^/]}/changelog_elements', repo_route=True) | |
|
202 | config.add_route( | |
|
203 | name='repo_changelog_elements_file', | |
|
204 | pattern='/{repo_name:.*?[^/]}/changelog_elements/{commit_id}/{f_path:.*}', repo_route=True) | |
|
205 | 217 | |
|
206 | 218 | # Compare |
|
207 | 219 | config.add_route( |
@@ -312,6 +324,11 b' def includeme(config):' | |||
|
312 | 324 | pattern='/{repo_name:.*?[^/]}/pull-request/{pull_request_id:\d+}/comment/{comment_id}/delete', |
|
313 | 325 | repo_route=True, repo_accepted_types=['hg', 'git']) |
|
314 | 326 | |
|
327 | # Artifacts, (EE feature) | |
|
328 | config.add_route( | |
|
329 | name='repo_artifacts_list', | |
|
330 | pattern='/{repo_name:.*?[^/]}/artifacts', repo_route=True) | |
|
331 | ||
|
315 | 332 | # Settings |
|
316 | 333 | config.add_route( |
|
317 | 334 | name='edit_repo', |
@@ -40,11 +40,11 b' def route_path(name, params=None, **kwar' | |||
|
40 | 40 | class TestPullRequestList(object): |
|
41 | 41 | |
|
42 | 42 | @pytest.mark.parametrize('params, expected_title', [ |
|
43 |
({'source': 0, 'closed': 1}, 'Closed |
|
|
44 |
({'source': 0, 'my': 1}, ' |
|
|
45 |
({'source': 0, 'awaiting_review': 1}, ' |
|
|
46 |
({'source': 0, 'awaiting_my_review': 1}, ' |
|
|
47 |
({'source': 1}, ' |
|
|
43 | ({'source': 0, 'closed': 1}, 'Closed'), | |
|
44 | ({'source': 0, 'my': 1}, 'Opened by me'), | |
|
45 | ({'source': 0, 'awaiting_review': 1}, 'Awaiting review'), | |
|
46 | ({'source': 0, 'awaiting_my_review': 1}, 'Awaiting my review'), | |
|
47 | ({'source': 1}, 'From this repo'), | |
|
48 | 48 | ]) |
|
49 | 49 | def test_showing_list_page(self, backend, pr_util, params, expected_title): |
|
50 | 50 | pull_request = pr_util.create_pull_request() |
@@ -55,9 +55,10 b' class TestPullRequestList(object):' | |||
|
55 | 55 | params=params)) |
|
56 | 56 | |
|
57 | 57 | assert_response = response.assert_response() |
|
58 | assert_response.element_equals_to('.panel-title', expected_title) | |
|
59 |
element = assert_response.get_element('. |
|
|
60 |
element_text = |
|
|
58 | ||
|
59 | element = assert_response.get_element('.title .active') | |
|
60 | element_text = element.text_content() | |
|
61 | assert expected_title == element_text | |
|
61 | 62 | |
|
62 | 63 | def test_showing_list_page_data(self, backend, pr_util, xhr_header): |
|
63 | 64 | pull_request = pr_util.create_pull_request() |
@@ -32,9 +32,10 b' def route_path(name, params=None, **kwar' | |||
|
32 | 32 | import urllib |
|
33 | 33 | |
|
34 | 34 | base_url = { |
|
35 | 'repo_changelog':'/{repo_name}/changelog', | |
|
36 |
'repo_c |
|
|
37 | 'repo_changelog_elements':'/{repo_name}/changelog_elements', | |
|
35 | 'repo_changelog': '/{repo_name}/changelog', | |
|
36 | 'repo_commits': '/{repo_name}/commits', | |
|
37 | 'repo_commits_file': '/{repo_name}/commits/{commit_id}/{f_path}', | |
|
38 | 'repo_commits_elements': '/{repo_name}/commits_elements', | |
|
38 | 39 | }[name].format(**kwargs) |
|
39 | 40 | |
|
40 | 41 | if params: |
@@ -42,8 +43,22 b' def route_path(name, params=None, **kwar' | |||
|
42 | 43 | return base_url |
|
43 | 44 | |
|
44 | 45 | |
|
46 | def assert_commits_on_page(response, indexes): | |
|
47 | found_indexes = [int(idx) for idx in MATCH_HASH.findall(response.body)] | |
|
48 | assert found_indexes == indexes | |
|
49 | ||
|
50 | ||
|
45 | 51 | class TestChangelogController(TestController): |
|
46 | 52 | |
|
53 | def test_commits_page(self, backend): | |
|
54 | self.log_user() | |
|
55 | response = self.app.get( | |
|
56 | route_path('repo_commits', repo_name=backend.repo_name)) | |
|
57 | ||
|
58 | first_idx = -1 | |
|
59 | last_idx = -DEFAULT_CHANGELOG_SIZE | |
|
60 | self.assert_commit_range_on_page(response, first_idx, last_idx, backend) | |
|
61 | ||
|
47 | 62 | def test_changelog(self, backend): |
|
48 | 63 | self.log_user() |
|
49 | 64 | response = self.app.get( |
@@ -62,6 +77,14 b' class TestChangelogController(TestContro' | |||
|
62 | 77 | params=dict(branch=backend.default_branch_name)), |
|
63 | 78 | status=200) |
|
64 | 79 | |
|
80 | @pytest.mark.backends("hg", "git") | |
|
81 | def test_commits_filtered_by_branch(self, backend): | |
|
82 | self.log_user() | |
|
83 | self.app.get( | |
|
84 | route_path('repo_commits', repo_name=backend.repo_name, | |
|
85 | params=dict(branch=backend.default_branch_name)), | |
|
86 | status=200) | |
|
87 | ||
|
65 | 88 | @pytest.mark.backends("svn") |
|
66 | 89 | def test_changelog_filtered_by_branch_svn(self, autologin_user, backend): |
|
67 | 90 | repo = backend['svn-simple-layout'] |
@@ -70,27 +93,22 b' class TestChangelogController(TestContro' | |||
|
70 | 93 | params=dict(branch='trunk')), |
|
71 | 94 | status=200) |
|
72 | 95 | |
|
73 | self.assert_commits_on_page( | |
|
74 | response, indexes=[15, 12, 7, 3, 2, 1]) | |
|
96 | assert_commits_on_page(response, indexes=[15, 12, 7, 3, 2, 1]) | |
|
75 | 97 | |
|
76 |
def test_c |
|
|
98 | def test_commits_filtered_by_wrong_branch(self, backend): | |
|
77 | 99 | self.log_user() |
|
78 | 100 | branch = 'wrong-branch-name' |
|
79 | 101 | response = self.app.get( |
|
80 |
route_path('repo_c |
|
|
102 | route_path('repo_commits', repo_name=backend.repo_name, | |
|
81 | 103 | params=dict(branch=branch)), |
|
82 | 104 | status=302) |
|
83 |
expected_url = '/{repo}/c |
|
|
105 | expected_url = '/{repo}/commits/{branch}'.format( | |
|
84 | 106 | repo=backend.repo_name, branch=branch) |
|
85 | 107 | assert expected_url in response.location |
|
86 | 108 | response = response.follow() |
|
87 | 109 | expected_warning = 'Branch {} is not found.'.format(branch) |
|
88 | 110 | assert expected_warning in response.body |
|
89 | 111 | |
|
90 | def assert_commits_on_page(self, response, indexes): | |
|
91 | found_indexes = [int(idx) for idx in MATCH_HASH.findall(response.body)] | |
|
92 | assert found_indexes == indexes | |
|
93 | ||
|
94 | 112 | @pytest.mark.xfail_backends("svn", reason="Depends on branch support") |
|
95 | 113 | def test_changelog_filtered_by_branch_with_merges( |
|
96 | 114 | self, autologin_user, backend): |
@@ -112,21 +130,20 b' class TestChangelogController(TestContro' | |||
|
112 | 130 | status=200) |
|
113 | 131 | |
|
114 | 132 | @pytest.mark.backends("hg") |
|
115 |
def test_c |
|
|
133 | def test_commits_closed_branches(self, autologin_user, backend): | |
|
116 | 134 | repo = backend['closed_branch'] |
|
117 | 135 | response = self.app.get( |
|
118 |
route_path('repo_c |
|
|
136 | route_path('repo_commits', repo_name=repo.repo_name, | |
|
119 | 137 | params=dict(branch='experimental')), |
|
120 | 138 | status=200) |
|
121 | 139 | |
|
122 |
|
|
|
123 | response, indexes=[3, 1]) | |
|
140 | assert_commits_on_page(response, indexes=[3, 1]) | |
|
124 | 141 | |
|
125 | 142 | def test_changelog_pagination(self, backend): |
|
126 | 143 | self.log_user() |
|
127 | 144 | # pagination, walk up to page 6 |
|
128 | 145 | changelog_url = route_path( |
|
129 |
'repo_c |
|
|
146 | 'repo_commits', repo_name=backend.repo_name) | |
|
130 | 147 | |
|
131 | 148 | for page in range(1, 7): |
|
132 | 149 | response = self.app.get(changelog_url, {'page': page}) |
@@ -138,22 +155,30 b' class TestChangelogController(TestContro' | |||
|
138 | 155 | def assert_commit_range_on_page( |
|
139 | 156 | self, response, first_idx, last_idx, backend): |
|
140 | 157 | input_template = ( |
|
141 |
"""<input class="commit-range" |
|
|
158 | """<input class="commit-range" """ | |
|
159 | """data-commit-id="%(raw_id)s" data-commit-idx="%(idx)s" """ | |
|
160 | """data-short-id="%(short_id)s" id="%(raw_id)s" """ | |
|
142 | 161 | """name="%(raw_id)s" type="checkbox" value="1" />""" |
|
143 | 162 | ) |
|
163 | ||
|
144 | 164 | commit_span_template = """<span class="commit_hash">r%s:%s</span>""" |
|
145 | 165 | repo = backend.repo |
|
146 | 166 | |
|
147 | 167 | first_commit_on_page = repo.get_commit(commit_idx=first_idx) |
|
148 | 168 | response.mustcontain( |
|
149 |
input_template % {'raw_id': first_commit_on_page.raw_id |
|
|
169 | input_template % {'raw_id': first_commit_on_page.raw_id, | |
|
170 | 'idx': first_commit_on_page.idx, | |
|
171 | 'short_id': first_commit_on_page.short_id}) | |
|
172 | ||
|
150 | 173 | response.mustcontain(commit_span_template % ( |
|
151 | 174 | first_commit_on_page.idx, first_commit_on_page.short_id) |
|
152 | 175 | ) |
|
153 | 176 | |
|
154 | 177 | last_commit_on_page = repo.get_commit(commit_idx=last_idx) |
|
155 | 178 | response.mustcontain( |
|
156 |
input_template % {'raw_id': last_commit_on_page.raw_id |
|
|
179 | input_template % {'raw_id': last_commit_on_page.raw_id, | |
|
180 | 'idx': last_commit_on_page.idx, | |
|
181 | 'short_id': last_commit_on_page.short_id}) | |
|
157 | 182 | response.mustcontain(commit_span_template % ( |
|
158 | 183 | last_commit_on_page.idx, last_commit_on_page.short_id) |
|
159 | 184 | ) |
@@ -168,10 +193,10 b' class TestChangelogController(TestContro' | |||
|
168 | 193 | '/vcs/exceptions.py', |
|
169 | 194 | '//vcs/exceptions.py' |
|
170 | 195 | ]) |
|
171 |
def test_c |
|
|
196 | def test_commits_with_filenode(self, backend, test_path): | |
|
172 | 197 | self.log_user() |
|
173 | 198 | response = self.app.get( |
|
174 |
route_path('repo_c |
|
|
199 | route_path('repo_commits_file', repo_name=backend.repo_name, | |
|
175 | 200 | commit_id='tip', f_path=test_path), |
|
176 | 201 | ) |
|
177 | 202 | |
@@ -180,16 +205,16 b' class TestChangelogController(TestContro' | |||
|
180 | 205 | response.mustcontain('Added not implemented hg backend test case') |
|
181 | 206 | response.mustcontain('Added BaseChangeset class') |
|
182 | 207 | |
|
183 |
def test_c |
|
|
208 | def test_commits_with_filenode_that_is_dirnode(self, backend): | |
|
184 | 209 | self.log_user() |
|
185 | 210 | self.app.get( |
|
186 |
route_path('repo_c |
|
|
211 | route_path('repo_commits_file', repo_name=backend.repo_name, | |
|
187 | 212 | commit_id='tip', f_path='/tests'), |
|
188 | 213 | status=302) |
|
189 | 214 | |
|
190 |
def test_c |
|
|
215 | def test_commits_with_filenode_not_existing(self, backend): | |
|
191 | 216 | self.log_user() |
|
192 | 217 | self.app.get( |
|
193 |
route_path('repo_c |
|
|
218 | route_path('repo_commits_file', repo_name=backend.repo_name, | |
|
194 | 219 | commit_id='tip', f_path='wrong_path'), |
|
195 | 220 | status=302) |
@@ -94,6 +94,7 b' class TestCompareView(object):' | |||
|
94 | 94 | origin_repo = origin.scm_instance(cache=False) |
|
95 | 95 | origin_repo.config.clear_section('hooks') |
|
96 | 96 | origin_repo.pull(fork.repo_full_path, commit_ids=[commit3.raw_id]) |
|
97 | origin_repo = origin.scm_instance(cache=False) # cache rebuild | |
|
97 | 98 | |
|
98 | 99 | # Verify test fixture setup |
|
99 | 100 | # This does not work for git |
@@ -162,8 +163,7 b' class TestCompareView(object):' | |||
|
162 | 163 | compare_page.target_source_are_disabled() |
|
163 | 164 | |
|
164 | 165 | @pytest.mark.xfail_backends("svn", reason="Depends on branch support") |
|
165 | def test_compare_forks_on_branch_extra_commits_origin_has_incomming( | |
|
166 | self, backend): | |
|
166 | def test_compare_forks_on_branch_extra_commits_origin_has_incomming(self, backend): | |
|
167 | 167 | repo1 = backend.create_repo() |
|
168 | 168 | |
|
169 | 169 | # commit something ! |
@@ -21,6 +21,7 b'' | |||
|
21 | 21 | import pytest |
|
22 | 22 | |
|
23 | 23 | from rhodecode.lib.vcs import nodes |
|
24 | from rhodecode.lib.vcs.backends.base import EmptyCommit | |
|
24 | 25 | from rhodecode.tests.fixture import Fixture |
|
25 | 26 | from rhodecode.tests.utils import commit_change |
|
26 | 27 | |
@@ -43,70 +44,7 b' def route_path(name, params=None, **kwar' | |||
|
43 | 44 | @pytest.mark.usefixtures("autologin_user", "app") |
|
44 | 45 | class TestSideBySideDiff(object): |
|
45 | 46 | |
|
46 |
def test_diff_side |
|
|
47 | f_path = 'test_sidebyside_file.py' | |
|
48 | commit1_content = 'content-25d7e49c18b159446c\n' | |
|
49 | commit2_content = 'content-603d6c72c46d953420\n' | |
|
50 | repo = backend.create_repo() | |
|
51 | ||
|
52 | commit1 = commit_change( | |
|
53 | repo.repo_name, filename=f_path, content=commit1_content, | |
|
54 | message='A', vcs_type=backend.alias, parent=None, newfile=True) | |
|
55 | ||
|
56 | commit2 = commit_change( | |
|
57 | repo.repo_name, filename=f_path, content=commit2_content, | |
|
58 | message='B, child of A', vcs_type=backend.alias, parent=commit1) | |
|
59 | ||
|
60 | response = self.app.get(route_path( | |
|
61 | 'repo_compare', | |
|
62 | repo_name=repo.repo_name, | |
|
63 | source_ref_type='rev', | |
|
64 | source_ref=commit1.raw_id, | |
|
65 | target_ref_type='rev', | |
|
66 | target_ref=commit2.raw_id, | |
|
67 | params=dict(f_path=f_path, target_repo=repo.repo_name, diffmode='sidebyside') | |
|
68 | )) | |
|
69 | ||
|
70 | response.mustcontain('Expand 1 commit') | |
|
71 | response.mustcontain('1 file changed') | |
|
72 | ||
|
73 | response.mustcontain( | |
|
74 | 'r%s:%s...r%s:%s' % ( | |
|
75 | commit1.idx, commit1.short_id, commit2.idx, commit2.short_id)) | |
|
76 | ||
|
77 | response.mustcontain('<strong>{}</strong>'.format(f_path)) | |
|
78 | ||
|
79 | def test_diff_side_by_side_with_empty_file(self, app, backend, backend_stub): | |
|
80 | commits = [ | |
|
81 | {'message': 'First commit'}, | |
|
82 | {'message': 'Commit with binary', | |
|
83 | 'added': [nodes.FileNode('file.empty', content='')]}, | |
|
84 | ] | |
|
85 | f_path = 'file.empty' | |
|
86 | repo = backend.create_repo(commits=commits) | |
|
87 | commit1 = repo.get_commit(commit_idx=0) | |
|
88 | commit2 = repo.get_commit(commit_idx=1) | |
|
89 | ||
|
90 | response = self.app.get(route_path( | |
|
91 | 'repo_compare', | |
|
92 | repo_name=repo.repo_name, | |
|
93 | source_ref_type='rev', | |
|
94 | source_ref=commit1.raw_id, | |
|
95 | target_ref_type='rev', | |
|
96 | target_ref=commit2.raw_id, | |
|
97 | params=dict(f_path=f_path, target_repo=repo.repo_name, diffmode='sidebyside') | |
|
98 | )) | |
|
99 | ||
|
100 | response.mustcontain('Expand 1 commit') | |
|
101 | response.mustcontain('1 file changed') | |
|
102 | ||
|
103 | response.mustcontain( | |
|
104 | 'r%s:%s...r%s:%s' % ( | |
|
105 | commit1.idx, commit1.short_id, commit2.idx, commit2.short_id)) | |
|
106 | ||
|
107 | response.mustcontain('<strong>{}</strong>'.format(f_path)) | |
|
108 | ||
|
109 | def test_diff_sidebyside_two_commits(self, app, backend): | |
|
47 | def test_diff_sidebyside_single_commit(self, app, backend): | |
|
110 | 48 | commit_id_range = { |
|
111 | 49 | 'hg': { |
|
112 | 50 | 'commits': ['25d7e49c18b159446cadfa506a5cf8ad1cb04067', |
@@ -141,26 +79,164 b' class TestSideBySideDiff(object):' | |||
|
141 | 79 | params=dict(target_repo=backend.repo_name, diffmode='sidebyside') |
|
142 | 80 | )) |
|
143 | 81 | |
|
82 | response.mustcontain(file_changes) | |
|
144 | 83 | response.mustcontain('Expand 1 commit') |
|
145 | response.mustcontain(file_changes) | |
|
146 | 84 | |
|
147 |
def test_diff_sidebyside_two_commits |
|
|
85 | def test_diff_sidebyside_two_commits(self, app, backend): | |
|
148 | 86 | commit_id_range = { |
|
149 | 87 | 'hg': { |
|
150 | 'commits': ['25d7e49c18b159446cadfa506a5cf8ad1cb04067', | |
|
88 | 'commits': ['4fdd71e9427417b2e904e0464c634fdee85ec5a7', | |
|
151 | 89 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], |
|
152 |
'changes': ' |
|
|
90 | 'changes': '32 files changed: 1165 inserted, 308 deleted' | |
|
153 | 91 | }, |
|
154 | 92 | 'git': { |
|
155 | 'commits': ['6fc9270775aaf5544c1deb014f4ddd60c952fcbb', | |
|
93 | 'commits': ['f5fbf9cfd5f1f1be146f6d3b38bcd791a7480c13', | |
|
156 | 94 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], |
|
157 |
'changes': ' |
|
|
95 | 'changes': '32 files changed: 1165 inserted, 308 deleted' | |
|
158 | 96 | }, |
|
159 | 97 | |
|
160 | 98 | 'svn': { |
|
161 |
'commits': ['33 |
|
|
99 | 'commits': ['335', | |
|
162 | 100 | '337'], |
|
163 |
'changes': ' |
|
|
101 | 'changes': '32 files changed: 1179 inserted, 310 deleted' | |
|
102 | }, | |
|
103 | } | |
|
104 | ||
|
105 | commit_info = commit_id_range[backend.alias] | |
|
106 | commit2, commit1 = commit_info['commits'] | |
|
107 | file_changes = commit_info['changes'] | |
|
108 | ||
|
109 | response = self.app.get(route_path( | |
|
110 | 'repo_compare', | |
|
111 | repo_name=backend.repo_name, | |
|
112 | source_ref_type='rev', | |
|
113 | source_ref=commit2, | |
|
114 | target_repo=backend.repo_name, | |
|
115 | target_ref_type='rev', | |
|
116 | target_ref=commit1, | |
|
117 | params=dict(target_repo=backend.repo_name, diffmode='sidebyside') | |
|
118 | )) | |
|
119 | ||
|
120 | response.mustcontain(file_changes) | |
|
121 | response.mustcontain('Expand 2 commits') | |
|
122 | ||
|
123 | @pytest.mark.xfail(reason='GIT does not handle empty commit compare correct (missing 1 commit)') | |
|
124 | def test_diff_side_by_side_from_0_commit(self, app, backend, backend_stub): | |
|
125 | f_path = 'test_sidebyside_file.py' | |
|
126 | commit1_content = 'content-25d7e49c18b159446c\n' | |
|
127 | commit2_content = 'content-603d6c72c46d953420\n' | |
|
128 | repo = backend.create_repo() | |
|
129 | ||
|
130 | commit1 = commit_change( | |
|
131 | repo.repo_name, filename=f_path, content=commit1_content, | |
|
132 | message='A', vcs_type=backend.alias, parent=None, newfile=True) | |
|
133 | ||
|
134 | commit2 = commit_change( | |
|
135 | repo.repo_name, filename=f_path, content=commit2_content, | |
|
136 | message='B, child of A', vcs_type=backend.alias, parent=commit1) | |
|
137 | ||
|
138 | response = self.app.get(route_path( | |
|
139 | 'repo_compare', | |
|
140 | repo_name=repo.repo_name, | |
|
141 | source_ref_type='rev', | |
|
142 | source_ref=EmptyCommit().raw_id, | |
|
143 | target_ref_type='rev', | |
|
144 | target_ref=commit2.raw_id, | |
|
145 | params=dict(diffmode='sidebyside') | |
|
146 | )) | |
|
147 | ||
|
148 | response.mustcontain('Expand 2 commits') | |
|
149 | response.mustcontain('123 file changed') | |
|
150 | ||
|
151 | response.mustcontain( | |
|
152 | 'r%s:%s...r%s:%s' % ( | |
|
153 | commit1.idx, commit1.short_id, commit2.idx, commit2.short_id)) | |
|
154 | ||
|
155 | response.mustcontain('<strong>{}</strong>'.format(f_path)) | |
|
156 | ||
|
157 | @pytest.mark.xfail(reason='GIT does not handle empty commit compare correct (missing 1 commit)') | |
|
158 | def test_diff_side_by_side_from_0_commit_with_file_filter(self, app, backend, backend_stub): | |
|
159 | f_path = 'test_sidebyside_file.py' | |
|
160 | commit1_content = 'content-25d7e49c18b159446c\n' | |
|
161 | commit2_content = 'content-603d6c72c46d953420\n' | |
|
162 | repo = backend.create_repo() | |
|
163 | ||
|
164 | commit1 = commit_change( | |
|
165 | repo.repo_name, filename=f_path, content=commit1_content, | |
|
166 | message='A', vcs_type=backend.alias, parent=None, newfile=True) | |
|
167 | ||
|
168 | commit2 = commit_change( | |
|
169 | repo.repo_name, filename=f_path, content=commit2_content, | |
|
170 | message='B, child of A', vcs_type=backend.alias, parent=commit1) | |
|
171 | ||
|
172 | response = self.app.get(route_path( | |
|
173 | 'repo_compare', | |
|
174 | repo_name=repo.repo_name, | |
|
175 | source_ref_type='rev', | |
|
176 | source_ref=EmptyCommit().raw_id, | |
|
177 | target_ref_type='rev', | |
|
178 | target_ref=commit2.raw_id, | |
|
179 | params=dict(f_path=f_path, target_repo=repo.repo_name, diffmode='sidebyside') | |
|
180 | )) | |
|
181 | ||
|
182 | response.mustcontain('Expand 2 commits') | |
|
183 | response.mustcontain('1 file changed') | |
|
184 | ||
|
185 | response.mustcontain( | |
|
186 | 'r%s:%s...r%s:%s' % ( | |
|
187 | commit1.idx, commit1.short_id, commit2.idx, commit2.short_id)) | |
|
188 | ||
|
189 | response.mustcontain('<strong>{}</strong>'.format(f_path)) | |
|
190 | ||
|
191 | def test_diff_side_by_side_with_empty_file(self, app, backend, backend_stub): | |
|
192 | commits = [ | |
|
193 | {'message': 'First commit'}, | |
|
194 | {'message': 'Second commit'}, | |
|
195 | {'message': 'Commit with binary', | |
|
196 | 'added': [nodes.FileNode('file.empty', content='')]}, | |
|
197 | ] | |
|
198 | f_path = 'file.empty' | |
|
199 | repo = backend.create_repo(commits=commits) | |
|
200 | commit1 = repo.get_commit(commit_idx=0) | |
|
201 | commit2 = repo.get_commit(commit_idx=1) | |
|
202 | commit3 = repo.get_commit(commit_idx=2) | |
|
203 | ||
|
204 | response = self.app.get(route_path( | |
|
205 | 'repo_compare', | |
|
206 | repo_name=repo.repo_name, | |
|
207 | source_ref_type='rev', | |
|
208 | source_ref=commit1.raw_id, | |
|
209 | target_ref_type='rev', | |
|
210 | target_ref=commit3.raw_id, | |
|
211 | params=dict(f_path=f_path, target_repo=repo.repo_name, diffmode='sidebyside') | |
|
212 | )) | |
|
213 | ||
|
214 | response.mustcontain('Expand 2 commits') | |
|
215 | response.mustcontain('1 file changed') | |
|
216 | ||
|
217 | response.mustcontain( | |
|
218 | 'r%s:%s...r%s:%s' % ( | |
|
219 | commit2.idx, commit2.short_id, commit3.idx, commit3.short_id)) | |
|
220 | ||
|
221 | response.mustcontain('<strong>{}</strong>'.format(f_path)) | |
|
222 | ||
|
223 | def test_diff_sidebyside_two_commits_with_file_filter(self, app, backend): | |
|
224 | commit_id_range = { | |
|
225 | 'hg': { | |
|
226 | 'commits': ['4fdd71e9427417b2e904e0464c634fdee85ec5a7', | |
|
227 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], | |
|
228 | 'changes': '1 file changed: 3 inserted, 3 deleted' | |
|
229 | }, | |
|
230 | 'git': { | |
|
231 | 'commits': ['f5fbf9cfd5f1f1be146f6d3b38bcd791a7480c13', | |
|
232 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], | |
|
233 | 'changes': '1 file changed: 3 inserted, 3 deleted' | |
|
234 | }, | |
|
235 | ||
|
236 | 'svn': { | |
|
237 | 'commits': ['335', | |
|
238 | '337'], | |
|
239 | 'changes': '1 file changed: 3 inserted, 3 deleted' | |
|
164 | 240 | }, |
|
165 | 241 | } |
|
166 | 242 | f_path = 'docs/conf.py' |
@@ -179,5 +255,5 b' class TestSideBySideDiff(object):' | |||
|
179 | 255 | params=dict(f_path=f_path, target_repo=backend.repo_name, diffmode='sidebyside') |
|
180 | 256 | )) |
|
181 | 257 | |
|
182 |
response.mustcontain('Expand |
|
|
258 | response.mustcontain('Expand 2 commits') | |
|
183 | 259 | response.mustcontain(file_changes) |
@@ -243,7 +243,7 b' class TestFilesViews(object):' | |||
|
243 | 243 | repo_name=backend.repo_name, |
|
244 | 244 | commit_id=commit.raw_id, f_path='vcs/nodes.py')) |
|
245 | 245 | |
|
246 |
msgbox = """<div class="commit |
|
|
246 | msgbox = """<div class="commit">%s</div>""" | |
|
247 | 247 | response.mustcontain(msgbox % (commit.message, )) |
|
248 | 248 | |
|
249 | 249 | assert_response = response.assert_response() |
@@ -313,6 +313,7 b' class TestFilesViews(object):' | |||
|
313 | 313 | |
|
314 | 314 | expected_data = json.loads( |
|
315 | 315 | fixture.load_resource('svn_node_history_branches.json')) |
|
316 | ||
|
316 | 317 | assert expected_data == response.json |
|
317 | 318 | |
|
318 | 319 | def test_file_source_history_with_annotation(self, backend, xhr_header): |
@@ -521,10 +522,10 b' class TestRepositoryArchival(object):' | |||
|
521 | 522 | def test_archival(self, backend): |
|
522 | 523 | backend.enable_downloads() |
|
523 | 524 | commit = backend.repo.get_commit(commit_idx=173) |
|
524 |
for a |
|
|
525 | mime_type, arch_ext = info | |
|
526 |
short = commit.short_id + |
|
|
527 |
fname = commit.raw_id + |
|
|
525 | for a_type, content_type, extension in settings.ARCHIVE_SPECS: | |
|
526 | ||
|
527 | short = commit.short_id + extension | |
|
528 | fname = commit.raw_id + extension | |
|
528 | 529 | filename = '%s-%s' % (backend.repo_name, short) |
|
529 | 530 | response = self.app.get( |
|
530 | 531 | route_path('repo_archivefile', |
@@ -534,7 +535,7 b' class TestRepositoryArchival(object):' | |||
|
534 | 535 | assert response.status == '200 OK' |
|
535 | 536 | headers = [ |
|
536 | 537 | ('Content-Disposition', 'attachment; filename=%s' % filename), |
|
537 |
('Content-Type', '%s' % |
|
|
538 | ('Content-Type', '%s' % content_type), | |
|
538 | 539 | ] |
|
539 | 540 | |
|
540 | 541 | for header in headers: |
@@ -761,7 +762,7 b' class TestModifyFilesWithWebInterface(ob' | |||
|
761 | 762 | |
|
762 | 763 | @pytest.mark.xfail_backends("svn", reason="Depends on online editing") |
|
763 | 764 | def test_add_file_into_repo_missing_content(self, backend, csrf_token): |
|
764 |
|
|
|
765 | backend.create_repo() | |
|
765 | 766 | filename = 'init.py' |
|
766 | 767 | response = self.app.post( |
|
767 | 768 | route_path('repo_files_create_file', |
@@ -770,26 +771,25 b' class TestModifyFilesWithWebInterface(ob' | |||
|
770 | 771 | params={ |
|
771 | 772 | 'content': "", |
|
772 | 773 | 'filename': filename, |
|
773 | 'location': "", | |
|
774 | 774 | 'csrf_token': csrf_token, |
|
775 | 775 | }, |
|
776 | 776 | status=302) |
|
777 | assert_session_flash(response, | |
|
778 | 'Successfully committed new file `{}`'.format( | |
|
779 | os.path.join(filename))) | |
|
777 | expected_msg = 'Successfully committed new file `{}`'.format(os.path.join(filename)) | |
|
778 | assert_session_flash(response, expected_msg) | |
|
780 | 779 | |
|
781 | 780 | def test_add_file_into_repo_missing_filename(self, backend, csrf_token): |
|
781 | commit_id = backend.repo.get_commit().raw_id | |
|
782 | 782 | response = self.app.post( |
|
783 | 783 | route_path('repo_files_create_file', |
|
784 | 784 | repo_name=backend.repo_name, |
|
785 |
commit_id= |
|
|
785 | commit_id=commit_id, f_path='/'), | |
|
786 | 786 | params={ |
|
787 | 787 | 'content': "foo", |
|
788 | 788 | 'csrf_token': csrf_token, |
|
789 | 789 | }, |
|
790 | 790 | status=302) |
|
791 | 791 | |
|
792 | assert_session_flash(response, 'No filename') | |
|
792 | assert_session_flash(response, 'No filename specified') | |
|
793 | 793 | |
|
794 | 794 | def test_add_file_into_repo_errors_and_no_commits( |
|
795 | 795 | self, backend, csrf_token): |
@@ -806,7 +806,7 b' class TestModifyFilesWithWebInterface(ob' | |||
|
806 | 806 | }, |
|
807 | 807 | status=302) |
|
808 | 808 | |
|
809 | assert_session_flash(response, 'No filename') | |
|
809 | assert_session_flash(response, 'No filename specified') | |
|
810 | 810 | |
|
811 | 811 | # Not allowed, redirect to the summary |
|
812 | 812 | redirected = response.follow() |
@@ -817,52 +817,51 b' class TestModifyFilesWithWebInterface(ob' | |||
|
817 | 817 | |
|
818 | 818 | assert redirected.request.path == summary_url |
|
819 | 819 | |
|
820 |
@pytest.mark.parametrize(" |
|
|
821 | ('/abs', 'foo'), | |
|
822 | ('../rel', 'foo'), | |
|
823 | ('file/../foo', 'foo'), | |
|
820 | @pytest.mark.parametrize("filename, clean_filename", [ | |
|
821 | ('/abs/foo', 'abs/foo'), | |
|
822 | ('../rel/foo', 'rel/foo'), | |
|
823 | ('file/../foo/foo', 'file/foo/foo'), | |
|
824 | 824 | ]) |
|
825 | def test_add_file_into_repo_bad_filenames( | |
|
826 | self, location, filename, backend, csrf_token): | |
|
825 | def test_add_file_into_repo_bad_filenames(self, filename, clean_filename, backend, csrf_token): | |
|
826 | repo = backend.create_repo() | |
|
827 | commit_id = repo.get_commit().raw_id | |
|
828 | ||
|
827 | 829 | response = self.app.post( |
|
828 | 830 | route_path('repo_files_create_file', |
|
829 |
repo_name= |
|
|
830 |
commit_id= |
|
|
831 | repo_name=repo.repo_name, | |
|
832 | commit_id=commit_id, f_path='/'), | |
|
831 | 833 | params={ |
|
832 | 834 | 'content': "foo", |
|
833 | 835 | 'filename': filename, |
|
834 | 'location': location, | |
|
835 | 836 | 'csrf_token': csrf_token, |
|
836 | 837 | }, |
|
837 | 838 | status=302) |
|
838 | 839 | |
|
839 | assert_session_flash( | |
|
840 | response, | |
|
841 | 'The location specified must be a relative path and must not ' | |
|
842 | 'contain .. in the path') | |
|
840 | expected_msg = 'Successfully committed new file `{}`'.format(clean_filename) | |
|
841 | assert_session_flash(response, expected_msg) | |
|
843 | 842 | |
|
844 |
@pytest.mark.parametrize("cnt, |
|
|
845 |
(1, '', |
|
|
846 |
(2, 'dir', |
|
|
847 | (3, 'rel/dir', 'foo.bar'), | |
|
843 | @pytest.mark.parametrize("cnt, filename, content", [ | |
|
844 | (1, 'foo.txt', "Content"), | |
|
845 | (2, 'dir/foo.rst', "Content"), | |
|
846 | (3, 'dir/foo-second.rst', "Content"), | |
|
847 | (4, 'rel/dir/foo.bar', "Content"), | |
|
848 | 848 | ]) |
|
849 |
def test_add_file_into_repo(self, cnt, |
|
|
850 | csrf_token): | |
|
849 | def test_add_file_into_empty_repo(self, cnt, filename, content, backend, csrf_token): | |
|
851 | 850 | repo = backend.create_repo() |
|
851 | commit_id = repo.get_commit().raw_id | |
|
852 | 852 | response = self.app.post( |
|
853 | 853 | route_path('repo_files_create_file', |
|
854 | 854 | repo_name=repo.repo_name, |
|
855 |
commit_id= |
|
|
855 | commit_id=commit_id, f_path='/'), | |
|
856 | 856 | params={ |
|
857 |
'content': |
|
|
857 | 'content': content, | |
|
858 | 858 | 'filename': filename, |
|
859 | 'location': location, | |
|
860 | 859 | 'csrf_token': csrf_token, |
|
861 | 860 | }, |
|
862 | 861 | status=302) |
|
863 | assert_session_flash(response, | |
|
864 |
|
|
|
865 | os.path.join(location, filename))) | |
|
862 | ||
|
863 | expected_msg = 'Successfully committed new file `{}`'.format(filename) | |
|
864 | assert_session_flash(response, expected_msg) | |
|
866 | 865 | |
|
867 | 866 | def test_edit_file_view(self, backend): |
|
868 | 867 | response = self.app.get( |
@@ -884,8 +883,7 b' class TestModifyFilesWithWebInterface(ob' | |||
|
884 | 883 | f_path='vcs/nodes.py'), |
|
885 | 884 | status=302) |
|
886 | 885 | assert_session_flash( |
|
887 | response, | |
|
888 | 'You can only edit files with commit being a valid branch') | |
|
886 | response, 'Cannot modify file. Given commit `tip` is not head of a branch.') | |
|
889 | 887 | |
|
890 | 888 | def test_edit_file_view_commit_changes(self, backend, csrf_token): |
|
891 | 889 | repo = backend.create_repo() |
@@ -953,8 +951,7 b' class TestModifyFilesWithWebInterface(ob' | |||
|
953 | 951 | f_path='vcs/nodes.py'), |
|
954 | 952 | status=302) |
|
955 | 953 | assert_session_flash( |
|
956 | response, | |
|
957 | 'You can only delete files with commit being a valid branch') | |
|
954 | response, 'Cannot modify file. Given commit `tip` is not head of a branch.') | |
|
958 | 955 | |
|
959 | 956 | def test_delete_file_view_commit_changes(self, backend, csrf_token): |
|
960 | 957 | repo = backend.create_repo() |
@@ -992,7 +989,7 b' class TestFilesViewOtherCases(object):' | |||
|
992 | 989 | repo_file_add_url = route_path( |
|
993 | 990 | 'repo_files_add_file', |
|
994 | 991 | repo_name=repo.repo_name, |
|
995 |
commit_id=0, f_path='') |
|
|
992 | commit_id=0, f_path='') | |
|
996 | 993 | |
|
997 | 994 | assert_session_flash( |
|
998 | 995 | response, |
@@ -1009,7 +1006,7 b' class TestFilesViewOtherCases(object):' | |||
|
1009 | 1006 | repo_file_add_url = route_path( |
|
1010 | 1007 | 'repo_files_add_file', |
|
1011 | 1008 | repo_name=repo.repo_name, |
|
1012 |
commit_id=0, f_path='') |
|
|
1009 | commit_id=0, f_path='') | |
|
1013 | 1010 | |
|
1014 | 1011 | response = self.app.get( |
|
1015 | 1012 | route_path('repo_files', |
@@ -40,6 +40,8 b' def route_path(name, params=None, **kwar' | |||
|
40 | 40 | base_url = { |
|
41 | 41 | 'repo_changelog': '/{repo_name}/changelog', |
|
42 | 42 | 'repo_changelog_file': '/{repo_name}/changelog/{commit_id}/{f_path}', |
|
43 | 'repo_commits': '/{repo_name}/commits', | |
|
44 | 'repo_commits_file': '/{repo_name}/commits/{commit_id}/{f_path}', | |
|
43 | 45 | 'pullrequest_show': '/{repo_name}/pull-request/{pull_request_id}', |
|
44 | 46 | 'pullrequest_show_all': '/{repo_name}/pull-request', |
|
45 | 47 | 'pullrequest_show_all_data': '/{repo_name}/pull-request-data', |
@@ -998,11 +1000,11 b' class TestPullrequestsView(object):' | |||
|
998 | 1000 | assert len(target_children) == 1 |
|
999 | 1001 | |
|
1000 | 1002 | expected_origin_link = route_path( |
|
1001 |
'repo_c |
|
|
1003 | 'repo_commits', | |
|
1002 | 1004 | repo_name=pull_request.source_repo.scm_instance().name, |
|
1003 | 1005 | params=dict(branch='origin')) |
|
1004 | 1006 | expected_target_link = route_path( |
|
1005 |
'repo_c |
|
|
1007 | 'repo_commits', | |
|
1006 | 1008 | repo_name=pull_request.target_repo.scm_instance().name, |
|
1007 | 1009 | params=dict(branch='target')) |
|
1008 | 1010 | assert origin_children[0].attrib['href'] == expected_origin_link |
@@ -350,11 +350,11 b' class TestCreateReferenceData(object):' | |||
|
350 | 350 | { |
|
351 | 351 | 'children': [ |
|
352 | 352 | { |
|
353 | 'id': 'a', 'raw_id': 'a_id', 'text': 'a', 'type': 't1', | |
|
353 | 'id': 'a', 'idx': 0, 'raw_id': 'a_id', 'text': 'a', 'type': 't1', | |
|
354 | 354 | 'files_url': expected_files_url + 'a/?at=a', |
|
355 | 355 | }, |
|
356 | 356 | { |
|
357 | 'id': 'b', 'raw_id': 'b_id', 'text': 'b', 'type': 't1', | |
|
357 | 'id': 'b', 'idx': 0, 'raw_id': 'b_id', 'text': 'b', 'type': 't1', | |
|
358 | 358 | 'files_url': expected_files_url + 'b/?at=b', |
|
359 | 359 | } |
|
360 | 360 | ], |
@@ -363,7 +363,7 b' class TestCreateReferenceData(object):' | |||
|
363 | 363 | { |
|
364 | 364 | 'children': [ |
|
365 | 365 | { |
|
366 | 'id': 'c', 'raw_id': 'c_id', 'text': 'c', 'type': 't2', | |
|
366 | 'id': 'c', 'idx': 0, 'raw_id': 'c_id', 'text': 'c', 'type': 't2', | |
|
367 | 367 | 'files_url': expected_files_url + 'c/?at=c', |
|
368 | 368 | } |
|
369 | 369 | ], |
@@ -385,12 +385,12 b' class TestCreateReferenceData(object):' | |||
|
385 | 385 | { |
|
386 | 386 | 'children': [ |
|
387 | 387 | { |
|
388 | 'id': 'a@a_id', 'raw_id': 'a_id', | |
|
388 | 'id': 'a@a_id', 'idx': 0, 'raw_id': 'a_id', | |
|
389 | 389 | 'text': 'a', 'type': 't1', |
|
390 | 390 | 'files_url': expected_files_url + 'a_id/a?at=a', |
|
391 | 391 | }, |
|
392 | 392 | { |
|
393 | 'id': 'b@b_id', 'raw_id': 'b_id', | |
|
393 | 'id': 'b@b_id', 'idx': 0, 'raw_id': 'b_id', | |
|
394 | 394 | 'text': 'b', 'type': 't1', |
|
395 | 395 | 'files_url': expected_files_url + 'b_id/b?at=b', |
|
396 | 396 | } |
@@ -400,7 +400,7 b' class TestCreateReferenceData(object):' | |||
|
400 | 400 | { |
|
401 | 401 | 'children': [ |
|
402 | 402 | { |
|
403 | 'id': 'c@c_id', 'raw_id': 'c_id', | |
|
403 | 'id': 'c@c_id', 'idx': 0, 'raw_id': 'c_id', | |
|
404 | 404 | 'text': 'c', 'type': 't2', |
|
405 | 405 | 'files_url': expected_files_url + 'c_id/c?at=c', |
|
406 | 406 | } |
@@ -516,6 +516,7 b' class TestReferenceItems(object):' | |||
|
516 | 516 | 'text': ref_name, |
|
517 | 517 | 'id': self._format_function(ref_name, ref_id), |
|
518 | 518 | 'raw_id': ref_id, |
|
519 | 'idx': 0, | |
|
519 | 520 | 'type': self.ref_type, |
|
520 | 521 | 'files_url': self.fake_url |
|
521 | 522 | } |
@@ -113,7 +113,7 b' class RepoChangelogView(RepoAppView):' | |||
|
113 | 113 | h.flash('Branch {} is not found.'.format(h.escape(branch_name)), |
|
114 | 114 | category='warning') |
|
115 | 115 | redirect_url = h.route_path( |
|
116 |
'repo_c |
|
|
116 | 'repo_commits_file', repo_name=repo_name, | |
|
117 | 117 | commit_id=branch_name, f_path=f_path or '') |
|
118 | 118 | raise HTTPFound(redirect_url) |
|
119 | 119 | |
@@ -127,13 +127,13 b' class RepoChangelogView(RepoAppView):' | |||
|
127 | 127 | if f_path: |
|
128 | 128 | # changelog for file |
|
129 | 129 | return h.route_path( |
|
130 |
'repo_c |
|
|
130 | 'repo_commits_file', | |
|
131 | 131 | repo_name=c.rhodecode_db_repo.repo_name, |
|
132 | 132 | commit_id=commit_id, f_path=f_path, |
|
133 | 133 | _query=query_params) |
|
134 | 134 | else: |
|
135 | 135 | return h.route_path( |
|
136 |
'repo_c |
|
|
136 | 'repo_commits', | |
|
137 | 137 | repo_name=c.rhodecode_db_repo.repo_name, _query=query_params) |
|
138 | 138 | |
|
139 | 139 | c.total_cs = len(collection) |
@@ -171,11 +171,18 b' class RepoChangelogView(RepoAppView):' | |||
|
171 | 171 | @HasRepoPermissionAnyDecorator( |
|
172 | 172 | 'repository.read', 'repository.write', 'repository.admin') |
|
173 | 173 | @view_config( |
|
174 | route_name='repo_commits', request_method='GET', | |
|
175 | renderer='rhodecode:templates/commits/changelog.mako') | |
|
176 | @view_config( | |
|
177 | route_name='repo_commits_file', request_method='GET', | |
|
178 | renderer='rhodecode:templates/commits/changelog.mako') | |
|
179 | # old routes for backward compat | |
|
180 | @view_config( | |
|
174 | 181 | route_name='repo_changelog', request_method='GET', |
|
175 |
renderer='rhodecode:templates/c |
|
|
182 | renderer='rhodecode:templates/commits/changelog.mako') | |
|
176 | 183 | @view_config( |
|
177 | 184 | route_name='repo_changelog_file', request_method='GET', |
|
178 |
renderer='rhodecode:templates/c |
|
|
185 | renderer='rhodecode:templates/commits/changelog.mako') | |
|
179 | 186 | def repo_changelog(self): |
|
180 | 187 | c = self.load_default_context() |
|
181 | 188 | |
@@ -224,7 +231,7 b' class RepoChangelogView(RepoAppView):' | |||
|
224 | 231 | except RepositoryError as e: |
|
225 | 232 | h.flash(safe_str(e), category='warning') |
|
226 | 233 | redirect_url = h.route_path( |
|
227 |
'repo_c |
|
|
234 | 'repo_commits', repo_name=self.db_repo_name) | |
|
228 | 235 | raise HTTPFound(redirect_url) |
|
229 | 236 | collection = list(reversed(collection)) |
|
230 | 237 | else: |
@@ -246,14 +253,14 b' class RepoChangelogView(RepoAppView):' | |||
|
246 | 253 | log.exception(safe_str(e)) |
|
247 | 254 | h.flash(safe_str(h.escape(e)), category='error') |
|
248 | 255 | raise HTTPFound( |
|
249 |
h.route_path('repo_c |
|
|
256 | h.route_path('repo_commits', repo_name=self.db_repo_name)) | |
|
250 | 257 | |
|
251 | 258 | if partial_xhr or self.request.environ.get('HTTP_X_PJAX'): |
|
252 | 259 | # case when loading dynamic file history in file view |
|
253 | 260 | # loading from ajax, we don't want the first result, it's popped |
|
254 | 261 | # in the code above |
|
255 | 262 | html = render( |
|
256 |
'rhodecode:templates/c |
|
|
263 | 'rhodecode:templates/commits/changelog_file_history.mako', | |
|
257 | 264 | self._get_template_context(c), self.request) |
|
258 | 265 | return Response(html) |
|
259 | 266 | |
@@ -271,14 +278,14 b' class RepoChangelogView(RepoAppView):' | |||
|
271 | 278 | @HasRepoPermissionAnyDecorator( |
|
272 | 279 | 'repository.read', 'repository.write', 'repository.admin') |
|
273 | 280 | @view_config( |
|
274 |
route_name='repo_c |
|
|
275 |
renderer='rhodecode:templates/c |
|
|
281 | route_name='repo_commits_elements', request_method=('GET', 'POST'), | |
|
282 | renderer='rhodecode:templates/commits/changelog_elements.mako', | |
|
276 | 283 | xhr=True) |
|
277 | 284 | @view_config( |
|
278 |
route_name='repo_c |
|
|
279 |
renderer='rhodecode:templates/c |
|
|
285 | route_name='repo_commits_elements_file', request_method=('GET', 'POST'), | |
|
286 | renderer='rhodecode:templates/commits/changelog_elements.mako', | |
|
280 | 287 | xhr=True) |
|
281 |
def repo_c |
|
|
288 | def repo_commits_elements(self): | |
|
282 | 289 | c = self.load_default_context() |
|
283 | 290 | commit_id = self.request.matchdict.get('commit_id') |
|
284 | 291 | f_path = self._get_f_path(self.request.matchdict) |
@@ -312,7 +319,7 b' class RepoChangelogView(RepoAppView):' | |||
|
312 | 319 | except (RepositoryError, CommitDoesNotExistError, Exception) as e: |
|
313 | 320 | log.exception(safe_str(e)) |
|
314 | 321 | raise HTTPFound( |
|
315 |
h.route_path('repo_c |
|
|
322 | h.route_path('repo_commits', repo_name=self.db_repo_name)) | |
|
316 | 323 | |
|
317 | 324 | collection = base_commit.get_path_history( |
|
318 | 325 | f_path, limit=hist_limit, pre_load=pre_load) |
@@ -105,10 +105,9 b' class RepoCommitsView(RepoAppView):' | |||
|
105 | 105 | |
|
106 | 106 | c.commit_ranges = commits |
|
107 | 107 | if not c.commit_ranges: |
|
108 | raise RepositoryError( | |
|
109 | 'The commit range returned an empty result') | |
|
110 | except CommitDoesNotExistError: | |
|
111 | msg = _('No such commit exists for this repository') | |
|
108 | raise RepositoryError('The commit range returned an empty result') | |
|
109 | except CommitDoesNotExistError as e: | |
|
110 | msg = _('No such commit exists. Org exception: `{}`').format(e) | |
|
112 | 111 | h.flash(msg, category='error') |
|
113 | 112 | raise HTTPNotFound() |
|
114 | 113 | except Exception: |
@@ -214,29 +214,23 b' class RepoCompareView(RepoAppView):' | |||
|
214 | 214 | pre_load = ["author", "branch", "date", "message"] |
|
215 | 215 | c.ancestor = None |
|
216 | 216 | |
|
217 | if c.file_path: | |
|
218 | if source_commit == target_commit: | |
|
219 | c.commit_ranges = [] | |
|
220 | else: | |
|
221 | c.commit_ranges = [target_commit] | |
|
222 | else: | |
|
223 | try: | |
|
224 | c.commit_ranges = source_scm.compare( | |
|
225 | source_commit.raw_id, target_commit.raw_id, | |
|
226 | target_scm, merge, pre_load=pre_load) | |
|
227 | if merge: | |
|
228 | c.ancestor = source_scm.get_common_ancestor( | |
|
229 | source_commit.raw_id, target_commit.raw_id, target_scm) | |
|
230 | except RepositoryRequirementError: | |
|
231 | msg = _('Could not compare repos with different ' | |
|
232 | 'large file settings') | |
|
233 | log.error(msg) | |
|
234 | if partial: | |
|
235 | return Response(msg) | |
|
236 | h.flash(msg, category='error') | |
|
237 | raise HTTPFound( | |
|
238 | h.route_path('repo_compare_select', | |
|
239 | repo_name=self.db_repo_name)) | |
|
217 | try: | |
|
218 | c.commit_ranges = source_scm.compare( | |
|
219 | source_commit.raw_id, target_commit.raw_id, | |
|
220 | target_scm, merge, pre_load=pre_load) or [] | |
|
221 | if merge: | |
|
222 | c.ancestor = source_scm.get_common_ancestor( | |
|
223 | source_commit.raw_id, target_commit.raw_id, target_scm) | |
|
224 | except RepositoryRequirementError: | |
|
225 | msg = _('Could not compare repos with different ' | |
|
226 | 'large file settings') | |
|
227 | log.error(msg) | |
|
228 | if partial: | |
|
229 | return Response(msg) | |
|
230 | h.flash(msg, category='error') | |
|
231 | raise HTTPFound( | |
|
232 | h.route_path('repo_compare_select', | |
|
233 | repo_name=self.db_repo_name)) | |
|
240 | 234 | |
|
241 | 235 | c.statuses = self.db_repo.statuses( |
|
242 | 236 | [x.raw_id for x in c.commit_ranges]) |
This diff has been collapsed as it changes many lines, (503 lines changed) Show them Hide them | |||
@@ -25,6 +25,7 b' import shutil' | |||
|
25 | 25 | import tempfile |
|
26 | 26 | import collections |
|
27 | 27 | import urllib |
|
28 | import pathlib2 | |
|
28 | 29 | |
|
29 | 30 | from pyramid.httpexceptions import HTTPNotFound, HTTPBadRequest, HTTPFound |
|
30 | 31 | from pyramid.view import view_config |
@@ -42,7 +43,7 b' from rhodecode.lib.exceptions import Non' | |||
|
42 | 43 | from rhodecode.lib.codeblocks import ( |
|
43 | 44 | filenode_as_lines_tokens, filenode_as_annotated_lines_tokens) |
|
44 | 45 | from rhodecode.lib.utils2 import ( |
|
45 | convert_line_endings, detect_mode, safe_str, str2bool, safe_int) | |
|
46 | convert_line_endings, detect_mode, safe_str, str2bool, safe_int, sha1, safe_unicode) | |
|
46 | 47 | from rhodecode.lib.auth import ( |
|
47 | 48 | LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired) |
|
48 | 49 | from rhodecode.lib.vcs import path as vcspath |
@@ -87,7 +88,7 b' class RepoFilesView(RepoAppView):' | |||
|
87 | 88 | c.enable_downloads = self.db_repo.enable_downloads |
|
88 | 89 | return c |
|
89 | 90 | |
|
90 | def _ensure_not_locked(self): | |
|
91 | def _ensure_not_locked(self, commit_id='tip'): | |
|
91 | 92 | _ = self.request.translate |
|
92 | 93 | |
|
93 | 94 | repo = self.db_repo |
@@ -98,21 +99,41 b' class RepoFilesView(RepoAppView):' | |||
|
98 | 99 | 'warning') |
|
99 | 100 | files_url = h.route_path( |
|
100 | 101 | 'repo_files:default_path', |
|
101 |
repo_name=self.db_repo_name, commit_id= |
|
|
102 | repo_name=self.db_repo_name, commit_id=commit_id) | |
|
102 | 103 | raise HTTPFound(files_url) |
|
103 | 104 | |
|
104 | def check_branch_permission(self, branch_name): | |
|
105 | def forbid_non_head(self, is_head, f_path, commit_id='tip', json_mode=False): | |
|
106 | _ = self.request.translate | |
|
107 | ||
|
108 | if not is_head: | |
|
109 | message = _('Cannot modify file. ' | |
|
110 | 'Given commit `{}` is not head of a branch.').format(commit_id) | |
|
111 | h.flash(message, category='warning') | |
|
112 | ||
|
113 | if json_mode: | |
|
114 | return message | |
|
115 | ||
|
116 | files_url = h.route_path( | |
|
117 | 'repo_files', repo_name=self.db_repo_name, commit_id=commit_id, | |
|
118 | f_path=f_path) | |
|
119 | raise HTTPFound(files_url) | |
|
120 | ||
|
121 | def check_branch_permission(self, branch_name, commit_id='tip', json_mode=False): | |
|
105 | 122 | _ = self.request.translate |
|
106 | 123 | |
|
107 | 124 | rule, branch_perm = self._rhodecode_user.get_rule_and_branch_permission( |
|
108 | 125 | self.db_repo_name, branch_name) |
|
109 | 126 | if branch_perm and branch_perm not in ['branch.push', 'branch.push_force']: |
|
110 | h.flash( | |
|
111 | _('Branch `{}` changes forbidden by rule {}.').format(branch_name, rule), | |
|
112 |
|
|
|
127 | message = _('Branch `{}` changes forbidden by rule {}.').format( | |
|
128 | branch_name, rule) | |
|
129 | h.flash(message, 'warning') | |
|
130 | ||
|
131 | if json_mode: | |
|
132 | return message | |
|
133 | ||
|
113 | 134 | files_url = h.route_path( |
|
114 | 'repo_files:default_path', | |
|
115 | repo_name=self.db_repo_name, commit_id='tip') | |
|
135 | 'repo_files:default_path', repo_name=self.db_repo_name, commit_id=commit_id) | |
|
136 | ||
|
116 | 137 | raise HTTPFound(files_url) |
|
117 | 138 | |
|
118 | 139 | def _get_commit_and_path(self): |
@@ -146,8 +167,7 b' class RepoFilesView(RepoAppView):' | |||
|
146 | 167 | |
|
147 | 168 | _url = h.route_path( |
|
148 | 169 | 'repo_files_add_file', |
|
149 |
repo_name=self.db_repo_name, commit_id=0, f_path='' |
|
|
150 | _anchor='edit') | |
|
170 | repo_name=self.db_repo_name, commit_id=0, f_path='') | |
|
151 | 171 | |
|
152 | 172 | if h.HasRepoPermissionAny( |
|
153 | 173 | 'repository.write', 'repository.admin')(self.db_repo_name): |
@@ -185,8 +205,7 b' class RepoFilesView(RepoAppView):' | |||
|
185 | 205 | h.flash(_('No such commit exists for this repository'), category='error') |
|
186 | 206 | raise HTTPNotFound() |
|
187 | 207 | except RepositoryError as e: |
|
188 | log.warning('Repository error while fetching ' | |
|
189 | 'filenode `%s`. Err:%s', path, e) | |
|
208 | log.warning('Repository error while fetching filenode `%s`. Err:%s', path, e) | |
|
190 | 209 | h.flash(safe_str(h.escape(e)), category='error') |
|
191 | 210 | raise HTTPNotFound() |
|
192 | 211 | |
@@ -195,12 +214,7 b' class RepoFilesView(RepoAppView):' | |||
|
195 | 214 | def _is_valid_head(self, commit_id, repo): |
|
196 | 215 | branch_name = sha_commit_id = '' |
|
197 | 216 | is_head = False |
|
198 | ||
|
199 | if h.is_svn(repo) and not repo.is_empty(): | |
|
200 | # Note: Subversion only has one head. | |
|
201 | if commit_id == repo.get_commit(commit_idx=-1).raw_id: | |
|
202 | is_head = True | |
|
203 | return branch_name, sha_commit_id, is_head | |
|
217 | log.debug('Checking if commit_id `%s` is a head for %s.', commit_id, repo) | |
|
204 | 218 | |
|
205 | 219 | for _branch_name, branch_commit_id in repo.branches.items(): |
|
206 | 220 | # simple case we pass in branch name, it's a HEAD |
@@ -216,8 +230,14 b' class RepoFilesView(RepoAppView):' | |||
|
216 | 230 | sha_commit_id = branch_commit_id |
|
217 | 231 | break |
|
218 | 232 | |
|
233 | if h.is_svn(repo) and not repo.is_empty(): | |
|
234 | # Note: Subversion only has one head. | |
|
235 | if commit_id == repo.get_commit(commit_idx=-1).raw_id: | |
|
236 | is_head = True | |
|
237 | return branch_name, sha_commit_id, is_head | |
|
238 | ||
|
219 | 239 | # checked branches, means we only need to try to get the branch/commit_sha |
|
220 | if not repo.is_empty: | |
|
240 | if not repo.is_empty(): | |
|
221 | 241 | commit = repo.get_commit(commit_id=commit_id) |
|
222 | 242 | if commit: |
|
223 | 243 | branch_name = commit.branch |
@@ -225,8 +245,7 b' class RepoFilesView(RepoAppView):' | |||
|
225 | 245 | |
|
226 | 246 | return branch_name, sha_commit_id, is_head |
|
227 | 247 | |
|
228 | def _get_tree_at_commit( | |
|
229 | self, c, commit_id, f_path, full_load=False): | |
|
248 | def _get_tree_at_commit(self, c, commit_id, f_path, full_load=False): | |
|
230 | 249 | |
|
231 | 250 | repo_id = self.db_repo.repo_id |
|
232 | 251 | force_recache = self.get_recache_flag() |
@@ -244,16 +263,16 b' class RepoFilesView(RepoAppView):' | |||
|
244 | 263 | |
|
245 | 264 | @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, |
|
246 | 265 | condition=cache_on) |
|
247 | def compute_file_tree(repo_id, commit_id, f_path, full_load): | |
|
248 | log.debug('Generating cached file tree for repo_id: %s, %s, %s', | |
|
249 | repo_id, commit_id, f_path) | |
|
266 | def compute_file_tree(ver, repo_id, commit_id, f_path, full_load): | |
|
267 | log.debug('Generating cached file tree at ver:%s for repo_id: %s, %s, %s', | |
|
268 | ver, repo_id, commit_id, f_path) | |
|
250 | 269 | |
|
251 | 270 | c.full_load = full_load |
|
252 | 271 | return render( |
|
253 | 272 | 'rhodecode:templates/files/files_browser_tree.mako', |
|
254 | 273 | self._get_template_context(c), self.request) |
|
255 | 274 | |
|
256 | return compute_file_tree(self.db_repo.repo_id, commit_id, f_path, full_load) | |
|
275 | return compute_file_tree('v1', self.db_repo.repo_id, commit_id, f_path, full_load) | |
|
257 | 276 | |
|
258 | 277 | def _get_archive_spec(self, fname): |
|
259 | 278 | log.debug('Detecting archive spec for: `%s`', fname) |
@@ -261,8 +280,7 b' class RepoFilesView(RepoAppView):' | |||
|
261 | 280 | fileformat = None |
|
262 | 281 | ext = None |
|
263 | 282 | content_type = None |
|
264 |
for a_type, |
|
|
265 | content_type, extension = ext_data | |
|
283 | for a_type, content_type, extension in settings.ARCHIVE_SPECS: | |
|
266 | 284 | |
|
267 | 285 | if fname.endswith(extension): |
|
268 | 286 | fileformat = a_type |
@@ -278,6 +296,15 b' class RepoFilesView(RepoAppView):' | |||
|
278 | 296 | |
|
279 | 297 | return commit_id, ext, fileformat, content_type |
|
280 | 298 | |
|
299 | def create_pure_path(self, *parts): | |
|
300 | # Split paths and sanitize them, removing any ../ etc | |
|
301 | sanitized_path = [ | |
|
302 | x for x in pathlib2.PurePath(*parts).parts | |
|
303 | if x not in ['.', '..']] | |
|
304 | ||
|
305 | pure_path = pathlib2.PurePath(*sanitized_path) | |
|
306 | return pure_path | |
|
307 | ||
|
281 | 308 | @LoginRequired() |
|
282 | 309 | @HasRepoPermissionAnyDecorator( |
|
283 | 310 | 'repository.read', 'repository.write', 'repository.admin') |
@@ -289,9 +316,10 b' class RepoFilesView(RepoAppView):' | |||
|
289 | 316 | from rhodecode import CONFIG |
|
290 | 317 | _ = self.request.translate |
|
291 | 318 | self.load_default_context() |
|
292 | ||
|
319 | default_at_path = '/' | |
|
293 | 320 | fname = self.request.matchdict['fname'] |
|
294 | 321 | subrepos = self.request.GET.get('subrepos') == 'true' |
|
322 | at_path = self.request.GET.get('at_path') or default_at_path | |
|
295 | 323 | |
|
296 | 324 | if not self.db_repo.enable_downloads: |
|
297 | 325 | return Response(_('Downloads disabled')) |
@@ -311,10 +339,31 b' class RepoFilesView(RepoAppView):' | |||
|
311 | 339 | except EmptyRepositoryError: |
|
312 | 340 | return Response(_('Empty repository')) |
|
313 | 341 | |
|
314 | archive_name = '%s-%s%s%s' % ( | |
|
315 | safe_str(self.db_repo_name.replace('/', '_')), | |
|
316 | '-sub' if subrepos else '', | |
|
317 | safe_str(commit.short_id), ext) | |
|
342 | try: | |
|
343 | at_path = commit.get_node(at_path).path or default_at_path | |
|
344 | except Exception: | |
|
345 | return Response(_('No node at path {} for this repository').format(at_path)) | |
|
346 | ||
|
347 | path_sha = sha1(at_path)[:8] | |
|
348 | ||
|
349 | # original backward compat name of archive | |
|
350 | clean_name = safe_str(self.db_repo_name.replace('/', '_')) | |
|
351 | short_sha = safe_str(commit.short_id) | |
|
352 | ||
|
353 | if at_path == default_at_path: | |
|
354 | archive_name = '{}-{}{}{}'.format( | |
|
355 | clean_name, | |
|
356 | '-sub' if subrepos else '', | |
|
357 | short_sha, | |
|
358 | ext) | |
|
359 | # custom path and new name | |
|
360 | else: | |
|
361 | archive_name = '{}-{}{}-{}{}'.format( | |
|
362 | clean_name, | |
|
363 | '-sub' if subrepos else '', | |
|
364 | short_sha, | |
|
365 | path_sha, | |
|
366 | ext) | |
|
318 | 367 | |
|
319 | 368 | use_cached_archive = False |
|
320 | 369 | archive_cache_enabled = CONFIG.get( |
@@ -339,7 +388,8 b' class RepoFilesView(RepoAppView):' | |||
|
339 | 388 | fd, archive = tempfile.mkstemp() |
|
340 | 389 | log.debug('Creating new temp archive in %s', archive) |
|
341 | 390 | try: |
|
342 |
commit.archive_repo(archive, kind=fileformat, subrepos=subrepos |
|
|
391 | commit.archive_repo(archive, kind=fileformat, subrepos=subrepos, | |
|
392 | archive_at_path=at_path) | |
|
343 | 393 | except ImproperArchiveTypeError: |
|
344 | 394 | return _('Unknown archive type') |
|
345 | 395 | if archive_cache_enabled: |
@@ -632,8 +682,7 b' class RepoFilesView(RepoAppView):' | |||
|
632 | 682 | c.authors = [] |
|
633 | 683 | # this loads a simple tree without metadata to speed things up |
|
634 | 684 | # later via ajax we call repo_nodetree_full and fetch whole |
|
635 | c.file_tree = self._get_tree_at_commit( | |
|
636 | c, c.commit.raw_id, f_path) | |
|
685 | c.file_tree = self._get_tree_at_commit(c, c.commit.raw_id, f_path) | |
|
637 | 686 | |
|
638 | 687 | except RepositoryError as e: |
|
639 | 688 | h.flash(safe_str(h.escape(e)), category='error') |
@@ -875,18 +924,17 b' class RepoFilesView(RepoAppView):' | |||
|
875 | 924 | self.db_repo_name, self.db_repo.repo_id, commit.raw_id, f_path) |
|
876 | 925 | return {'nodes': metadata} |
|
877 | 926 | |
|
878 | def _create_references( | |
|
879 | self, branches_or_tags, symbolic_reference, f_path): | |
|
927 | def _create_references(self, branches_or_tags, symbolic_reference, f_path, ref_type): | |
|
880 | 928 | items = [] |
|
881 | 929 | for name, commit_id in branches_or_tags.items(): |
|
882 | sym_ref = symbolic_reference(commit_id, name, f_path) | |
|
883 | items.append((sym_ref, name)) | |
|
930 | sym_ref = symbolic_reference(commit_id, name, f_path, ref_type) | |
|
931 | items.append((sym_ref, name, ref_type)) | |
|
884 | 932 | return items |
|
885 | 933 | |
|
886 | def _symbolic_reference(self, commit_id, name, f_path): | |
|
934 | def _symbolic_reference(self, commit_id, name, f_path, ref_type): | |
|
887 | 935 | return commit_id |
|
888 | 936 | |
|
889 | def _symbolic_reference_svn(self, commit_id, name, f_path): | |
|
937 | def _symbolic_reference_svn(self, commit_id, name, f_path, ref_type): | |
|
890 | 938 | new_f_path = vcspath.join(name, f_path) |
|
891 | 939 | return u'%s@%s' % (new_f_path, commit_id) |
|
892 | 940 | |
@@ -916,7 +964,7 b' class RepoFilesView(RepoAppView):' | |||
|
916 | 964 | for commit in commits: |
|
917 | 965 | branch = ' (%s)' % commit.branch if commit.branch else '' |
|
918 | 966 | n_desc = 'r%s:%s%s' % (commit.idx, commit.short_id, branch) |
|
919 | commits_group[0].append((commit.raw_id, n_desc,)) | |
|
967 | commits_group[0].append((commit.raw_id, n_desc, 'sha')) | |
|
920 | 968 | history.append(commits_group) |
|
921 | 969 | |
|
922 | 970 | symbolic_reference = self._symbolic_reference |
@@ -932,11 +980,11 b' class RepoFilesView(RepoAppView):' | |||
|
932 | 980 | symbolic_reference = self._symbolic_reference_svn |
|
933 | 981 | |
|
934 | 982 | branches = self._create_references( |
|
935 | self.rhodecode_vcs_repo.branches, symbolic_reference, f_path) | |
|
983 | self.rhodecode_vcs_repo.branches, symbolic_reference, f_path, 'branch') | |
|
936 | 984 | branches_group = (branches, _("Branches")) |
|
937 | 985 | |
|
938 | 986 | tags = self._create_references( |
|
939 | self.rhodecode_vcs_repo.tags, symbolic_reference, f_path) | |
|
987 | self.rhodecode_vcs_repo.tags, symbolic_reference, f_path, 'tag') | |
|
940 | 988 | tags_group = (tags, _("Tags")) |
|
941 | 989 | |
|
942 | 990 | history.append(branches_group) |
@@ -964,7 +1012,7 b' class RepoFilesView(RepoAppView):' | |||
|
964 | 1012 | for obj in file_history: |
|
965 | 1013 | res.append({ |
|
966 | 1014 | 'text': obj[1], |
|
967 | 'children': [{'id': o[0], 'text': o[1]} for o in obj[0]] | |
|
1015 | 'children': [{'id': o[0], 'text': o[1], 'type': o[2]} for o in obj[0]] | |
|
968 | 1016 | }) |
|
969 | 1017 | |
|
970 | 1018 | data = { |
@@ -1035,15 +1083,9 b' class RepoFilesView(RepoAppView):' | |||
|
1035 | 1083 | _branch_name, _sha_commit_id, is_head = \ |
|
1036 | 1084 | self._is_valid_head(commit_id, self.rhodecode_vcs_repo) |
|
1037 | 1085 | |
|
1038 | if not is_head: | |
|
1039 | h.flash(_('You can only delete files with commit ' | |
|
1040 | 'being a valid branch head.'), category='warning') | |
|
1041 | raise HTTPFound( | |
|
1042 | h.route_path('repo_files', | |
|
1043 | repo_name=self.db_repo_name, commit_id='tip', | |
|
1044 | f_path=f_path)) | |
|
1086 | self.forbid_non_head(is_head, f_path) | |
|
1087 | self.check_branch_permission(_branch_name) | |
|
1045 | 1088 | |
|
1046 | self.check_branch_permission(_branch_name) | |
|
1047 | 1089 | c.commit = self._get_commit_or_redirect(commit_id) |
|
1048 | 1090 | c.file = self._get_filenode_or_redirect(c.commit, f_path) |
|
1049 | 1091 | |
@@ -1069,13 +1111,7 b' class RepoFilesView(RepoAppView):' | |||
|
1069 | 1111 | _branch_name, _sha_commit_id, is_head = \ |
|
1070 | 1112 | self._is_valid_head(commit_id, self.rhodecode_vcs_repo) |
|
1071 | 1113 | |
|
1072 | if not is_head: | |
|
1073 | h.flash(_('You can only delete files with commit ' | |
|
1074 | 'being a valid branch head.'), category='warning') | |
|
1075 | raise HTTPFound( | |
|
1076 | h.route_path('repo_files', | |
|
1077 | repo_name=self.db_repo_name, commit_id='tip', | |
|
1078 | f_path=f_path)) | |
|
1114 | self.forbid_non_head(is_head, f_path) | |
|
1079 | 1115 | self.check_branch_permission(_branch_name) |
|
1080 | 1116 | |
|
1081 | 1117 | c.commit = self._get_commit_or_redirect(commit_id) |
@@ -1125,14 +1161,8 b' class RepoFilesView(RepoAppView):' | |||
|
1125 | 1161 | _branch_name, _sha_commit_id, is_head = \ |
|
1126 | 1162 | self._is_valid_head(commit_id, self.rhodecode_vcs_repo) |
|
1127 | 1163 | |
|
1128 | if not is_head: | |
|
1129 | h.flash(_('You can only edit files with commit ' | |
|
1130 | 'being a valid branch head.'), category='warning') | |
|
1131 | raise HTTPFound( | |
|
1132 | h.route_path('repo_files', | |
|
1133 | repo_name=self.db_repo_name, commit_id='tip', | |
|
1134 | f_path=f_path)) | |
|
1135 | self.check_branch_permission(_branch_name) | |
|
1164 | self.forbid_non_head(is_head, f_path, commit_id=commit_id) | |
|
1165 | self.check_branch_permission(_branch_name, commit_id=commit_id) | |
|
1136 | 1166 | |
|
1137 | 1167 | c.commit = self._get_commit_or_redirect(commit_id) |
|
1138 | 1168 | c.file = self._get_filenode_or_redirect(c.commit, f_path) |
@@ -1144,8 +1174,7 b' class RepoFilesView(RepoAppView):' | |||
|
1144 | 1174 | commit_id=c.commit.raw_id, f_path=f_path) |
|
1145 | 1175 | raise HTTPFound(files_url) |
|
1146 | 1176 | |
|
1147 | c.default_message = _( | |
|
1148 | 'Edited file {} via RhodeCode Enterprise').format(f_path) | |
|
1177 | c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path) | |
|
1149 | 1178 | c.f_path = f_path |
|
1150 | 1179 | |
|
1151 | 1180 | return self._get_template_context(c) |
@@ -1162,32 +1191,23 b' class RepoFilesView(RepoAppView):' | |||
|
1162 | 1191 | commit_id, f_path = self._get_commit_and_path() |
|
1163 | 1192 | |
|
1164 | 1193 | self._ensure_not_locked() |
|
1165 | _branch_name, _sha_commit_id, is_head = \ | |
|
1166 | self._is_valid_head(commit_id, self.rhodecode_vcs_repo) | |
|
1167 | ||
|
1168 | if not is_head: | |
|
1169 | h.flash(_('You can only edit files with commit ' | |
|
1170 | 'being a valid branch head.'), category='warning') | |
|
1171 | raise HTTPFound( | |
|
1172 | h.route_path('repo_files', | |
|
1173 | repo_name=self.db_repo_name, commit_id='tip', | |
|
1174 | f_path=f_path)) | |
|
1175 | ||
|
1176 | self.check_branch_permission(_branch_name) | |
|
1177 | 1194 | |
|
1178 | 1195 | c.commit = self._get_commit_or_redirect(commit_id) |
|
1179 | 1196 | c.file = self._get_filenode_or_redirect(c.commit, f_path) |
|
1180 | 1197 | |
|
1181 | 1198 | if c.file.is_binary: |
|
1182 | raise HTTPFound( | |
|
1183 | h.route_path('repo_files', | |
|
1184 | repo_name=self.db_repo_name, | |
|
1185 | commit_id=c.commit.raw_id, | |
|
1186 | f_path=f_path)) | |
|
1199 | raise HTTPFound(h.route_path('repo_files', repo_name=self.db_repo_name, | |
|
1200 | commit_id=c.commit.raw_id, f_path=f_path)) | |
|
1201 | ||
|
1202 | _branch_name, _sha_commit_id, is_head = \ | |
|
1203 | self._is_valid_head(commit_id, self.rhodecode_vcs_repo) | |
|
1187 | 1204 | |
|
1188 | c.default_message = _( | |
|
1189 | 'Edited file {} via RhodeCode Enterprise').format(f_path) | |
|
1205 | self.forbid_non_head(is_head, f_path, commit_id=commit_id) | |
|
1206 | self.check_branch_permission(_branch_name, commit_id=commit_id) | |
|
1207 | ||
|
1208 | c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path) | |
|
1190 | 1209 | c.f_path = f_path |
|
1210 | ||
|
1191 | 1211 | old_content = c.file.content |
|
1192 | 1212 | sl = old_content.splitlines(1) |
|
1193 | 1213 | first_line = sl[0] if sl else '' |
@@ -1198,20 +1218,25 b' class RepoFilesView(RepoAppView):' | |||
|
1198 | 1218 | content = convert_line_endings(r_post.get('content', ''), line_ending_mode) |
|
1199 | 1219 | |
|
1200 | 1220 | message = r_post.get('message') or c.default_message |
|
1201 |
org_ |
|
|
1221 | org_node_path = c.file.unicode_path | |
|
1202 | 1222 | filename = r_post['filename'] |
|
1203 | org_filename = c.file.name | |
|
1223 | ||
|
1224 | root_path = c.file.dir_path | |
|
1225 | pure_path = self.create_pure_path(root_path, filename) | |
|
1226 | node_path = safe_unicode(bytes(pure_path)) | |
|
1204 | 1227 | |
|
1205 | if content == old_content and filename == org_filename: | |
|
1206 | h.flash(_('No changes'), category='warning') | |
|
1207 | raise HTTPFound( | |
|
1208 | h.route_path('repo_commit', repo_name=self.db_repo_name, | |
|
1209 |
|
|
|
1228 | default_redirect_url = h.route_path('repo_commit', repo_name=self.db_repo_name, | |
|
1229 | commit_id=commit_id) | |
|
1230 | if content == old_content and node_path == org_node_path: | |
|
1231 | h.flash(_('No changes detected on {}').format(org_node_path), | |
|
1232 | category='warning') | |
|
1233 | raise HTTPFound(default_redirect_url) | |
|
1234 | ||
|
1210 | 1235 | try: |
|
1211 | 1236 | mapping = { |
|
1212 |
org_ |
|
|
1213 |
'org_filename': org_ |
|
|
1214 |
'filename': |
|
|
1237 | org_node_path: { | |
|
1238 | 'org_filename': org_node_path, | |
|
1239 | 'filename': node_path, | |
|
1215 | 1240 | 'content': content, |
|
1216 | 1241 | 'lexer': '', |
|
1217 | 1242 | 'op': 'mod', |
@@ -1219,7 +1244,7 b' class RepoFilesView(RepoAppView):' | |||
|
1219 | 1244 | } |
|
1220 | 1245 | } |
|
1221 | 1246 | |
|
1222 | ScmModel().update_nodes( | |
|
1247 | commit = ScmModel().update_nodes( | |
|
1223 | 1248 | user=self._rhodecode_db_user.user_id, |
|
1224 | 1249 | repo=self.db_repo, |
|
1225 | 1250 | message=message, |
@@ -1227,21 +1252,25 b' class RepoFilesView(RepoAppView):' | |||
|
1227 | 1252 | parent_commit=c.commit, |
|
1228 | 1253 | ) |
|
1229 | 1254 | |
|
1230 | h.flash( | |
|
1231 | _('Successfully committed changes to file `{}`').format( | |
|
1255 | h.flash(_('Successfully committed changes to file `{}`').format( | |
|
1232 | 1256 | h.escape(f_path)), category='success') |
|
1257 | default_redirect_url = h.route_path( | |
|
1258 | 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id) | |
|
1259 | ||
|
1233 | 1260 | except Exception: |
|
1234 | 1261 | log.exception('Error occurred during commit') |
|
1235 | 1262 | h.flash(_('Error occurred during commit'), category='error') |
|
1236 | raise HTTPFound( | |
|
1237 | h.route_path('repo_commit', repo_name=self.db_repo_name, | |
|
1238 | commit_id='tip')) | |
|
1263 | ||
|
1264 | raise HTTPFound(default_redirect_url) | |
|
1239 | 1265 | |
|
1240 | 1266 | @LoginRequired() |
|
1241 | 1267 | @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin') |
|
1242 | 1268 | @view_config( |
|
1243 | 1269 | route_name='repo_files_add_file', request_method='GET', |
|
1244 | 1270 | renderer='rhodecode:templates/files/files_add.mako') |
|
1271 | @view_config( | |
|
1272 | route_name='repo_files_upload_file', request_method='GET', | |
|
1273 | renderer='rhodecode:templates/files/files_upload.mako') | |
|
1245 | 1274 | def repo_files_add_file(self): |
|
1246 | 1275 | _ = self.request.translate |
|
1247 | 1276 | c = self.load_default_context() |
@@ -1252,27 +1281,20 b' class RepoFilesView(RepoAppView):' | |||
|
1252 | 1281 | c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False) |
|
1253 | 1282 | if c.commit is None: |
|
1254 | 1283 | c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias) |
|
1255 | c.default_message = (_('Added file via RhodeCode Enterprise')) | |
|
1256 | c.f_path = f_path.lstrip('/') # ensure not relative path | |
|
1257 | 1284 | |
|
1258 | if self.rhodecode_vcs_repo.is_empty: | |
|
1285 | if self.rhodecode_vcs_repo.is_empty(): | |
|
1259 | 1286 | # for empty repository we cannot check for current branch, we rely on |
|
1260 | 1287 | # c.commit.branch instead |
|
1261 | _branch_name = c.commit.branch | |
|
1262 | is_head = True | |
|
1288 | _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True | |
|
1263 | 1289 | else: |
|
1264 | 1290 | _branch_name, _sha_commit_id, is_head = \ |
|
1265 | 1291 | self._is_valid_head(commit_id, self.rhodecode_vcs_repo) |
|
1266 | 1292 | |
|
1267 | if not is_head: | |
|
1268 | h.flash(_('You can only add files with commit ' | |
|
1269 | 'being a valid branch head.'), category='warning') | |
|
1270 | raise HTTPFound( | |
|
1271 | h.route_path('repo_files', | |
|
1272 | repo_name=self.db_repo_name, commit_id='tip', | |
|
1273 | f_path=f_path)) | |
|
1293 | self.forbid_non_head(is_head, f_path, commit_id=commit_id) | |
|
1294 | self.check_branch_permission(_branch_name, commit_id=commit_id) | |
|
1274 | 1295 | |
|
1275 | self.check_branch_permission(_branch_name) | |
|
1296 | c.default_message = (_('Added file via RhodeCode Enterprise')) | |
|
1297 | c.f_path = f_path.lstrip('/') # ensure not relative path | |
|
1276 | 1298 | |
|
1277 | 1299 | return self._get_template_context(c) |
|
1278 | 1300 | |
@@ -1289,86 +1311,62 b' class RepoFilesView(RepoAppView):' | |||
|
1289 | 1311 | |
|
1290 | 1312 | self._ensure_not_locked() |
|
1291 | 1313 | |
|
1292 | r_post = self.request.POST | |
|
1293 | ||
|
1294 | c.commit = self._get_commit_or_redirect( | |
|
1295 | commit_id, redirect_after=False) | |
|
1314 | c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False) | |
|
1296 | 1315 | if c.commit is None: |
|
1297 | 1316 | c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias) |
|
1298 | 1317 | |
|
1299 | if self.rhodecode_vcs_repo.is_empty: | |
|
1300 | # for empty repository we cannot check for current branch, we rely on | |
|
1301 | # c.commit.branch instead | |
|
1302 | _branch_name = c.commit.branch | |
|
1303 | is_head = True | |
|
1304 | else: | |
|
1305 | _branch_name, _sha_commit_id, is_head = \ | |
|
1306 | self._is_valid_head(commit_id, self.rhodecode_vcs_repo) | |
|
1307 | ||
|
1308 | if not is_head: | |
|
1309 | h.flash(_('You can only add files with commit ' | |
|
1310 | 'being a valid branch head.'), category='warning') | |
|
1311 | raise HTTPFound( | |
|
1312 | h.route_path('repo_files', | |
|
1313 | repo_name=self.db_repo_name, commit_id='tip', | |
|
1314 | f_path=f_path)) | |
|
1315 | ||
|
1316 | self.check_branch_permission(_branch_name) | |
|
1317 | ||
|
1318 | c.default_message = (_('Added file via RhodeCode Enterprise')) | |
|
1319 | c.f_path = f_path | |
|
1320 | unix_mode = 0 | |
|
1321 | content = convert_line_endings(r_post.get('content', ''), unix_mode) | |
|
1322 | ||
|
1323 | message = r_post.get('message') or c.default_message | |
|
1324 | filename = r_post.get('filename') | |
|
1325 | location = r_post.get('location', '') # dir location | |
|
1326 | file_obj = r_post.get('upload_file', None) | |
|
1327 | ||
|
1328 | if file_obj is not None and hasattr(file_obj, 'filename'): | |
|
1329 | filename = r_post.get('filename_upload') | |
|
1330 | content = file_obj.file | |
|
1331 | ||
|
1332 | if hasattr(content, 'file'): | |
|
1333 | # non posix systems store real file under file attr | |
|
1334 | content = content.file | |
|
1335 | ||
|
1336 | if self.rhodecode_vcs_repo.is_empty: | |
|
1318 | # calculate redirect URL | |
|
1319 | if self.rhodecode_vcs_repo.is_empty(): | |
|
1337 | 1320 | default_redirect_url = h.route_path( |
|
1338 | 1321 | 'repo_summary', repo_name=self.db_repo_name) |
|
1339 | 1322 | else: |
|
1340 | 1323 | default_redirect_url = h.route_path( |
|
1341 | 1324 | 'repo_commit', repo_name=self.db_repo_name, commit_id='tip') |
|
1342 | 1325 | |
|
1343 | # If there's no commit, redirect to repo summary | |
|
1344 | if type(c.commit) is EmptyCommit: | |
|
1345 | redirect_url = h.route_path( | |
|
1346 | 'repo_summary', repo_name=self.db_repo_name) | |
|
1326 | if self.rhodecode_vcs_repo.is_empty(): | |
|
1327 | # for empty repository we cannot check for current branch, we rely on | |
|
1328 | # c.commit.branch instead | |
|
1329 | _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True | |
|
1347 | 1330 | else: |
|
1348 | redirect_url = default_redirect_url | |
|
1331 | _branch_name, _sha_commit_id, is_head = \ | |
|
1332 | self._is_valid_head(commit_id, self.rhodecode_vcs_repo) | |
|
1333 | ||
|
1334 | self.forbid_non_head(is_head, f_path, commit_id=commit_id) | |
|
1335 | self.check_branch_permission(_branch_name, commit_id=commit_id) | |
|
1336 | ||
|
1337 | c.default_message = (_('Added file via RhodeCode Enterprise')) | |
|
1338 | c.f_path = f_path | |
|
1339 | ||
|
1340 | r_post = self.request.POST | |
|
1341 | message = r_post.get('message') or c.default_message | |
|
1342 | filename = r_post.get('filename') | |
|
1343 | unix_mode = 0 | |
|
1344 | content = convert_line_endings(r_post.get('content', ''), unix_mode) | |
|
1349 | 1345 | |
|
1350 | 1346 | if not filename: |
|
1351 | h.flash(_('No filename'), category='warning') | |
|
1347 | # If there's no commit, redirect to repo summary | |
|
1348 | if type(c.commit) is EmptyCommit: | |
|
1349 | redirect_url = h.route_path( | |
|
1350 | 'repo_summary', repo_name=self.db_repo_name) | |
|
1351 | else: | |
|
1352 | redirect_url = default_redirect_url | |
|
1353 | h.flash(_('No filename specified'), category='warning') | |
|
1352 | 1354 | raise HTTPFound(redirect_url) |
|
1353 | 1355 | |
|
1354 | # extract the location from filename, | |
|
1355 | # allows using foo/bar.txt syntax to create subdirectories | |
|
1356 | subdir_loc = filename.rsplit('/', 1) | |
|
1357 | if len(subdir_loc) == 2: | |
|
1358 | location = os.path.join(location, subdir_loc[0]) | |
|
1356 | root_path = f_path | |
|
1357 | pure_path = self.create_pure_path(root_path, filename) | |
|
1358 | node_path = safe_unicode(bytes(pure_path).lstrip('/')) | |
|
1359 | 1359 | |
|
1360 | # strip all crap out of file, just leave the basename | |
|
1361 | filename = os.path.basename(filename) | |
|
1362 | node_path = os.path.join(location, filename) | |
|
1363 | 1360 | author = self._rhodecode_db_user.full_contact |
|
1361 | nodes = { | |
|
1362 | node_path: { | |
|
1363 | 'content': content | |
|
1364 | } | |
|
1365 | } | |
|
1364 | 1366 | |
|
1365 | 1367 | try: |
|
1366 | nodes = { | |
|
1367 | node_path: { | |
|
1368 | 'content': content | |
|
1369 | } | |
|
1370 | } | |
|
1371 | ScmModel().create_nodes( | |
|
1368 | ||
|
1369 | commit = ScmModel().create_nodes( | |
|
1372 | 1370 | user=self._rhodecode_db_user.user_id, |
|
1373 | 1371 | repo=self.db_repo, |
|
1374 | 1372 | message=message, |
@@ -1377,14 +1375,16 b' class RepoFilesView(RepoAppView):' | |||
|
1377 | 1375 | author=author, |
|
1378 | 1376 | ) |
|
1379 | 1377 | |
|
1380 | h.flash( | |
|
1381 | _('Successfully committed new file `{}`').format( | |
|
1378 | h.flash(_('Successfully committed new file `{}`').format( | |
|
1382 | 1379 | h.escape(node_path)), category='success') |
|
1380 | ||
|
1381 | default_redirect_url = h.route_path( | |
|
1382 | 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id) | |
|
1383 | ||
|
1383 | 1384 | except NonRelativePathError: |
|
1384 | 1385 | log.exception('Non Relative path found') |
|
1385 | h.flash(_( | |
|
1386 | 'The location specified must be a relative path and must not ' | |
|
1387 | 'contain .. in the path'), category='warning') | |
|
1386 | h.flash(_('The location specified must be a relative path and must not ' | |
|
1387 | 'contain .. in the path'), category='warning') | |
|
1388 | 1388 | raise HTTPFound(default_redirect_url) |
|
1389 | 1389 | except (NodeError, NodeAlreadyExistsError) as e: |
|
1390 | 1390 | h.flash(_(h.escape(e)), category='error') |
@@ -1393,3 +1393,134 b' class RepoFilesView(RepoAppView):' | |||
|
1393 | 1393 | h.flash(_('Error occurred during commit'), category='error') |
|
1394 | 1394 | |
|
1395 | 1395 | raise HTTPFound(default_redirect_url) |
|
1396 | ||
|
1397 | @LoginRequired() | |
|
1398 | @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin') | |
|
1399 | @CSRFRequired() | |
|
1400 | @view_config( | |
|
1401 | route_name='repo_files_upload_file', request_method='POST', | |
|
1402 | renderer='json_ext') | |
|
1403 | def repo_files_upload_file(self): | |
|
1404 | _ = self.request.translate | |
|
1405 | c = self.load_default_context() | |
|
1406 | commit_id, f_path = self._get_commit_and_path() | |
|
1407 | ||
|
1408 | self._ensure_not_locked() | |
|
1409 | ||
|
1410 | c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False) | |
|
1411 | if c.commit is None: | |
|
1412 | c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias) | |
|
1413 | ||
|
1414 | # calculate redirect URL | |
|
1415 | if self.rhodecode_vcs_repo.is_empty(): | |
|
1416 | default_redirect_url = h.route_path( | |
|
1417 | 'repo_summary', repo_name=self.db_repo_name) | |
|
1418 | else: | |
|
1419 | default_redirect_url = h.route_path( | |
|
1420 | 'repo_commit', repo_name=self.db_repo_name, commit_id='tip') | |
|
1421 | ||
|
1422 | if self.rhodecode_vcs_repo.is_empty(): | |
|
1423 | # for empty repository we cannot check for current branch, we rely on | |
|
1424 | # c.commit.branch instead | |
|
1425 | _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True | |
|
1426 | else: | |
|
1427 | _branch_name, _sha_commit_id, is_head = \ | |
|
1428 | self._is_valid_head(commit_id, self.rhodecode_vcs_repo) | |
|
1429 | ||
|
1430 | error = self.forbid_non_head(is_head, f_path, json_mode=True) | |
|
1431 | if error: | |
|
1432 | return { | |
|
1433 | 'error': error, | |
|
1434 | 'redirect_url': default_redirect_url | |
|
1435 | } | |
|
1436 | error = self.check_branch_permission(_branch_name, json_mode=True) | |
|
1437 | if error: | |
|
1438 | return { | |
|
1439 | 'error': error, | |
|
1440 | 'redirect_url': default_redirect_url | |
|
1441 | } | |
|
1442 | ||
|
1443 | c.default_message = (_('Uploaded file via RhodeCode Enterprise')) | |
|
1444 | c.f_path = f_path | |
|
1445 | ||
|
1446 | r_post = self.request.POST | |
|
1447 | ||
|
1448 | message = c.default_message | |
|
1449 | user_message = r_post.getall('message') | |
|
1450 | if isinstance(user_message, list) and user_message: | |
|
1451 | # we take the first from duplicated results if it's not empty | |
|
1452 | message = user_message[0] if user_message[0] else message | |
|
1453 | ||
|
1454 | nodes = {} | |
|
1455 | ||
|
1456 | for file_obj in r_post.getall('files_upload') or []: | |
|
1457 | content = file_obj.file | |
|
1458 | filename = file_obj.filename | |
|
1459 | ||
|
1460 | root_path = f_path | |
|
1461 | pure_path = self.create_pure_path(root_path, filename) | |
|
1462 | node_path = safe_unicode(bytes(pure_path).lstrip('/')) | |
|
1463 | ||
|
1464 | nodes[node_path] = { | |
|
1465 | 'content': content | |
|
1466 | } | |
|
1467 | ||
|
1468 | if not nodes: | |
|
1469 | error = 'missing files' | |
|
1470 | return { | |
|
1471 | 'error': error, | |
|
1472 | 'redirect_url': default_redirect_url | |
|
1473 | } | |
|
1474 | ||
|
1475 | author = self._rhodecode_db_user.full_contact | |
|
1476 | ||
|
1477 | try: | |
|
1478 | commit = ScmModel().create_nodes( | |
|
1479 | user=self._rhodecode_db_user.user_id, | |
|
1480 | repo=self.db_repo, | |
|
1481 | message=message, | |
|
1482 | nodes=nodes, | |
|
1483 | parent_commit=c.commit, | |
|
1484 | author=author, | |
|
1485 | ) | |
|
1486 | if len(nodes) == 1: | |
|
1487 | flash_message = _('Successfully committed {} new files').format(len(nodes)) | |
|
1488 | else: | |
|
1489 | flash_message = _('Successfully committed 1 new file') | |
|
1490 | ||
|
1491 | h.flash(flash_message, category='success') | |
|
1492 | ||
|
1493 | default_redirect_url = h.route_path( | |
|
1494 | 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id) | |
|
1495 | ||
|
1496 | except NonRelativePathError: | |
|
1497 | log.exception('Non Relative path found') | |
|
1498 | error = _('The location specified must be a relative path and must not ' | |
|
1499 | 'contain .. in the path') | |
|
1500 | h.flash(error, category='warning') | |
|
1501 | ||
|
1502 | return { | |
|
1503 | 'error': error, | |
|
1504 | 'redirect_url': default_redirect_url | |
|
1505 | } | |
|
1506 | except (NodeError, NodeAlreadyExistsError) as e: | |
|
1507 | error = h.escape(e) | |
|
1508 | h.flash(error, category='error') | |
|
1509 | ||
|
1510 | return { | |
|
1511 | 'error': error, | |
|
1512 | 'redirect_url': default_redirect_url | |
|
1513 | } | |
|
1514 | except Exception: | |
|
1515 | log.exception('Error occurred during commit') | |
|
1516 | error = _('Error occurred during commit') | |
|
1517 | h.flash(error, category='error') | |
|
1518 | return { | |
|
1519 | 'error': error, | |
|
1520 | 'redirect_url': default_redirect_url | |
|
1521 | } | |
|
1522 | ||
|
1523 | return { | |
|
1524 | 'error': None, | |
|
1525 | 'redirect_url': default_redirect_url | |
|
1526 | } |
@@ -72,6 +72,7 b' class RepoSummaryView(RepoAppView):' | |||
|
72 | 72 | log.debug("Searching for a README file.") |
|
73 | 73 | readme_node = ReadmeFinder(_renderer_type).search(commit) |
|
74 | 74 | if readme_node: |
|
75 | log.debug('Found README node: %s', readme_node) | |
|
75 | 76 | relative_urls = { |
|
76 | 77 | 'raw': h.route_path( |
|
77 | 78 | 'repo_file_raw', repo_name=_repo_name, |
@@ -82,7 +83,8 b' class RepoSummaryView(RepoAppView):' | |||
|
82 | 83 | } |
|
83 | 84 | readme_data = self._render_readme_or_none( |
|
84 | 85 | commit, readme_node, relative_urls) |
|
85 | readme_filename = readme_node.path | |
|
86 | readme_filename = readme_node.unicode_path | |
|
87 | ||
|
86 | 88 | return readme_data, readme_filename |
|
87 | 89 | |
|
88 | 90 | inv_context_manager = rc_cache.InvalidationContext( |
@@ -152,6 +154,26 b' class RepoSummaryView(RepoAppView):' | |||
|
152 | 154 | c.comments = self.db_repo.get_comments(page_ids) |
|
153 | 155 | c.statuses = self.db_repo.statuses(page_ids) |
|
154 | 156 | |
|
157 | def _prepare_and_set_clone_url(self, c): | |
|
158 | username = '' | |
|
159 | if self._rhodecode_user.username != User.DEFAULT_USER: | |
|
160 | username = safe_str(self._rhodecode_user.username) | |
|
161 | ||
|
162 | _def_clone_uri = _def_clone_uri_id = c.clone_uri_tmpl | |
|
163 | _def_clone_uri_ssh = c.clone_uri_ssh_tmpl | |
|
164 | ||
|
165 | if '{repo}' in _def_clone_uri: | |
|
166 | _def_clone_uri_id = _def_clone_uri.replace('{repo}', '_{repoid}') | |
|
167 | elif '{repoid}' in _def_clone_uri: | |
|
168 | _def_clone_uri_id = _def_clone_uri.replace('_{repoid}', '{repo}') | |
|
169 | ||
|
170 | c.clone_repo_url = self.db_repo.clone_url( | |
|
171 | user=username, uri_tmpl=_def_clone_uri) | |
|
172 | c.clone_repo_url_id = self.db_repo.clone_url( | |
|
173 | user=username, uri_tmpl=_def_clone_uri_id) | |
|
174 | c.clone_repo_url_ssh = self.db_repo.clone_url( | |
|
175 | uri_tmpl=_def_clone_uri_ssh, ssh=True) | |
|
176 | ||
|
155 | 177 | @LoginRequired() |
|
156 | 178 | @HasRepoPermissionAnyDecorator( |
|
157 | 179 | 'repository.read', 'repository.write', 'repository.admin') |
@@ -160,6 +182,7 b' class RepoSummaryView(RepoAppView):' | |||
|
160 | 182 | renderer='rhodecode:templates/summary/summary_commits.mako') |
|
161 | 183 | def summary_commits(self): |
|
162 | 184 | c = self.load_default_context() |
|
185 | self._prepare_and_set_clone_url(c) | |
|
163 | 186 | self._load_commits_context(c) |
|
164 | 187 | return self._get_template_context(c) |
|
165 | 188 | |
@@ -179,26 +202,11 b' class RepoSummaryView(RepoAppView):' | |||
|
179 | 202 | c = self.load_default_context() |
|
180 | 203 | |
|
181 | 204 | # Prepare the clone URL |
|
182 | username = '' | |
|
183 | if self._rhodecode_user.username != User.DEFAULT_USER: | |
|
184 | username = safe_str(self._rhodecode_user.username) | |
|
185 | ||
|
186 | _def_clone_uri = _def_clone_uri_id = c.clone_uri_tmpl | |
|
187 | _def_clone_uri_ssh = c.clone_uri_ssh_tmpl | |
|
205 | self._prepare_and_set_clone_url(c) | |
|
188 | 206 | |
|
189 | if '{repo}' in _def_clone_uri: | |
|
190 | _def_clone_uri_id = _def_clone_uri.replace( | |
|
191 | '{repo}', '_{repoid}') | |
|
192 | elif '{repoid}' in _def_clone_uri: | |
|
193 | _def_clone_uri_id = _def_clone_uri.replace( | |
|
194 | '_{repoid}', '{repo}') | |
|
195 | ||
|
196 | c.clone_repo_url = self.db_repo.clone_url( | |
|
197 | user=username, uri_tmpl=_def_clone_uri) | |
|
198 | c.clone_repo_url_id = self.db_repo.clone_url( | |
|
199 | user=username, uri_tmpl=_def_clone_uri_id) | |
|
200 | c.clone_repo_url_ssh = self.db_repo.clone_url( | |
|
201 | uri_tmpl=_def_clone_uri_ssh, ssh=True) | |
|
207 | # update every 5 min | |
|
208 | if self.db_repo.last_commit_cache_update_diff > 60 * 5: | |
|
209 | self.db_repo.update_commit_cache() | |
|
202 | 210 | |
|
203 | 211 | # If enabled, get statistics data |
|
204 | 212 | |
@@ -231,8 +239,6 b' class RepoSummaryView(RepoAppView):' | |||
|
231 | 239 | c.enable_downloads = self.db_repo.enable_downloads |
|
232 | 240 | c.repository_followers = scm_model.get_followers(self.db_repo) |
|
233 | 241 | c.repository_forks = scm_model.get_forks(self.db_repo) |
|
234 | c.repository_is_user_following = scm_model.is_following_repo( | |
|
235 | self.db_repo_name, self._rhodecode_user.user_id) | |
|
236 | 242 | |
|
237 | 243 | # first interaction with the VCS instance after here... |
|
238 | 244 | if c.repository_requirements_missing: |
@@ -317,8 +323,7 b' class RepoSummaryView(RepoAppView):' | |||
|
317 | 323 | (_("Tag"), repo.tags, 'tag'), |
|
318 | 324 | (_("Bookmark"), repo.bookmarks, 'book'), |
|
319 | 325 | ] |
|
320 | res = self._create_reference_data( | |
|
321 | repo, self.db_repo_name, refs_to_create) | |
|
326 | res = self._create_reference_data(repo, self.db_repo_name, refs_to_create) | |
|
322 | 327 | data = { |
|
323 | 328 | 'more': False, |
|
324 | 329 | 'results': res |
@@ -365,8 +370,7 b' class RepoSummaryView(RepoAppView):' | |||
|
365 | 370 | }) |
|
366 | 371 | return result |
|
367 | 372 | |
|
368 | def _create_reference_items(self, repo, full_repo_name, refs, ref_type, | |
|
369 | format_ref_id): | |
|
373 | def _create_reference_items(self, repo, full_repo_name, refs, ref_type, format_ref_id): | |
|
370 | 374 | result = [] |
|
371 | 375 | is_svn = h.is_svn(repo) |
|
372 | 376 | for ref_name, raw_id in refs.iteritems(): |
@@ -378,6 +382,7 b' class RepoSummaryView(RepoAppView):' | |||
|
378 | 382 | 'raw_id': raw_id, |
|
379 | 383 | 'type': ref_type, |
|
380 | 384 | 'files_url': files_url, |
|
385 | 'idx': 0, | |
|
381 | 386 | }) |
|
382 | 387 | return result |
|
383 | 388 |
@@ -20,12 +20,13 b'' | |||
|
20 | 20 | |
|
21 | 21 | import os |
|
22 | 22 | import sys |
|
23 | import shutil | |
|
24 | 23 | import logging |
|
25 | 24 | import tempfile |
|
26 | 25 | import textwrap |
|
27 | ||
|
26 | import collections | |
|
28 | 27 | from .base import VcsServer |
|
28 | from rhodecode.model.db import RhodeCodeUi | |
|
29 | from rhodecode.model.settings import VcsSettingsModel | |
|
29 | 30 | |
|
30 | 31 | log = logging.getLogger(__name__) |
|
31 | 32 | |
@@ -37,62 +38,46 b' class MercurialTunnelWrapper(object):' | |||
|
37 | 38 | self.server = server |
|
38 | 39 | self.stdin = sys.stdin |
|
39 | 40 | self.stdout = sys.stdout |
|
40 |
self. |
|
|
41 | self.hooks_env_fd, self.hooks_env_path = tempfile.mkstemp() | |
|
41 | self.hooks_env_fd, self.hooks_env_path = tempfile.mkstemp(prefix='hgrc_rhodecode_') | |
|
42 | 42 | |
|
43 | 43 | def create_hooks_env(self): |
|
44 | repo_name = self.server.repo_name | |
|
45 | hg_flags = self.server.config_to_hgrc(repo_name) | |
|
44 | 46 | |
|
45 | 47 | content = textwrap.dedent( |
|
46 | 48 | ''' |
|
47 |
# SSH hooks version= |
|
|
48 |
|
|
|
49 | pretxnchangegroup.ssh_auth=python:vcsserver.hooks.pre_push_ssh_auth | |
|
50 | pretxnchangegroup.ssh=python:vcsserver.hooks.pre_push_ssh | |
|
51 | changegroup.ssh=python:vcsserver.hooks.post_push_ssh | |
|
52 | ||
|
53 | preoutgoing.ssh=python:vcsserver.hooks.pre_pull_ssh | |
|
54 | outgoing.ssh=python:vcsserver.hooks.post_pull_ssh | |
|
49 | # RhodeCode SSH hooks version=2.0.0 | |
|
50 | {custom} | |
|
51 | ''' | |
|
52 | ).format(custom='\n'.join(hg_flags)) | |
|
55 | 53 | |
|
56 | ''' | |
|
57 | ) | |
|
54 | root = self.server.get_root_store() | |
|
55 | hgrc_custom = os.path.join(root, repo_name, '.hg', 'hgrc_rhodecode') | |
|
56 | hgrc_main = os.path.join(root, repo_name, '.hg', 'hgrc') | |
|
58 | 57 | |
|
58 | # cleanup custom hgrc file | |
|
59 | if os.path.isfile(hgrc_custom): | |
|
60 | with open(hgrc_custom, 'wb') as f: | |
|
61 | f.write('') | |
|
62 | log.debug('Cleanup custom hgrc file under %s', hgrc_custom) | |
|
63 | ||
|
64 | # write temp | |
|
59 | 65 | with os.fdopen(self.hooks_env_fd, 'w') as hooks_env_file: |
|
60 | 66 | hooks_env_file.write(content) |
|
61 | root = self.server.get_root_store() | |
|
62 | 67 | |
|
63 | hgrc_custom = os.path.join( | |
|
64 | root, self.server.repo_name, '.hg', 'hgrc_rhodecode') | |
|
65 | log.debug('Wrote custom hgrc file under %s', hgrc_custom) | |
|
66 | shutil.move( | |
|
67 | self.hooks_env_path, hgrc_custom) | |
|
68 | ||
|
69 | hgrc_main = os.path.join( | |
|
70 | root, self.server.repo_name, '.hg', 'hgrc') | |
|
71 | include_marker = '%include hgrc_rhodecode' | |
|
68 | return self.hooks_env_path | |
|
72 | 69 | |
|
73 | if not os.path.isfile(hgrc_main): | |
|
74 | os.mknod(hgrc_main) | |
|
75 | ||
|
76 | with open(hgrc_main, 'rb') as f: | |
|
77 | data = f.read() | |
|
78 | has_marker = include_marker in data | |
|
70 | def remove_configs(self): | |
|
71 | os.remove(self.hooks_env_path) | |
|
79 | 72 | |
|
80 | if not has_marker: | |
|
81 | log.debug('Adding include marker for hooks') | |
|
82 | with open(hgrc_main, 'wa') as f: | |
|
83 | f.write(textwrap.dedent(''' | |
|
84 | # added by RhodeCode | |
|
85 | {} | |
|
86 | '''.format(include_marker))) | |
|
87 | ||
|
88 | def command(self): | |
|
73 | def command(self, hgrc_path): | |
|
89 | 74 | root = self.server.get_root_store() |
|
90 | 75 | |
|
91 | 76 | command = ( |
|
92 | "cd {root}; {hg_path} -R {root}{repo_name} " | |
|
77 | "cd {root}; HGRCPATH={hgrc} {hg_path} -R {root}{repo_name} " | |
|
93 | 78 | "serve --stdio".format( |
|
94 | 79 | root=root, hg_path=self.server.hg_path, |
|
95 | repo_name=self.server.repo_name)) | |
|
80 | repo_name=self.server.repo_name, hgrc=hgrc_path)) | |
|
96 | 81 | log.debug("Final CMD: %s", command) |
|
97 | 82 | return command |
|
98 | 83 | |
@@ -102,22 +87,61 b' class MercurialTunnelWrapper(object):' | |||
|
102 | 87 | action = '?' |
|
103 | 88 | # permissions are check via `pre_push_ssh_auth` hook |
|
104 | 89 | self.server.update_environment(action=action, extras=extras) |
|
105 | self.create_hooks_env() | |
|
106 | return os.system(self.command()) | |
|
90 | custom_hgrc_file = self.create_hooks_env() | |
|
91 | ||
|
92 | try: | |
|
93 | return os.system(self.command(custom_hgrc_file)) | |
|
94 | finally: | |
|
95 | self.remove_configs() | |
|
107 | 96 | |
|
108 | 97 | |
|
109 | 98 | class MercurialServer(VcsServer): |
|
110 | 99 | backend = 'hg' |
|
100 | cli_flags = ['phases', 'largefiles', 'extensions', 'experimental', 'hooks'] | |
|
111 | 101 | |
|
112 | def __init__(self, store, ini_path, repo_name, | |
|
113 |
|
|
|
114 | super(MercurialServer, self).\ | |
|
115 | __init__(user, user_permissions, config, env) | |
|
102 | def __init__(self, store, ini_path, repo_name, user, user_permissions, config, env): | |
|
103 | super(MercurialServer, self).__init__(user, user_permissions, config, env) | |
|
116 | 104 | |
|
117 | 105 | self.store = store |
|
118 | 106 | self.ini_path = ini_path |
|
119 | 107 | self.repo_name = repo_name |
|
120 | self._path = self.hg_path = config.get( | |
|
121 | 'app:main', 'ssh.executable.hg') | |
|
108 | self._path = self.hg_path = config.get('app:main', 'ssh.executable.hg') | |
|
109 | self.tunnel = MercurialTunnelWrapper(server=self) | |
|
110 | ||
|
111 | def config_to_hgrc(self, repo_name): | |
|
112 | ui_sections = collections.defaultdict(list) | |
|
113 | ui = VcsSettingsModel(repo=repo_name).get_ui_settings(section=None, key=None) | |
|
114 | ||
|
115 | # write default hooks | |
|
116 | default_hooks = [ | |
|
117 | ('pretxnchangegroup.ssh_auth', 'python:vcsserver.hooks.pre_push_ssh_auth'), | |
|
118 | ('pretxnchangegroup.ssh', 'python:vcsserver.hooks.pre_push_ssh'), | |
|
119 | ('changegroup.ssh', 'python:vcsserver.hooks.post_push_ssh'), | |
|
120 | ||
|
121 | ('preoutgoing.ssh', 'python:vcsserver.hooks.pre_pull_ssh'), | |
|
122 | ('outgoing.ssh', 'python:vcsserver.hooks.post_pull_ssh'), | |
|
123 | ] | |
|
124 | ||
|
125 | for k, v in default_hooks: | |
|
126 | ui_sections['hooks'].append((k, v)) | |
|
122 | 127 | |
|
123 | self.tunnel = MercurialTunnelWrapper(server=self) | |
|
128 | for entry in ui: | |
|
129 | if not entry.active: | |
|
130 | continue | |
|
131 | sec = entry.section | |
|
132 | key = entry.key | |
|
133 | ||
|
134 | if sec in self.cli_flags: | |
|
135 | # we want only custom hooks, so we skip builtins | |
|
136 | if sec == 'hooks' and key in RhodeCodeUi.HOOKS_BUILTIN: | |
|
137 | continue | |
|
138 | ||
|
139 | ui_sections[sec].append([key, entry.value]) | |
|
140 | ||
|
141 | flags = [] | |
|
142 | for _sec, key_val in ui_sections.items(): | |
|
143 | flags.append(' ') | |
|
144 | flags.append('[{}]'.format(_sec)) | |
|
145 | for key, val in key_val: | |
|
146 | flags.append('{}= {}'.format(key, val)) | |
|
147 | return flags |
@@ -18,6 +18,7 b'' | |||
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | import os | |
|
21 | 22 | import mock |
|
22 | 23 | import pytest |
|
23 | 24 | |
@@ -68,14 +69,16 b' def hg_server(app):' | |||
|
68 | 69 | |
|
69 | 70 | class TestMercurialServer(object): |
|
70 | 71 | |
|
71 | def test_command(self, hg_server): | |
|
72 | def test_command(self, hg_server, tmpdir): | |
|
72 | 73 | server = hg_server.create() |
|
74 | custom_hgrc = os.path.join(str(tmpdir), 'hgrc') | |
|
73 | 75 | expected_command = ( |
|
74 | 'cd {root}; {hg_path} -R {root}{repo_name} serve --stdio'.format( | |
|
75 | root=hg_server.root, hg_path=hg_server.hg_path, | |
|
76 | 'cd {root}; HGRCPATH={custom_hgrc} {hg_path} -R {root}{repo_name} serve --stdio'.format( | |
|
77 | root=hg_server.root, custom_hgrc=custom_hgrc, hg_path=hg_server.hg_path, | |
|
76 | 78 | repo_name=hg_server.repo_name) |
|
77 | 79 | ) |
|
78 |
|
|
|
80 | server_command = server.tunnel.command(custom_hgrc) | |
|
81 | assert expected_command == server_command | |
|
79 | 82 | |
|
80 | 83 | @pytest.mark.parametrize('permissions, action, code', [ |
|
81 | 84 | ({}, 'pull', -2), |
This diff has been collapsed as it changes many lines, (914 lines changed) Show them Hide them | |||
@@ -28,23 +28,16 b'' | |||
|
28 | 28 | { |
|
29 | 29 | "license": [ |
|
30 | 30 | { |
|
31 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
32 | "shortName": "bsdOriginal", | |
|
33 | "spdxId": "BSD-4-Clause", | |
|
34 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
35 | } | |
|
36 | ], | |
|
37 | "name": "python2.7-coverage-3.7.1" | |
|
38 | }, | |
|
39 | { | |
|
40 | "license": [ | |
|
41 | { | |
|
42 | 31 | "fullName": "MIT License", |
|
43 | 32 | "shortName": "mit", |
|
44 | 33 | "spdxId": "MIT", |
|
45 | 34 | "url": "http://spdx.org/licenses/MIT.html" |
|
46 | 35 | } |
|
47 | 36 | ], |
|
37 | "name": "python2.7-beautifulsoup4-4.6.3" | |
|
38 | }, | |
|
39 | { | |
|
40 | "license": "UNKNOWN", | |
|
48 | 41 | "name": "python2.7-bootstrapped-pip-9.0.1" |
|
49 | 42 | }, |
|
50 | 43 | { |
@@ -56,29 +49,7 b'' | |||
|
56 | 49 | "url": "http://spdx.org/licenses/MIT.html" |
|
57 | 50 | } |
|
58 | 51 | ], |
|
59 |
"name": "python2.7- |
|
|
60 | }, | |
|
61 | { | |
|
62 | "license": [ | |
|
63 | { | |
|
64 | "fullName": "MIT License", | |
|
65 | "shortName": "mit", | |
|
66 | "spdxId": "MIT", | |
|
67 | "url": "http://spdx.org/licenses/MIT.html" | |
|
68 | } | |
|
69 | ], | |
|
70 | "name": "python2.7-webtest-2.0.29" | |
|
71 | }, | |
|
72 | { | |
|
73 | "license": [ | |
|
74 | { | |
|
75 | "fullName": "MIT License", | |
|
76 | "shortName": "mit", | |
|
77 | "spdxId": "MIT", | |
|
78 | "url": "http://spdx.org/licenses/MIT.html" | |
|
79 | } | |
|
80 | ], | |
|
81 | "name": "python2.7-beautifulsoup4-4.6.3" | |
|
52 | "name": "python2.7-webtest-2.0.32" | |
|
82 | 53 | }, |
|
83 | 54 | { |
|
84 | 55 | "license": [ |
@@ -100,7 +71,7 b'' | |||
|
100 | 71 | "url": "http://spdx.org/licenses/MIT.html" |
|
101 | 72 | } |
|
102 | 73 | ], |
|
103 |
"name": "python2.7-webob-1. |
|
|
74 | "name": "python2.7-webob-1.8.4" | |
|
104 | 75 | }, |
|
105 | 76 | { |
|
106 | 77 | "license": [ |
@@ -116,6 +87,28 b'' | |||
|
116 | 87 | { |
|
117 | 88 | "license": [ |
|
118 | 89 | { |
|
90 | "fullName": "Apache License 2.0", | |
|
91 | "shortName": "asl20", | |
|
92 | "spdxId": "Apache-2.0", | |
|
93 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
94 | } | |
|
95 | ], | |
|
96 | "name": "python2.7-coverage-4.5.1" | |
|
97 | }, | |
|
98 | { | |
|
99 | "license": [ | |
|
100 | { | |
|
101 | "fullName": "MIT License", | |
|
102 | "shortName": "mit", | |
|
103 | "spdxId": "MIT", | |
|
104 | "url": "http://spdx.org/licenses/MIT.html" | |
|
105 | } | |
|
106 | ], | |
|
107 | "name": "python2.7-cov-core-1.15.0" | |
|
108 | }, | |
|
109 | { | |
|
110 | "license": [ | |
|
111 | { | |
|
119 | 112 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", |
|
120 | 113 | "shortName": "bsdOriginal", |
|
121 | 114 | "spdxId": "BSD-4-Clause", |
@@ -127,6 +120,17 b'' | |||
|
127 | 120 | { |
|
128 | 121 | "license": [ |
|
129 | 122 | { |
|
123 | "fullName": "GNU Lesser General Public License v3 or later (LGPLv3+)" | |
|
124 | }, | |
|
125 | { | |
|
126 | "fullName": "LGPL" | |
|
127 | } | |
|
128 | ], | |
|
129 | "name": "python2.7-gprof2dot-2017.9.19" | |
|
130 | }, | |
|
131 | { | |
|
132 | "license": [ | |
|
133 | { | |
|
130 | 134 | "fullName": "MIT License", |
|
131 | 135 | "shortName": "mit", |
|
132 | 136 | "spdxId": "MIT", |
@@ -136,7 +140,7 b'' | |||
|
136 | 140 | "fullName": "DFSG approved" |
|
137 | 141 | } |
|
138 | 142 | ], |
|
139 |
"name": "python2.7-pytest-timeout-1. |
|
|
143 | "name": "python2.7-pytest-timeout-1.3.2" | |
|
140 | 144 | }, |
|
141 | 145 | { |
|
142 | 146 | "license": [ |
@@ -147,7 +151,32 b'' | |||
|
147 | 151 | "url": "http://spdx.org/licenses/MIT.html" |
|
148 | 152 | } |
|
149 | 153 | ], |
|
150 |
"name": "python2.7-pytest-3. |
|
|
154 | "name": "python2.7-pytest-3.8.2" | |
|
155 | }, | |
|
156 | { | |
|
157 | "license": [ | |
|
158 | { | |
|
159 | "fullName": "MIT License", | |
|
160 | "shortName": "mit", | |
|
161 | "spdxId": "MIT", | |
|
162 | "url": "http://spdx.org/licenses/MIT.html" | |
|
163 | } | |
|
164 | ], | |
|
165 | "name": "python2.7-pathlib2-2.3.3" | |
|
166 | }, | |
|
167 | { | |
|
168 | "license": [ | |
|
169 | { | |
|
170 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
171 | "shortName": "bsdOriginal", | |
|
172 | "spdxId": "BSD-4-Clause", | |
|
173 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
174 | }, | |
|
175 | { | |
|
176 | "fullName": "New BSD License" | |
|
177 | } | |
|
178 | ], | |
|
179 | "name": "python2.7-scandir-1.9.0" | |
|
151 | 180 | }, |
|
152 | 181 | { |
|
153 | 182 | "license": [ |
@@ -172,7 +201,7 b'' | |||
|
172 | 201 | "url": "http://spdx.org/licenses/MIT.html" |
|
173 | 202 | } |
|
174 | 203 | ], |
|
175 |
"name": "python2.7-pluggy-0. |
|
|
204 | "name": "python2.7-pluggy-0.9.0" | |
|
176 | 205 | }, |
|
177 | 206 | { |
|
178 | 207 | "license": [ |
@@ -183,7 +212,7 b'' | |||
|
183 | 212 | "url": "http://spdx.org/licenses/MIT.html" |
|
184 | 213 | } |
|
185 | 214 | ], |
|
186 |
"name": "python2.7-atomicwrites-1. |
|
|
215 | "name": "python2.7-atomicwrites-1.2.1" | |
|
187 | 216 | }, |
|
188 | 217 | { |
|
189 | 218 | "license": [ |
@@ -194,7 +223,7 b'' | |||
|
194 | 223 | "url": "http://spdx.org/licenses/MIT.html" |
|
195 | 224 | } |
|
196 | 225 | ], |
|
197 |
"name": "python2.7-more-itertools- |
|
|
226 | "name": "python2.7-more-itertools-5.0.0" | |
|
198 | 227 | }, |
|
199 | 228 | { |
|
200 | 229 | "license": [ |
@@ -205,7 +234,7 b'' | |||
|
205 | 234 | "url": "http://spdx.org/licenses/MIT.html" |
|
206 | 235 | } |
|
207 | 236 | ], |
|
208 |
"name": "python2.7-attrs-18. |
|
|
237 | "name": "python2.7-attrs-18.2.0" | |
|
209 | 238 | }, |
|
210 | 239 | { |
|
211 | 240 | "license": [ |
@@ -216,18 +245,7 b'' | |||
|
216 | 245 | "url": "http://spdx.org/licenses/MIT.html" |
|
217 | 246 | } |
|
218 | 247 | ], |
|
219 |
"name": "python2.7-py-1. |
|
|
220 | }, | |
|
221 | { | |
|
222 | "license": [ | |
|
223 | { | |
|
224 | "fullName": "GNU Lesser General Public License v3 or later (LGPLv3+)" | |
|
225 | }, | |
|
226 | { | |
|
227 | "fullName": "LGPL" | |
|
228 | } | |
|
229 | ], | |
|
230 | "name": "python2.7-gprof2dot-2017.9.19" | |
|
248 | "name": "python2.7-py-1.6.0" | |
|
231 | 249 | }, |
|
232 | 250 | { |
|
233 | 251 | "license": [ |
@@ -254,17 +272,6 b'' | |||
|
254 | 272 | { |
|
255 | 273 | "license": [ |
|
256 | 274 | { |
|
257 | "fullName": "MIT License", | |
|
258 | "shortName": "mit", | |
|
259 | "spdxId": "MIT", | |
|
260 | "url": "http://spdx.org/licenses/MIT.html" | |
|
261 | } | |
|
262 | ], | |
|
263 | "name": "python2.7-setuptools-scm-2.1.0" | |
|
264 | }, | |
|
265 | { | |
|
266 | "license": [ | |
|
267 | { | |
|
268 | 275 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", |
|
269 | 276 | "shortName": "bsdOriginal", |
|
270 | 277 | "spdxId": "BSD-4-Clause", |
@@ -299,7 +306,7 b'' | |||
|
299 | 306 | "url": "http://spdx.org/licenses/MIT.html" |
|
300 | 307 | } |
|
301 | 308 | ], |
|
302 |
"name": "python2.7-pytest-cov-2. |
|
|
309 | "name": "python2.7-pytest-cov-2.6.0" | |
|
303 | 310 | }, |
|
304 | 311 | { |
|
305 | 312 | "license": [ |
@@ -310,7 +317,7 b'' | |||
|
310 | 317 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
311 | 318 | } |
|
312 | 319 | ], |
|
313 |
"name": "python2.7-appenlight-client-0.6.2 |
|
|
320 | "name": "python2.7-appenlight-client-0.6.26" | |
|
314 | 321 | }, |
|
315 | 322 | { |
|
316 | 323 | "license": [ |
@@ -326,10 +333,107 b'' | |||
|
326 | 333 | { |
|
327 | 334 | "license": [ |
|
328 | 335 | { |
|
329 |
"fullName": "A |
|
|
336 | "fullName": "Apache 2.0 and Proprietary" | |
|
337 | } | |
|
338 | ], | |
|
339 | "name": "python2.7-rhodecode-tools-1.2.1" | |
|
340 | }, | |
|
341 | { | |
|
342 | "license": [ | |
|
343 | { | |
|
344 | "fullName": "Apache License 2.0", | |
|
345 | "shortName": "asl20", | |
|
346 | "spdxId": "Apache-2.0", | |
|
347 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
348 | } | |
|
349 | ], | |
|
350 | "name": "python2.7-elasticsearch1-dsl-0.0.12" | |
|
351 | }, | |
|
352 | { | |
|
353 | "license": [ | |
|
354 | { | |
|
355 | "fullName": "Apache License 2.0", | |
|
356 | "shortName": "asl20", | |
|
357 | "spdxId": "Apache-2.0", | |
|
358 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
359 | } | |
|
360 | ], | |
|
361 | "name": "python2.7-elasticsearch1-1.10.0" | |
|
362 | }, | |
|
363 | { | |
|
364 | "license": [ | |
|
365 | { | |
|
366 | "fullName": "MIT License", | |
|
367 | "shortName": "mit", | |
|
368 | "spdxId": "MIT", | |
|
369 | "url": "http://spdx.org/licenses/MIT.html" | |
|
330 | 370 | } |
|
331 | 371 | ], |
|
332 |
"name": "python2.7- |
|
|
372 | "name": "python2.7-urllib3-1.24.1" | |
|
373 | }, | |
|
374 | { | |
|
375 | "license": [ | |
|
376 | { | |
|
377 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
378 | "shortName": "bsdOriginal", | |
|
379 | "spdxId": "BSD-4-Clause", | |
|
380 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
381 | }, | |
|
382 | { | |
|
383 | "fullName": "Apache License 2.0", | |
|
384 | "shortName": "asl20", | |
|
385 | "spdxId": "Apache-2.0", | |
|
386 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
387 | }, | |
|
388 | { | |
|
389 | "fullName": "Dual License" | |
|
390 | } | |
|
391 | ], | |
|
392 | "name": "python2.7-python-dateutil-2.8.0" | |
|
393 | }, | |
|
394 | { | |
|
395 | "license": [ | |
|
396 | { | |
|
397 | "fullName": "Apache License 2.0", | |
|
398 | "shortName": "asl20", | |
|
399 | "spdxId": "Apache-2.0", | |
|
400 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
401 | } | |
|
402 | ], | |
|
403 | "name": "python2.7-elasticsearch2-2.5.0" | |
|
404 | }, | |
|
405 | { | |
|
406 | "license": [ | |
|
407 | { | |
|
408 | "fullName": "Apache License 2.0", | |
|
409 | "shortName": "asl20", | |
|
410 | "spdxId": "Apache-2.0", | |
|
411 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
412 | } | |
|
413 | ], | |
|
414 | "name": "python2.7-elasticsearch-dsl-6.3.1" | |
|
415 | }, | |
|
416 | { | |
|
417 | "license": [ | |
|
418 | { | |
|
419 | "fullName": "Python Software Foundation License version 2", | |
|
420 | "shortName": "psfl", | |
|
421 | "spdxId": "Python-2.0", | |
|
422 | "url": "http://spdx.org/licenses/Python-2.0.html" | |
|
423 | } | |
|
424 | ], | |
|
425 | "name": "python2.7-ipaddress-1.0.22" | |
|
426 | }, | |
|
427 | { | |
|
428 | "license": [ | |
|
429 | { | |
|
430 | "fullName": "Apache License 2.0", | |
|
431 | "shortName": "asl20", | |
|
432 | "spdxId": "Apache-2.0", | |
|
433 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
434 | } | |
|
435 | ], | |
|
436 | "name": "python2.7-elasticsearch-6.3.1" | |
|
333 | 437 | }, |
|
334 | 438 | { |
|
335 | 439 | "license": [ |
@@ -351,66 +455,13 b'' | |||
|
351 | 455 | { |
|
352 | 456 | "license": [ |
|
353 | 457 | { |
|
354 | "fullName": "MIT License", | |
|
355 | "shortName": "mit", | |
|
356 | "spdxId": "MIT", | |
|
357 | "url": "http://spdx.org/licenses/MIT.html" | |
|
358 | } | |
|
359 | ], | |
|
360 | "name": "python2.7-urllib3-1.21" | |
|
361 | }, | |
|
362 | { | |
|
363 | "license": [ | |
|
364 | { | |
|
365 | "fullName": "Apache License 2.0", | |
|
366 | "shortName": "asl20", | |
|
367 | "spdxId": "Apache-2.0", | |
|
368 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
369 | } | |
|
370 | ], | |
|
371 | "name": "python2.7-elasticsearch-dsl-2.2.0" | |
|
372 | }, | |
|
373 | { | |
|
374 | "license": [ | |
|
375 | { | |
|
376 | "fullName": "Apache License 2.0", | |
|
377 | "shortName": "asl20", | |
|
378 | "spdxId": "Apache-2.0", | |
|
379 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
380 | } | |
|
381 | ], | |
|
382 | "name": "python2.7-elasticsearch-2.3.0" | |
|
383 | }, | |
|
384 | { | |
|
385 | "license": [ | |
|
386 | { | |
|
387 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
388 | "shortName": "bsdOriginal", | |
|
389 | "spdxId": "BSD-4-Clause", | |
|
390 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
391 | }, | |
|
392 | { | |
|
393 | "fullName": "Apache License 2.0", | |
|
394 | "shortName": "asl20", | |
|
395 | "spdxId": "Apache-2.0", | |
|
396 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
397 | }, | |
|
398 | { | |
|
399 | "fullName": "Dual License" | |
|
400 | } | |
|
401 | ], | |
|
402 | "name": "python2.7-python-dateutil-2.7.3" | |
|
403 | }, | |
|
404 | { | |
|
405 | "license": [ | |
|
406 | { | |
|
407 | 458 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", |
|
408 | 459 | "shortName": "bsdOriginal", |
|
409 | 460 | "spdxId": "BSD-4-Clause", |
|
410 | 461 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
411 | 462 | } |
|
412 | 463 | ], |
|
413 | "name": "python2.7-markupsafe-1.0" | |
|
464 | "name": "python2.7-markupsafe-1.1.0" | |
|
414 | 465 | }, |
|
415 | 466 | { |
|
416 | 467 | "license": [ |
@@ -443,29 +494,7 b'' | |||
|
443 | 494 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
444 | 495 | } |
|
445 | 496 | ], |
|
446 |
"name": "python2.7-click- |
|
|
447 | }, | |
|
448 | { | |
|
449 | "license": [ | |
|
450 | { | |
|
451 | "fullName": "MIT License", | |
|
452 | "shortName": "mit", | |
|
453 | "spdxId": "MIT", | |
|
454 | "url": "http://spdx.org/licenses/MIT.html" | |
|
455 | } | |
|
456 | ], | |
|
457 | "name": "python2.7-bottle-0.12.13" | |
|
458 | }, | |
|
459 | { | |
|
460 | "license": [ | |
|
461 | { | |
|
462 | "fullName": "MIT License", | |
|
463 | "shortName": "mit", | |
|
464 | "spdxId": "MIT", | |
|
465 | "url": "http://spdx.org/licenses/MIT.html" | |
|
466 | } | |
|
467 | ], | |
|
468 | "name": "python2.7-cprofilev-1.0.7" | |
|
497 | "name": "python2.7-click-7.0" | |
|
469 | 498 | }, |
|
470 | 499 | { |
|
471 | 500 | "license": [ |
@@ -519,31 +548,6 b'' | |||
|
519 | 548 | "url": "http://spdx.org/licenses/MIT.html" |
|
520 | 549 | } |
|
521 | 550 | ], |
|
522 | "name": "python2.7-pathlib2-2.3.0" | |
|
523 | }, | |
|
524 | { | |
|
525 | "license": [ | |
|
526 | { | |
|
527 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
528 | "shortName": "bsdOriginal", | |
|
529 | "spdxId": "BSD-4-Clause", | |
|
530 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
531 | }, | |
|
532 | { | |
|
533 | "fullName": "New BSD License" | |
|
534 | } | |
|
535 | ], | |
|
536 | "name": "python2.7-scandir-1.9.0" | |
|
537 | }, | |
|
538 | { | |
|
539 | "license": [ | |
|
540 | { | |
|
541 | "fullName": "MIT License", | |
|
542 | "shortName": "mit", | |
|
543 | "spdxId": "MIT", | |
|
544 | "url": "http://spdx.org/licenses/MIT.html" | |
|
545 | } | |
|
546 | ], | |
|
547 | 551 | "name": "python2.7-backports.shutil-get-terminal-size-1.0.0" |
|
548 | 552 | }, |
|
549 | 553 | { |
@@ -555,7 +559,7 b'' | |||
|
555 | 559 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
556 | 560 | } |
|
557 | 561 | ], |
|
558 |
"name": "python2.7-pygments-2. |
|
|
562 | "name": "python2.7-pygments-2.3.1" | |
|
559 | 563 | }, |
|
560 | 564 | { |
|
561 | 565 | "license": [ |
@@ -646,7 +650,7 b'' | |||
|
646 | 650 | "url": "http://spdx.org/licenses/MIT.html" |
|
647 | 651 | } |
|
648 | 652 | ], |
|
649 |
"name": "python2.7-pickleshare-0.7. |
|
|
653 | "name": "python2.7-pickleshare-0.7.5" | |
|
650 | 654 | }, |
|
651 | 655 | { |
|
652 | 656 | "license": [ |
@@ -690,7 +694,7 b'' | |||
|
690 | 694 | "url": "http://spdx.org/licenses/MIT.html" |
|
691 | 695 | } |
|
692 | 696 | ], |
|
693 |
"name": "python2.7-greenlet-0.4.1 |
|
|
697 | "name": "python2.7-greenlet-0.4.15" | |
|
694 | 698 | }, |
|
695 | 699 | { |
|
696 | 700 | "license": [ |
@@ -701,7 +705,7 b'' | |||
|
701 | 705 | "url": "http://spdx.org/licenses/MIT.html" |
|
702 | 706 | } |
|
703 | 707 | ], |
|
704 |
"name": "python2.7-gevent-1. |
|
|
708 | "name": "python2.7-gevent-1.4.0" | |
|
705 | 709 | }, |
|
706 | 710 | { |
|
707 | 711 | "license": [ |
@@ -712,7 +716,7 b'' | |||
|
712 | 716 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
713 | 717 | } |
|
714 | 718 | ], |
|
715 |
"name": "python2.7-psutil-5. |
|
|
719 | "name": "python2.7-psutil-5.5.1" | |
|
716 | 720 | }, |
|
717 | 721 | { |
|
718 | 722 | "license": [ |
@@ -745,7 +749,7 b'' | |||
|
745 | 749 | "url": "http://spdx.org/licenses/MIT.html" |
|
746 | 750 | } |
|
747 | 751 | ], |
|
748 |
"name": "python2.7-alembic- |
|
|
752 | "name": "python2.7-alembic-1.0.5" | |
|
749 | 753 | }, |
|
750 | 754 | { |
|
751 | 755 | "license": { |
@@ -754,7 +758,7 b'' | |||
|
754 | 758 | "spdxId": "Apache-2.0", |
|
755 | 759 | "url": "http://spdx.org/licenses/Apache-2.0.html" |
|
756 | 760 | }, |
|
757 |
"name": "python2.7-python-editor-1.0. |
|
|
761 | "name": "python2.7-python-editor-1.0.4" | |
|
758 | 762 | }, |
|
759 | 763 | { |
|
760 | 764 | "license": [ |
@@ -842,39 +846,6 b'' | |||
|
842 | 846 | { |
|
843 | 847 | "license": [ |
|
844 | 848 | { |
|
845 | "fullName": "Apache License 2.0", | |
|
846 | "shortName": "asl20", | |
|
847 | "spdxId": "Apache-2.0", | |
|
848 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
849 | } | |
|
850 | ], | |
|
851 | "name": "python2.7-bleach-2.1.4" | |
|
852 | }, | |
|
853 | { | |
|
854 | "license": [ | |
|
855 | { | |
|
856 | "fullName": "MIT License", | |
|
857 | "shortName": "mit", | |
|
858 | "spdxId": "MIT", | |
|
859 | "url": "http://spdx.org/licenses/MIT.html" | |
|
860 | } | |
|
861 | ], | |
|
862 | "name": "python2.7-html5lib-1.0.1" | |
|
863 | }, | |
|
864 | { | |
|
865 | "license": [ | |
|
866 | { | |
|
867 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
868 | "shortName": "bsdOriginal", | |
|
869 | "spdxId": "BSD-4-Clause", | |
|
870 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
871 | } | |
|
872 | ], | |
|
873 | "name": "python2.7-webencodings-0.5.1" | |
|
874 | }, | |
|
875 | { | |
|
876 | "license": [ | |
|
877 | { | |
|
878 | 849 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", |
|
879 | 850 | "shortName": "bsdOriginal", |
|
880 | 851 | "spdxId": "BSD-4-Clause", |
@@ -892,7 +863,7 b'' | |||
|
892 | 863 | "url": "http://spdx.org/licenses/MIT.html" |
|
893 | 864 | } |
|
894 | 865 | ], |
|
895 |
"name": "python2.7-testpath-0. |
|
|
866 | "name": "python2.7-testpath-0.4.2" | |
|
896 | 867 | }, |
|
897 | 868 | { |
|
898 | 869 | "license": [ |
@@ -908,6 +879,28 b'' | |||
|
908 | 879 | { |
|
909 | 880 | "license": [ |
|
910 | 881 | { |
|
882 | "fullName": "Apache License 2.0", | |
|
883 | "shortName": "asl20", | |
|
884 | "spdxId": "Apache-2.0", | |
|
885 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
886 | } | |
|
887 | ], | |
|
888 | "name": "python2.7-bleach-3.1.0" | |
|
889 | }, | |
|
890 | { | |
|
891 | "license": [ | |
|
892 | { | |
|
893 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
894 | "shortName": "bsdOriginal", | |
|
895 | "spdxId": "BSD-4-Clause", | |
|
896 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
897 | } | |
|
898 | ], | |
|
899 | "name": "python2.7-webencodings-0.5.1" | |
|
900 | }, | |
|
901 | { | |
|
902 | "license": [ | |
|
903 | { | |
|
911 | 904 | "fullName": "MIT License", |
|
912 | 905 | "shortName": "mit", |
|
913 | 906 | "spdxId": "MIT", |
@@ -925,7 +918,7 b'' | |||
|
925 | 918 | "url": "http://spdx.org/licenses/MIT.html" |
|
926 | 919 | } |
|
927 | 920 | ], |
|
928 |
"name": "python2.7-configparser-3. |
|
|
921 | "name": "python2.7-configparser-3.7.3" | |
|
929 | 922 | }, |
|
930 | 923 | { |
|
931 | 924 | "license": [ |
@@ -947,7 +940,73 b'' | |||
|
947 | 940 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
948 | 941 | } |
|
949 | 942 | ], |
|
950 |
"name": "python2.7-mistune-0.8. |
|
|
943 | "name": "python2.7-mistune-0.8.4" | |
|
944 | }, | |
|
945 | { | |
|
946 | "license": { | |
|
947 | "fullName": "GNU Lesser General Public License v3.0 or later", | |
|
948 | "shortName": "lgpl3Plus", | |
|
949 | "spdxId": "LGPL-3.0+", | |
|
950 | "url": "http://spdx.org/licenses/LGPL-3.0+.html" | |
|
951 | }, | |
|
952 | "name": "python2.7-psycopg2-2.7.7" | |
|
953 | }, | |
|
954 | { | |
|
955 | "license": { | |
|
956 | "fullName": "PostgreSQL License", | |
|
957 | "shortName": "postgresql", | |
|
958 | "spdxId": "PostgreSQL", | |
|
959 | "url": "http://spdx.org/licenses/PostgreSQL.html" | |
|
960 | }, | |
|
961 | "name": "postgresql-9.6.10" | |
|
962 | }, | |
|
963 | { | |
|
964 | "license": [ | |
|
965 | { | |
|
966 | "fullName": "zlib License", | |
|
967 | "shortName": "zlib", | |
|
968 | "spdxId": "Zlib", | |
|
969 | "url": "http://spdx.org/licenses/Zlib.html" | |
|
970 | }, | |
|
971 | { | |
|
972 | "fullName": "libpng License", | |
|
973 | "shortName": "libpng", | |
|
974 | "spdxId": "Libpng", | |
|
975 | "url": "http://spdx.org/licenses/Libpng.html" | |
|
976 | } | |
|
977 | ], | |
|
978 | "name": "python2.7-pysqlite-2.8.3" | |
|
979 | }, | |
|
980 | { | |
|
981 | "license": [ | |
|
982 | { | |
|
983 | "fullName": "MIT License", | |
|
984 | "shortName": "mit", | |
|
985 | "spdxId": "MIT", | |
|
986 | "url": "http://spdx.org/licenses/MIT.html" | |
|
987 | } | |
|
988 | ], | |
|
989 | "name": "python2.7-pymysql-0.8.1" | |
|
990 | }, | |
|
991 | { | |
|
992 | "license": [ | |
|
993 | { | |
|
994 | "fullName": "GNU General Public License v1.0 only", | |
|
995 | "shortName": "gpl1", | |
|
996 | "spdxId": "GPL-1.0", | |
|
997 | "url": "http://spdx.org/licenses/GPL-1.0.html" | |
|
998 | } | |
|
999 | ], | |
|
1000 | "name": "python2.7-mysql-python-1.2.5" | |
|
1001 | }, | |
|
1002 | { | |
|
1003 | "license": { | |
|
1004 | "fullName": "GNU Library General Public License v2.1 only", | |
|
1005 | "shortName": "lgpl21", | |
|
1006 | "spdxId": "LGPL-2.1", | |
|
1007 | "url": "http://spdx.org/licenses/LGPL-2.1.html" | |
|
1008 | }, | |
|
1009 | "name": "mariadb-connector-c-2.3.4" | |
|
951 | 1010 | }, |
|
952 | 1011 | { |
|
953 | 1012 | "license": [ |
@@ -958,7 +1017,7 b'' | |||
|
958 | 1017 | "url": "http://spdx.org/licenses/ZPL-2.1.html" |
|
959 | 1018 | } |
|
960 | 1019 | ], |
|
961 |
"name": "python2.7-zope.interface-4. |
|
|
1020 | "name": "python2.7-zope.interface-4.6.0" | |
|
962 | 1021 | }, |
|
963 | 1022 | { |
|
964 | 1023 | "license": [ |
@@ -969,7 +1028,7 b'' | |||
|
969 | 1028 | "url": "http://spdx.org/licenses/ZPL-2.1.html" |
|
970 | 1029 | } |
|
971 | 1030 | ], |
|
972 |
"name": "python2.7-zope.event-4. |
|
|
1031 | "name": "python2.7-zope.event-4.4" | |
|
973 | 1032 | }, |
|
974 | 1033 | { |
|
975 | 1034 | "license": [ |
@@ -980,7 +1039,7 b'' | |||
|
980 | 1039 | "url": "http://spdx.org/licenses/ZPL-2.1.html" |
|
981 | 1040 | } |
|
982 | 1041 | ], |
|
983 |
"name": "python2.7-zope.deprecation-4. |
|
|
1042 | "name": "python2.7-zope.deprecation-4.4.0" | |
|
984 | 1043 | }, |
|
985 | 1044 | { |
|
986 | 1045 | "license": [ |
@@ -1043,7 +1102,7 b'' | |||
|
1043 | 1102 | "url": "http://spdx.org/licenses/MIT.html" |
|
1044 | 1103 | } |
|
1045 | 1104 | ], |
|
1046 |
"name": "python2.7-paste- |
|
|
1105 | "name": "python2.7-paste-3.0.5" | |
|
1047 | 1106 | }, |
|
1048 | 1107 | { |
|
1049 | 1108 | "license": [ |
@@ -1061,7 +1120,7 b'' | |||
|
1061 | 1120 | "fullName": "Repoze License", |
|
1062 | 1121 | "url": "http://www.repoze.org/LICENSE.txt" |
|
1063 | 1122 | }, |
|
1064 |
"name": "python2.7-venusian-1. |
|
|
1123 | "name": "python2.7-venusian-1.2.0" | |
|
1065 | 1124 | }, |
|
1066 | 1125 | { |
|
1067 | 1126 | "license": { |
@@ -1072,17 +1131,6 b'' | |||
|
1072 | 1131 | "name": "python2.7-urlobject-2.4.3" |
|
1073 | 1132 | }, |
|
1074 | 1133 | { |
|
1075 | "license": [ | |
|
1076 | { | |
|
1077 | "fullName": "Apache License 2.0", | |
|
1078 | "shortName": "asl20", | |
|
1079 | "spdxId": "Apache-2.0", | |
|
1080 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
1081 | } | |
|
1082 | ], | |
|
1083 | "name": "python2.7-trollius-1.0.4" | |
|
1084 | }, | |
|
1085 | { | |
|
1086 | 1134 | "license": { |
|
1087 | 1135 | "fullName": "Repoze License", |
|
1088 | 1136 | "url": "http://www.repoze.org/LICENSE.txt" |
@@ -1095,7 +1143,7 b'' | |||
|
1095 | 1143 | "fullName": "BSD-derived (http://www.repoze.org/LICENSE.txt)" |
|
1096 | 1144 | } |
|
1097 | 1145 | ], |
|
1098 |
"name": "python2.7-supervisor-3.3. |
|
|
1146 | "name": "python2.7-supervisor-3.3.5" | |
|
1099 | 1147 | }, |
|
1100 | 1148 | { |
|
1101 | 1149 | "license": [ |
@@ -1114,7 +1162,7 b'' | |||
|
1114 | 1162 | "url": "http://spdx.org/licenses/Python-2.0.html" |
|
1115 | 1163 | } |
|
1116 | 1164 | ], |
|
1117 |
"name": "python2.7-subprocess32-3.5. |
|
|
1165 | "name": "python2.7-subprocess32-3.5.3" | |
|
1118 | 1166 | }, |
|
1119 | 1167 | { |
|
1120 | 1168 | "license": [ |
@@ -1125,7 +1173,7 b'' | |||
|
1125 | 1173 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
1126 | 1174 | } |
|
1127 | 1175 | ], |
|
1128 |
"name": "python2.7-sshpubkeys- |
|
|
1176 | "name": "python2.7-sshpubkeys-3.1.0" | |
|
1129 | 1177 | }, |
|
1130 | 1178 | { |
|
1131 | 1179 | "license": [ |
@@ -1141,11 +1189,55 b'' | |||
|
1141 | 1189 | { |
|
1142 | 1190 | "license": [ |
|
1143 | 1191 | { |
|
1144 |
"fullName": " |
|
|
1145 |
"shortName": " |
|
|
1192 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1193 | "shortName": "bsdOriginal", | |
|
1194 | "spdxId": "BSD-4-Clause", | |
|
1195 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1196 | }, | |
|
1197 | { | |
|
1198 | "fullName": "BSD or Apache License, Version 2.0" | |
|
1199 | }, | |
|
1200 | { | |
|
1201 | "fullName": "Apache License 2.0", | |
|
1202 | "shortName": "asl20", | |
|
1203 | "spdxId": "Apache-2.0", | |
|
1204 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
1146 | 1205 | } |
|
1147 | 1206 | ], |
|
1148 |
"name": "python2.7- |
|
|
1207 | "name": "python2.7-cryptography-2.5" | |
|
1208 | }, | |
|
1209 | { | |
|
1210 | "license": [ | |
|
1211 | { | |
|
1212 | "fullName": "MIT License", | |
|
1213 | "shortName": "mit", | |
|
1214 | "spdxId": "MIT", | |
|
1215 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1216 | } | |
|
1217 | ], | |
|
1218 | "name": "python2.7-cffi-1.12.1" | |
|
1219 | }, | |
|
1220 | { | |
|
1221 | "license": [ | |
|
1222 | { | |
|
1223 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1224 | "shortName": "bsdOriginal", | |
|
1225 | "spdxId": "BSD-4-Clause", | |
|
1226 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1227 | } | |
|
1228 | ], | |
|
1229 | "name": "python2.7-pycparser-2.19" | |
|
1230 | }, | |
|
1231 | { | |
|
1232 | "license": [ | |
|
1233 | { | |
|
1234 | "fullName": "MIT License", | |
|
1235 | "shortName": "mit", | |
|
1236 | "spdxId": "MIT", | |
|
1237 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1238 | } | |
|
1239 | ], | |
|
1240 | "name": "python2.7-asn1crypto-0.24.0" | |
|
1149 | 1241 | }, |
|
1150 | 1242 | { |
|
1151 | 1243 | "license": [ |
@@ -1159,18 +1251,7 b'' | |||
|
1159 | 1251 | "url": "http://spdx.org/licenses/MIT.html" |
|
1160 | 1252 | } |
|
1161 | 1253 | ], |
|
1162 |
"name": "python2.7-simplejson-3.1 |
|
|
1163 | }, | |
|
1164 | { | |
|
1165 | "license": [ | |
|
1166 | { | |
|
1167 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1168 | "shortName": "bsdOriginal", | |
|
1169 | "spdxId": "BSD-4-Clause", | |
|
1170 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1171 | } | |
|
1172 | ], | |
|
1173 | "name": "python2.7-setproctitle-1.1.10" | |
|
1254 | "name": "python2.7-simplejson-3.16.0" | |
|
1174 | 1255 | }, |
|
1175 | 1256 | { |
|
1176 | 1257 | "license": [ |
@@ -1210,7 +1291,7 b'' | |||
|
1210 | 1291 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
1211 | 1292 | } |
|
1212 | 1293 | ], |
|
1213 |
"name": "python2.7-py-gfm-0.1. |
|
|
1294 | "name": "python2.7-py-gfm-0.1.4" | |
|
1214 | 1295 | }, |
|
1215 | 1296 | { |
|
1216 | 1297 | "license": [ |
@@ -1248,6 +1329,72 b'' | |||
|
1248 | 1329 | { |
|
1249 | 1330 | "license": [ |
|
1250 | 1331 | { |
|
1332 | "fullName": "MIT License", | |
|
1333 | "shortName": "mit", | |
|
1334 | "spdxId": "MIT", | |
|
1335 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1336 | } | |
|
1337 | ], | |
|
1338 | "name": "python2.7-python-saml-2.4.2" | |
|
1339 | }, | |
|
1340 | { | |
|
1341 | "license": [ | |
|
1342 | { | |
|
1343 | "fullName": "Python Software Foundation License version 2", | |
|
1344 | "shortName": "psfl", | |
|
1345 | "spdxId": "Python-2.0", | |
|
1346 | "url": "http://spdx.org/licenses/Python-2.0.html" | |
|
1347 | } | |
|
1348 | ], | |
|
1349 | "name": "python2.7-defusedxml-0.5.0" | |
|
1350 | }, | |
|
1351 | { | |
|
1352 | "license": [ | |
|
1353 | { | |
|
1354 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1355 | "shortName": "bsdOriginal", | |
|
1356 | "spdxId": "BSD-4-Clause", | |
|
1357 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1358 | } | |
|
1359 | ], | |
|
1360 | "name": "python2.7-isodate-0.6.0" | |
|
1361 | }, | |
|
1362 | { | |
|
1363 | "license": [ | |
|
1364 | { | |
|
1365 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1366 | "shortName": "bsdOriginal", | |
|
1367 | "spdxId": "BSD-4-Clause", | |
|
1368 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1369 | } | |
|
1370 | ], | |
|
1371 | "name": "python2.7-dm.xmlsec.binding-1.3.7" | |
|
1372 | }, | |
|
1373 | { | |
|
1374 | "license": [ | |
|
1375 | { | |
|
1376 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1377 | "shortName": "bsdOriginal", | |
|
1378 | "spdxId": "BSD-4-Clause", | |
|
1379 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1380 | } | |
|
1381 | ], | |
|
1382 | "name": "python2.7-lxml-4.2.5" | |
|
1383 | }, | |
|
1384 | { | |
|
1385 | "license": [ | |
|
1386 | { | |
|
1387 | "fullName": "MIT License", | |
|
1388 | "shortName": "mit", | |
|
1389 | "spdxId": "MIT", | |
|
1390 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1391 | } | |
|
1392 | ], | |
|
1393 | "name": "python2.7-wheel-0.30.0" | |
|
1394 | }, | |
|
1395 | { | |
|
1396 | "license": [ | |
|
1397 | { | |
|
1251 | 1398 | "fullName": "License :: OSI Approved :: MIT License" |
|
1252 | 1399 | }, |
|
1253 | 1400 | { |
@@ -1260,14 +1407,7 b'' | |||
|
1260 | 1407 | "name": "python2.7-python-pam-1.8.4" |
|
1261 | 1408 | }, |
|
1262 | 1409 | { |
|
1263 |
"license": |
|
|
1264 | { | |
|
1265 | "fullName": "GNU General Public License v1.0 only", | |
|
1266 | "shortName": "gpl1", | |
|
1267 | "spdxId": "GPL-1.0", | |
|
1268 | "url": "http://spdx.org/licenses/GPL-1.0.html" | |
|
1269 | } | |
|
1270 | ], | |
|
1410 | "license": "UNKNOWN", | |
|
1271 | 1411 | "name": "linux-pam-1.3.0" |
|
1272 | 1412 | }, |
|
1273 | 1413 | { |
@@ -1302,9 +1442,7 b'' | |||
|
1302 | 1442 | "name": "libkrb5-1.15.2" |
|
1303 | 1443 | }, |
|
1304 | 1444 | { |
|
1305 |
"license": |
|
|
1306 | "fullName": "BSD-derived (https://www.openldap.org/software/release/license.html)" | |
|
1307 | }, | |
|
1445 | "license": "UNKNOWN", | |
|
1308 | 1446 | "name": "openldap-2.4.45" |
|
1309 | 1447 | }, |
|
1310 | 1448 | { |
@@ -1316,7 +1454,18 b'' | |||
|
1316 | 1454 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
1317 | 1455 | } |
|
1318 | 1456 | ], |
|
1319 |
"name": "python2.7-pyasn1-modules-0.2. |
|
|
1457 | "name": "python2.7-pyasn1-modules-0.2.4" | |
|
1458 | }, | |
|
1459 | { | |
|
1460 | "license": [ | |
|
1461 | { | |
|
1462 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1463 | "shortName": "bsdOriginal", | |
|
1464 | "spdxId": "BSD-4-Clause", | |
|
1465 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1466 | } | |
|
1467 | ], | |
|
1468 | "name": "python2.7-pyasn1-0.4.5" | |
|
1320 | 1469 | }, |
|
1321 | 1470 | { |
|
1322 | 1471 | "license": [ |
@@ -1327,42 +1476,36 b'' | |||
|
1327 | 1476 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
1328 | 1477 | } |
|
1329 | 1478 | ], |
|
1330 |
"name": "python2.7-py |
|
|
1479 | "name": "python2.7-pyramid-mailer-0.15.1" | |
|
1331 | 1480 | }, |
|
1332 | 1481 | { |
|
1333 | 1482 | "license": [ |
|
1334 | 1483 | { |
|
1335 |
"fullName": " |
|
|
1336 |
"shortName": "z |
|
|
1337 |
"spdxId": "Z |
|
|
1338 |
"url": "http://spdx.org/licenses/Z |
|
|
1339 | }, | |
|
1340 | { | |
|
1341 | "fullName": "libpng License", | |
|
1342 | "shortName": "libpng", | |
|
1343 | "spdxId": "Libpng", | |
|
1344 | "url": "http://spdx.org/licenses/Libpng.html" | |
|
1484 | "fullName": "Zope Public License 2.1", | |
|
1485 | "shortName": "zpl21", | |
|
1486 | "spdxId": "ZPL-2.1", | |
|
1487 | "url": "http://spdx.org/licenses/ZPL-2.1.html" | |
|
1345 | 1488 | } |
|
1346 | 1489 | ], |
|
1347 |
"name": "python2.7- |
|
|
1490 | "name": "python2.7-transaction-2.4.0" | |
|
1491 | }, | |
|
1492 | { | |
|
1493 | "license": [ | |
|
1494 | { | |
|
1495 | "fullName": "Zope Public License 2.1", | |
|
1496 | "shortName": "zpl21", | |
|
1497 | "spdxId": "ZPL-2.1", | |
|
1498 | "url": "http://spdx.org/licenses/ZPL-2.1.html" | |
|
1499 | } | |
|
1500 | ], | |
|
1501 | "name": "python2.7-repoze.sendmail-4.4.1" | |
|
1348 | 1502 | }, |
|
1349 | 1503 | { |
|
1350 | 1504 | "license": { |
|
1351 | 1505 | "fullName": "Repoze License", |
|
1352 | 1506 | "url": "http://www.repoze.org/LICENSE.txt" |
|
1353 | 1507 | }, |
|
1354 |
"name": "python2.7-pyramid-1. |
|
|
1355 | }, | |
|
1356 | { | |
|
1357 | "license": [ | |
|
1358 | { | |
|
1359 | "fullName": "MIT License", | |
|
1360 | "shortName": "mit", | |
|
1361 | "spdxId": "MIT", | |
|
1362 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1363 | } | |
|
1364 | ], | |
|
1365 | "name": "python2.7-hupper-1.3" | |
|
1508 | "name": "python2.7-pyramid-1.10.1" | |
|
1366 | 1509 | }, |
|
1367 | 1510 | { |
|
1368 | 1511 | "license": [ |
@@ -1395,7 +1538,18 b'' | |||
|
1395 | 1538 | "url": "http://spdx.org/licenses/MIT.html" |
|
1396 | 1539 | } |
|
1397 | 1540 | ], |
|
1398 |
"name": "python2.7-pastedeploy- |
|
|
1541 | "name": "python2.7-pastedeploy-2.0.1" | |
|
1542 | }, | |
|
1543 | { | |
|
1544 | "license": [ | |
|
1545 | { | |
|
1546 | "fullName": "MIT License", | |
|
1547 | "shortName": "mit", | |
|
1548 | "spdxId": "MIT", | |
|
1549 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1550 | } | |
|
1551 | ], | |
|
1552 | "name": "python2.7-hupper-1.5" | |
|
1399 | 1553 | }, |
|
1400 | 1554 | { |
|
1401 | 1555 | "license": { |
@@ -1428,18 +1582,7 b'' | |||
|
1428 | 1582 | "url": "http://www.repoze.org/LICENSE.txt" |
|
1429 | 1583 | } |
|
1430 | 1584 | ], |
|
1431 |
"name": "python2.7-pyramid-debugtoolbar-4. |
|
|
1432 | }, | |
|
1433 | { | |
|
1434 | "license": [ | |
|
1435 | { | |
|
1436 | "fullName": "Python Software Foundation License version 2", | |
|
1437 | "shortName": "psfl", | |
|
1438 | "spdxId": "Python-2.0", | |
|
1439 | "url": "http://spdx.org/licenses/Python-2.0.html" | |
|
1440 | } | |
|
1441 | ], | |
|
1442 | "name": "python2.7-ipaddress-1.0.22" | |
|
1585 | "name": "python2.7-pyramid-debugtoolbar-4.5" | |
|
1443 | 1586 | }, |
|
1444 | 1587 | { |
|
1445 | 1588 | "license": { |
@@ -1468,29 +1611,16 b'' | |||
|
1468 | 1611 | "url": "http://spdx.org/licenses/MIT.html" |
|
1469 | 1612 | } |
|
1470 | 1613 | ], |
|
1471 |
"name": "python2.7-pyparsing- |
|
|
1614 | "name": "python2.7-pyparsing-2.3.0" | |
|
1472 | 1615 | }, |
|
1473 | 1616 | { |
|
1474 | 1617 | "license": [ |
|
1475 | 1618 | { |
|
1476 |
"fullName": " |
|
|
1477 |
"shortName": " |
|
|
1478 | "spdxId": "Apache-2.0", | |
|
1479 | "url": "http://spdx.org/licenses/Apache-2.0.html" | |
|
1619 | "fullName": "Public Domain", | |
|
1620 | "shortName": "publicDomain" | |
|
1480 | 1621 | } |
|
1481 | 1622 | ], |
|
1482 |
"name": "python2.7-py |
|
|
1483 | }, | |
|
1484 | { | |
|
1485 | "license": [ | |
|
1486 | { | |
|
1487 | "fullName": "MIT License", | |
|
1488 | "shortName": "mit", | |
|
1489 | "spdxId": "MIT", | |
|
1490 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1491 | } | |
|
1492 | ], | |
|
1493 | "name": "python2.7-pyflakes-0.8.1" | |
|
1623 | "name": "python2.7-pycrypto-2.6.1" | |
|
1494 | 1624 | }, |
|
1495 | 1625 | { |
|
1496 | 1626 | "license": { |
@@ -1513,30 +1643,12 b'' | |||
|
1513 | 1643 | "name": "python2.7-py-bcrypt-0.4" |
|
1514 | 1644 | }, |
|
1515 | 1645 | { |
|
1516 | "license": { | |
|
1517 | "fullName": "GNU Lesser General Public License v3.0 or later", | |
|
1518 | "shortName": "lgpl3Plus", | |
|
1519 | "spdxId": "LGPL-3.0+", | |
|
1520 | "url": "http://spdx.org/licenses/LGPL-3.0+.html" | |
|
1521 | }, | |
|
1522 | "name": "python2.7-psycopg2-2.7.4" | |
|
1523 | }, | |
|
1524 | { | |
|
1525 | "license": { | |
|
1526 | "fullName": "PostgreSQL License", | |
|
1527 | "shortName": "postgresql", | |
|
1528 | "spdxId": "PostgreSQL", | |
|
1529 | "url": "http://spdx.org/licenses/PostgreSQL.html" | |
|
1530 | }, | |
|
1531 | "name": "postgresql-9.6.10" | |
|
1532 | }, | |
|
1533 | { | |
|
1534 | 1646 | "license": [ |
|
1535 | 1647 | { |
|
1536 | 1648 | "fullName": "BSD-derived (http://www.repoze.org/LICENSE.txt)" |
|
1537 | 1649 | } |
|
1538 | 1650 | ], |
|
1539 |
"name": "python2.7-peppercorn-0. |
|
|
1651 | "name": "python2.7-peppercorn-0.6" | |
|
1540 | 1652 | }, |
|
1541 | 1653 | { |
|
1542 | 1654 | "license": [ |
@@ -1547,7 +1659,7 b'' | |||
|
1547 | 1659 | "url": "http://spdx.org/licenses/MIT.html" |
|
1548 | 1660 | } |
|
1549 | 1661 | ], |
|
1550 |
"name": "python2.7-pastescript- |
|
|
1662 | "name": "python2.7-pastescript-3.0.0" | |
|
1551 | 1663 | }, |
|
1552 | 1664 | { |
|
1553 | 1665 | "license": [ |
@@ -1569,60 +1681,7 b'' | |||
|
1569 | 1681 | "url": "http://spdx.org/licenses/MIT.html" |
|
1570 | 1682 | } |
|
1571 | 1683 | ], |
|
1572 |
"name": "python2.7- |
|
|
1573 | }, | |
|
1574 | { | |
|
1575 | "license": [ | |
|
1576 | { | |
|
1577 | "fullName": "MIT License", | |
|
1578 | "shortName": "mit", | |
|
1579 | "spdxId": "MIT", | |
|
1580 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1581 | } | |
|
1582 | ], | |
|
1583 | "name": "python2.7-graphviz-0.9" | |
|
1584 | }, | |
|
1585 | { | |
|
1586 | "license": [ | |
|
1587 | { | |
|
1588 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1589 | "shortName": "bsdOriginal", | |
|
1590 | "spdxId": "BSD-4-Clause", | |
|
1591 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1592 | } | |
|
1593 | ], | |
|
1594 | "name": "python2.7-pyotp-2.2.6" | |
|
1595 | }, | |
|
1596 | { | |
|
1597 | "license": [ | |
|
1598 | { | |
|
1599 | "fullName": "MIT License", | |
|
1600 | "shortName": "mit", | |
|
1601 | "spdxId": "MIT", | |
|
1602 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1603 | } | |
|
1604 | ], | |
|
1605 | "name": "python2.7-pymysql-0.8.1" | |
|
1606 | }, | |
|
1607 | { | |
|
1608 | "license": [ | |
|
1609 | { | |
|
1610 | "fullName": "GNU General Public License v1.0 only", | |
|
1611 | "shortName": "gpl1", | |
|
1612 | "spdxId": "GPL-1.0", | |
|
1613 | "url": "http://spdx.org/licenses/GPL-1.0.html" | |
|
1614 | } | |
|
1615 | ], | |
|
1616 | "name": "python2.7-mysql-python-1.2.5" | |
|
1617 | }, | |
|
1618 | { | |
|
1619 | "license": { | |
|
1620 | "fullName": "GNU Library General Public License v2.1 only", | |
|
1621 | "shortName": "lgpl21", | |
|
1622 | "spdxId": "LGPL-2.1", | |
|
1623 | "url": "http://spdx.org/licenses/LGPL-2.1.html" | |
|
1624 | }, | |
|
1625 | "name": "mariadb-connector-c-2.3.4" | |
|
1684 | "name": "python2.7-pyotp-2.2.7" | |
|
1626 | 1685 | }, |
|
1627 | 1686 | { |
|
1628 | 1687 | "license": [ |
@@ -1644,29 +1703,7 b'' | |||
|
1644 | 1703 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
1645 | 1704 | } |
|
1646 | 1705 | ], |
|
1647 |
"name": "python2.7- |
|
|
1648 | }, | |
|
1649 | { | |
|
1650 | "license": [ | |
|
1651 | { | |
|
1652 | "fullName": "MIT License", | |
|
1653 | "shortName": "mit", | |
|
1654 | "spdxId": "MIT", | |
|
1655 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1656 | } | |
|
1657 | ], | |
|
1658 | "name": "python2.7-wheel-0.30.0" | |
|
1659 | }, | |
|
1660 | { | |
|
1661 | "license": [ | |
|
1662 | { | |
|
1663 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1664 | "shortName": "bsdOriginal", | |
|
1665 | "spdxId": "BSD-4-Clause", | |
|
1666 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1667 | } | |
|
1668 | ], | |
|
1669 | "name": "python2.7-kombu-4.2.0" | |
|
1706 | "name": "python2.7-kombu-4.2.1" | |
|
1670 | 1707 | }, |
|
1671 | 1708 | { |
|
1672 | 1709 | "license": [ |
@@ -1688,18 +1725,7 b'' | |||
|
1688 | 1725 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
1689 | 1726 | } |
|
1690 | 1727 | ], |
|
1691 |
"name": "python2.7-vine-1. |
|
|
1692 | }, | |
|
1693 | { | |
|
1694 | "license": [ | |
|
1695 | { | |
|
1696 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1697 | "shortName": "bsdOriginal", | |
|
1698 | "spdxId": "BSD-4-Clause", | |
|
1699 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1700 | } | |
|
1701 | ], | |
|
1702 | "name": "python2.7-billiard-3.5.0.3" | |
|
1728 | "name": "python2.7-vine-1.2.0" | |
|
1703 | 1729 | }, |
|
1704 | 1730 | { |
|
1705 | 1731 | "license": [ |
@@ -1721,7 +1747,7 b'' | |||
|
1721 | 1747 | "url": "http://spdx.org/licenses/MIT.html" |
|
1722 | 1748 | } |
|
1723 | 1749 | ], |
|
1724 |
"name": "python2.7-iso8601-0.1.1 |
|
|
1750 | "name": "python2.7-iso8601-0.1.12" | |
|
1725 | 1751 | }, |
|
1726 | 1752 | { |
|
1727 | 1753 | "license": [ |
@@ -1765,7 +1791,7 b'' | |||
|
1765 | 1791 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
1766 | 1792 | } |
|
1767 | 1793 | ], |
|
1768 |
"name": "python2.7-dogpile.cache-0. |
|
|
1794 | "name": "python2.7-dogpile.cache-0.7.1" | |
|
1769 | 1795 | }, |
|
1770 | 1796 | { |
|
1771 | 1797 | "license": { |
@@ -1779,17 +1805,20 b'' | |||
|
1779 | 1805 | { |
|
1780 | 1806 | "license": [ |
|
1781 | 1807 | { |
|
1808 | "fullName": "Repoze Public License" | |
|
1809 | }, | |
|
1810 | { | |
|
1782 | 1811 | "fullName": "BSD-derived (http://www.repoze.org/LICENSE.txt)" |
|
1783 | 1812 | } |
|
1784 | 1813 | ], |
|
1785 |
"name": "python2.7-deform-2.0. |
|
|
1814 | "name": "python2.7-deform-2.0.7" | |
|
1786 | 1815 | }, |
|
1787 | 1816 | { |
|
1788 | 1817 | "license": { |
|
1789 | 1818 | "fullName": "Repoze License", |
|
1790 | 1819 | "url": "http://www.repoze.org/LICENSE.txt" |
|
1791 | 1820 | }, |
|
1792 |
"name": "python2.7-colander-1. |
|
|
1821 | "name": "python2.7-colander-1.7.0" | |
|
1793 | 1822 | }, |
|
1794 | 1823 | { |
|
1795 | 1824 | "license": [ |
@@ -1863,6 +1892,17 b'' | |||
|
1863 | 1892 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" |
|
1864 | 1893 | } |
|
1865 | 1894 | ], |
|
1895 | "name": "python2.7-billiard-3.5.0.3" | |
|
1896 | }, | |
|
1897 | { | |
|
1898 | "license": [ | |
|
1899 | { | |
|
1900 | "fullName": "BSD 4-clause \"Original\" or \"Old\" License", | |
|
1901 | "shortName": "bsdOriginal", | |
|
1902 | "spdxId": "BSD-4-Clause", | |
|
1903 | "url": "http://spdx.org/licenses/BSD-4-Clause.html" | |
|
1904 | } | |
|
1905 | ], | |
|
1866 | 1906 | "name": "python2.7-babel-1.3" |
|
1867 | 1907 | }, |
|
1868 | 1908 | { |
@@ -1877,25 +1917,7 b'' | |||
|
1877 | 1917 | "name": "python2.7-authomatic-0.1.0.post1" |
|
1878 | 1918 | }, |
|
1879 | 1919 | { |
|
1880 |
"license": |
|
|
1881 | { | |
|
1882 | "fullName": "MIT License", | |
|
1883 | "shortName": "mit", | |
|
1884 | "spdxId": "MIT", | |
|
1885 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1886 | } | |
|
1887 | ], | |
|
1888 | "name": "node-grunt-cli-1.2.0" | |
|
1889 | }, | |
|
1890 | { | |
|
1891 | "license": [ | |
|
1892 | { | |
|
1893 | "fullName": "MIT License", | |
|
1894 | "shortName": "mit", | |
|
1895 | "spdxId": "MIT", | |
|
1896 | "url": "http://spdx.org/licenses/MIT.html" | |
|
1897 | } | |
|
1898 | ], | |
|
1920 | "license": "UNKNOWN", | |
|
1899 | 1921 | "name": "python2.7-rhodecode-testdata-0.10.0" |
|
1900 | 1922 | } |
|
1901 | 1923 | ] |
@@ -250,7 +250,7 b' def includeme(config):' | |||
|
250 | 250 | |
|
251 | 251 | # Includes which are required. The application would fail without them. |
|
252 | 252 | config.include('pyramid_mako') |
|
253 |
config.include(' |
|
|
253 | config.include('rhodecode.lib.rc_beaker') | |
|
254 | 254 | config.include('rhodecode.lib.rc_cache') |
|
255 | 255 | |
|
256 | 256 | config.include('rhodecode.apps._base.navigation') |
@@ -28,8 +28,8 b' def _pre_push_hook(*args, **kwargs):' | |||
|
28 | 28 | [{u'hg_env|git_env': ..., |
|
29 | 29 | u'multiple_heads': [], |
|
30 | 30 | u'name': u'default', |
|
31 |
u'new_rev': u'd0b |
|
|
32 |
u'old_rev': u'd0b |
|
|
31 | u'new_rev': u'd0b2ae0692e722e01d5677f27a104631cf798b69', | |
|
32 | u'old_rev': u'd0b1ae0692e722e01d5677f27a104631cf798b69', | |
|
33 | 33 | u'ref': u'', |
|
34 | 34 | u'total_commits': 2, |
|
35 | 35 | u'type': u'branch'}] |
@@ -47,13 +47,17 b' def _pre_push_hook(*args, **kwargs):' | |||
|
47 | 47 | forbid_files = repo_extra_fields.get('forbid_files_glob', {}).get('field_value') |
|
48 | 48 | forbid_files = aslist(forbid_files) |
|
49 | 49 | |
|
50 | # forbid_files = ['*'] # example pattern | |
|
51 | ||
|
50 | 52 | # optionally get bytes limit for a single file, e.g 1024 for 1KB |
|
51 | 53 | forbid_size_over = repo_extra_fields.get('forbid_size_over', {}).get('field_value') |
|
52 | 54 | forbid_size_over = int(forbid_size_over or 0) |
|
53 | 55 | |
|
56 | # forbid_size_over = 1024 # example 1024 | |
|
57 | ||
|
54 | 58 | def validate_file_name_and_size(file_data, forbidden_files=None, size_limit=None): |
|
55 | 59 | """ |
|
56 |
This function validates com |
|
|
60 | This function validates comited files against some sort of rules. | |
|
57 | 61 | It should return a valid boolean, and a reason for failure |
|
58 | 62 | |
|
59 | 63 | file_data =[ |
@@ -87,7 +91,10 b' def _pre_push_hook(*args, **kwargs):' | |||
|
87 | 91 | |
|
88 | 92 | # validate A(dded) files and size |
|
89 | 93 | if size_limit and operation == 'A': |
|
90 |
|
|
|
94 | if 'file_size' in file_data: | |
|
95 | size = file_data['file_size'] | |
|
96 | else: | |
|
97 | size = len(file_data['raw_diff']) | |
|
91 | 98 | |
|
92 | 99 | reason = 'File {} size of {} bytes exceeds limit {}'.format( |
|
93 | 100 | file_name, format_byte_size_binary(size), |
@@ -34,12 +34,51 b' from rhodecode.lib.vcs.backends.hg.diff ' | |||
|
34 | 34 | from rhodecode.lib.vcs.backends.git.diff import GitDiff |
|
35 | 35 | |
|
36 | 36 | |
|
37 |
def get_ |
|
|
37 | def get_svn_files(repo, vcs_repo, refs): | |
|
38 | txn_id = refs[0] | |
|
39 | files = [] | |
|
40 | stdout, stderr = vcs_repo.run_svn_command( | |
|
41 | ['svnlook', 'changed', repo.repo_full_path, '--transaction', txn_id]) | |
|
42 | ||
|
43 | svn_op_to_rc_op = { | |
|
44 | 'A': 'A', | |
|
45 | 'U': 'M', | |
|
46 | 'D': 'D', | |
|
47 | } | |
|
48 | ||
|
49 | for entry in stdout.splitlines(): | |
|
50 | parsed_entry = { | |
|
51 | 'raw_diff': '', | |
|
52 | 'filename': '', | |
|
53 | 'chunks': [], | |
|
54 | 'ops': {}, | |
|
55 | 'file_size': 0 | |
|
56 | } | |
|
57 | ||
|
58 | op = entry[0] | |
|
59 | path = entry[1:].strip() | |
|
60 | ||
|
61 | rc_op = svn_op_to_rc_op.get(op) or '?' | |
|
62 | parsed_entry['filename'] = path | |
|
63 | parsed_entry['operation'] = rc_op | |
|
64 | ||
|
65 | if rc_op in ['A', 'M']: | |
|
66 | stdout, stderr = vcs_repo.run_svn_command( | |
|
67 | ['svnlook', 'filesize', repo.repo_full_path, path, '--transaction', txn_id]) | |
|
68 | file_size = int(stdout.strip()) | |
|
69 | parsed_entry['file_size'] = file_size | |
|
70 | ||
|
71 | files.append(parsed_entry) | |
|
72 | ||
|
73 | return files | |
|
74 | ||
|
75 | ||
|
76 | def get_hg_files(repo, vcs_repo, refs): | |
|
38 | 77 | files = [] |
|
39 | 78 | return files |
|
40 | 79 | |
|
41 | 80 | |
|
42 | def get_git_files(repo, refs): | |
|
81 | def get_git_files(repo, vcs_repo, refs): | |
|
43 | 82 | files = [] |
|
44 | 83 | |
|
45 | 84 | for data in refs: |
@@ -57,7 +96,7 b' def get_git_files(repo, refs):' | |||
|
57 | 96 | 'diff', old_rev, new_rev |
|
58 | 97 | ] |
|
59 | 98 | |
|
60 | stdout, stderr = repo.run_git_command(cmd, extra_env=git_env) | |
|
99 | stdout, stderr = vcs_repo.run_git_command(cmd, extra_env=git_env) | |
|
61 | 100 | vcs_diff = GitDiff(stdout) |
|
62 | 101 | |
|
63 | 102 | diff_processor = diffs.DiffProcessor(vcs_diff, format='newdiff') |
@@ -86,11 +125,14 b' def run(*args, **kwargs):' | |||
|
86 | 125 | if vcs_type == 'git': |
|
87 | 126 | for rev_data in kwargs['commit_ids']: |
|
88 | 127 | new_environ = dict((k, v) for k, v in rev_data['git_env']) |
|
89 | files = get_git_files(vcs_repo, kwargs['commit_ids']) | |
|
128 | files = get_git_files(repo, vcs_repo, kwargs['commit_ids']) | |
|
90 | 129 | |
|
91 | 130 | if vcs_type == 'hg': |
|
92 | 131 | for rev_data in kwargs['commit_ids']: |
|
93 | 132 | new_environ = dict((k, v) for k, v in rev_data['hg_env']) |
|
94 | files = get_hg_files(vcs_repo, kwargs['commit_ids']) | |
|
133 | files = get_hg_files(repo, vcs_repo, kwargs['commit_ids']) | |
|
134 | ||
|
135 | if vcs_type == 'svn': | |
|
136 | files = get_svn_files(repo, vcs_repo, kwargs['commit_ids']) | |
|
95 | 137 | |
|
96 | 138 | return files |
@@ -402,7 +402,8 b' class RepoIntegrationsView(IntegrationSe' | |||
|
402 | 402 | c.rhodecode_db_repo = self.repo |
|
403 | 403 | c.repo_name = self.db_repo.repo_name |
|
404 | 404 | c.repository_pull_requests = ScmModel().get_pull_requests(self.repo) |
|
405 | ||
|
405 | c.repository_is_user_following = ScmModel().is_following_repo( | |
|
406 | c.repo_name, self._rhodecode_user.user_id) | |
|
406 | 407 | c.has_origin_repo_read_perm = False |
|
407 | 408 | if self.db_repo.fork: |
|
408 | 409 | c.has_origin_repo_read_perm = h.HasRepoPermissionAny( |
@@ -92,6 +92,9 b' ACTIONS_V1 = {' | |||
|
92 | 92 | 'repo.commit.comment.delete': {'data': {}}, |
|
93 | 93 | 'repo.commit.vote': '', |
|
94 | 94 | |
|
95 | 'repo.artifact.add': '', | |
|
96 | 'repo.artifact.delete': '', | |
|
97 | ||
|
95 | 98 | 'repo_group.create': {'data': {}}, |
|
96 | 99 | 'repo_group.edit': {'old_data': {}}, |
|
97 | 100 | 'repo_group.edit.permissions': {}, |
@@ -2078,8 +2078,7 b' class HasRepoPermissionAny(PermsFunction' | |||
|
2078 | 2078 | class HasRepoGroupPermissionAny(PermsFunction): |
|
2079 | 2079 | def __call__(self, group_name=None, check_location='', user=None): |
|
2080 | 2080 | self.repo_group_name = group_name |
|
2081 | return super(HasRepoGroupPermissionAny, self).__call__( | |
|
2082 | check_location, user) | |
|
2081 | return super(HasRepoGroupPermissionAny, self).__call__(check_location, user) | |
|
2083 | 2082 | |
|
2084 | 2083 | def check_permissions(self, user): |
|
2085 | 2084 | perms = user.permissions |
@@ -2095,8 +2094,7 b' class HasRepoGroupPermissionAny(PermsFun' | |||
|
2095 | 2094 | class HasRepoGroupPermissionAll(PermsFunction): |
|
2096 | 2095 | def __call__(self, group_name=None, check_location='', user=None): |
|
2097 | 2096 | self.repo_group_name = group_name |
|
2098 | return super(HasRepoGroupPermissionAll, self).__call__( | |
|
2099 | check_location, user) | |
|
2097 | return super(HasRepoGroupPermissionAll, self).__call__(check_location, user) | |
|
2100 | 2098 | |
|
2101 | 2099 | def check_permissions(self, user): |
|
2102 | 2100 | perms = user.permissions |
@@ -2112,8 +2110,7 b' class HasRepoGroupPermissionAll(PermsFun' | |||
|
2112 | 2110 | class HasUserGroupPermissionAny(PermsFunction): |
|
2113 | 2111 | def __call__(self, user_group_name=None, check_location='', user=None): |
|
2114 | 2112 | self.user_group_name = user_group_name |
|
2115 | return super(HasUserGroupPermissionAny, self).__call__( | |
|
2116 | check_location, user) | |
|
2113 | return super(HasUserGroupPermissionAny, self).__call__(check_location, user) | |
|
2117 | 2114 | |
|
2118 | 2115 | def check_permissions(self, user): |
|
2119 | 2116 | perms = user.permissions |
@@ -2129,8 +2126,7 b' class HasUserGroupPermissionAny(PermsFun' | |||
|
2129 | 2126 | class HasUserGroupPermissionAll(PermsFunction): |
|
2130 | 2127 | def __call__(self, user_group_name=None, check_location='', user=None): |
|
2131 | 2128 | self.user_group_name = user_group_name |
|
2132 | return super(HasUserGroupPermissionAll, self).__call__( | |
|
2133 | check_location, user) | |
|
2129 | return super(HasUserGroupPermissionAll, self).__call__(check_location, user) | |
|
2134 | 2130 | |
|
2135 | 2131 | def check_permissions(self, user): |
|
2136 | 2132 | perms = user.permissions |
@@ -288,7 +288,6 b' def attach_context_attributes(context, r' | |||
|
288 | 288 | """ |
|
289 | 289 | config = request.registry.settings |
|
290 | 290 | |
|
291 | ||
|
292 | 291 | rc_config = SettingsModel().get_all_settings(cache=True) |
|
293 | 292 | |
|
294 | 293 | context.rhodecode_version = rhodecode.__version__ |
@@ -375,20 +374,25 b' def attach_context_attributes(context, r' | |||
|
375 | 374 | "sideside": "sideside" |
|
376 | 375 | }.get(request.GET.get('diffmode')) |
|
377 | 376 | |
|
378 | if diffmode and diffmode != request.session.get('rc_user_session_attr.diffmode'): | |
|
379 | request.session['rc_user_session_attr.diffmode'] = diffmode | |
|
380 | ||
|
381 | # session settings per user | |
|
377 | is_api = hasattr(request, 'rpc_user') | |
|
382 | 378 | session_attrs = { |
|
383 | 379 | # defaults |
|
384 | 380 | "clone_url_format": "http", |
|
385 | 381 | "diffmode": "sideside" |
|
386 | 382 | } |
|
387 | for k, v in request.session.items(): | |
|
388 | pref = 'rc_user_session_attr.' | |
|
389 | if k and k.startswith(pref): | |
|
390 | k = k[len(pref):] | |
|
391 | session_attrs[k] = v | |
|
383 | ||
|
384 | if not is_api: | |
|
385 | # don't access pyramid session for API calls | |
|
386 | if diffmode and diffmode != request.session.get('rc_user_session_attr.diffmode'): | |
|
387 | request.session['rc_user_session_attr.diffmode'] = diffmode | |
|
388 | ||
|
389 | # session settings per user | |
|
390 | ||
|
391 | for k, v in request.session.items(): | |
|
392 | pref = 'rc_user_session_attr.' | |
|
393 | if k and k.startswith(pref): | |
|
394 | k = k[len(pref):] | |
|
395 | session_attrs[k] = v | |
|
392 | 396 | |
|
393 | 397 | context.user_session_attrs = session_attrs |
|
394 | 398 | |
@@ -420,8 +424,12 b' def attach_context_attributes(context, r' | |||
|
420 | 424 | 'extra': {'plugins': {}} |
|
421 | 425 | } |
|
422 | 426 | # END CONFIG VARS |
|
427 | if is_api: | |
|
428 | csrf_token = None | |
|
429 | else: | |
|
430 | csrf_token = auth.get_csrf_token(session=request.session) | |
|
423 | 431 | |
|
424 |
context.csrf_token = |
|
|
432 | context.csrf_token = csrf_token | |
|
425 | 433 | context.backends = rhodecode.BACKENDS.keys() |
|
426 | 434 | context.backends.sort() |
|
427 | 435 | unread_count = 0 |
@@ -537,7 +545,7 b' def bootstrap_config(request):' | |||
|
537 | 545 | |
|
538 | 546 | # allow pyramid lookup in testing |
|
539 | 547 | config.include('pyramid_mako') |
|
540 |
config.include(' |
|
|
548 | config.include('rhodecode.lib.rc_beaker') | |
|
541 | 549 | config.include('rhodecode.lib.rc_cache') |
|
542 | 550 | |
|
543 | 551 | add_events_routes(config) |
@@ -375,6 +375,27 b' class DbManage(object):' | |||
|
375 | 375 | hgevolve.ui_active = False |
|
376 | 376 | self.sa.add(hgevolve) |
|
377 | 377 | |
|
378 | hgevolve = RhodeCodeUi() | |
|
379 | hgevolve.ui_section = 'experimental' | |
|
380 | hgevolve.ui_key = 'evolution' | |
|
381 | hgevolve.ui_value = '' | |
|
382 | hgevolve.ui_active = False | |
|
383 | self.sa.add(hgevolve) | |
|
384 | ||
|
385 | hgevolve = RhodeCodeUi() | |
|
386 | hgevolve.ui_section = 'experimental' | |
|
387 | hgevolve.ui_key = 'evolution.exchange' | |
|
388 | hgevolve.ui_value = '' | |
|
389 | hgevolve.ui_active = False | |
|
390 | self.sa.add(hgevolve) | |
|
391 | ||
|
392 | hgevolve = RhodeCodeUi() | |
|
393 | hgevolve.ui_section = 'extensions' | |
|
394 | hgevolve.ui_key = 'topic' | |
|
395 | hgevolve.ui_value = '' | |
|
396 | hgevolve.ui_active = False | |
|
397 | self.sa.add(hgevolve) | |
|
398 | ||
|
378 | 399 | # enable hggit disabled by default |
|
379 | 400 | hggit = RhodeCodeUi() |
|
380 | 401 | hggit.ui_section = 'extensions' |
@@ -464,9 +485,6 b' class DbManage(object):' | |||
|
464 | 485 | self.populate_default_permissions() |
|
465 | 486 | return fixed |
|
466 | 487 | |
|
467 | def update_repo_info(self): | |
|
468 | RepoModel.update_repoinfo() | |
|
469 | ||
|
470 | 488 | def config_prompt(self, test_repo_path='', retries=3): |
|
471 | 489 | defaults = self.cli_args |
|
472 | 490 | _path = defaults.get('repos_location') |
@@ -614,7 +614,7 b' class Repository(Base, BaseModel):' | |||
|
614 | 614 | if (cs_cache != self.changeset_cache or not self.changeset_cache): |
|
615 | 615 | _default = datetime.datetime.fromtimestamp(0) |
|
616 | 616 | last_change = cs_cache.get('date') or _default |
|
617 |
log.debug('updated repo %s with new c |
|
|
617 | log.debug('updated repo %s with new commit cache %s', self.repo_name, cs_cache) | |
|
618 | 618 | self.updated_on = last_change |
|
619 | 619 | self.changeset_cache = cs_cache |
|
620 | 620 | Session().add(self) |
@@ -38,8 +38,6 b' from sqlalchemy.orm import relationship,' | |||
|
38 | 38 | from sqlalchemy.exc import OperationalError |
|
39 | 39 | from beaker.cache import cache_region, region_invalidate |
|
40 | 40 | from webob.exc import HTTPNotFound |
|
41 | from Crypto.Cipher import AES | |
|
42 | from Crypto import Random | |
|
43 | 41 | from zope.cachedescriptors.property import Lazy as LazyProperty |
|
44 | 42 | |
|
45 | 43 | from rhodecode.translation import _ |
@@ -2164,7 +2164,7 b' class Repository(Base, BaseModel):' | |||
|
2164 | 2164 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
2165 | 2165 | _default = datetime.datetime.fromtimestamp(0) |
|
2166 | 2166 | last_change = cs_cache.get('date') or _default |
|
2167 |
log.debug('updated repo %s with new c |
|
|
2167 | log.debug('updated repo %s with new commit cache %s', | |
|
2168 | 2168 | self.repo_name, cs_cache) |
|
2169 | 2169 | self.updated_on = last_change |
|
2170 | 2170 | self.changeset_cache = cs_cache |
@@ -2230,7 +2230,7 b' class Repository(Base, BaseModel):' | |||
|
2230 | 2230 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
2231 | 2231 | _default = datetime.datetime.fromtimestamp(0) |
|
2232 | 2232 | last_change = cs_cache.get('date') or _default |
|
2233 |
log.debug('updated repo %s with new c |
|
|
2233 | log.debug('updated repo %s with new commit cache %s', | |
|
2234 | 2234 | self.repo_name, cs_cache) |
|
2235 | 2235 | self.updated_on = last_change |
|
2236 | 2236 | self.changeset_cache = cs_cache |
@@ -2278,7 +2278,7 b' class Repository(Base, BaseModel):' | |||
|
2278 | 2278 | # if yes, we use the current timestamp instead. Imagine you get |
|
2279 | 2279 | # old commit pushed 1y ago, we'd set last update 1y to ago. |
|
2280 | 2280 | last_change = _default |
|
2281 |
log.debug('updated repo %s with new c |
|
|
2281 | log.debug('updated repo %s with new commit cache %s', | |
|
2282 | 2282 | self.repo_name, cs_cache) |
|
2283 | 2283 | self.updated_on = last_change |
|
2284 | 2284 | self.changeset_cache = cs_cache |
@@ -2301,7 +2301,7 b' class Repository(Base, BaseModel):' | |||
|
2301 | 2301 | # if yes, we use the current timestamp instead. Imagine you get |
|
2302 | 2302 | # old commit pushed 1y ago, we'd set last update 1y to ago. |
|
2303 | 2303 | last_change = _default |
|
2304 |
log.debug('updated repo %s with new c |
|
|
2304 | log.debug('updated repo %s with new commit cache %s', | |
|
2305 | 2305 | self.repo_name, cs_cache) |
|
2306 | 2306 | self.updated_on = last_change |
|
2307 | 2307 | self.changeset_cache = cs_cache |
@@ -2301,7 +2301,7 b' class Repository(Base, BaseModel):' | |||
|
2301 | 2301 | # if yes, we use the current timestamp instead. Imagine you get |
|
2302 | 2302 | # old commit pushed 1y ago, we'd set last update 1y to ago. |
|
2303 | 2303 | last_change = _default |
|
2304 |
log.debug('updated repo %s with new c |
|
|
2304 | log.debug('updated repo %s with new commit cache %s', | |
|
2305 | 2305 | self.repo_name, cs_cache) |
|
2306 | 2306 | self.updated_on = last_change |
|
2307 | 2307 | self.changeset_cache = cs_cache |
@@ -1865,7 +1865,7 b' class Repository(Base, BaseModel):' | |||
|
1865 | 1865 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
1866 | 1866 | _default = datetime.datetime.fromtimestamp(0) |
|
1867 | 1867 | last_change = cs_cache.get('date') or _default |
|
1868 |
log.debug('updated repo %s with new c |
|
|
1868 | log.debug('updated repo %s with new commit cache %s', | |
|
1869 | 1869 | self.repo_name, cs_cache) |
|
1870 | 1870 | self.updated_on = last_change |
|
1871 | 1871 | self.changeset_cache = cs_cache |
@@ -1868,7 +1868,7 b' class Repository(Base, BaseModel):' | |||
|
1868 | 1868 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
1869 | 1869 | _default = datetime.datetime.fromtimestamp(0) |
|
1870 | 1870 | last_change = cs_cache.get('date') or _default |
|
1871 |
log.debug('updated repo %s with new c |
|
|
1871 | log.debug('updated repo %s with new commit cache %s', | |
|
1872 | 1872 | self.repo_name, cs_cache) |
|
1873 | 1873 | self.updated_on = last_change |
|
1874 | 1874 | self.changeset_cache = cs_cache |
@@ -1867,7 +1867,7 b' class Repository(Base, BaseModel):' | |||
|
1867 | 1867 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
1868 | 1868 | _default = datetime.datetime.fromtimestamp(0) |
|
1869 | 1869 | last_change = cs_cache.get('date') or _default |
|
1870 |
log.debug('updated repo %s with new c |
|
|
1870 | log.debug('updated repo %s with new commit cache %s', | |
|
1871 | 1871 | self.repo_name, cs_cache) |
|
1872 | 1872 | self.updated_on = last_change |
|
1873 | 1873 | self.changeset_cache = cs_cache |
@@ -1869,7 +1869,7 b' class Repository(Base, BaseModel):' | |||
|
1869 | 1869 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
1870 | 1870 | _default = datetime.datetime.fromtimestamp(0) |
|
1871 | 1871 | last_change = cs_cache.get('date') or _default |
|
1872 |
log.debug('updated repo %s with new c |
|
|
1872 | log.debug('updated repo %s with new commit cache %s', | |
|
1873 | 1873 | self.repo_name, cs_cache) |
|
1874 | 1874 | self.updated_on = last_change |
|
1875 | 1875 | self.changeset_cache = cs_cache |
@@ -1869,7 +1869,7 b' class Repository(Base, BaseModel):' | |||
|
1869 | 1869 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
1870 | 1870 | _default = datetime.datetime.fromtimestamp(0) |
|
1871 | 1871 | last_change = cs_cache.get('date') or _default |
|
1872 |
log.debug('updated repo %s with new c |
|
|
1872 | log.debug('updated repo %s with new commit cache %s', | |
|
1873 | 1873 | self.repo_name, cs_cache) |
|
1874 | 1874 | self.updated_on = last_change |
|
1875 | 1875 | self.changeset_cache = cs_cache |
@@ -1912,7 +1912,7 b' class Repository(Base, BaseModel):' | |||
|
1912 | 1912 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
1913 | 1913 | _default = datetime.datetime.fromtimestamp(0) |
|
1914 | 1914 | last_change = cs_cache.get('date') or _default |
|
1915 |
log.debug('updated repo %s with new c |
|
|
1915 | log.debug('updated repo %s with new commit cache %s', | |
|
1916 | 1916 | self.repo_name, cs_cache) |
|
1917 | 1917 | self.updated_on = last_change |
|
1918 | 1918 | self.changeset_cache = cs_cache |
@@ -1913,7 +1913,7 b' class Repository(Base, BaseModel):' | |||
|
1913 | 1913 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
1914 | 1914 | _default = datetime.datetime.fromtimestamp(0) |
|
1915 | 1915 | last_change = cs_cache.get('date') or _default |
|
1916 |
log.debug('updated repo %s with new c |
|
|
1916 | log.debug('updated repo %s with new commit cache %s', | |
|
1917 | 1917 | self.repo_name, cs_cache) |
|
1918 | 1918 | self.updated_on = last_change |
|
1919 | 1919 | self.changeset_cache = cs_cache |
@@ -2100,7 +2100,7 b' class Repository(Base, BaseModel):' | |||
|
2100 | 2100 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
2101 | 2101 | _default = datetime.datetime.fromtimestamp(0) |
|
2102 | 2102 | last_change = cs_cache.get('date') or _default |
|
2103 |
log.debug('updated repo %s with new c |
|
|
2103 | log.debug('updated repo %s with new commit cache %s', | |
|
2104 | 2104 | self.repo_name, cs_cache) |
|
2105 | 2105 | self.updated_on = last_change |
|
2106 | 2106 | self.changeset_cache = cs_cache |
@@ -111,4 +111,28 b' class AESCipher(object):' | |||
|
111 | 111 | |
|
112 | 112 | @staticmethod |
|
113 | 113 | def _unpad(s): |
|
114 | return s[:-ord(s[len(s)-1:])] No newline at end of file | |
|
114 | return s[:-ord(s[len(s)-1:])] | |
|
115 | ||
|
116 | ||
|
117 | def validate_and_get_enc_data(enc_data, enc_key, enc_strict_mode): | |
|
118 | parts = enc_data.split('$', 3) | |
|
119 | if not len(parts) == 3: | |
|
120 | # probably not encrypted values | |
|
121 | return enc_data | |
|
122 | else: | |
|
123 | if parts[0] != 'enc': | |
|
124 | # parts ok but without our header ? | |
|
125 | return enc_data | |
|
126 | ||
|
127 | # at that stage we know it's our encryption | |
|
128 | if parts[1] == 'aes': | |
|
129 | decrypted_data = AESCipher(enc_key).decrypt(parts[2]) | |
|
130 | elif parts[1] == 'aes_hmac': | |
|
131 | decrypted_data = AESCipher( | |
|
132 | enc_key, hmac=True, | |
|
133 | strict_verification=enc_strict_mode).decrypt(parts[2]) | |
|
134 | else: | |
|
135 | raise ValueError( | |
|
136 | 'Encryption type part is wrong, must be `aes` ' | |
|
137 | 'or `aes_hmac`, got `%s` instead' % (parts[1])) | |
|
138 | return decrypted_data |
@@ -199,6 +199,7 b' class _GetError(object):' | |||
|
199 | 199 | if form_errors and field_name in form_errors: |
|
200 | 200 | return literal(tmpl % form_errors.get(field_name)) |
|
201 | 201 | |
|
202 | ||
|
202 | 203 | get_error = _GetError() |
|
203 | 204 | |
|
204 | 205 | |
@@ -214,38 +215,45 b' class _ToolTip(object):' | |||
|
214 | 215 | tooltip_title = escape(tooltip_title) |
|
215 | 216 | tooltip_title = tooltip_title.replace('<', '<').replace('>', '>') |
|
216 | 217 | return tooltip_title |
|
218 | ||
|
219 | ||
|
217 | 220 | tooltip = _ToolTip() |
|
218 | 221 | |
|
222 | files_icon = u'<i class="file-breadcrumb-copy tooltip icon-clipboard clipboard-action" data-clipboard-text="{}" title="Copy the full path"></i>' | |
|
219 | 223 | |
|
220 | def files_breadcrumbs(repo_name, commit_id, file_path): | |
|
224 | ||
|
225 | def files_breadcrumbs(repo_name, commit_id, file_path, at_ref=None, limit_items=False, linkify_last_item=False): | |
|
221 | 226 | if isinstance(file_path, str): |
|
222 | 227 | file_path = safe_unicode(file_path) |
|
223 | 228 | |
|
224 | # TODO: johbo: Is this always a url like path, or is this operating | |
|
225 | # system dependent? | |
|
226 | path_segments = file_path.split('/') | |
|
229 | route_qry = {'at': at_ref} if at_ref else None | |
|
227 | 230 | |
|
228 | repo_name_html = escape(repo_name) | |
|
229 | if len(path_segments) == 1 and path_segments[0] == '': | |
|
230 |
|
|
|
231 | else: | |
|
232 | url_segments = [ | |
|
233 |
|
|
|
234 |
|
|
|
235 |
r |
|
|
236 | 'repo_files', | |
|
237 |
|
|
|
238 | commit_id=commit_id, | |
|
239 | f_path=''), | |
|
240 | class_='pjax-link')] | |
|
231 | # first segment is a `..` link to repo files | |
|
232 | root_name = literal(u'<i class="icon-home"></i>') | |
|
233 | url_segments = [ | |
|
234 | link_to( | |
|
235 | root_name, | |
|
236 | route_path( | |
|
237 | 'repo_files', | |
|
238 | repo_name=repo_name, | |
|
239 | commit_id=commit_id, | |
|
240 | f_path='', | |
|
241 | _query=route_qry), | |
|
242 | )] | |
|
241 | 243 | |
|
244 | path_segments = file_path.split('/') | |
|
242 | 245 | last_cnt = len(path_segments) - 1 |
|
243 | 246 | for cnt, segment in enumerate(path_segments): |
|
244 | 247 | if not segment: |
|
245 | 248 | continue |
|
246 | 249 | segment_html = escape(segment) |
|
247 | 250 | |
|
248 |
|
|
|
251 | last_item = cnt == last_cnt | |
|
252 | ||
|
253 | if last_item and linkify_last_item is False: | |
|
254 | # plain version | |
|
255 | url_segments.append(segment_html) | |
|
256 | else: | |
|
249 | 257 | url_segments.append( |
|
250 | 258 | link_to( |
|
251 | 259 | segment_html, |
@@ -253,12 +261,32 b' def files_breadcrumbs(repo_name, commit_' | |||
|
253 | 261 | 'repo_files', |
|
254 | 262 | repo_name=repo_name, |
|
255 | 263 | commit_id=commit_id, |
|
256 |
f_path='/'.join(path_segments[:cnt + 1]) |
|
|
257 | class_='pjax-link')) | |
|
258 | else: | |
|
259 | url_segments.append(segment_html) | |
|
264 | f_path='/'.join(path_segments[:cnt + 1]), | |
|
265 | _query=route_qry), | |
|
266 | )) | |
|
267 | ||
|
268 | limited_url_segments = url_segments[:1] + ['...'] + url_segments[-5:] | |
|
269 | if limit_items and len(limited_url_segments) < len(url_segments): | |
|
270 | url_segments = limited_url_segments | |
|
260 | 271 | |
|
261 | return literal('/'.join(url_segments)) | |
|
272 | full_path = file_path | |
|
273 | icon = files_icon.format(escape(full_path)) | |
|
274 | if file_path == '': | |
|
275 | return root_name | |
|
276 | else: | |
|
277 | return literal(' / '.join(url_segments) + icon) | |
|
278 | ||
|
279 | ||
|
280 | def files_url_data(request): | |
|
281 | matchdict = request.matchdict | |
|
282 | ||
|
283 | if 'f_path' not in matchdict: | |
|
284 | matchdict['f_path'] = '' | |
|
285 | ||
|
286 | if 'commit_id' not in matchdict: | |
|
287 | matchdict['commit_id'] = 'tip' | |
|
288 | ||
|
289 | return json.dumps(matchdict) | |
|
262 | 290 | |
|
263 | 291 | |
|
264 | 292 | def code_highlight(code, lexer, formatter, use_hl_filter=False): |
@@ -1196,7 +1224,7 b' class InitialsGravatar(object):' | |||
|
1196 | 1224 | </text> |
|
1197 | 1225 | </svg>""".format( |
|
1198 | 1226 | size=self.size, |
|
1199 |
f_size=self.size/ |
|
|
1227 | f_size=self.size/2.05, # scale the text inside the box nicely | |
|
1200 | 1228 | background=self.background, |
|
1201 | 1229 | text_color=self.text_color, |
|
1202 | 1230 | text=initials.upper(), |
@@ -1486,10 +1514,12 b' def breadcrumb_repo_link(repo):' | |||
|
1486 | 1514 | """ |
|
1487 | 1515 | |
|
1488 | 1516 | path = [ |
|
1489 |
link_to(group.name, route_path('repo_group_home', repo_group_name=group.group_name) |
|
|
1517 | link_to(group.name, route_path('repo_group_home', repo_group_name=group.group_name), | |
|
1518 | title='last change:{}'.format(format_date(group.last_commit_change))) | |
|
1490 | 1519 | for group in repo.groups_with_parents |
|
1491 | 1520 | ] + [ |
|
1492 |
link_to(repo.just_name, route_path('repo_summary', repo_name=repo.repo_name) |
|
|
1521 | link_to(repo.just_name, route_path('repo_summary', repo_name=repo.repo_name), | |
|
1522 | title='last change:{}'.format(format_date(repo.last_commit_change))) | |
|
1493 | 1523 | ] |
|
1494 | 1524 | |
|
1495 | 1525 | return literal(' » '.join(path)) |
@@ -1507,11 +1537,13 b' def breadcrumb_repo_group_link(repo_grou' | |||
|
1507 | 1537 | |
|
1508 | 1538 | path = [ |
|
1509 | 1539 | link_to(group.name, |
|
1510 |
route_path('repo_group_home', repo_group_name=group.group_name) |
|
|
1540 | route_path('repo_group_home', repo_group_name=group.group_name), | |
|
1541 | title='last change:{}'.format(format_date(group.last_commit_change))) | |
|
1511 | 1542 | for group in repo_group.parents |
|
1512 | 1543 | ] + [ |
|
1513 | 1544 | link_to(repo_group.name, |
|
1514 |
route_path('repo_group_home', repo_group_name=repo_group.group_name) |
|
|
1545 | route_path('repo_group_home', repo_group_name=repo_group.group_name), | |
|
1546 | title='last change:{}'.format(format_date(repo_group.last_commit_change))) | |
|
1515 | 1547 | ] |
|
1516 | 1548 | |
|
1517 | 1549 | return literal(' » '.join(path)) |
@@ -1907,23 +1939,26 b' def secure_form(form_url, method="POST",' | |||
|
1907 | 1939 | |
|
1908 | 1940 | def dropdownmenu(name, selected, options, enable_filter=False, **attrs): |
|
1909 | 1941 | select_html = select(name, selected, options, **attrs) |
|
1942 | ||
|
1910 | 1943 | select2 = """ |
|
1911 | 1944 | <script> |
|
1912 | 1945 | $(document).ready(function() { |
|
1913 | 1946 | $('#%s').select2({ |
|
1914 | containerCssClass: 'drop-menu', | |
|
1947 | containerCssClass: 'drop-menu %s', | |
|
1915 | 1948 | dropdownCssClass: 'drop-menu-dropdown', |
|
1916 | 1949 | dropdownAutoWidth: true%s |
|
1917 | 1950 | }); |
|
1918 | 1951 | }); |
|
1919 | 1952 | </script> |
|
1920 | 1953 | """ |
|
1954 | ||
|
1921 | 1955 | filter_option = """, |
|
1922 | 1956 | minimumResultsForSearch: -1 |
|
1923 | 1957 | """ |
|
1924 | 1958 | input_id = attrs.get('id') or name |
|
1959 | extra_classes = ' '.join(attrs.pop('extra_classes', [])) | |
|
1925 | 1960 | filter_enabled = "" if enable_filter else filter_option |
|
1926 | select_script = literal(select2 % (input_id, filter_enabled)) | |
|
1961 | select_script = literal(select2 % (input_id, extra_classes, filter_enabled)) | |
|
1927 | 1962 | |
|
1928 | 1963 | return literal(select_html+select_script) |
|
1929 | 1964 | |
@@ -1944,7 +1979,7 b' def get_visual_attr(tmpl_context_var, at' | |||
|
1944 | 1979 | |
|
1945 | 1980 | def get_last_path_part(file_node): |
|
1946 | 1981 | if not file_node.path: |
|
1947 | return u'' | |
|
1982 | return u'/' | |
|
1948 | 1983 | |
|
1949 | 1984 | path = safe_unicode(file_node.path.split('/')[-1]) |
|
1950 | 1985 | return u'../' + path |
@@ -2033,10 +2068,11 b' def reviewer_as_json(*args, **kwargs):' | |||
|
2033 | 2068 | def get_repo_view_type(request): |
|
2034 | 2069 | route_name = request.matched_route.name |
|
2035 | 2070 | route_to_view_type = { |
|
2036 |
'repo_changelog': 'c |
|
|
2071 | 'repo_changelog': 'commits', | |
|
2072 | 'repo_commits': 'commits', | |
|
2037 | 2073 | 'repo_files': 'files', |
|
2038 | 2074 | 'repo_summary': 'summary', |
|
2039 | 2075 | 'repo_commit': 'commit' |
|
2076 | } | |
|
2040 | 2077 | |
|
2041 | } | |
|
2042 | 2078 | return route_to_view_type.get(route_name) |
@@ -80,7 +80,7 b' def trigger_log_create_pull_request_hook' | |||
|
80 | 80 | extras = _get_rc_scm_extras(username, repo_name, repo_alias, |
|
81 | 81 | 'create_pull_request') |
|
82 | 82 | events.trigger(events.PullRequestCreateEvent(pull_request)) |
|
83 | extras.update(pull_request.get_api_data()) | |
|
83 | extras.update(pull_request.get_api_data(with_merge_state=False)) | |
|
84 | 84 | hooks_base.log_create_pull_request(**extras) |
|
85 | 85 | |
|
86 | 86 |
@@ -80,6 +80,13 b' class BaseSearcher(object):' | |||
|
80 | 80 | def extract_search_tags(query): |
|
81 | 81 | return [] |
|
82 | 82 | |
|
83 | @staticmethod | |
|
84 | def escape_specials(val): | |
|
85 | """ | |
|
86 | Handle and escape reserved chars for search | |
|
87 | """ | |
|
88 | return val | |
|
89 | ||
|
83 | 90 | |
|
84 | 91 | def search_config(config, prefix='search.'): |
|
85 | 92 | _config = {} |
@@ -20,7 +20,55 b'' | |||
|
20 | 20 | |
|
21 | 21 | import markdown |
|
22 | 22 | |
|
23 | from mdx_gfm import GithubFlavoredMarkdownExtension # pragma: no cover | |
|
23 | from markdown.extensions import Extension | |
|
24 | from markdown.extensions.fenced_code import FencedCodeExtension | |
|
25 | from markdown.extensions.smart_strong import SmartEmphasisExtension | |
|
26 | from markdown.extensions.tables import TableExtension | |
|
27 | from markdown.extensions.nl2br import Nl2BrExtension | |
|
28 | ||
|
29 | import gfm | |
|
30 | ||
|
31 | ||
|
32 | class GithubFlavoredMarkdownExtension(Extension): | |
|
33 | """ | |
|
34 | An extension that is as compatible as possible with GitHub-flavored | |
|
35 | Markdown (GFM). | |
|
36 | ||
|
37 | This extension aims to be compatible with the variant of GFM that GitHub | |
|
38 | uses for Markdown-formatted gists and files (including READMEs). This | |
|
39 | variant seems to have all the extensions described in the `GFM | |
|
40 | documentation`_, except: | |
|
41 | ||
|
42 | - Newlines in paragraphs are not transformed into ``br`` tags. | |
|
43 | - Intra-GitHub links to commits, repositories, and issues are not | |
|
44 | supported. | |
|
45 | ||
|
46 | If you need support for features specific to GitHub comments and issues, | |
|
47 | please use :class:`mdx_gfm.GithubFlavoredMarkdownExtension`. | |
|
48 | ||
|
49 | .. _GFM documentation: https://guides.github.com/features/mastering-markdown/ | |
|
50 | """ | |
|
51 | ||
|
52 | def extendMarkdown(self, md, md_globals): | |
|
53 | # Built-in extensions | |
|
54 | FencedCodeExtension().extendMarkdown(md, md_globals) | |
|
55 | SmartEmphasisExtension().extendMarkdown(md, md_globals) | |
|
56 | TableExtension().extendMarkdown(md, md_globals) | |
|
57 | ||
|
58 | # Custom extensions | |
|
59 | gfm.AutolinkExtension().extendMarkdown(md, md_globals) | |
|
60 | gfm.AutomailExtension().extendMarkdown(md, md_globals) | |
|
61 | gfm.HiddenHiliteExtension([ | |
|
62 | ('guess_lang', 'False'), | |
|
63 | ('css_class', 'highlight') | |
|
64 | ]).extendMarkdown(md, md_globals) | |
|
65 | gfm.SemiSaneListExtension().extendMarkdown(md, md_globals) | |
|
66 | gfm.SpacedLinkExtension().extendMarkdown(md, md_globals) | |
|
67 | gfm.StrikethroughExtension().extendMarkdown(md, md_globals) | |
|
68 | gfm.TaskListExtension([ | |
|
69 | ('list_attrs', {'class': 'checkbox'}) | |
|
70 | ]).extendMarkdown(md, md_globals) | |
|
71 | Nl2BrExtension().extendMarkdown(md, md_globals) | |
|
24 | 72 | |
|
25 | 73 | |
|
26 | 74 | # Global Vars |
@@ -500,38 +500,38 b' class MarkupRenderer(object):' | |||
|
500 | 500 | (body, resources) = html_exporter.from_notebook_node(notebook) |
|
501 | 501 | header = '<!-- ## IPYTHON NOTEBOOK RENDERING ## -->' |
|
502 | 502 | js = MakoTemplate(r''' |
|
503 |
<!-- |
|
|
504 | <!-- MathJax configuration --> | |
|
505 | <script type="text/x-mathjax-config"> | |
|
506 | MathJax.Hub.Config({ | |
|
507 | jax: ["input/TeX","output/HTML-CSS", "output/PreviewHTML"], | |
|
508 | extensions: ["tex2jax.js","MathMenu.js","MathZoom.js", "fast-preview.js", "AssistiveMML.js", "[Contrib]/a11y/accessibility-menu.js"], | |
|
509 | TeX: { | |
|
510 | extensions: ["AMSmath.js","AMSsymbols.js","noErrors.js","noUndefined.js"] | |
|
511 |
|
|
|
512 | tex2jax: { | |
|
513 |
|
|
|
514 | displayMath: [ ['$$','$$'], ["\\[","\\]"] ], | |
|
515 |
|
|
|
516 | processEnvironments: true | |
|
517 | }, | |
|
518 |
|
|
|
519 | // we use CSS to left justify single line equations in code cells. | |
|
520 | displayAlign: 'center', | |
|
521 | "HTML-CSS": { | |
|
522 | styles: {'.MathJax_Display': {"margin": 0}}, | |
|
523 | linebreaks: { automatic: true }, | |
|
524 | availableFonts: ["STIX", "TeX"] | |
|
525 | }, | |
|
526 | showMathMenu: false | |
|
527 |
|
|
|
528 | </script> | |
|
529 | <!-- End of mathjax configuration --> | |
|
530 | <script src="${h.asset('js/src/math_jax/MathJax.js')}"></script> | |
|
503 | <!-- MathJax configuration --> | |
|
504 | <script type="text/x-mathjax-config"> | |
|
505 | MathJax.Hub.Config({ | |
|
506 | jax: ["input/TeX","output/HTML-CSS", "output/PreviewHTML"], | |
|
507 | extensions: ["tex2jax.js","MathMenu.js","MathZoom.js", "fast-preview.js", "AssistiveMML.js", "[Contrib]/a11y/accessibility-menu.js"], | |
|
508 | TeX: { | |
|
509 | extensions: ["AMSmath.js","AMSsymbols.js","noErrors.js","noUndefined.js"] | |
|
510 | }, | |
|
511 | tex2jax: { | |
|
512 | inlineMath: [ ['$','$'], ["\\(","\\)"] ], | |
|
513 | displayMath: [ ['$$','$$'], ["\\[","\\]"] ], | |
|
514 | processEscapes: true, | |
|
515 | processEnvironments: true | |
|
516 | }, | |
|
517 | // Center justify equations in code and markdown cells. Elsewhere | |
|
518 | // we use CSS to left justify single line equations in code cells. | |
|
519 | displayAlign: 'center', | |
|
520 | "HTML-CSS": { | |
|
521 | styles: {'.MathJax_Display': {"margin": 0}}, | |
|
522 | linebreaks: { automatic: true }, | |
|
523 | availableFonts: ["STIX", "TeX"] | |
|
524 | }, | |
|
525 | showMathMenu: false | |
|
526 | }); | |
|
527 | </script> | |
|
528 | <!-- End of MathJax configuration --> | |
|
529 | <script src="${h.asset('js/src/math_jax/MathJax.js')}"></script> | |
|
531 | 530 | ''').render(h=helpers) |
|
532 | 531 | |
|
533 | css = '<style>{}</style>'.format( | |
|
534 | ''.join(_sanitize_resources(resources['inlining']['css']))) | |
|
532 | css = MakoTemplate(r''' | |
|
533 | <link rel="stylesheet" type="text/css" href="${h.asset('css/style-ipython.css', ver=ver)}" media="screen"/> | |
|
534 | ''').render(h=helpers, ver='ver1') | |
|
535 | 535 | |
|
536 | 536 | body = '\n'.join([header, css, js, body]) |
|
537 | 537 | return body, resources |
@@ -142,7 +142,7 b' class SimpleGit(simplevcs.SimpleVCS):' | |||
|
142 | 142 | return self.scm_app.create_git_wsgi_app( |
|
143 | 143 | repo_path, repo_name, config) |
|
144 | 144 | |
|
145 | def _create_config(self, extras, repo_name): | |
|
145 | def _create_config(self, extras, repo_name, scheme='http'): | |
|
146 | 146 | extras['git_update_server_info'] = utils2.str2bool( |
|
147 | 147 | rhodecode.CONFIG.get('git_update_server_info')) |
|
148 | 148 | |
@@ -152,4 +152,5 b' class SimpleGit(simplevcs.SimpleVCS):' | |||
|
152 | 152 | extras['git_lfs_enabled'] = utils2.str2bool( |
|
153 | 153 | config.get('vcs_git_lfs', 'enabled')) |
|
154 | 154 | extras['git_lfs_store_path'] = custom_store or default_lfs_store() |
|
155 | extras['git_lfs_http_scheme'] = scheme | |
|
155 | 156 | return extras |
@@ -153,7 +153,7 b' class SimpleHg(simplevcs.SimpleVCS):' | |||
|
153 | 153 | def _create_wsgi_app(self, repo_path, repo_name, config): |
|
154 | 154 | return self.scm_app.create_hg_wsgi_app(repo_path, repo_name, config) |
|
155 | 155 | |
|
156 | def _create_config(self, extras, repo_name): | |
|
156 | def _create_config(self, extras, repo_name, scheme='http'): | |
|
157 | 157 | config = utils.make_db_config(repo=repo_name) |
|
158 | 158 | config.set('rhodecode', 'RC_SCM_DATA', json.dumps(extras)) |
|
159 | 159 |
@@ -87,20 +87,22 b' class SimpleSvnApp(object):' | |||
|
87 | 87 | stream = True |
|
88 | 88 | |
|
89 | 89 | stream = stream |
|
90 | log.debug( | |
|
91 | 'Calling SVN PROXY: method:%s via `%s`, Stream: %s', | |
|
92 | req_method, path_info, stream) | |
|
93 | response = requests.request( | |
|
94 | req_method, path_info, | |
|
95 | data=data, headers=request_headers, stream=stream) | |
|
90 | log.debug('Calling SVN PROXY at `%s`, using method:%s. Stream: %s', | |
|
91 | path_info, req_method, stream) | |
|
92 | try: | |
|
93 | response = requests.request( | |
|
94 | req_method, path_info, | |
|
95 | data=data, headers=request_headers, stream=stream) | |
|
96 | except requests.ConnectionError: | |
|
97 | log.exception('ConnectionError occurred for endpoint %s', path_info) | |
|
98 | raise | |
|
96 | 99 | |
|
97 | 100 | if response.status_code not in [200, 401]: |
|
101 | text = '\n{}'.format(response.text) if response.text else '' | |
|
98 | 102 | if response.status_code >= 500: |
|
99 |
log.error('Got SVN response:%s with text: |
|
|
100 | response, response.text) | |
|
103 | log.error('Got SVN response:%s with text:`%s`', response, text) | |
|
101 | 104 | else: |
|
102 |
log.debug('Got SVN response:%s with text: |
|
|
103 | response, response.text) | |
|
105 | log.debug('Got SVN response:%s with text:`%s`', response, text) | |
|
104 | 106 | else: |
|
105 | 107 | log.debug('got response code: %s', response.status_code) |
|
106 | 108 | |
@@ -217,7 +219,7 b' class SimpleSvn(simplevcs.SimpleVCS):' | |||
|
217 | 219 | conf = self.repo_vcs_config |
|
218 | 220 | return str2bool(conf.get('vcs_svn_proxy', 'http_requests_enabled')) |
|
219 | 221 | |
|
220 | def _create_config(self, extras, repo_name): | |
|
222 | def _create_config(self, extras, repo_name, scheme='http'): | |
|
221 | 223 | conf = self.repo_vcs_config |
|
222 | 224 | server_url = conf.get('vcs_svn_proxy', 'http_server_url') |
|
223 | 225 | server_url = server_url or self.DEFAULT_HTTP_SERVER |
@@ -360,6 +360,13 b' class SimpleVCS(object):' | |||
|
360 | 360 | |
|
361 | 361 | return perm_result |
|
362 | 362 | |
|
363 | def _get_http_scheme(self, environ): | |
|
364 | try: | |
|
365 | return environ['wsgi.url_scheme'] | |
|
366 | except Exception: | |
|
367 | log.exception('Failed to read http scheme') | |
|
368 | return 'http' | |
|
369 | ||
|
363 | 370 | def _check_ssl(self, environ, start_response): |
|
364 | 371 | """ |
|
365 | 372 | Checks the SSL check flag and returns False if SSL is not present |
@@ -597,7 +604,9 b' class SimpleVCS(object):' | |||
|
597 | 604 | extras, environ, action, txn_id=txn_id) |
|
598 | 605 | log.debug('HOOKS extras is %s', extras) |
|
599 | 606 | |
|
600 | config = self._create_config(extras, self.acl_repo_name) | |
|
607 | http_scheme = self._get_http_scheme(environ) | |
|
608 | ||
|
609 | config = self._create_config(extras, self.acl_repo_name, scheme=http_scheme) | |
|
601 | 610 | app = self._create_wsgi_app(repo_path, self.url_repo_name, config) |
|
602 | 611 | with callback_daemon: |
|
603 | 612 | app.rc_extras = extras |
@@ -643,7 +652,7 b' class SimpleVCS(object):' | |||
|
643 | 652 | """Return the WSGI app that will finally handle the request.""" |
|
644 | 653 | raise NotImplementedError() |
|
645 | 654 | |
|
646 | def _create_config(self, extras, repo_name): | |
|
655 | def _create_config(self, extras, repo_name, scheme='http'): | |
|
647 | 656 | """Create a safe config representation.""" |
|
648 | 657 | raise NotImplementedError() |
|
649 | 658 |
@@ -23,6 +23,7 b' import re' | |||
|
23 | 23 | import time |
|
24 | 24 | import datetime |
|
25 | 25 | import dateutil |
|
26 | import pickle | |
|
26 | 27 | |
|
27 | 28 | from rhodecode.model.db import DbSession, Session |
|
28 | 29 | |
@@ -142,7 +143,6 b' class FileAuthSessions(BaseAuthSessions)' | |||
|
142 | 143 | return stats['callbacks'] |
|
143 | 144 | |
|
144 | 145 | |
|
145 | ||
|
146 | 146 | class MemcachedAuthSessions(BaseAuthSessions): |
|
147 | 147 | SESSION_TYPE = 'ext:memcached' |
|
148 | 148 | _key_regex = re.compile(r'ITEM (.*_session) \[(.*); (.*)\]') |
@@ -195,6 +195,43 b' class MemcachedAuthSessions(BaseAuthSess' | |||
|
195 | 195 | raise CleanupCommand('Cleanup for this session type not yet available') |
|
196 | 196 | |
|
197 | 197 | |
|
198 | class RedisAuthSessions(BaseAuthSessions): | |
|
199 | SESSION_TYPE = 'ext:redis' | |
|
200 | ||
|
201 | def _get_client(self): | |
|
202 | import redis | |
|
203 | args = { | |
|
204 | 'socket_timeout': 60, | |
|
205 | 'url': self.config.get('beaker.session.url') | |
|
206 | } | |
|
207 | ||
|
208 | client = redis.StrictRedis.from_url(**args) | |
|
209 | return client | |
|
210 | ||
|
211 | def get_count(self): | |
|
212 | client = self._get_client() | |
|
213 | return len(client.keys('beaker_cache:*')) | |
|
214 | ||
|
215 | def get_expired_count(self, older_than_seconds=None): | |
|
216 | expiry_date = self._seconds_to_date(older_than_seconds) | |
|
217 | return self.NOT_AVAILABLE | |
|
218 | ||
|
219 | def clean_sessions(self, older_than_seconds=None): | |
|
220 | client = self._get_client() | |
|
221 | expiry_time = time.time() - older_than_seconds | |
|
222 | deleted_keys = 0 | |
|
223 | for key in client.keys('beaker_cache:*'): | |
|
224 | data = client.get(key) | |
|
225 | if data: | |
|
226 | json_data = pickle.loads(data) | |
|
227 | accessed_time = json_data['_accessed_time'] | |
|
228 | if accessed_time < expiry_time: | |
|
229 | client.delete(key) | |
|
230 | deleted_keys += 1 | |
|
231 | ||
|
232 | return deleted_keys | |
|
233 | ||
|
234 | ||
|
198 | 235 | class MemoryAuthSessions(BaseAuthSessions): |
|
199 | 236 | SESSION_TYPE = 'memory' |
|
200 | 237 | |
@@ -212,6 +249,7 b' def get_session_handler(session_type):' | |||
|
212 | 249 | types = { |
|
213 | 250 | 'file': FileAuthSessions, |
|
214 | 251 | 'ext:memcached': MemcachedAuthSessions, |
|
252 | 'ext:redis': RedisAuthSessions, | |
|
215 | 253 | 'ext:database': DbAuthSessions, |
|
216 | 254 | 'memory': MemoryAuthSessions |
|
217 | 255 | } |
@@ -371,7 +371,8 b' def config_data_from_db(clear_session=Tr' | |||
|
371 | 371 | config.append(( |
|
372 | 372 | safe_str(setting.section), safe_str(setting.key), False)) |
|
373 | 373 | log.debug( |
|
374 | 'settings ui from db: %s', | |
|
374 | 'settings ui from db@repo[%s]: %s', | |
|
375 | repo, | |
|
375 | 376 | ','.join(map(lambda s: '[{}] {}={}'.format(*s), ui_data))) |
|
376 | 377 | if clear_session: |
|
377 | 378 | meta.Session.remove() |
@@ -33,6 +33,8 b' import collections' | |||
|
33 | 33 | import warnings |
|
34 | 34 | |
|
35 | 35 | from zope.cachedescriptors.property import Lazy as LazyProperty |
|
36 | from zope.cachedescriptors.property import CachedProperty | |
|
37 | ||
|
36 | 38 | from pyramid import compat |
|
37 | 39 | |
|
38 | 40 | from rhodecode.translation import lazy_ugettext |
@@ -53,6 +55,7 b' log = logging.getLogger(__name__)' | |||
|
53 | 55 | |
|
54 | 56 | FILEMODE_DEFAULT = 0o100644 |
|
55 | 57 | FILEMODE_EXECUTABLE = 0o100755 |
|
58 | EMPTY_COMMIT_ID = '0' * 40 | |
|
56 | 59 | |
|
57 | 60 | Reference = collections.namedtuple('Reference', ('type', 'name', 'commit_id')) |
|
58 | 61 | |
@@ -161,7 +164,7 b' class MergeResponse(object):' | |||
|
161 | 164 | u'This pull request cannot be merged because the source contains ' |
|
162 | 165 | u'more branches than the target.'), |
|
163 | 166 | MergeFailureReason.HG_TARGET_HAS_MULTIPLE_HEADS: lazy_ugettext( |
|
164 | u'This pull request cannot be merged because the target ' | |
|
167 | u'This pull request cannot be merged because the target `{target_ref.name}` ' | |
|
165 | 168 | u'has multiple heads: `{heads}`.'), |
|
166 | 169 | MergeFailureReason.TARGET_IS_LOCKED: lazy_ugettext( |
|
167 | 170 | u'This pull request cannot be merged because the target repository is ' |
@@ -261,6 +264,7 b' class BaseRepository(object):' | |||
|
261 | 264 | EMPTY_COMMIT_ID = '0' * 40 |
|
262 | 265 | |
|
263 | 266 | path = None |
|
267 | _commit_ids_ver = 0 | |
|
264 | 268 | |
|
265 | 269 | def __init__(self, repo_path, config=None, create=False, **kwargs): |
|
266 | 270 | """ |
@@ -309,6 +313,9 b' class BaseRepository(object):' | |||
|
309 | 313 | def _remote(self): |
|
310 | 314 | raise NotImplementedError |
|
311 | 315 | |
|
316 | def _heads(self, branch=None): | |
|
317 | return [] | |
|
318 | ||
|
312 | 319 | @LazyProperty |
|
313 | 320 | def EMPTY_COMMIT(self): |
|
314 | 321 | return EmptyCommit(self.EMPTY_COMMIT_ID) |
@@ -380,7 +387,7 b' class BaseRepository(object):' | |||
|
380 | 387 | return commit.size |
|
381 | 388 | |
|
382 | 389 | def is_empty(self): |
|
383 |
return |
|
|
390 | return self._remote.is_empty() | |
|
384 | 391 | |
|
385 | 392 | @staticmethod |
|
386 | 393 | def check_url(url, config): |
@@ -401,6 +408,15 b' class BaseRepository(object):' | |||
|
401 | 408 | # COMMITS |
|
402 | 409 | # ========================================================================== |
|
403 | 410 | |
|
411 | @CachedProperty('_commit_ids_ver') | |
|
412 | def commit_ids(self): | |
|
413 | raise NotImplementedError | |
|
414 | ||
|
415 | def append_commit_id(self, commit_id): | |
|
416 | if commit_id not in self.commit_ids: | |
|
417 | self._rebuild_cache(self.commit_ids + [commit_id]) | |
|
418 | self._commit_ids_ver = time.time() | |
|
419 | ||
|
404 | 420 | def get_commit(self, commit_id=None, commit_idx=None, pre_load=None, translate_tag=None): |
|
405 | 421 | """ |
|
406 | 422 | Returns instance of `BaseCommit` class. If `commit_id` and `commit_idx` |
@@ -1091,12 +1107,12 b' class BaseCommit(object):' | |||
|
1091 | 1107 | """ |
|
1092 | 1108 | return None |
|
1093 | 1109 | |
|
1094 |
def archive_repo(self, |
|
|
1095 | prefix=None, write_metadata=False, mtime=None): | |
|
1110 | def archive_repo(self, archive_dest_path, kind='tgz', subrepos=None, | |
|
1111 | prefix=None, write_metadata=False, mtime=None, archive_at_path='/'): | |
|
1096 | 1112 | """ |
|
1097 | 1113 | Creates an archive containing the contents of the repository. |
|
1098 | 1114 | |
|
1099 |
:param |
|
|
1115 | :param archive_dest_path: path to the file which to create the archive. | |
|
1100 | 1116 | :param kind: one of following: ``"tbz2"``, ``"tgz"``, ``"zip"``. |
|
1101 | 1117 | :param prefix: name of root directory in archive. |
|
1102 | 1118 | Default is repository name and commit's short_id joined with dash: |
@@ -1104,10 +1120,11 b' class BaseCommit(object):' | |||
|
1104 | 1120 | :param write_metadata: write a metadata file into archive. |
|
1105 | 1121 | :param mtime: custom modification time for archive creation, defaults |
|
1106 | 1122 | to time.time() if not given. |
|
1123 | :param archive_at_path: pack files at this path (default '/') | |
|
1107 | 1124 | |
|
1108 | 1125 | :raise VCSError: If prefix has a problem. |
|
1109 | 1126 | """ |
|
1110 |
allowed_kinds = settings.ARCHIVE_SPECS |
|
|
1127 | allowed_kinds = [x[0] for x in settings.ARCHIVE_SPECS] | |
|
1111 | 1128 | if kind not in allowed_kinds: |
|
1112 | 1129 | raise ImproperArchiveTypeError( |
|
1113 | 1130 | 'Archive kind (%s) not supported use one of %s' % |
@@ -1115,11 +1132,11 b' class BaseCommit(object):' | |||
|
1115 | 1132 | |
|
1116 | 1133 | prefix = self._validate_archive_prefix(prefix) |
|
1117 | 1134 | |
|
1118 | mtime = mtime or time.mktime(self.date.timetuple()) | |
|
1135 | mtime = mtime is not None or time.mktime(self.date.timetuple()) | |
|
1119 | 1136 | |
|
1120 | 1137 | file_info = [] |
|
1121 | 1138 | cur_rev = self.repository.get_commit(commit_id=self.raw_id) |
|
1122 |
for _r, _d, files in cur_rev.walk( |
|
|
1139 | for _r, _d, files in cur_rev.walk(archive_at_path): | |
|
1123 | 1140 | for f in files: |
|
1124 | 1141 | f_path = os.path.join(prefix, f.path) |
|
1125 | 1142 | file_info.append( |
@@ -1128,15 +1145,15 b' class BaseCommit(object):' | |||
|
1128 | 1145 | if write_metadata: |
|
1129 | 1146 | metadata = [ |
|
1130 | 1147 | ('repo_name', self.repository.name), |
|
1131 |
(' |
|
|
1132 |
(' |
|
|
1148 | ('commit_id', self.raw_id), | |
|
1149 | ('mtime', mtime), | |
|
1133 | 1150 | ('branch', self.branch), |
|
1134 | 1151 | ('tags', ','.join(self.tags)), |
|
1135 | 1152 | ] |
|
1136 | 1153 | meta = ["%s:%s" % (f_name, value) for f_name, value in metadata] |
|
1137 | 1154 | file_info.append(('.archival.txt', 0o644, False, '\n'.join(meta))) |
|
1138 | 1155 | |
|
1139 |
connection.Hg.archive_repo( |
|
|
1156 | connection.Hg.archive_repo(archive_dest_path, mtime, file_info, kind) | |
|
1140 | 1157 | |
|
1141 | 1158 | def _validate_archive_prefix(self, prefix): |
|
1142 | 1159 | if prefix is None: |
@@ -1505,9 +1522,7 b' class BaseInMemoryCommit(object):' | |||
|
1505 | 1522 | "Cannot remove node at %s from " |
|
1506 | 1523 | "following parents: %s" % (not_removed, parents)) |
|
1507 | 1524 | |
|
1508 | def commit( | |
|
1509 | self, message, author, parents=None, branch=None, date=None, | |
|
1510 | **kwargs): | |
|
1525 | def commit(self, message, author, parents=None, branch=None, date=None, **kwargs): | |
|
1511 | 1526 | """ |
|
1512 | 1527 | Performs in-memory commit (doesn't check workdir in any way) and |
|
1513 | 1528 | returns newly created :class:`BaseCommit`. Updates repository's |
@@ -1555,7 +1570,7 b' class EmptyCommit(BaseCommit):' | |||
|
1555 | 1570 | """ |
|
1556 | 1571 | |
|
1557 | 1572 | def __init__( |
|
1558 |
self, commit_id= |
|
|
1573 | self, commit_id=EMPTY_COMMIT_ID, repo=None, alias=None, idx=-1, | |
|
1559 | 1574 | message='', author='', date=None): |
|
1560 | 1575 | self._empty_commit_id = commit_id |
|
1561 | 1576 | # TODO: johbo: Solve idx parameter, default value does not make |
@@ -1615,7 +1630,7 b' class EmptyChangeset(EmptyCommit):' | |||
|
1615 | 1630 | "Use EmptyCommit instead of EmptyChangeset", DeprecationWarning) |
|
1616 | 1631 | return super(EmptyCommit, cls).__new__(cls, *args, **kwargs) |
|
1617 | 1632 | |
|
1618 |
def __init__(self, cs= |
|
|
1633 | def __init__(self, cs=EMPTY_COMMIT_ID, repo=None, requested_revision=None, | |
|
1619 | 1634 | alias=None, revision=-1, message='', author='', date=None): |
|
1620 | 1635 | if requested_revision is not None: |
|
1621 | 1636 | warnings.warn( |
@@ -234,8 +234,7 b' class GitCommit(base.BaseCommit):' | |||
|
234 | 234 | path = self._fix_path(path) |
|
235 | 235 | if self._get_kind(path) != NodeKind.FILE: |
|
236 | 236 | raise CommitError( |
|
237 | "File does not exist for commit %s at '%s'" % | |
|
238 | (self.raw_id, path)) | |
|
237 | "File does not exist for commit %s at '%s'" % (self.raw_id, path)) | |
|
239 | 238 | return path |
|
240 | 239 | |
|
241 | 240 | def _get_file_nodes(self): |
@@ -353,8 +352,7 b' class GitCommit(base.BaseCommit):' | |||
|
353 | 352 | def get_nodes(self, path): |
|
354 | 353 | if self._get_kind(path) != NodeKind.DIR: |
|
355 | 354 | raise CommitError( |
|
356 | "Directory does not exist for commit %s at " | |
|
357 | " '%s'" % (self.raw_id, path)) | |
|
355 | "Directory does not exist for commit %s at '%s'" % (self.raw_id, path)) | |
|
358 | 356 | path = self._fix_path(path) |
|
359 | 357 | id_, _ = self._get_id_for_path(path) |
|
360 | 358 | tree_id = self._remote[id_]['id'] |
@@ -29,8 +29,7 b' from rhodecode.lib.vcs.backends import b' | |||
|
29 | 29 | |
|
30 | 30 | class GitInMemoryCommit(base.BaseInMemoryCommit): |
|
31 | 31 | |
|
32 | def commit(self, message, author, parents=None, branch=None, date=None, | |
|
33 | **kwargs): | |
|
32 | def commit(self, message, author, parents=None, branch=None, date=None, **kwargs): | |
|
34 | 33 | """ |
|
35 | 34 | Performs in-memory commit (doesn't check workdir in any way) and |
|
36 | 35 | returns newly created `GitCommit`. Updates repository's |
@@ -94,12 +93,12 b' class GitInMemoryCommit(base.BaseInMemor' | |||
|
94 | 93 | commit_data, branch, commit_tree, updated, removed) |
|
95 | 94 | |
|
96 | 95 | # Update vcs repository object |
|
97 |
self.repository.commit_id |
|
|
98 | self.repository._rebuild_cache(self.repository.commit_ids) | |
|
96 | self.repository.append_commit_id(commit_id) | |
|
99 | 97 | |
|
100 | 98 | # invalidate parsed refs after commit |
|
101 | 99 | self.repository._refs = self.repository._get_refs() |
|
102 | 100 | self.repository.branches = self.repository._get_branches() |
|
103 | tip = self.repository.get_commit() | |
|
101 | tip = self.repository.get_commit(commit_id) | |
|
102 | ||
|
104 | 103 | self.reset() |
|
105 | 104 | return tip |
@@ -25,8 +25,10 b' GIT repository module' | |||
|
25 | 25 | import logging |
|
26 | 26 | import os |
|
27 | 27 | import re |
|
28 | import time | |
|
28 | 29 | |
|
29 | 30 | from zope.cachedescriptors.property import Lazy as LazyProperty |
|
31 | from zope.cachedescriptors.property import CachedProperty | |
|
30 | 32 | |
|
31 | 33 | from rhodecode.lib.compat import OrderedDict |
|
32 | 34 | from rhodecode.lib.datelib import ( |
@@ -69,6 +71,9 b' class GitRepository(BaseRepository):' | |||
|
69 | 71 | # caches |
|
70 | 72 | self._commit_ids = {} |
|
71 | 73 | |
|
74 | # dependent that trigger re-computation of commit_ids | |
|
75 | self._commit_ids_ver = 0 | |
|
76 | ||
|
72 | 77 | @LazyProperty |
|
73 | 78 | def _remote(self): |
|
74 | 79 | return connection.Git(self.path, self.config, with_wire=self.with_wire) |
@@ -81,7 +86,7 b' class GitRepository(BaseRepository):' | |||
|
81 | 86 | def head(self): |
|
82 | 87 | return self._remote.head() |
|
83 | 88 | |
|
84 | @LazyProperty | |
|
89 | @CachedProperty('_commit_ids_ver') | |
|
85 | 90 | def commit_ids(self): |
|
86 | 91 | """ |
|
87 | 92 | Returns list of commit ids, in ascending order. Being lazy |
@@ -222,13 +227,10 b' class GitRepository(BaseRepository):' | |||
|
222 | 227 | return [] |
|
223 | 228 | return output.splitlines() |
|
224 | 229 | |
|
225 |
def _ |
|
|
230 | def _lookup_commit(self, commit_id_or_idx, translate_tag=True): | |
|
226 | 231 | def is_null(value): |
|
227 | 232 | return len(value) == commit_id_or_idx.count('0') |
|
228 | 233 | |
|
229 | if self.is_empty(): | |
|
230 | raise EmptyRepositoryError("There are no commits yet") | |
|
231 | ||
|
232 | 234 | if commit_id_or_idx in (None, '', 'tip', 'HEAD', 'head', -1): |
|
233 | 235 | return self.commit_ids[-1] |
|
234 | 236 | |
@@ -238,8 +240,7 b' class GitRepository(BaseRepository):' | |||
|
238 | 240 | try: |
|
239 | 241 | commit_id_or_idx = self.commit_ids[int(commit_id_or_idx)] |
|
240 | 242 | except Exception: |
|
241 |
msg = "Commit |
|
|
242 | commit_id_or_idx, self) | |
|
243 | msg = "Commit {} does not exist for `{}`".format(commit_id_or_idx, self.name) | |
|
243 | 244 | raise CommitDoesNotExistError(msg) |
|
244 | 245 | |
|
245 | 246 | elif is_bstr: |
@@ -261,8 +262,7 b' class GitRepository(BaseRepository):' | |||
|
261 | 262 | |
|
262 | 263 | if (not SHA_PATTERN.match(commit_id_or_idx) or |
|
263 | 264 | commit_id_or_idx not in self.commit_ids): |
|
264 |
msg = "Commit |
|
|
265 | commit_id_or_idx, self) | |
|
265 | msg = "Commit {} does not exist for `{}`".format(commit_id_or_idx, self.name) | |
|
266 | 266 | raise CommitDoesNotExistError(msg) |
|
267 | 267 | |
|
268 | 268 | # Ensure we return full id |
@@ -431,19 +431,42 b' class GitRepository(BaseRepository):' | |||
|
431 | 431 | Returns `GitCommit` object representing commit from git repository |
|
432 | 432 | at the given `commit_id` or head (most recent commit) if None given. |
|
433 | 433 | """ |
|
434 | if self.is_empty(): | |
|
435 | raise EmptyRepositoryError("There are no commits yet") | |
|
436 | ||
|
434 | 437 | if commit_id is not None: |
|
435 | 438 | self._validate_commit_id(commit_id) |
|
439 | try: | |
|
440 | # we have cached idx, use it without contacting the remote | |
|
441 | idx = self._commit_ids[commit_id] | |
|
442 | return GitCommit(self, commit_id, idx, pre_load=pre_load) | |
|
443 | except KeyError: | |
|
444 | pass | |
|
445 | ||
|
436 | 446 | elif commit_idx is not None: |
|
437 | 447 | self._validate_commit_idx(commit_idx) |
|
438 | commit_id = commit_idx | |
|
439 |
commit_id = self. |
|
|
448 | try: | |
|
449 | _commit_id = self.commit_ids[commit_idx] | |
|
450 | if commit_idx < 0: | |
|
451 | commit_idx = self.commit_ids.index(_commit_id) | |
|
452 | return GitCommit(self, _commit_id, commit_idx, pre_load=pre_load) | |
|
453 | except IndexError: | |
|
454 | commit_id = commit_idx | |
|
455 | else: | |
|
456 | commit_id = "tip" | |
|
457 | ||
|
458 | commit_id = self._lookup_commit(commit_id) | |
|
459 | remote_idx = None | |
|
460 | if translate_tag: | |
|
461 | # Need to call remote to translate id for tagging scenario | |
|
462 | remote_data = self._remote.get_object(commit_id) | |
|
463 | commit_id = remote_data["commit_id"] | |
|
464 | remote_idx = remote_data["idx"] | |
|
465 | ||
|
440 | 466 | try: |
|
441 | if translate_tag: | |
|
442 | # Need to call remote to translate id for tagging scenario | |
|
443 | commit_id = self._remote.get_object(commit_id)["commit_id"] | |
|
444 | 467 | idx = self._commit_ids[commit_id] |
|
445 | 468 | except KeyError: |
|
446 | raise RepositoryError("Cannot get object with id %s" % commit_id) | |
|
469 | idx = remote_idx or 0 | |
|
447 | 470 | |
|
448 | 471 | return GitCommit(self, commit_id, idx, pre_load=pre_load) |
|
449 | 472 | |
@@ -472,6 +495,7 b' class GitRepository(BaseRepository):' | |||
|
472 | 495 | """ |
|
473 | 496 | if self.is_empty(): |
|
474 | 497 | raise EmptyRepositoryError("There are no commits yet") |
|
498 | ||
|
475 | 499 | self._validate_branch_name(branch_name) |
|
476 | 500 | |
|
477 | 501 | if start_id is not None: |
@@ -479,9 +503,9 b' class GitRepository(BaseRepository):' | |||
|
479 | 503 | if end_id is not None: |
|
480 | 504 | self._validate_commit_id(end_id) |
|
481 | 505 | |
|
482 |
start_raw_id = self._ |
|
|
506 | start_raw_id = self._lookup_commit(start_id) | |
|
483 | 507 | start_pos = self._commit_ids[start_raw_id] if start_id else None |
|
484 |
end_raw_id = self._ |
|
|
508 | end_raw_id = self._lookup_commit(end_id) | |
|
485 | 509 | end_pos = max(0, self._commit_ids[end_raw_id]) if end_id else None |
|
486 | 510 | |
|
487 | 511 | if None not in [start_id, end_id] and start_pos > end_pos: |
@@ -589,8 +613,9 b' class GitRepository(BaseRepository):' | |||
|
589 | 613 | commit = commit.parents[0] |
|
590 | 614 | self._remote.set_refs('refs/heads/%s' % branch_name, commit.raw_id) |
|
591 | 615 | |
|
592 | self.commit_ids = self._get_all_commit_ids() | |
|
593 | self._rebuild_cache(self.commit_ids) | |
|
616 | self._commit_ids_ver = time.time() | |
|
617 | # we updated _commit_ids_ver so accessing self.commit_ids will re-compute it | |
|
618 | return len(self.commit_ids) | |
|
594 | 619 | |
|
595 | 620 | def get_common_ancestor(self, commit_id1, commit_id2, repo2): |
|
596 | 621 | if commit_id1 == commit_id2: |
@@ -193,13 +193,6 b' class MercurialCommit(base.BaseCommit):' | |||
|
193 | 193 | children = self._remote.ctx_children(self.idx) |
|
194 | 194 | return self._make_commits(children) |
|
195 | 195 | |
|
196 | def diff(self, ignore_whitespace=True, context=3): | |
|
197 | result = self._remote.ctx_diff( | |
|
198 | self.idx, | |
|
199 | git=True, ignore_whitespace=ignore_whitespace, context=context) | |
|
200 | diff = ''.join(result) | |
|
201 | return MercurialDiff(diff) | |
|
202 | ||
|
203 | 196 | def _fix_path(self, path): |
|
204 | 197 | """ |
|
205 | 198 | Mercurial keeps filenodes as str so we need to encode from unicode |
@@ -30,8 +30,7 b' from rhodecode.lib.vcs.exceptions import' | |||
|
30 | 30 | |
|
31 | 31 | class MercurialInMemoryCommit(BaseInMemoryCommit): |
|
32 | 32 | |
|
33 | def commit(self, message, author, parents=None, branch=None, date=None, | |
|
34 | **kwargs): | |
|
33 | def commit(self, message, author, parents=None, branch=None, date=None, **kwargs): | |
|
35 | 34 | """ |
|
36 | 35 | Performs in-memory commit (doesn't check workdir in any way) and |
|
37 | 36 | returns newly created `MercurialCommit`. Updates repository's |
@@ -83,15 +82,14 b' class MercurialInMemoryCommit(BaseInMemo' | |||
|
83 | 82 | |
|
84 | 83 | date, tz = date_to_timestamp_plus_offset(date) |
|
85 | 84 | |
|
86 |
|
|
|
85 | commit_id = self.repository._remote.commitctx( | |
|
87 | 86 | message=message, parents=parent_ids, |
|
88 | 87 | commit_time=date, commit_timezone=tz, user=author, |
|
89 | 88 | files=self.get_paths(), extra=kwargs, removed=removed, |
|
90 | 89 | updated=updated) |
|
90 | self.repository.append_commit_id(commit_id) | |
|
91 | 91 | |
|
92 | self.repository.commit_ids.append(new_id) | |
|
93 | self.repository._rebuild_cache(self.repository.commit_ids) | |
|
94 | 92 | self.repository.branches = self.repository._get_branches() |
|
95 | tip = self.repository.get_commit() | |
|
93 | tip = self.repository.get_commit(commit_id) | |
|
96 | 94 | self.reset() |
|
97 | 95 | return tip |
@@ -24,9 +24,11 b' HG repository module' | |||
|
24 | 24 | import os |
|
25 | 25 | import logging |
|
26 | 26 | import binascii |
|
27 | import time | |
|
27 | 28 | import urllib |
|
28 | 29 | |
|
29 | 30 | from zope.cachedescriptors.property import Lazy as LazyProperty |
|
31 | from zope.cachedescriptors.property import CachedProperty | |
|
30 | 32 | |
|
31 | 33 | from rhodecode.lib.compat import OrderedDict |
|
32 | 34 | from rhodecode.lib.datelib import ( |
@@ -85,11 +87,14 b' class MercurialRepository(BaseRepository' | |||
|
85 | 87 | # caches |
|
86 | 88 | self._commit_ids = {} |
|
87 | 89 | |
|
90 | # dependent that trigger re-computation of commit_ids | |
|
91 | self._commit_ids_ver = 0 | |
|
92 | ||
|
88 | 93 | @LazyProperty |
|
89 | 94 | def _remote(self): |
|
90 | 95 | return connection.Hg(self.path, self.config, with_wire=self.with_wire) |
|
91 | 96 | |
|
92 | @LazyProperty | |
|
97 | @CachedProperty('_commit_ids_ver') | |
|
93 | 98 | def commit_ids(self): |
|
94 | 99 | """ |
|
95 | 100 | Returns list of commit ids, in ascending order. Being lazy |
@@ -157,8 +162,7 b' class MercurialRepository(BaseRepository' | |||
|
157 | 162 | |
|
158 | 163 | return OrderedDict(sorted(_tags, key=get_name, reverse=True)) |
|
159 | 164 | |
|
160 | def tag(self, name, user, commit_id=None, message=None, date=None, | |
|
161 | **kwargs): | |
|
165 | def tag(self, name, user, commit_id=None, message=None, date=None, **kwargs): | |
|
162 | 166 | """ |
|
163 | 167 | Creates and returns a tag for the given ``commit_id``. |
|
164 | 168 | |
@@ -172,6 +176,7 b' class MercurialRepository(BaseRepository' | |||
|
172 | 176 | """ |
|
173 | 177 | if name in self.tags: |
|
174 | 178 | raise TagAlreadyExistError("Tag %s already exists" % name) |
|
179 | ||
|
175 | 180 | commit = self.get_commit(commit_id=commit_id) |
|
176 | 181 | local = kwargs.setdefault('local', False) |
|
177 | 182 | |
@@ -180,8 +185,7 b' class MercurialRepository(BaseRepository' | |||
|
180 | 185 | |
|
181 | 186 | date, tz = date_to_timestamp_plus_offset(date) |
|
182 | 187 | |
|
183 | self._remote.tag( | |
|
184 | name, commit.raw_id, message, local, user, date, tz) | |
|
188 | self._remote.tag(name, commit.raw_id, message, local, user, date, tz) | |
|
185 | 189 | self._remote.invalidate_vcs_cache() |
|
186 | 190 | |
|
187 | 191 | # Reinitialize tags |
@@ -203,6 +207,7 b' class MercurialRepository(BaseRepository' | |||
|
203 | 207 | """ |
|
204 | 208 | if name not in self.tags: |
|
205 | 209 | raise TagDoesNotExistError("Tag %s does not exist" % name) |
|
210 | ||
|
206 | 211 | if message is None: |
|
207 | 212 | message = "Removed tag %s" % name |
|
208 | 213 | local = False |
@@ -271,8 +276,9 b' class MercurialRepository(BaseRepository' | |||
|
271 | 276 | self._remote.strip(commit_id, update=False, backup="none") |
|
272 | 277 | |
|
273 | 278 | self._remote.invalidate_vcs_cache() |
|
274 | self.commit_ids = self._get_all_commit_ids() | |
|
275 | self._rebuild_cache(self.commit_ids) | |
|
279 | self._commit_ids_ver = time.time() | |
|
280 | # we updated _commit_ids_ver so accessing self.commit_ids will re-compute it | |
|
281 | return len(self.commit_ids) | |
|
276 | 282 | |
|
277 | 283 | def verify(self): |
|
278 | 284 | verify = self._remote.verify() |
@@ -425,18 +431,20 b' class MercurialRepository(BaseRepository' | |||
|
425 | 431 | if commit_id is not None: |
|
426 | 432 | self._validate_commit_id(commit_id) |
|
427 | 433 | try: |
|
434 | # we have cached idx, use it without contacting the remote | |
|
428 | 435 | idx = self._commit_ids[commit_id] |
|
429 | 436 | return MercurialCommit(self, commit_id, idx, pre_load=pre_load) |
|
430 | 437 | except KeyError: |
|
431 | 438 | pass |
|
439 | ||
|
432 | 440 | elif commit_idx is not None: |
|
433 | 441 | self._validate_commit_idx(commit_idx) |
|
434 | 442 | try: |
|
435 |
|
|
|
443 | _commit_id = self.commit_ids[commit_idx] | |
|
436 | 444 | if commit_idx < 0: |
|
437 |
commit_idx |
|
|
438 | return MercurialCommit( | |
|
439 |
|
|
|
445 | commit_idx = self.commit_ids.index(_commit_id) | |
|
446 | ||
|
447 | return MercurialCommit(self, _commit_id, commit_idx, pre_load=pre_load) | |
|
440 | 448 | except IndexError: |
|
441 | 449 | commit_id = commit_idx |
|
442 | 450 | else: |
@@ -448,8 +456,8 b' class MercurialRepository(BaseRepository' | |||
|
448 | 456 | try: |
|
449 | 457 | raw_id, idx = self._remote.lookup(commit_id, both=True) |
|
450 | 458 | except CommitDoesNotExistError: |
|
451 |
msg = "Commit |
|
|
452 | commit_id, self) | |
|
459 | msg = "Commit {} does not exist for `{}`".format( | |
|
460 | *map(safe_str, [commit_id, self.name])) | |
|
453 | 461 | raise CommitDoesNotExistError(msg) |
|
454 | 462 | |
|
455 | 463 | return MercurialCommit(self, raw_id, idx, pre_load=pre_load) |
@@ -477,11 +485,11 b' class MercurialRepository(BaseRepository' | |||
|
477 | 485 | ``end`` could not be found. |
|
478 | 486 | """ |
|
479 | 487 | # actually we should check now if it's not an empty repo |
|
480 | branch_ancestors = False | |
|
481 | 488 | if self.is_empty(): |
|
482 | 489 | raise EmptyRepositoryError("There are no commits yet") |
|
483 | 490 | self._validate_branch_name(branch_name) |
|
484 | 491 | |
|
492 | branch_ancestors = False | |
|
485 | 493 | if start_id is not None: |
|
486 | 494 | self._validate_commit_id(start_id) |
|
487 | 495 | c_start = self.get_commit(commit_id=start_id) |
@@ -715,11 +723,16 b' class MercurialRepository(BaseRepository' | |||
|
715 | 723 | |
|
716 | 724 | try: |
|
717 | 725 | if target_ref.type == 'branch' and len(self._heads(target_ref.name)) != 1: |
|
718 | heads = ','.join(self._heads(target_ref.name)) | |
|
726 | heads = '\n,'.join(self._heads(target_ref.name)) | |
|
727 | metadata = { | |
|
728 | 'target_ref': target_ref, | |
|
729 | 'source_ref': source_ref, | |
|
730 | 'heads': heads | |
|
731 | } | |
|
719 | 732 | return MergeResponse( |
|
720 | 733 | False, False, None, |
|
721 | 734 | MergeFailureReason.HG_TARGET_HAS_MULTIPLE_HEADS, |
|
722 |
metadata= |
|
|
735 | metadata=metadata) | |
|
723 | 736 | except CommitDoesNotExistError: |
|
724 | 737 | log.exception('Failure when looking up branch heads on hg target') |
|
725 | 738 | return MergeResponse( |
@@ -892,11 +905,15 b' class MercurialRepository(BaseRepository' | |||
|
892 | 905 | |
|
893 | 906 | def read_patterns(suffix): |
|
894 | 907 | svalue = None |
|
895 | try: | |
|
896 |
|
|
|
897 | except configparser.NoOptionError: | |
|
908 | for section, option in [ | |
|
909 | ('narrowacl', username + suffix), | |
|
910 | ('narrowacl', 'default' + suffix), | |
|
911 | ('narrowhgacl', username + suffix), | |
|
912 | ('narrowhgacl', 'default' + suffix) | |
|
913 | ]: | |
|
898 | 914 | try: |
|
899 |
svalue = hgacl.get( |
|
|
915 | svalue = hgacl.get(section, option) | |
|
916 | break # stop at the first value we find | |
|
900 | 917 | except configparser.NoOptionError: |
|
901 | 918 | pass |
|
902 | 919 | if not svalue: |
@@ -30,8 +30,7 b' from rhodecode.lib.vcs.backends import b' | |||
|
30 | 30 | |
|
31 | 31 | class SubversionInMemoryCommit(base.BaseInMemoryCommit): |
|
32 | 32 | |
|
33 | def commit(self, message, author, parents=None, branch=None, date=None, | |
|
34 | **kwargs): | |
|
33 | def commit(self, message, author, parents=None, branch=None, date=None, **kwargs): | |
|
35 | 34 | if branch not in (None, self.repository.DEFAULT_BRANCH_NAME): |
|
36 | 35 | raise NotImplementedError("Branches are not yet supported") |
|
37 | 36 | |
@@ -74,8 +73,7 b' class SubversionInMemoryCommit(base.Base' | |||
|
74 | 73 | # we should not add the commit_id, if it is already evaluated, it |
|
75 | 74 | # will not be evaluated again. |
|
76 | 75 | commit_id = str(svn_rev) |
|
77 |
|
|
|
78 | self.repository.commit_ids.append(commit_id) | |
|
76 | self.repository.append_commit_id(commit_id) | |
|
79 | 77 | tip = self.repository.get_commit() |
|
80 | 78 | self.reset() |
|
81 | 79 | return tip |
@@ -27,6 +27,7 b' import os' | |||
|
27 | 27 | import urllib |
|
28 | 28 | |
|
29 | 29 | from zope.cachedescriptors.property import Lazy as LazyProperty |
|
30 | from zope.cachedescriptors.property import CachedProperty | |
|
30 | 31 | |
|
31 | 32 | from rhodecode.lib.compat import OrderedDict |
|
32 | 33 | from rhodecode.lib.datelib import date_astimestamp |
@@ -75,6 +76,9 b' class SubversionRepository(base.BaseRepo' | |||
|
75 | 76 | |
|
76 | 77 | self._init_repo(create, src_url) |
|
77 | 78 | |
|
79 | # dependent that trigger re-computation of commit_ids | |
|
80 | self._commit_ids_ver = 0 | |
|
81 | ||
|
78 | 82 | @LazyProperty |
|
79 | 83 | def _remote(self): |
|
80 | 84 | return connection.Svn(self.path, self.config) |
@@ -93,11 +97,31 b' class SubversionRepository(base.BaseRepo' | |||
|
93 | 97 | else: |
|
94 | 98 | self._check_path() |
|
95 | 99 | |
|
96 | @LazyProperty | |
|
100 | @CachedProperty('_commit_ids_ver') | |
|
97 | 101 | def commit_ids(self): |
|
98 | 102 | head = self._remote.lookup(None) |
|
99 | 103 | return [str(r) for r in xrange(1, head + 1)] |
|
100 | 104 | |
|
105 | def _rebuild_cache(self, commit_ids): | |
|
106 | pass | |
|
107 | ||
|
108 | def run_svn_command(self, cmd, **opts): | |
|
109 | """ | |
|
110 | Runs given ``cmd`` as svn command and returns tuple | |
|
111 | (stdout, stderr). | |
|
112 | ||
|
113 | :param cmd: full svn command to be executed | |
|
114 | :param opts: env options to pass into Subprocess command | |
|
115 | """ | |
|
116 | if not isinstance(cmd, list): | |
|
117 | raise ValueError('cmd must be a list, got %s instead' % type(cmd)) | |
|
118 | ||
|
119 | skip_stderr_log = opts.pop('skip_stderr_log', False) | |
|
120 | out, err = self._remote.run_svn_command(cmd, **opts) | |
|
121 | if err and not skip_stderr_log: | |
|
122 | log.debug('Stderr output of svn command "%s":\n%s', cmd, err) | |
|
123 | return out, err | |
|
124 | ||
|
101 | 125 | @LazyProperty |
|
102 | 126 | def branches(self): |
|
103 | 127 | return self._tags_or_branches('vcs_svn_branch') |
@@ -260,7 +284,7 b' class SubversionRepository(base.BaseRepo' | |||
|
260 | 284 | try: |
|
261 | 285 | commit_id = self.commit_ids[commit_idx] |
|
262 | 286 | except IndexError: |
|
263 | raise CommitDoesNotExistError | |
|
287 | raise CommitDoesNotExistError('No commit with idx: {}'.format(commit_idx)) | |
|
264 | 288 | |
|
265 | 289 | commit_id = self._sanitize_commit_id(commit_id) |
|
266 | 290 | commit = SubversionCommit(repository=self, commit_id=commit_id) |
@@ -42,12 +42,16 b' BACKENDS = {' | |||
|
42 | 42 | 'svn': 'rhodecode.lib.vcs.backends.svn.SubversionRepository', |
|
43 | 43 | } |
|
44 | 44 | |
|
45 | # TODO: Remove once controllers/files.py is adjusted | |
|
46 |
ARCHIVE_SPECS = |
|
|
47 |
'tbz2' |
|
|
48 |
|
|
|
49 | 'zip': ('application/zip', '.zip'), | |
|
50 | } | |
|
45 | ||
|
46 | ARCHIVE_SPECS = [ | |
|
47 | ('tbz2', 'application/x-bzip2', 'tbz2'), | |
|
48 | ('tbz2', 'application/x-bzip2', '.tar.bz2'), | |
|
49 | ||
|
50 | ('tgz', 'application/x-gzip', '.tgz'), | |
|
51 | ('tgz', 'application/x-gzip', '.tar.gz'), | |
|
52 | ||
|
53 | ('zip', 'application/zip', '.zip'), | |
|
54 | ] | |
|
51 | 55 | |
|
52 | 56 | HOOKS_PROTOCOL = None |
|
53 | 57 | HOOKS_DIRECT_CALLS = False |
@@ -101,23 +101,35 b' def parse_datetime(text):' | |||
|
101 | 101 | :param text: string of desired date/datetime or something more verbose, |
|
102 | 102 | like *yesterday*, *2weeks 3days*, etc. |
|
103 | 103 | """ |
|
104 | if not text: | |
|
105 | raise ValueError('Wrong date: "%s"' % text) | |
|
104 | 106 | |
|
105 | text = text.strip().lower() | |
|
107 | if isinstance(text, datetime.datetime): | |
|
108 | return text | |
|
106 | 109 | |
|
107 | INPUT_FORMATS = ( | |
|
110 | # we limit a format to no include microseconds e.g 2017-10-17t17:48:23.XXXX | |
|
111 | text = text.strip().lower()[:19] | |
|
112 | ||
|
113 | input_formats = ( | |
|
108 | 114 | '%Y-%m-%d %H:%M:%S', |
|
115 | '%Y-%m-%dt%H:%M:%S', | |
|
109 | 116 | '%Y-%m-%d %H:%M', |
|
117 | '%Y-%m-%dt%H:%M', | |
|
110 | 118 | '%Y-%m-%d', |
|
111 | 119 | '%m/%d/%Y %H:%M:%S', |
|
120 | '%m/%d/%Yt%H:%M:%S', | |
|
112 | 121 | '%m/%d/%Y %H:%M', |
|
122 | '%m/%d/%Yt%H:%M', | |
|
113 | 123 | '%m/%d/%Y', |
|
114 | 124 | '%m/%d/%y %H:%M:%S', |
|
125 | '%m/%d/%yt%H:%M:%S', | |
|
115 | 126 | '%m/%d/%y %H:%M', |
|
127 | '%m/%d/%yt%H:%M', | |
|
116 | 128 | '%m/%d/%y', |
|
117 | 129 | ) |
|
118 | for format in INPUT_FORMATS: | |
|
130 | for format_def in input_formats: | |
|
119 | 131 | try: |
|
120 | return datetime.datetime(*time.strptime(text, format)[:6]) | |
|
132 | return datetime.datetime(*time.strptime(text, format_def)[:6]) | |
|
121 | 133 | except ValueError: |
|
122 | 134 | pass |
|
123 | 135 |
@@ -25,6 +25,7 b' Database Models for RhodeCode Enterprise' | |||
|
25 | 25 | import re |
|
26 | 26 | import os |
|
27 | 27 | import time |
|
28 | import string | |
|
28 | 29 | import hashlib |
|
29 | 30 | import logging |
|
30 | 31 | import datetime |
@@ -39,7 +40,7 b' from sqlalchemy import (' | |||
|
39 | 40 | Index, Sequence, UniqueConstraint, ForeignKey, CheckConstraint, Column, |
|
40 | 41 | Boolean, String, Unicode, UnicodeText, DateTime, Integer, LargeBinary, |
|
41 | 42 | Text, Float, PickleType) |
|
42 | from sqlalchemy.sql.expression import true, false | |
|
43 | from sqlalchemy.sql.expression import true, false, case | |
|
43 | 44 | from sqlalchemy.sql.functions import coalesce, count # pragma: no cover |
|
44 | 45 | from sqlalchemy.orm import ( |
|
45 | 46 | relationship, joinedload, class_mapper, validates, aliased) |
@@ -50,6 +51,7 b' from sqlalchemy.dialects.mysql import LO' | |||
|
50 | 51 | from zope.cachedescriptors.property import Lazy as LazyProperty |
|
51 | 52 | from pyramid import compat |
|
52 | 53 | from pyramid.threadlocal import get_current_request |
|
54 | from webhelpers.text import collapse, remove_formatting | |
|
53 | 55 | |
|
54 | 56 | from rhodecode.translation import _ |
|
55 | 57 | from rhodecode.lib.vcs import get_vcs_instance |
@@ -57,13 +59,13 b' from rhodecode.lib.vcs.backends.base imp' | |||
|
57 | 59 | from rhodecode.lib.utils2 import ( |
|
58 | 60 | str2bool, safe_str, get_commit_safe, safe_unicode, sha1_safe, |
|
59 | 61 | time_to_datetime, aslist, Optional, safe_int, get_clone_url, AttributeDict, |
|
60 | glob2re, StrictAttributeDict, cleaned_uri) | |
|
62 | glob2re, StrictAttributeDict, cleaned_uri, datetime_to_time, OrderedDefaultDict) | |
|
61 | 63 | from rhodecode.lib.jsonalchemy import MutationObj, MutationList, JsonType, \ |
|
62 | 64 | JsonRaw |
|
63 | 65 | from rhodecode.lib.ext_json import json |
|
64 | 66 | from rhodecode.lib.caching_query import FromCache |
|
65 | from rhodecode.lib.encrypt import AESCipher | |
|
66 | ||
|
67 | from rhodecode.lib.encrypt import AESCipher, validate_and_get_enc_data | |
|
68 | from rhodecode.lib.encrypt2 import Encryptor | |
|
67 | 69 | from rhodecode.model.meta import Base, Session |
|
68 | 70 | |
|
69 | 71 | URL_SEP = '/' |
@@ -159,44 +161,46 b' class EncryptedTextValue(TypeDecorator):' | |||
|
159 | 161 | impl = Text |
|
160 | 162 | |
|
161 | 163 | def process_bind_param(self, value, dialect): |
|
162 | if not value: | |
|
163 |
|
|
|
164 | if value.startswith('enc$aes$') or value.startswith('enc$aes_hmac$'): | |
|
165 | # protect against double encrypting if someone manually starts | |
|
166 | # doing | |
|
167 | raise ValueError('value needs to be in unencrypted format, ie. ' | |
|
168 | 'not starting with enc$aes') | |
|
169 | return 'enc$aes_hmac$%s' % AESCipher( | |
|
170 | ENCRYPTION_KEY, hmac=True).encrypt(value) | |
|
171 | ||
|
172 | def process_result_value(self, value, dialect): | |
|
164 | """ | |
|
165 | Setter for storing value | |
|
166 | """ | |
|
173 | 167 | import rhodecode |
|
174 | ||
|
175 | 168 | if not value: |
|
176 | 169 | return value |
|
177 | 170 | |
|
178 | parts = value.split('$', 3) | |
|
179 | if not len(parts) == 3: | |
|
180 | # probably not encrypted values | |
|
181 | return value | |
|
171 | # protect against double encrypting if values is already encrypted | |
|
172 | if value.startswith('enc$aes$') \ | |
|
173 | or value.startswith('enc$aes_hmac$') \ | |
|
174 | or value.startswith('enc2$'): | |
|
175 | raise ValueError('value needs to be in unencrypted format, ' | |
|
176 | 'ie. not starting with enc$ or enc2$') | |
|
177 | ||
|
178 | algo = rhodecode.CONFIG.get('rhodecode.encrypted_values.algorithm') or 'aes' | |
|
179 | if algo == 'aes': | |
|
180 | return 'enc$aes_hmac$%s' % AESCipher(ENCRYPTION_KEY, hmac=True).encrypt(value) | |
|
181 | elif algo == 'fernet': | |
|
182 | return Encryptor(ENCRYPTION_KEY).encrypt(value) | |
|
182 | 183 | else: |
|
183 | if parts[0] != 'enc': | |
|
184 | # parts ok but without our header ? | |
|
185 | return value | |
|
186 | enc_strict_mode = str2bool(rhodecode.CONFIG.get( | |
|
187 | 'rhodecode.encrypted_values.strict') or True) | |
|
188 | # at that stage we know it's our encryption | |
|
189 | if parts[1] == 'aes': | |
|
190 | decrypted_data = AESCipher(ENCRYPTION_KEY).decrypt(parts[2]) | |
|
191 | elif parts[1] == 'aes_hmac': | |
|
192 | decrypted_data = AESCipher( | |
|
193 | ENCRYPTION_KEY, hmac=True, | |
|
194 | strict_verification=enc_strict_mode).decrypt(parts[2]) | |
|
195 | else: | |
|
196 | raise ValueError( | |
|
197 | 'Encryption type part is wrong, must be `aes` ' | |
|
198 | 'or `aes_hmac`, got `%s` instead' % (parts[1])) | |
|
199 | return decrypted_data | |
|
184 | ValueError('Bad encryption algorithm, should be fernet or aes, got: {}'.format(algo)) | |
|
185 | ||
|
186 | def process_result_value(self, value, dialect): | |
|
187 | """ | |
|
188 | Getter for retrieving value | |
|
189 | """ | |
|
190 | ||
|
191 | import rhodecode | |
|
192 | if not value: | |
|
193 | return value | |
|
194 | ||
|
195 | algo = rhodecode.CONFIG.get('rhodecode.encrypted_values.algorithm') or 'aes' | |
|
196 | enc_strict_mode = str2bool(rhodecode.CONFIG.get('rhodecode.encrypted_values.strict') or True) | |
|
197 | if algo == 'aes': | |
|
198 | decrypted_data = validate_and_get_enc_data(value, ENCRYPTION_KEY, enc_strict_mode) | |
|
199 | elif algo == 'fernet': | |
|
200 | return Encryptor(ENCRYPTION_KEY).decrypt(value) | |
|
201 | else: | |
|
202 | ValueError('Bad encryption algorithm, should be fernet or aes, got: {}'.format(algo)) | |
|
203 | return decrypted_data | |
|
200 | 204 | |
|
201 | 205 | |
|
202 | 206 | class BaseModel(object): |
@@ -407,6 +411,15 b' class RhodeCodeUi(Base, BaseModel):' | |||
|
407 | 411 | HOOK_PUSH = 'changegroup.push_logger' |
|
408 | 412 | HOOK_PUSH_KEY = 'pushkey.key_push' |
|
409 | 413 | |
|
414 | HOOKS_BUILTIN = [ | |
|
415 | HOOK_PRE_PULL, | |
|
416 | HOOK_PULL, | |
|
417 | HOOK_PRE_PUSH, | |
|
418 | HOOK_PRETX_PUSH, | |
|
419 | HOOK_PUSH, | |
|
420 | HOOK_PUSH_KEY, | |
|
421 | ] | |
|
422 | ||
|
410 | 423 | # TODO: johbo: Unify way how hooks are configured for git and hg, |
|
411 | 424 | # git part is currently hardcoded. |
|
412 | 425 | |
@@ -1665,11 +1678,12 b' class Repository(Base, BaseModel):' | |||
|
1665 | 1678 | cascade="all, delete, delete-orphan") |
|
1666 | 1679 | ui = relationship('RepoRhodeCodeUi', cascade="all") |
|
1667 | 1680 | settings = relationship('RepoRhodeCodeSetting', cascade="all") |
|
1668 | integrations = relationship('Integration', | |
|
1669 | cascade="all, delete, delete-orphan") | |
|
1681 | integrations = relationship('Integration', cascade="all, delete, delete-orphan") | |
|
1670 | 1682 | |
|
1671 | 1683 | scoped_tokens = relationship('UserApiKeys', cascade="all") |
|
1672 | 1684 | |
|
1685 | artifacts = relationship('FileStore', cascade="all") | |
|
1686 | ||
|
1673 | 1687 | def __unicode__(self): |
|
1674 | 1688 | return u"<%s('%s:%s')>" % (self.__class__.__name__, self.repo_id, |
|
1675 | 1689 | safe_unicode(self.repo_name)) |
@@ -1717,7 +1731,9 b' class Repository(Base, BaseModel):' | |||
|
1717 | 1731 | from rhodecode.lib.vcs.backends.base import EmptyCommit |
|
1718 | 1732 | dummy = EmptyCommit().__json__() |
|
1719 | 1733 | if not self._changeset_cache: |
|
1720 | return dummy | |
|
1734 | dummy['source_repo_id'] = self.repo_id | |
|
1735 | return json.loads(json.dumps(dummy)) | |
|
1736 | ||
|
1721 | 1737 | try: |
|
1722 | 1738 | return json.loads(self._changeset_cache) |
|
1723 | 1739 | except TypeError: |
@@ -2170,6 +2186,20 b' class Repository(Base, BaseModel):' | |||
|
2170 | 2186 | return make_lock, currently_locked, lock_info |
|
2171 | 2187 | |
|
2172 | 2188 | @property |
|
2189 | def last_commit_cache_update_diff(self): | |
|
2190 | return time.time() - (safe_int(self.changeset_cache.get('updated_on')) or 0) | |
|
2191 | ||
|
2192 | @property | |
|
2193 | def last_commit_change(self): | |
|
2194 | from rhodecode.lib.vcs.utils.helpers import parse_datetime | |
|
2195 | empty_date = datetime.datetime.fromtimestamp(0) | |
|
2196 | date_latest = self.changeset_cache.get('date', empty_date) | |
|
2197 | try: | |
|
2198 | return parse_datetime(date_latest) | |
|
2199 | except Exception: | |
|
2200 | return empty_date | |
|
2201 | ||
|
2202 | @property | |
|
2173 | 2203 | def last_db_change(self): |
|
2174 | 2204 | return self.updated_on |
|
2175 | 2205 | |
@@ -2262,6 +2292,7 b' class Repository(Base, BaseModel):' | |||
|
2262 | 2292 | """ |
|
2263 | 2293 | Update cache of last changeset for repository, keys should be:: |
|
2264 | 2294 | |
|
2295 | source_repo_id | |
|
2265 | 2296 | short_id |
|
2266 | 2297 | raw_id |
|
2267 | 2298 | revision |
@@ -2269,15 +2300,15 b' class Repository(Base, BaseModel):' | |||
|
2269 | 2300 | message |
|
2270 | 2301 | date |
|
2271 | 2302 | author |
|
2272 | ||
|
2273 | :param cs_cache: | |
|
2303 | updated_on | |
|
2304 | ||
|
2274 | 2305 | """ |
|
2275 | 2306 | from rhodecode.lib.vcs.backends.base import BaseChangeset |
|
2276 | 2307 | if cs_cache is None: |
|
2277 | 2308 | # use no-cache version here |
|
2278 | 2309 | scm_repo = self.scm_instance(cache=False, config=config) |
|
2279 | 2310 | |
|
2280 |
empty = |
|
|
2311 | empty = scm_repo is None or scm_repo.is_empty() | |
|
2281 | 2312 | if not empty: |
|
2282 | 2313 | cs_cache = scm_repo.get_commit( |
|
2283 | 2314 | pre_load=["author", "date", "message", "parents"]) |
@@ -2297,18 +2328,28 b' class Repository(Base, BaseModel):' | |||
|
2297 | 2328 | if is_outdated(cs_cache) or not self.changeset_cache: |
|
2298 | 2329 | _default = datetime.datetime.utcnow() |
|
2299 | 2330 | last_change = cs_cache.get('date') or _default |
|
2300 | if self.updated_on and self.updated_on > last_change: | |
|
2301 | # we check if last update is newer than the new value | |
|
2302 | # if yes, we use the current timestamp instead. Imagine you get | |
|
2303 | # old commit pushed 1y ago, we'd set last update 1y to ago. | |
|
2304 | last_change = _default | |
|
2305 | log.debug('updated repo %s with new cs cache %s', | |
|
2306 | self.repo_name, cs_cache) | |
|
2307 | self.updated_on = last_change | |
|
2331 | # we check if last update is newer than the new value | |
|
2332 | # if yes, we use the current timestamp instead. Imagine you get | |
|
2333 | # old commit pushed 1y ago, we'd set last update 1y to ago. | |
|
2334 | last_change_timestamp = datetime_to_time(last_change) | |
|
2335 | current_timestamp = datetime_to_time(last_change) | |
|
2336 | if last_change_timestamp > current_timestamp: | |
|
2337 | cs_cache['date'] = _default | |
|
2338 | ||
|
2339 | cs_cache['updated_on'] = time.time() | |
|
2308 | 2340 | self.changeset_cache = cs_cache |
|
2309 | 2341 | Session().add(self) |
|
2310 | 2342 | Session().commit() |
|
2343 | ||
|
2344 | log.debug('updated repo %s with new commit cache %s', | |
|
2345 | self.repo_name, cs_cache) | |
|
2311 | 2346 | else: |
|
2347 | cs_cache = self.changeset_cache | |
|
2348 | cs_cache['updated_on'] = time.time() | |
|
2349 | self.changeset_cache = cs_cache | |
|
2350 | Session().add(self) | |
|
2351 | Session().commit() | |
|
2352 | ||
|
2312 | 2353 | log.debug('Skipping update_commit_cache for repo:`%s` ' |
|
2313 | 2354 | 'commit already with latest changes', self.repo_name) |
|
2314 | 2355 | |
@@ -2397,6 +2438,7 b' class Repository(Base, BaseModel):' | |||
|
2397 | 2438 | # control over cache behaviour |
|
2398 | 2439 | if cache is None and full_cache and not config: |
|
2399 | 2440 | return self._get_instance_cached() |
|
2441 | # cache here is sent to the "vcs server" | |
|
2400 | 2442 | return self._get_instance(cache=bool(cache), config=config) |
|
2401 | 2443 | |
|
2402 | 2444 | def _get_instance_cached(self): |
@@ -2425,8 +2467,7 b' class Repository(Base, BaseModel):' | |||
|
2425 | 2467 | else: |
|
2426 | 2468 | instance = get_instance_cached(*args) |
|
2427 | 2469 | |
|
2428 | log.debug( | |
|
2429 | 'Repo instance fetched in %.3fs', inv_context_manager.compute_time) | |
|
2470 | log.debug('Repo instance fetched in %.3fs', inv_context_manager.compute_time) | |
|
2430 | 2471 | return instance |
|
2431 | 2472 | |
|
2432 | 2473 | def _get_instance(self, cache=True, config=None): |
@@ -2440,7 +2481,8 b' class Repository(Base, BaseModel):' | |||
|
2440 | 2481 | with_wire=custom_wire, |
|
2441 | 2482 | create=False, |
|
2442 | 2483 | _vcs_alias=self.repo_type) |
|
2443 | ||
|
2484 | if repo is not None: | |
|
2485 | repo.count() # cache rebuild | |
|
2444 | 2486 | return repo |
|
2445 | 2487 | |
|
2446 | 2488 | def __json__(self): |
@@ -2467,7 +2509,8 b' class RepoGroup(Base, BaseModel):' | |||
|
2467 | 2509 | CHOICES_SEPARATOR = '/' # used to generate select2 choices for nested groups |
|
2468 | 2510 | |
|
2469 | 2511 | group_id = Column("group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) |
|
2470 | group_name = Column("group_name", String(255), nullable=False, unique=True, default=None) | |
|
2512 | _group_name = Column("group_name", String(255), nullable=False, unique=True, default=None) | |
|
2513 | group_name_hash = Column("repo_group_name_hash", String(1024), nullable=False, unique=False) | |
|
2471 | 2514 | group_parent_id = Column("group_parent_id", Integer(), ForeignKey('groups.group_id'), nullable=True, unique=None, default=None) |
|
2472 | 2515 | group_description = Column("group_description", String(10000), nullable=True, unique=None, default=None) |
|
2473 | 2516 | enable_locking = Column("enable_locking", Boolean(), nullable=False, unique=None, default=False) |
@@ -2475,6 +2518,8 b' class RepoGroup(Base, BaseModel):' | |||
|
2475 | 2518 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) |
|
2476 | 2519 | updated_on = Column('updated_on', DateTime(timezone=False), nullable=True, unique=None, default=datetime.datetime.now) |
|
2477 | 2520 | personal = Column('personal', Boolean(), nullable=True, unique=None, default=None) |
|
2521 | _changeset_cache = Column( | |
|
2522 | "changeset_cache", LargeBinary(), nullable=True) # JSON data | |
|
2478 | 2523 | |
|
2479 | 2524 | repo_group_to_perm = relationship('UserRepoGroupToPerm', cascade='all', order_by='UserRepoGroupToPerm.group_to_perm_id') |
|
2480 | 2525 | users_group_to_perm = relationship('UserGroupRepoGroupToPerm', cascade='all') |
@@ -2490,6 +2535,38 b' class RepoGroup(Base, BaseModel):' | |||
|
2490 | 2535 | return u"<%s('id:%s:%s')>" % ( |
|
2491 | 2536 | self.__class__.__name__, self.group_id, self.group_name) |
|
2492 | 2537 | |
|
2538 | @hybrid_property | |
|
2539 | def group_name(self): | |
|
2540 | return self._group_name | |
|
2541 | ||
|
2542 | @group_name.setter | |
|
2543 | def group_name(self, value): | |
|
2544 | self._group_name = value | |
|
2545 | self.group_name_hash = self.hash_repo_group_name(value) | |
|
2546 | ||
|
2547 | @hybrid_property | |
|
2548 | def changeset_cache(self): | |
|
2549 | from rhodecode.lib.vcs.backends.base import EmptyCommit | |
|
2550 | dummy = EmptyCommit().__json__() | |
|
2551 | if not self._changeset_cache: | |
|
2552 | dummy['source_repo_id'] = '' | |
|
2553 | return json.loads(json.dumps(dummy)) | |
|
2554 | ||
|
2555 | try: | |
|
2556 | return json.loads(self._changeset_cache) | |
|
2557 | except TypeError: | |
|
2558 | return dummy | |
|
2559 | except Exception: | |
|
2560 | log.error(traceback.format_exc()) | |
|
2561 | return dummy | |
|
2562 | ||
|
2563 | @changeset_cache.setter | |
|
2564 | def changeset_cache(self, val): | |
|
2565 | try: | |
|
2566 | self._changeset_cache = json.dumps(val) | |
|
2567 | except Exception: | |
|
2568 | log.error(traceback.format_exc()) | |
|
2569 | ||
|
2493 | 2570 | @validates('group_parent_id') |
|
2494 | 2571 | def validate_group_parent_id(self, key, val): |
|
2495 | 2572 | """ |
@@ -2506,6 +2583,18 b' class RepoGroup(Base, BaseModel):' | |||
|
2506 | 2583 | return h.escape(self.group_description) |
|
2507 | 2584 | |
|
2508 | 2585 | @classmethod |
|
2586 | def hash_repo_group_name(cls, repo_group_name): | |
|
2587 | val = remove_formatting(repo_group_name) | |
|
2588 | val = safe_str(val).lower() | |
|
2589 | chars = [] | |
|
2590 | for c in val: | |
|
2591 | if c not in string.ascii_letters: | |
|
2592 | c = str(ord(c)) | |
|
2593 | chars.append(c) | |
|
2594 | ||
|
2595 | return ''.join(chars) | |
|
2596 | ||
|
2597 | @classmethod | |
|
2509 | 2598 | def _generate_choice(cls, repo_group): |
|
2510 | 2599 | from webhelpers.html import literal as _literal |
|
2511 | 2600 | _name = lambda k: _literal(cls.CHOICES_SEPARATOR.join(k)) |
@@ -2573,8 +2662,7 b' class RepoGroup(Base, BaseModel):' | |||
|
2573 | 2662 | return q.all() |
|
2574 | 2663 | |
|
2575 | 2664 | @property |
|
2576 | def parents(self): | |
|
2577 | parents_recursion_limit = 10 | |
|
2665 | def parents(self, parents_recursion_limit = 10): | |
|
2578 | 2666 | groups = [] |
|
2579 | 2667 | if self.parent_group is None: |
|
2580 | 2668 | return groups |
@@ -2597,6 +2685,20 b' class RepoGroup(Base, BaseModel):' | |||
|
2597 | 2685 | return groups |
|
2598 | 2686 | |
|
2599 | 2687 | @property |
|
2688 | def last_commit_cache_update_diff(self): | |
|
2689 | return time.time() - (safe_int(self.changeset_cache.get('updated_on')) or 0) | |
|
2690 | ||
|
2691 | @property | |
|
2692 | def last_commit_change(self): | |
|
2693 | from rhodecode.lib.vcs.utils.helpers import parse_datetime | |
|
2694 | empty_date = datetime.datetime.fromtimestamp(0) | |
|
2695 | date_latest = self.changeset_cache.get('date', empty_date) | |
|
2696 | try: | |
|
2697 | return parse_datetime(date_latest) | |
|
2698 | except Exception: | |
|
2699 | return empty_date | |
|
2700 | ||
|
2701 | @property | |
|
2600 | 2702 | def last_db_change(self): |
|
2601 | 2703 | return self.updated_on |
|
2602 | 2704 | |
@@ -2635,7 +2737,7 b' class RepoGroup(Base, BaseModel):' | |||
|
2635 | 2737 | |
|
2636 | 2738 | return cnt + children_count(self) |
|
2637 | 2739 | |
|
2638 | def _recursive_objects(self, include_repos=True): | |
|
2740 | def _recursive_objects(self, include_repos=True, include_groups=True): | |
|
2639 | 2741 | all_ = [] |
|
2640 | 2742 | |
|
2641 | 2743 | def _get_members(root_gr): |
@@ -2645,11 +2747,16 b' class RepoGroup(Base, BaseModel):' | |||
|
2645 | 2747 | childs = root_gr.children.all() |
|
2646 | 2748 | if childs: |
|
2647 | 2749 | for gr in childs: |
|
2648 |
|
|
|
2750 | if include_groups: | |
|
2751 | all_.append(gr) | |
|
2649 | 2752 | _get_members(gr) |
|
2650 | 2753 | |
|
2754 | root_group = [] | |
|
2755 | if include_groups: | |
|
2756 | root_group = [self] | |
|
2757 | ||
|
2651 | 2758 | _get_members(self) |
|
2652 |
return |
|
|
2759 | return root_group + all_ | |
|
2653 | 2760 | |
|
2654 | 2761 | def recursive_groups_and_repos(self): |
|
2655 | 2762 | """ |
@@ -2663,6 +2770,12 b' class RepoGroup(Base, BaseModel):' | |||
|
2663 | 2770 | """ |
|
2664 | 2771 | return self._recursive_objects(include_repos=False) |
|
2665 | 2772 | |
|
2773 | def recursive_repos(self): | |
|
2774 | """ | |
|
2775 | Returns all children repositories for this group | |
|
2776 | """ | |
|
2777 | return self._recursive_objects(include_groups=False) | |
|
2778 | ||
|
2666 | 2779 | def get_new_name(self, group_name): |
|
2667 | 2780 | """ |
|
2668 | 2781 | returns new full group name based on parent and new name |
@@ -2673,6 +2786,63 b' class RepoGroup(Base, BaseModel):' | |||
|
2673 | 2786 | self.parent_group else []) |
|
2674 | 2787 | return RepoGroup.url_sep().join(path_prefix + [group_name]) |
|
2675 | 2788 | |
|
2789 | def update_commit_cache(self, config=None): | |
|
2790 | """ | |
|
2791 | Update cache of last changeset for newest repository inside this group, keys should be:: | |
|
2792 | ||
|
2793 | source_repo_id | |
|
2794 | short_id | |
|
2795 | raw_id | |
|
2796 | revision | |
|
2797 | parents | |
|
2798 | message | |
|
2799 | date | |
|
2800 | author | |
|
2801 | ||
|
2802 | """ | |
|
2803 | from rhodecode.lib.vcs.utils.helpers import parse_datetime | |
|
2804 | ||
|
2805 | def repo_groups_and_repos(): | |
|
2806 | all_entries = OrderedDefaultDict(list) | |
|
2807 | ||
|
2808 | def _get_members(root_gr, pos=0): | |
|
2809 | ||
|
2810 | for repo in root_gr.repositories: | |
|
2811 | all_entries[root_gr].append(repo) | |
|
2812 | ||
|
2813 | # fill in all parent positions | |
|
2814 | for parent_group in root_gr.parents: | |
|
2815 | all_entries[parent_group].extend(all_entries[root_gr]) | |
|
2816 | ||
|
2817 | children_groups = root_gr.children.all() | |
|
2818 | if children_groups: | |
|
2819 | for cnt, gr in enumerate(children_groups, 1): | |
|
2820 | _get_members(gr, pos=pos+cnt) | |
|
2821 | ||
|
2822 | _get_members(root_gr=self) | |
|
2823 | return all_entries | |
|
2824 | ||
|
2825 | empty_date = datetime.datetime.fromtimestamp(0) | |
|
2826 | for repo_group, repos in repo_groups_and_repos().items(): | |
|
2827 | ||
|
2828 | latest_repo_cs_cache = {} | |
|
2829 | for repo in repos: | |
|
2830 | repo_cs_cache = repo.changeset_cache | |
|
2831 | date_latest = latest_repo_cs_cache.get('date', empty_date) | |
|
2832 | date_current = repo_cs_cache.get('date', empty_date) | |
|
2833 | current_timestamp = datetime_to_time(parse_datetime(date_latest)) | |
|
2834 | if current_timestamp < datetime_to_time(parse_datetime(date_current)): | |
|
2835 | latest_repo_cs_cache = repo_cs_cache | |
|
2836 | latest_repo_cs_cache['source_repo_id'] = repo.repo_id | |
|
2837 | ||
|
2838 | latest_repo_cs_cache['updated_on'] = time.time() | |
|
2839 | repo_group.changeset_cache = latest_repo_cs_cache | |
|
2840 | Session().add(repo_group) | |
|
2841 | Session().commit() | |
|
2842 | ||
|
2843 | log.debug('updated repo group %s with new commit cache %s', | |
|
2844 | repo_group.group_name, latest_repo_cs_cache) | |
|
2845 | ||
|
2676 | 2846 | def permissions(self, with_admins=True, with_owner=True, |
|
2677 | 2847 | expand_from_user_groups=False): |
|
2678 | 2848 | """ |
@@ -2768,6 +2938,13 b' class RepoGroup(Base, BaseModel):' | |||
|
2768 | 2938 | } |
|
2769 | 2939 | return data |
|
2770 | 2940 | |
|
2941 | def get_dict(self): | |
|
2942 | # Since we transformed `group_name` to a hybrid property, we need to | |
|
2943 | # keep compatibility with the code which uses `group_name` field. | |
|
2944 | result = super(RepoGroup, self).get_dict() | |
|
2945 | result['group_name'] = result.pop('_group_name', None) | |
|
2946 | return result | |
|
2947 | ||
|
2771 | 2948 | |
|
2772 | 2949 | class Permission(Base, BaseModel): |
|
2773 | 2950 | __tablename__ = 'permissions' |
@@ -4258,9 +4435,16 b' class Gist(Base, BaseModel):' | |||
|
4258 | 4435 | # SCM functions |
|
4259 | 4436 | |
|
4260 | 4437 | def scm_instance(self, **kwargs): |
|
4438 | """ | |
|
4439 | Get an instance of VCS Repository | |
|
4440 | ||
|
4441 | :param kwargs: | |
|
4442 | """ | |
|
4443 | from rhodecode.model.gist import GistModel | |
|
4261 | 4444 | full_repo_path = os.path.join(self.base_path(), self.gist_access_id) |
|
4262 | 4445 | return get_vcs_instance( |
|
4263 |
repo_path=safe_str(full_repo_path), create=False |
|
|
4446 | repo_path=safe_str(full_repo_path), create=False, | |
|
4447 | _vcs_alias=GistModel.vcs_backend) | |
|
4264 | 4448 | |
|
4265 | 4449 | |
|
4266 | 4450 | class ExternalIdentity(Base, BaseModel): |
@@ -4904,8 +5088,8 b' class FileStore(Base, BaseModel):' | |||
|
4904 | 5088 | |
|
4905 | 5089 | @classmethod |
|
4906 | 5090 | def create(cls, file_uid, filename, file_hash, file_size, file_display_name='', |
|
4907 | file_description='', enabled=True, check_acl=True, | |
|
4908 | user_id=None, scope_repo_id=None, scope_repo_group_id=None): | |
|
5091 | file_description='', enabled=True, check_acl=True, user_id=None, | |
|
5092 | scope_user_id=None, scope_repo_id=None, scope_repo_group_id=None): | |
|
4909 | 5093 | |
|
4910 | 5094 | store_entry = FileStore() |
|
4911 | 5095 | store_entry.file_uid = file_uid |
@@ -4919,6 +5103,7 b' class FileStore(Base, BaseModel):' | |||
|
4919 | 5103 | store_entry.enabled = enabled |
|
4920 | 5104 | |
|
4921 | 5105 | store_entry.user_id = user_id |
|
5106 | store_entry.scope_user_id = scope_user_id | |
|
4922 | 5107 | store_entry.scope_repo_id = scope_repo_id |
|
4923 | 5108 | store_entry.scope_repo_group_id = scope_repo_group_id |
|
4924 | 5109 | return store_entry |
@@ -618,7 +618,7 b' def PullRequestForm(localizer, repo_id):' | |||
|
618 | 618 | revisions = All(#v.NotReviewedRevisions(localizer, repo_id)(), |
|
619 | 619 | v.UniqueList(localizer)(not_empty=True)) |
|
620 | 620 | review_members = formencode.ForEach(ReviewerForm()) |
|
621 |
pullrequest_title = v.UnicodeString(strip=True, required=True, min= |
|
|
621 | pullrequest_title = v.UnicodeString(strip=True, required=True, min=1, max=255) | |
|
622 | 622 | pullrequest_desc = v.UnicodeString(strip=True, required=False) |
|
623 | 623 | description_renderer = v.UnicodeString(strip=True, required=False) |
|
624 | 624 |
@@ -47,6 +47,7 b" GIST_METADATA_FILE = '.rc_gist_metadata'" | |||
|
47 | 47 | |
|
48 | 48 | class GistModel(BaseModel): |
|
49 | 49 | cls = Gist |
|
50 | vcs_backend = 'hg' | |
|
50 | 51 | |
|
51 | 52 | def _get_gist(self, gist): |
|
52 | 53 | """ |
@@ -145,7 +146,7 b' class GistModel(BaseModel):' | |||
|
145 | 146 | gist_repo_path = os.path.join(GIST_STORE_LOC, gist_id) |
|
146 | 147 | log.debug('Creating new %s GIST repo in %s', gist_type, gist_repo_path) |
|
147 | 148 | repo = RepoModel()._create_filesystem_repo( |
|
148 |
repo_name=gist_id, repo_type= |
|
|
149 | repo_name=gist_id, repo_type=self.vcs_backend, repo_group=GIST_STORE_LOC, | |
|
149 | 150 | use_global_config=True) |
|
150 | 151 | |
|
151 | 152 | # now create single multifile commit |
@@ -683,6 +683,7 b' class PullRequestModel(BaseModel):' | |||
|
683 | 683 | |
|
684 | 684 | # source repo |
|
685 | 685 | source_repo = pull_request.source_repo.scm_instance() |
|
686 | ||
|
686 | 687 | try: |
|
687 | 688 | source_commit = source_repo.get_commit(commit_id=source_ref_name) |
|
688 | 689 | except CommitDoesNotExistError: |
@@ -696,6 +697,7 b' class PullRequestModel(BaseModel):' | |||
|
696 | 697 | |
|
697 | 698 | # target repo |
|
698 | 699 | target_repo = pull_request.target_repo.scm_instance() |
|
700 | ||
|
699 | 701 | try: |
|
700 | 702 | target_commit = target_repo.get_commit(commit_id=target_ref_name) |
|
701 | 703 | except CommitDoesNotExistError: |
@@ -752,8 +754,8 b' class PullRequestModel(BaseModel):' | |||
|
752 | 754 | target_commit.raw_id, source_commit.raw_id, source_repo, merge=True, |
|
753 | 755 | pre_load=pre_load) |
|
754 | 756 | |
|
755 |
ancestor = |
|
|
756 |
target_commit.raw_id, |
|
|
757 | ancestor = source_repo.get_common_ancestor( | |
|
758 | source_commit.raw_id, target_commit.raw_id, target_repo) | |
|
757 | 759 | |
|
758 | 760 | pull_request.source_ref = '%s:%s:%s' % ( |
|
759 | 761 | source_ref_type, source_ref_name, source_commit.raw_id) |
@@ -1314,10 +1316,21 b' class PullRequestModel(BaseModel):' | |||
|
1314 | 1316 | merge_state = self._refresh_merge_state( |
|
1315 | 1317 | pull_request, target_vcs, target_ref) |
|
1316 | 1318 | else: |
|
1317 |
possible = pull_request. |
|
|
1318 | last_merge_status == MergeFailureReason.NONE | |
|
1319 | possible = pull_request.last_merge_status == MergeFailureReason.NONE | |
|
1320 | metadata = { | |
|
1321 | 'target_ref': pull_request.target_ref_parts, | |
|
1322 | 'source_ref': pull_request.source_ref_parts, | |
|
1323 | } | |
|
1324 | if not possible and target_ref.type == 'branch': | |
|
1325 | # NOTE(marcink): case for mercurial multiple heads on branch | |
|
1326 | heads = target_vcs._heads(target_ref.name) | |
|
1327 | if len(heads) != 1: | |
|
1328 | heads = '\n,'.join(target_vcs._heads(target_ref.name)) | |
|
1329 | metadata.update({ | |
|
1330 | 'heads': heads | |
|
1331 | }) | |
|
1319 | 1332 | merge_state = MergeResponse( |
|
1320 | possible, False, None, pull_request.last_merge_status) | |
|
1333 | possible, False, None, pull_request.last_merge_status, metadata=metadata) | |
|
1321 | 1334 | |
|
1322 | 1335 | return merge_state |
|
1323 | 1336 | |
@@ -1326,6 +1339,7 b' class PullRequestModel(BaseModel):' | |||
|
1326 | 1339 | name_or_id = reference.name |
|
1327 | 1340 | else: |
|
1328 | 1341 | name_or_id = reference.commit_id |
|
1342 | ||
|
1329 | 1343 | refreshed_commit = vcs_repository.get_commit(name_or_id) |
|
1330 | 1344 | refreshed_reference = Reference( |
|
1331 | 1345 | reference.type, reference.name, refreshed_commit.raw_id) |
@@ -192,14 +192,14 b' class RepoModel(BaseModel):' | |||
|
192 | 192 | return repo_log |
|
193 | 193 | |
|
194 | 194 | @classmethod |
|
195 |
def update_ |
|
|
195 | def update_commit_cache(cls, repositories=None): | |
|
196 | 196 | if not repositories: |
|
197 | 197 | repositories = Repository.getAll() |
|
198 | 198 | for repo in repositories: |
|
199 | 199 | repo.update_commit_cache() |
|
200 | 200 | |
|
201 | 201 | def get_repos_as_dict(self, repo_list=None, admin=False, |
|
202 | super_user_actions=False): | |
|
202 | super_user_actions=False, short_name=None): | |
|
203 | 203 | _render = get_current_request().get_partial_renderer( |
|
204 | 204 | 'rhodecode:templates/data_table/_dt_elements.mako') |
|
205 | 205 | c = _render.get_call_context() |
@@ -208,8 +208,12 b' class RepoModel(BaseModel):' | |||
|
208 | 208 | return _render('quick_menu', repo_name) |
|
209 | 209 | |
|
210 | 210 | def repo_lnk(name, rtype, rstate, private, archived, fork_of): |
|
211 | if short_name is not None: | |
|
212 | short_name_var = short_name | |
|
213 | else: | |
|
214 | short_name_var = not admin | |
|
211 | 215 | return _render('repo_name', name, rtype, rstate, private, archived, fork_of, |
|
212 |
short_name= |
|
|
216 | short_name=short_name_var, admin=False) | |
|
213 | 217 | |
|
214 | 218 | def last_change(last_change): |
|
215 | 219 | if admin and isinstance(last_change, datetime.datetime) and not last_change.tzinfo: |
@@ -250,8 +254,8 b' class RepoModel(BaseModel):' | |||
|
250 | 254 | repo.private, repo.archived, repo.fork), |
|
251 | 255 | "name_raw": repo.repo_name.lower(), |
|
252 | 256 | |
|
253 |
"last_change": last_change(repo.last_ |
|
|
254 |
"last_change_raw": datetime_to_time(repo.last_ |
|
|
257 | "last_change": last_change(repo.last_commit_change), | |
|
258 | "last_change_raw": datetime_to_time(repo.last_commit_change), | |
|
255 | 259 | |
|
256 | 260 | "last_changeset": last_rev(repo.repo_name, cs_cache), |
|
257 | 261 | "last_changeset_raw": cs_cache.get('revision'), |
@@ -97,6 +97,8 b' class RepoGroupModel(BaseModel):' | |||
|
97 | 97 | return string.Template(template).safe_substitute( |
|
98 | 98 | username=user.username, |
|
99 | 99 | user_id=user.user_id, |
|
100 | first_name=user.first_name, | |
|
101 | last_name=user.last_name, | |
|
100 | 102 | ) |
|
101 | 103 | |
|
102 | 104 | def create_personal_repo_group(self, user, commit_early=True): |
@@ -307,6 +309,10 b' class RepoGroupModel(BaseModel):' | |||
|
307 | 309 | # trigger the post hook |
|
308 | 310 | from rhodecode.lib.hooks_base import log_create_repository_group |
|
309 | 311 | repo_group = RepoGroup.get_by_group_name(group_name) |
|
312 | ||
|
313 | # update repo group commit caches initially | |
|
314 | repo_group.update_commit_cache() | |
|
315 | ||
|
310 | 316 | log_create_repository_group( |
|
311 | 317 | created_by=user.username, **repo_group.get_dict()) |
|
312 | 318 | |
@@ -684,6 +690,13 b' class RepoGroupModel(BaseModel):' | |||
|
684 | 690 | 'revoked permission from usergroup: {} on repogroup: {}'.format( |
|
685 | 691 | group_name, repo_group), namespace='security.repogroup') |
|
686 | 692 | |
|
693 | @classmethod | |
|
694 | def update_commit_cache(cls, repo_groups=None): | |
|
695 | if not repo_groups: | |
|
696 | repo_groups = RepoGroup.getAll() | |
|
697 | for repo_group in repo_groups: | |
|
698 | repo_group.update_commit_cache() | |
|
699 | ||
|
687 | 700 | def get_repo_groups_as_dict(self, repo_group_list=None, admin=False, |
|
688 | 701 | super_user_actions=False): |
|
689 | 702 | |
@@ -705,6 +718,11 b' class RepoGroupModel(BaseModel):' | |||
|
705 | 718 | (datetime.datetime.now() - datetime.datetime.utcnow()).seconds) |
|
706 | 719 | return _render("last_change", last_change) |
|
707 | 720 | |
|
721 | def last_rev(repo_name, cs_cache): | |
|
722 | return _render('revision', repo_name, cs_cache.get('revision'), | |
|
723 | cs_cache.get('raw_id'), cs_cache.get('author'), | |
|
724 | cs_cache.get('message'), cs_cache.get('date')) | |
|
725 | ||
|
708 | 726 | def desc(desc, personal): |
|
709 | 727 | return _render( |
|
710 | 728 | 'repo_group_desc', desc, personal, c.visual.stylify_metatags) |
@@ -721,13 +739,19 b' class RepoGroupModel(BaseModel):' | |||
|
721 | 739 | |
|
722 | 740 | repo_group_data = [] |
|
723 | 741 | for group in repo_group_list: |
|
742 | cs_cache = group.changeset_cache | |
|
743 | last_repo_name = cs_cache.get('source_repo_name') | |
|
724 | 744 | |
|
725 | 745 | row = { |
|
726 | 746 | "menu": quick_menu(group.group_name), |
|
727 | 747 | "name": repo_group_lnk(group.group_name), |
|
728 | 748 | "name_raw": group.group_name, |
|
729 |
"last_change": last_change(group.last_ |
|
|
730 |
"last_change_raw": datetime_to_time(group.last_ |
|
|
749 | "last_change": last_change(group.last_commit_change), | |
|
750 | "last_change_raw": datetime_to_time(group.last_commit_change), | |
|
751 | ||
|
752 | "last_changeset": "", | |
|
753 | "last_changeset_raw": "", | |
|
754 | ||
|
731 | 755 | "desc": desc(group.description_safe, group.personal), |
|
732 | 756 | "top_level_repos": 0, |
|
733 | 757 | "owner": user_profile(group.user.username) |
@@ -818,6 +818,8 b' class ScmModel(BaseModel):' | |||
|
818 | 818 | repo_name=repo.repo_name, repo_alias=scm_instance.alias, |
|
819 | 819 | commit_ids=[tip.raw_id]) |
|
820 | 820 | |
|
821 | return tip | |
|
822 | ||
|
821 | 823 | def delete_nodes(self, user, repo, message, nodes, parent_commit=None, |
|
822 | 824 | author=None, trigger_push_hook=True): |
|
823 | 825 | """ |
@@ -119,6 +119,7 b' class SettingsModel(BaseModel):' | |||
|
119 | 119 | new_ui.ui_value = val |
|
120 | 120 | new_ui.ui_active = active |
|
121 | 121 | |
|
122 | repository_id = '' | |
|
122 | 123 | if self.repo: |
|
123 | 124 | repo = self._get_repo(self.repo) |
|
124 | 125 | repository_id = repo.repo_id |
@@ -440,26 +441,39 b' class VcsSettingsModel(object):' | |||
|
440 | 441 | HOOKS_SETTINGS = ( |
|
441 | 442 | ('hooks', 'changegroup.repo_size'), |
|
442 | 443 | ('hooks', 'changegroup.push_logger'), |
|
443 |
('hooks', 'outgoing.pull_logger'), |
|
|
444 | ('hooks', 'outgoing.pull_logger'), | |
|
445 | ) | |
|
444 | 446 | HG_SETTINGS = ( |
|
445 | 447 | ('extensions', 'largefiles'), |
|
446 | 448 | ('phases', 'publish'), |
|
447 |
('extensions', 'evolve'), |
|
|
449 | ('extensions', 'evolve'), | |
|
450 | ('extensions', 'topic'), | |
|
451 | ('experimental', 'evolution'), | |
|
452 | ('experimental', 'evolution.exchange'), | |
|
453 | ) | |
|
448 | 454 | GIT_SETTINGS = ( |
|
449 |
('vcs_git_lfs', 'enabled'), |
|
|
455 | ('vcs_git_lfs', 'enabled'), | |
|
456 | ) | |
|
450 | 457 | GLOBAL_HG_SETTINGS = ( |
|
451 | 458 | ('extensions', 'largefiles'), |
|
452 | 459 | ('largefiles', 'usercache'), |
|
453 | 460 | ('phases', 'publish'), |
|
454 | 461 | ('extensions', 'hgsubversion'), |
|
455 |
('extensions', 'evolve'), |
|
|
462 | ('extensions', 'evolve'), | |
|
463 | ('extensions', 'topic'), | |
|
464 | ('experimental', 'evolution'), | |
|
465 | ('experimental', 'evolution.exchange'), | |
|
466 | ) | |
|
467 | ||
|
456 | 468 | GLOBAL_GIT_SETTINGS = ( |
|
457 | 469 | ('vcs_git_lfs', 'enabled'), |
|
458 |
('vcs_git_lfs', 'store_location') |
|
|
470 | ('vcs_git_lfs', 'store_location') | |
|
471 | ) | |
|
459 | 472 | |
|
460 | 473 | GLOBAL_SVN_SETTINGS = ( |
|
461 | 474 | ('vcs_svn_proxy', 'http_requests_enabled'), |
|
462 |
('vcs_svn_proxy', 'http_server_url') |
|
|
475 | ('vcs_svn_proxy', 'http_server_url') | |
|
476 | ) | |
|
463 | 477 | |
|
464 | 478 | SVN_BRANCH_SECTION = 'vcs_svn_branch' |
|
465 | 479 | SVN_TAG_SECTION = 'vcs_svn_tag' |
@@ -574,12 +588,38 b' class VcsSettingsModel(object):' | |||
|
574 | 588 | def create_repo_svn_settings(self, data): |
|
575 | 589 | return self._create_svn_settings(self.repo_settings, data) |
|
576 | 590 | |
|
591 | def _set_evolution(self, settings, is_enabled): | |
|
592 | if is_enabled: | |
|
593 | # if evolve is active set evolution=all | |
|
594 | ||
|
595 | self._create_or_update_ui( | |
|
596 | settings, *('experimental', 'evolution'), value='all', | |
|
597 | active=True) | |
|
598 | self._create_or_update_ui( | |
|
599 | settings, *('experimental', 'evolution.exchange'), value='yes', | |
|
600 | active=True) | |
|
601 | # if evolve is active set topics server support | |
|
602 | self._create_or_update_ui( | |
|
603 | settings, *('extensions', 'topic'), value='', | |
|
604 | active=True) | |
|
605 | ||
|
606 | else: | |
|
607 | self._create_or_update_ui( | |
|
608 | settings, *('experimental', 'evolution'), value='', | |
|
609 | active=False) | |
|
610 | self._create_or_update_ui( | |
|
611 | settings, *('experimental', 'evolution.exchange'), value='no', | |
|
612 | active=False) | |
|
613 | self._create_or_update_ui( | |
|
614 | settings, *('extensions', 'topic'), value='', | |
|
615 | active=False) | |
|
616 | ||
|
577 | 617 | @assert_repo_settings |
|
578 | 618 | def create_or_update_repo_hg_settings(self, data): |
|
579 | 619 | largefiles, phases, evolve = \ |
|
580 | self.HG_SETTINGS | |
|
620 | self.HG_SETTINGS[:3] | |
|
581 | 621 | largefiles_key, phases_key, evolve_key = \ |
|
582 | self._get_settings_keys(self.HG_SETTINGS, data) | |
|
622 | self._get_settings_keys(self.HG_SETTINGS[:3], data) | |
|
583 | 623 | |
|
584 | 624 | self._create_or_update_ui( |
|
585 | 625 | self.repo_settings, *largefiles, value='', |
@@ -587,21 +627,22 b' class VcsSettingsModel(object):' | |||
|
587 | 627 | self._create_or_update_ui( |
|
588 | 628 | self.repo_settings, *evolve, value='', |
|
589 | 629 | active=data[evolve_key]) |
|
630 | self._set_evolution(self.repo_settings, is_enabled=data[evolve_key]) | |
|
631 | ||
|
590 | 632 | self._create_or_update_ui( |
|
591 | 633 | self.repo_settings, *phases, value=safe_str(data[phases_key])) |
|
592 | 634 | |
|
593 | 635 | def create_or_update_global_hg_settings(self, data): |
|
594 | 636 | largefiles, largefiles_store, phases, hgsubversion, evolve \ |
|
595 | = self.GLOBAL_HG_SETTINGS | |
|
637 | = self.GLOBAL_HG_SETTINGS[:5] | |
|
596 | 638 | largefiles_key, largefiles_store_key, phases_key, subversion_key, evolve_key \ |
|
597 | = self._get_settings_keys(self.GLOBAL_HG_SETTINGS, data) | |
|
639 | = self._get_settings_keys(self.GLOBAL_HG_SETTINGS[:5], data) | |
|
598 | 640 | |
|
599 | 641 | self._create_or_update_ui( |
|
600 | 642 | self.global_settings, *largefiles, value='', |
|
601 | 643 | active=data[largefiles_key]) |
|
602 | 644 | self._create_or_update_ui( |
|
603 | self.global_settings, *largefiles_store, | |
|
604 | value=data[largefiles_store_key]) | |
|
645 | self.global_settings, *largefiles_store, value=data[largefiles_store_key]) | |
|
605 | 646 | self._create_or_update_ui( |
|
606 | 647 | self.global_settings, *phases, value=safe_str(data[phases_key])) |
|
607 | 648 | self._create_or_update_ui( |
@@ -609,9 +650,10 b' class VcsSettingsModel(object):' | |||
|
609 | 650 | self._create_or_update_ui( |
|
610 | 651 | self.global_settings, *evolve, value='', |
|
611 | 652 | active=data[evolve_key]) |
|
653 | self._set_evolution(self.global_settings, is_enabled=data[evolve_key]) | |
|
612 | 654 | |
|
613 | 655 | def create_or_update_repo_git_settings(self, data): |
|
614 | # NOTE(marcink): # comma make unpack work properly | |
|
656 | # NOTE(marcink): # comma makes unpack work properly | |
|
615 | 657 | lfs_enabled, \ |
|
616 | 658 | = self.GIT_SETTINGS |
|
617 | 659 | |
@@ -675,6 +717,7 b' class VcsSettingsModel(object):' | |||
|
675 | 717 | def get_repo_ui_settings(self, section=None, key=None): |
|
676 | 718 | global_uis = self.global_settings.get_ui(section, key) |
|
677 | 719 | repo_uis = self.repo_settings.get_ui(section, key) |
|
720 | ||
|
678 | 721 | filtered_repo_uis = self._filter_ui_settings(repo_uis) |
|
679 | 722 | filtered_repo_uis_keys = [ |
|
680 | 723 | (s.section, s.key) for s in filtered_repo_uis] |
@@ -24,10 +24,15 b' import traceback' | |||
|
24 | 24 | import sshpubkeys |
|
25 | 25 | import sshpubkeys.exceptions |
|
26 | 26 | |
|
27 | from cryptography.hazmat.primitives.asymmetric import rsa | |
|
28 | from cryptography.hazmat.primitives import serialization as crypto_serialization | |
|
29 | from cryptography.hazmat.backends import default_backend as crypto_default_backend | |
|
30 | ||
|
27 | 31 | from rhodecode.model import BaseModel |
|
28 | 32 | from rhodecode.model.db import UserSshKeys |
|
29 | 33 | from rhodecode.model.meta import Session |
|
30 | 34 | |
|
35 | ||
|
31 | 36 | log = logging.getLogger(__name__) |
|
32 | 37 | |
|
33 | 38 | |
@@ -62,16 +67,24 b' class SshKeyModel(BaseModel):' | |||
|
62 | 67 | raise |
|
63 | 68 | |
|
64 | 69 | def generate_keypair(self, comment=None): |
|
65 | from Crypto.PublicKey import RSA | |
|
66 | ||
|
67 | key = RSA.generate(2048) | |
|
68 | private = key.exportKey('PEM') | |
|
69 | 70 | |
|
70 | pubkey = key.publickey() | |
|
71 | public = pubkey.exportKey('OpenSSH') | |
|
71 | key = rsa.generate_private_key( | |
|
72 | backend=crypto_default_backend(), | |
|
73 | public_exponent=65537, | |
|
74 | key_size=2048 | |
|
75 | ) | |
|
76 | private_key = key.private_bytes( | |
|
77 | crypto_serialization.Encoding.PEM, | |
|
78 | crypto_serialization.PrivateFormat.PKCS8, | |
|
79 | crypto_serialization.NoEncryption()) | |
|
80 | public_key = key.public_key().public_bytes( | |
|
81 | crypto_serialization.Encoding.OpenSSH, | |
|
82 | crypto_serialization.PublicFormat.OpenSSH | |
|
83 | ) | |
|
84 | ||
|
72 | 85 | if comment: |
|
73 | public = public + " " + comment | |
|
74 | return private, public | |
|
86 | public_key = public_key + " " + comment | |
|
87 | return private_key, public_key | |
|
75 | 88 | |
|
76 | 89 | def create(self, user, fingerprint, key_data, description): |
|
77 | 90 | """ |
@@ -14,11 +14,11 b' input[type="button"] {' | |||
|
14 | 14 | font-family: @text-light; |
|
15 | 15 | text-decoration: none; |
|
16 | 16 | text-shadow: none; |
|
17 |
color: @grey |
|
|
17 | color: @grey2; | |
|
18 | 18 | background-color: white; |
|
19 | 19 | background-image: none; |
|
20 | 20 | border: none; |
|
21 |
.border ( @border-thickness-buttons, @grey |
|
|
21 | .border ( @border-thickness-buttons, @grey5 ); | |
|
22 | 22 | .border-radius (@border-radius); |
|
23 | 23 | cursor: pointer; |
|
24 | 24 | white-space: nowrap; |
@@ -26,6 +26,10 b' input[type="button"] {' | |||
|
26 | 26 | -moz-transition: background .3s,color .3s; |
|
27 | 27 | -o-transition: background .3s,color .3s; |
|
28 | 28 | transition: background .3s,color .3s; |
|
29 | box-shadow: @button-shadow; | |
|
30 | -webkit-box-shadow: @button-shadow; | |
|
31 | ||
|
32 | ||
|
29 | 33 | |
|
30 | 34 | a { |
|
31 | 35 | display: block; |
@@ -44,8 +48,9 b' input[type="button"] {' | |||
|
44 | 48 | outline:none; |
|
45 | 49 | } |
|
46 | 50 | &:hover { |
|
47 |
color: |
|
|
48 |
background-color: @ |
|
|
51 | color: @rcdarkblue; | |
|
52 | background-color: @white; | |
|
53 | .border ( @border-thickness, @grey4 ); | |
|
49 | 54 | } |
|
50 | 55 | |
|
51 | 56 | .icon-remove-sign { |
@@ -70,26 +75,26 b' input[type="button"] {' | |||
|
70 | 75 | |
|
71 | 76 | |
|
72 | 77 | .btn-default { |
|
73 |
|
|
|
78 | border: @border-thickness solid @grey5; | |
|
74 | 79 | background-image: none; |
|
75 |
color: @r |
|
|
80 | color: @grey2; | |
|
76 | 81 | |
|
77 | 82 | a { |
|
78 |
color: @r |
|
|
83 | color: @grey2; | |
|
79 | 84 | } |
|
80 | 85 | |
|
81 | 86 | &:hover, |
|
82 | 87 | &.active { |
|
83 |
color: |
|
|
84 |
background-color: @ |
|
|
85 |
.border ( @border-thickness, @r |
|
|
88 | color: @rcdarkblue; | |
|
89 | background-color: @white; | |
|
90 | .border ( @border-thickness, @grey4 ); | |
|
86 | 91 | |
|
87 | 92 | a { |
|
88 |
color: |
|
|
93 | color: @grey2; | |
|
89 | 94 | } |
|
90 | 95 | } |
|
91 | 96 | &:disabled { |
|
92 |
.border ( @border-thickness-buttons, @grey |
|
|
97 | .border ( @border-thickness-buttons, @grey5 ); | |
|
93 | 98 | background-color: transparent; |
|
94 | 99 | } |
|
95 | 100 | } |
@@ -326,6 +331,7 b' input[type="submit"] {' | |||
|
326 | 331 | .border ( @border-thickness-buttons, @rcblue ); |
|
327 | 332 | background-color: @rcblue; |
|
328 | 333 | color: white; |
|
334 | opacity: 0.5; | |
|
329 | 335 | } |
|
330 | 336 | } |
|
331 | 337 | |
@@ -416,3 +422,23 b' input[type="reset"] {' | |||
|
416 | 422 | } |
|
417 | 423 | } |
|
418 | 424 | |
|
425 | ||
|
426 | .button-links { | |
|
427 | float: left; | |
|
428 | display: inline; | |
|
429 | margin: 0; | |
|
430 | padding-left: 0; | |
|
431 | list-style: none; | |
|
432 | text-align: right; | |
|
433 | ||
|
434 | li { | |
|
435 | ||
|
436 | ||
|
437 | } | |
|
438 | ||
|
439 | li.active { | |
|
440 | background-color: @grey6; | |
|
441 | .border ( @border-thickness, @grey4 ); | |
|
442 | } | |
|
443 | ||
|
444 | } |
@@ -404,12 +404,9 b' div.codeblock {' | |||
|
404 | 404 | |
|
405 | 405 | // TODO: johbo: Added interim to get rid of the margin around |
|
406 | 406 | // Select2 widgets. This needs further cleanup. |
|
407 | margin-top: @padding; | |
|
408 | ||
|
409 | 407 | overflow: auto; |
|
410 | 408 | padding: 0px; |
|
411 |
border: @border-thickness solid @grey |
|
|
412 | background: @grey6; | |
|
409 | border: @border-thickness solid @grey6; | |
|
413 | 410 | .border-radius(@border-radius); |
|
414 | 411 | |
|
415 | 412 | #remove_gist { |
@@ -479,7 +476,7 b' div.codeblock {' | |||
|
479 | 476 | } |
|
480 | 477 | |
|
481 | 478 | .code-body { |
|
482 |
padding: |
|
|
479 | padding: 0.8em 1em; | |
|
483 | 480 | background-color: #ffffff; |
|
484 | 481 | min-width: 100%; |
|
485 | 482 | box-sizing: border-box; |
@@ -492,6 +489,21 b' div.codeblock {' | |||
|
492 | 489 | height: auto; |
|
493 | 490 | width: 100%; |
|
494 | 491 | } |
|
492 | ||
|
493 | .markdown-block { | |
|
494 | padding: 1em 0; | |
|
495 | } | |
|
496 | } | |
|
497 | ||
|
498 | .codeblock-header { | |
|
499 | background: @grey7; | |
|
500 | height: 36px; | |
|
501 | } | |
|
502 | ||
|
503 | .path { | |
|
504 | border-bottom: 1px solid @grey6; | |
|
505 | padding: .65em 1em; | |
|
506 | height: 18px; | |
|
495 | 507 | } |
|
496 | 508 | } |
|
497 | 509 |
@@ -27,7 +27,7 b'' | |||
|
27 | 27 | |
|
28 | 28 | .CodeMirror-gutters { |
|
29 | 29 | border-right: 1px solid #ddd; |
|
30 |
background-color: |
|
|
30 | background-color: white; | |
|
31 | 31 | white-space: nowrap; |
|
32 | 32 | } |
|
33 | 33 | .CodeMirror-linenumbers {} |
@@ -14,7 +14,7 b' tr.inline-comments div {' | |||
|
14 | 14 | max-width: 100%; |
|
15 | 15 | |
|
16 | 16 | p { |
|
17 |
white-space: normal; |
|
|
17 | white-space: normal; | |
|
18 | 18 | } |
|
19 | 19 | |
|
20 | 20 | code, pre, .code, dd { |
@@ -227,7 +227,7 b' tr.inline-comments div {' | |||
|
227 | 227 | .delete-comment { |
|
228 | 228 | display: inline-block; |
|
229 | 229 | color: @rcblue; |
|
230 | ||
|
230 | ||
|
231 | 231 | &:hover { |
|
232 | 232 | cursor: pointer; |
|
233 | 233 | } |
@@ -377,13 +377,13 b' form.comment-form {' | |||
|
377 | 377 | position: relative; |
|
378 | 378 | width: 100%; |
|
379 | 379 | min-height: 42px; |
|
380 | ||
|
380 | ||
|
381 | 381 | .status_box, |
|
382 | 382 | .cancel-button { |
|
383 | 383 | float: left; |
|
384 | 384 | display: inline-block; |
|
385 | 385 | } |
|
386 | ||
|
386 | ||
|
387 | 387 | .action-buttons { |
|
388 | 388 | float: right; |
|
389 | 389 | display: inline-block; |
@@ -426,10 +426,10 b' form.comment-form {' | |||
|
426 | 426 | |
|
427 | 427 | .comment-form-login { |
|
428 | 428 | .comment-help { |
|
429 |
padding: 0. |
|
|
429 | padding: 0.7em; //same as the button | |
|
430 | 430 | } |
|
431 | 431 | |
|
432 |
div.clearfix { |
|
|
432 | div.clearfix { | |
|
433 | 433 | clear: both; |
|
434 | 434 | width: 100%; |
|
435 | 435 | display: block; |
@@ -38,7 +38,7 b'' | |||
|
38 | 38 | |
|
39 | 39 | .form-control { |
|
40 | 40 | width: 100%; |
|
41 |
padding: 0. |
|
|
41 | padding: 0.7em; | |
|
42 | 42 | border: 1px solid #979797; |
|
43 | 43 | border-radius: 2px; |
|
44 | 44 | } |
@@ -60,13 +60,13 b' form.rcform {' | |||
|
60 | 60 | max-width: 500px; |
|
61 | 61 | margin: 0 0 @padding -@legend-width; |
|
62 | 62 | padding: 0 0 0 @legend-width; |
|
63 | ||
|
63 | ||
|
64 | 64 | .btn { |
|
65 | 65 | display: inline-block; |
|
66 | 66 | margin: 0 1em @padding 0; |
|
67 | 67 | } |
|
68 | 68 | } |
|
69 | ||
|
69 | ||
|
70 | 70 | input, |
|
71 | 71 | textarea { |
|
72 | 72 | float: left; |
@@ -113,7 +113,7 b' form.rcform {' | |||
|
113 | 113 | opacity: 0.5; |
|
114 | 114 | } |
|
115 | 115 | } |
|
116 | ||
|
116 | ||
|
117 | 117 | input[type="radio"]:not(#ie), |
|
118 | 118 | input[type="checkbox"]:not(#ie) { |
|
119 | 119 | // Hide the input, but have it still be clickable |
@@ -187,13 +187,13 b' form.rcform {' | |||
|
187 | 187 | filter: progid:DXImageTransform.Microsoft.Matrix(sizingMethod='auto expand', M11=0.7071067811865476, M12=-0.7071067811865475, M21=0.7071067811865475, M22=0.7071067811865476); /* IE6,IE7 */ |
|
188 | 188 | |
|
189 | 189 | -ms-filter: "progid:DXImageTransform.Microsoft.Matrix(SizingMethod='auto expand', M11=0.7071067811865476, M12=-0.7071067811865475, M21=0.7071067811865475, M22=0.7071067811865476)"; /* IE8 */ } |
|
190 | ||
|
190 | ||
|
191 | 191 | & + .label { |
|
192 | 192 | float: left; |
|
193 | 193 | margin-top: 5px |
|
194 | 194 | } |
|
195 | 195 | } |
|
196 | ||
|
196 | ||
|
197 | 197 | input[type=checkbox]:not(#ie) { |
|
198 | 198 | visibility: hidden; |
|
199 | 199 | &:checked + label:after { |
@@ -231,6 +231,11 b' form.rcform {' | |||
|
231 | 231 | |
|
232 | 232 | .drop-menu { |
|
233 | 233 | float: left; |
|
234 | ||
|
235 | & + .last-item { | |
|
236 | margin: 0; | |
|
237 | } | |
|
238 | ||
|
234 | 239 | margin: 0 @input-padding 0 0; |
|
235 | 240 | } |
|
236 | 241 | |
@@ -244,7 +249,7 b' form.rcform {' | |||
|
244 | 249 | .error-message { |
|
245 | 250 | margin-top: 5px; |
|
246 | 251 | } |
|
247 | ||
|
252 | ||
|
248 | 253 | input[type=submit] { |
|
249 | 254 | &:extend(.btn-primary); |
|
250 | 255 | |
@@ -271,14 +276,15 b' form.rcform {' | |||
|
271 | 276 | .badged-field { |
|
272 | 277 | .user-badge { |
|
273 | 278 | line-height: 25px; |
|
274 |
padding: |
|
|
279 | padding: .4em; | |
|
275 | 280 | border-radius: @border-radius; |
|
276 |
border-top: 1px solid @ |
|
|
277 |
border-left: 1px solid @ |
|
|
278 |
border-bottom: 1px solid @ |
|
|
281 | border-top: 1px solid @grey4; | |
|
282 | border-left: 1px solid @grey4; | |
|
283 | border-bottom: 1px solid @grey4; | |
|
279 | 284 | font-size: 14px; |
|
280 | 285 | font-style: normal; |
|
281 | 286 | color: @text-light; |
|
287 | background: @grey7; | |
|
282 | 288 | display: inline-block; |
|
283 | 289 | vertical-align: top; |
|
284 | 290 | cursor: default; |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: file renamed from rhodecode/public/images/rhodecode-logo-white-216x60.png to rhodecode/public/images/rhodecode-logo-white-60x60.png, binary diff hidden |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file renamed from rhodecode/templates/changelog/changelog.mako to rhodecode/templates/commits/changelog.mako | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file renamed from rhodecode/templates/changelog/changelog_elements.mako to rhodecode/templates/commits/changelog_elements.mako | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file renamed from rhodecode/templates/changelog/changelog_file_history.mako to rhodecode/templates/commits/changelog_file_history.mako | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file renamed from rhodecode/templates/files/files_detail.mako to rhodecode/templates/files/files_source_header.mako | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file renamed from rhodecode/templates/files/file_tree_detail.mako to rhodecode/templates/files/files_tree_header.mako | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed |
|
1 | NO CONTENT: file was removed, binary diff hidden |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
General Comments 0
You need to be logged in to leave comments.
Login now