Show More
The requested changes are too big and content was truncated. Show full diff
@@ -0,0 +1,26 b'' | |||
|
1 | .. _adjust-rhodecode-mem: | |
|
2 | ||
|
3 | RhodeCode Memory Usage | |
|
4 | ---------------------- | |
|
5 | ||
|
6 | Starting from Version 4.18.X RhodeCode has a builtin memory monitor for gunicorn workers. | |
|
7 | Enabling this can limit the maximum amount of memory system can use. Each worker | |
|
8 | for RhodeCode is monitored independently. | |
|
9 | To enable Memory management make sure to have following settings inside `[app:main] section` of | |
|
10 | :file:`home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file. | |
|
11 | ||
|
12 | ||
|
13 | ||
|
14 | ; Maximum memory usage that each worker can use before it will receive a | |
|
15 | ; graceful restart signal 0 = memory monitoring is disabled | |
|
16 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |
|
17 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |
|
18 | memory_max_usage = 1073741824 | |
|
19 | ||
|
20 | ; How often in seconds to check for memory usage for each gunicorn worker | |
|
21 | memory_usage_check_interval = 60 | |
|
22 | ||
|
23 | ; Threshold value for which we don't recycle worker if GarbageCollection | |
|
24 | ; frees up enough resources. Before each restart we try to run GC on worker | |
|
25 | ; in case we get enough free memory after that, restart will not happen. | |
|
26 | memory_usage_recovery_threshold = 0.8 |
@@ -0,0 +1,230 b'' | |||
|
1 | |RCE| 4.18.0 |RNS| | |
|
2 | ------------------ | |
|
3 | ||
|
4 | Release Date | |
|
5 | ^^^^^^^^^^^^ | |
|
6 | ||
|
7 | - 2020-01-05 | |
|
8 | ||
|
9 | ||
|
10 | New Features | |
|
11 | ^^^^^^^^^^^^ | |
|
12 | ||
|
13 | - Artifacts: are no longer in BETA. New info page is available for uploaded artifacts | |
|
14 | which exposes some useful information like sha256, various access urls etc, and also | |
|
15 | allows deletion of artifacts, and updating their description. | |
|
16 | - Artifacts: support new download url based on access to artifacts using new auth-token types. | |
|
17 | - Artifacts: added ability to store artifacts using API, and internal cli upload. | |
|
18 | This allows uploading of artifacts that can have 100s of GBs in size efficiently. | |
|
19 | - Artifacts: added metadata logic to store various extra custom data for artifacts. | |
|
20 | - Comments: added support for adding comment attachments using the artifacts logic. | |
|
21 | Logged in users can now pick or drag and drop attachments into comment forms. | |
|
22 | - Comments: enable linkification of certain patterns on comments in repo/pull request scopes. | |
|
23 | This will render now active links to commits, pull-requests mentioned in comments body. | |
|
24 | - Jira: new update integration plugin. | |
|
25 | Plugin now fetches possible transitions from tickets and show them to users in the interface. | |
|
26 | Allow sending extra attributes during a transition like `resolution` message. | |
|
27 | - Navigation: Added new consistent and contextual way of creating new objects | |
|
28 | likes gists, repositories, and repository groups using dedicated action (with a `+` sign) | |
|
29 | available in the top navigation. | |
|
30 | - Hovercards: added new tooltips and hovercards to expose certain information for objects shown in UI. | |
|
31 | RhodeCode usernames, issues, pull-requests will have active hovercard logic that will | |
|
32 | load extra information about them and exposing them to users. | |
|
33 | - Files: all readme files found in repository file browser will be now rendered, allowing having readme per directory. | |
|
34 | - Search: expose line counts in search files information. | |
|
35 | - Audit-logs: expose download user audit logs as JSON file. | |
|
36 | - Users: added description field for users. | |
|
37 | Allows users to write a short BIO, or description of their role in the organization. | |
|
38 | - Users: allow super-admins to change bound authentication type for users. | |
|
39 | E.g internal rhodecode accounts can be changed to ldap easily from user settings page. | |
|
40 | - Pull requests: simplified the UI for display view, hide less important information and expose the most important ones. | |
|
41 | - Pull requests: add merge check that detects WIP marker in title. | |
|
42 | Usually WIP in title means unfinished task that needs still some work, such marker will prevent accidental merges. | |
|
43 | - Pull requests: TODO comments have now a dedicated box below reviewers to keep track | |
|
44 | of important TODOs that still need attention before review process is finalized. | |
|
45 | - Pull requests: participants of pull request will receive an email about update of a | |
|
46 | pull requests with a small summary of changes made. | |
|
47 | - Pull requests: change the naming from #NUM into !NUM. | |
|
48 | !NUM format is now parsed and linkified in comments and commit messages. | |
|
49 | - Pull requests: pull requests which state is changing can now be viewed with a limited view. | |
|
50 | - Pull requests: re-organize merge/close buttons and merge checks according to the new UI. | |
|
51 | - Pull requests: update commits button allows a force-refresh update now using dropdown option. | |
|
52 | - Pull requests: added quick filter to grid view to filter/search pull requests in a repository. | |
|
53 | - Pull requests: closing a pull-request without a merge requires additional confirmation now. | |
|
54 | - Pull requests: merge checks will now show which files caused conflicts and are blocking the merge. | |
|
55 | - Emails: updated all generated emails design and cleanup the data fields they expose. | |
|
56 | a) More consistent UI for all types of emails. b) Improved formatting of plaintext emails | |
|
57 | c) Added reply link to comment type emails for quicker response action. | |
|
58 | ||
|
59 | ||
|
60 | General | |
|
61 | ^^^^^^^ | |
|
62 | ||
|
63 | - Artifacts: don't show hidden artifacts, allow showing them via a GET ?hidden=1 flag. | |
|
64 | Hidden artifacts are for example comment attachments. | |
|
65 | - UI: new commits page, according to the new design, which started on 4.17.X release lines | |
|
66 | - UI: use explicit named actions like "create user" instead of generic "save" which is bad UX. | |
|
67 | - UI: fixed problems with generating last change in repository groups. | |
|
68 | There's now a new logic that checks all objects inside group for latest update time. | |
|
69 | - API: add artifact `get_info`, and `store_metadata` methods. | |
|
70 | - API: allowed to specify extra recipients for pr/commit comments api methods. | |
|
71 | - Vcsserver: set file based cache as default for vcsserver which can be shared | |
|
72 | across multiple workers saving memory usage. | |
|
73 | - Vcsserver: added redis as possible cache backend for even greater performance. | |
|
74 | - Dependencies: bumped GIT version to 2.23.0 | |
|
75 | - Dependencies: bumped SVN version to 1.12.2 | |
|
76 | - Dependencies: bumped Mercurial version to 5.1.1 and hg-evolve to 9.1.0 | |
|
77 | - Search: added logic for sorting ElasticSearch6 backend search results. | |
|
78 | - User bookmarks: make it easier to re-organize existing entries. | |
|
79 | - Data grids: hide pagination for single pages in grids. | |
|
80 | - Gists: UX, removed private/public gist buttons and replaced them with radio group. | |
|
81 | - Gunicorn: moved all configuration of gunicorn workers to .ini files. | |
|
82 | - Gunicorn: added worker memory management allowing setting maximum per-worker memory usage. | |
|
83 | - Automation: moved update groups task into celery task | |
|
84 | - Cache commits: add option to refresh caches manually from advanced pages. | |
|
85 | - Pull requests: add indication of state change in list of pull-requests and actually show them in the list. | |
|
86 | - Cache keys: register and self cleanup cache keys used for invalidation to prevent leaking lot of them into DB on worker recycle | |
|
87 | - Repo groups: removed locking inheritance flag from repo-groups. We'll deprecate this soon and this only brings in confusion | |
|
88 | - System snapshot: improved formatting for better readability | |
|
89 | - System info: expose data about vcsserver. | |
|
90 | - Packages: updated celery to 4.3.0 and switch default backend to redis instead of RabbitMQ. | |
|
91 | Redis is stable enough and easier to install. Having Redis simplifies the stack as it's used in other parts of RhodeCode. | |
|
92 | - Dependencies: bumped alembic to 1.2.1 | |
|
93 | - Dependencies: bumped amqp==2.5.2 and kombu==4.6.6 | |
|
94 | - Dependencies: bumped atomicwrites==1.3.0 | |
|
95 | - Dependencies: bumped cffi==1.12.3 | |
|
96 | - Dependencies: bumped configparser==4.0.2 | |
|
97 | - Dependencies: bumped deform==2.0.8 | |
|
98 | - Dependencies: bumped dogpile.cache==0.9.0 | |
|
99 | - Dependencies: bumped hupper==1.8.1 | |
|
100 | - Dependencies: bumped mako to 1.1.0 | |
|
101 | - Dependencies: bumped markupsafe to 1.1.1 | |
|
102 | - Dependencies: bumped packaging==19.2 | |
|
103 | - Dependencies: bumped paste==3.2.1 | |
|
104 | - Dependencies: bumped pastescript==3.2.0 | |
|
105 | - Dependencies: bumped pathlib2 to 2.3.4 | |
|
106 | - Dependencies: bumped pluggy==0.13.0 | |
|
107 | - Dependencies: bumped psutil to 5.6.3 | |
|
108 | - Dependencies: bumped psutil==5.6.5 | |
|
109 | - Dependencies: bumped psycopg2==2.8.4 | |
|
110 | - Dependencies: bumped pycurl to 7.43.0.3 | |
|
111 | - Dependencies: bumped pyotp==2.3.0 | |
|
112 | - Dependencies: bumped pyparsing to 2.4.2 | |
|
113 | - Dependencies: bumped pyramid-debugtoolbar==4.5.1 | |
|
114 | - Dependencies: bumped pyramid-mako to 1.1.0 | |
|
115 | - Dependencies: bumped redis to 3.3.8 | |
|
116 | - Dependencies: bumped sqlalchemy to 1.3.8 | |
|
117 | - Dependencies: bumped sqlalchemy==1.3.11 | |
|
118 | - Dependencies: bumped test libraries. | |
|
119 | - Dependencies: freeze alembic==1.3.1 | |
|
120 | - Dependencies: freeze python-dateutil | |
|
121 | - Dependencies: freeze redis==3.3.11 | |
|
122 | - Dependencies: freeze supervisor==4.1.0 | |
|
123 | ||
|
124 | ||
|
125 | Security | |
|
126 | ^^^^^^^^ | |
|
127 | ||
|
128 | - Security: fixed issues with exposing wrong http status (403) indicating repository with | |
|
129 | given name exists and we don't have permissions to it. This was exposed in the redirection | |
|
130 | logic of the global pull-request page. In case of redirection we also exposed | |
|
131 | repository name in the URL. | |
|
132 | ||
|
133 | ||
|
134 | Performance | |
|
135 | ^^^^^^^^^^^ | |
|
136 | ||
|
137 | - Core: many various small improvements and optimizations to make rhodecode faster then before. | |
|
138 | - VCSServer: new cache implementation for remote functions. | |
|
139 | Single worker shared caches that can use redis/file-cache. | |
|
140 | This greatly improves performance on larger instances, and doesn't trigger cache | |
|
141 | re-calculation on worker restarts. | |
|
142 | - GIT: switched internal git operations from Dulwich to libgit2 in order to obtain better performance and scalability. | |
|
143 | - SSH: skip loading unneeded application parts for SSH to make execution of ssh commands faster. | |
|
144 | - Main page: main page will now load repositories and repositories groups using partial DB calls instead of big JSON files. | |
|
145 | In case of many repositories in root this could lead to very slow page rendering. | |
|
146 | - Admin pages: made all grids use same DB based partial loading logic. We'll no longer fetch | |
|
147 | all objects into JSON for display purposes. This significantly improves speed of those pages in case | |
|
148 | of many objects shown in them. | |
|
149 | - Summary page: use non-memory cache for readme, and cleanup cache for repo stats. | |
|
150 | This change won't re-cache after worker restarts and can be shared across all workers | |
|
151 | - Files: only check for git_lfs/hg_largefiles if they are enabled. | |
|
152 | This speeds up fetching of files if they are not LF and very big. | |
|
153 | - Vcsserver: added support for streaming data from the remote methods. This allows | |
|
154 | to stream very large files without taking up memory, mostly for usage in SVN when | |
|
155 | downloading large binaries from vcs system. | |
|
156 | - Files: added streaming remote attributes for vcsserver. | |
|
157 | This change enables streaming raw content or raw downloads of large files without | |
|
158 | transferring them over to enterprise for pack & repack using msgpack. | |
|
159 | Msgpack has a limit of 2gb and generally pack+repack for ~2gb is very slow. | |
|
160 | - Files: ensure over size limit files never do any content fetching when viewing such files. | |
|
161 | - VCSServer: skip host verification to speed up pycurl calls. | |
|
162 | - User-bookmarks: cache fetching of bookmarks since this is quite expensive query to | |
|
163 | make with joinedload on repos/repo groups. | |
|
164 | - Goto-switcher: reduce query data to only required attributes for speedups. | |
|
165 | - My account: owner/watched repos are now loaded only using DB queries. | |
|
166 | ||
|
167 | ||
|
168 | Fixes | |
|
169 | ^^^^^ | |
|
170 | ||
|
171 | - Mercurial: move imports from top-level to prevent from loading mercurial code on hook execution for svn/git. | |
|
172 | - GIT: limit sync-fetch logic to only retrieve tags/ and heads/ with default execution arguments. | |
|
173 | - GIT: fixed issue with git submodules detection. | |
|
174 | - SVN: fix checkout url for ssh+svn backend not having special prefix resulting in incorrect command shown. | |
|
175 | - SVN: fixed problem with showing empty directories. | |
|
176 | - OAuth: use a vendored version of `authomatic` library, and switch Bitbucket authentication to use oauth2. | |
|
177 | - Diffs: handle paths with quotes in diffs. | |
|
178 | - Diffs: fixed outdated files in pull-requests re-using the filediff raw_id for anchor generation. Fixes #5567 | |
|
179 | - Diffs: toggle race condition on sticky vs wide-diff-mode that caused some display problems on larger diffs. | |
|
180 | - Pull requests: handle exceptions in state change and improve logging. | |
|
181 | - Pull requests: fixed title/description generation for single commits which are numbers. | |
|
182 | - Pull requests: changed the source of changes to be using shadow repos if it exists. | |
|
183 | In case of `git push -f` and rebase we lost commits in the repo resulting in | |
|
184 | problems of displaying versions of pull-requests. | |
|
185 | - Pull requests: handle case when removing existing files from a repository in compare versions diff. | |
|
186 | - Files: don't expose copy content helper in case of binary files. | |
|
187 | - Registration: properly expose first_name/last_name into email on user registration. | |
|
188 | - Markup renderers: fixed broken code highlight for rst files. | |
|
189 | - Ui: make super admin be named consistently across ui. | |
|
190 | - Audit logs: fixed search cases with special chars such as `-`. | |
|
191 | ||
|
192 | ||
|
193 | Upgrade notes | |
|
194 | ^^^^^^^^^^^^^ | |
|
195 | ||
|
196 | - New Automation task. We've changed the logic for updating latest change inside repository group. | |
|
197 | New logic includes scanning for changes in all nested objects. Since this is a heavy task | |
|
198 | a new dedicated scheduler task has been created to update it automatically on a scheduled base. | |
|
199 | Please review in `admin > settings > automation` to enable this task. | |
|
200 | ||
|
201 | - New safer encryption algorithm. Some setting values are encrypted before storing it inside the database. | |
|
202 | To keep full backward compatibility old AES algorithm is used. | |
|
203 | If you wish to enable a safer option set fernet encryption instead inside rhodecode.ini | |
|
204 | `rhodecode.encrypted_values.algorithm = fernet` | |
|
205 | ||
|
206 | - Pull requests UI changes. We've simplified the UI on pull requests page. | |
|
207 | Please review the new UI to prevent surprises. All actions from old UI should be still possible with the new one. | |
|
208 | ||
|
209 | - Redis is now a default recommended backend for Celery and replaces previous rabbitmq. | |
|
210 | Redis is generally easier to manage and install, and it's also very stable for usage | |
|
211 | in the scheduler/celery async tasks. Since we also recommend Redis for caches the application | |
|
212 | stack can be simplified by removing rabbitmq and replacing it with single Redis instance. | |
|
213 | ||
|
214 | - Recommendation for using Redis as the new cache backend on vcsserver. | |
|
215 | Since Version 4.18.0 VCSServer has a new cache implementation for VCS data. | |
|
216 | By default, for simplicity the cache type is file based. We strongly recommend using | |
|
217 | Redis instead for better Performance and scalability | |
|
218 | Please review vcsserver.ini settings under: | |
|
219 | `rc_cache.repo_object.backend = dogpile.cache.rc.redis_msgpack` | |
|
220 | ||
|
221 | - New memory monitoring for Gunicorn workers. Starting from 4.18 release a option was added | |
|
222 | to limit the maximum amount of memory used by a worker. | |
|
223 | Please review new settings in `[server:main]` section for memory management in both | |
|
224 | rhodecode.ini and vcsserver.ini:: | |
|
225 | ||
|
226 | ; Maximum memory usage that each worker can use before it will receive a | |
|
227 | ; graceful restart signal 0 = memory monitoring is disabled | |
|
228 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |
|
229 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |
|
230 | memory_max_usage = 0 |
@@ -0,0 +1,19 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2019 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
@@ -0,0 +1,41 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2018-2019 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | ||
|
22 | def includeme(config): | |
|
23 | ||
|
24 | config.add_route( | |
|
25 | name='hovercard_user', | |
|
26 | pattern='/_hovercard/user/{user_id}') | |
|
27 | ||
|
28 | config.add_route( | |
|
29 | name='hovercard_user_group', | |
|
30 | pattern='/_hovercard/user_group/{user_group_id}') | |
|
31 | ||
|
32 | config.add_route( | |
|
33 | name='hovercard_pull_request', | |
|
34 | pattern='/_hovercard/pull_request/{pull_request_id}') | |
|
35 | ||
|
36 | config.add_route( | |
|
37 | name='hovercard_repo_commit', | |
|
38 | pattern='/_hovercard/commit/{repo_name:.*?[^/]}/{commit_id}', repo_route=True) | |
|
39 | ||
|
40 | # Scan module for configuration decorators. | |
|
41 | config.scan('.views', ignore='.tests') |
@@ -0,0 +1,110 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2016-2019 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | import re | |
|
22 | import logging | |
|
23 | import collections | |
|
24 | ||
|
25 | from pyramid.httpexceptions import HTTPNotFound | |
|
26 | from pyramid.view import view_config | |
|
27 | ||
|
28 | from rhodecode.apps._base import BaseAppView, RepoAppView | |
|
29 | from rhodecode.lib import helpers as h | |
|
30 | from rhodecode.lib.auth import ( | |
|
31 | LoginRequired, NotAnonymous, HasRepoGroupPermissionAnyDecorator, CSRFRequired, | |
|
32 | HasRepoPermissionAnyDecorator) | |
|
33 | from rhodecode.lib.codeblocks import filenode_as_lines_tokens | |
|
34 | from rhodecode.lib.index import searcher_from_config | |
|
35 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int | |
|
36 | from rhodecode.lib.ext_json import json | |
|
37 | from rhodecode.lib.vcs.exceptions import CommitDoesNotExistError, EmptyRepositoryError | |
|
38 | from rhodecode.lib.vcs.nodes import FileNode | |
|
39 | from rhodecode.model.db import ( | |
|
40 | func, true, or_, case, in_filter_generator, Repository, RepoGroup, User, UserGroup, PullRequest) | |
|
41 | from rhodecode.model.repo import RepoModel | |
|
42 | from rhodecode.model.repo_group import RepoGroupModel | |
|
43 | from rhodecode.model.scm import RepoGroupList, RepoList | |
|
44 | from rhodecode.model.user import UserModel | |
|
45 | from rhodecode.model.user_group import UserGroupModel | |
|
46 | ||
|
47 | log = logging.getLogger(__name__) | |
|
48 | ||
|
49 | ||
|
50 | class HoverCardsView(BaseAppView): | |
|
51 | ||
|
52 | def load_default_context(self): | |
|
53 | c = self._get_local_tmpl_context() | |
|
54 | return c | |
|
55 | ||
|
56 | @LoginRequired() | |
|
57 | @view_config( | |
|
58 | route_name='hovercard_user', request_method='GET', xhr=True, | |
|
59 | renderer='rhodecode:templates/hovercards/hovercard_user.mako') | |
|
60 | def hovercard_user(self): | |
|
61 | c = self.load_default_context() | |
|
62 | user_id = self.request.matchdict['user_id'] | |
|
63 | c.user = User.get_or_404(user_id) | |
|
64 | return self._get_template_context(c) | |
|
65 | ||
|
66 | @LoginRequired() | |
|
67 | @view_config( | |
|
68 | route_name='hovercard_user_group', request_method='GET', xhr=True, | |
|
69 | renderer='rhodecode:templates/hovercards/hovercard_user_group.mako') | |
|
70 | def hovercard_user_group(self): | |
|
71 | c = self.load_default_context() | |
|
72 | user_group_id = self.request.matchdict['user_group_id'] | |
|
73 | c.user_group = UserGroup.get_or_404(user_group_id) | |
|
74 | return self._get_template_context(c) | |
|
75 | ||
|
76 | @LoginRequired() | |
|
77 | @view_config( | |
|
78 | route_name='hovercard_pull_request', request_method='GET', xhr=True, | |
|
79 | renderer='rhodecode:templates/hovercards/hovercard_pull_request.mako') | |
|
80 | def hovercard_pull_request(self): | |
|
81 | c = self.load_default_context() | |
|
82 | c.pull_request = PullRequest.get_or_404( | |
|
83 | self.request.matchdict['pull_request_id']) | |
|
84 | perms = ['repository.read', 'repository.write', 'repository.admin'] | |
|
85 | c.can_view_pr = h.HasRepoPermissionAny(*perms)( | |
|
86 | c.pull_request.target_repo.repo_name) | |
|
87 | return self._get_template_context(c) | |
|
88 | ||
|
89 | ||
|
90 | class HoverCardsRepoView(RepoAppView): | |
|
91 | def load_default_context(self): | |
|
92 | c = self._get_local_tmpl_context() | |
|
93 | return c | |
|
94 | ||
|
95 | @LoginRequired() | |
|
96 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', 'repository.admin') | |
|
97 | @view_config( | |
|
98 | route_name='hovercard_repo_commit', request_method='GET', xhr=True, | |
|
99 | renderer='rhodecode:templates/hovercards/hovercard_repo_commit.mako') | |
|
100 | def hovercard_repo_commit(self): | |
|
101 | c = self.load_default_context() | |
|
102 | commit_id = self.request.matchdict['commit_id'] | |
|
103 | pre_load = ['author', 'branch', 'date', 'message'] | |
|
104 | try: | |
|
105 | c.commit = self.rhodecode_vcs_repo.get_commit( | |
|
106 | commit_id=commit_id, pre_load=pre_load) | |
|
107 | except (CommitDoesNotExistError, EmptyRepositoryError): | |
|
108 | raise HTTPNotFound() | |
|
109 | ||
|
110 | return self._get_template_context(c) |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100755 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
@@ -1,5 +1,5 b'' | |||
|
1 | 1 | [bumpversion] |
|
2 |
current_version = 4.1 |
|
|
2 | current_version = 4.18.0 | |
|
3 | 3 | message = release: Bump version {current_version} to {new_version} |
|
4 | 4 | |
|
5 | 5 | [bumpversion:file:rhodecode/VERSION] |
@@ -8,6 +8,7 b' include =' | |||
|
8 | 8 | omit = |
|
9 | 9 | rhodecode/lib/dbmigrate/* |
|
10 | 10 | rhodecode/lib/paster_commands/* |
|
11 | rhodecode/lib/_vendor/* | |
|
11 | 12 | |
|
12 | 13 | [report] |
|
13 | 14 |
@@ -45,6 +45,7 b' syntax: regexp' | |||
|
45 | 45 | ^rhodecode/public/js/rhodecode-components.html$ |
|
46 | 46 | ^rhodecode/public/js/rhodecode-components.js$ |
|
47 | 47 | ^rhodecode/public/js/scripts.js$ |
|
48 | ^rhodecode/public/js/scripts.min.js$ | |
|
48 | 49 | ^rhodecode/public/js/src/components/root-styles.gen.html$ |
|
49 | 50 | ^rhodecode/public/js/vendors/webcomponentsjs/ |
|
50 | 51 | ^rhodecode\.db$ |
@@ -5,25 +5,20 b' done = false' | |||
|
5 | 5 | done = true |
|
6 | 6 | |
|
7 | 7 | [task:rc_tools_pinned] |
|
8 | done = true | |
|
9 | 8 | |
|
10 | 9 | [task:fixes_on_stable] |
|
11 | done = true | |
|
12 | 10 | |
|
13 | 11 | [task:pip2nix_generated] |
|
14 | done = true | |
|
15 | 12 | |
|
16 | 13 | [task:changelog_updated] |
|
17 | done = true | |
|
18 | 14 | |
|
19 | 15 | [task:generate_api_docs] |
|
20 | done = true | |
|
16 | ||
|
17 | [task:updated_translation] | |
|
21 | 18 | |
|
22 | 19 | [release] |
|
23 |
state = |
|
|
24 |
version = 4.1 |
|
|
25 | ||
|
26 | [task:updated_translation] | |
|
20 | state = in_progress | |
|
21 | version = 4.18.0 | |
|
27 | 22 | |
|
28 | 23 | [task:generate_js_routes] |
|
29 | 24 |
@@ -13,9 +13,10 b' module.exports = function(grunt) {' | |||
|
13 | 13 | |
|
14 | 14 | grunt.loadNpmTasks('grunt-contrib-less'); |
|
15 | 15 | grunt.loadNpmTasks('grunt-contrib-concat'); |
|
16 | grunt.loadNpmTasks('grunt-contrib-uglify'); | |
|
16 | 17 | grunt.loadNpmTasks('grunt-contrib-watch'); |
|
17 | 18 | grunt.loadNpmTasks('grunt-contrib-jshint'); |
|
18 | 19 | grunt.loadNpmTasks('grunt-contrib-copy'); |
|
19 | 20 | grunt.loadNpmTasks('grunt-webpack'); |
|
20 | grunt.registerTask('default', ['less:production', 'less:components', 'copy', 'webpack', 'concat:dist']); | |
|
21 | grunt.registerTask('default', ['less:production', 'less:components', 'copy', 'webpack', 'concat:dist', 'uglify:dist']); | |
|
21 | 22 | }; |
@@ -17,6 +17,7 b' test:' | |||
|
17 | 17 | test-clean: |
|
18 | 18 | rm -rf coverage.xml htmlcov junit.xml pylint.log result |
|
19 | 19 | find . -type d -name "__pycache__" -prune -exec rm -rf '{}' ';' |
|
20 | find . -type f \( -iname '.coverage.*' \) -exec rm '{}' ';' | |
|
20 | 21 | |
|
21 | 22 | test-only: |
|
22 | 23 | PYTHONHASHSEED=random \ |
@@ -28,7 +29,7 b' test-only-mysql:' | |||
|
28 | 29 | PYTHONHASHSEED=random \ |
|
29 | 30 | py.test -x -vv -r xw -p no:sugar --cov=rhodecode \ |
|
30 | 31 | --cov-report=term-missing --cov-report=html \ |
|
31 | --ini-config-override='{"app:main": {"sqlalchemy.db1.url": "mysql://root:qweqwe@localhost/rhodecode_test"}}' \ | |
|
32 | --ini-config-override='{"app:main": {"sqlalchemy.db1.url": "mysql://root:qweqwe@localhost/rhodecode_test?charset=utf8"}}' \ | |
|
32 | 33 | rhodecode |
|
33 | 34 | |
|
34 | 35 | test-only-postgres: |
@@ -5,20 +5,24 b' RhodeCode' | |||
|
5 | 5 | About |
|
6 | 6 | ----- |
|
7 | 7 | |
|
8 |
``RhodeCode`` is a fast and powerful management tool for |
|
|
9 | and Subversion_ with a built in push/pull server, full text search, | |
|
10 | pull requests and powerful code-review system. It works on http/https, SSH and | |
|
11 | has a few unique features like: | |
|
8 | ``RhodeCode`` is a fast and powerful source code management tool for | |
|
9 | Mercurial_, GIT_ and Subversion_. It's main features are: | |
|
10 | ||
|
12 | 11 |
|
|
13 | - plugable architecture from Pyramid web-framework. | |
|
14 | - advanced permission system with IP restrictions, inheritation, and user-groups. | |
|
12 | - built in push/pull server | |
|
13 | - SSH with key management support | |
|
14 | - full text search. | |
|
15 | - plugable authentication. | |
|
16 | - pull requests and powerful code-review system. | |
|
17 | - advanced permission system with IP restrictions, permission inheritation, and user-groups. | |
|
15 | 18 | - rich set of authentication plugins including LDAP, ActiveDirectory, SAML 2.0, |
|
16 | 19 | Atlassian Crowd, Http-Headers, Pam, Token-Auth, OAuth. |
|
17 | 20 | - live code-review chat, and reviewer rules. |
|
18 | 21 | - full web based file editing. |
|
19 | 22 | - unified multi vcs support. |
|
20 | 23 | - snippets (gist) system. |
|
21 | - integration framework for Slack, CI systems, Webhooks. | |
|
24 | - artfacts store for binaries. | |
|
25 | - integration framework for Slack, CI systems, Webhooks, Jira, Redmine etc. | |
|
22 | 26 | - integration with all 3rd party issue trackers. |
|
23 | 27 | |
|
24 | 28 | |
@@ -41,7 +45,8 b' Source code' | |||
|
41 | 45 | ----------- |
|
42 | 46 | |
|
43 | 47 | The latest sources can be obtained from official RhodeCode instance |
|
44 | https://code.rhodecode.com | |
|
48 | https://code.rhodecode.com/rhodecode-enterprise-ce | |
|
49 | https://code.rhodecode.com/rhodecode-vcsserver | |
|
45 | 50 | |
|
46 | 51 | |
|
47 | 52 | Contributions |
This diff has been collapsed as it changes many lines, (918 lines changed) Show them Hide them | |||
@@ -1,24 +1,22 b'' | |||
|
1 | ||
|
1 | ## -*- coding: utf-8 -*- | |
|
2 | 2 | |
|
3 |
|
|
|
4 |
|
|
|
5 |
|
|
|
3 | ; ######################################### | |
|
4 | ; RHODECODE COMMUNITY EDITION CONFIGURATION | |
|
5 | ; ######################################### | |
|
6 | 6 | |
|
7 | 7 | [DEFAULT] |
|
8 |
|
|
|
8 | ; Debug flag sets all loggers to debug, and enables request tracking | |
|
9 | 9 | debug = true |
|
10 | 10 | |
|
11 |
|
|
|
12 | ## EMAIL CONFIGURATION ## | |
|
13 | ## Uncomment and replace with the email address which should receive ## | |
|
14 | ## any error reports after an application crash ## | |
|
15 | ## Additionally these settings will be used by the RhodeCode mailing system ## | |
|
16 | ################################################################################ | |
|
11 | ; ######################################################################## | |
|
12 | ; EMAIL CONFIGURATION | |
|
13 | ; These settings will be used by the RhodeCode mailing system | |
|
14 | ; ######################################################################## | |
|
17 | 15 | |
|
18 |
|
|
|
16 | ; prefix all emails subjects with given prefix, helps filtering out emails | |
|
19 | 17 | #email_prefix = [RhodeCode] |
|
20 | 18 | |
|
21 |
|
|
|
19 | ; email FROM address all mails will be sent | |
|
22 | 20 | #app_email_from = rhodecode-noreply@localhost |
|
23 | 21 | |
|
24 | 22 | #smtp_server = mail.server.com |
@@ -29,82 +27,139 b' debug = true' | |||
|
29 | 27 | #smtp_use_ssl = true |
|
30 | 28 | |
|
31 | 29 | [server:main] |
|
32 | ## COMMON ## | |
|
30 | ; COMMON HOST/IP CONFIG | |
|
33 | 31 | host = 127.0.0.1 |
|
34 | 32 | port = 5000 |
|
35 | 33 | |
|
36 |
|
|
|
37 |
|
|
|
38 |
|
|
|
34 | ; ################################################## | |
|
35 | ; WAITRESS WSGI SERVER - Recommended for Development | |
|
36 | ; ################################################## | |
|
39 | 37 | |
|
38 | ; use server type | |
|
40 | 39 | use = egg:waitress#main |
|
41 | ## number of worker threads | |
|
40 | ||
|
41 | ; number of worker threads | |
|
42 | 42 | threads = 5 |
|
43 | ## MAX BODY SIZE 100GB | |
|
43 | ||
|
44 | ; MAX BODY SIZE 100GB | |
|
44 | 45 | max_request_body_size = 107374182400 |
|
45 | ## Use poll instead of select, fixes file descriptors limits problems. | |
|
46 | ## May not work on old windows systems. | |
|
46 | ||
|
47 | ; Use poll instead of select, fixes file descriptors limits problems. | |
|
48 | ; May not work on old windows systems. | |
|
47 | 49 | asyncore_use_poll = true |
|
48 | 50 | |
|
49 | 51 | |
|
50 | ########################## | |
|
51 |
|
|
|
52 | ########################## | |
|
53 | ## run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |
|
52 | ; ########################### | |
|
53 | ; GUNICORN APPLICATION SERVER | |
|
54 | ; ########################### | |
|
54 | 55 | |
|
56 | ; run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |
|
57 | ||
|
58 | ; Module to use, this setting shouldn't be changed | |
|
55 | 59 | #use = egg:gunicorn#main |
|
56 | ## Sets the number of process workers. More workers means more concurrent connections | |
|
57 | ## RhodeCode can handle at the same time. Each additional worker also it increases | |
|
58 | ## memory usage as each has it's own set of caches. | |
|
59 | ## Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |
|
60 | ## than 8-10 unless for really big deployments .e.g 700-1000 users. | |
|
61 | ## `instance_id = *` must be set in the [app:main] section below (which is the default) | |
|
62 | ## when using more than 1 worker. | |
|
60 | ||
|
61 | ; Sets the number of process workers. More workers means more concurrent connections | |
|
62 | ; RhodeCode can handle at the same time. Each additional worker also it increases | |
|
63 | ; memory usage as each has it's own set of caches. | |
|
64 | ; Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |
|
65 | ; than 8-10 unless for really big deployments .e.g 700-1000 users. | |
|
66 | ; `instance_id = *` must be set in the [app:main] section below (which is the default) | |
|
67 | ; when using more than 1 worker. | |
|
63 | 68 | #workers = 2 |
|
64 | ## process name visible in process list | |
|
69 | ||
|
70 | ; Gunicorn access log level | |
|
71 | #loglevel = info | |
|
72 | ||
|
73 | ; Process name visible in process list | |
|
65 | 74 | #proc_name = rhodecode |
|
66 | ## type of worker class, one of sync, gevent | |
|
67 | ## recommended for bigger setup is using of of other than sync one | |
|
75 | ||
|
76 | ; Type of worker class, one of `sync`, `gevent` | |
|
77 | ; Recommended type is `gevent` | |
|
68 | 78 | #worker_class = gevent |
|
69 | ## The maximum number of simultaneous clients. Valid only for Gevent | |
|
79 | ||
|
80 | ; The maximum number of simultaneous clients. Valid only for gevent | |
|
70 | 81 | #worker_connections = 10 |
|
71 | ## max number of requests that worker will handle before being gracefully | |
|
72 | ## restarted, could prevent memory leaks | |
|
82 | ||
|
83 | ; Max number of requests that worker will handle before being gracefully restarted. | |
|
84 | ; Prevents memory leaks, jitter adds variability so not all workers are restarted at once. | |
|
73 | 85 | #max_requests = 1000 |
|
74 | 86 | #max_requests_jitter = 30 |
|
75 | ## amount of time a worker can spend with handling a request before it | |
|
76 | ## gets killed and restarted. Set to 6hrs | |
|
87 | ||
|
88 | ; Amount of time a worker can spend with handling a request before it | |
|
89 | ; gets killed and restarted. By default set to 21600 (6hrs) | |
|
90 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |
|
77 | 91 | #timeout = 21600 |
|
78 | 92 | |
|
93 | ; The maximum size of HTTP request line in bytes. | |
|
94 | ; 0 for unlimited | |
|
95 | #limit_request_line = 0 | |
|
96 | ||
|
97 | ; Limit the number of HTTP headers fields in a request. | |
|
98 | ; By default this value is 100 and can't be larger than 32768. | |
|
99 | #limit_request_fields = 32768 | |
|
100 | ||
|
101 | ; Limit the allowed size of an HTTP request header field. | |
|
102 | ; Value is a positive number or 0. | |
|
103 | ; Setting it to 0 will allow unlimited header field sizes. | |
|
104 | #limit_request_field_size = 0 | |
|
105 | ||
|
106 | ; Timeout for graceful workers restart. | |
|
107 | ; After receiving a restart signal, workers have this much time to finish | |
|
108 | ; serving requests. Workers still alive after the timeout (starting from the | |
|
109 | ; receipt of the restart signal) are force killed. | |
|
110 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |
|
111 | #graceful_timeout = 3600 | |
|
112 | ||
|
113 | # The number of seconds to wait for requests on a Keep-Alive connection. | |
|
114 | # Generally set in the 1-5 seconds range. | |
|
115 | #keepalive = 2 | |
|
116 | ||
|
117 | ; Maximum memory usage that each worker can use before it will receive a | |
|
118 | ; graceful restart signal 0 = memory monitoring is disabled | |
|
119 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |
|
120 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |
|
121 | #memory_max_usage = 0 | |
|
122 | ||
|
123 | ; How often in seconds to check for memory usage for each gunicorn worker | |
|
124 | #memory_usage_check_interval = 60 | |
|
125 | ||
|
126 | ; Threshold value for which we don't recycle worker if GarbageCollection | |
|
127 | ; frees up enough resources. Before each restart we try to run GC on worker | |
|
128 | ; in case we get enough free memory after that, restart will not happen. | |
|
129 | #memory_usage_recovery_threshold = 0.8 | |
|
130 | ||
|
79 | 131 | |
|
80 |
|
|
|
81 |
|
|
|
82 |
|
|
|
83 |
|
|
|
84 |
|
|
|
85 |
|
|
|
86 |
|
|
|
132 | ; Prefix middleware for RhodeCode. | |
|
133 | ; recommended when using proxy setup. | |
|
134 | ; allows to set RhodeCode under a prefix in server. | |
|
135 | ; eg https://server.com/custom_prefix. Enable `filter-with =` option below as well. | |
|
136 | ; And set your prefix like: `prefix = /custom_prefix` | |
|
137 | ; be sure to also set beaker.session.cookie_path = /custom_prefix if you need | |
|
138 | ; to make your cookies only work on prefix url | |
|
87 | 139 | [filter:proxy-prefix] |
|
88 | 140 | use = egg:PasteDeploy#prefix |
|
89 | 141 | prefix = / |
|
90 | 142 | |
|
91 | 143 | [app:main] |
|
92 |
|
|
|
93 |
|
|
|
94 |
|
|
|
95 |
|
|
|
144 | ; The %(here)s variable will be replaced with the absolute path of parent directory | |
|
145 | ; of this file | |
|
146 | ; In addition ENVIRONMENT variables usage is possible, e.g | |
|
147 | ; sqlalchemy.db1.url = {ENV_RC_DB_URL} | |
|
96 | 148 | |
|
97 | 149 | use = egg:rhodecode-enterprise-ce |
|
98 | 150 | |
|
99 |
|
|
|
151 | ; enable proxy prefix middleware, defined above | |
|
100 | 152 | #filter-with = proxy-prefix |
|
101 | 153 | |
|
154 | ; ############# | |
|
155 | ; DEBUG OPTIONS | |
|
156 | ; ############# | |
|
157 | ||
|
158 | pyramid.reload_templates = true | |
|
159 | ||
|
102 | 160 | # During development the we want to have the debug toolbar enabled |
|
103 | 161 | pyramid.includes = |
|
104 | 162 | pyramid_debugtoolbar |
|
105 | rhodecode.lib.middleware.request_wrapper | |
|
106 | ||
|
107 | pyramid.reload_templates = true | |
|
108 | 163 | |
|
109 | 164 | debugtoolbar.hosts = 0.0.0.0/0 |
|
110 | 165 | debugtoolbar.exclude_prefixes = |
@@ -121,101 +176,100 b' rhodecode.includes =' | |||
|
121 | 176 | # api prefix url |
|
122 | 177 | rhodecode.api.url = /_admin/api |
|
123 | 178 | |
|
124 | ||
|
125 | ## END RHODECODE PLUGINS ## | |
|
179 | ; enable debug style page | |
|
180 | debug_style = true | |
|
126 | 181 | |
|
127 | ## encryption key used to encrypt social plugin tokens, | |
|
128 | ## remote_urls with credentials etc, if not set it defaults to | |
|
129 | ## `beaker.session.secret` | |
|
182 | ; ################# | |
|
183 | ; END DEBUG OPTIONS | |
|
184 | ; ################# | |
|
185 | ||
|
186 | ; encryption key used to encrypt social plugin tokens, | |
|
187 | ; remote_urls with credentials etc, if not set it defaults to | |
|
188 | ; `beaker.session.secret` | |
|
130 | 189 | #rhodecode.encrypted_values.secret = |
|
131 | 190 | |
|
132 |
|
|
|
133 |
|
|
|
191 | ; decryption strict mode (enabled by default). It controls if decryption raises | |
|
192 | ; `SignatureVerificationError` in case of wrong key, or damaged encryption data. | |
|
134 | 193 | #rhodecode.encrypted_values.strict = false |
|
135 | 194 | |
|
136 |
|
|
|
137 |
|
|
|
138 |
|
|
|
195 | ; Pick algorithm for encryption. Either fernet (more secure) or aes (default) | |
|
196 | ; fernet is safer, and we strongly recommend switching to it. | |
|
197 | ; Due to backward compatibility aes is used as default. | |
|
139 | 198 | #rhodecode.encrypted_values.algorithm = fernet |
|
140 | 199 | |
|
141 |
|
|
|
200 | ; Return gzipped responses from RhodeCode (static files/application) | |
|
142 | 201 | gzip_responses = false |
|
143 | 202 | |
|
144 |
|
|
|
203 | ; Auto-generate javascript routes file on startup | |
|
145 | 204 | generate_js_files = false |
|
146 | 205 | |
|
147 |
|
|
|
148 |
|
|
|
206 | ; System global default language. | |
|
207 | ; All available languages: en (default), be, de, es, fr, it, ja, pl, pt, ru, zh | |
|
149 | 208 | lang = en |
|
150 | 209 | |
|
151 |
|
|
|
152 |
|
|
|
210 | ; Perform a full repository scan and import on each server start. | |
|
211 | ; Settings this to true could lead to very long startup time. | |
|
153 | 212 | startup.import_repos = false |
|
154 | 213 | |
|
155 |
|
|
|
156 |
|
|
|
157 |
|
|
|
158 |
|
|
|
214 | ; Uncomment and set this path to use archive download cache. | |
|
215 | ; Once enabled, generated archives will be cached at this location | |
|
216 | ; and served from the cache during subsequent requests for the same archive of | |
|
217 | ; the repository. | |
|
159 | 218 | #archive_cache_dir = /tmp/tarballcache |
|
160 | 219 | |
|
161 |
|
|
|
162 |
|
|
|
163 |
|
|
|
220 | ; URL at which the application is running. This is used for Bootstrapping | |
|
221 | ; requests in context when no web request is available. Used in ishell, or | |
|
222 | ; SSH calls. Set this for events to receive proper url for SSH calls. | |
|
164 | 223 | app.base_url = http://rhodecode.local |
|
165 | 224 | |
|
166 |
|
|
|
225 | ; Unique application ID. Should be a random unique string for security. | |
|
167 | 226 | app_instance_uuid = rc-production |
|
168 | 227 | |
|
169 |
|
|
|
170 |
|
|
|
171 |
|
|
|
228 | ; Cut off limit for large diffs (size in bytes). If overall diff size on | |
|
229 | ; commit, or pull request exceeds this limit this diff will be displayed | |
|
230 | ; partially. E.g 512000 == 512Kb | |
|
172 | 231 | cut_off_limit_diff = 512000 |
|
173 | 232 | |
|
174 |
|
|
|
175 |
|
|
|
176 |
|
|
|
233 | ; Cut off limit for large files inside diffs (size in bytes). Each individual | |
|
234 | ; file inside diff which exceeds this limit will be displayed partially. | |
|
235 | ; E.g 128000 == 128Kb | |
|
177 | 236 | cut_off_limit_file = 128000 |
|
178 | 237 | |
|
179 |
|
|
|
238 | ; Use cached version of vcs repositories everywhere. Recommended to be `true` | |
|
180 | 239 | vcs_full_cache = true |
|
181 | 240 | |
|
182 |
|
|
|
183 |
|
|
|
241 | ; Force https in RhodeCode, fixes https redirects, assumes it's always https. | |
|
242 | ; Normally this is controlled by proper flags sent from http server such as Nginx or Apache | |
|
184 | 243 | force_https = false |
|
185 | 244 | |
|
186 |
|
|
|
245 | ; use Strict-Transport-Security headers | |
|
187 | 246 | use_htsts = false |
|
188 | 247 | |
|
189 | ## git rev filter option, --all is the default filter, if you need to | |
|
190 | ## hide all refs in changelog switch this to --branches --tags | |
|
191 | git_rev_filter = --branches --tags | |
|
192 | ||
|
193 | # Set to true if your repos are exposed using the dumb protocol | |
|
248 | ; Set to true if your repos are exposed using the dumb protocol | |
|
194 | 249 | git_update_server_info = false |
|
195 | 250 | |
|
196 |
|
|
|
251 | ; RSS/ATOM feed options | |
|
197 | 252 | rss_cut_off_limit = 256000 |
|
198 | 253 | rss_items_per_page = 10 |
|
199 | 254 | rss_include_diff = false |
|
200 | 255 | |
|
201 |
|
|
|
202 |
|
|
|
203 |
|
|
|
204 |
|
|
|
256 | ; gist URL alias, used to create nicer urls for gist. This should be an | |
|
257 | ; url that does rewrites to _admin/gists/{gistid}. | |
|
258 | ; example: http://gist.rhodecode.org/{gistid}. Empty means use the internal | |
|
259 | ; RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid} | |
|
205 | 260 | gist_alias_url = |
|
206 | 261 | |
|
207 |
|
|
|
208 |
|
|
|
209 |
|
|
|
210 |
|
|
|
211 |
|
|
|
212 |
|
|
|
213 |
|
|
|
214 | ## | |
|
215 | ## list of all views can be found under `/_admin/permissions/auth_token_access` | |
|
216 | ## The list should be "," separated and on a single line. | |
|
217 | ## | |
|
218 | ## Most common views to enable: | |
|
262 | ; List of views (using glob pattern syntax) that AUTH TOKENS could be | |
|
263 | ; used for access. | |
|
264 | ; Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it | |
|
265 | ; came from the the logged in user who own this authentication token. | |
|
266 | ; Additionally @TOKEN syntax can be used to bound the view to specific | |
|
267 | ; authentication token. Such view would be only accessible when used together | |
|
268 | ; with this authentication token | |
|
269 | ; list of all views can be found under `/_admin/permissions/auth_token_access` | |
|
270 | ; The list should be "," separated and on a single line. | |
|
271 | ; Most common views to enable: | |
|
272 | ||
|
219 | 273 | # RepoCommitsView:repo_commit_download |
|
220 | 274 | # RepoCommitsView:repo_commit_patch |
|
221 | 275 | # RepoCommitsView:repo_commit_raw |
@@ -226,164 +280,194 b' gist_alias_url =' | |||
|
226 | 280 | # GistView:* |
|
227 | 281 | api_access_controllers_whitelist = |
|
228 | 282 | |
|
229 |
|
|
|
230 |
|
|
|
283 | ; Default encoding used to convert from and to unicode | |
|
284 | ; can be also a comma separated list of encoding in case of mixed encodings | |
|
231 | 285 | default_encoding = UTF-8 |
|
232 | 286 | |
|
233 |
|
|
|
234 |
|
|
|
235 |
|
|
|
236 |
|
|
|
287 | ; instance-id prefix | |
|
288 | ; a prefix key for this instance used for cache invalidation when running | |
|
289 | ; multiple instances of RhodeCode, make sure it's globally unique for | |
|
290 | ; all running RhodeCode instances. Leave empty if you don't use it | |
|
237 | 291 | instance_id = |
|
238 | 292 | |
|
239 |
|
|
|
240 |
|
|
|
241 |
|
|
|
242 |
|
|
|
243 |
|
|
|
244 | ## | |
|
245 | ## Available builtin plugin IDs (hash is part of the ID): | |
|
246 |
|
|
|
247 |
|
|
|
248 |
|
|
|
249 |
|
|
|
250 |
|
|
|
251 | ## egg:rhodecode-enterprise-ce#crowd | |
|
293 | ; Fallback authentication plugin. Set this to a plugin ID to force the usage | |
|
294 | ; of an authentication plugin also if it is disabled by it's settings. | |
|
295 | ; This could be useful if you are unable to log in to the system due to broken | |
|
296 | ; authentication settings. Then you can enable e.g. the internal RhodeCode auth | |
|
297 | ; module to log in again and fix the settings. | |
|
298 | ; Available builtin plugin IDs (hash is part of the ID): | |
|
299 | ; egg:rhodecode-enterprise-ce#rhodecode | |
|
300 | ; egg:rhodecode-enterprise-ce#pam | |
|
301 | ; egg:rhodecode-enterprise-ce#ldap | |
|
302 | ; egg:rhodecode-enterprise-ce#jasig_cas | |
|
303 | ; egg:rhodecode-enterprise-ce#headers | |
|
304 | ; egg:rhodecode-enterprise-ce#crowd | |
|
305 | ||
|
252 | 306 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode |
|
253 | 307 | |
|
254 | ## alternative return HTTP header for failed authentication. Default HTTP | |
|
255 | ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with | |
|
256 | ## handling that causing a series of failed authentication calls. | |
|
257 | ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code | |
|
258 | ## This will be served instead of default 401 on bad authentication | |
|
308 | ; Flag to control loading of legacy plugins in py:/path format | |
|
309 | auth_plugin.import_legacy_plugins = true | |
|
310 | ||
|
311 | ; alternative return HTTP header for failed authentication. Default HTTP | |
|
312 | ; response is 401 HTTPUnauthorized. Currently HG clients have troubles with | |
|
313 | ; handling that causing a series of failed authentication calls. | |
|
314 | ; Set this variable to 403 to return HTTPForbidden, or any other HTTP code | |
|
315 | ; This will be served instead of default 401 on bad authentication | |
|
259 | 316 | auth_ret_code = |
|
260 | 317 | |
|
261 |
|
|
|
262 |
|
|
|
263 |
|
|
|
318 | ; use special detection method when serving auth_ret_code, instead of serving | |
|
319 | ; ret_code directly, use 401 initially (Which triggers credentials prompt) | |
|
320 | ; and then serve auth_ret_code to clients | |
|
264 | 321 | auth_ret_code_detection = false |
|
265 | 322 | |
|
266 |
|
|
|
267 |
|
|
|
323 | ; locking return code. When repository is locked return this HTTP code. 2XX | |
|
324 | ; codes don't break the transactions while 4XX codes do | |
|
268 | 325 | lock_ret_code = 423 |
|
269 | 326 | |
|
270 |
|
|
|
327 | ; allows to change the repository location in settings page | |
|
271 | 328 | allow_repo_location_change = true |
|
272 | 329 | |
|
273 |
|
|
|
330 | ; allows to setup custom hooks in settings page | |
|
274 | 331 | allow_custom_hooks_settings = true |
|
275 | 332 | |
|
276 |
|
|
|
277 |
|
|
|
333 | ; Generated license token required for EE edition license. | |
|
334 | ; New generated token value can be found in Admin > settings > license page. | |
|
278 | 335 | license_token = |
|
279 | 336 | |
|
280 | ## supervisor connection uri, for managing supervisor and logs. | |
|
337 | ; This flag hides sensitive information on the license page such as token, and license data | |
|
338 | license.hide_license_info = false | |
|
339 | ||
|
340 | ; supervisor connection uri, for managing supervisor and logs. | |
|
281 | 341 | supervisor.uri = |
|
282 | ## supervisord group name/id we only want this RC instance to handle | |
|
342 | ||
|
343 | ; supervisord group name/id we only want this RC instance to handle | |
|
283 | 344 | supervisor.group_id = dev |
|
284 | 345 | |
|
285 |
|
|
|
346 | ; Display extended labs settings | |
|
286 | 347 | labs_settings_active = true |
|
287 | 348 | |
|
288 |
|
|
|
289 |
|
|
|
349 | ; Custom exception store path, defaults to TMPDIR | |
|
350 | ; This is used to store exception from RhodeCode in shared directory | |
|
290 | 351 | #exception_tracker.store_path = |
|
291 | 352 | |
|
292 |
|
|
|
353 | ; File store configuration. This is used to store and serve uploaded files | |
|
293 | 354 | file_store.enabled = true |
|
294 | ## Storage backend, available options are: local | |
|
355 | ||
|
356 | ; Storage backend, available options are: local | |
|
295 | 357 | file_store.backend = local |
|
296 | ## path to store the uploaded binaries | |
|
358 | ||
|
359 | ; path to store the uploaded binaries | |
|
297 | 360 | file_store.storage_path = %(here)s/data/file_store |
|
298 | 361 | |
|
299 | 362 | |
|
300 | #################################### | |
|
301 | ### CELERY CONFIG #### | |
|
302 | #################################### | |
|
303 | ## run: /path/to/celery worker \ | |
|
304 | ## -E --beat --app rhodecode.lib.celerylib.loader \ | |
|
305 | ## --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler \ | |
|
306 | ## --loglevel DEBUG --ini /path/to/rhodecode.ini | |
|
363 | ; ############# | |
|
364 | ; CELERY CONFIG | |
|
365 | ; ############# | |
|
366 | ||
|
367 | ; manually run celery: /path/to/celery worker -E --beat --app rhodecode.lib.celerylib.loader --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler --loglevel DEBUG --ini /path/to/rhodecode.ini | |
|
307 | 368 | |
|
308 | 369 | use_celery = false |
|
309 | 370 | |
|
310 |
|
|
|
311 |
celery.broker_url = |
|
|
371 | ; connection url to the message broker (default redis) | |
|
372 | celery.broker_url = redis://localhost:6379/8 | |
|
312 | 373 | |
|
313 | ## maximum tasks to execute before worker restart | |
|
374 | ; rabbitmq example | |
|
375 | #celery.broker_url = amqp://rabbitmq:qweqwe@localhost:5672/rabbitmqhost | |
|
376 | ||
|
377 | ; maximum tasks to execute before worker restart | |
|
314 | 378 | celery.max_tasks_per_child = 100 |
|
315 | 379 | |
|
316 |
|
|
|
380 | ; tasks will never be sent to the queue, but executed locally instead. | |
|
317 | 381 | celery.task_always_eager = false |
|
318 | 382 | |
|
319 | ##################################### | |
|
320 | ### DOGPILE CACHE #### | |
|
321 | ##################################### | |
|
322 | ## Default cache dir for caches. Putting this into a ramdisk | |
|
323 | ## can boost performance, eg. /tmpfs/data_ramdisk, however this directory might require | |
|
324 | ## large amount of space | |
|
383 | ; ############# | |
|
384 | ; DOGPILE CACHE | |
|
385 | ; ############# | |
|
386 | ||
|
387 | ; Default cache dir for caches. Putting this into a ramdisk can boost performance. | |
|
388 | ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space | |
|
325 | 389 | cache_dir = %(here)s/data |
|
326 | 390 | |
|
327 | ## `cache_perms` cache settings for permission tree, auth TTL. | |
|
391 | ; ********************************************* | |
|
392 | ; `sql_cache_short` cache for heavy SQL queries | |
|
393 | ; Only supported backend is `memory_lru` | |
|
394 | ; ********************************************* | |
|
395 | rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru | |
|
396 | rc_cache.sql_cache_short.expiration_time = 30 | |
|
397 | ||
|
398 | ||
|
399 | ; ***************************************************** | |
|
400 | ; `cache_repo_longterm` cache for repo object instances | |
|
401 | ; Only supported backend is `memory_lru` | |
|
402 | ; ***************************************************** | |
|
403 | rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru | |
|
404 | ; by default we use 30 Days, cache is still invalidated on push | |
|
405 | rc_cache.cache_repo_longterm.expiration_time = 2592000 | |
|
406 | ; max items in LRU cache, set to smaller number to save memory, and expire last used caches | |
|
407 | rc_cache.cache_repo_longterm.max_size = 10000 | |
|
408 | ||
|
409 | ||
|
410 | ; ************************************************* | |
|
411 | ; `cache_perms` cache for permission tree, auth TTL | |
|
412 | ; ************************************************* | |
|
328 | 413 | rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace |
|
329 | 414 | rc_cache.cache_perms.expiration_time = 300 |
|
415 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set | |
|
416 | #rc_cache.cache_perms.arguments.filename = /tmp/cache_perms.db | |
|
330 | 417 | |
|
331 |
|
|
|
418 | ; alternative `cache_perms` redis backend with distributed lock | |
|
332 | 419 | #rc_cache.cache_perms.backend = dogpile.cache.rc.redis |
|
333 | 420 | #rc_cache.cache_perms.expiration_time = 300 |
|
334 | ## redis_expiration_time needs to be greater then expiration_time | |
|
421 | ||
|
422 | ; redis_expiration_time needs to be greater then expiration_time | |
|
335 | 423 | #rc_cache.cache_perms.arguments.redis_expiration_time = 7200 |
|
336 | #rc_cache.cache_perms.arguments.socket_timeout = 30 | |
|
424 | ||
|
337 | 425 | #rc_cache.cache_perms.arguments.host = localhost |
|
338 | 426 | #rc_cache.cache_perms.arguments.port = 6379 |
|
339 | 427 | #rc_cache.cache_perms.arguments.db = 0 |
|
340 | ## more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
|
428 | #rc_cache.cache_perms.arguments.socket_timeout = 30 | |
|
429 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
|
341 | 430 | #rc_cache.cache_perms.arguments.distributed_lock = true |
|
342 | 431 | |
|
343 | ## `cache_repo` cache settings for FileTree, Readme, RSS FEEDS | |
|
432 | ||
|
433 | ; *************************************************** | |
|
434 | ; `cache_repo` cache for file tree, Readme, RSS FEEDS | |
|
435 | ; *************************************************** | |
|
344 | 436 | rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace |
|
345 | 437 | rc_cache.cache_repo.expiration_time = 2592000 |
|
438 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set | |
|
439 | #rc_cache.cache_repo.arguments.filename = /tmp/cache_repo.db | |
|
346 | 440 | |
|
347 |
|
|
|
441 | ; alternative `cache_repo` redis backend with distributed lock | |
|
348 | 442 | #rc_cache.cache_repo.backend = dogpile.cache.rc.redis |
|
349 | 443 | #rc_cache.cache_repo.expiration_time = 2592000 |
|
350 | ## redis_expiration_time needs to be greater then expiration_time | |
|
444 | ||
|
445 | ; redis_expiration_time needs to be greater then expiration_time | |
|
351 | 446 | #rc_cache.cache_repo.arguments.redis_expiration_time = 2678400 |
|
352 | #rc_cache.cache_repo.arguments.socket_timeout = 30 | |
|
447 | ||
|
353 | 448 | #rc_cache.cache_repo.arguments.host = localhost |
|
354 | 449 | #rc_cache.cache_repo.arguments.port = 6379 |
|
355 | 450 | #rc_cache.cache_repo.arguments.db = 1 |
|
356 | ## more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
|
451 | #rc_cache.cache_repo.arguments.socket_timeout = 30 | |
|
452 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
|
357 | 453 | #rc_cache.cache_repo.arguments.distributed_lock = true |
|
358 | 454 | |
|
359 | ## cache settings for SQL queries, this needs to use memory type backend | |
|
360 | rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru | |
|
361 | rc_cache.sql_cache_short.expiration_time = 30 | |
|
362 | 455 | |
|
363 | ## `cache_repo_longterm` cache for repo object instances, this needs to use memory | |
|
364 | ## type backend as the objects kept are not pickle serializable | |
|
365 | rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru | |
|
366 | ## by default we use 96H, this is using invalidation on push anyway | |
|
367 | rc_cache.cache_repo_longterm.expiration_time = 345600 | |
|
368 | ## max items in LRU cache, reduce this number to save memory, and expire last used | |
|
369 | ## cached objects | |
|
370 | rc_cache.cache_repo_longterm.max_size = 10000 | |
|
456 | ; ############## | |
|
457 | ; BEAKER SESSION | |
|
458 | ; ############## | |
|
371 | 459 | |
|
372 | ||
|
373 | #################################### | |
|
374 | ### BEAKER SESSION #### | |
|
375 | #################################### | |
|
376 | ||
|
377 | ## .session.type is type of storage options for the session, current allowed | |
|
378 | ## types are file, ext:memcached, ext:redis, ext:database, and memory (default). | |
|
460 | ; beaker.session.type is type of storage options for the logged users sessions. Current allowed | |
|
461 | ; types are file, ext:redis, ext:database, ext:memcached, and memory (default if not specified). | |
|
462 | ; Fastest ones are Redis and ext:database | |
|
379 | 463 | beaker.session.type = file |
|
380 | 464 | beaker.session.data_dir = %(here)s/data/sessions |
|
381 | 465 | |
|
382 |
|
|
|
466 | ; Redis based sessions | |
|
383 | 467 | #beaker.session.type = ext:redis |
|
384 | 468 | #beaker.session.url = redis://127.0.0.1:6379/2 |
|
385 | 469 | |
|
386 |
|
|
|
470 | ; DB based session, fast, and allows easy management over logged in users | |
|
387 | 471 | #beaker.session.type = ext:database |
|
388 | 472 | #beaker.session.table_name = db_session |
|
389 | 473 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode |
@@ -395,267 +479,275 b' beaker.session.key = rhodecode' | |||
|
395 | 479 | beaker.session.secret = develop-rc-uytcxaz |
|
396 | 480 | beaker.session.lock_dir = %(here)s/data/sessions/lock |
|
397 | 481 | |
|
398 |
|
|
|
399 |
|
|
|
482 | ; Secure encrypted cookie. Requires AES and AES python libraries | |
|
483 | ; you must disable beaker.session.secret to use this | |
|
400 | 484 | #beaker.session.encrypt_key = key_for_encryption |
|
401 | 485 | #beaker.session.validate_key = validation_key |
|
402 | 486 | |
|
403 |
|
|
|
404 |
|
|
|
487 | ; Sets session as invalid (also logging out user) if it haven not been | |
|
488 | ; accessed for given amount of time in seconds | |
|
405 | 489 | beaker.session.timeout = 2592000 |
|
406 | 490 | beaker.session.httponly = true |
|
407 | ## Path to use for the cookie. Set to prefix if you use prefix middleware | |
|
491 | ||
|
492 | ; Path to use for the cookie. Set to prefix if you use prefix middleware | |
|
408 | 493 | #beaker.session.cookie_path = /custom_prefix |
|
409 | 494 | |
|
410 |
|
|
|
495 | ; Set https secure cookie | |
|
411 | 496 | beaker.session.secure = false |
|
412 | 497 | |
|
413 | ## auto save the session to not to use .save() | |
|
414 | beaker.session.auto = false | |
|
415 | ||
|
416 | ## default cookie expiration time in seconds, set to `true` to set expire | |
|
417 | ## at browser close | |
|
498 | ; default cookie expiration time in seconds, set to `true` to set expire | |
|
499 | ; at browser close | |
|
418 | 500 | #beaker.session.cookie_expires = 3600 |
|
419 | 501 | |
|
420 |
|
|
|
421 |
|
|
|
422 |
|
|
|
423 | ## Full text search indexer is available in rhodecode-tools under | |
|
424 | ## `rhodecode-tools index` command | |
|
502 | ; ############################# | |
|
503 | ; SEARCH INDEXING CONFIGURATION | |
|
504 | ; ############################# | |
|
425 | 505 | |
|
426 | ## WHOOSH Backend, doesn't require additional services to run | |
|
427 | ## it works good with few dozen repos | |
|
506 | ; Full text search indexer is available in rhodecode-tools under | |
|
507 | ; `rhodecode-tools index` command | |
|
508 | ||
|
509 | ; WHOOSH Backend, doesn't require additional services to run | |
|
510 | ; it works good with few dozen repos | |
|
428 | 511 | search.module = rhodecode.lib.index.whoosh |
|
429 | 512 | search.location = %(here)s/data/index |
|
430 | 513 | |
|
431 |
|
|
|
432 |
|
|
|
433 |
|
|
|
434 | ## channelstream enables persistent connections and live notification | |
|
435 | ## in the system. It's also used by the chat system | |
|
514 | ; #################### | |
|
515 | ; CHANNELSTREAM CONFIG | |
|
516 | ; #################### | |
|
517 | ||
|
518 | ; channelstream enables persistent connections and live notification | |
|
519 | ; in the system. It's also used by the chat system | |
|
436 | 520 | |
|
437 | 521 | channelstream.enabled = false |
|
438 | 522 | |
|
439 |
|
|
|
523 | ; server address for channelstream server on the backend | |
|
440 | 524 | channelstream.server = 127.0.0.1:9800 |
|
441 | 525 | |
|
442 |
|
|
|
443 |
|
|
|
444 |
|
|
|
445 |
|
|
|
526 | ; location of the channelstream server from outside world | |
|
527 | ; use ws:// for http or wss:// for https. This address needs to be handled | |
|
528 | ; by external HTTP server such as Nginx or Apache | |
|
529 | ; see Nginx/Apache configuration examples in our docs | |
|
446 | 530 | channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream |
|
447 | 531 | channelstream.secret = secret |
|
448 | 532 | channelstream.history.location = %(here)s/channelstream_history |
|
449 | 533 | |
|
450 |
|
|
|
451 |
|
|
|
534 | ; Internal application path that Javascript uses to connect into. | |
|
535 | ; If you use proxy-prefix the prefix should be added before /_channelstream | |
|
452 | 536 | channelstream.proxy_path = /_channelstream |
|
453 | 537 | |
|
454 | 538 | |
|
455 |
|
|
|
456 | ## APPENLIGHT CONFIG ## | |
|
457 |
|
|
|
539 | ; ############################## | |
|
540 | ; MAIN RHODECODE DATABASE CONFIG | |
|
541 | ; ############################## | |
|
542 | ||
|
543 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |
|
544 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |
|
545 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8 | |
|
546 | ; pymysql is an alternative driver for MySQL, use in case of problems with default one | |
|
547 | #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode | |
|
548 | ||
|
549 | sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |
|
550 | ||
|
551 | ; see sqlalchemy docs for other advanced settings | |
|
552 | ; print the sql statements to output | |
|
553 | sqlalchemy.db1.echo = false | |
|
554 | ||
|
555 | ; recycle the connections after this amount of seconds | |
|
556 | sqlalchemy.db1.pool_recycle = 3600 | |
|
557 | sqlalchemy.db1.convert_unicode = true | |
|
558 | ||
|
559 | ; the number of connections to keep open inside the connection pool. | |
|
560 | ; 0 indicates no limit | |
|
561 | #sqlalchemy.db1.pool_size = 5 | |
|
562 | ||
|
563 | ; The number of connections to allow in connection pool "overflow", that is | |
|
564 | ; connections that can be opened above and beyond the pool_size setting, | |
|
565 | ; which defaults to five. | |
|
566 | #sqlalchemy.db1.max_overflow = 10 | |
|
567 | ||
|
568 | ; Connection check ping, used to detect broken database connections | |
|
569 | ; could be enabled to better handle cases if MySQL has gone away errors | |
|
570 | #sqlalchemy.db1.ping_connection = true | |
|
571 | ||
|
572 | ; ########## | |
|
573 | ; VCS CONFIG | |
|
574 | ; ########## | |
|
575 | vcs.server.enable = true | |
|
576 | vcs.server = localhost:9900 | |
|
577 | ||
|
578 | ; Web server connectivity protocol, responsible for web based VCS operations | |
|
579 | ; Available protocols are: | |
|
580 | ; `http` - use http-rpc backend (default) | |
|
581 | vcs.server.protocol = http | |
|
582 | ||
|
583 | ; Push/Pull operations protocol, available options are: | |
|
584 | ; `http` - use http-rpc backend (default) | |
|
585 | vcs.scm_app_implementation = http | |
|
586 | ||
|
587 | ; Push/Pull operations hooks protocol, available options are: | |
|
588 | ; `http` - use http-rpc backend (default) | |
|
589 | vcs.hooks.protocol = http | |
|
590 | ||
|
591 | ; Host on which this instance is listening for hooks. If vcsserver is in other location | |
|
592 | ; this should be adjusted. | |
|
593 | vcs.hooks.host = 127.0.0.1 | |
|
594 | ||
|
595 | ; Start VCSServer with this instance as a subprocess, useful for development | |
|
596 | vcs.start_server = false | |
|
597 | ||
|
598 | ; List of enabled VCS backends, available options are: | |
|
599 | ; `hg` - mercurial | |
|
600 | ; `git` - git | |
|
601 | ; `svn` - subversion | |
|
602 | vcs.backends = hg, git, svn | |
|
603 | ||
|
604 | ; Wait this number of seconds before killing connection to the vcsserver | |
|
605 | vcs.connection_timeout = 3600 | |
|
606 | ||
|
607 | ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
|
608 | ; Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |
|
609 | #vcs.svn.compatible_version = pre-1.8-compatible | |
|
610 | ||
|
458 | 611 | |
|
459 | ## Appenlight is tailored to work with RhodeCode, see | |
|
460 | ## http://appenlight.com for details how to obtain an account | |
|
612 | ; #################################################### | |
|
613 | ; Subversion proxy support (mod_dav_svn) | |
|
614 | ; Maps RhodeCode repo groups into SVN paths for Apache | |
|
615 | ; #################################################### | |
|
616 | ||
|
617 | ; Enable or disable the config file generation. | |
|
618 | svn.proxy.generate_config = false | |
|
619 | ||
|
620 | ; Generate config file with `SVNListParentPath` set to `On`. | |
|
621 | svn.proxy.list_parent_path = true | |
|
622 | ||
|
623 | ; Set location and file name of generated config file. | |
|
624 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |
|
625 | ||
|
626 | ; alternative mod_dav config template. This needs to be a valid mako template | |
|
627 | ; Example template can be found in the source code: | |
|
628 | ; rhodecode/apps/svn_support/templates/mod-dav-svn.conf.mako | |
|
629 | #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako | |
|
630 | ||
|
631 | ; Used as a prefix to the `Location` block in the generated config file. | |
|
632 | ; In most cases it should be set to `/`. | |
|
633 | svn.proxy.location_root = / | |
|
634 | ||
|
635 | ; Command to reload the mod dav svn configuration on change. | |
|
636 | ; Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh | |
|
637 | ; Make sure user who runs RhodeCode process is allowed to reload Apache | |
|
638 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |
|
639 | ||
|
640 | ; If the timeout expires before the reload command finishes, the command will | |
|
641 | ; be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |
|
642 | #svn.proxy.reload_timeout = 10 | |
|
643 | ||
|
644 | ; #################### | |
|
645 | ; SSH Support Settings | |
|
646 | ; #################### | |
|
461 | 647 | |
|
462 | ## Appenlight integration enabled | |
|
648 | ; Defines if a custom authorized_keys file should be created and written on | |
|
649 | ; any change user ssh keys. Setting this to false also disables possibility | |
|
650 | ; of adding SSH keys by users from web interface. Super admins can still | |
|
651 | ; manage SSH Keys. | |
|
652 | ssh.generate_authorized_keyfile = false | |
|
653 | ||
|
654 | ; Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding` | |
|
655 | # ssh.authorized_keys_ssh_opts = | |
|
656 | ||
|
657 | ; Path to the authorized_keys file where the generate entries are placed. | |
|
658 | ; It is possible to have multiple key files specified in `sshd_config` e.g. | |
|
659 | ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode | |
|
660 | ssh.authorized_keys_file_path = ~/.ssh/authorized_keys_rhodecode | |
|
661 | ||
|
662 | ; Command to execute the SSH wrapper. The binary is available in the | |
|
663 | ; RhodeCode installation directory. | |
|
664 | ; e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper | |
|
665 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper | |
|
666 | ||
|
667 | ; Allow shell when executing the ssh-wrapper command | |
|
668 | ssh.wrapper_cmd_allow_shell = false | |
|
669 | ||
|
670 | ; Enables logging, and detailed output send back to the client during SSH | |
|
671 | ; operations. Useful for debugging, shouldn't be used in production. | |
|
672 | ssh.enable_debug_logging = true | |
|
673 | ||
|
674 | ; Paths to binary executable, by default they are the names, but we can | |
|
675 | ; override them if we want to use a custom one | |
|
676 | ssh.executable.hg = ~/.rccontrol/vcsserver-1/profile/bin/hg | |
|
677 | ssh.executable.git = ~/.rccontrol/vcsserver-1/profile/bin/git | |
|
678 | ssh.executable.svn = ~/.rccontrol/vcsserver-1/profile/bin/svnserve | |
|
679 | ||
|
680 | ; Enables SSH key generator web interface. Disabling this still allows users | |
|
681 | ; to add their own keys. | |
|
682 | ssh.enable_ui_key_generator = true | |
|
683 | ||
|
684 | ||
|
685 | ; ################# | |
|
686 | ; APPENLIGHT CONFIG | |
|
687 | ; ################# | |
|
688 | ||
|
689 | ; Appenlight is tailored to work with RhodeCode, see | |
|
690 | ; http://appenlight.rhodecode.com for details how to obtain an account | |
|
691 | ||
|
692 | ; Appenlight integration enabled | |
|
463 | 693 | appenlight = false |
|
464 | 694 | |
|
465 | 695 | appenlight.server_url = https://api.appenlight.com |
|
466 | 696 | appenlight.api_key = YOUR_API_KEY |
|
467 | 697 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 |
|
468 | 698 | |
|
469 |
|
|
|
699 | ; used for JS client | |
|
470 | 700 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY |
|
471 | 701 | |
|
472 |
|
|
|
702 | ; TWEAK AMOUNT OF INFO SENT HERE | |
|
473 | 703 | |
|
474 |
|
|
|
704 | ; enables 404 error logging (default False) | |
|
475 | 705 | appenlight.report_404 = false |
|
476 | 706 | |
|
477 |
|
|
|
707 | ; time in seconds after request is considered being slow (default 1) | |
|
478 | 708 | appenlight.slow_request_time = 1 |
|
479 | 709 | |
|
480 |
|
|
|
481 |
|
|
|
710 | ; record slow requests in application | |
|
711 | ; (needs to be enabled for slow datastore recording and time tracking) | |
|
482 | 712 | appenlight.slow_requests = true |
|
483 | 713 | |
|
484 |
|
|
|
714 | ; enable hooking to application loggers | |
|
485 | 715 | appenlight.logging = true |
|
486 | 716 | |
|
487 |
|
|
|
717 | ; minimum log level for log capture | |
|
488 | 718 | appenlight.logging.level = WARNING |
|
489 | 719 | |
|
490 |
|
|
|
491 |
|
|
|
720 | ; send logs only from erroneous/slow requests | |
|
721 | ; (saves API quota for intensive logging) | |
|
492 | 722 | appenlight.logging_on_error = false |
|
493 | 723 | |
|
494 |
|
|
|
495 |
|
|
|
496 |
|
|
|
497 |
|
|
|
498 |
|
|
|
724 | ; list of additional keywords that should be grabbed from environ object | |
|
725 | ; can be string with comma separated list of words in lowercase | |
|
726 | ; (by default client will always send following info: | |
|
727 | ; 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that | |
|
728 | ; start with HTTP* this list be extended with additional keywords here | |
|
499 | 729 | appenlight.environ_keys_whitelist = |
|
500 | 730 | |
|
501 |
|
|
|
502 |
|
|
|
503 |
|
|
|
504 |
|
|
|
505 |
|
|
|
731 | ; list of keywords that should be blanked from request object | |
|
732 | ; can be string with comma separated list of words in lowercase | |
|
733 | ; (by default client will always blank keys that contain following words | |
|
734 | ; 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' | |
|
735 | ; this list be extended with additional keywords set here | |
|
506 | 736 | appenlight.request_keys_blacklist = |
|
507 | 737 | |
|
508 |
|
|
|
509 |
|
|
|
510 |
|
|
|
738 | ; list of namespaces that should be ignores when gathering log entries | |
|
739 | ; can be string with comma separated list of namespaces | |
|
740 | ; (by default the client ignores own entries: appenlight_client.client) | |
|
511 | 741 | appenlight.log_namespace_blacklist = |
|
512 | 742 | |
|
513 | # enable debug style page | |
|
514 | debug_style = true | |
|
515 | ||
|
516 | ########################################### | |
|
517 | ### MAIN RHODECODE DATABASE CONFIG ### | |
|
518 | ########################################### | |
|
519 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |
|
520 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |
|
521 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8 | |
|
522 | # pymysql is an alternative driver for MySQL, use in case of problems with default one | |
|
523 | #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode | |
|
524 | ||
|
525 | sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |
|
526 | ||
|
527 | # see sqlalchemy docs for other advanced settings | |
|
528 | ||
|
529 | ## print the sql statements to output | |
|
530 | sqlalchemy.db1.echo = false | |
|
531 | ## recycle the connections after this amount of seconds | |
|
532 | sqlalchemy.db1.pool_recycle = 3600 | |
|
533 | sqlalchemy.db1.convert_unicode = true | |
|
534 | ||
|
535 | ## the number of connections to keep open inside the connection pool. | |
|
536 | ## 0 indicates no limit | |
|
537 | #sqlalchemy.db1.pool_size = 5 | |
|
538 | ||
|
539 | ## the number of connections to allow in connection pool "overflow", that is | |
|
540 | ## connections that can be opened above and beyond the pool_size setting, | |
|
541 | ## which defaults to five. | |
|
542 | #sqlalchemy.db1.max_overflow = 10 | |
|
543 | ||
|
544 | ## Connection check ping, used to detect broken database connections | |
|
545 | ## could be enabled to better handle cases if MySQL has gone away errors | |
|
546 | #sqlalchemy.db1.ping_connection = true | |
|
547 | ||
|
548 | ################## | |
|
549 | ### VCS CONFIG ### | |
|
550 | ################## | |
|
551 | vcs.server.enable = true | |
|
552 | vcs.server = localhost:9900 | |
|
553 | ||
|
554 | ## Web server connectivity protocol, responsible for web based VCS operations | |
|
555 | ## Available protocols are: | |
|
556 | ## `http` - use http-rpc backend (default) | |
|
557 | vcs.server.protocol = http | |
|
558 | ||
|
559 | ## Push/Pull operations protocol, available options are: | |
|
560 | ## `http` - use http-rpc backend (default) | |
|
561 | vcs.scm_app_implementation = http | |
|
562 | ||
|
563 | ## Push/Pull operations hooks protocol, available options are: | |
|
564 | ## `http` - use http-rpc backend (default) | |
|
565 | vcs.hooks.protocol = http | |
|
566 | ||
|
567 | ## Host on which this instance is listening for hooks. If vcsserver is in other location | |
|
568 | ## this should be adjusted. | |
|
569 | vcs.hooks.host = 127.0.0.1 | |
|
570 | ||
|
571 | vcs.server.log_level = debug | |
|
572 | ## Start VCSServer with this instance as a subprocess, useful for development | |
|
573 | vcs.start_server = false | |
|
574 | ||
|
575 | ## List of enabled VCS backends, available options are: | |
|
576 | ## `hg` - mercurial | |
|
577 | ## `git` - git | |
|
578 | ## `svn` - subversion | |
|
579 | vcs.backends = hg, git, svn | |
|
580 | ||
|
581 | vcs.connection_timeout = 3600 | |
|
582 | ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
|
583 | ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |
|
584 | #vcs.svn.compatible_version = pre-1.8-compatible | |
|
585 | ||
|
586 | ||
|
587 | ############################################################ | |
|
588 | ### Subversion proxy support (mod_dav_svn) ### | |
|
589 | ### Maps RhodeCode repo groups into SVN paths for Apache ### | |
|
590 | ############################################################ | |
|
591 | ## Enable or disable the config file generation. | |
|
592 | svn.proxy.generate_config = false | |
|
593 | ## Generate config file with `SVNListParentPath` set to `On`. | |
|
594 | svn.proxy.list_parent_path = true | |
|
595 | ## Set location and file name of generated config file. | |
|
596 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |
|
597 | ## alternative mod_dav config template. This needs to be a mako template | |
|
598 | #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako | |
|
599 | ## Used as a prefix to the `Location` block in the generated config file. | |
|
600 | ## In most cases it should be set to `/`. | |
|
601 | svn.proxy.location_root = / | |
|
602 | ## Command to reload the mod dav svn configuration on change. | |
|
603 | ## Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh | |
|
604 | ## Make sure user who runs RhodeCode process is allowed to reload Apache | |
|
605 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |
|
606 | ## If the timeout expires before the reload command finishes, the command will | |
|
607 | ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |
|
608 | #svn.proxy.reload_timeout = 10 | |
|
609 | ||
|
610 | ############################################################ | |
|
611 | ### SSH Support Settings ### | |
|
612 | ############################################################ | |
|
613 | ||
|
614 | ## Defines if a custom authorized_keys file should be created and written on | |
|
615 | ## any change user ssh keys. Setting this to false also disables possibility | |
|
616 | ## of adding SSH keys by users from web interface. Super admins can still | |
|
617 | ## manage SSH Keys. | |
|
618 | ssh.generate_authorized_keyfile = false | |
|
619 | ||
|
620 | ## Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding` | |
|
621 | # ssh.authorized_keys_ssh_opts = | |
|
622 | ||
|
623 | ## Path to the authorized_keys file where the generate entries are placed. | |
|
624 | ## It is possible to have multiple key files specified in `sshd_config` e.g. | |
|
625 | ## AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode | |
|
626 | ssh.authorized_keys_file_path = ~/.ssh/authorized_keys_rhodecode | |
|
627 | ||
|
628 | ## Command to execute the SSH wrapper. The binary is available in the | |
|
629 | ## RhodeCode installation directory. | |
|
630 | ## e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper | |
|
631 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper | |
|
632 | ||
|
633 | ## Allow shell when executing the ssh-wrapper command | |
|
634 | ssh.wrapper_cmd_allow_shell = false | |
|
635 | ||
|
636 | ## Enables logging, and detailed output send back to the client during SSH | |
|
637 | ## operations. Useful for debugging, shouldn't be used in production. | |
|
638 | ssh.enable_debug_logging = true | |
|
639 | ||
|
640 | ## Paths to binary executable, by default they are the names, but we can | |
|
641 | ## override them if we want to use a custom one | |
|
642 | ssh.executable.hg = ~/.rccontrol/vcsserver-1/profile/bin/hg | |
|
643 | ssh.executable.git = ~/.rccontrol/vcsserver-1/profile/bin/git | |
|
644 | ssh.executable.svn = ~/.rccontrol/vcsserver-1/profile/bin/svnserve | |
|
645 | ||
|
646 | ## Enables SSH key generator web interface. Disabling this still allows users | |
|
647 | ## to add their own keys. | |
|
648 | ssh.enable_ui_key_generator = true | |
|
649 | ||
|
650 | ||
|
651 | ## Dummy marker to add new entries after. | |
|
652 | ## Add any custom entries below. Please don't remove. | |
|
743 | ; Dummy marker to add new entries after. | |
|
744 | ; Add any custom entries below. Please don't remove this marker. | |
|
653 | 745 | custom.conf = 1 |
|
654 | 746 | |
|
655 | 747 | |
|
656 |
|
|
|
657 |
|
|
|
658 |
|
|
|
748 | ; ##################### | |
|
749 | ; LOGGING CONFIGURATION | |
|
750 | ; ##################### | |
|
659 | 751 | [loggers] |
|
660 | 752 | keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper |
|
661 | 753 | |
@@ -665,9 +757,9 b' keys = console, console_sql' | |||
|
665 | 757 | [formatters] |
|
666 | 758 | keys = generic, color_formatter, color_formatter_sql |
|
667 | 759 | |
|
668 |
|
|
|
669 |
|
|
|
670 |
|
|
|
760 | ; ####### | |
|
761 | ; LOGGERS | |
|
762 | ; ####### | |
|
671 | 763 | [logger_root] |
|
672 | 764 | level = NOTSET |
|
673 | 765 | handlers = console |
@@ -702,9 +794,9 b' handlers =' | |||
|
702 | 794 | qualname = celery |
|
703 | 795 | |
|
704 | 796 | |
|
705 |
|
|
|
706 |
|
|
|
707 |
|
|
|
797 | ; ######## | |
|
798 | ; HANDLERS | |
|
799 | ; ######## | |
|
708 | 800 | |
|
709 | 801 | [handler_console] |
|
710 | 802 | class = StreamHandler |
@@ -713,17 +805,17 b' level = DEBUG' | |||
|
713 | 805 | formatter = color_formatter |
|
714 | 806 | |
|
715 | 807 | [handler_console_sql] |
|
716 |
|
|
|
717 |
|
|
|
718 |
|
|
|
808 | ; "level = DEBUG" logs SQL queries and results. | |
|
809 | ; "level = INFO" logs SQL queries. | |
|
810 | ; "level = WARN" logs neither. (Recommended for production systems.) | |
|
719 | 811 | class = StreamHandler |
|
720 | 812 | args = (sys.stderr, ) |
|
721 | 813 | level = WARN |
|
722 | 814 | formatter = color_formatter_sql |
|
723 | 815 | |
|
724 |
|
|
|
725 |
|
|
|
726 |
|
|
|
816 | ; ########## | |
|
817 | ; FORMATTERS | |
|
818 | ; ########## | |
|
727 | 819 | |
|
728 | 820 | [formatter_generic] |
|
729 | 821 | class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter |
@@ -1,58 +1,26 b'' | |||
|
1 | 1 | """ |
|
2 | gunicorn config extension and hooks. Sets additional configuration that is | |
|
3 | available post the .ini config. | |
|
4 | ||
|
5 | - workers = ${cpu_number} | |
|
6 | - threads = 1 | |
|
7 | - proc_name = ${gunicorn_proc_name} | |
|
8 | - worker_class = sync | |
|
9 | - worker_connections = 10 | |
|
10 | - max_requests = 1000 | |
|
11 | - max_requests_jitter = 30 | |
|
12 | - timeout = 21600 | |
|
13 | ||
|
2 | Gunicorn config extension and hooks. This config file adds some extra settings and memory management. | |
|
3 | Gunicorn configuration should be managed by .ini files entries of RhodeCode or VCSServer | |
|
14 | 4 | """ |
|
15 | 5 | |
|
16 | import multiprocessing | |
|
6 | import gc | |
|
7 | import os | |
|
17 | 8 | import sys |
|
9 | import math | |
|
18 | 10 | import time |
|
19 | import datetime | |
|
20 | 11 | import threading |
|
21 | 12 | import traceback |
|
13 | import random | |
|
22 | 14 | from gunicorn.glogging import Logger |
|
23 | 15 | |
|
24 | 16 | |
|
17 | def get_workers(): | |
|
18 | import multiprocessing | |
|
19 | return multiprocessing.cpu_count() * 2 + 1 | |
|
20 | ||
|
25 | 21 | # GLOBAL |
|
26 | 22 | errorlog = '-' |
|
27 | 23 | accesslog = '-' |
|
28 | loglevel = 'debug' | |
|
29 | ||
|
30 | # SECURITY | |
|
31 | ||
|
32 | # The maximum size of HTTP request line in bytes. | |
|
33 | # 0 for unlimited | |
|
34 | limit_request_line = 0 | |
|
35 | ||
|
36 | # Limit the number of HTTP headers fields in a request. | |
|
37 | # By default this value is 100 and can't be larger than 32768. | |
|
38 | limit_request_fields = 10240 | |
|
39 | ||
|
40 | # Limit the allowed size of an HTTP request header field. | |
|
41 | # Value is a positive number or 0. | |
|
42 | # Setting it to 0 will allow unlimited header field sizes. | |
|
43 | limit_request_field_size = 0 | |
|
44 | ||
|
45 | ||
|
46 | # Timeout for graceful workers restart. | |
|
47 | # After receiving a restart signal, workers have this much time to finish | |
|
48 | # serving requests. Workers still alive after the timeout (starting from the | |
|
49 | # receipt of the restart signal) are force killed. | |
|
50 | graceful_timeout = 30 | |
|
51 | ||
|
52 | ||
|
53 | # The number of seconds to wait for requests on a Keep-Alive connection. | |
|
54 | # Generally set in the 1-5 seconds range. | |
|
55 | keepalive = 2 | |
|
56 | 24 | |
|
57 | 25 | |
|
58 | 26 | # SERVER MECHANICS |
@@ -63,38 +31,178 b' tmp_upload_dir = None' | |||
|
63 | 31 | |
|
64 | 32 | # Custom log format |
|
65 | 33 | access_log_format = ( |
|
66 |
'%(t)s |
|
|
34 | '%(t)s %(p)s INFO [GNCRN] %(h)-15s rqt:%(L)s %(s)s %(b)-6s "%(m)s:%(U)s %(q)s" usr:%(u)s "%(f)s" "%(a)s"') | |
|
67 | 35 | |
|
68 | 36 | # self adjust workers based on CPU count |
|
69 | # workers = multiprocessing.cpu_count() * 2 + 1 | |
|
37 | # workers = get_workers() | |
|
38 | ||
|
39 | ||
|
40 | def _get_process_rss(pid=None): | |
|
41 | try: | |
|
42 | import psutil | |
|
43 | if pid: | |
|
44 | proc = psutil.Process(pid) | |
|
45 | else: | |
|
46 | proc = psutil.Process() | |
|
47 | return proc.memory_info().rss | |
|
48 | except Exception: | |
|
49 | return None | |
|
70 | 50 | |
|
71 | 51 | |
|
72 | def post_fork(server, worker): | |
|
73 | server.log.info("[<%-10s>] WORKER spawned", worker.pid) | |
|
52 | def _get_config(ini_path): | |
|
53 | ||
|
54 | try: | |
|
55 | import configparser | |
|
56 | except ImportError: | |
|
57 | import ConfigParser as configparser | |
|
58 | try: | |
|
59 | config = configparser.RawConfigParser() | |
|
60 | config.read(ini_path) | |
|
61 | return config | |
|
62 | except Exception: | |
|
63 | return None | |
|
64 | ||
|
65 | ||
|
66 | def _time_with_offset(memory_usage_check_interval): | |
|
67 | return time.time() - random.randint(0, memory_usage_check_interval/2.0) | |
|
74 | 68 | |
|
75 | 69 | |
|
76 | 70 | def pre_fork(server, worker): |
|
77 | 71 | pass |
|
78 | 72 | |
|
79 | 73 | |
|
74 | def post_fork(server, worker): | |
|
75 | ||
|
76 | # memory spec defaults | |
|
77 | _memory_max_usage = 0 | |
|
78 | _memory_usage_check_interval = 60 | |
|
79 | _memory_usage_recovery_threshold = 0.8 | |
|
80 | ||
|
81 | ini_path = os.path.abspath(server.cfg.paste) | |
|
82 | conf = _get_config(ini_path) | |
|
83 | ||
|
84 | section = 'server:main' | |
|
85 | if conf and conf.has_section(section): | |
|
86 | ||
|
87 | if conf.has_option(section, 'memory_max_usage'): | |
|
88 | _memory_max_usage = conf.getint(section, 'memory_max_usage') | |
|
89 | ||
|
90 | if conf.has_option(section, 'memory_usage_check_interval'): | |
|
91 | _memory_usage_check_interval = conf.getint(section, 'memory_usage_check_interval') | |
|
92 | ||
|
93 | if conf.has_option(section, 'memory_usage_recovery_threshold'): | |
|
94 | _memory_usage_recovery_threshold = conf.getfloat(section, 'memory_usage_recovery_threshold') | |
|
95 | ||
|
96 | worker._memory_max_usage = _memory_max_usage | |
|
97 | worker._memory_usage_check_interval = _memory_usage_check_interval | |
|
98 | worker._memory_usage_recovery_threshold = _memory_usage_recovery_threshold | |
|
99 | ||
|
100 | # register memory last check time, with some random offset so we don't recycle all | |
|
101 | # at once | |
|
102 | worker._last_memory_check_time = _time_with_offset(_memory_usage_check_interval) | |
|
103 | ||
|
104 | if _memory_max_usage: | |
|
105 | server.log.info("[%-10s] WORKER spawned with max memory set at %s", worker.pid, | |
|
106 | _format_data_size(_memory_max_usage)) | |
|
107 | else: | |
|
108 | server.log.info("[%-10s] WORKER spawned", worker.pid) | |
|
109 | ||
|
110 | ||
|
80 | 111 | def pre_exec(server): |
|
81 | 112 | server.log.info("Forked child, re-executing.") |
|
82 | 113 | |
|
83 | 114 | |
|
84 | 115 | def on_starting(server): |
|
85 | server.log.info("Server is starting.") | |
|
116 | server_lbl = '{} {}'.format(server.proc_name, server.address) | |
|
117 | server.log.info("Server %s is starting.", server_lbl) | |
|
86 | 118 | |
|
87 | 119 | |
|
88 | 120 | def when_ready(server): |
|
89 | server.log.info("Server is ready. Spawning workers") | |
|
121 | server.log.info("Server %s is ready. Spawning workers", server) | |
|
90 | 122 | |
|
91 | 123 | |
|
92 | 124 | def on_reload(server): |
|
93 | 125 | pass |
|
94 | 126 | |
|
95 | 127 | |
|
128 | def _format_data_size(size, unit="B", precision=1, binary=True): | |
|
129 | """Format a number using SI units (kilo, mega, etc.). | |
|
130 | ||
|
131 | ``size``: The number as a float or int. | |
|
132 | ||
|
133 | ``unit``: The unit name in plural form. Examples: "bytes", "B". | |
|
134 | ||
|
135 | ``precision``: How many digits to the right of the decimal point. Default | |
|
136 | is 1. 0 suppresses the decimal point. | |
|
137 | ||
|
138 | ``binary``: If false, use base-10 decimal prefixes (kilo = K = 1000). | |
|
139 | If true, use base-2 binary prefixes (kibi = Ki = 1024). | |
|
140 | ||
|
141 | ``full_name``: If false (default), use the prefix abbreviation ("k" or | |
|
142 | "Ki"). If true, use the full prefix ("kilo" or "kibi"). If false, | |
|
143 | use abbreviation ("k" or "Ki"). | |
|
144 | ||
|
145 | """ | |
|
146 | ||
|
147 | if not binary: | |
|
148 | base = 1000 | |
|
149 | multiples = ('', 'k', 'M', 'G', 'T', 'P', 'E', 'Z', 'Y') | |
|
150 | else: | |
|
151 | base = 1024 | |
|
152 | multiples = ('', 'Ki', 'Mi', 'Gi', 'Ti', 'Pi', 'Ei', 'Zi', 'Yi') | |
|
153 | ||
|
154 | sign = "" | |
|
155 | if size > 0: | |
|
156 | m = int(math.log(size, base)) | |
|
157 | elif size < 0: | |
|
158 | sign = "-" | |
|
159 | size = -size | |
|
160 | m = int(math.log(size, base)) | |
|
161 | else: | |
|
162 | m = 0 | |
|
163 | if m > 8: | |
|
164 | m = 8 | |
|
165 | ||
|
166 | if m == 0: | |
|
167 | precision = '%.0f' | |
|
168 | else: | |
|
169 | precision = '%%.%df' % precision | |
|
170 | ||
|
171 | size = precision % (size / math.pow(base, m)) | |
|
172 | ||
|
173 | return '%s%s %s%s' % (sign, size.strip(), multiples[m], unit) | |
|
174 | ||
|
175 | ||
|
176 | def _check_memory_usage(worker): | |
|
177 | memory_max_usage = worker._memory_max_usage | |
|
178 | if not memory_max_usage: | |
|
179 | return | |
|
180 | ||
|
181 | memory_usage_check_interval = worker._memory_usage_check_interval | |
|
182 | memory_usage_recovery_threshold = memory_max_usage * worker._memory_usage_recovery_threshold | |
|
183 | ||
|
184 | elapsed = time.time() - worker._last_memory_check_time | |
|
185 | if elapsed > memory_usage_check_interval: | |
|
186 | mem_usage = _get_process_rss() | |
|
187 | if mem_usage and mem_usage > memory_max_usage: | |
|
188 | worker.log.info( | |
|
189 | "memory usage %s > %s, forcing gc", | |
|
190 | _format_data_size(mem_usage), _format_data_size(memory_max_usage)) | |
|
191 | # Try to clean it up by forcing a full collection. | |
|
192 | gc.collect() | |
|
193 | mem_usage = _get_process_rss() | |
|
194 | if mem_usage > memory_usage_recovery_threshold: | |
|
195 | # Didn't clean up enough, we'll have to terminate. | |
|
196 | worker.log.warning( | |
|
197 | "memory usage %s > %s after gc, quitting", | |
|
198 | _format_data_size(mem_usage), _format_data_size(memory_max_usage)) | |
|
199 | # This will cause worker to auto-restart itself | |
|
200 | worker.alive = False | |
|
201 | worker._last_memory_check_time = time.time() | |
|
202 | ||
|
203 | ||
|
96 | 204 | def worker_int(worker): |
|
97 |
worker.log.info("[ |
|
|
205 | worker.log.info("[%-10s] worker received INT or QUIT signal", worker.pid) | |
|
98 | 206 | |
|
99 | 207 | # get traceback info, on worker crash |
|
100 | 208 | id2name = dict([(th.ident, th.name) for th in threading.enumerate()]) |
@@ -110,15 +218,15 b' def worker_int(worker):' | |||
|
110 | 218 | |
|
111 | 219 | |
|
112 | 220 | def worker_abort(worker): |
|
113 |
worker.log.info("[ |
|
|
221 | worker.log.info("[%-10s] worker received SIGABRT signal", worker.pid) | |
|
114 | 222 | |
|
115 | 223 | |
|
116 | 224 | def worker_exit(server, worker): |
|
117 |
worker.log.info("[ |
|
|
225 | worker.log.info("[%-10s] worker exit", worker.pid) | |
|
118 | 226 | |
|
119 | 227 | |
|
120 | 228 | def child_exit(server, worker): |
|
121 |
worker.log.info("[ |
|
|
229 | worker.log.info("[%-10s] worker child exit", worker.pid) | |
|
122 | 230 | |
|
123 | 231 | |
|
124 | 232 | def pre_request(worker, req): |
@@ -129,9 +237,12 b' def pre_request(worker, req):' | |||
|
129 | 237 | |
|
130 | 238 | def post_request(worker, req, environ, resp): |
|
131 | 239 | total_time = time.time() - worker.start_time |
|
240 | # Gunicorn sometimes has problems with reading the status_code | |
|
241 | status_code = getattr(resp, 'status_code', '') | |
|
132 | 242 | worker.log.debug( |
|
133 |
"GNCRN POST WORKER [cnt:%s]: %s %s resp: %s, Load Time: %. |
|
|
134 |
worker.nr, req.method, req.path, |
|
|
243 | "GNCRN POST WORKER [cnt:%s]: %s %s resp: %s, Load Time: %.4fs", | |
|
244 | worker.nr, req.method, req.path, status_code, total_time) | |
|
245 | _check_memory_usage(worker) | |
|
135 | 246 | |
|
136 | 247 | |
|
137 | 248 | class RhodeCodeLogger(Logger): |
This diff has been collapsed as it changes many lines, (912 lines changed) Show them Hide them | |||
@@ -1,24 +1,22 b'' | |||
|
1 | ||
|
1 | ## -*- coding: utf-8 -*- | |
|
2 | 2 | |
|
3 |
|
|
|
4 |
|
|
|
5 |
|
|
|
3 | ; ######################################### | |
|
4 | ; RHODECODE COMMUNITY EDITION CONFIGURATION | |
|
5 | ; ######################################### | |
|
6 | 6 | |
|
7 | 7 | [DEFAULT] |
|
8 |
|
|
|
8 | ; Debug flag sets all loggers to debug, and enables request tracking | |
|
9 | 9 | debug = false |
|
10 | 10 | |
|
11 |
|
|
|
12 | ## EMAIL CONFIGURATION ## | |
|
13 | ## Uncomment and replace with the email address which should receive ## | |
|
14 | ## any error reports after an application crash ## | |
|
15 | ## Additionally these settings will be used by the RhodeCode mailing system ## | |
|
16 | ################################################################################ | |
|
11 | ; ######################################################################## | |
|
12 | ; EMAIL CONFIGURATION | |
|
13 | ; These settings will be used by the RhodeCode mailing system | |
|
14 | ; ######################################################################## | |
|
17 | 15 | |
|
18 |
|
|
|
16 | ; prefix all emails subjects with given prefix, helps filtering out emails | |
|
19 | 17 | #email_prefix = [RhodeCode] |
|
20 | 18 | |
|
21 |
|
|
|
19 | ; email FROM address all mails will be sent | |
|
22 | 20 | #app_email_from = rhodecode-noreply@localhost |
|
23 | 21 | |
|
24 | 22 | #smtp_server = mail.server.com |
@@ -29,168 +27,200 b' debug = false' | |||
|
29 | 27 | #smtp_use_ssl = true |
|
30 | 28 | |
|
31 | 29 | [server:main] |
|
32 | ## COMMON ## | |
|
30 | ; COMMON HOST/IP CONFIG | |
|
33 | 31 | host = 127.0.0.1 |
|
34 | 32 | port = 5000 |
|
35 | 33 | |
|
36 | ########################################################### | |
|
37 | ## WAITRESS WSGI SERVER - Recommended for Development #### | |
|
38 | ########################################################### | |
|
34 | ||
|
35 | ; ########################### | |
|
36 | ; GUNICORN APPLICATION SERVER | |
|
37 | ; ########################### | |
|
38 | ||
|
39 | ; run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |
|
40 | ||
|
41 | ; Module to use, this setting shouldn't be changed | |
|
42 | use = egg:gunicorn#main | |
|
43 | ||
|
44 | ; Sets the number of process workers. More workers means more concurrent connections | |
|
45 | ; RhodeCode can handle at the same time. Each additional worker also it increases | |
|
46 | ; memory usage as each has it's own set of caches. | |
|
47 | ; Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |
|
48 | ; than 8-10 unless for really big deployments .e.g 700-1000 users. | |
|
49 | ; `instance_id = *` must be set in the [app:main] section below (which is the default) | |
|
50 | ; when using more than 1 worker. | |
|
51 | workers = 2 | |
|
52 | ||
|
53 | ; Gunicorn access log level | |
|
54 | loglevel = info | |
|
55 | ||
|
56 | ; Process name visible in process list | |
|
57 | proc_name = rhodecode | |
|
58 | ||
|
59 | ; Type of worker class, one of `sync`, `gevent` | |
|
60 | ; Recommended type is `gevent` | |
|
61 | worker_class = gevent | |
|
62 | ||
|
63 | ; The maximum number of simultaneous clients per worker. Valid only for gevent | |
|
64 | worker_connections = 10 | |
|
65 | ||
|
66 | ; Max number of requests that worker will handle before being gracefully restarted. | |
|
67 | ; Prevents memory leaks, jitter adds variability so not all workers are restarted at once. | |
|
68 | max_requests = 1000 | |
|
69 | max_requests_jitter = 30 | |
|
39 | 70 | |
|
40 | #use = egg:waitress#main | |
|
41 | ## number of worker threads | |
|
42 | #threads = 5 | |
|
43 | ## MAX BODY SIZE 100GB | |
|
44 | #max_request_body_size = 107374182400 | |
|
45 | ## Use poll instead of select, fixes file descriptors limits problems. | |
|
46 | ## May not work on old windows systems. | |
|
47 | #asyncore_use_poll = true | |
|
71 | ; Amount of time a worker can spend with handling a request before it | |
|
72 | ; gets killed and restarted. By default set to 21600 (6hrs) | |
|
73 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |
|
74 | timeout = 21600 | |
|
75 | ||
|
76 | ; The maximum size of HTTP request line in bytes. | |
|
77 | ; 0 for unlimited | |
|
78 | limit_request_line = 0 | |
|
79 | ||
|
80 | ; Limit the number of HTTP headers fields in a request. | |
|
81 | ; By default this value is 100 and can't be larger than 32768. | |
|
82 | limit_request_fields = 32768 | |
|
83 | ||
|
84 | ; Limit the allowed size of an HTTP request header field. | |
|
85 | ; Value is a positive number or 0. | |
|
86 | ; Setting it to 0 will allow unlimited header field sizes. | |
|
87 | limit_request_field_size = 0 | |
|
88 | ||
|
89 | ; Timeout for graceful workers restart. | |
|
90 | ; After receiving a restart signal, workers have this much time to finish | |
|
91 | ; serving requests. Workers still alive after the timeout (starting from the | |
|
92 | ; receipt of the restart signal) are force killed. | |
|
93 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |
|
94 | graceful_timeout = 3600 | |
|
95 | ||
|
96 | # The number of seconds to wait for requests on a Keep-Alive connection. | |
|
97 | # Generally set in the 1-5 seconds range. | |
|
98 | keepalive = 2 | |
|
99 | ||
|
100 | ; Maximum memory usage that each worker can use before it will receive a | |
|
101 | ; graceful restart signal 0 = memory monitoring is disabled | |
|
102 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |
|
103 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |
|
104 | memory_max_usage = 0 | |
|
105 | ||
|
106 | ; How often in seconds to check for memory usage for each gunicorn worker | |
|
107 | memory_usage_check_interval = 60 | |
|
108 | ||
|
109 | ; Threshold value for which we don't recycle worker if GarbageCollection | |
|
110 | ; frees up enough resources. Before each restart we try to run GC on worker | |
|
111 | ; in case we get enough free memory after that, restart will not happen. | |
|
112 | memory_usage_recovery_threshold = 0.8 | |
|
48 | 113 | |
|
49 | 114 | |
|
50 | ########################## | |
|
51 | ## GUNICORN WSGI SERVER ## | |
|
52 | ########################## | |
|
53 | ## run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |
|
54 | ||
|
55 | use = egg:gunicorn#main | |
|
56 | ## Sets the number of process workers. More workers means more concurrent connections | |
|
57 | ## RhodeCode can handle at the same time. Each additional worker also it increases | |
|
58 | ## memory usage as each has it's own set of caches. | |
|
59 | ## Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |
|
60 | ## than 8-10 unless for really big deployments .e.g 700-1000 users. | |
|
61 | ## `instance_id = *` must be set in the [app:main] section below (which is the default) | |
|
62 | ## when using more than 1 worker. | |
|
63 | workers = 2 | |
|
64 | ## process name visible in process list | |
|
65 | proc_name = rhodecode | |
|
66 | ## type of worker class, one of sync, gevent | |
|
67 | ## recommended for bigger setup is using of of other than sync one | |
|
68 | worker_class = gevent | |
|
69 | ## The maximum number of simultaneous clients. Valid only for Gevent | |
|
70 | worker_connections = 10 | |
|
71 | ## max number of requests that worker will handle before being gracefully | |
|
72 | ## restarted, could prevent memory leaks | |
|
73 | max_requests = 1000 | |
|
74 | max_requests_jitter = 30 | |
|
75 | ## amount of time a worker can spend with handling a request before it | |
|
76 | ## gets killed and restarted. Set to 6hrs | |
|
77 | timeout = 21600 | |
|
78 | ||
|
79 | ||
|
80 | ## prefix middleware for RhodeCode. | |
|
81 | ## recommended when using proxy setup. | |
|
82 | ## allows to set RhodeCode under a prefix in server. | |
|
83 | ## eg https://server.com/custom_prefix. Enable `filter-with =` option below as well. | |
|
84 | ## And set your prefix like: `prefix = /custom_prefix` | |
|
85 | ## be sure to also set beaker.session.cookie_path = /custom_prefix if you need | |
|
86 | ## to make your cookies only work on prefix url | |
|
115 | ; Prefix middleware for RhodeCode. | |
|
116 | ; recommended when using proxy setup. | |
|
117 | ; allows to set RhodeCode under a prefix in server. | |
|
118 | ; eg https://server.com/custom_prefix. Enable `filter-with =` option below as well. | |
|
119 | ; And set your prefix like: `prefix = /custom_prefix` | |
|
120 | ; be sure to also set beaker.session.cookie_path = /custom_prefix if you need | |
|
121 | ; to make your cookies only work on prefix url | |
|
87 | 122 | [filter:proxy-prefix] |
|
88 | 123 | use = egg:PasteDeploy#prefix |
|
89 | 124 | prefix = / |
|
90 | 125 | |
|
91 | 126 | [app:main] |
|
92 |
|
|
|
93 |
|
|
|
94 |
|
|
|
95 |
|
|
|
127 | ; The %(here)s variable will be replaced with the absolute path of parent directory | |
|
128 | ; of this file | |
|
129 | ; In addition ENVIRONMENT variables usage is possible, e.g | |
|
130 | ; sqlalchemy.db1.url = {ENV_RC_DB_URL} | |
|
96 | 131 | |
|
97 | 132 | use = egg:rhodecode-enterprise-ce |
|
98 | 133 | |
|
99 |
|
|
|
134 | ; enable proxy prefix middleware, defined above | |
|
100 | 135 | #filter-with = proxy-prefix |
|
101 | 136 | |
|
102 |
|
|
|
103 |
|
|
|
104 |
|
|
|
137 | ; encryption key used to encrypt social plugin tokens, | |
|
138 | ; remote_urls with credentials etc, if not set it defaults to | |
|
139 | ; `beaker.session.secret` | |
|
105 | 140 | #rhodecode.encrypted_values.secret = |
|
106 | 141 | |
|
107 |
|
|
|
108 |
|
|
|
142 | ; decryption strict mode (enabled by default). It controls if decryption raises | |
|
143 | ; `SignatureVerificationError` in case of wrong key, or damaged encryption data. | |
|
109 | 144 | #rhodecode.encrypted_values.strict = false |
|
110 | 145 | |
|
111 |
|
|
|
112 |
|
|
|
113 |
|
|
|
146 | ; Pick algorithm for encryption. Either fernet (more secure) or aes (default) | |
|
147 | ; fernet is safer, and we strongly recommend switching to it. | |
|
148 | ; Due to backward compatibility aes is used as default. | |
|
114 | 149 | #rhodecode.encrypted_values.algorithm = fernet |
|
115 | 150 | |
|
116 |
|
|
|
151 | ; Return gzipped responses from RhodeCode (static files/application) | |
|
117 | 152 | gzip_responses = false |
|
118 | 153 | |
|
119 |
|
|
|
154 | ; Auto-generate javascript routes file on startup | |
|
120 | 155 | generate_js_files = false |
|
121 | 156 | |
|
122 |
|
|
|
123 |
|
|
|
157 | ; System global default language. | |
|
158 | ; All available languages: en (default), be, de, es, fr, it, ja, pl, pt, ru, zh | |
|
124 | 159 | lang = en |
|
125 | 160 | |
|
126 |
|
|
|
127 |
|
|
|
161 | ; Perform a full repository scan and import on each server start. | |
|
162 | ; Settings this to true could lead to very long startup time. | |
|
128 | 163 | startup.import_repos = false |
|
129 | 164 | |
|
130 |
|
|
|
131 |
|
|
|
132 |
|
|
|
133 |
|
|
|
165 | ; Uncomment and set this path to use archive download cache. | |
|
166 | ; Once enabled, generated archives will be cached at this location | |
|
167 | ; and served from the cache during subsequent requests for the same archive of | |
|
168 | ; the repository. | |
|
134 | 169 | #archive_cache_dir = /tmp/tarballcache |
|
135 | 170 | |
|
136 |
|
|
|
137 |
|
|
|
138 |
|
|
|
171 | ; URL at which the application is running. This is used for Bootstrapping | |
|
172 | ; requests in context when no web request is available. Used in ishell, or | |
|
173 | ; SSH calls. Set this for events to receive proper url for SSH calls. | |
|
139 | 174 | app.base_url = http://rhodecode.local |
|
140 | 175 | |
|
141 |
|
|
|
176 | ; Unique application ID. Should be a random unique string for security. | |
|
142 | 177 | app_instance_uuid = rc-production |
|
143 | 178 | |
|
144 |
|
|
|
145 |
|
|
|
146 |
|
|
|
179 | ; Cut off limit for large diffs (size in bytes). If overall diff size on | |
|
180 | ; commit, or pull request exceeds this limit this diff will be displayed | |
|
181 | ; partially. E.g 512000 == 512Kb | |
|
147 | 182 | cut_off_limit_diff = 512000 |
|
148 | 183 | |
|
149 |
|
|
|
150 |
|
|
|
151 |
|
|
|
184 | ; Cut off limit for large files inside diffs (size in bytes). Each individual | |
|
185 | ; file inside diff which exceeds this limit will be displayed partially. | |
|
186 | ; E.g 128000 == 128Kb | |
|
152 | 187 | cut_off_limit_file = 128000 |
|
153 | 188 | |
|
154 |
|
|
|
189 | ; Use cached version of vcs repositories everywhere. Recommended to be `true` | |
|
155 | 190 | vcs_full_cache = true |
|
156 | 191 | |
|
157 |
|
|
|
158 |
|
|
|
192 | ; Force https in RhodeCode, fixes https redirects, assumes it's always https. | |
|
193 | ; Normally this is controlled by proper flags sent from http server such as Nginx or Apache | |
|
159 | 194 | force_https = false |
|
160 | 195 | |
|
161 |
|
|
|
196 | ; use Strict-Transport-Security headers | |
|
162 | 197 | use_htsts = false |
|
163 | 198 | |
|
164 | ## git rev filter option, --all is the default filter, if you need to | |
|
165 | ## hide all refs in changelog switch this to --branches --tags | |
|
166 | git_rev_filter = --branches --tags | |
|
167 | ||
|
168 | # Set to true if your repos are exposed using the dumb protocol | |
|
199 | ; Set to true if your repos are exposed using the dumb protocol | |
|
169 | 200 | git_update_server_info = false |
|
170 | 201 | |
|
171 |
|
|
|
202 | ; RSS/ATOM feed options | |
|
172 | 203 | rss_cut_off_limit = 256000 |
|
173 | 204 | rss_items_per_page = 10 |
|
174 | 205 | rss_include_diff = false |
|
175 | 206 | |
|
176 |
|
|
|
177 |
|
|
|
178 |
|
|
|
179 |
|
|
|
207 | ; gist URL alias, used to create nicer urls for gist. This should be an | |
|
208 | ; url that does rewrites to _admin/gists/{gistid}. | |
|
209 | ; example: http://gist.rhodecode.org/{gistid}. Empty means use the internal | |
|
210 | ; RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid} | |
|
180 | 211 | gist_alias_url = |
|
181 | 212 | |
|
182 |
|
|
|
183 |
|
|
|
184 |
|
|
|
185 |
|
|
|
186 |
|
|
|
187 |
|
|
|
188 |
|
|
|
189 | ## | |
|
190 | ## list of all views can be found under `/_admin/permissions/auth_token_access` | |
|
191 | ## The list should be "," separated and on a single line. | |
|
192 | ## | |
|
193 | ## Most common views to enable: | |
|
213 | ; List of views (using glob pattern syntax) that AUTH TOKENS could be | |
|
214 | ; used for access. | |
|
215 | ; Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it | |
|
216 | ; came from the the logged in user who own this authentication token. | |
|
217 | ; Additionally @TOKEN syntax can be used to bound the view to specific | |
|
218 | ; authentication token. Such view would be only accessible when used together | |
|
219 | ; with this authentication token | |
|
220 | ; list of all views can be found under `/_admin/permissions/auth_token_access` | |
|
221 | ; The list should be "," separated and on a single line. | |
|
222 | ; Most common views to enable: | |
|
223 | ||
|
194 | 224 | # RepoCommitsView:repo_commit_download |
|
195 | 225 | # RepoCommitsView:repo_commit_patch |
|
196 | 226 | # RepoCommitsView:repo_commit_raw |
@@ -201,164 +231,194 b' gist_alias_url =' | |||
|
201 | 231 | # GistView:* |
|
202 | 232 | api_access_controllers_whitelist = |
|
203 | 233 | |
|
204 |
|
|
|
205 |
|
|
|
234 | ; Default encoding used to convert from and to unicode | |
|
235 | ; can be also a comma separated list of encoding in case of mixed encodings | |
|
206 | 236 | default_encoding = UTF-8 |
|
207 | 237 | |
|
208 |
|
|
|
209 |
|
|
|
210 |
|
|
|
211 |
|
|
|
238 | ; instance-id prefix | |
|
239 | ; a prefix key for this instance used for cache invalidation when running | |
|
240 | ; multiple instances of RhodeCode, make sure it's globally unique for | |
|
241 | ; all running RhodeCode instances. Leave empty if you don't use it | |
|
212 | 242 | instance_id = |
|
213 | 243 | |
|
214 |
|
|
|
215 |
|
|
|
216 |
|
|
|
217 |
|
|
|
218 |
|
|
|
219 | ## | |
|
220 | ## Available builtin plugin IDs (hash is part of the ID): | |
|
221 |
|
|
|
222 |
|
|
|
223 |
|
|
|
224 |
|
|
|
225 |
|
|
|
226 | ## egg:rhodecode-enterprise-ce#crowd | |
|
244 | ; Fallback authentication plugin. Set this to a plugin ID to force the usage | |
|
245 | ; of an authentication plugin also if it is disabled by it's settings. | |
|
246 | ; This could be useful if you are unable to log in to the system due to broken | |
|
247 | ; authentication settings. Then you can enable e.g. the internal RhodeCode auth | |
|
248 | ; module to log in again and fix the settings. | |
|
249 | ; Available builtin plugin IDs (hash is part of the ID): | |
|
250 | ; egg:rhodecode-enterprise-ce#rhodecode | |
|
251 | ; egg:rhodecode-enterprise-ce#pam | |
|
252 | ; egg:rhodecode-enterprise-ce#ldap | |
|
253 | ; egg:rhodecode-enterprise-ce#jasig_cas | |
|
254 | ; egg:rhodecode-enterprise-ce#headers | |
|
255 | ; egg:rhodecode-enterprise-ce#crowd | |
|
256 | ||
|
227 | 257 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode |
|
228 | 258 | |
|
229 | ## alternative return HTTP header for failed authentication. Default HTTP | |
|
230 | ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with | |
|
231 | ## handling that causing a series of failed authentication calls. | |
|
232 | ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code | |
|
233 | ## This will be served instead of default 401 on bad authentication | |
|
259 | ; Flag to control loading of legacy plugins in py:/path format | |
|
260 | auth_plugin.import_legacy_plugins = true | |
|
261 | ||
|
262 | ; alternative return HTTP header for failed authentication. Default HTTP | |
|
263 | ; response is 401 HTTPUnauthorized. Currently HG clients have troubles with | |
|
264 | ; handling that causing a series of failed authentication calls. | |
|
265 | ; Set this variable to 403 to return HTTPForbidden, or any other HTTP code | |
|
266 | ; This will be served instead of default 401 on bad authentication | |
|
234 | 267 | auth_ret_code = |
|
235 | 268 | |
|
236 |
|
|
|
237 |
|
|
|
238 |
|
|
|
269 | ; use special detection method when serving auth_ret_code, instead of serving | |
|
270 | ; ret_code directly, use 401 initially (Which triggers credentials prompt) | |
|
271 | ; and then serve auth_ret_code to clients | |
|
239 | 272 | auth_ret_code_detection = false |
|
240 | 273 | |
|
241 |
|
|
|
242 |
|
|
|
274 | ; locking return code. When repository is locked return this HTTP code. 2XX | |
|
275 | ; codes don't break the transactions while 4XX codes do | |
|
243 | 276 | lock_ret_code = 423 |
|
244 | 277 | |
|
245 |
|
|
|
278 | ; allows to change the repository location in settings page | |
|
246 | 279 | allow_repo_location_change = true |
|
247 | 280 | |
|
248 |
|
|
|
281 | ; allows to setup custom hooks in settings page | |
|
249 | 282 | allow_custom_hooks_settings = true |
|
250 | 283 | |
|
251 |
|
|
|
252 |
|
|
|
284 | ; Generated license token required for EE edition license. | |
|
285 | ; New generated token value can be found in Admin > settings > license page. | |
|
253 | 286 | license_token = |
|
254 | 287 | |
|
255 | ## supervisor connection uri, for managing supervisor and logs. | |
|
288 | ; This flag hides sensitive information on the license page such as token, and license data | |
|
289 | license.hide_license_info = false | |
|
290 | ||
|
291 | ; supervisor connection uri, for managing supervisor and logs. | |
|
256 | 292 | supervisor.uri = |
|
257 | ## supervisord group name/id we only want this RC instance to handle | |
|
293 | ||
|
294 | ; supervisord group name/id we only want this RC instance to handle | |
|
258 | 295 | supervisor.group_id = prod |
|
259 | 296 | |
|
260 |
|
|
|
297 | ; Display extended labs settings | |
|
261 | 298 | labs_settings_active = true |
|
262 | 299 | |
|
263 |
|
|
|
264 |
|
|
|
300 | ; Custom exception store path, defaults to TMPDIR | |
|
301 | ; This is used to store exception from RhodeCode in shared directory | |
|
265 | 302 | #exception_tracker.store_path = |
|
266 | 303 | |
|
267 |
|
|
|
304 | ; File store configuration. This is used to store and serve uploaded files | |
|
268 | 305 | file_store.enabled = true |
|
269 | ## Storage backend, available options are: local | |
|
306 | ||
|
307 | ; Storage backend, available options are: local | |
|
270 | 308 | file_store.backend = local |
|
271 | ## path to store the uploaded binaries | |
|
309 | ||
|
310 | ; path to store the uploaded binaries | |
|
272 | 311 | file_store.storage_path = %(here)s/data/file_store |
|
273 | 312 | |
|
274 | 313 | |
|
275 | #################################### | |
|
276 | ### CELERY CONFIG #### | |
|
277 | #################################### | |
|
278 | ## run: /path/to/celery worker \ | |
|
279 | ## -E --beat --app rhodecode.lib.celerylib.loader \ | |
|
280 | ## --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler \ | |
|
281 | ## --loglevel DEBUG --ini /path/to/rhodecode.ini | |
|
314 | ; ############# | |
|
315 | ; CELERY CONFIG | |
|
316 | ; ############# | |
|
317 | ||
|
318 | ; manually run celery: /path/to/celery worker -E --beat --app rhodecode.lib.celerylib.loader --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler --loglevel DEBUG --ini /path/to/rhodecode.ini | |
|
282 | 319 | |
|
283 | 320 | use_celery = false |
|
284 | 321 | |
|
285 |
|
|
|
286 |
celery.broker_url = |
|
|
322 | ; connection url to the message broker (default redis) | |
|
323 | celery.broker_url = redis://localhost:6379/8 | |
|
287 | 324 | |
|
288 | ## maximum tasks to execute before worker restart | |
|
325 | ; rabbitmq example | |
|
326 | #celery.broker_url = amqp://rabbitmq:qweqwe@localhost:5672/rabbitmqhost | |
|
327 | ||
|
328 | ; maximum tasks to execute before worker restart | |
|
289 | 329 | celery.max_tasks_per_child = 100 |
|
290 | 330 | |
|
291 |
|
|
|
331 | ; tasks will never be sent to the queue, but executed locally instead. | |
|
292 | 332 | celery.task_always_eager = false |
|
293 | 333 | |
|
294 | ##################################### | |
|
295 | ### DOGPILE CACHE #### | |
|
296 | ##################################### | |
|
297 | ## Default cache dir for caches. Putting this into a ramdisk | |
|
298 | ## can boost performance, eg. /tmpfs/data_ramdisk, however this directory might require | |
|
299 | ## large amount of space | |
|
334 | ; ############# | |
|
335 | ; DOGPILE CACHE | |
|
336 | ; ############# | |
|
337 | ||
|
338 | ; Default cache dir for caches. Putting this into a ramdisk can boost performance. | |
|
339 | ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space | |
|
300 | 340 | cache_dir = %(here)s/data |
|
301 | 341 | |
|
302 | ## `cache_perms` cache settings for permission tree, auth TTL. | |
|
342 | ; ********************************************* | |
|
343 | ; `sql_cache_short` cache for heavy SQL queries | |
|
344 | ; Only supported backend is `memory_lru` | |
|
345 | ; ********************************************* | |
|
346 | rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru | |
|
347 | rc_cache.sql_cache_short.expiration_time = 30 | |
|
348 | ||
|
349 | ||
|
350 | ; ***************************************************** | |
|
351 | ; `cache_repo_longterm` cache for repo object instances | |
|
352 | ; Only supported backend is `memory_lru` | |
|
353 | ; ***************************************************** | |
|
354 | rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru | |
|
355 | ; by default we use 30 Days, cache is still invalidated on push | |
|
356 | rc_cache.cache_repo_longterm.expiration_time = 2592000 | |
|
357 | ; max items in LRU cache, set to smaller number to save memory, and expire last used caches | |
|
358 | rc_cache.cache_repo_longterm.max_size = 10000 | |
|
359 | ||
|
360 | ||
|
361 | ; ************************************************* | |
|
362 | ; `cache_perms` cache for permission tree, auth TTL | |
|
363 | ; ************************************************* | |
|
303 | 364 | rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace |
|
304 | 365 | rc_cache.cache_perms.expiration_time = 300 |
|
366 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set | |
|
367 | #rc_cache.cache_perms.arguments.filename = /tmp/cache_perms.db | |
|
305 | 368 | |
|
306 |
|
|
|
369 | ; alternative `cache_perms` redis backend with distributed lock | |
|
307 | 370 | #rc_cache.cache_perms.backend = dogpile.cache.rc.redis |
|
308 | 371 | #rc_cache.cache_perms.expiration_time = 300 |
|
309 | ## redis_expiration_time needs to be greater then expiration_time | |
|
372 | ||
|
373 | ; redis_expiration_time needs to be greater then expiration_time | |
|
310 | 374 | #rc_cache.cache_perms.arguments.redis_expiration_time = 7200 |
|
311 | #rc_cache.cache_perms.arguments.socket_timeout = 30 | |
|
375 | ||
|
312 | 376 | #rc_cache.cache_perms.arguments.host = localhost |
|
313 | 377 | #rc_cache.cache_perms.arguments.port = 6379 |
|
314 | 378 | #rc_cache.cache_perms.arguments.db = 0 |
|
315 | ## more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
|
379 | #rc_cache.cache_perms.arguments.socket_timeout = 30 | |
|
380 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
|
316 | 381 | #rc_cache.cache_perms.arguments.distributed_lock = true |
|
317 | 382 | |
|
318 | ## `cache_repo` cache settings for FileTree, Readme, RSS FEEDS | |
|
383 | ||
|
384 | ; *************************************************** | |
|
385 | ; `cache_repo` cache for file tree, Readme, RSS FEEDS | |
|
386 | ; *************************************************** | |
|
319 | 387 | rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace |
|
320 | 388 | rc_cache.cache_repo.expiration_time = 2592000 |
|
389 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set | |
|
390 | #rc_cache.cache_repo.arguments.filename = /tmp/cache_repo.db | |
|
321 | 391 | |
|
322 |
|
|
|
392 | ; alternative `cache_repo` redis backend with distributed lock | |
|
323 | 393 | #rc_cache.cache_repo.backend = dogpile.cache.rc.redis |
|
324 | 394 | #rc_cache.cache_repo.expiration_time = 2592000 |
|
325 | ## redis_expiration_time needs to be greater then expiration_time | |
|
395 | ||
|
396 | ; redis_expiration_time needs to be greater then expiration_time | |
|
326 | 397 | #rc_cache.cache_repo.arguments.redis_expiration_time = 2678400 |
|
327 | #rc_cache.cache_repo.arguments.socket_timeout = 30 | |
|
398 | ||
|
328 | 399 | #rc_cache.cache_repo.arguments.host = localhost |
|
329 | 400 | #rc_cache.cache_repo.arguments.port = 6379 |
|
330 | 401 | #rc_cache.cache_repo.arguments.db = 1 |
|
331 | ## more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
|
402 | #rc_cache.cache_repo.arguments.socket_timeout = 30 | |
|
403 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
|
332 | 404 | #rc_cache.cache_repo.arguments.distributed_lock = true |
|
333 | 405 | |
|
334 | ## cache settings for SQL queries, this needs to use memory type backend | |
|
335 | rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru | |
|
336 | rc_cache.sql_cache_short.expiration_time = 30 | |
|
337 | 406 | |
|
338 | ## `cache_repo_longterm` cache for repo object instances, this needs to use memory | |
|
339 | ## type backend as the objects kept are not pickle serializable | |
|
340 | rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru | |
|
341 | ## by default we use 96H, this is using invalidation on push anyway | |
|
342 | rc_cache.cache_repo_longterm.expiration_time = 345600 | |
|
343 | ## max items in LRU cache, reduce this number to save memory, and expire last used | |
|
344 | ## cached objects | |
|
345 | rc_cache.cache_repo_longterm.max_size = 10000 | |
|
407 | ; ############## | |
|
408 | ; BEAKER SESSION | |
|
409 | ; ############## | |
|
346 | 410 | |
|
347 | ||
|
348 | #################################### | |
|
349 | ### BEAKER SESSION #### | |
|
350 | #################################### | |
|
351 | ||
|
352 | ## .session.type is type of storage options for the session, current allowed | |
|
353 | ## types are file, ext:memcached, ext:redis, ext:database, and memory (default). | |
|
411 | ; beaker.session.type is type of storage options for the logged users sessions. Current allowed | |
|
412 | ; types are file, ext:redis, ext:database, ext:memcached, and memory (default if not specified). | |
|
413 | ; Fastest ones are Redis and ext:database | |
|
354 | 414 | beaker.session.type = file |
|
355 | 415 | beaker.session.data_dir = %(here)s/data/sessions |
|
356 | 416 | |
|
357 |
|
|
|
417 | ; Redis based sessions | |
|
358 | 418 | #beaker.session.type = ext:redis |
|
359 | 419 | #beaker.session.url = redis://127.0.0.1:6379/2 |
|
360 | 420 | |
|
361 |
|
|
|
421 | ; DB based session, fast, and allows easy management over logged in users | |
|
362 | 422 | #beaker.session.type = ext:database |
|
363 | 423 | #beaker.session.table_name = db_session |
|
364 | 424 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode |
@@ -370,265 +430,275 b' beaker.session.key = rhodecode' | |||
|
370 | 430 | beaker.session.secret = production-rc-uytcxaz |
|
371 | 431 | beaker.session.lock_dir = %(here)s/data/sessions/lock |
|
372 | 432 | |
|
373 |
|
|
|
374 |
|
|
|
433 | ; Secure encrypted cookie. Requires AES and AES python libraries | |
|
434 | ; you must disable beaker.session.secret to use this | |
|
375 | 435 | #beaker.session.encrypt_key = key_for_encryption |
|
376 | 436 | #beaker.session.validate_key = validation_key |
|
377 | 437 | |
|
378 |
|
|
|
379 |
|
|
|
438 | ; Sets session as invalid (also logging out user) if it haven not been | |
|
439 | ; accessed for given amount of time in seconds | |
|
380 | 440 | beaker.session.timeout = 2592000 |
|
381 | 441 | beaker.session.httponly = true |
|
382 | ## Path to use for the cookie. Set to prefix if you use prefix middleware | |
|
442 | ||
|
443 | ; Path to use for the cookie. Set to prefix if you use prefix middleware | |
|
383 | 444 | #beaker.session.cookie_path = /custom_prefix |
|
384 | 445 | |
|
385 |
|
|
|
446 | ; Set https secure cookie | |
|
386 | 447 | beaker.session.secure = false |
|
387 | 448 | |
|
388 | ## auto save the session to not to use .save() | |
|
389 | beaker.session.auto = false | |
|
390 | ||
|
391 | ## default cookie expiration time in seconds, set to `true` to set expire | |
|
392 | ## at browser close | |
|
449 | ; default cookie expiration time in seconds, set to `true` to set expire | |
|
450 | ; at browser close | |
|
393 | 451 | #beaker.session.cookie_expires = 3600 |
|
394 | 452 | |
|
395 |
|
|
|
396 |
|
|
|
397 |
|
|
|
398 | ## Full text search indexer is available in rhodecode-tools under | |
|
399 | ## `rhodecode-tools index` command | |
|
453 | ; ############################# | |
|
454 | ; SEARCH INDEXING CONFIGURATION | |
|
455 | ; ############################# | |
|
400 | 456 | |
|
401 | ## WHOOSH Backend, doesn't require additional services to run | |
|
402 | ## it works good with few dozen repos | |
|
457 | ; Full text search indexer is available in rhodecode-tools under | |
|
458 | ; `rhodecode-tools index` command | |
|
459 | ||
|
460 | ; WHOOSH Backend, doesn't require additional services to run | |
|
461 | ; it works good with few dozen repos | |
|
403 | 462 | search.module = rhodecode.lib.index.whoosh |
|
404 | 463 | search.location = %(here)s/data/index |
|
405 | 464 | |
|
406 |
|
|
|
407 |
|
|
|
408 |
|
|
|
409 | ## channelstream enables persistent connections and live notification | |
|
410 | ## in the system. It's also used by the chat system | |
|
465 | ; #################### | |
|
466 | ; CHANNELSTREAM CONFIG | |
|
467 | ; #################### | |
|
468 | ||
|
469 | ; channelstream enables persistent connections and live notification | |
|
470 | ; in the system. It's also used by the chat system | |
|
411 | 471 | |
|
412 | 472 | channelstream.enabled = false |
|
413 | 473 | |
|
414 |
|
|
|
474 | ; server address for channelstream server on the backend | |
|
415 | 475 | channelstream.server = 127.0.0.1:9800 |
|
416 | 476 | |
|
417 |
|
|
|
418 |
|
|
|
419 |
|
|
|
420 |
|
|
|
477 | ; location of the channelstream server from outside world | |
|
478 | ; use ws:// for http or wss:// for https. This address needs to be handled | |
|
479 | ; by external HTTP server such as Nginx or Apache | |
|
480 | ; see Nginx/Apache configuration examples in our docs | |
|
421 | 481 | channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream |
|
422 | 482 | channelstream.secret = secret |
|
423 | 483 | channelstream.history.location = %(here)s/channelstream_history |
|
424 | 484 | |
|
425 |
|
|
|
426 |
|
|
|
485 | ; Internal application path that Javascript uses to connect into. | |
|
486 | ; If you use proxy-prefix the prefix should be added before /_channelstream | |
|
427 | 487 | channelstream.proxy_path = /_channelstream |
|
428 | 488 | |
|
429 | 489 | |
|
430 |
|
|
|
431 | ## APPENLIGHT CONFIG ## | |
|
432 |
|
|
|
490 | ; ############################## | |
|
491 | ; MAIN RHODECODE DATABASE CONFIG | |
|
492 | ; ############################## | |
|
493 | ||
|
494 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |
|
495 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |
|
496 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8 | |
|
497 | ; pymysql is an alternative driver for MySQL, use in case of problems with default one | |
|
498 | #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode | |
|
499 | ||
|
500 | sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |
|
501 | ||
|
502 | ; see sqlalchemy docs for other advanced settings | |
|
503 | ; print the sql statements to output | |
|
504 | sqlalchemy.db1.echo = false | |
|
505 | ||
|
506 | ; recycle the connections after this amount of seconds | |
|
507 | sqlalchemy.db1.pool_recycle = 3600 | |
|
508 | sqlalchemy.db1.convert_unicode = true | |
|
509 | ||
|
510 | ; the number of connections to keep open inside the connection pool. | |
|
511 | ; 0 indicates no limit | |
|
512 | #sqlalchemy.db1.pool_size = 5 | |
|
513 | ||
|
514 | ; The number of connections to allow in connection pool "overflow", that is | |
|
515 | ; connections that can be opened above and beyond the pool_size setting, | |
|
516 | ; which defaults to five. | |
|
517 | #sqlalchemy.db1.max_overflow = 10 | |
|
518 | ||
|
519 | ; Connection check ping, used to detect broken database connections | |
|
520 | ; could be enabled to better handle cases if MySQL has gone away errors | |
|
521 | #sqlalchemy.db1.ping_connection = true | |
|
522 | ||
|
523 | ; ########## | |
|
524 | ; VCS CONFIG | |
|
525 | ; ########## | |
|
526 | vcs.server.enable = true | |
|
527 | vcs.server = localhost:9900 | |
|
528 | ||
|
529 | ; Web server connectivity protocol, responsible for web based VCS operations | |
|
530 | ; Available protocols are: | |
|
531 | ; `http` - use http-rpc backend (default) | |
|
532 | vcs.server.protocol = http | |
|
533 | ||
|
534 | ; Push/Pull operations protocol, available options are: | |
|
535 | ; `http` - use http-rpc backend (default) | |
|
536 | vcs.scm_app_implementation = http | |
|
537 | ||
|
538 | ; Push/Pull operations hooks protocol, available options are: | |
|
539 | ; `http` - use http-rpc backend (default) | |
|
540 | vcs.hooks.protocol = http | |
|
541 | ||
|
542 | ; Host on which this instance is listening for hooks. If vcsserver is in other location | |
|
543 | ; this should be adjusted. | |
|
544 | vcs.hooks.host = 127.0.0.1 | |
|
545 | ||
|
546 | ; Start VCSServer with this instance as a subprocess, useful for development | |
|
547 | vcs.start_server = false | |
|
548 | ||
|
549 | ; List of enabled VCS backends, available options are: | |
|
550 | ; `hg` - mercurial | |
|
551 | ; `git` - git | |
|
552 | ; `svn` - subversion | |
|
553 | vcs.backends = hg, git, svn | |
|
554 | ||
|
555 | ; Wait this number of seconds before killing connection to the vcsserver | |
|
556 | vcs.connection_timeout = 3600 | |
|
557 | ||
|
558 | ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
|
559 | ; Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |
|
560 | #vcs.svn.compatible_version = pre-1.8-compatible | |
|
561 | ||
|
433 | 562 | |
|
434 | ## Appenlight is tailored to work with RhodeCode, see | |
|
435 | ## http://appenlight.com for details how to obtain an account | |
|
563 | ; #################################################### | |
|
564 | ; Subversion proxy support (mod_dav_svn) | |
|
565 | ; Maps RhodeCode repo groups into SVN paths for Apache | |
|
566 | ; #################################################### | |
|
567 | ||
|
568 | ; Enable or disable the config file generation. | |
|
569 | svn.proxy.generate_config = false | |
|
570 | ||
|
571 | ; Generate config file with `SVNListParentPath` set to `On`. | |
|
572 | svn.proxy.list_parent_path = true | |
|
573 | ||
|
574 | ; Set location and file name of generated config file. | |
|
575 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |
|
576 | ||
|
577 | ; alternative mod_dav config template. This needs to be a valid mako template | |
|
578 | ; Example template can be found in the source code: | |
|
579 | ; rhodecode/apps/svn_support/templates/mod-dav-svn.conf.mako | |
|
580 | #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako | |
|
581 | ||
|
582 | ; Used as a prefix to the `Location` block in the generated config file. | |
|
583 | ; In most cases it should be set to `/`. | |
|
584 | svn.proxy.location_root = / | |
|
585 | ||
|
586 | ; Command to reload the mod dav svn configuration on change. | |
|
587 | ; Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh | |
|
588 | ; Make sure user who runs RhodeCode process is allowed to reload Apache | |
|
589 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |
|
590 | ||
|
591 | ; If the timeout expires before the reload command finishes, the command will | |
|
592 | ; be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |
|
593 | #svn.proxy.reload_timeout = 10 | |
|
594 | ||
|
595 | ; #################### | |
|
596 | ; SSH Support Settings | |
|
597 | ; #################### | |
|
436 | 598 | |
|
437 | ## Appenlight integration enabled | |
|
599 | ; Defines if a custom authorized_keys file should be created and written on | |
|
600 | ; any change user ssh keys. Setting this to false also disables possibility | |
|
601 | ; of adding SSH keys by users from web interface. Super admins can still | |
|
602 | ; manage SSH Keys. | |
|
603 | ssh.generate_authorized_keyfile = false | |
|
604 | ||
|
605 | ; Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding` | |
|
606 | # ssh.authorized_keys_ssh_opts = | |
|
607 | ||
|
608 | ; Path to the authorized_keys file where the generate entries are placed. | |
|
609 | ; It is possible to have multiple key files specified in `sshd_config` e.g. | |
|
610 | ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode | |
|
611 | ssh.authorized_keys_file_path = ~/.ssh/authorized_keys_rhodecode | |
|
612 | ||
|
613 | ; Command to execute the SSH wrapper. The binary is available in the | |
|
614 | ; RhodeCode installation directory. | |
|
615 | ; e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper | |
|
616 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper | |
|
617 | ||
|
618 | ; Allow shell when executing the ssh-wrapper command | |
|
619 | ssh.wrapper_cmd_allow_shell = false | |
|
620 | ||
|
621 | ; Enables logging, and detailed output send back to the client during SSH | |
|
622 | ; operations. Useful for debugging, shouldn't be used in production. | |
|
623 | ssh.enable_debug_logging = false | |
|
624 | ||
|
625 | ; Paths to binary executable, by default they are the names, but we can | |
|
626 | ; override them if we want to use a custom one | |
|
627 | ssh.executable.hg = ~/.rccontrol/vcsserver-1/profile/bin/hg | |
|
628 | ssh.executable.git = ~/.rccontrol/vcsserver-1/profile/bin/git | |
|
629 | ssh.executable.svn = ~/.rccontrol/vcsserver-1/profile/bin/svnserve | |
|
630 | ||
|
631 | ; Enables SSH key generator web interface. Disabling this still allows users | |
|
632 | ; to add their own keys. | |
|
633 | ssh.enable_ui_key_generator = true | |
|
634 | ||
|
635 | ||
|
636 | ; ################# | |
|
637 | ; APPENLIGHT CONFIG | |
|
638 | ; ################# | |
|
639 | ||
|
640 | ; Appenlight is tailored to work with RhodeCode, see | |
|
641 | ; http://appenlight.rhodecode.com for details how to obtain an account | |
|
642 | ||
|
643 | ; Appenlight integration enabled | |
|
438 | 644 | appenlight = false |
|
439 | 645 | |
|
440 | 646 | appenlight.server_url = https://api.appenlight.com |
|
441 | 647 | appenlight.api_key = YOUR_API_KEY |
|
442 | 648 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 |
|
443 | 649 | |
|
444 |
|
|
|
650 | ; used for JS client | |
|
445 | 651 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY |
|
446 | 652 | |
|
447 |
|
|
|
653 | ; TWEAK AMOUNT OF INFO SENT HERE | |
|
448 | 654 | |
|
449 |
|
|
|
655 | ; enables 404 error logging (default False) | |
|
450 | 656 | appenlight.report_404 = false |
|
451 | 657 | |
|
452 |
|
|
|
658 | ; time in seconds after request is considered being slow (default 1) | |
|
453 | 659 | appenlight.slow_request_time = 1 |
|
454 | 660 | |
|
455 |
|
|
|
456 |
|
|
|
661 | ; record slow requests in application | |
|
662 | ; (needs to be enabled for slow datastore recording and time tracking) | |
|
457 | 663 | appenlight.slow_requests = true |
|
458 | 664 | |
|
459 |
|
|
|
665 | ; enable hooking to application loggers | |
|
460 | 666 | appenlight.logging = true |
|
461 | 667 | |
|
462 |
|
|
|
668 | ; minimum log level for log capture | |
|
463 | 669 | appenlight.logging.level = WARNING |
|
464 | 670 | |
|
465 |
|
|
|
466 |
|
|
|
671 | ; send logs only from erroneous/slow requests | |
|
672 | ; (saves API quota for intensive logging) | |
|
467 | 673 | appenlight.logging_on_error = false |
|
468 | 674 | |
|
469 |
|
|
|
470 |
|
|
|
471 |
|
|
|
472 |
|
|
|
473 |
|
|
|
675 | ; list of additional keywords that should be grabbed from environ object | |
|
676 | ; can be string with comma separated list of words in lowercase | |
|
677 | ; (by default client will always send following info: | |
|
678 | ; 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that | |
|
679 | ; start with HTTP* this list be extended with additional keywords here | |
|
474 | 680 | appenlight.environ_keys_whitelist = |
|
475 | 681 | |
|
476 |
|
|
|
477 |
|
|
|
478 |
|
|
|
479 |
|
|
|
480 |
|
|
|
682 | ; list of keywords that should be blanked from request object | |
|
683 | ; can be string with comma separated list of words in lowercase | |
|
684 | ; (by default client will always blank keys that contain following words | |
|
685 | ; 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' | |
|
686 | ; this list be extended with additional keywords set here | |
|
481 | 687 | appenlight.request_keys_blacklist = |
|
482 | 688 | |
|
483 |
|
|
|
484 |
|
|
|
485 |
|
|
|
689 | ; list of namespaces that should be ignores when gathering log entries | |
|
690 | ; can be string with comma separated list of namespaces | |
|
691 | ; (by default the client ignores own entries: appenlight_client.client) | |
|
486 | 692 | appenlight.log_namespace_blacklist = |
|
487 | 693 | |
|
488 | ||
|
489 | ########################################### | |
|
490 | ### MAIN RHODECODE DATABASE CONFIG ### | |
|
491 | ########################################### | |
|
492 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |
|
493 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |
|
494 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8 | |
|
495 | # pymysql is an alternative driver for MySQL, use in case of problems with default one | |
|
496 | #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode | |
|
497 | ||
|
498 | sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |
|
499 | ||
|
500 | # see sqlalchemy docs for other advanced settings | |
|
501 | ||
|
502 | ## print the sql statements to output | |
|
503 | sqlalchemy.db1.echo = false | |
|
504 | ## recycle the connections after this amount of seconds | |
|
505 | sqlalchemy.db1.pool_recycle = 3600 | |
|
506 | sqlalchemy.db1.convert_unicode = true | |
|
507 | ||
|
508 | ## the number of connections to keep open inside the connection pool. | |
|
509 | ## 0 indicates no limit | |
|
510 | #sqlalchemy.db1.pool_size = 5 | |
|
511 | ||
|
512 | ## the number of connections to allow in connection pool "overflow", that is | |
|
513 | ## connections that can be opened above and beyond the pool_size setting, | |
|
514 | ## which defaults to five. | |
|
515 | #sqlalchemy.db1.max_overflow = 10 | |
|
516 | ||
|
517 | ## Connection check ping, used to detect broken database connections | |
|
518 | ## could be enabled to better handle cases if MySQL has gone away errors | |
|
519 | #sqlalchemy.db1.ping_connection = true | |
|
520 | ||
|
521 | ################## | |
|
522 | ### VCS CONFIG ### | |
|
523 | ################## | |
|
524 | vcs.server.enable = true | |
|
525 | vcs.server = localhost:9900 | |
|
526 | ||
|
527 | ## Web server connectivity protocol, responsible for web based VCS operations | |
|
528 | ## Available protocols are: | |
|
529 | ## `http` - use http-rpc backend (default) | |
|
530 | vcs.server.protocol = http | |
|
531 | ||
|
532 | ## Push/Pull operations protocol, available options are: | |
|
533 | ## `http` - use http-rpc backend (default) | |
|
534 | vcs.scm_app_implementation = http | |
|
535 | ||
|
536 | ## Push/Pull operations hooks protocol, available options are: | |
|
537 | ## `http` - use http-rpc backend (default) | |
|
538 | vcs.hooks.protocol = http | |
|
539 | ||
|
540 | ## Host on which this instance is listening for hooks. If vcsserver is in other location | |
|
541 | ## this should be adjusted. | |
|
542 | vcs.hooks.host = 127.0.0.1 | |
|
543 | ||
|
544 | vcs.server.log_level = info | |
|
545 | ## Start VCSServer with this instance as a subprocess, useful for development | |
|
546 | vcs.start_server = false | |
|
547 | ||
|
548 | ## List of enabled VCS backends, available options are: | |
|
549 | ## `hg` - mercurial | |
|
550 | ## `git` - git | |
|
551 | ## `svn` - subversion | |
|
552 | vcs.backends = hg, git, svn | |
|
553 | ||
|
554 | vcs.connection_timeout = 3600 | |
|
555 | ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |
|
556 | ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |
|
557 | #vcs.svn.compatible_version = pre-1.8-compatible | |
|
558 | ||
|
559 | ||
|
560 | ############################################################ | |
|
561 | ### Subversion proxy support (mod_dav_svn) ### | |
|
562 | ### Maps RhodeCode repo groups into SVN paths for Apache ### | |
|
563 | ############################################################ | |
|
564 | ## Enable or disable the config file generation. | |
|
565 | svn.proxy.generate_config = false | |
|
566 | ## Generate config file with `SVNListParentPath` set to `On`. | |
|
567 | svn.proxy.list_parent_path = true | |
|
568 | ## Set location and file name of generated config file. | |
|
569 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |
|
570 | ## alternative mod_dav config template. This needs to be a mako template | |
|
571 | #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako | |
|
572 | ## Used as a prefix to the `Location` block in the generated config file. | |
|
573 | ## In most cases it should be set to `/`. | |
|
574 | svn.proxy.location_root = / | |
|
575 | ## Command to reload the mod dav svn configuration on change. | |
|
576 | ## Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh | |
|
577 | ## Make sure user who runs RhodeCode process is allowed to reload Apache | |
|
578 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |
|
579 | ## If the timeout expires before the reload command finishes, the command will | |
|
580 | ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |
|
581 | #svn.proxy.reload_timeout = 10 | |
|
582 | ||
|
583 | ############################################################ | |
|
584 | ### SSH Support Settings ### | |
|
585 | ############################################################ | |
|
586 | ||
|
587 | ## Defines if a custom authorized_keys file should be created and written on | |
|
588 | ## any change user ssh keys. Setting this to false also disables possibility | |
|
589 | ## of adding SSH keys by users from web interface. Super admins can still | |
|
590 | ## manage SSH Keys. | |
|
591 | ssh.generate_authorized_keyfile = false | |
|
592 | ||
|
593 | ## Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding` | |
|
594 | # ssh.authorized_keys_ssh_opts = | |
|
595 | ||
|
596 | ## Path to the authorized_keys file where the generate entries are placed. | |
|
597 | ## It is possible to have multiple key files specified in `sshd_config` e.g. | |
|
598 | ## AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode | |
|
599 | ssh.authorized_keys_file_path = ~/.ssh/authorized_keys_rhodecode | |
|
600 | ||
|
601 | ## Command to execute the SSH wrapper. The binary is available in the | |
|
602 | ## RhodeCode installation directory. | |
|
603 | ## e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper | |
|
604 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper | |
|
605 | ||
|
606 | ## Allow shell when executing the ssh-wrapper command | |
|
607 | ssh.wrapper_cmd_allow_shell = false | |
|
608 | ||
|
609 | ## Enables logging, and detailed output send back to the client during SSH | |
|
610 | ## operations. Useful for debugging, shouldn't be used in production. | |
|
611 | ssh.enable_debug_logging = false | |
|
612 | ||
|
613 | ## Paths to binary executable, by default they are the names, but we can | |
|
614 | ## override them if we want to use a custom one | |
|
615 | ssh.executable.hg = ~/.rccontrol/vcsserver-1/profile/bin/hg | |
|
616 | ssh.executable.git = ~/.rccontrol/vcsserver-1/profile/bin/git | |
|
617 | ssh.executable.svn = ~/.rccontrol/vcsserver-1/profile/bin/svnserve | |
|
618 | ||
|
619 | ## Enables SSH key generator web interface. Disabling this still allows users | |
|
620 | ## to add their own keys. | |
|
621 | ssh.enable_ui_key_generator = true | |
|
622 | ||
|
623 | ||
|
624 | ## Dummy marker to add new entries after. | |
|
625 | ## Add any custom entries below. Please don't remove. | |
|
694 | ; Dummy marker to add new entries after. | |
|
695 | ; Add any custom entries below. Please don't remove this marker. | |
|
626 | 696 | custom.conf = 1 |
|
627 | 697 | |
|
628 | 698 | |
|
629 |
|
|
|
630 |
|
|
|
631 |
|
|
|
699 | ; ##################### | |
|
700 | ; LOGGING CONFIGURATION | |
|
701 | ; ##################### | |
|
632 | 702 | [loggers] |
|
633 | 703 | keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper |
|
634 | 704 | |
@@ -638,9 +708,9 b' keys = console, console_sql' | |||
|
638 | 708 | [formatters] |
|
639 | 709 | keys = generic, color_formatter, color_formatter_sql |
|
640 | 710 | |
|
641 |
|
|
|
642 |
|
|
|
643 |
|
|
|
711 | ; ####### | |
|
712 | ; LOGGERS | |
|
713 | ; ####### | |
|
644 | 714 | [logger_root] |
|
645 | 715 | level = NOTSET |
|
646 | 716 | handlers = console |
@@ -675,9 +745,9 b' handlers =' | |||
|
675 | 745 | qualname = celery |
|
676 | 746 | |
|
677 | 747 | |
|
678 |
|
|
|
679 |
|
|
|
680 |
|
|
|
748 | ; ######## | |
|
749 | ; HANDLERS | |
|
750 | ; ######## | |
|
681 | 751 | |
|
682 | 752 | [handler_console] |
|
683 | 753 | class = StreamHandler |
@@ -686,17 +756,17 b' level = INFO' | |||
|
686 | 756 | formatter = generic |
|
687 | 757 | |
|
688 | 758 | [handler_console_sql] |
|
689 |
|
|
|
690 |
|
|
|
691 |
|
|
|
759 | ; "level = DEBUG" logs SQL queries and results. | |
|
760 | ; "level = INFO" logs SQL queries. | |
|
761 | ; "level = WARN" logs neither. (Recommended for production systems.) | |
|
692 | 762 | class = StreamHandler |
|
693 | 763 | args = (sys.stderr, ) |
|
694 | 764 | level = WARN |
|
695 | 765 | formatter = generic |
|
696 | 766 | |
|
697 |
|
|
|
698 |
|
|
|
699 |
|
|
|
767 | ; ########## | |
|
768 | ; FORMATTERS | |
|
769 | ; ########## | |
|
700 | 770 | |
|
701 | 771 | [formatter_generic] |
|
702 | 772 | class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter |
@@ -190,7 +190,7 b' let' | |||
|
190 | 190 | postInstall = '' |
|
191 | 191 | # check required files |
|
192 | 192 | STATIC_CHECK="/robots.txt /502.html |
|
193 | /js/scripts.js /js/rhodecode-components.js | |
|
193 | /js/scripts.min.js /js/rhodecode-components.js | |
|
194 | 194 | /css/style.css /css/style-polymer.css /css/style-ipython.css" |
|
195 | 195 | |
|
196 | 196 | for file in $STATIC_CHECK; |
@@ -12,13 +12,50 b' Here is how to force delete a repository' | |||
|
12 | 12 | |
|
13 | 13 | |
|
14 | 14 | .. code-block:: bash |
|
15 | :dedent: 1 | |
|
15 | 16 | |
|
16 | 17 | # starts the ishell interactive prompt |
|
17 | 18 | $ rccontrol ishell enterprise-1 |
|
18 | 19 | |
|
19 | 20 | .. code-block:: python |
|
21 | :dedent: 1 | |
|
20 | 22 | |
|
21 | 23 | In [4]: from rhodecode.model.repo import RepoModel |
|
22 | 24 | In [3]: repo = Repository.get_by_repo_name('test_repos/repo_with_prs') |
|
23 | 25 | In [5]: RepoModel().delete(repo, forks='detach', pull_requests='delete') |
|
24 | 26 | In [6]: Session().commit() |
|
27 | ||
|
28 | ||
|
29 | Below is a fully automated example to force delete repositories reading from a | |
|
30 | file where each line is a repository name. This can be executed via simple CLI command | |
|
31 | without entering the interactive shell. | |
|
32 | ||
|
33 | Save the below content as a file named `repo_delete_task.py` | |
|
34 | ||
|
35 | ||
|
36 | .. code-block:: python | |
|
37 | :dedent: 1 | |
|
38 | ||
|
39 | from rhodecode.model.db import * | |
|
40 | from rhodecode.model.repo import RepoModel | |
|
41 | with open('delete_repos.txt', 'rb') as f: | |
|
42 | # read all lines from file | |
|
43 | repos = f.readlines() | |
|
44 | for repo_name in repos: | |
|
45 | repo_name = repo_name.strip() # cleanup the name just in case | |
|
46 | repo = Repository.get_by_repo_name(repo_name) | |
|
47 | if not repo: | |
|
48 | raise Exception('Repo with name {} not found'.format(repo_name)) | |
|
49 | RepoModel().delete(repo, forks='detach', pull_requests='delete') | |
|
50 | Session().commit() | |
|
51 | print('Removed repository {}'.format(repo_name)) | |
|
52 | ||
|
53 | ||
|
54 | The code above will read the names of repositories from a file called `delete_repos.txt` | |
|
55 | Each lines should represent a single name e.g `repo_name_1` or `repo_group/repo_name_2` | |
|
56 | ||
|
57 | Run this line from CLI to execute the code from the `repo_delete_task.py` file and | |
|
58 | exit the ishell after the execution:: | |
|
59 | ||
|
60 | echo "%run repo_delete_task.py" | rccontrol ishell Enterprise-1 | |
|
61 |
@@ -110,6 +110,7 b' Use the following example to configure N' | |||
|
110 | 110 | # gzip_types text/css text/javascript text/xml text/plain text/x-component application/javascript application/json application/xml application/rss+xml font/truetype font/opentype application/vnd.ms-fontobject image/svg+xml; |
|
111 | 111 | # gzip_vary on; |
|
112 | 112 | # gzip_disable "msie6"; |
|
113 | # expires 60d; | |
|
113 | 114 | # alias /path/to/.rccontrol/community-1/static; |
|
114 | 115 | # alias /path/to/.rccontrol/enterprise-1/static; |
|
115 | 116 | # } |
@@ -9,7 +9,8 b' may find some of the following methods u' | |||
|
9 | 9 | .. toctree:: |
|
10 | 10 | |
|
11 | 11 | tuning/tuning-gunicorn |
|
12 |
tuning/tuning-vcs-memory- |
|
|
12 | tuning/tuning-vcs-server-memory-usage | |
|
13 | tuning/tuning-rhodecode-memory-usage | |
|
13 | 14 | tuning/tuning-user-sessions-performance |
|
14 | 15 | tuning/tuning-increase-db-performance |
|
15 | 16 | tuning/tuning-scale-horizontally-cluster |
@@ -25,26 +25,22 b' 2. In the ``[server:main]`` section, cha' | |||
|
25 | 25 | |
|
26 | 26 | .. code-block:: ini |
|
27 | 27 | |
|
28 | use = egg:gunicorn#main | |
|
29 | ## Sets the number of process workers. You must set `instance_id = *` | |
|
30 | ## when this option is set to more than one worker, recommended | |
|
31 |
|
|
|
32 | ## The `instance_id = *` must be set in the [app:main] section below | |
|
33 | workers = 4 | |
|
34 | ## process name | |
|
35 | proc_name = rhodecode | |
|
36 | ## type of worker class, one of sync, gevent | |
|
37 | ## recommended for bigger setup is using of of other than sync one | |
|
38 | worker_class = sync | |
|
39 | ## The maximum number of simultaneous clients. Valid only for Gevent | |
|
40 | #worker_connections = 10 | |
|
41 | ## max number of requests that worker will handle before being gracefully | |
|
42 | ## restarted, could prevent memory leaks | |
|
43 | max_requests = 1000 | |
|
44 | max_requests_jitter = 30 | |
|
45 | ## amount of time a worker can spend with handling a request tuning-change-lfs-dir.before it | |
|
46 | ## gets killed and restarted. Set to 6hrs | |
|
47 | timeout = 21600 | |
|
28 | ; Sets the number of process workers. More workers means more concurrent connections | |
|
29 | ; RhodeCode can handle at the same time. Each additional worker also it increases | |
|
30 | ; memory usage as each has it's own set of caches. | |
|
31 | ; Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |
|
32 | ; than 8-10 unless for really big deployments .e.g 700-1000 users. | |
|
33 | ; `instance_id = *` must be set in the [app:main] section below (which is the default) | |
|
34 | ; when using more than 1 worker. | |
|
35 | workers = 6 | |
|
36 | ||
|
37 | ; Type of worker class, one of `sync`, `gevent` | |
|
38 | ; Use `gevent` for rhodecode | |
|
39 | worker_class = gevent | |
|
40 | ||
|
41 | ; The maximum number of simultaneous clients per worker. Valid only for gevent | |
|
42 | worker_connections = 10 | |
|
43 | ||
|
48 | 44 | |
|
49 | 45 | 3. In the ``[app:main]`` section, set the ``instance_id`` property to ``*``. |
|
50 | 46 | |
@@ -63,24 +59,19 b' 5. In the ``[server:main]`` section, inc' | |||
|
63 | 59 | |
|
64 | 60 | .. code-block:: ini |
|
65 | 61 | |
|
66 | ## run with gunicorn --log-config vcsserver.ini --paste vcsserver.ini | |
|
67 | use = egg:gunicorn#main | |
|
68 | ## Sets the number of process workers. Recommended | |
|
69 |
|
|
|
70 | workers = 4 | |
|
71 | ## process name | |
|
72 | proc_name = rhodecode_vcsserver | |
|
73 | ## type of worker class, currently `sync` is the only option allowed. | |
|
62 | ; Sets the number of process workers. More workers means more concurrent connections | |
|
63 | ; RhodeCode can handle at the same time. Each additional worker also it increases | |
|
64 | ; memory usage as each has it's own set of caches. | |
|
65 | ; Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |
|
66 | ; than 8-10 unless for really big deployments .e.g 700-1000 users. | |
|
67 | ; `instance_id = *` must be set in the [app:main] section below (which is the default) | |
|
68 | ; when using more than 1 worker. | |
|
69 | workers = 8 | |
|
70 | ||
|
71 | ; Type of worker class, one of `sync`, `gevent` | |
|
72 | ; Use `sync` for vcsserver | |
|
74 | 73 | worker_class = sync |
|
75 | ## The maximum number of simultaneous clients. Valid only for Gevent | |
|
76 | #worker_connections = 10 | |
|
77 | ## max number of requests that worker will handle before being gracefully | |
|
78 | ## restarted, could prevent memory leaks | |
|
79 | max_requests = 1000 | |
|
80 | max_requests_jitter = 30 | |
|
81 | ## amount of time a worker can spend with handling a request before it | |
|
82 | ## gets killed and restarted. Set to 6hrs | |
|
83 | timeout = 21600 | |
|
74 | ||
|
84 | 75 | |
|
85 | 76 | 6. Save your changes. |
|
86 | 77 | 7. Restart your |RCE| instances, using the following command: |
@@ -109,17 +100,18 b' 2. In the ``[server:main]`` section, cha' | |||
|
109 | 100 | |
|
110 | 101 | .. code-block:: ini |
|
111 | 102 | |
|
112 |
|
|
|
113 | ## recommended for bigger setup is using of of other than sync one | |
|
103 | ; Type of worker class, one of `sync`, `gevent` | |
|
104 | ; Use `gevent` for rhodecode | |
|
114 | 105 | worker_class = gevent |
|
115 | ## The maximum number of simultaneous clients. Valid only for Gevent | |
|
106 | ||
|
107 | ; The maximum number of simultaneous clients per worker. Valid only for gevent | |
|
116 | 108 | worker_connections = 30 |
|
117 | 109 | |
|
118 | 110 | |
|
119 | 111 | .. note:: |
|
120 | 112 | |
|
121 | 113 | `Gevent` is currently only supported for Enterprise/Community instances. |
|
122 |
VCSServer doesn't |
|
|
114 | VCSServer doesn't support gevent. | |
|
123 | 115 | |
|
124 | 116 | |
|
125 | 117 |
@@ -57,7 +57,7 b" Here's an overview what components shoul" | |||
|
57 | 57 | - `nginx` acting as a load-balancer. |
|
58 | 58 | - `postgresql-server` used for database and sessions. |
|
59 | 59 | - `redis-server` used for storing shared caches. |
|
60 | - optionally `rabbitmq-server` for `Celery` if used. | |
|
60 | - optionally `rabbitmq-server` or `redis` for `Celery` if used. | |
|
61 | 61 | - optionally if `Celery` is used Enterprise/Community instance + VCSServer. |
|
62 | 62 | - optionally mailserver that can be shared by other instances. |
|
63 | 63 | - optionally channelstream server to handle live communication for all instances. |
@@ -263,6 +263,7 b' 6) Configure `Nginx`_ as reverse proxy o' | |||
|
263 | 263 | gzip_types text/css text/javascript text/xml text/plain text/x-component application/javascript application/json application/xml application/rss+xml font/truetype font/opentype application/vnd.ms-fontobject image/svg+xml; |
|
264 | 264 | gzip_vary on; |
|
265 | 265 | gzip_disable "msie6"; |
|
266 | expires 60d; | |
|
266 | 267 | #alias /home/rcdev/.rccontrol/community-1/static; |
|
267 | 268 | alias /home/rcdev/.rccontrol/enterprise-1/static; |
|
268 | 269 | } |
@@ -372,16 +373,16 b' Using Celery with cluster' | |||
|
372 | 373 | |
|
373 | 374 | |
|
374 | 375 | If `Celery` is used we recommend setting also an instance of Enterprise/Community+VCSserver |
|
375 |
on the node that is running `RabbitMQ`_. Those instances will be used to |
|
|
376 |
tasks on the `rc-node-1`. This is the most efficient setup. |
|
|
377 | handles tasks such as sending emails, forking repositories, importing | |
|
376 | on the node that is running `RabbitMQ`_ or `Redis`_. Those instances will be used to | |
|
377 | executed async tasks on the `rc-node-1`. This is the most efficient setup. | |
|
378 | `Celery` usually handles tasks such as sending emails, forking repositories, importing | |
|
378 | 379 | repositories from external location etc. Using workers on instance that has |
|
379 | 380 | the direct access to disks used by NFS as well as email server gives noticeable |
|
380 | 381 | performance boost. Running local workers to the NFS storage results in faster |
|
381 | 382 | execution of forking large repositories or sending lots of emails. |
|
382 | 383 | |
|
383 | 384 | Those instances need to be configured in the same way as for other nodes. |
|
384 | The instance in rc-node-1 can be added to the cluser, but we don't recommend doing it. | |
|
385 | The instance in rc-node-1 can be added to the cluster, but we don't recommend doing it. | |
|
385 | 386 | For best results let it be isolated to only executing `Celery` tasks in the cluster setup. |
|
386 | 387 | |
|
387 | 388 |
@@ -1,8 +1,26 b'' | |||
|
1 |
.. _adjust-vcs-mem |
|
|
1 | .. _adjust-vcs-server-mem: | |
|
2 | 2 | |
|
3 |
VCSServer Memory |
|
|
3 | VCSServer Memory Usage | |
|
4 | 4 | ---------------------- |
|
5 | 5 | |
|
6 | The VCS Server mamory cache can be adjusted to work best with the resources | |
|
7 | available to your |RCE| instance. If you find that memory resources are under | |
|
8 | pressure, see the :ref:`vcs-server-maintain` section for details. | |
|
6 | Starting from Version 4.18.X RhodeCode has a builtin memory monitor for gunicorn workers. | |
|
7 | Enabling this can limit the maximum amount of memory system can use. Each worker | |
|
8 | for VCS Server is monitored independently. | |
|
9 | To enable Memory management make sure to have following settings inside `[app:main] section` of | |
|
10 | :file:`home/{user}/.rccontrol/{instance-id}/vcsserver.ini` file. | |
|
11 | ||
|
12 | ||
|
13 | ||
|
14 | ; Maximum memory usage that each worker can use before it will receive a | |
|
15 | ; graceful restart signal 0 = memory monitoring is disabled | |
|
16 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |
|
17 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |
|
18 | memory_max_usage = 1073741824 | |
|
19 | ||
|
20 | ; How often in seconds to check for memory usage for each gunicorn worker | |
|
21 | memory_usage_check_interval = 60 | |
|
22 | ||
|
23 | ; Threshold value for which we don't recycle worker if GarbageCollection | |
|
24 | ; frees up enough resources. Before each restart we try to run GC on worker | |
|
25 | ; in case we get enough free memory after that, restart will not happen. | |
|
26 | memory_usage_recovery_threshold = 0.8 |
@@ -110,35 +110,39 b' match, for example:' | |||
|
110 | 110 | |
|
111 | 111 | .. _vcs-server-maintain: |
|
112 | 112 | |
|
113 |
VCS Server |
|
|
114 |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
|
113 | VCS Server Cache Optimization | |
|
114 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
|
115 | 115 | |
|
116 |
To optimize the VCS server to manage the cache and memory usage efficiently, |
|
|
117 | configure the following options in the | |
|
118 | :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini` file. Once | |
|
119 | configured, restart the VCS Server. By default we use an optimal settings, but in certain | |
|
120 | conditions tunning expiration_time and max_size can affect memory usage and performance | |
|
116 | To optimize the VCS server to manage the cache and memory usage efficiently, it's recommended to | |
|
117 | configure the Redis backend for VCSServer caches. | |
|
118 | Once configured, restart the VCS Server. | |
|
119 | ||
|
120 | Make sure Redis is installed and running. | |
|
121 | Open :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini` | |
|
122 | file and ensure the below settings for `repo_object` type cache are set: | |
|
121 | 123 | |
|
122 | 124 | .. code-block:: ini |
|
123 | 125 | |
|
124 | ## cache region for storing repo_objects cache | |
|
125 |
rc_cache.repo_object.backend = dogpile.cache.rc. |
|
|
126 | ; ensure the default file based cache is *commented out* | |
|
127 | ##rc_cache.repo_object.backend = dogpile.cache.rc.file_namespace | |
|
128 | ##rc_cache.repo_object.expiration_time = 2592000 | |
|
126 | 129 | |
|
127 | ## cache auto-expires after N seconds, setting this to 0 disabled cache | |
|
128 | rc_cache.repo_object.expiration_time = 300 | |
|
130 | ; `repo_object` cache settings for vcs methods for repositories | |
|
131 | rc_cache.repo_object.backend = dogpile.cache.rc.redis_msgpack | |
|
129 | 132 | |
|
130 | ## max size of LRU, old values will be discarded if the size of cache reaches max_size | |
|
131 | ## Sets the maximum number of items stored in the cache, before the cache | |
|
132 | ## starts to be cleared. | |
|
133 | ; cache auto-expires after N seconds | |
|
134 | ; Examples: 86400 (1Day), 604800 (7Days), 1209600 (14Days), 2592000 (30days), 7776000 (90Days) | |
|
135 | rc_cache.repo_object.expiration_time = 2592000 | |
|
136 | ||
|
137 | ; redis_expiration_time needs to be greater then expiration_time | |
|
138 | rc_cache.repo_object.arguments.redis_expiration_time = 3592000 | |
|
133 | 139 | |
|
134 | ## As a general rule of thumb, running this value at 120 resulted in a | |
|
135 | ## 5GB cache. Running it at 240 resulted in a 9GB cache. Your results | |
|
136 | ## will differ based on usage patterns and |repo| sizes. | |
|
137 | ||
|
138 | ## Tweaking this value to run at a fairly constant memory load on your | |
|
139 | ## server will help performance. | |
|
140 | ||
|
141 | rc_cache.repo_object.max_size = 120 | |
|
140 | rc_cache.repo_object.arguments.host = localhost | |
|
141 | rc_cache.repo_object.arguments.port = 6379 | |
|
142 | rc_cache.repo_object.arguments.db = 5 | |
|
143 | rc_cache.repo_object.arguments.socket_timeout = 30 | |
|
144 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
|
145 | rc_cache.repo_object.arguments.distributed_lock = true | |
|
142 | 146 | |
|
143 | 147 | |
|
144 | 148 | To clear the cache completely, you can restart the VCS Server. |
@@ -190,25 +194,6 b' For a more detailed explanation of the l' | |||
|
190 | 194 | \port <int> |
|
191 | 195 | Set the port number on which the VCS Server will be available. |
|
192 | 196 | |
|
193 | \locale <locale_utf> | |
|
194 | Set the locale the VCS Server expects. | |
|
195 | ||
|
196 | \workers <int> | |
|
197 | Set the number of process workers.Recommended | |
|
198 | value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers | |
|
199 | ||
|
200 | \max_requests <int> | |
|
201 | The maximum number of requests a worker will process before restarting. | |
|
202 | Any value greater than zero will limit the number of requests a work | |
|
203 | will process before automatically restarting. This is a simple method | |
|
204 | to help limit the damage of memory leaks. | |
|
205 | ||
|
206 | \max_requests_jitter <int> | |
|
207 | The maximum jitter to add to the max_requests setting. | |
|
208 | The jitter causes the restart per worker to be randomized by | |
|
209 | randint(0, max_requests_jitter). This is intended to stagger worker | |
|
210 | restarts to avoid all workers restarting at the same time. | |
|
211 | ||
|
212 | 197 | |
|
213 | 198 | .. note:: |
|
214 | 199 | |
@@ -216,63 +201,139 b' For a more detailed explanation of the l' | |||
|
216 | 201 | |
|
217 | 202 | .. code-block:: ini |
|
218 | 203 | |
|
219 |
|
|
|
220 | # RhodeCode VCSServer with HTTP Backend - configuration # | |
|
221 | # # | |
|
222 | ################################################################################ | |
|
223 | ||
|
204 | ; ################################# | |
|
205 | ; RHODECODE VCSSERVER CONFIGURATION | |
|
206 | ; ################################# | |
|
224 | 207 | |
|
225 | 208 | [server:main] |
|
226 | ## COMMON ## | |
|
209 | ; COMMON HOST/IP CONFIG | |
|
227 | 210 | host = 127.0.0.1 |
|
228 | 211 | port = 10002 |
|
229 | 212 | |
|
230 | ########################## | |
|
231 |
|
|
|
232 | ########################## | |
|
233 | ## run with gunicorn --log-config vcsserver.ini --paste vcsserver.ini | |
|
213 | ; ########################### | |
|
214 | ; GUNICORN APPLICATION SERVER | |
|
215 | ; ########################### | |
|
216 | ||
|
217 | ; run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |
|
218 | ||
|
219 | ; Module to use, this setting shouldn't be changed | |
|
234 | 220 | use = egg:gunicorn#main |
|
235 | ## Sets the number of process workers. Recommended | |
|
236 | ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers | |
|
237 | workers = 3 | |
|
238 | ## process name | |
|
221 | ||
|
222 | ; Sets the number of process workers. More workers means more concurrent connections | |
|
223 | ; RhodeCode can handle at the same time. Each additional worker also it increases | |
|
224 | ; memory usage as each has it's own set of caches. | |
|
225 | ; Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |
|
226 | ; than 8-10 unless for really big deployments .e.g 700-1000 users. | |
|
227 | ; `instance_id = *` must be set in the [app:main] section below (which is the default) | |
|
228 | ; when using more than 1 worker. | |
|
229 | workers = 6 | |
|
230 | ||
|
231 | ; Gunicorn access log level | |
|
232 | loglevel = info | |
|
233 | ||
|
234 | ; Process name visible in process list | |
|
239 | 235 | proc_name = rhodecode_vcsserver |
|
240 | ## type of worker class, one of sync, gevent | |
|
241 | ## recommended for bigger setup is using of of other than sync one | |
|
236 | ||
|
237 | ; Type of worker class, one of sync, gevent | |
|
238 | ; currently `sync` is the only option allowed. | |
|
242 | 239 | worker_class = sync |
|
243 | ## The maximum number of simultaneous clients. Valid only for Gevent | |
|
244 | #worker_connections = 10 | |
|
245 | ## max number of requests that worker will handle before being gracefully | |
|
246 | ## restarted, could prevent memory leaks | |
|
240 | ||
|
241 | ; The maximum number of simultaneous clients. Valid only for gevent | |
|
242 | worker_connections = 10 | |
|
243 | ||
|
244 | ; Max number of requests that worker will handle before being gracefully restarted. | |
|
245 | ; Prevents memory leaks, jitter adds variability so not all workers are restarted at once. | |
|
247 | 246 | max_requests = 1000 |
|
248 | 247 | max_requests_jitter = 30 |
|
249 | ## amount of time a worker can spend with handling a request before it | |
|
250 | ## gets killed and restarted. Set to 6hrs | |
|
248 | ||
|
249 | ; Amount of time a worker can spend with handling a request before it | |
|
250 | ; gets killed and restarted. By default set to 21600 (6hrs) | |
|
251 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |
|
251 | 252 | timeout = 21600 |
|
252 | 253 | |
|
254 | ; The maximum size of HTTP request line in bytes. | |
|
255 | ; 0 for unlimited | |
|
256 | limit_request_line = 0 | |
|
257 | ||
|
258 | ; Limit the number of HTTP headers fields in a request. | |
|
259 | ; By default this value is 100 and can't be larger than 32768. | |
|
260 | limit_request_fields = 32768 | |
|
261 | ||
|
262 | ; Limit the allowed size of an HTTP request header field. | |
|
263 | ; Value is a positive number or 0. | |
|
264 | ; Setting it to 0 will allow unlimited header field sizes. | |
|
265 | limit_request_field_size = 0 | |
|
266 | ||
|
267 | ; Timeout for graceful workers restart. | |
|
268 | ; After receiving a restart signal, workers have this much time to finish | |
|
269 | ; serving requests. Workers still alive after the timeout (starting from the | |
|
270 | ; receipt of the restart signal) are force killed. | |
|
271 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |
|
272 | graceful_timeout = 3600 | |
|
273 | ||
|
274 | # The number of seconds to wait for requests on a Keep-Alive connection. | |
|
275 | # Generally set in the 1-5 seconds range. | |
|
276 | keepalive = 2 | |
|
277 | ||
|
278 | ; Maximum memory usage that each worker can use before it will receive a | |
|
279 | ; graceful restart signal 0 = memory monitoring is disabled | |
|
280 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |
|
281 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |
|
282 | memory_max_usage = 1073741824 | |
|
283 | ||
|
284 | ; How often in seconds to check for memory usage for each gunicorn worker | |
|
285 | memory_usage_check_interval = 60 | |
|
286 | ||
|
287 | ; Threshold value for which we don't recycle worker if GarbageCollection | |
|
288 | ; frees up enough resources. Before each restart we try to run GC on worker | |
|
289 | ; in case we get enough free memory after that, restart will not happen. | |
|
290 | memory_usage_recovery_threshold = 0.8 | |
|
291 | ||
|
292 | ||
|
253 | 293 | [app:main] |
|
254 | 294 | use = egg:rhodecode-vcsserver |
|
255 | 295 | |
|
256 | 296 | pyramid.default_locale_name = en |
|
257 | 297 | pyramid.includes = |
|
258 | 298 | |
|
259 |
|
|
|
299 | ; default locale used by VCS systems | |
|
260 | 300 | locale = en_US.UTF-8 |
|
261 | 301 | |
|
262 | # cache regions, please don't change | |
|
263 | beaker.cache.regions = repo_object | |
|
264 | beaker.cache.repo_object.type = memorylru | |
|
265 | beaker.cache.repo_object.max_items = 100 | |
|
266 | # cache auto-expires after N seconds | |
|
267 | beaker.cache.repo_object.expire = 300 | |
|
268 | beaker.cache.repo_object.enabled = true | |
|
302 | ; ############# | |
|
303 | ; DOGPILE CACHE | |
|
304 | ; ############# | |
|
305 | ||
|
306 | ; Default cache dir for caches. Putting this into a ramdisk can boost performance. | |
|
307 | ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space | |
|
308 | cache_dir = %(here)s/data | |
|
309 | ||
|
310 | ; ********************************************************** | |
|
311 | ; `repo_object` cache with redis backend | |
|
312 | ; recommended for larger instance, or for better performance | |
|
313 | ; ********************************************************** | |
|
314 | ||
|
315 | ; `repo_object` cache settings for vcs methods for repositories | |
|
316 | rc_cache.repo_object.backend = dogpile.cache.rc.redis_msgpack | |
|
269 | 317 | |
|
318 | ; cache auto-expires after N seconds | |
|
319 | ; Examples: 86400 (1Day), 604800 (7Days), 1209600 (14Days), 2592000 (30days), 7776000 (90Days) | |
|
320 | rc_cache.repo_object.expiration_time = 2592000 | |
|
270 | 321 | |
|
271 | ################################ | |
|
272 | ### LOGGING CONFIGURATION #### | |
|
273 | ################################ | |
|
322 | ; redis_expiration_time needs to be greater then expiration_time | |
|
323 | rc_cache.repo_object.arguments.redis_expiration_time = 3592000 | |
|
324 | ||
|
325 | rc_cache.repo_object.arguments.host = localhost | |
|
326 | rc_cache.repo_object.arguments.port = 6379 | |
|
327 | rc_cache.repo_object.arguments.db = 5 | |
|
328 | rc_cache.repo_object.arguments.socket_timeout = 30 | |
|
329 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
|
330 | rc_cache.repo_object.arguments.distributed_lock = true | |
|
331 | ||
|
332 | ; ##################### | |
|
333 | ; LOGGING CONFIGURATION | |
|
334 | ; ##################### | |
|
274 | 335 | [loggers] |
|
275 |
keys = root, vcsserver |
|
|
336 | keys = root, vcsserver | |
|
276 | 337 | |
|
277 | 338 | [handlers] |
|
278 | 339 | keys = console |
@@ -280,9 +341,9 b' For a more detailed explanation of the l' | |||
|
280 | 341 | [formatters] |
|
281 | 342 | keys = generic |
|
282 | 343 | |
|
283 |
|
|
|
284 |
|
|
|
285 |
|
|
|
344 | ; ####### | |
|
345 | ; LOGGERS | |
|
346 | ; ####### | |
|
286 | 347 | [logger_root] |
|
287 | 348 | level = NOTSET |
|
288 | 349 | handlers = console |
@@ -293,29 +354,23 b' For a more detailed explanation of the l' | |||
|
293 | 354 | qualname = vcsserver |
|
294 | 355 | propagate = 1 |
|
295 | 356 | |
|
296 | [logger_beaker] | |
|
297 | level = DEBUG | |
|
298 | handlers = | |
|
299 | qualname = beaker | |
|
300 | propagate = 1 | |
|
301 | 357 | |
|
302 | ||
|
303 | ############## | |
|
304 | ## HANDLERS ## | |
|
305 | ############## | |
|
358 | ; ######## | |
|
359 | ; HANDLERS | |
|
360 | ; ######## | |
|
306 | 361 | |
|
307 | 362 | [handler_console] |
|
308 | 363 | class = StreamHandler |
|
309 | args = (sys.stderr,) | |
|
310 |
level = |
|
|
364 | args = (sys.stderr, ) | |
|
365 | level = INFO | |
|
311 | 366 | formatter = generic |
|
312 | 367 | |
|
313 |
|
|
|
314 |
|
|
|
315 |
|
|
|
368 | ; ########## | |
|
369 | ; FORMATTERS | |
|
370 | ; ########## | |
|
316 | 371 | |
|
317 | 372 | [formatter_generic] |
|
318 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
|
373 | format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s | |
|
319 | 374 | datefmt = %Y-%m-%d %H:%M:%S |
|
320 | 375 | |
|
321 | 376 |
@@ -39,7 +39,7 b' close_pull_request' | |||
|
39 | 39 | comment_pull_request |
|
40 | 40 | -------------------- |
|
41 | 41 | |
|
42 | .. py:function:: comment_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, message=<Optional:None>, commit_id=<Optional:None>, status=<Optional:None>, comment_type=<Optional:u'note'>, resolves_comment_id=<Optional:None>, userid=<Optional:<OptionalAttr:apiuser>>) | |
|
42 | .. py:function:: comment_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, message=<Optional:None>, commit_id=<Optional:None>, status=<Optional:None>, comment_type=<Optional:u'note'>, resolves_comment_id=<Optional:None>, extra_recipients=<Optional:[]>, userid=<Optional:<OptionalAttr:apiuser>>) | |
|
43 | 43 | |
|
44 | 44 | Comment on the pull request specified with the `pullrequestid`, |
|
45 | 45 | in the |repo| specified by the `repoid`, and optionally change the |
@@ -63,6 +63,11 b' comment_pull_request' | |||
|
63 | 63 | :type status: str |
|
64 | 64 | :param comment_type: Comment type, one of: 'note', 'todo' |
|
65 | 65 | :type comment_type: Optional(str), default: 'note' |
|
66 | :param resolves_comment_id: id of comment which this one will resolve | |
|
67 | :type resolves_comment_id: Optional(int) | |
|
68 | :param extra_recipients: list of user ids or usernames to add | |
|
69 | notifications for this comment. Acts like a CC for notification | |
|
70 | :type extra_recipients: Optional(list) | |
|
66 | 71 | :param userid: Comment on the pull request as this user |
|
67 | 72 | :type userid: Optional(str or int) |
|
68 | 73 | |
@@ -126,7 +131,7 b' create_pull_request' | |||
|
126 | 131 | get_pull_request |
|
127 | 132 | ---------------- |
|
128 | 133 | |
|
129 | .. py:function:: get_pull_request(apiuser, pullrequestid, repoid=<Optional:None>) | |
|
134 | .. py:function:: get_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, merge_state=<Optional:False>) | |
|
130 | 135 | |
|
131 | 136 | Get a pull request based on the given ID. |
|
132 | 137 | |
@@ -137,6 +142,9 b' get_pull_request' | |||
|
137 | 142 | :type repoid: str or int |
|
138 | 143 | :param pullrequestid: ID of the requested pull request. |
|
139 | 144 | :type pullrequestid: int |
|
145 | :param merge_state: Optional calculate merge state for each repository. | |
|
146 | This could result in longer time to fetch the data | |
|
147 | :type merge_state: bool | |
|
140 | 148 | |
|
141 | 149 | Example output: |
|
142 | 150 | |
@@ -250,7 +258,7 b' get_pull_request_comments' | |||
|
250 | 258 | get_pull_requests |
|
251 | 259 | ----------------- |
|
252 | 260 | |
|
253 |
.. py:function:: get_pull_requests(apiuser, repoid, status=<Optional:'new'>, merge_state=<Optional: |
|
|
261 | .. py:function:: get_pull_requests(apiuser, repoid, status=<Optional:'new'>, merge_state=<Optional:False>) | |
|
254 | 262 | |
|
255 | 263 | Get all pull requests from the repository specified in `repoid`. |
|
256 | 264 |
@@ -28,7 +28,7 b' add_field_to_repo' | |||
|
28 | 28 | comment_commit |
|
29 | 29 | -------------- |
|
30 | 30 | |
|
31 | .. py:function:: comment_commit(apiuser, repoid, commit_id, message, status=<Optional:None>, comment_type=<Optional:u'note'>, resolves_comment_id=<Optional:None>, userid=<Optional:<OptionalAttr:apiuser>>) | |
|
31 | .. py:function:: comment_commit(apiuser, repoid, commit_id, message, status=<Optional:None>, comment_type=<Optional:u'note'>, resolves_comment_id=<Optional:None>, extra_recipients=<Optional:[]>, userid=<Optional:<OptionalAttr:apiuser>>) | |
|
32 | 32 | |
|
33 | 33 | Set a commit comment, and optionally change the status of the commit. |
|
34 | 34 | |
@@ -45,6 +45,11 b' comment_commit' | |||
|
45 | 45 | :type status: str |
|
46 | 46 | :param comment_type: Comment type, one of: 'note', 'todo' |
|
47 | 47 | :type comment_type: Optional(str), default: 'note' |
|
48 | :param resolves_comment_id: id of comment which this one will resolve | |
|
49 | :type resolves_comment_id: Optional(int) | |
|
50 | :param extra_recipients: list of user ids or usernames to add | |
|
51 | notifications for this comment. Acts like a CC for notification | |
|
52 | :type extra_recipients: Optional(list) | |
|
48 | 53 | :param userid: Set the user name of the comment creator. |
|
49 | 54 | :type userid: Optional(str or int) |
|
50 | 55 | |
@@ -66,7 +71,7 b' comment_commit' | |||
|
66 | 71 | create_repo |
|
67 | 72 | ----------- |
|
68 | 73 | |
|
69 |
.. py:function:: create_repo(apiuser, repo_name, repo_type, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional: |
|
|
74 | .. py:function:: create_repo(apiuser, repo_name, repo_type, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, copy_permissions=<Optional:False>) | |
|
70 | 75 | |
|
71 | 76 | Creates a repository. |
|
72 | 77 | |
@@ -97,7 +102,7 b' create_repo' | |||
|
97 | 102 | :type clone_uri: str |
|
98 | 103 | :param push_uri: set push_uri |
|
99 | 104 | :type push_uri: str |
|
100 | :param landing_rev: <rev_type>:<rev> | |
|
105 | :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd | |
|
101 | 106 | :type landing_rev: str |
|
102 | 107 | :param enable_locking: |
|
103 | 108 | :type enable_locking: bool |
@@ -169,7 +174,7 b' delete_repo' | |||
|
169 | 174 | fork_repo |
|
170 | 175 | --------- |
|
171 | 176 | |
|
172 |
.. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional: |
|
|
177 | .. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:None>, copy_permissions=<Optional:False>) | |
|
173 | 178 | |
|
174 | 179 | Creates a fork of the specified |repo|. |
|
175 | 180 | |
@@ -198,7 +203,7 b' fork_repo' | |||
|
198 | 203 | :type copy_permissions: bool |
|
199 | 204 | :param private: Make the fork private. The default is False. |
|
200 | 205 | :type private: bool |
|
201 |
:param landing_rev: Set the landing revision. |
|
|
206 | :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd | |
|
202 | 207 | |
|
203 | 208 | Example output: |
|
204 | 209 | |
@@ -1085,7 +1090,7 b' strip' | |||
|
1085 | 1090 | update_repo |
|
1086 | 1091 | ----------- |
|
1087 | 1092 | |
|
1088 |
.. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional: |
|
|
1093 | .. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional:None>, fork_of=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>) | |
|
1089 | 1094 | |
|
1090 | 1095 | Updates a repository with the given information. |
|
1091 | 1096 | |
@@ -1117,7 +1122,7 b' update_repo' | |||
|
1117 | 1122 | :type private: bool |
|
1118 | 1123 | :param clone_uri: Update the |repo| clone URI. |
|
1119 | 1124 | :type clone_uri: str |
|
1120 |
:param landing_rev: Set the |repo| landing revision. |
|
|
1125 | :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd | |
|
1121 | 1126 | :type landing_rev: str |
|
1122 | 1127 | :param enable_statistics: Enable statistics on the |repo|, (True | False). |
|
1123 | 1128 | :type enable_statistics: bool |
@@ -6,7 +6,7 b' search methods' | |||
|
6 | 6 | search |
|
7 | 7 | ------ |
|
8 | 8 | |
|
9 |
.. py:function:: search(apiuser, search_query, search_type, page_limit=<Optional:10>, page=<Optional:1>, search_sort=<Optional:' |
|
|
9 | .. py:function:: search(apiuser, search_query, search_type, page_limit=<Optional:10>, page=<Optional:1>, search_sort=<Optional:'desc:date'>, repo_name=<Optional:None>, repo_group_name=<Optional:None>) | |
|
10 | 10 | |
|
11 | 11 | Fetch Full Text Search results using API. |
|
12 | 12 | |
@@ -23,9 +23,15 b' search' | |||
|
23 | 23 | :type page_limit: Optional(int) |
|
24 | 24 | :param page: Page number. Default first page. |
|
25 | 25 | :type page: Optional(int) |
|
26 | :param search_sort: Search sort order. Default newfirst. The following are valid options: | |
|
27 | * newfirst | |
|
28 | * oldfirst | |
|
26 | :param search_sort: Search sort order.Must start with asc: or desc: Default desc:date. | |
|
27 | The following are valid options: | |
|
28 | * asc|desc:message.raw | |
|
29 | * asc|desc:date | |
|
30 | * asc|desc:author.email.raw | |
|
31 | * asc|desc:message.raw | |
|
32 | * newfirst (old legacy equal to desc:date) | |
|
33 | * oldfirst (old legacy equal to asc:date) | |
|
34 | ||
|
29 | 35 | :type search_sort: Optional(str) |
|
30 | 36 | :param repo_name: Filter by one repo. Default is all. |
|
31 | 37 | :type repo_name: Optional(str) |
@@ -6,7 +6,7 b' store methods' | |||
|
6 | 6 | file_store_add (EE only) |
|
7 | 7 | ------------------------ |
|
8 | 8 | |
|
9 | .. py:function:: file_store_add(apiuser, filename, content) | |
|
9 | .. py:function:: file_store_add(apiuser, filename, content, description=<Optional:''>) | |
|
10 | 10 | |
|
11 | 11 | Upload API for the file_store |
|
12 | 12 | |
@@ -19,6 +19,8 b' file_store_add (EE only)' | |||
|
19 | 19 | :type apiuser: AuthUser |
|
20 | 20 | :param filename: name of the file uploaded |
|
21 | 21 | :type filename: str |
|
22 | :param description: Optional description for added file | |
|
23 | :type description: str | |
|
22 | 24 | :param content: base64 encoded content of the uploaded file |
|
23 | 25 | :type content: str |
|
24 | 26 | |
@@ -35,3 +37,148 b' file_store_add (EE only)' | |||
|
35 | 37 | error : null |
|
36 | 38 | |
|
37 | 39 | |
|
40 | file_store_add_with_acl (EE only) | |
|
41 | --------------------------------- | |
|
42 | ||
|
43 | .. py:function:: file_store_add_with_acl(apiuser, filename, content, description=<Optional:''>, scope_user_id=<Optional:None>, scope_repo_id=<Optional:None>, scope_repo_group_id=<Optional:None>) | |
|
44 | ||
|
45 | Upload API for the file_store | |
|
46 | ||
|
47 | Example usage from CLI:: | |
|
48 | rhodecode-api --instance-name=enterprise-1 upload_file "{"content": "$(cat image.jpg | base64)", "filename":"image.jpg", "scope_repo_id":101}" | |
|
49 | ||
|
50 | This command takes the following options: | |
|
51 | ||
|
52 | :param apiuser: This is filled automatically from the |authtoken|. | |
|
53 | :type apiuser: AuthUser | |
|
54 | :param filename: name of the file uploaded | |
|
55 | :type filename: str | |
|
56 | :param description: Optional description for added file | |
|
57 | :type description: str | |
|
58 | :param content: base64 encoded content of the uploaded file | |
|
59 | :type content: str | |
|
60 | ||
|
61 | :param scope_user_id: Optionally bind this file to user. | |
|
62 | This will check ACL in such way only this user can access the file. | |
|
63 | :type scope_user_id: int | |
|
64 | :param scope_repo_id: Optionally bind this file to repository. | |
|
65 | This will check ACL in such way only user with proper access to such | |
|
66 | repository can access the file. | |
|
67 | :type scope_repo_id: int | |
|
68 | :param scope_repo_group_id: Optionally bind this file to repository group. | |
|
69 | This will check ACL in such way only user with proper access to such | |
|
70 | repository group can access the file. | |
|
71 | :type scope_repo_group_id: int | |
|
72 | ||
|
73 | Example output: | |
|
74 | ||
|
75 | .. code-block:: bash | |
|
76 | ||
|
77 | id : <id_given_in_input> | |
|
78 | result: { | |
|
79 | "access_path": "/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg", | |
|
80 | "access_path_fqn": "http://server.domain.com/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg", | |
|
81 | "store_fid": "84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg" | |
|
82 | } | |
|
83 | error : null | |
|
84 | ||
|
85 | ||
|
86 | file_store_get_info (EE only) | |
|
87 | ----------------------------- | |
|
88 | ||
|
89 | .. py:function:: file_store_get_info(apiuser, store_fid) | |
|
90 | ||
|
91 | Get artifact data. | |
|
92 | ||
|
93 | Example output: | |
|
94 | ||
|
95 | .. code-block:: bash | |
|
96 | ||
|
97 | id : <id_given_in_input> | |
|
98 | result: { | |
|
99 | "artifact": { | |
|
100 | "access_path_fqn": "https://rhodecode.example.com/_file_store/download/0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg", | |
|
101 | "created_on": "2019-10-15T16:25:35.491", | |
|
102 | "description": "my upload", | |
|
103 | "downloaded_times": 1, | |
|
104 | "file_uid": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg", | |
|
105 | "filename": "example.jpg", | |
|
106 | "filename_org": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg", | |
|
107 | "hidden": false, | |
|
108 | "metadata": [ | |
|
109 | { | |
|
110 | "artifact": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg", | |
|
111 | "key": "yellow", | |
|
112 | "section": "tags", | |
|
113 | "value": "bar" | |
|
114 | } | |
|
115 | ], | |
|
116 | "sha256": "818dff0f44574dfb6814d38e6bf3c60c5943d1d13653398ecddaedf2f6a5b04d", | |
|
117 | "size": 18599, | |
|
118 | "uploaded_by": { | |
|
119 | "email": "admin@rhodecode.com", | |
|
120 | "emails": [ | |
|
121 | "admin@rhodecode.com" | |
|
122 | ], | |
|
123 | "firstname": "Admin", | |
|
124 | "lastname": "LastName", | |
|
125 | "user_id": 2, | |
|
126 | "username": "admin" | |
|
127 | } | |
|
128 | } | |
|
129 | } | |
|
130 | error : null | |
|
131 | ||
|
132 | ||
|
133 | file_store_add_metadata (EE only) | |
|
134 | --------------------------------- | |
|
135 | ||
|
136 | .. py:function:: file_store_add_metadata(apiuser, store_fid, section, key, value, value_type=<Optional:'unicode'>) | |
|
137 | ||
|
138 | Add metadata into artifact. The metadata consist of section, key, value. eg. | |
|
139 | section='tags', 'key'='tag_name', value='1' | |
|
140 | ||
|
141 | :param apiuser: This is filled automatically from the |authtoken|. | |
|
142 | :type apiuser: AuthUser | |
|
143 | ||
|
144 | :param store_fid: file uid, e.g 0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4 | |
|
145 | :type store_fid: str | |
|
146 | ||
|
147 | :param section: Section name to add metadata | |
|
148 | :type section: str | |
|
149 | ||
|
150 | :param key: Key to add as metadata | |
|
151 | :type key: str | |
|
152 | ||
|
153 | :param value: Value to add as metadata | |
|
154 | :type value: str | |
|
155 | ||
|
156 | :param value_type: Optional type, default is 'unicode' other types are: | |
|
157 | int, list, bool, unicode, str | |
|
158 | ||
|
159 | :type value_type: str | |
|
160 | ||
|
161 | Example output: | |
|
162 | ||
|
163 | .. code-block:: bash | |
|
164 | ||
|
165 | id : <id_given_in_input> | |
|
166 | result: { | |
|
167 | "metadata": [ | |
|
168 | { | |
|
169 | "artifact": "0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4", | |
|
170 | "key": "secret", | |
|
171 | "section": "tags", | |
|
172 | "value": "1" | |
|
173 | }, | |
|
174 | { | |
|
175 | "artifact": "0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4", | |
|
176 | "key": "video", | |
|
177 | "section": "tags", | |
|
178 | "value": "1" | |
|
179 | } | |
|
180 | ] | |
|
181 | } | |
|
182 | error : null | |
|
183 | ||
|
184 |
@@ -72,7 +72,9 b' create_user_group' | |||
|
72 | 72 | :param active: Set this group as active. |
|
73 | 73 | :type active: Optional(``True`` | ``False``) |
|
74 | 74 | :param sync: Set enabled or disabled the automatically sync from |
|
75 | external authentication types like ldap. | |
|
75 | external authentication types like ldap. If User Group will be named like | |
|
76 | one from e.g ldap and sync flag is enabled members will be synced automatically. | |
|
77 | Sync type when enabled via API is set to `manual_api` | |
|
76 | 78 | :type sync: Optional(``True`` | ``False``) |
|
77 | 79 | |
|
78 | 80 | Example output: |
@@ -391,7 +393,9 b' update_user_group' | |||
|
391 | 393 | :param active: Set the group as active. |
|
392 | 394 | :type active: Optional(``True`` | ``False``) |
|
393 | 395 | :param sync: Set enabled or disabled the automatically sync from |
|
394 | external authentication types like ldap. | |
|
396 | external authentication types like ldap. If User Group will be named like | |
|
397 | one from e.g ldap and sync flag is enabled members will be synced automatically. | |
|
398 | Sync type when enabled via API is set to `manual_api` | |
|
395 | 399 | :type sync: Optional(``True`` | ``False``) |
|
396 | 400 | |
|
397 | 401 | Example output: |
@@ -6,7 +6,7 b' user methods' | |||
|
6 | 6 | create_user |
|
7 | 7 | ----------- |
|
8 | 8 | |
|
9 | .. py:function:: create_user(apiuser, username, email, password=<Optional:''>, firstname=<Optional:''>, lastname=<Optional:''>, active=<Optional:True>, admin=<Optional:False>, extern_name=<Optional:'rhodecode'>, extern_type=<Optional:'rhodecode'>, force_password_change=<Optional:False>, create_personal_repo_group=<Optional:None>) | |
|
9 | .. py:function:: create_user(apiuser, username, email, password=<Optional:''>, firstname=<Optional:''>, lastname=<Optional:''>, description=<Optional:''>, active=<Optional:True>, admin=<Optional:False>, extern_name=<Optional:'rhodecode'>, extern_type=<Optional:'rhodecode'>, force_password_change=<Optional:False>, create_personal_repo_group=<Optional:None>) | |
|
10 | 10 | |
|
11 | 11 | Creates a new user and returns the new user object. |
|
12 | 12 | |
@@ -27,6 +27,8 b' create_user' | |||
|
27 | 27 | :type firstname: Optional(str) |
|
28 | 28 | :param lastname: Set the new user surname. |
|
29 | 29 | :type lastname: Optional(str) |
|
30 | :param description: Set user description, or short bio. Metatags are allowed. | |
|
31 | :type description: Optional(str) | |
|
30 | 32 | :param active: Set the user as active. |
|
31 | 33 | :type active: Optional(``True`` | ``False``) |
|
32 | 34 | :param admin: Give the new user admin rights. |
@@ -155,6 +157,7 b' get_user' | |||
|
155 | 157 | "extern_name": "rhodecode", |
|
156 | 158 | "extern_type": "rhodecode", |
|
157 | 159 | "firstname": "username", |
|
160 | "description": "user description", | |
|
158 | 161 | "ip_addresses": [], |
|
159 | 162 | "language": null, |
|
160 | 163 | "last_login": "Timestamp", |
@@ -268,7 +271,7 b' get_users' | |||
|
268 | 271 | update_user |
|
269 | 272 | ----------- |
|
270 | 273 | |
|
271 | .. py:function:: update_user(apiuser, userid, username=<Optional:None>, email=<Optional:None>, password=<Optional:None>, firstname=<Optional:None>, lastname=<Optional:None>, active=<Optional:None>, admin=<Optional:None>, extern_type=<Optional:None>, extern_name=<Optional:None>) | |
|
274 | .. py:function:: update_user(apiuser, userid, username=<Optional:None>, email=<Optional:None>, password=<Optional:None>, firstname=<Optional:None>, lastname=<Optional:None>, description=<Optional:None>, active=<Optional:None>, admin=<Optional:None>, extern_type=<Optional:None>, extern_name=<Optional:None>) | |
|
272 | 275 | |
|
273 | 276 | Updates the details for the specified user, if that user exists. |
|
274 | 277 | |
@@ -291,6 +294,8 b' update_user' | |||
|
291 | 294 | :type firstname: Optional(str) |
|
292 | 295 | :param lastname: Set the new surname. |
|
293 | 296 | :type lastname: Optional(str) |
|
297 | :param description: Set user description, or short bio. Metatags are allowed. | |
|
298 | :type description: Optional(str) | |
|
294 | 299 | :param active: Set the new user as active. |
|
295 | 300 | :type active: Optional(``True`` | ``False``) |
|
296 | 301 | :param admin: Give the user admin rights. |
@@ -19,7 +19,7 b' Setup Nix Package Manager' | |||
|
19 | 19 | |
|
20 | 20 | To install the Nix Package Manager, please run:: |
|
21 | 21 | |
|
22 | $ curl https://nixos.org/nix/install | sh | |
|
22 | $ curl https://nixos.org/releases/nix/nix-2.0.4/install | sh | |
|
23 | 23 | |
|
24 | 24 | or go to https://nixos.org/nix/ and follow the installation instructions. |
|
25 | 25 | Once this is correctly set up on your system, you should be able to use the |
@@ -79,6 +79,12 b' and commit files and |repos| while manag' | |||
|
79 | 79 | contributing/contributing |
|
80 | 80 | |
|
81 | 81 | .. toctree:: |
|
82 | :maxdepth: 2 | |
|
83 | :caption: RhodeCode Control Documentation | |
|
84 | ||
|
85 | RhodeCode Installer <https://docs.rhodecode.com/RhodeCode-Control/> | |
|
86 | ||
|
87 | .. toctree:: | |
|
82 | 88 | :maxdepth: 1 |
|
83 | 89 | :caption: About |
|
84 | 90 |
@@ -11,16 +11,20 b' and import repositories in async way. It' | |||
|
11 | 11 | repository sync in scheduler. |
|
12 | 12 | |
|
13 | 13 | If you decide to use Celery you also need a working message queue. |
|
14 |
The |
|
|
14 | There are two fully supported message brokers is rabbitmq_ and redis_ (recommended). | |
|
15 | ||
|
16 | Since release 4.18.X we recommend using redis_ as a backend since it's generally | |
|
17 | easier to work with, and results in simpler stack as redis is generally recommended | |
|
18 | for caching purposes. | |
|
15 | 19 | |
|
16 | 20 | |
|
17 | 21 | In order to install and configure Celery, follow these steps: |
|
18 | 22 | |
|
19 | 1. Install RabbitMQ, see the documentation on the Celery website for | |
|
20 | `rabbitmq installation`_, or `rabbitmq website installation`_ | |
|
23 | 1. Install RabbitMQ or Redis for a message queue, see the documentation on the Celery website for | |
|
24 | `redis installation`_ or `rabbitmq installation`_ | |
|
21 | 25 | |
|
22 | 26 | |
|
23 |
1a. |
|
|
27 | 1a. If you choose RabbitMQ example configuration after installation would look like that:: | |
|
24 | 28 | |
|
25 | 29 | sudo rabbitmqctl add_user rcuser secret_password |
|
26 | 30 | sudo rabbitmqctl add_vhost rhodevhost |
@@ -45,6 +49,10 b' 3. Configure Celery in the' | |||
|
45 | 49 | Set the broker_url as minimal settings required to enable operation. |
|
46 | 50 | If used our example data from pt 1a, here is how the broker url should look like:: |
|
47 | 51 | |
|
52 | # for Redis | |
|
53 | celery.broker_url = redis://localhost:6379/8 | |
|
54 | ||
|
55 | # for RabbitMQ | |
|
48 | 56 | celery.broker_url = amqp://rcuser:secret_password@localhost:5672/rhodevhost |
|
49 | 57 |
|
|
50 | 58 | Full configuration example is below: |
@@ -57,7 +65,7 b' 3. Configure Celery in the' | |||
|
57 | 65 | #################################### |
|
58 | 66 | |
|
59 | 67 | use_celery = true |
|
60 |
celery.broker_url = |
|
|
68 | celery.broker_url = redis://localhost:6379/8 | |
|
61 | 69 | |
|
62 | 70 | # maximum tasks to execute before worker restart |
|
63 | 71 | celery.max_tasks_per_child = 100 |
@@ -69,6 +77,8 b' 3. Configure Celery in the' | |||
|
69 | 77 | .. _python: http://www.python.org/ |
|
70 | 78 | .. _mercurial: http://mercurial.selenic.com/ |
|
71 | 79 | .. _celery: http://celeryproject.org/ |
|
80 | .. _redis: http://redis.io | |
|
81 | .. _redis installation: https://redis.io/topics/quickstart | |
|
72 | 82 | .. _rabbitmq: http://www.rabbitmq.com/ |
|
73 | 83 | .. _rabbitmq installation: http://docs.celeryproject.org/en/latest/getting-started/brokers/rabbitmq.html |
|
74 | 84 | .. _rabbitmq website installation: http://www.rabbitmq.com/download.html |
@@ -35,4 +35,3 b' 2. When you open the file, find the data' | |||
|
35 | 35 | # see sqlalchemy docs for other advanced settings |
|
36 | 36 | sqlalchemy.db1.echo = false |
|
37 | 37 | sqlalchemy.db1.pool_recycle = 3600 |
|
38 | sqlalchemy.db1.convert_unicode = true |
@@ -38,44 +38,39 b' default one. See the instructions in :re' | |||
|
38 | 38 | |
|
39 | 39 | .. _issue-tr-eg-ref: |
|
40 | 40 | |
|
41 | ||
|
41 | 42 | Jira Integration |
|
42 | 43 | ---------------- |
|
43 | 44 | |
|
44 | * Regex = ``(?:^#|\s#)(\w+-\d+)`` | |
|
45 | * URL = ``https://myissueserver.com/browse/${id}`` | |
|
46 | * Issue Prefix = ``#`` | |
|
45 | Please check examples in the view for configuration the issue trackers. | |
|
46 | ||
|
47 | 47 | |
|
48 | 48 | Confluence (Wiki) |
|
49 | 49 | ----------------- |
|
50 | 50 | |
|
51 | * Regex = ``(?:conf-)([A-Z0-9]+)`` | |
|
52 | * URL = ``https://example.atlassian.net/display/wiki/${id}/${repo_name}`` | |
|
53 | * issue prefix = ``CONF-`` | |
|
51 | Please check examples in the view for configuration the issue trackers. | |
|
52 | ||
|
54 | 53 | |
|
55 | 54 | Redmine Integration |
|
56 | 55 | ------------------- |
|
57 | 56 | |
|
58 | * Regex = ``(issue-+\d+)`` | |
|
59 | * URL = ``https://myissueserver.com/redmine/issue/${id}`` | |
|
60 | * Issue Prefix = ``issue-`` | |
|
57 | Please check examples in the view for configuration the issue trackers. | |
|
58 | ||
|
61 | 59 | |
|
62 |
Redmine |
|
|
63 | -------------- | |
|
60 | Redmine wiki Integration | |
|
61 | ------------------------ | |
|
64 | 62 | |
|
65 | * Regex = ``(?:wiki-)([a-zA-Z0-9]+)`` | |
|
66 | * URL = ``https://example.com/redmine/projects/wiki/${repo_name}`` | |
|
67 | * Issue prefix = ``Issue-`` | |
|
63 | Please check examples in the view for configuration the issue trackers. | |
|
64 | ||
|
68 | 65 | |
|
69 | 66 | Pivotal Tracker |
|
70 | 67 | --------------- |
|
71 | 68 | |
|
72 | * Regex = ``(?:pivot-)(?<project_id>\d+)-(?<story>\d+)`` | |
|
73 | * URL = ``https://www.pivotaltracker.com/s/projects/${project_id}/stories/${story}`` | |
|
74 | * Issue prefix = ``Piv-`` | |
|
69 | Please check examples in the view for configuration the issue trackers. | |
|
70 | ||
|
75 | 71 | |
|
76 | 72 | Trello |
|
77 | 73 | ------ |
|
78 | 74 | |
|
79 | * Regex = ``(?:trello-)(?<card_id>[a-zA-Z0-9]+)`` | |
|
80 | * URL = ``https://trello.com/example.com/${card_id}`` | |
|
81 | * Issue prefix = ``Trello-`` | |
|
75 | Please check examples in the view for configuration the issue trackers. | |
|
76 |
@@ -62,7 +62,7 b' Performance' | |||
|
62 | 62 | Fixes |
|
63 | 63 | ^^^^^ |
|
64 | 64 | |
|
65 |
- |
|
|
65 | - Hooks: fixed more unicode problems with new pull-request link generator. | |
|
66 | 66 | - Mercurial: fix ssh-server support for mercurial custom options. |
|
67 | 67 | - Pull requests: updated metadata information for failed merges with multiple heads. |
|
68 | 68 | - Pull requests: calculate ancestor in the same way as creation mode. |
@@ -9,6 +9,7 b' Release Notes' | |||
|
9 | 9 | .. toctree:: |
|
10 | 10 | :maxdepth: 1 |
|
11 | 11 | |
|
12 | release-notes-4.18.0.rst | |
|
12 | 13 | release-notes-4.17.4.rst |
|
13 | 14 | release-notes-4.17.3.rst |
|
14 | 15 | release-notes-4.17.2.rst |
@@ -57,9 +57,12 b' To install |RCT|, use the following step' | |||
|
57 | 57 | |
|
58 | 58 | 1. Set up a ``virtualenv`` on your local machine, see virtualenv_ instructions |
|
59 | 59 | here. |
|
60 | 2. Install |RCT| using pip. Full url with token is available at https://rhodecode.com/u/#rhodecode-tools | |
|
61 |
` |
|
|
60 | 2. Install |RCT| using pip. All downloadable versions of |RCT| are available at: | |
|
61 | `https://code.rhodecode.com/rhodecode-tools-ce/artifacts` | |
|
62 | 62 | |
|
63 | Example installation:: | |
|
64 | ||
|
65 | pip install -I https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-10ac93f4-bb7d-4b97-baea-68110743dd5a.tar.gz | |
|
63 | 66 | |
|
64 | 67 | Once |RCT| is installed using these steps there are a few extra |
|
65 | 68 | configuration changes you can make. These are explained in more detail in the |
@@ -56,4 +56,4 b' Repository Shortcuts' | |||
|
56 | 56 | Go to the repository settings page. |
|
57 | 57 | |
|
58 | 58 | \--:kbd:`gO` |
|
59 | Go to the repository permissions settings. | |
|
59 | Go to the repository access permissions settings. |
@@ -99,6 +99,12 b'' | |||
|
99 | 99 | "nonull": true |
|
100 | 100 | } |
|
101 | 101 | }, |
|
102 | "uglify": { | |
|
103 | "dist": { | |
|
104 | "src": "<%= dirs.js.dest %>/scripts.js", | |
|
105 | "dest": "<%= dirs.js.dest %>/scripts.min.js" | |
|
106 | } | |
|
107 | }, | |
|
102 | 108 | "less": { |
|
103 | 109 | "development": { |
|
104 | 110 | "options": { |
@@ -22,6 +22,7 b'' | |||
|
22 | 22 | "grunt-contrib-less": "^1.1.0", |
|
23 | 23 | "grunt-contrib-watch": "^0.6.1", |
|
24 | 24 | "grunt-webpack": "^3.1.3", |
|
25 | "grunt-contrib-uglify": "^4.0.1", | |
|
25 | 26 | "jquery": "1.11.3", |
|
26 | 27 | "mark.js": "8.11.1", |
|
27 | 28 | "jshint": "^2.9.1-rc3", |
This diff has been collapsed as it changes many lines, (1005 lines changed) Show them Hide them | |||
@@ -211,13 +211,13 b' let' | |||
|
211 | 211 | sha512 = "yiUk09opTEnE1lK+tb501ENb+yQBi4p++Ep0eGJAHesVYKVMPNgPphVKkIizkDaU+n0SE+zXfTsRbYyOMDYXSg=="; |
|
212 | 212 | }; |
|
213 | 213 | }; |
|
214 |
"@polymer/polymer-3. |
|
|
214 | "@polymer/polymer-3.3.0" = { | |
|
215 | 215 | name = "_at_polymer_slash_polymer"; |
|
216 | 216 | packageName = "@polymer/polymer"; |
|
217 |
version = "3. |
|
|
218 | src = fetchurl { | |
|
219 |
url = "https://registry.npmjs.org/@polymer/polymer/-/polymer-3. |
|
|
220 | sha512 = "L6uV1oM6T6xbwbVx6t3biG5T2VSSB03LxnIrUd9M2pr6RkHVPFHJ37pC5MUwBAEhkGFJif7eks7fdMMSGZTeEQ=="; | |
|
217 | version = "3.3.0"; | |
|
218 | src = fetchurl { | |
|
219 | url = "https://registry.npmjs.org/@polymer/polymer/-/polymer-3.3.0.tgz"; | |
|
220 | sha512 = "rij7suomS7DxdBamnwr/Xa0V5hpypf7I9oYKseF2FWz5Xh2a3wJNpVjgJy1adXVCxqIyPhghsrthnfCt7EblsQ=="; | |
|
221 | 221 | }; |
|
222 | 222 | }; |
|
223 | 223 | "@types/clone-0.1.30" = { |
@@ -229,13 +229,13 b' let' | |||
|
229 | 229 | sha1 = "e7365648c1b42136a59c7d5040637b3b5c83b614"; |
|
230 | 230 | }; |
|
231 | 231 | }; |
|
232 |
"@types/node-6.14. |
|
|
232 | "@types/node-6.14.9" = { | |
|
233 | 233 | name = "_at_types_slash_node"; |
|
234 | 234 | packageName = "@types/node"; |
|
235 |
version = "6.14. |
|
|
236 | src = fetchurl { | |
|
237 |
url = "https://registry.npmjs.org/@types/node/-/node-6.14. |
|
|
238 | sha512 = "rFs9zCFtSHuseiNXxYxFlun8ibu+jtZPgRM+2ILCmeLiGeGLiIGxuOzD+cNyHegI1GD+da3R/cIbs9+xCLp13w=="; | |
|
235 | version = "6.14.9"; | |
|
236 | src = fetchurl { | |
|
237 | url = "https://registry.npmjs.org/@types/node/-/node-6.14.9.tgz"; | |
|
238 | sha512 = "leP/gxHunuazPdZaCvsCefPQxinqUDsCxCR5xaDUrY2MkYxQRFZZwU5e7GojyYsGB7QVtCi7iVEl/hoFXQYc+w=="; | |
|
239 | 239 | }; |
|
240 | 240 | }; |
|
241 | 241 | "@types/parse5-2.2.34" = { |
@@ -409,22 +409,22 b' let' | |||
|
409 | 409 | sha512 = "mJ3QKWtCchL1vhU/kZlJnLPuQZnlDOdZsyP0bbLWPGdYsQDnSBvyTLhzwBA3QAMlzEL9V4JHygEmK6/OTEyytA=="; |
|
410 | 410 | }; |
|
411 | 411 | }; |
|
412 |
"@webcomponents/shadycss-1.9. |
|
|
412 | "@webcomponents/shadycss-1.9.2" = { | |
|
413 | 413 | name = "_at_webcomponents_slash_shadycss"; |
|
414 | 414 | packageName = "@webcomponents/shadycss"; |
|
415 |
version = "1.9. |
|
|
416 | src = fetchurl { | |
|
417 |
url = "https://registry.npmjs.org/@webcomponents/shadycss/-/shadycss-1.9. |
|
|
418 | sha512 = "IaZOnWOKXHghqk/WfPNDRIgDBi3RsVPY2IFAw6tYiL9UBGvQRy5R6uC+Fk7qTZsReTJ0xh5MTT8yAcb3MUR4mQ=="; | |
|
419 | }; | |
|
420 | }; | |
|
421 |
"@webcomponents/webcomponentsjs-2. |
|
|
415 | version = "1.9.2"; | |
|
416 | src = fetchurl { | |
|
417 | url = "https://registry.npmjs.org/@webcomponents/shadycss/-/shadycss-1.9.2.tgz"; | |
|
418 | sha512 = "GsD7RpDVrVdgC6e+D8zQia8RGNmEGQ9/qotnVPQYPrIXhGS5xSt6ZED9YmuHz3HbLqY+E54tE1EK3tjLzSCGrw=="; | |
|
419 | }; | |
|
420 | }; | |
|
421 | "@webcomponents/webcomponentsjs-2.3.0" = { | |
|
422 | 422 | name = "_at_webcomponents_slash_webcomponentsjs"; |
|
423 | 423 | packageName = "@webcomponents/webcomponentsjs"; |
|
424 |
version = "2. |
|
|
425 | src = fetchurl { | |
|
426 |
url = "https://registry.npmjs.org/@webcomponents/webcomponentsjs/-/webcomponentsjs-2. |
|
|
427 | sha512 = "5dzhUhP+h0qMiK0IWb7VNb0OGBoXO3AuI6Qi8t9PoKT50s5L1jv0xnwnLq+cFgPuTB8FLTNP8xIDmyoOsKBy9Q=="; | |
|
424 | version = "2.3.0"; | |
|
425 | src = fetchurl { | |
|
426 | url = "https://registry.npmjs.org/@webcomponents/webcomponentsjs/-/webcomponentsjs-2.3.0.tgz"; | |
|
427 | sha512 = "sR6FOrNnnncRuoJDqq9QxtRsJMbIvASw4vnJwIYKVlKO3AMc+NAr/bIQNnUiTTE9pBDTJkFpVaUdjJaRdsjmyA=="; | |
|
428 | 428 | }; |
|
429 | 429 | }; |
|
430 | 430 | "@xtuc/ieee754-1.2.0" = { |
@@ -499,22 +499,22 b' let' | |||
|
499 | 499 | sha1 = "82ffb02b29e662ae53bdc20af15947706739c536"; |
|
500 | 500 | }; |
|
501 | 501 | }; |
|
502 |
"ajv-6.10. |
|
|
502 | "ajv-6.10.2" = { | |
|
503 | 503 | name = "ajv"; |
|
504 | 504 | packageName = "ajv"; |
|
505 |
version = "6.10. |
|
|
506 | src = fetchurl { | |
|
507 |
url = "https://registry.npmjs.org/ajv/-/ajv-6.10. |
|
|
508 | sha512 = "nffhOpkymDECQyR0mnsUtoCE8RlX38G0rYP+wgLWFyZuUyuuojSSvi/+euOiQBIn63whYwYVIIH1TvE3tu4OEg=="; | |
|
509 | }; | |
|
510 | }; | |
|
511 |
"ajv-keywords-3.4. |
|
|
505 | version = "6.10.2"; | |
|
506 | src = fetchurl { | |
|
507 | url = "https://registry.npmjs.org/ajv/-/ajv-6.10.2.tgz"; | |
|
508 | sha512 = "TXtUUEYHuaTEbLZWIKUr5pmBuhDLy+8KYtPYdcV8qC+pOZL+NKqYwvWSRrVXHn+ZmRRAu8vJTAznH7Oag6RVRw=="; | |
|
509 | }; | |
|
510 | }; | |
|
511 | "ajv-keywords-3.4.1" = { | |
|
512 | 512 | name = "ajv-keywords"; |
|
513 | 513 | packageName = "ajv-keywords"; |
|
514 |
version = "3.4. |
|
|
515 | src = fetchurl { | |
|
516 |
url = "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-3.4. |
|
|
517 | sha512 = "aUjdRFISbuFOl0EIZc+9e4FfZp0bDZgAdOOf30bJmw8VM9v84SHyVyxDfbWxpGYbdZD/9XoKxfHVNmxPkhwyGw=="; | |
|
514 | version = "3.4.1"; | |
|
515 | src = fetchurl { | |
|
516 | url = "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-3.4.1.tgz"; | |
|
517 | sha512 = "RO1ibKvd27e6FEShVFfPALuHI3WjSVNeK5FIsmme/LYRNxjKuNj+Dt7bucLa6NdSv3JcVTyMlm9kGR84z1XpaQ=="; | |
|
518 | 518 | }; |
|
519 | 519 | }; |
|
520 | 520 | "align-text-0.1.4" = { |
@@ -806,13 +806,13 b' let' | |||
|
806 | 806 | sha1 = "b6bbe0b0674b9d719708ca38de8c237cb526c3d1"; |
|
807 | 807 | }; |
|
808 | 808 | }; |
|
809 |
"async-2.6. |
|
|
809 | "async-2.6.3" = { | |
|
810 | 810 | name = "async"; |
|
811 | 811 | packageName = "async"; |
|
812 |
version = "2.6. |
|
|
813 | src = fetchurl { | |
|
814 |
url = "https://registry.npmjs.org/async/-/async-2.6. |
|
|
815 | sha512 = "H1qVYh1MYhEEFLsP97cVKqCGo7KfCyTt6uEWqsTBr9SO84oK9Uwbyd/yCW+6rKJLHksBNUVWZDAjfS+Ccx0Bbg=="; | |
|
812 | version = "2.6.3"; | |
|
813 | src = fetchurl { | |
|
814 | url = "https://registry.npmjs.org/async/-/async-2.6.3.tgz"; | |
|
815 | sha512 = "zflvls11DCy+dQWzTW2dzuilv8Z5X/pjfmZOWba6TNIVDm+2UDaJmXSOXlasHKfNBs8oo3M0aT50fDEWfKZjXg=="; | |
|
816 | 816 | }; |
|
817 | 817 | }; |
|
818 | 818 | "async-each-1.0.3" = { |
@@ -1400,13 +1400,13 b' let' | |||
|
1400 | 1400 | sha512 = "5T6P4xPgpp0YDFvSWwEZ4NoE3aM4QBQXDzmVbraCkFj8zHM+mba8SyqB5DbZWyR7mYHo6Y7BdQo3MoA4m0TeQg=="; |
|
1401 | 1401 | }; |
|
1402 | 1402 | }; |
|
1403 |
"base64-js-1.3. |
|
|
1403 | "base64-js-1.3.1" = { | |
|
1404 | 1404 | name = "base64-js"; |
|
1405 | 1405 | packageName = "base64-js"; |
|
1406 |
version = "1.3. |
|
|
1407 | src = fetchurl { | |
|
1408 |
url = "https://registry.npmjs.org/base64-js/-/base64-js-1.3. |
|
|
1409 | sha512 = "ccav/yGvoa80BQDljCxsmmQ3Xvx60/UpBIij5QN21W3wBi/hhIC9OoO+KLpu9IJTS9j4DRVJ3aDDF9cMSoa2lw=="; | |
|
1406 | version = "1.3.1"; | |
|
1407 | src = fetchurl { | |
|
1408 | url = "https://registry.npmjs.org/base64-js/-/base64-js-1.3.1.tgz"; | |
|
1409 | sha512 = "mLQ4i2QO1ytvGWFWmcngKO//JXAQueZvwEKtjgQFM4jIK0kU+ytMfplL8j+n5mspOfjHwoAg+9yhb7BwAHm36g=="; | |
|
1410 | 1410 | }; |
|
1411 | 1411 | }; |
|
1412 | 1412 | "bcrypt-pbkdf-1.0.2" = { |
@@ -1445,13 +1445,13 b' let' | |||
|
1445 | 1445 | sha512 = "Un7MIEDdUC5gNpcGDV97op1Ywk748MpHcFTHoYs6qnj1Z3j7I53VG3nwZhKzoBZmbdRNnb6WRdFlwl7tSDuZGw=="; |
|
1446 | 1446 | }; |
|
1447 | 1447 | }; |
|
1448 |
"bluebird-3. |
|
|
1448 | "bluebird-3.7.1" = { | |
|
1449 | 1449 | name = "bluebird"; |
|
1450 | 1450 | packageName = "bluebird"; |
|
1451 |
version = "3. |
|
|
1452 | src = fetchurl { | |
|
1453 |
url = "https://registry.npmjs.org/bluebird/-/bluebird-3. |
|
|
1454 | sha512 = "FG+nFEZChJrbQ9tIccIfZJBz3J7mLrAhxakAbnrJWn8d7aKOC+LWifa0G+p4ZqKp4y13T7juYvdhq9NzKdsrjw=="; | |
|
1451 | version = "3.7.1"; | |
|
1452 | src = fetchurl { | |
|
1453 | url = "https://registry.npmjs.org/bluebird/-/bluebird-3.7.1.tgz"; | |
|
1454 | sha512 = "DdmyoGCleJnkbp3nkbxTLJ18rjDsE4yCggEwKNXkeV123sPNfOCYeDoeuOY+F2FrSjO1YXcTU+dsy96KMy+gcg=="; | |
|
1455 | 1455 | }; |
|
1456 | 1456 | }; |
|
1457 | 1457 | "bn.js-4.11.8" = { |
@@ -1670,22 +1670,22 b' let' | |||
|
1670 | 1670 | sha1 = "b534e7c734c4f81ec5fbe8aca2ad24354b962c6c"; |
|
1671 | 1671 | }; |
|
1672 | 1672 | }; |
|
1673 |
"caniuse-db-1.0.30000 |
|
|
1673 | "caniuse-db-1.0.30001006" = { | |
|
1674 | 1674 | name = "caniuse-db"; |
|
1675 | 1675 | packageName = "caniuse-db"; |
|
1676 |
version = "1.0.30000 |
|
|
1677 | src = fetchurl { | |
|
1678 |
url = "https://registry.npmjs.org/caniuse-db/-/caniuse-db-1.0.30000 |
|
|
1679 | sha512 = "70gk6cLSD5rItxnZ7WUxyCpM9LAjEb1tVzlENQfXQXZS/IiGnfAC6u32G5cZFlDBKjNPBIta/QSx5CZLZepxRA=="; | |
|
1680 | }; | |
|
1681 | }; | |
|
1682 |
"caniuse-lite-1.0.30000 |
|
|
1676 | version = "1.0.30001006"; | |
|
1677 | src = fetchurl { | |
|
1678 | url = "https://registry.npmjs.org/caniuse-db/-/caniuse-db-1.0.30001006.tgz"; | |
|
1679 | sha512 = "Xn25grc0GXATFnnEX+KP3IwEv6ZdHs4CALyLKvK8pBeeBe+hSpqy3/GyKBgEp4hn6o+bI+GNeNeQBf9PBOK0EQ=="; | |
|
1680 | }; | |
|
1681 | }; | |
|
1682 | "caniuse-lite-1.0.30001006" = { | |
|
1683 | 1683 | name = "caniuse-lite"; |
|
1684 | 1684 | packageName = "caniuse-lite"; |
|
1685 |
version = "1.0.30000 |
|
|
1686 | src = fetchurl { | |
|
1687 |
url = "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30000 |
|
|
1688 | sha512 = "rUBIbap+VJfxTzrM4akJ00lkvVb5/n5v3EGXfWzSH5zT8aJmGzjA8HWhJ4U6kCpzxozUSnB+yvAYDRPY6mRpgQ=="; | |
|
1685 | version = "1.0.30001006"; | |
|
1686 | src = fetchurl { | |
|
1687 | url = "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001006.tgz"; | |
|
1688 | sha512 = "MXnUVX27aGs/QINz+QG1sWSLDr3P1A3Hq5EUWoIt0T7K24DuvMxZEnh3Y5aHlJW6Bz2aApJdSewdYLd8zQnUuw=="; | |
|
1689 | 1689 | }; |
|
1690 | 1690 | }; |
|
1691 | 1691 | "caseless-0.12.0" = { |
@@ -1733,31 +1733,31 b' let' | |||
|
1733 | 1733 | sha512 = "Mti+f9lpJNcwF4tWV8/OrTTtF1gZi+f8FqlyAdouralcFWFQWF2+NgCHShjkCb+IFBLq9buZwE1xckQU4peSuQ=="; |
|
1734 | 1734 | }; |
|
1735 | 1735 | }; |
|
1736 |
"chokidar-2.1. |
|
|
1736 | "chokidar-2.1.8" = { | |
|
1737 | 1737 | name = "chokidar"; |
|
1738 | 1738 | packageName = "chokidar"; |
|
1739 |
version = "2.1. |
|
|
1740 | src = fetchurl { | |
|
1741 |
url = "https://registry.npmjs.org/chokidar/-/chokidar-2.1. |
|
|
1742 | sha512 = "i0TprVWp+Kj4WRPtInjexJ8Q+BqTE909VpH8xVhXrJkoc5QC8VO9TryGOqTr+2hljzc1sC62t22h5tZePodM/A=="; | |
|
1743 | }; | |
|
1744 | }; | |
|
1745 |
"chownr-1.1. |
|
|
1739 | version = "2.1.8"; | |
|
1740 | src = fetchurl { | |
|
1741 | url = "https://registry.npmjs.org/chokidar/-/chokidar-2.1.8.tgz"; | |
|
1742 | sha512 = "ZmZUazfOzf0Nve7duiCKD23PFSCs4JPoYyccjUFF3aQkQadqBhfzhjkwBH2mNOG9cTBwhamM37EIsIkZw3nRgg=="; | |
|
1743 | }; | |
|
1744 | }; | |
|
1745 | "chownr-1.1.3" = { | |
|
1746 | 1746 | name = "chownr"; |
|
1747 | 1747 | packageName = "chownr"; |
|
1748 |
version = "1.1. |
|
|
1749 | src = fetchurl { | |
|
1750 |
url = "https://registry.npmjs.org/chownr/-/chownr-1.1. |
|
|
1751 | sha512 = "j38EvO5+LHX84jlo6h4UzmOwi0UgW61WRyPtJz4qaadK5eY3BTS5TY/S1Stc3Uk2lIM6TPevAlULiEJwie860g=="; | |
|
1752 | }; | |
|
1753 | }; | |
|
1754 |
"chrome-trace-event-1.0. |
|
|
1748 | version = "1.1.3"; | |
|
1749 | src = fetchurl { | |
|
1750 | url = "https://registry.npmjs.org/chownr/-/chownr-1.1.3.tgz"; | |
|
1751 | sha512 = "i70fVHhmV3DtTl6nqvZOnIjbY0Pe4kAUjwHj8z0zAdgBtYrJyYwLKCCuRBQ5ppkyL0AkN7HKRnETdmdp1zqNXw=="; | |
|
1752 | }; | |
|
1753 | }; | |
|
1754 | "chrome-trace-event-1.0.2" = { | |
|
1755 | 1755 | name = "chrome-trace-event"; |
|
1756 | 1756 | packageName = "chrome-trace-event"; |
|
1757 |
version = "1.0. |
|
|
1758 | src = fetchurl { | |
|
1759 |
url = "https://registry.npmjs.org/chrome-trace-event/-/chrome-trace-event-1.0. |
|
|
1760 | sha512 = "xDbVgyfDTT2piup/h8dK/y4QZfJRSa73bw1WZ8b4XM1o7fsFubUVGYcE+1ANtOzJJELGpYoG2961z0Z6OAld9A=="; | |
|
1757 | version = "1.0.2"; | |
|
1758 | src = fetchurl { | |
|
1759 | url = "https://registry.npmjs.org/chrome-trace-event/-/chrome-trace-event-1.0.2.tgz"; | |
|
1760 | sha512 = "9e/zx1jw7B4CO+c/RXoCsfg/x1AfUBioy4owYH0bJprEYAx5hRFLRhWBqHAG57D0ZM4H7vxbP7bPe0VwhQRYDQ=="; | |
|
1761 | 1761 | }; |
|
1762 | 1762 | }; |
|
1763 | 1763 | "cipher-base-1.0.4" = { |
@@ -1958,13 +1958,13 b' let' | |||
|
1958 | 1958 | sha1 = "168a4701756b6a7f51a12ce0c97bfa28c084ed63"; |
|
1959 | 1959 | }; |
|
1960 | 1960 | }; |
|
1961 |
"colors-1. |
|
|
1961 | "colors-1.4.0" = { | |
|
1962 | 1962 | name = "colors"; |
|
1963 | 1963 | packageName = "colors"; |
|
1964 |
version = "1. |
|
|
1965 | src = fetchurl { | |
|
1966 |
url = "https://registry.npmjs.org/colors/-/colors-1. |
|
|
1967 | sha512 = "mmGt/1pZqYRjMxB1axhTo16/snVZ5krrKkcmMeVKxzECMMXoCgnvTPp10QgHfcbQZw8Dq2jMNG6je4JlWU0gWg=="; | |
|
1964 | version = "1.4.0"; | |
|
1965 | src = fetchurl { | |
|
1966 | url = "https://registry.npmjs.org/colors/-/colors-1.4.0.tgz"; | |
|
1967 | sha512 = "a+UqTh4kgZg/SlGvfbzDHpgRu7AAQOmmqRHJnxhRZICKFUT91brVhNNt58CMWU9PsBbv3PDCZUHbVxuDiH2mtA=="; | |
|
1968 | 1968 | }; |
|
1969 | 1969 | }; |
|
1970 | 1970 | "combined-stream-1.0.8" = { |
@@ -2003,6 +2003,15 b' let' | |||
|
2003 | 2003 | sha512 = "6tvAOO+D6OENvRAh524Dh9jcfKTYDQAqvqezbCW82xj5X0pSrcpxtvRKHLG0yBY6SD7PSDrJaj+0AiOcKVd1Xg=="; |
|
2004 | 2004 | }; |
|
2005 | 2005 | }; |
|
2006 | "commander-2.20.3" = { | |
|
2007 | name = "commander"; | |
|
2008 | packageName = "commander"; | |
|
2009 | version = "2.20.3"; | |
|
2010 | src = fetchurl { | |
|
2011 | url = "https://registry.npmjs.org/commander/-/commander-2.20.3.tgz"; | |
|
2012 | sha512 = "GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ=="; | |
|
2013 | }; | |
|
2014 | }; | |
|
2006 | 2015 | "commondir-1.0.1" = { |
|
2007 | 2016 | name = "commondir"; |
|
2008 | 2017 | packageName = "commondir"; |
@@ -2093,13 +2102,13 b' let' | |||
|
2093 | 2102 | sha512 = "Y+SQCF+0NoWQryez2zXn5J5knmr9z/9qSQt7fbL78u83rxmigOy8X5+BFn8CFSuX+nKT8gpYwJX68ekqtQt6ZA=="; |
|
2094 | 2103 | }; |
|
2095 | 2104 | }; |
|
2096 |
"core-js-2.6. |
|
|
2105 | "core-js-2.6.10" = { | |
|
2097 | 2106 | name = "core-js"; |
|
2098 | 2107 | packageName = "core-js"; |
|
2099 |
version = "2.6. |
|
|
2100 | src = fetchurl { | |
|
2101 |
url = "https://registry.npmjs.org/core-js/-/core-js-2.6. |
|
|
2102 | sha512 = "klh/kDpwX8hryYL14M9w/xei6vrv6sE8gTHDG7/T/+SEovB/G4ejwcfE/CBzO6Edsu+OETZMZ3wcX/EjUkrl5A=="; | |
|
2108 | version = "2.6.10"; | |
|
2109 | src = fetchurl { | |
|
2110 | url = "https://registry.npmjs.org/core-js/-/core-js-2.6.10.tgz"; | |
|
2111 | sha512 = "I39t74+4t+zau64EN1fE5v2W31Adtc/REhzWN+gWRRXg6WH5qAsZm62DHpQ1+Yhe4047T55jvzz7MUqF/dBBlA=="; | |
|
2103 | 2112 | }; |
|
2104 | 2113 | }; |
|
2105 | 2114 | "core-util-is-1.0.2" = { |
@@ -2237,13 +2246,13 b' let' | |||
|
2237 | 2246 | sha1 = "ddd52c587033f49e94b71fc55569f252e8ff5f85"; |
|
2238 | 2247 | }; |
|
2239 | 2248 | }; |
|
2240 |
"cyclist-0. |
|
|
2249 | "cyclist-1.0.1" = { | |
|
2241 | 2250 | name = "cyclist"; |
|
2242 | 2251 | packageName = "cyclist"; |
|
2243 |
version = "0. |
|
|
2244 | src = fetchurl { | |
|
2245 |
url = "https://registry.npmjs.org/cyclist/-/cyclist-0. |
|
|
2246 | sha1 = "1b33792e11e914a2fd6d6ed6447464444e5fa640"; | |
|
2252 | version = "1.0.1"; | |
|
2253 | src = fetchurl { | |
|
2254 | url = "https://registry.npmjs.org/cyclist/-/cyclist-1.0.1.tgz"; | |
|
2255 | sha1 = "596e9698fd0c80e12038c2b82d6eb1b35b6224d9"; | |
|
2247 | 2256 | }; |
|
2248 | 2257 | }; |
|
2249 | 2258 | "dashdash-1.14.1" = { |
@@ -2435,13 +2444,13 b' let' | |||
|
2435 | 2444 | sha512 = "gd3ypIPfOMr9h5jIKq8E3sHOTCjeirnl0WK5ZdS1AW0Odt0b1PaWaHdJ4Qk4klv+YB9aJBS7mESXjFoDQPu6DA=="; |
|
2436 | 2445 | }; |
|
2437 | 2446 | }; |
|
2438 |
"dom-serializer-0. |
|
|
2447 | "dom-serializer-0.2.1" = { | |
|
2439 | 2448 | name = "dom-serializer"; |
|
2440 | 2449 | packageName = "dom-serializer"; |
|
2441 |
version = "0. |
|
|
2442 | src = fetchurl { | |
|
2443 |
url = "https://registry.npmjs.org/dom-serializer/-/dom-serializer-0. |
|
|
2444 | sha512 = "l0IU0pPzLWSHBcieZbpOKgkIn3ts3vAh7ZuFyXNwJxJXk/c4Gwj9xaTJwIDVQCXawWD0qb3IzMGH5rglQaO0XA=="; | |
|
2450 | version = "0.2.1"; | |
|
2451 | src = fetchurl { | |
|
2452 | url = "https://registry.npmjs.org/dom-serializer/-/dom-serializer-0.2.1.tgz"; | |
|
2453 | sha512 = "sK3ujri04WyjwQXVoK4PU3y8ula1stq10GJZpqHIUgoGZdsGzAGu65BnU3d08aTVSvO7mGPZUc0wTEDL+qGE0Q=="; | |
|
2445 | 2454 | }; |
|
2446 | 2455 | }; |
|
2447 | 2456 | "dom5-2.3.0" = { |
@@ -2471,6 +2480,15 b' let' | |||
|
2471 | 2480 | sha512 = "BSKB+TSpMpFI/HOxCNr1O8aMOTZ8hT3pM3GQ0w/mWRmkhEDSFJkkyzz4XQsBV44BChwGkrDfMyjVD0eA2aFV3w=="; |
|
2472 | 2481 | }; |
|
2473 | 2482 | }; |
|
2483 | "domelementtype-2.0.1" = { | |
|
2484 | name = "domelementtype"; | |
|
2485 | packageName = "domelementtype"; | |
|
2486 | version = "2.0.1"; | |
|
2487 | src = fetchurl { | |
|
2488 | url = "https://registry.npmjs.org/domelementtype/-/domelementtype-2.0.1.tgz"; | |
|
2489 | sha512 = "5HOHUDsYZWV8FGWN0Njbr/Rn7f/eWSQi1v7+HsUVwXgn8nWWlL64zKDkS0n8ZmQ3mlWOMuXOnR+7Nx/5tMO5AQ=="; | |
|
2490 | }; | |
|
2491 | }; | |
|
2474 | 2492 | "domhandler-2.3.0" = { |
|
2475 | 2493 | name = "domhandler"; |
|
2476 | 2494 | packageName = "domhandler"; |
@@ -2498,6 +2516,15 b' let' | |||
|
2498 | 2516 | sha512 = "3VduRWLxx9hbVr42QieQN25mx/I61/mRdUSuxAmDGdDqZIN8qtP7tcKMa3KfpJjuGjOJGYYUzzeq6eGDnkzesA=="; |
|
2499 | 2517 | }; |
|
2500 | 2518 | }; |
|
2519 | "duplexer-0.1.1" = { | |
|
2520 | name = "duplexer"; | |
|
2521 | packageName = "duplexer"; | |
|
2522 | version = "0.1.1"; | |
|
2523 | src = fetchurl { | |
|
2524 | url = "https://registry.npmjs.org/duplexer/-/duplexer-0.1.1.tgz"; | |
|
2525 | sha1 = "ace6ff808c1ce66b57d1ebf97977acb02334cfc1"; | |
|
2526 | }; | |
|
2527 | }; | |
|
2501 | 2528 | "duplexify-3.7.1" = { |
|
2502 | 2529 | name = "duplexify"; |
|
2503 | 2530 | packageName = "duplexify"; |
@@ -2516,22 +2543,22 b' let' | |||
|
2516 | 2543 | sha1 = "3a83a904e54353287874c564b7549386849a98c9"; |
|
2517 | 2544 | }; |
|
2518 | 2545 | }; |
|
2519 |
"electron-to-chromium-1.3. |
|
|
2546 | "electron-to-chromium-1.3.302" = { | |
|
2520 | 2547 | name = "electron-to-chromium"; |
|
2521 | 2548 | packageName = "electron-to-chromium"; |
|
2522 |
version = "1.3. |
|
|
2523 | src = fetchurl { | |
|
2524 |
url = "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.3. |
|
|
2525 | sha512 = "lyoC8aoqbbDqsprb6aPdt9n3DpOZZzdz/T4IZKsR0/dkZIxnJVUjjcpOSwA66jPRIOyDAamCTAUqweU05kKNSg=="; | |
|
2526 | }; | |
|
2527 | }; | |
|
2528 |
"elliptic-6. |
|
|
2549 | version = "1.3.302"; | |
|
2550 | src = fetchurl { | |
|
2551 | url = "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.3.302.tgz"; | |
|
2552 | sha512 = "1qConyiVEbj4xZRBXqtGR003+9tV0rJF0PS6aeO0Ln/UL637js9hdwweCl07meh/kJoI2N4W8q3R3g3F5z46ww=="; | |
|
2553 | }; | |
|
2554 | }; | |
|
2555 | "elliptic-6.5.1" = { | |
|
2529 | 2556 | name = "elliptic"; |
|
2530 | 2557 | packageName = "elliptic"; |
|
2531 |
version = "6. |
|
|
2532 | src = fetchurl { | |
|
2533 |
url = "https://registry.npmjs.org/elliptic/-/elliptic-6. |
|
|
2534 | sha512 = "BsXLz5sqX8OHcsh7CqBMztyXARmGQ3LWPtGjJi6DiJHq5C/qvi9P3OqgswKSDftbu8+IoI/QDTAm2fFnQ9SZSQ=="; | |
|
2558 | version = "6.5.1"; | |
|
2559 | src = fetchurl { | |
|
2560 | url = "https://registry.npmjs.org/elliptic/-/elliptic-6.5.1.tgz"; | |
|
2561 | sha512 = "xvJINNLbTeWQjrl6X+7eQCrIy/YPv5XCpKW6kB5mKvtnGILoLDcySuwomfdzt0BMdLNVnuRNTuzKNHj0bva1Cg=="; | |
|
2535 | 2562 | }; |
|
2536 | 2563 | }; |
|
2537 | 2564 | "emojis-list-2.1.0" = { |
@@ -2543,13 +2570,13 b' let' | |||
|
2543 | 2570 | sha1 = "4daa4d9db00f9819880c79fa457ae5b09a1fd389"; |
|
2544 | 2571 | }; |
|
2545 | 2572 | }; |
|
2546 |
"end-of-stream-1.4. |
|
|
2573 | "end-of-stream-1.4.4" = { | |
|
2547 | 2574 | name = "end-of-stream"; |
|
2548 | 2575 | packageName = "end-of-stream"; |
|
2549 |
version = "1.4. |
|
|
2550 | src = fetchurl { | |
|
2551 |
url = "https://registry.npmjs.org/end-of-stream/-/end-of-stream-1.4. |
|
|
2552 | sha512 = "1MkrZNvWTKCaigbn+W15elq2BB/L22nqrSY5DKlo3X6+vclJm8Bb5djXJBmEX6fS3+zCh/F4VBK5Z2KxJt4s2Q=="; | |
|
2576 | version = "1.4.4"; | |
|
2577 | src = fetchurl { | |
|
2578 | url = "https://registry.npmjs.org/end-of-stream/-/end-of-stream-1.4.4.tgz"; | |
|
2579 | sha512 = "+uw1inIHVPQoaVuHzRyXd21icM+cnt4CzD5rW+NC1wjOUSTOs+Te7FOv7AhN7vS9x/oIyhLP5PR1H+phQAHu5Q=="; | |
|
2553 | 2580 | }; |
|
2554 | 2581 | }; |
|
2555 | 2582 | "enhanced-resolve-3.4.1" = { |
@@ -2561,13 +2588,13 b' let' | |||
|
2561 | 2588 | sha1 = "0421e339fd71419b3da13d129b3979040230476e"; |
|
2562 | 2589 | }; |
|
2563 | 2590 | }; |
|
2564 |
"enhanced-resolve-4.1. |
|
|
2591 | "enhanced-resolve-4.1.1" = { | |
|
2565 | 2592 | name = "enhanced-resolve"; |
|
2566 | 2593 | packageName = "enhanced-resolve"; |
|
2567 |
version = "4.1. |
|
|
2568 | src = fetchurl { | |
|
2569 |
url = "https://registry.npmjs.org/enhanced-resolve/-/enhanced-resolve-4.1. |
|
|
2570 | sha512 = "F/7vkyTtyc/llOIn8oWclcB25KdRaiPBpZYDgJHgh/UHtpgT2p2eldQgtQnLtUvfMKPKxbRaQM/hHkvLHt1Vng=="; | |
|
2594 | version = "4.1.1"; | |
|
2595 | src = fetchurl { | |
|
2596 | url = "https://registry.npmjs.org/enhanced-resolve/-/enhanced-resolve-4.1.1.tgz"; | |
|
2597 | sha512 = "98p2zE+rL7/g/DzMHMTF4zZlCgeVdJ7yr6xzEpJRYwFYrGi9ANdn5DnJURg6RpBkyk60XYDnWIv51VfIhfNGuA=="; | |
|
2571 | 2598 | }; |
|
2572 | 2599 | }; |
|
2573 | 2600 | "entities-1.0.0" = { |
@@ -2579,13 +2606,13 b' let' | |||
|
2579 | 2606 | sha1 = "b2987aa3821347fcde642b24fdfc9e4fb712bf26"; |
|
2580 | 2607 | }; |
|
2581 | 2608 | }; |
|
2582 |
"entities- |
|
|
2609 | "entities-2.0.0" = { | |
|
2583 | 2610 | name = "entities"; |
|
2584 | 2611 | packageName = "entities"; |
|
2585 |
version = " |
|
|
2586 | src = fetchurl { | |
|
2587 |
url = "https://registry.npmjs.org/entities/-/entities- |
|
|
2588 | sha512 = "f2LZMYl1Fzu7YSBKg+RoROelpOaNrcGmE9AZubeDfrCEia483oW4MI4VyFd5VNHIgQ/7qm1I0wUHK1eJnn2y2w=="; | |
|
2612 | version = "2.0.0"; | |
|
2613 | src = fetchurl { | |
|
2614 | url = "https://registry.npmjs.org/entities/-/entities-2.0.0.tgz"; | |
|
2615 | sha512 = "D9f7V0JSRwIxlRI2mjMqufDrRDnx8p+eEOz7aUM9SuvF8gsBzra0/6tbjl1m8eQHrZlYj6PxqE00hZ1SAIKPLw=="; | |
|
2589 | 2616 | }; |
|
2590 | 2617 | }; |
|
2591 | 2618 | "errno-0.1.7" = { |
@@ -2597,13 +2624,13 b' let' | |||
|
2597 | 2624 | sha512 = "MfrRBDWzIWifgq6tJj60gkAwtLNb6sQPlcFrSOflcP1aFmmruKQ2wRnze/8V6kgyz7H3FF8Npzv78mZ7XLLflg=="; |
|
2598 | 2625 | }; |
|
2599 | 2626 | }; |
|
2600 |
"es-abstract-1.1 |
|
|
2627 | "es-abstract-1.16.0" = { | |
|
2601 | 2628 | name = "es-abstract"; |
|
2602 | 2629 | packageName = "es-abstract"; |
|
2603 |
version = "1.1 |
|
|
2604 | src = fetchurl { | |
|
2605 |
url = "https://registry.npmjs.org/es-abstract/-/es-abstract-1.1 |
|
|
2606 | sha512 = "vDZfg/ykNxQVwup/8E1BZhVzFfBxs9NqMzGcvIJrqg5k2/5Za2bWo40dK2J1pgLngZ7c+Shh8lwYtLGyrwPutg=="; | |
|
2630 | version = "1.16.0"; | |
|
2631 | src = fetchurl { | |
|
2632 | url = "https://registry.npmjs.org/es-abstract/-/es-abstract-1.16.0.tgz"; | |
|
2633 | sha512 = "xdQnfykZ9JMEiasTAJZJdMWCQ1Vm00NBw79/AWi7ELfZuuPCSOMDZbT9mkOfSctVtfhb+sAAzrm+j//GjjLHLg=="; | |
|
2607 | 2634 | }; |
|
2608 | 2635 | }; |
|
2609 | 2636 | "es-to-primitive-1.2.0" = { |
@@ -2687,22 +2714,22 b' let' | |||
|
2687 | 2714 | sha512 = "64RBB++fIOAXPw3P9cy89qfMlvZEXZkqqJkjqqXIvzP5ezRZjW+lPWjw35UX/3EhUPFYbg5ER4JYgDw4007/DQ=="; |
|
2688 | 2715 | }; |
|
2689 | 2716 | }; |
|
2690 |
"estraverse-4. |
|
|
2717 | "estraverse-4.3.0" = { | |
|
2691 | 2718 | name = "estraverse"; |
|
2692 | 2719 | packageName = "estraverse"; |
|
2693 |
version = "4. |
|
|
2694 | src = fetchurl { | |
|
2695 |
url = "https://registry.npmjs.org/estraverse/-/estraverse-4. |
|
|
2696 | sha1 = "0dee3fed31fcd469618ce7342099fc1afa0bdb13"; | |
|
2697 | }; | |
|
2698 | }; | |
|
2699 |
"esutils-2.0. |
|
|
2720 | version = "4.3.0"; | |
|
2721 | src = fetchurl { | |
|
2722 | url = "https://registry.npmjs.org/estraverse/-/estraverse-4.3.0.tgz"; | |
|
2723 | sha512 = "39nnKffWz8xN1BU/2c79n9nB9HDzo0niYUqx6xyqUnyoAnQyyWpOTdZEeiCch8BBu515t4wp9ZmgVfVhn9EBpw=="; | |
|
2724 | }; | |
|
2725 | }; | |
|
2726 | "esutils-2.0.3" = { | |
|
2700 | 2727 | name = "esutils"; |
|
2701 | 2728 | packageName = "esutils"; |
|
2702 |
version = "2.0. |
|
|
2703 | src = fetchurl { | |
|
2704 |
url = "https://registry.npmjs.org/esutils/-/esutils-2.0. |
|
|
2705 | sha1 = "0abf4f1caa5bcb1f7a9d8acc6dea4faaa04bac9b"; | |
|
2729 | version = "2.0.3"; | |
|
2730 | src = fetchurl { | |
|
2731 | url = "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz"; | |
|
2732 | sha512 = "kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g=="; | |
|
2706 | 2733 | }; |
|
2707 | 2734 | }; |
|
2708 | 2735 | "eventemitter2-0.4.14" = { |
@@ -2867,6 +2894,15 b' let' | |||
|
2867 | 2894 | sha1 = "c14c5b3bf14d7417ffbfd990c0a7495cd9f337bc"; |
|
2868 | 2895 | }; |
|
2869 | 2896 | }; |
|
2897 | "figures-1.7.0" = { | |
|
2898 | name = "figures"; | |
|
2899 | packageName = "figures"; | |
|
2900 | version = "1.7.0"; | |
|
2901 | src = fetchurl { | |
|
2902 | url = "https://registry.npmjs.org/figures/-/figures-1.7.0.tgz"; | |
|
2903 | sha1 = "cbe1e3affcf1cd44b80cadfed28dc793a9701d2e"; | |
|
2904 | }; | |
|
2905 | }; | |
|
2870 | 2906 | "file-sync-cmp-0.1.1" = { |
|
2871 | 2907 | name = "file-sync-cmp"; |
|
2872 | 2908 | packageName = "file-sync-cmp"; |
@@ -2948,13 +2984,13 b' let' | |||
|
2948 | 2984 | sha512 = "lNaHNVymajmk0OJMBn8fVUAU1BtDeKIqKoVhk4xAALB57aALg6b4W0MfJ/cUE0g9YBXy5XhSlPIpYIJ7HaY/3Q=="; |
|
2949 | 2985 | }; |
|
2950 | 2986 | }; |
|
2951 |
"flatten-1.0. |
|
|
2987 | "flatten-1.0.3" = { | |
|
2952 | 2988 | name = "flatten"; |
|
2953 | 2989 | packageName = "flatten"; |
|
2954 |
version = "1.0. |
|
|
2955 | src = fetchurl { | |
|
2956 |
url = "https://registry.npmjs.org/flatten/-/flatten-1.0. |
|
|
2957 | sha1 = "dae46a9d78fbe25292258cc1e780a41d95c03782"; | |
|
2990 | version = "1.0.3"; | |
|
2991 | src = fetchurl { | |
|
2992 | url = "https://registry.npmjs.org/flatten/-/flatten-1.0.3.tgz"; | |
|
2993 | sha512 = "dVsPA/UwQ8+2uoFe5GHtiBMu48dWLTdsuEd7CKGlZlD78r1TTWBvDuFaFGKCo/ZfEr95Uk56vZoX86OsHkUeIg=="; | |
|
2958 | 2994 | }; |
|
2959 | 2995 | }; |
|
2960 | 2996 | "flush-write-stream-1.1.1" = { |
@@ -3128,13 +3164,13 b' let' | |||
|
3128 | 3164 | sha1 = "4a973f635b9190f715d10987d5c00fd2815ebe3d"; |
|
3129 | 3165 | }; |
|
3130 | 3166 | }; |
|
3131 |
"glob-7.1. |
|
|
3167 | "glob-7.1.5" = { | |
|
3132 | 3168 | name = "glob"; |
|
3133 | 3169 | packageName = "glob"; |
|
3134 |
version = "7.1. |
|
|
3135 | src = fetchurl { | |
|
3136 |
url = "https://registry.npmjs.org/glob/-/glob-7.1. |
|
|
3137 | sha512 = "hkLPepehmnKk41pUGm3sYxoFs/umurYfYJCerbXEyFIWcAzvpipAgVkBqqT9RBKMGjnq6kMuyYwha6csxbiM1A=="; | |
|
3170 | version = "7.1.5"; | |
|
3171 | src = fetchurl { | |
|
3172 | url = "https://registry.npmjs.org/glob/-/glob-7.1.5.tgz"; | |
|
3173 | sha512 = "J9dlskqUXK1OeTOYBEn5s8aMukWMwWfs+rPTn/jn50Ux4MNXVhubL1wu/j2t+H4NVI+cXEcCaYellqaPVGXNqQ=="; | |
|
3138 | 3174 | }; |
|
3139 | 3175 | }; |
|
3140 | 3176 | "glob-parent-3.1.0" = { |
@@ -3218,13 +3254,13 b' let' | |||
|
3218 | 3254 | sha1 = "15a4806a57547cb2d2dbf27f42e89a8c3451b364"; |
|
3219 | 3255 | }; |
|
3220 | 3256 | }; |
|
3221 |
"graceful-fs-4. |
|
|
3257 | "graceful-fs-4.2.3" = { | |
|
3222 | 3258 | name = "graceful-fs"; |
|
3223 | 3259 | packageName = "graceful-fs"; |
|
3224 |
version = "4. |
|
|
3225 | src = fetchurl { | |
|
3226 |
url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4. |
|
|
3227 | sha512 = "6uHUhOPEBgQ24HM+r6b/QwWfZq+yiFcipKFrOFiBEnWdy5sdzYoi+pJeQaPI5qOLRFqWmAXUPQNsielzdLoecA=="; | |
|
3260 | version = "4.2.3"; | |
|
3261 | src = fetchurl { | |
|
3262 | url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.3.tgz"; | |
|
3263 | sha512 = "a30VEBm4PEdx1dRB7MFK7BejejvCvBronbLjht+sHuGYj8PHs7M/5Z+rt5lw551vZ7yfTCj4Vuyy3mSJytDWRQ=="; | |
|
3228 | 3264 | }; |
|
3229 | 3265 | }; |
|
3230 | 3266 | "grunt-0.4.5" = { |
@@ -3281,6 +3317,15 b' let' | |||
|
3281 | 3317 | sha1 = "3bbdec0b75d12ceaa55d62943625c0b0861cdf6f"; |
|
3282 | 3318 | }; |
|
3283 | 3319 | }; |
|
3320 | "grunt-contrib-uglify-4.0.1" = { | |
|
3321 | name = "grunt-contrib-uglify"; | |
|
3322 | packageName = "grunt-contrib-uglify"; | |
|
3323 | version = "4.0.1"; | |
|
3324 | src = fetchurl { | |
|
3325 | url = "https://registry.npmjs.org/grunt-contrib-uglify/-/grunt-contrib-uglify-4.0.1.tgz"; | |
|
3326 | sha512 = "dwf8/+4uW1+7pH72WButOEnzErPGmtUvc8p08B0eQS/6ON0WdeQu0+WFeafaPTbbY1GqtS25lsHWaDeiTQNWPg=="; | |
|
3327 | }; | |
|
3328 | }; | |
|
3284 | 3329 | "grunt-contrib-watch-0.6.1" = { |
|
3285 | 3330 | name = "grunt-contrib-watch"; |
|
3286 | 3331 | packageName = "grunt-contrib-watch"; |
@@ -3335,6 +3380,15 b' let' | |||
|
3335 | 3380 | sha512 = "SaZ8K8lG4iTxs7ClZxOWCf3kxqS2y+Eel8SbaEGgBKwhAp6e45beIu+vhBZRLX3vonKML2kjemKsQ21REaqNFQ=="; |
|
3336 | 3381 | }; |
|
3337 | 3382 | }; |
|
3383 | "gzip-size-3.0.0" = { | |
|
3384 | name = "gzip-size"; | |
|
3385 | packageName = "gzip-size"; | |
|
3386 | version = "3.0.0"; | |
|
3387 | src = fetchurl { | |
|
3388 | url = "https://registry.npmjs.org/gzip-size/-/gzip-size-3.0.0.tgz"; | |
|
3389 | sha1 = "546188e9bdc337f673772f81660464b389dce520"; | |
|
3390 | }; | |
|
3391 | }; | |
|
3338 | 3392 | "har-schema-1.0.5" = { |
|
3339 | 3393 | name = "har-schema"; |
|
3340 | 3394 | packageName = "har-schema"; |
@@ -3695,15 +3749,6 b' let' | |||
|
3695 | 3749 | sha1 = "f30f716c8e2bd346c7b67d3df3915566a7c05607"; |
|
3696 | 3750 | }; |
|
3697 | 3751 | }; |
|
3698 | "indexof-0.0.1" = { | |
|
3699 | name = "indexof"; | |
|
3700 | packageName = "indexof"; | |
|
3701 | version = "0.0.1"; | |
|
3702 | src = fetchurl { | |
|
3703 | url = "https://registry.npmjs.org/indexof/-/indexof-0.0.1.tgz"; | |
|
3704 | sha1 = "82dc336d232b9062179d05ab3293a66059fd435d"; | |
|
3705 | }; | |
|
3706 | }; | |
|
3707 | 3752 | "inflight-1.0.6" = { |
|
3708 | 3753 | name = "inflight"; |
|
3709 | 3754 | packageName = "inflight"; |
@@ -3740,6 +3785,15 b' let' | |||
|
3740 | 3785 | sha1 = "633c2c83e3da42a502f52466022480f4208261de"; |
|
3741 | 3786 | }; |
|
3742 | 3787 | }; |
|
3788 | "inherits-2.0.4" = { | |
|
3789 | name = "inherits"; | |
|
3790 | packageName = "inherits"; | |
|
3791 | version = "2.0.4"; | |
|
3792 | src = fetchurl { | |
|
3793 | url = "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz"; | |
|
3794 | sha512 = "k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ=="; | |
|
3795 | }; | |
|
3796 | }; | |
|
3743 | 3797 | "ini-1.3.5" = { |
|
3744 | 3798 | name = "ini"; |
|
3745 | 3799 | packageName = "ini"; |
@@ -4424,13 +4478,13 b' let' | |||
|
4424 | 4478 | sha1 = "fadd834b9683073da179b3eae6d9c0d15053f73e"; |
|
4425 | 4479 | }; |
|
4426 | 4480 | }; |
|
4427 |
"lodash-4.17.1 |
|
|
4481 | "lodash-4.17.15" = { | |
|
4428 | 4482 | name = "lodash"; |
|
4429 | 4483 | packageName = "lodash"; |
|
4430 |
version = "4.17.1 |
|
|
4431 | src = fetchurl { | |
|
4432 |
url = "https://registry.npmjs.org/lodash/-/lodash-4.17.1 |
|
|
4433 | sha512 = "cQKh8igo5QUhZ7lg38DYWAxMvjSAKG0A8wGSVimP07SIUEK2UO+arSRKbRZWtelMtN5V0Hkwh5ryOto/SshYIg=="; | |
|
4484 | version = "4.17.15"; | |
|
4485 | src = fetchurl { | |
|
4486 | url = "https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz"; | |
|
4487 | sha512 = "8xOcRHvCjnocdS5cpwXQXVzmmh5e5+saE2QGoeQmbKmRS6J3VQppPOIt0MnmE+4xlZoumy0GPG0D0MVIQbNA1A=="; | |
|
4434 | 4488 | }; |
|
4435 | 4489 | }; |
|
4436 | 4490 | "lodash.camelcase-4.3.0" = { |
@@ -4577,6 +4631,15 b' let' | |||
|
4577 | 4631 | sha1 = "de819fdbcd84dccd8fae59c6aeb79615b9d266ac"; |
|
4578 | 4632 | }; |
|
4579 | 4633 | }; |
|
4634 | "maxmin-2.1.0" = { | |
|
4635 | name = "maxmin"; | |
|
4636 | packageName = "maxmin"; | |
|
4637 | version = "2.1.0"; | |
|
4638 | src = fetchurl { | |
|
4639 | url = "https://registry.npmjs.org/maxmin/-/maxmin-2.1.0.tgz"; | |
|
4640 | sha1 = "4d3b220903d95eee7eb7ac7fa864e72dc09a3166"; | |
|
4641 | }; | |
|
4642 | }; | |
|
4580 | 4643 | "md5.js-1.3.5" = { |
|
4581 | 4644 | name = "md5.js"; |
|
4582 | 4645 | packageName = "md5.js"; |
@@ -4604,6 +4667,15 b' let' | |||
|
4604 | 4667 | sha1 = "3a9a20b8462523e447cfbc7e8bb80ed667bfc552"; |
|
4605 | 4668 | }; |
|
4606 | 4669 | }; |
|
4670 | "memory-fs-0.5.0" = { | |
|
4671 | name = "memory-fs"; | |
|
4672 | packageName = "memory-fs"; | |
|
4673 | version = "0.5.0"; | |
|
4674 | src = fetchurl { | |
|
4675 | url = "https://registry.npmjs.org/memory-fs/-/memory-fs-0.5.0.tgz"; | |
|
4676 | sha512 = "jA0rdU5KoQMC0e6ppoNRtpp6vjFq6+NY7r8hywnC7V+1Xj/MtHwGIbB1QaK/dunyjWteJzmkpd7ooeWg10T7GA=="; | |
|
4677 | }; | |
|
4678 | }; | |
|
4607 | 4679 | "micromatch-3.1.10" = { |
|
4608 | 4680 | name = "micromatch"; |
|
4609 | 4681 | packageName = "micromatch"; |
@@ -4730,13 +4802,13 b' let' | |||
|
4730 | 4802 | sha512 = "zHo8v+otD1J10j/tC+VNoGK9keCuByhKovAvdn74dmxJl9+mWHnx6EMsDN4lgRoMI/eYo2nchAxniIbUPb5onw=="; |
|
4731 | 4803 | }; |
|
4732 | 4804 | }; |
|
4733 |
"mixin-deep-1.3. |
|
|
4805 | "mixin-deep-1.3.2" = { | |
|
4734 | 4806 | name = "mixin-deep"; |
|
4735 | 4807 | packageName = "mixin-deep"; |
|
4736 |
version = "1.3. |
|
|
4737 | src = fetchurl { | |
|
4738 |
url = "https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3. |
|
|
4739 | sha512 = "8ZItLHeEgaqEvd5lYBXfm4EZSFCX29Jb9K+lAHhDKzReKBQKj3R+7NOF6tjqYi9t4oI8VUfaWITJQm86wnXGNQ=="; | |
|
4808 | version = "1.3.2"; | |
|
4809 | src = fetchurl { | |
|
4810 | url = "https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3.2.tgz"; | |
|
4811 | sha512 = "WRoDn//mXBiJ1H40rqa3vH0toePwSsGb45iInWlTySa+Uu4k3tYUSxa2v1KqAiLtvlrSzaExqS1gtk96A9zvEA=="; | |
|
4740 | 4812 | }; |
|
4741 | 4813 | }; |
|
4742 | 4814 | "mkdirp-0.5.1" = { |
@@ -4784,13 +4856,13 b' let' | |||
|
4784 | 4856 | sha1 = "5608aeadfc00be6c2901df5f9861788de0d597c8"; |
|
4785 | 4857 | }; |
|
4786 | 4858 | }; |
|
4787 |
"nan-2.1 |
|
|
4859 | "nan-2.14.0" = { | |
|
4788 | 4860 | name = "nan"; |
|
4789 | 4861 | packageName = "nan"; |
|
4790 |
version = "2.1 |
|
|
4791 | src = fetchurl { | |
|
4792 |
url = "https://registry.npmjs.org/nan/-/nan-2.1 |
|
|
4793 | sha512 = "TghvYc72wlMGMVMluVo9WRJc0mB8KxxF/gZ4YYFy7V2ZQX9l7rgbPg7vjS9mt6U5HXODVFVI2bOduCzwOMv/lw=="; | |
|
4862 | version = "2.14.0"; | |
|
4863 | src = fetchurl { | |
|
4864 | url = "https://registry.npmjs.org/nan/-/nan-2.14.0.tgz"; | |
|
4865 | sha512 = "INOFj37C7k3AfaNTtX8RhsTw7qRy7eLET14cROi9+5HAVbbHuIWUHEauBv5qT4Av2tWasiTY1Jw6puUNqRJXQg=="; | |
|
4794 | 4866 | }; |
|
4795 | 4867 | }; |
|
4796 | 4868 | "nanomatch-1.2.13" = { |
@@ -4829,13 +4901,13 b' let' | |||
|
4829 | 4901 | sha512 = "rmTZ9kz+f3rCvK2TD1Ue/oZlns7OGoIWP4fc3llxxRXlOkHKoWPPWJOfFYpITabSow43QJbRIoHQXtt10VldyQ=="; |
|
4830 | 4902 | }; |
|
4831 | 4903 | }; |
|
4832 |
"node-libs-browser-2.2. |
|
|
4904 | "node-libs-browser-2.2.1" = { | |
|
4833 | 4905 | name = "node-libs-browser"; |
|
4834 | 4906 | packageName = "node-libs-browser"; |
|
4835 |
version = "2.2. |
|
|
4836 | src = fetchurl { | |
|
4837 |
url = "https://registry.npmjs.org/node-libs-browser/-/node-libs-browser-2.2. |
|
|
4838 | sha512 = "5MQunG/oyOaBdttrL40dA7bUfPORLRWMUJLQtMg7nluxUvk5XwnLdL9twQHFAjRx/y7mIMkLKT9++qPbbk6BZA=="; | |
|
4907 | version = "2.2.1"; | |
|
4908 | src = fetchurl { | |
|
4909 | url = "https://registry.npmjs.org/node-libs-browser/-/node-libs-browser-2.2.1.tgz"; | |
|
4910 | sha512 = "h/zcD8H9kaDZ9ALUWwlBUDo6TKF8a7qBSCSEGfjTVIYeqsioSKaAX+BN7NgiMGp6iSIXZ3PxgCu8KS3b71YK5Q=="; | |
|
4839 | 4911 | }; |
|
4840 | 4912 | }; |
|
4841 | 4913 | "nopt-1.0.10" = { |
@@ -4973,6 +5045,15 b' let' | |||
|
4973 | 5045 | sha1 = "7e7d858b781bd7c991a41ba975ed3812754e998c"; |
|
4974 | 5046 | }; |
|
4975 | 5047 | }; |
|
5048 | "object-inspect-1.6.0" = { | |
|
5049 | name = "object-inspect"; | |
|
5050 | packageName = "object-inspect"; | |
|
5051 | version = "1.6.0"; | |
|
5052 | src = fetchurl { | |
|
5053 | url = "https://registry.npmjs.org/object-inspect/-/object-inspect-1.6.0.tgz"; | |
|
5054 | sha512 = "GJzfBZ6DgDAmnuaM3104jR4s1Myxr3Y3zfIyN4z3UdqN69oSRacNK8UhnobDdC+7J2AHCjGwxQubNJfE70SXXQ=="; | |
|
5055 | }; | |
|
5056 | }; | |
|
4976 | 5057 | "object-keys-1.1.1" = { |
|
4977 | 5058 | name = "object-keys"; |
|
4978 | 5059 | packageName = "object-keys"; |
@@ -5117,13 +5198,13 b' let' | |||
|
5117 | 5198 | sha512 = "vvcXsLAJ9Dr5rQOPk7toZQZJApBl2K4J6dANSsEuh6QI41JYcsS/qhTGa9ErIUUgK3WNQoJYvylxvjqmiqEA9Q=="; |
|
5118 | 5199 | }; |
|
5119 | 5200 | }; |
|
5120 |
"p-limit-2.2. |
|
|
5201 | "p-limit-2.2.1" = { | |
|
5121 | 5202 | name = "p-limit"; |
|
5122 | 5203 | packageName = "p-limit"; |
|
5123 |
version = "2.2. |
|
|
5124 | src = fetchurl { | |
|
5125 |
url = "https://registry.npmjs.org/p-limit/-/p-limit-2.2. |
|
|
5126 | sha512 = "pZbTJpoUsCzV48Mc9Nh51VbwO0X9cuPFE8gYwx9BTCt9SF8/b7Zljd2fVgOxhIF/HDTKgpVzs+GPhyKfjLLFRQ=="; | |
|
5204 | version = "2.2.1"; | |
|
5205 | src = fetchurl { | |
|
5206 | url = "https://registry.npmjs.org/p-limit/-/p-limit-2.2.1.tgz"; | |
|
5207 | sha512 = "85Tk+90UCVWvbDavCLKPOLC9vvY8OwEX/RtKF+/1OADJMVlFfEHOiMTPVyxg7mk/dKa+ipdHm0OUkTvCpMTuwg=="; | |
|
5127 | 5208 | }; |
|
5128 | 5209 | }; |
|
5129 | 5210 | "p-locate-2.0.0" = { |
@@ -5171,13 +5252,13 b' let' | |||
|
5171 | 5252 | sha512 = "0DTvPVU3ed8+HNXOu5Bs+o//Mbdj9VNQMUOe9oKCwh8l0GNwpTDMKCWbRjgtD291AWnkAgkqA/LOnQS8AmS1tw=="; |
|
5172 | 5253 | }; |
|
5173 | 5254 | }; |
|
5174 |
"parallel-transform-1. |
|
|
5255 | "parallel-transform-1.2.0" = { | |
|
5175 | 5256 | name = "parallel-transform"; |
|
5176 | 5257 | packageName = "parallel-transform"; |
|
5177 |
version = "1. |
|
|
5178 | src = fetchurl { | |
|
5179 |
url = "https://registry.npmjs.org/parallel-transform/-/parallel-transform-1. |
|
|
5180 | sha1 = "d410f065b05da23081fcd10f28854c29bda33b06"; | |
|
5258 | version = "1.2.0"; | |
|
5259 | src = fetchurl { | |
|
5260 | url = "https://registry.npmjs.org/parallel-transform/-/parallel-transform-1.2.0.tgz"; | |
|
5261 | sha512 = "P2vSmIu38uIlvdcU7fDkyrxj33gTUy/ABO5ZUbGowxNCopBq/OoD42bP4UmMrJoPyk4Uqf0mu3mtWBhHCZD8yg=="; | |
|
5181 | 5262 | }; |
|
5182 | 5263 | }; |
|
5183 | 5264 | "param-case-2.1.1" = { |
@@ -5189,13 +5270,13 b' let' | |||
|
5189 | 5270 | sha1 = "df94fd8cf6531ecf75e6bef9a0858fbc72be2247"; |
|
5190 | 5271 | }; |
|
5191 | 5272 | }; |
|
5192 |
"parse-asn1-5.1. |
|
|
5273 | "parse-asn1-5.1.5" = { | |
|
5193 | 5274 | name = "parse-asn1"; |
|
5194 | 5275 | packageName = "parse-asn1"; |
|
5195 |
version = "5.1. |
|
|
5196 | src = fetchurl { | |
|
5197 |
url = "https://registry.npmjs.org/parse-asn1/-/parse-asn1-5.1. |
|
|
5198 | sha512 = "Qs5duJcuvNExRfFZ99HDD3z4mAi3r9Wl/FOjEOijlxwCZs7E7mW2vjTpgQ4J8LpTF8x5v+1Vn5UQFejmWT11aw=="; | |
|
5276 | version = "5.1.5"; | |
|
5277 | src = fetchurl { | |
|
5278 | url = "https://registry.npmjs.org/parse-asn1/-/parse-asn1-5.1.5.tgz"; | |
|
5279 | sha512 = "jkMYn1dcJqF6d5CpU689bq7w/b5ALS9ROVSpQDPrZsqqesUJii9qutvoT5ltGedNXMO2e16YUWIghG9KxaViTQ=="; | |
|
5199 | 5280 | }; |
|
5200 | 5281 | }; |
|
5201 | 5282 | "parse-filepath-1.0.2" = { |
@@ -5252,13 +5333,13 b' let' | |||
|
5252 | 5333 | sha1 = "b363e55e8006ca6fe21784d2db22bd15d7917f14"; |
|
5253 | 5334 | }; |
|
5254 | 5335 | }; |
|
5255 |
"path-browserify-0.0. |
|
|
5336 | "path-browserify-0.0.1" = { | |
|
5256 | 5337 | name = "path-browserify"; |
|
5257 | 5338 | packageName = "path-browserify"; |
|
5258 |
version = "0.0. |
|
|
5259 | src = fetchurl { | |
|
5260 |
url = "https://registry.npmjs.org/path-browserify/-/path-browserify-0.0. |
|
|
5261 | sha1 = "a0b870729aae214005b7d5032ec2cbbb0fb4451a"; | |
|
5339 | version = "0.0.1"; | |
|
5340 | src = fetchurl { | |
|
5341 | url = "https://registry.npmjs.org/path-browserify/-/path-browserify-0.0.1.tgz"; | |
|
5342 | sha512 = "BapA40NHICOS+USX9SN4tyhq+A2RrN/Ws5F0Z5aMHDp98Fl86lX8Oti8B7uN93L4Ifv4fHOEA+pQw87gmMO/lQ=="; | |
|
5262 | 5343 | }; |
|
5263 | 5344 | }; |
|
5264 | 5345 | "path-dirname-1.0.2" = { |
@@ -5711,6 +5792,15 b' let' | |||
|
5711 | 5792 | sha1 = "d4f4562b0ce3696e41ac52d0e002e57a635dc6dc"; |
|
5712 | 5793 | }; |
|
5713 | 5794 | }; |
|
5795 | "pretty-bytes-3.0.1" = { | |
|
5796 | name = "pretty-bytes"; | |
|
5797 | packageName = "pretty-bytes"; | |
|
5798 | version = "3.0.1"; | |
|
5799 | src = fetchurl { | |
|
5800 | url = "https://registry.npmjs.org/pretty-bytes/-/pretty-bytes-3.0.1.tgz"; | |
|
5801 | sha1 = "27d0008d778063a0b4811bb35c79f1bd5d5fbccf"; | |
|
5802 | }; | |
|
5803 | }; | |
|
5714 | 5804 | "pretty-error-2.1.1" = { |
|
5715 | 5805 | name = "pretty-error"; |
|
5716 | 5806 | packageName = "pretty-error"; |
@@ -5738,13 +5828,13 b' let' | |||
|
5738 | 5828 | sha1 = "7332300e840161bda3e69a1d1d91a7d4bc16f182"; |
|
5739 | 5829 | }; |
|
5740 | 5830 | }; |
|
5741 |
"process-nextick-args-2.0. |
|
|
5831 | "process-nextick-args-2.0.1" = { | |
|
5742 | 5832 | name = "process-nextick-args"; |
|
5743 | 5833 | packageName = "process-nextick-args"; |
|
5744 |
version = "2.0. |
|
|
5745 | src = fetchurl { | |
|
5746 |
url = "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-2.0. |
|
|
5747 | sha512 = "MtEC1TqN0EU5nephaJ4rAtThHtC86dNN9qCuEhtshvpVBkAW5ZO7BASN9REnF9eoXGcRub+pFuKEpOHE+HbEMw=="; | |
|
5834 | version = "2.0.1"; | |
|
5835 | src = fetchurl { | |
|
5836 | url = "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-2.0.1.tgz"; | |
|
5837 | sha512 = "3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag=="; | |
|
5748 | 5838 | }; |
|
5749 | 5839 | }; |
|
5750 | 5840 | "promise-7.3.1" = { |
@@ -5990,13 +6080,13 b' let' | |||
|
5990 | 6080 | sha1 = "747c914e049614a4c9cfbba629871ad1d2927716"; |
|
5991 | 6081 | }; |
|
5992 | 6082 | }; |
|
5993 |
"reduce-function-call-1.0. |
|
|
6083 | "reduce-function-call-1.0.3" = { | |
|
5994 | 6084 | name = "reduce-function-call"; |
|
5995 | 6085 | packageName = "reduce-function-call"; |
|
5996 |
version = "1.0. |
|
|
5997 | src = fetchurl { | |
|
5998 |
url = "https://registry.npmjs.org/reduce-function-call/-/reduce-function-call-1.0. |
|
|
5999 | sha1 = "5a200bf92e0e37751752fe45b0ab330fd4b6be99"; | |
|
6086 | version = "1.0.3"; | |
|
6087 | src = fetchurl { | |
|
6088 | url = "https://registry.npmjs.org/reduce-function-call/-/reduce-function-call-1.0.3.tgz"; | |
|
6089 | sha512 = "Hl/tuV2VDgWgCSEeWMLwxLZqX7OK59eU1guxXsRKTAyeYimivsKdtcV4fu3r710tpG5GmDKDhQ0HSZLExnNmyQ=="; | |
|
6000 | 6090 | }; |
|
6001 | 6091 | }; |
|
6002 | 6092 | "regenerate-1.4.0" = { |
@@ -6152,13 +6242,13 b' let' | |||
|
6152 | 6242 | sha1 = "97f717b69d48784f5f526a6c5aa8ffdda055a4d1"; |
|
6153 | 6243 | }; |
|
6154 | 6244 | }; |
|
6155 |
"resolve-1.10 |
|
|
6245 | "resolve-1.12.0" = { | |
|
6156 | 6246 | name = "resolve"; |
|
6157 | 6247 | packageName = "resolve"; |
|
6158 |
version = "1.10 |
|
|
6159 | src = fetchurl { | |
|
6160 |
url = "https://registry.npmjs.org/resolve/-/resolve-1.10 |
|
|
6161 | sha512 = "KuIe4mf++td/eFb6wkaPbMDnP6kObCaEtIDuHOUED6MNUo4K670KZUHuuvYPZDxNF0WVLw49n06M2m2dXphEzA=="; | |
|
6248 | version = "1.12.0"; | |
|
6249 | src = fetchurl { | |
|
6250 | url = "https://registry.npmjs.org/resolve/-/resolve-1.12.0.tgz"; | |
|
6251 | sha512 = "B/dOmuoAik5bKcD6s6nXDCjzUKnaDvdkRyAk6rsmsKLipWj4797iothd7jmmUhWTfinVMU+wc56rYKsit2Qy4w=="; | |
|
6162 | 6252 | }; |
|
6163 | 6253 | }; |
|
6164 | 6254 | "resolve-cwd-2.0.0" = { |
@@ -6224,13 +6314,13 b' let' | |||
|
6224 | 6314 | sha1 = "e439be2aaee327321952730f99a8929e4fc50582"; |
|
6225 | 6315 | }; |
|
6226 | 6316 | }; |
|
6227 |
"rimraf-2. |
|
|
6317 | "rimraf-2.7.1" = { | |
|
6228 | 6318 | name = "rimraf"; |
|
6229 | 6319 | packageName = "rimraf"; |
|
6230 |
version = "2. |
|
|
6231 | src = fetchurl { | |
|
6232 |
url = "https://registry.npmjs.org/rimraf/-/rimraf-2. |
|
|
6233 | sha512 = "mwqeW5XsA2qAejG46gYdENaxXjx9onRNCfn7L0duuP4hCuTIi/QO7PDK07KJfp1d+izWPrzEJDcSqBa0OZQriA=="; | |
|
6320 | version = "2.7.1"; | |
|
6321 | src = fetchurl { | |
|
6322 | url = "https://registry.npmjs.org/rimraf/-/rimraf-2.7.1.tgz"; | |
|
6323 | sha512 = "uWjbaKIK3T1OSVptzX7Nl6PvQ3qAGtKEtVRjRuazjfL3Bx5eI409VZSqgND+4UNnmzLVdPj9FqFJNPqBZFve4w=="; | |
|
6234 | 6324 | }; |
|
6235 | 6325 | }; |
|
6236 | 6326 | "ripemd160-2.0.2" = { |
@@ -6260,6 +6350,15 b' let' | |||
|
6260 | 6350 | sha512 = "Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g=="; |
|
6261 | 6351 | }; |
|
6262 | 6352 | }; |
|
6353 | "safe-buffer-5.2.0" = { | |
|
6354 | name = "safe-buffer"; | |
|
6355 | packageName = "safe-buffer"; | |
|
6356 | version = "5.2.0"; | |
|
6357 | src = fetchurl { | |
|
6358 | url = "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.0.tgz"; | |
|
6359 | sha512 = "fZEwUGbVl7kouZs1jCdMLdt95hdIv0ZeHg6L7qPeciMZhZ+/gdesW4wgTARkrFWEpspjEATAzUGPG8N2jJiwbg=="; | |
|
6360 | }; | |
|
6361 | }; | |
|
6263 | 6362 | "safe-regex-1.1.0" = { |
|
6264 | 6363 | name = "safe-regex"; |
|
6265 | 6364 | packageName = "safe-regex"; |
@@ -6305,22 +6404,22 b' let' | |||
|
6305 | 6404 | sha1 = "0e7350acdec80b1108528786ec1d4418d11b396d"; |
|
6306 | 6405 | }; |
|
6307 | 6406 | }; |
|
6308 |
"semver-5.7. |
|
|
6407 | "semver-5.7.1" = { | |
|
6309 | 6408 | name = "semver"; |
|
6310 | 6409 | packageName = "semver"; |
|
6311 |
version = "5.7. |
|
|
6312 | src = fetchurl { | |
|
6313 |
url = "https://registry.npmjs.org/semver/-/semver-5.7. |
|
|
6314 | sha512 = "Ya52jSX2u7QKghxeoFGpLwCtGlt7j0oY9DYb5apt9nPlJ42ID+ulTXESnt/qAQcoSERyZ5sl3LDIOw0nAn/5DA=="; | |
|
6315 | }; | |
|
6316 | }; | |
|
6317 |
"serialize-javascript-1. |
|
|
6410 | version = "5.7.1"; | |
|
6411 | src = fetchurl { | |
|
6412 | url = "https://registry.npmjs.org/semver/-/semver-5.7.1.tgz"; | |
|
6413 | sha512 = "sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ=="; | |
|
6414 | }; | |
|
6415 | }; | |
|
6416 | "serialize-javascript-1.9.1" = { | |
|
6318 | 6417 | name = "serialize-javascript"; |
|
6319 | 6418 | packageName = "serialize-javascript"; |
|
6320 |
version = "1. |
|
|
6321 | src = fetchurl { | |
|
6322 |
url = "https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1. |
|
|
6323 | sha512 = "ke8UG8ulpFOxO8f8gRYabHQe/ZntKlcig2Mp+8+URDP1D8vJZ0KUt7LYo07q25Z/+JVSgpr/cui9PIp5H6/+nA=="; | |
|
6419 | version = "1.9.1"; | |
|
6420 | src = fetchurl { | |
|
6421 | url = "https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.9.1.tgz"; | |
|
6422 | sha512 = "0Vb/54WJ6k5v8sSWN09S0ora+Hnr+cX40r9F170nT+mSkaxltoE/7R3OrIdBSUv1OoiobH1QoWQbCnAO+e8J1A=="; | |
|
6324 | 6423 | }; |
|
6325 | 6424 | }; |
|
6326 | 6425 | "set-blocking-2.0.0" = { |
@@ -6332,22 +6431,13 b' let' | |||
|
6332 | 6431 | sha1 = "045f9782d011ae9a6803ddd382b24392b3d890f7"; |
|
6333 | 6432 | }; |
|
6334 | 6433 | }; |
|
6335 |
"set-value-0. |
|
|
6434 | "set-value-2.0.1" = { | |
|
6336 | 6435 | name = "set-value"; |
|
6337 | 6436 | packageName = "set-value"; |
|
6338 |
version = "0. |
|
|
6339 | src = fetchurl { | |
|
6340 |
url = "https://registry.npmjs.org/set-value/-/set-value-0. |
|
|
6341 | sha1 = "7db08f9d3d22dc7f78e53af3c3bf4666ecdfccf1"; | |
|
6342 | }; | |
|
6343 | }; | |
|
6344 | "set-value-2.0.0" = { | |
|
6345 | name = "set-value"; | |
|
6346 | packageName = "set-value"; | |
|
6347 | version = "2.0.0"; | |
|
6348 | src = fetchurl { | |
|
6349 | url = "https://registry.npmjs.org/set-value/-/set-value-2.0.0.tgz"; | |
|
6350 | sha512 = "hw0yxk9GT/Hr5yJEYnHNKYXkIA8mVJgd9ditYZCe16ZczcaELYYcfvaXesNACk2O8O0nTiPQcQhGUQj8JLzeeg=="; | |
|
6437 | version = "2.0.1"; | |
|
6438 | src = fetchurl { | |
|
6439 | url = "https://registry.npmjs.org/set-value/-/set-value-2.0.1.tgz"; | |
|
6440 | sha512 = "JxHc1weCN68wRY0fhCoXpyK55m/XPHafOmK4UWD7m2CI14GMcFypt4w/0+NV5f/ZMby2F6S2wwA7fgynh9gWSw=="; | |
|
6351 | 6441 | }; |
|
6352 | 6442 | }; |
|
6353 | 6443 | "setimmediate-1.0.5" = { |
@@ -6665,6 +6755,24 b' let' | |||
|
6665 | 6755 | sha512 = "nOqH59deCq9SRHlxq1Aw85Jnt4w6KvLKqWVik6oA9ZklXLNIOlqg4F2yrT1MVaTjAqvVwdfeZ7w7aCvJD7ugkw=="; |
|
6666 | 6756 | }; |
|
6667 | 6757 | }; |
|
6758 | "string.prototype.trimleft-2.1.0" = { | |
|
6759 | name = "string.prototype.trimleft"; | |
|
6760 | packageName = "string.prototype.trimleft"; | |
|
6761 | version = "2.1.0"; | |
|
6762 | src = fetchurl { | |
|
6763 | url = "https://registry.npmjs.org/string.prototype.trimleft/-/string.prototype.trimleft-2.1.0.tgz"; | |
|
6764 | sha512 = "FJ6b7EgdKxxbDxc79cOlok6Afd++TTs5szo+zJTUyow3ycrRfJVE2pq3vcN53XexvKZu/DJMDfeI/qMiZTrjTw=="; | |
|
6765 | }; | |
|
6766 | }; | |
|
6767 | "string.prototype.trimright-2.1.0" = { | |
|
6768 | name = "string.prototype.trimright"; | |
|
6769 | packageName = "string.prototype.trimright"; | |
|
6770 | version = "2.1.0"; | |
|
6771 | src = fetchurl { | |
|
6772 | url = "https://registry.npmjs.org/string.prototype.trimright/-/string.prototype.trimright-2.1.0.tgz"; | |
|
6773 | sha512 = "fXZTSV55dNBwv16uw+hh5jkghxSnc5oHq+5K/gXgizHwAvMetdAJlHqqoFC1FSDVPYWLkAKl2cxpUT41sV7nSg=="; | |
|
6774 | }; | |
|
6775 | }; | |
|
6668 | 6776 | "string_decoder-0.10.31" = { |
|
6669 | 6777 | name = "string_decoder"; |
|
6670 | 6778 | packageName = "string_decoder"; |
@@ -6683,13 +6791,13 b' let' | |||
|
6683 | 6791 | sha512 = "n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg=="; |
|
6684 | 6792 | }; |
|
6685 | 6793 | }; |
|
6686 |
"string_decoder-1. |
|
|
6794 | "string_decoder-1.3.0" = { | |
|
6687 | 6795 | name = "string_decoder"; |
|
6688 | 6796 | packageName = "string_decoder"; |
|
6689 |
version = "1. |
|
|
6690 | src = fetchurl { | |
|
6691 |
url = "https://registry.npmjs.org/string_decoder/-/string_decoder-1. |
|
|
6692 | sha512 = "6YqyX6ZWEYguAxgZzHGL7SsCeGx3V2TtOTqZz1xSTSWnqsbWwbptafNyvf/ACquZUXV3DANr5BDIwNYe1mN42w=="; | |
|
6797 | version = "1.3.0"; | |
|
6798 | src = fetchurl { | |
|
6799 | url = "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz"; | |
|
6800 | sha512 = "hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA=="; | |
|
6693 | 6801 | }; |
|
6694 | 6802 | }; |
|
6695 | 6803 | "stringstream-0.0.6" = { |
@@ -6836,13 +6944,13 b' let' | |||
|
6836 | 6944 | sha512 = "/mrRod8xqpA+IHSLyGCQ2s8SPHiCDEeQJSep1jqLYeEUClOFG2Qsh+4FU6G9VeqpZnGW/Su8LQGc4YKni5rYSQ=="; |
|
6837 | 6945 | }; |
|
6838 | 6946 | }; |
|
6839 |
"timers-browserify-2.0.1 |
|
|
6947 | "timers-browserify-2.0.11" = { | |
|
6840 | 6948 | name = "timers-browserify"; |
|
6841 | 6949 | packageName = "timers-browserify"; |
|
6842 |
version = "2.0.1 |
|
|
6843 | src = fetchurl { | |
|
6844 |
url = "https://registry.npmjs.org/timers-browserify/-/timers-browserify-2.0.1 |
|
|
6845 | sha512 = "YvC1SV1XdOUaL6gx5CoGroT3Gu49pK9+TZ38ErPldOWW4j49GI1HKs9DV+KGq/w6y+LZ72W1c8cKz2vzY+qpzg=="; | |
|
6950 | version = "2.0.11"; | |
|
6951 | src = fetchurl { | |
|
6952 | url = "https://registry.npmjs.org/timers-browserify/-/timers-browserify-2.0.11.tgz"; | |
|
6953 | sha512 = "60aV6sgJ5YEbzUdn9c8kYGIqOubPoUdqQCul3SBAsRCZ40s6Y5cMcrW4dt3/k/EsbLVJNl9n6Vz3fTc+k2GeKQ=="; | |
|
6846 | 6954 | }; |
|
6847 | 6955 | }; |
|
6848 | 6956 | "tiny-emitter-2.1.0" = { |
@@ -6944,13 +7052,13 b' let' | |||
|
6944 | 7052 | sha1 = "30c6203e1e66b841a88701ed8858f1725d94b026"; |
|
6945 | 7053 | }; |
|
6946 | 7054 | }; |
|
6947 |
"tslib-1. |
|
|
7055 | "tslib-1.10.0" = { | |
|
6948 | 7056 | name = "tslib"; |
|
6949 | 7057 | packageName = "tslib"; |
|
6950 |
version = "1. |
|
|
6951 | src = fetchurl { | |
|
6952 |
url = "https://registry.npmjs.org/tslib/-/tslib-1. |
|
|
6953 | sha512 = "4krF8scpejhaOgqzBEcGM7yDIEfi0/8+8zDRZhNZZ2kjmHJ4hv3zCbQWxoJGz1iw5U0Jl0nma13xzHXcncMavQ=="; | |
|
7058 | version = "1.10.0"; | |
|
7059 | src = fetchurl { | |
|
7060 | url = "https://registry.npmjs.org/tslib/-/tslib-1.10.0.tgz"; | |
|
7061 | sha512 = "qOebF53frne81cf0S9B41ByenJ3/IuH8yJKngAX35CmiZySA0khhkovshKK+jGCaMnVomla7gVlIcc3EvKPbTQ=="; | |
|
6954 | 7062 | }; |
|
6955 | 7063 | }; |
|
6956 | 7064 | "tty-browserify-0.0.0" = { |
@@ -7016,6 +7124,15 b' let' | |||
|
7016 | 7124 | sha512 = "Y2VsbPVs0FIshJztycsO2SfPk7/KAF/T72qzv9u5EpQ4kB2hQoHlhNQTsNyy6ul7lQtqJN/AoWeS23OzEiEFxw=="; |
|
7017 | 7125 | }; |
|
7018 | 7126 | }; |
|
7127 | "uglify-js-3.6.7" = { | |
|
7128 | name = "uglify-js"; | |
|
7129 | packageName = "uglify-js"; | |
|
7130 | version = "3.6.7"; | |
|
7131 | src = fetchurl { | |
|
7132 | url = "https://registry.npmjs.org/uglify-js/-/uglify-js-3.6.7.tgz"; | |
|
7133 | sha512 = "4sXQDzmdnoXiO+xvmTzQsfIiwrjUCSA95rSP4SEd8tDb51W2TiDOlL76Hl+Kw0Ie42PSItCW8/t6pBNCF2R48A=="; | |
|
7134 | }; | |
|
7135 | }; | |
|
7019 | 7136 | "uglify-to-browserify-1.0.2" = { |
|
7020 | 7137 | name = "uglify-to-browserify"; |
|
7021 | 7138 | packageName = "uglify-to-browserify"; |
@@ -7079,13 +7196,13 b' let' | |||
|
7079 | 7196 | sha1 = "8cdd8fbac4e2d2ea1e7e2e8097c42f442280f85b"; |
|
7080 | 7197 | }; |
|
7081 | 7198 | }; |
|
7082 |
"union-value-1.0. |
|
|
7199 | "union-value-1.0.1" = { | |
|
7083 | 7200 | name = "union-value"; |
|
7084 | 7201 | packageName = "union-value"; |
|
7085 |
version = "1.0. |
|
|
7086 | src = fetchurl { | |
|
7087 |
url = "https://registry.npmjs.org/union-value/-/union-value-1.0. |
|
|
7088 | sha1 = "5c71c34cb5bad5dcebe3ea0cd08207ba5aa1aea4"; | |
|
7202 | version = "1.0.1"; | |
|
7203 | src = fetchurl { | |
|
7204 | url = "https://registry.npmjs.org/union-value/-/union-value-1.0.1.tgz"; | |
|
7205 | sha512 = "tJfXmxMeWYnczCVs7XAEvIV7ieppALdyepWMkHkwciRpZraG/xwT+s2JN8+pr1+8jCRf80FFzvr+MpQeeoF4Xg=="; | |
|
7089 | 7206 | }; |
|
7090 | 7207 | }; |
|
7091 | 7208 | "uniq-1.0.1" = { |
@@ -7115,13 +7232,13 b' let' | |||
|
7115 | 7232 | sha512 = "Vmp0jIp2ln35UTXuryvjzkjGdRyf9b2lTXuSYUiPmzRcl3FDtYqAwOnTJkAngD9SWhnoJzDbTKwaOrZ+STtxNQ=="; |
|
7116 | 7233 | }; |
|
7117 | 7234 | }; |
|
7118 |
"unique-slug-2.0. |
|
|
7235 | "unique-slug-2.0.2" = { | |
|
7119 | 7236 | name = "unique-slug"; |
|
7120 | 7237 | packageName = "unique-slug"; |
|
7121 |
version = "2.0. |
|
|
7122 | src = fetchurl { | |
|
7123 |
url = "https://registry.npmjs.org/unique-slug/-/unique-slug-2.0. |
|
|
7124 | sha512 = "n9cU6+gITaVu7VGj1Z8feKMmfAjEAQGhwD9fE3zvpRRa0wEIx8ODYkVGfSc94M2OX00tUFV8wH3zYbm1I8mxFg=="; | |
|
7238 | version = "2.0.2"; | |
|
7239 | src = fetchurl { | |
|
7240 | url = "https://registry.npmjs.org/unique-slug/-/unique-slug-2.0.2.tgz"; | |
|
7241 | sha512 = "zoWr9ObaxALD3DOPfjPSqxt4fnZiWblxHIgeWqW8x7UqDzEtHEQLzji2cuJYQFCU6KmoJikOYAZlrTHHebjx2w=="; | |
|
7125 | 7242 | }; |
|
7126 | 7243 | }; |
|
7127 | 7244 | "unset-value-1.0.0" = { |
@@ -7133,13 +7250,13 b' let' | |||
|
7133 | 7250 | sha1 = "8376873f7d2335179ffb1e6fc3a8ed0dfc8ab559"; |
|
7134 | 7251 | }; |
|
7135 | 7252 | }; |
|
7136 |
"upath-1. |
|
|
7253 | "upath-1.2.0" = { | |
|
7137 | 7254 | name = "upath"; |
|
7138 | 7255 | packageName = "upath"; |
|
7139 |
version = "1. |
|
|
7140 | src = fetchurl { | |
|
7141 |
url = "https://registry.npmjs.org/upath/-/upath-1. |
|
|
7142 | sha512 = "kXpym8nmDmlCBr7nKdIx8P2jNBa+pBpIUFRnKJ4dr8htyYGJFokkr2ZvERRtUN+9SY+JqXouNgUPtv6JQva/2Q=="; | |
|
7256 | version = "1.2.0"; | |
|
7257 | src = fetchurl { | |
|
7258 | url = "https://registry.npmjs.org/upath/-/upath-1.2.0.tgz"; | |
|
7259 | sha512 = "aZwGpamFO61g3OlfT7OQCHqhGnW43ieH9WZeP7QxN/G/jS4jfqUkZxoryvJgVPEcrl5NL/ggHsSmLMHuH64Lhg=="; | |
|
7143 | 7260 | }; |
|
7144 | 7261 | }; |
|
7145 | 7262 | "upper-case-1.1.3" = { |
@@ -7160,6 +7277,15 b' let' | |||
|
7160 | 7277 | sha512 = "KY9Frmirql91X2Qgjry0Wd4Y+YTdrdZheS8TFwvkbLWf/G5KNJDCh6pKL5OZctEW4+0Baa5idK2ZQuELRwPznQ=="; |
|
7161 | 7278 | }; |
|
7162 | 7279 | }; |
|
7280 | "uri-path-1.0.0" = { | |
|
7281 | name = "uri-path"; | |
|
7282 | packageName = "uri-path"; | |
|
7283 | version = "1.0.0"; | |
|
7284 | src = fetchurl { | |
|
7285 | url = "https://registry.npmjs.org/uri-path/-/uri-path-1.0.0.tgz"; | |
|
7286 | sha1 = "9747f018358933c31de0fccfd82d138e67262e32"; | |
|
7287 | }; | |
|
7288 | }; | |
|
7163 | 7289 | "urix-0.1.0" = { |
|
7164 | 7290 | name = "urix"; |
|
7165 | 7291 | packageName = "urix"; |
@@ -7232,22 +7358,22 b' let' | |||
|
7232 | 7358 | sha1 = "8a16a05d445657a3aea5eecc5b12a4fa5379772c"; |
|
7233 | 7359 | }; |
|
7234 | 7360 | }; |
|
7235 |
"uuid-3.3. |
|
|
7361 | "uuid-3.3.3" = { | |
|
7236 | 7362 | name = "uuid"; |
|
7237 | 7363 | packageName = "uuid"; |
|
7238 |
version = "3.3. |
|
|
7239 | src = fetchurl { | |
|
7240 |
url = "https://registry.npmjs.org/uuid/-/uuid-3.3. |
|
|
7241 | sha512 = "yXJmeNaw3DnnKAOKJE51sL/ZaYfWJRl1pK9dr19YFCu0ObS231AB1/LbqTKRAQ5kw8A90rA6fr4riOUpTZvQZA=="; | |
|
7242 | }; | |
|
7243 | }; | |
|
7244 |
"v8-compile-cache-2.0 |
|
|
7364 | version = "3.3.3"; | |
|
7365 | src = fetchurl { | |
|
7366 | url = "https://registry.npmjs.org/uuid/-/uuid-3.3.3.tgz"; | |
|
7367 | sha512 = "pW0No1RGHgzlpHJO1nsVrHKpOEIxkGg1xB+v0ZmdNH5OAeAwzAVrCnI2/6Mtx+Uys6iaylxa+D3g4j63IKKjSQ=="; | |
|
7368 | }; | |
|
7369 | }; | |
|
7370 | "v8-compile-cache-2.1.0" = { | |
|
7245 | 7371 | name = "v8-compile-cache"; |
|
7246 | 7372 | packageName = "v8-compile-cache"; |
|
7247 |
version = "2.0 |
|
|
7248 | src = fetchurl { | |
|
7249 |
url = "https://registry.npmjs.org/v8-compile-cache/-/v8-compile-cache-2.0 |
|
|
7250 | sha512 = "CNmdbwQMBjwr9Gsmohvm0pbL954tJrNzf6gWL3K+QMQf00PF7ERGrEiLgjuU3mKreLC2MeGhUsNV9ybTbLgd3w=="; | |
|
7373 | version = "2.1.0"; | |
|
7374 | src = fetchurl { | |
|
7375 | url = "https://registry.npmjs.org/v8-compile-cache/-/v8-compile-cache-2.1.0.tgz"; | |
|
7376 | sha512 = "usZBT3PW+LOjM25wbqIlZwPeJV+3OSz3M1k1Ws8snlW39dZyYL9lOGC5FgPVHfk0jKmjiDV8Z0mIbVQPiwFs7g=="; | |
|
7251 | 7377 | }; |
|
7252 | 7378 | }; |
|
7253 | 7379 | "v8flags-3.1.3" = { |
@@ -7277,13 +7403,13 b' let' | |||
|
7277 | 7403 | sha1 = "3a105ca17053af55d6e270c1f8288682e18da400"; |
|
7278 | 7404 | }; |
|
7279 | 7405 | }; |
|
7280 |
"vm-browserify- |
|
|
7406 | "vm-browserify-1.1.0" = { | |
|
7281 | 7407 | name = "vm-browserify"; |
|
7282 | 7408 | packageName = "vm-browserify"; |
|
7283 |
version = " |
|
|
7284 | src = fetchurl { | |
|
7285 |
url = "https://registry.npmjs.org/vm-browserify/-/vm-browserify- |
|
|
7286 | sha1 = "5d7ea45bbef9e4a6ff65f95438e0a87c357d5a73"; | |
|
7409 | version = "1.1.0"; | |
|
7410 | src = fetchurl { | |
|
7411 | url = "https://registry.npmjs.org/vm-browserify/-/vm-browserify-1.1.0.tgz"; | |
|
7412 | sha512 = "iq+S7vZJE60yejDYM0ek6zg308+UZsdtPExWP9VZoCFCz1zkJoXFnAX7aZfd/ZwrkidzdUZL0C/ryW+JwAiIGw=="; | |
|
7287 | 7413 | }; |
|
7288 | 7414 | }; |
|
7289 | 7415 | "watchpack-1.6.0" = { |
@@ -7331,13 +7457,13 b' let' | |||
|
7331 | 7457 | sha1 = "fc571588c8558da77be9efb6debdc5a3b172bdc2"; |
|
7332 | 7458 | }; |
|
7333 | 7459 | }; |
|
7334 |
"webpack-sources-1.3 |
|
|
7460 | "webpack-sources-1.4.3" = { | |
|
7335 | 7461 | name = "webpack-sources"; |
|
7336 | 7462 | packageName = "webpack-sources"; |
|
7337 |
version = "1.3 |
|
|
7338 | src = fetchurl { | |
|
7339 |
url = "https://registry.npmjs.org/webpack-sources/-/webpack-sources-1.3 |
|
|
7340 | sha512 = "OiVgSrbGu7NEnEvQJJgdSFPl2qWKkWq5lHMhgiToIiN9w34EBnjYzSYs+VbL5KoYiLNtFFa7BZIKxRED3I32pA=="; | |
|
7463 | version = "1.4.3"; | |
|
7464 | src = fetchurl { | |
|
7465 | url = "https://registry.npmjs.org/webpack-sources/-/webpack-sources-1.4.3.tgz"; | |
|
7466 | sha512 = "lgTS3Xhv1lCOKo7SA5TjKXMjpSM4sBjNV5+q2bqesbSPs5FjGmU6jjtBSkX9b4qW87vDIsCIlUPOEhbZrMdjeQ=="; | |
|
7341 | 7467 | }; |
|
7342 | 7468 | }; |
|
7343 | 7469 | "webpack-uglify-js-plugin-1.1.9" = { |
@@ -7430,13 +7556,13 b' let' | |||
|
7430 | 7556 | sha1 = "b5243d8f3ec1aa35f1364605bc0d1036e30ab69f"; |
|
7431 | 7557 | }; |
|
7432 | 7558 | }; |
|
7433 |
"xtend-4.0. |
|
|
7559 | "xtend-4.0.2" = { | |
|
7434 | 7560 | name = "xtend"; |
|
7435 | 7561 | packageName = "xtend"; |
|
7436 |
version = "4.0. |
|
|
7437 | src = fetchurl { | |
|
7438 |
url = "https://registry.npmjs.org/xtend/-/xtend-4.0. |
|
|
7439 | sha1 = "a5c6d532be656e23db820efb943a1f04998d63af"; | |
|
7562 | version = "4.0.2"; | |
|
7563 | src = fetchurl { | |
|
7564 | url = "https://registry.npmjs.org/xtend/-/xtend-4.0.2.tgz"; | |
|
7565 | sha512 = "LKYU1iAXJXUgAXn9URjiu+MWhyUXHsvfp7mcuYm9dSUKK0/CjtrUwFAxD82/mCWbtLsGjFIad0wIsod4zrTAEQ=="; | |
|
7440 | 7566 | }; |
|
7441 | 7567 | }; |
|
7442 | 7568 | "y18n-4.0.0" = { |
@@ -7514,9 +7640,9 b' let' | |||
|
7514 | 7640 | sources."@polymer/paper-toast-3.0.1" |
|
7515 | 7641 | sources."@polymer/paper-toggle-button-3.0.1" |
|
7516 | 7642 | sources."@polymer/paper-tooltip-3.0.1" |
|
7517 |
sources."@polymer/polymer-3. |
|
|
7643 | sources."@polymer/polymer-3.3.0" | |
|
7518 | 7644 | sources."@types/clone-0.1.30" |
|
7519 |
sources."@types/node-6.14. |
|
|
7645 | sources."@types/node-6.14.9" | |
|
7520 | 7646 | sources."@types/parse5-2.2.34" |
|
7521 | 7647 | sources."@webassemblyjs/ast-1.7.10" |
|
7522 | 7648 | sources."@webassemblyjs/floating-point-hex-parser-1.7.10" |
@@ -7536,8 +7662,8 b' let' | |||
|
7536 | 7662 | sources."@webassemblyjs/wasm-parser-1.7.10" |
|
7537 | 7663 | sources."@webassemblyjs/wast-parser-1.7.10" |
|
7538 | 7664 | sources."@webassemblyjs/wast-printer-1.7.10" |
|
7539 |
sources."@webcomponents/shadycss-1.9. |
|
|
7540 |
sources."@webcomponents/webcomponentsjs-2. |
|
|
7665 | sources."@webcomponents/shadycss-1.9.2" | |
|
7666 | sources."@webcomponents/webcomponentsjs-2.3.0" | |
|
7541 | 7667 | sources."@xtuc/ieee754-1.2.0" |
|
7542 | 7668 | sources."@xtuc/long-4.2.1" |
|
7543 | 7669 | sources."abbrev-1.1.1" |
@@ -7549,7 +7675,7 b' let' | |||
|
7549 | 7675 | ]; |
|
7550 | 7676 | }) |
|
7551 | 7677 | sources."ajv-4.11.8" |
|
7552 |
sources."ajv-keywords-3.4. |
|
|
7678 | sources."ajv-keywords-3.4.1" | |
|
7553 | 7679 | (sources."align-text-0.1.4" // { |
|
7554 | 7680 | dependencies = [ |
|
7555 | 7681 | sources."kind-of-3.2.2" |
@@ -7615,20 +7741,20 b' let' | |||
|
7615 | 7741 | (sources."babel-core-6.26.3" // { |
|
7616 | 7742 | dependencies = [ |
|
7617 | 7743 | sources."json5-0.5.1" |
|
7618 |
sources."lodash-4.17.1 |
|
|
7744 | sources."lodash-4.17.15" | |
|
7619 | 7745 | sources."minimatch-3.0.4" |
|
7620 | 7746 | ]; |
|
7621 | 7747 | }) |
|
7622 | 7748 | (sources."babel-generator-6.26.1" // { |
|
7623 | 7749 | dependencies = [ |
|
7624 |
sources."lodash-4.17.1 |
|
|
7750 | sources."lodash-4.17.15" | |
|
7625 | 7751 | ]; |
|
7626 | 7752 | }) |
|
7627 | 7753 | sources."babel-helper-builder-binary-assignment-operator-visitor-6.24.1" |
|
7628 | 7754 | sources."babel-helper-call-delegate-6.24.1" |
|
7629 | 7755 | (sources."babel-helper-define-map-6.26.0" // { |
|
7630 | 7756 | dependencies = [ |
|
7631 |
sources."lodash-4.17.1 |
|
|
7757 | sources."lodash-4.17.15" | |
|
7632 | 7758 | ]; |
|
7633 | 7759 | }) |
|
7634 | 7760 | sources."babel-helper-explode-assignable-expression-6.24.1" |
@@ -7638,7 +7764,7 b' let' | |||
|
7638 | 7764 | sources."babel-helper-optimise-call-expression-6.24.1" |
|
7639 | 7765 | (sources."babel-helper-regex-6.26.0" // { |
|
7640 | 7766 | dependencies = [ |
|
7641 |
sources."lodash-4.17.1 |
|
|
7767 | sources."lodash-4.17.15" | |
|
7642 | 7768 | ]; |
|
7643 | 7769 | }) |
|
7644 | 7770 | sources."babel-helper-remap-async-to-generator-6.24.1" |
@@ -7656,7 +7782,7 b' let' | |||
|
7656 | 7782 | sources."babel-plugin-transform-es2015-block-scoped-functions-6.22.0" |
|
7657 | 7783 | (sources."babel-plugin-transform-es2015-block-scoping-6.26.0" // { |
|
7658 | 7784 | dependencies = [ |
|
7659 |
sources."lodash-4.17.1 |
|
|
7785 | sources."lodash-4.17.15" | |
|
7660 | 7786 | ]; |
|
7661 | 7787 | }) |
|
7662 | 7788 | sources."babel-plugin-transform-es2015-classes-6.24.1" |
@@ -7685,23 +7811,23 b' let' | |||
|
7685 | 7811 | sources."babel-preset-env-1.7.0" |
|
7686 | 7812 | (sources."babel-register-6.26.0" // { |
|
7687 | 7813 | dependencies = [ |
|
7688 |
sources."lodash-4.17.1 |
|
|
7814 | sources."lodash-4.17.15" | |
|
7689 | 7815 | ]; |
|
7690 | 7816 | }) |
|
7691 | 7817 | sources."babel-runtime-6.26.0" |
|
7692 | 7818 | (sources."babel-template-6.26.0" // { |
|
7693 | 7819 | dependencies = [ |
|
7694 |
sources."lodash-4.17.1 |
|
|
7820 | sources."lodash-4.17.15" | |
|
7695 | 7821 | ]; |
|
7696 | 7822 | }) |
|
7697 | 7823 | (sources."babel-traverse-6.26.0" // { |
|
7698 | 7824 | dependencies = [ |
|
7699 |
sources."lodash-4.17.1 |
|
|
7825 | sources."lodash-4.17.15" | |
|
7700 | 7826 | ]; |
|
7701 | 7827 | }) |
|
7702 | 7828 | (sources."babel-types-6.26.0" // { |
|
7703 | 7829 | dependencies = [ |
|
7704 |
sources."lodash-4.17.1 |
|
|
7830 | sources."lodash-4.17.15" | |
|
7705 | 7831 | ]; |
|
7706 | 7832 | }) |
|
7707 | 7833 | sources."babylon-6.18.0" |
@@ -7711,11 +7837,11 b' let' | |||
|
7711 | 7837 | sources."define-property-1.0.0" |
|
7712 | 7838 | ]; |
|
7713 | 7839 | }) |
|
7714 |
sources."base64-js-1.3. |
|
|
7840 | sources."base64-js-1.3.1" | |
|
7715 | 7841 | sources."bcrypt-pbkdf-1.0.2" |
|
7716 | 7842 | sources."big.js-5.2.2" |
|
7717 | 7843 | sources."binary-extensions-1.13.1" |
|
7718 |
sources."bluebird-3. |
|
|
7844 | sources."bluebird-3.7.1" | |
|
7719 | 7845 | sources."bn.js-4.11.8" |
|
7720 | 7846 | sources."boolbase-1.0.0" |
|
7721 | 7847 | sources."boom-2.10.1" |
@@ -7739,11 +7865,11 b' let' | |||
|
7739 | 7865 | sources."builtin-status-codes-3.0.0" |
|
7740 | 7866 | (sources."cacache-10.0.4" // { |
|
7741 | 7867 | dependencies = [ |
|
7742 |
sources."glob-7.1. |
|
|
7743 |
sources."graceful-fs-4. |
|
|
7868 | sources."glob-7.1.5" | |
|
7869 | sources."graceful-fs-4.2.3" | |
|
7744 | 7870 | sources."lru-cache-4.1.5" |
|
7745 | 7871 | sources."minimatch-3.0.4" |
|
7746 |
sources."rimraf-2. |
|
|
7872 | sources."rimraf-2.7.1" | |
|
7747 | 7873 | ]; |
|
7748 | 7874 | }) |
|
7749 | 7875 | sources."cache-base-1.0.1" |
@@ -7754,18 +7880,18 b' let' | |||
|
7754 | 7880 | sources."browserslist-1.7.7" |
|
7755 | 7881 | ]; |
|
7756 | 7882 | }) |
|
7757 |
sources."caniuse-db-1.0.30000 |
|
|
7758 |
sources."caniuse-lite-1.0.30000 |
|
|
7883 | sources."caniuse-db-1.0.30001006" | |
|
7884 | sources."caniuse-lite-1.0.30001006" | |
|
7759 | 7885 | sources."caseless-0.12.0" |
|
7760 | 7886 | sources."center-align-0.1.3" |
|
7761 | 7887 | sources."chalk-0.5.1" |
|
7762 |
(sources."chokidar-2.1. |
|
|
7888 | (sources."chokidar-2.1.8" // { | |
|
7763 | 7889 | dependencies = [ |
|
7764 | 7890 | sources."is-glob-4.0.1" |
|
7765 | 7891 | ]; |
|
7766 | 7892 | }) |
|
7767 |
sources."chownr-1.1. |
|
|
7768 |
sources."chrome-trace-event-1.0. |
|
|
7893 | sources."chownr-1.1.3" | |
|
7894 | sources."chrome-trace-event-1.0.2" | |
|
7769 | 7895 | sources."cipher-base-1.0.4" |
|
7770 | 7896 | (sources."clap-1.2.3" // { |
|
7771 | 7897 | dependencies = [ |
@@ -7801,7 +7927,7 b' let' | |||
|
7801 | 7927 | }) |
|
7802 | 7928 | (sources."cli-1.0.1" // { |
|
7803 | 7929 | dependencies = [ |
|
7804 |
sources."glob-7.1. |
|
|
7930 | sources."glob-7.1.5" | |
|
7805 | 7931 | sources."minimatch-3.0.4" |
|
7806 | 7932 | ]; |
|
7807 | 7933 | }) |
@@ -7825,24 +7951,29 b' let' | |||
|
7825 | 7951 | sources."colormin-1.1.2" |
|
7826 | 7952 | sources."colors-0.6.2" |
|
7827 | 7953 | sources."combined-stream-1.0.8" |
|
7828 |
sources."commander-2. |
|
|
7954 | sources."commander-2.20.3" | |
|
7829 | 7955 | sources."commondir-1.0.1" |
|
7830 | 7956 | sources."component-emitter-1.3.0" |
|
7831 | 7957 | sources."concat-map-0.0.1" |
|
7832 | 7958 | (sources."concat-stream-1.6.2" // { |
|
7833 | 7959 | dependencies = [ |
|
7834 | 7960 | sources."readable-stream-2.3.6" |
|
7961 | sources."safe-buffer-5.1.2" | |
|
7835 | 7962 | sources."string_decoder-1.1.1" |
|
7836 | 7963 | ]; |
|
7837 | 7964 | }) |
|
7838 | 7965 | sources."console-browserify-1.1.0" |
|
7839 | 7966 | sources."constants-browserify-1.0.0" |
|
7840 | sources."convert-source-map-1.6.0" | |
|
7967 | (sources."convert-source-map-1.6.0" // { | |
|
7968 | dependencies = [ | |
|
7969 | sources."safe-buffer-5.1.2" | |
|
7970 | ]; | |
|
7971 | }) | |
|
7841 | 7972 | (sources."copy-concurrently-1.0.5" // { |
|
7842 | 7973 | dependencies = [ |
|
7843 |
sources."glob-7.1. |
|
|
7974 | sources."glob-7.1.5" | |
|
7844 | 7975 | sources."minimatch-3.0.4" |
|
7845 |
sources."rimraf-2. |
|
|
7976 | sources."rimraf-2.7.1" | |
|
7846 | 7977 | ]; |
|
7847 | 7978 | }) |
|
7848 | 7979 | sources."copy-descriptor-0.1.1" |
@@ -7852,7 +7983,7 b' let' | |||
|
7852 | 7983 | sources."minimatch-3.0.4" |
|
7853 | 7984 | ]; |
|
7854 | 7985 | }) |
|
7855 |
sources."core-js-2.6. |
|
|
7986 | sources."core-js-2.6.10" | |
|
7856 | 7987 | sources."core-util-is-1.0.2" |
|
7857 | 7988 | sources."create-ecdh-4.0.3" |
|
7858 | 7989 | sources."create-hash-1.2.0" |
@@ -7876,7 +8007,7 b' let' | |||
|
7876 | 8007 | sources."cssesc-0.1.0" |
|
7877 | 8008 | sources."cssnano-3.10.0" |
|
7878 | 8009 | sources."csso-2.3.2" |
|
7879 |
sources."cyclist-0. |
|
|
8010 | sources."cyclist-1.0.1" | |
|
7880 | 8011 | (sources."dashdash-1.14.1" // { |
|
7881 | 8012 | dependencies = [ |
|
7882 | 8013 | sources."assert-plus-1.0.0" |
@@ -7899,9 +8030,10 b' let' | |||
|
7899 | 8030 | sources."diffie-hellman-5.0.3" |
|
7900 | 8031 | sources."dir-glob-2.2.2" |
|
7901 | 8032 | sources."dom-converter-0.2.0" |
|
7902 |
(sources."dom-serializer-0. |
|
|
7903 | dependencies = [ | |
|
7904 |
sources."ent |
|
|
8033 | (sources."dom-serializer-0.2.1" // { | |
|
8034 | dependencies = [ | |
|
8035 | sources."domelementtype-2.0.1" | |
|
8036 | sources."entities-2.0.0" | |
|
7905 | 8037 | ]; |
|
7906 | 8038 | }) |
|
7907 | 8039 | (sources."dom5-2.3.0" // { |
@@ -7915,25 +8047,31 b' let' | |||
|
7915 | 8047 | sources."domhandler-2.3.0" |
|
7916 | 8048 | sources."domutils-1.5.1" |
|
7917 | 8049 | sources."dropzone-5.5.1" |
|
8050 | sources."duplexer-0.1.1" | |
|
7918 | 8051 | (sources."duplexify-3.7.1" // { |
|
7919 | 8052 | dependencies = [ |
|
7920 | 8053 | sources."readable-stream-2.3.6" |
|
8054 | sources."safe-buffer-5.1.2" | |
|
7921 | 8055 | sources."string_decoder-1.1.1" |
|
7922 | 8056 | ]; |
|
7923 | 8057 | }) |
|
7924 | 8058 | sources."ecc-jsbn-0.1.2" |
|
7925 |
sources."electron-to-chromium-1.3. |
|
|
7926 |
sources."elliptic-6. |
|
|
8059 | sources."electron-to-chromium-1.3.302" | |
|
8060 | sources."elliptic-6.5.1" | |
|
7927 | 8061 | sources."emojis-list-2.1.0" |
|
7928 |
sources."end-of-stream-1.4. |
|
|
7929 |
(sources."enhanced-resolve-4.1. |
|
|
7930 | dependencies = [ | |
|
7931 |
sources."graceful-fs-4. |
|
|
8062 | sources."end-of-stream-1.4.4" | |
|
8063 | (sources."enhanced-resolve-4.1.1" // { | |
|
8064 | dependencies = [ | |
|
8065 | sources."graceful-fs-4.2.3" | |
|
8066 | sources."memory-fs-0.5.0" | |
|
8067 | sources."readable-stream-2.3.6" | |
|
8068 | sources."safe-buffer-5.1.2" | |
|
8069 | sources."string_decoder-1.1.1" | |
|
7932 | 8070 | ]; |
|
7933 | 8071 | }) |
|
7934 | 8072 | sources."entities-1.0.0" |
|
7935 | 8073 | sources."errno-0.1.7" |
|
7936 |
sources."es-abstract-1.1 |
|
|
8074 | sources."es-abstract-1.16.0" | |
|
7937 | 8075 | sources."es-to-primitive-1.2.0" |
|
7938 | 8076 | sources."es6-templates-0.2.3" |
|
7939 | 8077 | sources."escape-string-regexp-1.0.5" |
@@ -7941,8 +8079,8 b' let' | |||
|
7941 | 8079 | sources."espree-3.5.4" |
|
7942 | 8080 | sources."esprima-1.0.4" |
|
7943 | 8081 | sources."esrecurse-4.2.1" |
|
7944 |
sources."estraverse-4. |
|
|
7945 |
sources."esutils-2.0. |
|
|
8082 | sources."estraverse-4.3.0" | |
|
8083 | sources."esutils-2.0.3" | |
|
7946 | 8084 | sources."eventemitter2-0.4.14" |
|
7947 | 8085 | sources."events-3.0.0" |
|
7948 | 8086 | sources."evp_bytestokey-1.0.3" |
@@ -7986,6 +8124,7 b' let' | |||
|
7986 | 8124 | sources."fastparse-1.1.2" |
|
7987 | 8125 | sources."favico.js-0.3.10" |
|
7988 | 8126 | sources."faye-websocket-0.4.4" |
|
8127 | sources."figures-1.7.0" | |
|
7989 | 8128 | sources."file-sync-cmp-0.1.1" |
|
7990 | 8129 | (sources."fill-range-4.0.0" // { |
|
7991 | 8130 | dependencies = [ |
@@ -8003,10 +8142,11 b' let' | |||
|
8003 | 8142 | }) |
|
8004 | 8143 | sources."fined-1.2.0" |
|
8005 | 8144 | sources."flagged-respawn-1.0.1" |
|
8006 |
sources."flatten-1.0. |
|
|
8145 | sources."flatten-1.0.3" | |
|
8007 | 8146 | (sources."flush-write-stream-1.1.1" // { |
|
8008 | 8147 | dependencies = [ |
|
8009 | 8148 | sources."readable-stream-2.3.6" |
|
8149 | sources."safe-buffer-5.1.2" | |
|
8010 | 8150 | sources."string_decoder-1.1.1" |
|
8011 | 8151 | ]; |
|
8012 | 8152 | }) |
@@ -8018,12 +8158,13 b' let' | |||
|
8018 | 8158 | (sources."from2-2.3.0" // { |
|
8019 | 8159 | dependencies = [ |
|
8020 | 8160 | sources."readable-stream-2.3.6" |
|
8161 | sources."safe-buffer-5.1.2" | |
|
8021 | 8162 | sources."string_decoder-1.1.1" |
|
8022 | 8163 | ]; |
|
8023 | 8164 | }) |
|
8024 | 8165 | (sources."fs-write-stream-atomic-1.0.10" // { |
|
8025 | 8166 | dependencies = [ |
|
8026 |
sources."graceful-fs-4. |
|
|
8167 | sources."graceful-fs-4.2.3" | |
|
8027 | 8168 | ]; |
|
8028 | 8169 | }) |
|
8029 | 8170 | sources."fs.realpath-1.0.0" |
@@ -8059,7 +8200,7 b' let' | |||
|
8059 | 8200 | sources."globals-9.18.0" |
|
8060 | 8201 | (sources."globby-7.1.1" // { |
|
8061 | 8202 | dependencies = [ |
|
8062 |
sources."glob-7.1. |
|
|
8203 | sources."glob-7.1.5" | |
|
8063 | 8204 | sources."minimatch-3.0.4" |
|
8064 | 8205 | ]; |
|
8065 | 8206 | }) |
@@ -8094,7 +8235,7 b' let' | |||
|
8094 | 8235 | (sources."grunt-contrib-jshint-0.12.0" // { |
|
8095 | 8236 | dependencies = [ |
|
8096 | 8237 | sources."jshint-2.9.7" |
|
8097 |
sources."lodash-4.17.1 |
|
|
8238 | sources."lodash-4.17.15" | |
|
8098 | 8239 | sources."minimatch-3.0.4" |
|
8099 | 8240 | ]; |
|
8100 | 8241 | }) |
@@ -8102,14 +8243,21 b' let' | |||
|
8102 | 8243 | dependencies = [ |
|
8103 | 8244 | sources."ansi-regex-2.1.1" |
|
8104 | 8245 | sources."ansi-styles-2.2.1" |
|
8105 |
sources."async-2.6. |
|
|
8246 | sources."async-2.6.3" | |
|
8106 | 8247 | sources."chalk-1.1.3" |
|
8107 | 8248 | sources."has-ansi-2.0.0" |
|
8108 |
sources."lodash-4.17.1 |
|
|
8249 | sources."lodash-4.17.15" | |
|
8109 | 8250 | sources."strip-ansi-3.0.1" |
|
8110 | 8251 | sources."supports-color-2.0.0" |
|
8111 | 8252 | ]; |
|
8112 | 8253 | }) |
|
8254 | (sources."grunt-contrib-uglify-4.0.1" // { | |
|
8255 | dependencies = [ | |
|
8256 | sources."ansi-styles-3.2.1" | |
|
8257 | sources."chalk-2.4.2" | |
|
8258 | sources."supports-color-5.5.0" | |
|
8259 | ]; | |
|
8260 | }) | |
|
8113 | 8261 | (sources."grunt-contrib-watch-0.6.1" // { |
|
8114 | 8262 | dependencies = [ |
|
8115 | 8263 | sources."async-0.2.10" |
@@ -8132,9 +8280,10 b' let' | |||
|
8132 | 8280 | sources."grunt-legacy-util-0.2.0" |
|
8133 | 8281 | (sources."grunt-webpack-3.1.3" // { |
|
8134 | 8282 | dependencies = [ |
|
8135 |
sources."lodash-4.17.1 |
|
|
8136 | ]; | |
|
8137 | }) | |
|
8283 | sources."lodash-4.17.15" | |
|
8284 | ]; | |
|
8285 | }) | |
|
8286 | sources."gzip-size-3.0.0" | |
|
8138 | 8287 | sources."har-schema-1.0.5" |
|
8139 | 8288 | sources."har-validator-4.2.1" |
|
8140 | 8289 | sources."has-1.0.3" |
@@ -8161,6 +8310,12 b' let' | |||
|
8161 | 8310 | (sources."html-minifier-3.5.21" // { |
|
8162 | 8311 | dependencies = [ |
|
8163 | 8312 | sources."commander-2.17.1" |
|
8313 | sources."source-map-0.6.1" | |
|
8314 | (sources."uglify-js-3.4.10" // { | |
|
8315 | dependencies = [ | |
|
8316 | sources."commander-2.19.0" | |
|
8317 | ]; | |
|
8318 | }) | |
|
8164 | 8319 | ]; |
|
8165 | 8320 | }) |
|
8166 | 8321 | (sources."html-webpack-plugin-3.2.0" // { |
@@ -8168,7 +8323,7 b' let' | |||
|
8168 | 8323 | sources."big.js-3.2.0" |
|
8169 | 8324 | sources."json5-0.5.1" |
|
8170 | 8325 | sources."loader-utils-0.2.17" |
|
8171 |
sources."lodash-4.17.1 |
|
|
8326 | sources."lodash-4.17.15" | |
|
8172 | 8327 | ]; |
|
8173 | 8328 | }) |
|
8174 | 8329 | sources."htmlparser2-3.8.3" |
@@ -8193,7 +8348,7 b' let' | |||
|
8193 | 8348 | dependencies = [ |
|
8194 | 8349 | sources."find-up-3.0.0" |
|
8195 | 8350 | sources."locate-path-3.0.0" |
|
8196 |
sources."p-limit-2.2. |
|
|
8351 | sources."p-limit-2.2.1" | |
|
8197 | 8352 | sources."p-locate-3.0.0" |
|
8198 | 8353 | sources."p-try-2.2.0" |
|
8199 | 8354 | sources."pkg-dir-3.0.0" |
@@ -8202,9 +8357,8 b' let' | |||
|
8202 | 8357 | sources."imports-loader-0.7.1" |
|
8203 | 8358 | sources."imurmurhash-0.1.4" |
|
8204 | 8359 | sources."indexes-of-1.0.1" |
|
8205 | sources."indexof-0.0.1" | |
|
8206 | 8360 | sources."inflight-1.0.6" |
|
8207 |
sources."inherits-2.0. |
|
|
8361 | sources."inherits-2.0.4" | |
|
8208 | 8362 | sources."ini-1.3.5" |
|
8209 | 8363 | sources."interpret-1.1.0" |
|
8210 | 8364 | sources."invariant-2.2.4" |
@@ -8250,7 +8404,7 b' let' | |||
|
8250 | 8404 | sources."jsesc-1.3.0" |
|
8251 | 8405 | (sources."jshint-2.10.2" // { |
|
8252 | 8406 | dependencies = [ |
|
8253 |
sources."lodash-4.17.1 |
|
|
8407 | sources."lodash-4.17.15" | |
|
8254 | 8408 | sources."minimatch-3.0.4" |
|
8255 | 8409 | ]; |
|
8256 | 8410 | }) |
@@ -8271,7 +8425,7 b' let' | |||
|
8271 | 8425 | sources."lcid-2.0.0" |
|
8272 | 8426 | (sources."less-2.7.3" // { |
|
8273 | 8427 | dependencies = [ |
|
8274 |
sources."graceful-fs-4. |
|
|
8428 | sources."graceful-fs-4.2.3" | |
|
8275 | 8429 | ]; |
|
8276 | 8430 | }) |
|
8277 | 8431 | (sources."liftoff-2.5.0" // { |
@@ -8298,11 +8452,22 b' let' | |||
|
8298 | 8452 | sources."map-visit-1.0.0" |
|
8299 | 8453 | sources."mark.js-8.11.1" |
|
8300 | 8454 | sources."math-expression-evaluator-1.2.17" |
|
8455 | (sources."maxmin-2.1.0" // { | |
|
8456 | dependencies = [ | |
|
8457 | sources."ansi-regex-2.1.1" | |
|
8458 | sources."ansi-styles-2.2.1" | |
|
8459 | sources."chalk-1.1.3" | |
|
8460 | sources."has-ansi-2.0.0" | |
|
8461 | sources."strip-ansi-3.0.1" | |
|
8462 | sources."supports-color-2.0.0" | |
|
8463 | ]; | |
|
8464 | }) | |
|
8301 | 8465 | sources."md5.js-1.3.5" |
|
8302 | 8466 | sources."mem-4.3.0" |
|
8303 | 8467 | (sources."memory-fs-0.4.1" // { |
|
8304 | 8468 | dependencies = [ |
|
8305 | 8469 | sources."readable-stream-2.3.6" |
|
8470 | sources."safe-buffer-5.1.2" | |
|
8306 | 8471 | sources."string_decoder-1.1.1" |
|
8307 | 8472 | ]; |
|
8308 | 8473 | }) |
@@ -8317,7 +8482,7 b' let' | |||
|
8317 | 8482 | sources."minimatch-0.2.14" |
|
8318 | 8483 | sources."minimist-1.2.0" |
|
8319 | 8484 | sources."mississippi-2.0.0" |
|
8320 |
(sources."mixin-deep-1.3. |
|
|
8485 | (sources."mixin-deep-1.3.2" // { | |
|
8321 | 8486 | dependencies = [ |
|
8322 | 8487 | sources."is-extendable-1.0.1" |
|
8323 | 8488 | ]; |
@@ -8331,25 +8496,30 b' let' | |||
|
8331 | 8496 | sources."mousetrap-1.6.3" |
|
8332 | 8497 | (sources."move-concurrently-1.0.1" // { |
|
8333 | 8498 | dependencies = [ |
|
8334 |
sources."glob-7.1. |
|
|
8499 | sources."glob-7.1.5" | |
|
8335 | 8500 | sources."minimatch-3.0.4" |
|
8336 |
sources."rimraf-2. |
|
|
8501 | sources."rimraf-2.7.1" | |
|
8337 | 8502 | ]; |
|
8338 | 8503 | }) |
|
8339 | 8504 | sources."ms-2.0.0" |
|
8340 |
sources."nan-2.1 |
|
|
8505 | sources."nan-2.14.0" | |
|
8341 | 8506 | sources."nanomatch-1.2.13" |
|
8342 | 8507 | sources."neo-async-2.6.1" |
|
8343 | 8508 | sources."nice-try-1.0.5" |
|
8344 | 8509 | sources."no-case-2.3.2" |
|
8345 |
(sources."node-libs-browser-2.2. |
|
|
8510 | (sources."node-libs-browser-2.2.1" // { | |
|
8346 | 8511 | dependencies = [ |
|
8347 | 8512 | (sources."readable-stream-2.3.6" // { |
|
8348 | 8513 | dependencies = [ |
|
8349 | 8514 | sources."string_decoder-1.1.1" |
|
8350 | 8515 | ]; |
|
8351 | 8516 | }) |
|
8352 |
sources."s |
|
|
8517 | sources."safe-buffer-5.1.2" | |
|
8518 | (sources."string_decoder-1.3.0" // { | |
|
8519 | dependencies = [ | |
|
8520 | sources."safe-buffer-5.2.0" | |
|
8521 | ]; | |
|
8522 | }) | |
|
8353 | 8523 | ]; |
|
8354 | 8524 | }) |
|
8355 | 8525 | sources."nopt-1.0.10" |
@@ -8380,6 +8550,7 b' let' | |||
|
8380 | 8550 | sources."kind-of-3.2.2" |
|
8381 | 8551 | ]; |
|
8382 | 8552 | }) |
|
8553 | sources."object-inspect-1.6.0" | |
|
8383 | 8554 | sources."object-keys-1.1.1" |
|
8384 | 8555 | sources."object-visit-1.0.1" |
|
8385 | 8556 | sources."object.defaults-1.1.0" |
@@ -8399,14 +8570,15 b' let' | |||
|
8399 | 8570 | sources."p-locate-2.0.0" |
|
8400 | 8571 | sources."p-try-1.0.0" |
|
8401 | 8572 | sources."pako-1.0.10" |
|
8402 |
(sources."parallel-transform-1. |
|
|
8573 | (sources."parallel-transform-1.2.0" // { | |
|
8403 | 8574 | dependencies = [ |
|
8404 | 8575 | sources."readable-stream-2.3.6" |
|
8576 | sources."safe-buffer-5.1.2" | |
|
8405 | 8577 | sources."string_decoder-1.1.1" |
|
8406 | 8578 | ]; |
|
8407 | 8579 | }) |
|
8408 | 8580 | sources."param-case-2.1.1" |
|
8409 |
sources."parse-asn1-5.1. |
|
|
8581 | sources."parse-asn1-5.1.5" | |
|
8410 | 8582 | sources."parse-filepath-1.0.2" |
|
8411 | 8583 | sources."parse-passwd-1.0.0" |
|
8412 | 8584 | sources."parse5-3.0.3" |
@@ -8416,7 +8588,7 b' let' | |||
|
8416 | 8588 | ]; |
|
8417 | 8589 | }) |
|
8418 | 8590 | sources."pascalcase-0.1.1" |
|
8419 |
sources."path-browserify-0.0. |
|
|
8591 | sources."path-browserify-0.0.1" | |
|
8420 | 8592 | sources."path-dirname-1.0.2" |
|
8421 | 8593 | sources."path-exists-3.0.0" |
|
8422 | 8594 | sources."path-is-absolute-1.0.1" |
@@ -8527,10 +8699,11 b' let' | |||
|
8527 | 8699 | sources."postcss-value-parser-3.3.1" |
|
8528 | 8700 | sources."postcss-zindex-2.2.0" |
|
8529 | 8701 | sources."prepend-http-1.0.4" |
|
8702 | sources."pretty-bytes-3.0.1" | |
|
8530 | 8703 | sources."pretty-error-2.1.1" |
|
8531 | 8704 | sources."private-0.1.8" |
|
8532 | 8705 | sources."process-0.11.10" |
|
8533 |
sources."process-nextick-args-2.0. |
|
|
8706 | sources."process-nextick-args-2.0.1" | |
|
8534 | 8707 | sources."promise-7.3.1" |
|
8535 | 8708 | sources."promise-inflight-1.0.1" |
|
8536 | 8709 | sources."prr-1.0.1" |
@@ -8555,8 +8728,9 b' let' | |||
|
8555 | 8728 | }) |
|
8556 | 8729 | (sources."readdirp-2.2.1" // { |
|
8557 | 8730 | dependencies = [ |
|
8558 |
sources."graceful-fs-4. |
|
|
8731 | sources."graceful-fs-4.2.3" | |
|
8559 | 8732 | sources."readable-stream-2.3.6" |
|
8733 | sources."safe-buffer-5.1.2" | |
|
8560 | 8734 | sources."string_decoder-1.1.1" |
|
8561 | 8735 | ]; |
|
8562 | 8736 | }) |
@@ -8571,11 +8745,7 b' let' | |||
|
8571 | 8745 | sources."balanced-match-0.4.2" |
|
8572 | 8746 | ]; |
|
8573 | 8747 | }) |
|
8574 |
|
|
|
8575 | dependencies = [ | |
|
8576 | sources."balanced-match-0.4.2" | |
|
8577 | ]; | |
|
8578 | }) | |
|
8748 | sources."reduce-function-call-1.0.3" | |
|
8579 | 8749 | sources."regenerate-1.4.0" |
|
8580 | 8750 | sources."regenerator-runtime-0.11.1" |
|
8581 | 8751 | sources."regenerator-transform-0.10.1" |
@@ -8601,7 +8771,7 b' let' | |||
|
8601 | 8771 | sources."request-2.81.0" |
|
8602 | 8772 | sources."require-directory-2.1.1" |
|
8603 | 8773 | sources."require-main-filename-1.0.1" |
|
8604 |
sources."resolve-1.10 |
|
|
8774 | sources."resolve-1.12.0" | |
|
8605 | 8775 | sources."resolve-cwd-2.0.0" |
|
8606 | 8776 | sources."resolve-dir-1.0.1" |
|
8607 | 8777 | sources."resolve-from-3.0.0" |
@@ -8611,20 +8781,20 b' let' | |||
|
8611 | 8781 | sources."rimraf-2.2.8" |
|
8612 | 8782 | sources."ripemd160-2.0.2" |
|
8613 | 8783 | sources."run-queue-1.0.3" |
|
8614 |
sources."safe-buffer-5. |
|
|
8784 | sources."safe-buffer-5.2.0" | |
|
8615 | 8785 | sources."safe-regex-1.1.0" |
|
8616 | 8786 | sources."safer-buffer-2.1.2" |
|
8617 | 8787 | sources."sax-1.2.4" |
|
8618 | 8788 | (sources."schema-utils-0.4.7" // { |
|
8619 | 8789 | dependencies = [ |
|
8620 |
sources."ajv-6.10. |
|
|
8790 | sources."ajv-6.10.2" | |
|
8621 | 8791 | ]; |
|
8622 | 8792 | }) |
|
8623 | 8793 | sources."select-1.1.2" |
|
8624 |
sources."semver-5.7. |
|
|
8625 |
sources."serialize-javascript-1. |
|
|
8794 | sources."semver-5.7.1" | |
|
8795 | sources."serialize-javascript-1.9.1" | |
|
8626 | 8796 | sources."set-blocking-2.0.0" |
|
8627 |
(sources."set-value-2.0. |
|
|
8797 | (sources."set-value-2.0.1" // { | |
|
8628 | 8798 | dependencies = [ |
|
8629 | 8799 | sources."extend-shallow-2.0.1" |
|
8630 | 8800 | ]; |
@@ -8701,6 +8871,7 b' let' | |||
|
8701 | 8871 | (sources."stream-browserify-2.0.2" // { |
|
8702 | 8872 | dependencies = [ |
|
8703 | 8873 | sources."readable-stream-2.3.6" |
|
8874 | sources."safe-buffer-5.1.2" | |
|
8704 | 8875 | sources."string_decoder-1.1.1" |
|
8705 | 8876 | ]; |
|
8706 | 8877 | }) |
@@ -8708,6 +8879,7 b' let' | |||
|
8708 | 8879 | (sources."stream-http-2.8.3" // { |
|
8709 | 8880 | dependencies = [ |
|
8710 | 8881 | sources."readable-stream-2.3.6" |
|
8882 | sources."safe-buffer-5.1.2" | |
|
8711 | 8883 | sources."string_decoder-1.1.1" |
|
8712 | 8884 | ]; |
|
8713 | 8885 | }) |
@@ -8720,6 +8892,8 b' let' | |||
|
8720 | 8892 | sources."strip-ansi-4.0.0" |
|
8721 | 8893 | ]; |
|
8722 | 8894 | }) |
|
8895 | sources."string.prototype.trimleft-2.1.0" | |
|
8896 | sources."string.prototype.trimright-2.1.0" | |
|
8723 | 8897 | sources."string_decoder-0.10.31" |
|
8724 | 8898 | sources."stringstream-0.0.6" |
|
8725 | 8899 | sources."strip-ansi-0.3.0" |
@@ -8740,10 +8914,11 b' let' | |||
|
8740 | 8914 | (sources."through2-2.0.5" // { |
|
8741 | 8915 | dependencies = [ |
|
8742 | 8916 | sources."readable-stream-2.3.6" |
|
8917 | sources."safe-buffer-5.1.2" | |
|
8743 | 8918 | sources."string_decoder-1.1.1" |
|
8744 | 8919 | ]; |
|
8745 | 8920 | }) |
|
8746 |
sources."timers-browserify-2.0.1 |
|
|
8921 | sources."timers-browserify-2.0.11" | |
|
8747 | 8922 | sources."tiny-emitter-2.1.0" |
|
8748 | 8923 | (sources."tiny-lr-fork-0.0.5" // { |
|
8749 | 8924 | dependencies = [ |
@@ -8766,27 +8941,27 b' let' | |||
|
8766 | 8941 | (sources."ts-loader-1.3.3" // { |
|
8767 | 8942 | dependencies = [ |
|
8768 | 8943 | sources."big.js-3.2.0" |
|
8769 |
sources."colors-1. |
|
|
8944 | sources."colors-1.4.0" | |
|
8770 | 8945 | sources."enhanced-resolve-3.4.1" |
|
8771 |
sources."graceful-fs-4. |
|
|
8946 | sources."graceful-fs-4.2.3" | |
|
8772 | 8947 | sources."json5-0.5.1" |
|
8773 | 8948 | sources."loader-utils-0.2.17" |
|
8774 | 8949 | sources."tapable-0.2.9" |
|
8775 | 8950 | ]; |
|
8776 | 8951 | }) |
|
8777 |
sources."tslib-1. |
|
|
8952 | sources."tslib-1.10.0" | |
|
8778 | 8953 | sources."tty-browserify-0.0.0" |
|
8779 | 8954 | sources."tunnel-agent-0.6.0" |
|
8780 | 8955 | sources."tweetnacl-0.14.5" |
|
8781 | 8956 | sources."typedarray-0.0.6" |
|
8782 | 8957 | (sources."uglify-es-3.3.10" // { |
|
8783 | 8958 | dependencies = [ |
|
8959 | sources."commander-2.14.1" | |
|
8784 | 8960 | sources."source-map-0.6.1" |
|
8785 | 8961 | ]; |
|
8786 | 8962 | }) |
|
8787 |
(sources."uglify-js-3. |
|
|
8788 | dependencies = [ | |
|
8789 | sources."commander-2.19.0" | |
|
8963 | (sources."uglify-js-3.6.7" // { | |
|
8964 | dependencies = [ | |
|
8790 | 8965 | sources."source-map-0.6.1" |
|
8791 | 8966 | ]; |
|
8792 | 8967 | }) |
@@ -8799,16 +8974,11 b' let' | |||
|
8799 | 8974 | sources."unc-path-regex-0.1.2" |
|
8800 | 8975 | sources."underscore-1.7.0" |
|
8801 | 8976 | sources."underscore.string-2.2.1" |
|
8802 |
|
|
|
8803 | dependencies = [ | |
|
8804 | sources."extend-shallow-2.0.1" | |
|
8805 | sources."set-value-0.4.3" | |
|
8806 | ]; | |
|
8807 | }) | |
|
8977 | sources."union-value-1.0.1" | |
|
8808 | 8978 | sources."uniq-1.0.1" |
|
8809 | 8979 | sources."uniqs-2.0.0" |
|
8810 | 8980 | sources."unique-filename-1.1.1" |
|
8811 |
sources."unique-slug-2.0. |
|
|
8981 | sources."unique-slug-2.0.2" | |
|
8812 | 8982 | (sources."unset-value-1.0.0" // { |
|
8813 | 8983 | dependencies = [ |
|
8814 | 8984 | (sources."has-value-0.3.1" // { |
@@ -8819,13 +8989,14 b' let' | |||
|
8819 | 8989 | sources."has-values-0.1.4" |
|
8820 | 8990 | ]; |
|
8821 | 8991 | }) |
|
8822 |
sources."upath-1. |
|
|
8992 | sources."upath-1.2.0" | |
|
8823 | 8993 | sources."upper-case-1.1.3" |
|
8824 | 8994 | (sources."uri-js-4.2.2" // { |
|
8825 | 8995 | dependencies = [ |
|
8826 | 8996 | sources."punycode-2.1.1" |
|
8827 | 8997 | ]; |
|
8828 | 8998 | }) |
|
8999 | sources."uri-path-1.0.0" | |
|
8829 | 9000 | sources."urix-0.1.0" |
|
8830 | 9001 | (sources."url-0.11.0" // { |
|
8831 | 9002 | dependencies = [ |
@@ -8833,12 +9004,16 b' let' | |||
|
8833 | 9004 | ]; |
|
8834 | 9005 | }) |
|
8835 | 9006 | sources."use-3.1.1" |
|
8836 | sources."util-0.11.1" | |
|
9007 | (sources."util-0.11.1" // { | |
|
9008 | dependencies = [ | |
|
9009 | sources."inherits-2.0.3" | |
|
9010 | ]; | |
|
9011 | }) | |
|
8837 | 9012 | sources."util-deprecate-1.0.2" |
|
8838 | 9013 | sources."util.promisify-1.0.0" |
|
8839 | 9014 | sources."utila-0.4.0" |
|
8840 |
sources."uuid-3.3. |
|
|
8841 |
sources."v8-compile-cache-2.0 |
|
|
9015 | sources."uuid-3.3.3" | |
|
9016 | sources."v8-compile-cache-2.1.0" | |
|
8842 | 9017 | sources."v8flags-3.1.3" |
|
8843 | 9018 | sources."vendors-1.0.3" |
|
8844 | 9019 | (sources."verror-1.10.0" // { |
@@ -8846,16 +9021,16 b' let' | |||
|
8846 | 9021 | sources."assert-plus-1.0.0" |
|
8847 | 9022 | ]; |
|
8848 | 9023 | }) |
|
8849 |
sources."vm-browserify- |
|
|
9024 | sources."vm-browserify-1.1.0" | |
|
8850 | 9025 | (sources."watchpack-1.6.0" // { |
|
8851 | 9026 | dependencies = [ |
|
8852 |
sources."graceful-fs-4. |
|
|
9027 | sources."graceful-fs-4.2.3" | |
|
8853 | 9028 | ]; |
|
8854 | 9029 | }) |
|
8855 | 9030 | sources."waypoints-4.0.1" |
|
8856 | 9031 | (sources."webpack-4.23.1" // { |
|
8857 | 9032 | dependencies = [ |
|
8858 |
sources."ajv-6.10. |
|
|
9033 | sources."ajv-6.10.2" | |
|
8859 | 9034 | ]; |
|
8860 | 9035 | }) |
|
8861 | 9036 | (sources."webpack-cli-3.1.2" // { |
@@ -8871,7 +9046,7 b' let' | |||
|
8871 | 9046 | sources."source-map-0.4.4" |
|
8872 | 9047 | ]; |
|
8873 | 9048 | }) |
|
8874 |
(sources."webpack-sources-1.3 |
|
|
9049 | (sources."webpack-sources-1.4.3" // { | |
|
8875 | 9050 | dependencies = [ |
|
8876 | 9051 | sources."source-map-0.6.1" |
|
8877 | 9052 | ]; |
@@ -8904,14 +9079,14 b' let' | |||
|
8904 | 9079 | ]; |
|
8905 | 9080 | }) |
|
8906 | 9081 | sources."wrappy-1.0.2" |
|
8907 |
sources."xtend-4.0. |
|
|
9082 | sources."xtend-4.0.2" | |
|
8908 | 9083 | sources."y18n-4.0.0" |
|
8909 | 9084 | sources."yallist-2.1.2" |
|
8910 | 9085 | (sources."yargs-12.0.5" // { |
|
8911 | 9086 | dependencies = [ |
|
8912 | 9087 | sources."find-up-3.0.0" |
|
8913 | 9088 | sources."locate-path-3.0.0" |
|
8914 |
sources."p-limit-2.2. |
|
|
9089 | sources."p-limit-2.2.1" | |
|
8915 | 9090 | sources."p-locate-3.0.0" |
|
8916 | 9091 | sources."p-try-2.2.0" |
|
8917 | 9092 | ]; |
@@ -5,7 +5,7 b'' | |||
|
5 | 5 | |
|
6 | 6 | self: super: { |
|
7 | 7 | "alembic" = super.buildPythonPackage { |
|
8 |
name = "alembic-1. |
|
|
8 | name = "alembic-1.3.1"; | |
|
9 | 9 | doCheck = false; |
|
10 | 10 | propagatedBuildInputs = [ |
|
11 | 11 | self."sqlalchemy" |
@@ -14,22 +14,22 b' self: super: {' | |||
|
14 | 14 | self."python-dateutil" |
|
15 | 15 | ]; |
|
16 | 16 | src = fetchurl { |
|
17 |
url = "https://files.pythonhosted.org/packages/6e |
|
|
18 | sha256 = "1dwl0264r6ri2jyrjr68am04x538ab26xwy4crqjnnhm4alwm3c2"; | |
|
17 | url = "https://files.pythonhosted.org/packages/84/64/493c45119dce700a4b9eeecc436ef9e8835ab67bae6414f040cdc7b58f4b/alembic-1.3.1.tar.gz"; | |
|
18 | sha256 = "1cl2chk5jx0rf4hmsd5lljic7iifw17yv3y5xawvp4i14jvpn9s9"; | |
|
19 | 19 | }; |
|
20 | 20 | meta = { |
|
21 | 21 | license = [ pkgs.lib.licenses.mit ]; |
|
22 | 22 | }; |
|
23 | 23 | }; |
|
24 | 24 | "amqp" = super.buildPythonPackage { |
|
25 |
name = "amqp-2. |
|
|
25 | name = "amqp-2.5.2"; | |
|
26 | 26 | doCheck = false; |
|
27 | 27 | propagatedBuildInputs = [ |
|
28 | 28 | self."vine" |
|
29 | 29 | ]; |
|
30 | 30 | src = fetchurl { |
|
31 |
url = "https://files.pythonhosted.org/packages/ |
|
|
32 | sha256 = "0wlfnvhmfrn7c8qif2jyvsm63ibdxp02ss564qwrvqfhz0di72s0"; | |
|
31 | url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz"; | |
|
32 | sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp"; | |
|
33 | 33 | }; |
|
34 | 34 | meta = { |
|
35 | 35 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -63,33 +63,22 b' self: super: {' | |||
|
63 | 63 | }; |
|
64 | 64 | }; |
|
65 | 65 | "atomicwrites" = super.buildPythonPackage { |
|
66 |
name = "atomicwrites-1. |
|
|
66 | name = "atomicwrites-1.3.0"; | |
|
67 | 67 | doCheck = false; |
|
68 | 68 | src = fetchurl { |
|
69 |
url = "https://files.pythonhosted.org/packages/ac |
|
|
70 | sha256 = "1vmkbw9j0qammwxbxycrs39gvdg4lc2d4lk98kwf8ag2manyi6pc"; | |
|
69 | url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz"; | |
|
70 | sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm"; | |
|
71 | 71 | }; |
|
72 | 72 | meta = { |
|
73 | 73 | license = [ pkgs.lib.licenses.mit ]; |
|
74 | 74 | }; |
|
75 | 75 | }; |
|
76 | 76 | "attrs" = super.buildPythonPackage { |
|
77 |
name = "attrs-1 |
|
|
77 | name = "attrs-19.3.0"; | |
|
78 | 78 | doCheck = false; |
|
79 | 79 | src = fetchurl { |
|
80 |
url = "https://files.pythonhosted.org/packages/ |
|
|
81 | sha256 = "0s9ydh058wmmf5v391pym877x4ahxg45dw6a0w4c7s5wgpigdjqh"; | |
|
82 | }; | |
|
83 | meta = { | |
|
84 | license = [ pkgs.lib.licenses.mit ]; | |
|
85 | }; | |
|
86 | }; | |
|
87 | "authomatic" = super.buildPythonPackage { | |
|
88 | name = "authomatic-0.1.0.post1"; | |
|
89 | doCheck = false; | |
|
90 | src = fetchurl { | |
|
91 | url = "https://code.rhodecode.com/upstream/authomatic/artifacts/download/0-4fe9c041-a567-4f84-be4c-7efa2a606d3c.tar.gz?md5=f6bdc3c769688212db68233e8d2b0383"; | |
|
92 | sha256 = "0pc716mva0ym6xd8jwzjbjp8dqxy9069wwwv2aqwb8lyhl4757ab"; | |
|
80 | url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz"; | |
|
81 | sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp"; | |
|
93 | 82 | }; |
|
94 | 83 | meta = { |
|
95 | 84 | license = [ pkgs.lib.licenses.mit ]; |
@@ -146,11 +135,11 b' self: super: {' | |||
|
146 | 135 | }; |
|
147 | 136 | }; |
|
148 | 137 | "billiard" = super.buildPythonPackage { |
|
149 |
name = "billiard-3. |
|
|
138 | name = "billiard-3.6.1.0"; | |
|
150 | 139 | doCheck = false; |
|
151 | 140 | src = fetchurl { |
|
152 |
url = "https://files.pythonhosted.org/packages/ |
|
|
153 | sha256 = "1riwiiwgb141151md4ykx49qrz749akj5k8g290ji9bsqjyj4yqx"; | |
|
141 | url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz"; | |
|
142 | sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q"; | |
|
154 | 143 | }; |
|
155 | 144 | meta = { |
|
156 | 145 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -183,30 +172,42 b' self: super: {' | |||
|
183 | 172 | }; |
|
184 | 173 | }; |
|
185 | 174 | "celery" = super.buildPythonPackage { |
|
186 |
name = "celery-4. |
|
|
175 | name = "celery-4.3.0"; | |
|
187 | 176 | doCheck = false; |
|
188 | 177 | propagatedBuildInputs = [ |
|
189 | 178 | self."pytz" |
|
190 | 179 | self."billiard" |
|
191 | 180 | self."kombu" |
|
181 | self."vine" | |
|
192 | 182 | ]; |
|
193 | 183 | src = fetchurl { |
|
194 |
url = "https://files.pythonhosted.org/packages/e9 |
|
|
195 | sha256 = "1xbir4vw42n2ir9lanhwl7w69zpmj7lbi66fxm2b7pyvkcss7wni"; | |
|
184 | url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz"; | |
|
185 | sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac"; | |
|
196 | 186 | }; |
|
197 | 187 | meta = { |
|
198 | 188 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
199 | 189 | }; |
|
200 | 190 | }; |
|
191 | "certifi" = super.buildPythonPackage { | |
|
192 | name = "certifi-2019.11.28"; | |
|
193 | doCheck = false; | |
|
194 | src = fetchurl { | |
|
195 | url = "https://files.pythonhosted.org/packages/41/bf/9d214a5af07debc6acf7f3f257265618f1db242a3f8e49a9b516f24523a6/certifi-2019.11.28.tar.gz"; | |
|
196 | sha256 = "07qg6864bk4qxa8akr967amlmsq9v310hp039mcpjx6dliylrdi5"; | |
|
197 | }; | |
|
198 | meta = { | |
|
199 | license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ]; | |
|
200 | }; | |
|
201 | }; | |
|
201 | 202 | "cffi" = super.buildPythonPackage { |
|
202 |
name = "cffi-1.12. |
|
|
203 | name = "cffi-1.12.3"; | |
|
203 | 204 | doCheck = false; |
|
204 | 205 | propagatedBuildInputs = [ |
|
205 | 206 | self."pycparser" |
|
206 | 207 | ]; |
|
207 | 208 | src = fetchurl { |
|
208 | url = "https://files.pythonhosted.org/packages/64/7c/27367b38e6cc3e1f49f193deb761fe75cda9f95da37b67b422e62281fcac/cffi-1.12.2.tar.gz"; | |
|
209 | sha256 = "19qfks2djya8vix95bmg3xzipjb8w9b8mbj4j5k2hqkc8j58f4z1"; | |
|
209 | url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz"; | |
|
210 | sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704"; | |
|
210 | 211 | }; |
|
211 | 212 | meta = { |
|
212 | 213 | license = [ pkgs.lib.licenses.mit ]; |
@@ -243,6 +244,17 b' self: super: {' | |||
|
243 | 244 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
244 | 245 | }; |
|
245 | 246 | }; |
|
247 | "chardet" = super.buildPythonPackage { | |
|
248 | name = "chardet-3.0.4"; | |
|
249 | doCheck = false; | |
|
250 | src = fetchurl { | |
|
251 | url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz"; | |
|
252 | sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4"; | |
|
253 | }; | |
|
254 | meta = { | |
|
255 | license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |
|
256 | }; | |
|
257 | }; | |
|
246 | 258 | "click" = super.buildPythonPackage { |
|
247 | 259 | name = "click-7.0"; |
|
248 | 260 | doCheck = false; |
@@ -285,16 +297,27 b' self: super: {' | |||
|
285 | 297 | }; |
|
286 | 298 | }; |
|
287 | 299 | "configparser" = super.buildPythonPackage { |
|
288 |
name = "configparser- |
|
|
300 | name = "configparser-4.0.2"; | |
|
289 | 301 | doCheck = false; |
|
290 | 302 | src = fetchurl { |
|
291 |
url = "https://files.pythonhosted.org/packages/ |
|
|
292 | sha256 = "0xac32886ihs2xg7w1gppcq2sgin5qsm8lqwijs5xifq9w0x0q6s"; | |
|
303 | url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz"; | |
|
304 | sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7"; | |
|
293 | 305 | }; |
|
294 | 306 | meta = { |
|
295 | 307 | license = [ pkgs.lib.licenses.mit ]; |
|
296 | 308 | }; |
|
297 | 309 | }; |
|
310 | "contextlib2" = super.buildPythonPackage { | |
|
311 | name = "contextlib2-0.6.0.post1"; | |
|
312 | doCheck = false; | |
|
313 | src = fetchurl { | |
|
314 | url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz"; | |
|
315 | sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01"; | |
|
316 | }; | |
|
317 | meta = { | |
|
318 | license = [ pkgs.lib.licenses.psfl ]; | |
|
319 | }; | |
|
320 | }; | |
|
298 | 321 | "cov-core" = super.buildPythonPackage { |
|
299 | 322 | name = "cov-core-1.15.0"; |
|
300 | 323 | doCheck = false; |
@@ -310,11 +333,11 b' self: super: {' | |||
|
310 | 333 | }; |
|
311 | 334 | }; |
|
312 | 335 | "coverage" = super.buildPythonPackage { |
|
313 |
name = "coverage-4.5. |
|
|
336 | name = "coverage-4.5.4"; | |
|
314 | 337 | doCheck = false; |
|
315 | 338 | src = fetchurl { |
|
316 |
url = "https://files.pythonhosted.org/packages/8 |
|
|
317 | sha256 = "02f6m073qdispn96rc616hg0rnmw1pgqzw3bgxwiwza4zf9hirlx"; | |
|
339 | url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz"; | |
|
340 | sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0"; | |
|
318 | 341 | }; |
|
319 | 342 | meta = { |
|
320 | 343 | license = [ pkgs.lib.licenses.asl20 ]; |
@@ -361,7 +384,7 b' self: super: {' | |||
|
361 | 384 | }; |
|
362 | 385 | }; |
|
363 | 386 | "deform" = super.buildPythonPackage { |
|
364 |
name = "deform-2.0. |
|
|
387 | name = "deform-2.0.8"; | |
|
365 | 388 | doCheck = false; |
|
366 | 389 | propagatedBuildInputs = [ |
|
367 | 390 | self."chameleon" |
@@ -372,8 +395,8 b' self: super: {' | |||
|
372 | 395 | self."zope.deprecation" |
|
373 | 396 | ]; |
|
374 | 397 | src = fetchurl { |
|
375 | url = "https://files.pythonhosted.org/packages/cf/a1/bc234527b8f181de9acd80e796483c00007658d1e32b7de78f1c2e004d9a/deform-2.0.7.tar.gz"; | |
|
376 | sha256 = "0jnpi0zr2hjvbmiz6nm33yqv976dn9lf51vhlzqc0i75xcr9rwig"; | |
|
398 | url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz"; | |
|
399 | sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9"; | |
|
377 | 400 | }; |
|
378 | 401 | meta = { |
|
379 | 402 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
@@ -417,14 +440,14 b' self: super: {' | |||
|
417 | 440 | }; |
|
418 | 441 | }; |
|
419 | 442 | "dogpile.cache" = super.buildPythonPackage { |
|
420 |
name = "dogpile.cache-0. |
|
|
443 | name = "dogpile.cache-0.9.0"; | |
|
421 | 444 | doCheck = false; |
|
422 | 445 | propagatedBuildInputs = [ |
|
423 | 446 | self."decorator" |
|
424 | 447 | ]; |
|
425 | 448 | src = fetchurl { |
|
426 |
url = "https://files.pythonhosted.org/packages/84 |
|
|
427 | sha256 = "0caazmrzhnfqb5yrp8myhw61ny637jj69wcngrpbvi31jlcpy6v9"; | |
|
449 | url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz"; | |
|
450 | sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k"; | |
|
428 | 451 | }; |
|
429 | 452 | meta = { |
|
430 | 453 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -514,14 +537,14 b' self: super: {' | |||
|
514 | 537 | }; |
|
515 | 538 | }; |
|
516 | 539 | "elasticsearch2" = super.buildPythonPackage { |
|
517 |
name = "elasticsearch2-2.5. |
|
|
540 | name = "elasticsearch2-2.5.1"; | |
|
518 | 541 | doCheck = false; |
|
519 | 542 | propagatedBuildInputs = [ |
|
520 | 543 | self."urllib3" |
|
521 | 544 | ]; |
|
522 | 545 | src = fetchurl { |
|
523 |
url = "https://files.pythonhosted.org/packages/ |
|
|
524 | sha256 = "0ky0q16lbvz022yv6q3pix7aamf026p1y994537ccjf0p0dxnbxr"; | |
|
546 | url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz"; | |
|
547 | sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k"; | |
|
525 | 548 | }; |
|
526 | 549 | meta = { |
|
527 | 550 | license = [ pkgs.lib.licenses.asl20 ]; |
@@ -666,16 +689,44 b' self: super: {' | |||
|
666 | 689 | }; |
|
667 | 690 | }; |
|
668 | 691 | "hupper" = super.buildPythonPackage { |
|
669 |
name = "hupper-1. |
|
|
692 | name = "hupper-1.9.1"; | |
|
670 | 693 | doCheck = false; |
|
671 | 694 | src = fetchurl { |
|
672 |
url = "https://files.pythonhosted.org/packages/ |
|
|
673 | sha256 = "0d3cvkc8ssgwk54wvhbifj56ry97qi10pfzwfk8vwzzcikbfp3zy"; | |
|
695 | url = "https://files.pythonhosted.org/packages/09/3a/4f215659f31eeffe364a984dba486bfa3907bfcc54b7013bdfe825cebb5f/hupper-1.9.1.tar.gz"; | |
|
696 | sha256 = "0pyg879fv9mbwlnbzw2a3234qqycqs9l97h5mpkmk0bvxhi2471v"; | |
|
674 | 697 | }; |
|
675 | 698 | meta = { |
|
676 | 699 | license = [ pkgs.lib.licenses.mit ]; |
|
677 | 700 | }; |
|
678 | 701 | }; |
|
702 | "idna" = super.buildPythonPackage { | |
|
703 | name = "idna-2.8"; | |
|
704 | doCheck = false; | |
|
705 | src = fetchurl { | |
|
706 | url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz"; | |
|
707 | sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3"; | |
|
708 | }; | |
|
709 | meta = { | |
|
710 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ]; | |
|
711 | }; | |
|
712 | }; | |
|
713 | "importlib-metadata" = super.buildPythonPackage { | |
|
714 | name = "importlib-metadata-0.23"; | |
|
715 | doCheck = false; | |
|
716 | propagatedBuildInputs = [ | |
|
717 | self."zipp" | |
|
718 | self."contextlib2" | |
|
719 | self."configparser" | |
|
720 | self."pathlib2" | |
|
721 | ]; | |
|
722 | src = fetchurl { | |
|
723 | url = "https://files.pythonhosted.org/packages/5d/44/636bcd15697791943e2dedda0dbe098d8530a38d113b202817133e0b06c0/importlib_metadata-0.23.tar.gz"; | |
|
724 | sha256 = "09mdqdfv5rdrwz80jh9m379gxmvk2vhjfz0fg53hid00icvxf65a"; | |
|
725 | }; | |
|
726 | meta = { | |
|
727 | license = [ pkgs.lib.licenses.asl20 ]; | |
|
728 | }; | |
|
729 | }; | |
|
679 | 730 | "infrae.cache" = super.buildPythonPackage { |
|
680 | 731 | name = "infrae.cache-1.0.1"; |
|
681 | 732 | doCheck = false; |
@@ -703,11 +754,11 b' self: super: {' | |||
|
703 | 754 | }; |
|
704 | 755 | }; |
|
705 | 756 | "ipaddress" = super.buildPythonPackage { |
|
706 |
name = "ipaddress-1.0.2 |
|
|
757 | name = "ipaddress-1.0.23"; | |
|
707 | 758 | doCheck = false; |
|
708 | 759 | src = fetchurl { |
|
709 | url = "https://files.pythonhosted.org/packages/97/8d/77b8cedcfbf93676148518036c6b1ce7f8e14bf07e95d7fd4ddcb8cc052f/ipaddress-1.0.22.tar.gz"; | |
|
710 | sha256 = "0b570bm6xqpjwqis15pvdy6lyvvzfndjvkynilcddjj5x98wfimi"; | |
|
760 | url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz"; | |
|
761 | sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p"; | |
|
711 | 762 | }; |
|
712 | 763 | meta = { |
|
713 | 764 | license = [ pkgs.lib.licenses.psfl ]; |
@@ -859,14 +910,15 b' self: super: {' | |||
|
859 | 910 | }; |
|
860 | 911 | }; |
|
861 | 912 | "kombu" = super.buildPythonPackage { |
|
862 |
name = "kombu-4. |
|
|
913 | name = "kombu-4.6.6"; | |
|
863 | 914 | doCheck = false; |
|
864 | 915 | propagatedBuildInputs = [ |
|
865 | 916 | self."amqp" |
|
917 | self."importlib-metadata" | |
|
866 | 918 | ]; |
|
867 | 919 | src = fetchurl { |
|
868 |
url = "https://files.pythonhosted.org/packages/ |
|
|
869 | sha256 = "10lh3hncvw67fz0k5vgbx3yh9gjfpqdlia1f13i28cgnc1nfrbc6"; | |
|
920 | url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz"; | |
|
921 | sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p"; | |
|
870 | 922 | }; |
|
871 | 923 | meta = { |
|
872 | 924 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -884,14 +936,14 b' self: super: {' | |||
|
884 | 936 | }; |
|
885 | 937 | }; |
|
886 | 938 | "mako" = super.buildPythonPackage { |
|
887 |
name = "mako-1.0 |
|
|
939 | name = "mako-1.1.0"; | |
|
888 | 940 | doCheck = false; |
|
889 | 941 | propagatedBuildInputs = [ |
|
890 | 942 | self."markupsafe" |
|
891 | 943 | ]; |
|
892 | 944 | src = fetchurl { |
|
893 |
url = "https://files.pythonhosted.org/packages/e |
|
|
894 | sha256 = "1bi5gnr8r8dva06qpyx4kgjc6spm2k1y908183nbbaylggjzs0jf"; | |
|
945 | url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz"; | |
|
946 | sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3"; | |
|
895 | 947 | }; |
|
896 | 948 | meta = { |
|
897 | 949 | license = [ pkgs.lib.licenses.mit ]; |
@@ -909,25 +961,14 b' self: super: {' | |||
|
909 | 961 | }; |
|
910 | 962 | }; |
|
911 | 963 | "markupsafe" = super.buildPythonPackage { |
|
912 |
name = "markupsafe-1.1. |
|
|
964 | name = "markupsafe-1.1.1"; | |
|
913 | 965 | doCheck = false; |
|
914 | 966 | src = fetchurl { |
|
915 |
url = "https://files.pythonhosted.org/packages/ |
|
|
916 | sha256 = "1lxirjypbdd3l9jl4vliilhfnhy7c7f2vlldqg1b0i74khn375sf"; | |
|
967 | url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz"; | |
|
968 | sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9"; | |
|
917 | 969 | }; |
|
918 | 970 | meta = { |
|
919 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
920 | }; | |
|
921 | }; | |
|
922 | "meld3" = super.buildPythonPackage { | |
|
923 | name = "meld3-1.0.2"; | |
|
924 | doCheck = false; | |
|
925 | src = fetchurl { | |
|
926 | url = "https://files.pythonhosted.org/packages/45/a0/317c6422b26c12fe0161e936fc35f36552069ba8e6f7ecbd99bbffe32a5f/meld3-1.0.2.tar.gz"; | |
|
927 | sha256 = "0n4mkwlpsqnmn0dm0wm5hn9nkda0nafl0jdy5sdl5977znh59dzp"; | |
|
928 | }; | |
|
929 | meta = { | |
|
930 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
|
971 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ]; | |
|
931 | 972 | }; |
|
932 | 973 | }; |
|
933 | 974 | "mistune" = super.buildPythonPackage { |
@@ -942,14 +983,18 b' self: super: {' | |||
|
942 | 983 | }; |
|
943 | 984 | }; |
|
944 | 985 | "mock" = super.buildPythonPackage { |
|
945 |
name = "mock- |
|
|
986 | name = "mock-3.0.5"; | |
|
946 | 987 | doCheck = false; |
|
988 | propagatedBuildInputs = [ | |
|
989 | self."six" | |
|
990 | self."funcsigs" | |
|
991 | ]; | |
|
947 | 992 | src = fetchurl { |
|
948 | url = "https://files.pythonhosted.org/packages/a2/52/7edcd94f0afb721a2d559a5b9aae8af4f8f2c79bc63fdbe8a8a6c9b23bbe/mock-1.0.1.tar.gz"; | |
|
949 | sha256 = "0kzlsbki6q0awf89rc287f3aj8x431lrajf160a70z0ikhnxsfdq"; | |
|
993 | url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz"; | |
|
994 | sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3"; | |
|
950 | 995 | }; |
|
951 | 996 | meta = { |
|
952 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
997 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ]; | |
|
953 | 998 | }; |
|
954 | 999 | }; |
|
955 | 1000 | "more-itertools" = super.buildPythonPackage { |
@@ -1029,14 +1074,18 b' self: super: {' | |||
|
1029 | 1074 | }; |
|
1030 | 1075 | }; |
|
1031 | 1076 | "packaging" = super.buildPythonPackage { |
|
1032 |
name = "packaging-1 |
|
|
1077 | name = "packaging-19.2"; | |
|
1033 | 1078 | doCheck = false; |
|
1079 | propagatedBuildInputs = [ | |
|
1080 | self."pyparsing" | |
|
1081 | self."six" | |
|
1082 | ]; | |
|
1034 | 1083 | src = fetchurl { |
|
1035 |
url = "https://files.pythonhosted.org/packages/24 |
|
|
1036 | sha256 = "1zn60w84bxvw6wypffka18ca66pa1k2cfrq3cq8fnsfja5m3k4ng"; | |
|
1084 | url = "https://files.pythonhosted.org/packages/5a/2f/449ded84226d0e2fda8da9252e5ee7731bdf14cd338f622dfcd9934e0377/packaging-19.2.tar.gz"; | |
|
1085 | sha256 = "0izwlz9h0bw171a1chr311g2y7n657zjaf4mq4rgm8pp9lbj9f98"; | |
|
1037 | 1086 | }; |
|
1038 | 1087 | meta = { |
|
1039 | license = [ pkgs.lib.licenses.asl20 ]; | |
|
1088 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ]; | |
|
1040 | 1089 | }; |
|
1041 | 1090 | }; |
|
1042 | 1091 | "pandocfilters" = super.buildPythonPackage { |
@@ -1051,14 +1100,14 b' self: super: {' | |||
|
1051 | 1100 | }; |
|
1052 | 1101 | }; |
|
1053 | 1102 | "paste" = super.buildPythonPackage { |
|
1054 |
name = "paste-3. |
|
|
1103 | name = "paste-3.2.1"; | |
|
1055 | 1104 | doCheck = false; |
|
1056 | 1105 | propagatedBuildInputs = [ |
|
1057 | 1106 | self."six" |
|
1058 | 1107 | ]; |
|
1059 | 1108 | src = fetchurl { |
|
1060 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1061 | sha256 = "05w1sh6ky4d7pmdb8nv82n13w22jcn3qsagg5ih3hjmbws9kkwf4"; | |
|
1109 | url = "https://files.pythonhosted.org/packages/0d/86/7008b5563594e8a63763f05212a3eb84c85f0b2eff834e5697716e56bca9/Paste-3.2.1.tar.gz"; | |
|
1110 | sha256 = "1vjxr8n1p31c9x9rh8g0f34yisa9028cxpvn36q7g1s0m2b9x71x"; | |
|
1062 | 1111 | }; |
|
1063 | 1112 | meta = { |
|
1064 | 1113 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1076,7 +1125,7 b' self: super: {' | |||
|
1076 | 1125 | }; |
|
1077 | 1126 | }; |
|
1078 | 1127 | "pastescript" = super.buildPythonPackage { |
|
1079 |
name = "pastescript-3. |
|
|
1128 | name = "pastescript-3.2.0"; | |
|
1080 | 1129 | doCheck = false; |
|
1081 | 1130 | propagatedBuildInputs = [ |
|
1082 | 1131 | self."paste" |
@@ -1084,23 +1133,23 b' self: super: {' | |||
|
1084 | 1133 | self."six" |
|
1085 | 1134 | ]; |
|
1086 | 1135 | src = fetchurl { |
|
1087 |
url = "https://files.pythonhosted.org/packages/9 |
|
|
1088 | sha256 = "02qcxjjr32ks7a6d4f533wl34ysc7yhwlrfcyqwqbzr52250v4fs"; | |
|
1136 | url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz"; | |
|
1137 | sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv"; | |
|
1089 | 1138 | }; |
|
1090 | 1139 | meta = { |
|
1091 | 1140 | license = [ pkgs.lib.licenses.mit ]; |
|
1092 | 1141 | }; |
|
1093 | 1142 | }; |
|
1094 | 1143 | "pathlib2" = super.buildPythonPackage { |
|
1095 |
name = "pathlib2-2.3. |
|
|
1144 | name = "pathlib2-2.3.5"; | |
|
1096 | 1145 | doCheck = false; |
|
1097 | 1146 | propagatedBuildInputs = [ |
|
1098 | 1147 | self."six" |
|
1099 | 1148 | self."scandir" |
|
1100 | 1149 | ]; |
|
1101 | 1150 | src = fetchurl { |
|
1102 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1103 | sha256 = "1y0f9rkm1924zrc5dn4bwxlhgdkbml82lkcc28l5rgmr7d918q24"; | |
|
1151 | url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz"; | |
|
1152 | sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc"; | |
|
1104 | 1153 | }; |
|
1105 | 1154 | meta = { |
|
1106 | 1155 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1175,48 +1224,51 b' self: super: {' | |||
|
1175 | 1224 | }; |
|
1176 | 1225 | }; |
|
1177 | 1226 | "pluggy" = super.buildPythonPackage { |
|
1178 |
name = "pluggy-0.11 |
|
|
1227 | name = "pluggy-0.13.1"; | |
|
1179 | 1228 | doCheck = false; |
|
1229 | propagatedBuildInputs = [ | |
|
1230 | self."importlib-metadata" | |
|
1231 | ]; | |
|
1180 | 1232 | src = fetchurl { |
|
1181 |
url = "https://files.pythonhosted.org/packages/0 |
|
|
1182 | sha256 = "10511a54dvafw1jrk75mrhml53c7b7w4yaw7241696lc2hfvr895"; | |
|
1233 | url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz"; | |
|
1234 | sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm"; | |
|
1183 | 1235 | }; |
|
1184 | 1236 | meta = { |
|
1185 | 1237 | license = [ pkgs.lib.licenses.mit ]; |
|
1186 | 1238 | }; |
|
1187 | 1239 | }; |
|
1188 | 1240 | "prompt-toolkit" = super.buildPythonPackage { |
|
1189 |
name = "prompt-toolkit-1.0.1 |
|
|
1241 | name = "prompt-toolkit-1.0.18"; | |
|
1190 | 1242 | doCheck = false; |
|
1191 | 1243 | propagatedBuildInputs = [ |
|
1192 | 1244 | self."six" |
|
1193 | 1245 | self."wcwidth" |
|
1194 | 1246 | ]; |
|
1195 | 1247 | src = fetchurl { |
|
1196 |
url = "https://files.pythonhosted.org/packages/f |
|
|
1197 | sha256 = "1d65hm6nf0cbq0q0121m60zzy4s1fpg9fn761s1yxf08dridvkn1"; | |
|
1248 | url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz"; | |
|
1249 | sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx"; | |
|
1198 | 1250 | }; |
|
1199 | 1251 | meta = { |
|
1200 | 1252 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1201 | 1253 | }; |
|
1202 | 1254 | }; |
|
1203 | 1255 | "psutil" = super.buildPythonPackage { |
|
1204 |
name = "psutil-5.5 |
|
|
1256 | name = "psutil-5.6.5"; | |
|
1205 | 1257 | doCheck = false; |
|
1206 | 1258 | src = fetchurl { |
|
1207 |
url = "https://files.pythonhosted.org/packages/c |
|
|
1208 | sha256 = "045qaqvn6k90bj5bcy259yrwcd2afgznaav3sfhphy9b8ambzkkj"; | |
|
1259 | url = "https://files.pythonhosted.org/packages/03/9a/95c4b3d0424426e5fd94b5302ff74cea44d5d4f53466e1228ac8e73e14b4/psutil-5.6.5.tar.gz"; | |
|
1260 | sha256 = "0isil5jxwwd8awz54qk28rpgjg43i5l6yl70g40vxwa4r4m56lfh"; | |
|
1209 | 1261 | }; |
|
1210 | 1262 | meta = { |
|
1211 | 1263 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1212 | 1264 | }; |
|
1213 | 1265 | }; |
|
1214 | 1266 | "psycopg2" = super.buildPythonPackage { |
|
1215 |
name = "psycopg2-2.8. |
|
|
1267 | name = "psycopg2-2.8.4"; | |
|
1216 | 1268 | doCheck = false; |
|
1217 | 1269 | src = fetchurl { |
|
1218 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1219 | sha256 = "0ms4kx0p5n281l89awccix4d05ybmdngnjjpi9jbzd0rhf1nwyl9"; | |
|
1270 | url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz"; | |
|
1271 | sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q"; | |
|
1220 | 1272 | }; |
|
1221 | 1273 | meta = { |
|
1222 | 1274 | license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ]; |
@@ -1234,11 +1286,11 b' self: super: {' | |||
|
1234 | 1286 | }; |
|
1235 | 1287 | }; |
|
1236 | 1288 | "py" = super.buildPythonPackage { |
|
1237 |
name = "py-1. |
|
|
1289 | name = "py-1.8.0"; | |
|
1238 | 1290 | doCheck = false; |
|
1239 | 1291 | src = fetchurl { |
|
1240 | url = "https://files.pythonhosted.org/packages/4f/38/5f427d1eedae73063ce4da680d2bae72014995f9fdeaa57809df61c968cd/py-1.6.0.tar.gz"; | |
|
1241 | sha256 = "1wcs3zv9wl5m5x7p16avqj2gsrviyb23yvc3pr330isqs0sh98q6"; | |
|
1292 | url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz"; | |
|
1293 | sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw"; | |
|
1242 | 1294 | }; |
|
1243 | 1295 | meta = { |
|
1244 | 1296 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1271,28 +1323,28 b' self: super: {' | |||
|
1271 | 1323 | }; |
|
1272 | 1324 | }; |
|
1273 | 1325 | "pyasn1" = super.buildPythonPackage { |
|
1274 |
name = "pyasn1-0.4. |
|
|
1326 | name = "pyasn1-0.4.8"; | |
|
1275 | 1327 | doCheck = false; |
|
1276 | 1328 | src = fetchurl { |
|
1277 |
url = "https://files.pythonhosted.org/packages/46 |
|
|
1278 | sha256 = "1xqh3jh2nfi2bflk5a0vn59y3pp1vn54f3ksx652sid92gz2096s"; | |
|
1329 | url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz"; | |
|
1330 | sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf"; | |
|
1279 | 1331 | }; |
|
1280 | 1332 | meta = { |
|
1281 | 1333 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1282 | 1334 | }; |
|
1283 | 1335 | }; |
|
1284 | 1336 | "pyasn1-modules" = super.buildPythonPackage { |
|
1285 |
name = "pyasn1-modules-0.2. |
|
|
1337 | name = "pyasn1-modules-0.2.6"; | |
|
1286 | 1338 | doCheck = false; |
|
1287 | 1339 | propagatedBuildInputs = [ |
|
1288 | 1340 | self."pyasn1" |
|
1289 | 1341 | ]; |
|
1290 | 1342 | src = fetchurl { |
|
1291 |
url = "https://files.pythonhosted.org/packages/ec |
|
|
1292 | sha256 = "15nvfx0vnl8akdlv3k6s0n80vqvryj82bm040jdsn7wmyxl1ywpg"; | |
|
1343 | url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz"; | |
|
1344 | sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3"; | |
|
1293 | 1345 | }; |
|
1294 | 1346 | meta = { |
|
1295 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
1347 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ]; | |
|
1296 | 1348 | }; |
|
1297 | 1349 | }; |
|
1298 | 1350 | "pycparser" = super.buildPythonPackage { |
@@ -1318,11 +1370,11 b' self: super: {' | |||
|
1318 | 1370 | }; |
|
1319 | 1371 | }; |
|
1320 | 1372 | "pycurl" = super.buildPythonPackage { |
|
1321 |
name = "pycurl-7.43.0. |
|
|
1373 | name = "pycurl-7.43.0.3"; | |
|
1322 | 1374 | doCheck = false; |
|
1323 | 1375 | src = fetchurl { |
|
1324 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1325 | sha256 = "1915kb04k1j4y6k1dx1sgnbddxrl9r1n4q928if2lkrdm73xy30g"; | |
|
1376 | url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz"; | |
|
1377 | sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g"; | |
|
1326 | 1378 | }; |
|
1327 | 1379 | meta = { |
|
1328 | 1380 | license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; |
@@ -1351,22 +1403,22 b' self: super: {' | |||
|
1351 | 1403 | }; |
|
1352 | 1404 | }; |
|
1353 | 1405 | "pyotp" = super.buildPythonPackage { |
|
1354 |
name = "pyotp-2. |
|
|
1406 | name = "pyotp-2.3.0"; | |
|
1355 | 1407 | doCheck = false; |
|
1356 | 1408 | src = fetchurl { |
|
1357 |
url = "https://files.pythonhosted.org/packages/b1 |
|
|
1358 | sha256 = "00p69nw431f0s2ilg0hnd77p1l22m06p9rq4f8zfapmavnmzw3xy"; | |
|
1409 | url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz"; | |
|
1410 | sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw"; | |
|
1359 | 1411 | }; |
|
1360 | 1412 | meta = { |
|
1361 | 1413 | license = [ pkgs.lib.licenses.mit ]; |
|
1362 | 1414 | }; |
|
1363 | 1415 | }; |
|
1364 | 1416 | "pyparsing" = super.buildPythonPackage { |
|
1365 |
name = "pyparsing-2. |
|
|
1417 | name = "pyparsing-2.4.5"; | |
|
1366 | 1418 | doCheck = false; |
|
1367 | 1419 | src = fetchurl { |
|
1368 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1369 | sha256 = "14k5v7n3xqw8kzf42x06bzp184spnlkya2dpjyflax6l3yrallzk"; | |
|
1420 | url = "https://files.pythonhosted.org/packages/00/32/8076fa13e832bb4dcff379f18f228e5a53412be0631808b9ca2610c0f566/pyparsing-2.4.5.tar.gz"; | |
|
1421 | sha256 = "0fk8gsybiw1gm146mkjdjvaajwh20xwvpv4j7syh2zrnpq0j19jc"; | |
|
1370 | 1422 | }; |
|
1371 | 1423 | meta = { |
|
1372 | 1424 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1396,7 +1448,7 b' self: super: {' | |||
|
1396 | 1448 | }; |
|
1397 | 1449 | }; |
|
1398 | 1450 | "pyramid-debugtoolbar" = super.buildPythonPackage { |
|
1399 | name = "pyramid-debugtoolbar-4.5"; | |
|
1451 | name = "pyramid-debugtoolbar-4.5.1"; | |
|
1400 | 1452 | doCheck = false; |
|
1401 | 1453 | propagatedBuildInputs = [ |
|
1402 | 1454 | self."pyramid" |
@@ -1406,8 +1458,8 b' self: super: {' | |||
|
1406 | 1458 | self."ipaddress" |
|
1407 | 1459 | ]; |
|
1408 | 1460 | src = fetchurl { |
|
1409 |
url = "https://files.pythonhosted.org/packages/14 |
|
|
1410 | sha256 = "0x2p3409pnx66n6dx5vc0mk2r1cp1ydr8mp120w44r9pwcngbibl"; | |
|
1461 | url = "https://files.pythonhosted.org/packages/88/21/74e7fa52edc74667e29403bd0cb4f2bb74dc4014711de313868001bf639f/pyramid_debugtoolbar-4.5.1.tar.gz"; | |
|
1462 | sha256 = "0hgf6i1fzvq43m9vjdmb24nnv8fwp7sdzrx9bcwrgpy24n07am9a"; | |
|
1411 | 1463 | }; |
|
1412 | 1464 | meta = { |
|
1413 | 1465 | license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ]; |
@@ -1447,15 +1499,15 b' self: super: {' | |||
|
1447 | 1499 | }; |
|
1448 | 1500 | }; |
|
1449 | 1501 | "pyramid-mako" = super.buildPythonPackage { |
|
1450 |
name = "pyramid-mako-1.0 |
|
|
1502 | name = "pyramid-mako-1.1.0"; | |
|
1451 | 1503 | doCheck = false; |
|
1452 | 1504 | propagatedBuildInputs = [ |
|
1453 | 1505 | self."pyramid" |
|
1454 | 1506 | self."mako" |
|
1455 | 1507 | ]; |
|
1456 | 1508 | src = fetchurl { |
|
1457 |
url = "https://files.pythonhosted.org/packages/f1 |
|
|
1458 | sha256 = "18gk2vliq8z4acblsl6yzgbvnr9rlxjlcqir47km7kvlk1xri83d"; | |
|
1509 | url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz"; | |
|
1510 | sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0"; | |
|
1459 | 1511 | }; |
|
1460 | 1512 | meta = { |
|
1461 | 1513 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
@@ -1473,44 +1525,46 b' self: super: {' | |||
|
1473 | 1525 | }; |
|
1474 | 1526 | }; |
|
1475 | 1527 | "pytest" = super.buildPythonPackage { |
|
1476 |
name = "pytest- |
|
|
1528 | name = "pytest-4.6.5"; | |
|
1477 | 1529 | doCheck = false; |
|
1478 | 1530 | propagatedBuildInputs = [ |
|
1479 | 1531 | self."py" |
|
1480 | 1532 | self."six" |
|
1481 |
self." |
|
|
1533 | self."packaging" | |
|
1482 | 1534 | self."attrs" |
|
1483 | self."more-itertools" | |
|
1484 | 1535 | self."atomicwrites" |
|
1485 | 1536 | self."pluggy" |
|
1537 | self."importlib-metadata" | |
|
1538 | self."wcwidth" | |
|
1486 | 1539 | self."funcsigs" |
|
1487 | 1540 | self."pathlib2" |
|
1541 | self."more-itertools" | |
|
1488 | 1542 | ]; |
|
1489 | 1543 | src = fetchurl { |
|
1490 |
url = "https://files.pythonhosted.org/packages/5f |
|
|
1491 | sha256 = "18nrwzn61kph2y6gxwfz9ms68rfvr9d4vcffsxng9p7jk9z18clk"; | |
|
1544 | url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz"; | |
|
1545 | sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg"; | |
|
1492 | 1546 | }; |
|
1493 | 1547 | meta = { |
|
1494 | 1548 | license = [ pkgs.lib.licenses.mit ]; |
|
1495 | 1549 | }; |
|
1496 | 1550 | }; |
|
1497 | 1551 | "pytest-cov" = super.buildPythonPackage { |
|
1498 |
name = "pytest-cov-2. |
|
|
1552 | name = "pytest-cov-2.7.1"; | |
|
1499 | 1553 | doCheck = false; |
|
1500 | 1554 | propagatedBuildInputs = [ |
|
1501 | 1555 | self."pytest" |
|
1502 | 1556 | self."coverage" |
|
1503 | 1557 | ]; |
|
1504 | 1558 | src = fetchurl { |
|
1505 | url = "https://files.pythonhosted.org/packages/d9/e2/58f90a316fbd94dd50bf5c826a23f3f5d079fb3cc448c1e9f0e3c33a3d2a/pytest-cov-2.6.0.tar.gz"; | |
|
1506 | sha256 = "0qnpp9y3ygx4jk4pf5ad71fh2skbvnr6gl54m7rg5qysnx4g0q73"; | |
|
1559 | url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz"; | |
|
1560 | sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0"; | |
|
1507 | 1561 | }; |
|
1508 | 1562 | meta = { |
|
1509 | 1563 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ]; |
|
1510 | 1564 | }; |
|
1511 | 1565 | }; |
|
1512 | 1566 | "pytest-profiling" = super.buildPythonPackage { |
|
1513 |
name = "pytest-profiling-1. |
|
|
1567 | name = "pytest-profiling-1.7.0"; | |
|
1514 | 1568 | doCheck = false; |
|
1515 | 1569 | propagatedBuildInputs = [ |
|
1516 | 1570 | self."six" |
@@ -1518,62 +1572,63 b' self: super: {' | |||
|
1518 | 1572 | self."gprof2dot" |
|
1519 | 1573 | ]; |
|
1520 | 1574 | src = fetchurl { |
|
1521 |
url = "https://files.pythonhosted.org/packages/f |
|
|
1522 | sha256 = "08r5afx5z22yvpmsnl91l4amsy1yxn8qsmm61mhp06mz8zjs51kb"; | |
|
1575 | url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz"; | |
|
1576 | sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk"; | |
|
1523 | 1577 | }; |
|
1524 | 1578 | meta = { |
|
1525 | 1579 | license = [ pkgs.lib.licenses.mit ]; |
|
1526 | 1580 | }; |
|
1527 | 1581 | }; |
|
1528 | 1582 | "pytest-runner" = super.buildPythonPackage { |
|
1529 |
name = "pytest-runner- |
|
|
1583 | name = "pytest-runner-5.1"; | |
|
1530 | 1584 | doCheck = false; |
|
1531 | 1585 | src = fetchurl { |
|
1532 |
url = "https://files.pythonhosted.org/packages/9e |
|
|
1533 | sha256 = "1gkpyphawxz38ni1gdq1fmwyqcg02m7ypzqvv46z06crwdxi2gyj"; | |
|
1586 | url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz"; | |
|
1587 | sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815"; | |
|
1534 | 1588 | }; |
|
1535 | 1589 | meta = { |
|
1536 | 1590 | license = [ pkgs.lib.licenses.mit ]; |
|
1537 | 1591 | }; |
|
1538 | 1592 | }; |
|
1539 | 1593 | "pytest-sugar" = super.buildPythonPackage { |
|
1540 |
name = "pytest-sugar-0.9. |
|
|
1594 | name = "pytest-sugar-0.9.2"; | |
|
1541 | 1595 | doCheck = false; |
|
1542 | 1596 | propagatedBuildInputs = [ |
|
1543 | 1597 | self."pytest" |
|
1544 | 1598 | self."termcolor" |
|
1599 | self."packaging" | |
|
1545 | 1600 | ]; |
|
1546 | 1601 | src = fetchurl { |
|
1547 | url = "https://files.pythonhosted.org/packages/3e/6a/a3f909083079d03bde11d06ab23088886bbe25f2c97fbe4bb865e2bf05bc/pytest-sugar-0.9.1.tar.gz"; | |
|
1548 | sha256 = "0b4av40dv30727m54v211r0nzwjp2ajkjgxix6j484qjmwpw935b"; | |
|
1602 | url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz"; | |
|
1603 | sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w"; | |
|
1549 | 1604 | }; |
|
1550 | 1605 | meta = { |
|
1551 | 1606 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1552 | 1607 | }; |
|
1553 | 1608 | }; |
|
1554 | 1609 | "pytest-timeout" = super.buildPythonPackage { |
|
1555 |
name = "pytest-timeout-1.3. |
|
|
1610 | name = "pytest-timeout-1.3.3"; | |
|
1556 | 1611 | doCheck = false; |
|
1557 | 1612 | propagatedBuildInputs = [ |
|
1558 | 1613 | self."pytest" |
|
1559 | 1614 | ]; |
|
1560 | 1615 | src = fetchurl { |
|
1561 |
url = "https://files.pythonhosted.org/packages/8c |
|
|
1562 | sha256 = "09wnmzvnls2mnsdz7x3c3sk2zdp6jl4dryvyj5i8hqz16q2zq5qi"; | |
|
1616 | url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz"; | |
|
1617 | sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a"; | |
|
1563 | 1618 | }; |
|
1564 | 1619 | meta = { |
|
1565 | 1620 | license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ]; |
|
1566 | 1621 | }; |
|
1567 | 1622 | }; |
|
1568 | 1623 | "python-dateutil" = super.buildPythonPackage { |
|
1569 |
name = "python-dateutil-2.8. |
|
|
1624 | name = "python-dateutil-2.8.1"; | |
|
1570 | 1625 | doCheck = false; |
|
1571 | 1626 | propagatedBuildInputs = [ |
|
1572 | 1627 | self."six" |
|
1573 | 1628 | ]; |
|
1574 | 1629 | src = fetchurl { |
|
1575 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1576 | sha256 = "17nsfhy4xdz1khrfxa61vd7pmvd5z0wa3zb6v4gb4kfnykv0b668"; | |
|
1630 | url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz"; | |
|
1631 | sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk"; | |
|
1577 | 1632 | }; |
|
1578 | 1633 | meta = { |
|
1579 | 1634 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ]; |
@@ -1647,11 +1702,11 b' self: super: {' | |||
|
1647 | 1702 | }; |
|
1648 | 1703 | }; |
|
1649 | 1704 | "pytz" = super.buildPythonPackage { |
|
1650 |
name = "pytz-201 |
|
|
1705 | name = "pytz-2019.2"; | |
|
1651 | 1706 | doCheck = false; |
|
1652 | 1707 | src = fetchurl { |
|
1653 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1654 | sha256 = "0jgpqx3kk2rhv81j1izjxvmx8d0x7hzs1857pgqnixic5wq2ar60"; | |
|
1708 | url = "https://files.pythonhosted.org/packages/27/c0/fbd352ca76050952a03db776d241959d5a2ee1abddfeb9e2a53fdb489be4/pytz-2019.2.tar.gz"; | |
|
1709 | sha256 = "0ckb27hhjc8i8gcdvk4d9avld62b7k52yjijc60s2m3y8cpb7h16"; | |
|
1655 | 1710 | }; |
|
1656 | 1711 | meta = { |
|
1657 | 1712 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1669,11 +1724,11 b' self: super: {' | |||
|
1669 | 1724 | }; |
|
1670 | 1725 | }; |
|
1671 | 1726 | "redis" = super.buildPythonPackage { |
|
1672 |
name = "redis- |
|
|
1727 | name = "redis-3.3.11"; | |
|
1673 | 1728 | doCheck = false; |
|
1674 | 1729 | src = fetchurl { |
|
1675 |
url = "https://files.pythonhosted.org/packages/09 |
|
|
1676 | sha256 = "03vcgklykny0g0wpvqmy8p6azi2s078317wgb2xjv5m2rs9sjb52"; | |
|
1730 | url = "https://files.pythonhosted.org/packages/06/ca/00557c74279d2f256d3c42cabf237631355f3a132e4c74c2000e6647ad98/redis-3.3.11.tar.gz"; | |
|
1731 | sha256 = "1hicqbi5xl92hhml82awrr2rxl9jar5fp8nbcycj9qgmsdwc43wd"; | |
|
1677 | 1732 | }; |
|
1678 | 1733 | meta = { |
|
1679 | 1734 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1707,18 +1762,24 b' self: super: {' | |||
|
1707 | 1762 | }; |
|
1708 | 1763 | }; |
|
1709 | 1764 | "requests" = super.buildPythonPackage { |
|
1710 |
name = "requests-2. |
|
|
1765 | name = "requests-2.22.0"; | |
|
1711 | 1766 | doCheck = false; |
|
1767 | propagatedBuildInputs = [ | |
|
1768 | self."chardet" | |
|
1769 | self."idna" | |
|
1770 | self."urllib3" | |
|
1771 | self."certifi" | |
|
1772 | ]; | |
|
1712 | 1773 | src = fetchurl { |
|
1713 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1714 | sha256 = "0zsqrzlybf25xscgi7ja4s48y2abf9wvjkn47wh984qgs1fq2xy5"; | |
|
1774 | url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz"; | |
|
1775 | sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i"; | |
|
1715 | 1776 | }; |
|
1716 | 1777 | meta = { |
|
1717 | 1778 | license = [ pkgs.lib.licenses.asl20 ]; |
|
1718 | 1779 | }; |
|
1719 | 1780 | }; |
|
1720 | 1781 | "rhodecode-enterprise-ce" = super.buildPythonPackage { |
|
1721 |
name = "rhodecode-enterprise-ce-4.1 |
|
|
1782 | name = "rhodecode-enterprise-ce-4.18.0"; | |
|
1722 | 1783 | buildInputs = [ |
|
1723 | 1784 | self."pytest" |
|
1724 | 1785 | self."py" |
@@ -1738,7 +1799,6 b' self: super: {' | |||
|
1738 | 1799 | doCheck = true; |
|
1739 | 1800 | propagatedBuildInputs = [ |
|
1740 | 1801 | self."amqp" |
|
1741 | self."authomatic" | |
|
1742 | 1802 | self."babel" |
|
1743 | 1803 | self."beaker" |
|
1744 | 1804 | self."bleach" |
@@ -1808,7 +1868,6 b' self: super: {' | |||
|
1808 | 1868 | self."venusian" |
|
1809 | 1869 | self."weberror" |
|
1810 | 1870 | self."webhelpers2" |
|
1811 | self."webhelpers" | |
|
1812 | 1871 | self."webob" |
|
1813 | 1872 | self."whoosh" |
|
1814 | 1873 | self."wsgiref" |
@@ -1823,6 +1882,7 b' self: super: {' | |||
|
1823 | 1882 | self."nbconvert" |
|
1824 | 1883 | self."nbformat" |
|
1825 | 1884 | self."jupyter-client" |
|
1885 | self."jupyter-core" | |
|
1826 | 1886 | self."alembic" |
|
1827 | 1887 | self."invoke" |
|
1828 | 1888 | self."bumpversion" |
@@ -1854,7 +1914,7 b' self: super: {' | |||
|
1854 | 1914 | }; |
|
1855 | 1915 | }; |
|
1856 | 1916 | "rhodecode-tools" = super.buildPythonPackage { |
|
1857 |
name = "rhodecode-tools-1. |
|
|
1917 | name = "rhodecode-tools-1.4.0"; | |
|
1858 | 1918 | doCheck = false; |
|
1859 | 1919 | propagatedBuildInputs = [ |
|
1860 | 1920 | self."click" |
@@ -1871,8 +1931,8 b' self: super: {' | |||
|
1871 | 1931 | self."elasticsearch1-dsl" |
|
1872 | 1932 | ]; |
|
1873 | 1933 | src = fetchurl { |
|
1874 |
url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0- |
|
|
1875 | sha256 = "1vfhgf46inbx7jvlfx4fdzh3vz7lh37r291gzb5hx447pfm3qllg"; | |
|
1934 | url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a"; | |
|
1935 | sha256 = "0fjszppj3zhh47g1i6b9xqps28gzfxdkzwb47pdmzrd1sfx29w3n"; | |
|
1876 | 1936 | }; |
|
1877 | 1937 | meta = { |
|
1878 | 1938 | license = [ { fullName = "Apache 2.0 and Proprietary"; } ]; |
@@ -1916,11 +1976,11 b' self: super: {' | |||
|
1916 | 1976 | }; |
|
1917 | 1977 | }; |
|
1918 | 1978 | "setuptools" = super.buildPythonPackage { |
|
1919 |
name = "setuptools-4 |
|
|
1979 | name = "setuptools-44.0.0"; | |
|
1920 | 1980 | doCheck = false; |
|
1921 | 1981 | src = fetchurl { |
|
1922 |
url = "https://files.pythonhosted.org/packages/ |
|
|
1923 | sha256 = "04sns22y2hhsrwfy1mha2lgslvpjsjsz8xws7h2rh5a7ylkd28m2"; | |
|
1982 | url = "https://files.pythonhosted.org/packages/b0/f3/44da7482ac6da3f36f68e253cb04de37365b3dba9036a3c70773b778b485/setuptools-44.0.0.zip"; | |
|
1983 | sha256 = "025h5cnxcmda1893l6i12hrwdvs1n8r31qs6q4pkif2v7rrggfp5"; | |
|
1924 | 1984 | }; |
|
1925 | 1985 | meta = { |
|
1926 | 1986 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1960,11 +2020,11 b' self: super: {' | |||
|
1960 | 2020 | }; |
|
1961 | 2021 | }; |
|
1962 | 2022 | "sqlalchemy" = super.buildPythonPackage { |
|
1963 |
name = "sqlalchemy-1. |
|
|
2023 | name = "sqlalchemy-1.3.11"; | |
|
1964 | 2024 | doCheck = false; |
|
1965 | 2025 | src = fetchurl { |
|
1966 |
url = "https://files.pythonhosted.org/packages/cc |
|
|
1967 | sha256 = "1ab4ysip6irajfbxl9wy27kv76miaz8h6759hfx92499z4dcf3lb"; | |
|
2026 | url = "https://files.pythonhosted.org/packages/34/5c/0e1d7ad0ca52544bb12f9cb8d5cc454af45821c92160ffedd38db0a317f6/SQLAlchemy-1.3.11.tar.gz"; | |
|
2027 | sha256 = "12izpqqgy738ndn7qqn962qxi8qw2xb9vg2i880x12paklg599dg"; | |
|
1968 | 2028 | }; |
|
1969 | 2029 | meta = { |
|
1970 | 2030 | license = [ pkgs.lib.licenses.mit ]; |
@@ -1997,14 +2057,11 b' self: super: {' | |||
|
1997 | 2057 | }; |
|
1998 | 2058 | }; |
|
1999 | 2059 | "supervisor" = super.buildPythonPackage { |
|
2000 |
name = "supervisor-4.0 |
|
|
2060 | name = "supervisor-4.1.0"; | |
|
2001 | 2061 | doCheck = false; |
|
2002 | propagatedBuildInputs = [ | |
|
2003 | self."meld3" | |
|
2004 | ]; | |
|
2005 | 2062 | src = fetchurl { |
|
2006 |
url = "https://files.pythonhosted.org/packages/ |
|
|
2007 | sha256 = "17hla7mx6w5m5jzkkjxgqa8wpswqmfhbhf49f692hw78fg0ans7p"; | |
|
2063 | url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz"; | |
|
2064 | sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d"; | |
|
2008 | 2065 | }; |
|
2009 | 2066 | meta = { |
|
2010 | 2067 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
@@ -2033,18 +2090,18 b' self: super: {' | |||
|
2033 | 2090 | }; |
|
2034 | 2091 | }; |
|
2035 | 2092 | "testpath" = super.buildPythonPackage { |
|
2036 |
name = "testpath-0.4. |
|
|
2093 | name = "testpath-0.4.4"; | |
|
2037 | 2094 | doCheck = false; |
|
2038 | 2095 | src = fetchurl { |
|
2039 |
url = "https://files.pythonhosted.org/packages/0 |
|
|
2040 | sha256 = "1y40hywscnnyb734pnzm55nd8r8kp1072bjxbil83gcd53cv755n"; | |
|
2096 | url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz"; | |
|
2097 | sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30"; | |
|
2041 | 2098 | }; |
|
2042 | 2099 | meta = { |
|
2043 | 2100 | license = [ ]; |
|
2044 | 2101 | }; |
|
2045 | 2102 | }; |
|
2046 | 2103 | "traitlets" = super.buildPythonPackage { |
|
2047 |
name = "traitlets-4.3. |
|
|
2104 | name = "traitlets-4.3.3"; | |
|
2048 | 2105 | doCheck = false; |
|
2049 | 2106 | propagatedBuildInputs = [ |
|
2050 | 2107 | self."ipython-genutils" |
@@ -2053,8 +2110,8 b' self: super: {' | |||
|
2053 | 2110 | self."enum34" |
|
2054 | 2111 | ]; |
|
2055 | 2112 | src = fetchurl { |
|
2056 |
url = "https://files.pythonhosted.org/packages/ |
|
|
2057 | sha256 = "0dbq7sx26xqz5ixs711k5nc88p8a0nqyz6162pwks5dpcz9d4jww"; | |
|
2113 | url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz"; | |
|
2114 | sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh"; | |
|
2058 | 2115 | }; |
|
2059 | 2116 | meta = { |
|
2060 | 2117 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -2100,11 +2157,11 b' self: super: {' | |||
|
2100 | 2157 | }; |
|
2101 | 2158 | }; |
|
2102 | 2159 | "urllib3" = super.buildPythonPackage { |
|
2103 |
name = "urllib3-1.2 |
|
|
2160 | name = "urllib3-1.25.2"; | |
|
2104 | 2161 | doCheck = false; |
|
2105 | 2162 | src = fetchurl { |
|
2106 |
url = "https://files.pythonhosted.org/packages/b |
|
|
2107 | sha256 = "08lwd9f3hqznyf32vnzwvp87pchx062nkbgyrf67rwlkgj0jk5fy"; | |
|
2163 | url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz"; | |
|
2164 | sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55"; | |
|
2108 | 2165 | }; |
|
2109 | 2166 | meta = { |
|
2110 | 2167 | license = [ pkgs.lib.licenses.mit ]; |
@@ -2144,11 +2201,11 b' self: super: {' | |||
|
2144 | 2201 | }; |
|
2145 | 2202 | }; |
|
2146 | 2203 | "waitress" = super.buildPythonPackage { |
|
2147 |
name = "waitress-1.3. |
|
|
2204 | name = "waitress-1.3.1"; | |
|
2148 | 2205 | doCheck = false; |
|
2149 | 2206 | src = fetchurl { |
|
2150 |
url = "https://files.pythonhosted.org/packages/ |
|
|
2151 | sha256 = "09j5dzbbcxib7vdskhx39s1qsydlr4n2p2png71d7mjnr9pnwajf"; | |
|
2207 | url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz"; | |
|
2208 | sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7"; | |
|
2152 | 2209 | }; |
|
2153 | 2210 | meta = { |
|
2154 | 2211 | license = [ pkgs.lib.licenses.zpl21 ]; |
@@ -2193,20 +2250,6 b' self: super: {' | |||
|
2193 | 2250 | license = [ pkgs.lib.licenses.mit ]; |
|
2194 | 2251 | }; |
|
2195 | 2252 | }; |
|
2196 | "webhelpers" = super.buildPythonPackage { | |
|
2197 | name = "webhelpers-1.3"; | |
|
2198 | doCheck = false; | |
|
2199 | propagatedBuildInputs = [ | |
|
2200 | self."markupsafe" | |
|
2201 | ]; | |
|
2202 | src = fetchurl { | |
|
2203 | url = "https://files.pythonhosted.org/packages/ee/68/4d07672821d514184357f1552f2dad923324f597e722de3b016ca4f7844f/WebHelpers-1.3.tar.gz"; | |
|
2204 | sha256 = "10x5i82qdkrvyw18gsybwggfhfpl869siaab89vnndi9x62g51pa"; | |
|
2205 | }; | |
|
2206 | meta = { | |
|
2207 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
2208 | }; | |
|
2209 | }; | |
|
2210 | 2253 | "webhelpers2" = super.buildPythonPackage { |
|
2211 | 2254 | name = "webhelpers2-2.0"; |
|
2212 | 2255 | doCheck = false; |
@@ -2283,6 +2326,20 b' self: super: {' | |||
|
2283 | 2326 | license = [ { fullName = "PSF or ZPL"; } ]; |
|
2284 | 2327 | }; |
|
2285 | 2328 | }; |
|
2329 | "zipp" = super.buildPythonPackage { | |
|
2330 | name = "zipp-0.6.0"; | |
|
2331 | doCheck = false; | |
|
2332 | propagatedBuildInputs = [ | |
|
2333 | self."more-itertools" | |
|
2334 | ]; | |
|
2335 | src = fetchurl { | |
|
2336 | url = "https://files.pythonhosted.org/packages/57/dd/585d728479d97d25aeeb9aa470d36a4ad8d0ba5610f84e14770128ce6ff7/zipp-0.6.0.tar.gz"; | |
|
2337 | sha256 = "13ndkf7vklw978a4gdl1yfvn8hch28429a0iam67sg4nrp5v261p"; | |
|
2338 | }; | |
|
2339 | meta = { | |
|
2340 | license = [ pkgs.lib.licenses.mit ]; | |
|
2341 | }; | |
|
2342 | }; | |
|
2286 | 2343 | "zope.cachedescriptors" = super.buildPythonPackage { |
|
2287 | 2344 | name = "zope.cachedescriptors-4.3.1"; |
|
2288 | 2345 | doCheck = false; |
@@ -9,8 +9,11 b' vcsserver_config_http = rhodecode/tests/' | |||
|
9 | 9 | |
|
10 | 10 | addopts = |
|
11 | 11 | --pdbcls=IPython.terminal.debugger:TerminalPdb |
|
12 | --strict-markers | |
|
12 | 13 | |
|
13 | 14 | markers = |
|
14 | 15 | vcs_operations: Mark tests depending on a running RhodeCode instance. |
|
15 | 16 | xfail_backends: Mark tests as xfail for given backends. |
|
16 | 17 | skip_backends: Mark tests as skipped for given backends. |
|
18 | backends: Mark backends | |
|
19 | dbs: database markers for running tests for given DB |
@@ -1,13 +1,10 b'' | |||
|
1 | 1 | ## dependencies |
|
2 | 2 | |
|
3 |
amqp==2. |
|
|
4 | # not released authomatic that has updated some oauth providers | |
|
5 | https://code.rhodecode.com/upstream/authomatic/artifacts/download/0-4fe9c041-a567-4f84-be4c-7efa2a606d3c.tar.gz?md5=f6bdc3c769688212db68233e8d2b0383#egg=authomatic==0.1.0.post1 | |
|
6 | ||
|
3 | amqp==2.5.2 | |
|
7 | 4 | babel==1.3 |
|
8 | 5 | beaker==1.9.1 |
|
9 | 6 | bleach==3.1.0 |
|
10 |
celery==4. |
|
|
7 | celery==4.3.0 | |
|
11 | 8 | channelstream==0.5.2 |
|
12 | 9 | click==7.0 |
|
13 | 10 | colander==1.7.0 |
@@ -16,9 +13,9 b' https://code.rhodecode.com/upstream/conf' | |||
|
16 | 13 | cssselect==1.0.3 |
|
17 | 14 | cryptography==2.6.1 |
|
18 | 15 | decorator==4.1.2 |
|
19 |
deform==2.0. |
|
|
16 | deform==2.0.8 | |
|
20 | 17 | docutils==0.14.0 |
|
21 |
dogpile.cache==0. |
|
|
18 | dogpile.cache==0.9.0 | |
|
22 | 19 | dogpile.core==0.4.1 |
|
23 | 20 | formencode==1.2.4 |
|
24 | 21 | future==0.14.3 |
@@ -26,55 +23,54 b' futures==3.0.2' | |||
|
26 | 23 | infrae.cache==1.0.1 |
|
27 | 24 | iso8601==0.1.12 |
|
28 | 25 | itsdangerous==0.24 |
|
29 |
kombu==4. |
|
|
26 | kombu==4.6.6 | |
|
30 | 27 | lxml==4.2.5 |
|
31 |
mako==1. |
|
|
28 | mako==1.1.0 | |
|
32 | 29 | markdown==2.6.11 |
|
33 |
markupsafe==1.1. |
|
|
30 | markupsafe==1.1.1 | |
|
34 | 31 | msgpack-python==0.5.6 |
|
35 |
pyotp==2. |
|
|
36 |
packaging==1 |
|
|
37 |
pathlib2==2.3. |
|
|
38 |
paste==3. |
|
|
32 | pyotp==2.3.0 | |
|
33 | packaging==19.2 | |
|
34 | pathlib2==2.3.5 | |
|
35 | paste==3.2.1 | |
|
39 | 36 | pastedeploy==2.0.1 |
|
40 |
pastescript==3. |
|
|
37 | pastescript==3.2.0 | |
|
41 | 38 | peppercorn==0.6 |
|
42 |
psutil==5. |
|
|
39 | psutil==5.6.5 | |
|
43 | 40 | py-bcrypt==0.4 |
|
44 |
pycurl==7.43.0. |
|
|
41 | pycurl==7.43.0.3 | |
|
45 | 42 | pycrypto==2.6.1 |
|
46 | 43 | pygments==2.4.2 |
|
47 |
pyparsing==2. |
|
|
48 |
pyramid-debugtoolbar==4.5. |
|
|
49 |
pyramid-mako==1. |
|
|
44 | pyparsing==2.4.5 | |
|
45 | pyramid-debugtoolbar==4.5.1 | |
|
46 | pyramid-mako==1.1.0 | |
|
50 | 47 | pyramid==1.10.4 |
|
51 | 48 | pyramid_mailer==0.15.1 |
|
52 | python-dateutil | |
|
49 | python-dateutil==2.8.1 | |
|
53 | 50 | python-ldap==3.1.0 |
|
54 | 51 | python-memcached==1.59 |
|
55 | 52 | python-pam==1.8.4 |
|
56 | 53 | python-saml==2.4.2 |
|
57 |
pytz==201 |
|
|
54 | pytz==2019.2 | |
|
58 | 55 | tzlocal==1.5.1 |
|
59 | 56 | pyzmq==14.6.0 |
|
60 | 57 | py-gfm==0.1.4 |
|
61 |
redis== |
|
|
58 | redis==3.3.11 | |
|
62 | 59 | repoze.lru==0.7 |
|
63 |
requests==2. |
|
|
60 | requests==2.22.0 | |
|
64 | 61 | routes==2.4.1 |
|
65 | 62 | simplejson==3.16.0 |
|
66 | 63 | six==1.11.0 |
|
67 |
sqlalchemy==1. |
|
|
64 | sqlalchemy==1.3.11 | |
|
68 | 65 | sshpubkeys==3.1.0 |
|
69 | 66 | subprocess32==3.5.4 |
|
70 |
supervisor==4. |
|
|
67 | supervisor==4.1.0 | |
|
71 | 68 | translationstring==1.3 |
|
72 |
urllib3==1.2 |
|
|
69 | urllib3==1.25.2 | |
|
73 | 70 | urlobject==2.4.3 |
|
74 | 71 | venusian==1.2.0 |
|
75 | 72 | weberror==0.10.3 |
|
76 | 73 | webhelpers2==2.0 |
|
77 | webhelpers==1.3 | |
|
78 | 74 | webob==1.8.5 |
|
79 | 75 | whoosh==2.7.4 |
|
80 | 76 | wsgiref==0.1.2 |
@@ -87,17 +83,18 b' zope.interface==4.6.0' | |||
|
87 | 83 | mysql-python==1.2.5 |
|
88 | 84 | pymysql==0.8.1 |
|
89 | 85 | pysqlite==2.8.3 |
|
90 |
psycopg2==2.8. |
|
|
86 | psycopg2==2.8.4 | |
|
91 | 87 | |
|
92 | 88 | # IPYTHON RENDERING |
|
93 | 89 | # entrypoints backport, pypi version doesn't support egg installs |
|
94 | 90 | https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d#egg=entrypoints==0.2.2.rhodecode-upstream1 |
|
95 | 91 | nbconvert==5.3.1 |
|
96 | 92 | nbformat==4.4.0 |
|
97 |
jupyter |
|
|
93 | jupyter-client==5.0.0 | |
|
94 | jupyter-core==4.5.0 | |
|
98 | 95 | |
|
99 | 96 | ## cli tools |
|
100 |
alembic==1. |
|
|
97 | alembic==1.3.1 | |
|
101 | 98 | invoke==0.13.0 |
|
102 | 99 | bumpversion==0.5.3 |
|
103 | 100 | |
@@ -105,14 +102,15 b' bumpversion==0.5.3' | |||
|
105 | 102 | gevent==1.4.0 |
|
106 | 103 | greenlet==0.4.15 |
|
107 | 104 | gunicorn==19.9.0 |
|
108 |
waitress==1.3. |
|
|
105 | waitress==1.3.1 | |
|
109 | 106 | |
|
110 | 107 | ## debug |
|
111 | 108 | ipdb==0.12.0 |
|
112 | 109 | ipython==5.1.0 |
|
113 | 110 | |
|
114 | ## rhodecode-tools, special case | |
|
115 | https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-10ac93f4-bb7d-4b97-baea-68110743dd5a.tar.gz?md5=962dc77c06aceee62282b98d33149661#egg=rhodecode-tools==1.2.1 | |
|
111 | ## rhodecode-tools, special case, use file://PATH.tar.gz#egg=rhodecode-tools==X.Y.Z, to test local version | |
|
112 | https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a#egg=rhodecode-tools==1.4.0 | |
|
113 | ||
|
116 | 114 | |
|
117 | 115 | ## appenlight |
|
118 | 116 | appenlight-client==0.6.26 |
@@ -1,19 +1,27 b'' | |||
|
1 | 1 | # contains not directly required libraries we want to pin the version. |
|
2 | 2 | |
|
3 |
atomicwrites==1. |
|
|
4 |
attrs==1 |
|
|
5 | billiard==3.5.0.3 | |
|
3 | atomicwrites==1.3.0 | |
|
4 | attrs==19.3.0 | |
|
5 | asn1crypto==0.24.0 | |
|
6 | billiard==3.6.1.0 | |
|
7 | cffi==1.12.3 | |
|
6 | 8 | chameleon==2.24 |
|
7 | cffi==1.12.2 | |
|
9 | configparser==4.0.2 | |
|
10 | contextlib2==0.6.0.post1 | |
|
8 | 11 | ecdsa==0.13.2 |
|
9 | hupper==1.6.1 | |
|
10 | 12 | gnureadline==6.3.8 |
|
13 | hupper==1.9.1 | |
|
14 | ipaddress==1.0.23 | |
|
15 | importlib-metadata==0.23 | |
|
11 | 16 | jinja2==2.9.6 |
|
12 | 17 | jsonschema==2.6.0 |
|
18 | pluggy==0.13.1 | |
|
19 | pyasn1-modules==0.2.6 | |
|
13 | 20 | pyramid-jinja2==2.7 |
|
14 | pluggy==0.11.0 | |
|
15 | setproctitle==1.1.10 | |
|
16 | 21 | scandir==1.10.0 |
|
22 | setproctitle==1.1.10 | |
|
17 | 23 | tempita==0.5.2 |
|
24 | testpath==0.4.4 | |
|
25 | transaction==2.4.0 | |
|
18 | 26 | vine==1.3.0 |
|
19 | configparser==3.7.4 | |
|
27 | wcwidth==0.1.7 |
@@ -1,16 +1,16 b'' | |||
|
1 | 1 | # test related requirements |
|
2 |
pytest== |
|
|
3 |
py==1. |
|
|
4 |
pytest-cov==2. |
|
|
5 |
pytest-sugar==0.9. |
|
|
6 |
pytest-runner== |
|
|
7 |
pytest-profiling==1. |
|
|
8 |
pytest-timeout==1.3. |
|
|
2 | pytest==4.6.5 | |
|
3 | py==1.8.0 | |
|
4 | pytest-cov==2.7.1 | |
|
5 | pytest-sugar==0.9.2 | |
|
6 | pytest-runner==5.1.0 | |
|
7 | pytest-profiling==1.7.0 | |
|
8 | pytest-timeout==1.3.3 | |
|
9 | 9 | gprof2dot==2017.9.19 |
|
10 | 10 | |
|
11 |
mock== |
|
|
11 | mock==3.0.5 | |
|
12 | 12 | cov-core==1.15.0 |
|
13 |
coverage==4.5. |
|
|
13 | coverage==4.5.4 | |
|
14 | 14 | |
|
15 | 15 | webtest==2.0.33 |
|
16 | 16 | beautifulsoup4==4.6.3 |
@@ -45,7 +45,7 b' PYRAMID_SETTINGS = {}' | |||
|
45 | 45 | EXTENSIONS = {} |
|
46 | 46 | |
|
47 | 47 | __version__ = ('.'.join((str(each) for each in VERSION[:3]))) |
|
48 |
__dbversion__ = |
|
|
48 | __dbversion__ = 103 # defines current db version for migrations | |
|
49 | 49 | __platform__ = platform.system() |
|
50 | 50 | __license__ = 'AGPLv3, and Commercial License' |
|
51 | 51 | __author__ = 'RhodeCode GmbH' |
@@ -122,7 +122,7 b' def jsonrpc_response(request, result):' | |||
|
122 | 122 | return response |
|
123 | 123 | |
|
124 | 124 | |
|
125 | def jsonrpc_error(request, message, retid=None, code=None): | |
|
125 | def jsonrpc_error(request, message, retid=None, code=None, headers=None): | |
|
126 | 126 | """ |
|
127 | 127 | Generate a Response object with a JSON-RPC error body |
|
128 | 128 | |
@@ -132,10 +132,12 b' def jsonrpc_error(request, message, reti' | |||
|
132 | 132 | """ |
|
133 | 133 | err_dict = {'id': retid, 'result': None, 'error': message} |
|
134 | 134 | body = render(DEFAULT_RENDERER, err_dict, request=request).encode('utf-8') |
|
135 | ||
|
135 | 136 | return Response( |
|
136 | 137 | body=body, |
|
137 | 138 | status=code, |
|
138 | content_type='application/json' | |
|
139 | content_type='application/json', | |
|
140 | headerlist=headers | |
|
139 | 141 | ) |
|
140 | 142 | |
|
141 | 143 | |
@@ -287,8 +289,7 b' def request_view(request):' | |||
|
287 | 289 | }) |
|
288 | 290 | |
|
289 | 291 | # register some common functions for usage |
|
290 | attach_context_attributes( | |
|
291 | TemplateArgs(), request, request.rpc_user.user_id) | |
|
292 | attach_context_attributes(TemplateArgs(), request, request.rpc_user.user_id) | |
|
292 | 293 | |
|
293 | 294 | try: |
|
294 | 295 | ret_value = func(**call_params) |
@@ -298,9 +299,13 b' def request_view(request):' | |||
|
298 | 299 | except Exception: |
|
299 | 300 | log.exception('Unhandled exception occurred on api call: %s', func) |
|
300 | 301 | exc_info = sys.exc_info() |
|
301 | store_exception(id(exc_info), exc_info, prefix='rhodecode-api') | |
|
302 | exc_id, exc_type_name = store_exception( | |
|
303 | id(exc_info), exc_info, prefix='rhodecode-api') | |
|
304 | error_headers = [('RhodeCode-Exception-Id', str(exc_id)), | |
|
305 | ('RhodeCode-Exception-Type', str(exc_type_name))] | |
|
302 | 306 | return jsonrpc_error( |
|
303 |
request, retid=request.rpc_id, message='Internal server error' |
|
|
307 | request, retid=request.rpc_id, message='Internal server error', | |
|
308 | headers=error_headers) | |
|
304 | 309 | |
|
305 | 310 | |
|
306 | 311 | def setup_request(request): |
@@ -333,6 +338,7 b' def setup_request(request):' | |||
|
333 | 338 | raise JSONRPCError("Content-Length is 0") |
|
334 | 339 | |
|
335 | 340 | raw_body = request.body |
|
341 | log.debug("Loading JSON body now") | |
|
336 | 342 | try: |
|
337 | 343 | json_body = json.loads(raw_body) |
|
338 | 344 | except ValueError as e: |
@@ -359,7 +365,7 b' def setup_request(request):' | |||
|
359 | 365 | request.rpc_params = json_body['args'] \ |
|
360 | 366 | if isinstance(json_body['args'], dict) else {} |
|
361 | 367 | |
|
362 |
log.debug('method: %s, params: % |
|
|
368 | log.debug('method: %s, params: %.10240r', request.rpc_method, request.rpc_params) | |
|
363 | 369 | except KeyError as e: |
|
364 | 370 | raise JSONRPCError('Incorrect JSON data. Missing %s' % e) |
|
365 | 371 |
@@ -49,7 +49,7 b' class TestClosePullRequest(object):' | |||
|
49 | 49 | assert_ok(id_, expected, response.body) |
|
50 | 50 | journal = UserLog.query()\ |
|
51 | 51 | .filter(UserLog.user_id == author) \ |
|
52 |
.order_by( |
|
|
52 | .order_by(UserLog.user_log_id.asc()) \ | |
|
53 | 53 | .filter(UserLog.repository_id == repo)\ |
|
54 | 54 | .all() |
|
55 | 55 | assert journal[-1].action == 'repo.pull_request.close' |
@@ -20,7 +20,7 b'' | |||
|
20 | 20 | |
|
21 | 21 | import pytest |
|
22 | 22 | |
|
23 | from rhodecode.model.db import ChangesetStatus | |
|
23 | from rhodecode.model.db import ChangesetStatus, User | |
|
24 | 24 | from rhodecode.api.tests.utils import ( |
|
25 | 25 | build_data, api_call, assert_error, assert_ok) |
|
26 | 26 | |
@@ -79,3 +79,38 b' class TestCommentCommit(object):' | |||
|
79 | 79 | 'success': True |
|
80 | 80 | } |
|
81 | 81 | assert_ok(id_, expected, given=response.body) |
|
82 | ||
|
83 | def test_api_comment_commit_with_extra_recipients(self, backend, user_util): | |
|
84 | ||
|
85 | commit_id = backend.repo.scm_instance().get_commit('tip').raw_id | |
|
86 | ||
|
87 | user1 = user_util.create_user() | |
|
88 | user1_id = user1.user_id | |
|
89 | user2 = user_util.create_user() | |
|
90 | user2_id = user2.user_id | |
|
91 | ||
|
92 | id_, params = build_data( | |
|
93 | self.apikey, 'comment_commit', repoid=backend.repo_name, | |
|
94 | commit_id=commit_id, | |
|
95 | message='abracadabra', | |
|
96 | extra_recipients=[user1.user_id, user2.username]) | |
|
97 | ||
|
98 | response = api_call(self.app, params) | |
|
99 | repo = backend.repo.scm_instance() | |
|
100 | ||
|
101 | expected = { | |
|
102 | 'msg': 'Commented on commit `%s` for repository `%s`' % ( | |
|
103 | repo.get_commit().raw_id, backend.repo_name), | |
|
104 | 'status_change': None, | |
|
105 | 'success': True | |
|
106 | } | |
|
107 | ||
|
108 | assert_ok(id_, expected, given=response.body) | |
|
109 | # check user1/user2 inbox for notification | |
|
110 | user1 = User.get(user1_id) | |
|
111 | assert 1 == len(user1.notifications) | |
|
112 | assert 'abracadabra' in user1.notifications[0].notification.body | |
|
113 | ||
|
114 | user2 = User.get(user2_id) | |
|
115 | assert 1 == len(user2.notifications) | |
|
116 | assert 'abracadabra' in user2.notifications[0].notification.body |
@@ -21,7 +21,7 b'' | |||
|
21 | 21 | import pytest |
|
22 | 22 | |
|
23 | 23 | from rhodecode.model.comment import CommentsModel |
|
24 | from rhodecode.model.db import UserLog | |
|
24 | from rhodecode.model.db import UserLog, User | |
|
25 | 25 | from rhodecode.model.pull_request import PullRequestModel |
|
26 | 26 | from rhodecode.tests import TEST_USER_ADMIN_LOGIN |
|
27 | 27 | from rhodecode.api.tests.utils import ( |
@@ -65,11 +65,48 b' class TestCommentPullRequest(object):' | |||
|
65 | 65 | journal = UserLog.query()\ |
|
66 | 66 | .filter(UserLog.user_id == author)\ |
|
67 | 67 | .filter(UserLog.repository_id == repo) \ |
|
68 |
.order_by( |
|
|
68 | .order_by(UserLog.user_log_id.asc()) \ | |
|
69 | 69 | .all() |
|
70 | 70 | assert journal[-1].action == 'repo.pull_request.comment.create' |
|
71 | 71 | |
|
72 | 72 | @pytest.mark.backends("git", "hg") |
|
73 | def test_api_comment_pull_request_with_extra_recipients(self, pr_util, user_util): | |
|
74 | pull_request = pr_util.create_pull_request() | |
|
75 | ||
|
76 | user1 = user_util.create_user() | |
|
77 | user1_id = user1.user_id | |
|
78 | user2 = user_util.create_user() | |
|
79 | user2_id = user2.user_id | |
|
80 | ||
|
81 | id_, params = build_data( | |
|
82 | self.apikey, 'comment_pull_request', | |
|
83 | repoid=pull_request.target_repo.repo_name, | |
|
84 | pullrequestid=pull_request.pull_request_id, | |
|
85 | message='test message', | |
|
86 | extra_recipients=[user1.user_id, user2.username] | |
|
87 | ) | |
|
88 | response = api_call(self.app, params) | |
|
89 | pull_request = PullRequestModel().get(pull_request.pull_request_id) | |
|
90 | ||
|
91 | comments = CommentsModel().get_comments( | |
|
92 | pull_request.target_repo.repo_id, pull_request=pull_request) | |
|
93 | ||
|
94 | expected = { | |
|
95 | 'pull_request_id': pull_request.pull_request_id, | |
|
96 | 'comment_id': comments[-1].comment_id, | |
|
97 | 'status': {'given': None, 'was_changed': None} | |
|
98 | } | |
|
99 | assert_ok(id_, expected, response.body) | |
|
100 | # check user1/user2 inbox for notification | |
|
101 | user1 = User.get(user1_id) | |
|
102 | assert 1 == len(user1.notifications) | |
|
103 | assert 'test message' in user1.notifications[0].notification.body | |
|
104 | ||
|
105 | user2 = User.get(user2_id) | |
|
106 | assert 1 == len(user2.notifications) | |
|
107 | assert 'test message' in user2.notifications[0].notification.body | |
|
108 | ||
|
109 | @pytest.mark.backends("git", "hg") | |
|
73 | 110 | def test_api_comment_pull_request_change_status( |
|
74 | 111 | self, pr_util, no_notifications): |
|
75 | 112 | pull_request = pr_util.create_pull_request() |
@@ -82,6 +82,7 b' class TestCreateUser(object):' | |||
|
82 | 82 | self.apikey, 'create_user', |
|
83 | 83 | username=username, |
|
84 | 84 | email=email, |
|
85 | description='CTO of Things', | |
|
85 | 86 | password='example') |
|
86 | 87 | response = api_call(self.app, params) |
|
87 | 88 |
@@ -27,6 +27,16 b' from rhodecode.api.tests.utils import (' | |||
|
27 | 27 | @pytest.mark.usefixtures("testuser_api", "app") |
|
28 | 28 | class TestApiSearch(object): |
|
29 | 29 | |
|
30 | @pytest.mark.parametrize("sort_dir", [ | |
|
31 | "asc", | |
|
32 | "desc", | |
|
33 | ]) | |
|
34 | @pytest.mark.parametrize("sort", [ | |
|
35 | "xxx", | |
|
36 | "author_email", | |
|
37 | "date", | |
|
38 | "message", | |
|
39 | ]) | |
|
30 | 40 | @pytest.mark.parametrize("query, expected_hits, expected_paths", [ |
|
31 | 41 | ('todo', 23, [ |
|
32 | 42 | 'vcs/backends/hg/inmemory.py', |
@@ -55,10 +65,11 b' class TestApiSearch(object):' | |||
|
55 | 65 | 'vcs/tests/test_cli.py']), |
|
56 | 66 | ('owner:michał test', 0, []), |
|
57 | 67 | ]) |
|
58 | def test_search_content_results(self, query, expected_hits, expected_paths): | |
|
68 | def test_search_content_results(self, sort_dir, sort, query, expected_hits, expected_paths): | |
|
59 | 69 | id_, params = build_data( |
|
60 | 70 | self.apikey_regular, 'search', |
|
61 | 71 | search_query=query, |
|
72 | search_sort='{}:{}'.format(sort_dir, sort), | |
|
62 | 73 | search_type='content') |
|
63 | 74 | |
|
64 | 75 | response = api_call(self.app, params) |
@@ -70,6 +81,16 b' class TestApiSearch(object):' | |||
|
70 | 81 | for expected_path in expected_paths: |
|
71 | 82 | assert expected_path in paths |
|
72 | 83 | |
|
84 | @pytest.mark.parametrize("sort_dir", [ | |
|
85 | "asc", | |
|
86 | "desc", | |
|
87 | ]) | |
|
88 | @pytest.mark.parametrize("sort", [ | |
|
89 | "xxx", | |
|
90 | "date", | |
|
91 | "file", | |
|
92 | "size", | |
|
93 | ]) | |
|
73 | 94 | @pytest.mark.parametrize("query, expected_hits, expected_paths", [ |
|
74 | 95 | ('readme.rst', 3, []), |
|
75 | 96 | ('test*', 75, []), |
@@ -77,10 +98,11 b' class TestApiSearch(object):' | |||
|
77 | 98 | ('extension:rst', 48, []), |
|
78 | 99 | ('extension:rst api', 24, []), |
|
79 | 100 | ]) |
|
80 | def test_search_file_paths(self, query, expected_hits, expected_paths): | |
|
101 | def test_search_file_paths(self, sort_dir, sort, query, expected_hits, expected_paths): | |
|
81 | 102 | id_, params = build_data( |
|
82 | 103 | self.apikey_regular, 'search', |
|
83 | 104 | search_query=query, |
|
105 | search_sort='{}:{}'.format(sort_dir, sort), | |
|
84 | 106 | search_type='path') |
|
85 | 107 | |
|
86 | 108 | response = api_call(self.app, params) |
@@ -50,6 +50,7 b' class TestGetMethod(object):' | |||
|
50 | 50 | {'apiuser': '<RequiredType>', |
|
51 | 51 | 'comment_type': "<Optional:u'note'>", |
|
52 | 52 | 'commit_id': '<RequiredType>', |
|
53 | 'extra_recipients': '<Optional:[]>', | |
|
53 | 54 | 'message': '<RequiredType>', |
|
54 | 55 | 'repoid': '<RequiredType>', |
|
55 | 56 | 'request': '<RequiredType>', |
@@ -54,7 +54,7 b' class TestGetRepoChangeset(object):' | |||
|
54 | 54 | details=details, |
|
55 | 55 | ) |
|
56 | 56 | response = api_call(self.app, params) |
|
57 |
expected = ' |
|
|
57 | expected = "commit_id must be a string value got <type 'int'> instead" | |
|
58 | 58 | assert_error(id_, expected, given=response.body) |
|
59 | 59 | |
|
60 | 60 | @pytest.mark.parametrize("details", ['basic', 'extended', 'full']) |
@@ -137,5 +137,5 b' class TestGetRepoChangeset(object):' | |||
|
137 | 137 | details=details, |
|
138 | 138 | ) |
|
139 | 139 | response = api_call(self.app, params) |
|
140 |
expected = ' |
|
|
140 | expected = "commit_id must be a string value got <type 'int'> instead" | |
|
141 | 141 | assert_error(id_, expected, given=response.body) |
@@ -31,36 +31,38 b' from rhodecode.api.tests.utils import (' | |||
|
31 | 31 | @pytest.fixture() |
|
32 | 32 | def make_repo_comments_factory(request): |
|
33 | 33 | |
|
34 | def maker(repo): | |
|
35 | user = User.get_first_super_admin() | |
|
36 | commit = repo.scm_instance()[0] | |
|
34 | class Make(object): | |
|
35 | ||
|
36 | def make_comments(self, repo): | |
|
37 | user = User.get_first_super_admin() | |
|
38 | commit = repo.scm_instance()[0] | |
|
37 | 39 | |
|
38 | commit_id = commit.raw_id | |
|
39 | file_0 = commit.affected_files[0] | |
|
40 | comments = [] | |
|
40 | commit_id = commit.raw_id | |
|
41 | file_0 = commit.affected_files[0] | |
|
42 | comments = [] | |
|
41 | 43 | |
|
42 | # general | |
|
43 | CommentsModel().create( | |
|
44 | text='General Comment', repo=repo, user=user, commit_id=commit_id, | |
|
45 | comment_type=ChangesetComment.COMMENT_TYPE_NOTE, send_email=False) | |
|
44 | # general | |
|
45 | CommentsModel().create( | |
|
46 | text='General Comment', repo=repo, user=user, commit_id=commit_id, | |
|
47 | comment_type=ChangesetComment.COMMENT_TYPE_NOTE, send_email=False) | |
|
46 | 48 | |
|
47 | # inline | |
|
48 | CommentsModel().create( | |
|
49 | text='Inline Comment', repo=repo, user=user, commit_id=commit_id, | |
|
50 | f_path=file_0, line_no='n1', | |
|
51 | comment_type=ChangesetComment.COMMENT_TYPE_NOTE, send_email=False) | |
|
49 | # inline | |
|
50 | CommentsModel().create( | |
|
51 | text='Inline Comment', repo=repo, user=user, commit_id=commit_id, | |
|
52 | f_path=file_0, line_no='n1', | |
|
53 | comment_type=ChangesetComment.COMMENT_TYPE_NOTE, send_email=False) | |
|
52 | 54 | |
|
53 | # todo | |
|
54 | CommentsModel().create( | |
|
55 | text='INLINE TODO Comment', repo=repo, user=user, commit_id=commit_id, | |
|
56 | f_path=file_0, line_no='n1', | |
|
57 | comment_type=ChangesetComment.COMMENT_TYPE_TODO, send_email=False) | |
|
55 | # todo | |
|
56 | CommentsModel().create( | |
|
57 | text='INLINE TODO Comment', repo=repo, user=user, commit_id=commit_id, | |
|
58 | f_path=file_0, line_no='n1', | |
|
59 | comment_type=ChangesetComment.COMMENT_TYPE_TODO, send_email=False) | |
|
58 | 60 | |
|
59 | @request.addfinalizer | |
|
60 | def cleanup(): | |
|
61 | for comment in comments: | |
|
62 | Session().delete(comment) | |
|
63 |
return |
|
|
61 | @request.addfinalizer | |
|
62 | def cleanup(): | |
|
63 | for comment in comments: | |
|
64 | Session().delete(comment) | |
|
65 | return Make() | |
|
64 | 66 | |
|
65 | 67 | |
|
66 | 68 | @pytest.mark.usefixtures("testuser_api", "app") |
@@ -76,7 +78,7 b' class TestGetRepo(object):' | |||
|
76 | 78 | make_repo_comments_factory, filters, expected_count): |
|
77 | 79 | commits = [{'message': 'A'}, {'message': 'B'}] |
|
78 | 80 | repo = backend.create_repo(commits=commits) |
|
79 | make_repo_comments_factory(repo) | |
|
81 | make_repo_comments_factory.make_comments(repo) | |
|
80 | 82 | |
|
81 | 83 | api_call_params = {'repoid': repo.repo_name,} |
|
82 | 84 | api_call_params.update(filters) |
@@ -92,12 +94,13 b' class TestGetRepo(object):' | |||
|
92 | 94 | |
|
93 | 95 | assert len(result) == expected_count |
|
94 | 96 | |
|
95 |
def test_api_get_repo_comments_wrong_comment_typ( |
|
|
97 | def test_api_get_repo_comments_wrong_comment_type( | |
|
98 | self, make_repo_comments_factory, backend_hg): | |
|
99 | commits = [{'message': 'A'}, {'message': 'B'}] | |
|
100 | repo = backend_hg.create_repo(commits=commits) | |
|
101 | make_repo_comments_factory.make_comments(repo) | |
|
96 | 102 | |
|
97 | repo = backend_hg.create_repo() | |
|
98 | make_repo_comments_factory(repo) | |
|
99 | ||
|
100 | api_call_params = {'repoid': repo.repo_name,} | |
|
103 | api_call_params = {'repoid': repo.repo_name} | |
|
101 | 104 | api_call_params.update({'comment_type': 'bogus'}) |
|
102 | 105 | |
|
103 | 106 | expected = 'comment_type must be one of `{}` got {}'.format( |
@@ -25,7 +25,7 b' from rhodecode.model.scm import ScmModel' | |||
|
25 | 25 | from rhodecode.api.tests.utils import build_data, api_call, assert_ok |
|
26 | 26 | |
|
27 | 27 | |
|
28 | @pytest.fixture | |
|
28 | @pytest.fixture() | |
|
29 | 29 | def http_host_stub(): |
|
30 | 30 | """ |
|
31 | 31 | To ensure that we can get an IP address, this test shall run with a |
@@ -120,7 +120,7 b' class TestMergePullRequest(object):' | |||
|
120 | 120 | journal = UserLog.query()\ |
|
121 | 121 | .filter(UserLog.user_id == author)\ |
|
122 | 122 | .filter(UserLog.repository_id == repo) \ |
|
123 |
.order_by( |
|
|
123 | .order_by(UserLog.user_log_id.asc()) \ | |
|
124 | 124 | .all() |
|
125 | 125 | assert journal[-2].action == 'repo.pull_request.merge' |
|
126 | 126 | assert journal[-1].action == 'repo.pull_request.close' |
@@ -221,7 +221,7 b' class TestMergePullRequest(object):' | |||
|
221 | 221 | journal = UserLog.query() \ |
|
222 | 222 | .filter(UserLog.user_id == merge_user_id) \ |
|
223 | 223 | .filter(UserLog.repository_id == repo) \ |
|
224 |
.order_by( |
|
|
224 | .order_by(UserLog.user_log_id.asc()) \ | |
|
225 | 225 | .all() |
|
226 | 226 | assert journal[-2].action == 'repo.pull_request.merge' |
|
227 | 227 | assert journal[-1].action == 'repo.pull_request.close' |
@@ -42,7 +42,8 b' class TestUpdateUser(object):' | |||
|
42 | 42 | ('extern_name', None), |
|
43 | 43 | ('active', False), |
|
44 | 44 | ('active', True), |
|
45 | ('password', 'newpass') | |
|
45 | ('password', 'newpass'), | |
|
46 | ('description', 'CTO 4 Life') | |
|
46 | 47 | ]) |
|
47 | 48 | def test_api_update_user(self, name, expected, user_util): |
|
48 | 49 | usr = user_util.create_user() |
@@ -20,11 +20,16 b'' | |||
|
20 | 20 | |
|
21 | 21 | |
|
22 | 22 | import random |
|
23 | import pytest | |
|
23 | 24 | |
|
24 | 25 | from rhodecode.api.utils import get_origin |
|
25 | 26 | from rhodecode.lib.ext_json import json |
|
26 | 27 | |
|
27 | 28 | |
|
29 | def jsonify(obj): | |
|
30 | return json.loads(json.dumps(obj)) | |
|
31 | ||
|
32 | ||
|
28 | 33 | API_URL = '/_admin/api' |
|
29 | 34 | |
|
30 | 35 | |
@@ -42,12 +47,16 b' def assert_call_ok(id_, given):' | |||
|
42 | 47 | |
|
43 | 48 | |
|
44 | 49 | def assert_ok(id_, expected, given): |
|
50 | given = json.loads(given) | |
|
51 | if given.get('error'): | |
|
52 | pytest.fail("Unexpected ERROR in success response: {}".format(given['error'])) | |
|
53 | ||
|
45 | 54 | expected = jsonify({ |
|
46 | 55 | 'id': id_, |
|
47 | 56 | 'error': None, |
|
48 | 57 | 'result': expected |
|
49 | 58 | }) |
|
50 | given = json.loads(given) | |
|
59 | ||
|
51 | 60 | assert expected == given |
|
52 | 61 | |
|
53 | 62 | |
@@ -61,10 +70,6 b' def assert_error(id_, expected, given):' | |||
|
61 | 70 | assert expected == given |
|
62 | 71 | |
|
63 | 72 | |
|
64 | def jsonify(obj): | |
|
65 | return json.loads(json.dumps(obj)) | |
|
66 | ||
|
67 | ||
|
68 | 73 | def build_data(apikey, method, **kw): |
|
69 | 74 | """ |
|
70 | 75 | Builds API data with given random ID |
@@ -451,7 +451,7 b' def comment_pull_request(' | |||
|
451 | 451 | request, apiuser, pullrequestid, repoid=Optional(None), |
|
452 | 452 | message=Optional(None), commit_id=Optional(None), status=Optional(None), |
|
453 | 453 | comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE), |
|
454 | resolves_comment_id=Optional(None), | |
|
454 | resolves_comment_id=Optional(None), extra_recipients=Optional([]), | |
|
455 | 455 | userid=Optional(OAttr('apiuser'))): |
|
456 | 456 | """ |
|
457 | 457 | Comment on the pull request specified with the `pullrequestid`, |
@@ -476,6 +476,11 b' def comment_pull_request(' | |||
|
476 | 476 | :type status: str |
|
477 | 477 | :param comment_type: Comment type, one of: 'note', 'todo' |
|
478 | 478 | :type comment_type: Optional(str), default: 'note' |
|
479 | :param resolves_comment_id: id of comment which this one will resolve | |
|
480 | :type resolves_comment_id: Optional(int) | |
|
481 | :param extra_recipients: list of user ids or usernames to add | |
|
482 | notifications for this comment. Acts like a CC for notification | |
|
483 | :type extra_recipients: Optional(list) | |
|
479 | 484 | :param userid: Comment on the pull request as this user |
|
480 | 485 | :type userid: Optional(str or int) |
|
481 | 486 | |
@@ -521,6 +526,7 b' def comment_pull_request(' | |||
|
521 | 526 | commit_id = Optional.extract(commit_id) |
|
522 | 527 | comment_type = Optional.extract(comment_type) |
|
523 | 528 | resolves_comment_id = Optional.extract(resolves_comment_id) |
|
529 | extra_recipients = Optional.extract(extra_recipients) | |
|
524 | 530 | |
|
525 | 531 | if not message and not status: |
|
526 | 532 | raise JSONRPCError( |
@@ -580,7 +586,8 b' def comment_pull_request(' | |||
|
580 | 586 | renderer=renderer, |
|
581 | 587 | comment_type=comment_type, |
|
582 | 588 | resolves_comment_id=resolves_comment_id, |
|
583 | auth_user=auth_user | |
|
589 | auth_user=auth_user, | |
|
590 | extra_recipients=extra_recipients | |
|
584 | 591 | ) |
|
585 | 592 | |
|
586 | 593 | if allowed_to_change_status and status: |
@@ -888,7 +895,9 b' def update_pull_request(' | |||
|
888 | 895 | |
|
889 | 896 | with pull_request.set_state(PullRequest.STATE_UPDATING): |
|
890 | 897 | if PullRequestModel().has_valid_update_type(pull_request): |
|
891 | update_response = PullRequestModel().update_commits(pull_request) | |
|
898 | db_user = apiuser.get_instance() | |
|
899 | update_response = PullRequestModel().update_commits( | |
|
900 | pull_request, db_user) | |
|
892 | 901 | commit_changes = update_response.changes or commit_changes |
|
893 | 902 | Session().commit() |
|
894 | 903 |
@@ -617,9 +617,7 b' def get_repo_fts_tree(request, apiuser, ' | |||
|
617 | 617 | cache_namespace_uid = 'cache_repo.{}'.format(repo_id) |
|
618 | 618 | region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid) |
|
619 | 619 | |
|
620 | @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, | |
|
621 | condition=cache_on) | |
|
622 | def compute_fts_tree(repo_id, commit_id, root_path, cache_ver): | |
|
620 | def compute_fts_tree(cache_ver, repo_id, commit_id, root_path): | |
|
623 | 621 | return ScmModel().get_fts_data(repo_id, commit_id, root_path) |
|
624 | 622 | |
|
625 | 623 | try: |
@@ -640,7 +638,7 b' def get_repo_fts_tree(request, apiuser, ' | |||
|
640 | 638 | 'with caching: %s[TTL: %ss]' % ( |
|
641 | 639 | repo_id, commit_id, cache_on, cache_seconds or 0)) |
|
642 | 640 | |
|
643 |
tree_files = compute_fts_tree(repo_id, commit_id, root_path |
|
|
641 | tree_files = compute_fts_tree(rc_cache.FILE_TREE_CACHE_VER, repo_id, commit_id, root_path) | |
|
644 | 642 | return tree_files |
|
645 | 643 | |
|
646 | 644 | except Exception: |
@@ -714,7 +712,7 b' def create_repo(' | |||
|
714 | 712 | private=Optional(False), |
|
715 | 713 | clone_uri=Optional(None), |
|
716 | 714 | push_uri=Optional(None), |
|
717 |
landing_rev=Optional( |
|
|
715 | landing_rev=Optional(None), | |
|
718 | 716 | enable_statistics=Optional(False), |
|
719 | 717 | enable_locking=Optional(False), |
|
720 | 718 | enable_downloads=Optional(False), |
@@ -749,7 +747,7 b' def create_repo(' | |||
|
749 | 747 | :type clone_uri: str |
|
750 | 748 | :param push_uri: set push_uri |
|
751 | 749 | :type push_uri: str |
|
752 | :param landing_rev: <rev_type>:<rev> | |
|
750 | :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd | |
|
753 | 751 | :type landing_rev: str |
|
754 | 752 | :param enable_locking: |
|
755 | 753 | :type enable_locking: bool |
@@ -793,7 +791,6 b' def create_repo(' | |||
|
793 | 791 | copy_permissions = Optional.extract(copy_permissions) |
|
794 | 792 | clone_uri = Optional.extract(clone_uri) |
|
795 | 793 | push_uri = Optional.extract(push_uri) |
|
796 | landing_commit_ref = Optional.extract(landing_rev) | |
|
797 | 794 | |
|
798 | 795 | defs = SettingsModel().get_default_repo_settings(strip_prefix=True) |
|
799 | 796 | if isinstance(private, Optional): |
@@ -807,8 +804,15 b' def create_repo(' | |||
|
807 | 804 | if isinstance(enable_downloads, Optional): |
|
808 | 805 | enable_downloads = defs.get('repo_enable_downloads') |
|
809 | 806 | |
|
807 | landing_ref, _label = ScmModel.backend_landing_ref(repo_type) | |
|
808 | ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate) | |
|
809 | ref_choices = list(set(ref_choices + [landing_ref])) | |
|
810 | ||
|
811 | landing_commit_ref = Optional.extract(landing_rev) or landing_ref | |
|
812 | ||
|
810 | 813 | schema = repo_schema.RepoSchema().bind( |
|
811 | 814 | repo_type_options=rhodecode.BACKENDS.keys(), |
|
815 | repo_ref_options=ref_choices, | |
|
812 | 816 | repo_type=repo_type, |
|
813 | 817 | # user caller |
|
814 | 818 | user=apiuser) |
@@ -958,7 +962,7 b' def update_repo(' | |||
|
958 | 962 | owner=Optional(OAttr('apiuser')), description=Optional(''), |
|
959 | 963 | private=Optional(False), |
|
960 | 964 | clone_uri=Optional(None), push_uri=Optional(None), |
|
961 |
landing_rev=Optional( |
|
|
965 | landing_rev=Optional(None), fork_of=Optional(None), | |
|
962 | 966 | enable_statistics=Optional(False), |
|
963 | 967 | enable_locking=Optional(False), |
|
964 | 968 | enable_downloads=Optional(False), fields=Optional('')): |
@@ -993,7 +997,7 b' def update_repo(' | |||
|
993 | 997 | :type private: bool |
|
994 | 998 | :param clone_uri: Update the |repo| clone URI. |
|
995 | 999 | :type clone_uri: str |
|
996 |
:param landing_rev: Set the |repo| landing revision. |
|
|
1000 | :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd | |
|
997 | 1001 | :type landing_rev: str |
|
998 | 1002 | :param enable_statistics: Enable statistics on the |repo|, (True | False). |
|
999 | 1003 | :type enable_statistics: bool |
@@ -1049,8 +1053,10 b' def update_repo(' | |||
|
1049 | 1053 | repo_enable_downloads=enable_downloads |
|
1050 | 1054 | if not isinstance(enable_downloads, Optional) else repo.enable_downloads) |
|
1051 | 1055 | |
|
1056 | landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type) | |
|
1052 | 1057 | ref_choices, _labels = ScmModel().get_repo_landing_revs( |
|
1053 | 1058 | request.translate, repo=repo) |
|
1059 | ref_choices = list(set(ref_choices + [landing_ref])) | |
|
1054 | 1060 | |
|
1055 | 1061 | old_values = repo.get_api_data() |
|
1056 | 1062 | repo_type = repo.repo_type |
@@ -1128,7 +1134,7 b' def fork_repo(request, apiuser, repoid, ' | |||
|
1128 | 1134 | description=Optional(''), |
|
1129 | 1135 | private=Optional(False), |
|
1130 | 1136 | clone_uri=Optional(None), |
|
1131 |
landing_rev=Optional( |
|
|
1137 | landing_rev=Optional(None), | |
|
1132 | 1138 | copy_permissions=Optional(False)): |
|
1133 | 1139 | """ |
|
1134 | 1140 | Creates a fork of the specified |repo|. |
@@ -1158,7 +1164,7 b' def fork_repo(request, apiuser, repoid, ' | |||
|
1158 | 1164 | :type copy_permissions: bool |
|
1159 | 1165 | :param private: Make the fork private. The default is False. |
|
1160 | 1166 | :type private: bool |
|
1161 |
:param landing_rev: Set the landing revision. |
|
|
1167 | :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd | |
|
1162 | 1168 | |
|
1163 | 1169 | Example output: |
|
1164 | 1170 | |
@@ -1210,11 +1216,17 b' def fork_repo(request, apiuser, repoid, ' | |||
|
1210 | 1216 | description = Optional.extract(description) |
|
1211 | 1217 | copy_permissions = Optional.extract(copy_permissions) |
|
1212 | 1218 | clone_uri = Optional.extract(clone_uri) |
|
1213 | landing_commit_ref = Optional.extract(landing_rev) | |
|
1219 | ||
|
1220 | landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type) | |
|
1221 | ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate) | |
|
1222 | ref_choices = list(set(ref_choices + [landing_ref])) | |
|
1223 | landing_commit_ref = Optional.extract(landing_rev) or landing_ref | |
|
1224 | ||
|
1214 | 1225 | private = Optional.extract(private) |
|
1215 | 1226 | |
|
1216 | 1227 | schema = repo_schema.RepoSchema().bind( |
|
1217 | 1228 | repo_type_options=rhodecode.BACKENDS.keys(), |
|
1229 | repo_ref_options=ref_choices, | |
|
1218 | 1230 | repo_type=repo.repo_type, |
|
1219 | 1231 | # user caller |
|
1220 | 1232 | user=apiuser) |
@@ -1538,7 +1550,7 b' def lock(request, apiuser, repoid, locke' | |||
|
1538 | 1550 | def comment_commit( |
|
1539 | 1551 | request, apiuser, repoid, commit_id, message, status=Optional(None), |
|
1540 | 1552 | comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE), |
|
1541 | resolves_comment_id=Optional(None), | |
|
1553 | resolves_comment_id=Optional(None), extra_recipients=Optional([]), | |
|
1542 | 1554 | userid=Optional(OAttr('apiuser'))): |
|
1543 | 1555 | """ |
|
1544 | 1556 | Set a commit comment, and optionally change the status of the commit. |
@@ -1556,6 +1568,11 b' def comment_commit(' | |||
|
1556 | 1568 | :type status: str |
|
1557 | 1569 | :param comment_type: Comment type, one of: 'note', 'todo' |
|
1558 | 1570 | :type comment_type: Optional(str), default: 'note' |
|
1571 | :param resolves_comment_id: id of comment which this one will resolve | |
|
1572 | :type resolves_comment_id: Optional(int) | |
|
1573 | :param extra_recipients: list of user ids or usernames to add | |
|
1574 | notifications for this comment. Acts like a CC for notification | |
|
1575 | :type extra_recipients: Optional(list) | |
|
1559 | 1576 | :param userid: Set the user name of the comment creator. |
|
1560 | 1577 | :type userid: Optional(str or int) |
|
1561 | 1578 | |
@@ -1592,6 +1609,7 b' def comment_commit(' | |||
|
1592 | 1609 | status = Optional.extract(status) |
|
1593 | 1610 | comment_type = Optional.extract(comment_type) |
|
1594 | 1611 | resolves_comment_id = Optional.extract(resolves_comment_id) |
|
1612 | extra_recipients = Optional.extract(extra_recipients) | |
|
1595 | 1613 | |
|
1596 | 1614 | allowed_statuses = [x[0] for x in ChangesetStatus.STATUSES] |
|
1597 | 1615 | if status and status not in allowed_statuses: |
@@ -1620,7 +1638,8 b' def comment_commit(' | |||
|
1620 | 1638 | renderer=renderer, |
|
1621 | 1639 | comment_type=comment_type, |
|
1622 | 1640 | resolves_comment_id=resolves_comment_id, |
|
1623 | auth_user=apiuser | |
|
1641 | auth_user=apiuser, | |
|
1642 | extra_recipients=extra_recipients | |
|
1624 | 1643 | ) |
|
1625 | 1644 | if status: |
|
1626 | 1645 | # also do a status change |
@@ -33,7 +33,7 b' log = logging.getLogger(__name__)' | |||
|
33 | 33 | |
|
34 | 34 | @jsonrpc_method() |
|
35 | 35 | def search(request, apiuser, search_query, search_type, page_limit=Optional(10), |
|
36 |
page=Optional(1), search_sort=Optional(' |
|
|
36 | page=Optional(1), search_sort=Optional('desc:date'), | |
|
37 | 37 | repo_name=Optional(None), repo_group_name=Optional(None)): |
|
38 | 38 | """ |
|
39 | 39 | Fetch Full Text Search results using API. |
@@ -51,9 +51,15 b' def search(request, apiuser, search_quer' | |||
|
51 | 51 | :type page_limit: Optional(int) |
|
52 | 52 | :param page: Page number. Default first page. |
|
53 | 53 | :type page: Optional(int) |
|
54 |
:param search_sort: Search sort order. |
|
|
55 | * newfirst | |
|
56 | * oldfirst | |
|
54 | :param search_sort: Search sort order.Must start with asc: or desc: Default desc:date. | |
|
55 | The following are valid options: | |
|
56 | * asc|desc:message.raw | |
|
57 | * asc|desc:date | |
|
58 | * asc|desc:author.email.raw | |
|
59 | * asc|desc:message.raw | |
|
60 | * newfirst (old legacy equal to desc:date) | |
|
61 | * oldfirst (old legacy equal to asc:date) | |
|
62 | ||
|
57 | 63 | :type search_sort: Optional(str) |
|
58 | 64 | :param repo_name: Filter by one repo. Default is all. |
|
59 | 65 | :type repo_name: Optional(str) |
@@ -101,7 +107,7 b' def search(request, apiuser, search_quer' | |||
|
101 | 107 | searcher.cleanup() |
|
102 | 108 | |
|
103 | 109 | if not search_result['error']: |
|
104 |
data['execution_time'] = '%s results (%. |
|
|
110 | data['execution_time'] = '%s results (%.4f seconds)' % ( | |
|
105 | 111 | search_result['count'], |
|
106 | 112 | search_result['runtime']) |
|
107 | 113 | else: |
@@ -75,6 +75,7 b' def get_user(request, apiuser, userid=Op' | |||
|
75 | 75 | "extern_name": "rhodecode", |
|
76 | 76 | "extern_type": "rhodecode", |
|
77 | 77 | "firstname": "username", |
|
78 | "description": "user description", | |
|
78 | 79 | "ip_addresses": [], |
|
79 | 80 | "language": null, |
|
80 | 81 | "last_login": "Timestamp", |
@@ -159,7 +160,7 b' def get_users(request, apiuser):' | |||
|
159 | 160 | |
|
160 | 161 | @jsonrpc_method() |
|
161 | 162 | def create_user(request, apiuser, username, email, password=Optional(''), |
|
162 | firstname=Optional(''), lastname=Optional(''), | |
|
163 | firstname=Optional(''), lastname=Optional(''), description=Optional(''), | |
|
163 | 164 | active=Optional(True), admin=Optional(False), |
|
164 | 165 | extern_name=Optional('rhodecode'), |
|
165 | 166 | extern_type=Optional('rhodecode'), |
@@ -185,6 +186,8 b' def create_user(request, apiuser, userna' | |||
|
185 | 186 | :type firstname: Optional(str) |
|
186 | 187 | :param lastname: Set the new user surname. |
|
187 | 188 | :type lastname: Optional(str) |
|
189 | :param description: Set user description, or short bio. Metatags are allowed. | |
|
190 | :type description: Optional(str) | |
|
188 | 191 | :param active: Set the user as active. |
|
189 | 192 | :type active: Optional(``True`` | ``False``) |
|
190 | 193 | :param admin: Give the new user admin rights. |
@@ -250,6 +253,7 b' def create_user(request, apiuser, userna' | |||
|
250 | 253 | email = Optional.extract(email) |
|
251 | 254 | first_name = Optional.extract(firstname) |
|
252 | 255 | last_name = Optional.extract(lastname) |
|
256 | description = Optional.extract(description) | |
|
253 | 257 | active = Optional.extract(active) |
|
254 | 258 | admin = Optional.extract(admin) |
|
255 | 259 | extern_type = Optional.extract(extern_type) |
@@ -267,6 +271,7 b' def create_user(request, apiuser, userna' | |||
|
267 | 271 | last_name=last_name, |
|
268 | 272 | active=active, |
|
269 | 273 | admin=admin, |
|
274 | description=description, | |
|
270 | 275 | extern_type=extern_type, |
|
271 | 276 | extern_name=extern_name, |
|
272 | 277 | )) |
@@ -280,6 +285,7 b' def create_user(request, apiuser, userna' | |||
|
280 | 285 | email=schema_data['email'], |
|
281 | 286 | firstname=schema_data['first_name'], |
|
282 | 287 | lastname=schema_data['last_name'], |
|
288 | description=schema_data['description'], | |
|
283 | 289 | active=schema_data['active'], |
|
284 | 290 | admin=schema_data['admin'], |
|
285 | 291 | extern_type=schema_data['extern_type'], |
@@ -307,7 +313,7 b' def create_user(request, apiuser, userna' | |||
|
307 | 313 | def update_user(request, apiuser, userid, username=Optional(None), |
|
308 | 314 | email=Optional(None), password=Optional(None), |
|
309 | 315 | firstname=Optional(None), lastname=Optional(None), |
|
310 | active=Optional(None), admin=Optional(None), | |
|
316 | description=Optional(None), active=Optional(None), admin=Optional(None), | |
|
311 | 317 | extern_type=Optional(None), extern_name=Optional(None), ): |
|
312 | 318 | """ |
|
313 | 319 | Updates the details for the specified user, if that user exists. |
@@ -331,6 +337,8 b' def update_user(request, apiuser, userid' | |||
|
331 | 337 | :type firstname: Optional(str) |
|
332 | 338 | :param lastname: Set the new surname. |
|
333 | 339 | :type lastname: Optional(str) |
|
340 | :param description: Set user description, or short bio. Metatags are allowed. | |
|
341 | :type description: Optional(str) | |
|
334 | 342 | :param active: Set the new user as active. |
|
335 | 343 | :type active: Optional(``True`` | ``False``) |
|
336 | 344 | :param admin: Give the user admin rights. |
@@ -379,6 +387,7 b' def update_user(request, apiuser, userid' | |||
|
379 | 387 | store_update(updates, email, 'email') |
|
380 | 388 | store_update(updates, firstname, 'name') |
|
381 | 389 | store_update(updates, lastname, 'lastname') |
|
390 | store_update(updates, description, 'description') | |
|
382 | 391 | store_update(updates, active, 'active') |
|
383 | 392 | store_update(updates, admin, 'admin') |
|
384 | 393 | store_update(updates, extern_name, 'extern_name') |
@@ -191,7 +191,9 b' def create_user_group(' | |||
|
191 | 191 | :param active: Set this group as active. |
|
192 | 192 | :type active: Optional(``True`` | ``False``) |
|
193 | 193 | :param sync: Set enabled or disabled the automatically sync from |
|
194 | external authentication types like ldap. | |
|
194 | external authentication types like ldap. If User Group will be named like | |
|
195 | one from e.g ldap and sync flag is enabled members will be synced automatically. | |
|
196 | Sync type when enabled via API is set to `manual_api` | |
|
195 | 197 | :type sync: Optional(``True`` | ``False``) |
|
196 | 198 | |
|
197 | 199 | Example output: |
@@ -303,7 +305,9 b' def update_user_group(request, apiuser, ' | |||
|
303 | 305 | :param active: Set the group as active. |
|
304 | 306 | :type active: Optional(``True`` | ``False``) |
|
305 | 307 | :param sync: Set enabled or disabled the automatically sync from |
|
306 | external authentication types like ldap. | |
|
308 | external authentication types like ldap. If User Group will be named like | |
|
309 | one from e.g ldap and sync flag is enabled members will be synced automatically. | |
|
310 | Sync type when enabled via API is set to `manual_api` | |
|
307 | 311 | :type sync: Optional(``True`` | ``False``) |
|
308 | 312 | |
|
309 | 313 | Example output: |
@@ -25,9 +25,11 b' import operator' | |||
|
25 | 25 | from pyramid import compat |
|
26 | 26 | from pyramid.httpexceptions import HTTPFound, HTTPForbidden, HTTPBadRequest |
|
27 | 27 | |
|
28 | from rhodecode.lib import helpers as h, diffs | |
|
28 | from rhodecode.lib import helpers as h, diffs, rc_cache | |
|
29 | 29 | from rhodecode.lib.utils2 import ( |
|
30 | 30 | StrictAttributeDict, str2bool, safe_int, datetime_to_time, safe_unicode) |
|
31 | from rhodecode.lib.markup_renderer import MarkupRenderer, relative_links | |
|
32 | from rhodecode.lib.vcs.backends.base import EmptyCommit | |
|
31 | 33 | from rhodecode.lib.vcs.exceptions import RepositoryRequirementError |
|
32 | 34 | from rhodecode.model import repo |
|
33 | 35 | from rhodecode.model import repo_group |
@@ -36,6 +38,7 b' from rhodecode.model import user' | |||
|
36 | 38 | from rhodecode.model.db import User |
|
37 | 39 | from rhodecode.model.scm import ScmModel |
|
38 | 40 | from rhodecode.model.settings import VcsSettingsModel |
|
41 | from rhodecode.model.repo import ReadmeFinder | |
|
39 | 42 | |
|
40 | 43 | log = logging.getLogger(__name__) |
|
41 | 44 | |
@@ -222,6 +225,7 b' class RepoAppView(BaseAppView):' | |||
|
222 | 225 | self.db_repo = request.db_repo |
|
223 | 226 | self.db_repo_name = self.db_repo.repo_name |
|
224 | 227 | self.db_repo_pull_requests = ScmModel().get_pull_requests(self.db_repo) |
|
228 | self.db_repo_artifacts = ScmModel().get_artifacts(self.db_repo) | |
|
225 | 229 | |
|
226 | 230 | def _handle_missing_requirements(self, error): |
|
227 | 231 | log.error( |
@@ -237,6 +241,7 b' class RepoAppView(BaseAppView):' | |||
|
237 | 241 | c.rhodecode_db_repo = self.db_repo |
|
238 | 242 | c.repo_name = self.db_repo_name |
|
239 | 243 | c.repository_pull_requests = self.db_repo_pull_requests |
|
244 | c.repository_artifacts = self.db_repo_artifacts | |
|
240 | 245 | c.repository_is_user_following = ScmModel().is_following_repo( |
|
241 | 246 | self.db_repo_name, self._rhodecode_user.user_id) |
|
242 | 247 | self.path_filter = PathFilter(None) |
@@ -305,6 +310,69 b' class RepoAppView(BaseAppView):' | |||
|
305 | 310 | settings = settings_model.get_general_settings() |
|
306 | 311 | return settings.get(settings_key, default) |
|
307 | 312 | |
|
313 | def _get_repo_setting(self, target_repo, settings_key, default=False): | |
|
314 | settings_model = VcsSettingsModel(repo=target_repo) | |
|
315 | settings = settings_model.get_repo_settings_inherited() | |
|
316 | return settings.get(settings_key, default) | |
|
317 | ||
|
318 | def _get_readme_data(self, db_repo, renderer_type, commit_id=None, path='/'): | |
|
319 | log.debug('Looking for README file at path %s', path) | |
|
320 | if commit_id: | |
|
321 | landing_commit_id = commit_id | |
|
322 | else: | |
|
323 | landing_commit = db_repo.get_landing_commit() | |
|
324 | if isinstance(landing_commit, EmptyCommit): | |
|
325 | return None, None | |
|
326 | landing_commit_id = landing_commit.raw_id | |
|
327 | ||
|
328 | cache_namespace_uid = 'cache_repo.{}'.format(db_repo.repo_id) | |
|
329 | region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid) | |
|
330 | start = time.time() | |
|
331 | ||
|
332 | @region.conditional_cache_on_arguments(namespace=cache_namespace_uid) | |
|
333 | def generate_repo_readme(repo_id, _commit_id, _repo_name, _readme_search_path, _renderer_type): | |
|
334 | readme_data = None | |
|
335 | readme_filename = None | |
|
336 | ||
|
337 | commit = db_repo.get_commit(_commit_id) | |
|
338 | log.debug("Searching for a README file at commit %s.", _commit_id) | |
|
339 | readme_node = ReadmeFinder(_renderer_type).search(commit, path=_readme_search_path) | |
|
340 | ||
|
341 | if readme_node: | |
|
342 | log.debug('Found README node: %s', readme_node) | |
|
343 | relative_urls = { | |
|
344 | 'raw': h.route_path( | |
|
345 | 'repo_file_raw', repo_name=_repo_name, | |
|
346 | commit_id=commit.raw_id, f_path=readme_node.path), | |
|
347 | 'standard': h.route_path( | |
|
348 | 'repo_files', repo_name=_repo_name, | |
|
349 | commit_id=commit.raw_id, f_path=readme_node.path), | |
|
350 | } | |
|
351 | readme_data = self._render_readme_or_none(commit, readme_node, relative_urls) | |
|
352 | readme_filename = readme_node.unicode_path | |
|
353 | ||
|
354 | return readme_data, readme_filename | |
|
355 | ||
|
356 | readme_data, readme_filename = generate_repo_readme( | |
|
357 | db_repo.repo_id, landing_commit_id, db_repo.repo_name, path, renderer_type,) | |
|
358 | compute_time = time.time() - start | |
|
359 | log.debug('Repo README for path %s generated and computed in %.4fs', | |
|
360 | path, compute_time) | |
|
361 | return readme_data, readme_filename | |
|
362 | ||
|
363 | def _render_readme_or_none(self, commit, readme_node, relative_urls): | |
|
364 | log.debug('Found README file `%s` rendering...', readme_node.path) | |
|
365 | renderer = MarkupRenderer() | |
|
366 | try: | |
|
367 | html_source = renderer.render( | |
|
368 | readme_node.content, filename=readme_node.path) | |
|
369 | if relative_urls: | |
|
370 | return relative_links(html_source, relative_urls) | |
|
371 | return html_source | |
|
372 | except Exception: | |
|
373 | log.exception( | |
|
374 | "Exception while trying to render the README") | |
|
375 | ||
|
308 | 376 | def get_recache_flag(self): |
|
309 | 377 | for flag_name in ['force_recache', 'force-recache', 'no-cache']: |
|
310 | 378 | flag_val = self.request.GET.get(flag_name) |
@@ -464,7 +532,7 b' class BaseReferencesView(RepoAppView):' | |||
|
464 | 532 | |
|
465 | 533 | def load_refs_context(self, ref_items, partials_template): |
|
466 | 534 | _render = self.request.get_partial_renderer(partials_template) |
|
467 | pre_load = ["author", "date", "message"] | |
|
535 | pre_load = ["author", "date", "message", "parents"] | |
|
468 | 536 | |
|
469 | 537 | is_svn = h.is_svn(self.rhodecode_vcs_repo) |
|
470 | 538 | is_hg = h.is_hg(self.rhodecode_vcs_repo) |
@@ -140,7 +140,6 b' def admin_routes(config):' | |||
|
140 | 140 | name='admin_settings_visual_update', |
|
141 | 141 | pattern='/settings/visual/update') |
|
142 | 142 | |
|
143 | ||
|
144 | 143 | config.add_route( |
|
145 | 144 | name='admin_settings_issuetracker', |
|
146 | 145 | pattern='/settings/issue-tracker') |
@@ -378,6 +377,10 b' def admin_routes(config):' | |||
|
378 | 377 | name='edit_user_audit_logs', |
|
379 | 378 | pattern='/users/{user_id:\d+}/edit/audit', user_route=True) |
|
380 | 379 | |
|
380 | config.add_route( | |
|
381 | name='edit_user_audit_logs_download', | |
|
382 | pattern='/users/{user_id:\d+}/edit/audit/download', user_route=True) | |
|
383 | ||
|
381 | 384 | # user caches |
|
382 | 385 | config.add_route( |
|
383 | 386 | name='edit_user_caches', |
@@ -411,6 +414,10 b' def admin_routes(config):' | |||
|
411 | 414 | pattern='/repos') |
|
412 | 415 | |
|
413 | 416 | config.add_route( |
|
417 | name='repos_data', | |
|
418 | pattern='/repos_data') | |
|
419 | ||
|
420 | config.add_route( | |
|
414 | 421 | name='repo_new', |
|
415 | 422 | pattern='/repos/new') |
|
416 | 423 |
@@ -155,7 +155,7 b' class TestAuthSettingsView(object):' | |||
|
155 | 155 | response = self._post_ldap_settings(params, override={ |
|
156 | 156 | 'port': invalid_port_value, |
|
157 | 157 | }) |
|
158 |
assertr = |
|
|
158 | assertr = response.assert_response() | |
|
159 | 159 | assertr.element_contains( |
|
160 | 160 | '.form .field #port ~ .error-message', |
|
161 | 161 | invalid_port_value) |
@@ -47,6 +47,7 b' def route_path(name, params=None, **kwar' | |||
|
47 | 47 | |
|
48 | 48 | base_url = { |
|
49 | 49 | 'repos': ADMIN_PREFIX + '/repos', |
|
50 | 'repos_data': ADMIN_PREFIX + '/repos_data', | |
|
50 | 51 | 'repo_new': ADMIN_PREFIX + '/repos/new', |
|
51 | 52 | 'repo_create': ADMIN_PREFIX + '/repos/create', |
|
52 | 53 | |
@@ -70,24 +71,25 b' def _get_permission_for_user(user, repo)' | |||
|
70 | 71 | @pytest.mark.usefixtures("app") |
|
71 | 72 | class TestAdminRepos(object): |
|
72 | 73 | |
|
73 | def test_repo_list(self, autologin_user, user_util): | |
|
74 | def test_repo_list(self, autologin_user, user_util, xhr_header): | |
|
74 | 75 | repo = user_util.create_repo() |
|
75 | 76 | repo_name = repo.repo_name |
|
76 | 77 | response = self.app.get( |
|
77 |
route_path('repos'), status=200 |
|
|
78 | route_path('repos_data'), status=200, | |
|
79 | extra_environ=xhr_header) | |
|
78 | 80 | |
|
79 | 81 | response.mustcontain(repo_name) |
|
80 | 82 | |
|
81 | 83 | def test_create_page_restricted_to_single_backend(self, autologin_user, backend): |
|
82 | 84 | with mock.patch('rhodecode.BACKENDS', {'git': 'git'}): |
|
83 | 85 | response = self.app.get(route_path('repo_new'), status=200) |
|
84 |
assert_response = |
|
|
86 | assert_response = response.assert_response() | |
|
85 | 87 | element = assert_response.get_element('#repo_type') |
|
86 | 88 | assert element.text_content() == '\ngit\n' |
|
87 | 89 | |
|
88 | 90 | def test_create_page_non_restricted_backends(self, autologin_user, backend): |
|
89 | 91 | response = self.app.get(route_path('repo_new'), status=200) |
|
90 |
assert_response = |
|
|
92 | assert_response = response.assert_response() | |
|
91 | 93 | assert_response.element_contains('#repo_type', 'git') |
|
92 | 94 | assert_response.element_contains('#repo_type', 'svn') |
|
93 | 95 | assert_response.element_contains('#repo_type', 'hg') |
@@ -84,7 +84,7 b' class TestAdminRepositoryGroups(object):' | |||
|
84 | 84 | fixture.create_repo_group('test_repo_group') |
|
85 | 85 | response = self.app.get(route_path( |
|
86 | 86 | 'repo_groups_data'), extra_environ=xhr_header) |
|
87 |
response.mustcontain('" |
|
|
87 | response.mustcontain('<a href=\\"/{}/_edit\\" title=\\"Edit\\">Edit</a>'.format('test_repo_group')) | |
|
88 | 88 | fixture.destroy_repo_group('test_repo_group') |
|
89 | 89 | |
|
90 | 90 | def test_new(self, autologin_user): |
@@ -367,7 +367,7 b' class TestAdminSettingsVcs(object):' | |||
|
367 | 367 | |
|
368 | 368 | def test_has_an_input_for_invalidation_of_inline_comments(self): |
|
369 | 369 | response = self.app.get(route_path('admin_settings_vcs')) |
|
370 |
assert_response = |
|
|
370 | assert_response = response.assert_response() | |
|
371 | 371 | assert_response.one_element_exists( |
|
372 | 372 | '[name=rhodecode_use_outdated_comments]') |
|
373 | 373 | |
@@ -412,14 +412,14 b' class TestAdminSettingsVcs(object):' | |||
|
412 | 412 | setting = SettingsModel().get_setting_by_name(setting_key) |
|
413 | 413 | assert setting.app_settings_value is new_value |
|
414 | 414 | |
|
415 | @pytest.fixture | |
|
415 | @pytest.fixture() | |
|
416 | 416 | def disable_sql_cache(self, request): |
|
417 | 417 | patcher = mock.patch( |
|
418 | 418 | 'rhodecode.lib.caching_query.FromCache.process_query') |
|
419 | 419 | request.addfinalizer(patcher.stop) |
|
420 | 420 | patcher.start() |
|
421 | 421 | |
|
422 | @pytest.fixture | |
|
422 | @pytest.fixture() | |
|
423 | 423 | def form_defaults(self): |
|
424 | 424 | from rhodecode.apps.admin.views.settings import AdminSettingsView |
|
425 | 425 | return AdminSettingsView._form_defaults() |
@@ -518,7 +518,7 b' class TestOpenSourceLicenses(object):' | |||
|
518 | 518 | response = self.app.get( |
|
519 | 519 | route_path('admin_settings_open_source'), status=200) |
|
520 | 520 | |
|
521 |
assert_response = |
|
|
521 | assert_response = response.assert_response() | |
|
522 | 522 | assert_response.element_contains( |
|
523 | 523 | '.panel-heading', 'Licenses of Third Party Packages') |
|
524 | 524 | for license_data in sample_licenses: |
@@ -528,7 +528,7 b' class TestOpenSourceLicenses(object):' | |||
|
528 | 528 | def test_records_can_be_read(self, autologin_user): |
|
529 | 529 | response = self.app.get( |
|
530 | 530 | route_path('admin_settings_open_source'), status=200) |
|
531 |
assert_response = |
|
|
531 | assert_response = response.assert_response() | |
|
532 | 532 | assert_response.element_contains( |
|
533 | 533 | '.panel-heading', 'Licenses of Third Party Packages') |
|
534 | 534 | |
@@ -726,7 +726,7 b' class TestAdminSettingsIssueTracker(obje' | |||
|
726 | 726 | IssueTrackerSettingsModel().delete_entries(self.uid) |
|
727 | 727 | |
|
728 | 728 | def test_delete_issuetracker_pattern( |
|
729 | self, autologin_user, backend, csrf_token, settings_util): | |
|
729 | self, autologin_user, backend, csrf_token, settings_util, xhr_header): | |
|
730 | 730 | pattern = 'issuetracker_pat' |
|
731 | 731 | uid = md5(pattern) |
|
732 | 732 | settings_util.create_rhodecode_setting( |
@@ -734,10 +734,9 b' class TestAdminSettingsIssueTracker(obje' | |||
|
734 | 734 | |
|
735 | 735 | post_url = route_path('admin_settings_issuetracker_delete') |
|
736 | 736 | post_data = { |
|
737 | '_method': 'delete', | |
|
738 | 737 | 'uid': uid, |
|
739 | 738 | 'csrf_token': csrf_token |
|
740 | 739 | } |
|
741 |
self.app.post(post_url, post_data, status= |
|
|
740 | self.app.post(post_url, post_data, extra_environ=xhr_header, status=200) | |
|
742 | 741 | settings = SettingsModel().get_all_settings() |
|
743 | 742 | assert 'rhodecode_%s%s' % (self.SHORT_PATTERN_KEY, uid) not in settings |
@@ -91,6 +91,9 b' def route_path(name, params=None, **kwar' | |||
|
91 | 91 | 'edit_user_audit_logs': |
|
92 | 92 | ADMIN_PREFIX + '/users/{user_id}/edit/audit', |
|
93 | 93 | |
|
94 | 'edit_user_audit_logs_download': | |
|
95 | ADMIN_PREFIX + '/users/{user_id}/edit/audit/download', | |
|
96 | ||
|
94 | 97 | }[name].format(**kwargs) |
|
95 | 98 | |
|
96 | 99 | if params: |
@@ -318,7 +321,6 b' class TestAdminUsersView(TestController)' | |||
|
318 | 321 | route_path('edit_user_emails', user_id=user_id)) |
|
319 | 322 | response.mustcontain(no=['example@rhodecode.com']) |
|
320 | 323 | |
|
321 | ||
|
322 | 324 | def test_create(self, request, xhr_header): |
|
323 | 325 | self.log_user() |
|
324 | 326 | username = 'newtestuser' |
@@ -333,6 +335,7 b' class TestAdminUsersView(TestController)' | |||
|
333 | 335 | response = self.app.post(route_path('users_create'), params={ |
|
334 | 336 | 'username': username, |
|
335 | 337 | 'password': password, |
|
338 | 'description': 'mr CTO', | |
|
336 | 339 | 'password_confirmation': password_confirmation, |
|
337 | 340 | 'firstname': name, |
|
338 | 341 | 'active': True, |
@@ -381,6 +384,7 b' class TestAdminUsersView(TestController)' | |||
|
381 | 384 | 'name': name, |
|
382 | 385 | 'active': False, |
|
383 | 386 | 'lastname': lastname, |
|
387 | 'description': 'mr CTO', | |
|
384 | 388 | 'email': email, |
|
385 | 389 | 'csrf_token': self.csrf_token, |
|
386 | 390 | }) |
@@ -418,6 +422,7 b' class TestAdminUsersView(TestController)' | |||
|
418 | 422 | ('email', {'email': 'some@email.com'}), |
|
419 | 423 | ('language', {'language': 'de'}), |
|
420 | 424 | ('language', {'language': 'en'}), |
|
425 | ('description', {'description': 'hello CTO'}), | |
|
421 | 426 | # ('new_password', {'new_password': 'foobar123', |
|
422 | 427 | # 'password_confirmation': 'foobar123'}) |
|
423 | 428 | ]) |
@@ -515,7 +520,7 b' class TestAdminUsersView(TestController)' | |||
|
515 | 520 | route_path('user_delete', user_id=new_user.user_id), |
|
516 | 521 | params={'csrf_token': self.csrf_token}) |
|
517 | 522 | |
|
518 | assert_session_flash(response, 'Successfully deleted user') | |
|
523 | assert_session_flash(response, 'Successfully deleted user `{}`'.format(username)) | |
|
519 | 524 | |
|
520 | 525 | def test_delete_owner_of_repository(self, request, user_util): |
|
521 | 526 | self.log_user() |
@@ -531,8 +536,7 b' class TestAdminUsersView(TestController)' | |||
|
531 | 536 | params={'csrf_token': self.csrf_token}) |
|
532 | 537 | |
|
533 | 538 | msg = 'user "%s" still owns 1 repositories and cannot be removed. ' \ |
|
534 | 'Switch owners or remove those repositories:%s' % (username, | |
|
535 | obj_name) | |
|
539 | 'Switch owners or remove those repositories:%s' % (username, obj_name) | |
|
536 | 540 | assert_session_flash(response, msg) |
|
537 | 541 | fixture.destroy_repo(obj_name) |
|
538 | 542 | |
@@ -542,6 +546,7 b' class TestAdminUsersView(TestController)' | |||
|
542 | 546 | usr = user_util.create_user(auto_cleanup=False) |
|
543 | 547 | username = usr.username |
|
544 | 548 | fixture.create_repo(obj_name, cur_user=usr.username) |
|
549 | Session().commit() | |
|
545 | 550 | |
|
546 | 551 | new_user = Session().query(User)\ |
|
547 | 552 | .filter(User.username == username).one() |
@@ -583,8 +588,7 b' class TestAdminUsersView(TestController)' | |||
|
583 | 588 | params={'csrf_token': self.csrf_token}) |
|
584 | 589 | |
|
585 | 590 | msg = 'user "%s" still owns 1 repository groups and cannot be removed. ' \ |
|
586 | 'Switch owners or remove those repository groups:%s' % (username, | |
|
587 | obj_name) | |
|
591 | 'Switch owners or remove those repository groups:%s' % (username, obj_name) | |
|
588 | 592 | assert_session_flash(response, msg) |
|
589 | 593 | fixture.destroy_repo_group(obj_name) |
|
590 | 594 | |
@@ -635,8 +639,7 b' class TestAdminUsersView(TestController)' | |||
|
635 | 639 | params={'csrf_token': self.csrf_token}) |
|
636 | 640 | |
|
637 | 641 | msg = 'user "%s" still owns 1 user groups and cannot be removed. ' \ |
|
638 | 'Switch owners or remove those user groups:%s' % (username, | |
|
639 | obj_name) | |
|
642 | 'Switch owners or remove those user groups:%s' % (username, obj_name) | |
|
640 | 643 | assert_session_flash(response, msg) |
|
641 | 644 | fixture.destroy_user_group(obj_name) |
|
642 | 645 | |
@@ -779,3 +782,13 b' class TestAdminUsersView(TestController)' | |||
|
779 | 782 | user = self.log_user() |
|
780 | 783 | self.app.get( |
|
781 | 784 | route_path('edit_user_audit_logs', user_id=user['user_id'])) |
|
785 | ||
|
786 | def test_audit_log_page_download(self): | |
|
787 | user = self.log_user() | |
|
788 | user_id = user['user_id'] | |
|
789 | response = self.app.get( | |
|
790 | route_path('edit_user_audit_logs_download', user_id=user_id)) | |
|
791 | ||
|
792 | assert response.content_disposition == \ | |
|
793 | 'attachment; filename=user_{}_audit_logs.json'.format(user_id) | |
|
794 | assert response.content_type == "application/json" |
@@ -28,7 +28,7 b' from rhodecode.model.db import joinedloa' | |||
|
28 | 28 | from rhodecode.lib.user_log_filter import user_log_filter |
|
29 | 29 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator |
|
30 | 30 | from rhodecode.lib.utils2 import safe_int |
|
31 | from rhodecode.lib.helpers import Page | |
|
31 | from rhodecode.lib.helpers import SqlPage | |
|
32 | 32 | |
|
33 | 33 | log = logging.getLogger(__name__) |
|
34 | 34 | |
@@ -62,13 +62,16 b' class AdminAuditLogsView(BaseAppView):' | |||
|
62 | 62 | |
|
63 | 63 | p = safe_int(self.request.GET.get('page', 1), 1) |
|
64 | 64 | |
|
65 |
def url_generator( |
|
|
65 | def url_generator(page_num): | |
|
66 | query_params = { | |
|
67 | 'page': page_num | |
|
68 | } | |
|
66 | 69 | if c.search_term: |
|
67 |
|
|
|
68 |
return self.request.current_route_path(_query= |
|
|
70 | query_params['filter'] = c.search_term | |
|
71 | return self.request.current_route_path(_query=query_params) | |
|
69 | 72 | |
|
70 | c.audit_logs = Page(users_log, page=p, items_per_page=10, | |
|
71 | url=url_generator) | |
|
73 | c.audit_logs = SqlPage(users_log, page=p, items_per_page=10, | |
|
74 | url_maker=url_generator) | |
|
72 | 75 | return self._get_template_context(c) |
|
73 | 76 | |
|
74 | 77 | @LoginRequired() |
@@ -25,7 +25,7 b' from pyramid.view import view_config' | |||
|
25 | 25 | |
|
26 | 26 | from rhodecode.apps._base import BaseAppView |
|
27 | 27 | from rhodecode.lib import helpers as h |
|
28 | from rhodecode.lib.auth import (LoginRequired, NotAnonymous) | |
|
28 | from rhodecode.lib.auth import (LoginRequired, NotAnonymous, HasRepoPermissionAny) | |
|
29 | 29 | from rhodecode.model.db import PullRequest |
|
30 | 30 | |
|
31 | 31 | |
@@ -66,6 +66,13 b' class AdminMainView(BaseAppView):' | |||
|
66 | 66 | pull_request_id = pull_request.pull_request_id |
|
67 | 67 | |
|
68 | 68 | repo_name = pull_request.target_repo.repo_name |
|
69 | # NOTE(marcink): | |
|
70 | # check permissions so we don't redirect to repo that we don't have access to | |
|
71 | # exposing it's name | |
|
72 | target_repo_perm = HasRepoPermissionAny( | |
|
73 | 'repository.read', 'repository.write', 'repository.admin')(repo_name) | |
|
74 | if not target_repo_perm: | |
|
75 | raise HTTPNotFound() | |
|
69 | 76 | |
|
70 | 77 | raise HTTPFound( |
|
71 | 78 | h.route_path('pullrequest_show', repo_name=repo_name, |
@@ -68,7 +68,8 b' class AdminPermissionsView(BaseAppView, ' | |||
|
68 | 68 | |
|
69 | 69 | c.user = User.get_default_user(refresh=True) |
|
70 | 70 | |
|
71 | app_settings = SettingsModel().get_all_settings() | |
|
71 | app_settings = c.rc_config | |
|
72 | ||
|
72 | 73 | defaults = { |
|
73 | 74 | 'anonymous': c.user.active, |
|
74 | 75 | 'default_register_message': app_settings.get( |
@@ -47,7 +47,7 b' class AdminProcessManagementView(BaseApp' | |||
|
47 | 47 | 'name': proc.name(), |
|
48 | 48 | 'mem_rss': mem.rss, |
|
49 | 49 | 'mem_vms': mem.vms, |
|
50 | 'cpu_percent': proc.cpu_percent(), | |
|
50 | 'cpu_percent': proc.cpu_percent(interval=0.1), | |
|
51 | 51 | 'create_time': proc.create_time(), |
|
52 | 52 | 'cmd': ' '.join(proc.cmdline()), |
|
53 | 53 | }) |
@@ -19,6 +19,8 b'' | |||
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | import datetime |
|
21 | 21 | import logging |
|
22 | import time | |
|
23 | ||
|
22 | 24 | import formencode |
|
23 | 25 | import formencode.htmlfill |
|
24 | 26 | |
@@ -63,7 +65,7 b' class AdminRepoGroupsView(BaseAppView, D' | |||
|
63 | 65 | # and display only those we have ADMIN right |
|
64 | 66 | groups_with_admin_rights = RepoGroupList( |
|
65 | 67 | RepoGroup.query().all(), |
|
66 | perm_set=['group.admin']) | |
|
68 | perm_set=['group.admin'], extra_kwargs=dict(user=self._rhodecode_user)) | |
|
67 | 69 | c.repo_groups = RepoGroup.groups_choices( |
|
68 | 70 | groups=groups_with_admin_rights, |
|
69 | 71 | show_empty_group=allow_empty_group) |
@@ -109,9 +111,9 b' class AdminRepoGroupsView(BaseAppView, D' | |||
|
109 | 111 | def repo_group_list_data(self): |
|
110 | 112 | self.load_default_context() |
|
111 | 113 | column_map = { |
|
112 |
'name |
|
|
114 | 'name': 'group_name_hash', | |
|
113 | 115 | 'desc': 'group_description', |
|
114 |
'last_change |
|
|
116 | 'last_change': 'updated_on', | |
|
115 | 117 | 'top_level_repos': 'repos_total', |
|
116 | 118 | 'owner': 'user_username', |
|
117 | 119 | } |
@@ -131,9 +133,10 b' class AdminRepoGroupsView(BaseAppView, D' | |||
|
131 | 133 | |
|
132 | 134 | def last_change(last_change): |
|
133 | 135 | if isinstance(last_change, datetime.datetime) and not last_change.tzinfo: |
|
134 |
|
|
|
135 |
|
|
|
136 | last_change = last_change + delta | |
|
136 | ts = time.time() | |
|
137 | utc_offset = (datetime.datetime.fromtimestamp(ts) | |
|
138 | - datetime.datetime.utcfromtimestamp(ts)).total_seconds() | |
|
139 | last_change = last_change + datetime.timedelta(seconds=utc_offset) | |
|
137 | 140 | return _render("last_change", last_change) |
|
138 | 141 | |
|
139 | 142 | def desc(desc, personal): |
@@ -147,12 +150,8 b' class AdminRepoGroupsView(BaseAppView, D' | |||
|
147 | 150 | def user_profile(username): |
|
148 | 151 | return _render('user_profile', username) |
|
149 | 152 | |
|
150 | auth_repo_group_list = RepoGroupList( | |
|
151 | RepoGroup.query().all(), perm_set=['group.admin']) | |
|
152 | ||
|
153 | allowed_ids = [-1] | |
|
154 | for repo_group in auth_repo_group_list: | |
|
155 | allowed_ids.append(repo_group.group_id) | |
|
153 | _perms = ['group.admin'] | |
|
154 | allowed_ids = [-1] + self._rhodecode_user.repo_group_acl_ids_from_stack(_perms) | |
|
156 | 155 | |
|
157 | 156 | repo_groups_data_total_count = RepoGroup.query()\ |
|
158 | 157 | .filter(or_( |
@@ -180,7 +179,7 b' class AdminRepoGroupsView(BaseAppView, D' | |||
|
180 | 179 | # generate multiple IN to fix limitation problems |
|
181 | 180 | *in_filter_generator(RepoGroup.group_id, allowed_ids) |
|
182 | 181 | )) \ |
|
183 | .outerjoin(Repository) \ | |
|
182 | .outerjoin(Repository, Repository.group_id == RepoGroup.group_id) \ | |
|
184 | 183 | .join(User, User.user_id == RepoGroup.user_id) \ |
|
185 | 184 | .group_by(RepoGroup, User) |
|
186 | 185 | |
@@ -224,9 +223,8 b' class AdminRepoGroupsView(BaseAppView, D' | |||
|
224 | 223 | row = { |
|
225 | 224 | "menu": quick_menu(repo_gr.group_name), |
|
226 | 225 | "name": repo_group_lnk(repo_gr.group_name), |
|
227 | "name_raw": repo_gr.group_name, | |
|
226 | ||
|
228 | 227 | "last_change": last_change(repo_gr.updated_on), |
|
229 | "last_change_raw": datetime_to_time(repo_gr.updated_on), | |
|
230 | 228 | |
|
231 | 229 | "last_changeset": "", |
|
232 | 230 | "last_changeset_raw": "", |
@@ -31,7 +31,6 b' from rhodecode import events' | |||
|
31 | 31 | from rhodecode.apps._base import BaseAppView, DataGridAppView |
|
32 | 32 | from rhodecode.lib.celerylib.utils import get_task_id |
|
33 | 33 | |
|
34 | from rhodecode.lib.ext_json import json | |
|
35 | 34 | from rhodecode.lib.auth import ( |
|
36 | 35 | LoginRequired, CSRFRequired, NotAnonymous, |
|
37 | 36 | HasPermissionAny, HasRepoGroupPermissionAny) |
@@ -43,7 +42,8 b' from rhodecode.model.permission import P' | |||
|
43 | 42 | from rhodecode.model.repo import RepoModel |
|
44 | 43 | from rhodecode.model.scm import RepoList, RepoGroupList, ScmModel |
|
45 | 44 | from rhodecode.model.settings import SettingsModel |
|
46 |
from rhodecode.model.db import |
|
|
45 | from rhodecode.model.db import ( | |
|
46 | in_filter_generator, or_, func, Session, Repository, RepoGroup, User) | |
|
47 | 47 | |
|
48 | 48 | log = logging.getLogger(__name__) |
|
49 | 49 | |
@@ -60,8 +60,6 b' class AdminReposView(BaseAppView, DataGr' | |||
|
60 | 60 | perm_set=['group.write', 'group.admin']) |
|
61 | 61 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) |
|
62 | 62 | c.repo_groups_choices = map(lambda k: safe_unicode(k[0]), c.repo_groups) |
|
63 | c.landing_revs_choices, c.landing_revs = \ | |
|
64 | ScmModel().get_repo_landing_revs(self.request.translate) | |
|
65 | 63 | c.personal_repo_group = self._rhodecode_user.personal_repo_group |
|
66 | 64 | |
|
67 | 65 | @LoginRequired() |
@@ -72,15 +70,94 b' class AdminReposView(BaseAppView, DataGr' | |||
|
72 | 70 | renderer='rhodecode:templates/admin/repos/repos.mako') |
|
73 | 71 | def repository_list(self): |
|
74 | 72 | c = self.load_default_context() |
|
73 | return self._get_template_context(c) | |
|
75 | 74 | |
|
76 | repo_list = Repository.get_all_repos() | |
|
77 | c.repo_list = RepoList(repo_list, perm_set=['repository.admin']) | |
|
75 | @LoginRequired() | |
|
76 | @NotAnonymous() | |
|
77 | # perms check inside | |
|
78 | @view_config( | |
|
79 | route_name='repos_data', request_method='GET', | |
|
80 | renderer='json_ext', xhr=True) | |
|
81 | def repository_list_data(self): | |
|
82 | self.load_default_context() | |
|
83 | column_map = { | |
|
84 | 'name': 'repo_name', | |
|
85 | 'desc': 'description', | |
|
86 | 'last_change': 'updated_on', | |
|
87 | 'owner': 'user_username', | |
|
88 | } | |
|
89 | draw, start, limit = self._extract_chunk(self.request) | |
|
90 | search_q, order_by, order_dir = self._extract_ordering( | |
|
91 | self.request, column_map=column_map) | |
|
92 | ||
|
93 | _perms = ['repository.admin'] | |
|
94 | allowed_ids = [-1] + self._rhodecode_user.repo_acl_ids_from_stack(_perms) | |
|
95 | ||
|
96 | repos_data_total_count = Repository.query() \ | |
|
97 | .filter(or_( | |
|
98 | # generate multiple IN to fix limitation problems | |
|
99 | *in_filter_generator(Repository.repo_id, allowed_ids)) | |
|
100 | ) \ | |
|
101 | .count() | |
|
102 | ||
|
103 | base_q = Session.query( | |
|
104 | Repository.repo_id, | |
|
105 | Repository.repo_name, | |
|
106 | Repository.description, | |
|
107 | Repository.repo_type, | |
|
108 | Repository.repo_state, | |
|
109 | Repository.private, | |
|
110 | Repository.archived, | |
|
111 | Repository.fork, | |
|
112 | Repository.updated_on, | |
|
113 | Repository._changeset_cache, | |
|
114 | User, | |
|
115 | ) \ | |
|
116 | .filter(or_( | |
|
117 | # generate multiple IN to fix limitation problems | |
|
118 | *in_filter_generator(Repository.repo_id, allowed_ids)) | |
|
119 | ) \ | |
|
120 | .join(User, User.user_id == Repository.user_id) \ | |
|
121 | .group_by(Repository, User) | |
|
122 | ||
|
123 | if search_q: | |
|
124 | like_expression = u'%{}%'.format(safe_unicode(search_q)) | |
|
125 | base_q = base_q.filter(or_( | |
|
126 | Repository.repo_name.ilike(like_expression), | |
|
127 | )) | |
|
128 | ||
|
129 | repos_data_total_filtered_count = base_q.count() | |
|
130 | ||
|
131 | sort_defined = False | |
|
132 | if order_by == 'repo_name': | |
|
133 | sort_col = func.lower(Repository.repo_name) | |
|
134 | sort_defined = True | |
|
135 | elif order_by == 'user_username': | |
|
136 | sort_col = User.username | |
|
137 | else: | |
|
138 | sort_col = getattr(Repository, order_by, None) | |
|
139 | ||
|
140 | if sort_defined or sort_col: | |
|
141 | if order_dir == 'asc': | |
|
142 | sort_col = sort_col.asc() | |
|
143 | else: | |
|
144 | sort_col = sort_col.desc() | |
|
145 | ||
|
146 | base_q = base_q.order_by(sort_col) | |
|
147 | base_q = base_q.offset(start).limit(limit) | |
|
148 | ||
|
149 | repos_list = base_q.all() | |
|
150 | ||
|
78 | 151 | repos_data = RepoModel().get_repos_as_dict( |
|
79 |
repo_list= |
|
|
80 | # json used to render the grid | |
|
81 | c.data = json.dumps(repos_data) | |
|
152 | repo_list=repos_list, admin=True, super_user_actions=True) | |
|
82 | 153 | |
|
83 | return self._get_template_context(c) | |
|
154 | data = ({ | |
|
155 | 'draw': draw, | |
|
156 | 'data': repos_data, | |
|
157 | 'recordsTotal': repos_data_total_count, | |
|
158 | 'recordsFiltered': repos_data_total_filtered_count, | |
|
159 | }) | |
|
160 | return data | |
|
84 | 161 | |
|
85 | 162 | @LoginRequired() |
|
86 | 163 | @NotAnonymous() |
@@ -151,8 +228,7 b' class AdminReposView(BaseAppView, DataGr' | |||
|
151 | 228 | try: |
|
152 | 229 | # CanWriteToGroup validators checks permissions of this POST |
|
153 | 230 | form = RepoForm( |
|
154 |
self.request.translate, repo_groups=c.repo_groups_choices |
|
|
155 | landing_revs=c.landing_revs_choices)() | |
|
231 | self.request.translate, repo_groups=c.repo_groups_choices)() | |
|
156 | 232 | form_result = form.to_python(dict(self.request.POST)) |
|
157 | 233 | copy_permissions = form_result.get('repo_copy_permissions') |
|
158 | 234 | # create is done sometimes async on celery, db transaction |
@@ -445,7 +445,7 b' class AdminSettingsView(BaseAppView):' | |||
|
445 | 445 | def settings_issuetracker(self): |
|
446 | 446 | c = self.load_default_context() |
|
447 | 447 | c.active = 'issuetracker' |
|
448 | defaults = SettingsModel().get_all_settings() | |
|
448 | defaults = c.rc_config | |
|
449 | 449 | |
|
450 | 450 | entry_key = 'rhodecode_issuetracker_pat_' |
|
451 | 451 | |
@@ -518,7 +518,7 b' class AdminSettingsView(BaseAppView):' | |||
|
518 | 518 | @CSRFRequired() |
|
519 | 519 | @view_config( |
|
520 | 520 | route_name='admin_settings_issuetracker_delete', request_method='POST', |
|
521 | renderer='rhodecode:templates/admin/settings/settings.mako') | |
|
521 | renderer='json_ext', xhr=True) | |
|
522 | 522 | def settings_issuetracker_delete(self): |
|
523 | 523 | _ = self.request.translate |
|
524 | 524 | self.load_default_context() |
@@ -528,8 +528,11 b' class AdminSettingsView(BaseAppView):' | |||
|
528 | 528 | except Exception: |
|
529 | 529 | log.exception('Failed to delete issue tracker setting %s', uid) |
|
530 | 530 | raise HTTPNotFound() |
|
531 | h.flash(_('Removed issue tracker entry'), category='success') | |
|
532 | raise HTTPFound(h.route_path('admin_settings_issuetracker')) | |
|
531 | ||
|
532 | SettingsModel().invalidate_settings_cache() | |
|
533 | h.flash(_('Removed issue tracker entry.'), category='success') | |
|
534 | ||
|
535 | return {'deleted': uid} | |
|
533 | 536 | |
|
534 | 537 | @LoginRequired() |
|
535 | 538 | @HasPermissionAllDecorator('hg.admin') |
@@ -570,8 +573,7 b' class AdminSettingsView(BaseAppView):' | |||
|
570 | 573 | |
|
571 | 574 | email_kwargs = { |
|
572 | 575 | 'date': datetime.datetime.now(), |
|
573 |
'user': c.rhodecode_user |
|
|
574 | 'rhodecode_version': c.rhodecode_version | |
|
576 | 'user': c.rhodecode_user | |
|
575 | 577 | } |
|
576 | 578 | |
|
577 | 579 | (subject, headers, email_body, |
@@ -155,6 +155,10 b' class AdminSystemInfoSettingsView(BaseAp' | |||
|
155 | 155 | |
|
156 | 156 | ] |
|
157 | 157 | |
|
158 | c.vcsserver_data_items = [ | |
|
159 | (k, v) for k,v in (val('vcs_server_config') or {}).items() | |
|
160 | ] | |
|
161 | ||
|
158 | 162 | if snapshot: |
|
159 | 163 | if c.allowed_to_snapshot: |
|
160 | 164 | c.data_items.pop(0) # remove server info |
@@ -99,12 +99,8 b' class AdminUserGroupsView(BaseAppView, D' | |||
|
99 | 99 | def user_profile(username): |
|
100 | 100 | return _render('user_profile', username) |
|
101 | 101 | |
|
102 | auth_user_group_list = UserGroupList( | |
|
103 | UserGroup.query().all(), perm_set=['usergroup.admin']) | |
|
104 | ||
|
105 | allowed_ids = [-1] | |
|
106 | for user_group in auth_user_group_list: | |
|
107 | allowed_ids.append(user_group.users_group_id) | |
|
102 | _perms = ['usergroup.admin'] | |
|
103 | allowed_ids = [-1] + self._rhodecode_user.user_group_acl_ids_from_stack(_perms) | |
|
108 | 104 | |
|
109 | 105 | user_groups_data_total_count = UserGroup.query()\ |
|
110 | 106 | .filter(or_( |
@@ -134,7 +130,7 b' class AdminUserGroupsView(BaseAppView, D' | |||
|
134 | 130 | # generate multiple IN to fix limitation problems |
|
135 | 131 | *in_filter_generator(UserGroup.users_group_id, allowed_ids) |
|
136 | 132 | )) \ |
|
137 | .outerjoin(UserGroupMember) \ | |
|
133 | .outerjoin(UserGroupMember, UserGroupMember.users_group_id == UserGroup.users_group_id) \ | |
|
138 | 134 | .join(User, User.user_id == UserGroup.user_id) \ |
|
139 | 135 | .group_by(UserGroup, User) |
|
140 | 136 | |
@@ -175,7 +171,6 b' class AdminUserGroupsView(BaseAppView, D' | |||
|
175 | 171 | for user_gr in auth_user_group_list: |
|
176 | 172 | row = { |
|
177 | 173 | "users_group_name": user_group_name(user_gr.users_group_name), |
|
178 | "name_raw": h.escape(user_gr.users_group_name), | |
|
179 | 174 | "description": h.escape(user_gr.user_group_description), |
|
180 | 175 | "members": user_gr.member_count, |
|
181 | 176 | # NOTE(marcink): because of advanced query we |
@@ -31,6 +31,7 b' from pyramid.response import Response' | |||
|
31 | 31 | from rhodecode import events |
|
32 | 32 | from rhodecode.apps._base import BaseAppView, DataGridAppView, UserAppView |
|
33 | 33 | from rhodecode.apps.ssh_support import SshKeyFileChangeEvent |
|
34 | from rhodecode.authentication.base import get_authn_registry, RhodeCodeExternalAuthPlugin | |
|
34 | 35 | from rhodecode.authentication.plugins import auth_rhodecode |
|
35 | 36 | from rhodecode.events import trigger |
|
36 | 37 | from rhodecode.model.db import true |
@@ -43,6 +44,7 b' from rhodecode.lib.ext_json import json' | |||
|
43 | 44 | from rhodecode.lib.auth import ( |
|
44 | 45 | LoginRequired, HasPermissionAllDecorator, CSRFRequired) |
|
45 | 46 | from rhodecode.lib import helpers as h |
|
47 | from rhodecode.lib.helpers import SqlPage | |
|
46 | 48 | from rhodecode.lib.utils2 import safe_int, safe_unicode, AttributeDict |
|
47 | 49 | from rhodecode.model.auth_token import AuthTokenModel |
|
48 | 50 | from rhodecode.model.forms import ( |
@@ -249,7 +251,32 b' class UsersView(UserAppView):' | |||
|
249 | 251 | in there as well. |
|
250 | 252 | """ |
|
251 | 253 | |
|
254 | def get_auth_plugins(self): | |
|
255 | valid_plugins = [] | |
|
256 | authn_registry = get_authn_registry(self.request.registry) | |
|
257 | for plugin in authn_registry.get_plugins_for_authentication(): | |
|
258 | if isinstance(plugin, RhodeCodeExternalAuthPlugin): | |
|
259 | valid_plugins.append(plugin) | |
|
260 | elif plugin.name == 'rhodecode': | |
|
261 | valid_plugins.append(plugin) | |
|
262 | ||
|
263 | # extend our choices if user has set a bound plugin which isn't enabled at the | |
|
264 | # moment | |
|
265 | extern_type = self.db_user.extern_type | |
|
266 | if extern_type not in [x.uid for x in valid_plugins]: | |
|
267 | try: | |
|
268 | plugin = authn_registry.get_plugin_by_uid(extern_type) | |
|
269 | if plugin: | |
|
270 | valid_plugins.append(plugin) | |
|
271 | ||
|
272 | except Exception: | |
|
273 | log.exception( | |
|
274 | 'Could not extend user plugins with `{}`'.format(extern_type)) | |
|
275 | return valid_plugins | |
|
276 | ||
|
252 | 277 | def load_default_context(self): |
|
278 | req = self.request | |
|
279 | ||
|
253 | 280 | c = self._get_local_tmpl_context() |
|
254 | 281 | c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS |
|
255 | 282 | c.allowed_languages = [ |
@@ -263,7 +290,10 b' class UsersView(UserAppView):' | |||
|
263 | 290 | ('ru', 'Russian (ru)'), |
|
264 | 291 | ('zh', 'Chinese (zh)'), |
|
265 | 292 | ] |
|
266 | req = self.request | |
|
293 | ||
|
294 | c.allowed_extern_types = [ | |
|
295 | (x.uid, x.get_display_name()) for x in self.get_auth_plugins() | |
|
296 | ] | |
|
267 | 297 | |
|
268 | 298 | c.available_permissions = req.registry.settings['available_permissions'] |
|
269 | 299 | PermissionModel().set_global_permission_choices( |
@@ -297,7 +327,7 b' class UsersView(UserAppView):' | |||
|
297 | 327 | old_values = c.user.get_api_data() |
|
298 | 328 | try: |
|
299 | 329 | form_result = _form.to_python(dict(self.request.POST)) |
|
300 |
skip_attrs = [' |
|
|
330 | skip_attrs = ['extern_name'] | |
|
301 | 331 | # TODO: plugin should define if username can be updated |
|
302 | 332 | if c.extern_type != "rhodecode": |
|
303 | 333 | # forbid updating username for external accounts |
@@ -347,59 +377,69 b' class UsersView(UserAppView):' | |||
|
347 | 377 | _repos = c.user.repositories |
|
348 | 378 | _repo_groups = c.user.repository_groups |
|
349 | 379 | _user_groups = c.user.user_groups |
|
380 | _artifacts = c.user.artifacts | |
|
350 | 381 | |
|
351 | 382 | handle_repos = None |
|
352 | 383 | handle_repo_groups = None |
|
353 | 384 | handle_user_groups = None |
|
354 | # dummy call for flash of handle | |
|
355 | set_handle_flash_repos = lambda: None | |
|
356 | set_handle_flash_repo_groups = lambda: None | |
|
357 |
|
|
|
385 | handle_artifacts = None | |
|
386 | ||
|
387 | # calls for flash of handle based on handle case detach or delete | |
|
388 | def set_handle_flash_repos(): | |
|
389 | handle = handle_repos | |
|
390 | if handle == 'detach': | |
|
391 | h.flash(_('Detached %s repositories') % len(_repos), | |
|
392 | category='success') | |
|
393 | elif handle == 'delete': | |
|
394 | h.flash(_('Deleted %s repositories') % len(_repos), | |
|
395 | category='success') | |
|
396 | ||
|
397 | def set_handle_flash_repo_groups(): | |
|
398 | handle = handle_repo_groups | |
|
399 | if handle == 'detach': | |
|
400 | h.flash(_('Detached %s repository groups') % len(_repo_groups), | |
|
401 | category='success') | |
|
402 | elif handle == 'delete': | |
|
403 | h.flash(_('Deleted %s repository groups') % len(_repo_groups), | |
|
404 | category='success') | |
|
405 | ||
|
406 | def set_handle_flash_user_groups(): | |
|
407 | handle = handle_user_groups | |
|
408 | if handle == 'detach': | |
|
409 | h.flash(_('Detached %s user groups') % len(_user_groups), | |
|
410 | category='success') | |
|
411 | elif handle == 'delete': | |
|
412 | h.flash(_('Deleted %s user groups') % len(_user_groups), | |
|
413 | category='success') | |
|
414 | ||
|
415 | def set_handle_flash_artifacts(): | |
|
416 | handle = handle_artifacts | |
|
417 | if handle == 'detach': | |
|
418 | h.flash(_('Detached %s artifacts') % len(_artifacts), | |
|
419 | category='success') | |
|
420 | elif handle == 'delete': | |
|
421 | h.flash(_('Deleted %s artifacts') % len(_artifacts), | |
|
422 | category='success') | |
|
358 | 423 | |
|
359 | 424 | if _repos and self.request.POST.get('user_repos'): |
|
360 | do = self.request.POST['user_repos'] | |
|
361 | if do == 'detach': | |
|
362 | handle_repos = 'detach' | |
|
363 | set_handle_flash_repos = lambda: h.flash( | |
|
364 | _('Detached %s repositories') % len(_repos), | |
|
365 | category='success') | |
|
366 | elif do == 'delete': | |
|
367 | handle_repos = 'delete' | |
|
368 | set_handle_flash_repos = lambda: h.flash( | |
|
369 | _('Deleted %s repositories') % len(_repos), | |
|
370 | category='success') | |
|
425 | handle_repos = self.request.POST['user_repos'] | |
|
371 | 426 | |
|
372 | 427 | if _repo_groups and self.request.POST.get('user_repo_groups'): |
|
373 | do = self.request.POST['user_repo_groups'] | |
|
374 | if do == 'detach': | |
|
375 | handle_repo_groups = 'detach' | |
|
376 | set_handle_flash_repo_groups = lambda: h.flash( | |
|
377 | _('Detached %s repository groups') % len(_repo_groups), | |
|
378 | category='success') | |
|
379 | elif do == 'delete': | |
|
380 | handle_repo_groups = 'delete' | |
|
381 | set_handle_flash_repo_groups = lambda: h.flash( | |
|
382 | _('Deleted %s repository groups') % len(_repo_groups), | |
|
383 | category='success') | |
|
428 | handle_repo_groups = self.request.POST['user_repo_groups'] | |
|
384 | 429 | |
|
385 | 430 | if _user_groups and self.request.POST.get('user_user_groups'): |
|
386 | do = self.request.POST['user_user_groups'] | |
|
387 | if do == 'detach': | |
|
388 | handle_user_groups = 'detach' | |
|
389 | set_handle_flash_user_groups = lambda: h.flash( | |
|
390 | _('Detached %s user groups') % len(_user_groups), | |
|
391 | category='success') | |
|
392 | elif do == 'delete': | |
|
393 | handle_user_groups = 'delete' | |
|
394 | set_handle_flash_user_groups = lambda: h.flash( | |
|
395 | _('Deleted %s user groups') % len(_user_groups), | |
|
396 | category='success') | |
|
431 | handle_user_groups = self.request.POST['user_user_groups'] | |
|
432 | ||
|
433 | if _artifacts and self.request.POST.get('user_artifacts'): | |
|
434 | handle_artifacts = self.request.POST['user_artifacts'] | |
|
397 | 435 | |
|
398 | 436 | old_values = c.user.get_api_data() |
|
437 | ||
|
399 | 438 | try: |
|
400 | 439 | UserModel().delete(c.user, handle_repos=handle_repos, |
|
401 | 440 | handle_repo_groups=handle_repo_groups, |
|
402 |
handle_user_groups=handle_user_groups |
|
|
441 | handle_user_groups=handle_user_groups, | |
|
442 | handle_artifacts=handle_artifacts) | |
|
403 | 443 | |
|
404 | 444 | audit_logger.store_web( |
|
405 | 445 | 'user.delete', action_data={'old_data': old_values}, |
@@ -409,7 +449,9 b' class UsersView(UserAppView):' | |||
|
409 | 449 | set_handle_flash_repos() |
|
410 | 450 | set_handle_flash_repo_groups() |
|
411 | 451 | set_handle_flash_user_groups() |
|
412 | h.flash(_('Successfully deleted user'), category='success') | |
|
452 | set_handle_flash_artifacts() | |
|
453 | username = h.escape(old_values['username']) | |
|
454 | h.flash(_('Successfully deleted user `{}`').format(username), category='success') | |
|
413 | 455 | except (UserOwnsReposException, UserOwnsRepoGroupsException, |
|
414 | 456 | UserOwnsUserGroupsException, DefaultUserException) as e: |
|
415 | 457 | h.flash(e, category='warning') |
@@ -1187,19 +1229,45 b' class UsersView(UserAppView):' | |||
|
1187 | 1229 | filter_term = self.request.GET.get('filter') |
|
1188 | 1230 | user_log = UserModel().get_user_log(c.user, filter_term) |
|
1189 | 1231 | |
|
1190 |
def url_generator( |
|
|
1232 | def url_generator(page_num): | |
|
1233 | query_params = { | |
|
1234 | 'page': page_num | |
|
1235 | } | |
|
1191 | 1236 | if filter_term: |
|
1192 |
|
|
|
1193 |
return self.request.current_route_path(_query= |
|
|
1237 | query_params['filter'] = filter_term | |
|
1238 | return self.request.current_route_path(_query=query_params) | |
|
1194 | 1239 | |
|
1195 |
c.audit_logs = |
|
|
1196 | user_log, page=p, items_per_page=10, url=url_generator) | |
|
1240 | c.audit_logs = SqlPage( | |
|
1241 | user_log, page=p, items_per_page=10, url_maker=url_generator) | |
|
1197 | 1242 | c.filter_term = filter_term |
|
1198 | 1243 | return self._get_template_context(c) |
|
1199 | 1244 | |
|
1200 | 1245 | @LoginRequired() |
|
1201 | 1246 | @HasPermissionAllDecorator('hg.admin') |
|
1202 | 1247 | @view_config( |
|
1248 | route_name='edit_user_audit_logs_download', request_method='GET', | |
|
1249 | renderer='string') | |
|
1250 | def user_audit_logs_download(self): | |
|
1251 | _ = self.request.translate | |
|
1252 | c = self.load_default_context() | |
|
1253 | c.user = self.db_user | |
|
1254 | ||
|
1255 | user_log = UserModel().get_user_log(c.user, filter_term=None) | |
|
1256 | ||
|
1257 | audit_log_data = {} | |
|
1258 | for entry in user_log: | |
|
1259 | audit_log_data[entry.user_log_id] = entry.get_dict() | |
|
1260 | ||
|
1261 | response = Response(json.dumps(audit_log_data, indent=4)) | |
|
1262 | response.content_disposition = str( | |
|
1263 | 'attachment; filename=%s' % 'user_{}_audit_logs.json'.format(c.user.user_id)) | |
|
1264 | response.content_type = 'application/json' | |
|
1265 | ||
|
1266 | return response | |
|
1267 | ||
|
1268 | @LoginRequired() | |
|
1269 | @HasPermissionAllDecorator('hg.admin') | |
|
1270 | @view_config( | |
|
1203 | 1271 | route_name='edit_user_perms_summary', request_method='GET', |
|
1204 | 1272 | renderer='rhodecode:templates/admin/users/user_edit.mako') |
|
1205 | 1273 | def user_perms_summary(self): |
@@ -43,6 +43,14 b' def includeme(config):' | |||
|
43 | 43 | pattern=ADMIN_PREFIX + '/debug_style', |
|
44 | 44 | debug_style=True) |
|
45 | 45 | config.add_route( |
|
46 | name='debug_style_email', | |
|
47 | pattern=ADMIN_PREFIX + '/debug_style/email/{email_id}', | |
|
48 | debug_style=True) | |
|
49 | config.add_route( | |
|
50 | name='debug_style_email_plain_rendered', | |
|
51 | pattern=ADMIN_PREFIX + '/debug_style/email-rendered/{email_id}', | |
|
52 | debug_style=True) | |
|
53 | config.add_route( | |
|
46 | 54 | name='debug_style_template', |
|
47 | 55 | pattern=ADMIN_PREFIX + '/debug_style/t/{t_path}', |
|
48 | 56 | debug_style=True) |
@@ -20,10 +20,15 b'' | |||
|
20 | 20 | |
|
21 | 21 | import os |
|
22 | 22 | import logging |
|
23 | import datetime | |
|
23 | 24 | |
|
24 | 25 | from pyramid.view import view_config |
|
25 | 26 | from pyramid.renderers import render_to_response |
|
26 | 27 | from rhodecode.apps._base import BaseAppView |
|
28 | from rhodecode.lib.celerylib import run_task, tasks | |
|
29 | from rhodecode.lib.utils2 import AttributeDict | |
|
30 | from rhodecode.model.db import User | |
|
31 | from rhodecode.model.notification import EmailNotificationModel | |
|
27 | 32 | |
|
28 | 33 | log = logging.getLogger(__name__) |
|
29 | 34 | |
@@ -46,6 +51,317 b' class DebugStyleView(BaseAppView):' | |||
|
46 | 51 | request=self.request) |
|
47 | 52 | |
|
48 | 53 | @view_config( |
|
54 | route_name='debug_style_email', request_method='GET', | |
|
55 | renderer=None) | |
|
56 | @view_config( | |
|
57 | route_name='debug_style_email_plain_rendered', request_method='GET', | |
|
58 | renderer=None) | |
|
59 | def render_email(self): | |
|
60 | c = self.load_default_context() | |
|
61 | email_id = self.request.matchdict['email_id'] | |
|
62 | c.active = 'emails' | |
|
63 | ||
|
64 | pr = AttributeDict( | |
|
65 | pull_request_id=123, | |
|
66 | title='digital_ocean: fix redis, elastic search start on boot, ' | |
|
67 | 'fix fd limits on supervisor, set postgres 11 version', | |
|
68 | description=''' | |
|
69 | Check if we should use full-topic or mini-topic. | |
|
70 | ||
|
71 | - full topic produces some problems with merge states etc | |
|
72 | - server-mini-topic needs probably tweeks. | |
|
73 | ''', | |
|
74 | repo_name='foobar', | |
|
75 | source_ref_parts=AttributeDict(type='branch', name='fix-ticket-2000'), | |
|
76 | target_ref_parts=AttributeDict(type='branch', name='master'), | |
|
77 | ) | |
|
78 | target_repo = AttributeDict(repo_name='repo_group/target_repo') | |
|
79 | source_repo = AttributeDict(repo_name='repo_group/source_repo') | |
|
80 | user = User.get_by_username(self.request.GET.get('user')) or self._rhodecode_db_user | |
|
81 | # file/commit changes for PR update | |
|
82 | commit_changes = AttributeDict({ | |
|
83 | 'added': ['aaaaaaabbbbb', 'cccccccddddddd'], | |
|
84 | 'removed': ['eeeeeeeeeee'], | |
|
85 | }) | |
|
86 | file_changes = AttributeDict({ | |
|
87 | 'added': ['a/file1.md', 'file2.py'], | |
|
88 | 'modified': ['b/modified_file.rst'], | |
|
89 | 'removed': ['.idea'], | |
|
90 | }) | |
|
91 | email_kwargs = { | |
|
92 | 'test': {}, | |
|
93 | 'message': { | |
|
94 | 'body': 'message body !' | |
|
95 | }, | |
|
96 | 'email_test': { | |
|
97 | 'user': user, | |
|
98 | 'date': datetime.datetime.now(), | |
|
99 | }, | |
|
100 | 'password_reset': { | |
|
101 | 'password_reset_url': 'http://example.com/reset-rhodecode-password/token', | |
|
102 | ||
|
103 | 'user': user, | |
|
104 | 'date': datetime.datetime.now(), | |
|
105 | 'email': 'test@rhodecode.com', | |
|
106 | 'first_admin_email': User.get_first_super_admin().email | |
|
107 | }, | |
|
108 | 'password_reset_confirmation': { | |
|
109 | 'new_password': 'new-password-example', | |
|
110 | 'user': user, | |
|
111 | 'date': datetime.datetime.now(), | |
|
112 | 'email': 'test@rhodecode.com', | |
|
113 | 'first_admin_email': User.get_first_super_admin().email | |
|
114 | }, | |
|
115 | 'registration': { | |
|
116 | 'user': user, | |
|
117 | 'date': datetime.datetime.now(), | |
|
118 | }, | |
|
119 | ||
|
120 | 'pull_request_comment': { | |
|
121 | 'user': user, | |
|
122 | ||
|
123 | 'status_change': None, | |
|
124 | 'status_change_type': None, | |
|
125 | ||
|
126 | 'pull_request': pr, | |
|
127 | 'pull_request_commits': [], | |
|
128 | ||
|
129 | 'pull_request_target_repo': target_repo, | |
|
130 | 'pull_request_target_repo_url': 'http://target-repo/url', | |
|
131 | ||
|
132 | 'pull_request_source_repo': source_repo, | |
|
133 | 'pull_request_source_repo_url': 'http://source-repo/url', | |
|
134 | ||
|
135 | 'pull_request_url': 'http://localhost/pr1', | |
|
136 | 'pr_comment_url': 'http://comment-url', | |
|
137 | 'pr_comment_reply_url': 'http://comment-url#reply', | |
|
138 | ||
|
139 | 'comment_file': None, | |
|
140 | 'comment_line': None, | |
|
141 | 'comment_type': 'note', | |
|
142 | 'comment_body': 'This is my comment body. *I like !*', | |
|
143 | 'comment_id': 2048, | |
|
144 | 'renderer_type': 'markdown', | |
|
145 | 'mention': True, | |
|
146 | ||
|
147 | }, | |
|
148 | 'pull_request_comment+status': { | |
|
149 | 'user': user, | |
|
150 | ||
|
151 | 'status_change': 'approved', | |
|
152 | 'status_change_type': 'approved', | |
|
153 | ||
|
154 | 'pull_request': pr, | |
|
155 | 'pull_request_commits': [], | |
|
156 | ||
|
157 | 'pull_request_target_repo': target_repo, | |
|
158 | 'pull_request_target_repo_url': 'http://target-repo/url', | |
|
159 | ||
|
160 | 'pull_request_source_repo': source_repo, | |
|
161 | 'pull_request_source_repo_url': 'http://source-repo/url', | |
|
162 | ||
|
163 | 'pull_request_url': 'http://localhost/pr1', | |
|
164 | 'pr_comment_url': 'http://comment-url', | |
|
165 | 'pr_comment_reply_url': 'http://comment-url#reply', | |
|
166 | ||
|
167 | 'comment_type': 'todo', | |
|
168 | 'comment_file': None, | |
|
169 | 'comment_line': None, | |
|
170 | 'comment_body': ''' | |
|
171 | I think something like this would be better | |
|
172 | ||
|
173 | ```py | |
|
174 | ||
|
175 | def db(): | |
|
176 | global connection | |
|
177 | return connection | |
|
178 | ||
|
179 | ``` | |
|
180 | ||
|
181 | ''', | |
|
182 | 'comment_id': 2048, | |
|
183 | 'renderer_type': 'markdown', | |
|
184 | 'mention': True, | |
|
185 | ||
|
186 | }, | |
|
187 | 'pull_request_comment+file': { | |
|
188 | 'user': user, | |
|
189 | ||
|
190 | 'status_change': None, | |
|
191 | 'status_change_type': None, | |
|
192 | ||
|
193 | 'pull_request': pr, | |
|
194 | 'pull_request_commits': [], | |
|
195 | ||
|
196 | 'pull_request_target_repo': target_repo, | |
|
197 | 'pull_request_target_repo_url': 'http://target-repo/url', | |
|
198 | ||
|
199 | 'pull_request_source_repo': source_repo, | |
|
200 | 'pull_request_source_repo_url': 'http://source-repo/url', | |
|
201 | ||
|
202 | 'pull_request_url': 'http://localhost/pr1', | |
|
203 | ||
|
204 | 'pr_comment_url': 'http://comment-url', | |
|
205 | 'pr_comment_reply_url': 'http://comment-url#reply', | |
|
206 | ||
|
207 | 'comment_file': 'rhodecode/model/db.py', | |
|
208 | 'comment_line': 'o1210', | |
|
209 | 'comment_type': 'todo', | |
|
210 | 'comment_body': ''' | |
|
211 | I like this ! | |
|
212 | ||
|
213 | But please check this code:: | |
|
214 | ||
|
215 | def main(): | |
|
216 | print 'ok' | |
|
217 | ||
|
218 | This should work better ! | |
|
219 | ''', | |
|
220 | 'comment_id': 2048, | |
|
221 | 'renderer_type': 'rst', | |
|
222 | 'mention': True, | |
|
223 | ||
|
224 | }, | |
|
225 | ||
|
226 | 'pull_request_update': { | |
|
227 | 'updating_user': user, | |
|
228 | ||
|
229 | 'status_change': None, | |
|
230 | 'status_change_type': None, | |
|
231 | ||
|
232 | 'pull_request': pr, | |
|
233 | 'pull_request_commits': [], | |
|
234 | ||
|
235 | 'pull_request_target_repo': target_repo, | |
|
236 | 'pull_request_target_repo_url': 'http://target-repo/url', | |
|
237 | ||
|
238 | 'pull_request_source_repo': source_repo, | |
|
239 | 'pull_request_source_repo_url': 'http://source-repo/url', | |
|
240 | ||
|
241 | 'pull_request_url': 'http://localhost/pr1', | |
|
242 | ||
|
243 | # update comment links | |
|
244 | 'pr_comment_url': 'http://comment-url', | |
|
245 | 'pr_comment_reply_url': 'http://comment-url#reply', | |
|
246 | 'ancestor_commit_id': 'f39bd443', | |
|
247 | 'added_commits': commit_changes.added, | |
|
248 | 'removed_commits': commit_changes.removed, | |
|
249 | 'changed_files': (file_changes.added + file_changes.modified + file_changes.removed), | |
|
250 | 'added_files': file_changes.added, | |
|
251 | 'modified_files': file_changes.modified, | |
|
252 | 'removed_files': file_changes.removed, | |
|
253 | }, | |
|
254 | ||
|
255 | 'cs_comment': { | |
|
256 | 'user': user, | |
|
257 | 'commit': AttributeDict(idx=123, raw_id='a'*40, message='Commit message'), | |
|
258 | 'status_change': None, | |
|
259 | 'status_change_type': None, | |
|
260 | ||
|
261 | 'commit_target_repo_url': 'http://foo.example.com/#comment1', | |
|
262 | 'repo_name': 'test-repo', | |
|
263 | 'comment_type': 'note', | |
|
264 | 'comment_file': None, | |
|
265 | 'comment_line': None, | |
|
266 | 'commit_comment_url': 'http://comment-url', | |
|
267 | 'commit_comment_reply_url': 'http://comment-url#reply', | |
|
268 | 'comment_body': 'This is my comment body. *I like !*', | |
|
269 | 'comment_id': 2048, | |
|
270 | 'renderer_type': 'markdown', | |
|
271 | 'mention': True, | |
|
272 | }, | |
|
273 | 'cs_comment+status': { | |
|
274 | 'user': user, | |
|
275 | 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'), | |
|
276 | 'status_change': 'approved', | |
|
277 | 'status_change_type': 'approved', | |
|
278 | ||
|
279 | 'commit_target_repo_url': 'http://foo.example.com/#comment1', | |
|
280 | 'repo_name': 'test-repo', | |
|
281 | 'comment_type': 'note', | |
|
282 | 'comment_file': None, | |
|
283 | 'comment_line': None, | |
|
284 | 'commit_comment_url': 'http://comment-url', | |
|
285 | 'commit_comment_reply_url': 'http://comment-url#reply', | |
|
286 | 'comment_body': ''' | |
|
287 | Hello **world** | |
|
288 | ||
|
289 | This is a multiline comment :) | |
|
290 | ||
|
291 | - list | |
|
292 | - list2 | |
|
293 | ''', | |
|
294 | 'comment_id': 2048, | |
|
295 | 'renderer_type': 'markdown', | |
|
296 | 'mention': True, | |
|
297 | }, | |
|
298 | 'cs_comment+file': { | |
|
299 | 'user': user, | |
|
300 | 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'), | |
|
301 | 'status_change': None, | |
|
302 | 'status_change_type': None, | |
|
303 | ||
|
304 | 'commit_target_repo_url': 'http://foo.example.com/#comment1', | |
|
305 | 'repo_name': 'test-repo', | |
|
306 | ||
|
307 | 'comment_type': 'note', | |
|
308 | 'comment_file': 'test-file.py', | |
|
309 | 'comment_line': 'n100', | |
|
310 | ||
|
311 | 'commit_comment_url': 'http://comment-url', | |
|
312 | 'commit_comment_reply_url': 'http://comment-url#reply', | |
|
313 | 'comment_body': 'This is my comment body. *I like !*', | |
|
314 | 'comment_id': 2048, | |
|
315 | 'renderer_type': 'markdown', | |
|
316 | 'mention': True, | |
|
317 | }, | |
|
318 | ||
|
319 | 'pull_request': { | |
|
320 | 'user': user, | |
|
321 | 'pull_request': pr, | |
|
322 | 'pull_request_commits': [ | |
|
323 | ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\ | |
|
324 | my-account: moved email closer to profile as it's similar data just moved outside. | |
|
325 | '''), | |
|
326 | ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\ | |
|
327 | users: description edit fixes | |
|
328 | ||
|
329 | - tests | |
|
330 | - added metatags info | |
|
331 | '''), | |
|
332 | ], | |
|
333 | ||
|
334 | 'pull_request_target_repo': target_repo, | |
|
335 | 'pull_request_target_repo_url': 'http://target-repo/url', | |
|
336 | ||
|
337 | 'pull_request_source_repo': source_repo, | |
|
338 | 'pull_request_source_repo_url': 'http://source-repo/url', | |
|
339 | ||
|
340 | 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123', | |
|
341 | } | |
|
342 | ||
|
343 | } | |
|
344 | ||
|
345 | template_type = email_id.split('+')[0] | |
|
346 | (c.subject, c.headers, c.email_body, | |
|
347 | c.email_body_plaintext) = EmailNotificationModel().render_email( | |
|
348 | template_type, **email_kwargs.get(email_id, {})) | |
|
349 | ||
|
350 | test_email = self.request.GET.get('email') | |
|
351 | if test_email: | |
|
352 | recipients = [test_email] | |
|
353 | run_task(tasks.send_email, recipients, c.subject, | |
|
354 | c.email_body_plaintext, c.email_body) | |
|
355 | ||
|
356 | if self.request.matched_route.name == 'debug_style_email_plain_rendered': | |
|
357 | template = 'debug_style/email_plain_rendered.mako' | |
|
358 | else: | |
|
359 | template = 'debug_style/email.mako' | |
|
360 | return render_to_response( | |
|
361 | template, self._get_template_context(c), | |
|
362 | request=self.request) | |
|
363 | ||
|
364 | @view_config( | |
|
49 | 365 | route_name='debug_style_template', request_method='GET', |
|
50 | 366 | renderer=None) |
|
51 | 367 | def template(self): |
@@ -53,7 +369,18 b' class DebugStyleView(BaseAppView):' | |||
|
53 | 369 | c = self.load_default_context() |
|
54 | 370 | c.active = os.path.splitext(t_path)[0] |
|
55 | 371 | c.came_from = '' |
|
372 | c.email_types = { | |
|
373 | 'cs_comment+file': {}, | |
|
374 | 'cs_comment+status': {}, | |
|
375 | ||
|
376 | 'pull_request_comment+file': {}, | |
|
377 | 'pull_request_comment+status': {}, | |
|
378 | ||
|
379 | 'pull_request_update': {}, | |
|
380 | } | |
|
381 | c.email_types.update(EmailNotificationModel.email_types) | |
|
56 | 382 | |
|
57 | 383 | return render_to_response( |
|
58 | 384 | 'debug_style/' + t_path, self._get_template_context(c), |
|
59 | request=self.request) No newline at end of file | |
|
385 | request=self.request) | |
|
386 |
@@ -44,6 +44,9 b' def includeme(config):' | |||
|
44 | 44 | config.add_route( |
|
45 | 45 | name='download_file', |
|
46 | 46 | pattern='/_file_store/download/{fid}') |
|
47 | config.add_route( | |
|
48 | name='download_file_by_token', | |
|
49 | pattern='/_file_store/token-download/{_auth_token}/{fid}') | |
|
47 | 50 | |
|
48 | 51 | # Scan module for configuration decorators. |
|
49 | 52 | config.scan('.views', ignore='.tests') |
@@ -26,7 +26,8 b' import hashlib' | |||
|
26 | 26 | from rhodecode.lib.ext_json import json |
|
27 | 27 | from rhodecode.apps.file_store import utils |
|
28 | 28 | from rhodecode.apps.file_store.extensions import resolve_extensions |
|
29 |
from rhodecode.apps.file_store.exceptions import |
|
|
29 | from rhodecode.apps.file_store.exceptions import ( | |
|
30 | FileNotAllowedException, FileOverSizeException) | |
|
30 | 31 | |
|
31 | 32 | METADATA_VER = 'v1' |
|
32 | 33 | |
@@ -91,6 +92,9 b' class LocalFileStorage(object):' | |||
|
91 | 92 | self.base_path = base_path |
|
92 | 93 | self.extensions = resolve_extensions([], groups=extension_groups) |
|
93 | 94 | |
|
95 | def __repr__(self): | |
|
96 | return '{}@{}'.format(self.__class__, self.base_path) | |
|
97 | ||
|
94 | 98 | def store_path(self, filename): |
|
95 | 99 | """ |
|
96 | 100 | Returns absolute file path of the filename, joined to the |
@@ -140,16 +144,21 b' class LocalFileStorage(object):' | |||
|
140 | 144 | :param ext: extension to check |
|
141 | 145 | :param extensions: iterable of extensions to validate against (or self.extensions) |
|
142 | 146 | """ |
|
147 | def normalize_ext(_ext): | |
|
148 | if _ext.startswith('.'): | |
|
149 | _ext = _ext[1:] | |
|
150 | return _ext.lower() | |
|
143 | 151 | |
|
144 | 152 | extensions = extensions or self.extensions |
|
145 | 153 | if not extensions: |
|
146 | 154 | return True |
|
147 | if ext.startswith('.'): | |
|
148 | ext = ext[1:] | |
|
149 | return ext.lower() in extensions | |
|
155 | ||
|
156 | ext = normalize_ext(ext) | |
|
157 | ||
|
158 | return ext in [normalize_ext(x) for x in extensions] | |
|
150 | 159 | |
|
151 | 160 | def save_file(self, file_obj, filename, directory=None, extensions=None, |
|
152 | extra_metadata=None, **kwargs): | |
|
161 | extra_metadata=None, max_filesize=None, **kwargs): | |
|
153 | 162 | """ |
|
154 | 163 | Saves a file object to the uploads location. |
|
155 | 164 | Returns the resolved filename, i.e. the directory + |
@@ -159,7 +168,9 b' class LocalFileStorage(object):' | |||
|
159 | 168 | :param filename: original filename |
|
160 | 169 | :param directory: relative path of sub-directory |
|
161 | 170 | :param extensions: iterable of allowed extensions, if not default |
|
171 | :param max_filesize: maximum size of file that should be allowed | |
|
162 | 172 | :param extra_metadata: extra JSON metadata to store next to the file with .meta suffix |
|
173 | ||
|
163 | 174 | """ |
|
164 | 175 | |
|
165 | 176 | extensions = extensions or self.extensions |
@@ -191,6 +202,12 b' class LocalFileStorage(object):' | |||
|
191 | 202 | metadata = extra_metadata |
|
192 | 203 | |
|
193 | 204 | size = os.stat(path).st_size |
|
205 | ||
|
206 | if max_filesize and size > max_filesize: | |
|
207 | # free up the copied file, and raise exc | |
|
208 | os.remove(path) | |
|
209 | raise FileOverSizeException() | |
|
210 | ||
|
194 | 211 | file_hash = self.calculate_path_hash(path) |
|
195 | 212 | |
|
196 | 213 | metadata.update( |
@@ -21,7 +21,8 b' import os' | |||
|
21 | 21 | import pytest |
|
22 | 22 | |
|
23 | 23 | from rhodecode.lib.ext_json import json |
|
24 |
from rhodecode.model. |
|
|
24 | from rhodecode.model.auth_token import AuthTokenModel | |
|
25 | from rhodecode.model.db import Session, FileStore, Repository, User | |
|
25 | 26 | from rhodecode.tests import TestController |
|
26 | 27 | from rhodecode.apps.file_store import utils, config_keys |
|
27 | 28 | |
@@ -32,6 +33,7 b' def route_path(name, params=None, **kwar' | |||
|
32 | 33 | base_url = { |
|
33 | 34 | 'upload_file': '/_file_store/upload', |
|
34 | 35 | 'download_file': '/_file_store/download/{fid}', |
|
36 | 'download_file_by_token': '/_file_store/token-download/{_auth_token}/{fid}' | |
|
35 | 37 | |
|
36 | 38 | }[name].format(**kwargs) |
|
37 | 39 | |
@@ -124,3 +126,136 b' class TestFileStoreViews(TestController)' | |||
|
124 | 126 | status=200) |
|
125 | 127 | |
|
126 | 128 | assert response.json['store_fid'] |
|
129 | ||
|
130 | @pytest.fixture() | |
|
131 | def create_artifact_factory(self, tmpdir): | |
|
132 | def factory(user_id, content): | |
|
133 | store_path = self.app._pyramid_settings[config_keys.store_path] | |
|
134 | store = utils.get_file_storage({config_keys.store_path: store_path}) | |
|
135 | fid = 'example.txt' | |
|
136 | ||
|
137 | filesystem_file = os.path.join(str(tmpdir), fid) | |
|
138 | with open(filesystem_file, 'wb') as f: | |
|
139 | f.write(content) | |
|
140 | ||
|
141 | with open(filesystem_file, 'rb') as f: | |
|
142 | store_uid, metadata = store.save_file(f, fid, extra_metadata={'filename': fid}) | |
|
143 | ||
|
144 | entry = FileStore.create( | |
|
145 | file_uid=store_uid, filename=metadata["filename"], | |
|
146 | file_hash=metadata["sha256"], file_size=metadata["size"], | |
|
147 | file_display_name='file_display_name', | |
|
148 | file_description='repo artifact `{}`'.format(metadata["filename"]), | |
|
149 | check_acl=True, user_id=user_id, | |
|
150 | ) | |
|
151 | Session().add(entry) | |
|
152 | Session().commit() | |
|
153 | return entry | |
|
154 | return factory | |
|
155 | ||
|
156 | def test_download_file_non_scoped(self, user_util, create_artifact_factory): | |
|
157 | user = self.log_user() | |
|
158 | user_id = user['user_id'] | |
|
159 | content = 'HELLO MY NAME IS ARTIFACT !' | |
|
160 | ||
|
161 | artifact = create_artifact_factory(user_id, content) | |
|
162 | file_uid = artifact.file_uid | |
|
163 | response = self.app.get(route_path('download_file', fid=file_uid), status=200) | |
|
164 | assert response.text == content | |
|
165 | ||
|
166 | # log-in to new user and test download again | |
|
167 | user = user_util.create_user(password='qweqwe') | |
|
168 | self.log_user(user.username, 'qweqwe') | |
|
169 | response = self.app.get(route_path('download_file', fid=file_uid), status=200) | |
|
170 | assert response.text == content | |
|
171 | ||
|
172 | def test_download_file_scoped_to_repo(self, user_util, create_artifact_factory): | |
|
173 | user = self.log_user() | |
|
174 | user_id = user['user_id'] | |
|
175 | content = 'HELLO MY NAME IS ARTIFACT !' | |
|
176 | ||
|
177 | artifact = create_artifact_factory(user_id, content) | |
|
178 | # bind to repo | |
|
179 | repo = user_util.create_repo() | |
|
180 | repo_id = repo.repo_id | |
|
181 | artifact.scope_repo_id = repo_id | |
|
182 | Session().add(artifact) | |
|
183 | Session().commit() | |
|
184 | ||
|
185 | file_uid = artifact.file_uid | |
|
186 | response = self.app.get(route_path('download_file', fid=file_uid), status=200) | |
|
187 | assert response.text == content | |
|
188 | ||
|
189 | # log-in to new user and test download again | |
|
190 | user = user_util.create_user(password='qweqwe') | |
|
191 | self.log_user(user.username, 'qweqwe') | |
|
192 | response = self.app.get(route_path('download_file', fid=file_uid), status=200) | |
|
193 | assert response.text == content | |
|
194 | ||
|
195 | # forbid user the rights to repo | |
|
196 | repo = Repository.get(repo_id) | |
|
197 | user_util.grant_user_permission_to_repo(repo, user, 'repository.none') | |
|
198 | self.app.get(route_path('download_file', fid=file_uid), status=404) | |
|
199 | ||
|
200 | def test_download_file_scoped_to_user(self, user_util, create_artifact_factory): | |
|
201 | user = self.log_user() | |
|
202 | user_id = user['user_id'] | |
|
203 | content = 'HELLO MY NAME IS ARTIFACT !' | |
|
204 | ||
|
205 | artifact = create_artifact_factory(user_id, content) | |
|
206 | # bind to user | |
|
207 | user = user_util.create_user(password='qweqwe') | |
|
208 | ||
|
209 | artifact.scope_user_id = user.user_id | |
|
210 | Session().add(artifact) | |
|
211 | Session().commit() | |
|
212 | ||
|
213 | # artifact creator doesn't have access since it's bind to another user | |
|
214 | file_uid = artifact.file_uid | |
|
215 | self.app.get(route_path('download_file', fid=file_uid), status=404) | |
|
216 | ||
|
217 | # log-in to new user and test download again, should be ok since we're bind to this artifact | |
|
218 | self.log_user(user.username, 'qweqwe') | |
|
219 | response = self.app.get(route_path('download_file', fid=file_uid), status=200) | |
|
220 | assert response.text == content | |
|
221 | ||
|
222 | def test_download_file_scoped_to_repo_with_bad_token(self, user_util, create_artifact_factory): | |
|
223 | user_id = User.get_first_super_admin().user_id | |
|
224 | content = 'HELLO MY NAME IS ARTIFACT !' | |
|
225 | ||
|
226 | artifact = create_artifact_factory(user_id, content) | |
|
227 | # bind to repo | |
|
228 | repo = user_util.create_repo() | |
|
229 | repo_id = repo.repo_id | |
|
230 | artifact.scope_repo_id = repo_id | |
|
231 | Session().add(artifact) | |
|
232 | Session().commit() | |
|
233 | ||
|
234 | file_uid = artifact.file_uid | |
|
235 | self.app.get(route_path('download_file_by_token', | |
|
236 | _auth_token='bogus', fid=file_uid), status=302) | |
|
237 | ||
|
238 | def test_download_file_scoped_to_repo_with_token(self, user_util, create_artifact_factory): | |
|
239 | user = User.get_first_super_admin() | |
|
240 | AuthTokenModel().create(user, 'test artifact token', | |
|
241 | role=AuthTokenModel.cls.ROLE_ARTIFACT_DOWNLOAD) | |
|
242 | ||
|
243 | user = User.get_first_super_admin() | |
|
244 | artifact_token = user.artifact_token | |
|
245 | ||
|
246 | user_id = User.get_first_super_admin().user_id | |
|
247 | content = 'HELLO MY NAME IS ARTIFACT !' | |
|
248 | ||
|
249 | artifact = create_artifact_factory(user_id, content) | |
|
250 | # bind to repo | |
|
251 | repo = user_util.create_repo() | |
|
252 | repo_id = repo.repo_id | |
|
253 | artifact.scope_repo_id = repo_id | |
|
254 | Session().add(artifact) | |
|
255 | Session().commit() | |
|
256 | ||
|
257 | file_uid = artifact.file_uid | |
|
258 | response = self.app.get( | |
|
259 | route_path('download_file_by_token', | |
|
260 | _auth_token=artifact_token, fid=file_uid), status=200) | |
|
261 | assert response.text == content |
@@ -25,7 +25,7 b' import pathlib2' | |||
|
25 | 25 | |
|
26 | 26 | |
|
27 | 27 | def get_file_storage(settings): |
|
28 | from rhodecode.apps.file_store.local_store import LocalFileStorage | |
|
28 | from rhodecode.apps.file_store.backends.local_store import LocalFileStorage | |
|
29 | 29 | from rhodecode.apps.file_store import config_keys |
|
30 | 30 | store_path = settings.get(config_keys.store_path) |
|
31 | 31 | return LocalFileStorage(base_path=store_path) |
@@ -30,8 +30,10 b' from rhodecode.apps.file_store.exception' | |||
|
30 | 30 | |
|
31 | 31 | from rhodecode.lib import helpers as h |
|
32 | 32 | from rhodecode.lib import audit_logger |
|
33 | from rhodecode.lib.auth import (CSRFRequired, NotAnonymous, HasRepoPermissionAny, HasRepoGroupPermissionAny) | |
|
34 | from rhodecode.model.db import Session, FileStore | |
|
33 | from rhodecode.lib.auth import ( | |
|
34 | CSRFRequired, NotAnonymous, HasRepoPermissionAny, HasRepoGroupPermissionAny, | |
|
35 | LoginRequired) | |
|
36 | from rhodecode.model.db import Session, FileStore, UserApiKeys | |
|
35 | 37 | |
|
36 | 38 | log = logging.getLogger(__name__) |
|
37 | 39 | |
@@ -44,6 +46,55 b' class FileStoreView(BaseAppView):' | |||
|
44 | 46 | self.storage = utils.get_file_storage(self.request.registry.settings) |
|
45 | 47 | return c |
|
46 | 48 | |
|
49 | def _serve_file(self, file_uid): | |
|
50 | ||
|
51 | if not self.storage.exists(file_uid): | |
|
52 | store_path = self.storage.store_path(file_uid) | |
|
53 | log.debug('File with FID:%s not found in the store under `%s`', | |
|
54 | file_uid, store_path) | |
|
55 | raise HTTPNotFound() | |
|
56 | ||
|
57 | db_obj = FileStore().query().filter(FileStore.file_uid == file_uid).scalar() | |
|
58 | if not db_obj: | |
|
59 | raise HTTPNotFound() | |
|
60 | ||
|
61 | # private upload for user | |
|
62 | if db_obj.check_acl and db_obj.scope_user_id: | |
|
63 | log.debug('Artifact: checking scope access for bound artifact user: `%s`', | |
|
64 | db_obj.scope_user_id) | |
|
65 | user = db_obj.user | |
|
66 | if self._rhodecode_db_user.user_id != user.user_id: | |
|
67 | log.warning('Access to file store object forbidden') | |
|
68 | raise HTTPNotFound() | |
|
69 | ||
|
70 | # scoped to repository permissions | |
|
71 | if db_obj.check_acl and db_obj.scope_repo_id: | |
|
72 | log.debug('Artifact: checking scope access for bound artifact repo: `%s`', | |
|
73 | db_obj.scope_repo_id) | |
|
74 | repo = db_obj.repo | |
|
75 | perm_set = ['repository.read', 'repository.write', 'repository.admin'] | |
|
76 | has_perm = HasRepoPermissionAny(*perm_set)(repo.repo_name, 'FileStore check') | |
|
77 | if not has_perm: | |
|
78 | log.warning('Access to file store object `%s` forbidden', file_uid) | |
|
79 | raise HTTPNotFound() | |
|
80 | ||
|
81 | # scoped to repository group permissions | |
|
82 | if db_obj.check_acl and db_obj.scope_repo_group_id: | |
|
83 | log.debug('Artifact: checking scope access for bound artifact repo group: `%s`', | |
|
84 | db_obj.scope_repo_group_id) | |
|
85 | repo_group = db_obj.repo_group | |
|
86 | perm_set = ['group.read', 'group.write', 'group.admin'] | |
|
87 | has_perm = HasRepoGroupPermissionAny(*perm_set)(repo_group.group_name, 'FileStore check') | |
|
88 | if not has_perm: | |
|
89 | log.warning('Access to file store object `%s` forbidden', file_uid) | |
|
90 | raise HTTPNotFound() | |
|
91 | ||
|
92 | FileStore.bump_access_counter(file_uid) | |
|
93 | ||
|
94 | file_path = self.storage.store_path(file_uid) | |
|
95 | return FileResponse(file_path) | |
|
96 | ||
|
97 | @LoginRequired() | |
|
47 | 98 | @NotAnonymous() |
|
48 | 99 | @CSRFRequired() |
|
49 | 100 | @view_config(route_name='upload_file', request_method='POST', renderer='json_ext') |
@@ -84,7 +135,7 b' class FileStoreView(BaseAppView):' | |||
|
84 | 135 | entry = FileStore.create( |
|
85 | 136 | file_uid=store_uid, filename=metadata["filename"], |
|
86 | 137 | file_hash=metadata["sha256"], file_size=metadata["size"], |
|
87 | file_description='upload attachment', | |
|
138 | file_description=u'upload attachment', | |
|
88 | 139 | check_acl=False, user_id=self._rhodecode_user.user_id |
|
89 | 140 | ) |
|
90 | 141 | Session().add(entry) |
@@ -99,46 +150,25 b' class FileStoreView(BaseAppView):' | |||
|
99 | 150 | return {'store_fid': store_uid, |
|
100 | 151 | 'access_path': h.route_path('download_file', fid=store_uid)} |
|
101 | 152 | |
|
153 | # ACL is checked by scopes, if no scope the file is accessible to all | |
|
102 | 154 | @view_config(route_name='download_file') |
|
103 | 155 | def download_file(self): |
|
104 | 156 | self.load_default_context() |
|
105 | 157 | file_uid = self.request.matchdict['fid'] |
|
106 | 158 | log.debug('Requesting FID:%s from store %s', file_uid, self.storage) |
|
107 | ||
|
108 | if not self.storage.exists(file_uid): | |
|
109 | log.debug('File with FID:%s not found in the store', file_uid) | |
|
110 | raise HTTPNotFound() | |
|
111 | ||
|
112 | db_obj = FileStore().query().filter(FileStore.file_uid == file_uid).scalar() | |
|
113 | if not db_obj: | |
|
114 | raise HTTPNotFound() | |
|
115 | ||
|
116 | # private upload for user | |
|
117 | if db_obj.check_acl and db_obj.scope_user_id: | |
|
118 | user = db_obj.user | |
|
119 | if self._rhodecode_db_user.user_id != user.user_id: | |
|
120 | log.warning('Access to file store object forbidden') | |
|
121 | raise HTTPNotFound() | |
|
159 | return self._serve_file(file_uid) | |
|
122 | 160 | |
|
123 | # scoped to repository permissions | |
|
124 | if db_obj.check_acl and db_obj.scope_repo_id: | |
|
125 | repo = db_obj.repo | |
|
126 | perm_set = ['repository.read', 'repository.write', 'repository.admin'] | |
|
127 | has_perm = HasRepoPermissionAny(*perm_set)(repo.repo_name, 'FileStore check') | |
|
128 | if not has_perm: | |
|
129 | log.warning('Access to file store object forbidden') | |
|
130 | raise HTTPNotFound() | |
|
161 | # in addition to @LoginRequired ACL is checked by scopes | |
|
162 | @LoginRequired(auth_token_access=[UserApiKeys.ROLE_ARTIFACT_DOWNLOAD]) | |
|
163 | @NotAnonymous() | |
|
164 | @view_config(route_name='download_file_by_token') | |
|
165 | def download_file_by_token(self): | |
|
166 | """ | |
|
167 | Special view that allows to access the download file by special URL that | |
|
168 | is stored inside the URL. | |
|
131 | 169 |
|
|
132 | # scoped to repository group permissions | |
|
133 | if db_obj.check_acl and db_obj.scope_repo_group_id: | |
|
134 | repo_group = db_obj.repo_group | |
|
135 | perm_set = ['group.read', 'group.write', 'group.admin'] | |
|
136 | has_perm = HasRepoGroupPermissionAny(*perm_set)(repo_group.group_name, 'FileStore check') | |
|
137 | if not has_perm: | |
|
138 | log.warning('Access to file store object forbidden') | |
|
139 | raise HTTPNotFound() | |
|
140 | ||
|
141 | FileStore.bump_access_counter(file_uid) | |
|
142 | ||
|
143 | file_path = self.storage.store_path(file_uid) | |
|
144 | return FileResponse(file_path) | |
|
170 | http://example.com/_file_store/token-download/TOKEN/FILE_UID | |
|
171 | """ | |
|
172 | self.load_default_context() | |
|
173 | file_uid = self.request.matchdict['fid'] | |
|
174 | return self._serve_file(file_uid) |
@@ -83,7 +83,7 b' class GistUtility(object):' | |||
|
83 | 83 | Session().commit() |
|
84 | 84 | |
|
85 | 85 | |
|
86 | @pytest.fixture | |
|
86 | @pytest.fixture() | |
|
87 | 87 | def create_gist(request): |
|
88 | 88 | gist_utility = GistUtility() |
|
89 | 89 | request.addfinalizer(gist_utility.cleanup) |
@@ -159,7 +159,7 b' class TestGistsController(TestController' | |||
|
159 | 159 | params={'lifetime': -1, |
|
160 | 160 | 'content': 'gist test', |
|
161 | 161 | 'filename': 'foo', |
|
162 |
' |
|
|
162 | 'gist_type': 'public', | |
|
163 | 163 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
164 | 164 | 'csrf_token': self.csrf_token}, |
|
165 | 165 | status=302) |
@@ -174,7 +174,7 b' class TestGistsController(TestController' | |||
|
174 | 174 | params={'lifetime': -1, |
|
175 | 175 | 'content': 'gist test', |
|
176 | 176 | 'filename': '/home/foo', |
|
177 |
' |
|
|
177 | 'gist_type': 'public', | |
|
178 | 178 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
179 | 179 | 'csrf_token': self.csrf_token}, |
|
180 | 180 | status=200) |
@@ -197,7 +197,7 b' class TestGistsController(TestController' | |||
|
197 | 197 | params={'lifetime': -1, |
|
198 | 198 | 'content': 'private gist test', |
|
199 | 199 | 'filename': 'private-foo', |
|
200 |
' |
|
|
200 | 'gist_type': 'private', | |
|
201 | 201 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
202 | 202 | 'csrf_token': self.csrf_token}, |
|
203 | 203 | status=302) |
@@ -216,7 +216,7 b' class TestGistsController(TestController' | |||
|
216 | 216 | params={'lifetime': -1, |
|
217 | 217 | 'content': 'private gist test', |
|
218 | 218 | 'filename': 'private-foo', |
|
219 |
' |
|
|
219 | 'gist_type': 'private', | |
|
220 | 220 | 'gist_acl_level': Gist.ACL_LEVEL_PRIVATE, |
|
221 | 221 | 'csrf_token': self.csrf_token}, |
|
222 | 222 | status=302) |
@@ -236,7 +236,7 b' class TestGistsController(TestController' | |||
|
236 | 236 | 'content': 'gist test', |
|
237 | 237 | 'filename': 'foo-desc', |
|
238 | 238 | 'description': 'gist-desc', |
|
239 |
' |
|
|
239 | 'gist_type': 'public', | |
|
240 | 240 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
241 | 241 | 'csrf_token': self.csrf_token}, |
|
242 | 242 | status=302) |
@@ -252,7 +252,7 b' class TestGistsController(TestController' | |||
|
252 | 252 | 'content': 'gist test', |
|
253 | 253 | 'filename': 'foo-desc', |
|
254 | 254 | 'description': 'gist-desc', |
|
255 |
' |
|
|
255 | 'gist_type': 'public', | |
|
256 | 256 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
257 | 257 | 'csrf_token': self.csrf_token |
|
258 | 258 | } |
@@ -72,7 +72,7 b' class GistView(BaseAppView):' | |||
|
72 | 72 | @LoginRequired() |
|
73 | 73 | @view_config( |
|
74 | 74 | route_name='gists_show', request_method='GET', |
|
75 | renderer='rhodecode:templates/admin/gists/index.mako') | |
|
75 | renderer='rhodecode:templates/admin/gists/gist_index.mako') | |
|
76 | 76 | def gist_show_all(self): |
|
77 | 77 | c = self.load_default_context() |
|
78 | 78 | |
@@ -136,7 +136,7 b' class GistView(BaseAppView):' | |||
|
136 | 136 | @NotAnonymous() |
|
137 | 137 | @view_config( |
|
138 | 138 | route_name='gists_new', request_method='GET', |
|
139 | renderer='rhodecode:templates/admin/gists/new.mako') | |
|
139 | renderer='rhodecode:templates/admin/gists/gist_new.mako') | |
|
140 | 140 | def gist_new(self): |
|
141 | 141 | c = self.load_default_context() |
|
142 | 142 | return self._get_template_context(c) |
@@ -146,21 +146,26 b' class GistView(BaseAppView):' | |||
|
146 | 146 | @CSRFRequired() |
|
147 | 147 | @view_config( |
|
148 | 148 | route_name='gists_create', request_method='POST', |
|
149 | renderer='rhodecode:templates/admin/gists/new.mako') | |
|
149 | renderer='rhodecode:templates/admin/gists/gist_new.mako') | |
|
150 | 150 | def gist_create(self): |
|
151 | 151 | _ = self.request.translate |
|
152 | 152 | c = self.load_default_context() |
|
153 | 153 | |
|
154 | 154 | data = dict(self.request.POST) |
|
155 | 155 | data['filename'] = data.get('filename') or Gist.DEFAULT_FILENAME |
|
156 | ||
|
156 | 157 | data['nodes'] = [{ |
|
157 | 158 | 'filename': data['filename'], |
|
158 | 159 | 'content': data.get('content'), |
|
159 | 160 | 'mimetype': data.get('mimetype') # None is autodetect |
|
160 | 161 | }] |
|
162 | gist_type = { | |
|
163 | 'public': Gist.GIST_PUBLIC, | |
|
164 | 'private': Gist.GIST_PRIVATE | |
|
165 | }.get(data.get('gist_type')) or Gist.GIST_PRIVATE | |
|
161 | 166 | |
|
162 |
data['gist_type'] = |
|
|
163 | Gist.GIST_PUBLIC if data.get('public') else Gist.GIST_PRIVATE) | |
|
167 | data['gist_type'] = gist_type | |
|
168 | ||
|
164 | 169 | data['gist_acl_level'] = ( |
|
165 | 170 | data.get('gist_acl_level') or Gist.ACL_LEVEL_PRIVATE) |
|
166 | 171 | |
@@ -196,7 +201,7 b' class GistView(BaseAppView):' | |||
|
196 | 201 | errors['filename'] = errors['nodes.0.filename'] |
|
197 | 202 | del errors['nodes.0.filename'] |
|
198 | 203 | |
|
199 | data = render('rhodecode:templates/admin/gists/new.mako', | |
|
204 | data = render('rhodecode:templates/admin/gists/gist_new.mako', | |
|
200 | 205 | self._get_template_context(c), self.request) |
|
201 | 206 | html = formencode.htmlfill.render( |
|
202 | 207 | data, |
@@ -260,10 +265,10 b' class GistView(BaseAppView):' | |||
|
260 | 265 | @LoginRequired() |
|
261 | 266 | @view_config( |
|
262 | 267 | route_name='gist_show', request_method='GET', |
|
263 | renderer='rhodecode:templates/admin/gists/show.mako') | |
|
268 | renderer='rhodecode:templates/admin/gists/gist_show.mako') | |
|
264 | 269 | @view_config( |
|
265 | 270 | route_name='gist_show_rev', request_method='GET', |
|
266 | renderer='rhodecode:templates/admin/gists/show.mako') | |
|
271 | renderer='rhodecode:templates/admin/gists/gist_show.mako') | |
|
267 | 272 | @view_config( |
|
268 | 273 | route_name='gist_show_formatted', request_method='GET', |
|
269 | 274 | renderer=None) |
@@ -304,7 +309,7 b' class GistView(BaseAppView):' | |||
|
304 | 309 | @NotAnonymous() |
|
305 | 310 | @view_config( |
|
306 | 311 | route_name='gist_edit', request_method='GET', |
|
307 | renderer='rhodecode:templates/admin/gists/edit.mako') | |
|
312 | renderer='rhodecode:templates/admin/gists/gist_edit.mako') | |
|
308 | 313 | def gist_edit(self): |
|
309 | 314 | _ = self.request.translate |
|
310 | 315 | gist_id = self.request.matchdict['gist_id'] |
@@ -338,7 +343,7 b' class GistView(BaseAppView):' | |||
|
338 | 343 | @CSRFRequired() |
|
339 | 344 | @view_config( |
|
340 | 345 | route_name='gist_update', request_method='POST', |
|
341 | renderer='rhodecode:templates/admin/gists/edit.mako') | |
|
346 | renderer='rhodecode:templates/admin/gists/gist_edit.mako') | |
|
342 | 347 | def gist_update(self): |
|
343 | 348 | _ = self.request.translate |
|
344 | 349 | gist_id = self.request.matchdict['gist_id'] |
@@ -44,6 +44,14 b' def includeme(config):' | |||
|
44 | 44 | pattern='/') |
|
45 | 45 | |
|
46 | 46 | config.add_route( |
|
47 | name='main_page_repos_data', | |
|
48 | pattern='/_home_repos') | |
|
49 | ||
|
50 | config.add_route( | |
|
51 | name='main_page_repo_groups_data', | |
|
52 | pattern='/_home_repo_groups') | |
|
53 | ||
|
54 | config.add_route( | |
|
47 | 55 | name='user_autocomplete_data', |
|
48 | 56 | pattern='/_users') |
|
49 | 57 |
@@ -22,7 +22,7 b'' | |||
|
22 | 22 | import pytest |
|
23 | 23 | |
|
24 | 24 | import rhodecode |
|
25 | from rhodecode.model.db import Repository | |
|
25 | from rhodecode.model.db import Repository, RepoGroup, User | |
|
26 | 26 | from rhodecode.model.meta import Session |
|
27 | 27 | from rhodecode.model.repo import RepoModel |
|
28 | 28 | from rhodecode.model.repo_group import RepoGroupModel |
@@ -37,6 +37,8 b' fixture = Fixture()' | |||
|
37 | 37 | def route_path(name, **kwargs): |
|
38 | 38 | return { |
|
39 | 39 | 'home': '/', |
|
40 | 'main_page_repos_data': '/_home_repos', | |
|
41 | 'main_page_repo_groups_data': '/_home_repo_groups', | |
|
40 | 42 | 'repo_group_home': '/{repo_group_name}' |
|
41 | 43 | }[name].format(**kwargs) |
|
42 | 44 | |
@@ -47,11 +49,42 b' class TestHomeController(TestController)' | |||
|
47 | 49 | self.log_user() |
|
48 | 50 | response = self.app.get(route_path('home')) |
|
49 | 51 | # if global permission is set |
|
50 |
response.mustcontain(' |
|
|
52 | response.mustcontain('New Repository') | |
|
53 | ||
|
54 | def test_index_grid_repos(self, xhr_header): | |
|
55 | self.log_user() | |
|
56 | response = self.app.get(route_path('main_page_repos_data'), extra_environ=xhr_header) | |
|
57 | # search for objects inside the JavaScript JSON | |
|
58 | for obj in Repository.getAll(): | |
|
59 | response.mustcontain('<a href=\\"/{}\\">'.format(obj.repo_name)) | |
|
60 | ||
|
61 | def test_index_grid_repo_groups(self, xhr_header): | |
|
62 | self.log_user() | |
|
63 | response = self.app.get(route_path('main_page_repo_groups_data'), | |
|
64 | extra_environ=xhr_header,) | |
|
51 | 65 | |
|
52 | 66 | # search for objects inside the JavaScript JSON |
|
53 |
for |
|
|
54 |
response.mustcontain('" |
|
|
67 | for obj in RepoGroup.getAll(): | |
|
68 | response.mustcontain('<a href=\\"/{}\\">'.format(obj.group_name)) | |
|
69 | ||
|
70 | def test_index_grid_repo_groups_without_access(self, xhr_header, user_util): | |
|
71 | user = user_util.create_user(password='qweqwe') | |
|
72 | group_ok = user_util.create_repo_group(owner=user) | |
|
73 | group_id_ok = group_ok.group_id | |
|
74 | ||
|
75 | group_forbidden = user_util.create_repo_group(owner=User.get_first_super_admin()) | |
|
76 | group_id_forbidden = group_forbidden.group_id | |
|
77 | ||
|
78 | user_util.grant_user_permission_to_repo_group(group_forbidden, user, 'group.none') | |
|
79 | self.log_user(user.username, 'qweqwe') | |
|
80 | ||
|
81 | self.app.get(route_path('main_page_repo_groups_data'), | |
|
82 | extra_environ=xhr_header, | |
|
83 | params={'repo_group_id': group_id_ok}, status=200) | |
|
84 | ||
|
85 | self.app.get(route_path('main_page_repo_groups_data'), | |
|
86 | extra_environ=xhr_header, | |
|
87 | params={'repo_group_id': group_id_forbidden}, status=404) | |
|
55 | 88 | |
|
56 | 89 | def test_index_contains_statics_with_ver(self): |
|
57 | 90 | from rhodecode.lib.base import calculate_version_hash |
@@ -62,11 +95,11 b' class TestHomeController(TestController)' | |||
|
62 | 95 | rhodecode_version_hash = calculate_version_hash( |
|
63 | 96 | {'beaker.session.secret': 'test-rc-uytcxaz'}) |
|
64 | 97 | response.mustcontain('style.css?ver={0}'.format(rhodecode_version_hash)) |
|
65 | response.mustcontain('scripts.js?ver={0}'.format(rhodecode_version_hash)) | |
|
98 | response.mustcontain('scripts.min.js?ver={0}'.format(rhodecode_version_hash)) | |
|
66 | 99 | |
|
67 | def test_index_contains_backend_specific_details(self, backend): | |
|
100 | def test_index_contains_backend_specific_details(self, backend, xhr_header): | |
|
68 | 101 | self.log_user() |
|
69 |
response = self.app.get(route_path(' |
|
|
102 | response = self.app.get(route_path('main_page_repos_data'), extra_environ=xhr_header) | |
|
70 | 103 | tip = backend.repo.get_commit().raw_id |
|
71 | 104 | |
|
72 | 105 | # html in javascript variable: |
@@ -81,39 +114,44 b' class TestHomeController(TestController)' | |||
|
81 | 114 | response = self.app.get(route_path('home'), status=302) |
|
82 | 115 | assert 'login' in response.location |
|
83 | 116 | |
|
84 |
def test_index_page_on_groups(self, autologin_user, |
|
|
85 | response = self.app.get(route_path('repo_group_home', repo_group_name='gr1')) | |
|
86 | response.mustcontain("gr1/repo_in_group") | |
|
117 | def test_index_page_on_groups_with_wrong_group_id(self, autologin_user, xhr_header): | |
|
118 | group_id = 918123 | |
|
119 | self.app.get( | |
|
120 | route_path('main_page_repo_groups_data'), | |
|
121 | params={'repo_group_id': group_id}, | |
|
122 | status=404, extra_environ=xhr_header) | |
|
87 | 123 | |
|
88 | def test_index_page_on_group_with_trailing_slash( | |
|
89 | self, autologin_user, repo_group): | |
|
90 | response = self.app.get(route_path('repo_group_home', repo_group_name='gr1') + '/') | |
|
91 | response.mustcontain("gr1/repo_in_group") | |
|
124 | def test_index_page_on_groups(self, autologin_user, user_util, xhr_header): | |
|
125 | gr = user_util.create_repo_group() | |
|
126 | repo = user_util.create_repo(parent=gr) | |
|
127 | repo_name = repo.repo_name | |
|
128 | group_id = gr.group_id | |
|
92 | 129 | |
|
93 | @pytest.fixture(scope='class') | |
|
94 | def repo_group(self, request): | |
|
95 | gr = fixture.create_repo_group('gr1') | |
|
96 | fixture.create_repo(name='gr1/repo_in_group', repo_group=gr) | |
|
130 | response = self.app.get(route_path( | |
|
131 | 'repo_group_home', repo_group_name=gr.group_name)) | |
|
132 | response.mustcontain('d.repo_group_id = {}'.format(group_id)) | |
|
97 | 133 | |
|
98 | @request.addfinalizer | |
|
99 | def cleanup(): | |
|
100 | RepoModel().delete('gr1/repo_in_group') | |
|
101 | RepoGroupModel().delete(repo_group='gr1', force_delete=True) | |
|
102 | Session().commit() | |
|
134 | response = self.app.get( | |
|
135 | route_path('main_page_repos_data'), | |
|
136 | params={'repo_group_id': group_id}, | |
|
137 | extra_environ=xhr_header,) | |
|
138 | response.mustcontain(repo_name) | |
|
103 | 139 | |
|
104 |
def test_index_ |
|
|
105 |
|
|
|
106 | username = user.username | |
|
107 | user.name = '<img src="/image1" onload="alert(\'Hello, World!\');">' | |
|
108 | user.lastname = '#"><img src=x onerror=prompt(document.cookie);>' | |
|
140 | def test_index_page_on_group_with_trailing_slash(self, autologin_user, user_util, xhr_header): | |
|
141 | gr = user_util.create_repo_group() | |
|
142 | repo = user_util.create_repo(parent=gr) | |
|
143 | repo_name = repo.repo_name | |
|
144 | group_id = gr.group_id | |
|
109 | 145 | |
|
110 | Session().add(user) | |
|
111 | Session().commit() | |
|
112 | user_util.create_repo(owner=username) | |
|
146 | response = self.app.get(route_path( | |
|
147 | 'repo_group_home', repo_group_name=gr.group_name+'/')) | |
|
148 | response.mustcontain('d.repo_group_id = {}'.format(group_id)) | |
|
113 | 149 | |
|
114 |
response = self.app.get( |
|
|
115 | response.mustcontain(h.html_escape(user.first_name)) | |
|
116 | response.mustcontain(h.html_escape(user.last_name)) | |
|
150 | response = self.app.get( | |
|
151 | route_path('main_page_repos_data'), | |
|
152 | params={'repo_group_id': group_id}, | |
|
153 | extra_environ=xhr_header, ) | |
|
154 | response.mustcontain(repo_name) | |
|
117 | 155 | |
|
118 | 156 | @pytest.mark.parametrize("name, state", [ |
|
119 | 157 | ('Disabled', False), |
@@ -137,5 +175,5 b' class TestHomeController(TestController)' | |||
|
137 | 175 | def test_logout_form_contains_csrf(self, autologin_user, csrf_token): |
|
138 | 176 | response = self.app.get(route_path('home')) |
|
139 | 177 | assert_response = response.assert_response() |
|
140 |
element = assert_response.get_element('.logout |
|
|
178 | element = assert_response.get_element('.logout [name=csrf_token]') | |
|
141 | 179 | assert element.value == csrf_token |
@@ -22,29 +22,30 b' import re' | |||
|
22 | 22 | import logging |
|
23 | 23 | import collections |
|
24 | 24 | |
|
25 | from pyramid.httpexceptions import HTTPNotFound | |
|
25 | 26 | from pyramid.view import view_config |
|
26 | 27 | |
|
27 | from rhodecode.apps._base import BaseAppView | |
|
28 | from rhodecode.apps._base import BaseAppView, DataGridAppView | |
|
28 | 29 | from rhodecode.lib import helpers as h |
|
29 | 30 | from rhodecode.lib.auth import ( |
|
30 |
LoginRequired, NotAnonymous, HasRepoGroupPermissionAnyDecorator, CSRFRequired |
|
|
31 | LoginRequired, NotAnonymous, HasRepoGroupPermissionAnyDecorator, CSRFRequired, | |
|
32 | HasRepoGroupPermissionAny, AuthUser) | |
|
31 | 33 | from rhodecode.lib.codeblocks import filenode_as_lines_tokens |
|
32 | 34 | from rhodecode.lib.index import searcher_from_config |
|
33 | 35 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int |
|
34 | from rhodecode.lib.ext_json import json | |
|
35 | 36 | from rhodecode.lib.vcs.nodes import FileNode |
|
36 | 37 | from rhodecode.model.db import ( |
|
37 |
func, true, or_, case, in_filter_generator, |
|
|
38 | func, true, or_, case, in_filter_generator, Session, | |
|
39 | Repository, RepoGroup, User, UserGroup) | |
|
38 | 40 | from rhodecode.model.repo import RepoModel |
|
39 | 41 | from rhodecode.model.repo_group import RepoGroupModel |
|
40 | from rhodecode.model.scm import RepoGroupList, RepoList | |
|
41 | 42 | from rhodecode.model.user import UserModel |
|
42 | 43 | from rhodecode.model.user_group import UserGroupModel |
|
43 | 44 | |
|
44 | 45 | log = logging.getLogger(__name__) |
|
45 | 46 | |
|
46 | 47 | |
|
47 | class HomeView(BaseAppView): | |
|
48 | class HomeView(BaseAppView, DataGridAppView): | |
|
48 | 49 | |
|
49 | 50 | def load_default_context(self): |
|
50 | 51 | c = self._get_local_tmpl_context() |
@@ -112,7 +113,12 b' class HomeView(BaseAppView):' | |||
|
112 | 113 | ['repository.read', 'repository.write', 'repository.admin'], |
|
113 | 114 | cache=False, name_filter=name_contains) or [-1] |
|
114 | 115 | |
|
115 |
query = |
|
|
116 | query = Session().query( | |
|
117 | Repository.repo_name, | |
|
118 | Repository.repo_id, | |
|
119 | Repository.repo_type, | |
|
120 | Repository.private, | |
|
121 | )\ | |
|
116 | 122 | .filter(Repository.archived.isnot(true()))\ |
|
117 | 123 | .filter(or_( |
|
118 | 124 | # generate multiple IN to fix limitation problems |
@@ -158,7 +164,10 b' class HomeView(BaseAppView):' | |||
|
158 | 164 | ['group.read', 'group.write', 'group.admin'], |
|
159 | 165 | cache=False, name_filter=name_contains) or [-1] |
|
160 | 166 | |
|
161 |
query = |
|
|
167 | query = Session().query( | |
|
168 | RepoGroup.group_id, | |
|
169 | RepoGroup.group_name, | |
|
170 | )\ | |
|
162 | 171 | .filter(or_( |
|
163 | 172 | # generate multiple IN to fix limitation problems |
|
164 | 173 | *in_filter_generator(RepoGroup.group_id, allowed_ids) |
@@ -449,6 +458,7 b' class HomeView(BaseAppView):' | |||
|
449 | 458 | 'id': -10, |
|
450 | 459 | 'value': query, |
|
451 | 460 | 'value_display': label, |
|
461 | 'value_icon': '<i class="icon-code"></i>', | |
|
452 | 462 | 'type': 'search', |
|
453 | 463 | 'subtype': 'repo', |
|
454 | 464 | 'url': h.route_path('search_repo', |
@@ -466,6 +476,7 b' class HomeView(BaseAppView):' | |||
|
466 | 476 | 'id': -20, |
|
467 | 477 | 'value': query, |
|
468 | 478 | 'value_display': label, |
|
479 | 'value_icon': '<i class="icon-history"></i>', | |
|
469 | 480 | 'type': 'search', |
|
470 | 481 | 'subtype': 'repo', |
|
471 | 482 | 'url': h.route_path('search_repo', |
@@ -491,6 +502,7 b' class HomeView(BaseAppView):' | |||
|
491 | 502 | 'id': -30, |
|
492 | 503 | 'value': query, |
|
493 | 504 | 'value_display': label, |
|
505 | 'value_icon': '<i class="icon-code"></i>', | |
|
494 | 506 | 'type': 'search', |
|
495 | 507 | 'subtype': 'repo_group', |
|
496 | 508 | 'url': h.route_path('search_repo_group', |
@@ -508,6 +520,7 b' class HomeView(BaseAppView):' | |||
|
508 | 520 | 'id': -40, |
|
509 | 521 | 'value': query, |
|
510 | 522 | 'value_display': label, |
|
523 | 'value_icon': '<i class="icon-history"></i>', | |
|
511 | 524 | 'type': 'search', |
|
512 | 525 | 'subtype': 'repo_group', |
|
513 | 526 | 'url': h.route_path('search_repo_group', |
@@ -529,6 +542,7 b' class HomeView(BaseAppView):' | |||
|
529 | 542 | 'id': -1, |
|
530 | 543 | 'value': query, |
|
531 | 544 | 'value_display': u'File search for: `{}`'.format(query), |
|
545 | 'value_icon': '<i class="icon-code"></i>', | |
|
532 | 546 | 'type': 'search', |
|
533 | 547 | 'subtype': 'global', |
|
534 | 548 | 'url': h.route_path('search', |
@@ -539,6 +553,7 b' class HomeView(BaseAppView):' | |||
|
539 | 553 | 'id': -2, |
|
540 | 554 | 'value': query, |
|
541 | 555 | 'value_display': u'Commit search for: `{}`'.format(query), |
|
556 | 'value_icon': '<i class="icon-history"></i>', | |
|
542 | 557 | 'type': 'search', |
|
543 | 558 | 'subtype': 'global', |
|
544 | 559 | 'url': h.route_path('search', |
@@ -667,23 +682,6 b' class HomeView(BaseAppView):' | |||
|
667 | 682 | |
|
668 | 683 | return {'suggestions': res} |
|
669 | 684 | |
|
670 | def _get_groups_and_repos(self, repo_group_id=None): | |
|
671 | # repo groups groups | |
|
672 | repo_group_list = RepoGroup.get_all_repo_groups(group_id=repo_group_id) | |
|
673 | _perms = ['group.read', 'group.write', 'group.admin'] | |
|
674 | repo_group_list_acl = RepoGroupList(repo_group_list, perm_set=_perms) | |
|
675 | repo_group_data = RepoGroupModel().get_repo_groups_as_dict( | |
|
676 | repo_group_list=repo_group_list_acl, admin=False) | |
|
677 | ||
|
678 | # repositories | |
|
679 | repo_list = Repository.get_all_repos(group_id=repo_group_id) | |
|
680 | _perms = ['repository.read', 'repository.write', 'repository.admin'] | |
|
681 | repo_list_acl = RepoList(repo_list, perm_set=_perms) | |
|
682 | repo_data = RepoModel().get_repos_as_dict( | |
|
683 | repo_list=repo_list_acl, admin=False) | |
|
684 | ||
|
685 | return repo_data, repo_group_data | |
|
686 | ||
|
687 | 685 | @LoginRequired() |
|
688 | 686 | @view_config( |
|
689 | 687 | route_name='home', request_method='GET', |
@@ -691,17 +689,74 b' class HomeView(BaseAppView):' | |||
|
691 | 689 | def main_page(self): |
|
692 | 690 | c = self.load_default_context() |
|
693 | 691 | c.repo_group = None |
|
694 | ||
|
695 | repo_data, repo_group_data = self._get_groups_and_repos() | |
|
696 | # json used to render the grids | |
|
697 | c.repos_data = json.dumps(repo_data) | |
|
698 | c.repo_groups_data = json.dumps(repo_group_data) | |
|
699 | ||
|
700 | 692 | return self._get_template_context(c) |
|
701 | 693 | |
|
694 | def _main_page_repo_groups_data(self, repo_group_id): | |
|
695 | column_map = { | |
|
696 | 'name': 'group_name_hash', | |
|
697 | 'desc': 'group_description', | |
|
698 | 'last_change': 'updated_on', | |
|
699 | 'owner': 'user_username', | |
|
700 | } | |
|
701 | draw, start, limit = self._extract_chunk(self.request) | |
|
702 | search_q, order_by, order_dir = self._extract_ordering( | |
|
703 | self.request, column_map=column_map) | |
|
704 | return RepoGroupModel().get_repo_groups_data_table( | |
|
705 | draw, start, limit, | |
|
706 | search_q, order_by, order_dir, | |
|
707 | self._rhodecode_user, repo_group_id) | |
|
708 | ||
|
709 | def _main_page_repos_data(self, repo_group_id): | |
|
710 | column_map = { | |
|
711 | 'name': 'repo_name', | |
|
712 | 'desc': 'description', | |
|
713 | 'last_change': 'updated_on', | |
|
714 | 'owner': 'user_username', | |
|
715 | } | |
|
716 | draw, start, limit = self._extract_chunk(self.request) | |
|
717 | search_q, order_by, order_dir = self._extract_ordering( | |
|
718 | self.request, column_map=column_map) | |
|
719 | return RepoModel().get_repos_data_table( | |
|
720 | draw, start, limit, | |
|
721 | search_q, order_by, order_dir, | |
|
722 | self._rhodecode_user, repo_group_id) | |
|
723 | ||
|
702 | 724 | @LoginRequired() |
|
703 | @HasRepoGroupPermissionAnyDecorator( | |
|
704 | 'group.read', 'group.write', 'group.admin') | |
|
725 | @view_config( | |
|
726 | route_name='main_page_repo_groups_data', | |
|
727 | request_method='GET', renderer='json_ext', xhr=True) | |
|
728 | def main_page_repo_groups_data(self): | |
|
729 | self.load_default_context() | |
|
730 | repo_group_id = safe_int(self.request.GET.get('repo_group_id')) | |
|
731 | ||
|
732 | if repo_group_id: | |
|
733 | group = RepoGroup.get_or_404(repo_group_id) | |
|
734 | _perms = AuthUser.repo_group_read_perms | |
|
735 | if not HasRepoGroupPermissionAny(*_perms)( | |
|
736 | group.group_name, 'user is allowed to list repo group children'): | |
|
737 | raise HTTPNotFound() | |
|
738 | ||
|
739 | return self._main_page_repo_groups_data(repo_group_id) | |
|
740 | ||
|
741 | @LoginRequired() | |
|
742 | @view_config( | |
|
743 | route_name='main_page_repos_data', | |
|
744 | request_method='GET', renderer='json_ext', xhr=True) | |
|
745 | def main_page_repos_data(self): | |
|
746 | self.load_default_context() | |
|
747 | repo_group_id = safe_int(self.request.GET.get('repo_group_id')) | |
|
748 | ||
|
749 | if repo_group_id: | |
|
750 | group = RepoGroup.get_or_404(repo_group_id) | |
|
751 | _perms = AuthUser.repo_group_read_perms | |
|
752 | if not HasRepoGroupPermissionAny(*_perms)( | |
|
753 | group.group_name, 'user is allowed to list repo group children'): | |
|
754 | raise HTTPNotFound() | |
|
755 | ||
|
756 | return self._main_page_repos_data(repo_group_id) | |
|
757 | ||
|
758 | @LoginRequired() | |
|
759 | @HasRepoGroupPermissionAnyDecorator(*AuthUser.repo_group_read_perms) | |
|
705 | 760 | @view_config( |
|
706 | 761 | route_name='repo_group_home', request_method='GET', |
|
707 | 762 | renderer='rhodecode:templates/index_repo_group.mako') |
@@ -711,16 +766,6 b' class HomeView(BaseAppView):' | |||
|
711 | 766 | def repo_group_main_page(self): |
|
712 | 767 | c = self.load_default_context() |
|
713 | 768 | c.repo_group = self.request.db_repo_group |
|
714 | repo_data, repo_group_data = self._get_groups_and_repos(c.repo_group.group_id) | |
|
715 | ||
|
716 | # update every 5 min | |
|
717 | if self.request.db_repo_group.last_commit_cache_update_diff > 60 * 5: | |
|
718 | self.request.db_repo_group.update_commit_cache() | |
|
719 | ||
|
720 | # json used to render the grids | |
|
721 | c.repos_data = json.dumps(repo_data) | |
|
722 | c.repo_groups_data = json.dumps(repo_group_data) | |
|
723 | ||
|
724 | 769 | return self._get_template_context(c) |
|
725 | 770 | |
|
726 | 771 | @LoginRequired() |
@@ -22,7 +22,7 b'' | |||
|
22 | 22 | import logging |
|
23 | 23 | import itertools |
|
24 | 24 | |
|
25 | from webhelpers.feedgenerator import Atom1Feed, Rss201rev2Feed | |
|
25 | ||
|
26 | 26 | |
|
27 | 27 | from pyramid.view import view_config |
|
28 | 28 | from pyramid.httpexceptions import HTTPBadRequest |
@@ -34,10 +34,11 b' from rhodecode.model.db import (' | |||
|
34 | 34 | or_, joinedload, Repository, UserLog, UserFollowing, User, UserApiKeys) |
|
35 | 35 | from rhodecode.model.meta import Session |
|
36 | 36 | import rhodecode.lib.helpers as h |
|
37 | from rhodecode.lib.helpers import Page | |
|
37 | from rhodecode.lib.helpers import SqlPage | |
|
38 | 38 | from rhodecode.lib.user_log_filter import user_log_filter |
|
39 | 39 | from rhodecode.lib.auth import LoginRequired, NotAnonymous, CSRFRequired, HasRepoPermissionAny |
|
40 | 40 | from rhodecode.lib.utils2 import safe_int, AttributeDict, md5_safe |
|
41 | from rhodecode.lib.feedgenerator.feedgenerator import Atom1Feed, Rss201rev2Feed | |
|
41 | 42 | from rhodecode.model.scm import ScmModel |
|
42 | 43 | |
|
43 | 44 | log = logging.getLogger(__name__) |
@@ -166,7 +167,7 b' class JournalView(BaseAppView):' | |||
|
166 | 167 | description=desc) |
|
167 | 168 | |
|
168 | 169 | response = Response(feed.writeString('utf-8')) |
|
169 |
response.content_type = feed. |
|
|
170 | response.content_type = feed.content_type | |
|
170 | 171 | return response |
|
171 | 172 | |
|
172 | 173 | def _rss_feed(self, repos, search_term, public=True): |
@@ -212,7 +213,7 b' class JournalView(BaseAppView):' | |||
|
212 | 213 | description=desc) |
|
213 | 214 | |
|
214 | 215 | response = Response(feed.writeString('utf-8')) |
|
215 |
response.content_type = feed. |
|
|
216 | response.content_type = feed.content_type | |
|
216 | 217 | return response |
|
217 | 218 | |
|
218 | 219 | @LoginRequired() |
@@ -232,15 +233,15 b' class JournalView(BaseAppView):' | |||
|
232 | 233 | |
|
233 | 234 | journal = self._get_journal_data(following, c.search_term) |
|
234 | 235 | |
|
235 |
def url_generator( |
|
|
236 | def url_generator(page_num): | |
|
236 | 237 | query_params = { |
|
238 | 'page': page_num, | |
|
237 | 239 | 'filter': c.search_term |
|
238 | 240 | } |
|
239 | query_params.update(kw) | |
|
240 | 241 | return self.request.current_route_path(_query=query_params) |
|
241 | 242 | |
|
242 | c.journal_pager = Page( | |
|
243 | journal, page=p, items_per_page=20, url=url_generator) | |
|
243 | c.journal_pager = SqlPage( | |
|
244 | journal, page=p, items_per_page=20, url_maker=url_generator) | |
|
244 | 245 | c.journal_day_aggreagate = self._get_daily_aggregate(c.journal_pager) |
|
245 | 246 | |
|
246 | 247 | c.journal_data = render( |
@@ -333,13 +334,14 b' class JournalView(BaseAppView):' | |||
|
333 | 334 | |
|
334 | 335 | journal = self._get_journal_data(c.following, c.search_term) |
|
335 | 336 | |
|
336 |
def url_generator( |
|
|
337 |
query_params = { |
|
|
338 | query_params.update(kw) | |
|
337 | def url_generator(page_num): | |
|
338 | query_params = { | |
|
339 | 'page': page_num | |
|
340 | } | |
|
339 | 341 | return self.request.current_route_path(_query=query_params) |
|
340 | 342 | |
|
341 | c.journal_pager = Page( | |
|
342 | journal, page=p, items_per_page=20, url=url_generator) | |
|
343 | c.journal_pager = SqlPage( | |
|
344 | journal, page=p, items_per_page=20, url_maker=url_generator) | |
|
343 | 345 | c.journal_day_aggreagate = self._get_daily_aggregate(c.journal_pager) |
|
344 | 346 | |
|
345 | 347 | c.journal_data = render( |
@@ -93,7 +93,7 b' class TestLoginController(object):' | |||
|
93 | 93 | session = response.get_session_from_response() |
|
94 | 94 | username = session['rhodecode_user'].get('username') |
|
95 | 95 | assert username == 'test_admin' |
|
96 |
response.mustcontain(' |
|
|
96 | response.mustcontain('logout') | |
|
97 | 97 | |
|
98 | 98 | def test_login_regular_ok(self): |
|
99 | 99 | response = self.app.post(route_path('login'), |
@@ -104,8 +104,7 b' class TestLoginController(object):' | |||
|
104 | 104 | session = response.get_session_from_response() |
|
105 | 105 | username = session['rhodecode_user'].get('username') |
|
106 | 106 | assert username == 'test_regular' |
|
107 | ||
|
108 | response.mustcontain('/%s' % HG_REPO) | |
|
107 | response.mustcontain('logout') | |
|
109 | 108 | |
|
110 | 109 | def test_login_regular_forbidden_when_super_admin_restriction(self): |
|
111 | 110 | from rhodecode.authentication.plugins.auth_rhodecode import RhodeCodeAuthPlugin |
@@ -225,7 +224,7 b' class TestLoginController(object):' | |||
|
225 | 224 | session = response.get_session_from_response() |
|
226 | 225 | username = session['rhodecode_user'].get('username') |
|
227 | 226 | assert username == temp_user |
|
228 |
response.mustcontain(' |
|
|
227 | response.mustcontain('logout') | |
|
229 | 228 | |
|
230 | 229 | # new password should be bcrypted, after log-in and transfer |
|
231 | 230 | user = User.get_by_username(temp_user) |
@@ -401,7 +400,7 b' class TestLoginController(object):' | |||
|
401 | 400 | ) # This should be overridden |
|
402 | 401 | |
|
403 | 402 | assert_session_flash( |
|
404 | response, 'You have successfully registered with RhodeCode') | |
|
403 | response, 'You have successfully registered with RhodeCode. You can log-in now.') | |
|
405 | 404 | |
|
406 | 405 | ret = Session().query(User).filter( |
|
407 | 406 | User.username == 'test_regular4').one() |
@@ -88,7 +88,7 b' class TestPasswordReset(TestController):' | |||
|
88 | 88 | |
|
89 | 89 | response = self.app.get(route_path('reset_password')) |
|
90 | 90 | |
|
91 |
assert_response = |
|
|
91 | assert_response = response.assert_response() | |
|
92 | 92 | if show_reset: |
|
93 | 93 | response.mustcontain('Send password reset email') |
|
94 | 94 | assert_response.one_element_exists('#email') |
@@ -90,7 +90,7 b' class TestRegisterCaptcha(object):' | |||
|
90 | 90 | |
|
91 | 91 | response = app.get(ADMIN_PREFIX + '/register') |
|
92 | 92 | |
|
93 |
assertr = |
|
|
93 | assertr = response.assert_response() | |
|
94 | 94 | if active: |
|
95 | 95 | assertr.one_element_exists('#recaptcha_field') |
|
96 | 96 | else: |
@@ -128,6 +128,6 b' class TestRegisterCaptcha(object):' | |||
|
128 | 128 | else: |
|
129 | 129 | # If captche input is invalid we expect to stay on the registration |
|
130 | 130 | # page with an error message displayed. |
|
131 |
assertr = |
|
|
131 | assertr = response.assert_response() | |
|
132 | 132 | assert response.status_int == 200 |
|
133 | 133 | assertr.one_element_exists('#recaptcha_field ~ span.error-message') |
@@ -312,8 +312,6 b' class LoginView(BaseAppView):' | |||
|
312 | 312 | action_data = {'data': new_user.get_api_data(), |
|
313 | 313 | 'user_agent': self.request.user_agent} |
|
314 | 314 | |
|
315 | ||
|
316 | ||
|
317 | 315 | if external_identity: |
|
318 | 316 | action_data['external_identity'] = external_identity |
|
319 | 317 | |
@@ -329,8 +327,13 b' class LoginView(BaseAppView):' | |||
|
329 | 327 | event = UserRegistered(user=new_user, session=self.session) |
|
330 | 328 | trigger(event) |
|
331 | 329 | h.flash( |
|
332 | _('You have successfully registered with RhodeCode'), | |
|
330 | _('You have successfully registered with RhodeCode. You can log-in now.'), | |
|
333 | 331 | category='success') |
|
332 | if external_identity: | |
|
333 | h.flash( | |
|
334 | _('Please use the {identity} button to log-in').format( | |
|
335 | identity=external_identity), | |
|
336 | category='success') | |
|
334 | 337 | Session().commit() |
|
335 | 338 | |
|
336 | 339 | redirect_ro = self.request.route_path('login') |
@@ -33,22 +33,21 b' from rhodecode import forms' | |||
|
33 | 33 | from rhodecode.lib import helpers as h |
|
34 | 34 | from rhodecode.lib import audit_logger |
|
35 | 35 | from rhodecode.lib.ext_json import json |
|
36 | from rhodecode.lib.auth import LoginRequired, NotAnonymous, CSRFRequired, \ | |
|
37 | HasRepoPermissionAny, HasRepoGroupPermissionAny | |
|
36 | from rhodecode.lib.auth import ( | |
|
37 | LoginRequired, NotAnonymous, CSRFRequired, | |
|
38 | HasRepoPermissionAny, HasRepoGroupPermissionAny, AuthUser) | |
|
38 | 39 | from rhodecode.lib.channelstream import ( |
|
39 | 40 | channelstream_request, ChannelstreamException) |
|
40 | 41 | from rhodecode.lib.utils2 import safe_int, md5, str2bool |
|
41 | 42 | from rhodecode.model.auth_token import AuthTokenModel |
|
42 | 43 | from rhodecode.model.comment import CommentsModel |
|
43 | 44 | from rhodecode.model.db import ( |
|
44 |
IntegrityError, |
|
|
45 | IntegrityError, or_, in_filter_generator, | |
|
45 | 46 | Repository, UserEmailMap, UserApiKeys, UserFollowing, |
|
46 | 47 | PullRequest, UserBookmark, RepoGroup) |
|
47 | 48 | from rhodecode.model.meta import Session |
|
48 | 49 | from rhodecode.model.pull_request import PullRequestModel |
|
49 | from rhodecode.model.scm import RepoList | |
|
50 | 50 | from rhodecode.model.user import UserModel |
|
51 | from rhodecode.model.repo import RepoModel | |
|
52 | 51 | from rhodecode.model.user_group import UserGroupModel |
|
53 | 52 | from rhodecode.model.validation_schema.schemas import user_schema |
|
54 | 53 | |
@@ -345,22 +344,59 b' class MyAccountView(BaseAppView, DataGri' | |||
|
345 | 344 | 'You should see a new live message now.'} |
|
346 | 345 | |
|
347 | 346 | def _load_my_repos_data(self, watched=False): |
|
347 | ||
|
348 | allowed_ids = [-1] + self._rhodecode_user.repo_acl_ids_from_stack(AuthUser.repo_read_perms) | |
|
349 | ||
|
348 | 350 | if watched: |
|
349 | admin = False | |
|
350 |
|
|
|
351 | .filter(UserFollowing.user_id == self._rhodecode_user.user_id)\ | |
|
352 | .options(joinedload(UserFollowing.follows_repository))\ | |
|
351 | # repos user watch | |
|
352 | repo_list = Session().query( | |
|
353 | Repository | |
|
354 | ) \ | |
|
355 | .join( | |
|
356 | (UserFollowing, UserFollowing.follows_repo_id == Repository.repo_id) | |
|
357 | ) \ | |
|
358 | .filter( | |
|
359 | UserFollowing.user_id == self._rhodecode_user.user_id | |
|
360 | ) \ | |
|
361 | .filter(or_( | |
|
362 | # generate multiple IN to fix limitation problems | |
|
363 | *in_filter_generator(Repository.repo_id, allowed_ids)) | |
|
364 | ) \ | |
|
365 | .order_by(Repository.repo_name) \ | |
|
353 | 366 | .all() |
|
354 | repo_list = [x.follows_repository for x in follows_repos] | |
|
367 | ||
|
355 | 368 | else: |
|
356 | admin = True | |
|
357 |
repo_list = |
|
|
358 | user_id=self._rhodecode_user.user_id) | |
|
359 | repo_list = RepoList(repo_list, perm_set=[ | |
|
360 | 'repository.read', 'repository.write', 'repository.admin']) | |
|
369 | # repos user is owner of | |
|
370 | repo_list = Session().query( | |
|
371 | Repository | |
|
372 | ) \ | |
|
373 | .filter( | |
|
374 | Repository.user_id == self._rhodecode_user.user_id | |
|
375 | ) \ | |
|
376 | .filter(or_( | |
|
377 | # generate multiple IN to fix limitation problems | |
|
378 | *in_filter_generator(Repository.repo_id, allowed_ids)) | |
|
379 | ) \ | |
|
380 | .order_by(Repository.repo_name) \ | |
|
381 | .all() | |
|
361 | 382 | |
|
362 | repos_data = RepoModel().get_repos_as_dict( | |
|
363 | repo_list=repo_list, admin=admin, short_name=False) | |
|
383 | _render = self.request.get_partial_renderer( | |
|
384 | 'rhodecode:templates/data_table/_dt_elements.mako') | |
|
385 | ||
|
386 | def repo_lnk(name, rtype, rstate, private, archived, fork_of): | |
|
387 | return _render('repo_name', name, rtype, rstate, private, archived, fork_of, | |
|
388 | short_name=False, admin=False) | |
|
389 | ||
|
390 | repos_data = [] | |
|
391 | for repo in repo_list: | |
|
392 | row = { | |
|
393 | "name": repo_lnk(repo.repo_name, repo.repo_type, repo.repo_state, | |
|
394 | repo.private, repo.archived, repo.fork), | |
|
395 | "name_raw": repo.repo_name.lower(), | |
|
396 | } | |
|
397 | ||
|
398 | repos_data.append(row) | |
|
399 | ||
|
364 | 400 | # json used to render the grid |
|
365 | 401 | return json.dumps(repos_data) |
|
366 | 402 | |
@@ -398,16 +434,19 b' class MyAccountView(BaseAppView, DataGri' | |||
|
398 | 434 | def my_account_bookmarks(self): |
|
399 | 435 | c = self.load_default_context() |
|
400 | 436 | c.active = 'bookmarks' |
|
437 | c.bookmark_items = UserBookmark.get_bookmarks_for_user( | |
|
438 | self._rhodecode_db_user.user_id, cache=False) | |
|
401 | 439 | return self._get_template_context(c) |
|
402 | 440 | |
|
403 | def _process_entry(self, entry, user_id): | |
|
441 | def _process_bookmark_entry(self, entry, user_id): | |
|
404 | 442 | position = safe_int(entry.get('position')) |
|
443 | cur_position = safe_int(entry.get('cur_position')) | |
|
405 | 444 | if position is None: |
|
406 | 445 | return |
|
407 | 446 | |
|
408 | 447 | # check if this is an existing entry |
|
409 | 448 | is_new = False |
|
410 | db_entry = UserBookmark().get_by_position_for_user(position, user_id) | |
|
449 | db_entry = UserBookmark().get_by_position_for_user(cur_position, user_id) | |
|
411 | 450 | |
|
412 | 451 | if db_entry and str2bool(entry.get('remove')): |
|
413 | 452 | log.debug('Marked bookmark %s for deletion', db_entry) |
@@ -446,12 +485,12 b' class MyAccountView(BaseAppView, DataGri' | |||
|
446 | 485 | should_save = True |
|
447 | 486 | |
|
448 | 487 | if should_save: |
|
449 | log.debug('Saving bookmark %s, new:%s', db_entry, is_new) | |
|
450 | 488 | # mark user and position |
|
451 | 489 | db_entry.user_id = user_id |
|
452 | 490 | db_entry.position = position |
|
453 | 491 | db_entry.title = entry.get('title') |
|
454 | 492 | db_entry.redirect_url = entry.get('redirect_url') or default_redirect_url |
|
493 | log.debug('Saving bookmark %s, new:%s', db_entry, is_new) | |
|
455 | 494 | |
|
456 | 495 | Session().add(db_entry) |
|
457 | 496 | |
@@ -468,15 +507,31 b' class MyAccountView(BaseAppView, DataGri' | |||
|
468 | 507 | controls = peppercorn.parse(self.request.POST.items()) |
|
469 | 508 | user_id = c.user.user_id |
|
470 | 509 | |
|
510 | # validate positions | |
|
511 | positions = {} | |
|
512 | for entry in controls.get('bookmarks', []): | |
|
513 | position = safe_int(entry['position']) | |
|
514 | if position is None: | |
|
515 | continue | |
|
516 | ||
|
517 | if position in positions: | |
|
518 | h.flash(_("Position {} is defined twice. " | |
|
519 | "Please correct this error.").format(position), category='error') | |
|
520 | return HTTPFound(h.route_path('my_account_bookmarks')) | |
|
521 | ||
|
522 | entry['position'] = position | |
|
523 | entry['cur_position'] = safe_int(entry.get('cur_position')) | |
|
524 | positions[position] = entry | |
|
525 | ||
|
471 | 526 | try: |
|
472 |
for entry in |
|
|
473 | self._process_entry(entry, user_id) | |
|
527 | for entry in positions.values(): | |
|
528 | self._process_bookmark_entry(entry, user_id) | |
|
474 | 529 | |
|
475 | 530 | Session().commit() |
|
476 | 531 | h.flash(_("Update Bookmarks"), category='success') |
|
477 | 532 | except IntegrityError: |
|
478 | 533 | h.flash(_("Failed to update bookmarks. " |
|
479 | "Make sure an unique position is used"), category='error') | |
|
534 | "Make sure an unique position is used."), category='error') | |
|
480 | 535 | |
|
481 | 536 | return HTTPFound(h.route_path('my_account_bookmarks')) |
|
482 | 537 | |
@@ -582,6 +637,7 b' class MyAccountView(BaseAppView, DataGri' | |||
|
582 | 637 | 'email': c.user.email, |
|
583 | 638 | 'firstname': c.user.firstname, |
|
584 | 639 | 'lastname': c.user.lastname, |
|
640 | 'description': c.user.description, | |
|
585 | 641 | } |
|
586 | 642 | c.form = forms.RcForm( |
|
587 | 643 | schema, appstruct=appstruct, |
@@ -664,7 +720,8 b' class MyAccountView(BaseAppView, DataGri' | |||
|
664 | 720 | 'target_repo': _render('pullrequest_target_repo', |
|
665 | 721 | pr.target_repo.repo_name), |
|
666 | 722 | 'name': _render('pullrequest_name', |
|
667 |
pr.pull_request_id, pr. |
|
|
723 | pr.pull_request_id, pr.pull_request_state, | |
|
724 | pr.work_in_progress, pr.target_repo.repo_name, | |
|
668 | 725 | short=True), |
|
669 | 726 | 'name_raw': pr.pull_request_id, |
|
670 | 727 | 'status': _render('pullrequest_status', |
@@ -28,7 +28,7 b' from rhodecode.apps._base import BaseApp' | |||
|
28 | 28 | from rhodecode.lib.auth import LoginRequired, NotAnonymous, CSRFRequired |
|
29 | 29 | |
|
30 | 30 | from rhodecode.lib import helpers as h |
|
31 | from rhodecode.lib.helpers import Page | |
|
31 | from rhodecode.lib.helpers import SqlPage | |
|
32 | 32 | from rhodecode.lib.utils2 import safe_int |
|
33 | 33 | from rhodecode.model.db import Notification |
|
34 | 34 | from rhodecode.model.notification import NotificationModel |
@@ -74,13 +74,16 b' class MyAccountNotificationsView(BaseApp' | |||
|
74 | 74 | |
|
75 | 75 | p = safe_int(self.request.GET.get('page', 1), 1) |
|
76 | 76 | |
|
77 |
def url_generator( |
|
|
77 | def url_generator(page_num): | |
|
78 | query_params = { | |
|
79 | 'page': page_num | |
|
80 | } | |
|
78 | 81 | _query = self.request.GET.mixed() |
|
79 |
|
|
|
80 |
return self.request.current_route_path(_query= |
|
|
82 | query_params.update(_query) | |
|
83 | return self.request.current_route_path(_query=query_params) | |
|
81 | 84 | |
|
82 | c.notifications = Page(notifications, page=p, items_per_page=10, | |
|
83 | url=url_generator) | |
|
85 | c.notifications = SqlPage(notifications, page=p, items_per_page=10, | |
|
86 | url_maker=url_generator) | |
|
84 | 87 | |
|
85 | 88 | c.unread_type = 'unread' |
|
86 | 89 | c.all_type = 'all' |
@@ -46,9 +46,16 b' class RepoGroupSettingsView(RepoGroupApp' | |||
|
46 | 46 | route_name='edit_repo_group_advanced', request_method='GET', |
|
47 | 47 | renderer='rhodecode:templates/admin/repo_groups/repo_group_edit.mako') |
|
48 | 48 | def edit_repo_group_advanced(self): |
|
49 | _ = self.request.translate | |
|
49 | 50 | c = self.load_default_context() |
|
50 | 51 | c.active = 'advanced' |
|
51 | 52 | c.repo_group = self.db_repo_group |
|
53 | ||
|
54 | # update commit cache if GET flag is present | |
|
55 | if self.request.GET.get('update_commit_cache'): | |
|
56 | self.db_repo_group.update_commit_cache() | |
|
57 | h.flash(_('updated commit cache'), category='success') | |
|
58 | ||
|
52 | 59 | return self._get_template_context(c) |
|
53 | 60 | |
|
54 | 61 | @LoginRequired() |
@@ -79,6 +79,10 b' def includeme(config):' | |||
|
79 | 79 | pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/preview', repo_route=True) |
|
80 | 80 | |
|
81 | 81 | config.add_route( |
|
82 | name='repo_commit_comment_attachment_upload', | |
|
83 | pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/attachment_upload', repo_route=True) | |
|
84 | ||
|
85 | config.add_route( | |
|
82 | 86 | name='repo_commit_comment_delete', |
|
83 | 87 | pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/delete', repo_route=True) |
|
84 | 88 |
@@ -98,7 +98,7 b' class TestRepoCommitCommentsView(TestCon' | |||
|
98 | 98 | assert notification.type_ == Notification.TYPE_CHANGESET_COMMENT |
|
99 | 99 | |
|
100 | 100 | author = notification.created_by_user.username_and_name |
|
101 | sbj = '{0} left a {1} on commit `{2}` in the {3} repository'.format( | |
|
101 | sbj = '@{0} left a {1} on commit `{2}` in the `{3}` repository'.format( | |
|
102 | 102 | author, comment_type, h.show_id(commit), backend.repo_name) |
|
103 | 103 | assert sbj == notification.subject |
|
104 | 104 | |
@@ -159,7 +159,7 b' class TestRepoCommitCommentsView(TestCon' | |||
|
159 | 159 | assert comment.revision == commit_id |
|
160 | 160 | |
|
161 | 161 | author = notification.created_by_user.username_and_name |
|
162 | sbj = '{0} left a {1} on file `{2}` in commit `{3}` in the {4} repository'.format( | |
|
162 | sbj = '@{0} left a {1} on file `{2}` in commit `{3}` in the `{4}` repository'.format( | |
|
163 | 163 | author, comment_type, f_path, h.show_id(commit), backend.repo_name) |
|
164 | 164 | |
|
165 | 165 | assert sbj == notification.subject |
@@ -230,7 +230,7 b' class TestRepoCommitCommentsView(TestCon' | |||
|
230 | 230 | assert notification.type_ == Notification.TYPE_CHANGESET_COMMENT |
|
231 | 231 | |
|
232 | 232 | author = notification.created_by_user.username_and_name |
|
233 | sbj = '[status: Approved] {0} left a note on commit `{1}` in the {2} repository'.format( | |
|
233 | sbj = '[status: Approved] @{0} left a note on commit `{1}` in the `{2}` repository'.format( | |
|
234 | 234 | author, h.show_id(commit), backend.repo_name) |
|
235 | 235 | assert sbj == notification.subject |
|
236 | 236 | |
@@ -299,14 +299,14 b' class TestRepoCommitCommentsView(TestCon' | |||
|
299 | 299 | |
|
300 | 300 | def assert_comment_links(response, comments, inline_comments): |
|
301 | 301 | if comments == 1: |
|
302 |
comments_text = "%d |
|
|
302 | comments_text = "%d General" % comments | |
|
303 | 303 | else: |
|
304 |
comments_text = "%d |
|
|
304 | comments_text = "%d General" % comments | |
|
305 | 305 | |
|
306 | 306 | if inline_comments == 1: |
|
307 |
inline_comments_text = "%d Inline |
|
|
307 | inline_comments_text = "%d Inline" % inline_comments | |
|
308 | 308 | else: |
|
309 |
inline_comments_text = "%d Inline |
|
|
309 | inline_comments_text = "%d Inline" % inline_comments | |
|
310 | 310 | |
|
311 | 311 | if comments: |
|
312 | 312 | response.mustcontain('<a href="#comments">%s</a>,' % comments_text) |
@@ -315,6 +315,6 b' def assert_comment_links(response, comme' | |||
|
315 | 315 | |
|
316 | 316 | if inline_comments: |
|
317 | 317 | response.mustcontain( |
|
318 |
'id="inline-comments-counter">%s |
|
|
318 | 'id="inline-comments-counter">%s' % inline_comments_text) | |
|
319 | 319 | else: |
|
320 | 320 | response.mustcontain(inline_comments_text) |
@@ -20,6 +20,7 b'' | |||
|
20 | 20 | |
|
21 | 21 | import pytest |
|
22 | 22 | |
|
23 | from rhodecode.apps.repository.tests.test_repo_compare import ComparePage | |
|
23 | 24 | from rhodecode.lib.helpers import _shorten_commit_id |
|
24 | 25 | |
|
25 | 26 | |
@@ -79,13 +80,22 b' class TestRepoCommitView(object):' | |||
|
79 | 80 | 'git': '03fa803d7e9fb14daa9a3089e0d1494eda75d986', |
|
80 | 81 | 'svn': '337', |
|
81 | 82 | } |
|
83 | diff_stat = { | |
|
84 | 'hg': (21, 943, 288), | |
|
85 | 'git': (20, 941, 286), | |
|
86 | 'svn': (21, 943, 288), | |
|
87 | } | |
|
88 | ||
|
82 | 89 | commit_id = commit_id[backend.alias] |
|
83 | 90 | response = self.app.get(route_path( |
|
84 | 91 | 'repo_commit', |
|
85 | 92 | repo_name=backend.repo_name, commit_id=commit_id)) |
|
86 | 93 | |
|
87 | 94 | response.mustcontain(_shorten_commit_id(commit_id)) |
|
88 | response.mustcontain('21 files changed: 943 inserted, 288 deleted') | |
|
95 | ||
|
96 | compare_page = ComparePage(response) | |
|
97 | file_changes = diff_stat[backend.alias] | |
|
98 | compare_page.contains_change_summary(*file_changes) | |
|
89 | 99 | |
|
90 | 100 | # files op files |
|
91 | 101 | response.mustcontain('File not present at commit: %s' % |
@@ -121,20 +131,24 b' class TestRepoCommitView(object):' | |||
|
121 | 131 | response.mustcontain(_shorten_commit_id(commit_ids[0])) |
|
122 | 132 | response.mustcontain(_shorten_commit_id(commit_ids[1])) |
|
123 | 133 | |
|
134 | compare_page = ComparePage(response) | |
|
135 | ||
|
124 | 136 | # svn is special |
|
125 | 137 | if backend.alias == 'svn': |
|
126 | 138 | response.mustcontain('new file 10644') |
|
127 | response.mustcontain('1 file changed: 5 inserted, 1 deleted') | |
|
128 | response.mustcontain('12 files changed: 236 inserted, 22 deleted') | |
|
129 | response.mustcontain('21 files changed: 943 inserted, 288 deleted') | |
|
139 | for file_changes in [(1, 5, 1), (12, 236, 22), (21, 943, 288)]: | |
|
140 | compare_page.contains_change_summary(*file_changes) | |
|
141 | elif backend.alias == 'git': | |
|
142 | response.mustcontain('new file 100644') | |
|
143 | for file_changes in [(12, 222, 20), (20, 941, 286)]: | |
|
144 | compare_page.contains_change_summary(*file_changes) | |
|
130 | 145 | else: |
|
131 | 146 | response.mustcontain('new file 100644') |
|
132 | response.mustcontain('12 files changed: 222 inserted, 20 deleted') | |
|
133 | response.mustcontain('21 files changed: 943 inserted, 288 deleted') | |
|
147 | for file_changes in [(12, 222, 20), (21, 943, 288)]: | |
|
148 | compare_page.contains_change_summary(*file_changes) | |
|
134 | 149 | |
|
135 | 150 | # files op files |
|
136 | response.mustcontain('File not present at commit: %s' % | |
|
137 | _shorten_commit_id(commit_ids[1])) | |
|
151 | response.mustcontain('File not present at commit: %s' % _shorten_commit_id(commit_ids[1])) | |
|
138 | 152 | response.mustcontain('Added docstrings to vcs.cli') # commit msg |
|
139 | 153 | response.mustcontain('Changed theme to ADC theme') # commit msg |
|
140 | 154 | |
@@ -166,13 +180,21 b' class TestRepoCommitView(object):' | |||
|
166 | 180 | response.mustcontain('File not present at commit: %s' % |
|
167 | 181 | _shorten_commit_id(commit_ids[1])) |
|
168 | 182 | |
|
183 | compare_page = ComparePage(response) | |
|
184 | ||
|
169 | 185 | # svn is special |
|
170 | 186 | if backend.alias == 'svn': |
|
171 | 187 | response.mustcontain('new file 10644') |
|
172 | response.mustcontain('32 files changed: 1179 inserted, 310 deleted') | |
|
188 | file_changes = (32, 1179, 310) | |
|
189 | compare_page.contains_change_summary(*file_changes) | |
|
190 | elif backend.alias == 'git': | |
|
191 | response.mustcontain('new file 100644') | |
|
192 | file_changes = (31, 1163, 306) | |
|
193 | compare_page.contains_change_summary(*file_changes) | |
|
173 | 194 | else: |
|
174 | 195 | response.mustcontain('new file 100644') |
|
175 | response.mustcontain('32 files changed: 1165 inserted, 308 deleted') | |
|
196 | file_changes = (32, 1165, 308) | |
|
197 | compare_page.contains_change_summary(*file_changes) | |
|
176 | 198 | |
|
177 | 199 | response.mustcontain('Added docstrings to vcs.cli') # commit msg |
|
178 | 200 | response.mustcontain('Changed theme to ADC theme') # commit msg |
@@ -246,7 +268,7 b' new file mode 120000' | |||
|
246 | 268 | """, |
|
247 | 269 | 'git': r"""diff --git a/README b/README |
|
248 | 270 | new file mode 120000 |
|
249 | index 0000000000000000000000000000000000000000..92cacd285355271487b7e379dba6ca60f9a554a4 | |
|
271 | index 0000000..92cacd2 | |
|
250 | 272 | --- /dev/null |
|
251 | 273 | +++ b/README |
|
252 | 274 | @@ -0,0 +1 @@ |
@@ -300,6 +322,6 b' Added a symlink' | |||
|
300 | 322 | |
|
301 | 323 | # right pane diff menus |
|
302 | 324 | if right_menu: |
|
303 |
for elem in ['Hide whitespace changes', 'Toggle |
|
|
325 | for elem in ['Hide whitespace changes', 'Toggle wide diff', | |
|
304 | 326 | 'Show full context diff']: |
|
305 | 327 | response.mustcontain(elem) |
@@ -623,8 +623,8 b' class ComparePage(AssertResponse):' | |||
|
623 | 623 | |
|
624 | 624 | def contains_change_summary(self, files_changed, inserted, deleted): |
|
625 | 625 | template = ( |
|
626 |
|
|
|
627 |
"{inserted} inserted, {deleted} deleted |
|
|
626 | '{files_changed} file{plural} changed: ' | |
|
627 | '<span class="op-added">{inserted} inserted</span>, <span class="op-deleted">{deleted} deleted</span>') | |
|
628 | 628 | self.response.mustcontain(template.format( |
|
629 | 629 | files_changed=files_changed, |
|
630 | 630 | plural="s" if files_changed > 1 else "", |
@@ -102,7 +102,7 b' class TestCompareView(object):' | |||
|
102 | 102 | 'git': { |
|
103 | 103 | 'tag': 'v0.2.2', |
|
104 | 104 | 'branch': 'master', |
|
105 |
'response': (7 |
|
|
105 | 'response': (70, 1855, 3002) | |
|
106 | 106 | }, |
|
107 | 107 | } |
|
108 | 108 |
@@ -20,6 +20,7 b'' | |||
|
20 | 20 | |
|
21 | 21 | import pytest |
|
22 | 22 | |
|
23 | from rhodecode.apps.repository.tests.test_repo_compare import ComparePage | |
|
23 | 24 | from rhodecode.lib.vcs import nodes |
|
24 | 25 | from rhodecode.lib.vcs.backends.base import EmptyCommit |
|
25 | 26 | from rhodecode.tests.fixture import Fixture |
@@ -49,18 +50,18 b' class TestSideBySideDiff(object):' | |||
|
49 | 50 | 'hg': { |
|
50 | 51 | 'commits': ['25d7e49c18b159446cadfa506a5cf8ad1cb04067', |
|
51 | 52 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], |
|
52 |
'changes': |
|
|
53 | 'changes': (21, 943, 288), | |
|
53 | 54 | }, |
|
54 | 55 | 'git': { |
|
55 | 56 | 'commits': ['6fc9270775aaf5544c1deb014f4ddd60c952fcbb', |
|
56 | 57 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], |
|
57 |
'changes': |
|
|
58 | 'changes': (20, 941, 286), | |
|
58 | 59 | }, |
|
59 | 60 | |
|
60 | 61 | 'svn': { |
|
61 | 62 | 'commits': ['336', |
|
62 | 63 | '337'], |
|
63 |
'changes': |
|
|
64 | 'changes': (21, 943, 288), | |
|
64 | 65 | }, |
|
65 | 66 | } |
|
66 | 67 | |
@@ -79,26 +80,27 b' class TestSideBySideDiff(object):' | |||
|
79 | 80 | params=dict(target_repo=backend.repo_name, diffmode='sidebyside') |
|
80 | 81 | )) |
|
81 | 82 | |
|
82 | response.mustcontain(file_changes) | |
|
83 | response.mustcontain('Expand 1 commit') | |
|
83 | compare_page = ComparePage(response) | |
|
84 | compare_page.contains_change_summary(*file_changes) | |
|
85 | response.mustcontain('Collapse 1 commit') | |
|
84 | 86 | |
|
85 | 87 | def test_diff_sidebyside_two_commits(self, app, backend): |
|
86 | 88 | commit_id_range = { |
|
87 | 89 | 'hg': { |
|
88 | 90 | 'commits': ['4fdd71e9427417b2e904e0464c634fdee85ec5a7', |
|
89 | 91 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], |
|
90 |
'changes': |
|
|
92 | 'changes': (32, 1165, 308), | |
|
91 | 93 | }, |
|
92 | 94 | 'git': { |
|
93 | 95 | 'commits': ['f5fbf9cfd5f1f1be146f6d3b38bcd791a7480c13', |
|
94 | 96 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], |
|
95 |
'changes': |
|
|
97 | 'changes': (31, 1163, 306), | |
|
96 | 98 | }, |
|
97 | 99 | |
|
98 | 100 | 'svn': { |
|
99 | 101 | 'commits': ['335', |
|
100 | 102 | '337'], |
|
101 |
'changes': |
|
|
103 | 'changes': (32, 1179, 310), | |
|
102 | 104 | }, |
|
103 | 105 | } |
|
104 | 106 | |
@@ -117,8 +119,36 b' class TestSideBySideDiff(object):' | |||
|
117 | 119 | params=dict(target_repo=backend.repo_name, diffmode='sidebyside') |
|
118 | 120 | )) |
|
119 | 121 | |
|
120 | response.mustcontain(file_changes) | |
|
121 | response.mustcontain('Expand 2 commits') | |
|
122 | compare_page = ComparePage(response) | |
|
123 | compare_page.contains_change_summary(*file_changes) | |
|
124 | ||
|
125 | response.mustcontain('Collapse 2 commits') | |
|
126 | ||
|
127 | def test_diff_sidebyside_collapsed_commits(self, app, backend_svn): | |
|
128 | commit_id_range = { | |
|
129 | ||
|
130 | 'svn': { | |
|
131 | 'commits': ['330', | |
|
132 | '337'], | |
|
133 | ||
|
134 | }, | |
|
135 | } | |
|
136 | ||
|
137 | commit_info = commit_id_range['svn'] | |
|
138 | commit2, commit1 = commit_info['commits'] | |
|
139 | ||
|
140 | response = self.app.get(route_path( | |
|
141 | 'repo_compare', | |
|
142 | repo_name=backend_svn.repo_name, | |
|
143 | source_ref_type='rev', | |
|
144 | source_ref=commit2, | |
|
145 | target_repo=backend_svn.repo_name, | |
|
146 | target_ref_type='rev', | |
|
147 | target_ref=commit1, | |
|
148 | params=dict(target_repo=backend_svn.repo_name, diffmode='sidebyside') | |
|
149 | )) | |
|
150 | ||
|
151 | response.mustcontain('Expand 7 commits') | |
|
122 | 152 | |
|
123 | 153 | @pytest.mark.xfail(reason='GIT does not handle empty commit compare correct (missing 1 commit)') |
|
124 | 154 | def test_diff_side_by_side_from_0_commit(self, app, backend, backend_stub): |
@@ -145,14 +175,14 b' class TestSideBySideDiff(object):' | |||
|
145 | 175 | params=dict(diffmode='sidebyside') |
|
146 | 176 | )) |
|
147 | 177 | |
|
148 |
response.mustcontain(' |
|
|
178 | response.mustcontain('Collapse 2 commits') | |
|
149 | 179 | response.mustcontain('123 file changed') |
|
150 | 180 | |
|
151 | 181 | response.mustcontain( |
|
152 | 182 | 'r%s:%s...r%s:%s' % ( |
|
153 | 183 | commit1.idx, commit1.short_id, commit2.idx, commit2.short_id)) |
|
154 | 184 | |
|
155 |
response.mustcontain( |
|
|
185 | response.mustcontain(f_path) | |
|
156 | 186 | |
|
157 | 187 | @pytest.mark.xfail(reason='GIT does not handle empty commit compare correct (missing 1 commit)') |
|
158 | 188 | def test_diff_side_by_side_from_0_commit_with_file_filter(self, app, backend, backend_stub): |
@@ -179,14 +209,14 b' class TestSideBySideDiff(object):' | |||
|
179 | 209 | params=dict(f_path=f_path, target_repo=repo.repo_name, diffmode='sidebyside') |
|
180 | 210 | )) |
|
181 | 211 | |
|
182 |
response.mustcontain(' |
|
|
212 | response.mustcontain('Collapse 2 commits') | |
|
183 | 213 | response.mustcontain('1 file changed') |
|
184 | 214 | |
|
185 | 215 | response.mustcontain( |
|
186 | 216 | 'r%s:%s...r%s:%s' % ( |
|
187 | 217 | commit1.idx, commit1.short_id, commit2.idx, commit2.short_id)) |
|
188 | 218 | |
|
189 |
response.mustcontain( |
|
|
219 | response.mustcontain(f_path) | |
|
190 | 220 | |
|
191 | 221 | def test_diff_side_by_side_with_empty_file(self, app, backend, backend_stub): |
|
192 | 222 | commits = [ |
@@ -211,32 +241,32 b' class TestSideBySideDiff(object):' | |||
|
211 | 241 | params=dict(f_path=f_path, target_repo=repo.repo_name, diffmode='sidebyside') |
|
212 | 242 | )) |
|
213 | 243 | |
|
214 |
response.mustcontain(' |
|
|
244 | response.mustcontain('Collapse 2 commits') | |
|
215 | 245 | response.mustcontain('1 file changed') |
|
216 | 246 | |
|
217 | 247 | response.mustcontain( |
|
218 | 248 | 'r%s:%s...r%s:%s' % ( |
|
219 | 249 | commit2.idx, commit2.short_id, commit3.idx, commit3.short_id)) |
|
220 | 250 | |
|
221 |
response.mustcontain( |
|
|
251 | response.mustcontain(f_path) | |
|
222 | 252 | |
|
223 | 253 | def test_diff_sidebyside_two_commits_with_file_filter(self, app, backend): |
|
224 | 254 | commit_id_range = { |
|
225 | 255 | 'hg': { |
|
226 | 256 | 'commits': ['4fdd71e9427417b2e904e0464c634fdee85ec5a7', |
|
227 | 257 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], |
|
228 |
'changes': |
|
|
258 | 'changes': (1, 3, 3) | |
|
229 | 259 | }, |
|
230 | 260 | 'git': { |
|
231 | 261 | 'commits': ['f5fbf9cfd5f1f1be146f6d3b38bcd791a7480c13', |
|
232 | 262 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], |
|
233 |
'changes': |
|
|
263 | 'changes': (1, 3, 3) | |
|
234 | 264 | }, |
|
235 | 265 | |
|
236 | 266 | 'svn': { |
|
237 | 267 | 'commits': ['335', |
|
238 | 268 | '337'], |
|
239 |
'changes': |
|
|
269 | 'changes': (1, 3, 3) | |
|
240 | 270 | }, |
|
241 | 271 | } |
|
242 | 272 | f_path = 'docs/conf.py' |
@@ -255,5 +285,7 b' class TestSideBySideDiff(object):' | |||
|
255 | 285 | params=dict(f_path=f_path, target_repo=backend.repo_name, diffmode='sidebyside') |
|
256 | 286 | )) |
|
257 | 287 | |
|
258 |
response.mustcontain(' |
|
|
259 | response.mustcontain(file_changes) | |
|
288 | response.mustcontain('Collapse 2 commits') | |
|
289 | ||
|
290 | compare_page = ComparePage(response) | |
|
291 | compare_page.contains_change_summary(*file_changes) |
@@ -41,7 +41,7 b' def route_path(name, params=None, **kwar' | |||
|
41 | 41 | class TestFeedView(TestController): |
|
42 | 42 | |
|
43 | 43 | @pytest.mark.parametrize("feed_type,response_types,content_type",[ |
|
44 |
('rss', ['<rss version="2.0" |
|
|
44 | ('rss', ['<rss version="2.0"'], | |
|
45 | 45 | "application/rss+xml"), |
|
46 | 46 | ('atom', ['xmlns="http://www.w3.org/2005/Atom"', 'xml:lang="en-us"'], |
|
47 | 47 | "application/atom+xml"), |
@@ -116,7 +116,7 b' class TestFeedView(TestController):' | |||
|
116 | 116 | self, backend, user_util, feed_type): |
|
117 | 117 | user = user_util.create_user() |
|
118 | 118 | auth_token = AuthTokenModel().create( |
|
119 | user.user_id, 'test-token', -1, AuthTokenModel.cls.ROLE_API) | |
|
119 | user.user_id, u'test-token', -1, AuthTokenModel.cls.ROLE_API) | |
|
120 | 120 | auth_token = auth_token.api_key |
|
121 | 121 | |
|
122 | 122 | self.app.get( |
@@ -127,7 +127,7 b' class TestFeedView(TestController):' | |||
|
127 | 127 | status=302) |
|
128 | 128 | |
|
129 | 129 | auth_token = AuthTokenModel().create( |
|
130 | user.user_id, 'test-token', -1, AuthTokenModel.cls.ROLE_FEED) | |
|
130 | user.user_id, u'test-token', -1, AuthTokenModel.cls.ROLE_FEED) | |
|
131 | 131 | auth_token = auth_token.api_key |
|
132 | 132 | self.app.get( |
|
133 | 133 | route_path( |
@@ -23,6 +23,7 b' import os' | |||
|
23 | 23 | import mock |
|
24 | 24 | import pytest |
|
25 | 25 | |
|
26 | from rhodecode.apps.repository.tests.test_repo_compare import ComparePage | |
|
26 | 27 | from rhodecode.apps.repository.views.repo_files import RepoFilesView |
|
27 | 28 | from rhodecode.lib import helpers as h |
|
28 | 29 | from rhodecode.lib.compat import OrderedDict |
@@ -616,8 +617,11 b' class TestFilesDiff(object):' | |||
|
616 | 617 | }) |
|
617 | 618 | # use redirect since this is OLD view redirecting to compare page |
|
618 | 619 | response = response.follow() |
|
619 |
response.mustcontain(' |
|
|
620 | response.mustcontain('1 file changed: 0 inserted, 0 deleted') | |
|
620 | response.mustcontain('Collapse 1 commit') | |
|
621 | file_changes = (1, 0, 0) | |
|
622 | ||
|
623 | compare_page = ComparePage(response) | |
|
624 | compare_page.contains_change_summary(*file_changes) | |
|
621 | 625 | |
|
622 | 626 | if backend.alias == 'svn': |
|
623 | 627 | response.mustcontain('new file 10644') |
@@ -130,7 +130,6 b' class TestRepoForkViewTests(TestControll' | |||
|
130 | 130 | 'repo_type': backend.alias, |
|
131 | 131 | 'description': description, |
|
132 | 132 | 'private': 'False', |
|
133 | 'landing_rev': 'rev:tip', | |
|
134 | 133 | 'csrf_token': csrf_token, |
|
135 | 134 | } |
|
136 | 135 | |
@@ -159,7 +158,6 b' class TestRepoForkViewTests(TestControll' | |||
|
159 | 158 | 'repo_type': backend.alias, |
|
160 | 159 | 'description': description, |
|
161 | 160 | 'private': 'False', |
|
162 | 'landing_rev': 'rev:tip', | |
|
163 | 161 | 'csrf_token': csrf_token, |
|
164 | 162 | } |
|
165 | 163 | self.app.post( |
@@ -172,8 +170,8 b' class TestRepoForkViewTests(TestControll' | |||
|
172 | 170 | route_path('repo_creating_check', repo_name=fork_name)) |
|
173 | 171 | # test if we have a message that fork is ok |
|
174 | 172 | assert_session_flash(response, |
|
175 | 'Forked repository %s as <a href="/%s">%s</a>' | |
|
176 |
|
|
|
173 | 'Forked repository %s as <a href="/%s">%s</a>' % ( | |
|
174 | repo_name, fork_name, fork_name)) | |
|
177 | 175 | |
|
178 | 176 | # test if the fork was created in the database |
|
179 | 177 | fork_repo = Session().query(Repository)\ |
@@ -205,7 +203,6 b' class TestRepoForkViewTests(TestControll' | |||
|
205 | 203 | 'repo_type': backend.alias, |
|
206 | 204 | 'description': description, |
|
207 | 205 | 'private': 'False', |
|
208 | 'landing_rev': 'rev:tip', | |
|
209 | 206 | 'csrf_token': csrf_token, |
|
210 | 207 | } |
|
211 | 208 | self.app.post( |
@@ -218,8 +215,8 b' class TestRepoForkViewTests(TestControll' | |||
|
218 | 215 | route_path('repo_creating_check', repo_name=fork_name_full)) |
|
219 | 216 | # test if we have a message that fork is ok |
|
220 | 217 | assert_session_flash(response, |
|
221 | 'Forked repository %s as <a href="/%s">%s</a>' | |
|
222 |
|
|
|
218 | 'Forked repository %s as <a href="/%s">%s</a>' % ( | |
|
219 | repo_name, fork_name_full, fork_name_full)) | |
|
223 | 220 | |
|
224 | 221 | # test if the fork was created in the database |
|
225 | 222 | fork_repo = Session().query(Repository)\ |
@@ -84,7 +84,7 b' class TestRepoIssueTracker(object):' | |||
|
84 | 84 | extra_environ=xhr_header, params=data) |
|
85 | 85 | |
|
86 | 86 | assert response.body == \ |
|
87 | 'example of <a class="issue-tracker-link" href="http://url">prefix</a> replacement' | |
|
87 | 'example of <a class="tooltip issue-tracker-link" href="http://url" title="description">prefix</a> replacement' | |
|
88 | 88 | |
|
89 | 89 | @request.addfinalizer |
|
90 | 90 | def cleanup(): |
@@ -125,7 +125,7 b' class TestRepoIssueTracker(object):' | |||
|
125 | 125 | self.settings_model.delete_entries(self.uid) |
|
126 | 126 | |
|
127 | 127 | def test_delete_issuetracker_pattern( |
|
128 | self, autologin_user, backend, csrf_token, settings_util): | |
|
128 | self, autologin_user, backend, csrf_token, settings_util, xhr_header): | |
|
129 | 129 | repo = backend.create_repo() |
|
130 | 130 | repo_name = repo.repo_name |
|
131 | 131 | entry_key = 'issuetracker_pat_' |
@@ -141,8 +141,9 b' class TestRepoIssueTracker(object):' | |||
|
141 | 141 | repo_name=backend.repo.repo_name), |
|
142 | 142 | { |
|
143 | 143 | 'uid': uid, |
|
144 | 'csrf_token': csrf_token | |
|
145 | }, status=302) | |
|
144 | 'csrf_token': csrf_token, | |
|
145 | '': '' | |
|
146 | }, extra_environ=xhr_header, status=200) | |
|
146 | 147 | settings = IssueTrackerSettingsModel( |
|
147 | 148 | repo=Repository.get_by_repo_name(repo_name)).get_repo_settings() |
|
148 | 149 | assert 'rhodecode_%s%s' % (entry_key, uid) not in settings |
@@ -62,7 +62,7 b' class TestRepoPermissionsView(object):' | |||
|
62 | 62 | route_path('edit_repo_perms', |
|
63 | 63 | repo_name=repo_name), form_data).follow() |
|
64 | 64 | |
|
65 | assert 'Repository permissions updated' in response | |
|
65 | assert 'Repository access permissions updated' in response | |
|
66 | 66 | |
|
67 | 67 | # revoke given |
|
68 | 68 | form_data = permission_update_data_generator( |
@@ -74,4 +74,4 b' class TestRepoPermissionsView(object):' | |||
|
74 | 74 | route_path('edit_repo_perms', |
|
75 | 75 | repo_name=repo_name), form_data).follow() |
|
76 | 76 | |
|
77 | assert 'Repository permissions updated' in response | |
|
77 | assert 'Repository access permissions updated' in response |
@@ -101,12 +101,11 b' class TestPullrequestsView(object):' | |||
|
101 | 101 | for commit_id in pull_request.revisions: |
|
102 | 102 | response.mustcontain(commit_id) |
|
103 | 103 | |
|
104 |
|
|
|
105 |
|
|
|
106 | target_clone_url = pull_request.target_repo.clone_url() | |
|
107 | assert target_clone_url in response | |
|
104 | response.mustcontain(pull_request.target_ref_parts.type) | |
|
105 | response.mustcontain(pull_request.target_ref_parts.name) | |
|
108 | 106 | |
|
109 |
|
|
|
107 | response.mustcontain('class="pull-request-merge"') | |
|
108 | ||
|
110 | 109 | if pr_merge_enabled: |
|
111 | 110 | response.mustcontain('Pull request reviewer approval is pending') |
|
112 | 111 | else: |
@@ -261,7 +260,7 b' class TestPullrequestsView(object):' | |||
|
261 | 260 | True, True, '', MergeFailureReason.MISSING_TARGET_REF, |
|
262 | 261 | metadata={'target_ref': PullRequest.unicode_to_reference(unicode_reference)}) |
|
263 | 262 | response.assert_response().element_contains( |
|
264 |
' |
|
|
263 | 'div[data-role="merge-message"]', merge_resp.merge_status_message) | |
|
265 | 264 | |
|
266 | 265 | def test_comment_and_close_pull_request_custom_message_approved( |
|
267 | 266 | self, pr_util, csrf_token, xhr_header): |
@@ -284,7 +283,7 b' class TestPullrequestsView(object):' | |||
|
284 | 283 | journal = UserLog.query()\ |
|
285 | 284 | .filter(UserLog.user_id == author)\ |
|
286 | 285 | .filter(UserLog.repository_id == repo) \ |
|
287 |
.order_by( |
|
|
286 | .order_by(UserLog.user_log_id.asc()) \ | |
|
288 | 287 | .all() |
|
289 | 288 | assert journal[-1].action == 'repo.pull_request.close' |
|
290 | 289 | |
@@ -323,7 +322,7 b' class TestPullrequestsView(object):' | |||
|
323 | 322 | |
|
324 | 323 | journal = UserLog.query()\ |
|
325 | 324 | .filter(UserLog.user_id == author, UserLog.repository_id == repo) \ |
|
326 |
.order_by( |
|
|
325 | .order_by(UserLog.user_log_id.asc()) \ | |
|
327 | 326 | .all() |
|
328 | 327 | assert journal[-1].action == 'repo.pull_request.close' |
|
329 | 328 | |
@@ -467,7 +466,7 b' class TestPullrequestsView(object):' | |||
|
467 | 466 | .filter(Notification.created_by == pull_request.author.user_id, |
|
468 | 467 | Notification.type_ == Notification.TYPE_PULL_REQUEST, |
|
469 | 468 | Notification.subject.contains( |
|
470 |
" |
|
|
469 | "requested a pull request review. !%s" % pull_request_id)) | |
|
471 | 470 | assert len(notifications.all()) == 1 |
|
472 | 471 | |
|
473 | 472 | # Change reviewers and check that a notification was made |
@@ -536,9 +535,9 b' class TestPullrequestsView(object):' | |||
|
536 | 535 | |
|
537 | 536 | # Check generated diff contents |
|
538 | 537 | response = response.follow() |
|
539 | assert 'content_of_ancestor' not in response.body | |
|
540 |
|
|
|
541 |
|
|
|
538 | response.mustcontain(no=['content_of_ancestor']) | |
|
539 | response.mustcontain(no=['content_of_ancestor-child']) | |
|
540 | response.mustcontain('content_of_change') | |
|
542 | 541 | |
|
543 | 542 | def test_merge_pull_request_enabled(self, pr_util, csrf_token): |
|
544 | 543 | # Clear any previous calls to rcextensions |
@@ -549,11 +548,10 b' class TestPullrequestsView(object):' | |||
|
549 | 548 | pull_request_id = pull_request.pull_request_id |
|
550 | 549 | repo_name = pull_request.target_repo.scm_instance().name, |
|
551 | 550 | |
|
552 | response = self.app.post( | |
|
553 | route_path('pullrequest_merge', | |
|
554 | repo_name=str(repo_name[0]), | |
|
555 | pull_request_id=pull_request_id), | |
|
556 | params={'csrf_token': csrf_token}).follow() | |
|
551 | url = route_path('pullrequest_merge', | |
|
552 | repo_name=str(repo_name[0]), | |
|
553 | pull_request_id=pull_request_id) | |
|
554 | response = self.app.post(url, params={'csrf_token': csrf_token}).follow() | |
|
557 | 555 | |
|
558 | 556 | pull_request = PullRequest.get(pull_request_id) |
|
559 | 557 | |
@@ -563,7 +561,7 b' class TestPullrequestsView(object):' | |||
|
563 | 561 | pull_request, ChangesetStatus.STATUS_APPROVED) |
|
564 | 562 | |
|
565 | 563 | # Check the relevant log entries were added |
|
566 |
user_logs = UserLog.query().order_by( |
|
|
564 | user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(3) | |
|
567 | 565 | actions = [log.action for log in user_logs] |
|
568 | 566 | pr_commit_ids = PullRequestModel()._get_commit_ids(pull_request) |
|
569 | 567 | expected_actions = [ |
@@ -573,7 +571,7 b' class TestPullrequestsView(object):' | |||
|
573 | 571 | ] |
|
574 | 572 | assert actions == expected_actions |
|
575 | 573 | |
|
576 |
user_logs = UserLog.query().order_by( |
|
|
574 | user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(4) | |
|
577 | 575 | actions = [log for log in user_logs] |
|
578 | 576 | assert actions[-1].action == 'user.push' |
|
579 | 577 | assert actions[-1].action_data['commit_ids'] == pr_commit_ids |
@@ -690,8 +688,8 b' class TestPullrequestsView(object):' | |||
|
690 | 688 | pull_request_id=pull_request.pull_request_id)) |
|
691 | 689 | |
|
692 | 690 | assert response.status_int == 200 |
|
693 |
|
|
|
694 |
|
|
|
691 | response.mustcontain('Pull request updated to') | |
|
692 | response.mustcontain('with 1 added, 0 removed commits.') | |
|
695 | 693 | |
|
696 | 694 | # check that we have now both revisions |
|
697 | 695 | pull_request = PullRequest.get(pull_request_id) |
@@ -735,12 +733,12 b' class TestPullrequestsView(object):' | |||
|
735 | 733 | backend.pull_heads(source, heads=['change-rebased']) |
|
736 | 734 | |
|
737 | 735 | # update PR |
|
738 | self.app.post( | |
|
739 | route_path('pullrequest_update', | |
|
740 | repo_name=target.repo_name, | |
|
741 | pull_request_id=pull_request_id), | |
|
742 | params={'update_commits': 'true', 'csrf_token': csrf_token}, | |
|
743 | status=200) | |
|
736 | url = route_path('pullrequest_update', | |
|
737 | repo_name=target.repo_name, | |
|
738 | pull_request_id=pull_request_id) | |
|
739 | self.app.post(url, | |
|
740 | params={'update_commits': 'true', 'csrf_token': csrf_token}, | |
|
741 | status=200) | |
|
744 | 742 | |
|
745 | 743 | # check that we have now both revisions |
|
746 | 744 | pull_request = PullRequest.get(pull_request_id) |
@@ -753,8 +751,8 b' class TestPullrequestsView(object):' | |||
|
753 | 751 | repo_name=target.repo_name, |
|
754 | 752 | pull_request_id=pull_request.pull_request_id)) |
|
755 | 753 | assert response.status_int == 200 |
|
756 |
|
|
|
757 |
|
|
|
754 | response.mustcontain('Pull request updated to') | |
|
755 | response.mustcontain('with 1 added, 1 removed commits.') | |
|
758 | 756 | |
|
759 | 757 | def test_update_target_revision_with_removal_of_1_commit_git(self, backend_git, csrf_token): |
|
760 | 758 | backend = backend_git |
@@ -801,12 +799,12 b' class TestPullrequestsView(object):' | |||
|
801 | 799 | vcsrepo.run_git_command(['reset', '--soft', 'HEAD~2']) |
|
802 | 800 | |
|
803 | 801 | # update PR |
|
804 | self.app.post( | |
|
805 | route_path('pullrequest_update', | |
|
806 | repo_name=target.repo_name, | |
|
807 | pull_request_id=pull_request_id), | |
|
808 | params={'update_commits': 'true', 'csrf_token': csrf_token}, | |
|
809 | status=200) | |
|
802 | url = route_path('pullrequest_update', | |
|
803 | repo_name=target.repo_name, | |
|
804 | pull_request_id=pull_request_id) | |
|
805 | self.app.post(url, | |
|
806 | params={'update_commits': 'true', 'csrf_token': csrf_token}, | |
|
807 | status=200) | |
|
810 | 808 | |
|
811 | 809 | response = self.app.get(route_path('pullrequest_new', repo_name=target.repo_name)) |
|
812 | 810 | assert response.status_int == 200 |
@@ -871,6 +869,7 b' class TestPullrequestsView(object):' | |||
|
871 | 869 | {'message': 'new-feature', 'branch': branch_name}, |
|
872 | 870 | ] |
|
873 | 871 | repo = backend_git.create_repo(commits) |
|
872 | repo_name = repo.repo_name | |
|
874 | 873 | commit_ids = backend_git.commit_ids |
|
875 | 874 | |
|
876 | 875 | pull_request = PullRequest() |
@@ -888,13 +887,15 b' class TestPullrequestsView(object):' | |||
|
888 | 887 | Session().add(pull_request) |
|
889 | 888 | Session().commit() |
|
890 | 889 | |
|
890 | pull_request_id = pull_request.pull_request_id | |
|
891 | ||
|
891 | 892 | vcs = repo.scm_instance() |
|
892 | 893 | vcs.remove_ref('refs/heads/{}'.format(branch_name)) |
|
893 | 894 | |
|
894 | 895 | response = self.app.get(route_path( |
|
895 | 896 | 'pullrequest_show', |
|
896 |
repo_name= |
|
|
897 |
pull_request_id= |
|
|
897 | repo_name=repo_name, | |
|
898 | pull_request_id=pull_request_id)) | |
|
898 | 899 | |
|
899 | 900 | assert response.status_int == 200 |
|
900 | 901 | |
@@ -958,15 +959,15 b' class TestPullrequestsView(object):' | |||
|
958 | 959 | else: |
|
959 | 960 | vcs.strip(pr_util.commit_ids['new-feature']) |
|
960 | 961 | |
|
961 | response = self.app.post( | |
|
962 | route_path('pullrequest_update', | |
|
963 | repo_name=pull_request.target_repo.repo_name, | |
|
964 | pull_request_id=pull_request.pull_request_id), | |
|
965 | params={'update_commits': 'true', | |
|
966 | 'csrf_token': csrf_token}) | |
|
962 | url = route_path('pullrequest_update', | |
|
963 | repo_name=pull_request.target_repo.repo_name, | |
|
964 | pull_request_id=pull_request.pull_request_id) | |
|
965 | response = self.app.post(url, | |
|
966 | params={'update_commits': 'true', | |
|
967 | 'csrf_token': csrf_token}) | |
|
967 | 968 | |
|
968 | 969 | assert response.status_int == 200 |
|
969 | assert response.body == 'true' | |
|
970 | assert response.body == '{"response": true, "redirect_url": null}' | |
|
970 | 971 | |
|
971 | 972 | # Make sure that after update, it won't raise 500 errors |
|
972 | 973 | response = self.app.get(route_path( |
@@ -992,12 +993,13 b' class TestPullrequestsView(object):' | |||
|
992 | 993 | pull_request_id=pull_request.pull_request_id)) |
|
993 | 994 | assert response.status_int == 200 |
|
994 | 995 | |
|
995 |
|
|
|
996 |
|
|
|
997 |
assert len( |
|
|
998 | target = response.assert_response().get_element('.pr-targetinfo .tag') | |
|
999 | target_children = target.getchildren() | |
|
1000 | assert len(target_children) == 1 | |
|
996 | source = response.assert_response().get_element('.pr-source-info') | |
|
997 | source_parent = source.getparent() | |
|
998 | assert len(source_parent) == 1 | |
|
999 | ||
|
1000 | target = response.assert_response().get_element('.pr-target-info') | |
|
1001 | target_parent = target.getparent() | |
|
1002 | assert len(target_parent) == 1 | |
|
1001 | 1003 | |
|
1002 | 1004 | expected_origin_link = route_path( |
|
1003 | 1005 | 'repo_commits', |
@@ -1007,10 +1009,8 b' class TestPullrequestsView(object):' | |||
|
1007 | 1009 | 'repo_commits', |
|
1008 | 1010 | repo_name=pull_request.target_repo.scm_instance().name, |
|
1009 | 1011 | params=dict(branch='target')) |
|
1010 |
assert |
|
|
1011 | assert origin_children[0].text == 'branch: origin' | |
|
1012 | assert target_children[0].attrib['href'] == expected_target_link | |
|
1013 | assert target_children[0].text == 'branch: target' | |
|
1012 | assert source_parent.attrib['href'] == expected_origin_link | |
|
1013 | assert target_parent.attrib['href'] == expected_target_link | |
|
1014 | 1014 | |
|
1015 | 1015 | def test_bookmark_is_not_a_link(self, pr_util): |
|
1016 | 1016 | pull_request = pr_util.create_pull_request() |
@@ -1025,13 +1025,13 b' class TestPullrequestsView(object):' | |||
|
1025 | 1025 | pull_request_id=pull_request.pull_request_id)) |
|
1026 | 1026 | assert response.status_int == 200 |
|
1027 | 1027 | |
|
1028 |
|
|
|
1029 |
assert |
|
|
1030 | assert origin.getchildren() == [] | |
|
1028 | source = response.assert_response().get_element('.pr-source-info') | |
|
1029 | assert source.text.strip() == 'bookmark:origin' | |
|
1030 | assert source.getparent().attrib.get('href') is None | |
|
1031 | 1031 | |
|
1032 |
target = response.assert_response().get_element('.pr-targetinfo |
|
|
1033 |
assert target.text.strip() == 'bookmark: |
|
|
1034 |
assert target.get |
|
|
1032 | target = response.assert_response().get_element('.pr-target-info') | |
|
1033 | assert target.text.strip() == 'bookmark:target' | |
|
1034 | assert target.getparent().attrib.get('href') is None | |
|
1035 | 1035 | |
|
1036 | 1036 | def test_tag_is_not_a_link(self, pr_util): |
|
1037 | 1037 | pull_request = pr_util.create_pull_request() |
@@ -1046,13 +1046,13 b' class TestPullrequestsView(object):' | |||
|
1046 | 1046 | pull_request_id=pull_request.pull_request_id)) |
|
1047 | 1047 | assert response.status_int == 200 |
|
1048 | 1048 | |
|
1049 |
|
|
|
1050 |
assert |
|
|
1051 | assert origin.getchildren() == [] | |
|
1049 | source = response.assert_response().get_element('.pr-source-info') | |
|
1050 | assert source.text.strip() == 'tag:origin' | |
|
1051 | assert source.getparent().attrib.get('href') is None | |
|
1052 | 1052 | |
|
1053 |
target = response.assert_response().get_element('.pr-targetinfo |
|
|
1054 |
assert target.text.strip() == 'tag: |
|
|
1055 |
assert target.get |
|
|
1053 | target = response.assert_response().get_element('.pr-target-info') | |
|
1054 | assert target.text.strip() == 'tag:target' | |
|
1055 | assert target.getparent().attrib.get('href') is None | |
|
1056 | 1056 | |
|
1057 | 1057 | @pytest.mark.parametrize('mergeable', [True, False]) |
|
1058 | 1058 | def test_shadow_repository_link( |
@@ -1205,14 +1205,11 b' class TestPullrequestsControllerDelete(o' | |||
|
1205 | 1205 | |
|
1206 | 1206 | |
|
1207 | 1207 | def assert_pull_request_status(pull_request, expected_status): |
|
1208 | status = ChangesetStatusModel().calculated_review_status( | |
|
1209 | pull_request=pull_request) | |
|
1208 | status = ChangesetStatusModel().calculated_review_status(pull_request=pull_request) | |
|
1210 | 1209 | assert status == expected_status |
|
1211 | 1210 | |
|
1212 | 1211 | |
|
1213 | 1212 | @pytest.mark.parametrize('route', ['pullrequest_new', 'pullrequest_create']) |
|
1214 | 1213 | @pytest.mark.usefixtures("autologin_user") |
|
1215 | 1214 | def test_forbidde_to_repo_summary_for_svn_repositories(backend_svn, app, route): |
|
1216 | response = app.get( | |
|
1217 | route_path(route, repo_name=backend_svn.repo_name), status=404) | |
|
1218 | ||
|
1215 | app.get(route_path(route, repo_name=backend_svn.repo_name), status=404) |
@@ -236,7 +236,7 b' class TestSummaryView(object):' | |||
|
236 | 236 | with scm_patcher: |
|
237 | 237 | response = self.app.get( |
|
238 | 238 | route_path('repo_summary', repo_name=repo_name)) |
|
239 |
assert_response = |
|
|
239 | assert_response = response.assert_response() | |
|
240 | 240 | assert_response.element_contains( |
|
241 | 241 | '.main .alert-warning strong', 'Missing requirements') |
|
242 | 242 | assert_response.element_contains( |
@@ -327,7 +327,7 b' def summary_view(context_stub, request_s' | |||
|
327 | 327 | @pytest.mark.usefixtures('app') |
|
328 | 328 | class TestCreateReferenceData(object): |
|
329 | 329 | |
|
330 | @pytest.fixture | |
|
330 | @pytest.fixture() | |
|
331 | 331 | def example_refs(self): |
|
332 | 332 | section_1_refs = OrderedDict((('a', 'a_id'), ('b', 'b_id'))) |
|
333 | 333 | example_refs = [ |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file renamed from rhodecode/templates/admin/gists/edit.mako to rhodecode/templates/admin/gists/gist_edit.mako | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file renamed from rhodecode/templates/admin/gists/index.mako to rhodecode/templates/admin/gists/gist_index.mako | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file renamed from rhodecode/templates/admin/gists/new.mako to rhodecode/templates/admin/gists/gist_new.mako | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file renamed from rhodecode/templates/admin/gists/show.mako to rhodecode/templates/admin/gists/gist_show.mako | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
General Comments 0
You need to be logged in to leave comments.
Login now