Show More
The requested changes are too big and content was truncated. Show full diff
@@ -0,0 +1,26 b'' | |||||
|
1 | .. _adjust-rhodecode-mem: | |||
|
2 | ||||
|
3 | RhodeCode Memory Usage | |||
|
4 | ---------------------- | |||
|
5 | ||||
|
6 | Starting from Version 4.18.X RhodeCode has a builtin memory monitor for gunicorn workers. | |||
|
7 | Enabling this can limit the maximum amount of memory system can use. Each worker | |||
|
8 | for RhodeCode is monitored independently. | |||
|
9 | To enable Memory management make sure to have following settings inside `[app:main] section` of | |||
|
10 | :file:`home/{user}/.rccontrol/{instance-id}/rhodecode.ini` file. | |||
|
11 | ||||
|
12 | ||||
|
13 | ||||
|
14 | ; Maximum memory usage that each worker can use before it will receive a | |||
|
15 | ; graceful restart signal 0 = memory monitoring is disabled | |||
|
16 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |||
|
17 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |||
|
18 | memory_max_usage = 1073741824 | |||
|
19 | ||||
|
20 | ; How often in seconds to check for memory usage for each gunicorn worker | |||
|
21 | memory_usage_check_interval = 60 | |||
|
22 | ||||
|
23 | ; Threshold value for which we don't recycle worker if GarbageCollection | |||
|
24 | ; frees up enough resources. Before each restart we try to run GC on worker | |||
|
25 | ; in case we get enough free memory after that, restart will not happen. | |||
|
26 | memory_usage_recovery_threshold = 0.8 |
@@ -0,0 +1,230 b'' | |||||
|
1 | |RCE| 4.18.0 |RNS| | |||
|
2 | ------------------ | |||
|
3 | ||||
|
4 | Release Date | |||
|
5 | ^^^^^^^^^^^^ | |||
|
6 | ||||
|
7 | - 2020-01-05 | |||
|
8 | ||||
|
9 | ||||
|
10 | New Features | |||
|
11 | ^^^^^^^^^^^^ | |||
|
12 | ||||
|
13 | - Artifacts: are no longer in BETA. New info page is available for uploaded artifacts | |||
|
14 | which exposes some useful information like sha256, various access urls etc, and also | |||
|
15 | allows deletion of artifacts, and updating their description. | |||
|
16 | - Artifacts: support new download url based on access to artifacts using new auth-token types. | |||
|
17 | - Artifacts: added ability to store artifacts using API, and internal cli upload. | |||
|
18 | This allows uploading of artifacts that can have 100s of GBs in size efficiently. | |||
|
19 | - Artifacts: added metadata logic to store various extra custom data for artifacts. | |||
|
20 | - Comments: added support for adding comment attachments using the artifacts logic. | |||
|
21 | Logged in users can now pick or drag and drop attachments into comment forms. | |||
|
22 | - Comments: enable linkification of certain patterns on comments in repo/pull request scopes. | |||
|
23 | This will render now active links to commits, pull-requests mentioned in comments body. | |||
|
24 | - Jira: new update integration plugin. | |||
|
25 | Plugin now fetches possible transitions from tickets and show them to users in the interface. | |||
|
26 | Allow sending extra attributes during a transition like `resolution` message. | |||
|
27 | - Navigation: Added new consistent and contextual way of creating new objects | |||
|
28 | likes gists, repositories, and repository groups using dedicated action (with a `+` sign) | |||
|
29 | available in the top navigation. | |||
|
30 | - Hovercards: added new tooltips and hovercards to expose certain information for objects shown in UI. | |||
|
31 | RhodeCode usernames, issues, pull-requests will have active hovercard logic that will | |||
|
32 | load extra information about them and exposing them to users. | |||
|
33 | - Files: all readme files found in repository file browser will be now rendered, allowing having readme per directory. | |||
|
34 | - Search: expose line counts in search files information. | |||
|
35 | - Audit-logs: expose download user audit logs as JSON file. | |||
|
36 | - Users: added description field for users. | |||
|
37 | Allows users to write a short BIO, or description of their role in the organization. | |||
|
38 | - Users: allow super-admins to change bound authentication type for users. | |||
|
39 | E.g internal rhodecode accounts can be changed to ldap easily from user settings page. | |||
|
40 | - Pull requests: simplified the UI for display view, hide less important information and expose the most important ones. | |||
|
41 | - Pull requests: add merge check that detects WIP marker in title. | |||
|
42 | Usually WIP in title means unfinished task that needs still some work, such marker will prevent accidental merges. | |||
|
43 | - Pull requests: TODO comments have now a dedicated box below reviewers to keep track | |||
|
44 | of important TODOs that still need attention before review process is finalized. | |||
|
45 | - Pull requests: participants of pull request will receive an email about update of a | |||
|
46 | pull requests with a small summary of changes made. | |||
|
47 | - Pull requests: change the naming from #NUM into !NUM. | |||
|
48 | !NUM format is now parsed and linkified in comments and commit messages. | |||
|
49 | - Pull requests: pull requests which state is changing can now be viewed with a limited view. | |||
|
50 | - Pull requests: re-organize merge/close buttons and merge checks according to the new UI. | |||
|
51 | - Pull requests: update commits button allows a force-refresh update now using dropdown option. | |||
|
52 | - Pull requests: added quick filter to grid view to filter/search pull requests in a repository. | |||
|
53 | - Pull requests: closing a pull-request without a merge requires additional confirmation now. | |||
|
54 | - Pull requests: merge checks will now show which files caused conflicts and are blocking the merge. | |||
|
55 | - Emails: updated all generated emails design and cleanup the data fields they expose. | |||
|
56 | a) More consistent UI for all types of emails. b) Improved formatting of plaintext emails | |||
|
57 | c) Added reply link to comment type emails for quicker response action. | |||
|
58 | ||||
|
59 | ||||
|
60 | General | |||
|
61 | ^^^^^^^ | |||
|
62 | ||||
|
63 | - Artifacts: don't show hidden artifacts, allow showing them via a GET ?hidden=1 flag. | |||
|
64 | Hidden artifacts are for example comment attachments. | |||
|
65 | - UI: new commits page, according to the new design, which started on 4.17.X release lines | |||
|
66 | - UI: use explicit named actions like "create user" instead of generic "save" which is bad UX. | |||
|
67 | - UI: fixed problems with generating last change in repository groups. | |||
|
68 | There's now a new logic that checks all objects inside group for latest update time. | |||
|
69 | - API: add artifact `get_info`, and `store_metadata` methods. | |||
|
70 | - API: allowed to specify extra recipients for pr/commit comments api methods. | |||
|
71 | - Vcsserver: set file based cache as default for vcsserver which can be shared | |||
|
72 | across multiple workers saving memory usage. | |||
|
73 | - Vcsserver: added redis as possible cache backend for even greater performance. | |||
|
74 | - Dependencies: bumped GIT version to 2.23.0 | |||
|
75 | - Dependencies: bumped SVN version to 1.12.2 | |||
|
76 | - Dependencies: bumped Mercurial version to 5.1.1 and hg-evolve to 9.1.0 | |||
|
77 | - Search: added logic for sorting ElasticSearch6 backend search results. | |||
|
78 | - User bookmarks: make it easier to re-organize existing entries. | |||
|
79 | - Data grids: hide pagination for single pages in grids. | |||
|
80 | - Gists: UX, removed private/public gist buttons and replaced them with radio group. | |||
|
81 | - Gunicorn: moved all configuration of gunicorn workers to .ini files. | |||
|
82 | - Gunicorn: added worker memory management allowing setting maximum per-worker memory usage. | |||
|
83 | - Automation: moved update groups task into celery task | |||
|
84 | - Cache commits: add option to refresh caches manually from advanced pages. | |||
|
85 | - Pull requests: add indication of state change in list of pull-requests and actually show them in the list. | |||
|
86 | - Cache keys: register and self cleanup cache keys used for invalidation to prevent leaking lot of them into DB on worker recycle | |||
|
87 | - Repo groups: removed locking inheritance flag from repo-groups. We'll deprecate this soon and this only brings in confusion | |||
|
88 | - System snapshot: improved formatting for better readability | |||
|
89 | - System info: expose data about vcsserver. | |||
|
90 | - Packages: updated celery to 4.3.0 and switch default backend to redis instead of RabbitMQ. | |||
|
91 | Redis is stable enough and easier to install. Having Redis simplifies the stack as it's used in other parts of RhodeCode. | |||
|
92 | - Dependencies: bumped alembic to 1.2.1 | |||
|
93 | - Dependencies: bumped amqp==2.5.2 and kombu==4.6.6 | |||
|
94 | - Dependencies: bumped atomicwrites==1.3.0 | |||
|
95 | - Dependencies: bumped cffi==1.12.3 | |||
|
96 | - Dependencies: bumped configparser==4.0.2 | |||
|
97 | - Dependencies: bumped deform==2.0.8 | |||
|
98 | - Dependencies: bumped dogpile.cache==0.9.0 | |||
|
99 | - Dependencies: bumped hupper==1.8.1 | |||
|
100 | - Dependencies: bumped mako to 1.1.0 | |||
|
101 | - Dependencies: bumped markupsafe to 1.1.1 | |||
|
102 | - Dependencies: bumped packaging==19.2 | |||
|
103 | - Dependencies: bumped paste==3.2.1 | |||
|
104 | - Dependencies: bumped pastescript==3.2.0 | |||
|
105 | - Dependencies: bumped pathlib2 to 2.3.4 | |||
|
106 | - Dependencies: bumped pluggy==0.13.0 | |||
|
107 | - Dependencies: bumped psutil to 5.6.3 | |||
|
108 | - Dependencies: bumped psutil==5.6.5 | |||
|
109 | - Dependencies: bumped psycopg2==2.8.4 | |||
|
110 | - Dependencies: bumped pycurl to 7.43.0.3 | |||
|
111 | - Dependencies: bumped pyotp==2.3.0 | |||
|
112 | - Dependencies: bumped pyparsing to 2.4.2 | |||
|
113 | - Dependencies: bumped pyramid-debugtoolbar==4.5.1 | |||
|
114 | - Dependencies: bumped pyramid-mako to 1.1.0 | |||
|
115 | - Dependencies: bumped redis to 3.3.8 | |||
|
116 | - Dependencies: bumped sqlalchemy to 1.3.8 | |||
|
117 | - Dependencies: bumped sqlalchemy==1.3.11 | |||
|
118 | - Dependencies: bumped test libraries. | |||
|
119 | - Dependencies: freeze alembic==1.3.1 | |||
|
120 | - Dependencies: freeze python-dateutil | |||
|
121 | - Dependencies: freeze redis==3.3.11 | |||
|
122 | - Dependencies: freeze supervisor==4.1.0 | |||
|
123 | ||||
|
124 | ||||
|
125 | Security | |||
|
126 | ^^^^^^^^ | |||
|
127 | ||||
|
128 | - Security: fixed issues with exposing wrong http status (403) indicating repository with | |||
|
129 | given name exists and we don't have permissions to it. This was exposed in the redirection | |||
|
130 | logic of the global pull-request page. In case of redirection we also exposed | |||
|
131 | repository name in the URL. | |||
|
132 | ||||
|
133 | ||||
|
134 | Performance | |||
|
135 | ^^^^^^^^^^^ | |||
|
136 | ||||
|
137 | - Core: many various small improvements and optimizations to make rhodecode faster then before. | |||
|
138 | - VCSServer: new cache implementation for remote functions. | |||
|
139 | Single worker shared caches that can use redis/file-cache. | |||
|
140 | This greatly improves performance on larger instances, and doesn't trigger cache | |||
|
141 | re-calculation on worker restarts. | |||
|
142 | - GIT: switched internal git operations from Dulwich to libgit2 in order to obtain better performance and scalability. | |||
|
143 | - SSH: skip loading unneeded application parts for SSH to make execution of ssh commands faster. | |||
|
144 | - Main page: main page will now load repositories and repositories groups using partial DB calls instead of big JSON files. | |||
|
145 | In case of many repositories in root this could lead to very slow page rendering. | |||
|
146 | - Admin pages: made all grids use same DB based partial loading logic. We'll no longer fetch | |||
|
147 | all objects into JSON for display purposes. This significantly improves speed of those pages in case | |||
|
148 | of many objects shown in them. | |||
|
149 | - Summary page: use non-memory cache for readme, and cleanup cache for repo stats. | |||
|
150 | This change won't re-cache after worker restarts and can be shared across all workers | |||
|
151 | - Files: only check for git_lfs/hg_largefiles if they are enabled. | |||
|
152 | This speeds up fetching of files if they are not LF and very big. | |||
|
153 | - Vcsserver: added support for streaming data from the remote methods. This allows | |||
|
154 | to stream very large files without taking up memory, mostly for usage in SVN when | |||
|
155 | downloading large binaries from vcs system. | |||
|
156 | - Files: added streaming remote attributes for vcsserver. | |||
|
157 | This change enables streaming raw content or raw downloads of large files without | |||
|
158 | transferring them over to enterprise for pack & repack using msgpack. | |||
|
159 | Msgpack has a limit of 2gb and generally pack+repack for ~2gb is very slow. | |||
|
160 | - Files: ensure over size limit files never do any content fetching when viewing such files. | |||
|
161 | - VCSServer: skip host verification to speed up pycurl calls. | |||
|
162 | - User-bookmarks: cache fetching of bookmarks since this is quite expensive query to | |||
|
163 | make with joinedload on repos/repo groups. | |||
|
164 | - Goto-switcher: reduce query data to only required attributes for speedups. | |||
|
165 | - My account: owner/watched repos are now loaded only using DB queries. | |||
|
166 | ||||
|
167 | ||||
|
168 | Fixes | |||
|
169 | ^^^^^ | |||
|
170 | ||||
|
171 | - Mercurial: move imports from top-level to prevent from loading mercurial code on hook execution for svn/git. | |||
|
172 | - GIT: limit sync-fetch logic to only retrieve tags/ and heads/ with default execution arguments. | |||
|
173 | - GIT: fixed issue with git submodules detection. | |||
|
174 | - SVN: fix checkout url for ssh+svn backend not having special prefix resulting in incorrect command shown. | |||
|
175 | - SVN: fixed problem with showing empty directories. | |||
|
176 | - OAuth: use a vendored version of `authomatic` library, and switch Bitbucket authentication to use oauth2. | |||
|
177 | - Diffs: handle paths with quotes in diffs. | |||
|
178 | - Diffs: fixed outdated files in pull-requests re-using the filediff raw_id for anchor generation. Fixes #5567 | |||
|
179 | - Diffs: toggle race condition on sticky vs wide-diff-mode that caused some display problems on larger diffs. | |||
|
180 | - Pull requests: handle exceptions in state change and improve logging. | |||
|
181 | - Pull requests: fixed title/description generation for single commits which are numbers. | |||
|
182 | - Pull requests: changed the source of changes to be using shadow repos if it exists. | |||
|
183 | In case of `git push -f` and rebase we lost commits in the repo resulting in | |||
|
184 | problems of displaying versions of pull-requests. | |||
|
185 | - Pull requests: handle case when removing existing files from a repository in compare versions diff. | |||
|
186 | - Files: don't expose copy content helper in case of binary files. | |||
|
187 | - Registration: properly expose first_name/last_name into email on user registration. | |||
|
188 | - Markup renderers: fixed broken code highlight for rst files. | |||
|
189 | - Ui: make super admin be named consistently across ui. | |||
|
190 | - Audit logs: fixed search cases with special chars such as `-`. | |||
|
191 | ||||
|
192 | ||||
|
193 | Upgrade notes | |||
|
194 | ^^^^^^^^^^^^^ | |||
|
195 | ||||
|
196 | - New Automation task. We've changed the logic for updating latest change inside repository group. | |||
|
197 | New logic includes scanning for changes in all nested objects. Since this is a heavy task | |||
|
198 | a new dedicated scheduler task has been created to update it automatically on a scheduled base. | |||
|
199 | Please review in `admin > settings > automation` to enable this task. | |||
|
200 | ||||
|
201 | - New safer encryption algorithm. Some setting values are encrypted before storing it inside the database. | |||
|
202 | To keep full backward compatibility old AES algorithm is used. | |||
|
203 | If you wish to enable a safer option set fernet encryption instead inside rhodecode.ini | |||
|
204 | `rhodecode.encrypted_values.algorithm = fernet` | |||
|
205 | ||||
|
206 | - Pull requests UI changes. We've simplified the UI on pull requests page. | |||
|
207 | Please review the new UI to prevent surprises. All actions from old UI should be still possible with the new one. | |||
|
208 | ||||
|
209 | - Redis is now a default recommended backend for Celery and replaces previous rabbitmq. | |||
|
210 | Redis is generally easier to manage and install, and it's also very stable for usage | |||
|
211 | in the scheduler/celery async tasks. Since we also recommend Redis for caches the application | |||
|
212 | stack can be simplified by removing rabbitmq and replacing it with single Redis instance. | |||
|
213 | ||||
|
214 | - Recommendation for using Redis as the new cache backend on vcsserver. | |||
|
215 | Since Version 4.18.0 VCSServer has a new cache implementation for VCS data. | |||
|
216 | By default, for simplicity the cache type is file based. We strongly recommend using | |||
|
217 | Redis instead for better Performance and scalability | |||
|
218 | Please review vcsserver.ini settings under: | |||
|
219 | `rc_cache.repo_object.backend = dogpile.cache.rc.redis_msgpack` | |||
|
220 | ||||
|
221 | - New memory monitoring for Gunicorn workers. Starting from 4.18 release a option was added | |||
|
222 | to limit the maximum amount of memory used by a worker. | |||
|
223 | Please review new settings in `[server:main]` section for memory management in both | |||
|
224 | rhodecode.ini and vcsserver.ini:: | |||
|
225 | ||||
|
226 | ; Maximum memory usage that each worker can use before it will receive a | |||
|
227 | ; graceful restart signal 0 = memory monitoring is disabled | |||
|
228 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |||
|
229 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |||
|
230 | memory_max_usage = 0 |
@@ -0,0 +1,19 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2016-2019 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
@@ -0,0 +1,41 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2018-2019 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |||
|
20 | ||||
|
21 | ||||
|
22 | def includeme(config): | |||
|
23 | ||||
|
24 | config.add_route( | |||
|
25 | name='hovercard_user', | |||
|
26 | pattern='/_hovercard/user/{user_id}') | |||
|
27 | ||||
|
28 | config.add_route( | |||
|
29 | name='hovercard_user_group', | |||
|
30 | pattern='/_hovercard/user_group/{user_group_id}') | |||
|
31 | ||||
|
32 | config.add_route( | |||
|
33 | name='hovercard_pull_request', | |||
|
34 | pattern='/_hovercard/pull_request/{pull_request_id}') | |||
|
35 | ||||
|
36 | config.add_route( | |||
|
37 | name='hovercard_repo_commit', | |||
|
38 | pattern='/_hovercard/commit/{repo_name:.*?[^/]}/{commit_id}', repo_route=True) | |||
|
39 | ||||
|
40 | # Scan module for configuration decorators. | |||
|
41 | config.scan('.views', ignore='.tests') |
@@ -0,0 +1,110 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2016-2019 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |||
|
20 | ||||
|
21 | import re | |||
|
22 | import logging | |||
|
23 | import collections | |||
|
24 | ||||
|
25 | from pyramid.httpexceptions import HTTPNotFound | |||
|
26 | from pyramid.view import view_config | |||
|
27 | ||||
|
28 | from rhodecode.apps._base import BaseAppView, RepoAppView | |||
|
29 | from rhodecode.lib import helpers as h | |||
|
30 | from rhodecode.lib.auth import ( | |||
|
31 | LoginRequired, NotAnonymous, HasRepoGroupPermissionAnyDecorator, CSRFRequired, | |||
|
32 | HasRepoPermissionAnyDecorator) | |||
|
33 | from rhodecode.lib.codeblocks import filenode_as_lines_tokens | |||
|
34 | from rhodecode.lib.index import searcher_from_config | |||
|
35 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int | |||
|
36 | from rhodecode.lib.ext_json import json | |||
|
37 | from rhodecode.lib.vcs.exceptions import CommitDoesNotExistError, EmptyRepositoryError | |||
|
38 | from rhodecode.lib.vcs.nodes import FileNode | |||
|
39 | from rhodecode.model.db import ( | |||
|
40 | func, true, or_, case, in_filter_generator, Repository, RepoGroup, User, UserGroup, PullRequest) | |||
|
41 | from rhodecode.model.repo import RepoModel | |||
|
42 | from rhodecode.model.repo_group import RepoGroupModel | |||
|
43 | from rhodecode.model.scm import RepoGroupList, RepoList | |||
|
44 | from rhodecode.model.user import UserModel | |||
|
45 | from rhodecode.model.user_group import UserGroupModel | |||
|
46 | ||||
|
47 | log = logging.getLogger(__name__) | |||
|
48 | ||||
|
49 | ||||
|
50 | class HoverCardsView(BaseAppView): | |||
|
51 | ||||
|
52 | def load_default_context(self): | |||
|
53 | c = self._get_local_tmpl_context() | |||
|
54 | return c | |||
|
55 | ||||
|
56 | @LoginRequired() | |||
|
57 | @view_config( | |||
|
58 | route_name='hovercard_user', request_method='GET', xhr=True, | |||
|
59 | renderer='rhodecode:templates/hovercards/hovercard_user.mako') | |||
|
60 | def hovercard_user(self): | |||
|
61 | c = self.load_default_context() | |||
|
62 | user_id = self.request.matchdict['user_id'] | |||
|
63 | c.user = User.get_or_404(user_id) | |||
|
64 | return self._get_template_context(c) | |||
|
65 | ||||
|
66 | @LoginRequired() | |||
|
67 | @view_config( | |||
|
68 | route_name='hovercard_user_group', request_method='GET', xhr=True, | |||
|
69 | renderer='rhodecode:templates/hovercards/hovercard_user_group.mako') | |||
|
70 | def hovercard_user_group(self): | |||
|
71 | c = self.load_default_context() | |||
|
72 | user_group_id = self.request.matchdict['user_group_id'] | |||
|
73 | c.user_group = UserGroup.get_or_404(user_group_id) | |||
|
74 | return self._get_template_context(c) | |||
|
75 | ||||
|
76 | @LoginRequired() | |||
|
77 | @view_config( | |||
|
78 | route_name='hovercard_pull_request', request_method='GET', xhr=True, | |||
|
79 | renderer='rhodecode:templates/hovercards/hovercard_pull_request.mako') | |||
|
80 | def hovercard_pull_request(self): | |||
|
81 | c = self.load_default_context() | |||
|
82 | c.pull_request = PullRequest.get_or_404( | |||
|
83 | self.request.matchdict['pull_request_id']) | |||
|
84 | perms = ['repository.read', 'repository.write', 'repository.admin'] | |||
|
85 | c.can_view_pr = h.HasRepoPermissionAny(*perms)( | |||
|
86 | c.pull_request.target_repo.repo_name) | |||
|
87 | return self._get_template_context(c) | |||
|
88 | ||||
|
89 | ||||
|
90 | class HoverCardsRepoView(RepoAppView): | |||
|
91 | def load_default_context(self): | |||
|
92 | c = self._get_local_tmpl_context() | |||
|
93 | return c | |||
|
94 | ||||
|
95 | @LoginRequired() | |||
|
96 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', 'repository.admin') | |||
|
97 | @view_config( | |||
|
98 | route_name='hovercard_repo_commit', request_method='GET', xhr=True, | |||
|
99 | renderer='rhodecode:templates/hovercards/hovercard_repo_commit.mako') | |||
|
100 | def hovercard_repo_commit(self): | |||
|
101 | c = self.load_default_context() | |||
|
102 | commit_id = self.request.matchdict['commit_id'] | |||
|
103 | pre_load = ['author', 'branch', 'date', 'message'] | |||
|
104 | try: | |||
|
105 | c.commit = self.rhodecode_vcs_repo.get_commit( | |||
|
106 | commit_id=commit_id, pre_load=pre_load) | |||
|
107 | except (CommitDoesNotExistError, EmptyRepositoryError): | |||
|
108 | raise HTTPNotFound() | |||
|
109 | ||||
|
110 | return self._get_template_context(c) |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100755 |
|
NO CONTENT: new file 100755 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
@@ -1,5 +1,5 b'' | |||||
1 | [bumpversion] |
|
1 | [bumpversion] | |
2 |
current_version = 4.1 |
|
2 | current_version = 4.18.0 | |
3 | message = release: Bump version {current_version} to {new_version} |
|
3 | message = release: Bump version {current_version} to {new_version} | |
4 |
|
4 | |||
5 | [bumpversion:file:rhodecode/VERSION] |
|
5 | [bumpversion:file:rhodecode/VERSION] |
@@ -8,6 +8,7 b' include =' | |||||
8 | omit = |
|
8 | omit = | |
9 | rhodecode/lib/dbmigrate/* |
|
9 | rhodecode/lib/dbmigrate/* | |
10 | rhodecode/lib/paster_commands/* |
|
10 | rhodecode/lib/paster_commands/* | |
|
11 | rhodecode/lib/_vendor/* | |||
11 |
|
12 | |||
12 | [report] |
|
13 | [report] | |
13 |
|
14 |
@@ -45,6 +45,7 b' syntax: regexp' | |||||
45 | ^rhodecode/public/js/rhodecode-components.html$ |
|
45 | ^rhodecode/public/js/rhodecode-components.html$ | |
46 | ^rhodecode/public/js/rhodecode-components.js$ |
|
46 | ^rhodecode/public/js/rhodecode-components.js$ | |
47 | ^rhodecode/public/js/scripts.js$ |
|
47 | ^rhodecode/public/js/scripts.js$ | |
|
48 | ^rhodecode/public/js/scripts.min.js$ | |||
48 | ^rhodecode/public/js/src/components/root-styles.gen.html$ |
|
49 | ^rhodecode/public/js/src/components/root-styles.gen.html$ | |
49 | ^rhodecode/public/js/vendors/webcomponentsjs/ |
|
50 | ^rhodecode/public/js/vendors/webcomponentsjs/ | |
50 | ^rhodecode\.db$ |
|
51 | ^rhodecode\.db$ |
@@ -5,25 +5,20 b' done = false' | |||||
5 | done = true |
|
5 | done = true | |
6 |
|
6 | |||
7 | [task:rc_tools_pinned] |
|
7 | [task:rc_tools_pinned] | |
8 | done = true |
|
|||
9 |
|
8 | |||
10 | [task:fixes_on_stable] |
|
9 | [task:fixes_on_stable] | |
11 | done = true |
|
|||
12 |
|
10 | |||
13 | [task:pip2nix_generated] |
|
11 | [task:pip2nix_generated] | |
14 | done = true |
|
|||
15 |
|
12 | |||
16 | [task:changelog_updated] |
|
13 | [task:changelog_updated] | |
17 | done = true |
|
|||
18 |
|
14 | |||
19 | [task:generate_api_docs] |
|
15 | [task:generate_api_docs] | |
20 | done = true |
|
16 | ||
|
17 | [task:updated_translation] | |||
21 |
|
18 | |||
22 | [release] |
|
19 | [release] | |
23 |
state = |
|
20 | state = in_progress | |
24 |
version = 4.1 |
|
21 | version = 4.18.0 | |
25 |
|
||||
26 | [task:updated_translation] |
|
|||
27 |
|
22 | |||
28 | [task:generate_js_routes] |
|
23 | [task:generate_js_routes] | |
29 |
|
24 |
@@ -13,9 +13,10 b' module.exports = function(grunt) {' | |||||
13 |
|
13 | |||
14 | grunt.loadNpmTasks('grunt-contrib-less'); |
|
14 | grunt.loadNpmTasks('grunt-contrib-less'); | |
15 | grunt.loadNpmTasks('grunt-contrib-concat'); |
|
15 | grunt.loadNpmTasks('grunt-contrib-concat'); | |
|
16 | grunt.loadNpmTasks('grunt-contrib-uglify'); | |||
16 | grunt.loadNpmTasks('grunt-contrib-watch'); |
|
17 | grunt.loadNpmTasks('grunt-contrib-watch'); | |
17 | grunt.loadNpmTasks('grunt-contrib-jshint'); |
|
18 | grunt.loadNpmTasks('grunt-contrib-jshint'); | |
18 | grunt.loadNpmTasks('grunt-contrib-copy'); |
|
19 | grunt.loadNpmTasks('grunt-contrib-copy'); | |
19 | grunt.loadNpmTasks('grunt-webpack'); |
|
20 | grunt.loadNpmTasks('grunt-webpack'); | |
20 | grunt.registerTask('default', ['less:production', 'less:components', 'copy', 'webpack', 'concat:dist']); |
|
21 | grunt.registerTask('default', ['less:production', 'less:components', 'copy', 'webpack', 'concat:dist', 'uglify:dist']); | |
21 | }; |
|
22 | }; |
@@ -17,6 +17,7 b' test:' | |||||
17 | test-clean: |
|
17 | test-clean: | |
18 | rm -rf coverage.xml htmlcov junit.xml pylint.log result |
|
18 | rm -rf coverage.xml htmlcov junit.xml pylint.log result | |
19 | find . -type d -name "__pycache__" -prune -exec rm -rf '{}' ';' |
|
19 | find . -type d -name "__pycache__" -prune -exec rm -rf '{}' ';' | |
|
20 | find . -type f \( -iname '.coverage.*' \) -exec rm '{}' ';' | |||
20 |
|
21 | |||
21 | test-only: |
|
22 | test-only: | |
22 | PYTHONHASHSEED=random \ |
|
23 | PYTHONHASHSEED=random \ | |
@@ -28,7 +29,7 b' test-only-mysql:' | |||||
28 | PYTHONHASHSEED=random \ |
|
29 | PYTHONHASHSEED=random \ | |
29 | py.test -x -vv -r xw -p no:sugar --cov=rhodecode \ |
|
30 | py.test -x -vv -r xw -p no:sugar --cov=rhodecode \ | |
30 | --cov-report=term-missing --cov-report=html \ |
|
31 | --cov-report=term-missing --cov-report=html \ | |
31 | --ini-config-override='{"app:main": {"sqlalchemy.db1.url": "mysql://root:qweqwe@localhost/rhodecode_test"}}' \ |
|
32 | --ini-config-override='{"app:main": {"sqlalchemy.db1.url": "mysql://root:qweqwe@localhost/rhodecode_test?charset=utf8"}}' \ | |
32 | rhodecode |
|
33 | rhodecode | |
33 |
|
34 | |||
34 | test-only-postgres: |
|
35 | test-only-postgres: |
@@ -5,20 +5,24 b' RhodeCode' | |||||
5 | About |
|
5 | About | |
6 | ----- |
|
6 | ----- | |
7 |
|
7 | |||
8 |
``RhodeCode`` is a fast and powerful management tool for |
|
8 | ``RhodeCode`` is a fast and powerful source code management tool for | |
9 | and Subversion_ with a built in push/pull server, full text search, |
|
9 | Mercurial_, GIT_ and Subversion_. It's main features are: | |
10 | pull requests and powerful code-review system. It works on http/https, SSH and |
|
10 | ||
11 | has a few unique features like: |
|
|||
12 |
|
|
11 | ||
13 | - plugable architecture from Pyramid web-framework. |
|
12 | - built in push/pull server | |
14 | - advanced permission system with IP restrictions, inheritation, and user-groups. |
|
13 | - SSH with key management support | |
|
14 | - full text search. | |||
|
15 | - plugable authentication. | |||
|
16 | - pull requests and powerful code-review system. | |||
|
17 | - advanced permission system with IP restrictions, permission inheritation, and user-groups. | |||
15 | - rich set of authentication plugins including LDAP, ActiveDirectory, SAML 2.0, |
|
18 | - rich set of authentication plugins including LDAP, ActiveDirectory, SAML 2.0, | |
16 | Atlassian Crowd, Http-Headers, Pam, Token-Auth, OAuth. |
|
19 | Atlassian Crowd, Http-Headers, Pam, Token-Auth, OAuth. | |
17 | - live code-review chat, and reviewer rules. |
|
20 | - live code-review chat, and reviewer rules. | |
18 | - full web based file editing. |
|
21 | - full web based file editing. | |
19 | - unified multi vcs support. |
|
22 | - unified multi vcs support. | |
20 | - snippets (gist) system. |
|
23 | - snippets (gist) system. | |
21 | - integration framework for Slack, CI systems, Webhooks. |
|
24 | - artfacts store for binaries. | |
|
25 | - integration framework for Slack, CI systems, Webhooks, Jira, Redmine etc. | |||
22 | - integration with all 3rd party issue trackers. |
|
26 | - integration with all 3rd party issue trackers. | |
23 |
|
27 | |||
24 |
|
28 | |||
@@ -41,7 +45,8 b' Source code' | |||||
41 | ----------- |
|
45 | ----------- | |
42 |
|
46 | |||
43 | The latest sources can be obtained from official RhodeCode instance |
|
47 | The latest sources can be obtained from official RhodeCode instance | |
44 | https://code.rhodecode.com |
|
48 | https://code.rhodecode.com/rhodecode-enterprise-ce | |
|
49 | https://code.rhodecode.com/rhodecode-vcsserver | |||
45 |
|
50 | |||
46 |
|
51 | |||
47 | Contributions |
|
52 | Contributions |
This diff has been collapsed as it changes many lines, (918 lines changed) Show them Hide them | |||||
@@ -1,24 +1,22 b'' | |||||
1 |
|
1 | ## -*- coding: utf-8 -*- | ||
2 |
|
2 | |||
3 |
|
|
3 | ; ######################################### | |
4 |
|
|
4 | ; RHODECODE COMMUNITY EDITION CONFIGURATION | |
5 |
|
|
5 | ; ######################################### | |
6 |
|
6 | |||
7 | [DEFAULT] |
|
7 | [DEFAULT] | |
8 |
|
|
8 | ; Debug flag sets all loggers to debug, and enables request tracking | |
9 | debug = true |
|
9 | debug = true | |
10 |
|
10 | |||
11 |
|
|
11 | ; ######################################################################## | |
12 | ## EMAIL CONFIGURATION ## |
|
12 | ; EMAIL CONFIGURATION | |
13 | ## Uncomment and replace with the email address which should receive ## |
|
13 | ; These settings will be used by the RhodeCode mailing system | |
14 | ## any error reports after an application crash ## |
|
14 | ; ######################################################################## | |
15 | ## Additionally these settings will be used by the RhodeCode mailing system ## |
|
|||
16 | ################################################################################ |
|
|||
17 |
|
15 | |||
18 |
|
|
16 | ; prefix all emails subjects with given prefix, helps filtering out emails | |
19 | #email_prefix = [RhodeCode] |
|
17 | #email_prefix = [RhodeCode] | |
20 |
|
18 | |||
21 |
|
|
19 | ; email FROM address all mails will be sent | |
22 | #app_email_from = rhodecode-noreply@localhost |
|
20 | #app_email_from = rhodecode-noreply@localhost | |
23 |
|
21 | |||
24 | #smtp_server = mail.server.com |
|
22 | #smtp_server = mail.server.com | |
@@ -29,82 +27,139 b' debug = true' | |||||
29 | #smtp_use_ssl = true |
|
27 | #smtp_use_ssl = true | |
30 |
|
28 | |||
31 | [server:main] |
|
29 | [server:main] | |
32 | ## COMMON ## |
|
30 | ; COMMON HOST/IP CONFIG | |
33 | host = 127.0.0.1 |
|
31 | host = 127.0.0.1 | |
34 | port = 5000 |
|
32 | port = 5000 | |
35 |
|
33 | |||
36 |
|
|
34 | ; ################################################## | |
37 |
|
|
35 | ; WAITRESS WSGI SERVER - Recommended for Development | |
38 |
|
|
36 | ; ################################################## | |
39 |
|
37 | |||
|
38 | ; use server type | |||
40 | use = egg:waitress#main |
|
39 | use = egg:waitress#main | |
41 | ## number of worker threads |
|
40 | ||
|
41 | ; number of worker threads | |||
42 | threads = 5 |
|
42 | threads = 5 | |
43 | ## MAX BODY SIZE 100GB |
|
43 | ||
|
44 | ; MAX BODY SIZE 100GB | |||
44 | max_request_body_size = 107374182400 |
|
45 | max_request_body_size = 107374182400 | |
45 | ## Use poll instead of select, fixes file descriptors limits problems. |
|
46 | ||
46 | ## May not work on old windows systems. |
|
47 | ; Use poll instead of select, fixes file descriptors limits problems. | |
|
48 | ; May not work on old windows systems. | |||
47 | asyncore_use_poll = true |
|
49 | asyncore_use_poll = true | |
48 |
|
50 | |||
49 |
|
51 | |||
50 | ########################## |
|
52 | ; ########################### | |
51 |
|
|
53 | ; GUNICORN APPLICATION SERVER | |
52 | ########################## |
|
54 | ; ########################### | |
53 | ## run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini |
|
|||
54 |
|
55 | |||
|
56 | ; run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |||
|
57 | ||||
|
58 | ; Module to use, this setting shouldn't be changed | |||
55 | #use = egg:gunicorn#main |
|
59 | #use = egg:gunicorn#main | |
56 | ## Sets the number of process workers. More workers means more concurrent connections |
|
60 | ||
57 | ## RhodeCode can handle at the same time. Each additional worker also it increases |
|
61 | ; Sets the number of process workers. More workers means more concurrent connections | |
58 | ## memory usage as each has it's own set of caches. |
|
62 | ; RhodeCode can handle at the same time. Each additional worker also it increases | |
59 | ## Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more |
|
63 | ; memory usage as each has it's own set of caches. | |
60 | ## than 8-10 unless for really big deployments .e.g 700-1000 users. |
|
64 | ; Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |
61 | ## `instance_id = *` must be set in the [app:main] section below (which is the default) |
|
65 | ; than 8-10 unless for really big deployments .e.g 700-1000 users. | |
62 | ## when using more than 1 worker. |
|
66 | ; `instance_id = *` must be set in the [app:main] section below (which is the default) | |
|
67 | ; when using more than 1 worker. | |||
63 | #workers = 2 |
|
68 | #workers = 2 | |
64 | ## process name visible in process list |
|
69 | ||
|
70 | ; Gunicorn access log level | |||
|
71 | #loglevel = info | |||
|
72 | ||||
|
73 | ; Process name visible in process list | |||
65 | #proc_name = rhodecode |
|
74 | #proc_name = rhodecode | |
66 | ## type of worker class, one of sync, gevent |
|
75 | ||
67 | ## recommended for bigger setup is using of of other than sync one |
|
76 | ; Type of worker class, one of `sync`, `gevent` | |
|
77 | ; Recommended type is `gevent` | |||
68 | #worker_class = gevent |
|
78 | #worker_class = gevent | |
69 | ## The maximum number of simultaneous clients. Valid only for Gevent |
|
79 | ||
|
80 | ; The maximum number of simultaneous clients. Valid only for gevent | |||
70 | #worker_connections = 10 |
|
81 | #worker_connections = 10 | |
71 | ## max number of requests that worker will handle before being gracefully |
|
82 | ||
72 | ## restarted, could prevent memory leaks |
|
83 | ; Max number of requests that worker will handle before being gracefully restarted. | |
|
84 | ; Prevents memory leaks, jitter adds variability so not all workers are restarted at once. | |||
73 | #max_requests = 1000 |
|
85 | #max_requests = 1000 | |
74 | #max_requests_jitter = 30 |
|
86 | #max_requests_jitter = 30 | |
75 | ## amount of time a worker can spend with handling a request before it |
|
87 | ||
76 | ## gets killed and restarted. Set to 6hrs |
|
88 | ; Amount of time a worker can spend with handling a request before it | |
|
89 | ; gets killed and restarted. By default set to 21600 (6hrs) | |||
|
90 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |||
77 | #timeout = 21600 |
|
91 | #timeout = 21600 | |
78 |
|
92 | |||
|
93 | ; The maximum size of HTTP request line in bytes. | |||
|
94 | ; 0 for unlimited | |||
|
95 | #limit_request_line = 0 | |||
|
96 | ||||
|
97 | ; Limit the number of HTTP headers fields in a request. | |||
|
98 | ; By default this value is 100 and can't be larger than 32768. | |||
|
99 | #limit_request_fields = 32768 | |||
|
100 | ||||
|
101 | ; Limit the allowed size of an HTTP request header field. | |||
|
102 | ; Value is a positive number or 0. | |||
|
103 | ; Setting it to 0 will allow unlimited header field sizes. | |||
|
104 | #limit_request_field_size = 0 | |||
|
105 | ||||
|
106 | ; Timeout for graceful workers restart. | |||
|
107 | ; After receiving a restart signal, workers have this much time to finish | |||
|
108 | ; serving requests. Workers still alive after the timeout (starting from the | |||
|
109 | ; receipt of the restart signal) are force killed. | |||
|
110 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |||
|
111 | #graceful_timeout = 3600 | |||
|
112 | ||||
|
113 | # The number of seconds to wait for requests on a Keep-Alive connection. | |||
|
114 | # Generally set in the 1-5 seconds range. | |||
|
115 | #keepalive = 2 | |||
|
116 | ||||
|
117 | ; Maximum memory usage that each worker can use before it will receive a | |||
|
118 | ; graceful restart signal 0 = memory monitoring is disabled | |||
|
119 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |||
|
120 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |||
|
121 | #memory_max_usage = 0 | |||
|
122 | ||||
|
123 | ; How often in seconds to check for memory usage for each gunicorn worker | |||
|
124 | #memory_usage_check_interval = 60 | |||
|
125 | ||||
|
126 | ; Threshold value for which we don't recycle worker if GarbageCollection | |||
|
127 | ; frees up enough resources. Before each restart we try to run GC on worker | |||
|
128 | ; in case we get enough free memory after that, restart will not happen. | |||
|
129 | #memory_usage_recovery_threshold = 0.8 | |||
|
130 | ||||
79 |
|
131 | |||
80 |
|
|
132 | ; Prefix middleware for RhodeCode. | |
81 |
|
|
133 | ; recommended when using proxy setup. | |
82 |
|
|
134 | ; allows to set RhodeCode under a prefix in server. | |
83 |
|
|
135 | ; eg https://server.com/custom_prefix. Enable `filter-with =` option below as well. | |
84 |
|
|
136 | ; And set your prefix like: `prefix = /custom_prefix` | |
85 |
|
|
137 | ; be sure to also set beaker.session.cookie_path = /custom_prefix if you need | |
86 |
|
|
138 | ; to make your cookies only work on prefix url | |
87 | [filter:proxy-prefix] |
|
139 | [filter:proxy-prefix] | |
88 | use = egg:PasteDeploy#prefix |
|
140 | use = egg:PasteDeploy#prefix | |
89 | prefix = / |
|
141 | prefix = / | |
90 |
|
142 | |||
91 | [app:main] |
|
143 | [app:main] | |
92 |
|
|
144 | ; The %(here)s variable will be replaced with the absolute path of parent directory | |
93 |
|
|
145 | ; of this file | |
94 |
|
|
146 | ; In addition ENVIRONMENT variables usage is possible, e.g | |
95 |
|
|
147 | ; sqlalchemy.db1.url = {ENV_RC_DB_URL} | |
96 |
|
148 | |||
97 | use = egg:rhodecode-enterprise-ce |
|
149 | use = egg:rhodecode-enterprise-ce | |
98 |
|
150 | |||
99 |
|
|
151 | ; enable proxy prefix middleware, defined above | |
100 | #filter-with = proxy-prefix |
|
152 | #filter-with = proxy-prefix | |
101 |
|
153 | |||
|
154 | ; ############# | |||
|
155 | ; DEBUG OPTIONS | |||
|
156 | ; ############# | |||
|
157 | ||||
|
158 | pyramid.reload_templates = true | |||
|
159 | ||||
102 | # During development the we want to have the debug toolbar enabled |
|
160 | # During development the we want to have the debug toolbar enabled | |
103 | pyramid.includes = |
|
161 | pyramid.includes = | |
104 | pyramid_debugtoolbar |
|
162 | pyramid_debugtoolbar | |
105 | rhodecode.lib.middleware.request_wrapper |
|
|||
106 |
|
||||
107 | pyramid.reload_templates = true |
|
|||
108 |
|
163 | |||
109 | debugtoolbar.hosts = 0.0.0.0/0 |
|
164 | debugtoolbar.hosts = 0.0.0.0/0 | |
110 | debugtoolbar.exclude_prefixes = |
|
165 | debugtoolbar.exclude_prefixes = | |
@@ -121,101 +176,100 b' rhodecode.includes =' | |||||
121 | # api prefix url |
|
176 | # api prefix url | |
122 | rhodecode.api.url = /_admin/api |
|
177 | rhodecode.api.url = /_admin/api | |
123 |
|
178 | |||
124 |
|
179 | ; enable debug style page | ||
125 | ## END RHODECODE PLUGINS ## |
|
180 | debug_style = true | |
126 |
|
181 | |||
127 | ## encryption key used to encrypt social plugin tokens, |
|
182 | ; ################# | |
128 | ## remote_urls with credentials etc, if not set it defaults to |
|
183 | ; END DEBUG OPTIONS | |
129 | ## `beaker.session.secret` |
|
184 | ; ################# | |
|
185 | ||||
|
186 | ; encryption key used to encrypt social plugin tokens, | |||
|
187 | ; remote_urls with credentials etc, if not set it defaults to | |||
|
188 | ; `beaker.session.secret` | |||
130 | #rhodecode.encrypted_values.secret = |
|
189 | #rhodecode.encrypted_values.secret = | |
131 |
|
190 | |||
132 |
|
|
191 | ; decryption strict mode (enabled by default). It controls if decryption raises | |
133 |
|
|
192 | ; `SignatureVerificationError` in case of wrong key, or damaged encryption data. | |
134 | #rhodecode.encrypted_values.strict = false |
|
193 | #rhodecode.encrypted_values.strict = false | |
135 |
|
194 | |||
136 |
|
|
195 | ; Pick algorithm for encryption. Either fernet (more secure) or aes (default) | |
137 |
|
|
196 | ; fernet is safer, and we strongly recommend switching to it. | |
138 |
|
|
197 | ; Due to backward compatibility aes is used as default. | |
139 | #rhodecode.encrypted_values.algorithm = fernet |
|
198 | #rhodecode.encrypted_values.algorithm = fernet | |
140 |
|
199 | |||
141 |
|
|
200 | ; Return gzipped responses from RhodeCode (static files/application) | |
142 | gzip_responses = false |
|
201 | gzip_responses = false | |
143 |
|
202 | |||
144 |
|
|
203 | ; Auto-generate javascript routes file on startup | |
145 | generate_js_files = false |
|
204 | generate_js_files = false | |
146 |
|
205 | |||
147 |
|
|
206 | ; System global default language. | |
148 |
|
|
207 | ; All available languages: en (default), be, de, es, fr, it, ja, pl, pt, ru, zh | |
149 | lang = en |
|
208 | lang = en | |
150 |
|
209 | |||
151 |
|
|
210 | ; Perform a full repository scan and import on each server start. | |
152 |
|
|
211 | ; Settings this to true could lead to very long startup time. | |
153 | startup.import_repos = false |
|
212 | startup.import_repos = false | |
154 |
|
213 | |||
155 |
|
|
214 | ; Uncomment and set this path to use archive download cache. | |
156 |
|
|
215 | ; Once enabled, generated archives will be cached at this location | |
157 |
|
|
216 | ; and served from the cache during subsequent requests for the same archive of | |
158 |
|
|
217 | ; the repository. | |
159 | #archive_cache_dir = /tmp/tarballcache |
|
218 | #archive_cache_dir = /tmp/tarballcache | |
160 |
|
219 | |||
161 |
|
|
220 | ; URL at which the application is running. This is used for Bootstrapping | |
162 |
|
|
221 | ; requests in context when no web request is available. Used in ishell, or | |
163 |
|
|
222 | ; SSH calls. Set this for events to receive proper url for SSH calls. | |
164 | app.base_url = http://rhodecode.local |
|
223 | app.base_url = http://rhodecode.local | |
165 |
|
224 | |||
166 |
|
|
225 | ; Unique application ID. Should be a random unique string for security. | |
167 | app_instance_uuid = rc-production |
|
226 | app_instance_uuid = rc-production | |
168 |
|
227 | |||
169 |
|
|
228 | ; Cut off limit for large diffs (size in bytes). If overall diff size on | |
170 |
|
|
229 | ; commit, or pull request exceeds this limit this diff will be displayed | |
171 |
|
|
230 | ; partially. E.g 512000 == 512Kb | |
172 | cut_off_limit_diff = 512000 |
|
231 | cut_off_limit_diff = 512000 | |
173 |
|
232 | |||
174 |
|
|
233 | ; Cut off limit for large files inside diffs (size in bytes). Each individual | |
175 |
|
|
234 | ; file inside diff which exceeds this limit will be displayed partially. | |
176 |
|
|
235 | ; E.g 128000 == 128Kb | |
177 | cut_off_limit_file = 128000 |
|
236 | cut_off_limit_file = 128000 | |
178 |
|
237 | |||
179 |
|
|
238 | ; Use cached version of vcs repositories everywhere. Recommended to be `true` | |
180 | vcs_full_cache = true |
|
239 | vcs_full_cache = true | |
181 |
|
240 | |||
182 |
|
|
241 | ; Force https in RhodeCode, fixes https redirects, assumes it's always https. | |
183 |
|
|
242 | ; Normally this is controlled by proper flags sent from http server such as Nginx or Apache | |
184 | force_https = false |
|
243 | force_https = false | |
185 |
|
244 | |||
186 |
|
|
245 | ; use Strict-Transport-Security headers | |
187 | use_htsts = false |
|
246 | use_htsts = false | |
188 |
|
247 | |||
189 | ## git rev filter option, --all is the default filter, if you need to |
|
248 | ; Set to true if your repos are exposed using the dumb protocol | |
190 | ## hide all refs in changelog switch this to --branches --tags |
|
|||
191 | git_rev_filter = --branches --tags |
|
|||
192 |
|
||||
193 | # Set to true if your repos are exposed using the dumb protocol |
|
|||
194 | git_update_server_info = false |
|
249 | git_update_server_info = false | |
195 |
|
250 | |||
196 |
|
|
251 | ; RSS/ATOM feed options | |
197 | rss_cut_off_limit = 256000 |
|
252 | rss_cut_off_limit = 256000 | |
198 | rss_items_per_page = 10 |
|
253 | rss_items_per_page = 10 | |
199 | rss_include_diff = false |
|
254 | rss_include_diff = false | |
200 |
|
255 | |||
201 |
|
|
256 | ; gist URL alias, used to create nicer urls for gist. This should be an | |
202 |
|
|
257 | ; url that does rewrites to _admin/gists/{gistid}. | |
203 |
|
|
258 | ; example: http://gist.rhodecode.org/{gistid}. Empty means use the internal | |
204 |
|
|
259 | ; RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid} | |
205 | gist_alias_url = |
|
260 | gist_alias_url = | |
206 |
|
261 | |||
207 |
|
|
262 | ; List of views (using glob pattern syntax) that AUTH TOKENS could be | |
208 |
|
|
263 | ; used for access. | |
209 |
|
|
264 | ; Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it | |
210 |
|
|
265 | ; came from the the logged in user who own this authentication token. | |
211 |
|
|
266 | ; Additionally @TOKEN syntax can be used to bound the view to specific | |
212 |
|
|
267 | ; authentication token. Such view would be only accessible when used together | |
213 |
|
|
268 | ; with this authentication token | |
214 | ## |
|
269 | ; list of all views can be found under `/_admin/permissions/auth_token_access` | |
215 | ## list of all views can be found under `/_admin/permissions/auth_token_access` |
|
270 | ; The list should be "," separated and on a single line. | |
216 | ## The list should be "," separated and on a single line. |
|
271 | ; Most common views to enable: | |
217 | ## |
|
272 | ||
218 | ## Most common views to enable: |
|
|||
219 | # RepoCommitsView:repo_commit_download |
|
273 | # RepoCommitsView:repo_commit_download | |
220 | # RepoCommitsView:repo_commit_patch |
|
274 | # RepoCommitsView:repo_commit_patch | |
221 | # RepoCommitsView:repo_commit_raw |
|
275 | # RepoCommitsView:repo_commit_raw | |
@@ -226,164 +280,194 b' gist_alias_url =' | |||||
226 | # GistView:* |
|
280 | # GistView:* | |
227 | api_access_controllers_whitelist = |
|
281 | api_access_controllers_whitelist = | |
228 |
|
282 | |||
229 |
|
|
283 | ; Default encoding used to convert from and to unicode | |
230 |
|
|
284 | ; can be also a comma separated list of encoding in case of mixed encodings | |
231 | default_encoding = UTF-8 |
|
285 | default_encoding = UTF-8 | |
232 |
|
286 | |||
233 |
|
|
287 | ; instance-id prefix | |
234 |
|
|
288 | ; a prefix key for this instance used for cache invalidation when running | |
235 |
|
|
289 | ; multiple instances of RhodeCode, make sure it's globally unique for | |
236 |
|
|
290 | ; all running RhodeCode instances. Leave empty if you don't use it | |
237 | instance_id = |
|
291 | instance_id = | |
238 |
|
292 | |||
239 |
|
|
293 | ; Fallback authentication plugin. Set this to a plugin ID to force the usage | |
240 |
|
|
294 | ; of an authentication plugin also if it is disabled by it's settings. | |
241 |
|
|
295 | ; This could be useful if you are unable to log in to the system due to broken | |
242 |
|
|
296 | ; authentication settings. Then you can enable e.g. the internal RhodeCode auth | |
243 |
|
|
297 | ; module to log in again and fix the settings. | |
244 | ## |
|
298 | ; Available builtin plugin IDs (hash is part of the ID): | |
245 | ## Available builtin plugin IDs (hash is part of the ID): |
|
299 | ; egg:rhodecode-enterprise-ce#rhodecode | |
246 |
|
|
300 | ; egg:rhodecode-enterprise-ce#pam | |
247 |
|
|
301 | ; egg:rhodecode-enterprise-ce#ldap | |
248 |
|
|
302 | ; egg:rhodecode-enterprise-ce#jasig_cas | |
249 |
|
|
303 | ; egg:rhodecode-enterprise-ce#headers | |
250 |
|
|
304 | ; egg:rhodecode-enterprise-ce#crowd | |
251 | ## egg:rhodecode-enterprise-ce#crowd |
|
305 | ||
252 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode |
|
306 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode | |
253 |
|
307 | |||
254 | ## alternative return HTTP header for failed authentication. Default HTTP |
|
308 | ; Flag to control loading of legacy plugins in py:/path format | |
255 | ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with |
|
309 | auth_plugin.import_legacy_plugins = true | |
256 | ## handling that causing a series of failed authentication calls. |
|
310 | ||
257 | ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code |
|
311 | ; alternative return HTTP header for failed authentication. Default HTTP | |
258 | ## This will be served instead of default 401 on bad authentication |
|
312 | ; response is 401 HTTPUnauthorized. Currently HG clients have troubles with | |
|
313 | ; handling that causing a series of failed authentication calls. | |||
|
314 | ; Set this variable to 403 to return HTTPForbidden, or any other HTTP code | |||
|
315 | ; This will be served instead of default 401 on bad authentication | |||
259 | auth_ret_code = |
|
316 | auth_ret_code = | |
260 |
|
317 | |||
261 |
|
|
318 | ; use special detection method when serving auth_ret_code, instead of serving | |
262 |
|
|
319 | ; ret_code directly, use 401 initially (Which triggers credentials prompt) | |
263 |
|
|
320 | ; and then serve auth_ret_code to clients | |
264 | auth_ret_code_detection = false |
|
321 | auth_ret_code_detection = false | |
265 |
|
322 | |||
266 |
|
|
323 | ; locking return code. When repository is locked return this HTTP code. 2XX | |
267 |
|
|
324 | ; codes don't break the transactions while 4XX codes do | |
268 | lock_ret_code = 423 |
|
325 | lock_ret_code = 423 | |
269 |
|
326 | |||
270 |
|
|
327 | ; allows to change the repository location in settings page | |
271 | allow_repo_location_change = true |
|
328 | allow_repo_location_change = true | |
272 |
|
329 | |||
273 |
|
|
330 | ; allows to setup custom hooks in settings page | |
274 | allow_custom_hooks_settings = true |
|
331 | allow_custom_hooks_settings = true | |
275 |
|
332 | |||
276 |
|
|
333 | ; Generated license token required for EE edition license. | |
277 |
|
|
334 | ; New generated token value can be found in Admin > settings > license page. | |
278 | license_token = |
|
335 | license_token = | |
279 |
|
336 | |||
280 | ## supervisor connection uri, for managing supervisor and logs. |
|
337 | ; This flag hides sensitive information on the license page such as token, and license data | |
|
338 | license.hide_license_info = false | |||
|
339 | ||||
|
340 | ; supervisor connection uri, for managing supervisor and logs. | |||
281 | supervisor.uri = |
|
341 | supervisor.uri = | |
282 | ## supervisord group name/id we only want this RC instance to handle |
|
342 | ||
|
343 | ; supervisord group name/id we only want this RC instance to handle | |||
283 | supervisor.group_id = dev |
|
344 | supervisor.group_id = dev | |
284 |
|
345 | |||
285 |
|
|
346 | ; Display extended labs settings | |
286 | labs_settings_active = true |
|
347 | labs_settings_active = true | |
287 |
|
348 | |||
288 |
|
|
349 | ; Custom exception store path, defaults to TMPDIR | |
289 |
|
|
350 | ; This is used to store exception from RhodeCode in shared directory | |
290 | #exception_tracker.store_path = |
|
351 | #exception_tracker.store_path = | |
291 |
|
352 | |||
292 |
|
|
353 | ; File store configuration. This is used to store and serve uploaded files | |
293 | file_store.enabled = true |
|
354 | file_store.enabled = true | |
294 | ## Storage backend, available options are: local |
|
355 | ||
|
356 | ; Storage backend, available options are: local | |||
295 | file_store.backend = local |
|
357 | file_store.backend = local | |
296 | ## path to store the uploaded binaries |
|
358 | ||
|
359 | ; path to store the uploaded binaries | |||
297 | file_store.storage_path = %(here)s/data/file_store |
|
360 | file_store.storage_path = %(here)s/data/file_store | |
298 |
|
361 | |||
299 |
|
362 | |||
300 | #################################### |
|
363 | ; ############# | |
301 | ### CELERY CONFIG #### |
|
364 | ; CELERY CONFIG | |
302 | #################################### |
|
365 | ; ############# | |
303 | ## run: /path/to/celery worker \ |
|
366 | ||
304 | ## -E --beat --app rhodecode.lib.celerylib.loader \ |
|
367 | ; manually run celery: /path/to/celery worker -E --beat --app rhodecode.lib.celerylib.loader --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler --loglevel DEBUG --ini /path/to/rhodecode.ini | |
305 | ## --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler \ |
|
|||
306 | ## --loglevel DEBUG --ini /path/to/rhodecode.ini |
|
|||
307 |
|
368 | |||
308 | use_celery = false |
|
369 | use_celery = false | |
309 |
|
370 | |||
310 |
|
|
371 | ; connection url to the message broker (default redis) | |
311 |
celery.broker_url = |
|
372 | celery.broker_url = redis://localhost:6379/8 | |
312 |
|
373 | |||
313 | ## maximum tasks to execute before worker restart |
|
374 | ; rabbitmq example | |
|
375 | #celery.broker_url = amqp://rabbitmq:qweqwe@localhost:5672/rabbitmqhost | |||
|
376 | ||||
|
377 | ; maximum tasks to execute before worker restart | |||
314 | celery.max_tasks_per_child = 100 |
|
378 | celery.max_tasks_per_child = 100 | |
315 |
|
379 | |||
316 |
|
|
380 | ; tasks will never be sent to the queue, but executed locally instead. | |
317 | celery.task_always_eager = false |
|
381 | celery.task_always_eager = false | |
318 |
|
382 | |||
319 | ##################################### |
|
383 | ; ############# | |
320 | ### DOGPILE CACHE #### |
|
384 | ; DOGPILE CACHE | |
321 | ##################################### |
|
385 | ; ############# | |
322 | ## Default cache dir for caches. Putting this into a ramdisk |
|
386 | ||
323 | ## can boost performance, eg. /tmpfs/data_ramdisk, however this directory might require |
|
387 | ; Default cache dir for caches. Putting this into a ramdisk can boost performance. | |
324 | ## large amount of space |
|
388 | ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space | |
325 | cache_dir = %(here)s/data |
|
389 | cache_dir = %(here)s/data | |
326 |
|
390 | |||
327 | ## `cache_perms` cache settings for permission tree, auth TTL. |
|
391 | ; ********************************************* | |
|
392 | ; `sql_cache_short` cache for heavy SQL queries | |||
|
393 | ; Only supported backend is `memory_lru` | |||
|
394 | ; ********************************************* | |||
|
395 | rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru | |||
|
396 | rc_cache.sql_cache_short.expiration_time = 30 | |||
|
397 | ||||
|
398 | ||||
|
399 | ; ***************************************************** | |||
|
400 | ; `cache_repo_longterm` cache for repo object instances | |||
|
401 | ; Only supported backend is `memory_lru` | |||
|
402 | ; ***************************************************** | |||
|
403 | rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru | |||
|
404 | ; by default we use 30 Days, cache is still invalidated on push | |||
|
405 | rc_cache.cache_repo_longterm.expiration_time = 2592000 | |||
|
406 | ; max items in LRU cache, set to smaller number to save memory, and expire last used caches | |||
|
407 | rc_cache.cache_repo_longterm.max_size = 10000 | |||
|
408 | ||||
|
409 | ||||
|
410 | ; ************************************************* | |||
|
411 | ; `cache_perms` cache for permission tree, auth TTL | |||
|
412 | ; ************************************************* | |||
328 | rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace |
|
413 | rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace | |
329 | rc_cache.cache_perms.expiration_time = 300 |
|
414 | rc_cache.cache_perms.expiration_time = 300 | |
|
415 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set | |||
|
416 | #rc_cache.cache_perms.arguments.filename = /tmp/cache_perms.db | |||
330 |
|
417 | |||
331 |
|
|
418 | ; alternative `cache_perms` redis backend with distributed lock | |
332 | #rc_cache.cache_perms.backend = dogpile.cache.rc.redis |
|
419 | #rc_cache.cache_perms.backend = dogpile.cache.rc.redis | |
333 | #rc_cache.cache_perms.expiration_time = 300 |
|
420 | #rc_cache.cache_perms.expiration_time = 300 | |
334 | ## redis_expiration_time needs to be greater then expiration_time |
|
421 | ||
|
422 | ; redis_expiration_time needs to be greater then expiration_time | |||
335 | #rc_cache.cache_perms.arguments.redis_expiration_time = 7200 |
|
423 | #rc_cache.cache_perms.arguments.redis_expiration_time = 7200 | |
336 | #rc_cache.cache_perms.arguments.socket_timeout = 30 |
|
424 | ||
337 | #rc_cache.cache_perms.arguments.host = localhost |
|
425 | #rc_cache.cache_perms.arguments.host = localhost | |
338 | #rc_cache.cache_perms.arguments.port = 6379 |
|
426 | #rc_cache.cache_perms.arguments.port = 6379 | |
339 | #rc_cache.cache_perms.arguments.db = 0 |
|
427 | #rc_cache.cache_perms.arguments.db = 0 | |
340 | ## more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends |
|
428 | #rc_cache.cache_perms.arguments.socket_timeout = 30 | |
|
429 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |||
341 | #rc_cache.cache_perms.arguments.distributed_lock = true |
|
430 | #rc_cache.cache_perms.arguments.distributed_lock = true | |
342 |
|
431 | |||
343 | ## `cache_repo` cache settings for FileTree, Readme, RSS FEEDS |
|
432 | ||
|
433 | ; *************************************************** | |||
|
434 | ; `cache_repo` cache for file tree, Readme, RSS FEEDS | |||
|
435 | ; *************************************************** | |||
344 | rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace |
|
436 | rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace | |
345 | rc_cache.cache_repo.expiration_time = 2592000 |
|
437 | rc_cache.cache_repo.expiration_time = 2592000 | |
|
438 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set | |||
|
439 | #rc_cache.cache_repo.arguments.filename = /tmp/cache_repo.db | |||
346 |
|
440 | |||
347 |
|
|
441 | ; alternative `cache_repo` redis backend with distributed lock | |
348 | #rc_cache.cache_repo.backend = dogpile.cache.rc.redis |
|
442 | #rc_cache.cache_repo.backend = dogpile.cache.rc.redis | |
349 | #rc_cache.cache_repo.expiration_time = 2592000 |
|
443 | #rc_cache.cache_repo.expiration_time = 2592000 | |
350 | ## redis_expiration_time needs to be greater then expiration_time |
|
444 | ||
|
445 | ; redis_expiration_time needs to be greater then expiration_time | |||
351 | #rc_cache.cache_repo.arguments.redis_expiration_time = 2678400 |
|
446 | #rc_cache.cache_repo.arguments.redis_expiration_time = 2678400 | |
352 | #rc_cache.cache_repo.arguments.socket_timeout = 30 |
|
447 | ||
353 | #rc_cache.cache_repo.arguments.host = localhost |
|
448 | #rc_cache.cache_repo.arguments.host = localhost | |
354 | #rc_cache.cache_repo.arguments.port = 6379 |
|
449 | #rc_cache.cache_repo.arguments.port = 6379 | |
355 | #rc_cache.cache_repo.arguments.db = 1 |
|
450 | #rc_cache.cache_repo.arguments.db = 1 | |
356 | ## more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends |
|
451 | #rc_cache.cache_repo.arguments.socket_timeout = 30 | |
|
452 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |||
357 | #rc_cache.cache_repo.arguments.distributed_lock = true |
|
453 | #rc_cache.cache_repo.arguments.distributed_lock = true | |
358 |
|
454 | |||
359 | ## cache settings for SQL queries, this needs to use memory type backend |
|
|||
360 | rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru |
|
|||
361 | rc_cache.sql_cache_short.expiration_time = 30 |
|
|||
362 |
|
455 | |||
363 | ## `cache_repo_longterm` cache for repo object instances, this needs to use memory |
|
456 | ; ############## | |
364 | ## type backend as the objects kept are not pickle serializable |
|
457 | ; BEAKER SESSION | |
365 | rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru |
|
458 | ; ############## | |
366 | ## by default we use 96H, this is using invalidation on push anyway |
|
|||
367 | rc_cache.cache_repo_longterm.expiration_time = 345600 |
|
|||
368 | ## max items in LRU cache, reduce this number to save memory, and expire last used |
|
|||
369 | ## cached objects |
|
|||
370 | rc_cache.cache_repo_longterm.max_size = 10000 |
|
|||
371 |
|
459 | |||
372 |
|
460 | ; beaker.session.type is type of storage options for the logged users sessions. Current allowed | ||
373 | #################################### |
|
461 | ; types are file, ext:redis, ext:database, ext:memcached, and memory (default if not specified). | |
374 | ### BEAKER SESSION #### |
|
462 | ; Fastest ones are Redis and ext:database | |
375 | #################################### |
|
|||
376 |
|
||||
377 | ## .session.type is type of storage options for the session, current allowed |
|
|||
378 | ## types are file, ext:memcached, ext:redis, ext:database, and memory (default). |
|
|||
379 | beaker.session.type = file |
|
463 | beaker.session.type = file | |
380 | beaker.session.data_dir = %(here)s/data/sessions |
|
464 | beaker.session.data_dir = %(here)s/data/sessions | |
381 |
|
465 | |||
382 |
|
|
466 | ; Redis based sessions | |
383 | #beaker.session.type = ext:redis |
|
467 | #beaker.session.type = ext:redis | |
384 | #beaker.session.url = redis://127.0.0.1:6379/2 |
|
468 | #beaker.session.url = redis://127.0.0.1:6379/2 | |
385 |
|
469 | |||
386 |
|
|
470 | ; DB based session, fast, and allows easy management over logged in users | |
387 | #beaker.session.type = ext:database |
|
471 | #beaker.session.type = ext:database | |
388 | #beaker.session.table_name = db_session |
|
472 | #beaker.session.table_name = db_session | |
389 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode |
|
473 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode | |
@@ -395,267 +479,275 b' beaker.session.key = rhodecode' | |||||
395 | beaker.session.secret = develop-rc-uytcxaz |
|
479 | beaker.session.secret = develop-rc-uytcxaz | |
396 | beaker.session.lock_dir = %(here)s/data/sessions/lock |
|
480 | beaker.session.lock_dir = %(here)s/data/sessions/lock | |
397 |
|
481 | |||
398 |
|
|
482 | ; Secure encrypted cookie. Requires AES and AES python libraries | |
399 |
|
|
483 | ; you must disable beaker.session.secret to use this | |
400 | #beaker.session.encrypt_key = key_for_encryption |
|
484 | #beaker.session.encrypt_key = key_for_encryption | |
401 | #beaker.session.validate_key = validation_key |
|
485 | #beaker.session.validate_key = validation_key | |
402 |
|
486 | |||
403 |
|
|
487 | ; Sets session as invalid (also logging out user) if it haven not been | |
404 |
|
|
488 | ; accessed for given amount of time in seconds | |
405 | beaker.session.timeout = 2592000 |
|
489 | beaker.session.timeout = 2592000 | |
406 | beaker.session.httponly = true |
|
490 | beaker.session.httponly = true | |
407 | ## Path to use for the cookie. Set to prefix if you use prefix middleware |
|
491 | ||
|
492 | ; Path to use for the cookie. Set to prefix if you use prefix middleware | |||
408 | #beaker.session.cookie_path = /custom_prefix |
|
493 | #beaker.session.cookie_path = /custom_prefix | |
409 |
|
494 | |||
410 |
|
|
495 | ; Set https secure cookie | |
411 | beaker.session.secure = false |
|
496 | beaker.session.secure = false | |
412 |
|
497 | |||
413 | ## auto save the session to not to use .save() |
|
498 | ; default cookie expiration time in seconds, set to `true` to set expire | |
414 | beaker.session.auto = false |
|
499 | ; at browser close | |
415 |
|
||||
416 | ## default cookie expiration time in seconds, set to `true` to set expire |
|
|||
417 | ## at browser close |
|
|||
418 | #beaker.session.cookie_expires = 3600 |
|
500 | #beaker.session.cookie_expires = 3600 | |
419 |
|
501 | |||
420 |
|
|
502 | ; ############################# | |
421 |
|
|
503 | ; SEARCH INDEXING CONFIGURATION | |
422 |
|
|
504 | ; ############################# | |
423 | ## Full text search indexer is available in rhodecode-tools under |
|
|||
424 | ## `rhodecode-tools index` command |
|
|||
425 |
|
505 | |||
426 | ## WHOOSH Backend, doesn't require additional services to run |
|
506 | ; Full text search indexer is available in rhodecode-tools under | |
427 | ## it works good with few dozen repos |
|
507 | ; `rhodecode-tools index` command | |
|
508 | ||||
|
509 | ; WHOOSH Backend, doesn't require additional services to run | |||
|
510 | ; it works good with few dozen repos | |||
428 | search.module = rhodecode.lib.index.whoosh |
|
511 | search.module = rhodecode.lib.index.whoosh | |
429 | search.location = %(here)s/data/index |
|
512 | search.location = %(here)s/data/index | |
430 |
|
513 | |||
431 |
|
|
514 | ; #################### | |
432 |
|
|
515 | ; CHANNELSTREAM CONFIG | |
433 |
|
|
516 | ; #################### | |
434 | ## channelstream enables persistent connections and live notification |
|
517 | ||
435 | ## in the system. It's also used by the chat system |
|
518 | ; channelstream enables persistent connections and live notification | |
|
519 | ; in the system. It's also used by the chat system | |||
436 |
|
520 | |||
437 | channelstream.enabled = false |
|
521 | channelstream.enabled = false | |
438 |
|
522 | |||
439 |
|
|
523 | ; server address for channelstream server on the backend | |
440 | channelstream.server = 127.0.0.1:9800 |
|
524 | channelstream.server = 127.0.0.1:9800 | |
441 |
|
525 | |||
442 |
|
|
526 | ; location of the channelstream server from outside world | |
443 |
|
|
527 | ; use ws:// for http or wss:// for https. This address needs to be handled | |
444 |
|
|
528 | ; by external HTTP server such as Nginx or Apache | |
445 |
|
|
529 | ; see Nginx/Apache configuration examples in our docs | |
446 | channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream |
|
530 | channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream | |
447 | channelstream.secret = secret |
|
531 | channelstream.secret = secret | |
448 | channelstream.history.location = %(here)s/channelstream_history |
|
532 | channelstream.history.location = %(here)s/channelstream_history | |
449 |
|
533 | |||
450 |
|
|
534 | ; Internal application path that Javascript uses to connect into. | |
451 |
|
|
535 | ; If you use proxy-prefix the prefix should be added before /_channelstream | |
452 | channelstream.proxy_path = /_channelstream |
|
536 | channelstream.proxy_path = /_channelstream | |
453 |
|
537 | |||
454 |
|
538 | |||
455 |
|
|
539 | ; ############################## | |
456 | ## APPENLIGHT CONFIG ## |
|
540 | ; MAIN RHODECODE DATABASE CONFIG | |
457 |
|
|
541 | ; ############################## | |
|
542 | ||||
|
543 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |||
|
544 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |||
|
545 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8 | |||
|
546 | ; pymysql is an alternative driver for MySQL, use in case of problems with default one | |||
|
547 | #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode | |||
|
548 | ||||
|
549 | sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |||
|
550 | ||||
|
551 | ; see sqlalchemy docs for other advanced settings | |||
|
552 | ; print the sql statements to output | |||
|
553 | sqlalchemy.db1.echo = false | |||
|
554 | ||||
|
555 | ; recycle the connections after this amount of seconds | |||
|
556 | sqlalchemy.db1.pool_recycle = 3600 | |||
|
557 | sqlalchemy.db1.convert_unicode = true | |||
|
558 | ||||
|
559 | ; the number of connections to keep open inside the connection pool. | |||
|
560 | ; 0 indicates no limit | |||
|
561 | #sqlalchemy.db1.pool_size = 5 | |||
|
562 | ||||
|
563 | ; The number of connections to allow in connection pool "overflow", that is | |||
|
564 | ; connections that can be opened above and beyond the pool_size setting, | |||
|
565 | ; which defaults to five. | |||
|
566 | #sqlalchemy.db1.max_overflow = 10 | |||
|
567 | ||||
|
568 | ; Connection check ping, used to detect broken database connections | |||
|
569 | ; could be enabled to better handle cases if MySQL has gone away errors | |||
|
570 | #sqlalchemy.db1.ping_connection = true | |||
|
571 | ||||
|
572 | ; ########## | |||
|
573 | ; VCS CONFIG | |||
|
574 | ; ########## | |||
|
575 | vcs.server.enable = true | |||
|
576 | vcs.server = localhost:9900 | |||
|
577 | ||||
|
578 | ; Web server connectivity protocol, responsible for web based VCS operations | |||
|
579 | ; Available protocols are: | |||
|
580 | ; `http` - use http-rpc backend (default) | |||
|
581 | vcs.server.protocol = http | |||
|
582 | ||||
|
583 | ; Push/Pull operations protocol, available options are: | |||
|
584 | ; `http` - use http-rpc backend (default) | |||
|
585 | vcs.scm_app_implementation = http | |||
|
586 | ||||
|
587 | ; Push/Pull operations hooks protocol, available options are: | |||
|
588 | ; `http` - use http-rpc backend (default) | |||
|
589 | vcs.hooks.protocol = http | |||
|
590 | ||||
|
591 | ; Host on which this instance is listening for hooks. If vcsserver is in other location | |||
|
592 | ; this should be adjusted. | |||
|
593 | vcs.hooks.host = 127.0.0.1 | |||
|
594 | ||||
|
595 | ; Start VCSServer with this instance as a subprocess, useful for development | |||
|
596 | vcs.start_server = false | |||
|
597 | ||||
|
598 | ; List of enabled VCS backends, available options are: | |||
|
599 | ; `hg` - mercurial | |||
|
600 | ; `git` - git | |||
|
601 | ; `svn` - subversion | |||
|
602 | vcs.backends = hg, git, svn | |||
|
603 | ||||
|
604 | ; Wait this number of seconds before killing connection to the vcsserver | |||
|
605 | vcs.connection_timeout = 3600 | |||
|
606 | ||||
|
607 | ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |||
|
608 | ; Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |||
|
609 | #vcs.svn.compatible_version = pre-1.8-compatible | |||
|
610 | ||||
458 |
|
611 | |||
459 | ## Appenlight is tailored to work with RhodeCode, see |
|
612 | ; #################################################### | |
460 | ## http://appenlight.com for details how to obtain an account |
|
613 | ; Subversion proxy support (mod_dav_svn) | |
|
614 | ; Maps RhodeCode repo groups into SVN paths for Apache | |||
|
615 | ; #################################################### | |||
|
616 | ||||
|
617 | ; Enable or disable the config file generation. | |||
|
618 | svn.proxy.generate_config = false | |||
|
619 | ||||
|
620 | ; Generate config file with `SVNListParentPath` set to `On`. | |||
|
621 | svn.proxy.list_parent_path = true | |||
|
622 | ||||
|
623 | ; Set location and file name of generated config file. | |||
|
624 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |||
|
625 | ||||
|
626 | ; alternative mod_dav config template. This needs to be a valid mako template | |||
|
627 | ; Example template can be found in the source code: | |||
|
628 | ; rhodecode/apps/svn_support/templates/mod-dav-svn.conf.mako | |||
|
629 | #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako | |||
|
630 | ||||
|
631 | ; Used as a prefix to the `Location` block in the generated config file. | |||
|
632 | ; In most cases it should be set to `/`. | |||
|
633 | svn.proxy.location_root = / | |||
|
634 | ||||
|
635 | ; Command to reload the mod dav svn configuration on change. | |||
|
636 | ; Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh | |||
|
637 | ; Make sure user who runs RhodeCode process is allowed to reload Apache | |||
|
638 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |||
|
639 | ||||
|
640 | ; If the timeout expires before the reload command finishes, the command will | |||
|
641 | ; be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |||
|
642 | #svn.proxy.reload_timeout = 10 | |||
|
643 | ||||
|
644 | ; #################### | |||
|
645 | ; SSH Support Settings | |||
|
646 | ; #################### | |||
461 |
|
647 | |||
462 | ## Appenlight integration enabled |
|
648 | ; Defines if a custom authorized_keys file should be created and written on | |
|
649 | ; any change user ssh keys. Setting this to false also disables possibility | |||
|
650 | ; of adding SSH keys by users from web interface. Super admins can still | |||
|
651 | ; manage SSH Keys. | |||
|
652 | ssh.generate_authorized_keyfile = false | |||
|
653 | ||||
|
654 | ; Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding` | |||
|
655 | # ssh.authorized_keys_ssh_opts = | |||
|
656 | ||||
|
657 | ; Path to the authorized_keys file where the generate entries are placed. | |||
|
658 | ; It is possible to have multiple key files specified in `sshd_config` e.g. | |||
|
659 | ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode | |||
|
660 | ssh.authorized_keys_file_path = ~/.ssh/authorized_keys_rhodecode | |||
|
661 | ||||
|
662 | ; Command to execute the SSH wrapper. The binary is available in the | |||
|
663 | ; RhodeCode installation directory. | |||
|
664 | ; e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper | |||
|
665 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper | |||
|
666 | ||||
|
667 | ; Allow shell when executing the ssh-wrapper command | |||
|
668 | ssh.wrapper_cmd_allow_shell = false | |||
|
669 | ||||
|
670 | ; Enables logging, and detailed output send back to the client during SSH | |||
|
671 | ; operations. Useful for debugging, shouldn't be used in production. | |||
|
672 | ssh.enable_debug_logging = true | |||
|
673 | ||||
|
674 | ; Paths to binary executable, by default they are the names, but we can | |||
|
675 | ; override them if we want to use a custom one | |||
|
676 | ssh.executable.hg = ~/.rccontrol/vcsserver-1/profile/bin/hg | |||
|
677 | ssh.executable.git = ~/.rccontrol/vcsserver-1/profile/bin/git | |||
|
678 | ssh.executable.svn = ~/.rccontrol/vcsserver-1/profile/bin/svnserve | |||
|
679 | ||||
|
680 | ; Enables SSH key generator web interface. Disabling this still allows users | |||
|
681 | ; to add their own keys. | |||
|
682 | ssh.enable_ui_key_generator = true | |||
|
683 | ||||
|
684 | ||||
|
685 | ; ################# | |||
|
686 | ; APPENLIGHT CONFIG | |||
|
687 | ; ################# | |||
|
688 | ||||
|
689 | ; Appenlight is tailored to work with RhodeCode, see | |||
|
690 | ; http://appenlight.rhodecode.com for details how to obtain an account | |||
|
691 | ||||
|
692 | ; Appenlight integration enabled | |||
463 | appenlight = false |
|
693 | appenlight = false | |
464 |
|
694 | |||
465 | appenlight.server_url = https://api.appenlight.com |
|
695 | appenlight.server_url = https://api.appenlight.com | |
466 | appenlight.api_key = YOUR_API_KEY |
|
696 | appenlight.api_key = YOUR_API_KEY | |
467 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 |
|
697 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 | |
468 |
|
698 | |||
469 |
|
|
699 | ; used for JS client | |
470 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY |
|
700 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY | |
471 |
|
701 | |||
472 |
|
|
702 | ; TWEAK AMOUNT OF INFO SENT HERE | |
473 |
|
703 | |||
474 |
|
|
704 | ; enables 404 error logging (default False) | |
475 | appenlight.report_404 = false |
|
705 | appenlight.report_404 = false | |
476 |
|
706 | |||
477 |
|
|
707 | ; time in seconds after request is considered being slow (default 1) | |
478 | appenlight.slow_request_time = 1 |
|
708 | appenlight.slow_request_time = 1 | |
479 |
|
709 | |||
480 |
|
|
710 | ; record slow requests in application | |
481 |
|
|
711 | ; (needs to be enabled for slow datastore recording and time tracking) | |
482 | appenlight.slow_requests = true |
|
712 | appenlight.slow_requests = true | |
483 |
|
713 | |||
484 |
|
|
714 | ; enable hooking to application loggers | |
485 | appenlight.logging = true |
|
715 | appenlight.logging = true | |
486 |
|
716 | |||
487 |
|
|
717 | ; minimum log level for log capture | |
488 | appenlight.logging.level = WARNING |
|
718 | appenlight.logging.level = WARNING | |
489 |
|
719 | |||
490 |
|
|
720 | ; send logs only from erroneous/slow requests | |
491 |
|
|
721 | ; (saves API quota for intensive logging) | |
492 | appenlight.logging_on_error = false |
|
722 | appenlight.logging_on_error = false | |
493 |
|
723 | |||
494 |
|
|
724 | ; list of additional keywords that should be grabbed from environ object | |
495 |
|
|
725 | ; can be string with comma separated list of words in lowercase | |
496 |
|
|
726 | ; (by default client will always send following info: | |
497 |
|
|
727 | ; 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that | |
498 |
|
|
728 | ; start with HTTP* this list be extended with additional keywords here | |
499 | appenlight.environ_keys_whitelist = |
|
729 | appenlight.environ_keys_whitelist = | |
500 |
|
730 | |||
501 |
|
|
731 | ; list of keywords that should be blanked from request object | |
502 |
|
|
732 | ; can be string with comma separated list of words in lowercase | |
503 |
|
|
733 | ; (by default client will always blank keys that contain following words | |
504 |
|
|
734 | ; 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' | |
505 |
|
|
735 | ; this list be extended with additional keywords set here | |
506 | appenlight.request_keys_blacklist = |
|
736 | appenlight.request_keys_blacklist = | |
507 |
|
737 | |||
508 |
|
|
738 | ; list of namespaces that should be ignores when gathering log entries | |
509 |
|
|
739 | ; can be string with comma separated list of namespaces | |
510 |
|
|
740 | ; (by default the client ignores own entries: appenlight_client.client) | |
511 | appenlight.log_namespace_blacklist = |
|
741 | appenlight.log_namespace_blacklist = | |
512 |
|
742 | |||
513 | # enable debug style page |
|
743 | ; Dummy marker to add new entries after. | |
514 | debug_style = true |
|
744 | ; Add any custom entries below. Please don't remove this marker. | |
515 |
|
||||
516 | ########################################### |
|
|||
517 | ### MAIN RHODECODE DATABASE CONFIG ### |
|
|||
518 | ########################################### |
|
|||
519 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 |
|
|||
520 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode |
|
|||
521 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8 |
|
|||
522 | # pymysql is an alternative driver for MySQL, use in case of problems with default one |
|
|||
523 | #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode |
|
|||
524 |
|
||||
525 | sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 |
|
|||
526 |
|
||||
527 | # see sqlalchemy docs for other advanced settings |
|
|||
528 |
|
||||
529 | ## print the sql statements to output |
|
|||
530 | sqlalchemy.db1.echo = false |
|
|||
531 | ## recycle the connections after this amount of seconds |
|
|||
532 | sqlalchemy.db1.pool_recycle = 3600 |
|
|||
533 | sqlalchemy.db1.convert_unicode = true |
|
|||
534 |
|
||||
535 | ## the number of connections to keep open inside the connection pool. |
|
|||
536 | ## 0 indicates no limit |
|
|||
537 | #sqlalchemy.db1.pool_size = 5 |
|
|||
538 |
|
||||
539 | ## the number of connections to allow in connection pool "overflow", that is |
|
|||
540 | ## connections that can be opened above and beyond the pool_size setting, |
|
|||
541 | ## which defaults to five. |
|
|||
542 | #sqlalchemy.db1.max_overflow = 10 |
|
|||
543 |
|
||||
544 | ## Connection check ping, used to detect broken database connections |
|
|||
545 | ## could be enabled to better handle cases if MySQL has gone away errors |
|
|||
546 | #sqlalchemy.db1.ping_connection = true |
|
|||
547 |
|
||||
548 | ################## |
|
|||
549 | ### VCS CONFIG ### |
|
|||
550 | ################## |
|
|||
551 | vcs.server.enable = true |
|
|||
552 | vcs.server = localhost:9900 |
|
|||
553 |
|
||||
554 | ## Web server connectivity protocol, responsible for web based VCS operations |
|
|||
555 | ## Available protocols are: |
|
|||
556 | ## `http` - use http-rpc backend (default) |
|
|||
557 | vcs.server.protocol = http |
|
|||
558 |
|
||||
559 | ## Push/Pull operations protocol, available options are: |
|
|||
560 | ## `http` - use http-rpc backend (default) |
|
|||
561 | vcs.scm_app_implementation = http |
|
|||
562 |
|
||||
563 | ## Push/Pull operations hooks protocol, available options are: |
|
|||
564 | ## `http` - use http-rpc backend (default) |
|
|||
565 | vcs.hooks.protocol = http |
|
|||
566 |
|
||||
567 | ## Host on which this instance is listening for hooks. If vcsserver is in other location |
|
|||
568 | ## this should be adjusted. |
|
|||
569 | vcs.hooks.host = 127.0.0.1 |
|
|||
570 |
|
||||
571 | vcs.server.log_level = debug |
|
|||
572 | ## Start VCSServer with this instance as a subprocess, useful for development |
|
|||
573 | vcs.start_server = false |
|
|||
574 |
|
||||
575 | ## List of enabled VCS backends, available options are: |
|
|||
576 | ## `hg` - mercurial |
|
|||
577 | ## `git` - git |
|
|||
578 | ## `svn` - subversion |
|
|||
579 | vcs.backends = hg, git, svn |
|
|||
580 |
|
||||
581 | vcs.connection_timeout = 3600 |
|
|||
582 | ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out. |
|
|||
583 | ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible |
|
|||
584 | #vcs.svn.compatible_version = pre-1.8-compatible |
|
|||
585 |
|
||||
586 |
|
||||
587 | ############################################################ |
|
|||
588 | ### Subversion proxy support (mod_dav_svn) ### |
|
|||
589 | ### Maps RhodeCode repo groups into SVN paths for Apache ### |
|
|||
590 | ############################################################ |
|
|||
591 | ## Enable or disable the config file generation. |
|
|||
592 | svn.proxy.generate_config = false |
|
|||
593 | ## Generate config file with `SVNListParentPath` set to `On`. |
|
|||
594 | svn.proxy.list_parent_path = true |
|
|||
595 | ## Set location and file name of generated config file. |
|
|||
596 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf |
|
|||
597 | ## alternative mod_dav config template. This needs to be a mako template |
|
|||
598 | #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako |
|
|||
599 | ## Used as a prefix to the `Location` block in the generated config file. |
|
|||
600 | ## In most cases it should be set to `/`. |
|
|||
601 | svn.proxy.location_root = / |
|
|||
602 | ## Command to reload the mod dav svn configuration on change. |
|
|||
603 | ## Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh |
|
|||
604 | ## Make sure user who runs RhodeCode process is allowed to reload Apache |
|
|||
605 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload |
|
|||
606 | ## If the timeout expires before the reload command finishes, the command will |
|
|||
607 | ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds. |
|
|||
608 | #svn.proxy.reload_timeout = 10 |
|
|||
609 |
|
||||
610 | ############################################################ |
|
|||
611 | ### SSH Support Settings ### |
|
|||
612 | ############################################################ |
|
|||
613 |
|
||||
614 | ## Defines if a custom authorized_keys file should be created and written on |
|
|||
615 | ## any change user ssh keys. Setting this to false also disables possibility |
|
|||
616 | ## of adding SSH keys by users from web interface. Super admins can still |
|
|||
617 | ## manage SSH Keys. |
|
|||
618 | ssh.generate_authorized_keyfile = false |
|
|||
619 |
|
||||
620 | ## Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding` |
|
|||
621 | # ssh.authorized_keys_ssh_opts = |
|
|||
622 |
|
||||
623 | ## Path to the authorized_keys file where the generate entries are placed. |
|
|||
624 | ## It is possible to have multiple key files specified in `sshd_config` e.g. |
|
|||
625 | ## AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode |
|
|||
626 | ssh.authorized_keys_file_path = ~/.ssh/authorized_keys_rhodecode |
|
|||
627 |
|
||||
628 | ## Command to execute the SSH wrapper. The binary is available in the |
|
|||
629 | ## RhodeCode installation directory. |
|
|||
630 | ## e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper |
|
|||
631 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper |
|
|||
632 |
|
||||
633 | ## Allow shell when executing the ssh-wrapper command |
|
|||
634 | ssh.wrapper_cmd_allow_shell = false |
|
|||
635 |
|
||||
636 | ## Enables logging, and detailed output send back to the client during SSH |
|
|||
637 | ## operations. Useful for debugging, shouldn't be used in production. |
|
|||
638 | ssh.enable_debug_logging = true |
|
|||
639 |
|
||||
640 | ## Paths to binary executable, by default they are the names, but we can |
|
|||
641 | ## override them if we want to use a custom one |
|
|||
642 | ssh.executable.hg = ~/.rccontrol/vcsserver-1/profile/bin/hg |
|
|||
643 | ssh.executable.git = ~/.rccontrol/vcsserver-1/profile/bin/git |
|
|||
644 | ssh.executable.svn = ~/.rccontrol/vcsserver-1/profile/bin/svnserve |
|
|||
645 |
|
||||
646 | ## Enables SSH key generator web interface. Disabling this still allows users |
|
|||
647 | ## to add their own keys. |
|
|||
648 | ssh.enable_ui_key_generator = true |
|
|||
649 |
|
||||
650 |
|
||||
651 | ## Dummy marker to add new entries after. |
|
|||
652 | ## Add any custom entries below. Please don't remove. |
|
|||
653 | custom.conf = 1 |
|
745 | custom.conf = 1 | |
654 |
|
746 | |||
655 |
|
747 | |||
656 |
|
|
748 | ; ##################### | |
657 |
|
|
749 | ; LOGGING CONFIGURATION | |
658 |
|
|
750 | ; ##################### | |
659 | [loggers] |
|
751 | [loggers] | |
660 | keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper |
|
752 | keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper | |
661 |
|
753 | |||
@@ -665,9 +757,9 b' keys = console, console_sql' | |||||
665 | [formatters] |
|
757 | [formatters] | |
666 | keys = generic, color_formatter, color_formatter_sql |
|
758 | keys = generic, color_formatter, color_formatter_sql | |
667 |
|
759 | |||
668 |
|
|
760 | ; ####### | |
669 |
|
|
761 | ; LOGGERS | |
670 |
|
|
762 | ; ####### | |
671 | [logger_root] |
|
763 | [logger_root] | |
672 | level = NOTSET |
|
764 | level = NOTSET | |
673 | handlers = console |
|
765 | handlers = console | |
@@ -702,9 +794,9 b' handlers =' | |||||
702 | qualname = celery |
|
794 | qualname = celery | |
703 |
|
795 | |||
704 |
|
796 | |||
705 |
|
|
797 | ; ######## | |
706 |
|
|
798 | ; HANDLERS | |
707 |
|
|
799 | ; ######## | |
708 |
|
800 | |||
709 | [handler_console] |
|
801 | [handler_console] | |
710 | class = StreamHandler |
|
802 | class = StreamHandler | |
@@ -713,17 +805,17 b' level = DEBUG' | |||||
713 | formatter = color_formatter |
|
805 | formatter = color_formatter | |
714 |
|
806 | |||
715 | [handler_console_sql] |
|
807 | [handler_console_sql] | |
716 |
|
|
808 | ; "level = DEBUG" logs SQL queries and results. | |
717 |
|
|
809 | ; "level = INFO" logs SQL queries. | |
718 |
|
|
810 | ; "level = WARN" logs neither. (Recommended for production systems.) | |
719 | class = StreamHandler |
|
811 | class = StreamHandler | |
720 | args = (sys.stderr, ) |
|
812 | args = (sys.stderr, ) | |
721 | level = WARN |
|
813 | level = WARN | |
722 | formatter = color_formatter_sql |
|
814 | formatter = color_formatter_sql | |
723 |
|
815 | |||
724 |
|
|
816 | ; ########## | |
725 |
|
|
817 | ; FORMATTERS | |
726 |
|
|
818 | ; ########## | |
727 |
|
819 | |||
728 | [formatter_generic] |
|
820 | [formatter_generic] | |
729 | class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter |
|
821 | class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter |
@@ -1,58 +1,26 b'' | |||||
1 | """ |
|
1 | """ | |
2 | gunicorn config extension and hooks. Sets additional configuration that is |
|
2 | Gunicorn config extension and hooks. This config file adds some extra settings and memory management. | |
3 | available post the .ini config. |
|
3 | Gunicorn configuration should be managed by .ini files entries of RhodeCode or VCSServer | |
4 |
|
||||
5 | - workers = ${cpu_number} |
|
|||
6 | - threads = 1 |
|
|||
7 | - proc_name = ${gunicorn_proc_name} |
|
|||
8 | - worker_class = sync |
|
|||
9 | - worker_connections = 10 |
|
|||
10 | - max_requests = 1000 |
|
|||
11 | - max_requests_jitter = 30 |
|
|||
12 | - timeout = 21600 |
|
|||
13 |
|
||||
14 | """ |
|
4 | """ | |
15 |
|
5 | |||
16 | import multiprocessing |
|
6 | import gc | |
|
7 | import os | |||
17 | import sys |
|
8 | import sys | |
|
9 | import math | |||
18 | import time |
|
10 | import time | |
19 | import datetime |
|
|||
20 | import threading |
|
11 | import threading | |
21 | import traceback |
|
12 | import traceback | |
|
13 | import random | |||
22 | from gunicorn.glogging import Logger |
|
14 | from gunicorn.glogging import Logger | |
23 |
|
15 | |||
24 |
|
16 | |||
|
17 | def get_workers(): | |||
|
18 | import multiprocessing | |||
|
19 | return multiprocessing.cpu_count() * 2 + 1 | |||
|
20 | ||||
25 | # GLOBAL |
|
21 | # GLOBAL | |
26 | errorlog = '-' |
|
22 | errorlog = '-' | |
27 | accesslog = '-' |
|
23 | accesslog = '-' | |
28 | loglevel = 'debug' |
|
|||
29 |
|
||||
30 | # SECURITY |
|
|||
31 |
|
||||
32 | # The maximum size of HTTP request line in bytes. |
|
|||
33 | # 0 for unlimited |
|
|||
34 | limit_request_line = 0 |
|
|||
35 |
|
||||
36 | # Limit the number of HTTP headers fields in a request. |
|
|||
37 | # By default this value is 100 and can't be larger than 32768. |
|
|||
38 | limit_request_fields = 10240 |
|
|||
39 |
|
||||
40 | # Limit the allowed size of an HTTP request header field. |
|
|||
41 | # Value is a positive number or 0. |
|
|||
42 | # Setting it to 0 will allow unlimited header field sizes. |
|
|||
43 | limit_request_field_size = 0 |
|
|||
44 |
|
||||
45 |
|
||||
46 | # Timeout for graceful workers restart. |
|
|||
47 | # After receiving a restart signal, workers have this much time to finish |
|
|||
48 | # serving requests. Workers still alive after the timeout (starting from the |
|
|||
49 | # receipt of the restart signal) are force killed. |
|
|||
50 | graceful_timeout = 30 |
|
|||
51 |
|
||||
52 |
|
||||
53 | # The number of seconds to wait for requests on a Keep-Alive connection. |
|
|||
54 | # Generally set in the 1-5 seconds range. |
|
|||
55 | keepalive = 2 |
|
|||
56 |
|
24 | |||
57 |
|
25 | |||
58 | # SERVER MECHANICS |
|
26 | # SERVER MECHANICS | |
@@ -63,38 +31,178 b' tmp_upload_dir = None' | |||||
63 |
|
31 | |||
64 | # Custom log format |
|
32 | # Custom log format | |
65 | access_log_format = ( |
|
33 | access_log_format = ( | |
66 |
'%(t)s |
|
34 | '%(t)s %(p)s INFO [GNCRN] %(h)-15s rqt:%(L)s %(s)s %(b)-6s "%(m)s:%(U)s %(q)s" usr:%(u)s "%(f)s" "%(a)s"') | |
67 |
|
35 | |||
68 | # self adjust workers based on CPU count |
|
36 | # self adjust workers based on CPU count | |
69 | # workers = multiprocessing.cpu_count() * 2 + 1 |
|
37 | # workers = get_workers() | |
|
38 | ||||
|
39 | ||||
|
40 | def _get_process_rss(pid=None): | |||
|
41 | try: | |||
|
42 | import psutil | |||
|
43 | if pid: | |||
|
44 | proc = psutil.Process(pid) | |||
|
45 | else: | |||
|
46 | proc = psutil.Process() | |||
|
47 | return proc.memory_info().rss | |||
|
48 | except Exception: | |||
|
49 | return None | |||
70 |
|
50 | |||
71 |
|
51 | |||
72 | def post_fork(server, worker): |
|
52 | def _get_config(ini_path): | |
73 | server.log.info("[<%-10s>] WORKER spawned", worker.pid) |
|
53 | ||
|
54 | try: | |||
|
55 | import configparser | |||
|
56 | except ImportError: | |||
|
57 | import ConfigParser as configparser | |||
|
58 | try: | |||
|
59 | config = configparser.RawConfigParser() | |||
|
60 | config.read(ini_path) | |||
|
61 | return config | |||
|
62 | except Exception: | |||
|
63 | return None | |||
|
64 | ||||
|
65 | ||||
|
66 | def _time_with_offset(memory_usage_check_interval): | |||
|
67 | return time.time() - random.randint(0, memory_usage_check_interval/2.0) | |||
74 |
|
68 | |||
75 |
|
69 | |||
76 | def pre_fork(server, worker): |
|
70 | def pre_fork(server, worker): | |
77 | pass |
|
71 | pass | |
78 |
|
72 | |||
79 |
|
73 | |||
|
74 | def post_fork(server, worker): | |||
|
75 | ||||
|
76 | # memory spec defaults | |||
|
77 | _memory_max_usage = 0 | |||
|
78 | _memory_usage_check_interval = 60 | |||
|
79 | _memory_usage_recovery_threshold = 0.8 | |||
|
80 | ||||
|
81 | ini_path = os.path.abspath(server.cfg.paste) | |||
|
82 | conf = _get_config(ini_path) | |||
|
83 | ||||
|
84 | section = 'server:main' | |||
|
85 | if conf and conf.has_section(section): | |||
|
86 | ||||
|
87 | if conf.has_option(section, 'memory_max_usage'): | |||
|
88 | _memory_max_usage = conf.getint(section, 'memory_max_usage') | |||
|
89 | ||||
|
90 | if conf.has_option(section, 'memory_usage_check_interval'): | |||
|
91 | _memory_usage_check_interval = conf.getint(section, 'memory_usage_check_interval') | |||
|
92 | ||||
|
93 | if conf.has_option(section, 'memory_usage_recovery_threshold'): | |||
|
94 | _memory_usage_recovery_threshold = conf.getfloat(section, 'memory_usage_recovery_threshold') | |||
|
95 | ||||
|
96 | worker._memory_max_usage = _memory_max_usage | |||
|
97 | worker._memory_usage_check_interval = _memory_usage_check_interval | |||
|
98 | worker._memory_usage_recovery_threshold = _memory_usage_recovery_threshold | |||
|
99 | ||||
|
100 | # register memory last check time, with some random offset so we don't recycle all | |||
|
101 | # at once | |||
|
102 | worker._last_memory_check_time = _time_with_offset(_memory_usage_check_interval) | |||
|
103 | ||||
|
104 | if _memory_max_usage: | |||
|
105 | server.log.info("[%-10s] WORKER spawned with max memory set at %s", worker.pid, | |||
|
106 | _format_data_size(_memory_max_usage)) | |||
|
107 | else: | |||
|
108 | server.log.info("[%-10s] WORKER spawned", worker.pid) | |||
|
109 | ||||
|
110 | ||||
80 | def pre_exec(server): |
|
111 | def pre_exec(server): | |
81 | server.log.info("Forked child, re-executing.") |
|
112 | server.log.info("Forked child, re-executing.") | |
82 |
|
113 | |||
83 |
|
114 | |||
84 | def on_starting(server): |
|
115 | def on_starting(server): | |
85 | server.log.info("Server is starting.") |
|
116 | server_lbl = '{} {}'.format(server.proc_name, server.address) | |
|
117 | server.log.info("Server %s is starting.", server_lbl) | |||
86 |
|
118 | |||
87 |
|
119 | |||
88 | def when_ready(server): |
|
120 | def when_ready(server): | |
89 | server.log.info("Server is ready. Spawning workers") |
|
121 | server.log.info("Server %s is ready. Spawning workers", server) | |
90 |
|
122 | |||
91 |
|
123 | |||
92 | def on_reload(server): |
|
124 | def on_reload(server): | |
93 | pass |
|
125 | pass | |
94 |
|
126 | |||
95 |
|
127 | |||
|
128 | def _format_data_size(size, unit="B", precision=1, binary=True): | |||
|
129 | """Format a number using SI units (kilo, mega, etc.). | |||
|
130 | ||||
|
131 | ``size``: The number as a float or int. | |||
|
132 | ||||
|
133 | ``unit``: The unit name in plural form. Examples: "bytes", "B". | |||
|
134 | ||||
|
135 | ``precision``: How many digits to the right of the decimal point. Default | |||
|
136 | is 1. 0 suppresses the decimal point. | |||
|
137 | ||||
|
138 | ``binary``: If false, use base-10 decimal prefixes (kilo = K = 1000). | |||
|
139 | If true, use base-2 binary prefixes (kibi = Ki = 1024). | |||
|
140 | ||||
|
141 | ``full_name``: If false (default), use the prefix abbreviation ("k" or | |||
|
142 | "Ki"). If true, use the full prefix ("kilo" or "kibi"). If false, | |||
|
143 | use abbreviation ("k" or "Ki"). | |||
|
144 | ||||
|
145 | """ | |||
|
146 | ||||
|
147 | if not binary: | |||
|
148 | base = 1000 | |||
|
149 | multiples = ('', 'k', 'M', 'G', 'T', 'P', 'E', 'Z', 'Y') | |||
|
150 | else: | |||
|
151 | base = 1024 | |||
|
152 | multiples = ('', 'Ki', 'Mi', 'Gi', 'Ti', 'Pi', 'Ei', 'Zi', 'Yi') | |||
|
153 | ||||
|
154 | sign = "" | |||
|
155 | if size > 0: | |||
|
156 | m = int(math.log(size, base)) | |||
|
157 | elif size < 0: | |||
|
158 | sign = "-" | |||
|
159 | size = -size | |||
|
160 | m = int(math.log(size, base)) | |||
|
161 | else: | |||
|
162 | m = 0 | |||
|
163 | if m > 8: | |||
|
164 | m = 8 | |||
|
165 | ||||
|
166 | if m == 0: | |||
|
167 | precision = '%.0f' | |||
|
168 | else: | |||
|
169 | precision = '%%.%df' % precision | |||
|
170 | ||||
|
171 | size = precision % (size / math.pow(base, m)) | |||
|
172 | ||||
|
173 | return '%s%s %s%s' % (sign, size.strip(), multiples[m], unit) | |||
|
174 | ||||
|
175 | ||||
|
176 | def _check_memory_usage(worker): | |||
|
177 | memory_max_usage = worker._memory_max_usage | |||
|
178 | if not memory_max_usage: | |||
|
179 | return | |||
|
180 | ||||
|
181 | memory_usage_check_interval = worker._memory_usage_check_interval | |||
|
182 | memory_usage_recovery_threshold = memory_max_usage * worker._memory_usage_recovery_threshold | |||
|
183 | ||||
|
184 | elapsed = time.time() - worker._last_memory_check_time | |||
|
185 | if elapsed > memory_usage_check_interval: | |||
|
186 | mem_usage = _get_process_rss() | |||
|
187 | if mem_usage and mem_usage > memory_max_usage: | |||
|
188 | worker.log.info( | |||
|
189 | "memory usage %s > %s, forcing gc", | |||
|
190 | _format_data_size(mem_usage), _format_data_size(memory_max_usage)) | |||
|
191 | # Try to clean it up by forcing a full collection. | |||
|
192 | gc.collect() | |||
|
193 | mem_usage = _get_process_rss() | |||
|
194 | if mem_usage > memory_usage_recovery_threshold: | |||
|
195 | # Didn't clean up enough, we'll have to terminate. | |||
|
196 | worker.log.warning( | |||
|
197 | "memory usage %s > %s after gc, quitting", | |||
|
198 | _format_data_size(mem_usage), _format_data_size(memory_max_usage)) | |||
|
199 | # This will cause worker to auto-restart itself | |||
|
200 | worker.alive = False | |||
|
201 | worker._last_memory_check_time = time.time() | |||
|
202 | ||||
|
203 | ||||
96 | def worker_int(worker): |
|
204 | def worker_int(worker): | |
97 |
worker.log.info("[ |
|
205 | worker.log.info("[%-10s] worker received INT or QUIT signal", worker.pid) | |
98 |
|
206 | |||
99 | # get traceback info, on worker crash |
|
207 | # get traceback info, on worker crash | |
100 | id2name = dict([(th.ident, th.name) for th in threading.enumerate()]) |
|
208 | id2name = dict([(th.ident, th.name) for th in threading.enumerate()]) | |
@@ -110,15 +218,15 b' def worker_int(worker):' | |||||
110 |
|
218 | |||
111 |
|
219 | |||
112 | def worker_abort(worker): |
|
220 | def worker_abort(worker): | |
113 |
worker.log.info("[ |
|
221 | worker.log.info("[%-10s] worker received SIGABRT signal", worker.pid) | |
114 |
|
222 | |||
115 |
|
223 | |||
116 | def worker_exit(server, worker): |
|
224 | def worker_exit(server, worker): | |
117 |
worker.log.info("[ |
|
225 | worker.log.info("[%-10s] worker exit", worker.pid) | |
118 |
|
226 | |||
119 |
|
227 | |||
120 | def child_exit(server, worker): |
|
228 | def child_exit(server, worker): | |
121 |
worker.log.info("[ |
|
229 | worker.log.info("[%-10s] worker child exit", worker.pid) | |
122 |
|
230 | |||
123 |
|
231 | |||
124 | def pre_request(worker, req): |
|
232 | def pre_request(worker, req): | |
@@ -129,9 +237,12 b' def pre_request(worker, req):' | |||||
129 |
|
237 | |||
130 | def post_request(worker, req, environ, resp): |
|
238 | def post_request(worker, req, environ, resp): | |
131 | total_time = time.time() - worker.start_time |
|
239 | total_time = time.time() - worker.start_time | |
|
240 | # Gunicorn sometimes has problems with reading the status_code | |||
|
241 | status_code = getattr(resp, 'status_code', '') | |||
132 | worker.log.debug( |
|
242 | worker.log.debug( | |
133 |
"GNCRN POST WORKER [cnt:%s]: %s %s resp: %s, Load Time: %. |
|
243 | "GNCRN POST WORKER [cnt:%s]: %s %s resp: %s, Load Time: %.4fs", | |
134 |
worker.nr, req.method, req.path, |
|
244 | worker.nr, req.method, req.path, status_code, total_time) | |
|
245 | _check_memory_usage(worker) | |||
135 |
|
246 | |||
136 |
|
247 | |||
137 | class RhodeCodeLogger(Logger): |
|
248 | class RhodeCodeLogger(Logger): |
This diff has been collapsed as it changes many lines, (912 lines changed) Show them Hide them | |||||
@@ -1,24 +1,22 b'' | |||||
1 |
|
1 | ## -*- coding: utf-8 -*- | ||
2 |
|
2 | |||
3 |
|
|
3 | ; ######################################### | |
4 |
|
|
4 | ; RHODECODE COMMUNITY EDITION CONFIGURATION | |
5 |
|
|
5 | ; ######################################### | |
6 |
|
6 | |||
7 | [DEFAULT] |
|
7 | [DEFAULT] | |
8 |
|
|
8 | ; Debug flag sets all loggers to debug, and enables request tracking | |
9 | debug = false |
|
9 | debug = false | |
10 |
|
10 | |||
11 |
|
|
11 | ; ######################################################################## | |
12 | ## EMAIL CONFIGURATION ## |
|
12 | ; EMAIL CONFIGURATION | |
13 | ## Uncomment and replace with the email address which should receive ## |
|
13 | ; These settings will be used by the RhodeCode mailing system | |
14 | ## any error reports after an application crash ## |
|
14 | ; ######################################################################## | |
15 | ## Additionally these settings will be used by the RhodeCode mailing system ## |
|
|||
16 | ################################################################################ |
|
|||
17 |
|
15 | |||
18 |
|
|
16 | ; prefix all emails subjects with given prefix, helps filtering out emails | |
19 | #email_prefix = [RhodeCode] |
|
17 | #email_prefix = [RhodeCode] | |
20 |
|
18 | |||
21 |
|
|
19 | ; email FROM address all mails will be sent | |
22 | #app_email_from = rhodecode-noreply@localhost |
|
20 | #app_email_from = rhodecode-noreply@localhost | |
23 |
|
21 | |||
24 | #smtp_server = mail.server.com |
|
22 | #smtp_server = mail.server.com | |
@@ -29,168 +27,200 b' debug = false' | |||||
29 | #smtp_use_ssl = true |
|
27 | #smtp_use_ssl = true | |
30 |
|
28 | |||
31 | [server:main] |
|
29 | [server:main] | |
32 | ## COMMON ## |
|
30 | ; COMMON HOST/IP CONFIG | |
33 | host = 127.0.0.1 |
|
31 | host = 127.0.0.1 | |
34 | port = 5000 |
|
32 | port = 5000 | |
35 |
|
33 | |||
36 | ########################################################### |
|
34 | ||
37 | ## WAITRESS WSGI SERVER - Recommended for Development #### |
|
35 | ; ########################### | |
38 | ########################################################### |
|
36 | ; GUNICORN APPLICATION SERVER | |
|
37 | ; ########################### | |||
|
38 | ||||
|
39 | ; run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |||
|
40 | ||||
|
41 | ; Module to use, this setting shouldn't be changed | |||
|
42 | use = egg:gunicorn#main | |||
|
43 | ||||
|
44 | ; Sets the number of process workers. More workers means more concurrent connections | |||
|
45 | ; RhodeCode can handle at the same time. Each additional worker also it increases | |||
|
46 | ; memory usage as each has it's own set of caches. | |||
|
47 | ; Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |||
|
48 | ; than 8-10 unless for really big deployments .e.g 700-1000 users. | |||
|
49 | ; `instance_id = *` must be set in the [app:main] section below (which is the default) | |||
|
50 | ; when using more than 1 worker. | |||
|
51 | workers = 2 | |||
|
52 | ||||
|
53 | ; Gunicorn access log level | |||
|
54 | loglevel = info | |||
|
55 | ||||
|
56 | ; Process name visible in process list | |||
|
57 | proc_name = rhodecode | |||
|
58 | ||||
|
59 | ; Type of worker class, one of `sync`, `gevent` | |||
|
60 | ; Recommended type is `gevent` | |||
|
61 | worker_class = gevent | |||
|
62 | ||||
|
63 | ; The maximum number of simultaneous clients per worker. Valid only for gevent | |||
|
64 | worker_connections = 10 | |||
|
65 | ||||
|
66 | ; Max number of requests that worker will handle before being gracefully restarted. | |||
|
67 | ; Prevents memory leaks, jitter adds variability so not all workers are restarted at once. | |||
|
68 | max_requests = 1000 | |||
|
69 | max_requests_jitter = 30 | |||
39 |
|
70 | |||
40 | #use = egg:waitress#main |
|
71 | ; Amount of time a worker can spend with handling a request before it | |
41 | ## number of worker threads |
|
72 | ; gets killed and restarted. By default set to 21600 (6hrs) | |
42 | #threads = 5 |
|
73 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |
43 | ## MAX BODY SIZE 100GB |
|
74 | timeout = 21600 | |
44 | #max_request_body_size = 107374182400 |
|
75 | ||
45 | ## Use poll instead of select, fixes file descriptors limits problems. |
|
76 | ; The maximum size of HTTP request line in bytes. | |
46 | ## May not work on old windows systems. |
|
77 | ; 0 for unlimited | |
47 | #asyncore_use_poll = true |
|
78 | limit_request_line = 0 | |
|
79 | ||||
|
80 | ; Limit the number of HTTP headers fields in a request. | |||
|
81 | ; By default this value is 100 and can't be larger than 32768. | |||
|
82 | limit_request_fields = 32768 | |||
|
83 | ||||
|
84 | ; Limit the allowed size of an HTTP request header field. | |||
|
85 | ; Value is a positive number or 0. | |||
|
86 | ; Setting it to 0 will allow unlimited header field sizes. | |||
|
87 | limit_request_field_size = 0 | |||
|
88 | ||||
|
89 | ; Timeout for graceful workers restart. | |||
|
90 | ; After receiving a restart signal, workers have this much time to finish | |||
|
91 | ; serving requests. Workers still alive after the timeout (starting from the | |||
|
92 | ; receipt of the restart signal) are force killed. | |||
|
93 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |||
|
94 | graceful_timeout = 3600 | |||
|
95 | ||||
|
96 | # The number of seconds to wait for requests on a Keep-Alive connection. | |||
|
97 | # Generally set in the 1-5 seconds range. | |||
|
98 | keepalive = 2 | |||
|
99 | ||||
|
100 | ; Maximum memory usage that each worker can use before it will receive a | |||
|
101 | ; graceful restart signal 0 = memory monitoring is disabled | |||
|
102 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |||
|
103 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |||
|
104 | memory_max_usage = 0 | |||
|
105 | ||||
|
106 | ; How often in seconds to check for memory usage for each gunicorn worker | |||
|
107 | memory_usage_check_interval = 60 | |||
|
108 | ||||
|
109 | ; Threshold value for which we don't recycle worker if GarbageCollection | |||
|
110 | ; frees up enough resources. Before each restart we try to run GC on worker | |||
|
111 | ; in case we get enough free memory after that, restart will not happen. | |||
|
112 | memory_usage_recovery_threshold = 0.8 | |||
48 |
|
113 | |||
49 |
|
114 | |||
50 | ########################## |
|
115 | ; Prefix middleware for RhodeCode. | |
51 | ## GUNICORN WSGI SERVER ## |
|
116 | ; recommended when using proxy setup. | |
52 | ########################## |
|
117 | ; allows to set RhodeCode under a prefix in server. | |
53 | ## run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini |
|
118 | ; eg https://server.com/custom_prefix. Enable `filter-with =` option below as well. | |
54 |
|
119 | ; And set your prefix like: `prefix = /custom_prefix` | ||
55 | use = egg:gunicorn#main |
|
120 | ; be sure to also set beaker.session.cookie_path = /custom_prefix if you need | |
56 | ## Sets the number of process workers. More workers means more concurrent connections |
|
121 | ; to make your cookies only work on prefix url | |
57 | ## RhodeCode can handle at the same time. Each additional worker also it increases |
|
|||
58 | ## memory usage as each has it's own set of caches. |
|
|||
59 | ## Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more |
|
|||
60 | ## than 8-10 unless for really big deployments .e.g 700-1000 users. |
|
|||
61 | ## `instance_id = *` must be set in the [app:main] section below (which is the default) |
|
|||
62 | ## when using more than 1 worker. |
|
|||
63 | workers = 2 |
|
|||
64 | ## process name visible in process list |
|
|||
65 | proc_name = rhodecode |
|
|||
66 | ## type of worker class, one of sync, gevent |
|
|||
67 | ## recommended for bigger setup is using of of other than sync one |
|
|||
68 | worker_class = gevent |
|
|||
69 | ## The maximum number of simultaneous clients. Valid only for Gevent |
|
|||
70 | worker_connections = 10 |
|
|||
71 | ## max number of requests that worker will handle before being gracefully |
|
|||
72 | ## restarted, could prevent memory leaks |
|
|||
73 | max_requests = 1000 |
|
|||
74 | max_requests_jitter = 30 |
|
|||
75 | ## amount of time a worker can spend with handling a request before it |
|
|||
76 | ## gets killed and restarted. Set to 6hrs |
|
|||
77 | timeout = 21600 |
|
|||
78 |
|
||||
79 |
|
||||
80 | ## prefix middleware for RhodeCode. |
|
|||
81 | ## recommended when using proxy setup. |
|
|||
82 | ## allows to set RhodeCode under a prefix in server. |
|
|||
83 | ## eg https://server.com/custom_prefix. Enable `filter-with =` option below as well. |
|
|||
84 | ## And set your prefix like: `prefix = /custom_prefix` |
|
|||
85 | ## be sure to also set beaker.session.cookie_path = /custom_prefix if you need |
|
|||
86 | ## to make your cookies only work on prefix url |
|
|||
87 | [filter:proxy-prefix] |
|
122 | [filter:proxy-prefix] | |
88 | use = egg:PasteDeploy#prefix |
|
123 | use = egg:PasteDeploy#prefix | |
89 | prefix = / |
|
124 | prefix = / | |
90 |
|
125 | |||
91 | [app:main] |
|
126 | [app:main] | |
92 |
|
|
127 | ; The %(here)s variable will be replaced with the absolute path of parent directory | |
93 |
|
|
128 | ; of this file | |
94 |
|
|
129 | ; In addition ENVIRONMENT variables usage is possible, e.g | |
95 |
|
|
130 | ; sqlalchemy.db1.url = {ENV_RC_DB_URL} | |
96 |
|
131 | |||
97 | use = egg:rhodecode-enterprise-ce |
|
132 | use = egg:rhodecode-enterprise-ce | |
98 |
|
133 | |||
99 |
|
|
134 | ; enable proxy prefix middleware, defined above | |
100 | #filter-with = proxy-prefix |
|
135 | #filter-with = proxy-prefix | |
101 |
|
136 | |||
102 |
|
|
137 | ; encryption key used to encrypt social plugin tokens, | |
103 |
|
|
138 | ; remote_urls with credentials etc, if not set it defaults to | |
104 |
|
|
139 | ; `beaker.session.secret` | |
105 | #rhodecode.encrypted_values.secret = |
|
140 | #rhodecode.encrypted_values.secret = | |
106 |
|
141 | |||
107 |
|
|
142 | ; decryption strict mode (enabled by default). It controls if decryption raises | |
108 |
|
|
143 | ; `SignatureVerificationError` in case of wrong key, or damaged encryption data. | |
109 | #rhodecode.encrypted_values.strict = false |
|
144 | #rhodecode.encrypted_values.strict = false | |
110 |
|
145 | |||
111 |
|
|
146 | ; Pick algorithm for encryption. Either fernet (more secure) or aes (default) | |
112 |
|
|
147 | ; fernet is safer, and we strongly recommend switching to it. | |
113 |
|
|
148 | ; Due to backward compatibility aes is used as default. | |
114 | #rhodecode.encrypted_values.algorithm = fernet |
|
149 | #rhodecode.encrypted_values.algorithm = fernet | |
115 |
|
150 | |||
116 |
|
|
151 | ; Return gzipped responses from RhodeCode (static files/application) | |
117 | gzip_responses = false |
|
152 | gzip_responses = false | |
118 |
|
153 | |||
119 |
|
|
154 | ; Auto-generate javascript routes file on startup | |
120 | generate_js_files = false |
|
155 | generate_js_files = false | |
121 |
|
156 | |||
122 |
|
|
157 | ; System global default language. | |
123 |
|
|
158 | ; All available languages: en (default), be, de, es, fr, it, ja, pl, pt, ru, zh | |
124 | lang = en |
|
159 | lang = en | |
125 |
|
160 | |||
126 |
|
|
161 | ; Perform a full repository scan and import on each server start. | |
127 |
|
|
162 | ; Settings this to true could lead to very long startup time. | |
128 | startup.import_repos = false |
|
163 | startup.import_repos = false | |
129 |
|
164 | |||
130 |
|
|
165 | ; Uncomment and set this path to use archive download cache. | |
131 |
|
|
166 | ; Once enabled, generated archives will be cached at this location | |
132 |
|
|
167 | ; and served from the cache during subsequent requests for the same archive of | |
133 |
|
|
168 | ; the repository. | |
134 | #archive_cache_dir = /tmp/tarballcache |
|
169 | #archive_cache_dir = /tmp/tarballcache | |
135 |
|
170 | |||
136 |
|
|
171 | ; URL at which the application is running. This is used for Bootstrapping | |
137 |
|
|
172 | ; requests in context when no web request is available. Used in ishell, or | |
138 |
|
|
173 | ; SSH calls. Set this for events to receive proper url for SSH calls. | |
139 | app.base_url = http://rhodecode.local |
|
174 | app.base_url = http://rhodecode.local | |
140 |
|
175 | |||
141 |
|
|
176 | ; Unique application ID. Should be a random unique string for security. | |
142 | app_instance_uuid = rc-production |
|
177 | app_instance_uuid = rc-production | |
143 |
|
178 | |||
144 |
|
|
179 | ; Cut off limit for large diffs (size in bytes). If overall diff size on | |
145 |
|
|
180 | ; commit, or pull request exceeds this limit this diff will be displayed | |
146 |
|
|
181 | ; partially. E.g 512000 == 512Kb | |
147 | cut_off_limit_diff = 512000 |
|
182 | cut_off_limit_diff = 512000 | |
148 |
|
183 | |||
149 |
|
|
184 | ; Cut off limit for large files inside diffs (size in bytes). Each individual | |
150 |
|
|
185 | ; file inside diff which exceeds this limit will be displayed partially. | |
151 |
|
|
186 | ; E.g 128000 == 128Kb | |
152 | cut_off_limit_file = 128000 |
|
187 | cut_off_limit_file = 128000 | |
153 |
|
188 | |||
154 |
|
|
189 | ; Use cached version of vcs repositories everywhere. Recommended to be `true` | |
155 | vcs_full_cache = true |
|
190 | vcs_full_cache = true | |
156 |
|
191 | |||
157 |
|
|
192 | ; Force https in RhodeCode, fixes https redirects, assumes it's always https. | |
158 |
|
|
193 | ; Normally this is controlled by proper flags sent from http server such as Nginx or Apache | |
159 | force_https = false |
|
194 | force_https = false | |
160 |
|
195 | |||
161 |
|
|
196 | ; use Strict-Transport-Security headers | |
162 | use_htsts = false |
|
197 | use_htsts = false | |
163 |
|
198 | |||
164 | ## git rev filter option, --all is the default filter, if you need to |
|
199 | ; Set to true if your repos are exposed using the dumb protocol | |
165 | ## hide all refs in changelog switch this to --branches --tags |
|
|||
166 | git_rev_filter = --branches --tags |
|
|||
167 |
|
||||
168 | # Set to true if your repos are exposed using the dumb protocol |
|
|||
169 | git_update_server_info = false |
|
200 | git_update_server_info = false | |
170 |
|
201 | |||
171 |
|
|
202 | ; RSS/ATOM feed options | |
172 | rss_cut_off_limit = 256000 |
|
203 | rss_cut_off_limit = 256000 | |
173 | rss_items_per_page = 10 |
|
204 | rss_items_per_page = 10 | |
174 | rss_include_diff = false |
|
205 | rss_include_diff = false | |
175 |
|
206 | |||
176 |
|
|
207 | ; gist URL alias, used to create nicer urls for gist. This should be an | |
177 |
|
|
208 | ; url that does rewrites to _admin/gists/{gistid}. | |
178 |
|
|
209 | ; example: http://gist.rhodecode.org/{gistid}. Empty means use the internal | |
179 |
|
|
210 | ; RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid} | |
180 | gist_alias_url = |
|
211 | gist_alias_url = | |
181 |
|
212 | |||
182 |
|
|
213 | ; List of views (using glob pattern syntax) that AUTH TOKENS could be | |
183 |
|
|
214 | ; used for access. | |
184 |
|
|
215 | ; Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it | |
185 |
|
|
216 | ; came from the the logged in user who own this authentication token. | |
186 |
|
|
217 | ; Additionally @TOKEN syntax can be used to bound the view to specific | |
187 |
|
|
218 | ; authentication token. Such view would be only accessible when used together | |
188 |
|
|
219 | ; with this authentication token | |
189 | ## |
|
220 | ; list of all views can be found under `/_admin/permissions/auth_token_access` | |
190 | ## list of all views can be found under `/_admin/permissions/auth_token_access` |
|
221 | ; The list should be "," separated and on a single line. | |
191 | ## The list should be "," separated and on a single line. |
|
222 | ; Most common views to enable: | |
192 | ## |
|
223 | ||
193 | ## Most common views to enable: |
|
|||
194 | # RepoCommitsView:repo_commit_download |
|
224 | # RepoCommitsView:repo_commit_download | |
195 | # RepoCommitsView:repo_commit_patch |
|
225 | # RepoCommitsView:repo_commit_patch | |
196 | # RepoCommitsView:repo_commit_raw |
|
226 | # RepoCommitsView:repo_commit_raw | |
@@ -201,164 +231,194 b' gist_alias_url =' | |||||
201 | # GistView:* |
|
231 | # GistView:* | |
202 | api_access_controllers_whitelist = |
|
232 | api_access_controllers_whitelist = | |
203 |
|
233 | |||
204 |
|
|
234 | ; Default encoding used to convert from and to unicode | |
205 |
|
|
235 | ; can be also a comma separated list of encoding in case of mixed encodings | |
206 | default_encoding = UTF-8 |
|
236 | default_encoding = UTF-8 | |
207 |
|
237 | |||
208 |
|
|
238 | ; instance-id prefix | |
209 |
|
|
239 | ; a prefix key for this instance used for cache invalidation when running | |
210 |
|
|
240 | ; multiple instances of RhodeCode, make sure it's globally unique for | |
211 |
|
|
241 | ; all running RhodeCode instances. Leave empty if you don't use it | |
212 | instance_id = |
|
242 | instance_id = | |
213 |
|
243 | |||
214 |
|
|
244 | ; Fallback authentication plugin. Set this to a plugin ID to force the usage | |
215 |
|
|
245 | ; of an authentication plugin also if it is disabled by it's settings. | |
216 |
|
|
246 | ; This could be useful if you are unable to log in to the system due to broken | |
217 |
|
|
247 | ; authentication settings. Then you can enable e.g. the internal RhodeCode auth | |
218 |
|
|
248 | ; module to log in again and fix the settings. | |
219 | ## |
|
249 | ; Available builtin plugin IDs (hash is part of the ID): | |
220 | ## Available builtin plugin IDs (hash is part of the ID): |
|
250 | ; egg:rhodecode-enterprise-ce#rhodecode | |
221 |
|
|
251 | ; egg:rhodecode-enterprise-ce#pam | |
222 |
|
|
252 | ; egg:rhodecode-enterprise-ce#ldap | |
223 |
|
|
253 | ; egg:rhodecode-enterprise-ce#jasig_cas | |
224 |
|
|
254 | ; egg:rhodecode-enterprise-ce#headers | |
225 |
|
|
255 | ; egg:rhodecode-enterprise-ce#crowd | |
226 | ## egg:rhodecode-enterprise-ce#crowd |
|
256 | ||
227 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode |
|
257 | #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode | |
228 |
|
258 | |||
229 | ## alternative return HTTP header for failed authentication. Default HTTP |
|
259 | ; Flag to control loading of legacy plugins in py:/path format | |
230 | ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with |
|
260 | auth_plugin.import_legacy_plugins = true | |
231 | ## handling that causing a series of failed authentication calls. |
|
261 | ||
232 | ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code |
|
262 | ; alternative return HTTP header for failed authentication. Default HTTP | |
233 | ## This will be served instead of default 401 on bad authentication |
|
263 | ; response is 401 HTTPUnauthorized. Currently HG clients have troubles with | |
|
264 | ; handling that causing a series of failed authentication calls. | |||
|
265 | ; Set this variable to 403 to return HTTPForbidden, or any other HTTP code | |||
|
266 | ; This will be served instead of default 401 on bad authentication | |||
234 | auth_ret_code = |
|
267 | auth_ret_code = | |
235 |
|
268 | |||
236 |
|
|
269 | ; use special detection method when serving auth_ret_code, instead of serving | |
237 |
|
|
270 | ; ret_code directly, use 401 initially (Which triggers credentials prompt) | |
238 |
|
|
271 | ; and then serve auth_ret_code to clients | |
239 | auth_ret_code_detection = false |
|
272 | auth_ret_code_detection = false | |
240 |
|
273 | |||
241 |
|
|
274 | ; locking return code. When repository is locked return this HTTP code. 2XX | |
242 |
|
|
275 | ; codes don't break the transactions while 4XX codes do | |
243 | lock_ret_code = 423 |
|
276 | lock_ret_code = 423 | |
244 |
|
277 | |||
245 |
|
|
278 | ; allows to change the repository location in settings page | |
246 | allow_repo_location_change = true |
|
279 | allow_repo_location_change = true | |
247 |
|
280 | |||
248 |
|
|
281 | ; allows to setup custom hooks in settings page | |
249 | allow_custom_hooks_settings = true |
|
282 | allow_custom_hooks_settings = true | |
250 |
|
283 | |||
251 |
|
|
284 | ; Generated license token required for EE edition license. | |
252 |
|
|
285 | ; New generated token value can be found in Admin > settings > license page. | |
253 | license_token = |
|
286 | license_token = | |
254 |
|
287 | |||
255 | ## supervisor connection uri, for managing supervisor and logs. |
|
288 | ; This flag hides sensitive information on the license page such as token, and license data | |
|
289 | license.hide_license_info = false | |||
|
290 | ||||
|
291 | ; supervisor connection uri, for managing supervisor and logs. | |||
256 | supervisor.uri = |
|
292 | supervisor.uri = | |
257 | ## supervisord group name/id we only want this RC instance to handle |
|
293 | ||
|
294 | ; supervisord group name/id we only want this RC instance to handle | |||
258 | supervisor.group_id = prod |
|
295 | supervisor.group_id = prod | |
259 |
|
296 | |||
260 |
|
|
297 | ; Display extended labs settings | |
261 | labs_settings_active = true |
|
298 | labs_settings_active = true | |
262 |
|
299 | |||
263 |
|
|
300 | ; Custom exception store path, defaults to TMPDIR | |
264 |
|
|
301 | ; This is used to store exception from RhodeCode in shared directory | |
265 | #exception_tracker.store_path = |
|
302 | #exception_tracker.store_path = | |
266 |
|
303 | |||
267 |
|
|
304 | ; File store configuration. This is used to store and serve uploaded files | |
268 | file_store.enabled = true |
|
305 | file_store.enabled = true | |
269 | ## Storage backend, available options are: local |
|
306 | ||
|
307 | ; Storage backend, available options are: local | |||
270 | file_store.backend = local |
|
308 | file_store.backend = local | |
271 | ## path to store the uploaded binaries |
|
309 | ||
|
310 | ; path to store the uploaded binaries | |||
272 | file_store.storage_path = %(here)s/data/file_store |
|
311 | file_store.storage_path = %(here)s/data/file_store | |
273 |
|
312 | |||
274 |
|
313 | |||
275 | #################################### |
|
314 | ; ############# | |
276 | ### CELERY CONFIG #### |
|
315 | ; CELERY CONFIG | |
277 | #################################### |
|
316 | ; ############# | |
278 | ## run: /path/to/celery worker \ |
|
317 | ||
279 | ## -E --beat --app rhodecode.lib.celerylib.loader \ |
|
318 | ; manually run celery: /path/to/celery worker -E --beat --app rhodecode.lib.celerylib.loader --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler --loglevel DEBUG --ini /path/to/rhodecode.ini | |
280 | ## --scheduler rhodecode.lib.celerylib.scheduler.RcScheduler \ |
|
|||
281 | ## --loglevel DEBUG --ini /path/to/rhodecode.ini |
|
|||
282 |
|
319 | |||
283 | use_celery = false |
|
320 | use_celery = false | |
284 |
|
321 | |||
285 |
|
|
322 | ; connection url to the message broker (default redis) | |
286 |
celery.broker_url = |
|
323 | celery.broker_url = redis://localhost:6379/8 | |
287 |
|
324 | |||
288 | ## maximum tasks to execute before worker restart |
|
325 | ; rabbitmq example | |
|
326 | #celery.broker_url = amqp://rabbitmq:qweqwe@localhost:5672/rabbitmqhost | |||
|
327 | ||||
|
328 | ; maximum tasks to execute before worker restart | |||
289 | celery.max_tasks_per_child = 100 |
|
329 | celery.max_tasks_per_child = 100 | |
290 |
|
330 | |||
291 |
|
|
331 | ; tasks will never be sent to the queue, but executed locally instead. | |
292 | celery.task_always_eager = false |
|
332 | celery.task_always_eager = false | |
293 |
|
333 | |||
294 | ##################################### |
|
334 | ; ############# | |
295 | ### DOGPILE CACHE #### |
|
335 | ; DOGPILE CACHE | |
296 | ##################################### |
|
336 | ; ############# | |
297 | ## Default cache dir for caches. Putting this into a ramdisk |
|
337 | ||
298 | ## can boost performance, eg. /tmpfs/data_ramdisk, however this directory might require |
|
338 | ; Default cache dir for caches. Putting this into a ramdisk can boost performance. | |
299 | ## large amount of space |
|
339 | ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space | |
300 | cache_dir = %(here)s/data |
|
340 | cache_dir = %(here)s/data | |
301 |
|
341 | |||
302 | ## `cache_perms` cache settings for permission tree, auth TTL. |
|
342 | ; ********************************************* | |
|
343 | ; `sql_cache_short` cache for heavy SQL queries | |||
|
344 | ; Only supported backend is `memory_lru` | |||
|
345 | ; ********************************************* | |||
|
346 | rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru | |||
|
347 | rc_cache.sql_cache_short.expiration_time = 30 | |||
|
348 | ||||
|
349 | ||||
|
350 | ; ***************************************************** | |||
|
351 | ; `cache_repo_longterm` cache for repo object instances | |||
|
352 | ; Only supported backend is `memory_lru` | |||
|
353 | ; ***************************************************** | |||
|
354 | rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru | |||
|
355 | ; by default we use 30 Days, cache is still invalidated on push | |||
|
356 | rc_cache.cache_repo_longterm.expiration_time = 2592000 | |||
|
357 | ; max items in LRU cache, set to smaller number to save memory, and expire last used caches | |||
|
358 | rc_cache.cache_repo_longterm.max_size = 10000 | |||
|
359 | ||||
|
360 | ||||
|
361 | ; ************************************************* | |||
|
362 | ; `cache_perms` cache for permission tree, auth TTL | |||
|
363 | ; ************************************************* | |||
303 | rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace |
|
364 | rc_cache.cache_perms.backend = dogpile.cache.rc.file_namespace | |
304 | rc_cache.cache_perms.expiration_time = 300 |
|
365 | rc_cache.cache_perms.expiration_time = 300 | |
|
366 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set | |||
|
367 | #rc_cache.cache_perms.arguments.filename = /tmp/cache_perms.db | |||
305 |
|
368 | |||
306 |
|
|
369 | ; alternative `cache_perms` redis backend with distributed lock | |
307 | #rc_cache.cache_perms.backend = dogpile.cache.rc.redis |
|
370 | #rc_cache.cache_perms.backend = dogpile.cache.rc.redis | |
308 | #rc_cache.cache_perms.expiration_time = 300 |
|
371 | #rc_cache.cache_perms.expiration_time = 300 | |
309 | ## redis_expiration_time needs to be greater then expiration_time |
|
372 | ||
|
373 | ; redis_expiration_time needs to be greater then expiration_time | |||
310 | #rc_cache.cache_perms.arguments.redis_expiration_time = 7200 |
|
374 | #rc_cache.cache_perms.arguments.redis_expiration_time = 7200 | |
311 | #rc_cache.cache_perms.arguments.socket_timeout = 30 |
|
375 | ||
312 | #rc_cache.cache_perms.arguments.host = localhost |
|
376 | #rc_cache.cache_perms.arguments.host = localhost | |
313 | #rc_cache.cache_perms.arguments.port = 6379 |
|
377 | #rc_cache.cache_perms.arguments.port = 6379 | |
314 | #rc_cache.cache_perms.arguments.db = 0 |
|
378 | #rc_cache.cache_perms.arguments.db = 0 | |
315 | ## more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends |
|
379 | #rc_cache.cache_perms.arguments.socket_timeout = 30 | |
|
380 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |||
316 | #rc_cache.cache_perms.arguments.distributed_lock = true |
|
381 | #rc_cache.cache_perms.arguments.distributed_lock = true | |
317 |
|
382 | |||
318 | ## `cache_repo` cache settings for FileTree, Readme, RSS FEEDS |
|
383 | ||
|
384 | ; *************************************************** | |||
|
385 | ; `cache_repo` cache for file tree, Readme, RSS FEEDS | |||
|
386 | ; *************************************************** | |||
319 | rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace |
|
387 | rc_cache.cache_repo.backend = dogpile.cache.rc.file_namespace | |
320 | rc_cache.cache_repo.expiration_time = 2592000 |
|
388 | rc_cache.cache_repo.expiration_time = 2592000 | |
|
389 | ; file cache store path. Defaults to `cache_dir =` value or tempdir if both values are not set | |||
|
390 | #rc_cache.cache_repo.arguments.filename = /tmp/cache_repo.db | |||
321 |
|
391 | |||
322 |
|
|
392 | ; alternative `cache_repo` redis backend with distributed lock | |
323 | #rc_cache.cache_repo.backend = dogpile.cache.rc.redis |
|
393 | #rc_cache.cache_repo.backend = dogpile.cache.rc.redis | |
324 | #rc_cache.cache_repo.expiration_time = 2592000 |
|
394 | #rc_cache.cache_repo.expiration_time = 2592000 | |
325 | ## redis_expiration_time needs to be greater then expiration_time |
|
395 | ||
|
396 | ; redis_expiration_time needs to be greater then expiration_time | |||
326 | #rc_cache.cache_repo.arguments.redis_expiration_time = 2678400 |
|
397 | #rc_cache.cache_repo.arguments.redis_expiration_time = 2678400 | |
327 | #rc_cache.cache_repo.arguments.socket_timeout = 30 |
|
398 | ||
328 | #rc_cache.cache_repo.arguments.host = localhost |
|
399 | #rc_cache.cache_repo.arguments.host = localhost | |
329 | #rc_cache.cache_repo.arguments.port = 6379 |
|
400 | #rc_cache.cache_repo.arguments.port = 6379 | |
330 | #rc_cache.cache_repo.arguments.db = 1 |
|
401 | #rc_cache.cache_repo.arguments.db = 1 | |
331 | ## more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends |
|
402 | #rc_cache.cache_repo.arguments.socket_timeout = 30 | |
|
403 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |||
332 | #rc_cache.cache_repo.arguments.distributed_lock = true |
|
404 | #rc_cache.cache_repo.arguments.distributed_lock = true | |
333 |
|
405 | |||
334 | ## cache settings for SQL queries, this needs to use memory type backend |
|
|||
335 | rc_cache.sql_cache_short.backend = dogpile.cache.rc.memory_lru |
|
|||
336 | rc_cache.sql_cache_short.expiration_time = 30 |
|
|||
337 |
|
406 | |||
338 | ## `cache_repo_longterm` cache for repo object instances, this needs to use memory |
|
407 | ; ############## | |
339 | ## type backend as the objects kept are not pickle serializable |
|
408 | ; BEAKER SESSION | |
340 | rc_cache.cache_repo_longterm.backend = dogpile.cache.rc.memory_lru |
|
409 | ; ############## | |
341 | ## by default we use 96H, this is using invalidation on push anyway |
|
|||
342 | rc_cache.cache_repo_longterm.expiration_time = 345600 |
|
|||
343 | ## max items in LRU cache, reduce this number to save memory, and expire last used |
|
|||
344 | ## cached objects |
|
|||
345 | rc_cache.cache_repo_longterm.max_size = 10000 |
|
|||
346 |
|
410 | |||
347 |
|
411 | ; beaker.session.type is type of storage options for the logged users sessions. Current allowed | ||
348 | #################################### |
|
412 | ; types are file, ext:redis, ext:database, ext:memcached, and memory (default if not specified). | |
349 | ### BEAKER SESSION #### |
|
413 | ; Fastest ones are Redis and ext:database | |
350 | #################################### |
|
|||
351 |
|
||||
352 | ## .session.type is type of storage options for the session, current allowed |
|
|||
353 | ## types are file, ext:memcached, ext:redis, ext:database, and memory (default). |
|
|||
354 | beaker.session.type = file |
|
414 | beaker.session.type = file | |
355 | beaker.session.data_dir = %(here)s/data/sessions |
|
415 | beaker.session.data_dir = %(here)s/data/sessions | |
356 |
|
416 | |||
357 |
|
|
417 | ; Redis based sessions | |
358 | #beaker.session.type = ext:redis |
|
418 | #beaker.session.type = ext:redis | |
359 | #beaker.session.url = redis://127.0.0.1:6379/2 |
|
419 | #beaker.session.url = redis://127.0.0.1:6379/2 | |
360 |
|
420 | |||
361 |
|
|
421 | ; DB based session, fast, and allows easy management over logged in users | |
362 | #beaker.session.type = ext:database |
|
422 | #beaker.session.type = ext:database | |
363 | #beaker.session.table_name = db_session |
|
423 | #beaker.session.table_name = db_session | |
364 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode |
|
424 | #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode | |
@@ -370,265 +430,275 b' beaker.session.key = rhodecode' | |||||
370 | beaker.session.secret = production-rc-uytcxaz |
|
430 | beaker.session.secret = production-rc-uytcxaz | |
371 | beaker.session.lock_dir = %(here)s/data/sessions/lock |
|
431 | beaker.session.lock_dir = %(here)s/data/sessions/lock | |
372 |
|
432 | |||
373 |
|
|
433 | ; Secure encrypted cookie. Requires AES and AES python libraries | |
374 |
|
|
434 | ; you must disable beaker.session.secret to use this | |
375 | #beaker.session.encrypt_key = key_for_encryption |
|
435 | #beaker.session.encrypt_key = key_for_encryption | |
376 | #beaker.session.validate_key = validation_key |
|
436 | #beaker.session.validate_key = validation_key | |
377 |
|
437 | |||
378 |
|
|
438 | ; Sets session as invalid (also logging out user) if it haven not been | |
379 |
|
|
439 | ; accessed for given amount of time in seconds | |
380 | beaker.session.timeout = 2592000 |
|
440 | beaker.session.timeout = 2592000 | |
381 | beaker.session.httponly = true |
|
441 | beaker.session.httponly = true | |
382 | ## Path to use for the cookie. Set to prefix if you use prefix middleware |
|
442 | ||
|
443 | ; Path to use for the cookie. Set to prefix if you use prefix middleware | |||
383 | #beaker.session.cookie_path = /custom_prefix |
|
444 | #beaker.session.cookie_path = /custom_prefix | |
384 |
|
445 | |||
385 |
|
|
446 | ; Set https secure cookie | |
386 | beaker.session.secure = false |
|
447 | beaker.session.secure = false | |
387 |
|
448 | |||
388 | ## auto save the session to not to use .save() |
|
449 | ; default cookie expiration time in seconds, set to `true` to set expire | |
389 | beaker.session.auto = false |
|
450 | ; at browser close | |
390 |
|
||||
391 | ## default cookie expiration time in seconds, set to `true` to set expire |
|
|||
392 | ## at browser close |
|
|||
393 | #beaker.session.cookie_expires = 3600 |
|
451 | #beaker.session.cookie_expires = 3600 | |
394 |
|
452 | |||
395 |
|
|
453 | ; ############################# | |
396 |
|
|
454 | ; SEARCH INDEXING CONFIGURATION | |
397 |
|
|
455 | ; ############################# | |
398 | ## Full text search indexer is available in rhodecode-tools under |
|
|||
399 | ## `rhodecode-tools index` command |
|
|||
400 |
|
456 | |||
401 | ## WHOOSH Backend, doesn't require additional services to run |
|
457 | ; Full text search indexer is available in rhodecode-tools under | |
402 | ## it works good with few dozen repos |
|
458 | ; `rhodecode-tools index` command | |
|
459 | ||||
|
460 | ; WHOOSH Backend, doesn't require additional services to run | |||
|
461 | ; it works good with few dozen repos | |||
403 | search.module = rhodecode.lib.index.whoosh |
|
462 | search.module = rhodecode.lib.index.whoosh | |
404 | search.location = %(here)s/data/index |
|
463 | search.location = %(here)s/data/index | |
405 |
|
464 | |||
406 |
|
|
465 | ; #################### | |
407 |
|
|
466 | ; CHANNELSTREAM CONFIG | |
408 |
|
|
467 | ; #################### | |
409 | ## channelstream enables persistent connections and live notification |
|
468 | ||
410 | ## in the system. It's also used by the chat system |
|
469 | ; channelstream enables persistent connections and live notification | |
|
470 | ; in the system. It's also used by the chat system | |||
411 |
|
471 | |||
412 | channelstream.enabled = false |
|
472 | channelstream.enabled = false | |
413 |
|
473 | |||
414 |
|
|
474 | ; server address for channelstream server on the backend | |
415 | channelstream.server = 127.0.0.1:9800 |
|
475 | channelstream.server = 127.0.0.1:9800 | |
416 |
|
476 | |||
417 |
|
|
477 | ; location of the channelstream server from outside world | |
418 |
|
|
478 | ; use ws:// for http or wss:// for https. This address needs to be handled | |
419 |
|
|
479 | ; by external HTTP server such as Nginx or Apache | |
420 |
|
|
480 | ; see Nginx/Apache configuration examples in our docs | |
421 | channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream |
|
481 | channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream | |
422 | channelstream.secret = secret |
|
482 | channelstream.secret = secret | |
423 | channelstream.history.location = %(here)s/channelstream_history |
|
483 | channelstream.history.location = %(here)s/channelstream_history | |
424 |
|
484 | |||
425 |
|
|
485 | ; Internal application path that Javascript uses to connect into. | |
426 |
|
|
486 | ; If you use proxy-prefix the prefix should be added before /_channelstream | |
427 | channelstream.proxy_path = /_channelstream |
|
487 | channelstream.proxy_path = /_channelstream | |
428 |
|
488 | |||
429 |
|
489 | |||
430 |
|
|
490 | ; ############################## | |
431 | ## APPENLIGHT CONFIG ## |
|
491 | ; MAIN RHODECODE DATABASE CONFIG | |
432 |
|
|
492 | ; ############################## | |
|
493 | ||||
|
494 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 | |||
|
495 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |||
|
496 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8 | |||
|
497 | ; pymysql is an alternative driver for MySQL, use in case of problems with default one | |||
|
498 | #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode | |||
|
499 | ||||
|
500 | sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode | |||
|
501 | ||||
|
502 | ; see sqlalchemy docs for other advanced settings | |||
|
503 | ; print the sql statements to output | |||
|
504 | sqlalchemy.db1.echo = false | |||
|
505 | ||||
|
506 | ; recycle the connections after this amount of seconds | |||
|
507 | sqlalchemy.db1.pool_recycle = 3600 | |||
|
508 | sqlalchemy.db1.convert_unicode = true | |||
|
509 | ||||
|
510 | ; the number of connections to keep open inside the connection pool. | |||
|
511 | ; 0 indicates no limit | |||
|
512 | #sqlalchemy.db1.pool_size = 5 | |||
|
513 | ||||
|
514 | ; The number of connections to allow in connection pool "overflow", that is | |||
|
515 | ; connections that can be opened above and beyond the pool_size setting, | |||
|
516 | ; which defaults to five. | |||
|
517 | #sqlalchemy.db1.max_overflow = 10 | |||
|
518 | ||||
|
519 | ; Connection check ping, used to detect broken database connections | |||
|
520 | ; could be enabled to better handle cases if MySQL has gone away errors | |||
|
521 | #sqlalchemy.db1.ping_connection = true | |||
|
522 | ||||
|
523 | ; ########## | |||
|
524 | ; VCS CONFIG | |||
|
525 | ; ########## | |||
|
526 | vcs.server.enable = true | |||
|
527 | vcs.server = localhost:9900 | |||
|
528 | ||||
|
529 | ; Web server connectivity protocol, responsible for web based VCS operations | |||
|
530 | ; Available protocols are: | |||
|
531 | ; `http` - use http-rpc backend (default) | |||
|
532 | vcs.server.protocol = http | |||
|
533 | ||||
|
534 | ; Push/Pull operations protocol, available options are: | |||
|
535 | ; `http` - use http-rpc backend (default) | |||
|
536 | vcs.scm_app_implementation = http | |||
|
537 | ||||
|
538 | ; Push/Pull operations hooks protocol, available options are: | |||
|
539 | ; `http` - use http-rpc backend (default) | |||
|
540 | vcs.hooks.protocol = http | |||
|
541 | ||||
|
542 | ; Host on which this instance is listening for hooks. If vcsserver is in other location | |||
|
543 | ; this should be adjusted. | |||
|
544 | vcs.hooks.host = 127.0.0.1 | |||
|
545 | ||||
|
546 | ; Start VCSServer with this instance as a subprocess, useful for development | |||
|
547 | vcs.start_server = false | |||
|
548 | ||||
|
549 | ; List of enabled VCS backends, available options are: | |||
|
550 | ; `hg` - mercurial | |||
|
551 | ; `git` - git | |||
|
552 | ; `svn` - subversion | |||
|
553 | vcs.backends = hg, git, svn | |||
|
554 | ||||
|
555 | ; Wait this number of seconds before killing connection to the vcsserver | |||
|
556 | vcs.connection_timeout = 3600 | |||
|
557 | ||||
|
558 | ; Compatibility version when creating SVN repositories. Defaults to newest version when commented out. | |||
|
559 | ; Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible | |||
|
560 | #vcs.svn.compatible_version = pre-1.8-compatible | |||
|
561 | ||||
433 |
|
562 | |||
434 | ## Appenlight is tailored to work with RhodeCode, see |
|
563 | ; #################################################### | |
435 | ## http://appenlight.com for details how to obtain an account |
|
564 | ; Subversion proxy support (mod_dav_svn) | |
|
565 | ; Maps RhodeCode repo groups into SVN paths for Apache | |||
|
566 | ; #################################################### | |||
|
567 | ||||
|
568 | ; Enable or disable the config file generation. | |||
|
569 | svn.proxy.generate_config = false | |||
|
570 | ||||
|
571 | ; Generate config file with `SVNListParentPath` set to `On`. | |||
|
572 | svn.proxy.list_parent_path = true | |||
|
573 | ||||
|
574 | ; Set location and file name of generated config file. | |||
|
575 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |||
|
576 | ||||
|
577 | ; alternative mod_dav config template. This needs to be a valid mako template | |||
|
578 | ; Example template can be found in the source code: | |||
|
579 | ; rhodecode/apps/svn_support/templates/mod-dav-svn.conf.mako | |||
|
580 | #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako | |||
|
581 | ||||
|
582 | ; Used as a prefix to the `Location` block in the generated config file. | |||
|
583 | ; In most cases it should be set to `/`. | |||
|
584 | svn.proxy.location_root = / | |||
|
585 | ||||
|
586 | ; Command to reload the mod dav svn configuration on change. | |||
|
587 | ; Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh | |||
|
588 | ; Make sure user who runs RhodeCode process is allowed to reload Apache | |||
|
589 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |||
|
590 | ||||
|
591 | ; If the timeout expires before the reload command finishes, the command will | |||
|
592 | ; be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |||
|
593 | #svn.proxy.reload_timeout = 10 | |||
|
594 | ||||
|
595 | ; #################### | |||
|
596 | ; SSH Support Settings | |||
|
597 | ; #################### | |||
436 |
|
598 | |||
437 | ## Appenlight integration enabled |
|
599 | ; Defines if a custom authorized_keys file should be created and written on | |
|
600 | ; any change user ssh keys. Setting this to false also disables possibility | |||
|
601 | ; of adding SSH keys by users from web interface. Super admins can still | |||
|
602 | ; manage SSH Keys. | |||
|
603 | ssh.generate_authorized_keyfile = false | |||
|
604 | ||||
|
605 | ; Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding` | |||
|
606 | # ssh.authorized_keys_ssh_opts = | |||
|
607 | ||||
|
608 | ; Path to the authorized_keys file where the generate entries are placed. | |||
|
609 | ; It is possible to have multiple key files specified in `sshd_config` e.g. | |||
|
610 | ; AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode | |||
|
611 | ssh.authorized_keys_file_path = ~/.ssh/authorized_keys_rhodecode | |||
|
612 | ||||
|
613 | ; Command to execute the SSH wrapper. The binary is available in the | |||
|
614 | ; RhodeCode installation directory. | |||
|
615 | ; e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper | |||
|
616 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper | |||
|
617 | ||||
|
618 | ; Allow shell when executing the ssh-wrapper command | |||
|
619 | ssh.wrapper_cmd_allow_shell = false | |||
|
620 | ||||
|
621 | ; Enables logging, and detailed output send back to the client during SSH | |||
|
622 | ; operations. Useful for debugging, shouldn't be used in production. | |||
|
623 | ssh.enable_debug_logging = false | |||
|
624 | ||||
|
625 | ; Paths to binary executable, by default they are the names, but we can | |||
|
626 | ; override them if we want to use a custom one | |||
|
627 | ssh.executable.hg = ~/.rccontrol/vcsserver-1/profile/bin/hg | |||
|
628 | ssh.executable.git = ~/.rccontrol/vcsserver-1/profile/bin/git | |||
|
629 | ssh.executable.svn = ~/.rccontrol/vcsserver-1/profile/bin/svnserve | |||
|
630 | ||||
|
631 | ; Enables SSH key generator web interface. Disabling this still allows users | |||
|
632 | ; to add their own keys. | |||
|
633 | ssh.enable_ui_key_generator = true | |||
|
634 | ||||
|
635 | ||||
|
636 | ; ################# | |||
|
637 | ; APPENLIGHT CONFIG | |||
|
638 | ; ################# | |||
|
639 | ||||
|
640 | ; Appenlight is tailored to work with RhodeCode, see | |||
|
641 | ; http://appenlight.rhodecode.com for details how to obtain an account | |||
|
642 | ||||
|
643 | ; Appenlight integration enabled | |||
438 | appenlight = false |
|
644 | appenlight = false | |
439 |
|
645 | |||
440 | appenlight.server_url = https://api.appenlight.com |
|
646 | appenlight.server_url = https://api.appenlight.com | |
441 | appenlight.api_key = YOUR_API_KEY |
|
647 | appenlight.api_key = YOUR_API_KEY | |
442 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 |
|
648 | #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5 | |
443 |
|
649 | |||
444 |
|
|
650 | ; used for JS client | |
445 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY |
|
651 | appenlight.api_public_key = YOUR_API_PUBLIC_KEY | |
446 |
|
652 | |||
447 |
|
|
653 | ; TWEAK AMOUNT OF INFO SENT HERE | |
448 |
|
654 | |||
449 |
|
|
655 | ; enables 404 error logging (default False) | |
450 | appenlight.report_404 = false |
|
656 | appenlight.report_404 = false | |
451 |
|
657 | |||
452 |
|
|
658 | ; time in seconds after request is considered being slow (default 1) | |
453 | appenlight.slow_request_time = 1 |
|
659 | appenlight.slow_request_time = 1 | |
454 |
|
660 | |||
455 |
|
|
661 | ; record slow requests in application | |
456 |
|
|
662 | ; (needs to be enabled for slow datastore recording and time tracking) | |
457 | appenlight.slow_requests = true |
|
663 | appenlight.slow_requests = true | |
458 |
|
664 | |||
459 |
|
|
665 | ; enable hooking to application loggers | |
460 | appenlight.logging = true |
|
666 | appenlight.logging = true | |
461 |
|
667 | |||
462 |
|
|
668 | ; minimum log level for log capture | |
463 | appenlight.logging.level = WARNING |
|
669 | appenlight.logging.level = WARNING | |
464 |
|
670 | |||
465 |
|
|
671 | ; send logs only from erroneous/slow requests | |
466 |
|
|
672 | ; (saves API quota for intensive logging) | |
467 | appenlight.logging_on_error = false |
|
673 | appenlight.logging_on_error = false | |
468 |
|
674 | |||
469 |
|
|
675 | ; list of additional keywords that should be grabbed from environ object | |
470 |
|
|
676 | ; can be string with comma separated list of words in lowercase | |
471 |
|
|
677 | ; (by default client will always send following info: | |
472 |
|
|
678 | ; 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that | |
473 |
|
|
679 | ; start with HTTP* this list be extended with additional keywords here | |
474 | appenlight.environ_keys_whitelist = |
|
680 | appenlight.environ_keys_whitelist = | |
475 |
|
681 | |||
476 |
|
|
682 | ; list of keywords that should be blanked from request object | |
477 |
|
|
683 | ; can be string with comma separated list of words in lowercase | |
478 |
|
|
684 | ; (by default client will always blank keys that contain following words | |
479 |
|
|
685 | ; 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf' | |
480 |
|
|
686 | ; this list be extended with additional keywords set here | |
481 | appenlight.request_keys_blacklist = |
|
687 | appenlight.request_keys_blacklist = | |
482 |
|
688 | |||
483 |
|
|
689 | ; list of namespaces that should be ignores when gathering log entries | |
484 |
|
|
690 | ; can be string with comma separated list of namespaces | |
485 |
|
|
691 | ; (by default the client ignores own entries: appenlight_client.client) | |
486 | appenlight.log_namespace_blacklist = |
|
692 | appenlight.log_namespace_blacklist = | |
487 |
|
693 | |||
488 |
|
694 | ; Dummy marker to add new entries after. | ||
489 | ########################################### |
|
695 | ; Add any custom entries below. Please don't remove this marker. | |
490 | ### MAIN RHODECODE DATABASE CONFIG ### |
|
|||
491 | ########################################### |
|
|||
492 | #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode.db?timeout=30 |
|
|||
493 | #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode |
|
|||
494 | #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode?charset=utf8 |
|
|||
495 | # pymysql is an alternative driver for MySQL, use in case of problems with default one |
|
|||
496 | #sqlalchemy.db1.url = mysql+pymysql://root:qweqwe@localhost/rhodecode |
|
|||
497 |
|
||||
498 | sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode |
|
|||
499 |
|
||||
500 | # see sqlalchemy docs for other advanced settings |
|
|||
501 |
|
||||
502 | ## print the sql statements to output |
|
|||
503 | sqlalchemy.db1.echo = false |
|
|||
504 | ## recycle the connections after this amount of seconds |
|
|||
505 | sqlalchemy.db1.pool_recycle = 3600 |
|
|||
506 | sqlalchemy.db1.convert_unicode = true |
|
|||
507 |
|
||||
508 | ## the number of connections to keep open inside the connection pool. |
|
|||
509 | ## 0 indicates no limit |
|
|||
510 | #sqlalchemy.db1.pool_size = 5 |
|
|||
511 |
|
||||
512 | ## the number of connections to allow in connection pool "overflow", that is |
|
|||
513 | ## connections that can be opened above and beyond the pool_size setting, |
|
|||
514 | ## which defaults to five. |
|
|||
515 | #sqlalchemy.db1.max_overflow = 10 |
|
|||
516 |
|
||||
517 | ## Connection check ping, used to detect broken database connections |
|
|||
518 | ## could be enabled to better handle cases if MySQL has gone away errors |
|
|||
519 | #sqlalchemy.db1.ping_connection = true |
|
|||
520 |
|
||||
521 | ################## |
|
|||
522 | ### VCS CONFIG ### |
|
|||
523 | ################## |
|
|||
524 | vcs.server.enable = true |
|
|||
525 | vcs.server = localhost:9900 |
|
|||
526 |
|
||||
527 | ## Web server connectivity protocol, responsible for web based VCS operations |
|
|||
528 | ## Available protocols are: |
|
|||
529 | ## `http` - use http-rpc backend (default) |
|
|||
530 | vcs.server.protocol = http |
|
|||
531 |
|
||||
532 | ## Push/Pull operations protocol, available options are: |
|
|||
533 | ## `http` - use http-rpc backend (default) |
|
|||
534 | vcs.scm_app_implementation = http |
|
|||
535 |
|
||||
536 | ## Push/Pull operations hooks protocol, available options are: |
|
|||
537 | ## `http` - use http-rpc backend (default) |
|
|||
538 | vcs.hooks.protocol = http |
|
|||
539 |
|
||||
540 | ## Host on which this instance is listening for hooks. If vcsserver is in other location |
|
|||
541 | ## this should be adjusted. |
|
|||
542 | vcs.hooks.host = 127.0.0.1 |
|
|||
543 |
|
||||
544 | vcs.server.log_level = info |
|
|||
545 | ## Start VCSServer with this instance as a subprocess, useful for development |
|
|||
546 | vcs.start_server = false |
|
|||
547 |
|
||||
548 | ## List of enabled VCS backends, available options are: |
|
|||
549 | ## `hg` - mercurial |
|
|||
550 | ## `git` - git |
|
|||
551 | ## `svn` - subversion |
|
|||
552 | vcs.backends = hg, git, svn |
|
|||
553 |
|
||||
554 | vcs.connection_timeout = 3600 |
|
|||
555 | ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out. |
|
|||
556 | ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible, pre-1.9-compatible |
|
|||
557 | #vcs.svn.compatible_version = pre-1.8-compatible |
|
|||
558 |
|
||||
559 |
|
||||
560 | ############################################################ |
|
|||
561 | ### Subversion proxy support (mod_dav_svn) ### |
|
|||
562 | ### Maps RhodeCode repo groups into SVN paths for Apache ### |
|
|||
563 | ############################################################ |
|
|||
564 | ## Enable or disable the config file generation. |
|
|||
565 | svn.proxy.generate_config = false |
|
|||
566 | ## Generate config file with `SVNListParentPath` set to `On`. |
|
|||
567 | svn.proxy.list_parent_path = true |
|
|||
568 | ## Set location and file name of generated config file. |
|
|||
569 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf |
|
|||
570 | ## alternative mod_dav config template. This needs to be a mako template |
|
|||
571 | #svn.proxy.config_template = ~/.rccontrol/enterprise-1/custom_svn_conf.mako |
|
|||
572 | ## Used as a prefix to the `Location` block in the generated config file. |
|
|||
573 | ## In most cases it should be set to `/`. |
|
|||
574 | svn.proxy.location_root = / |
|
|||
575 | ## Command to reload the mod dav svn configuration on change. |
|
|||
576 | ## Example: `/etc/init.d/apache2 reload` or /home/USER/apache_reload.sh |
|
|||
577 | ## Make sure user who runs RhodeCode process is allowed to reload Apache |
|
|||
578 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload |
|
|||
579 | ## If the timeout expires before the reload command finishes, the command will |
|
|||
580 | ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds. |
|
|||
581 | #svn.proxy.reload_timeout = 10 |
|
|||
582 |
|
||||
583 | ############################################################ |
|
|||
584 | ### SSH Support Settings ### |
|
|||
585 | ############################################################ |
|
|||
586 |
|
||||
587 | ## Defines if a custom authorized_keys file should be created and written on |
|
|||
588 | ## any change user ssh keys. Setting this to false also disables possibility |
|
|||
589 | ## of adding SSH keys by users from web interface. Super admins can still |
|
|||
590 | ## manage SSH Keys. |
|
|||
591 | ssh.generate_authorized_keyfile = false |
|
|||
592 |
|
||||
593 | ## Options for ssh, default is `no-pty,no-port-forwarding,no-X11-forwarding,no-agent-forwarding` |
|
|||
594 | # ssh.authorized_keys_ssh_opts = |
|
|||
595 |
|
||||
596 | ## Path to the authorized_keys file where the generate entries are placed. |
|
|||
597 | ## It is possible to have multiple key files specified in `sshd_config` e.g. |
|
|||
598 | ## AuthorizedKeysFile %h/.ssh/authorized_keys %h/.ssh/authorized_keys_rhodecode |
|
|||
599 | ssh.authorized_keys_file_path = ~/.ssh/authorized_keys_rhodecode |
|
|||
600 |
|
||||
601 | ## Command to execute the SSH wrapper. The binary is available in the |
|
|||
602 | ## RhodeCode installation directory. |
|
|||
603 | ## e.g ~/.rccontrol/community-1/profile/bin/rc-ssh-wrapper |
|
|||
604 | ssh.wrapper_cmd = ~/.rccontrol/community-1/rc-ssh-wrapper |
|
|||
605 |
|
||||
606 | ## Allow shell when executing the ssh-wrapper command |
|
|||
607 | ssh.wrapper_cmd_allow_shell = false |
|
|||
608 |
|
||||
609 | ## Enables logging, and detailed output send back to the client during SSH |
|
|||
610 | ## operations. Useful for debugging, shouldn't be used in production. |
|
|||
611 | ssh.enable_debug_logging = false |
|
|||
612 |
|
||||
613 | ## Paths to binary executable, by default they are the names, but we can |
|
|||
614 | ## override them if we want to use a custom one |
|
|||
615 | ssh.executable.hg = ~/.rccontrol/vcsserver-1/profile/bin/hg |
|
|||
616 | ssh.executable.git = ~/.rccontrol/vcsserver-1/profile/bin/git |
|
|||
617 | ssh.executable.svn = ~/.rccontrol/vcsserver-1/profile/bin/svnserve |
|
|||
618 |
|
||||
619 | ## Enables SSH key generator web interface. Disabling this still allows users |
|
|||
620 | ## to add their own keys. |
|
|||
621 | ssh.enable_ui_key_generator = true |
|
|||
622 |
|
||||
623 |
|
||||
624 | ## Dummy marker to add new entries after. |
|
|||
625 | ## Add any custom entries below. Please don't remove. |
|
|||
626 | custom.conf = 1 |
|
696 | custom.conf = 1 | |
627 |
|
697 | |||
628 |
|
698 | |||
629 |
|
|
699 | ; ##################### | |
630 |
|
|
700 | ; LOGGING CONFIGURATION | |
631 |
|
|
701 | ; ##################### | |
632 | [loggers] |
|
702 | [loggers] | |
633 | keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper |
|
703 | keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper | |
634 |
|
704 | |||
@@ -638,9 +708,9 b' keys = console, console_sql' | |||||
638 | [formatters] |
|
708 | [formatters] | |
639 | keys = generic, color_formatter, color_formatter_sql |
|
709 | keys = generic, color_formatter, color_formatter_sql | |
640 |
|
710 | |||
641 |
|
|
711 | ; ####### | |
642 |
|
|
712 | ; LOGGERS | |
643 |
|
|
713 | ; ####### | |
644 | [logger_root] |
|
714 | [logger_root] | |
645 | level = NOTSET |
|
715 | level = NOTSET | |
646 | handlers = console |
|
716 | handlers = console | |
@@ -675,9 +745,9 b' handlers =' | |||||
675 | qualname = celery |
|
745 | qualname = celery | |
676 |
|
746 | |||
677 |
|
747 | |||
678 |
|
|
748 | ; ######## | |
679 |
|
|
749 | ; HANDLERS | |
680 |
|
|
750 | ; ######## | |
681 |
|
751 | |||
682 | [handler_console] |
|
752 | [handler_console] | |
683 | class = StreamHandler |
|
753 | class = StreamHandler | |
@@ -686,17 +756,17 b' level = INFO' | |||||
686 | formatter = generic |
|
756 | formatter = generic | |
687 |
|
757 | |||
688 | [handler_console_sql] |
|
758 | [handler_console_sql] | |
689 |
|
|
759 | ; "level = DEBUG" logs SQL queries and results. | |
690 |
|
|
760 | ; "level = INFO" logs SQL queries. | |
691 |
|
|
761 | ; "level = WARN" logs neither. (Recommended for production systems.) | |
692 | class = StreamHandler |
|
762 | class = StreamHandler | |
693 | args = (sys.stderr, ) |
|
763 | args = (sys.stderr, ) | |
694 | level = WARN |
|
764 | level = WARN | |
695 | formatter = generic |
|
765 | formatter = generic | |
696 |
|
766 | |||
697 |
|
|
767 | ; ########## | |
698 |
|
|
768 | ; FORMATTERS | |
699 |
|
|
769 | ; ########## | |
700 |
|
770 | |||
701 | [formatter_generic] |
|
771 | [formatter_generic] | |
702 | class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter |
|
772 | class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter |
@@ -190,7 +190,7 b' let' | |||||
190 | postInstall = '' |
|
190 | postInstall = '' | |
191 | # check required files |
|
191 | # check required files | |
192 | STATIC_CHECK="/robots.txt /502.html |
|
192 | STATIC_CHECK="/robots.txt /502.html | |
193 | /js/scripts.js /js/rhodecode-components.js |
|
193 | /js/scripts.min.js /js/rhodecode-components.js | |
194 | /css/style.css /css/style-polymer.css /css/style-ipython.css" |
|
194 | /css/style.css /css/style-polymer.css /css/style-ipython.css" | |
195 |
|
195 | |||
196 | for file in $STATIC_CHECK; |
|
196 | for file in $STATIC_CHECK; |
@@ -12,13 +12,50 b' Here is how to force delete a repository' | |||||
12 |
|
12 | |||
13 |
|
13 | |||
14 | .. code-block:: bash |
|
14 | .. code-block:: bash | |
|
15 | :dedent: 1 | |||
15 |
|
16 | |||
16 | # starts the ishell interactive prompt |
|
17 | # starts the ishell interactive prompt | |
17 | $ rccontrol ishell enterprise-1 |
|
18 | $ rccontrol ishell enterprise-1 | |
18 |
|
19 | |||
19 | .. code-block:: python |
|
20 | .. code-block:: python | |
|
21 | :dedent: 1 | |||
20 |
|
22 | |||
21 | In [4]: from rhodecode.model.repo import RepoModel |
|
23 | In [4]: from rhodecode.model.repo import RepoModel | |
22 | In [3]: repo = Repository.get_by_repo_name('test_repos/repo_with_prs') |
|
24 | In [3]: repo = Repository.get_by_repo_name('test_repos/repo_with_prs') | |
23 | In [5]: RepoModel().delete(repo, forks='detach', pull_requests='delete') |
|
25 | In [5]: RepoModel().delete(repo, forks='detach', pull_requests='delete') | |
24 | In [6]: Session().commit() |
|
26 | In [6]: Session().commit() | |
|
27 | ||||
|
28 | ||||
|
29 | Below is a fully automated example to force delete repositories reading from a | |||
|
30 | file where each line is a repository name. This can be executed via simple CLI command | |||
|
31 | without entering the interactive shell. | |||
|
32 | ||||
|
33 | Save the below content as a file named `repo_delete_task.py` | |||
|
34 | ||||
|
35 | ||||
|
36 | .. code-block:: python | |||
|
37 | :dedent: 1 | |||
|
38 | ||||
|
39 | from rhodecode.model.db import * | |||
|
40 | from rhodecode.model.repo import RepoModel | |||
|
41 | with open('delete_repos.txt', 'rb') as f: | |||
|
42 | # read all lines from file | |||
|
43 | repos = f.readlines() | |||
|
44 | for repo_name in repos: | |||
|
45 | repo_name = repo_name.strip() # cleanup the name just in case | |||
|
46 | repo = Repository.get_by_repo_name(repo_name) | |||
|
47 | if not repo: | |||
|
48 | raise Exception('Repo with name {} not found'.format(repo_name)) | |||
|
49 | RepoModel().delete(repo, forks='detach', pull_requests='delete') | |||
|
50 | Session().commit() | |||
|
51 | print('Removed repository {}'.format(repo_name)) | |||
|
52 | ||||
|
53 | ||||
|
54 | The code above will read the names of repositories from a file called `delete_repos.txt` | |||
|
55 | Each lines should represent a single name e.g `repo_name_1` or `repo_group/repo_name_2` | |||
|
56 | ||||
|
57 | Run this line from CLI to execute the code from the `repo_delete_task.py` file and | |||
|
58 | exit the ishell after the execution:: | |||
|
59 | ||||
|
60 | echo "%run repo_delete_task.py" | rccontrol ishell Enterprise-1 | |||
|
61 |
@@ -110,6 +110,7 b' Use the following example to configure N' | |||||
110 | # gzip_types text/css text/javascript text/xml text/plain text/x-component application/javascript application/json application/xml application/rss+xml font/truetype font/opentype application/vnd.ms-fontobject image/svg+xml; |
|
110 | # gzip_types text/css text/javascript text/xml text/plain text/x-component application/javascript application/json application/xml application/rss+xml font/truetype font/opentype application/vnd.ms-fontobject image/svg+xml; | |
111 | # gzip_vary on; |
|
111 | # gzip_vary on; | |
112 | # gzip_disable "msie6"; |
|
112 | # gzip_disable "msie6"; | |
|
113 | # expires 60d; | |||
113 | # alias /path/to/.rccontrol/community-1/static; |
|
114 | # alias /path/to/.rccontrol/community-1/static; | |
114 | # alias /path/to/.rccontrol/enterprise-1/static; |
|
115 | # alias /path/to/.rccontrol/enterprise-1/static; | |
115 | # } |
|
116 | # } |
@@ -9,7 +9,8 b' may find some of the following methods u' | |||||
9 | .. toctree:: |
|
9 | .. toctree:: | |
10 |
|
10 | |||
11 | tuning/tuning-gunicorn |
|
11 | tuning/tuning-gunicorn | |
12 |
tuning/tuning-vcs-memory- |
|
12 | tuning/tuning-vcs-server-memory-usage | |
|
13 | tuning/tuning-rhodecode-memory-usage | |||
13 | tuning/tuning-user-sessions-performance |
|
14 | tuning/tuning-user-sessions-performance | |
14 | tuning/tuning-increase-db-performance |
|
15 | tuning/tuning-increase-db-performance | |
15 | tuning/tuning-scale-horizontally-cluster |
|
16 | tuning/tuning-scale-horizontally-cluster |
@@ -25,26 +25,22 b' 2. In the ``[server:main]`` section, cha' | |||||
25 |
|
25 | |||
26 | .. code-block:: ini |
|
26 | .. code-block:: ini | |
27 |
|
27 | |||
28 | use = egg:gunicorn#main |
|
28 | ; Sets the number of process workers. More workers means more concurrent connections | |
29 | ## Sets the number of process workers. You must set `instance_id = *` |
|
29 | ; RhodeCode can handle at the same time. Each additional worker also it increases | |
30 | ## when this option is set to more than one worker, recommended |
|
30 | ; memory usage as each has it's own set of caches. | |
31 |
|
|
31 | ; Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |
32 | ## The `instance_id = *` must be set in the [app:main] section below |
|
32 | ; than 8-10 unless for really big deployments .e.g 700-1000 users. | |
33 | workers = 4 |
|
33 | ; `instance_id = *` must be set in the [app:main] section below (which is the default) | |
34 | ## process name |
|
34 | ; when using more than 1 worker. | |
35 | proc_name = rhodecode |
|
35 | workers = 6 | |
36 | ## type of worker class, one of sync, gevent |
|
36 | ||
37 | ## recommended for bigger setup is using of of other than sync one |
|
37 | ; Type of worker class, one of `sync`, `gevent` | |
38 | worker_class = sync |
|
38 | ; Use `gevent` for rhodecode | |
39 | ## The maximum number of simultaneous clients. Valid only for Gevent |
|
39 | worker_class = gevent | |
40 | #worker_connections = 10 |
|
40 | ||
41 | ## max number of requests that worker will handle before being gracefully |
|
41 | ; The maximum number of simultaneous clients per worker. Valid only for gevent | |
42 | ## restarted, could prevent memory leaks |
|
42 | worker_connections = 10 | |
43 | max_requests = 1000 |
|
43 | ||
44 | max_requests_jitter = 30 |
|
|||
45 | ## amount of time a worker can spend with handling a request tuning-change-lfs-dir.before it |
|
|||
46 | ## gets killed and restarted. Set to 6hrs |
|
|||
47 | timeout = 21600 |
|
|||
48 |
|
44 | |||
49 | 3. In the ``[app:main]`` section, set the ``instance_id`` property to ``*``. |
|
45 | 3. In the ``[app:main]`` section, set the ``instance_id`` property to ``*``. | |
50 |
|
46 | |||
@@ -63,24 +59,19 b' 5. In the ``[server:main]`` section, inc' | |||||
63 |
|
59 | |||
64 | .. code-block:: ini |
|
60 | .. code-block:: ini | |
65 |
|
61 | |||
66 | ## run with gunicorn --log-config vcsserver.ini --paste vcsserver.ini |
|
62 | ; Sets the number of process workers. More workers means more concurrent connections | |
67 | use = egg:gunicorn#main |
|
63 | ; RhodeCode can handle at the same time. Each additional worker also it increases | |
68 | ## Sets the number of process workers. Recommended |
|
64 | ; memory usage as each has it's own set of caches. | |
69 |
|
|
65 | ; Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |
70 | workers = 4 |
|
66 | ; than 8-10 unless for really big deployments .e.g 700-1000 users. | |
71 | ## process name |
|
67 | ; `instance_id = *` must be set in the [app:main] section below (which is the default) | |
72 | proc_name = rhodecode_vcsserver |
|
68 | ; when using more than 1 worker. | |
73 | ## type of worker class, currently `sync` is the only option allowed. |
|
69 | workers = 8 | |
|
70 | ||||
|
71 | ; Type of worker class, one of `sync`, `gevent` | |||
|
72 | ; Use `sync` for vcsserver | |||
74 | worker_class = sync |
|
73 | worker_class = sync | |
75 | ## The maximum number of simultaneous clients. Valid only for Gevent |
|
74 | ||
76 | #worker_connections = 10 |
|
|||
77 | ## max number of requests that worker will handle before being gracefully |
|
|||
78 | ## restarted, could prevent memory leaks |
|
|||
79 | max_requests = 1000 |
|
|||
80 | max_requests_jitter = 30 |
|
|||
81 | ## amount of time a worker can spend with handling a request before it |
|
|||
82 | ## gets killed and restarted. Set to 6hrs |
|
|||
83 | timeout = 21600 |
|
|||
84 |
|
75 | |||
85 | 6. Save your changes. |
|
76 | 6. Save your changes. | |
86 | 7. Restart your |RCE| instances, using the following command: |
|
77 | 7. Restart your |RCE| instances, using the following command: | |
@@ -109,17 +100,18 b' 2. In the ``[server:main]`` section, cha' | |||||
109 |
|
100 | |||
110 | .. code-block:: ini |
|
101 | .. code-block:: ini | |
111 |
|
102 | |||
112 |
|
|
103 | ; Type of worker class, one of `sync`, `gevent` | |
113 | ## recommended for bigger setup is using of of other than sync one |
|
104 | ; Use `gevent` for rhodecode | |
114 | worker_class = gevent |
|
105 | worker_class = gevent | |
115 | ## The maximum number of simultaneous clients. Valid only for Gevent |
|
106 | ||
|
107 | ; The maximum number of simultaneous clients per worker. Valid only for gevent | |||
116 | worker_connections = 30 |
|
108 | worker_connections = 30 | |
117 |
|
109 | |||
118 |
|
110 | |||
119 | .. note:: |
|
111 | .. note:: | |
120 |
|
112 | |||
121 | `Gevent` is currently only supported for Enterprise/Community instances. |
|
113 | `Gevent` is currently only supported for Enterprise/Community instances. | |
122 |
VCSServer doesn't |
|
114 | VCSServer doesn't support gevent. | |
123 |
|
115 | |||
124 |
|
116 | |||
125 |
|
117 |
@@ -57,7 +57,7 b" Here's an overview what components shoul" | |||||
57 | - `nginx` acting as a load-balancer. |
|
57 | - `nginx` acting as a load-balancer. | |
58 | - `postgresql-server` used for database and sessions. |
|
58 | - `postgresql-server` used for database and sessions. | |
59 | - `redis-server` used for storing shared caches. |
|
59 | - `redis-server` used for storing shared caches. | |
60 | - optionally `rabbitmq-server` for `Celery` if used. |
|
60 | - optionally `rabbitmq-server` or `redis` for `Celery` if used. | |
61 | - optionally if `Celery` is used Enterprise/Community instance + VCSServer. |
|
61 | - optionally if `Celery` is used Enterprise/Community instance + VCSServer. | |
62 | - optionally mailserver that can be shared by other instances. |
|
62 | - optionally mailserver that can be shared by other instances. | |
63 | - optionally channelstream server to handle live communication for all instances. |
|
63 | - optionally channelstream server to handle live communication for all instances. | |
@@ -263,6 +263,7 b' 6) Configure `Nginx`_ as reverse proxy o' | |||||
263 | gzip_types text/css text/javascript text/xml text/plain text/x-component application/javascript application/json application/xml application/rss+xml font/truetype font/opentype application/vnd.ms-fontobject image/svg+xml; |
|
263 | gzip_types text/css text/javascript text/xml text/plain text/x-component application/javascript application/json application/xml application/rss+xml font/truetype font/opentype application/vnd.ms-fontobject image/svg+xml; | |
264 | gzip_vary on; |
|
264 | gzip_vary on; | |
265 | gzip_disable "msie6"; |
|
265 | gzip_disable "msie6"; | |
|
266 | expires 60d; | |||
266 | #alias /home/rcdev/.rccontrol/community-1/static; |
|
267 | #alias /home/rcdev/.rccontrol/community-1/static; | |
267 | alias /home/rcdev/.rccontrol/enterprise-1/static; |
|
268 | alias /home/rcdev/.rccontrol/enterprise-1/static; | |
268 | } |
|
269 | } | |
@@ -372,16 +373,16 b' Using Celery with cluster' | |||||
372 |
|
373 | |||
373 |
|
374 | |||
374 | If `Celery` is used we recommend setting also an instance of Enterprise/Community+VCSserver |
|
375 | If `Celery` is used we recommend setting also an instance of Enterprise/Community+VCSserver | |
375 |
on the node that is running `RabbitMQ`_. Those instances will be used to |
|
376 | on the node that is running `RabbitMQ`_ or `Redis`_. Those instances will be used to | |
376 |
tasks on the `rc-node-1`. This is the most efficient setup. |
|
377 | executed async tasks on the `rc-node-1`. This is the most efficient setup. | |
377 | handles tasks such as sending emails, forking repositories, importing |
|
378 | `Celery` usually handles tasks such as sending emails, forking repositories, importing | |
378 | repositories from external location etc. Using workers on instance that has |
|
379 | repositories from external location etc. Using workers on instance that has | |
379 | the direct access to disks used by NFS as well as email server gives noticeable |
|
380 | the direct access to disks used by NFS as well as email server gives noticeable | |
380 | performance boost. Running local workers to the NFS storage results in faster |
|
381 | performance boost. Running local workers to the NFS storage results in faster | |
381 | execution of forking large repositories or sending lots of emails. |
|
382 | execution of forking large repositories or sending lots of emails. | |
382 |
|
383 | |||
383 | Those instances need to be configured in the same way as for other nodes. |
|
384 | Those instances need to be configured in the same way as for other nodes. | |
384 | The instance in rc-node-1 can be added to the cluser, but we don't recommend doing it. |
|
385 | The instance in rc-node-1 can be added to the cluster, but we don't recommend doing it. | |
385 | For best results let it be isolated to only executing `Celery` tasks in the cluster setup. |
|
386 | For best results let it be isolated to only executing `Celery` tasks in the cluster setup. | |
386 |
|
387 | |||
387 |
|
388 |
@@ -1,8 +1,26 b'' | |||||
1 |
.. _adjust-vcs-mem |
|
1 | .. _adjust-vcs-server-mem: | |
2 |
|
2 | |||
3 |
VCSServer Memory |
|
3 | VCSServer Memory Usage | |
4 | ---------------------- |
|
4 | ---------------------- | |
5 |
|
5 | |||
6 | The VCS Server mamory cache can be adjusted to work best with the resources |
|
6 | Starting from Version 4.18.X RhodeCode has a builtin memory monitor for gunicorn workers. | |
7 | available to your |RCE| instance. If you find that memory resources are under |
|
7 | Enabling this can limit the maximum amount of memory system can use. Each worker | |
8 | pressure, see the :ref:`vcs-server-maintain` section for details. |
|
8 | for VCS Server is monitored independently. | |
|
9 | To enable Memory management make sure to have following settings inside `[app:main] section` of | |||
|
10 | :file:`home/{user}/.rccontrol/{instance-id}/vcsserver.ini` file. | |||
|
11 | ||||
|
12 | ||||
|
13 | ||||
|
14 | ; Maximum memory usage that each worker can use before it will receive a | |||
|
15 | ; graceful restart signal 0 = memory monitoring is disabled | |||
|
16 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |||
|
17 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |||
|
18 | memory_max_usage = 1073741824 | |||
|
19 | ||||
|
20 | ; How often in seconds to check for memory usage for each gunicorn worker | |||
|
21 | memory_usage_check_interval = 60 | |||
|
22 | ||||
|
23 | ; Threshold value for which we don't recycle worker if GarbageCollection | |||
|
24 | ; frees up enough resources. Before each restart we try to run GC on worker | |||
|
25 | ; in case we get enough free memory after that, restart will not happen. | |||
|
26 | memory_usage_recovery_threshold = 0.8 |
@@ -110,35 +110,39 b' match, for example:' | |||||
110 |
|
110 | |||
111 | .. _vcs-server-maintain: |
|
111 | .. _vcs-server-maintain: | |
112 |
|
112 | |||
113 |
VCS Server |
|
113 | VCS Server Cache Optimization | |
114 |
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
114 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
115 |
|
115 | |||
116 |
To optimize the VCS server to manage the cache and memory usage efficiently, |
|
116 | To optimize the VCS server to manage the cache and memory usage efficiently, it's recommended to | |
117 | configure the following options in the |
|
117 | configure the Redis backend for VCSServer caches. | |
118 | :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini` file. Once |
|
118 | Once configured, restart the VCS Server. | |
119 | configured, restart the VCS Server. By default we use an optimal settings, but in certain |
|
119 | ||
120 | conditions tunning expiration_time and max_size can affect memory usage and performance |
|
120 | Make sure Redis is installed and running. | |
|
121 | Open :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini` | |||
|
122 | file and ensure the below settings for `repo_object` type cache are set: | |||
121 |
|
123 | |||
122 | .. code-block:: ini |
|
124 | .. code-block:: ini | |
123 |
|
125 | |||
124 | ## cache region for storing repo_objects cache |
|
126 | ; ensure the default file based cache is *commented out* | |
125 |
rc_cache.repo_object.backend = dogpile.cache.rc. |
|
127 | ##rc_cache.repo_object.backend = dogpile.cache.rc.file_namespace | |
|
128 | ##rc_cache.repo_object.expiration_time = 2592000 | |||
126 |
|
129 | |||
127 | ## cache auto-expires after N seconds, setting this to 0 disabled cache |
|
130 | ; `repo_object` cache settings for vcs methods for repositories | |
128 | rc_cache.repo_object.expiration_time = 300 |
|
131 | rc_cache.repo_object.backend = dogpile.cache.rc.redis_msgpack | |
129 |
|
132 | |||
130 | ## max size of LRU, old values will be discarded if the size of cache reaches max_size |
|
133 | ; cache auto-expires after N seconds | |
131 | ## Sets the maximum number of items stored in the cache, before the cache |
|
134 | ; Examples: 86400 (1Day), 604800 (7Days), 1209600 (14Days), 2592000 (30days), 7776000 (90Days) | |
132 | ## starts to be cleared. |
|
135 | rc_cache.repo_object.expiration_time = 2592000 | |
|
136 | ||||
|
137 | ; redis_expiration_time needs to be greater then expiration_time | |||
|
138 | rc_cache.repo_object.arguments.redis_expiration_time = 3592000 | |||
133 |
|
139 | |||
134 | ## As a general rule of thumb, running this value at 120 resulted in a |
|
140 | rc_cache.repo_object.arguments.host = localhost | |
135 | ## 5GB cache. Running it at 240 resulted in a 9GB cache. Your results |
|
141 | rc_cache.repo_object.arguments.port = 6379 | |
136 | ## will differ based on usage patterns and |repo| sizes. |
|
142 | rc_cache.repo_object.arguments.db = 5 | |
137 |
|
143 | rc_cache.repo_object.arguments.socket_timeout = 30 | ||
138 | ## Tweaking this value to run at a fairly constant memory load on your |
|
144 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |
139 | ## server will help performance. |
|
145 | rc_cache.repo_object.arguments.distributed_lock = true | |
140 |
|
||||
141 | rc_cache.repo_object.max_size = 120 |
|
|||
142 |
|
146 | |||
143 |
|
147 | |||
144 | To clear the cache completely, you can restart the VCS Server. |
|
148 | To clear the cache completely, you can restart the VCS Server. | |
@@ -190,25 +194,6 b' For a more detailed explanation of the l' | |||||
190 | \port <int> |
|
194 | \port <int> | |
191 | Set the port number on which the VCS Server will be available. |
|
195 | Set the port number on which the VCS Server will be available. | |
192 |
|
196 | |||
193 | \locale <locale_utf> |
|
|||
194 | Set the locale the VCS Server expects. |
|
|||
195 |
|
||||
196 | \workers <int> |
|
|||
197 | Set the number of process workers.Recommended |
|
|||
198 | value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers |
|
|||
199 |
|
||||
200 | \max_requests <int> |
|
|||
201 | The maximum number of requests a worker will process before restarting. |
|
|||
202 | Any value greater than zero will limit the number of requests a work |
|
|||
203 | will process before automatically restarting. This is a simple method |
|
|||
204 | to help limit the damage of memory leaks. |
|
|||
205 |
|
||||
206 | \max_requests_jitter <int> |
|
|||
207 | The maximum jitter to add to the max_requests setting. |
|
|||
208 | The jitter causes the restart per worker to be randomized by |
|
|||
209 | randint(0, max_requests_jitter). This is intended to stagger worker |
|
|||
210 | restarts to avoid all workers restarting at the same time. |
|
|||
211 |
|
||||
212 |
|
197 | |||
213 | .. note:: |
|
198 | .. note:: | |
214 |
|
199 | |||
@@ -216,63 +201,139 b' For a more detailed explanation of the l' | |||||
216 |
|
201 | |||
217 | .. code-block:: ini |
|
202 | .. code-block:: ini | |
218 |
|
203 | |||
219 |
|
|
204 | ; ################################# | |
220 | # RhodeCode VCSServer with HTTP Backend - configuration # |
|
205 | ; RHODECODE VCSSERVER CONFIGURATION | |
221 | # # |
|
206 | ; ################################# | |
222 | ################################################################################ |
|
|||
223 |
|
||||
224 |
|
207 | |||
225 | [server:main] |
|
208 | [server:main] | |
226 | ## COMMON ## |
|
209 | ; COMMON HOST/IP CONFIG | |
227 | host = 127.0.0.1 |
|
210 | host = 127.0.0.1 | |
228 | port = 10002 |
|
211 | port = 10002 | |
229 |
|
212 | |||
230 | ########################## |
|
213 | ; ########################### | |
231 |
|
|
214 | ; GUNICORN APPLICATION SERVER | |
232 | ########################## |
|
215 | ; ########################### | |
233 | ## run with gunicorn --log-config vcsserver.ini --paste vcsserver.ini |
|
216 | ||
|
217 | ; run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |||
|
218 | ||||
|
219 | ; Module to use, this setting shouldn't be changed | |||
234 | use = egg:gunicorn#main |
|
220 | use = egg:gunicorn#main | |
235 | ## Sets the number of process workers. Recommended |
|
221 | ||
236 | ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers |
|
222 | ; Sets the number of process workers. More workers means more concurrent connections | |
237 | workers = 3 |
|
223 | ; RhodeCode can handle at the same time. Each additional worker also it increases | |
238 | ## process name |
|
224 | ; memory usage as each has it's own set of caches. | |
|
225 | ; Recommended value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers, but no more | |||
|
226 | ; than 8-10 unless for really big deployments .e.g 700-1000 users. | |||
|
227 | ; `instance_id = *` must be set in the [app:main] section below (which is the default) | |||
|
228 | ; when using more than 1 worker. | |||
|
229 | workers = 6 | |||
|
230 | ||||
|
231 | ; Gunicorn access log level | |||
|
232 | loglevel = info | |||
|
233 | ||||
|
234 | ; Process name visible in process list | |||
239 | proc_name = rhodecode_vcsserver |
|
235 | proc_name = rhodecode_vcsserver | |
240 | ## type of worker class, one of sync, gevent |
|
236 | ||
241 | ## recommended for bigger setup is using of of other than sync one |
|
237 | ; Type of worker class, one of sync, gevent | |
|
238 | ; currently `sync` is the only option allowed. | |||
242 | worker_class = sync |
|
239 | worker_class = sync | |
243 | ## The maximum number of simultaneous clients. Valid only for Gevent |
|
240 | ||
244 | #worker_connections = 10 |
|
241 | ; The maximum number of simultaneous clients. Valid only for gevent | |
245 | ## max number of requests that worker will handle before being gracefully |
|
242 | worker_connections = 10 | |
246 | ## restarted, could prevent memory leaks |
|
243 | ||
|
244 | ; Max number of requests that worker will handle before being gracefully restarted. | |||
|
245 | ; Prevents memory leaks, jitter adds variability so not all workers are restarted at once. | |||
247 | max_requests = 1000 |
|
246 | max_requests = 1000 | |
248 | max_requests_jitter = 30 |
|
247 | max_requests_jitter = 30 | |
249 | ## amount of time a worker can spend with handling a request before it |
|
248 | ||
250 | ## gets killed and restarted. Set to 6hrs |
|
249 | ; Amount of time a worker can spend with handling a request before it | |
|
250 | ; gets killed and restarted. By default set to 21600 (6hrs) | |||
|
251 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |||
251 | timeout = 21600 |
|
252 | timeout = 21600 | |
252 |
|
253 | |||
|
254 | ; The maximum size of HTTP request line in bytes. | |||
|
255 | ; 0 for unlimited | |||
|
256 | limit_request_line = 0 | |||
|
257 | ||||
|
258 | ; Limit the number of HTTP headers fields in a request. | |||
|
259 | ; By default this value is 100 and can't be larger than 32768. | |||
|
260 | limit_request_fields = 32768 | |||
|
261 | ||||
|
262 | ; Limit the allowed size of an HTTP request header field. | |||
|
263 | ; Value is a positive number or 0. | |||
|
264 | ; Setting it to 0 will allow unlimited header field sizes. | |||
|
265 | limit_request_field_size = 0 | |||
|
266 | ||||
|
267 | ; Timeout for graceful workers restart. | |||
|
268 | ; After receiving a restart signal, workers have this much time to finish | |||
|
269 | ; serving requests. Workers still alive after the timeout (starting from the | |||
|
270 | ; receipt of the restart signal) are force killed. | |||
|
271 | ; Examples: 1800 (30min), 3600 (1hr), 7200 (2hr), 43200 (12h) | |||
|
272 | graceful_timeout = 3600 | |||
|
273 | ||||
|
274 | # The number of seconds to wait for requests on a Keep-Alive connection. | |||
|
275 | # Generally set in the 1-5 seconds range. | |||
|
276 | keepalive = 2 | |||
|
277 | ||||
|
278 | ; Maximum memory usage that each worker can use before it will receive a | |||
|
279 | ; graceful restart signal 0 = memory monitoring is disabled | |||
|
280 | ; Examples: 268435456 (256MB), 536870912 (512MB) | |||
|
281 | ; 1073741824 (1GB), 2147483648 (2GB), 4294967296 (4GB) | |||
|
282 | memory_max_usage = 1073741824 | |||
|
283 | ||||
|
284 | ; How often in seconds to check for memory usage for each gunicorn worker | |||
|
285 | memory_usage_check_interval = 60 | |||
|
286 | ||||
|
287 | ; Threshold value for which we don't recycle worker if GarbageCollection | |||
|
288 | ; frees up enough resources. Before each restart we try to run GC on worker | |||
|
289 | ; in case we get enough free memory after that, restart will not happen. | |||
|
290 | memory_usage_recovery_threshold = 0.8 | |||
|
291 | ||||
|
292 | ||||
253 | [app:main] |
|
293 | [app:main] | |
254 | use = egg:rhodecode-vcsserver |
|
294 | use = egg:rhodecode-vcsserver | |
255 |
|
295 | |||
256 | pyramid.default_locale_name = en |
|
296 | pyramid.default_locale_name = en | |
257 | pyramid.includes = |
|
297 | pyramid.includes = | |
258 |
|
298 | |||
259 |
|
|
299 | ; default locale used by VCS systems | |
260 | locale = en_US.UTF-8 |
|
300 | locale = en_US.UTF-8 | |
261 |
|
301 | |||
262 | # cache regions, please don't change |
|
302 | ; ############# | |
263 | beaker.cache.regions = repo_object |
|
303 | ; DOGPILE CACHE | |
264 | beaker.cache.repo_object.type = memorylru |
|
304 | ; ############# | |
265 | beaker.cache.repo_object.max_items = 100 |
|
305 | ||
266 | # cache auto-expires after N seconds |
|
306 | ; Default cache dir for caches. Putting this into a ramdisk can boost performance. | |
267 | beaker.cache.repo_object.expire = 300 |
|
307 | ; eg. /tmpfs/data_ramdisk, however this directory might require large amount of space | |
268 | beaker.cache.repo_object.enabled = true |
|
308 | cache_dir = %(here)s/data | |
|
309 | ||||
|
310 | ; ********************************************************** | |||
|
311 | ; `repo_object` cache with redis backend | |||
|
312 | ; recommended for larger instance, or for better performance | |||
|
313 | ; ********************************************************** | |||
|
314 | ||||
|
315 | ; `repo_object` cache settings for vcs methods for repositories | |||
|
316 | rc_cache.repo_object.backend = dogpile.cache.rc.redis_msgpack | |||
269 |
|
317 | |||
|
318 | ; cache auto-expires after N seconds | |||
|
319 | ; Examples: 86400 (1Day), 604800 (7Days), 1209600 (14Days), 2592000 (30days), 7776000 (90Days) | |||
|
320 | rc_cache.repo_object.expiration_time = 2592000 | |||
270 |
|
321 | |||
271 | ################################ |
|
322 | ; redis_expiration_time needs to be greater then expiration_time | |
272 | ### LOGGING CONFIGURATION #### |
|
323 | rc_cache.repo_object.arguments.redis_expiration_time = 3592000 | |
273 | ################################ |
|
324 | ||
|
325 | rc_cache.repo_object.arguments.host = localhost | |||
|
326 | rc_cache.repo_object.arguments.port = 6379 | |||
|
327 | rc_cache.repo_object.arguments.db = 5 | |||
|
328 | rc_cache.repo_object.arguments.socket_timeout = 30 | |||
|
329 | ; more Redis options: https://dogpilecache.sqlalchemy.org/en/latest/api.html#redis-backends | |||
|
330 | rc_cache.repo_object.arguments.distributed_lock = true | |||
|
331 | ||||
|
332 | ; ##################### | |||
|
333 | ; LOGGING CONFIGURATION | |||
|
334 | ; ##################### | |||
274 | [loggers] |
|
335 | [loggers] | |
275 |
keys = root, vcsserver |
|
336 | keys = root, vcsserver | |
276 |
|
337 | |||
277 | [handlers] |
|
338 | [handlers] | |
278 | keys = console |
|
339 | keys = console | |
@@ -280,9 +341,9 b' For a more detailed explanation of the l' | |||||
280 | [formatters] |
|
341 | [formatters] | |
281 | keys = generic |
|
342 | keys = generic | |
282 |
|
343 | |||
283 |
|
|
344 | ; ####### | |
284 |
|
|
345 | ; LOGGERS | |
285 |
|
|
346 | ; ####### | |
286 | [logger_root] |
|
347 | [logger_root] | |
287 | level = NOTSET |
|
348 | level = NOTSET | |
288 | handlers = console |
|
349 | handlers = console | |
@@ -293,29 +354,23 b' For a more detailed explanation of the l' | |||||
293 | qualname = vcsserver |
|
354 | qualname = vcsserver | |
294 | propagate = 1 |
|
355 | propagate = 1 | |
295 |
|
356 | |||
296 | [logger_beaker] |
|
|||
297 | level = DEBUG |
|
|||
298 | handlers = |
|
|||
299 | qualname = beaker |
|
|||
300 | propagate = 1 |
|
|||
301 |
|
357 | |||
302 |
|
358 | ; ######## | ||
303 | ############## |
|
359 | ; HANDLERS | |
304 | ## HANDLERS ## |
|
360 | ; ######## | |
305 | ############## |
|
|||
306 |
|
361 | |||
307 | [handler_console] |
|
362 | [handler_console] | |
308 | class = StreamHandler |
|
363 | class = StreamHandler | |
309 | args = (sys.stderr,) |
|
364 | args = (sys.stderr, ) | |
310 |
level = |
|
365 | level = INFO | |
311 | formatter = generic |
|
366 | formatter = generic | |
312 |
|
367 | |||
313 |
|
|
368 | ; ########## | |
314 |
|
|
369 | ; FORMATTERS | |
315 |
|
|
370 | ; ########## | |
316 |
|
371 | |||
317 | [formatter_generic] |
|
372 | [formatter_generic] | |
318 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
373 | format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s | |
319 | datefmt = %Y-%m-%d %H:%M:%S |
|
374 | datefmt = %Y-%m-%d %H:%M:%S | |
320 |
|
375 | |||
321 |
|
376 |
@@ -39,7 +39,7 b' close_pull_request' | |||||
39 | comment_pull_request |
|
39 | comment_pull_request | |
40 | -------------------- |
|
40 | -------------------- | |
41 |
|
41 | |||
42 | .. py:function:: comment_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, message=<Optional:None>, commit_id=<Optional:None>, status=<Optional:None>, comment_type=<Optional:u'note'>, resolves_comment_id=<Optional:None>, userid=<Optional:<OptionalAttr:apiuser>>) |
|
42 | .. py:function:: comment_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, message=<Optional:None>, commit_id=<Optional:None>, status=<Optional:None>, comment_type=<Optional:u'note'>, resolves_comment_id=<Optional:None>, extra_recipients=<Optional:[]>, userid=<Optional:<OptionalAttr:apiuser>>) | |
43 |
|
43 | |||
44 | Comment on the pull request specified with the `pullrequestid`, |
|
44 | Comment on the pull request specified with the `pullrequestid`, | |
45 | in the |repo| specified by the `repoid`, and optionally change the |
|
45 | in the |repo| specified by the `repoid`, and optionally change the | |
@@ -63,6 +63,11 b' comment_pull_request' | |||||
63 | :type status: str |
|
63 | :type status: str | |
64 | :param comment_type: Comment type, one of: 'note', 'todo' |
|
64 | :param comment_type: Comment type, one of: 'note', 'todo' | |
65 | :type comment_type: Optional(str), default: 'note' |
|
65 | :type comment_type: Optional(str), default: 'note' | |
|
66 | :param resolves_comment_id: id of comment which this one will resolve | |||
|
67 | :type resolves_comment_id: Optional(int) | |||
|
68 | :param extra_recipients: list of user ids or usernames to add | |||
|
69 | notifications for this comment. Acts like a CC for notification | |||
|
70 | :type extra_recipients: Optional(list) | |||
66 | :param userid: Comment on the pull request as this user |
|
71 | :param userid: Comment on the pull request as this user | |
67 | :type userid: Optional(str or int) |
|
72 | :type userid: Optional(str or int) | |
68 |
|
73 | |||
@@ -126,7 +131,7 b' create_pull_request' | |||||
126 | get_pull_request |
|
131 | get_pull_request | |
127 | ---------------- |
|
132 | ---------------- | |
128 |
|
133 | |||
129 | .. py:function:: get_pull_request(apiuser, pullrequestid, repoid=<Optional:None>) |
|
134 | .. py:function:: get_pull_request(apiuser, pullrequestid, repoid=<Optional:None>, merge_state=<Optional:False>) | |
130 |
|
135 | |||
131 | Get a pull request based on the given ID. |
|
136 | Get a pull request based on the given ID. | |
132 |
|
137 | |||
@@ -137,6 +142,9 b' get_pull_request' | |||||
137 | :type repoid: str or int |
|
142 | :type repoid: str or int | |
138 | :param pullrequestid: ID of the requested pull request. |
|
143 | :param pullrequestid: ID of the requested pull request. | |
139 | :type pullrequestid: int |
|
144 | :type pullrequestid: int | |
|
145 | :param merge_state: Optional calculate merge state for each repository. | |||
|
146 | This could result in longer time to fetch the data | |||
|
147 | :type merge_state: bool | |||
140 |
|
148 | |||
141 | Example output: |
|
149 | Example output: | |
142 |
|
150 | |||
@@ -250,7 +258,7 b' get_pull_request_comments' | |||||
250 | get_pull_requests |
|
258 | get_pull_requests | |
251 | ----------------- |
|
259 | ----------------- | |
252 |
|
260 | |||
253 |
.. py:function:: get_pull_requests(apiuser, repoid, status=<Optional:'new'>, merge_state=<Optional: |
|
261 | .. py:function:: get_pull_requests(apiuser, repoid, status=<Optional:'new'>, merge_state=<Optional:False>) | |
254 |
|
262 | |||
255 | Get all pull requests from the repository specified in `repoid`. |
|
263 | Get all pull requests from the repository specified in `repoid`. | |
256 |
|
264 |
@@ -28,7 +28,7 b' add_field_to_repo' | |||||
28 | comment_commit |
|
28 | comment_commit | |
29 | -------------- |
|
29 | -------------- | |
30 |
|
30 | |||
31 | .. py:function:: comment_commit(apiuser, repoid, commit_id, message, status=<Optional:None>, comment_type=<Optional:u'note'>, resolves_comment_id=<Optional:None>, userid=<Optional:<OptionalAttr:apiuser>>) |
|
31 | .. py:function:: comment_commit(apiuser, repoid, commit_id, message, status=<Optional:None>, comment_type=<Optional:u'note'>, resolves_comment_id=<Optional:None>, extra_recipients=<Optional:[]>, userid=<Optional:<OptionalAttr:apiuser>>) | |
32 |
|
32 | |||
33 | Set a commit comment, and optionally change the status of the commit. |
|
33 | Set a commit comment, and optionally change the status of the commit. | |
34 |
|
34 | |||
@@ -45,6 +45,11 b' comment_commit' | |||||
45 | :type status: str |
|
45 | :type status: str | |
46 | :param comment_type: Comment type, one of: 'note', 'todo' |
|
46 | :param comment_type: Comment type, one of: 'note', 'todo' | |
47 | :type comment_type: Optional(str), default: 'note' |
|
47 | :type comment_type: Optional(str), default: 'note' | |
|
48 | :param resolves_comment_id: id of comment which this one will resolve | |||
|
49 | :type resolves_comment_id: Optional(int) | |||
|
50 | :param extra_recipients: list of user ids or usernames to add | |||
|
51 | notifications for this comment. Acts like a CC for notification | |||
|
52 | :type extra_recipients: Optional(list) | |||
48 | :param userid: Set the user name of the comment creator. |
|
53 | :param userid: Set the user name of the comment creator. | |
49 | :type userid: Optional(str or int) |
|
54 | :type userid: Optional(str or int) | |
50 |
|
55 | |||
@@ -66,7 +71,7 b' comment_commit' | |||||
66 | create_repo |
|
71 | create_repo | |
67 | ----------- |
|
72 | ----------- | |
68 |
|
73 | |||
69 |
.. py:function:: create_repo(apiuser, repo_name, repo_type, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional: |
|
74 | .. py:function:: create_repo(apiuser, repo_name, repo_type, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, copy_permissions=<Optional:False>) | |
70 |
|
75 | |||
71 | Creates a repository. |
|
76 | Creates a repository. | |
72 |
|
77 | |||
@@ -97,7 +102,7 b' create_repo' | |||||
97 | :type clone_uri: str |
|
102 | :type clone_uri: str | |
98 | :param push_uri: set push_uri |
|
103 | :param push_uri: set push_uri | |
99 | :type push_uri: str |
|
104 | :type push_uri: str | |
100 | :param landing_rev: <rev_type>:<rev> |
|
105 | :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd | |
101 | :type landing_rev: str |
|
106 | :type landing_rev: str | |
102 | :param enable_locking: |
|
107 | :param enable_locking: | |
103 | :type enable_locking: bool |
|
108 | :type enable_locking: bool | |
@@ -169,7 +174,7 b' delete_repo' | |||||
169 | fork_repo |
|
174 | fork_repo | |
170 | --------- |
|
175 | --------- | |
171 |
|
176 | |||
172 |
.. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional: |
|
177 | .. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:None>, copy_permissions=<Optional:False>) | |
173 |
|
178 | |||
174 | Creates a fork of the specified |repo|. |
|
179 | Creates a fork of the specified |repo|. | |
175 |
|
180 | |||
@@ -198,7 +203,7 b' fork_repo' | |||||
198 | :type copy_permissions: bool |
|
203 | :type copy_permissions: bool | |
199 | :param private: Make the fork private. The default is False. |
|
204 | :param private: Make the fork private. The default is False. | |
200 | :type private: bool |
|
205 | :type private: bool | |
201 |
:param landing_rev: Set the landing revision. |
|
206 | :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd | |
202 |
|
207 | |||
203 | Example output: |
|
208 | Example output: | |
204 |
|
209 | |||
@@ -1085,7 +1090,7 b' strip' | |||||
1085 | update_repo |
|
1090 | update_repo | |
1086 | ----------- |
|
1091 | ----------- | |
1087 |
|
1092 | |||
1088 |
.. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional: |
|
1093 | .. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, push_uri=<Optional:None>, landing_rev=<Optional:None>, fork_of=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>) | |
1089 |
|
1094 | |||
1090 | Updates a repository with the given information. |
|
1095 | Updates a repository with the given information. | |
1091 |
|
1096 | |||
@@ -1117,7 +1122,7 b' update_repo' | |||||
1117 | :type private: bool |
|
1122 | :type private: bool | |
1118 | :param clone_uri: Update the |repo| clone URI. |
|
1123 | :param clone_uri: Update the |repo| clone URI. | |
1119 | :type clone_uri: str |
|
1124 | :type clone_uri: str | |
1120 |
:param landing_rev: Set the |repo| landing revision. |
|
1125 | :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd | |
1121 | :type landing_rev: str |
|
1126 | :type landing_rev: str | |
1122 | :param enable_statistics: Enable statistics on the |repo|, (True | False). |
|
1127 | :param enable_statistics: Enable statistics on the |repo|, (True | False). | |
1123 | :type enable_statistics: bool |
|
1128 | :type enable_statistics: bool |
@@ -6,7 +6,7 b' search methods' | |||||
6 | search |
|
6 | search | |
7 | ------ |
|
7 | ------ | |
8 |
|
8 | |||
9 |
.. py:function:: search(apiuser, search_query, search_type, page_limit=<Optional:10>, page=<Optional:1>, search_sort=<Optional:' |
|
9 | .. py:function:: search(apiuser, search_query, search_type, page_limit=<Optional:10>, page=<Optional:1>, search_sort=<Optional:'desc:date'>, repo_name=<Optional:None>, repo_group_name=<Optional:None>) | |
10 |
|
10 | |||
11 | Fetch Full Text Search results using API. |
|
11 | Fetch Full Text Search results using API. | |
12 |
|
12 | |||
@@ -23,9 +23,15 b' search' | |||||
23 | :type page_limit: Optional(int) |
|
23 | :type page_limit: Optional(int) | |
24 | :param page: Page number. Default first page. |
|
24 | :param page: Page number. Default first page. | |
25 | :type page: Optional(int) |
|
25 | :type page: Optional(int) | |
26 | :param search_sort: Search sort order. Default newfirst. The following are valid options: |
|
26 | :param search_sort: Search sort order.Must start with asc: or desc: Default desc:date. | |
27 | * newfirst |
|
27 | The following are valid options: | |
28 | * oldfirst |
|
28 | * asc|desc:message.raw | |
|
29 | * asc|desc:date | |||
|
30 | * asc|desc:author.email.raw | |||
|
31 | * asc|desc:message.raw | |||
|
32 | * newfirst (old legacy equal to desc:date) | |||
|
33 | * oldfirst (old legacy equal to asc:date) | |||
|
34 | ||||
29 | :type search_sort: Optional(str) |
|
35 | :type search_sort: Optional(str) | |
30 | :param repo_name: Filter by one repo. Default is all. |
|
36 | :param repo_name: Filter by one repo. Default is all. | |
31 | :type repo_name: Optional(str) |
|
37 | :type repo_name: Optional(str) |
@@ -6,7 +6,7 b' store methods' | |||||
6 | file_store_add (EE only) |
|
6 | file_store_add (EE only) | |
7 | ------------------------ |
|
7 | ------------------------ | |
8 |
|
8 | |||
9 | .. py:function:: file_store_add(apiuser, filename, content) |
|
9 | .. py:function:: file_store_add(apiuser, filename, content, description=<Optional:''>) | |
10 |
|
10 | |||
11 | Upload API for the file_store |
|
11 | Upload API for the file_store | |
12 |
|
12 | |||
@@ -19,6 +19,8 b' file_store_add (EE only)' | |||||
19 | :type apiuser: AuthUser |
|
19 | :type apiuser: AuthUser | |
20 | :param filename: name of the file uploaded |
|
20 | :param filename: name of the file uploaded | |
21 | :type filename: str |
|
21 | :type filename: str | |
|
22 | :param description: Optional description for added file | |||
|
23 | :type description: str | |||
22 | :param content: base64 encoded content of the uploaded file |
|
24 | :param content: base64 encoded content of the uploaded file | |
23 | :type content: str |
|
25 | :type content: str | |
24 |
|
26 | |||
@@ -35,3 +37,148 b' file_store_add (EE only)' | |||||
35 | error : null |
|
37 | error : null | |
36 |
|
38 | |||
37 |
|
39 | |||
|
40 | file_store_add_with_acl (EE only) | |||
|
41 | --------------------------------- | |||
|
42 | ||||
|
43 | .. py:function:: file_store_add_with_acl(apiuser, filename, content, description=<Optional:''>, scope_user_id=<Optional:None>, scope_repo_id=<Optional:None>, scope_repo_group_id=<Optional:None>) | |||
|
44 | ||||
|
45 | Upload API for the file_store | |||
|
46 | ||||
|
47 | Example usage from CLI:: | |||
|
48 | rhodecode-api --instance-name=enterprise-1 upload_file "{"content": "$(cat image.jpg | base64)", "filename":"image.jpg", "scope_repo_id":101}" | |||
|
49 | ||||
|
50 | This command takes the following options: | |||
|
51 | ||||
|
52 | :param apiuser: This is filled automatically from the |authtoken|. | |||
|
53 | :type apiuser: AuthUser | |||
|
54 | :param filename: name of the file uploaded | |||
|
55 | :type filename: str | |||
|
56 | :param description: Optional description for added file | |||
|
57 | :type description: str | |||
|
58 | :param content: base64 encoded content of the uploaded file | |||
|
59 | :type content: str | |||
|
60 | ||||
|
61 | :param scope_user_id: Optionally bind this file to user. | |||
|
62 | This will check ACL in such way only this user can access the file. | |||
|
63 | :type scope_user_id: int | |||
|
64 | :param scope_repo_id: Optionally bind this file to repository. | |||
|
65 | This will check ACL in such way only user with proper access to such | |||
|
66 | repository can access the file. | |||
|
67 | :type scope_repo_id: int | |||
|
68 | :param scope_repo_group_id: Optionally bind this file to repository group. | |||
|
69 | This will check ACL in such way only user with proper access to such | |||
|
70 | repository group can access the file. | |||
|
71 | :type scope_repo_group_id: int | |||
|
72 | ||||
|
73 | Example output: | |||
|
74 | ||||
|
75 | .. code-block:: bash | |||
|
76 | ||||
|
77 | id : <id_given_in_input> | |||
|
78 | result: { | |||
|
79 | "access_path": "/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg", | |||
|
80 | "access_path_fqn": "http://server.domain.com/_file_store/download/84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg", | |||
|
81 | "store_fid": "84d156f7-8323-4ad3-9fce-4a8e88e1deaf-0.jpg" | |||
|
82 | } | |||
|
83 | error : null | |||
|
84 | ||||
|
85 | ||||
|
86 | file_store_get_info (EE only) | |||
|
87 | ----------------------------- | |||
|
88 | ||||
|
89 | .. py:function:: file_store_get_info(apiuser, store_fid) | |||
|
90 | ||||
|
91 | Get artifact data. | |||
|
92 | ||||
|
93 | Example output: | |||
|
94 | ||||
|
95 | .. code-block:: bash | |||
|
96 | ||||
|
97 | id : <id_given_in_input> | |||
|
98 | result: { | |||
|
99 | "artifact": { | |||
|
100 | "access_path_fqn": "https://rhodecode.example.com/_file_store/download/0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg", | |||
|
101 | "created_on": "2019-10-15T16:25:35.491", | |||
|
102 | "description": "my upload", | |||
|
103 | "downloaded_times": 1, | |||
|
104 | "file_uid": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg", | |||
|
105 | "filename": "example.jpg", | |||
|
106 | "filename_org": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg", | |||
|
107 | "hidden": false, | |||
|
108 | "metadata": [ | |||
|
109 | { | |||
|
110 | "artifact": "0-031c2aa0-0d56-49a7-9ba3-b570bdd342ab.jpg", | |||
|
111 | "key": "yellow", | |||
|
112 | "section": "tags", | |||
|
113 | "value": "bar" | |||
|
114 | } | |||
|
115 | ], | |||
|
116 | "sha256": "818dff0f44574dfb6814d38e6bf3c60c5943d1d13653398ecddaedf2f6a5b04d", | |||
|
117 | "size": 18599, | |||
|
118 | "uploaded_by": { | |||
|
119 | "email": "admin@rhodecode.com", | |||
|
120 | "emails": [ | |||
|
121 | "admin@rhodecode.com" | |||
|
122 | ], | |||
|
123 | "firstname": "Admin", | |||
|
124 | "lastname": "LastName", | |||
|
125 | "user_id": 2, | |||
|
126 | "username": "admin" | |||
|
127 | } | |||
|
128 | } | |||
|
129 | } | |||
|
130 | error : null | |||
|
131 | ||||
|
132 | ||||
|
133 | file_store_add_metadata (EE only) | |||
|
134 | --------------------------------- | |||
|
135 | ||||
|
136 | .. py:function:: file_store_add_metadata(apiuser, store_fid, section, key, value, value_type=<Optional:'unicode'>) | |||
|
137 | ||||
|
138 | Add metadata into artifact. The metadata consist of section, key, value. eg. | |||
|
139 | section='tags', 'key'='tag_name', value='1' | |||
|
140 | ||||
|
141 | :param apiuser: This is filled automatically from the |authtoken|. | |||
|
142 | :type apiuser: AuthUser | |||
|
143 | ||||
|
144 | :param store_fid: file uid, e.g 0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4 | |||
|
145 | :type store_fid: str | |||
|
146 | ||||
|
147 | :param section: Section name to add metadata | |||
|
148 | :type section: str | |||
|
149 | ||||
|
150 | :param key: Key to add as metadata | |||
|
151 | :type key: str | |||
|
152 | ||||
|
153 | :param value: Value to add as metadata | |||
|
154 | :type value: str | |||
|
155 | ||||
|
156 | :param value_type: Optional type, default is 'unicode' other types are: | |||
|
157 | int, list, bool, unicode, str | |||
|
158 | ||||
|
159 | :type value_type: str | |||
|
160 | ||||
|
161 | Example output: | |||
|
162 | ||||
|
163 | .. code-block:: bash | |||
|
164 | ||||
|
165 | id : <id_given_in_input> | |||
|
166 | result: { | |||
|
167 | "metadata": [ | |||
|
168 | { | |||
|
169 | "artifact": "0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4", | |||
|
170 | "key": "secret", | |||
|
171 | "section": "tags", | |||
|
172 | "value": "1" | |||
|
173 | }, | |||
|
174 | { | |||
|
175 | "artifact": "0-d054cb71-91ab-44e2-9e4b-23fe14b4d74a.mp4", | |||
|
176 | "key": "video", | |||
|
177 | "section": "tags", | |||
|
178 | "value": "1" | |||
|
179 | } | |||
|
180 | ] | |||
|
181 | } | |||
|
182 | error : null | |||
|
183 | ||||
|
184 |
@@ -72,7 +72,9 b' create_user_group' | |||||
72 | :param active: Set this group as active. |
|
72 | :param active: Set this group as active. | |
73 | :type active: Optional(``True`` | ``False``) |
|
73 | :type active: Optional(``True`` | ``False``) | |
74 | :param sync: Set enabled or disabled the automatically sync from |
|
74 | :param sync: Set enabled or disabled the automatically sync from | |
75 | external authentication types like ldap. |
|
75 | external authentication types like ldap. If User Group will be named like | |
|
76 | one from e.g ldap and sync flag is enabled members will be synced automatically. | |||
|
77 | Sync type when enabled via API is set to `manual_api` | |||
76 | :type sync: Optional(``True`` | ``False``) |
|
78 | :type sync: Optional(``True`` | ``False``) | |
77 |
|
79 | |||
78 | Example output: |
|
80 | Example output: | |
@@ -391,7 +393,9 b' update_user_group' | |||||
391 | :param active: Set the group as active. |
|
393 | :param active: Set the group as active. | |
392 | :type active: Optional(``True`` | ``False``) |
|
394 | :type active: Optional(``True`` | ``False``) | |
393 | :param sync: Set enabled or disabled the automatically sync from |
|
395 | :param sync: Set enabled or disabled the automatically sync from | |
394 | external authentication types like ldap. |
|
396 | external authentication types like ldap. If User Group will be named like | |
|
397 | one from e.g ldap and sync flag is enabled members will be synced automatically. | |||
|
398 | Sync type when enabled via API is set to `manual_api` | |||
395 | :type sync: Optional(``True`` | ``False``) |
|
399 | :type sync: Optional(``True`` | ``False``) | |
396 |
|
400 | |||
397 | Example output: |
|
401 | Example output: |
@@ -6,7 +6,7 b' user methods' | |||||
6 | create_user |
|
6 | create_user | |
7 | ----------- |
|
7 | ----------- | |
8 |
|
8 | |||
9 | .. py:function:: create_user(apiuser, username, email, password=<Optional:''>, firstname=<Optional:''>, lastname=<Optional:''>, active=<Optional:True>, admin=<Optional:False>, extern_name=<Optional:'rhodecode'>, extern_type=<Optional:'rhodecode'>, force_password_change=<Optional:False>, create_personal_repo_group=<Optional:None>) |
|
9 | .. py:function:: create_user(apiuser, username, email, password=<Optional:''>, firstname=<Optional:''>, lastname=<Optional:''>, description=<Optional:''>, active=<Optional:True>, admin=<Optional:False>, extern_name=<Optional:'rhodecode'>, extern_type=<Optional:'rhodecode'>, force_password_change=<Optional:False>, create_personal_repo_group=<Optional:None>) | |
10 |
|
10 | |||
11 | Creates a new user and returns the new user object. |
|
11 | Creates a new user and returns the new user object. | |
12 |
|
12 | |||
@@ -27,6 +27,8 b' create_user' | |||||
27 | :type firstname: Optional(str) |
|
27 | :type firstname: Optional(str) | |
28 | :param lastname: Set the new user surname. |
|
28 | :param lastname: Set the new user surname. | |
29 | :type lastname: Optional(str) |
|
29 | :type lastname: Optional(str) | |
|
30 | :param description: Set user description, or short bio. Metatags are allowed. | |||
|
31 | :type description: Optional(str) | |||
30 | :param active: Set the user as active. |
|
32 | :param active: Set the user as active. | |
31 | :type active: Optional(``True`` | ``False``) |
|
33 | :type active: Optional(``True`` | ``False``) | |
32 | :param admin: Give the new user admin rights. |
|
34 | :param admin: Give the new user admin rights. | |
@@ -155,6 +157,7 b' get_user' | |||||
155 | "extern_name": "rhodecode", |
|
157 | "extern_name": "rhodecode", | |
156 | "extern_type": "rhodecode", |
|
158 | "extern_type": "rhodecode", | |
157 | "firstname": "username", |
|
159 | "firstname": "username", | |
|
160 | "description": "user description", | |||
158 | "ip_addresses": [], |
|
161 | "ip_addresses": [], | |
159 | "language": null, |
|
162 | "language": null, | |
160 | "last_login": "Timestamp", |
|
163 | "last_login": "Timestamp", | |
@@ -268,7 +271,7 b' get_users' | |||||
268 | update_user |
|
271 | update_user | |
269 | ----------- |
|
272 | ----------- | |
270 |
|
273 | |||
271 | .. py:function:: update_user(apiuser, userid, username=<Optional:None>, email=<Optional:None>, password=<Optional:None>, firstname=<Optional:None>, lastname=<Optional:None>, active=<Optional:None>, admin=<Optional:None>, extern_type=<Optional:None>, extern_name=<Optional:None>) |
|
274 | .. py:function:: update_user(apiuser, userid, username=<Optional:None>, email=<Optional:None>, password=<Optional:None>, firstname=<Optional:None>, lastname=<Optional:None>, description=<Optional:None>, active=<Optional:None>, admin=<Optional:None>, extern_type=<Optional:None>, extern_name=<Optional:None>) | |
272 |
|
275 | |||
273 | Updates the details for the specified user, if that user exists. |
|
276 | Updates the details for the specified user, if that user exists. | |
274 |
|
277 | |||
@@ -291,6 +294,8 b' update_user' | |||||
291 | :type firstname: Optional(str) |
|
294 | :type firstname: Optional(str) | |
292 | :param lastname: Set the new surname. |
|
295 | :param lastname: Set the new surname. | |
293 | :type lastname: Optional(str) |
|
296 | :type lastname: Optional(str) | |
|
297 | :param description: Set user description, or short bio. Metatags are allowed. | |||
|
298 | :type description: Optional(str) | |||
294 | :param active: Set the new user as active. |
|
299 | :param active: Set the new user as active. | |
295 | :type active: Optional(``True`` | ``False``) |
|
300 | :type active: Optional(``True`` | ``False``) | |
296 | :param admin: Give the user admin rights. |
|
301 | :param admin: Give the user admin rights. |
@@ -19,7 +19,7 b' Setup Nix Package Manager' | |||||
19 |
|
19 | |||
20 | To install the Nix Package Manager, please run:: |
|
20 | To install the Nix Package Manager, please run:: | |
21 |
|
21 | |||
22 | $ curl https://nixos.org/nix/install | sh |
|
22 | $ curl https://nixos.org/releases/nix/nix-2.0.4/install | sh | |
23 |
|
23 | |||
24 | or go to https://nixos.org/nix/ and follow the installation instructions. |
|
24 | or go to https://nixos.org/nix/ and follow the installation instructions. | |
25 | Once this is correctly set up on your system, you should be able to use the |
|
25 | Once this is correctly set up on your system, you should be able to use the |
@@ -79,6 +79,12 b' and commit files and |repos| while manag' | |||||
79 | contributing/contributing |
|
79 | contributing/contributing | |
80 |
|
80 | |||
81 | .. toctree:: |
|
81 | .. toctree:: | |
|
82 | :maxdepth: 2 | |||
|
83 | :caption: RhodeCode Control Documentation | |||
|
84 | ||||
|
85 | RhodeCode Installer <https://docs.rhodecode.com/RhodeCode-Control/> | |||
|
86 | ||||
|
87 | .. toctree:: | |||
82 | :maxdepth: 1 |
|
88 | :maxdepth: 1 | |
83 | :caption: About |
|
89 | :caption: About | |
84 |
|
90 |
@@ -11,16 +11,20 b' and import repositories in async way. It' | |||||
11 | repository sync in scheduler. |
|
11 | repository sync in scheduler. | |
12 |
|
12 | |||
13 | If you decide to use Celery you also need a working message queue. |
|
13 | If you decide to use Celery you also need a working message queue. | |
14 |
The |
|
14 | There are two fully supported message brokers is rabbitmq_ and redis_ (recommended). | |
|
15 | ||||
|
16 | Since release 4.18.X we recommend using redis_ as a backend since it's generally | |||
|
17 | easier to work with, and results in simpler stack as redis is generally recommended | |||
|
18 | for caching purposes. | |||
15 |
|
19 | |||
16 |
|
20 | |||
17 | In order to install and configure Celery, follow these steps: |
|
21 | In order to install and configure Celery, follow these steps: | |
18 |
|
22 | |||
19 | 1. Install RabbitMQ, see the documentation on the Celery website for |
|
23 | 1. Install RabbitMQ or Redis for a message queue, see the documentation on the Celery website for | |
20 | `rabbitmq installation`_, or `rabbitmq website installation`_ |
|
24 | `redis installation`_ or `rabbitmq installation`_ | |
21 |
|
25 | |||
22 |
|
26 | |||
23 |
1a. |
|
27 | 1a. If you choose RabbitMQ example configuration after installation would look like that:: | |
24 |
|
28 | |||
25 | sudo rabbitmqctl add_user rcuser secret_password |
|
29 | sudo rabbitmqctl add_user rcuser secret_password | |
26 | sudo rabbitmqctl add_vhost rhodevhost |
|
30 | sudo rabbitmqctl add_vhost rhodevhost | |
@@ -45,6 +49,10 b' 3. Configure Celery in the' | |||||
45 | Set the broker_url as minimal settings required to enable operation. |
|
49 | Set the broker_url as minimal settings required to enable operation. | |
46 | If used our example data from pt 1a, here is how the broker url should look like:: |
|
50 | If used our example data from pt 1a, here is how the broker url should look like:: | |
47 |
|
51 | |||
|
52 | # for Redis | |||
|
53 | celery.broker_url = redis://localhost:6379/8 | |||
|
54 | ||||
|
55 | # for RabbitMQ | |||
48 | celery.broker_url = amqp://rcuser:secret_password@localhost:5672/rhodevhost |
|
56 | celery.broker_url = amqp://rcuser:secret_password@localhost:5672/rhodevhost | |
49 |
|
|
57 | ||
50 | Full configuration example is below: |
|
58 | Full configuration example is below: | |
@@ -57,7 +65,7 b' 3. Configure Celery in the' | |||||
57 | #################################### |
|
65 | #################################### | |
58 |
|
66 | |||
59 | use_celery = true |
|
67 | use_celery = true | |
60 |
celery.broker_url = |
|
68 | celery.broker_url = redis://localhost:6379/8 | |
61 |
|
69 | |||
62 | # maximum tasks to execute before worker restart |
|
70 | # maximum tasks to execute before worker restart | |
63 | celery.max_tasks_per_child = 100 |
|
71 | celery.max_tasks_per_child = 100 | |
@@ -69,6 +77,8 b' 3. Configure Celery in the' | |||||
69 | .. _python: http://www.python.org/ |
|
77 | .. _python: http://www.python.org/ | |
70 | .. _mercurial: http://mercurial.selenic.com/ |
|
78 | .. _mercurial: http://mercurial.selenic.com/ | |
71 | .. _celery: http://celeryproject.org/ |
|
79 | .. _celery: http://celeryproject.org/ | |
|
80 | .. _redis: http://redis.io | |||
|
81 | .. _redis installation: https://redis.io/topics/quickstart | |||
72 | .. _rabbitmq: http://www.rabbitmq.com/ |
|
82 | .. _rabbitmq: http://www.rabbitmq.com/ | |
73 | .. _rabbitmq installation: http://docs.celeryproject.org/en/latest/getting-started/brokers/rabbitmq.html |
|
83 | .. _rabbitmq installation: http://docs.celeryproject.org/en/latest/getting-started/brokers/rabbitmq.html | |
74 | .. _rabbitmq website installation: http://www.rabbitmq.com/download.html |
|
84 | .. _rabbitmq website installation: http://www.rabbitmq.com/download.html |
@@ -35,4 +35,3 b' 2. When you open the file, find the data' | |||||
35 | # see sqlalchemy docs for other advanced settings |
|
35 | # see sqlalchemy docs for other advanced settings | |
36 | sqlalchemy.db1.echo = false |
|
36 | sqlalchemy.db1.echo = false | |
37 | sqlalchemy.db1.pool_recycle = 3600 |
|
37 | sqlalchemy.db1.pool_recycle = 3600 | |
38 | sqlalchemy.db1.convert_unicode = true |
|
@@ -38,44 +38,39 b' default one. See the instructions in :re' | |||||
38 |
|
38 | |||
39 | .. _issue-tr-eg-ref: |
|
39 | .. _issue-tr-eg-ref: | |
40 |
|
40 | |||
|
41 | ||||
41 | Jira Integration |
|
42 | Jira Integration | |
42 | ---------------- |
|
43 | ---------------- | |
43 |
|
44 | |||
44 | * Regex = ``(?:^#|\s#)(\w+-\d+)`` |
|
45 | Please check examples in the view for configuration the issue trackers. | |
45 | * URL = ``https://myissueserver.com/browse/${id}`` |
|
46 | ||
46 | * Issue Prefix = ``#`` |
|
|||
47 |
|
47 | |||
48 | Confluence (Wiki) |
|
48 | Confluence (Wiki) | |
49 | ----------------- |
|
49 | ----------------- | |
50 |
|
50 | |||
51 | * Regex = ``(?:conf-)([A-Z0-9]+)`` |
|
51 | Please check examples in the view for configuration the issue trackers. | |
52 | * URL = ``https://example.atlassian.net/display/wiki/${id}/${repo_name}`` |
|
52 | ||
53 | * issue prefix = ``CONF-`` |
|
|||
54 |
|
53 | |||
55 | Redmine Integration |
|
54 | Redmine Integration | |
56 | ------------------- |
|
55 | ------------------- | |
57 |
|
56 | |||
58 | * Regex = ``(issue-+\d+)`` |
|
57 | Please check examples in the view for configuration the issue trackers. | |
59 | * URL = ``https://myissueserver.com/redmine/issue/${id}`` |
|
58 | ||
60 | * Issue Prefix = ``issue-`` |
|
|||
61 |
|
59 | |||
62 |
Redmine |
|
60 | Redmine wiki Integration | |
63 | -------------- |
|
61 | ------------------------ | |
64 |
|
62 | |||
65 | * Regex = ``(?:wiki-)([a-zA-Z0-9]+)`` |
|
63 | Please check examples in the view for configuration the issue trackers. | |
66 | * URL = ``https://example.com/redmine/projects/wiki/${repo_name}`` |
|
64 | ||
67 | * Issue prefix = ``Issue-`` |
|
|||
68 |
|
65 | |||
69 | Pivotal Tracker |
|
66 | Pivotal Tracker | |
70 | --------------- |
|
67 | --------------- | |
71 |
|
68 | |||
72 | * Regex = ``(?:pivot-)(?<project_id>\d+)-(?<story>\d+)`` |
|
69 | Please check examples in the view for configuration the issue trackers. | |
73 | * URL = ``https://www.pivotaltracker.com/s/projects/${project_id}/stories/${story}`` |
|
70 | ||
74 | * Issue prefix = ``Piv-`` |
|
|||
75 |
|
71 | |||
76 | Trello |
|
72 | Trello | |
77 | ------ |
|
73 | ------ | |
78 |
|
74 | |||
79 | * Regex = ``(?:trello-)(?<card_id>[a-zA-Z0-9]+)`` |
|
75 | Please check examples in the view for configuration the issue trackers. | |
80 | * URL = ``https://trello.com/example.com/${card_id}`` |
|
76 | ||
81 | * Issue prefix = ``Trello-`` |
|
@@ -62,7 +62,7 b' Performance' | |||||
62 | Fixes |
|
62 | Fixes | |
63 | ^^^^^ |
|
63 | ^^^^^ | |
64 |
|
64 | |||
65 |
- |
|
65 | - Hooks: fixed more unicode problems with new pull-request link generator. | |
66 | - Mercurial: fix ssh-server support for mercurial custom options. |
|
66 | - Mercurial: fix ssh-server support for mercurial custom options. | |
67 | - Pull requests: updated metadata information for failed merges with multiple heads. |
|
67 | - Pull requests: updated metadata information for failed merges with multiple heads. | |
68 | - Pull requests: calculate ancestor in the same way as creation mode. |
|
68 | - Pull requests: calculate ancestor in the same way as creation mode. |
@@ -9,6 +9,7 b' Release Notes' | |||||
9 | .. toctree:: |
|
9 | .. toctree:: | |
10 | :maxdepth: 1 |
|
10 | :maxdepth: 1 | |
11 |
|
11 | |||
|
12 | release-notes-4.18.0.rst | |||
12 | release-notes-4.17.4.rst |
|
13 | release-notes-4.17.4.rst | |
13 | release-notes-4.17.3.rst |
|
14 | release-notes-4.17.3.rst | |
14 | release-notes-4.17.2.rst |
|
15 | release-notes-4.17.2.rst |
@@ -57,9 +57,12 b' To install |RCT|, use the following step' | |||||
57 |
|
57 | |||
58 | 1. Set up a ``virtualenv`` on your local machine, see virtualenv_ instructions |
|
58 | 1. Set up a ``virtualenv`` on your local machine, see virtualenv_ instructions | |
59 | here. |
|
59 | here. | |
60 | 2. Install |RCT| using pip. Full url with token is available at https://rhodecode.com/u/#rhodecode-tools |
|
60 | 2. Install |RCT| using pip. All downloadable versions of |RCT| are available at: | |
61 |
` |
|
61 | `https://code.rhodecode.com/rhodecode-tools-ce/artifacts` | |
62 |
|
62 | |||
|
63 | Example installation:: | |||
|
64 | ||||
|
65 | pip install -I https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-10ac93f4-bb7d-4b97-baea-68110743dd5a.tar.gz | |||
63 |
|
66 | |||
64 | Once |RCT| is installed using these steps there are a few extra |
|
67 | Once |RCT| is installed using these steps there are a few extra | |
65 | configuration changes you can make. These are explained in more detail in the |
|
68 | configuration changes you can make. These are explained in more detail in the |
@@ -56,4 +56,4 b' Repository Shortcuts' | |||||
56 | Go to the repository settings page. |
|
56 | Go to the repository settings page. | |
57 |
|
57 | |||
58 | \--:kbd:`gO` |
|
58 | \--:kbd:`gO` | |
59 | Go to the repository permissions settings. |
|
59 | Go to the repository access permissions settings. |
@@ -99,6 +99,12 b'' | |||||
99 | "nonull": true |
|
99 | "nonull": true | |
100 | } |
|
100 | } | |
101 | }, |
|
101 | }, | |
|
102 | "uglify": { | |||
|
103 | "dist": { | |||
|
104 | "src": "<%= dirs.js.dest %>/scripts.js", | |||
|
105 | "dest": "<%= dirs.js.dest %>/scripts.min.js" | |||
|
106 | } | |||
|
107 | }, | |||
102 | "less": { |
|
108 | "less": { | |
103 | "development": { |
|
109 | "development": { | |
104 | "options": { |
|
110 | "options": { |
@@ -22,6 +22,7 b'' | |||||
22 | "grunt-contrib-less": "^1.1.0", |
|
22 | "grunt-contrib-less": "^1.1.0", | |
23 | "grunt-contrib-watch": "^0.6.1", |
|
23 | "grunt-contrib-watch": "^0.6.1", | |
24 | "grunt-webpack": "^3.1.3", |
|
24 | "grunt-webpack": "^3.1.3", | |
|
25 | "grunt-contrib-uglify": "^4.0.1", | |||
25 | "jquery": "1.11.3", |
|
26 | "jquery": "1.11.3", | |
26 | "mark.js": "8.11.1", |
|
27 | "mark.js": "8.11.1", | |
27 | "jshint": "^2.9.1-rc3", |
|
28 | "jshint": "^2.9.1-rc3", |
This diff has been collapsed as it changes many lines, (1005 lines changed) Show them Hide them | |||||
@@ -211,13 +211,13 b' let' | |||||
211 | sha512 = "yiUk09opTEnE1lK+tb501ENb+yQBi4p++Ep0eGJAHesVYKVMPNgPphVKkIizkDaU+n0SE+zXfTsRbYyOMDYXSg=="; |
|
211 | sha512 = "yiUk09opTEnE1lK+tb501ENb+yQBi4p++Ep0eGJAHesVYKVMPNgPphVKkIizkDaU+n0SE+zXfTsRbYyOMDYXSg=="; | |
212 | }; |
|
212 | }; | |
213 | }; |
|
213 | }; | |
214 |
"@polymer/polymer-3. |
|
214 | "@polymer/polymer-3.3.0" = { | |
215 | name = "_at_polymer_slash_polymer"; |
|
215 | name = "_at_polymer_slash_polymer"; | |
216 | packageName = "@polymer/polymer"; |
|
216 | packageName = "@polymer/polymer"; | |
217 |
version = "3. |
|
217 | version = "3.3.0"; | |
218 | src = fetchurl { |
|
218 | src = fetchurl { | |
219 |
url = "https://registry.npmjs.org/@polymer/polymer/-/polymer-3. |
|
219 | url = "https://registry.npmjs.org/@polymer/polymer/-/polymer-3.3.0.tgz"; | |
220 | sha512 = "L6uV1oM6T6xbwbVx6t3biG5T2VSSB03LxnIrUd9M2pr6RkHVPFHJ37pC5MUwBAEhkGFJif7eks7fdMMSGZTeEQ=="; |
|
220 | sha512 = "rij7suomS7DxdBamnwr/Xa0V5hpypf7I9oYKseF2FWz5Xh2a3wJNpVjgJy1adXVCxqIyPhghsrthnfCt7EblsQ=="; | |
221 | }; |
|
221 | }; | |
222 | }; |
|
222 | }; | |
223 | "@types/clone-0.1.30" = { |
|
223 | "@types/clone-0.1.30" = { | |
@@ -229,13 +229,13 b' let' | |||||
229 | sha1 = "e7365648c1b42136a59c7d5040637b3b5c83b614"; |
|
229 | sha1 = "e7365648c1b42136a59c7d5040637b3b5c83b614"; | |
230 | }; |
|
230 | }; | |
231 | }; |
|
231 | }; | |
232 |
"@types/node-6.14. |
|
232 | "@types/node-6.14.9" = { | |
233 | name = "_at_types_slash_node"; |
|
233 | name = "_at_types_slash_node"; | |
234 | packageName = "@types/node"; |
|
234 | packageName = "@types/node"; | |
235 |
version = "6.14. |
|
235 | version = "6.14.9"; | |
236 | src = fetchurl { |
|
236 | src = fetchurl { | |
237 |
url = "https://registry.npmjs.org/@types/node/-/node-6.14. |
|
237 | url = "https://registry.npmjs.org/@types/node/-/node-6.14.9.tgz"; | |
238 | sha512 = "rFs9zCFtSHuseiNXxYxFlun8ibu+jtZPgRM+2ILCmeLiGeGLiIGxuOzD+cNyHegI1GD+da3R/cIbs9+xCLp13w=="; |
|
238 | sha512 = "leP/gxHunuazPdZaCvsCefPQxinqUDsCxCR5xaDUrY2MkYxQRFZZwU5e7GojyYsGB7QVtCi7iVEl/hoFXQYc+w=="; | |
239 | }; |
|
239 | }; | |
240 | }; |
|
240 | }; | |
241 | "@types/parse5-2.2.34" = { |
|
241 | "@types/parse5-2.2.34" = { | |
@@ -409,22 +409,22 b' let' | |||||
409 | sha512 = "mJ3QKWtCchL1vhU/kZlJnLPuQZnlDOdZsyP0bbLWPGdYsQDnSBvyTLhzwBA3QAMlzEL9V4JHygEmK6/OTEyytA=="; |
|
409 | sha512 = "mJ3QKWtCchL1vhU/kZlJnLPuQZnlDOdZsyP0bbLWPGdYsQDnSBvyTLhzwBA3QAMlzEL9V4JHygEmK6/OTEyytA=="; | |
410 | }; |
|
410 | }; | |
411 | }; |
|
411 | }; | |
412 |
"@webcomponents/shadycss-1.9. |
|
412 | "@webcomponents/shadycss-1.9.2" = { | |
413 | name = "_at_webcomponents_slash_shadycss"; |
|
413 | name = "_at_webcomponents_slash_shadycss"; | |
414 | packageName = "@webcomponents/shadycss"; |
|
414 | packageName = "@webcomponents/shadycss"; | |
415 |
version = "1.9. |
|
415 | version = "1.9.2"; | |
416 | src = fetchurl { |
|
416 | src = fetchurl { | |
417 |
url = "https://registry.npmjs.org/@webcomponents/shadycss/-/shadycss-1.9. |
|
417 | url = "https://registry.npmjs.org/@webcomponents/shadycss/-/shadycss-1.9.2.tgz"; | |
418 | sha512 = "IaZOnWOKXHghqk/WfPNDRIgDBi3RsVPY2IFAw6tYiL9UBGvQRy5R6uC+Fk7qTZsReTJ0xh5MTT8yAcb3MUR4mQ=="; |
|
418 | sha512 = "GsD7RpDVrVdgC6e+D8zQia8RGNmEGQ9/qotnVPQYPrIXhGS5xSt6ZED9YmuHz3HbLqY+E54tE1EK3tjLzSCGrw=="; | |
419 | }; |
|
419 | }; | |
420 | }; |
|
420 | }; | |
421 |
"@webcomponents/webcomponentsjs-2. |
|
421 | "@webcomponents/webcomponentsjs-2.3.0" = { | |
422 | name = "_at_webcomponents_slash_webcomponentsjs"; |
|
422 | name = "_at_webcomponents_slash_webcomponentsjs"; | |
423 | packageName = "@webcomponents/webcomponentsjs"; |
|
423 | packageName = "@webcomponents/webcomponentsjs"; | |
424 |
version = "2. |
|
424 | version = "2.3.0"; | |
425 | src = fetchurl { |
|
425 | src = fetchurl { | |
426 |
url = "https://registry.npmjs.org/@webcomponents/webcomponentsjs/-/webcomponentsjs-2. |
|
426 | url = "https://registry.npmjs.org/@webcomponents/webcomponentsjs/-/webcomponentsjs-2.3.0.tgz"; | |
427 | sha512 = "5dzhUhP+h0qMiK0IWb7VNb0OGBoXO3AuI6Qi8t9PoKT50s5L1jv0xnwnLq+cFgPuTB8FLTNP8xIDmyoOsKBy9Q=="; |
|
427 | sha512 = "sR6FOrNnnncRuoJDqq9QxtRsJMbIvASw4vnJwIYKVlKO3AMc+NAr/bIQNnUiTTE9pBDTJkFpVaUdjJaRdsjmyA=="; | |
428 | }; |
|
428 | }; | |
429 | }; |
|
429 | }; | |
430 | "@xtuc/ieee754-1.2.0" = { |
|
430 | "@xtuc/ieee754-1.2.0" = { | |
@@ -499,22 +499,22 b' let' | |||||
499 | sha1 = "82ffb02b29e662ae53bdc20af15947706739c536"; |
|
499 | sha1 = "82ffb02b29e662ae53bdc20af15947706739c536"; | |
500 | }; |
|
500 | }; | |
501 | }; |
|
501 | }; | |
502 |
"ajv-6.10. |
|
502 | "ajv-6.10.2" = { | |
503 | name = "ajv"; |
|
503 | name = "ajv"; | |
504 | packageName = "ajv"; |
|
504 | packageName = "ajv"; | |
505 |
version = "6.10. |
|
505 | version = "6.10.2"; | |
506 | src = fetchurl { |
|
506 | src = fetchurl { | |
507 |
url = "https://registry.npmjs.org/ajv/-/ajv-6.10. |
|
507 | url = "https://registry.npmjs.org/ajv/-/ajv-6.10.2.tgz"; | |
508 | sha512 = "nffhOpkymDECQyR0mnsUtoCE8RlX38G0rYP+wgLWFyZuUyuuojSSvi/+euOiQBIn63whYwYVIIH1TvE3tu4OEg=="; |
|
508 | sha512 = "TXtUUEYHuaTEbLZWIKUr5pmBuhDLy+8KYtPYdcV8qC+pOZL+NKqYwvWSRrVXHn+ZmRRAu8vJTAznH7Oag6RVRw=="; | |
509 | }; |
|
509 | }; | |
510 | }; |
|
510 | }; | |
511 |
"ajv-keywords-3.4. |
|
511 | "ajv-keywords-3.4.1" = { | |
512 | name = "ajv-keywords"; |
|
512 | name = "ajv-keywords"; | |
513 | packageName = "ajv-keywords"; |
|
513 | packageName = "ajv-keywords"; | |
514 |
version = "3.4. |
|
514 | version = "3.4.1"; | |
515 | src = fetchurl { |
|
515 | src = fetchurl { | |
516 |
url = "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-3.4. |
|
516 | url = "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-3.4.1.tgz"; | |
517 | sha512 = "aUjdRFISbuFOl0EIZc+9e4FfZp0bDZgAdOOf30bJmw8VM9v84SHyVyxDfbWxpGYbdZD/9XoKxfHVNmxPkhwyGw=="; |
|
517 | sha512 = "RO1ibKvd27e6FEShVFfPALuHI3WjSVNeK5FIsmme/LYRNxjKuNj+Dt7bucLa6NdSv3JcVTyMlm9kGR84z1XpaQ=="; | |
518 | }; |
|
518 | }; | |
519 | }; |
|
519 | }; | |
520 | "align-text-0.1.4" = { |
|
520 | "align-text-0.1.4" = { | |
@@ -806,13 +806,13 b' let' | |||||
806 | sha1 = "b6bbe0b0674b9d719708ca38de8c237cb526c3d1"; |
|
806 | sha1 = "b6bbe0b0674b9d719708ca38de8c237cb526c3d1"; | |
807 | }; |
|
807 | }; | |
808 | }; |
|
808 | }; | |
809 |
"async-2.6. |
|
809 | "async-2.6.3" = { | |
810 | name = "async"; |
|
810 | name = "async"; | |
811 | packageName = "async"; |
|
811 | packageName = "async"; | |
812 |
version = "2.6. |
|
812 | version = "2.6.3"; | |
813 | src = fetchurl { |
|
813 | src = fetchurl { | |
814 |
url = "https://registry.npmjs.org/async/-/async-2.6. |
|
814 | url = "https://registry.npmjs.org/async/-/async-2.6.3.tgz"; | |
815 | sha512 = "H1qVYh1MYhEEFLsP97cVKqCGo7KfCyTt6uEWqsTBr9SO84oK9Uwbyd/yCW+6rKJLHksBNUVWZDAjfS+Ccx0Bbg=="; |
|
815 | sha512 = "zflvls11DCy+dQWzTW2dzuilv8Z5X/pjfmZOWba6TNIVDm+2UDaJmXSOXlasHKfNBs8oo3M0aT50fDEWfKZjXg=="; | |
816 | }; |
|
816 | }; | |
817 | }; |
|
817 | }; | |
818 | "async-each-1.0.3" = { |
|
818 | "async-each-1.0.3" = { | |
@@ -1400,13 +1400,13 b' let' | |||||
1400 | sha512 = "5T6P4xPgpp0YDFvSWwEZ4NoE3aM4QBQXDzmVbraCkFj8zHM+mba8SyqB5DbZWyR7mYHo6Y7BdQo3MoA4m0TeQg=="; |
|
1400 | sha512 = "5T6P4xPgpp0YDFvSWwEZ4NoE3aM4QBQXDzmVbraCkFj8zHM+mba8SyqB5DbZWyR7mYHo6Y7BdQo3MoA4m0TeQg=="; | |
1401 | }; |
|
1401 | }; | |
1402 | }; |
|
1402 | }; | |
1403 |
"base64-js-1.3. |
|
1403 | "base64-js-1.3.1" = { | |
1404 | name = "base64-js"; |
|
1404 | name = "base64-js"; | |
1405 | packageName = "base64-js"; |
|
1405 | packageName = "base64-js"; | |
1406 |
version = "1.3. |
|
1406 | version = "1.3.1"; | |
1407 | src = fetchurl { |
|
1407 | src = fetchurl { | |
1408 |
url = "https://registry.npmjs.org/base64-js/-/base64-js-1.3. |
|
1408 | url = "https://registry.npmjs.org/base64-js/-/base64-js-1.3.1.tgz"; | |
1409 | sha512 = "ccav/yGvoa80BQDljCxsmmQ3Xvx60/UpBIij5QN21W3wBi/hhIC9OoO+KLpu9IJTS9j4DRVJ3aDDF9cMSoa2lw=="; |
|
1409 | sha512 = "mLQ4i2QO1ytvGWFWmcngKO//JXAQueZvwEKtjgQFM4jIK0kU+ytMfplL8j+n5mspOfjHwoAg+9yhb7BwAHm36g=="; | |
1410 | }; |
|
1410 | }; | |
1411 | }; |
|
1411 | }; | |
1412 | "bcrypt-pbkdf-1.0.2" = { |
|
1412 | "bcrypt-pbkdf-1.0.2" = { | |
@@ -1445,13 +1445,13 b' let' | |||||
1445 | sha512 = "Un7MIEDdUC5gNpcGDV97op1Ywk748MpHcFTHoYs6qnj1Z3j7I53VG3nwZhKzoBZmbdRNnb6WRdFlwl7tSDuZGw=="; |
|
1445 | sha512 = "Un7MIEDdUC5gNpcGDV97op1Ywk748MpHcFTHoYs6qnj1Z3j7I53VG3nwZhKzoBZmbdRNnb6WRdFlwl7tSDuZGw=="; | |
1446 | }; |
|
1446 | }; | |
1447 | }; |
|
1447 | }; | |
1448 |
"bluebird-3. |
|
1448 | "bluebird-3.7.1" = { | |
1449 | name = "bluebird"; |
|
1449 | name = "bluebird"; | |
1450 | packageName = "bluebird"; |
|
1450 | packageName = "bluebird"; | |
1451 |
version = "3. |
|
1451 | version = "3.7.1"; | |
1452 | src = fetchurl { |
|
1452 | src = fetchurl { | |
1453 |
url = "https://registry.npmjs.org/bluebird/-/bluebird-3. |
|
1453 | url = "https://registry.npmjs.org/bluebird/-/bluebird-3.7.1.tgz"; | |
1454 | sha512 = "FG+nFEZChJrbQ9tIccIfZJBz3J7mLrAhxakAbnrJWn8d7aKOC+LWifa0G+p4ZqKp4y13T7juYvdhq9NzKdsrjw=="; |
|
1454 | sha512 = "DdmyoGCleJnkbp3nkbxTLJ18rjDsE4yCggEwKNXkeV123sPNfOCYeDoeuOY+F2FrSjO1YXcTU+dsy96KMy+gcg=="; | |
1455 | }; |
|
1455 | }; | |
1456 | }; |
|
1456 | }; | |
1457 | "bn.js-4.11.8" = { |
|
1457 | "bn.js-4.11.8" = { | |
@@ -1670,22 +1670,22 b' let' | |||||
1670 | sha1 = "b534e7c734c4f81ec5fbe8aca2ad24354b962c6c"; |
|
1670 | sha1 = "b534e7c734c4f81ec5fbe8aca2ad24354b962c6c"; | |
1671 | }; |
|
1671 | }; | |
1672 | }; |
|
1672 | }; | |
1673 |
"caniuse-db-1.0.30000 |
|
1673 | "caniuse-db-1.0.30001006" = { | |
1674 | name = "caniuse-db"; |
|
1674 | name = "caniuse-db"; | |
1675 | packageName = "caniuse-db"; |
|
1675 | packageName = "caniuse-db"; | |
1676 |
version = "1.0.30000 |
|
1676 | version = "1.0.30001006"; | |
1677 | src = fetchurl { |
|
1677 | src = fetchurl { | |
1678 |
url = "https://registry.npmjs.org/caniuse-db/-/caniuse-db-1.0.30000 |
|
1678 | url = "https://registry.npmjs.org/caniuse-db/-/caniuse-db-1.0.30001006.tgz"; | |
1679 | sha512 = "70gk6cLSD5rItxnZ7WUxyCpM9LAjEb1tVzlENQfXQXZS/IiGnfAC6u32G5cZFlDBKjNPBIta/QSx5CZLZepxRA=="; |
|
1679 | sha512 = "Xn25grc0GXATFnnEX+KP3IwEv6ZdHs4CALyLKvK8pBeeBe+hSpqy3/GyKBgEp4hn6o+bI+GNeNeQBf9PBOK0EQ=="; | |
1680 | }; |
|
1680 | }; | |
1681 | }; |
|
1681 | }; | |
1682 |
"caniuse-lite-1.0.30000 |
|
1682 | "caniuse-lite-1.0.30001006" = { | |
1683 | name = "caniuse-lite"; |
|
1683 | name = "caniuse-lite"; | |
1684 | packageName = "caniuse-lite"; |
|
1684 | packageName = "caniuse-lite"; | |
1685 |
version = "1.0.30000 |
|
1685 | version = "1.0.30001006"; | |
1686 | src = fetchurl { |
|
1686 | src = fetchurl { | |
1687 |
url = "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30000 |
|
1687 | url = "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001006.tgz"; | |
1688 | sha512 = "rUBIbap+VJfxTzrM4akJ00lkvVb5/n5v3EGXfWzSH5zT8aJmGzjA8HWhJ4U6kCpzxozUSnB+yvAYDRPY6mRpgQ=="; |
|
1688 | sha512 = "MXnUVX27aGs/QINz+QG1sWSLDr3P1A3Hq5EUWoIt0T7K24DuvMxZEnh3Y5aHlJW6Bz2aApJdSewdYLd8zQnUuw=="; | |
1689 | }; |
|
1689 | }; | |
1690 | }; |
|
1690 | }; | |
1691 | "caseless-0.12.0" = { |
|
1691 | "caseless-0.12.0" = { | |
@@ -1733,31 +1733,31 b' let' | |||||
1733 | sha512 = "Mti+f9lpJNcwF4tWV8/OrTTtF1gZi+f8FqlyAdouralcFWFQWF2+NgCHShjkCb+IFBLq9buZwE1xckQU4peSuQ=="; |
|
1733 | sha512 = "Mti+f9lpJNcwF4tWV8/OrTTtF1gZi+f8FqlyAdouralcFWFQWF2+NgCHShjkCb+IFBLq9buZwE1xckQU4peSuQ=="; | |
1734 | }; |
|
1734 | }; | |
1735 | }; |
|
1735 | }; | |
1736 |
"chokidar-2.1. |
|
1736 | "chokidar-2.1.8" = { | |
1737 | name = "chokidar"; |
|
1737 | name = "chokidar"; | |
1738 | packageName = "chokidar"; |
|
1738 | packageName = "chokidar"; | |
1739 |
version = "2.1. |
|
1739 | version = "2.1.8"; | |
1740 | src = fetchurl { |
|
1740 | src = fetchurl { | |
1741 |
url = "https://registry.npmjs.org/chokidar/-/chokidar-2.1. |
|
1741 | url = "https://registry.npmjs.org/chokidar/-/chokidar-2.1.8.tgz"; | |
1742 | sha512 = "i0TprVWp+Kj4WRPtInjexJ8Q+BqTE909VpH8xVhXrJkoc5QC8VO9TryGOqTr+2hljzc1sC62t22h5tZePodM/A=="; |
|
1742 | sha512 = "ZmZUazfOzf0Nve7duiCKD23PFSCs4JPoYyccjUFF3aQkQadqBhfzhjkwBH2mNOG9cTBwhamM37EIsIkZw3nRgg=="; | |
1743 | }; |
|
1743 | }; | |
1744 | }; |
|
1744 | }; | |
1745 |
"chownr-1.1. |
|
1745 | "chownr-1.1.3" = { | |
1746 | name = "chownr"; |
|
1746 | name = "chownr"; | |
1747 | packageName = "chownr"; |
|
1747 | packageName = "chownr"; | |
1748 |
version = "1.1. |
|
1748 | version = "1.1.3"; | |
1749 | src = fetchurl { |
|
1749 | src = fetchurl { | |
1750 |
url = "https://registry.npmjs.org/chownr/-/chownr-1.1. |
|
1750 | url = "https://registry.npmjs.org/chownr/-/chownr-1.1.3.tgz"; | |
1751 | sha512 = "j38EvO5+LHX84jlo6h4UzmOwi0UgW61WRyPtJz4qaadK5eY3BTS5TY/S1Stc3Uk2lIM6TPevAlULiEJwie860g=="; |
|
1751 | sha512 = "i70fVHhmV3DtTl6nqvZOnIjbY0Pe4kAUjwHj8z0zAdgBtYrJyYwLKCCuRBQ5ppkyL0AkN7HKRnETdmdp1zqNXw=="; | |
1752 | }; |
|
1752 | }; | |
1753 | }; |
|
1753 | }; | |
1754 |
"chrome-trace-event-1.0. |
|
1754 | "chrome-trace-event-1.0.2" = { | |
1755 | name = "chrome-trace-event"; |
|
1755 | name = "chrome-trace-event"; | |
1756 | packageName = "chrome-trace-event"; |
|
1756 | packageName = "chrome-trace-event"; | |
1757 |
version = "1.0. |
|
1757 | version = "1.0.2"; | |
1758 | src = fetchurl { |
|
1758 | src = fetchurl { | |
1759 |
url = "https://registry.npmjs.org/chrome-trace-event/-/chrome-trace-event-1.0. |
|
1759 | url = "https://registry.npmjs.org/chrome-trace-event/-/chrome-trace-event-1.0.2.tgz"; | |
1760 | sha512 = "xDbVgyfDTT2piup/h8dK/y4QZfJRSa73bw1WZ8b4XM1o7fsFubUVGYcE+1ANtOzJJELGpYoG2961z0Z6OAld9A=="; |
|
1760 | sha512 = "9e/zx1jw7B4CO+c/RXoCsfg/x1AfUBioy4owYH0bJprEYAx5hRFLRhWBqHAG57D0ZM4H7vxbP7bPe0VwhQRYDQ=="; | |
1761 | }; |
|
1761 | }; | |
1762 | }; |
|
1762 | }; | |
1763 | "cipher-base-1.0.4" = { |
|
1763 | "cipher-base-1.0.4" = { | |
@@ -1958,13 +1958,13 b' let' | |||||
1958 | sha1 = "168a4701756b6a7f51a12ce0c97bfa28c084ed63"; |
|
1958 | sha1 = "168a4701756b6a7f51a12ce0c97bfa28c084ed63"; | |
1959 | }; |
|
1959 | }; | |
1960 | }; |
|
1960 | }; | |
1961 |
"colors-1. |
|
1961 | "colors-1.4.0" = { | |
1962 | name = "colors"; |
|
1962 | name = "colors"; | |
1963 | packageName = "colors"; |
|
1963 | packageName = "colors"; | |
1964 |
version = "1. |
|
1964 | version = "1.4.0"; | |
1965 | src = fetchurl { |
|
1965 | src = fetchurl { | |
1966 |
url = "https://registry.npmjs.org/colors/-/colors-1. |
|
1966 | url = "https://registry.npmjs.org/colors/-/colors-1.4.0.tgz"; | |
1967 | sha512 = "mmGt/1pZqYRjMxB1axhTo16/snVZ5krrKkcmMeVKxzECMMXoCgnvTPp10QgHfcbQZw8Dq2jMNG6je4JlWU0gWg=="; |
|
1967 | sha512 = "a+UqTh4kgZg/SlGvfbzDHpgRu7AAQOmmqRHJnxhRZICKFUT91brVhNNt58CMWU9PsBbv3PDCZUHbVxuDiH2mtA=="; | |
1968 | }; |
|
1968 | }; | |
1969 | }; |
|
1969 | }; | |
1970 | "combined-stream-1.0.8" = { |
|
1970 | "combined-stream-1.0.8" = { | |
@@ -2003,6 +2003,15 b' let' | |||||
2003 | sha512 = "6tvAOO+D6OENvRAh524Dh9jcfKTYDQAqvqezbCW82xj5X0pSrcpxtvRKHLG0yBY6SD7PSDrJaj+0AiOcKVd1Xg=="; |
|
2003 | sha512 = "6tvAOO+D6OENvRAh524Dh9jcfKTYDQAqvqezbCW82xj5X0pSrcpxtvRKHLG0yBY6SD7PSDrJaj+0AiOcKVd1Xg=="; | |
2004 | }; |
|
2004 | }; | |
2005 | }; |
|
2005 | }; | |
|
2006 | "commander-2.20.3" = { | |||
|
2007 | name = "commander"; | |||
|
2008 | packageName = "commander"; | |||
|
2009 | version = "2.20.3"; | |||
|
2010 | src = fetchurl { | |||
|
2011 | url = "https://registry.npmjs.org/commander/-/commander-2.20.3.tgz"; | |||
|
2012 | sha512 = "GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ=="; | |||
|
2013 | }; | |||
|
2014 | }; | |||
2006 | "commondir-1.0.1" = { |
|
2015 | "commondir-1.0.1" = { | |
2007 | name = "commondir"; |
|
2016 | name = "commondir"; | |
2008 | packageName = "commondir"; |
|
2017 | packageName = "commondir"; | |
@@ -2093,13 +2102,13 b' let' | |||||
2093 | sha512 = "Y+SQCF+0NoWQryez2zXn5J5knmr9z/9qSQt7fbL78u83rxmigOy8X5+BFn8CFSuX+nKT8gpYwJX68ekqtQt6ZA=="; |
|
2102 | sha512 = "Y+SQCF+0NoWQryez2zXn5J5knmr9z/9qSQt7fbL78u83rxmigOy8X5+BFn8CFSuX+nKT8gpYwJX68ekqtQt6ZA=="; | |
2094 | }; |
|
2103 | }; | |
2095 | }; |
|
2104 | }; | |
2096 |
"core-js-2.6. |
|
2105 | "core-js-2.6.10" = { | |
2097 | name = "core-js"; |
|
2106 | name = "core-js"; | |
2098 | packageName = "core-js"; |
|
2107 | packageName = "core-js"; | |
2099 |
version = "2.6. |
|
2108 | version = "2.6.10"; | |
2100 | src = fetchurl { |
|
2109 | src = fetchurl { | |
2101 |
url = "https://registry.npmjs.org/core-js/-/core-js-2.6. |
|
2110 | url = "https://registry.npmjs.org/core-js/-/core-js-2.6.10.tgz"; | |
2102 | sha512 = "klh/kDpwX8hryYL14M9w/xei6vrv6sE8gTHDG7/T/+SEovB/G4ejwcfE/CBzO6Edsu+OETZMZ3wcX/EjUkrl5A=="; |
|
2111 | sha512 = "I39t74+4t+zau64EN1fE5v2W31Adtc/REhzWN+gWRRXg6WH5qAsZm62DHpQ1+Yhe4047T55jvzz7MUqF/dBBlA=="; | |
2103 | }; |
|
2112 | }; | |
2104 | }; |
|
2113 | }; | |
2105 | "core-util-is-1.0.2" = { |
|
2114 | "core-util-is-1.0.2" = { | |
@@ -2237,13 +2246,13 b' let' | |||||
2237 | sha1 = "ddd52c587033f49e94b71fc55569f252e8ff5f85"; |
|
2246 | sha1 = "ddd52c587033f49e94b71fc55569f252e8ff5f85"; | |
2238 | }; |
|
2247 | }; | |
2239 | }; |
|
2248 | }; | |
2240 |
"cyclist-0. |
|
2249 | "cyclist-1.0.1" = { | |
2241 | name = "cyclist"; |
|
2250 | name = "cyclist"; | |
2242 | packageName = "cyclist"; |
|
2251 | packageName = "cyclist"; | |
2243 |
version = "0. |
|
2252 | version = "1.0.1"; | |
2244 | src = fetchurl { |
|
2253 | src = fetchurl { | |
2245 |
url = "https://registry.npmjs.org/cyclist/-/cyclist-0. |
|
2254 | url = "https://registry.npmjs.org/cyclist/-/cyclist-1.0.1.tgz"; | |
2246 | sha1 = "1b33792e11e914a2fd6d6ed6447464444e5fa640"; |
|
2255 | sha1 = "596e9698fd0c80e12038c2b82d6eb1b35b6224d9"; | |
2247 | }; |
|
2256 | }; | |
2248 | }; |
|
2257 | }; | |
2249 | "dashdash-1.14.1" = { |
|
2258 | "dashdash-1.14.1" = { | |
@@ -2435,13 +2444,13 b' let' | |||||
2435 | sha512 = "gd3ypIPfOMr9h5jIKq8E3sHOTCjeirnl0WK5ZdS1AW0Odt0b1PaWaHdJ4Qk4klv+YB9aJBS7mESXjFoDQPu6DA=="; |
|
2444 | sha512 = "gd3ypIPfOMr9h5jIKq8E3sHOTCjeirnl0WK5ZdS1AW0Odt0b1PaWaHdJ4Qk4klv+YB9aJBS7mESXjFoDQPu6DA=="; | |
2436 | }; |
|
2445 | }; | |
2437 | }; |
|
2446 | }; | |
2438 |
"dom-serializer-0. |
|
2447 | "dom-serializer-0.2.1" = { | |
2439 | name = "dom-serializer"; |
|
2448 | name = "dom-serializer"; | |
2440 | packageName = "dom-serializer"; |
|
2449 | packageName = "dom-serializer"; | |
2441 |
version = "0. |
|
2450 | version = "0.2.1"; | |
2442 | src = fetchurl { |
|
2451 | src = fetchurl { | |
2443 |
url = "https://registry.npmjs.org/dom-serializer/-/dom-serializer-0. |
|
2452 | url = "https://registry.npmjs.org/dom-serializer/-/dom-serializer-0.2.1.tgz"; | |
2444 | sha512 = "l0IU0pPzLWSHBcieZbpOKgkIn3ts3vAh7ZuFyXNwJxJXk/c4Gwj9xaTJwIDVQCXawWD0qb3IzMGH5rglQaO0XA=="; |
|
2453 | sha512 = "sK3ujri04WyjwQXVoK4PU3y8ula1stq10GJZpqHIUgoGZdsGzAGu65BnU3d08aTVSvO7mGPZUc0wTEDL+qGE0Q=="; | |
2445 | }; |
|
2454 | }; | |
2446 | }; |
|
2455 | }; | |
2447 | "dom5-2.3.0" = { |
|
2456 | "dom5-2.3.0" = { | |
@@ -2471,6 +2480,15 b' let' | |||||
2471 | sha512 = "BSKB+TSpMpFI/HOxCNr1O8aMOTZ8hT3pM3GQ0w/mWRmkhEDSFJkkyzz4XQsBV44BChwGkrDfMyjVD0eA2aFV3w=="; |
|
2480 | sha512 = "BSKB+TSpMpFI/HOxCNr1O8aMOTZ8hT3pM3GQ0w/mWRmkhEDSFJkkyzz4XQsBV44BChwGkrDfMyjVD0eA2aFV3w=="; | |
2472 | }; |
|
2481 | }; | |
2473 | }; |
|
2482 | }; | |
|
2483 | "domelementtype-2.0.1" = { | |||
|
2484 | name = "domelementtype"; | |||
|
2485 | packageName = "domelementtype"; | |||
|
2486 | version = "2.0.1"; | |||
|
2487 | src = fetchurl { | |||
|
2488 | url = "https://registry.npmjs.org/domelementtype/-/domelementtype-2.0.1.tgz"; | |||
|
2489 | sha512 = "5HOHUDsYZWV8FGWN0Njbr/Rn7f/eWSQi1v7+HsUVwXgn8nWWlL64zKDkS0n8ZmQ3mlWOMuXOnR+7Nx/5tMO5AQ=="; | |||
|
2490 | }; | |||
|
2491 | }; | |||
2474 | "domhandler-2.3.0" = { |
|
2492 | "domhandler-2.3.0" = { | |
2475 | name = "domhandler"; |
|
2493 | name = "domhandler"; | |
2476 | packageName = "domhandler"; |
|
2494 | packageName = "domhandler"; | |
@@ -2498,6 +2516,15 b' let' | |||||
2498 | sha512 = "3VduRWLxx9hbVr42QieQN25mx/I61/mRdUSuxAmDGdDqZIN8qtP7tcKMa3KfpJjuGjOJGYYUzzeq6eGDnkzesA=="; |
|
2516 | sha512 = "3VduRWLxx9hbVr42QieQN25mx/I61/mRdUSuxAmDGdDqZIN8qtP7tcKMa3KfpJjuGjOJGYYUzzeq6eGDnkzesA=="; | |
2499 | }; |
|
2517 | }; | |
2500 | }; |
|
2518 | }; | |
|
2519 | "duplexer-0.1.1" = { | |||
|
2520 | name = "duplexer"; | |||
|
2521 | packageName = "duplexer"; | |||
|
2522 | version = "0.1.1"; | |||
|
2523 | src = fetchurl { | |||
|
2524 | url = "https://registry.npmjs.org/duplexer/-/duplexer-0.1.1.tgz"; | |||
|
2525 | sha1 = "ace6ff808c1ce66b57d1ebf97977acb02334cfc1"; | |||
|
2526 | }; | |||
|
2527 | }; | |||
2501 | "duplexify-3.7.1" = { |
|
2528 | "duplexify-3.7.1" = { | |
2502 | name = "duplexify"; |
|
2529 | name = "duplexify"; | |
2503 | packageName = "duplexify"; |
|
2530 | packageName = "duplexify"; | |
@@ -2516,22 +2543,22 b' let' | |||||
2516 | sha1 = "3a83a904e54353287874c564b7549386849a98c9"; |
|
2543 | sha1 = "3a83a904e54353287874c564b7549386849a98c9"; | |
2517 | }; |
|
2544 | }; | |
2518 | }; |
|
2545 | }; | |
2519 |
"electron-to-chromium-1.3. |
|
2546 | "electron-to-chromium-1.3.302" = { | |
2520 | name = "electron-to-chromium"; |
|
2547 | name = "electron-to-chromium"; | |
2521 | packageName = "electron-to-chromium"; |
|
2548 | packageName = "electron-to-chromium"; | |
2522 |
version = "1.3. |
|
2549 | version = "1.3.302"; | |
2523 | src = fetchurl { |
|
2550 | src = fetchurl { | |
2524 |
url = "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.3. |
|
2551 | url = "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.3.302.tgz"; | |
2525 | sha512 = "lyoC8aoqbbDqsprb6aPdt9n3DpOZZzdz/T4IZKsR0/dkZIxnJVUjjcpOSwA66jPRIOyDAamCTAUqweU05kKNSg=="; |
|
2552 | sha512 = "1qConyiVEbj4xZRBXqtGR003+9tV0rJF0PS6aeO0Ln/UL637js9hdwweCl07meh/kJoI2N4W8q3R3g3F5z46ww=="; | |
2526 | }; |
|
2553 | }; | |
2527 | }; |
|
2554 | }; | |
2528 |
"elliptic-6. |
|
2555 | "elliptic-6.5.1" = { | |
2529 | name = "elliptic"; |
|
2556 | name = "elliptic"; | |
2530 | packageName = "elliptic"; |
|
2557 | packageName = "elliptic"; | |
2531 |
version = "6. |
|
2558 | version = "6.5.1"; | |
2532 | src = fetchurl { |
|
2559 | src = fetchurl { | |
2533 |
url = "https://registry.npmjs.org/elliptic/-/elliptic-6. |
|
2560 | url = "https://registry.npmjs.org/elliptic/-/elliptic-6.5.1.tgz"; | |
2534 | sha512 = "BsXLz5sqX8OHcsh7CqBMztyXARmGQ3LWPtGjJi6DiJHq5C/qvi9P3OqgswKSDftbu8+IoI/QDTAm2fFnQ9SZSQ=="; |
|
2561 | sha512 = "xvJINNLbTeWQjrl6X+7eQCrIy/YPv5XCpKW6kB5mKvtnGILoLDcySuwomfdzt0BMdLNVnuRNTuzKNHj0bva1Cg=="; | |
2535 | }; |
|
2562 | }; | |
2536 | }; |
|
2563 | }; | |
2537 | "emojis-list-2.1.0" = { |
|
2564 | "emojis-list-2.1.0" = { | |
@@ -2543,13 +2570,13 b' let' | |||||
2543 | sha1 = "4daa4d9db00f9819880c79fa457ae5b09a1fd389"; |
|
2570 | sha1 = "4daa4d9db00f9819880c79fa457ae5b09a1fd389"; | |
2544 | }; |
|
2571 | }; | |
2545 | }; |
|
2572 | }; | |
2546 |
"end-of-stream-1.4. |
|
2573 | "end-of-stream-1.4.4" = { | |
2547 | name = "end-of-stream"; |
|
2574 | name = "end-of-stream"; | |
2548 | packageName = "end-of-stream"; |
|
2575 | packageName = "end-of-stream"; | |
2549 |
version = "1.4. |
|
2576 | version = "1.4.4"; | |
2550 | src = fetchurl { |
|
2577 | src = fetchurl { | |
2551 |
url = "https://registry.npmjs.org/end-of-stream/-/end-of-stream-1.4. |
|
2578 | url = "https://registry.npmjs.org/end-of-stream/-/end-of-stream-1.4.4.tgz"; | |
2552 | sha512 = "1MkrZNvWTKCaigbn+W15elq2BB/L22nqrSY5DKlo3X6+vclJm8Bb5djXJBmEX6fS3+zCh/F4VBK5Z2KxJt4s2Q=="; |
|
2579 | sha512 = "+uw1inIHVPQoaVuHzRyXd21icM+cnt4CzD5rW+NC1wjOUSTOs+Te7FOv7AhN7vS9x/oIyhLP5PR1H+phQAHu5Q=="; | |
2553 | }; |
|
2580 | }; | |
2554 | }; |
|
2581 | }; | |
2555 | "enhanced-resolve-3.4.1" = { |
|
2582 | "enhanced-resolve-3.4.1" = { | |
@@ -2561,13 +2588,13 b' let' | |||||
2561 | sha1 = "0421e339fd71419b3da13d129b3979040230476e"; |
|
2588 | sha1 = "0421e339fd71419b3da13d129b3979040230476e"; | |
2562 | }; |
|
2589 | }; | |
2563 | }; |
|
2590 | }; | |
2564 |
"enhanced-resolve-4.1. |
|
2591 | "enhanced-resolve-4.1.1" = { | |
2565 | name = "enhanced-resolve"; |
|
2592 | name = "enhanced-resolve"; | |
2566 | packageName = "enhanced-resolve"; |
|
2593 | packageName = "enhanced-resolve"; | |
2567 |
version = "4.1. |
|
2594 | version = "4.1.1"; | |
2568 | src = fetchurl { |
|
2595 | src = fetchurl { | |
2569 |
url = "https://registry.npmjs.org/enhanced-resolve/-/enhanced-resolve-4.1. |
|
2596 | url = "https://registry.npmjs.org/enhanced-resolve/-/enhanced-resolve-4.1.1.tgz"; | |
2570 | sha512 = "F/7vkyTtyc/llOIn8oWclcB25KdRaiPBpZYDgJHgh/UHtpgT2p2eldQgtQnLtUvfMKPKxbRaQM/hHkvLHt1Vng=="; |
|
2597 | sha512 = "98p2zE+rL7/g/DzMHMTF4zZlCgeVdJ7yr6xzEpJRYwFYrGi9ANdn5DnJURg6RpBkyk60XYDnWIv51VfIhfNGuA=="; | |
2571 | }; |
|
2598 | }; | |
2572 | }; |
|
2599 | }; | |
2573 | "entities-1.0.0" = { |
|
2600 | "entities-1.0.0" = { | |
@@ -2579,13 +2606,13 b' let' | |||||
2579 | sha1 = "b2987aa3821347fcde642b24fdfc9e4fb712bf26"; |
|
2606 | sha1 = "b2987aa3821347fcde642b24fdfc9e4fb712bf26"; | |
2580 | }; |
|
2607 | }; | |
2581 | }; |
|
2608 | }; | |
2582 |
"entities- |
|
2609 | "entities-2.0.0" = { | |
2583 | name = "entities"; |
|
2610 | name = "entities"; | |
2584 | packageName = "entities"; |
|
2611 | packageName = "entities"; | |
2585 |
version = " |
|
2612 | version = "2.0.0"; | |
2586 | src = fetchurl { |
|
2613 | src = fetchurl { | |
2587 |
url = "https://registry.npmjs.org/entities/-/entities- |
|
2614 | url = "https://registry.npmjs.org/entities/-/entities-2.0.0.tgz"; | |
2588 | sha512 = "f2LZMYl1Fzu7YSBKg+RoROelpOaNrcGmE9AZubeDfrCEia483oW4MI4VyFd5VNHIgQ/7qm1I0wUHK1eJnn2y2w=="; |
|
2615 | sha512 = "D9f7V0JSRwIxlRI2mjMqufDrRDnx8p+eEOz7aUM9SuvF8gsBzra0/6tbjl1m8eQHrZlYj6PxqE00hZ1SAIKPLw=="; | |
2589 | }; |
|
2616 | }; | |
2590 | }; |
|
2617 | }; | |
2591 | "errno-0.1.7" = { |
|
2618 | "errno-0.1.7" = { | |
@@ -2597,13 +2624,13 b' let' | |||||
2597 | sha512 = "MfrRBDWzIWifgq6tJj60gkAwtLNb6sQPlcFrSOflcP1aFmmruKQ2wRnze/8V6kgyz7H3FF8Npzv78mZ7XLLflg=="; |
|
2624 | sha512 = "MfrRBDWzIWifgq6tJj60gkAwtLNb6sQPlcFrSOflcP1aFmmruKQ2wRnze/8V6kgyz7H3FF8Npzv78mZ7XLLflg=="; | |
2598 | }; |
|
2625 | }; | |
2599 | }; |
|
2626 | }; | |
2600 |
"es-abstract-1.1 |
|
2627 | "es-abstract-1.16.0" = { | |
2601 | name = "es-abstract"; |
|
2628 | name = "es-abstract"; | |
2602 | packageName = "es-abstract"; |
|
2629 | packageName = "es-abstract"; | |
2603 |
version = "1.1 |
|
2630 | version = "1.16.0"; | |
2604 | src = fetchurl { |
|
2631 | src = fetchurl { | |
2605 |
url = "https://registry.npmjs.org/es-abstract/-/es-abstract-1.1 |
|
2632 | url = "https://registry.npmjs.org/es-abstract/-/es-abstract-1.16.0.tgz"; | |
2606 | sha512 = "vDZfg/ykNxQVwup/8E1BZhVzFfBxs9NqMzGcvIJrqg5k2/5Za2bWo40dK2J1pgLngZ7c+Shh8lwYtLGyrwPutg=="; |
|
2633 | sha512 = "xdQnfykZ9JMEiasTAJZJdMWCQ1Vm00NBw79/AWi7ELfZuuPCSOMDZbT9mkOfSctVtfhb+sAAzrm+j//GjjLHLg=="; | |
2607 | }; |
|
2634 | }; | |
2608 | }; |
|
2635 | }; | |
2609 | "es-to-primitive-1.2.0" = { |
|
2636 | "es-to-primitive-1.2.0" = { | |
@@ -2687,22 +2714,22 b' let' | |||||
2687 | sha512 = "64RBB++fIOAXPw3P9cy89qfMlvZEXZkqqJkjqqXIvzP5ezRZjW+lPWjw35UX/3EhUPFYbg5ER4JYgDw4007/DQ=="; |
|
2714 | sha512 = "64RBB++fIOAXPw3P9cy89qfMlvZEXZkqqJkjqqXIvzP5ezRZjW+lPWjw35UX/3EhUPFYbg5ER4JYgDw4007/DQ=="; | |
2688 | }; |
|
2715 | }; | |
2689 | }; |
|
2716 | }; | |
2690 |
"estraverse-4. |
|
2717 | "estraverse-4.3.0" = { | |
2691 | name = "estraverse"; |
|
2718 | name = "estraverse"; | |
2692 | packageName = "estraverse"; |
|
2719 | packageName = "estraverse"; | |
2693 |
version = "4. |
|
2720 | version = "4.3.0"; | |
2694 | src = fetchurl { |
|
2721 | src = fetchurl { | |
2695 |
url = "https://registry.npmjs.org/estraverse/-/estraverse-4. |
|
2722 | url = "https://registry.npmjs.org/estraverse/-/estraverse-4.3.0.tgz"; | |
2696 | sha1 = "0dee3fed31fcd469618ce7342099fc1afa0bdb13"; |
|
2723 | sha512 = "39nnKffWz8xN1BU/2c79n9nB9HDzo0niYUqx6xyqUnyoAnQyyWpOTdZEeiCch8BBu515t4wp9ZmgVfVhn9EBpw=="; | |
2697 | }; |
|
2724 | }; | |
2698 | }; |
|
2725 | }; | |
2699 |
"esutils-2.0. |
|
2726 | "esutils-2.0.3" = { | |
2700 | name = "esutils"; |
|
2727 | name = "esutils"; | |
2701 | packageName = "esutils"; |
|
2728 | packageName = "esutils"; | |
2702 |
version = "2.0. |
|
2729 | version = "2.0.3"; | |
2703 | src = fetchurl { |
|
2730 | src = fetchurl { | |
2704 |
url = "https://registry.npmjs.org/esutils/-/esutils-2.0. |
|
2731 | url = "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz"; | |
2705 | sha1 = "0abf4f1caa5bcb1f7a9d8acc6dea4faaa04bac9b"; |
|
2732 | sha512 = "kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g=="; | |
2706 | }; |
|
2733 | }; | |
2707 | }; |
|
2734 | }; | |
2708 | "eventemitter2-0.4.14" = { |
|
2735 | "eventemitter2-0.4.14" = { | |
@@ -2867,6 +2894,15 b' let' | |||||
2867 | sha1 = "c14c5b3bf14d7417ffbfd990c0a7495cd9f337bc"; |
|
2894 | sha1 = "c14c5b3bf14d7417ffbfd990c0a7495cd9f337bc"; | |
2868 | }; |
|
2895 | }; | |
2869 | }; |
|
2896 | }; | |
|
2897 | "figures-1.7.0" = { | |||
|
2898 | name = "figures"; | |||
|
2899 | packageName = "figures"; | |||
|
2900 | version = "1.7.0"; | |||
|
2901 | src = fetchurl { | |||
|
2902 | url = "https://registry.npmjs.org/figures/-/figures-1.7.0.tgz"; | |||
|
2903 | sha1 = "cbe1e3affcf1cd44b80cadfed28dc793a9701d2e"; | |||
|
2904 | }; | |||
|
2905 | }; | |||
2870 | "file-sync-cmp-0.1.1" = { |
|
2906 | "file-sync-cmp-0.1.1" = { | |
2871 | name = "file-sync-cmp"; |
|
2907 | name = "file-sync-cmp"; | |
2872 | packageName = "file-sync-cmp"; |
|
2908 | packageName = "file-sync-cmp"; | |
@@ -2948,13 +2984,13 b' let' | |||||
2948 | sha512 = "lNaHNVymajmk0OJMBn8fVUAU1BtDeKIqKoVhk4xAALB57aALg6b4W0MfJ/cUE0g9YBXy5XhSlPIpYIJ7HaY/3Q=="; |
|
2984 | sha512 = "lNaHNVymajmk0OJMBn8fVUAU1BtDeKIqKoVhk4xAALB57aALg6b4W0MfJ/cUE0g9YBXy5XhSlPIpYIJ7HaY/3Q=="; | |
2949 | }; |
|
2985 | }; | |
2950 | }; |
|
2986 | }; | |
2951 |
"flatten-1.0. |
|
2987 | "flatten-1.0.3" = { | |
2952 | name = "flatten"; |
|
2988 | name = "flatten"; | |
2953 | packageName = "flatten"; |
|
2989 | packageName = "flatten"; | |
2954 |
version = "1.0. |
|
2990 | version = "1.0.3"; | |
2955 | src = fetchurl { |
|
2991 | src = fetchurl { | |
2956 |
url = "https://registry.npmjs.org/flatten/-/flatten-1.0. |
|
2992 | url = "https://registry.npmjs.org/flatten/-/flatten-1.0.3.tgz"; | |
2957 | sha1 = "dae46a9d78fbe25292258cc1e780a41d95c03782"; |
|
2993 | sha512 = "dVsPA/UwQ8+2uoFe5GHtiBMu48dWLTdsuEd7CKGlZlD78r1TTWBvDuFaFGKCo/ZfEr95Uk56vZoX86OsHkUeIg=="; | |
2958 | }; |
|
2994 | }; | |
2959 | }; |
|
2995 | }; | |
2960 | "flush-write-stream-1.1.1" = { |
|
2996 | "flush-write-stream-1.1.1" = { | |
@@ -3128,13 +3164,13 b' let' | |||||
3128 | sha1 = "4a973f635b9190f715d10987d5c00fd2815ebe3d"; |
|
3164 | sha1 = "4a973f635b9190f715d10987d5c00fd2815ebe3d"; | |
3129 | }; |
|
3165 | }; | |
3130 | }; |
|
3166 | }; | |
3131 |
"glob-7.1. |
|
3167 | "glob-7.1.5" = { | |
3132 | name = "glob"; |
|
3168 | name = "glob"; | |
3133 | packageName = "glob"; |
|
3169 | packageName = "glob"; | |
3134 |
version = "7.1. |
|
3170 | version = "7.1.5"; | |
3135 | src = fetchurl { |
|
3171 | src = fetchurl { | |
3136 |
url = "https://registry.npmjs.org/glob/-/glob-7.1. |
|
3172 | url = "https://registry.npmjs.org/glob/-/glob-7.1.5.tgz"; | |
3137 | sha512 = "hkLPepehmnKk41pUGm3sYxoFs/umurYfYJCerbXEyFIWcAzvpipAgVkBqqT9RBKMGjnq6kMuyYwha6csxbiM1A=="; |
|
3173 | sha512 = "J9dlskqUXK1OeTOYBEn5s8aMukWMwWfs+rPTn/jn50Ux4MNXVhubL1wu/j2t+H4NVI+cXEcCaYellqaPVGXNqQ=="; | |
3138 | }; |
|
3174 | }; | |
3139 | }; |
|
3175 | }; | |
3140 | "glob-parent-3.1.0" = { |
|
3176 | "glob-parent-3.1.0" = { | |
@@ -3218,13 +3254,13 b' let' | |||||
3218 | sha1 = "15a4806a57547cb2d2dbf27f42e89a8c3451b364"; |
|
3254 | sha1 = "15a4806a57547cb2d2dbf27f42e89a8c3451b364"; | |
3219 | }; |
|
3255 | }; | |
3220 | }; |
|
3256 | }; | |
3221 |
"graceful-fs-4. |
|
3257 | "graceful-fs-4.2.3" = { | |
3222 | name = "graceful-fs"; |
|
3258 | name = "graceful-fs"; | |
3223 | packageName = "graceful-fs"; |
|
3259 | packageName = "graceful-fs"; | |
3224 |
version = "4. |
|
3260 | version = "4.2.3"; | |
3225 | src = fetchurl { |
|
3261 | src = fetchurl { | |
3226 |
url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4. |
|
3262 | url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.3.tgz"; | |
3227 | sha512 = "6uHUhOPEBgQ24HM+r6b/QwWfZq+yiFcipKFrOFiBEnWdy5sdzYoi+pJeQaPI5qOLRFqWmAXUPQNsielzdLoecA=="; |
|
3263 | sha512 = "a30VEBm4PEdx1dRB7MFK7BejejvCvBronbLjht+sHuGYj8PHs7M/5Z+rt5lw551vZ7yfTCj4Vuyy3mSJytDWRQ=="; | |
3228 | }; |
|
3264 | }; | |
3229 | }; |
|
3265 | }; | |
3230 | "grunt-0.4.5" = { |
|
3266 | "grunt-0.4.5" = { | |
@@ -3281,6 +3317,15 b' let' | |||||
3281 | sha1 = "3bbdec0b75d12ceaa55d62943625c0b0861cdf6f"; |
|
3317 | sha1 = "3bbdec0b75d12ceaa55d62943625c0b0861cdf6f"; | |
3282 | }; |
|
3318 | }; | |
3283 | }; |
|
3319 | }; | |
|
3320 | "grunt-contrib-uglify-4.0.1" = { | |||
|
3321 | name = "grunt-contrib-uglify"; | |||
|
3322 | packageName = "grunt-contrib-uglify"; | |||
|
3323 | version = "4.0.1"; | |||
|
3324 | src = fetchurl { | |||
|
3325 | url = "https://registry.npmjs.org/grunt-contrib-uglify/-/grunt-contrib-uglify-4.0.1.tgz"; | |||
|
3326 | sha512 = "dwf8/+4uW1+7pH72WButOEnzErPGmtUvc8p08B0eQS/6ON0WdeQu0+WFeafaPTbbY1GqtS25lsHWaDeiTQNWPg=="; | |||
|
3327 | }; | |||
|
3328 | }; | |||
3284 | "grunt-contrib-watch-0.6.1" = { |
|
3329 | "grunt-contrib-watch-0.6.1" = { | |
3285 | name = "grunt-contrib-watch"; |
|
3330 | name = "grunt-contrib-watch"; | |
3286 | packageName = "grunt-contrib-watch"; |
|
3331 | packageName = "grunt-contrib-watch"; | |
@@ -3335,6 +3380,15 b' let' | |||||
3335 | sha512 = "SaZ8K8lG4iTxs7ClZxOWCf3kxqS2y+Eel8SbaEGgBKwhAp6e45beIu+vhBZRLX3vonKML2kjemKsQ21REaqNFQ=="; |
|
3380 | sha512 = "SaZ8K8lG4iTxs7ClZxOWCf3kxqS2y+Eel8SbaEGgBKwhAp6e45beIu+vhBZRLX3vonKML2kjemKsQ21REaqNFQ=="; | |
3336 | }; |
|
3381 | }; | |
3337 | }; |
|
3382 | }; | |
|
3383 | "gzip-size-3.0.0" = { | |||
|
3384 | name = "gzip-size"; | |||
|
3385 | packageName = "gzip-size"; | |||
|
3386 | version = "3.0.0"; | |||
|
3387 | src = fetchurl { | |||
|
3388 | url = "https://registry.npmjs.org/gzip-size/-/gzip-size-3.0.0.tgz"; | |||
|
3389 | sha1 = "546188e9bdc337f673772f81660464b389dce520"; | |||
|
3390 | }; | |||
|
3391 | }; | |||
3338 | "har-schema-1.0.5" = { |
|
3392 | "har-schema-1.0.5" = { | |
3339 | name = "har-schema"; |
|
3393 | name = "har-schema"; | |
3340 | packageName = "har-schema"; |
|
3394 | packageName = "har-schema"; | |
@@ -3695,15 +3749,6 b' let' | |||||
3695 | sha1 = "f30f716c8e2bd346c7b67d3df3915566a7c05607"; |
|
3749 | sha1 = "f30f716c8e2bd346c7b67d3df3915566a7c05607"; | |
3696 | }; |
|
3750 | }; | |
3697 | }; |
|
3751 | }; | |
3698 | "indexof-0.0.1" = { |
|
|||
3699 | name = "indexof"; |
|
|||
3700 | packageName = "indexof"; |
|
|||
3701 | version = "0.0.1"; |
|
|||
3702 | src = fetchurl { |
|
|||
3703 | url = "https://registry.npmjs.org/indexof/-/indexof-0.0.1.tgz"; |
|
|||
3704 | sha1 = "82dc336d232b9062179d05ab3293a66059fd435d"; |
|
|||
3705 | }; |
|
|||
3706 | }; |
|
|||
3707 | "inflight-1.0.6" = { |
|
3752 | "inflight-1.0.6" = { | |
3708 | name = "inflight"; |
|
3753 | name = "inflight"; | |
3709 | packageName = "inflight"; |
|
3754 | packageName = "inflight"; | |
@@ -3740,6 +3785,15 b' let' | |||||
3740 | sha1 = "633c2c83e3da42a502f52466022480f4208261de"; |
|
3785 | sha1 = "633c2c83e3da42a502f52466022480f4208261de"; | |
3741 | }; |
|
3786 | }; | |
3742 | }; |
|
3787 | }; | |
|
3788 | "inherits-2.0.4" = { | |||
|
3789 | name = "inherits"; | |||
|
3790 | packageName = "inherits"; | |||
|
3791 | version = "2.0.4"; | |||
|
3792 | src = fetchurl { | |||
|
3793 | url = "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz"; | |||
|
3794 | sha512 = "k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ=="; | |||
|
3795 | }; | |||
|
3796 | }; | |||
3743 | "ini-1.3.5" = { |
|
3797 | "ini-1.3.5" = { | |
3744 | name = "ini"; |
|
3798 | name = "ini"; | |
3745 | packageName = "ini"; |
|
3799 | packageName = "ini"; | |
@@ -4424,13 +4478,13 b' let' | |||||
4424 | sha1 = "fadd834b9683073da179b3eae6d9c0d15053f73e"; |
|
4478 | sha1 = "fadd834b9683073da179b3eae6d9c0d15053f73e"; | |
4425 | }; |
|
4479 | }; | |
4426 | }; |
|
4480 | }; | |
4427 |
"lodash-4.17.1 |
|
4481 | "lodash-4.17.15" = { | |
4428 | name = "lodash"; |
|
4482 | name = "lodash"; | |
4429 | packageName = "lodash"; |
|
4483 | packageName = "lodash"; | |
4430 |
version = "4.17.1 |
|
4484 | version = "4.17.15"; | |
4431 | src = fetchurl { |
|
4485 | src = fetchurl { | |
4432 |
url = "https://registry.npmjs.org/lodash/-/lodash-4.17.1 |
|
4486 | url = "https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz"; | |
4433 | sha512 = "cQKh8igo5QUhZ7lg38DYWAxMvjSAKG0A8wGSVimP07SIUEK2UO+arSRKbRZWtelMtN5V0Hkwh5ryOto/SshYIg=="; |
|
4487 | sha512 = "8xOcRHvCjnocdS5cpwXQXVzmmh5e5+saE2QGoeQmbKmRS6J3VQppPOIt0MnmE+4xlZoumy0GPG0D0MVIQbNA1A=="; | |
4434 | }; |
|
4488 | }; | |
4435 | }; |
|
4489 | }; | |
4436 | "lodash.camelcase-4.3.0" = { |
|
4490 | "lodash.camelcase-4.3.0" = { | |
@@ -4577,6 +4631,15 b' let' | |||||
4577 | sha1 = "de819fdbcd84dccd8fae59c6aeb79615b9d266ac"; |
|
4631 | sha1 = "de819fdbcd84dccd8fae59c6aeb79615b9d266ac"; | |
4578 | }; |
|
4632 | }; | |
4579 | }; |
|
4633 | }; | |
|
4634 | "maxmin-2.1.0" = { | |||
|
4635 | name = "maxmin"; | |||
|
4636 | packageName = "maxmin"; | |||
|
4637 | version = "2.1.0"; | |||
|
4638 | src = fetchurl { | |||
|
4639 | url = "https://registry.npmjs.org/maxmin/-/maxmin-2.1.0.tgz"; | |||
|
4640 | sha1 = "4d3b220903d95eee7eb7ac7fa864e72dc09a3166"; | |||
|
4641 | }; | |||
|
4642 | }; | |||
4580 | "md5.js-1.3.5" = { |
|
4643 | "md5.js-1.3.5" = { | |
4581 | name = "md5.js"; |
|
4644 | name = "md5.js"; | |
4582 | packageName = "md5.js"; |
|
4645 | packageName = "md5.js"; | |
@@ -4604,6 +4667,15 b' let' | |||||
4604 | sha1 = "3a9a20b8462523e447cfbc7e8bb80ed667bfc552"; |
|
4667 | sha1 = "3a9a20b8462523e447cfbc7e8bb80ed667bfc552"; | |
4605 | }; |
|
4668 | }; | |
4606 | }; |
|
4669 | }; | |
|
4670 | "memory-fs-0.5.0" = { | |||
|
4671 | name = "memory-fs"; | |||
|
4672 | packageName = "memory-fs"; | |||
|
4673 | version = "0.5.0"; | |||
|
4674 | src = fetchurl { | |||
|
4675 | url = "https://registry.npmjs.org/memory-fs/-/memory-fs-0.5.0.tgz"; | |||
|
4676 | sha512 = "jA0rdU5KoQMC0e6ppoNRtpp6vjFq6+NY7r8hywnC7V+1Xj/MtHwGIbB1QaK/dunyjWteJzmkpd7ooeWg10T7GA=="; | |||
|
4677 | }; | |||
|
4678 | }; | |||
4607 | "micromatch-3.1.10" = { |
|
4679 | "micromatch-3.1.10" = { | |
4608 | name = "micromatch"; |
|
4680 | name = "micromatch"; | |
4609 | packageName = "micromatch"; |
|
4681 | packageName = "micromatch"; | |
@@ -4730,13 +4802,13 b' let' | |||||
4730 | sha512 = "zHo8v+otD1J10j/tC+VNoGK9keCuByhKovAvdn74dmxJl9+mWHnx6EMsDN4lgRoMI/eYo2nchAxniIbUPb5onw=="; |
|
4802 | sha512 = "zHo8v+otD1J10j/tC+VNoGK9keCuByhKovAvdn74dmxJl9+mWHnx6EMsDN4lgRoMI/eYo2nchAxniIbUPb5onw=="; | |
4731 | }; |
|
4803 | }; | |
4732 | }; |
|
4804 | }; | |
4733 |
"mixin-deep-1.3. |
|
4805 | "mixin-deep-1.3.2" = { | |
4734 | name = "mixin-deep"; |
|
4806 | name = "mixin-deep"; | |
4735 | packageName = "mixin-deep"; |
|
4807 | packageName = "mixin-deep"; | |
4736 |
version = "1.3. |
|
4808 | version = "1.3.2"; | |
4737 | src = fetchurl { |
|
4809 | src = fetchurl { | |
4738 |
url = "https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3. |
|
4810 | url = "https://registry.npmjs.org/mixin-deep/-/mixin-deep-1.3.2.tgz"; | |
4739 | sha512 = "8ZItLHeEgaqEvd5lYBXfm4EZSFCX29Jb9K+lAHhDKzReKBQKj3R+7NOF6tjqYi9t4oI8VUfaWITJQm86wnXGNQ=="; |
|
4811 | sha512 = "WRoDn//mXBiJ1H40rqa3vH0toePwSsGb45iInWlTySa+Uu4k3tYUSxa2v1KqAiLtvlrSzaExqS1gtk96A9zvEA=="; | |
4740 | }; |
|
4812 | }; | |
4741 | }; |
|
4813 | }; | |
4742 | "mkdirp-0.5.1" = { |
|
4814 | "mkdirp-0.5.1" = { | |
@@ -4784,13 +4856,13 b' let' | |||||
4784 | sha1 = "5608aeadfc00be6c2901df5f9861788de0d597c8"; |
|
4856 | sha1 = "5608aeadfc00be6c2901df5f9861788de0d597c8"; | |
4785 | }; |
|
4857 | }; | |
4786 | }; |
|
4858 | }; | |
4787 |
"nan-2.1 |
|
4859 | "nan-2.14.0" = { | |
4788 | name = "nan"; |
|
4860 | name = "nan"; | |
4789 | packageName = "nan"; |
|
4861 | packageName = "nan"; | |
4790 |
version = "2.1 |
|
4862 | version = "2.14.0"; | |
4791 | src = fetchurl { |
|
4863 | src = fetchurl { | |
4792 |
url = "https://registry.npmjs.org/nan/-/nan-2.1 |
|
4864 | url = "https://registry.npmjs.org/nan/-/nan-2.14.0.tgz"; | |
4793 | sha512 = "TghvYc72wlMGMVMluVo9WRJc0mB8KxxF/gZ4YYFy7V2ZQX9l7rgbPg7vjS9mt6U5HXODVFVI2bOduCzwOMv/lw=="; |
|
4865 | sha512 = "INOFj37C7k3AfaNTtX8RhsTw7qRy7eLET14cROi9+5HAVbbHuIWUHEauBv5qT4Av2tWasiTY1Jw6puUNqRJXQg=="; | |
4794 | }; |
|
4866 | }; | |
4795 | }; |
|
4867 | }; | |
4796 | "nanomatch-1.2.13" = { |
|
4868 | "nanomatch-1.2.13" = { | |
@@ -4829,13 +4901,13 b' let' | |||||
4829 | sha512 = "rmTZ9kz+f3rCvK2TD1Ue/oZlns7OGoIWP4fc3llxxRXlOkHKoWPPWJOfFYpITabSow43QJbRIoHQXtt10VldyQ=="; |
|
4901 | sha512 = "rmTZ9kz+f3rCvK2TD1Ue/oZlns7OGoIWP4fc3llxxRXlOkHKoWPPWJOfFYpITabSow43QJbRIoHQXtt10VldyQ=="; | |
4830 | }; |
|
4902 | }; | |
4831 | }; |
|
4903 | }; | |
4832 |
"node-libs-browser-2.2. |
|
4904 | "node-libs-browser-2.2.1" = { | |
4833 | name = "node-libs-browser"; |
|
4905 | name = "node-libs-browser"; | |
4834 | packageName = "node-libs-browser"; |
|
4906 | packageName = "node-libs-browser"; | |
4835 |
version = "2.2. |
|
4907 | version = "2.2.1"; | |
4836 | src = fetchurl { |
|
4908 | src = fetchurl { | |
4837 |
url = "https://registry.npmjs.org/node-libs-browser/-/node-libs-browser-2.2. |
|
4909 | url = "https://registry.npmjs.org/node-libs-browser/-/node-libs-browser-2.2.1.tgz"; | |
4838 | sha512 = "5MQunG/oyOaBdttrL40dA7bUfPORLRWMUJLQtMg7nluxUvk5XwnLdL9twQHFAjRx/y7mIMkLKT9++qPbbk6BZA=="; |
|
4910 | sha512 = "h/zcD8H9kaDZ9ALUWwlBUDo6TKF8a7qBSCSEGfjTVIYeqsioSKaAX+BN7NgiMGp6iSIXZ3PxgCu8KS3b71YK5Q=="; | |
4839 | }; |
|
4911 | }; | |
4840 | }; |
|
4912 | }; | |
4841 | "nopt-1.0.10" = { |
|
4913 | "nopt-1.0.10" = { | |
@@ -4973,6 +5045,15 b' let' | |||||
4973 | sha1 = "7e7d858b781bd7c991a41ba975ed3812754e998c"; |
|
5045 | sha1 = "7e7d858b781bd7c991a41ba975ed3812754e998c"; | |
4974 | }; |
|
5046 | }; | |
4975 | }; |
|
5047 | }; | |
|
5048 | "object-inspect-1.6.0" = { | |||
|
5049 | name = "object-inspect"; | |||
|
5050 | packageName = "object-inspect"; | |||
|
5051 | version = "1.6.0"; | |||
|
5052 | src = fetchurl { | |||
|
5053 | url = "https://registry.npmjs.org/object-inspect/-/object-inspect-1.6.0.tgz"; | |||
|
5054 | sha512 = "GJzfBZ6DgDAmnuaM3104jR4s1Myxr3Y3zfIyN4z3UdqN69oSRacNK8UhnobDdC+7J2AHCjGwxQubNJfE70SXXQ=="; | |||
|
5055 | }; | |||
|
5056 | }; | |||
4976 | "object-keys-1.1.1" = { |
|
5057 | "object-keys-1.1.1" = { | |
4977 | name = "object-keys"; |
|
5058 | name = "object-keys"; | |
4978 | packageName = "object-keys"; |
|
5059 | packageName = "object-keys"; | |
@@ -5117,13 +5198,13 b' let' | |||||
5117 | sha512 = "vvcXsLAJ9Dr5rQOPk7toZQZJApBl2K4J6dANSsEuh6QI41JYcsS/qhTGa9ErIUUgK3WNQoJYvylxvjqmiqEA9Q=="; |
|
5198 | sha512 = "vvcXsLAJ9Dr5rQOPk7toZQZJApBl2K4J6dANSsEuh6QI41JYcsS/qhTGa9ErIUUgK3WNQoJYvylxvjqmiqEA9Q=="; | |
5118 | }; |
|
5199 | }; | |
5119 | }; |
|
5200 | }; | |
5120 |
"p-limit-2.2. |
|
5201 | "p-limit-2.2.1" = { | |
5121 | name = "p-limit"; |
|
5202 | name = "p-limit"; | |
5122 | packageName = "p-limit"; |
|
5203 | packageName = "p-limit"; | |
5123 |
version = "2.2. |
|
5204 | version = "2.2.1"; | |
5124 | src = fetchurl { |
|
5205 | src = fetchurl { | |
5125 |
url = "https://registry.npmjs.org/p-limit/-/p-limit-2.2. |
|
5206 | url = "https://registry.npmjs.org/p-limit/-/p-limit-2.2.1.tgz"; | |
5126 | sha512 = "pZbTJpoUsCzV48Mc9Nh51VbwO0X9cuPFE8gYwx9BTCt9SF8/b7Zljd2fVgOxhIF/HDTKgpVzs+GPhyKfjLLFRQ=="; |
|
5207 | sha512 = "85Tk+90UCVWvbDavCLKPOLC9vvY8OwEX/RtKF+/1OADJMVlFfEHOiMTPVyxg7mk/dKa+ipdHm0OUkTvCpMTuwg=="; | |
5127 | }; |
|
5208 | }; | |
5128 | }; |
|
5209 | }; | |
5129 | "p-locate-2.0.0" = { |
|
5210 | "p-locate-2.0.0" = { | |
@@ -5171,13 +5252,13 b' let' | |||||
5171 | sha512 = "0DTvPVU3ed8+HNXOu5Bs+o//Mbdj9VNQMUOe9oKCwh8l0GNwpTDMKCWbRjgtD291AWnkAgkqA/LOnQS8AmS1tw=="; |
|
5252 | sha512 = "0DTvPVU3ed8+HNXOu5Bs+o//Mbdj9VNQMUOe9oKCwh8l0GNwpTDMKCWbRjgtD291AWnkAgkqA/LOnQS8AmS1tw=="; | |
5172 | }; |
|
5253 | }; | |
5173 | }; |
|
5254 | }; | |
5174 |
"parallel-transform-1. |
|
5255 | "parallel-transform-1.2.0" = { | |
5175 | name = "parallel-transform"; |
|
5256 | name = "parallel-transform"; | |
5176 | packageName = "parallel-transform"; |
|
5257 | packageName = "parallel-transform"; | |
5177 |
version = "1. |
|
5258 | version = "1.2.0"; | |
5178 | src = fetchurl { |
|
5259 | src = fetchurl { | |
5179 |
url = "https://registry.npmjs.org/parallel-transform/-/parallel-transform-1. |
|
5260 | url = "https://registry.npmjs.org/parallel-transform/-/parallel-transform-1.2.0.tgz"; | |
5180 | sha1 = "d410f065b05da23081fcd10f28854c29bda33b06"; |
|
5261 | sha512 = "P2vSmIu38uIlvdcU7fDkyrxj33gTUy/ABO5ZUbGowxNCopBq/OoD42bP4UmMrJoPyk4Uqf0mu3mtWBhHCZD8yg=="; | |
5181 | }; |
|
5262 | }; | |
5182 | }; |
|
5263 | }; | |
5183 | "param-case-2.1.1" = { |
|
5264 | "param-case-2.1.1" = { | |
@@ -5189,13 +5270,13 b' let' | |||||
5189 | sha1 = "df94fd8cf6531ecf75e6bef9a0858fbc72be2247"; |
|
5270 | sha1 = "df94fd8cf6531ecf75e6bef9a0858fbc72be2247"; | |
5190 | }; |
|
5271 | }; | |
5191 | }; |
|
5272 | }; | |
5192 |
"parse-asn1-5.1. |
|
5273 | "parse-asn1-5.1.5" = { | |
5193 | name = "parse-asn1"; |
|
5274 | name = "parse-asn1"; | |
5194 | packageName = "parse-asn1"; |
|
5275 | packageName = "parse-asn1"; | |
5195 |
version = "5.1. |
|
5276 | version = "5.1.5"; | |
5196 | src = fetchurl { |
|
5277 | src = fetchurl { | |
5197 |
url = "https://registry.npmjs.org/parse-asn1/-/parse-asn1-5.1. |
|
5278 | url = "https://registry.npmjs.org/parse-asn1/-/parse-asn1-5.1.5.tgz"; | |
5198 | sha512 = "Qs5duJcuvNExRfFZ99HDD3z4mAi3r9Wl/FOjEOijlxwCZs7E7mW2vjTpgQ4J8LpTF8x5v+1Vn5UQFejmWT11aw=="; |
|
5279 | sha512 = "jkMYn1dcJqF6d5CpU689bq7w/b5ALS9ROVSpQDPrZsqqesUJii9qutvoT5ltGedNXMO2e16YUWIghG9KxaViTQ=="; | |
5199 | }; |
|
5280 | }; | |
5200 | }; |
|
5281 | }; | |
5201 | "parse-filepath-1.0.2" = { |
|
5282 | "parse-filepath-1.0.2" = { | |
@@ -5252,13 +5333,13 b' let' | |||||
5252 | sha1 = "b363e55e8006ca6fe21784d2db22bd15d7917f14"; |
|
5333 | sha1 = "b363e55e8006ca6fe21784d2db22bd15d7917f14"; | |
5253 | }; |
|
5334 | }; | |
5254 | }; |
|
5335 | }; | |
5255 |
"path-browserify-0.0. |
|
5336 | "path-browserify-0.0.1" = { | |
5256 | name = "path-browserify"; |
|
5337 | name = "path-browserify"; | |
5257 | packageName = "path-browserify"; |
|
5338 | packageName = "path-browserify"; | |
5258 |
version = "0.0. |
|
5339 | version = "0.0.1"; | |
5259 | src = fetchurl { |
|
5340 | src = fetchurl { | |
5260 |
url = "https://registry.npmjs.org/path-browserify/-/path-browserify-0.0. |
|
5341 | url = "https://registry.npmjs.org/path-browserify/-/path-browserify-0.0.1.tgz"; | |
5261 | sha1 = "a0b870729aae214005b7d5032ec2cbbb0fb4451a"; |
|
5342 | sha512 = "BapA40NHICOS+USX9SN4tyhq+A2RrN/Ws5F0Z5aMHDp98Fl86lX8Oti8B7uN93L4Ifv4fHOEA+pQw87gmMO/lQ=="; | |
5262 | }; |
|
5343 | }; | |
5263 | }; |
|
5344 | }; | |
5264 | "path-dirname-1.0.2" = { |
|
5345 | "path-dirname-1.0.2" = { | |
@@ -5711,6 +5792,15 b' let' | |||||
5711 | sha1 = "d4f4562b0ce3696e41ac52d0e002e57a635dc6dc"; |
|
5792 | sha1 = "d4f4562b0ce3696e41ac52d0e002e57a635dc6dc"; | |
5712 | }; |
|
5793 | }; | |
5713 | }; |
|
5794 | }; | |
|
5795 | "pretty-bytes-3.0.1" = { | |||
|
5796 | name = "pretty-bytes"; | |||
|
5797 | packageName = "pretty-bytes"; | |||
|
5798 | version = "3.0.1"; | |||
|
5799 | src = fetchurl { | |||
|
5800 | url = "https://registry.npmjs.org/pretty-bytes/-/pretty-bytes-3.0.1.tgz"; | |||
|
5801 | sha1 = "27d0008d778063a0b4811bb35c79f1bd5d5fbccf"; | |||
|
5802 | }; | |||
|
5803 | }; | |||
5714 | "pretty-error-2.1.1" = { |
|
5804 | "pretty-error-2.1.1" = { | |
5715 | name = "pretty-error"; |
|
5805 | name = "pretty-error"; | |
5716 | packageName = "pretty-error"; |
|
5806 | packageName = "pretty-error"; | |
@@ -5738,13 +5828,13 b' let' | |||||
5738 | sha1 = "7332300e840161bda3e69a1d1d91a7d4bc16f182"; |
|
5828 | sha1 = "7332300e840161bda3e69a1d1d91a7d4bc16f182"; | |
5739 | }; |
|
5829 | }; | |
5740 | }; |
|
5830 | }; | |
5741 |
"process-nextick-args-2.0. |
|
5831 | "process-nextick-args-2.0.1" = { | |
5742 | name = "process-nextick-args"; |
|
5832 | name = "process-nextick-args"; | |
5743 | packageName = "process-nextick-args"; |
|
5833 | packageName = "process-nextick-args"; | |
5744 |
version = "2.0. |
|
5834 | version = "2.0.1"; | |
5745 | src = fetchurl { |
|
5835 | src = fetchurl { | |
5746 |
url = "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-2.0. |
|
5836 | url = "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-2.0.1.tgz"; | |
5747 | sha512 = "MtEC1TqN0EU5nephaJ4rAtThHtC86dNN9qCuEhtshvpVBkAW5ZO7BASN9REnF9eoXGcRub+pFuKEpOHE+HbEMw=="; |
|
5837 | sha512 = "3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag=="; | |
5748 | }; |
|
5838 | }; | |
5749 | }; |
|
5839 | }; | |
5750 | "promise-7.3.1" = { |
|
5840 | "promise-7.3.1" = { | |
@@ -5990,13 +6080,13 b' let' | |||||
5990 | sha1 = "747c914e049614a4c9cfbba629871ad1d2927716"; |
|
6080 | sha1 = "747c914e049614a4c9cfbba629871ad1d2927716"; | |
5991 | }; |
|
6081 | }; | |
5992 | }; |
|
6082 | }; | |
5993 |
"reduce-function-call-1.0. |
|
6083 | "reduce-function-call-1.0.3" = { | |
5994 | name = "reduce-function-call"; |
|
6084 | name = "reduce-function-call"; | |
5995 | packageName = "reduce-function-call"; |
|
6085 | packageName = "reduce-function-call"; | |
5996 |
version = "1.0. |
|
6086 | version = "1.0.3"; | |
5997 | src = fetchurl { |
|
6087 | src = fetchurl { | |
5998 |
url = "https://registry.npmjs.org/reduce-function-call/-/reduce-function-call-1.0. |
|
6088 | url = "https://registry.npmjs.org/reduce-function-call/-/reduce-function-call-1.0.3.tgz"; | |
5999 | sha1 = "5a200bf92e0e37751752fe45b0ab330fd4b6be99"; |
|
6089 | sha512 = "Hl/tuV2VDgWgCSEeWMLwxLZqX7OK59eU1guxXsRKTAyeYimivsKdtcV4fu3r710tpG5GmDKDhQ0HSZLExnNmyQ=="; | |
6000 | }; |
|
6090 | }; | |
6001 | }; |
|
6091 | }; | |
6002 | "regenerate-1.4.0" = { |
|
6092 | "regenerate-1.4.0" = { | |
@@ -6152,13 +6242,13 b' let' | |||||
6152 | sha1 = "97f717b69d48784f5f526a6c5aa8ffdda055a4d1"; |
|
6242 | sha1 = "97f717b69d48784f5f526a6c5aa8ffdda055a4d1"; | |
6153 | }; |
|
6243 | }; | |
6154 | }; |
|
6244 | }; | |
6155 |
"resolve-1.10 |
|
6245 | "resolve-1.12.0" = { | |
6156 | name = "resolve"; |
|
6246 | name = "resolve"; | |
6157 | packageName = "resolve"; |
|
6247 | packageName = "resolve"; | |
6158 |
version = "1.10 |
|
6248 | version = "1.12.0"; | |
6159 | src = fetchurl { |
|
6249 | src = fetchurl { | |
6160 |
url = "https://registry.npmjs.org/resolve/-/resolve-1.10 |
|
6250 | url = "https://registry.npmjs.org/resolve/-/resolve-1.12.0.tgz"; | |
6161 | sha512 = "KuIe4mf++td/eFb6wkaPbMDnP6kObCaEtIDuHOUED6MNUo4K670KZUHuuvYPZDxNF0WVLw49n06M2m2dXphEzA=="; |
|
6251 | sha512 = "B/dOmuoAik5bKcD6s6nXDCjzUKnaDvdkRyAk6rsmsKLipWj4797iothd7jmmUhWTfinVMU+wc56rYKsit2Qy4w=="; | |
6162 | }; |
|
6252 | }; | |
6163 | }; |
|
6253 | }; | |
6164 | "resolve-cwd-2.0.0" = { |
|
6254 | "resolve-cwd-2.0.0" = { | |
@@ -6224,13 +6314,13 b' let' | |||||
6224 | sha1 = "e439be2aaee327321952730f99a8929e4fc50582"; |
|
6314 | sha1 = "e439be2aaee327321952730f99a8929e4fc50582"; | |
6225 | }; |
|
6315 | }; | |
6226 | }; |
|
6316 | }; | |
6227 |
"rimraf-2. |
|
6317 | "rimraf-2.7.1" = { | |
6228 | name = "rimraf"; |
|
6318 | name = "rimraf"; | |
6229 | packageName = "rimraf"; |
|
6319 | packageName = "rimraf"; | |
6230 |
version = "2. |
|
6320 | version = "2.7.1"; | |
6231 | src = fetchurl { |
|
6321 | src = fetchurl { | |
6232 |
url = "https://registry.npmjs.org/rimraf/-/rimraf-2. |
|
6322 | url = "https://registry.npmjs.org/rimraf/-/rimraf-2.7.1.tgz"; | |
6233 | sha512 = "mwqeW5XsA2qAejG46gYdENaxXjx9onRNCfn7L0duuP4hCuTIi/QO7PDK07KJfp1d+izWPrzEJDcSqBa0OZQriA=="; |
|
6323 | sha512 = "uWjbaKIK3T1OSVptzX7Nl6PvQ3qAGtKEtVRjRuazjfL3Bx5eI409VZSqgND+4UNnmzLVdPj9FqFJNPqBZFve4w=="; | |
6234 | }; |
|
6324 | }; | |
6235 | }; |
|
6325 | }; | |
6236 | "ripemd160-2.0.2" = { |
|
6326 | "ripemd160-2.0.2" = { | |
@@ -6260,6 +6350,15 b' let' | |||||
6260 | sha512 = "Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g=="; |
|
6350 | sha512 = "Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g=="; | |
6261 | }; |
|
6351 | }; | |
6262 | }; |
|
6352 | }; | |
|
6353 | "safe-buffer-5.2.0" = { | |||
|
6354 | name = "safe-buffer"; | |||
|
6355 | packageName = "safe-buffer"; | |||
|
6356 | version = "5.2.0"; | |||
|
6357 | src = fetchurl { | |||
|
6358 | url = "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.0.tgz"; | |||
|
6359 | sha512 = "fZEwUGbVl7kouZs1jCdMLdt95hdIv0ZeHg6L7qPeciMZhZ+/gdesW4wgTARkrFWEpspjEATAzUGPG8N2jJiwbg=="; | |||
|
6360 | }; | |||
|
6361 | }; | |||
6263 | "safe-regex-1.1.0" = { |
|
6362 | "safe-regex-1.1.0" = { | |
6264 | name = "safe-regex"; |
|
6363 | name = "safe-regex"; | |
6265 | packageName = "safe-regex"; |
|
6364 | packageName = "safe-regex"; | |
@@ -6305,22 +6404,22 b' let' | |||||
6305 | sha1 = "0e7350acdec80b1108528786ec1d4418d11b396d"; |
|
6404 | sha1 = "0e7350acdec80b1108528786ec1d4418d11b396d"; | |
6306 | }; |
|
6405 | }; | |
6307 | }; |
|
6406 | }; | |
6308 |
"semver-5.7. |
|
6407 | "semver-5.7.1" = { | |
6309 | name = "semver"; |
|
6408 | name = "semver"; | |
6310 | packageName = "semver"; |
|
6409 | packageName = "semver"; | |
6311 |
version = "5.7. |
|
6410 | version = "5.7.1"; | |
6312 | src = fetchurl { |
|
6411 | src = fetchurl { | |
6313 |
url = "https://registry.npmjs.org/semver/-/semver-5.7. |
|
6412 | url = "https://registry.npmjs.org/semver/-/semver-5.7.1.tgz"; | |
6314 | sha512 = "Ya52jSX2u7QKghxeoFGpLwCtGlt7j0oY9DYb5apt9nPlJ42ID+ulTXESnt/qAQcoSERyZ5sl3LDIOw0nAn/5DA=="; |
|
6413 | sha512 = "sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ=="; | |
6315 | }; |
|
6414 | }; | |
6316 | }; |
|
6415 | }; | |
6317 |
"serialize-javascript-1. |
|
6416 | "serialize-javascript-1.9.1" = { | |
6318 | name = "serialize-javascript"; |
|
6417 | name = "serialize-javascript"; | |
6319 | packageName = "serialize-javascript"; |
|
6418 | packageName = "serialize-javascript"; | |
6320 |
version = "1. |
|
6419 | version = "1.9.1"; | |
6321 | src = fetchurl { |
|
6420 | src = fetchurl { | |
6322 |
url = "https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1. |
|
6421 | url = "https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.9.1.tgz"; | |
6323 | sha512 = "ke8UG8ulpFOxO8f8gRYabHQe/ZntKlcig2Mp+8+URDP1D8vJZ0KUt7LYo07q25Z/+JVSgpr/cui9PIp5H6/+nA=="; |
|
6422 | sha512 = "0Vb/54WJ6k5v8sSWN09S0ora+Hnr+cX40r9F170nT+mSkaxltoE/7R3OrIdBSUv1OoiobH1QoWQbCnAO+e8J1A=="; | |
6324 | }; |
|
6423 | }; | |
6325 | }; |
|
6424 | }; | |
6326 | "set-blocking-2.0.0" = { |
|
6425 | "set-blocking-2.0.0" = { | |
@@ -6332,22 +6431,13 b' let' | |||||
6332 | sha1 = "045f9782d011ae9a6803ddd382b24392b3d890f7"; |
|
6431 | sha1 = "045f9782d011ae9a6803ddd382b24392b3d890f7"; | |
6333 | }; |
|
6432 | }; | |
6334 | }; |
|
6433 | }; | |
6335 |
"set-value-0. |
|
6434 | "set-value-2.0.1" = { | |
6336 | name = "set-value"; |
|
6435 | name = "set-value"; | |
6337 | packageName = "set-value"; |
|
6436 | packageName = "set-value"; | |
6338 |
version = "0. |
|
6437 | version = "2.0.1"; | |
6339 | src = fetchurl { |
|
6438 | src = fetchurl { | |
6340 |
url = "https://registry.npmjs.org/set-value/-/set-value-0. |
|
6439 | url = "https://registry.npmjs.org/set-value/-/set-value-2.0.1.tgz"; | |
6341 | sha1 = "7db08f9d3d22dc7f78e53af3c3bf4666ecdfccf1"; |
|
6440 | sha512 = "JxHc1weCN68wRY0fhCoXpyK55m/XPHafOmK4UWD7m2CI14GMcFypt4w/0+NV5f/ZMby2F6S2wwA7fgynh9gWSw=="; | |
6342 | }; |
|
|||
6343 | }; |
|
|||
6344 | "set-value-2.0.0" = { |
|
|||
6345 | name = "set-value"; |
|
|||
6346 | packageName = "set-value"; |
|
|||
6347 | version = "2.0.0"; |
|
|||
6348 | src = fetchurl { |
|
|||
6349 | url = "https://registry.npmjs.org/set-value/-/set-value-2.0.0.tgz"; |
|
|||
6350 | sha512 = "hw0yxk9GT/Hr5yJEYnHNKYXkIA8mVJgd9ditYZCe16ZczcaELYYcfvaXesNACk2O8O0nTiPQcQhGUQj8JLzeeg=="; |
|
|||
6351 | }; |
|
6441 | }; | |
6352 | }; |
|
6442 | }; | |
6353 | "setimmediate-1.0.5" = { |
|
6443 | "setimmediate-1.0.5" = { | |
@@ -6665,6 +6755,24 b' let' | |||||
6665 | sha512 = "nOqH59deCq9SRHlxq1Aw85Jnt4w6KvLKqWVik6oA9ZklXLNIOlqg4F2yrT1MVaTjAqvVwdfeZ7w7aCvJD7ugkw=="; |
|
6755 | sha512 = "nOqH59deCq9SRHlxq1Aw85Jnt4w6KvLKqWVik6oA9ZklXLNIOlqg4F2yrT1MVaTjAqvVwdfeZ7w7aCvJD7ugkw=="; | |
6666 | }; |
|
6756 | }; | |
6667 | }; |
|
6757 | }; | |
|
6758 | "string.prototype.trimleft-2.1.0" = { | |||
|
6759 | name = "string.prototype.trimleft"; | |||
|
6760 | packageName = "string.prototype.trimleft"; | |||
|
6761 | version = "2.1.0"; | |||
|
6762 | src = fetchurl { | |||
|
6763 | url = "https://registry.npmjs.org/string.prototype.trimleft/-/string.prototype.trimleft-2.1.0.tgz"; | |||
|
6764 | sha512 = "FJ6b7EgdKxxbDxc79cOlok6Afd++TTs5szo+zJTUyow3ycrRfJVE2pq3vcN53XexvKZu/DJMDfeI/qMiZTrjTw=="; | |||
|
6765 | }; | |||
|
6766 | }; | |||
|
6767 | "string.prototype.trimright-2.1.0" = { | |||
|
6768 | name = "string.prototype.trimright"; | |||
|
6769 | packageName = "string.prototype.trimright"; | |||
|
6770 | version = "2.1.0"; | |||
|
6771 | src = fetchurl { | |||
|
6772 | url = "https://registry.npmjs.org/string.prototype.trimright/-/string.prototype.trimright-2.1.0.tgz"; | |||
|
6773 | sha512 = "fXZTSV55dNBwv16uw+hh5jkghxSnc5oHq+5K/gXgizHwAvMetdAJlHqqoFC1FSDVPYWLkAKl2cxpUT41sV7nSg=="; | |||
|
6774 | }; | |||
|
6775 | }; | |||
6668 | "string_decoder-0.10.31" = { |
|
6776 | "string_decoder-0.10.31" = { | |
6669 | name = "string_decoder"; |
|
6777 | name = "string_decoder"; | |
6670 | packageName = "string_decoder"; |
|
6778 | packageName = "string_decoder"; | |
@@ -6683,13 +6791,13 b' let' | |||||
6683 | sha512 = "n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg=="; |
|
6791 | sha512 = "n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg=="; | |
6684 | }; |
|
6792 | }; | |
6685 | }; |
|
6793 | }; | |
6686 |
"string_decoder-1. |
|
6794 | "string_decoder-1.3.0" = { | |
6687 | name = "string_decoder"; |
|
6795 | name = "string_decoder"; | |
6688 | packageName = "string_decoder"; |
|
6796 | packageName = "string_decoder"; | |
6689 |
version = "1. |
|
6797 | version = "1.3.0"; | |
6690 | src = fetchurl { |
|
6798 | src = fetchurl { | |
6691 |
url = "https://registry.npmjs.org/string_decoder/-/string_decoder-1. |
|
6799 | url = "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz"; | |
6692 | sha512 = "6YqyX6ZWEYguAxgZzHGL7SsCeGx3V2TtOTqZz1xSTSWnqsbWwbptafNyvf/ACquZUXV3DANr5BDIwNYe1mN42w=="; |
|
6800 | sha512 = "hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA=="; | |
6693 | }; |
|
6801 | }; | |
6694 | }; |
|
6802 | }; | |
6695 | "stringstream-0.0.6" = { |
|
6803 | "stringstream-0.0.6" = { | |
@@ -6836,13 +6944,13 b' let' | |||||
6836 | sha512 = "/mrRod8xqpA+IHSLyGCQ2s8SPHiCDEeQJSep1jqLYeEUClOFG2Qsh+4FU6G9VeqpZnGW/Su8LQGc4YKni5rYSQ=="; |
|
6944 | sha512 = "/mrRod8xqpA+IHSLyGCQ2s8SPHiCDEeQJSep1jqLYeEUClOFG2Qsh+4FU6G9VeqpZnGW/Su8LQGc4YKni5rYSQ=="; | |
6837 | }; |
|
6945 | }; | |
6838 | }; |
|
6946 | }; | |
6839 |
"timers-browserify-2.0.1 |
|
6947 | "timers-browserify-2.0.11" = { | |
6840 | name = "timers-browserify"; |
|
6948 | name = "timers-browserify"; | |
6841 | packageName = "timers-browserify"; |
|
6949 | packageName = "timers-browserify"; | |
6842 |
version = "2.0.1 |
|
6950 | version = "2.0.11"; | |
6843 | src = fetchurl { |
|
6951 | src = fetchurl { | |
6844 |
url = "https://registry.npmjs.org/timers-browserify/-/timers-browserify-2.0.1 |
|
6952 | url = "https://registry.npmjs.org/timers-browserify/-/timers-browserify-2.0.11.tgz"; | |
6845 | sha512 = "YvC1SV1XdOUaL6gx5CoGroT3Gu49pK9+TZ38ErPldOWW4j49GI1HKs9DV+KGq/w6y+LZ72W1c8cKz2vzY+qpzg=="; |
|
6953 | sha512 = "60aV6sgJ5YEbzUdn9c8kYGIqOubPoUdqQCul3SBAsRCZ40s6Y5cMcrW4dt3/k/EsbLVJNl9n6Vz3fTc+k2GeKQ=="; | |
6846 | }; |
|
6954 | }; | |
6847 | }; |
|
6955 | }; | |
6848 | "tiny-emitter-2.1.0" = { |
|
6956 | "tiny-emitter-2.1.0" = { | |
@@ -6944,13 +7052,13 b' let' | |||||
6944 | sha1 = "30c6203e1e66b841a88701ed8858f1725d94b026"; |
|
7052 | sha1 = "30c6203e1e66b841a88701ed8858f1725d94b026"; | |
6945 | }; |
|
7053 | }; | |
6946 | }; |
|
7054 | }; | |
6947 |
"tslib-1. |
|
7055 | "tslib-1.10.0" = { | |
6948 | name = "tslib"; |
|
7056 | name = "tslib"; | |
6949 | packageName = "tslib"; |
|
7057 | packageName = "tslib"; | |
6950 |
version = "1. |
|
7058 | version = "1.10.0"; | |
6951 | src = fetchurl { |
|
7059 | src = fetchurl { | |
6952 |
url = "https://registry.npmjs.org/tslib/-/tslib-1. |
|
7060 | url = "https://registry.npmjs.org/tslib/-/tslib-1.10.0.tgz"; | |
6953 | sha512 = "4krF8scpejhaOgqzBEcGM7yDIEfi0/8+8zDRZhNZZ2kjmHJ4hv3zCbQWxoJGz1iw5U0Jl0nma13xzHXcncMavQ=="; |
|
7061 | sha512 = "qOebF53frne81cf0S9B41ByenJ3/IuH8yJKngAX35CmiZySA0khhkovshKK+jGCaMnVomla7gVlIcc3EvKPbTQ=="; | |
6954 | }; |
|
7062 | }; | |
6955 | }; |
|
7063 | }; | |
6956 | "tty-browserify-0.0.0" = { |
|
7064 | "tty-browserify-0.0.0" = { | |
@@ -7016,6 +7124,15 b' let' | |||||
7016 | sha512 = "Y2VsbPVs0FIshJztycsO2SfPk7/KAF/T72qzv9u5EpQ4kB2hQoHlhNQTsNyy6ul7lQtqJN/AoWeS23OzEiEFxw=="; |
|
7124 | sha512 = "Y2VsbPVs0FIshJztycsO2SfPk7/KAF/T72qzv9u5EpQ4kB2hQoHlhNQTsNyy6ul7lQtqJN/AoWeS23OzEiEFxw=="; | |
7017 | }; |
|
7125 | }; | |
7018 | }; |
|
7126 | }; | |
|
7127 | "uglify-js-3.6.7" = { | |||
|
7128 | name = "uglify-js"; | |||
|
7129 | packageName = "uglify-js"; | |||
|
7130 | version = "3.6.7"; | |||
|
7131 | src = fetchurl { | |||
|
7132 | url = "https://registry.npmjs.org/uglify-js/-/uglify-js-3.6.7.tgz"; | |||
|
7133 | sha512 = "4sXQDzmdnoXiO+xvmTzQsfIiwrjUCSA95rSP4SEd8tDb51W2TiDOlL76Hl+Kw0Ie42PSItCW8/t6pBNCF2R48A=="; | |||
|
7134 | }; | |||
|
7135 | }; | |||
7019 | "uglify-to-browserify-1.0.2" = { |
|
7136 | "uglify-to-browserify-1.0.2" = { | |
7020 | name = "uglify-to-browserify"; |
|
7137 | name = "uglify-to-browserify"; | |
7021 | packageName = "uglify-to-browserify"; |
|
7138 | packageName = "uglify-to-browserify"; | |
@@ -7079,13 +7196,13 b' let' | |||||
7079 | sha1 = "8cdd8fbac4e2d2ea1e7e2e8097c42f442280f85b"; |
|
7196 | sha1 = "8cdd8fbac4e2d2ea1e7e2e8097c42f442280f85b"; | |
7080 | }; |
|
7197 | }; | |
7081 | }; |
|
7198 | }; | |
7082 |
"union-value-1.0. |
|
7199 | "union-value-1.0.1" = { | |
7083 | name = "union-value"; |
|
7200 | name = "union-value"; | |
7084 | packageName = "union-value"; |
|
7201 | packageName = "union-value"; | |
7085 |
version = "1.0. |
|
7202 | version = "1.0.1"; | |
7086 | src = fetchurl { |
|
7203 | src = fetchurl { | |
7087 |
url = "https://registry.npmjs.org/union-value/-/union-value-1.0. |
|
7204 | url = "https://registry.npmjs.org/union-value/-/union-value-1.0.1.tgz"; | |
7088 | sha1 = "5c71c34cb5bad5dcebe3ea0cd08207ba5aa1aea4"; |
|
7205 | sha512 = "tJfXmxMeWYnczCVs7XAEvIV7ieppALdyepWMkHkwciRpZraG/xwT+s2JN8+pr1+8jCRf80FFzvr+MpQeeoF4Xg=="; | |
7089 | }; |
|
7206 | }; | |
7090 | }; |
|
7207 | }; | |
7091 | "uniq-1.0.1" = { |
|
7208 | "uniq-1.0.1" = { | |
@@ -7115,13 +7232,13 b' let' | |||||
7115 | sha512 = "Vmp0jIp2ln35UTXuryvjzkjGdRyf9b2lTXuSYUiPmzRcl3FDtYqAwOnTJkAngD9SWhnoJzDbTKwaOrZ+STtxNQ=="; |
|
7232 | sha512 = "Vmp0jIp2ln35UTXuryvjzkjGdRyf9b2lTXuSYUiPmzRcl3FDtYqAwOnTJkAngD9SWhnoJzDbTKwaOrZ+STtxNQ=="; | |
7116 | }; |
|
7233 | }; | |
7117 | }; |
|
7234 | }; | |
7118 |
"unique-slug-2.0. |
|
7235 | "unique-slug-2.0.2" = { | |
7119 | name = "unique-slug"; |
|
7236 | name = "unique-slug"; | |
7120 | packageName = "unique-slug"; |
|
7237 | packageName = "unique-slug"; | |
7121 |
version = "2.0. |
|
7238 | version = "2.0.2"; | |
7122 | src = fetchurl { |
|
7239 | src = fetchurl { | |
7123 |
url = "https://registry.npmjs.org/unique-slug/-/unique-slug-2.0. |
|
7240 | url = "https://registry.npmjs.org/unique-slug/-/unique-slug-2.0.2.tgz"; | |
7124 | sha512 = "n9cU6+gITaVu7VGj1Z8feKMmfAjEAQGhwD9fE3zvpRRa0wEIx8ODYkVGfSc94M2OX00tUFV8wH3zYbm1I8mxFg=="; |
|
7241 | sha512 = "zoWr9ObaxALD3DOPfjPSqxt4fnZiWblxHIgeWqW8x7UqDzEtHEQLzji2cuJYQFCU6KmoJikOYAZlrTHHebjx2w=="; | |
7125 | }; |
|
7242 | }; | |
7126 | }; |
|
7243 | }; | |
7127 | "unset-value-1.0.0" = { |
|
7244 | "unset-value-1.0.0" = { | |
@@ -7133,13 +7250,13 b' let' | |||||
7133 | sha1 = "8376873f7d2335179ffb1e6fc3a8ed0dfc8ab559"; |
|
7250 | sha1 = "8376873f7d2335179ffb1e6fc3a8ed0dfc8ab559"; | |
7134 | }; |
|
7251 | }; | |
7135 | }; |
|
7252 | }; | |
7136 |
"upath-1. |
|
7253 | "upath-1.2.0" = { | |
7137 | name = "upath"; |
|
7254 | name = "upath"; | |
7138 | packageName = "upath"; |
|
7255 | packageName = "upath"; | |
7139 |
version = "1. |
|
7256 | version = "1.2.0"; | |
7140 | src = fetchurl { |
|
7257 | src = fetchurl { | |
7141 |
url = "https://registry.npmjs.org/upath/-/upath-1. |
|
7258 | url = "https://registry.npmjs.org/upath/-/upath-1.2.0.tgz"; | |
7142 | sha512 = "kXpym8nmDmlCBr7nKdIx8P2jNBa+pBpIUFRnKJ4dr8htyYGJFokkr2ZvERRtUN+9SY+JqXouNgUPtv6JQva/2Q=="; |
|
7259 | sha512 = "aZwGpamFO61g3OlfT7OQCHqhGnW43ieH9WZeP7QxN/G/jS4jfqUkZxoryvJgVPEcrl5NL/ggHsSmLMHuH64Lhg=="; | |
7143 | }; |
|
7260 | }; | |
7144 | }; |
|
7261 | }; | |
7145 | "upper-case-1.1.3" = { |
|
7262 | "upper-case-1.1.3" = { | |
@@ -7160,6 +7277,15 b' let' | |||||
7160 | sha512 = "KY9Frmirql91X2Qgjry0Wd4Y+YTdrdZheS8TFwvkbLWf/G5KNJDCh6pKL5OZctEW4+0Baa5idK2ZQuELRwPznQ=="; |
|
7277 | sha512 = "KY9Frmirql91X2Qgjry0Wd4Y+YTdrdZheS8TFwvkbLWf/G5KNJDCh6pKL5OZctEW4+0Baa5idK2ZQuELRwPznQ=="; | |
7161 | }; |
|
7278 | }; | |
7162 | }; |
|
7279 | }; | |
|
7280 | "uri-path-1.0.0" = { | |||
|
7281 | name = "uri-path"; | |||
|
7282 | packageName = "uri-path"; | |||
|
7283 | version = "1.0.0"; | |||
|
7284 | src = fetchurl { | |||
|
7285 | url = "https://registry.npmjs.org/uri-path/-/uri-path-1.0.0.tgz"; | |||
|
7286 | sha1 = "9747f018358933c31de0fccfd82d138e67262e32"; | |||
|
7287 | }; | |||
|
7288 | }; | |||
7163 | "urix-0.1.0" = { |
|
7289 | "urix-0.1.0" = { | |
7164 | name = "urix"; |
|
7290 | name = "urix"; | |
7165 | packageName = "urix"; |
|
7291 | packageName = "urix"; | |
@@ -7232,22 +7358,22 b' let' | |||||
7232 | sha1 = "8a16a05d445657a3aea5eecc5b12a4fa5379772c"; |
|
7358 | sha1 = "8a16a05d445657a3aea5eecc5b12a4fa5379772c"; | |
7233 | }; |
|
7359 | }; | |
7234 | }; |
|
7360 | }; | |
7235 |
"uuid-3.3. |
|
7361 | "uuid-3.3.3" = { | |
7236 | name = "uuid"; |
|
7362 | name = "uuid"; | |
7237 | packageName = "uuid"; |
|
7363 | packageName = "uuid"; | |
7238 |
version = "3.3. |
|
7364 | version = "3.3.3"; | |
7239 | src = fetchurl { |
|
7365 | src = fetchurl { | |
7240 |
url = "https://registry.npmjs.org/uuid/-/uuid-3.3. |
|
7366 | url = "https://registry.npmjs.org/uuid/-/uuid-3.3.3.tgz"; | |
7241 | sha512 = "yXJmeNaw3DnnKAOKJE51sL/ZaYfWJRl1pK9dr19YFCu0ObS231AB1/LbqTKRAQ5kw8A90rA6fr4riOUpTZvQZA=="; |
|
7367 | sha512 = "pW0No1RGHgzlpHJO1nsVrHKpOEIxkGg1xB+v0ZmdNH5OAeAwzAVrCnI2/6Mtx+Uys6iaylxa+D3g4j63IKKjSQ=="; | |
7242 | }; |
|
7368 | }; | |
7243 | }; |
|
7369 | }; | |
7244 |
"v8-compile-cache-2.0 |
|
7370 | "v8-compile-cache-2.1.0" = { | |
7245 | name = "v8-compile-cache"; |
|
7371 | name = "v8-compile-cache"; | |
7246 | packageName = "v8-compile-cache"; |
|
7372 | packageName = "v8-compile-cache"; | |
7247 |
version = "2.0 |
|
7373 | version = "2.1.0"; | |
7248 | src = fetchurl { |
|
7374 | src = fetchurl { | |
7249 |
url = "https://registry.npmjs.org/v8-compile-cache/-/v8-compile-cache-2.0 |
|
7375 | url = "https://registry.npmjs.org/v8-compile-cache/-/v8-compile-cache-2.1.0.tgz"; | |
7250 | sha512 = "CNmdbwQMBjwr9Gsmohvm0pbL954tJrNzf6gWL3K+QMQf00PF7ERGrEiLgjuU3mKreLC2MeGhUsNV9ybTbLgd3w=="; |
|
7376 | sha512 = "usZBT3PW+LOjM25wbqIlZwPeJV+3OSz3M1k1Ws8snlW39dZyYL9lOGC5FgPVHfk0jKmjiDV8Z0mIbVQPiwFs7g=="; | |
7251 | }; |
|
7377 | }; | |
7252 | }; |
|
7378 | }; | |
7253 | "v8flags-3.1.3" = { |
|
7379 | "v8flags-3.1.3" = { | |
@@ -7277,13 +7403,13 b' let' | |||||
7277 | sha1 = "3a105ca17053af55d6e270c1f8288682e18da400"; |
|
7403 | sha1 = "3a105ca17053af55d6e270c1f8288682e18da400"; | |
7278 | }; |
|
7404 | }; | |
7279 | }; |
|
7405 | }; | |
7280 |
"vm-browserify- |
|
7406 | "vm-browserify-1.1.0" = { | |
7281 | name = "vm-browserify"; |
|
7407 | name = "vm-browserify"; | |
7282 | packageName = "vm-browserify"; |
|
7408 | packageName = "vm-browserify"; | |
7283 |
version = " |
|
7409 | version = "1.1.0"; | |
7284 | src = fetchurl { |
|
7410 | src = fetchurl { | |
7285 |
url = "https://registry.npmjs.org/vm-browserify/-/vm-browserify- |
|
7411 | url = "https://registry.npmjs.org/vm-browserify/-/vm-browserify-1.1.0.tgz"; | |
7286 | sha1 = "5d7ea45bbef9e4a6ff65f95438e0a87c357d5a73"; |
|
7412 | sha512 = "iq+S7vZJE60yejDYM0ek6zg308+UZsdtPExWP9VZoCFCz1zkJoXFnAX7aZfd/ZwrkidzdUZL0C/ryW+JwAiIGw=="; | |
7287 | }; |
|
7413 | }; | |
7288 | }; |
|
7414 | }; | |
7289 | "watchpack-1.6.0" = { |
|
7415 | "watchpack-1.6.0" = { | |
@@ -7331,13 +7457,13 b' let' | |||||
7331 | sha1 = "fc571588c8558da77be9efb6debdc5a3b172bdc2"; |
|
7457 | sha1 = "fc571588c8558da77be9efb6debdc5a3b172bdc2"; | |
7332 | }; |
|
7458 | }; | |
7333 | }; |
|
7459 | }; | |
7334 |
"webpack-sources-1.3 |
|
7460 | "webpack-sources-1.4.3" = { | |
7335 | name = "webpack-sources"; |
|
7461 | name = "webpack-sources"; | |
7336 | packageName = "webpack-sources"; |
|
7462 | packageName = "webpack-sources"; | |
7337 |
version = "1.3 |
|
7463 | version = "1.4.3"; | |
7338 | src = fetchurl { |
|
7464 | src = fetchurl { | |
7339 |
url = "https://registry.npmjs.org/webpack-sources/-/webpack-sources-1.3 |
|
7465 | url = "https://registry.npmjs.org/webpack-sources/-/webpack-sources-1.4.3.tgz"; | |
7340 | sha512 = "OiVgSrbGu7NEnEvQJJgdSFPl2qWKkWq5lHMhgiToIiN9w34EBnjYzSYs+VbL5KoYiLNtFFa7BZIKxRED3I32pA=="; |
|
7466 | sha512 = "lgTS3Xhv1lCOKo7SA5TjKXMjpSM4sBjNV5+q2bqesbSPs5FjGmU6jjtBSkX9b4qW87vDIsCIlUPOEhbZrMdjeQ=="; | |
7341 | }; |
|
7467 | }; | |
7342 | }; |
|
7468 | }; | |
7343 | "webpack-uglify-js-plugin-1.1.9" = { |
|
7469 | "webpack-uglify-js-plugin-1.1.9" = { | |
@@ -7430,13 +7556,13 b' let' | |||||
7430 | sha1 = "b5243d8f3ec1aa35f1364605bc0d1036e30ab69f"; |
|
7556 | sha1 = "b5243d8f3ec1aa35f1364605bc0d1036e30ab69f"; | |
7431 | }; |
|
7557 | }; | |
7432 | }; |
|
7558 | }; | |
7433 |
"xtend-4.0. |
|
7559 | "xtend-4.0.2" = { | |
7434 | name = "xtend"; |
|
7560 | name = "xtend"; | |
7435 | packageName = "xtend"; |
|
7561 | packageName = "xtend"; | |
7436 |
version = "4.0. |
|
7562 | version = "4.0.2"; | |
7437 | src = fetchurl { |
|
7563 | src = fetchurl { | |
7438 |
url = "https://registry.npmjs.org/xtend/-/xtend-4.0. |
|
7564 | url = "https://registry.npmjs.org/xtend/-/xtend-4.0.2.tgz"; | |
7439 | sha1 = "a5c6d532be656e23db820efb943a1f04998d63af"; |
|
7565 | sha512 = "LKYU1iAXJXUgAXn9URjiu+MWhyUXHsvfp7mcuYm9dSUKK0/CjtrUwFAxD82/mCWbtLsGjFIad0wIsod4zrTAEQ=="; | |
7440 | }; |
|
7566 | }; | |
7441 | }; |
|
7567 | }; | |
7442 | "y18n-4.0.0" = { |
|
7568 | "y18n-4.0.0" = { | |
@@ -7514,9 +7640,9 b' let' | |||||
7514 | sources."@polymer/paper-toast-3.0.1" |
|
7640 | sources."@polymer/paper-toast-3.0.1" | |
7515 | sources."@polymer/paper-toggle-button-3.0.1" |
|
7641 | sources."@polymer/paper-toggle-button-3.0.1" | |
7516 | sources."@polymer/paper-tooltip-3.0.1" |
|
7642 | sources."@polymer/paper-tooltip-3.0.1" | |
7517 |
sources."@polymer/polymer-3. |
|
7643 | sources."@polymer/polymer-3.3.0" | |
7518 | sources."@types/clone-0.1.30" |
|
7644 | sources."@types/clone-0.1.30" | |
7519 |
sources."@types/node-6.14. |
|
7645 | sources."@types/node-6.14.9" | |
7520 | sources."@types/parse5-2.2.34" |
|
7646 | sources."@types/parse5-2.2.34" | |
7521 | sources."@webassemblyjs/ast-1.7.10" |
|
7647 | sources."@webassemblyjs/ast-1.7.10" | |
7522 | sources."@webassemblyjs/floating-point-hex-parser-1.7.10" |
|
7648 | sources."@webassemblyjs/floating-point-hex-parser-1.7.10" | |
@@ -7536,8 +7662,8 b' let' | |||||
7536 | sources."@webassemblyjs/wasm-parser-1.7.10" |
|
7662 | sources."@webassemblyjs/wasm-parser-1.7.10" | |
7537 | sources."@webassemblyjs/wast-parser-1.7.10" |
|
7663 | sources."@webassemblyjs/wast-parser-1.7.10" | |
7538 | sources."@webassemblyjs/wast-printer-1.7.10" |
|
7664 | sources."@webassemblyjs/wast-printer-1.7.10" | |
7539 |
sources."@webcomponents/shadycss-1.9. |
|
7665 | sources."@webcomponents/shadycss-1.9.2" | |
7540 |
sources."@webcomponents/webcomponentsjs-2. |
|
7666 | sources."@webcomponents/webcomponentsjs-2.3.0" | |
7541 | sources."@xtuc/ieee754-1.2.0" |
|
7667 | sources."@xtuc/ieee754-1.2.0" | |
7542 | sources."@xtuc/long-4.2.1" |
|
7668 | sources."@xtuc/long-4.2.1" | |
7543 | sources."abbrev-1.1.1" |
|
7669 | sources."abbrev-1.1.1" | |
@@ -7549,7 +7675,7 b' let' | |||||
7549 | ]; |
|
7675 | ]; | |
7550 | }) |
|
7676 | }) | |
7551 | sources."ajv-4.11.8" |
|
7677 | sources."ajv-4.11.8" | |
7552 |
sources."ajv-keywords-3.4. |
|
7678 | sources."ajv-keywords-3.4.1" | |
7553 | (sources."align-text-0.1.4" // { |
|
7679 | (sources."align-text-0.1.4" // { | |
7554 | dependencies = [ |
|
7680 | dependencies = [ | |
7555 | sources."kind-of-3.2.2" |
|
7681 | sources."kind-of-3.2.2" | |
@@ -7615,20 +7741,20 b' let' | |||||
7615 | (sources."babel-core-6.26.3" // { |
|
7741 | (sources."babel-core-6.26.3" // { | |
7616 | dependencies = [ |
|
7742 | dependencies = [ | |
7617 | sources."json5-0.5.1" |
|
7743 | sources."json5-0.5.1" | |
7618 |
sources."lodash-4.17.1 |
|
7744 | sources."lodash-4.17.15" | |
7619 | sources."minimatch-3.0.4" |
|
7745 | sources."minimatch-3.0.4" | |
7620 | ]; |
|
7746 | ]; | |
7621 | }) |
|
7747 | }) | |
7622 | (sources."babel-generator-6.26.1" // { |
|
7748 | (sources."babel-generator-6.26.1" // { | |
7623 | dependencies = [ |
|
7749 | dependencies = [ | |
7624 |
sources."lodash-4.17.1 |
|
7750 | sources."lodash-4.17.15" | |
7625 | ]; |
|
7751 | ]; | |
7626 | }) |
|
7752 | }) | |
7627 | sources."babel-helper-builder-binary-assignment-operator-visitor-6.24.1" |
|
7753 | sources."babel-helper-builder-binary-assignment-operator-visitor-6.24.1" | |
7628 | sources."babel-helper-call-delegate-6.24.1" |
|
7754 | sources."babel-helper-call-delegate-6.24.1" | |
7629 | (sources."babel-helper-define-map-6.26.0" // { |
|
7755 | (sources."babel-helper-define-map-6.26.0" // { | |
7630 | dependencies = [ |
|
7756 | dependencies = [ | |
7631 |
sources."lodash-4.17.1 |
|
7757 | sources."lodash-4.17.15" | |
7632 | ]; |
|
7758 | ]; | |
7633 | }) |
|
7759 | }) | |
7634 | sources."babel-helper-explode-assignable-expression-6.24.1" |
|
7760 | sources."babel-helper-explode-assignable-expression-6.24.1" | |
@@ -7638,7 +7764,7 b' let' | |||||
7638 | sources."babel-helper-optimise-call-expression-6.24.1" |
|
7764 | sources."babel-helper-optimise-call-expression-6.24.1" | |
7639 | (sources."babel-helper-regex-6.26.0" // { |
|
7765 | (sources."babel-helper-regex-6.26.0" // { | |
7640 | dependencies = [ |
|
7766 | dependencies = [ | |
7641 |
sources."lodash-4.17.1 |
|
7767 | sources."lodash-4.17.15" | |
7642 | ]; |
|
7768 | ]; | |
7643 | }) |
|
7769 | }) | |
7644 | sources."babel-helper-remap-async-to-generator-6.24.1" |
|
7770 | sources."babel-helper-remap-async-to-generator-6.24.1" | |
@@ -7656,7 +7782,7 b' let' | |||||
7656 | sources."babel-plugin-transform-es2015-block-scoped-functions-6.22.0" |
|
7782 | sources."babel-plugin-transform-es2015-block-scoped-functions-6.22.0" | |
7657 | (sources."babel-plugin-transform-es2015-block-scoping-6.26.0" // { |
|
7783 | (sources."babel-plugin-transform-es2015-block-scoping-6.26.0" // { | |
7658 | dependencies = [ |
|
7784 | dependencies = [ | |
7659 |
sources."lodash-4.17.1 |
|
7785 | sources."lodash-4.17.15" | |
7660 | ]; |
|
7786 | ]; | |
7661 | }) |
|
7787 | }) | |
7662 | sources."babel-plugin-transform-es2015-classes-6.24.1" |
|
7788 | sources."babel-plugin-transform-es2015-classes-6.24.1" | |
@@ -7685,23 +7811,23 b' let' | |||||
7685 | sources."babel-preset-env-1.7.0" |
|
7811 | sources."babel-preset-env-1.7.0" | |
7686 | (sources."babel-register-6.26.0" // { |
|
7812 | (sources."babel-register-6.26.0" // { | |
7687 | dependencies = [ |
|
7813 | dependencies = [ | |
7688 |
sources."lodash-4.17.1 |
|
7814 | sources."lodash-4.17.15" | |
7689 | ]; |
|
7815 | ]; | |
7690 | }) |
|
7816 | }) | |
7691 | sources."babel-runtime-6.26.0" |
|
7817 | sources."babel-runtime-6.26.0" | |
7692 | (sources."babel-template-6.26.0" // { |
|
7818 | (sources."babel-template-6.26.0" // { | |
7693 | dependencies = [ |
|
7819 | dependencies = [ | |
7694 |
sources."lodash-4.17.1 |
|
7820 | sources."lodash-4.17.15" | |
7695 | ]; |
|
7821 | ]; | |
7696 | }) |
|
7822 | }) | |
7697 | (sources."babel-traverse-6.26.0" // { |
|
7823 | (sources."babel-traverse-6.26.0" // { | |
7698 | dependencies = [ |
|
7824 | dependencies = [ | |
7699 |
sources."lodash-4.17.1 |
|
7825 | sources."lodash-4.17.15" | |
7700 | ]; |
|
7826 | ]; | |
7701 | }) |
|
7827 | }) | |
7702 | (sources."babel-types-6.26.0" // { |
|
7828 | (sources."babel-types-6.26.0" // { | |
7703 | dependencies = [ |
|
7829 | dependencies = [ | |
7704 |
sources."lodash-4.17.1 |
|
7830 | sources."lodash-4.17.15" | |
7705 | ]; |
|
7831 | ]; | |
7706 | }) |
|
7832 | }) | |
7707 | sources."babylon-6.18.0" |
|
7833 | sources."babylon-6.18.0" | |
@@ -7711,11 +7837,11 b' let' | |||||
7711 | sources."define-property-1.0.0" |
|
7837 | sources."define-property-1.0.0" | |
7712 | ]; |
|
7838 | ]; | |
7713 | }) |
|
7839 | }) | |
7714 |
sources."base64-js-1.3. |
|
7840 | sources."base64-js-1.3.1" | |
7715 | sources."bcrypt-pbkdf-1.0.2" |
|
7841 | sources."bcrypt-pbkdf-1.0.2" | |
7716 | sources."big.js-5.2.2" |
|
7842 | sources."big.js-5.2.2" | |
7717 | sources."binary-extensions-1.13.1" |
|
7843 | sources."binary-extensions-1.13.1" | |
7718 |
sources."bluebird-3. |
|
7844 | sources."bluebird-3.7.1" | |
7719 | sources."bn.js-4.11.8" |
|
7845 | sources."bn.js-4.11.8" | |
7720 | sources."boolbase-1.0.0" |
|
7846 | sources."boolbase-1.0.0" | |
7721 | sources."boom-2.10.1" |
|
7847 | sources."boom-2.10.1" | |
@@ -7739,11 +7865,11 b' let' | |||||
7739 | sources."builtin-status-codes-3.0.0" |
|
7865 | sources."builtin-status-codes-3.0.0" | |
7740 | (sources."cacache-10.0.4" // { |
|
7866 | (sources."cacache-10.0.4" // { | |
7741 | dependencies = [ |
|
7867 | dependencies = [ | |
7742 |
sources."glob-7.1. |
|
7868 | sources."glob-7.1.5" | |
7743 |
sources."graceful-fs-4. |
|
7869 | sources."graceful-fs-4.2.3" | |
7744 | sources."lru-cache-4.1.5" |
|
7870 | sources."lru-cache-4.1.5" | |
7745 | sources."minimatch-3.0.4" |
|
7871 | sources."minimatch-3.0.4" | |
7746 |
sources."rimraf-2. |
|
7872 | sources."rimraf-2.7.1" | |
7747 | ]; |
|
7873 | ]; | |
7748 | }) |
|
7874 | }) | |
7749 | sources."cache-base-1.0.1" |
|
7875 | sources."cache-base-1.0.1" | |
@@ -7754,18 +7880,18 b' let' | |||||
7754 | sources."browserslist-1.7.7" |
|
7880 | sources."browserslist-1.7.7" | |
7755 | ]; |
|
7881 | ]; | |
7756 | }) |
|
7882 | }) | |
7757 |
sources."caniuse-db-1.0.30000 |
|
7883 | sources."caniuse-db-1.0.30001006" | |
7758 |
sources."caniuse-lite-1.0.30000 |
|
7884 | sources."caniuse-lite-1.0.30001006" | |
7759 | sources."caseless-0.12.0" |
|
7885 | sources."caseless-0.12.0" | |
7760 | sources."center-align-0.1.3" |
|
7886 | sources."center-align-0.1.3" | |
7761 | sources."chalk-0.5.1" |
|
7887 | sources."chalk-0.5.1" | |
7762 |
(sources."chokidar-2.1. |
|
7888 | (sources."chokidar-2.1.8" // { | |
7763 | dependencies = [ |
|
7889 | dependencies = [ | |
7764 | sources."is-glob-4.0.1" |
|
7890 | sources."is-glob-4.0.1" | |
7765 | ]; |
|
7891 | ]; | |
7766 | }) |
|
7892 | }) | |
7767 |
sources."chownr-1.1. |
|
7893 | sources."chownr-1.1.3" | |
7768 |
sources."chrome-trace-event-1.0. |
|
7894 | sources."chrome-trace-event-1.0.2" | |
7769 | sources."cipher-base-1.0.4" |
|
7895 | sources."cipher-base-1.0.4" | |
7770 | (sources."clap-1.2.3" // { |
|
7896 | (sources."clap-1.2.3" // { | |
7771 | dependencies = [ |
|
7897 | dependencies = [ | |
@@ -7801,7 +7927,7 b' let' | |||||
7801 | }) |
|
7927 | }) | |
7802 | (sources."cli-1.0.1" // { |
|
7928 | (sources."cli-1.0.1" // { | |
7803 | dependencies = [ |
|
7929 | dependencies = [ | |
7804 |
sources."glob-7.1. |
|
7930 | sources."glob-7.1.5" | |
7805 | sources."minimatch-3.0.4" |
|
7931 | sources."minimatch-3.0.4" | |
7806 | ]; |
|
7932 | ]; | |
7807 | }) |
|
7933 | }) | |
@@ -7825,24 +7951,29 b' let' | |||||
7825 | sources."colormin-1.1.2" |
|
7951 | sources."colormin-1.1.2" | |
7826 | sources."colors-0.6.2" |
|
7952 | sources."colors-0.6.2" | |
7827 | sources."combined-stream-1.0.8" |
|
7953 | sources."combined-stream-1.0.8" | |
7828 |
sources."commander-2. |
|
7954 | sources."commander-2.20.3" | |
7829 | sources."commondir-1.0.1" |
|
7955 | sources."commondir-1.0.1" | |
7830 | sources."component-emitter-1.3.0" |
|
7956 | sources."component-emitter-1.3.0" | |
7831 | sources."concat-map-0.0.1" |
|
7957 | sources."concat-map-0.0.1" | |
7832 | (sources."concat-stream-1.6.2" // { |
|
7958 | (sources."concat-stream-1.6.2" // { | |
7833 | dependencies = [ |
|
7959 | dependencies = [ | |
7834 | sources."readable-stream-2.3.6" |
|
7960 | sources."readable-stream-2.3.6" | |
|
7961 | sources."safe-buffer-5.1.2" | |||
7835 | sources."string_decoder-1.1.1" |
|
7962 | sources."string_decoder-1.1.1" | |
7836 | ]; |
|
7963 | ]; | |
7837 | }) |
|
7964 | }) | |
7838 | sources."console-browserify-1.1.0" |
|
7965 | sources."console-browserify-1.1.0" | |
7839 | sources."constants-browserify-1.0.0" |
|
7966 | sources."constants-browserify-1.0.0" | |
7840 | sources."convert-source-map-1.6.0" |
|
7967 | (sources."convert-source-map-1.6.0" // { | |
|
7968 | dependencies = [ | |||
|
7969 | sources."safe-buffer-5.1.2" | |||
|
7970 | ]; | |||
|
7971 | }) | |||
7841 | (sources."copy-concurrently-1.0.5" // { |
|
7972 | (sources."copy-concurrently-1.0.5" // { | |
7842 | dependencies = [ |
|
7973 | dependencies = [ | |
7843 |
sources."glob-7.1. |
|
7974 | sources."glob-7.1.5" | |
7844 | sources."minimatch-3.0.4" |
|
7975 | sources."minimatch-3.0.4" | |
7845 |
sources."rimraf-2. |
|
7976 | sources."rimraf-2.7.1" | |
7846 | ]; |
|
7977 | ]; | |
7847 | }) |
|
7978 | }) | |
7848 | sources."copy-descriptor-0.1.1" |
|
7979 | sources."copy-descriptor-0.1.1" | |
@@ -7852,7 +7983,7 b' let' | |||||
7852 | sources."minimatch-3.0.4" |
|
7983 | sources."minimatch-3.0.4" | |
7853 | ]; |
|
7984 | ]; | |
7854 | }) |
|
7985 | }) | |
7855 |
sources."core-js-2.6. |
|
7986 | sources."core-js-2.6.10" | |
7856 | sources."core-util-is-1.0.2" |
|
7987 | sources."core-util-is-1.0.2" | |
7857 | sources."create-ecdh-4.0.3" |
|
7988 | sources."create-ecdh-4.0.3" | |
7858 | sources."create-hash-1.2.0" |
|
7989 | sources."create-hash-1.2.0" | |
@@ -7876,7 +8007,7 b' let' | |||||
7876 | sources."cssesc-0.1.0" |
|
8007 | sources."cssesc-0.1.0" | |
7877 | sources."cssnano-3.10.0" |
|
8008 | sources."cssnano-3.10.0" | |
7878 | sources."csso-2.3.2" |
|
8009 | sources."csso-2.3.2" | |
7879 |
sources."cyclist-0. |
|
8010 | sources."cyclist-1.0.1" | |
7880 | (sources."dashdash-1.14.1" // { |
|
8011 | (sources."dashdash-1.14.1" // { | |
7881 | dependencies = [ |
|
8012 | dependencies = [ | |
7882 | sources."assert-plus-1.0.0" |
|
8013 | sources."assert-plus-1.0.0" | |
@@ -7899,9 +8030,10 b' let' | |||||
7899 | sources."diffie-hellman-5.0.3" |
|
8030 | sources."diffie-hellman-5.0.3" | |
7900 | sources."dir-glob-2.2.2" |
|
8031 | sources."dir-glob-2.2.2" | |
7901 | sources."dom-converter-0.2.0" |
|
8032 | sources."dom-converter-0.2.0" | |
7902 |
(sources."dom-serializer-0. |
|
8033 | (sources."dom-serializer-0.2.1" // { | |
7903 | dependencies = [ |
|
8034 | dependencies = [ | |
7904 |
sources."ent |
|
8035 | sources."domelementtype-2.0.1" | |
|
8036 | sources."entities-2.0.0" | |||
7905 | ]; |
|
8037 | ]; | |
7906 | }) |
|
8038 | }) | |
7907 | (sources."dom5-2.3.0" // { |
|
8039 | (sources."dom5-2.3.0" // { | |
@@ -7915,25 +8047,31 b' let' | |||||
7915 | sources."domhandler-2.3.0" |
|
8047 | sources."domhandler-2.3.0" | |
7916 | sources."domutils-1.5.1" |
|
8048 | sources."domutils-1.5.1" | |
7917 | sources."dropzone-5.5.1" |
|
8049 | sources."dropzone-5.5.1" | |
|
8050 | sources."duplexer-0.1.1" | |||
7918 | (sources."duplexify-3.7.1" // { |
|
8051 | (sources."duplexify-3.7.1" // { | |
7919 | dependencies = [ |
|
8052 | dependencies = [ | |
7920 | sources."readable-stream-2.3.6" |
|
8053 | sources."readable-stream-2.3.6" | |
|
8054 | sources."safe-buffer-5.1.2" | |||
7921 | sources."string_decoder-1.1.1" |
|
8055 | sources."string_decoder-1.1.1" | |
7922 | ]; |
|
8056 | ]; | |
7923 | }) |
|
8057 | }) | |
7924 | sources."ecc-jsbn-0.1.2" |
|
8058 | sources."ecc-jsbn-0.1.2" | |
7925 |
sources."electron-to-chromium-1.3. |
|
8059 | sources."electron-to-chromium-1.3.302" | |
7926 |
sources."elliptic-6. |
|
8060 | sources."elliptic-6.5.1" | |
7927 | sources."emojis-list-2.1.0" |
|
8061 | sources."emojis-list-2.1.0" | |
7928 |
sources."end-of-stream-1.4. |
|
8062 | sources."end-of-stream-1.4.4" | |
7929 |
(sources."enhanced-resolve-4.1. |
|
8063 | (sources."enhanced-resolve-4.1.1" // { | |
7930 | dependencies = [ |
|
8064 | dependencies = [ | |
7931 |
sources."graceful-fs-4. |
|
8065 | sources."graceful-fs-4.2.3" | |
|
8066 | sources."memory-fs-0.5.0" | |||
|
8067 | sources."readable-stream-2.3.6" | |||
|
8068 | sources."safe-buffer-5.1.2" | |||
|
8069 | sources."string_decoder-1.1.1" | |||
7932 | ]; |
|
8070 | ]; | |
7933 | }) |
|
8071 | }) | |
7934 | sources."entities-1.0.0" |
|
8072 | sources."entities-1.0.0" | |
7935 | sources."errno-0.1.7" |
|
8073 | sources."errno-0.1.7" | |
7936 |
sources."es-abstract-1.1 |
|
8074 | sources."es-abstract-1.16.0" | |
7937 | sources."es-to-primitive-1.2.0" |
|
8075 | sources."es-to-primitive-1.2.0" | |
7938 | sources."es6-templates-0.2.3" |
|
8076 | sources."es6-templates-0.2.3" | |
7939 | sources."escape-string-regexp-1.0.5" |
|
8077 | sources."escape-string-regexp-1.0.5" | |
@@ -7941,8 +8079,8 b' let' | |||||
7941 | sources."espree-3.5.4" |
|
8079 | sources."espree-3.5.4" | |
7942 | sources."esprima-1.0.4" |
|
8080 | sources."esprima-1.0.4" | |
7943 | sources."esrecurse-4.2.1" |
|
8081 | sources."esrecurse-4.2.1" | |
7944 |
sources."estraverse-4. |
|
8082 | sources."estraverse-4.3.0" | |
7945 |
sources."esutils-2.0. |
|
8083 | sources."esutils-2.0.3" | |
7946 | sources."eventemitter2-0.4.14" |
|
8084 | sources."eventemitter2-0.4.14" | |
7947 | sources."events-3.0.0" |
|
8085 | sources."events-3.0.0" | |
7948 | sources."evp_bytestokey-1.0.3" |
|
8086 | sources."evp_bytestokey-1.0.3" | |
@@ -7986,6 +8124,7 b' let' | |||||
7986 | sources."fastparse-1.1.2" |
|
8124 | sources."fastparse-1.1.2" | |
7987 | sources."favico.js-0.3.10" |
|
8125 | sources."favico.js-0.3.10" | |
7988 | sources."faye-websocket-0.4.4" |
|
8126 | sources."faye-websocket-0.4.4" | |
|
8127 | sources."figures-1.7.0" | |||
7989 | sources."file-sync-cmp-0.1.1" |
|
8128 | sources."file-sync-cmp-0.1.1" | |
7990 | (sources."fill-range-4.0.0" // { |
|
8129 | (sources."fill-range-4.0.0" // { | |
7991 | dependencies = [ |
|
8130 | dependencies = [ | |
@@ -8003,10 +8142,11 b' let' | |||||
8003 | }) |
|
8142 | }) | |
8004 | sources."fined-1.2.0" |
|
8143 | sources."fined-1.2.0" | |
8005 | sources."flagged-respawn-1.0.1" |
|
8144 | sources."flagged-respawn-1.0.1" | |
8006 |
sources."flatten-1.0. |
|
8145 | sources."flatten-1.0.3" | |
8007 | (sources."flush-write-stream-1.1.1" // { |
|
8146 | (sources."flush-write-stream-1.1.1" // { | |
8008 | dependencies = [ |
|
8147 | dependencies = [ | |
8009 | sources."readable-stream-2.3.6" |
|
8148 | sources."readable-stream-2.3.6" | |
|
8149 | sources."safe-buffer-5.1.2" | |||
8010 | sources."string_decoder-1.1.1" |
|
8150 | sources."string_decoder-1.1.1" | |
8011 | ]; |
|
8151 | ]; | |
8012 | }) |
|
8152 | }) | |
@@ -8018,12 +8158,13 b' let' | |||||
8018 | (sources."from2-2.3.0" // { |
|
8158 | (sources."from2-2.3.0" // { | |
8019 | dependencies = [ |
|
8159 | dependencies = [ | |
8020 | sources."readable-stream-2.3.6" |
|
8160 | sources."readable-stream-2.3.6" | |
|
8161 | sources."safe-buffer-5.1.2" | |||
8021 | sources."string_decoder-1.1.1" |
|
8162 | sources."string_decoder-1.1.1" | |
8022 | ]; |
|
8163 | ]; | |
8023 | }) |
|
8164 | }) | |
8024 | (sources."fs-write-stream-atomic-1.0.10" // { |
|
8165 | (sources."fs-write-stream-atomic-1.0.10" // { | |
8025 | dependencies = [ |
|
8166 | dependencies = [ | |
8026 |
sources."graceful-fs-4. |
|
8167 | sources."graceful-fs-4.2.3" | |
8027 | ]; |
|
8168 | ]; | |
8028 | }) |
|
8169 | }) | |
8029 | sources."fs.realpath-1.0.0" |
|
8170 | sources."fs.realpath-1.0.0" | |
@@ -8059,7 +8200,7 b' let' | |||||
8059 | sources."globals-9.18.0" |
|
8200 | sources."globals-9.18.0" | |
8060 | (sources."globby-7.1.1" // { |
|
8201 | (sources."globby-7.1.1" // { | |
8061 | dependencies = [ |
|
8202 | dependencies = [ | |
8062 |
sources."glob-7.1. |
|
8203 | sources."glob-7.1.5" | |
8063 | sources."minimatch-3.0.4" |
|
8204 | sources."minimatch-3.0.4" | |
8064 | ]; |
|
8205 | ]; | |
8065 | }) |
|
8206 | }) | |
@@ -8094,7 +8235,7 b' let' | |||||
8094 | (sources."grunt-contrib-jshint-0.12.0" // { |
|
8235 | (sources."grunt-contrib-jshint-0.12.0" // { | |
8095 | dependencies = [ |
|
8236 | dependencies = [ | |
8096 | sources."jshint-2.9.7" |
|
8237 | sources."jshint-2.9.7" | |
8097 |
sources."lodash-4.17.1 |
|
8238 | sources."lodash-4.17.15" | |
8098 | sources."minimatch-3.0.4" |
|
8239 | sources."minimatch-3.0.4" | |
8099 | ]; |
|
8240 | ]; | |
8100 | }) |
|
8241 | }) | |
@@ -8102,14 +8243,21 b' let' | |||||
8102 | dependencies = [ |
|
8243 | dependencies = [ | |
8103 | sources."ansi-regex-2.1.1" |
|
8244 | sources."ansi-regex-2.1.1" | |
8104 | sources."ansi-styles-2.2.1" |
|
8245 | sources."ansi-styles-2.2.1" | |
8105 |
sources."async-2.6. |
|
8246 | sources."async-2.6.3" | |
8106 | sources."chalk-1.1.3" |
|
8247 | sources."chalk-1.1.3" | |
8107 | sources."has-ansi-2.0.0" |
|
8248 | sources."has-ansi-2.0.0" | |
8108 |
sources."lodash-4.17.1 |
|
8249 | sources."lodash-4.17.15" | |
8109 | sources."strip-ansi-3.0.1" |
|
8250 | sources."strip-ansi-3.0.1" | |
8110 | sources."supports-color-2.0.0" |
|
8251 | sources."supports-color-2.0.0" | |
8111 | ]; |
|
8252 | ]; | |
8112 | }) |
|
8253 | }) | |
|
8254 | (sources."grunt-contrib-uglify-4.0.1" // { | |||
|
8255 | dependencies = [ | |||
|
8256 | sources."ansi-styles-3.2.1" | |||
|
8257 | sources."chalk-2.4.2" | |||
|
8258 | sources."supports-color-5.5.0" | |||
|
8259 | ]; | |||
|
8260 | }) | |||
8113 | (sources."grunt-contrib-watch-0.6.1" // { |
|
8261 | (sources."grunt-contrib-watch-0.6.1" // { | |
8114 | dependencies = [ |
|
8262 | dependencies = [ | |
8115 | sources."async-0.2.10" |
|
8263 | sources."async-0.2.10" | |
@@ -8132,9 +8280,10 b' let' | |||||
8132 | sources."grunt-legacy-util-0.2.0" |
|
8280 | sources."grunt-legacy-util-0.2.0" | |
8133 | (sources."grunt-webpack-3.1.3" // { |
|
8281 | (sources."grunt-webpack-3.1.3" // { | |
8134 | dependencies = [ |
|
8282 | dependencies = [ | |
8135 |
sources."lodash-4.17.1 |
|
8283 | sources."lodash-4.17.15" | |
8136 | ]; |
|
8284 | ]; | |
8137 | }) |
|
8285 | }) | |
|
8286 | sources."gzip-size-3.0.0" | |||
8138 | sources."har-schema-1.0.5" |
|
8287 | sources."har-schema-1.0.5" | |
8139 | sources."har-validator-4.2.1" |
|
8288 | sources."har-validator-4.2.1" | |
8140 | sources."has-1.0.3" |
|
8289 | sources."has-1.0.3" | |
@@ -8161,6 +8310,12 b' let' | |||||
8161 | (sources."html-minifier-3.5.21" // { |
|
8310 | (sources."html-minifier-3.5.21" // { | |
8162 | dependencies = [ |
|
8311 | dependencies = [ | |
8163 | sources."commander-2.17.1" |
|
8312 | sources."commander-2.17.1" | |
|
8313 | sources."source-map-0.6.1" | |||
|
8314 | (sources."uglify-js-3.4.10" // { | |||
|
8315 | dependencies = [ | |||
|
8316 | sources."commander-2.19.0" | |||
|
8317 | ]; | |||
|
8318 | }) | |||
8164 | ]; |
|
8319 | ]; | |
8165 | }) |
|
8320 | }) | |
8166 | (sources."html-webpack-plugin-3.2.0" // { |
|
8321 | (sources."html-webpack-plugin-3.2.0" // { | |
@@ -8168,7 +8323,7 b' let' | |||||
8168 | sources."big.js-3.2.0" |
|
8323 | sources."big.js-3.2.0" | |
8169 | sources."json5-0.5.1" |
|
8324 | sources."json5-0.5.1" | |
8170 | sources."loader-utils-0.2.17" |
|
8325 | sources."loader-utils-0.2.17" | |
8171 |
sources."lodash-4.17.1 |
|
8326 | sources."lodash-4.17.15" | |
8172 | ]; |
|
8327 | ]; | |
8173 | }) |
|
8328 | }) | |
8174 | sources."htmlparser2-3.8.3" |
|
8329 | sources."htmlparser2-3.8.3" | |
@@ -8193,7 +8348,7 b' let' | |||||
8193 | dependencies = [ |
|
8348 | dependencies = [ | |
8194 | sources."find-up-3.0.0" |
|
8349 | sources."find-up-3.0.0" | |
8195 | sources."locate-path-3.0.0" |
|
8350 | sources."locate-path-3.0.0" | |
8196 |
sources."p-limit-2.2. |
|
8351 | sources."p-limit-2.2.1" | |
8197 | sources."p-locate-3.0.0" |
|
8352 | sources."p-locate-3.0.0" | |
8198 | sources."p-try-2.2.0" |
|
8353 | sources."p-try-2.2.0" | |
8199 | sources."pkg-dir-3.0.0" |
|
8354 | sources."pkg-dir-3.0.0" | |
@@ -8202,9 +8357,8 b' let' | |||||
8202 | sources."imports-loader-0.7.1" |
|
8357 | sources."imports-loader-0.7.1" | |
8203 | sources."imurmurhash-0.1.4" |
|
8358 | sources."imurmurhash-0.1.4" | |
8204 | sources."indexes-of-1.0.1" |
|
8359 | sources."indexes-of-1.0.1" | |
8205 | sources."indexof-0.0.1" |
|
|||
8206 | sources."inflight-1.0.6" |
|
8360 | sources."inflight-1.0.6" | |
8207 |
sources."inherits-2.0. |
|
8361 | sources."inherits-2.0.4" | |
8208 | sources."ini-1.3.5" |
|
8362 | sources."ini-1.3.5" | |
8209 | sources."interpret-1.1.0" |
|
8363 | sources."interpret-1.1.0" | |
8210 | sources."invariant-2.2.4" |
|
8364 | sources."invariant-2.2.4" | |
@@ -8250,7 +8404,7 b' let' | |||||
8250 | sources."jsesc-1.3.0" |
|
8404 | sources."jsesc-1.3.0" | |
8251 | (sources."jshint-2.10.2" // { |
|
8405 | (sources."jshint-2.10.2" // { | |
8252 | dependencies = [ |
|
8406 | dependencies = [ | |
8253 |
sources."lodash-4.17.1 |
|
8407 | sources."lodash-4.17.15" | |
8254 | sources."minimatch-3.0.4" |
|
8408 | sources."minimatch-3.0.4" | |
8255 | ]; |
|
8409 | ]; | |
8256 | }) |
|
8410 | }) | |
@@ -8271,7 +8425,7 b' let' | |||||
8271 | sources."lcid-2.0.0" |
|
8425 | sources."lcid-2.0.0" | |
8272 | (sources."less-2.7.3" // { |
|
8426 | (sources."less-2.7.3" // { | |
8273 | dependencies = [ |
|
8427 | dependencies = [ | |
8274 |
sources."graceful-fs-4. |
|
8428 | sources."graceful-fs-4.2.3" | |
8275 | ]; |
|
8429 | ]; | |
8276 | }) |
|
8430 | }) | |
8277 | (sources."liftoff-2.5.0" // { |
|
8431 | (sources."liftoff-2.5.0" // { | |
@@ -8298,11 +8452,22 b' let' | |||||
8298 | sources."map-visit-1.0.0" |
|
8452 | sources."map-visit-1.0.0" | |
8299 | sources."mark.js-8.11.1" |
|
8453 | sources."mark.js-8.11.1" | |
8300 | sources."math-expression-evaluator-1.2.17" |
|
8454 | sources."math-expression-evaluator-1.2.17" | |
|
8455 | (sources."maxmin-2.1.0" // { | |||
|
8456 | dependencies = [ | |||
|
8457 | sources."ansi-regex-2.1.1" | |||
|
8458 | sources."ansi-styles-2.2.1" | |||
|
8459 | sources."chalk-1.1.3" | |||
|
8460 | sources."has-ansi-2.0.0" | |||
|
8461 | sources."strip-ansi-3.0.1" | |||
|
8462 | sources."supports-color-2.0.0" | |||
|
8463 | ]; | |||
|
8464 | }) | |||
8301 | sources."md5.js-1.3.5" |
|
8465 | sources."md5.js-1.3.5" | |
8302 | sources."mem-4.3.0" |
|
8466 | sources."mem-4.3.0" | |
8303 | (sources."memory-fs-0.4.1" // { |
|
8467 | (sources."memory-fs-0.4.1" // { | |
8304 | dependencies = [ |
|
8468 | dependencies = [ | |
8305 | sources."readable-stream-2.3.6" |
|
8469 | sources."readable-stream-2.3.6" | |
|
8470 | sources."safe-buffer-5.1.2" | |||
8306 | sources."string_decoder-1.1.1" |
|
8471 | sources."string_decoder-1.1.1" | |
8307 | ]; |
|
8472 | ]; | |
8308 | }) |
|
8473 | }) | |
@@ -8317,7 +8482,7 b' let' | |||||
8317 | sources."minimatch-0.2.14" |
|
8482 | sources."minimatch-0.2.14" | |
8318 | sources."minimist-1.2.0" |
|
8483 | sources."minimist-1.2.0" | |
8319 | sources."mississippi-2.0.0" |
|
8484 | sources."mississippi-2.0.0" | |
8320 |
(sources."mixin-deep-1.3. |
|
8485 | (sources."mixin-deep-1.3.2" // { | |
8321 | dependencies = [ |
|
8486 | dependencies = [ | |
8322 | sources."is-extendable-1.0.1" |
|
8487 | sources."is-extendable-1.0.1" | |
8323 | ]; |
|
8488 | ]; | |
@@ -8331,25 +8496,30 b' let' | |||||
8331 | sources."mousetrap-1.6.3" |
|
8496 | sources."mousetrap-1.6.3" | |
8332 | (sources."move-concurrently-1.0.1" // { |
|
8497 | (sources."move-concurrently-1.0.1" // { | |
8333 | dependencies = [ |
|
8498 | dependencies = [ | |
8334 |
sources."glob-7.1. |
|
8499 | sources."glob-7.1.5" | |
8335 | sources."minimatch-3.0.4" |
|
8500 | sources."minimatch-3.0.4" | |
8336 |
sources."rimraf-2. |
|
8501 | sources."rimraf-2.7.1" | |
8337 | ]; |
|
8502 | ]; | |
8338 | }) |
|
8503 | }) | |
8339 | sources."ms-2.0.0" |
|
8504 | sources."ms-2.0.0" | |
8340 |
sources."nan-2.1 |
|
8505 | sources."nan-2.14.0" | |
8341 | sources."nanomatch-1.2.13" |
|
8506 | sources."nanomatch-1.2.13" | |
8342 | sources."neo-async-2.6.1" |
|
8507 | sources."neo-async-2.6.1" | |
8343 | sources."nice-try-1.0.5" |
|
8508 | sources."nice-try-1.0.5" | |
8344 | sources."no-case-2.3.2" |
|
8509 | sources."no-case-2.3.2" | |
8345 |
(sources."node-libs-browser-2.2. |
|
8510 | (sources."node-libs-browser-2.2.1" // { | |
8346 | dependencies = [ |
|
8511 | dependencies = [ | |
8347 | (sources."readable-stream-2.3.6" // { |
|
8512 | (sources."readable-stream-2.3.6" // { | |
8348 | dependencies = [ |
|
8513 | dependencies = [ | |
8349 | sources."string_decoder-1.1.1" |
|
8514 | sources."string_decoder-1.1.1" | |
8350 | ]; |
|
8515 | ]; | |
8351 | }) |
|
8516 | }) | |
8352 |
sources."s |
|
8517 | sources."safe-buffer-5.1.2" | |
|
8518 | (sources."string_decoder-1.3.0" // { | |||
|
8519 | dependencies = [ | |||
|
8520 | sources."safe-buffer-5.2.0" | |||
|
8521 | ]; | |||
|
8522 | }) | |||
8353 | ]; |
|
8523 | ]; | |
8354 | }) |
|
8524 | }) | |
8355 | sources."nopt-1.0.10" |
|
8525 | sources."nopt-1.0.10" | |
@@ -8380,6 +8550,7 b' let' | |||||
8380 | sources."kind-of-3.2.2" |
|
8550 | sources."kind-of-3.2.2" | |
8381 | ]; |
|
8551 | ]; | |
8382 | }) |
|
8552 | }) | |
|
8553 | sources."object-inspect-1.6.0" | |||
8383 | sources."object-keys-1.1.1" |
|
8554 | sources."object-keys-1.1.1" | |
8384 | sources."object-visit-1.0.1" |
|
8555 | sources."object-visit-1.0.1" | |
8385 | sources."object.defaults-1.1.0" |
|
8556 | sources."object.defaults-1.1.0" | |
@@ -8399,14 +8570,15 b' let' | |||||
8399 | sources."p-locate-2.0.0" |
|
8570 | sources."p-locate-2.0.0" | |
8400 | sources."p-try-1.0.0" |
|
8571 | sources."p-try-1.0.0" | |
8401 | sources."pako-1.0.10" |
|
8572 | sources."pako-1.0.10" | |
8402 |
(sources."parallel-transform-1. |
|
8573 | (sources."parallel-transform-1.2.0" // { | |
8403 | dependencies = [ |
|
8574 | dependencies = [ | |
8404 | sources."readable-stream-2.3.6" |
|
8575 | sources."readable-stream-2.3.6" | |
|
8576 | sources."safe-buffer-5.1.2" | |||
8405 | sources."string_decoder-1.1.1" |
|
8577 | sources."string_decoder-1.1.1" | |
8406 | ]; |
|
8578 | ]; | |
8407 | }) |
|
8579 | }) | |
8408 | sources."param-case-2.1.1" |
|
8580 | sources."param-case-2.1.1" | |
8409 |
sources."parse-asn1-5.1. |
|
8581 | sources."parse-asn1-5.1.5" | |
8410 | sources."parse-filepath-1.0.2" |
|
8582 | sources."parse-filepath-1.0.2" | |
8411 | sources."parse-passwd-1.0.0" |
|
8583 | sources."parse-passwd-1.0.0" | |
8412 | sources."parse5-3.0.3" |
|
8584 | sources."parse5-3.0.3" | |
@@ -8416,7 +8588,7 b' let' | |||||
8416 | ]; |
|
8588 | ]; | |
8417 | }) |
|
8589 | }) | |
8418 | sources."pascalcase-0.1.1" |
|
8590 | sources."pascalcase-0.1.1" | |
8419 |
sources."path-browserify-0.0. |
|
8591 | sources."path-browserify-0.0.1" | |
8420 | sources."path-dirname-1.0.2" |
|
8592 | sources."path-dirname-1.0.2" | |
8421 | sources."path-exists-3.0.0" |
|
8593 | sources."path-exists-3.0.0" | |
8422 | sources."path-is-absolute-1.0.1" |
|
8594 | sources."path-is-absolute-1.0.1" | |
@@ -8527,10 +8699,11 b' let' | |||||
8527 | sources."postcss-value-parser-3.3.1" |
|
8699 | sources."postcss-value-parser-3.3.1" | |
8528 | sources."postcss-zindex-2.2.0" |
|
8700 | sources."postcss-zindex-2.2.0" | |
8529 | sources."prepend-http-1.0.4" |
|
8701 | sources."prepend-http-1.0.4" | |
|
8702 | sources."pretty-bytes-3.0.1" | |||
8530 | sources."pretty-error-2.1.1" |
|
8703 | sources."pretty-error-2.1.1" | |
8531 | sources."private-0.1.8" |
|
8704 | sources."private-0.1.8" | |
8532 | sources."process-0.11.10" |
|
8705 | sources."process-0.11.10" | |
8533 |
sources."process-nextick-args-2.0. |
|
8706 | sources."process-nextick-args-2.0.1" | |
8534 | sources."promise-7.3.1" |
|
8707 | sources."promise-7.3.1" | |
8535 | sources."promise-inflight-1.0.1" |
|
8708 | sources."promise-inflight-1.0.1" | |
8536 | sources."prr-1.0.1" |
|
8709 | sources."prr-1.0.1" | |
@@ -8555,8 +8728,9 b' let' | |||||
8555 | }) |
|
8728 | }) | |
8556 | (sources."readdirp-2.2.1" // { |
|
8729 | (sources."readdirp-2.2.1" // { | |
8557 | dependencies = [ |
|
8730 | dependencies = [ | |
8558 |
sources."graceful-fs-4. |
|
8731 | sources."graceful-fs-4.2.3" | |
8559 | sources."readable-stream-2.3.6" |
|
8732 | sources."readable-stream-2.3.6" | |
|
8733 | sources."safe-buffer-5.1.2" | |||
8560 | sources."string_decoder-1.1.1" |
|
8734 | sources."string_decoder-1.1.1" | |
8561 | ]; |
|
8735 | ]; | |
8562 | }) |
|
8736 | }) | |
@@ -8571,11 +8745,7 b' let' | |||||
8571 | sources."balanced-match-0.4.2" |
|
8745 | sources."balanced-match-0.4.2" | |
8572 | ]; |
|
8746 | ]; | |
8573 | }) |
|
8747 | }) | |
8574 |
|
|
8748 | sources."reduce-function-call-1.0.3" | |
8575 | dependencies = [ |
|
|||
8576 | sources."balanced-match-0.4.2" |
|
|||
8577 | ]; |
|
|||
8578 | }) |
|
|||
8579 | sources."regenerate-1.4.0" |
|
8749 | sources."regenerate-1.4.0" | |
8580 | sources."regenerator-runtime-0.11.1" |
|
8750 | sources."regenerator-runtime-0.11.1" | |
8581 | sources."regenerator-transform-0.10.1" |
|
8751 | sources."regenerator-transform-0.10.1" | |
@@ -8601,7 +8771,7 b' let' | |||||
8601 | sources."request-2.81.0" |
|
8771 | sources."request-2.81.0" | |
8602 | sources."require-directory-2.1.1" |
|
8772 | sources."require-directory-2.1.1" | |
8603 | sources."require-main-filename-1.0.1" |
|
8773 | sources."require-main-filename-1.0.1" | |
8604 |
sources."resolve-1.10 |
|
8774 | sources."resolve-1.12.0" | |
8605 | sources."resolve-cwd-2.0.0" |
|
8775 | sources."resolve-cwd-2.0.0" | |
8606 | sources."resolve-dir-1.0.1" |
|
8776 | sources."resolve-dir-1.0.1" | |
8607 | sources."resolve-from-3.0.0" |
|
8777 | sources."resolve-from-3.0.0" | |
@@ -8611,20 +8781,20 b' let' | |||||
8611 | sources."rimraf-2.2.8" |
|
8781 | sources."rimraf-2.2.8" | |
8612 | sources."ripemd160-2.0.2" |
|
8782 | sources."ripemd160-2.0.2" | |
8613 | sources."run-queue-1.0.3" |
|
8783 | sources."run-queue-1.0.3" | |
8614 |
sources."safe-buffer-5. |
|
8784 | sources."safe-buffer-5.2.0" | |
8615 | sources."safe-regex-1.1.0" |
|
8785 | sources."safe-regex-1.1.0" | |
8616 | sources."safer-buffer-2.1.2" |
|
8786 | sources."safer-buffer-2.1.2" | |
8617 | sources."sax-1.2.4" |
|
8787 | sources."sax-1.2.4" | |
8618 | (sources."schema-utils-0.4.7" // { |
|
8788 | (sources."schema-utils-0.4.7" // { | |
8619 | dependencies = [ |
|
8789 | dependencies = [ | |
8620 |
sources."ajv-6.10. |
|
8790 | sources."ajv-6.10.2" | |
8621 | ]; |
|
8791 | ]; | |
8622 | }) |
|
8792 | }) | |
8623 | sources."select-1.1.2" |
|
8793 | sources."select-1.1.2" | |
8624 |
sources."semver-5.7. |
|
8794 | sources."semver-5.7.1" | |
8625 |
sources."serialize-javascript-1. |
|
8795 | sources."serialize-javascript-1.9.1" | |
8626 | sources."set-blocking-2.0.0" |
|
8796 | sources."set-blocking-2.0.0" | |
8627 |
(sources."set-value-2.0. |
|
8797 | (sources."set-value-2.0.1" // { | |
8628 | dependencies = [ |
|
8798 | dependencies = [ | |
8629 | sources."extend-shallow-2.0.1" |
|
8799 | sources."extend-shallow-2.0.1" | |
8630 | ]; |
|
8800 | ]; | |
@@ -8701,6 +8871,7 b' let' | |||||
8701 | (sources."stream-browserify-2.0.2" // { |
|
8871 | (sources."stream-browserify-2.0.2" // { | |
8702 | dependencies = [ |
|
8872 | dependencies = [ | |
8703 | sources."readable-stream-2.3.6" |
|
8873 | sources."readable-stream-2.3.6" | |
|
8874 | sources."safe-buffer-5.1.2" | |||
8704 | sources."string_decoder-1.1.1" |
|
8875 | sources."string_decoder-1.1.1" | |
8705 | ]; |
|
8876 | ]; | |
8706 | }) |
|
8877 | }) | |
@@ -8708,6 +8879,7 b' let' | |||||
8708 | (sources."stream-http-2.8.3" // { |
|
8879 | (sources."stream-http-2.8.3" // { | |
8709 | dependencies = [ |
|
8880 | dependencies = [ | |
8710 | sources."readable-stream-2.3.6" |
|
8881 | sources."readable-stream-2.3.6" | |
|
8882 | sources."safe-buffer-5.1.2" | |||
8711 | sources."string_decoder-1.1.1" |
|
8883 | sources."string_decoder-1.1.1" | |
8712 | ]; |
|
8884 | ]; | |
8713 | }) |
|
8885 | }) | |
@@ -8720,6 +8892,8 b' let' | |||||
8720 | sources."strip-ansi-4.0.0" |
|
8892 | sources."strip-ansi-4.0.0" | |
8721 | ]; |
|
8893 | ]; | |
8722 | }) |
|
8894 | }) | |
|
8895 | sources."string.prototype.trimleft-2.1.0" | |||
|
8896 | sources."string.prototype.trimright-2.1.0" | |||
8723 | sources."string_decoder-0.10.31" |
|
8897 | sources."string_decoder-0.10.31" | |
8724 | sources."stringstream-0.0.6" |
|
8898 | sources."stringstream-0.0.6" | |
8725 | sources."strip-ansi-0.3.0" |
|
8899 | sources."strip-ansi-0.3.0" | |
@@ -8740,10 +8914,11 b' let' | |||||
8740 | (sources."through2-2.0.5" // { |
|
8914 | (sources."through2-2.0.5" // { | |
8741 | dependencies = [ |
|
8915 | dependencies = [ | |
8742 | sources."readable-stream-2.3.6" |
|
8916 | sources."readable-stream-2.3.6" | |
|
8917 | sources."safe-buffer-5.1.2" | |||
8743 | sources."string_decoder-1.1.1" |
|
8918 | sources."string_decoder-1.1.1" | |
8744 | ]; |
|
8919 | ]; | |
8745 | }) |
|
8920 | }) | |
8746 |
sources."timers-browserify-2.0.1 |
|
8921 | sources."timers-browserify-2.0.11" | |
8747 | sources."tiny-emitter-2.1.0" |
|
8922 | sources."tiny-emitter-2.1.0" | |
8748 | (sources."tiny-lr-fork-0.0.5" // { |
|
8923 | (sources."tiny-lr-fork-0.0.5" // { | |
8749 | dependencies = [ |
|
8924 | dependencies = [ | |
@@ -8766,27 +8941,27 b' let' | |||||
8766 | (sources."ts-loader-1.3.3" // { |
|
8941 | (sources."ts-loader-1.3.3" // { | |
8767 | dependencies = [ |
|
8942 | dependencies = [ | |
8768 | sources."big.js-3.2.0" |
|
8943 | sources."big.js-3.2.0" | |
8769 |
sources."colors-1. |
|
8944 | sources."colors-1.4.0" | |
8770 | sources."enhanced-resolve-3.4.1" |
|
8945 | sources."enhanced-resolve-3.4.1" | |
8771 |
sources."graceful-fs-4. |
|
8946 | sources."graceful-fs-4.2.3" | |
8772 | sources."json5-0.5.1" |
|
8947 | sources."json5-0.5.1" | |
8773 | sources."loader-utils-0.2.17" |
|
8948 | sources."loader-utils-0.2.17" | |
8774 | sources."tapable-0.2.9" |
|
8949 | sources."tapable-0.2.9" | |
8775 | ]; |
|
8950 | ]; | |
8776 | }) |
|
8951 | }) | |
8777 |
sources."tslib-1. |
|
8952 | sources."tslib-1.10.0" | |
8778 | sources."tty-browserify-0.0.0" |
|
8953 | sources."tty-browserify-0.0.0" | |
8779 | sources."tunnel-agent-0.6.0" |
|
8954 | sources."tunnel-agent-0.6.0" | |
8780 | sources."tweetnacl-0.14.5" |
|
8955 | sources."tweetnacl-0.14.5" | |
8781 | sources."typedarray-0.0.6" |
|
8956 | sources."typedarray-0.0.6" | |
8782 | (sources."uglify-es-3.3.10" // { |
|
8957 | (sources."uglify-es-3.3.10" // { | |
8783 | dependencies = [ |
|
8958 | dependencies = [ | |
|
8959 | sources."commander-2.14.1" | |||
8784 | sources."source-map-0.6.1" |
|
8960 | sources."source-map-0.6.1" | |
8785 | ]; |
|
8961 | ]; | |
8786 | }) |
|
8962 | }) | |
8787 |
(sources."uglify-js-3. |
|
8963 | (sources."uglify-js-3.6.7" // { | |
8788 | dependencies = [ |
|
8964 | dependencies = [ | |
8789 | sources."commander-2.19.0" |
|
|||
8790 | sources."source-map-0.6.1" |
|
8965 | sources."source-map-0.6.1" | |
8791 | ]; |
|
8966 | ]; | |
8792 | }) |
|
8967 | }) | |
@@ -8799,16 +8974,11 b' let' | |||||
8799 | sources."unc-path-regex-0.1.2" |
|
8974 | sources."unc-path-regex-0.1.2" | |
8800 | sources."underscore-1.7.0" |
|
8975 | sources."underscore-1.7.0" | |
8801 | sources."underscore.string-2.2.1" |
|
8976 | sources."underscore.string-2.2.1" | |
8802 |
|
|
8977 | sources."union-value-1.0.1" | |
8803 | dependencies = [ |
|
|||
8804 | sources."extend-shallow-2.0.1" |
|
|||
8805 | sources."set-value-0.4.3" |
|
|||
8806 | ]; |
|
|||
8807 | }) |
|
|||
8808 | sources."uniq-1.0.1" |
|
8978 | sources."uniq-1.0.1" | |
8809 | sources."uniqs-2.0.0" |
|
8979 | sources."uniqs-2.0.0" | |
8810 | sources."unique-filename-1.1.1" |
|
8980 | sources."unique-filename-1.1.1" | |
8811 |
sources."unique-slug-2.0. |
|
8981 | sources."unique-slug-2.0.2" | |
8812 | (sources."unset-value-1.0.0" // { |
|
8982 | (sources."unset-value-1.0.0" // { | |
8813 | dependencies = [ |
|
8983 | dependencies = [ | |
8814 | (sources."has-value-0.3.1" // { |
|
8984 | (sources."has-value-0.3.1" // { | |
@@ -8819,13 +8989,14 b' let' | |||||
8819 | sources."has-values-0.1.4" |
|
8989 | sources."has-values-0.1.4" | |
8820 | ]; |
|
8990 | ]; | |
8821 | }) |
|
8991 | }) | |
8822 |
sources."upath-1. |
|
8992 | sources."upath-1.2.0" | |
8823 | sources."upper-case-1.1.3" |
|
8993 | sources."upper-case-1.1.3" | |
8824 | (sources."uri-js-4.2.2" // { |
|
8994 | (sources."uri-js-4.2.2" // { | |
8825 | dependencies = [ |
|
8995 | dependencies = [ | |
8826 | sources."punycode-2.1.1" |
|
8996 | sources."punycode-2.1.1" | |
8827 | ]; |
|
8997 | ]; | |
8828 | }) |
|
8998 | }) | |
|
8999 | sources."uri-path-1.0.0" | |||
8829 | sources."urix-0.1.0" |
|
9000 | sources."urix-0.1.0" | |
8830 | (sources."url-0.11.0" // { |
|
9001 | (sources."url-0.11.0" // { | |
8831 | dependencies = [ |
|
9002 | dependencies = [ | |
@@ -8833,12 +9004,16 b' let' | |||||
8833 | ]; |
|
9004 | ]; | |
8834 | }) |
|
9005 | }) | |
8835 | sources."use-3.1.1" |
|
9006 | sources."use-3.1.1" | |
8836 | sources."util-0.11.1" |
|
9007 | (sources."util-0.11.1" // { | |
|
9008 | dependencies = [ | |||
|
9009 | sources."inherits-2.0.3" | |||
|
9010 | ]; | |||
|
9011 | }) | |||
8837 | sources."util-deprecate-1.0.2" |
|
9012 | sources."util-deprecate-1.0.2" | |
8838 | sources."util.promisify-1.0.0" |
|
9013 | sources."util.promisify-1.0.0" | |
8839 | sources."utila-0.4.0" |
|
9014 | sources."utila-0.4.0" | |
8840 |
sources."uuid-3.3. |
|
9015 | sources."uuid-3.3.3" | |
8841 |
sources."v8-compile-cache-2.0 |
|
9016 | sources."v8-compile-cache-2.1.0" | |
8842 | sources."v8flags-3.1.3" |
|
9017 | sources."v8flags-3.1.3" | |
8843 | sources."vendors-1.0.3" |
|
9018 | sources."vendors-1.0.3" | |
8844 | (sources."verror-1.10.0" // { |
|
9019 | (sources."verror-1.10.0" // { | |
@@ -8846,16 +9021,16 b' let' | |||||
8846 | sources."assert-plus-1.0.0" |
|
9021 | sources."assert-plus-1.0.0" | |
8847 | ]; |
|
9022 | ]; | |
8848 | }) |
|
9023 | }) | |
8849 |
sources."vm-browserify- |
|
9024 | sources."vm-browserify-1.1.0" | |
8850 | (sources."watchpack-1.6.0" // { |
|
9025 | (sources."watchpack-1.6.0" // { | |
8851 | dependencies = [ |
|
9026 | dependencies = [ | |
8852 |
sources."graceful-fs-4. |
|
9027 | sources."graceful-fs-4.2.3" | |
8853 | ]; |
|
9028 | ]; | |
8854 | }) |
|
9029 | }) | |
8855 | sources."waypoints-4.0.1" |
|
9030 | sources."waypoints-4.0.1" | |
8856 | (sources."webpack-4.23.1" // { |
|
9031 | (sources."webpack-4.23.1" // { | |
8857 | dependencies = [ |
|
9032 | dependencies = [ | |
8858 |
sources."ajv-6.10. |
|
9033 | sources."ajv-6.10.2" | |
8859 | ]; |
|
9034 | ]; | |
8860 | }) |
|
9035 | }) | |
8861 | (sources."webpack-cli-3.1.2" // { |
|
9036 | (sources."webpack-cli-3.1.2" // { | |
@@ -8871,7 +9046,7 b' let' | |||||
8871 | sources."source-map-0.4.4" |
|
9046 | sources."source-map-0.4.4" | |
8872 | ]; |
|
9047 | ]; | |
8873 | }) |
|
9048 | }) | |
8874 |
(sources."webpack-sources-1.3 |
|
9049 | (sources."webpack-sources-1.4.3" // { | |
8875 | dependencies = [ |
|
9050 | dependencies = [ | |
8876 | sources."source-map-0.6.1" |
|
9051 | sources."source-map-0.6.1" | |
8877 | ]; |
|
9052 | ]; | |
@@ -8904,14 +9079,14 b' let' | |||||
8904 | ]; |
|
9079 | ]; | |
8905 | }) |
|
9080 | }) | |
8906 | sources."wrappy-1.0.2" |
|
9081 | sources."wrappy-1.0.2" | |
8907 |
sources."xtend-4.0. |
|
9082 | sources."xtend-4.0.2" | |
8908 | sources."y18n-4.0.0" |
|
9083 | sources."y18n-4.0.0" | |
8909 | sources."yallist-2.1.2" |
|
9084 | sources."yallist-2.1.2" | |
8910 | (sources."yargs-12.0.5" // { |
|
9085 | (sources."yargs-12.0.5" // { | |
8911 | dependencies = [ |
|
9086 | dependencies = [ | |
8912 | sources."find-up-3.0.0" |
|
9087 | sources."find-up-3.0.0" | |
8913 | sources."locate-path-3.0.0" |
|
9088 | sources."locate-path-3.0.0" | |
8914 |
sources."p-limit-2.2. |
|
9089 | sources."p-limit-2.2.1" | |
8915 | sources."p-locate-3.0.0" |
|
9090 | sources."p-locate-3.0.0" | |
8916 | sources."p-try-2.2.0" |
|
9091 | sources."p-try-2.2.0" | |
8917 | ]; |
|
9092 | ]; |
@@ -5,7 +5,7 b'' | |||||
5 |
|
5 | |||
6 | self: super: { |
|
6 | self: super: { | |
7 | "alembic" = super.buildPythonPackage { |
|
7 | "alembic" = super.buildPythonPackage { | |
8 |
name = "alembic-1. |
|
8 | name = "alembic-1.3.1"; | |
9 | doCheck = false; |
|
9 | doCheck = false; | |
10 | propagatedBuildInputs = [ |
|
10 | propagatedBuildInputs = [ | |
11 | self."sqlalchemy" |
|
11 | self."sqlalchemy" | |
@@ -14,22 +14,22 b' self: super: {' | |||||
14 | self."python-dateutil" |
|
14 | self."python-dateutil" | |
15 | ]; |
|
15 | ]; | |
16 | src = fetchurl { |
|
16 | src = fetchurl { | |
17 |
url = "https://files.pythonhosted.org/packages/6e |
|
17 | url = "https://files.pythonhosted.org/packages/84/64/493c45119dce700a4b9eeecc436ef9e8835ab67bae6414f040cdc7b58f4b/alembic-1.3.1.tar.gz"; | |
18 | sha256 = "1dwl0264r6ri2jyrjr68am04x538ab26xwy4crqjnnhm4alwm3c2"; |
|
18 | sha256 = "1cl2chk5jx0rf4hmsd5lljic7iifw17yv3y5xawvp4i14jvpn9s9"; | |
19 | }; |
|
19 | }; | |
20 | meta = { |
|
20 | meta = { | |
21 | license = [ pkgs.lib.licenses.mit ]; |
|
21 | license = [ pkgs.lib.licenses.mit ]; | |
22 | }; |
|
22 | }; | |
23 | }; |
|
23 | }; | |
24 | "amqp" = super.buildPythonPackage { |
|
24 | "amqp" = super.buildPythonPackage { | |
25 |
name = "amqp-2. |
|
25 | name = "amqp-2.5.2"; | |
26 | doCheck = false; |
|
26 | doCheck = false; | |
27 | propagatedBuildInputs = [ |
|
27 | propagatedBuildInputs = [ | |
28 | self."vine" |
|
28 | self."vine" | |
29 | ]; |
|
29 | ]; | |
30 | src = fetchurl { |
|
30 | src = fetchurl { | |
31 |
url = "https://files.pythonhosted.org/packages/ |
|
31 | url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz"; | |
32 | sha256 = "0wlfnvhmfrn7c8qif2jyvsm63ibdxp02ss564qwrvqfhz0di72s0"; |
|
32 | sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp"; | |
33 | }; |
|
33 | }; | |
34 | meta = { |
|
34 | meta = { | |
35 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
35 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
@@ -63,33 +63,22 b' self: super: {' | |||||
63 | }; |
|
63 | }; | |
64 | }; |
|
64 | }; | |
65 | "atomicwrites" = super.buildPythonPackage { |
|
65 | "atomicwrites" = super.buildPythonPackage { | |
66 |
name = "atomicwrites-1. |
|
66 | name = "atomicwrites-1.3.0"; | |
67 | doCheck = false; |
|
67 | doCheck = false; | |
68 | src = fetchurl { |
|
68 | src = fetchurl { | |
69 |
url = "https://files.pythonhosted.org/packages/ac |
|
69 | url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz"; | |
70 | sha256 = "1vmkbw9j0qammwxbxycrs39gvdg4lc2d4lk98kwf8ag2manyi6pc"; |
|
70 | sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm"; | |
71 | }; |
|
71 | }; | |
72 | meta = { |
|
72 | meta = { | |
73 | license = [ pkgs.lib.licenses.mit ]; |
|
73 | license = [ pkgs.lib.licenses.mit ]; | |
74 | }; |
|
74 | }; | |
75 | }; |
|
75 | }; | |
76 | "attrs" = super.buildPythonPackage { |
|
76 | "attrs" = super.buildPythonPackage { | |
77 |
name = "attrs-1 |
|
77 | name = "attrs-19.3.0"; | |
78 | doCheck = false; |
|
78 | doCheck = false; | |
79 | src = fetchurl { |
|
79 | src = fetchurl { | |
80 |
url = "https://files.pythonhosted.org/packages/ |
|
80 | url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz"; | |
81 | sha256 = "0s9ydh058wmmf5v391pym877x4ahxg45dw6a0w4c7s5wgpigdjqh"; |
|
81 | sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp"; | |
82 | }; |
|
|||
83 | meta = { |
|
|||
84 | license = [ pkgs.lib.licenses.mit ]; |
|
|||
85 | }; |
|
|||
86 | }; |
|
|||
87 | "authomatic" = super.buildPythonPackage { |
|
|||
88 | name = "authomatic-0.1.0.post1"; |
|
|||
89 | doCheck = false; |
|
|||
90 | src = fetchurl { |
|
|||
91 | url = "https://code.rhodecode.com/upstream/authomatic/artifacts/download/0-4fe9c041-a567-4f84-be4c-7efa2a606d3c.tar.gz?md5=f6bdc3c769688212db68233e8d2b0383"; |
|
|||
92 | sha256 = "0pc716mva0ym6xd8jwzjbjp8dqxy9069wwwv2aqwb8lyhl4757ab"; |
|
|||
93 | }; |
|
82 | }; | |
94 | meta = { |
|
83 | meta = { | |
95 | license = [ pkgs.lib.licenses.mit ]; |
|
84 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -146,11 +135,11 b' self: super: {' | |||||
146 | }; |
|
135 | }; | |
147 | }; |
|
136 | }; | |
148 | "billiard" = super.buildPythonPackage { |
|
137 | "billiard" = super.buildPythonPackage { | |
149 |
name = "billiard-3. |
|
138 | name = "billiard-3.6.1.0"; | |
150 | doCheck = false; |
|
139 | doCheck = false; | |
151 | src = fetchurl { |
|
140 | src = fetchurl { | |
152 |
url = "https://files.pythonhosted.org/packages/ |
|
141 | url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz"; | |
153 | sha256 = "1riwiiwgb141151md4ykx49qrz749akj5k8g290ji9bsqjyj4yqx"; |
|
142 | sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q"; | |
154 | }; |
|
143 | }; | |
155 | meta = { |
|
144 | meta = { | |
156 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
145 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
@@ -183,30 +172,42 b' self: super: {' | |||||
183 | }; |
|
172 | }; | |
184 | }; |
|
173 | }; | |
185 | "celery" = super.buildPythonPackage { |
|
174 | "celery" = super.buildPythonPackage { | |
186 |
name = "celery-4. |
|
175 | name = "celery-4.3.0"; | |
187 | doCheck = false; |
|
176 | doCheck = false; | |
188 | propagatedBuildInputs = [ |
|
177 | propagatedBuildInputs = [ | |
189 | self."pytz" |
|
178 | self."pytz" | |
190 | self."billiard" |
|
179 | self."billiard" | |
191 | self."kombu" |
|
180 | self."kombu" | |
|
181 | self."vine" | |||
192 | ]; |
|
182 | ]; | |
193 | src = fetchurl { |
|
183 | src = fetchurl { | |
194 |
url = "https://files.pythonhosted.org/packages/e9 |
|
184 | url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz"; | |
195 | sha256 = "1xbir4vw42n2ir9lanhwl7w69zpmj7lbi66fxm2b7pyvkcss7wni"; |
|
185 | sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac"; | |
196 | }; |
|
186 | }; | |
197 | meta = { |
|
187 | meta = { | |
198 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
188 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
199 | }; |
|
189 | }; | |
200 | }; |
|
190 | }; | |
|
191 | "certifi" = super.buildPythonPackage { | |||
|
192 | name = "certifi-2019.11.28"; | |||
|
193 | doCheck = false; | |||
|
194 | src = fetchurl { | |||
|
195 | url = "https://files.pythonhosted.org/packages/41/bf/9d214a5af07debc6acf7f3f257265618f1db242a3f8e49a9b516f24523a6/certifi-2019.11.28.tar.gz"; | |||
|
196 | sha256 = "07qg6864bk4qxa8akr967amlmsq9v310hp039mcpjx6dliylrdi5"; | |||
|
197 | }; | |||
|
198 | meta = { | |||
|
199 | license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ]; | |||
|
200 | }; | |||
|
201 | }; | |||
201 | "cffi" = super.buildPythonPackage { |
|
202 | "cffi" = super.buildPythonPackage { | |
202 |
name = "cffi-1.12. |
|
203 | name = "cffi-1.12.3"; | |
203 | doCheck = false; |
|
204 | doCheck = false; | |
204 | propagatedBuildInputs = [ |
|
205 | propagatedBuildInputs = [ | |
205 | self."pycparser" |
|
206 | self."pycparser" | |
206 | ]; |
|
207 | ]; | |
207 | src = fetchurl { |
|
208 | src = fetchurl { | |
208 | url = "https://files.pythonhosted.org/packages/64/7c/27367b38e6cc3e1f49f193deb761fe75cda9f95da37b67b422e62281fcac/cffi-1.12.2.tar.gz"; |
|
209 | url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz"; | |
209 | sha256 = "19qfks2djya8vix95bmg3xzipjb8w9b8mbj4j5k2hqkc8j58f4z1"; |
|
210 | sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704"; | |
210 | }; |
|
211 | }; | |
211 | meta = { |
|
212 | meta = { | |
212 | license = [ pkgs.lib.licenses.mit ]; |
|
213 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -243,6 +244,17 b' self: super: {' | |||||
243 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
244 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
244 | }; |
|
245 | }; | |
245 | }; |
|
246 | }; | |
|
247 | "chardet" = super.buildPythonPackage { | |||
|
248 | name = "chardet-3.0.4"; | |||
|
249 | doCheck = false; | |||
|
250 | src = fetchurl { | |||
|
251 | url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz"; | |||
|
252 | sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4"; | |||
|
253 | }; | |||
|
254 | meta = { | |||
|
255 | license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |||
|
256 | }; | |||
|
257 | }; | |||
246 | "click" = super.buildPythonPackage { |
|
258 | "click" = super.buildPythonPackage { | |
247 | name = "click-7.0"; |
|
259 | name = "click-7.0"; | |
248 | doCheck = false; |
|
260 | doCheck = false; | |
@@ -285,16 +297,27 b' self: super: {' | |||||
285 | }; |
|
297 | }; | |
286 | }; |
|
298 | }; | |
287 | "configparser" = super.buildPythonPackage { |
|
299 | "configparser" = super.buildPythonPackage { | |
288 |
name = "configparser- |
|
300 | name = "configparser-4.0.2"; | |
289 | doCheck = false; |
|
301 | doCheck = false; | |
290 | src = fetchurl { |
|
302 | src = fetchurl { | |
291 |
url = "https://files.pythonhosted.org/packages/ |
|
303 | url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz"; | |
292 | sha256 = "0xac32886ihs2xg7w1gppcq2sgin5qsm8lqwijs5xifq9w0x0q6s"; |
|
304 | sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7"; | |
293 | }; |
|
305 | }; | |
294 | meta = { |
|
306 | meta = { | |
295 | license = [ pkgs.lib.licenses.mit ]; |
|
307 | license = [ pkgs.lib.licenses.mit ]; | |
296 | }; |
|
308 | }; | |
297 | }; |
|
309 | }; | |
|
310 | "contextlib2" = super.buildPythonPackage { | |||
|
311 | name = "contextlib2-0.6.0.post1"; | |||
|
312 | doCheck = false; | |||
|
313 | src = fetchurl { | |||
|
314 | url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz"; | |||
|
315 | sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01"; | |||
|
316 | }; | |||
|
317 | meta = { | |||
|
318 | license = [ pkgs.lib.licenses.psfl ]; | |||
|
319 | }; | |||
|
320 | }; | |||
298 | "cov-core" = super.buildPythonPackage { |
|
321 | "cov-core" = super.buildPythonPackage { | |
299 | name = "cov-core-1.15.0"; |
|
322 | name = "cov-core-1.15.0"; | |
300 | doCheck = false; |
|
323 | doCheck = false; | |
@@ -310,11 +333,11 b' self: super: {' | |||||
310 | }; |
|
333 | }; | |
311 | }; |
|
334 | }; | |
312 | "coverage" = super.buildPythonPackage { |
|
335 | "coverage" = super.buildPythonPackage { | |
313 |
name = "coverage-4.5. |
|
336 | name = "coverage-4.5.4"; | |
314 | doCheck = false; |
|
337 | doCheck = false; | |
315 | src = fetchurl { |
|
338 | src = fetchurl { | |
316 |
url = "https://files.pythonhosted.org/packages/8 |
|
339 | url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz"; | |
317 | sha256 = "02f6m073qdispn96rc616hg0rnmw1pgqzw3bgxwiwza4zf9hirlx"; |
|
340 | sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0"; | |
318 | }; |
|
341 | }; | |
319 | meta = { |
|
342 | meta = { | |
320 | license = [ pkgs.lib.licenses.asl20 ]; |
|
343 | license = [ pkgs.lib.licenses.asl20 ]; | |
@@ -361,7 +384,7 b' self: super: {' | |||||
361 | }; |
|
384 | }; | |
362 | }; |
|
385 | }; | |
363 | "deform" = super.buildPythonPackage { |
|
386 | "deform" = super.buildPythonPackage { | |
364 |
name = "deform-2.0. |
|
387 | name = "deform-2.0.8"; | |
365 | doCheck = false; |
|
388 | doCheck = false; | |
366 | propagatedBuildInputs = [ |
|
389 | propagatedBuildInputs = [ | |
367 | self."chameleon" |
|
390 | self."chameleon" | |
@@ -372,8 +395,8 b' self: super: {' | |||||
372 | self."zope.deprecation" |
|
395 | self."zope.deprecation" | |
373 | ]; |
|
396 | ]; | |
374 | src = fetchurl { |
|
397 | src = fetchurl { | |
375 | url = "https://files.pythonhosted.org/packages/cf/a1/bc234527b8f181de9acd80e796483c00007658d1e32b7de78f1c2e004d9a/deform-2.0.7.tar.gz"; |
|
398 | url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz"; | |
376 | sha256 = "0jnpi0zr2hjvbmiz6nm33yqv976dn9lf51vhlzqc0i75xcr9rwig"; |
|
399 | sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9"; | |
377 | }; |
|
400 | }; | |
378 | meta = { |
|
401 | meta = { | |
379 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
|
402 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
@@ -417,14 +440,14 b' self: super: {' | |||||
417 | }; |
|
440 | }; | |
418 | }; |
|
441 | }; | |
419 | "dogpile.cache" = super.buildPythonPackage { |
|
442 | "dogpile.cache" = super.buildPythonPackage { | |
420 |
name = "dogpile.cache-0. |
|
443 | name = "dogpile.cache-0.9.0"; | |
421 | doCheck = false; |
|
444 | doCheck = false; | |
422 | propagatedBuildInputs = [ |
|
445 | propagatedBuildInputs = [ | |
423 | self."decorator" |
|
446 | self."decorator" | |
424 | ]; |
|
447 | ]; | |
425 | src = fetchurl { |
|
448 | src = fetchurl { | |
426 |
url = "https://files.pythonhosted.org/packages/84 |
|
449 | url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz"; | |
427 | sha256 = "0caazmrzhnfqb5yrp8myhw61ny637jj69wcngrpbvi31jlcpy6v9"; |
|
450 | sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k"; | |
428 | }; |
|
451 | }; | |
429 | meta = { |
|
452 | meta = { | |
430 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
453 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
@@ -514,14 +537,14 b' self: super: {' | |||||
514 | }; |
|
537 | }; | |
515 | }; |
|
538 | }; | |
516 | "elasticsearch2" = super.buildPythonPackage { |
|
539 | "elasticsearch2" = super.buildPythonPackage { | |
517 |
name = "elasticsearch2-2.5. |
|
540 | name = "elasticsearch2-2.5.1"; | |
518 | doCheck = false; |
|
541 | doCheck = false; | |
519 | propagatedBuildInputs = [ |
|
542 | propagatedBuildInputs = [ | |
520 | self."urllib3" |
|
543 | self."urllib3" | |
521 | ]; |
|
544 | ]; | |
522 | src = fetchurl { |
|
545 | src = fetchurl { | |
523 |
url = "https://files.pythonhosted.org/packages/ |
|
546 | url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz"; | |
524 | sha256 = "0ky0q16lbvz022yv6q3pix7aamf026p1y994537ccjf0p0dxnbxr"; |
|
547 | sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k"; | |
525 | }; |
|
548 | }; | |
526 | meta = { |
|
549 | meta = { | |
527 | license = [ pkgs.lib.licenses.asl20 ]; |
|
550 | license = [ pkgs.lib.licenses.asl20 ]; | |
@@ -666,16 +689,44 b' self: super: {' | |||||
666 | }; |
|
689 | }; | |
667 | }; |
|
690 | }; | |
668 | "hupper" = super.buildPythonPackage { |
|
691 | "hupper" = super.buildPythonPackage { | |
669 |
name = "hupper-1. |
|
692 | name = "hupper-1.9.1"; | |
670 | doCheck = false; |
|
693 | doCheck = false; | |
671 | src = fetchurl { |
|
694 | src = fetchurl { | |
672 |
url = "https://files.pythonhosted.org/packages/ |
|
695 | url = "https://files.pythonhosted.org/packages/09/3a/4f215659f31eeffe364a984dba486bfa3907bfcc54b7013bdfe825cebb5f/hupper-1.9.1.tar.gz"; | |
673 | sha256 = "0d3cvkc8ssgwk54wvhbifj56ry97qi10pfzwfk8vwzzcikbfp3zy"; |
|
696 | sha256 = "0pyg879fv9mbwlnbzw2a3234qqycqs9l97h5mpkmk0bvxhi2471v"; | |
674 | }; |
|
697 | }; | |
675 | meta = { |
|
698 | meta = { | |
676 | license = [ pkgs.lib.licenses.mit ]; |
|
699 | license = [ pkgs.lib.licenses.mit ]; | |
677 | }; |
|
700 | }; | |
678 | }; |
|
701 | }; | |
|
702 | "idna" = super.buildPythonPackage { | |||
|
703 | name = "idna-2.8"; | |||
|
704 | doCheck = false; | |||
|
705 | src = fetchurl { | |||
|
706 | url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz"; | |||
|
707 | sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3"; | |||
|
708 | }; | |||
|
709 | meta = { | |||
|
710 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ]; | |||
|
711 | }; | |||
|
712 | }; | |||
|
713 | "importlib-metadata" = super.buildPythonPackage { | |||
|
714 | name = "importlib-metadata-0.23"; | |||
|
715 | doCheck = false; | |||
|
716 | propagatedBuildInputs = [ | |||
|
717 | self."zipp" | |||
|
718 | self."contextlib2" | |||
|
719 | self."configparser" | |||
|
720 | self."pathlib2" | |||
|
721 | ]; | |||
|
722 | src = fetchurl { | |||
|
723 | url = "https://files.pythonhosted.org/packages/5d/44/636bcd15697791943e2dedda0dbe098d8530a38d113b202817133e0b06c0/importlib_metadata-0.23.tar.gz"; | |||
|
724 | sha256 = "09mdqdfv5rdrwz80jh9m379gxmvk2vhjfz0fg53hid00icvxf65a"; | |||
|
725 | }; | |||
|
726 | meta = { | |||
|
727 | license = [ pkgs.lib.licenses.asl20 ]; | |||
|
728 | }; | |||
|
729 | }; | |||
679 | "infrae.cache" = super.buildPythonPackage { |
|
730 | "infrae.cache" = super.buildPythonPackage { | |
680 | name = "infrae.cache-1.0.1"; |
|
731 | name = "infrae.cache-1.0.1"; | |
681 | doCheck = false; |
|
732 | doCheck = false; | |
@@ -703,11 +754,11 b' self: super: {' | |||||
703 | }; |
|
754 | }; | |
704 | }; |
|
755 | }; | |
705 | "ipaddress" = super.buildPythonPackage { |
|
756 | "ipaddress" = super.buildPythonPackage { | |
706 |
name = "ipaddress-1.0.2 |
|
757 | name = "ipaddress-1.0.23"; | |
707 | doCheck = false; |
|
758 | doCheck = false; | |
708 | src = fetchurl { |
|
759 | src = fetchurl { | |
709 | url = "https://files.pythonhosted.org/packages/97/8d/77b8cedcfbf93676148518036c6b1ce7f8e14bf07e95d7fd4ddcb8cc052f/ipaddress-1.0.22.tar.gz"; |
|
760 | url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz"; | |
710 | sha256 = "0b570bm6xqpjwqis15pvdy6lyvvzfndjvkynilcddjj5x98wfimi"; |
|
761 | sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p"; | |
711 | }; |
|
762 | }; | |
712 | meta = { |
|
763 | meta = { | |
713 | license = [ pkgs.lib.licenses.psfl ]; |
|
764 | license = [ pkgs.lib.licenses.psfl ]; | |
@@ -859,14 +910,15 b' self: super: {' | |||||
859 | }; |
|
910 | }; | |
860 | }; |
|
911 | }; | |
861 | "kombu" = super.buildPythonPackage { |
|
912 | "kombu" = super.buildPythonPackage { | |
862 |
name = "kombu-4. |
|
913 | name = "kombu-4.6.6"; | |
863 | doCheck = false; |
|
914 | doCheck = false; | |
864 | propagatedBuildInputs = [ |
|
915 | propagatedBuildInputs = [ | |
865 | self."amqp" |
|
916 | self."amqp" | |
|
917 | self."importlib-metadata" | |||
866 | ]; |
|
918 | ]; | |
867 | src = fetchurl { |
|
919 | src = fetchurl { | |
868 |
url = "https://files.pythonhosted.org/packages/ |
|
920 | url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz"; | |
869 | sha256 = "10lh3hncvw67fz0k5vgbx3yh9gjfpqdlia1f13i28cgnc1nfrbc6"; |
|
921 | sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p"; | |
870 | }; |
|
922 | }; | |
871 | meta = { |
|
923 | meta = { | |
872 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
924 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
@@ -884,14 +936,14 b' self: super: {' | |||||
884 | }; |
|
936 | }; | |
885 | }; |
|
937 | }; | |
886 | "mako" = super.buildPythonPackage { |
|
938 | "mako" = super.buildPythonPackage { | |
887 |
name = "mako-1.0 |
|
939 | name = "mako-1.1.0"; | |
888 | doCheck = false; |
|
940 | doCheck = false; | |
889 | propagatedBuildInputs = [ |
|
941 | propagatedBuildInputs = [ | |
890 | self."markupsafe" |
|
942 | self."markupsafe" | |
891 | ]; |
|
943 | ]; | |
892 | src = fetchurl { |
|
944 | src = fetchurl { | |
893 |
url = "https://files.pythonhosted.org/packages/e |
|
945 | url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz"; | |
894 | sha256 = "1bi5gnr8r8dva06qpyx4kgjc6spm2k1y908183nbbaylggjzs0jf"; |
|
946 | sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3"; | |
895 | }; |
|
947 | }; | |
896 | meta = { |
|
948 | meta = { | |
897 | license = [ pkgs.lib.licenses.mit ]; |
|
949 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -909,25 +961,14 b' self: super: {' | |||||
909 | }; |
|
961 | }; | |
910 | }; |
|
962 | }; | |
911 | "markupsafe" = super.buildPythonPackage { |
|
963 | "markupsafe" = super.buildPythonPackage { | |
912 |
name = "markupsafe-1.1. |
|
964 | name = "markupsafe-1.1.1"; | |
913 | doCheck = false; |
|
965 | doCheck = false; | |
914 | src = fetchurl { |
|
966 | src = fetchurl { | |
915 |
url = "https://files.pythonhosted.org/packages/ |
|
967 | url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz"; | |
916 | sha256 = "1lxirjypbdd3l9jl4vliilhfnhy7c7f2vlldqg1b0i74khn375sf"; |
|
968 | sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9"; | |
917 | }; |
|
969 | }; | |
918 | meta = { |
|
970 | meta = { | |
919 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
971 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ]; | |
920 | }; |
|
|||
921 | }; |
|
|||
922 | "meld3" = super.buildPythonPackage { |
|
|||
923 | name = "meld3-1.0.2"; |
|
|||
924 | doCheck = false; |
|
|||
925 | src = fetchurl { |
|
|||
926 | url = "https://files.pythonhosted.org/packages/45/a0/317c6422b26c12fe0161e936fc35f36552069ba8e6f7ecbd99bbffe32a5f/meld3-1.0.2.tar.gz"; |
|
|||
927 | sha256 = "0n4mkwlpsqnmn0dm0wm5hn9nkda0nafl0jdy5sdl5977znh59dzp"; |
|
|||
928 | }; |
|
|||
929 | meta = { |
|
|||
930 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
|
|||
931 | }; |
|
972 | }; | |
932 | }; |
|
973 | }; | |
933 | "mistune" = super.buildPythonPackage { |
|
974 | "mistune" = super.buildPythonPackage { | |
@@ -942,14 +983,18 b' self: super: {' | |||||
942 | }; |
|
983 | }; | |
943 | }; |
|
984 | }; | |
944 | "mock" = super.buildPythonPackage { |
|
985 | "mock" = super.buildPythonPackage { | |
945 |
name = "mock- |
|
986 | name = "mock-3.0.5"; | |
946 | doCheck = false; |
|
987 | doCheck = false; | |
|
988 | propagatedBuildInputs = [ | |||
|
989 | self."six" | |||
|
990 | self."funcsigs" | |||
|
991 | ]; | |||
947 | src = fetchurl { |
|
992 | src = fetchurl { | |
948 | url = "https://files.pythonhosted.org/packages/a2/52/7edcd94f0afb721a2d559a5b9aae8af4f8f2c79bc63fdbe8a8a6c9b23bbe/mock-1.0.1.tar.gz"; |
|
993 | url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz"; | |
949 | sha256 = "0kzlsbki6q0awf89rc287f3aj8x431lrajf160a70z0ikhnxsfdq"; |
|
994 | sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3"; | |
950 | }; |
|
995 | }; | |
951 | meta = { |
|
996 | meta = { | |
952 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
997 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ]; | |
953 | }; |
|
998 | }; | |
954 | }; |
|
999 | }; | |
955 | "more-itertools" = super.buildPythonPackage { |
|
1000 | "more-itertools" = super.buildPythonPackage { | |
@@ -1029,14 +1074,18 b' self: super: {' | |||||
1029 | }; |
|
1074 | }; | |
1030 | }; |
|
1075 | }; | |
1031 | "packaging" = super.buildPythonPackage { |
|
1076 | "packaging" = super.buildPythonPackage { | |
1032 |
name = "packaging-1 |
|
1077 | name = "packaging-19.2"; | |
1033 | doCheck = false; |
|
1078 | doCheck = false; | |
|
1079 | propagatedBuildInputs = [ | |||
|
1080 | self."pyparsing" | |||
|
1081 | self."six" | |||
|
1082 | ]; | |||
1034 | src = fetchurl { |
|
1083 | src = fetchurl { | |
1035 |
url = "https://files.pythonhosted.org/packages/24 |
|
1084 | url = "https://files.pythonhosted.org/packages/5a/2f/449ded84226d0e2fda8da9252e5ee7731bdf14cd338f622dfcd9934e0377/packaging-19.2.tar.gz"; | |
1036 | sha256 = "1zn60w84bxvw6wypffka18ca66pa1k2cfrq3cq8fnsfja5m3k4ng"; |
|
1085 | sha256 = "0izwlz9h0bw171a1chr311g2y7n657zjaf4mq4rgm8pp9lbj9f98"; | |
1037 | }; |
|
1086 | }; | |
1038 | meta = { |
|
1087 | meta = { | |
1039 | license = [ pkgs.lib.licenses.asl20 ]; |
|
1088 | license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ]; | |
1040 | }; |
|
1089 | }; | |
1041 | }; |
|
1090 | }; | |
1042 | "pandocfilters" = super.buildPythonPackage { |
|
1091 | "pandocfilters" = super.buildPythonPackage { | |
@@ -1051,14 +1100,14 b' self: super: {' | |||||
1051 | }; |
|
1100 | }; | |
1052 | }; |
|
1101 | }; | |
1053 | "paste" = super.buildPythonPackage { |
|
1102 | "paste" = super.buildPythonPackage { | |
1054 |
name = "paste-3. |
|
1103 | name = "paste-3.2.1"; | |
1055 | doCheck = false; |
|
1104 | doCheck = false; | |
1056 | propagatedBuildInputs = [ |
|
1105 | propagatedBuildInputs = [ | |
1057 | self."six" |
|
1106 | self."six" | |
1058 | ]; |
|
1107 | ]; | |
1059 | src = fetchurl { |
|
1108 | src = fetchurl { | |
1060 |
url = "https://files.pythonhosted.org/packages/ |
|
1109 | url = "https://files.pythonhosted.org/packages/0d/86/7008b5563594e8a63763f05212a3eb84c85f0b2eff834e5697716e56bca9/Paste-3.2.1.tar.gz"; | |
1061 | sha256 = "05w1sh6ky4d7pmdb8nv82n13w22jcn3qsagg5ih3hjmbws9kkwf4"; |
|
1110 | sha256 = "1vjxr8n1p31c9x9rh8g0f34yisa9028cxpvn36q7g1s0m2b9x71x"; | |
1062 | }; |
|
1111 | }; | |
1063 | meta = { |
|
1112 | meta = { | |
1064 | license = [ pkgs.lib.licenses.mit ]; |
|
1113 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -1076,7 +1125,7 b' self: super: {' | |||||
1076 | }; |
|
1125 | }; | |
1077 | }; |
|
1126 | }; | |
1078 | "pastescript" = super.buildPythonPackage { |
|
1127 | "pastescript" = super.buildPythonPackage { | |
1079 |
name = "pastescript-3. |
|
1128 | name = "pastescript-3.2.0"; | |
1080 | doCheck = false; |
|
1129 | doCheck = false; | |
1081 | propagatedBuildInputs = [ |
|
1130 | propagatedBuildInputs = [ | |
1082 | self."paste" |
|
1131 | self."paste" | |
@@ -1084,23 +1133,23 b' self: super: {' | |||||
1084 | self."six" |
|
1133 | self."six" | |
1085 | ]; |
|
1134 | ]; | |
1086 | src = fetchurl { |
|
1135 | src = fetchurl { | |
1087 |
url = "https://files.pythonhosted.org/packages/9 |
|
1136 | url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz"; | |
1088 | sha256 = "02qcxjjr32ks7a6d4f533wl34ysc7yhwlrfcyqwqbzr52250v4fs"; |
|
1137 | sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv"; | |
1089 | }; |
|
1138 | }; | |
1090 | meta = { |
|
1139 | meta = { | |
1091 | license = [ pkgs.lib.licenses.mit ]; |
|
1140 | license = [ pkgs.lib.licenses.mit ]; | |
1092 | }; |
|
1141 | }; | |
1093 | }; |
|
1142 | }; | |
1094 | "pathlib2" = super.buildPythonPackage { |
|
1143 | "pathlib2" = super.buildPythonPackage { | |
1095 |
name = "pathlib2-2.3. |
|
1144 | name = "pathlib2-2.3.5"; | |
1096 | doCheck = false; |
|
1145 | doCheck = false; | |
1097 | propagatedBuildInputs = [ |
|
1146 | propagatedBuildInputs = [ | |
1098 | self."six" |
|
1147 | self."six" | |
1099 | self."scandir" |
|
1148 | self."scandir" | |
1100 | ]; |
|
1149 | ]; | |
1101 | src = fetchurl { |
|
1150 | src = fetchurl { | |
1102 |
url = "https://files.pythonhosted.org/packages/ |
|
1151 | url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz"; | |
1103 | sha256 = "1y0f9rkm1924zrc5dn4bwxlhgdkbml82lkcc28l5rgmr7d918q24"; |
|
1152 | sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc"; | |
1104 | }; |
|
1153 | }; | |
1105 | meta = { |
|
1154 | meta = { | |
1106 | license = [ pkgs.lib.licenses.mit ]; |
|
1155 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -1175,48 +1224,51 b' self: super: {' | |||||
1175 | }; |
|
1224 | }; | |
1176 | }; |
|
1225 | }; | |
1177 | "pluggy" = super.buildPythonPackage { |
|
1226 | "pluggy" = super.buildPythonPackage { | |
1178 |
name = "pluggy-0.11 |
|
1227 | name = "pluggy-0.13.1"; | |
1179 | doCheck = false; |
|
1228 | doCheck = false; | |
|
1229 | propagatedBuildInputs = [ | |||
|
1230 | self."importlib-metadata" | |||
|
1231 | ]; | |||
1180 | src = fetchurl { |
|
1232 | src = fetchurl { | |
1181 |
url = "https://files.pythonhosted.org/packages/0 |
|
1233 | url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz"; | |
1182 | sha256 = "10511a54dvafw1jrk75mrhml53c7b7w4yaw7241696lc2hfvr895"; |
|
1234 | sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm"; | |
1183 | }; |
|
1235 | }; | |
1184 | meta = { |
|
1236 | meta = { | |
1185 | license = [ pkgs.lib.licenses.mit ]; |
|
1237 | license = [ pkgs.lib.licenses.mit ]; | |
1186 | }; |
|
1238 | }; | |
1187 | }; |
|
1239 | }; | |
1188 | "prompt-toolkit" = super.buildPythonPackage { |
|
1240 | "prompt-toolkit" = super.buildPythonPackage { | |
1189 |
name = "prompt-toolkit-1.0.1 |
|
1241 | name = "prompt-toolkit-1.0.18"; | |
1190 | doCheck = false; |
|
1242 | doCheck = false; | |
1191 | propagatedBuildInputs = [ |
|
1243 | propagatedBuildInputs = [ | |
1192 | self."six" |
|
1244 | self."six" | |
1193 | self."wcwidth" |
|
1245 | self."wcwidth" | |
1194 | ]; |
|
1246 | ]; | |
1195 | src = fetchurl { |
|
1247 | src = fetchurl { | |
1196 |
url = "https://files.pythonhosted.org/packages/f |
|
1248 | url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz"; | |
1197 | sha256 = "1d65hm6nf0cbq0q0121m60zzy4s1fpg9fn761s1yxf08dridvkn1"; |
|
1249 | sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx"; | |
1198 | }; |
|
1250 | }; | |
1199 | meta = { |
|
1251 | meta = { | |
1200 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1252 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
1201 | }; |
|
1253 | }; | |
1202 | }; |
|
1254 | }; | |
1203 | "psutil" = super.buildPythonPackage { |
|
1255 | "psutil" = super.buildPythonPackage { | |
1204 |
name = "psutil-5.5 |
|
1256 | name = "psutil-5.6.5"; | |
1205 | doCheck = false; |
|
1257 | doCheck = false; | |
1206 | src = fetchurl { |
|
1258 | src = fetchurl { | |
1207 |
url = "https://files.pythonhosted.org/packages/c |
|
1259 | url = "https://files.pythonhosted.org/packages/03/9a/95c4b3d0424426e5fd94b5302ff74cea44d5d4f53466e1228ac8e73e14b4/psutil-5.6.5.tar.gz"; | |
1208 | sha256 = "045qaqvn6k90bj5bcy259yrwcd2afgznaav3sfhphy9b8ambzkkj"; |
|
1260 | sha256 = "0isil5jxwwd8awz54qk28rpgjg43i5l6yl70g40vxwa4r4m56lfh"; | |
1209 | }; |
|
1261 | }; | |
1210 | meta = { |
|
1262 | meta = { | |
1211 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1263 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
1212 | }; |
|
1264 | }; | |
1213 | }; |
|
1265 | }; | |
1214 | "psycopg2" = super.buildPythonPackage { |
|
1266 | "psycopg2" = super.buildPythonPackage { | |
1215 |
name = "psycopg2-2.8. |
|
1267 | name = "psycopg2-2.8.4"; | |
1216 | doCheck = false; |
|
1268 | doCheck = false; | |
1217 | src = fetchurl { |
|
1269 | src = fetchurl { | |
1218 |
url = "https://files.pythonhosted.org/packages/ |
|
1270 | url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz"; | |
1219 | sha256 = "0ms4kx0p5n281l89awccix4d05ybmdngnjjpi9jbzd0rhf1nwyl9"; |
|
1271 | sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q"; | |
1220 | }; |
|
1272 | }; | |
1221 | meta = { |
|
1273 | meta = { | |
1222 | license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ]; |
|
1274 | license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ]; | |
@@ -1234,11 +1286,11 b' self: super: {' | |||||
1234 | }; |
|
1286 | }; | |
1235 | }; |
|
1287 | }; | |
1236 | "py" = super.buildPythonPackage { |
|
1288 | "py" = super.buildPythonPackage { | |
1237 |
name = "py-1. |
|
1289 | name = "py-1.8.0"; | |
1238 | doCheck = false; |
|
1290 | doCheck = false; | |
1239 | src = fetchurl { |
|
1291 | src = fetchurl { | |
1240 | url = "https://files.pythonhosted.org/packages/4f/38/5f427d1eedae73063ce4da680d2bae72014995f9fdeaa57809df61c968cd/py-1.6.0.tar.gz"; |
|
1292 | url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz"; | |
1241 | sha256 = "1wcs3zv9wl5m5x7p16avqj2gsrviyb23yvc3pr330isqs0sh98q6"; |
|
1293 | sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw"; | |
1242 | }; |
|
1294 | }; | |
1243 | meta = { |
|
1295 | meta = { | |
1244 | license = [ pkgs.lib.licenses.mit ]; |
|
1296 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -1271,28 +1323,28 b' self: super: {' | |||||
1271 | }; |
|
1323 | }; | |
1272 | }; |
|
1324 | }; | |
1273 | "pyasn1" = super.buildPythonPackage { |
|
1325 | "pyasn1" = super.buildPythonPackage { | |
1274 |
name = "pyasn1-0.4. |
|
1326 | name = "pyasn1-0.4.8"; | |
1275 | doCheck = false; |
|
1327 | doCheck = false; | |
1276 | src = fetchurl { |
|
1328 | src = fetchurl { | |
1277 |
url = "https://files.pythonhosted.org/packages/46 |
|
1329 | url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz"; | |
1278 | sha256 = "1xqh3jh2nfi2bflk5a0vn59y3pp1vn54f3ksx652sid92gz2096s"; |
|
1330 | sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf"; | |
1279 | }; |
|
1331 | }; | |
1280 | meta = { |
|
1332 | meta = { | |
1281 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1333 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
1282 | }; |
|
1334 | }; | |
1283 | }; |
|
1335 | }; | |
1284 | "pyasn1-modules" = super.buildPythonPackage { |
|
1336 | "pyasn1-modules" = super.buildPythonPackage { | |
1285 |
name = "pyasn1-modules-0.2. |
|
1337 | name = "pyasn1-modules-0.2.6"; | |
1286 | doCheck = false; |
|
1338 | doCheck = false; | |
1287 | propagatedBuildInputs = [ |
|
1339 | propagatedBuildInputs = [ | |
1288 | self."pyasn1" |
|
1340 | self."pyasn1" | |
1289 | ]; |
|
1341 | ]; | |
1290 | src = fetchurl { |
|
1342 | src = fetchurl { | |
1291 |
url = "https://files.pythonhosted.org/packages/ec |
|
1343 | url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz"; | |
1292 | sha256 = "15nvfx0vnl8akdlv3k6s0n80vqvryj82bm040jdsn7wmyxl1ywpg"; |
|
1344 | sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3"; | |
1293 | }; |
|
1345 | }; | |
1294 | meta = { |
|
1346 | meta = { | |
1295 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1347 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ]; | |
1296 | }; |
|
1348 | }; | |
1297 | }; |
|
1349 | }; | |
1298 | "pycparser" = super.buildPythonPackage { |
|
1350 | "pycparser" = super.buildPythonPackage { | |
@@ -1318,11 +1370,11 b' self: super: {' | |||||
1318 | }; |
|
1370 | }; | |
1319 | }; |
|
1371 | }; | |
1320 | "pycurl" = super.buildPythonPackage { |
|
1372 | "pycurl" = super.buildPythonPackage { | |
1321 |
name = "pycurl-7.43.0. |
|
1373 | name = "pycurl-7.43.0.3"; | |
1322 | doCheck = false; |
|
1374 | doCheck = false; | |
1323 | src = fetchurl { |
|
1375 | src = fetchurl { | |
1324 |
url = "https://files.pythonhosted.org/packages/ |
|
1376 | url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz"; | |
1325 | sha256 = "1915kb04k1j4y6k1dx1sgnbddxrl9r1n4q928if2lkrdm73xy30g"; |
|
1377 | sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g"; | |
1326 | }; |
|
1378 | }; | |
1327 | meta = { |
|
1379 | meta = { | |
1328 | license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; |
|
1380 | license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |
@@ -1351,22 +1403,22 b' self: super: {' | |||||
1351 | }; |
|
1403 | }; | |
1352 | }; |
|
1404 | }; | |
1353 | "pyotp" = super.buildPythonPackage { |
|
1405 | "pyotp" = super.buildPythonPackage { | |
1354 |
name = "pyotp-2. |
|
1406 | name = "pyotp-2.3.0"; | |
1355 | doCheck = false; |
|
1407 | doCheck = false; | |
1356 | src = fetchurl { |
|
1408 | src = fetchurl { | |
1357 |
url = "https://files.pythonhosted.org/packages/b1 |
|
1409 | url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz"; | |
1358 | sha256 = "00p69nw431f0s2ilg0hnd77p1l22m06p9rq4f8zfapmavnmzw3xy"; |
|
1410 | sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw"; | |
1359 | }; |
|
1411 | }; | |
1360 | meta = { |
|
1412 | meta = { | |
1361 | license = [ pkgs.lib.licenses.mit ]; |
|
1413 | license = [ pkgs.lib.licenses.mit ]; | |
1362 | }; |
|
1414 | }; | |
1363 | }; |
|
1415 | }; | |
1364 | "pyparsing" = super.buildPythonPackage { |
|
1416 | "pyparsing" = super.buildPythonPackage { | |
1365 |
name = "pyparsing-2. |
|
1417 | name = "pyparsing-2.4.5"; | |
1366 | doCheck = false; |
|
1418 | doCheck = false; | |
1367 | src = fetchurl { |
|
1419 | src = fetchurl { | |
1368 |
url = "https://files.pythonhosted.org/packages/ |
|
1420 | url = "https://files.pythonhosted.org/packages/00/32/8076fa13e832bb4dcff379f18f228e5a53412be0631808b9ca2610c0f566/pyparsing-2.4.5.tar.gz"; | |
1369 | sha256 = "14k5v7n3xqw8kzf42x06bzp184spnlkya2dpjyflax6l3yrallzk"; |
|
1421 | sha256 = "0fk8gsybiw1gm146mkjdjvaajwh20xwvpv4j7syh2zrnpq0j19jc"; | |
1370 | }; |
|
1422 | }; | |
1371 | meta = { |
|
1423 | meta = { | |
1372 | license = [ pkgs.lib.licenses.mit ]; |
|
1424 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -1396,7 +1448,7 b' self: super: {' | |||||
1396 | }; |
|
1448 | }; | |
1397 | }; |
|
1449 | }; | |
1398 | "pyramid-debugtoolbar" = super.buildPythonPackage { |
|
1450 | "pyramid-debugtoolbar" = super.buildPythonPackage { | |
1399 | name = "pyramid-debugtoolbar-4.5"; |
|
1451 | name = "pyramid-debugtoolbar-4.5.1"; | |
1400 | doCheck = false; |
|
1452 | doCheck = false; | |
1401 | propagatedBuildInputs = [ |
|
1453 | propagatedBuildInputs = [ | |
1402 | self."pyramid" |
|
1454 | self."pyramid" | |
@@ -1406,8 +1458,8 b' self: super: {' | |||||
1406 | self."ipaddress" |
|
1458 | self."ipaddress" | |
1407 | ]; |
|
1459 | ]; | |
1408 | src = fetchurl { |
|
1460 | src = fetchurl { | |
1409 |
url = "https://files.pythonhosted.org/packages/14 |
|
1461 | url = "https://files.pythonhosted.org/packages/88/21/74e7fa52edc74667e29403bd0cb4f2bb74dc4014711de313868001bf639f/pyramid_debugtoolbar-4.5.1.tar.gz"; | |
1410 | sha256 = "0x2p3409pnx66n6dx5vc0mk2r1cp1ydr8mp120w44r9pwcngbibl"; |
|
1462 | sha256 = "0hgf6i1fzvq43m9vjdmb24nnv8fwp7sdzrx9bcwrgpy24n07am9a"; | |
1411 | }; |
|
1463 | }; | |
1412 | meta = { |
|
1464 | meta = { | |
1413 | license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ]; |
|
1465 | license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ]; | |
@@ -1447,15 +1499,15 b' self: super: {' | |||||
1447 | }; |
|
1499 | }; | |
1448 | }; |
|
1500 | }; | |
1449 | "pyramid-mako" = super.buildPythonPackage { |
|
1501 | "pyramid-mako" = super.buildPythonPackage { | |
1450 |
name = "pyramid-mako-1.0 |
|
1502 | name = "pyramid-mako-1.1.0"; | |
1451 | doCheck = false; |
|
1503 | doCheck = false; | |
1452 | propagatedBuildInputs = [ |
|
1504 | propagatedBuildInputs = [ | |
1453 | self."pyramid" |
|
1505 | self."pyramid" | |
1454 | self."mako" |
|
1506 | self."mako" | |
1455 | ]; |
|
1507 | ]; | |
1456 | src = fetchurl { |
|
1508 | src = fetchurl { | |
1457 |
url = "https://files.pythonhosted.org/packages/f1 |
|
1509 | url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz"; | |
1458 | sha256 = "18gk2vliq8z4acblsl6yzgbvnr9rlxjlcqir47km7kvlk1xri83d"; |
|
1510 | sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0"; | |
1459 | }; |
|
1511 | }; | |
1460 | meta = { |
|
1512 | meta = { | |
1461 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
|
1513 | license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
@@ -1473,44 +1525,46 b' self: super: {' | |||||
1473 | }; |
|
1525 | }; | |
1474 | }; |
|
1526 | }; | |
1475 | "pytest" = super.buildPythonPackage { |
|
1527 | "pytest" = super.buildPythonPackage { | |
1476 |
name = "pytest- |
|
1528 | name = "pytest-4.6.5"; | |
1477 | doCheck = false; |
|
1529 | doCheck = false; | |
1478 | propagatedBuildInputs = [ |
|
1530 | propagatedBuildInputs = [ | |
1479 | self."py" |
|
1531 | self."py" | |
1480 | self."six" |
|
1532 | self."six" | |
1481 |
self." |
|
1533 | self."packaging" | |
1482 | self."attrs" |
|
1534 | self."attrs" | |
1483 | self."more-itertools" |
|
|||
1484 | self."atomicwrites" |
|
1535 | self."atomicwrites" | |
1485 | self."pluggy" |
|
1536 | self."pluggy" | |
|
1537 | self."importlib-metadata" | |||
|
1538 | self."wcwidth" | |||
1486 | self."funcsigs" |
|
1539 | self."funcsigs" | |
1487 | self."pathlib2" |
|
1540 | self."pathlib2" | |
|
1541 | self."more-itertools" | |||
1488 | ]; |
|
1542 | ]; | |
1489 | src = fetchurl { |
|
1543 | src = fetchurl { | |
1490 |
url = "https://files.pythonhosted.org/packages/5f |
|
1544 | url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz"; | |
1491 | sha256 = "18nrwzn61kph2y6gxwfz9ms68rfvr9d4vcffsxng9p7jk9z18clk"; |
|
1545 | sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg"; | |
1492 | }; |
|
1546 | }; | |
1493 | meta = { |
|
1547 | meta = { | |
1494 | license = [ pkgs.lib.licenses.mit ]; |
|
1548 | license = [ pkgs.lib.licenses.mit ]; | |
1495 | }; |
|
1549 | }; | |
1496 | }; |
|
1550 | }; | |
1497 | "pytest-cov" = super.buildPythonPackage { |
|
1551 | "pytest-cov" = super.buildPythonPackage { | |
1498 |
name = "pytest-cov-2. |
|
1552 | name = "pytest-cov-2.7.1"; | |
1499 | doCheck = false; |
|
1553 | doCheck = false; | |
1500 | propagatedBuildInputs = [ |
|
1554 | propagatedBuildInputs = [ | |
1501 | self."pytest" |
|
1555 | self."pytest" | |
1502 | self."coverage" |
|
1556 | self."coverage" | |
1503 | ]; |
|
1557 | ]; | |
1504 | src = fetchurl { |
|
1558 | src = fetchurl { | |
1505 | url = "https://files.pythonhosted.org/packages/d9/e2/58f90a316fbd94dd50bf5c826a23f3f5d079fb3cc448c1e9f0e3c33a3d2a/pytest-cov-2.6.0.tar.gz"; |
|
1559 | url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz"; | |
1506 | sha256 = "0qnpp9y3ygx4jk4pf5ad71fh2skbvnr6gl54m7rg5qysnx4g0q73"; |
|
1560 | sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0"; | |
1507 | }; |
|
1561 | }; | |
1508 | meta = { |
|
1562 | meta = { | |
1509 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ]; |
|
1563 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ]; | |
1510 | }; |
|
1564 | }; | |
1511 | }; |
|
1565 | }; | |
1512 | "pytest-profiling" = super.buildPythonPackage { |
|
1566 | "pytest-profiling" = super.buildPythonPackage { | |
1513 |
name = "pytest-profiling-1. |
|
1567 | name = "pytest-profiling-1.7.0"; | |
1514 | doCheck = false; |
|
1568 | doCheck = false; | |
1515 | propagatedBuildInputs = [ |
|
1569 | propagatedBuildInputs = [ | |
1516 | self."six" |
|
1570 | self."six" | |
@@ -1518,62 +1572,63 b' self: super: {' | |||||
1518 | self."gprof2dot" |
|
1572 | self."gprof2dot" | |
1519 | ]; |
|
1573 | ]; | |
1520 | src = fetchurl { |
|
1574 | src = fetchurl { | |
1521 |
url = "https://files.pythonhosted.org/packages/f |
|
1575 | url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz"; | |
1522 | sha256 = "08r5afx5z22yvpmsnl91l4amsy1yxn8qsmm61mhp06mz8zjs51kb"; |
|
1576 | sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk"; | |
1523 | }; |
|
1577 | }; | |
1524 | meta = { |
|
1578 | meta = { | |
1525 | license = [ pkgs.lib.licenses.mit ]; |
|
1579 | license = [ pkgs.lib.licenses.mit ]; | |
1526 | }; |
|
1580 | }; | |
1527 | }; |
|
1581 | }; | |
1528 | "pytest-runner" = super.buildPythonPackage { |
|
1582 | "pytest-runner" = super.buildPythonPackage { | |
1529 |
name = "pytest-runner- |
|
1583 | name = "pytest-runner-5.1"; | |
1530 | doCheck = false; |
|
1584 | doCheck = false; | |
1531 | src = fetchurl { |
|
1585 | src = fetchurl { | |
1532 |
url = "https://files.pythonhosted.org/packages/9e |
|
1586 | url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz"; | |
1533 | sha256 = "1gkpyphawxz38ni1gdq1fmwyqcg02m7ypzqvv46z06crwdxi2gyj"; |
|
1587 | sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815"; | |
1534 | }; |
|
1588 | }; | |
1535 | meta = { |
|
1589 | meta = { | |
1536 | license = [ pkgs.lib.licenses.mit ]; |
|
1590 | license = [ pkgs.lib.licenses.mit ]; | |
1537 | }; |
|
1591 | }; | |
1538 | }; |
|
1592 | }; | |
1539 | "pytest-sugar" = super.buildPythonPackage { |
|
1593 | "pytest-sugar" = super.buildPythonPackage { | |
1540 |
name = "pytest-sugar-0.9. |
|
1594 | name = "pytest-sugar-0.9.2"; | |
1541 | doCheck = false; |
|
1595 | doCheck = false; | |
1542 | propagatedBuildInputs = [ |
|
1596 | propagatedBuildInputs = [ | |
1543 | self."pytest" |
|
1597 | self."pytest" | |
1544 | self."termcolor" |
|
1598 | self."termcolor" | |
|
1599 | self."packaging" | |||
1545 | ]; |
|
1600 | ]; | |
1546 | src = fetchurl { |
|
1601 | src = fetchurl { | |
1547 | url = "https://files.pythonhosted.org/packages/3e/6a/a3f909083079d03bde11d06ab23088886bbe25f2c97fbe4bb865e2bf05bc/pytest-sugar-0.9.1.tar.gz"; |
|
1602 | url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz"; | |
1548 | sha256 = "0b4av40dv30727m54v211r0nzwjp2ajkjgxix6j484qjmwpw935b"; |
|
1603 | sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w"; | |
1549 | }; |
|
1604 | }; | |
1550 | meta = { |
|
1605 | meta = { | |
1551 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1606 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
1552 | }; |
|
1607 | }; | |
1553 | }; |
|
1608 | }; | |
1554 | "pytest-timeout" = super.buildPythonPackage { |
|
1609 | "pytest-timeout" = super.buildPythonPackage { | |
1555 |
name = "pytest-timeout-1.3. |
|
1610 | name = "pytest-timeout-1.3.3"; | |
1556 | doCheck = false; |
|
1611 | doCheck = false; | |
1557 | propagatedBuildInputs = [ |
|
1612 | propagatedBuildInputs = [ | |
1558 | self."pytest" |
|
1613 | self."pytest" | |
1559 | ]; |
|
1614 | ]; | |
1560 | src = fetchurl { |
|
1615 | src = fetchurl { | |
1561 |
url = "https://files.pythonhosted.org/packages/8c |
|
1616 | url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz"; | |
1562 | sha256 = "09wnmzvnls2mnsdz7x3c3sk2zdp6jl4dryvyj5i8hqz16q2zq5qi"; |
|
1617 | sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a"; | |
1563 | }; |
|
1618 | }; | |
1564 | meta = { |
|
1619 | meta = { | |
1565 | license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ]; |
|
1620 | license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ]; | |
1566 | }; |
|
1621 | }; | |
1567 | }; |
|
1622 | }; | |
1568 | "python-dateutil" = super.buildPythonPackage { |
|
1623 | "python-dateutil" = super.buildPythonPackage { | |
1569 |
name = "python-dateutil-2.8. |
|
1624 | name = "python-dateutil-2.8.1"; | |
1570 | doCheck = false; |
|
1625 | doCheck = false; | |
1571 | propagatedBuildInputs = [ |
|
1626 | propagatedBuildInputs = [ | |
1572 | self."six" |
|
1627 | self."six" | |
1573 | ]; |
|
1628 | ]; | |
1574 | src = fetchurl { |
|
1629 | src = fetchurl { | |
1575 |
url = "https://files.pythonhosted.org/packages/ |
|
1630 | url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz"; | |
1576 | sha256 = "17nsfhy4xdz1khrfxa61vd7pmvd5z0wa3zb6v4gb4kfnykv0b668"; |
|
1631 | sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk"; | |
1577 | }; |
|
1632 | }; | |
1578 | meta = { |
|
1633 | meta = { | |
1579 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ]; |
|
1634 | license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ]; | |
@@ -1647,11 +1702,11 b' self: super: {' | |||||
1647 | }; |
|
1702 | }; | |
1648 | }; |
|
1703 | }; | |
1649 | "pytz" = super.buildPythonPackage { |
|
1704 | "pytz" = super.buildPythonPackage { | |
1650 |
name = "pytz-201 |
|
1705 | name = "pytz-2019.2"; | |
1651 | doCheck = false; |
|
1706 | doCheck = false; | |
1652 | src = fetchurl { |
|
1707 | src = fetchurl { | |
1653 |
url = "https://files.pythonhosted.org/packages/ |
|
1708 | url = "https://files.pythonhosted.org/packages/27/c0/fbd352ca76050952a03db776d241959d5a2ee1abddfeb9e2a53fdb489be4/pytz-2019.2.tar.gz"; | |
1654 | sha256 = "0jgpqx3kk2rhv81j1izjxvmx8d0x7hzs1857pgqnixic5wq2ar60"; |
|
1709 | sha256 = "0ckb27hhjc8i8gcdvk4d9avld62b7k52yjijc60s2m3y8cpb7h16"; | |
1655 | }; |
|
1710 | }; | |
1656 | meta = { |
|
1711 | meta = { | |
1657 | license = [ pkgs.lib.licenses.mit ]; |
|
1712 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -1669,11 +1724,11 b' self: super: {' | |||||
1669 | }; |
|
1724 | }; | |
1670 | }; |
|
1725 | }; | |
1671 | "redis" = super.buildPythonPackage { |
|
1726 | "redis" = super.buildPythonPackage { | |
1672 |
name = "redis- |
|
1727 | name = "redis-3.3.11"; | |
1673 | doCheck = false; |
|
1728 | doCheck = false; | |
1674 | src = fetchurl { |
|
1729 | src = fetchurl { | |
1675 |
url = "https://files.pythonhosted.org/packages/09 |
|
1730 | url = "https://files.pythonhosted.org/packages/06/ca/00557c74279d2f256d3c42cabf237631355f3a132e4c74c2000e6647ad98/redis-3.3.11.tar.gz"; | |
1676 | sha256 = "03vcgklykny0g0wpvqmy8p6azi2s078317wgb2xjv5m2rs9sjb52"; |
|
1731 | sha256 = "1hicqbi5xl92hhml82awrr2rxl9jar5fp8nbcycj9qgmsdwc43wd"; | |
1677 | }; |
|
1732 | }; | |
1678 | meta = { |
|
1733 | meta = { | |
1679 | license = [ pkgs.lib.licenses.mit ]; |
|
1734 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -1707,18 +1762,24 b' self: super: {' | |||||
1707 | }; |
|
1762 | }; | |
1708 | }; |
|
1763 | }; | |
1709 | "requests" = super.buildPythonPackage { |
|
1764 | "requests" = super.buildPythonPackage { | |
1710 |
name = "requests-2. |
|
1765 | name = "requests-2.22.0"; | |
1711 | doCheck = false; |
|
1766 | doCheck = false; | |
|
1767 | propagatedBuildInputs = [ | |||
|
1768 | self."chardet" | |||
|
1769 | self."idna" | |||
|
1770 | self."urllib3" | |||
|
1771 | self."certifi" | |||
|
1772 | ]; | |||
1712 | src = fetchurl { |
|
1773 | src = fetchurl { | |
1713 |
url = "https://files.pythonhosted.org/packages/ |
|
1774 | url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz"; | |
1714 | sha256 = "0zsqrzlybf25xscgi7ja4s48y2abf9wvjkn47wh984qgs1fq2xy5"; |
|
1775 | sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i"; | |
1715 | }; |
|
1776 | }; | |
1716 | meta = { |
|
1777 | meta = { | |
1717 | license = [ pkgs.lib.licenses.asl20 ]; |
|
1778 | license = [ pkgs.lib.licenses.asl20 ]; | |
1718 | }; |
|
1779 | }; | |
1719 | }; |
|
1780 | }; | |
1720 | "rhodecode-enterprise-ce" = super.buildPythonPackage { |
|
1781 | "rhodecode-enterprise-ce" = super.buildPythonPackage { | |
1721 |
name = "rhodecode-enterprise-ce-4.1 |
|
1782 | name = "rhodecode-enterprise-ce-4.18.0"; | |
1722 | buildInputs = [ |
|
1783 | buildInputs = [ | |
1723 | self."pytest" |
|
1784 | self."pytest" | |
1724 | self."py" |
|
1785 | self."py" | |
@@ -1738,7 +1799,6 b' self: super: {' | |||||
1738 | doCheck = true; |
|
1799 | doCheck = true; | |
1739 | propagatedBuildInputs = [ |
|
1800 | propagatedBuildInputs = [ | |
1740 | self."amqp" |
|
1801 | self."amqp" | |
1741 | self."authomatic" |
|
|||
1742 | self."babel" |
|
1802 | self."babel" | |
1743 | self."beaker" |
|
1803 | self."beaker" | |
1744 | self."bleach" |
|
1804 | self."bleach" | |
@@ -1808,7 +1868,6 b' self: super: {' | |||||
1808 | self."venusian" |
|
1868 | self."venusian" | |
1809 | self."weberror" |
|
1869 | self."weberror" | |
1810 | self."webhelpers2" |
|
1870 | self."webhelpers2" | |
1811 | self."webhelpers" |
|
|||
1812 | self."webob" |
|
1871 | self."webob" | |
1813 | self."whoosh" |
|
1872 | self."whoosh" | |
1814 | self."wsgiref" |
|
1873 | self."wsgiref" | |
@@ -1823,6 +1882,7 b' self: super: {' | |||||
1823 | self."nbconvert" |
|
1882 | self."nbconvert" | |
1824 | self."nbformat" |
|
1883 | self."nbformat" | |
1825 | self."jupyter-client" |
|
1884 | self."jupyter-client" | |
|
1885 | self."jupyter-core" | |||
1826 | self."alembic" |
|
1886 | self."alembic" | |
1827 | self."invoke" |
|
1887 | self."invoke" | |
1828 | self."bumpversion" |
|
1888 | self."bumpversion" | |
@@ -1854,7 +1914,7 b' self: super: {' | |||||
1854 | }; |
|
1914 | }; | |
1855 | }; |
|
1915 | }; | |
1856 | "rhodecode-tools" = super.buildPythonPackage { |
|
1916 | "rhodecode-tools" = super.buildPythonPackage { | |
1857 |
name = "rhodecode-tools-1. |
|
1917 | name = "rhodecode-tools-1.4.0"; | |
1858 | doCheck = false; |
|
1918 | doCheck = false; | |
1859 | propagatedBuildInputs = [ |
|
1919 | propagatedBuildInputs = [ | |
1860 | self."click" |
|
1920 | self."click" | |
@@ -1871,8 +1931,8 b' self: super: {' | |||||
1871 | self."elasticsearch1-dsl" |
|
1931 | self."elasticsearch1-dsl" | |
1872 | ]; |
|
1932 | ]; | |
1873 | src = fetchurl { |
|
1933 | src = fetchurl { | |
1874 |
url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0- |
|
1934 | url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a"; | |
1875 | sha256 = "1vfhgf46inbx7jvlfx4fdzh3vz7lh37r291gzb5hx447pfm3qllg"; |
|
1935 | sha256 = "0fjszppj3zhh47g1i6b9xqps28gzfxdkzwb47pdmzrd1sfx29w3n"; | |
1876 | }; |
|
1936 | }; | |
1877 | meta = { |
|
1937 | meta = { | |
1878 | license = [ { fullName = "Apache 2.0 and Proprietary"; } ]; |
|
1938 | license = [ { fullName = "Apache 2.0 and Proprietary"; } ]; | |
@@ -1916,11 +1976,11 b' self: super: {' | |||||
1916 | }; |
|
1976 | }; | |
1917 | }; |
|
1977 | }; | |
1918 | "setuptools" = super.buildPythonPackage { |
|
1978 | "setuptools" = super.buildPythonPackage { | |
1919 |
name = "setuptools-4 |
|
1979 | name = "setuptools-44.0.0"; | |
1920 | doCheck = false; |
|
1980 | doCheck = false; | |
1921 | src = fetchurl { |
|
1981 | src = fetchurl { | |
1922 |
url = "https://files.pythonhosted.org/packages/ |
|
1982 | url = "https://files.pythonhosted.org/packages/b0/f3/44da7482ac6da3f36f68e253cb04de37365b3dba9036a3c70773b778b485/setuptools-44.0.0.zip"; | |
1923 | sha256 = "04sns22y2hhsrwfy1mha2lgslvpjsjsz8xws7h2rh5a7ylkd28m2"; |
|
1983 | sha256 = "025h5cnxcmda1893l6i12hrwdvs1n8r31qs6q4pkif2v7rrggfp5"; | |
1924 | }; |
|
1984 | }; | |
1925 | meta = { |
|
1985 | meta = { | |
1926 | license = [ pkgs.lib.licenses.mit ]; |
|
1986 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -1960,11 +2020,11 b' self: super: {' | |||||
1960 | }; |
|
2020 | }; | |
1961 | }; |
|
2021 | }; | |
1962 | "sqlalchemy" = super.buildPythonPackage { |
|
2022 | "sqlalchemy" = super.buildPythonPackage { | |
1963 |
name = "sqlalchemy-1. |
|
2023 | name = "sqlalchemy-1.3.11"; | |
1964 | doCheck = false; |
|
2024 | doCheck = false; | |
1965 | src = fetchurl { |
|
2025 | src = fetchurl { | |
1966 |
url = "https://files.pythonhosted.org/packages/cc |
|
2026 | url = "https://files.pythonhosted.org/packages/34/5c/0e1d7ad0ca52544bb12f9cb8d5cc454af45821c92160ffedd38db0a317f6/SQLAlchemy-1.3.11.tar.gz"; | |
1967 | sha256 = "1ab4ysip6irajfbxl9wy27kv76miaz8h6759hfx92499z4dcf3lb"; |
|
2027 | sha256 = "12izpqqgy738ndn7qqn962qxi8qw2xb9vg2i880x12paklg599dg"; | |
1968 | }; |
|
2028 | }; | |
1969 | meta = { |
|
2029 | meta = { | |
1970 | license = [ pkgs.lib.licenses.mit ]; |
|
2030 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -1997,14 +2057,11 b' self: super: {' | |||||
1997 | }; |
|
2057 | }; | |
1998 | }; |
|
2058 | }; | |
1999 | "supervisor" = super.buildPythonPackage { |
|
2059 | "supervisor" = super.buildPythonPackage { | |
2000 |
name = "supervisor-4.0 |
|
2060 | name = "supervisor-4.1.0"; | |
2001 | doCheck = false; |
|
2061 | doCheck = false; | |
2002 | propagatedBuildInputs = [ |
|
|||
2003 | self."meld3" |
|
|||
2004 | ]; |
|
|||
2005 | src = fetchurl { |
|
2062 | src = fetchurl { | |
2006 |
url = "https://files.pythonhosted.org/packages/ |
|
2063 | url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz"; | |
2007 | sha256 = "17hla7mx6w5m5jzkkjxgqa8wpswqmfhbhf49f692hw78fg0ans7p"; |
|
2064 | sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d"; | |
2008 | }; |
|
2065 | }; | |
2009 | meta = { |
|
2066 | meta = { | |
2010 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
|
2067 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
@@ -2033,18 +2090,18 b' self: super: {' | |||||
2033 | }; |
|
2090 | }; | |
2034 | }; |
|
2091 | }; | |
2035 | "testpath" = super.buildPythonPackage { |
|
2092 | "testpath" = super.buildPythonPackage { | |
2036 |
name = "testpath-0.4. |
|
2093 | name = "testpath-0.4.4"; | |
2037 | doCheck = false; |
|
2094 | doCheck = false; | |
2038 | src = fetchurl { |
|
2095 | src = fetchurl { | |
2039 |
url = "https://files.pythonhosted.org/packages/0 |
|
2096 | url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz"; | |
2040 | sha256 = "1y40hywscnnyb734pnzm55nd8r8kp1072bjxbil83gcd53cv755n"; |
|
2097 | sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30"; | |
2041 | }; |
|
2098 | }; | |
2042 | meta = { |
|
2099 | meta = { | |
2043 | license = [ ]; |
|
2100 | license = [ ]; | |
2044 | }; |
|
2101 | }; | |
2045 | }; |
|
2102 | }; | |
2046 | "traitlets" = super.buildPythonPackage { |
|
2103 | "traitlets" = super.buildPythonPackage { | |
2047 |
name = "traitlets-4.3. |
|
2104 | name = "traitlets-4.3.3"; | |
2048 | doCheck = false; |
|
2105 | doCheck = false; | |
2049 | propagatedBuildInputs = [ |
|
2106 | propagatedBuildInputs = [ | |
2050 | self."ipython-genutils" |
|
2107 | self."ipython-genutils" | |
@@ -2053,8 +2110,8 b' self: super: {' | |||||
2053 | self."enum34" |
|
2110 | self."enum34" | |
2054 | ]; |
|
2111 | ]; | |
2055 | src = fetchurl { |
|
2112 | src = fetchurl { | |
2056 |
url = "https://files.pythonhosted.org/packages/ |
|
2113 | url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz"; | |
2057 | sha256 = "0dbq7sx26xqz5ixs711k5nc88p8a0nqyz6162pwks5dpcz9d4jww"; |
|
2114 | sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh"; | |
2058 | }; |
|
2115 | }; | |
2059 | meta = { |
|
2116 | meta = { | |
2060 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
2117 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
@@ -2100,11 +2157,11 b' self: super: {' | |||||
2100 | }; |
|
2157 | }; | |
2101 | }; |
|
2158 | }; | |
2102 | "urllib3" = super.buildPythonPackage { |
|
2159 | "urllib3" = super.buildPythonPackage { | |
2103 |
name = "urllib3-1.2 |
|
2160 | name = "urllib3-1.25.2"; | |
2104 | doCheck = false; |
|
2161 | doCheck = false; | |
2105 | src = fetchurl { |
|
2162 | src = fetchurl { | |
2106 |
url = "https://files.pythonhosted.org/packages/b |
|
2163 | url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz"; | |
2107 | sha256 = "08lwd9f3hqznyf32vnzwvp87pchx062nkbgyrf67rwlkgj0jk5fy"; |
|
2164 | sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55"; | |
2108 | }; |
|
2165 | }; | |
2109 | meta = { |
|
2166 | meta = { | |
2110 | license = [ pkgs.lib.licenses.mit ]; |
|
2167 | license = [ pkgs.lib.licenses.mit ]; | |
@@ -2144,11 +2201,11 b' self: super: {' | |||||
2144 | }; |
|
2201 | }; | |
2145 | }; |
|
2202 | }; | |
2146 | "waitress" = super.buildPythonPackage { |
|
2203 | "waitress" = super.buildPythonPackage { | |
2147 |
name = "waitress-1.3. |
|
2204 | name = "waitress-1.3.1"; | |
2148 | doCheck = false; |
|
2205 | doCheck = false; | |
2149 | src = fetchurl { |
|
2206 | src = fetchurl { | |
2150 |
url = "https://files.pythonhosted.org/packages/ |
|
2207 | url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz"; | |
2151 | sha256 = "09j5dzbbcxib7vdskhx39s1qsydlr4n2p2png71d7mjnr9pnwajf"; |
|
2208 | sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7"; | |
2152 | }; |
|
2209 | }; | |
2153 | meta = { |
|
2210 | meta = { | |
2154 | license = [ pkgs.lib.licenses.zpl21 ]; |
|
2211 | license = [ pkgs.lib.licenses.zpl21 ]; | |
@@ -2193,20 +2250,6 b' self: super: {' | |||||
2193 | license = [ pkgs.lib.licenses.mit ]; |
|
2250 | license = [ pkgs.lib.licenses.mit ]; | |
2194 | }; |
|
2251 | }; | |
2195 | }; |
|
2252 | }; | |
2196 | "webhelpers" = super.buildPythonPackage { |
|
|||
2197 | name = "webhelpers-1.3"; |
|
|||
2198 | doCheck = false; |
|
|||
2199 | propagatedBuildInputs = [ |
|
|||
2200 | self."markupsafe" |
|
|||
2201 | ]; |
|
|||
2202 | src = fetchurl { |
|
|||
2203 | url = "https://files.pythonhosted.org/packages/ee/68/4d07672821d514184357f1552f2dad923324f597e722de3b016ca4f7844f/WebHelpers-1.3.tar.gz"; |
|
|||
2204 | sha256 = "10x5i82qdkrvyw18gsybwggfhfpl869siaab89vnndi9x62g51pa"; |
|
|||
2205 | }; |
|
|||
2206 | meta = { |
|
|||
2207 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
|||
2208 | }; |
|
|||
2209 | }; |
|
|||
2210 | "webhelpers2" = super.buildPythonPackage { |
|
2253 | "webhelpers2" = super.buildPythonPackage { | |
2211 | name = "webhelpers2-2.0"; |
|
2254 | name = "webhelpers2-2.0"; | |
2212 | doCheck = false; |
|
2255 | doCheck = false; | |
@@ -2283,6 +2326,20 b' self: super: {' | |||||
2283 | license = [ { fullName = "PSF or ZPL"; } ]; |
|
2326 | license = [ { fullName = "PSF or ZPL"; } ]; | |
2284 | }; |
|
2327 | }; | |
2285 | }; |
|
2328 | }; | |
|
2329 | "zipp" = super.buildPythonPackage { | |||
|
2330 | name = "zipp-0.6.0"; | |||
|
2331 | doCheck = false; | |||
|
2332 | propagatedBuildInputs = [ | |||
|
2333 | self."more-itertools" | |||
|
2334 | ]; | |||
|
2335 | src = fetchurl { | |||
|
2336 | url = "https://files.pythonhosted.org/packages/57/dd/585d728479d97d25aeeb9aa470d36a4ad8d0ba5610f84e14770128ce6ff7/zipp-0.6.0.tar.gz"; | |||
|
2337 | sha256 = "13ndkf7vklw978a4gdl1yfvn8hch28429a0iam67sg4nrp5v261p"; | |||
|
2338 | }; | |||
|
2339 | meta = { | |||
|
2340 | license = [ pkgs.lib.licenses.mit ]; | |||
|
2341 | }; | |||
|
2342 | }; | |||
2286 | "zope.cachedescriptors" = super.buildPythonPackage { |
|
2343 | "zope.cachedescriptors" = super.buildPythonPackage { | |
2287 | name = "zope.cachedescriptors-4.3.1"; |
|
2344 | name = "zope.cachedescriptors-4.3.1"; | |
2288 | doCheck = false; |
|
2345 | doCheck = false; |
@@ -9,8 +9,11 b' vcsserver_config_http = rhodecode/tests/' | |||||
9 |
|
9 | |||
10 | addopts = |
|
10 | addopts = | |
11 | --pdbcls=IPython.terminal.debugger:TerminalPdb |
|
11 | --pdbcls=IPython.terminal.debugger:TerminalPdb | |
|
12 | --strict-markers | |||
12 |
|
13 | |||
13 | markers = |
|
14 | markers = | |
14 | vcs_operations: Mark tests depending on a running RhodeCode instance. |
|
15 | vcs_operations: Mark tests depending on a running RhodeCode instance. | |
15 | xfail_backends: Mark tests as xfail for given backends. |
|
16 | xfail_backends: Mark tests as xfail for given backends. | |
16 | skip_backends: Mark tests as skipped for given backends. |
|
17 | skip_backends: Mark tests as skipped for given backends. | |
|
18 | backends: Mark backends | |||
|
19 | dbs: database markers for running tests for given DB |
@@ -1,13 +1,10 b'' | |||||
1 | ## dependencies |
|
1 | ## dependencies | |
2 |
|
2 | |||
3 |
amqp==2. |
|
3 | amqp==2.5.2 | |
4 | # not released authomatic that has updated some oauth providers |
|
|||
5 | https://code.rhodecode.com/upstream/authomatic/artifacts/download/0-4fe9c041-a567-4f84-be4c-7efa2a606d3c.tar.gz?md5=f6bdc3c769688212db68233e8d2b0383#egg=authomatic==0.1.0.post1 |
|
|||
6 |
|
||||
7 | babel==1.3 |
|
4 | babel==1.3 | |
8 | beaker==1.9.1 |
|
5 | beaker==1.9.1 | |
9 | bleach==3.1.0 |
|
6 | bleach==3.1.0 | |
10 |
celery==4. |
|
7 | celery==4.3.0 | |
11 | channelstream==0.5.2 |
|
8 | channelstream==0.5.2 | |
12 | click==7.0 |
|
9 | click==7.0 | |
13 | colander==1.7.0 |
|
10 | colander==1.7.0 | |
@@ -16,9 +13,9 b' https://code.rhodecode.com/upstream/conf' | |||||
16 | cssselect==1.0.3 |
|
13 | cssselect==1.0.3 | |
17 | cryptography==2.6.1 |
|
14 | cryptography==2.6.1 | |
18 | decorator==4.1.2 |
|
15 | decorator==4.1.2 | |
19 |
deform==2.0. |
|
16 | deform==2.0.8 | |
20 | docutils==0.14.0 |
|
17 | docutils==0.14.0 | |
21 |
dogpile.cache==0. |
|
18 | dogpile.cache==0.9.0 | |
22 | dogpile.core==0.4.1 |
|
19 | dogpile.core==0.4.1 | |
23 | formencode==1.2.4 |
|
20 | formencode==1.2.4 | |
24 | future==0.14.3 |
|
21 | future==0.14.3 | |
@@ -26,55 +23,54 b' futures==3.0.2' | |||||
26 | infrae.cache==1.0.1 |
|
23 | infrae.cache==1.0.1 | |
27 | iso8601==0.1.12 |
|
24 | iso8601==0.1.12 | |
28 | itsdangerous==0.24 |
|
25 | itsdangerous==0.24 | |
29 |
kombu==4. |
|
26 | kombu==4.6.6 | |
30 | lxml==4.2.5 |
|
27 | lxml==4.2.5 | |
31 |
mako==1. |
|
28 | mako==1.1.0 | |
32 | markdown==2.6.11 |
|
29 | markdown==2.6.11 | |
33 |
markupsafe==1.1. |
|
30 | markupsafe==1.1.1 | |
34 | msgpack-python==0.5.6 |
|
31 | msgpack-python==0.5.6 | |
35 |
pyotp==2. |
|
32 | pyotp==2.3.0 | |
36 |
packaging==1 |
|
33 | packaging==19.2 | |
37 |
pathlib2==2.3. |
|
34 | pathlib2==2.3.5 | |
38 |
paste==3. |
|
35 | paste==3.2.1 | |
39 | pastedeploy==2.0.1 |
|
36 | pastedeploy==2.0.1 | |
40 |
pastescript==3. |
|
37 | pastescript==3.2.0 | |
41 | peppercorn==0.6 |
|
38 | peppercorn==0.6 | |
42 |
psutil==5. |
|
39 | psutil==5.6.5 | |
43 | py-bcrypt==0.4 |
|
40 | py-bcrypt==0.4 | |
44 |
pycurl==7.43.0. |
|
41 | pycurl==7.43.0.3 | |
45 | pycrypto==2.6.1 |
|
42 | pycrypto==2.6.1 | |
46 | pygments==2.4.2 |
|
43 | pygments==2.4.2 | |
47 |
pyparsing==2. |
|
44 | pyparsing==2.4.5 | |
48 |
pyramid-debugtoolbar==4.5. |
|
45 | pyramid-debugtoolbar==4.5.1 | |
49 |
pyramid-mako==1. |
|
46 | pyramid-mako==1.1.0 | |
50 | pyramid==1.10.4 |
|
47 | pyramid==1.10.4 | |
51 | pyramid_mailer==0.15.1 |
|
48 | pyramid_mailer==0.15.1 | |
52 | python-dateutil |
|
49 | python-dateutil==2.8.1 | |
53 | python-ldap==3.1.0 |
|
50 | python-ldap==3.1.0 | |
54 | python-memcached==1.59 |
|
51 | python-memcached==1.59 | |
55 | python-pam==1.8.4 |
|
52 | python-pam==1.8.4 | |
56 | python-saml==2.4.2 |
|
53 | python-saml==2.4.2 | |
57 |
pytz==201 |
|
54 | pytz==2019.2 | |
58 | tzlocal==1.5.1 |
|
55 | tzlocal==1.5.1 | |
59 | pyzmq==14.6.0 |
|
56 | pyzmq==14.6.0 | |
60 | py-gfm==0.1.4 |
|
57 | py-gfm==0.1.4 | |
61 |
redis== |
|
58 | redis==3.3.11 | |
62 | repoze.lru==0.7 |
|
59 | repoze.lru==0.7 | |
63 |
requests==2. |
|
60 | requests==2.22.0 | |
64 | routes==2.4.1 |
|
61 | routes==2.4.1 | |
65 | simplejson==3.16.0 |
|
62 | simplejson==3.16.0 | |
66 | six==1.11.0 |
|
63 | six==1.11.0 | |
67 |
sqlalchemy==1. |
|
64 | sqlalchemy==1.3.11 | |
68 | sshpubkeys==3.1.0 |
|
65 | sshpubkeys==3.1.0 | |
69 | subprocess32==3.5.4 |
|
66 | subprocess32==3.5.4 | |
70 |
supervisor==4. |
|
67 | supervisor==4.1.0 | |
71 | translationstring==1.3 |
|
68 | translationstring==1.3 | |
72 |
urllib3==1.2 |
|
69 | urllib3==1.25.2 | |
73 | urlobject==2.4.3 |
|
70 | urlobject==2.4.3 | |
74 | venusian==1.2.0 |
|
71 | venusian==1.2.0 | |
75 | weberror==0.10.3 |
|
72 | weberror==0.10.3 | |
76 | webhelpers2==2.0 |
|
73 | webhelpers2==2.0 | |
77 | webhelpers==1.3 |
|
|||
78 | webob==1.8.5 |
|
74 | webob==1.8.5 | |
79 | whoosh==2.7.4 |
|
75 | whoosh==2.7.4 | |
80 | wsgiref==0.1.2 |
|
76 | wsgiref==0.1.2 | |
@@ -87,17 +83,18 b' zope.interface==4.6.0' | |||||
87 | mysql-python==1.2.5 |
|
83 | mysql-python==1.2.5 | |
88 | pymysql==0.8.1 |
|
84 | pymysql==0.8.1 | |
89 | pysqlite==2.8.3 |
|
85 | pysqlite==2.8.3 | |
90 |
psycopg2==2.8. |
|
86 | psycopg2==2.8.4 | |
91 |
|
87 | |||
92 | # IPYTHON RENDERING |
|
88 | # IPYTHON RENDERING | |
93 | # entrypoints backport, pypi version doesn't support egg installs |
|
89 | # entrypoints backport, pypi version doesn't support egg installs | |
94 | https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d#egg=entrypoints==0.2.2.rhodecode-upstream1 |
|
90 | https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d#egg=entrypoints==0.2.2.rhodecode-upstream1 | |
95 | nbconvert==5.3.1 |
|
91 | nbconvert==5.3.1 | |
96 | nbformat==4.4.0 |
|
92 | nbformat==4.4.0 | |
97 |
jupyter |
|
93 | jupyter-client==5.0.0 | |
|
94 | jupyter-core==4.5.0 | |||
98 |
|
95 | |||
99 | ## cli tools |
|
96 | ## cli tools | |
100 |
alembic==1. |
|
97 | alembic==1.3.1 | |
101 | invoke==0.13.0 |
|
98 | invoke==0.13.0 | |
102 | bumpversion==0.5.3 |
|
99 | bumpversion==0.5.3 | |
103 |
|
100 | |||
@@ -105,14 +102,15 b' bumpversion==0.5.3' | |||||
105 | gevent==1.4.0 |
|
102 | gevent==1.4.0 | |
106 | greenlet==0.4.15 |
|
103 | greenlet==0.4.15 | |
107 | gunicorn==19.9.0 |
|
104 | gunicorn==19.9.0 | |
108 |
waitress==1.3. |
|
105 | waitress==1.3.1 | |
109 |
|
106 | |||
110 | ## debug |
|
107 | ## debug | |
111 | ipdb==0.12.0 |
|
108 | ipdb==0.12.0 | |
112 | ipython==5.1.0 |
|
109 | ipython==5.1.0 | |
113 |
|
110 | |||
114 | ## rhodecode-tools, special case |
|
111 | ## rhodecode-tools, special case, use file://PATH.tar.gz#egg=rhodecode-tools==X.Y.Z, to test local version | |
115 | https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-10ac93f4-bb7d-4b97-baea-68110743dd5a.tar.gz?md5=962dc77c06aceee62282b98d33149661#egg=rhodecode-tools==1.2.1 |
|
112 | https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a#egg=rhodecode-tools==1.4.0 | |
|
113 | ||||
116 |
|
114 | |||
117 | ## appenlight |
|
115 | ## appenlight | |
118 | appenlight-client==0.6.26 |
|
116 | appenlight-client==0.6.26 |
@@ -1,19 +1,27 b'' | |||||
1 | # contains not directly required libraries we want to pin the version. |
|
1 | # contains not directly required libraries we want to pin the version. | |
2 |
|
2 | |||
3 |
atomicwrites==1. |
|
3 | atomicwrites==1.3.0 | |
4 |
attrs==1 |
|
4 | attrs==19.3.0 | |
5 | billiard==3.5.0.3 |
|
5 | asn1crypto==0.24.0 | |
|
6 | billiard==3.6.1.0 | |||
|
7 | cffi==1.12.3 | |||
6 | chameleon==2.24 |
|
8 | chameleon==2.24 | |
7 | cffi==1.12.2 |
|
9 | configparser==4.0.2 | |
|
10 | contextlib2==0.6.0.post1 | |||
8 | ecdsa==0.13.2 |
|
11 | ecdsa==0.13.2 | |
9 | hupper==1.6.1 |
|
|||
10 | gnureadline==6.3.8 |
|
12 | gnureadline==6.3.8 | |
|
13 | hupper==1.9.1 | |||
|
14 | ipaddress==1.0.23 | |||
|
15 | importlib-metadata==0.23 | |||
11 | jinja2==2.9.6 |
|
16 | jinja2==2.9.6 | |
12 | jsonschema==2.6.0 |
|
17 | jsonschema==2.6.0 | |
|
18 | pluggy==0.13.1 | |||
|
19 | pyasn1-modules==0.2.6 | |||
13 | pyramid-jinja2==2.7 |
|
20 | pyramid-jinja2==2.7 | |
14 | pluggy==0.11.0 |
|
|||
15 | setproctitle==1.1.10 |
|
|||
16 | scandir==1.10.0 |
|
21 | scandir==1.10.0 | |
|
22 | setproctitle==1.1.10 | |||
17 | tempita==0.5.2 |
|
23 | tempita==0.5.2 | |
|
24 | testpath==0.4.4 | |||
|
25 | transaction==2.4.0 | |||
18 | vine==1.3.0 |
|
26 | vine==1.3.0 | |
19 | configparser==3.7.4 |
|
27 | wcwidth==0.1.7 |
@@ -1,16 +1,16 b'' | |||||
1 | # test related requirements |
|
1 | # test related requirements | |
2 |
pytest== |
|
2 | pytest==4.6.5 | |
3 |
py==1. |
|
3 | py==1.8.0 | |
4 |
pytest-cov==2. |
|
4 | pytest-cov==2.7.1 | |
5 |
pytest-sugar==0.9. |
|
5 | pytest-sugar==0.9.2 | |
6 |
pytest-runner== |
|
6 | pytest-runner==5.1.0 | |
7 |
pytest-profiling==1. |
|
7 | pytest-profiling==1.7.0 | |
8 |
pytest-timeout==1.3. |
|
8 | pytest-timeout==1.3.3 | |
9 | gprof2dot==2017.9.19 |
|
9 | gprof2dot==2017.9.19 | |
10 |
|
10 | |||
11 |
mock== |
|
11 | mock==3.0.5 | |
12 | cov-core==1.15.0 |
|
12 | cov-core==1.15.0 | |
13 |
coverage==4.5. |
|
13 | coverage==4.5.4 | |
14 |
|
14 | |||
15 | webtest==2.0.33 |
|
15 | webtest==2.0.33 | |
16 | beautifulsoup4==4.6.3 |
|
16 | beautifulsoup4==4.6.3 |
@@ -45,7 +45,7 b' PYRAMID_SETTINGS = {}' | |||||
45 | EXTENSIONS = {} |
|
45 | EXTENSIONS = {} | |
46 |
|
46 | |||
47 | __version__ = ('.'.join((str(each) for each in VERSION[:3]))) |
|
47 | __version__ = ('.'.join((str(each) for each in VERSION[:3]))) | |
48 |
__dbversion__ = |
|
48 | __dbversion__ = 103 # defines current db version for migrations | |
49 | __platform__ = platform.system() |
|
49 | __platform__ = platform.system() | |
50 | __license__ = 'AGPLv3, and Commercial License' |
|
50 | __license__ = 'AGPLv3, and Commercial License' | |
51 | __author__ = 'RhodeCode GmbH' |
|
51 | __author__ = 'RhodeCode GmbH' |
@@ -122,7 +122,7 b' def jsonrpc_response(request, result):' | |||||
122 | return response |
|
122 | return response | |
123 |
|
123 | |||
124 |
|
124 | |||
125 | def jsonrpc_error(request, message, retid=None, code=None): |
|
125 | def jsonrpc_error(request, message, retid=None, code=None, headers=None): | |
126 | """ |
|
126 | """ | |
127 | Generate a Response object with a JSON-RPC error body |
|
127 | Generate a Response object with a JSON-RPC error body | |
128 |
|
128 | |||
@@ -132,10 +132,12 b' def jsonrpc_error(request, message, reti' | |||||
132 | """ |
|
132 | """ | |
133 | err_dict = {'id': retid, 'result': None, 'error': message} |
|
133 | err_dict = {'id': retid, 'result': None, 'error': message} | |
134 | body = render(DEFAULT_RENDERER, err_dict, request=request).encode('utf-8') |
|
134 | body = render(DEFAULT_RENDERER, err_dict, request=request).encode('utf-8') | |
|
135 | ||||
135 | return Response( |
|
136 | return Response( | |
136 | body=body, |
|
137 | body=body, | |
137 | status=code, |
|
138 | status=code, | |
138 | content_type='application/json' |
|
139 | content_type='application/json', | |
|
140 | headerlist=headers | |||
139 | ) |
|
141 | ) | |
140 |
|
142 | |||
141 |
|
143 | |||
@@ -287,8 +289,7 b' def request_view(request):' | |||||
287 | }) |
|
289 | }) | |
288 |
|
290 | |||
289 | # register some common functions for usage |
|
291 | # register some common functions for usage | |
290 | attach_context_attributes( |
|
292 | attach_context_attributes(TemplateArgs(), request, request.rpc_user.user_id) | |
291 | TemplateArgs(), request, request.rpc_user.user_id) |
|
|||
292 |
|
293 | |||
293 | try: |
|
294 | try: | |
294 | ret_value = func(**call_params) |
|
295 | ret_value = func(**call_params) | |
@@ -298,9 +299,13 b' def request_view(request):' | |||||
298 | except Exception: |
|
299 | except Exception: | |
299 | log.exception('Unhandled exception occurred on api call: %s', func) |
|
300 | log.exception('Unhandled exception occurred on api call: %s', func) | |
300 | exc_info = sys.exc_info() |
|
301 | exc_info = sys.exc_info() | |
301 | store_exception(id(exc_info), exc_info, prefix='rhodecode-api') |
|
302 | exc_id, exc_type_name = store_exception( | |
|
303 | id(exc_info), exc_info, prefix='rhodecode-api') | |||
|
304 | error_headers = [('RhodeCode-Exception-Id', str(exc_id)), | |||
|
305 | ('RhodeCode-Exception-Type', str(exc_type_name))] | |||
302 | return jsonrpc_error( |
|
306 | return jsonrpc_error( | |
303 |
request, retid=request.rpc_id, message='Internal server error' |
|
307 | request, retid=request.rpc_id, message='Internal server error', | |
|
308 | headers=error_headers) | |||
304 |
|
309 | |||
305 |
|
310 | |||
306 | def setup_request(request): |
|
311 | def setup_request(request): | |
@@ -333,6 +338,7 b' def setup_request(request):' | |||||
333 | raise JSONRPCError("Content-Length is 0") |
|
338 | raise JSONRPCError("Content-Length is 0") | |
334 |
|
339 | |||
335 | raw_body = request.body |
|
340 | raw_body = request.body | |
|
341 | log.debug("Loading JSON body now") | |||
336 | try: |
|
342 | try: | |
337 | json_body = json.loads(raw_body) |
|
343 | json_body = json.loads(raw_body) | |
338 | except ValueError as e: |
|
344 | except ValueError as e: | |
@@ -359,7 +365,7 b' def setup_request(request):' | |||||
359 | request.rpc_params = json_body['args'] \ |
|
365 | request.rpc_params = json_body['args'] \ | |
360 | if isinstance(json_body['args'], dict) else {} |
|
366 | if isinstance(json_body['args'], dict) else {} | |
361 |
|
367 | |||
362 |
log.debug('method: %s, params: % |
|
368 | log.debug('method: %s, params: %.10240r', request.rpc_method, request.rpc_params) | |
363 | except KeyError as e: |
|
369 | except KeyError as e: | |
364 | raise JSONRPCError('Incorrect JSON data. Missing %s' % e) |
|
370 | raise JSONRPCError('Incorrect JSON data. Missing %s' % e) | |
365 |
|
371 |
@@ -49,7 +49,7 b' class TestClosePullRequest(object):' | |||||
49 | assert_ok(id_, expected, response.body) |
|
49 | assert_ok(id_, expected, response.body) | |
50 | journal = UserLog.query()\ |
|
50 | journal = UserLog.query()\ | |
51 | .filter(UserLog.user_id == author) \ |
|
51 | .filter(UserLog.user_id == author) \ | |
52 |
.order_by( |
|
52 | .order_by(UserLog.user_log_id.asc()) \ | |
53 | .filter(UserLog.repository_id == repo)\ |
|
53 | .filter(UserLog.repository_id == repo)\ | |
54 | .all() |
|
54 | .all() | |
55 | assert journal[-1].action == 'repo.pull_request.close' |
|
55 | assert journal[-1].action == 'repo.pull_request.close' |
@@ -20,7 +20,7 b'' | |||||
20 |
|
20 | |||
21 | import pytest |
|
21 | import pytest | |
22 |
|
22 | |||
23 | from rhodecode.model.db import ChangesetStatus |
|
23 | from rhodecode.model.db import ChangesetStatus, User | |
24 | from rhodecode.api.tests.utils import ( |
|
24 | from rhodecode.api.tests.utils import ( | |
25 | build_data, api_call, assert_error, assert_ok) |
|
25 | build_data, api_call, assert_error, assert_ok) | |
26 |
|
26 | |||
@@ -79,3 +79,38 b' class TestCommentCommit(object):' | |||||
79 | 'success': True |
|
79 | 'success': True | |
80 | } |
|
80 | } | |
81 | assert_ok(id_, expected, given=response.body) |
|
81 | assert_ok(id_, expected, given=response.body) | |
|
82 | ||||
|
83 | def test_api_comment_commit_with_extra_recipients(self, backend, user_util): | |||
|
84 | ||||
|
85 | commit_id = backend.repo.scm_instance().get_commit('tip').raw_id | |||
|
86 | ||||
|
87 | user1 = user_util.create_user() | |||
|
88 | user1_id = user1.user_id | |||
|
89 | user2 = user_util.create_user() | |||
|
90 | user2_id = user2.user_id | |||
|
91 | ||||
|
92 | id_, params = build_data( | |||
|
93 | self.apikey, 'comment_commit', repoid=backend.repo_name, | |||
|
94 | commit_id=commit_id, | |||
|
95 | message='abracadabra', | |||
|
96 | extra_recipients=[user1.user_id, user2.username]) | |||
|
97 | ||||
|
98 | response = api_call(self.app, params) | |||
|
99 | repo = backend.repo.scm_instance() | |||
|
100 | ||||
|
101 | expected = { | |||
|
102 | 'msg': 'Commented on commit `%s` for repository `%s`' % ( | |||
|
103 | repo.get_commit().raw_id, backend.repo_name), | |||
|
104 | 'status_change': None, | |||
|
105 | 'success': True | |||
|
106 | } | |||
|
107 | ||||
|
108 | assert_ok(id_, expected, given=response.body) | |||
|
109 | # check user1/user2 inbox for notification | |||
|
110 | user1 = User.get(user1_id) | |||
|
111 | assert 1 == len(user1.notifications) | |||
|
112 | assert 'abracadabra' in user1.notifications[0].notification.body | |||
|
113 | ||||
|
114 | user2 = User.get(user2_id) | |||
|
115 | assert 1 == len(user2.notifications) | |||
|
116 | assert 'abracadabra' in user2.notifications[0].notification.body |
@@ -21,7 +21,7 b'' | |||||
21 | import pytest |
|
21 | import pytest | |
22 |
|
22 | |||
23 | from rhodecode.model.comment import CommentsModel |
|
23 | from rhodecode.model.comment import CommentsModel | |
24 | from rhodecode.model.db import UserLog |
|
24 | from rhodecode.model.db import UserLog, User | |
25 | from rhodecode.model.pull_request import PullRequestModel |
|
25 | from rhodecode.model.pull_request import PullRequestModel | |
26 | from rhodecode.tests import TEST_USER_ADMIN_LOGIN |
|
26 | from rhodecode.tests import TEST_USER_ADMIN_LOGIN | |
27 | from rhodecode.api.tests.utils import ( |
|
27 | from rhodecode.api.tests.utils import ( | |
@@ -65,11 +65,48 b' class TestCommentPullRequest(object):' | |||||
65 | journal = UserLog.query()\ |
|
65 | journal = UserLog.query()\ | |
66 | .filter(UserLog.user_id == author)\ |
|
66 | .filter(UserLog.user_id == author)\ | |
67 | .filter(UserLog.repository_id == repo) \ |
|
67 | .filter(UserLog.repository_id == repo) \ | |
68 |
.order_by( |
|
68 | .order_by(UserLog.user_log_id.asc()) \ | |
69 | .all() |
|
69 | .all() | |
70 | assert journal[-1].action == 'repo.pull_request.comment.create' |
|
70 | assert journal[-1].action == 'repo.pull_request.comment.create' | |
71 |
|
71 | |||
72 | @pytest.mark.backends("git", "hg") |
|
72 | @pytest.mark.backends("git", "hg") | |
|
73 | def test_api_comment_pull_request_with_extra_recipients(self, pr_util, user_util): | |||
|
74 | pull_request = pr_util.create_pull_request() | |||
|
75 | ||||
|
76 | user1 = user_util.create_user() | |||
|
77 | user1_id = user1.user_id | |||
|
78 | user2 = user_util.create_user() | |||
|
79 | user2_id = user2.user_id | |||
|
80 | ||||
|
81 | id_, params = build_data( | |||
|
82 | self.apikey, 'comment_pull_request', | |||
|
83 | repoid=pull_request.target_repo.repo_name, | |||
|
84 | pullrequestid=pull_request.pull_request_id, | |||
|
85 | message='test message', | |||
|
86 | extra_recipients=[user1.user_id, user2.username] | |||
|
87 | ) | |||
|
88 | response = api_call(self.app, params) | |||
|
89 | pull_request = PullRequestModel().get(pull_request.pull_request_id) | |||
|
90 | ||||
|
91 | comments = CommentsModel().get_comments( | |||
|
92 | pull_request.target_repo.repo_id, pull_request=pull_request) | |||
|
93 | ||||
|
94 | expected = { | |||
|
95 | 'pull_request_id': pull_request.pull_request_id, | |||
|
96 | 'comment_id': comments[-1].comment_id, | |||
|
97 | 'status': {'given': None, 'was_changed': None} | |||
|
98 | } | |||
|
99 | assert_ok(id_, expected, response.body) | |||
|
100 | # check user1/user2 inbox for notification | |||
|
101 | user1 = User.get(user1_id) | |||
|
102 | assert 1 == len(user1.notifications) | |||
|
103 | assert 'test message' in user1.notifications[0].notification.body | |||
|
104 | ||||
|
105 | user2 = User.get(user2_id) | |||
|
106 | assert 1 == len(user2.notifications) | |||
|
107 | assert 'test message' in user2.notifications[0].notification.body | |||
|
108 | ||||
|
109 | @pytest.mark.backends("git", "hg") | |||
73 | def test_api_comment_pull_request_change_status( |
|
110 | def test_api_comment_pull_request_change_status( | |
74 | self, pr_util, no_notifications): |
|
111 | self, pr_util, no_notifications): | |
75 | pull_request = pr_util.create_pull_request() |
|
112 | pull_request = pr_util.create_pull_request() |
@@ -82,6 +82,7 b' class TestCreateUser(object):' | |||||
82 | self.apikey, 'create_user', |
|
82 | self.apikey, 'create_user', | |
83 | username=username, |
|
83 | username=username, | |
84 | email=email, |
|
84 | email=email, | |
|
85 | description='CTO of Things', | |||
85 | password='example') |
|
86 | password='example') | |
86 | response = api_call(self.app, params) |
|
87 | response = api_call(self.app, params) | |
87 |
|
88 |
@@ -27,6 +27,16 b' from rhodecode.api.tests.utils import (' | |||||
27 | @pytest.mark.usefixtures("testuser_api", "app") |
|
27 | @pytest.mark.usefixtures("testuser_api", "app") | |
28 | class TestApiSearch(object): |
|
28 | class TestApiSearch(object): | |
29 |
|
29 | |||
|
30 | @pytest.mark.parametrize("sort_dir", [ | |||
|
31 | "asc", | |||
|
32 | "desc", | |||
|
33 | ]) | |||
|
34 | @pytest.mark.parametrize("sort", [ | |||
|
35 | "xxx", | |||
|
36 | "author_email", | |||
|
37 | "date", | |||
|
38 | "message", | |||
|
39 | ]) | |||
30 | @pytest.mark.parametrize("query, expected_hits, expected_paths", [ |
|
40 | @pytest.mark.parametrize("query, expected_hits, expected_paths", [ | |
31 | ('todo', 23, [ |
|
41 | ('todo', 23, [ | |
32 | 'vcs/backends/hg/inmemory.py', |
|
42 | 'vcs/backends/hg/inmemory.py', | |
@@ -55,10 +65,11 b' class TestApiSearch(object):' | |||||
55 | 'vcs/tests/test_cli.py']), |
|
65 | 'vcs/tests/test_cli.py']), | |
56 | ('owner:michał test', 0, []), |
|
66 | ('owner:michał test', 0, []), | |
57 | ]) |
|
67 | ]) | |
58 | def test_search_content_results(self, query, expected_hits, expected_paths): |
|
68 | def test_search_content_results(self, sort_dir, sort, query, expected_hits, expected_paths): | |
59 | id_, params = build_data( |
|
69 | id_, params = build_data( | |
60 | self.apikey_regular, 'search', |
|
70 | self.apikey_regular, 'search', | |
61 | search_query=query, |
|
71 | search_query=query, | |
|
72 | search_sort='{}:{}'.format(sort_dir, sort), | |||
62 | search_type='content') |
|
73 | search_type='content') | |
63 |
|
74 | |||
64 | response = api_call(self.app, params) |
|
75 | response = api_call(self.app, params) | |
@@ -70,6 +81,16 b' class TestApiSearch(object):' | |||||
70 | for expected_path in expected_paths: |
|
81 | for expected_path in expected_paths: | |
71 | assert expected_path in paths |
|
82 | assert expected_path in paths | |
72 |
|
83 | |||
|
84 | @pytest.mark.parametrize("sort_dir", [ | |||
|
85 | "asc", | |||
|
86 | "desc", | |||
|
87 | ]) | |||
|
88 | @pytest.mark.parametrize("sort", [ | |||
|
89 | "xxx", | |||
|
90 | "date", | |||
|
91 | "file", | |||
|
92 | "size", | |||
|
93 | ]) | |||
73 | @pytest.mark.parametrize("query, expected_hits, expected_paths", [ |
|
94 | @pytest.mark.parametrize("query, expected_hits, expected_paths", [ | |
74 | ('readme.rst', 3, []), |
|
95 | ('readme.rst', 3, []), | |
75 | ('test*', 75, []), |
|
96 | ('test*', 75, []), | |
@@ -77,10 +98,11 b' class TestApiSearch(object):' | |||||
77 | ('extension:rst', 48, []), |
|
98 | ('extension:rst', 48, []), | |
78 | ('extension:rst api', 24, []), |
|
99 | ('extension:rst api', 24, []), | |
79 | ]) |
|
100 | ]) | |
80 | def test_search_file_paths(self, query, expected_hits, expected_paths): |
|
101 | def test_search_file_paths(self, sort_dir, sort, query, expected_hits, expected_paths): | |
81 | id_, params = build_data( |
|
102 | id_, params = build_data( | |
82 | self.apikey_regular, 'search', |
|
103 | self.apikey_regular, 'search', | |
83 | search_query=query, |
|
104 | search_query=query, | |
|
105 | search_sort='{}:{}'.format(sort_dir, sort), | |||
84 | search_type='path') |
|
106 | search_type='path') | |
85 |
|
107 | |||
86 | response = api_call(self.app, params) |
|
108 | response = api_call(self.app, params) |
@@ -50,6 +50,7 b' class TestGetMethod(object):' | |||||
50 | {'apiuser': '<RequiredType>', |
|
50 | {'apiuser': '<RequiredType>', | |
51 | 'comment_type': "<Optional:u'note'>", |
|
51 | 'comment_type': "<Optional:u'note'>", | |
52 | 'commit_id': '<RequiredType>', |
|
52 | 'commit_id': '<RequiredType>', | |
|
53 | 'extra_recipients': '<Optional:[]>', | |||
53 | 'message': '<RequiredType>', |
|
54 | 'message': '<RequiredType>', | |
54 | 'repoid': '<RequiredType>', |
|
55 | 'repoid': '<RequiredType>', | |
55 | 'request': '<RequiredType>', |
|
56 | 'request': '<RequiredType>', |
@@ -54,7 +54,7 b' class TestGetRepoChangeset(object):' | |||||
54 | details=details, |
|
54 | details=details, | |
55 | ) |
|
55 | ) | |
56 | response = api_call(self.app, params) |
|
56 | response = api_call(self.app, params) | |
57 |
expected = ' |
|
57 | expected = "commit_id must be a string value got <type 'int'> instead" | |
58 | assert_error(id_, expected, given=response.body) |
|
58 | assert_error(id_, expected, given=response.body) | |
59 |
|
59 | |||
60 | @pytest.mark.parametrize("details", ['basic', 'extended', 'full']) |
|
60 | @pytest.mark.parametrize("details", ['basic', 'extended', 'full']) | |
@@ -137,5 +137,5 b' class TestGetRepoChangeset(object):' | |||||
137 | details=details, |
|
137 | details=details, | |
138 | ) |
|
138 | ) | |
139 | response = api_call(self.app, params) |
|
139 | response = api_call(self.app, params) | |
140 |
expected = ' |
|
140 | expected = "commit_id must be a string value got <type 'int'> instead" | |
141 | assert_error(id_, expected, given=response.body) |
|
141 | assert_error(id_, expected, given=response.body) |
@@ -31,7 +31,9 b' from rhodecode.api.tests.utils import (' | |||||
31 | @pytest.fixture() |
|
31 | @pytest.fixture() | |
32 | def make_repo_comments_factory(request): |
|
32 | def make_repo_comments_factory(request): | |
33 |
|
33 | |||
34 | def maker(repo): |
|
34 | class Make(object): | |
|
35 | ||||
|
36 | def make_comments(self, repo): | |||
35 | user = User.get_first_super_admin() |
|
37 | user = User.get_first_super_admin() | |
36 | commit = repo.scm_instance()[0] |
|
38 | commit = repo.scm_instance()[0] | |
37 |
|
39 | |||
@@ -60,7 +62,7 b' def make_repo_comments_factory(request):' | |||||
60 | def cleanup(): |
|
62 | def cleanup(): | |
61 | for comment in comments: |
|
63 | for comment in comments: | |
62 | Session().delete(comment) |
|
64 | Session().delete(comment) | |
63 |
return |
|
65 | return Make() | |
64 |
|
66 | |||
65 |
|
67 | |||
66 | @pytest.mark.usefixtures("testuser_api", "app") |
|
68 | @pytest.mark.usefixtures("testuser_api", "app") | |
@@ -76,7 +78,7 b' class TestGetRepo(object):' | |||||
76 | make_repo_comments_factory, filters, expected_count): |
|
78 | make_repo_comments_factory, filters, expected_count): | |
77 | commits = [{'message': 'A'}, {'message': 'B'}] |
|
79 | commits = [{'message': 'A'}, {'message': 'B'}] | |
78 | repo = backend.create_repo(commits=commits) |
|
80 | repo = backend.create_repo(commits=commits) | |
79 | make_repo_comments_factory(repo) |
|
81 | make_repo_comments_factory.make_comments(repo) | |
80 |
|
82 | |||
81 | api_call_params = {'repoid': repo.repo_name,} |
|
83 | api_call_params = {'repoid': repo.repo_name,} | |
82 | api_call_params.update(filters) |
|
84 | api_call_params.update(filters) | |
@@ -92,12 +94,13 b' class TestGetRepo(object):' | |||||
92 |
|
94 | |||
93 | assert len(result) == expected_count |
|
95 | assert len(result) == expected_count | |
94 |
|
96 | |||
95 |
def test_api_get_repo_comments_wrong_comment_typ( |
|
97 | def test_api_get_repo_comments_wrong_comment_type( | |
|
98 | self, make_repo_comments_factory, backend_hg): | |||
|
99 | commits = [{'message': 'A'}, {'message': 'B'}] | |||
|
100 | repo = backend_hg.create_repo(commits=commits) | |||
|
101 | make_repo_comments_factory.make_comments(repo) | |||
96 |
|
102 | |||
97 | repo = backend_hg.create_repo() |
|
103 | api_call_params = {'repoid': repo.repo_name} | |
98 | make_repo_comments_factory(repo) |
|
|||
99 |
|
||||
100 | api_call_params = {'repoid': repo.repo_name,} |
|
|||
101 | api_call_params.update({'comment_type': 'bogus'}) |
|
104 | api_call_params.update({'comment_type': 'bogus'}) | |
102 |
|
105 | |||
103 | expected = 'comment_type must be one of `{}` got {}'.format( |
|
106 | expected = 'comment_type must be one of `{}` got {}'.format( |
@@ -25,7 +25,7 b' from rhodecode.model.scm import ScmModel' | |||||
25 | from rhodecode.api.tests.utils import build_data, api_call, assert_ok |
|
25 | from rhodecode.api.tests.utils import build_data, api_call, assert_ok | |
26 |
|
26 | |||
27 |
|
27 | |||
28 | @pytest.fixture |
|
28 | @pytest.fixture() | |
29 | def http_host_stub(): |
|
29 | def http_host_stub(): | |
30 | """ |
|
30 | """ | |
31 | To ensure that we can get an IP address, this test shall run with a |
|
31 | To ensure that we can get an IP address, this test shall run with a |
@@ -120,7 +120,7 b' class TestMergePullRequest(object):' | |||||
120 | journal = UserLog.query()\ |
|
120 | journal = UserLog.query()\ | |
121 | .filter(UserLog.user_id == author)\ |
|
121 | .filter(UserLog.user_id == author)\ | |
122 | .filter(UserLog.repository_id == repo) \ |
|
122 | .filter(UserLog.repository_id == repo) \ | |
123 |
.order_by( |
|
123 | .order_by(UserLog.user_log_id.asc()) \ | |
124 | .all() |
|
124 | .all() | |
125 | assert journal[-2].action == 'repo.pull_request.merge' |
|
125 | assert journal[-2].action == 'repo.pull_request.merge' | |
126 | assert journal[-1].action == 'repo.pull_request.close' |
|
126 | assert journal[-1].action == 'repo.pull_request.close' | |
@@ -221,7 +221,7 b' class TestMergePullRequest(object):' | |||||
221 | journal = UserLog.query() \ |
|
221 | journal = UserLog.query() \ | |
222 | .filter(UserLog.user_id == merge_user_id) \ |
|
222 | .filter(UserLog.user_id == merge_user_id) \ | |
223 | .filter(UserLog.repository_id == repo) \ |
|
223 | .filter(UserLog.repository_id == repo) \ | |
224 |
.order_by( |
|
224 | .order_by(UserLog.user_log_id.asc()) \ | |
225 | .all() |
|
225 | .all() | |
226 | assert journal[-2].action == 'repo.pull_request.merge' |
|
226 | assert journal[-2].action == 'repo.pull_request.merge' | |
227 | assert journal[-1].action == 'repo.pull_request.close' |
|
227 | assert journal[-1].action == 'repo.pull_request.close' |
@@ -42,7 +42,8 b' class TestUpdateUser(object):' | |||||
42 | ('extern_name', None), |
|
42 | ('extern_name', None), | |
43 | ('active', False), |
|
43 | ('active', False), | |
44 | ('active', True), |
|
44 | ('active', True), | |
45 | ('password', 'newpass') |
|
45 | ('password', 'newpass'), | |
|
46 | ('description', 'CTO 4 Life') | |||
46 | ]) |
|
47 | ]) | |
47 | def test_api_update_user(self, name, expected, user_util): |
|
48 | def test_api_update_user(self, name, expected, user_util): | |
48 | usr = user_util.create_user() |
|
49 | usr = user_util.create_user() |
@@ -20,11 +20,16 b'' | |||||
20 |
|
20 | |||
21 |
|
21 | |||
22 | import random |
|
22 | import random | |
|
23 | import pytest | |||
23 |
|
24 | |||
24 | from rhodecode.api.utils import get_origin |
|
25 | from rhodecode.api.utils import get_origin | |
25 | from rhodecode.lib.ext_json import json |
|
26 | from rhodecode.lib.ext_json import json | |
26 |
|
27 | |||
27 |
|
28 | |||
|
29 | def jsonify(obj): | |||
|
30 | return json.loads(json.dumps(obj)) | |||
|
31 | ||||
|
32 | ||||
28 | API_URL = '/_admin/api' |
|
33 | API_URL = '/_admin/api' | |
29 |
|
34 | |||
30 |
|
35 | |||
@@ -42,12 +47,16 b' def assert_call_ok(id_, given):' | |||||
42 |
|
47 | |||
43 |
|
48 | |||
44 | def assert_ok(id_, expected, given): |
|
49 | def assert_ok(id_, expected, given): | |
|
50 | given = json.loads(given) | |||
|
51 | if given.get('error'): | |||
|
52 | pytest.fail("Unexpected ERROR in success response: {}".format(given['error'])) | |||
|
53 | ||||
45 | expected = jsonify({ |
|
54 | expected = jsonify({ | |
46 | 'id': id_, |
|
55 | 'id': id_, | |
47 | 'error': None, |
|
56 | 'error': None, | |
48 | 'result': expected |
|
57 | 'result': expected | |
49 | }) |
|
58 | }) | |
50 | given = json.loads(given) |
|
59 | ||
51 | assert expected == given |
|
60 | assert expected == given | |
52 |
|
61 | |||
53 |
|
62 | |||
@@ -61,10 +70,6 b' def assert_error(id_, expected, given):' | |||||
61 | assert expected == given |
|
70 | assert expected == given | |
62 |
|
71 | |||
63 |
|
72 | |||
64 | def jsonify(obj): |
|
|||
65 | return json.loads(json.dumps(obj)) |
|
|||
66 |
|
||||
67 |
|
||||
68 | def build_data(apikey, method, **kw): |
|
73 | def build_data(apikey, method, **kw): | |
69 | """ |
|
74 | """ | |
70 | Builds API data with given random ID |
|
75 | Builds API data with given random ID |
@@ -451,7 +451,7 b' def comment_pull_request(' | |||||
451 | request, apiuser, pullrequestid, repoid=Optional(None), |
|
451 | request, apiuser, pullrequestid, repoid=Optional(None), | |
452 | message=Optional(None), commit_id=Optional(None), status=Optional(None), |
|
452 | message=Optional(None), commit_id=Optional(None), status=Optional(None), | |
453 | comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE), |
|
453 | comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE), | |
454 | resolves_comment_id=Optional(None), |
|
454 | resolves_comment_id=Optional(None), extra_recipients=Optional([]), | |
455 | userid=Optional(OAttr('apiuser'))): |
|
455 | userid=Optional(OAttr('apiuser'))): | |
456 | """ |
|
456 | """ | |
457 | Comment on the pull request specified with the `pullrequestid`, |
|
457 | Comment on the pull request specified with the `pullrequestid`, | |
@@ -476,6 +476,11 b' def comment_pull_request(' | |||||
476 | :type status: str |
|
476 | :type status: str | |
477 | :param comment_type: Comment type, one of: 'note', 'todo' |
|
477 | :param comment_type: Comment type, one of: 'note', 'todo' | |
478 | :type comment_type: Optional(str), default: 'note' |
|
478 | :type comment_type: Optional(str), default: 'note' | |
|
479 | :param resolves_comment_id: id of comment which this one will resolve | |||
|
480 | :type resolves_comment_id: Optional(int) | |||
|
481 | :param extra_recipients: list of user ids or usernames to add | |||
|
482 | notifications for this comment. Acts like a CC for notification | |||
|
483 | :type extra_recipients: Optional(list) | |||
479 | :param userid: Comment on the pull request as this user |
|
484 | :param userid: Comment on the pull request as this user | |
480 | :type userid: Optional(str or int) |
|
485 | :type userid: Optional(str or int) | |
481 |
|
486 | |||
@@ -521,6 +526,7 b' def comment_pull_request(' | |||||
521 | commit_id = Optional.extract(commit_id) |
|
526 | commit_id = Optional.extract(commit_id) | |
522 | comment_type = Optional.extract(comment_type) |
|
527 | comment_type = Optional.extract(comment_type) | |
523 | resolves_comment_id = Optional.extract(resolves_comment_id) |
|
528 | resolves_comment_id = Optional.extract(resolves_comment_id) | |
|
529 | extra_recipients = Optional.extract(extra_recipients) | |||
524 |
|
530 | |||
525 | if not message and not status: |
|
531 | if not message and not status: | |
526 | raise JSONRPCError( |
|
532 | raise JSONRPCError( | |
@@ -580,7 +586,8 b' def comment_pull_request(' | |||||
580 | renderer=renderer, |
|
586 | renderer=renderer, | |
581 | comment_type=comment_type, |
|
587 | comment_type=comment_type, | |
582 | resolves_comment_id=resolves_comment_id, |
|
588 | resolves_comment_id=resolves_comment_id, | |
583 | auth_user=auth_user |
|
589 | auth_user=auth_user, | |
|
590 | extra_recipients=extra_recipients | |||
584 | ) |
|
591 | ) | |
585 |
|
592 | |||
586 | if allowed_to_change_status and status: |
|
593 | if allowed_to_change_status and status: | |
@@ -888,7 +895,9 b' def update_pull_request(' | |||||
888 |
|
895 | |||
889 | with pull_request.set_state(PullRequest.STATE_UPDATING): |
|
896 | with pull_request.set_state(PullRequest.STATE_UPDATING): | |
890 | if PullRequestModel().has_valid_update_type(pull_request): |
|
897 | if PullRequestModel().has_valid_update_type(pull_request): | |
891 | update_response = PullRequestModel().update_commits(pull_request) |
|
898 | db_user = apiuser.get_instance() | |
|
899 | update_response = PullRequestModel().update_commits( | |||
|
900 | pull_request, db_user) | |||
892 | commit_changes = update_response.changes or commit_changes |
|
901 | commit_changes = update_response.changes or commit_changes | |
893 | Session().commit() |
|
902 | Session().commit() | |
894 |
|
903 |
@@ -617,9 +617,7 b' def get_repo_fts_tree(request, apiuser, ' | |||||
617 | cache_namespace_uid = 'cache_repo.{}'.format(repo_id) |
|
617 | cache_namespace_uid = 'cache_repo.{}'.format(repo_id) | |
618 | region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid) |
|
618 | region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid) | |
619 |
|
619 | |||
620 | @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, |
|
620 | def compute_fts_tree(cache_ver, repo_id, commit_id, root_path): | |
621 | condition=cache_on) |
|
|||
622 | def compute_fts_tree(repo_id, commit_id, root_path, cache_ver): |
|
|||
623 | return ScmModel().get_fts_data(repo_id, commit_id, root_path) |
|
621 | return ScmModel().get_fts_data(repo_id, commit_id, root_path) | |
624 |
|
622 | |||
625 | try: |
|
623 | try: | |
@@ -640,7 +638,7 b' def get_repo_fts_tree(request, apiuser, ' | |||||
640 | 'with caching: %s[TTL: %ss]' % ( |
|
638 | 'with caching: %s[TTL: %ss]' % ( | |
641 | repo_id, commit_id, cache_on, cache_seconds or 0)) |
|
639 | repo_id, commit_id, cache_on, cache_seconds or 0)) | |
642 |
|
640 | |||
643 |
tree_files = compute_fts_tree(repo_id, commit_id, root_path |
|
641 | tree_files = compute_fts_tree(rc_cache.FILE_TREE_CACHE_VER, repo_id, commit_id, root_path) | |
644 | return tree_files |
|
642 | return tree_files | |
645 |
|
643 | |||
646 | except Exception: |
|
644 | except Exception: | |
@@ -714,7 +712,7 b' def create_repo(' | |||||
714 | private=Optional(False), |
|
712 | private=Optional(False), | |
715 | clone_uri=Optional(None), |
|
713 | clone_uri=Optional(None), | |
716 | push_uri=Optional(None), |
|
714 | push_uri=Optional(None), | |
717 |
landing_rev=Optional( |
|
715 | landing_rev=Optional(None), | |
718 | enable_statistics=Optional(False), |
|
716 | enable_statistics=Optional(False), | |
719 | enable_locking=Optional(False), |
|
717 | enable_locking=Optional(False), | |
720 | enable_downloads=Optional(False), |
|
718 | enable_downloads=Optional(False), | |
@@ -749,7 +747,7 b' def create_repo(' | |||||
749 | :type clone_uri: str |
|
747 | :type clone_uri: str | |
750 | :param push_uri: set push_uri |
|
748 | :param push_uri: set push_uri | |
751 | :type push_uri: str |
|
749 | :type push_uri: str | |
752 | :param landing_rev: <rev_type>:<rev> |
|
750 | :param landing_rev: <rev_type>:<rev>, e.g branch:default, book:dev, rev:abcd | |
753 | :type landing_rev: str |
|
751 | :type landing_rev: str | |
754 | :param enable_locking: |
|
752 | :param enable_locking: | |
755 | :type enable_locking: bool |
|
753 | :type enable_locking: bool | |
@@ -793,7 +791,6 b' def create_repo(' | |||||
793 | copy_permissions = Optional.extract(copy_permissions) |
|
791 | copy_permissions = Optional.extract(copy_permissions) | |
794 | clone_uri = Optional.extract(clone_uri) |
|
792 | clone_uri = Optional.extract(clone_uri) | |
795 | push_uri = Optional.extract(push_uri) |
|
793 | push_uri = Optional.extract(push_uri) | |
796 | landing_commit_ref = Optional.extract(landing_rev) |
|
|||
797 |
|
794 | |||
798 | defs = SettingsModel().get_default_repo_settings(strip_prefix=True) |
|
795 | defs = SettingsModel().get_default_repo_settings(strip_prefix=True) | |
799 | if isinstance(private, Optional): |
|
796 | if isinstance(private, Optional): | |
@@ -807,8 +804,15 b' def create_repo(' | |||||
807 | if isinstance(enable_downloads, Optional): |
|
804 | if isinstance(enable_downloads, Optional): | |
808 | enable_downloads = defs.get('repo_enable_downloads') |
|
805 | enable_downloads = defs.get('repo_enable_downloads') | |
809 |
|
806 | |||
|
807 | landing_ref, _label = ScmModel.backend_landing_ref(repo_type) | |||
|
808 | ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate) | |||
|
809 | ref_choices = list(set(ref_choices + [landing_ref])) | |||
|
810 | ||||
|
811 | landing_commit_ref = Optional.extract(landing_rev) or landing_ref | |||
|
812 | ||||
810 | schema = repo_schema.RepoSchema().bind( |
|
813 | schema = repo_schema.RepoSchema().bind( | |
811 | repo_type_options=rhodecode.BACKENDS.keys(), |
|
814 | repo_type_options=rhodecode.BACKENDS.keys(), | |
|
815 | repo_ref_options=ref_choices, | |||
812 | repo_type=repo_type, |
|
816 | repo_type=repo_type, | |
813 | # user caller |
|
817 | # user caller | |
814 | user=apiuser) |
|
818 | user=apiuser) | |
@@ -958,7 +962,7 b' def update_repo(' | |||||
958 | owner=Optional(OAttr('apiuser')), description=Optional(''), |
|
962 | owner=Optional(OAttr('apiuser')), description=Optional(''), | |
959 | private=Optional(False), |
|
963 | private=Optional(False), | |
960 | clone_uri=Optional(None), push_uri=Optional(None), |
|
964 | clone_uri=Optional(None), push_uri=Optional(None), | |
961 |
landing_rev=Optional( |
|
965 | landing_rev=Optional(None), fork_of=Optional(None), | |
962 | enable_statistics=Optional(False), |
|
966 | enable_statistics=Optional(False), | |
963 | enable_locking=Optional(False), |
|
967 | enable_locking=Optional(False), | |
964 | enable_downloads=Optional(False), fields=Optional('')): |
|
968 | enable_downloads=Optional(False), fields=Optional('')): | |
@@ -993,7 +997,7 b' def update_repo(' | |||||
993 | :type private: bool |
|
997 | :type private: bool | |
994 | :param clone_uri: Update the |repo| clone URI. |
|
998 | :param clone_uri: Update the |repo| clone URI. | |
995 | :type clone_uri: str |
|
999 | :type clone_uri: str | |
996 |
:param landing_rev: Set the |repo| landing revision. |
|
1000 | :param landing_rev: Set the |repo| landing revision. e.g branch:default, book:dev, rev:abcd | |
997 | :type landing_rev: str |
|
1001 | :type landing_rev: str | |
998 | :param enable_statistics: Enable statistics on the |repo|, (True | False). |
|
1002 | :param enable_statistics: Enable statistics on the |repo|, (True | False). | |
999 | :type enable_statistics: bool |
|
1003 | :type enable_statistics: bool | |
@@ -1049,8 +1053,10 b' def update_repo(' | |||||
1049 | repo_enable_downloads=enable_downloads |
|
1053 | repo_enable_downloads=enable_downloads | |
1050 | if not isinstance(enable_downloads, Optional) else repo.enable_downloads) |
|
1054 | if not isinstance(enable_downloads, Optional) else repo.enable_downloads) | |
1051 |
|
1055 | |||
|
1056 | landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type) | |||
1052 | ref_choices, _labels = ScmModel().get_repo_landing_revs( |
|
1057 | ref_choices, _labels = ScmModel().get_repo_landing_revs( | |
1053 | request.translate, repo=repo) |
|
1058 | request.translate, repo=repo) | |
|
1059 | ref_choices = list(set(ref_choices + [landing_ref])) | |||
1054 |
|
1060 | |||
1055 | old_values = repo.get_api_data() |
|
1061 | old_values = repo.get_api_data() | |
1056 | repo_type = repo.repo_type |
|
1062 | repo_type = repo.repo_type | |
@@ -1128,7 +1134,7 b' def fork_repo(request, apiuser, repoid, ' | |||||
1128 | description=Optional(''), |
|
1134 | description=Optional(''), | |
1129 | private=Optional(False), |
|
1135 | private=Optional(False), | |
1130 | clone_uri=Optional(None), |
|
1136 | clone_uri=Optional(None), | |
1131 |
landing_rev=Optional( |
|
1137 | landing_rev=Optional(None), | |
1132 | copy_permissions=Optional(False)): |
|
1138 | copy_permissions=Optional(False)): | |
1133 | """ |
|
1139 | """ | |
1134 | Creates a fork of the specified |repo|. |
|
1140 | Creates a fork of the specified |repo|. | |
@@ -1158,7 +1164,7 b' def fork_repo(request, apiuser, repoid, ' | |||||
1158 | :type copy_permissions: bool |
|
1164 | :type copy_permissions: bool | |
1159 | :param private: Make the fork private. The default is False. |
|
1165 | :param private: Make the fork private. The default is False. | |
1160 | :type private: bool |
|
1166 | :type private: bool | |
1161 |
:param landing_rev: Set the landing revision. |
|
1167 | :param landing_rev: Set the landing revision. E.g branch:default, book:dev, rev:abcd | |
1162 |
|
1168 | |||
1163 | Example output: |
|
1169 | Example output: | |
1164 |
|
1170 | |||
@@ -1210,11 +1216,17 b' def fork_repo(request, apiuser, repoid, ' | |||||
1210 | description = Optional.extract(description) |
|
1216 | description = Optional.extract(description) | |
1211 | copy_permissions = Optional.extract(copy_permissions) |
|
1217 | copy_permissions = Optional.extract(copy_permissions) | |
1212 | clone_uri = Optional.extract(clone_uri) |
|
1218 | clone_uri = Optional.extract(clone_uri) | |
1213 | landing_commit_ref = Optional.extract(landing_rev) |
|
1219 | ||
|
1220 | landing_ref, _label = ScmModel.backend_landing_ref(repo.repo_type) | |||
|
1221 | ref_choices, _labels = ScmModel().get_repo_landing_revs(request.translate) | |||
|
1222 | ref_choices = list(set(ref_choices + [landing_ref])) | |||
|
1223 | landing_commit_ref = Optional.extract(landing_rev) or landing_ref | |||
|
1224 | ||||
1214 | private = Optional.extract(private) |
|
1225 | private = Optional.extract(private) | |
1215 |
|
1226 | |||
1216 | schema = repo_schema.RepoSchema().bind( |
|
1227 | schema = repo_schema.RepoSchema().bind( | |
1217 | repo_type_options=rhodecode.BACKENDS.keys(), |
|
1228 | repo_type_options=rhodecode.BACKENDS.keys(), | |
|
1229 | repo_ref_options=ref_choices, | |||
1218 | repo_type=repo.repo_type, |
|
1230 | repo_type=repo.repo_type, | |
1219 | # user caller |
|
1231 | # user caller | |
1220 | user=apiuser) |
|
1232 | user=apiuser) | |
@@ -1538,7 +1550,7 b' def lock(request, apiuser, repoid, locke' | |||||
1538 | def comment_commit( |
|
1550 | def comment_commit( | |
1539 | request, apiuser, repoid, commit_id, message, status=Optional(None), |
|
1551 | request, apiuser, repoid, commit_id, message, status=Optional(None), | |
1540 | comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE), |
|
1552 | comment_type=Optional(ChangesetComment.COMMENT_TYPE_NOTE), | |
1541 | resolves_comment_id=Optional(None), |
|
1553 | resolves_comment_id=Optional(None), extra_recipients=Optional([]), | |
1542 | userid=Optional(OAttr('apiuser'))): |
|
1554 | userid=Optional(OAttr('apiuser'))): | |
1543 | """ |
|
1555 | """ | |
1544 | Set a commit comment, and optionally change the status of the commit. |
|
1556 | Set a commit comment, and optionally change the status of the commit. | |
@@ -1556,6 +1568,11 b' def comment_commit(' | |||||
1556 | :type status: str |
|
1568 | :type status: str | |
1557 | :param comment_type: Comment type, one of: 'note', 'todo' |
|
1569 | :param comment_type: Comment type, one of: 'note', 'todo' | |
1558 | :type comment_type: Optional(str), default: 'note' |
|
1570 | :type comment_type: Optional(str), default: 'note' | |
|
1571 | :param resolves_comment_id: id of comment which this one will resolve | |||
|
1572 | :type resolves_comment_id: Optional(int) | |||
|
1573 | :param extra_recipients: list of user ids or usernames to add | |||
|
1574 | notifications for this comment. Acts like a CC for notification | |||
|
1575 | :type extra_recipients: Optional(list) | |||
1559 | :param userid: Set the user name of the comment creator. |
|
1576 | :param userid: Set the user name of the comment creator. | |
1560 | :type userid: Optional(str or int) |
|
1577 | :type userid: Optional(str or int) | |
1561 |
|
1578 | |||
@@ -1592,6 +1609,7 b' def comment_commit(' | |||||
1592 | status = Optional.extract(status) |
|
1609 | status = Optional.extract(status) | |
1593 | comment_type = Optional.extract(comment_type) |
|
1610 | comment_type = Optional.extract(comment_type) | |
1594 | resolves_comment_id = Optional.extract(resolves_comment_id) |
|
1611 | resolves_comment_id = Optional.extract(resolves_comment_id) | |
|
1612 | extra_recipients = Optional.extract(extra_recipients) | |||
1595 |
|
1613 | |||
1596 | allowed_statuses = [x[0] for x in ChangesetStatus.STATUSES] |
|
1614 | allowed_statuses = [x[0] for x in ChangesetStatus.STATUSES] | |
1597 | if status and status not in allowed_statuses: |
|
1615 | if status and status not in allowed_statuses: | |
@@ -1620,7 +1638,8 b' def comment_commit(' | |||||
1620 | renderer=renderer, |
|
1638 | renderer=renderer, | |
1621 | comment_type=comment_type, |
|
1639 | comment_type=comment_type, | |
1622 | resolves_comment_id=resolves_comment_id, |
|
1640 | resolves_comment_id=resolves_comment_id, | |
1623 | auth_user=apiuser |
|
1641 | auth_user=apiuser, | |
|
1642 | extra_recipients=extra_recipients | |||
1624 | ) |
|
1643 | ) | |
1625 | if status: |
|
1644 | if status: | |
1626 | # also do a status change |
|
1645 | # also do a status change |
@@ -33,7 +33,7 b' log = logging.getLogger(__name__)' | |||||
33 |
|
33 | |||
34 | @jsonrpc_method() |
|
34 | @jsonrpc_method() | |
35 | def search(request, apiuser, search_query, search_type, page_limit=Optional(10), |
|
35 | def search(request, apiuser, search_query, search_type, page_limit=Optional(10), | |
36 |
page=Optional(1), search_sort=Optional(' |
|
36 | page=Optional(1), search_sort=Optional('desc:date'), | |
37 | repo_name=Optional(None), repo_group_name=Optional(None)): |
|
37 | repo_name=Optional(None), repo_group_name=Optional(None)): | |
38 | """ |
|
38 | """ | |
39 | Fetch Full Text Search results using API. |
|
39 | Fetch Full Text Search results using API. | |
@@ -51,9 +51,15 b' def search(request, apiuser, search_quer' | |||||
51 | :type page_limit: Optional(int) |
|
51 | :type page_limit: Optional(int) | |
52 | :param page: Page number. Default first page. |
|
52 | :param page: Page number. Default first page. | |
53 | :type page: Optional(int) |
|
53 | :type page: Optional(int) | |
54 |
:param search_sort: Search sort order. |
|
54 | :param search_sort: Search sort order.Must start with asc: or desc: Default desc:date. | |
55 | * newfirst |
|
55 | The following are valid options: | |
56 | * oldfirst |
|
56 | * asc|desc:message.raw | |
|
57 | * asc|desc:date | |||
|
58 | * asc|desc:author.email.raw | |||
|
59 | * asc|desc:message.raw | |||
|
60 | * newfirst (old legacy equal to desc:date) | |||
|
61 | * oldfirst (old legacy equal to asc:date) | |||
|
62 | ||||
57 | :type search_sort: Optional(str) |
|
63 | :type search_sort: Optional(str) | |
58 | :param repo_name: Filter by one repo. Default is all. |
|
64 | :param repo_name: Filter by one repo. Default is all. | |
59 | :type repo_name: Optional(str) |
|
65 | :type repo_name: Optional(str) | |
@@ -101,7 +107,7 b' def search(request, apiuser, search_quer' | |||||
101 | searcher.cleanup() |
|
107 | searcher.cleanup() | |
102 |
|
108 | |||
103 | if not search_result['error']: |
|
109 | if not search_result['error']: | |
104 |
data['execution_time'] = '%s results (%. |
|
110 | data['execution_time'] = '%s results (%.4f seconds)' % ( | |
105 | search_result['count'], |
|
111 | search_result['count'], | |
106 | search_result['runtime']) |
|
112 | search_result['runtime']) | |
107 | else: |
|
113 | else: |
@@ -75,6 +75,7 b' def get_user(request, apiuser, userid=Op' | |||||
75 | "extern_name": "rhodecode", |
|
75 | "extern_name": "rhodecode", | |
76 | "extern_type": "rhodecode", |
|
76 | "extern_type": "rhodecode", | |
77 | "firstname": "username", |
|
77 | "firstname": "username", | |
|
78 | "description": "user description", | |||
78 | "ip_addresses": [], |
|
79 | "ip_addresses": [], | |
79 | "language": null, |
|
80 | "language": null, | |
80 | "last_login": "Timestamp", |
|
81 | "last_login": "Timestamp", | |
@@ -159,7 +160,7 b' def get_users(request, apiuser):' | |||||
159 |
|
160 | |||
160 | @jsonrpc_method() |
|
161 | @jsonrpc_method() | |
161 | def create_user(request, apiuser, username, email, password=Optional(''), |
|
162 | def create_user(request, apiuser, username, email, password=Optional(''), | |
162 | firstname=Optional(''), lastname=Optional(''), |
|
163 | firstname=Optional(''), lastname=Optional(''), description=Optional(''), | |
163 | active=Optional(True), admin=Optional(False), |
|
164 | active=Optional(True), admin=Optional(False), | |
164 | extern_name=Optional('rhodecode'), |
|
165 | extern_name=Optional('rhodecode'), | |
165 | extern_type=Optional('rhodecode'), |
|
166 | extern_type=Optional('rhodecode'), | |
@@ -185,6 +186,8 b' def create_user(request, apiuser, userna' | |||||
185 | :type firstname: Optional(str) |
|
186 | :type firstname: Optional(str) | |
186 | :param lastname: Set the new user surname. |
|
187 | :param lastname: Set the new user surname. | |
187 | :type lastname: Optional(str) |
|
188 | :type lastname: Optional(str) | |
|
189 | :param description: Set user description, or short bio. Metatags are allowed. | |||
|
190 | :type description: Optional(str) | |||
188 | :param active: Set the user as active. |
|
191 | :param active: Set the user as active. | |
189 | :type active: Optional(``True`` | ``False``) |
|
192 | :type active: Optional(``True`` | ``False``) | |
190 | :param admin: Give the new user admin rights. |
|
193 | :param admin: Give the new user admin rights. | |
@@ -250,6 +253,7 b' def create_user(request, apiuser, userna' | |||||
250 | email = Optional.extract(email) |
|
253 | email = Optional.extract(email) | |
251 | first_name = Optional.extract(firstname) |
|
254 | first_name = Optional.extract(firstname) | |
252 | last_name = Optional.extract(lastname) |
|
255 | last_name = Optional.extract(lastname) | |
|
256 | description = Optional.extract(description) | |||
253 | active = Optional.extract(active) |
|
257 | active = Optional.extract(active) | |
254 | admin = Optional.extract(admin) |
|
258 | admin = Optional.extract(admin) | |
255 | extern_type = Optional.extract(extern_type) |
|
259 | extern_type = Optional.extract(extern_type) | |
@@ -267,6 +271,7 b' def create_user(request, apiuser, userna' | |||||
267 | last_name=last_name, |
|
271 | last_name=last_name, | |
268 | active=active, |
|
272 | active=active, | |
269 | admin=admin, |
|
273 | admin=admin, | |
|
274 | description=description, | |||
270 | extern_type=extern_type, |
|
275 | extern_type=extern_type, | |
271 | extern_name=extern_name, |
|
276 | extern_name=extern_name, | |
272 | )) |
|
277 | )) | |
@@ -280,6 +285,7 b' def create_user(request, apiuser, userna' | |||||
280 | email=schema_data['email'], |
|
285 | email=schema_data['email'], | |
281 | firstname=schema_data['first_name'], |
|
286 | firstname=schema_data['first_name'], | |
282 | lastname=schema_data['last_name'], |
|
287 | lastname=schema_data['last_name'], | |
|
288 | description=schema_data['description'], | |||
283 | active=schema_data['active'], |
|
289 | active=schema_data['active'], | |
284 | admin=schema_data['admin'], |
|
290 | admin=schema_data['admin'], | |
285 | extern_type=schema_data['extern_type'], |
|
291 | extern_type=schema_data['extern_type'], | |
@@ -307,7 +313,7 b' def create_user(request, apiuser, userna' | |||||
307 | def update_user(request, apiuser, userid, username=Optional(None), |
|
313 | def update_user(request, apiuser, userid, username=Optional(None), | |
308 | email=Optional(None), password=Optional(None), |
|
314 | email=Optional(None), password=Optional(None), | |
309 | firstname=Optional(None), lastname=Optional(None), |
|
315 | firstname=Optional(None), lastname=Optional(None), | |
310 | active=Optional(None), admin=Optional(None), |
|
316 | description=Optional(None), active=Optional(None), admin=Optional(None), | |
311 | extern_type=Optional(None), extern_name=Optional(None), ): |
|
317 | extern_type=Optional(None), extern_name=Optional(None), ): | |
312 | """ |
|
318 | """ | |
313 | Updates the details for the specified user, if that user exists. |
|
319 | Updates the details for the specified user, if that user exists. | |
@@ -331,6 +337,8 b' def update_user(request, apiuser, userid' | |||||
331 | :type firstname: Optional(str) |
|
337 | :type firstname: Optional(str) | |
332 | :param lastname: Set the new surname. |
|
338 | :param lastname: Set the new surname. | |
333 | :type lastname: Optional(str) |
|
339 | :type lastname: Optional(str) | |
|
340 | :param description: Set user description, or short bio. Metatags are allowed. | |||
|
341 | :type description: Optional(str) | |||
334 | :param active: Set the new user as active. |
|
342 | :param active: Set the new user as active. | |
335 | :type active: Optional(``True`` | ``False``) |
|
343 | :type active: Optional(``True`` | ``False``) | |
336 | :param admin: Give the user admin rights. |
|
344 | :param admin: Give the user admin rights. | |
@@ -379,6 +387,7 b' def update_user(request, apiuser, userid' | |||||
379 | store_update(updates, email, 'email') |
|
387 | store_update(updates, email, 'email') | |
380 | store_update(updates, firstname, 'name') |
|
388 | store_update(updates, firstname, 'name') | |
381 | store_update(updates, lastname, 'lastname') |
|
389 | store_update(updates, lastname, 'lastname') | |
|
390 | store_update(updates, description, 'description') | |||
382 | store_update(updates, active, 'active') |
|
391 | store_update(updates, active, 'active') | |
383 | store_update(updates, admin, 'admin') |
|
392 | store_update(updates, admin, 'admin') | |
384 | store_update(updates, extern_name, 'extern_name') |
|
393 | store_update(updates, extern_name, 'extern_name') |
@@ -191,7 +191,9 b' def create_user_group(' | |||||
191 | :param active: Set this group as active. |
|
191 | :param active: Set this group as active. | |
192 | :type active: Optional(``True`` | ``False``) |
|
192 | :type active: Optional(``True`` | ``False``) | |
193 | :param sync: Set enabled or disabled the automatically sync from |
|
193 | :param sync: Set enabled or disabled the automatically sync from | |
194 | external authentication types like ldap. |
|
194 | external authentication types like ldap. If User Group will be named like | |
|
195 | one from e.g ldap and sync flag is enabled members will be synced automatically. | |||
|
196 | Sync type when enabled via API is set to `manual_api` | |||
195 | :type sync: Optional(``True`` | ``False``) |
|
197 | :type sync: Optional(``True`` | ``False``) | |
196 |
|
198 | |||
197 | Example output: |
|
199 | Example output: | |
@@ -303,7 +305,9 b' def update_user_group(request, apiuser, ' | |||||
303 | :param active: Set the group as active. |
|
305 | :param active: Set the group as active. | |
304 | :type active: Optional(``True`` | ``False``) |
|
306 | :type active: Optional(``True`` | ``False``) | |
305 | :param sync: Set enabled or disabled the automatically sync from |
|
307 | :param sync: Set enabled or disabled the automatically sync from | |
306 | external authentication types like ldap. |
|
308 | external authentication types like ldap. If User Group will be named like | |
|
309 | one from e.g ldap and sync flag is enabled members will be synced automatically. | |||
|
310 | Sync type when enabled via API is set to `manual_api` | |||
307 | :type sync: Optional(``True`` | ``False``) |
|
311 | :type sync: Optional(``True`` | ``False``) | |
308 |
|
312 | |||
309 | Example output: |
|
313 | Example output: |
@@ -25,9 +25,11 b' import operator' | |||||
25 | from pyramid import compat |
|
25 | from pyramid import compat | |
26 | from pyramid.httpexceptions import HTTPFound, HTTPForbidden, HTTPBadRequest |
|
26 | from pyramid.httpexceptions import HTTPFound, HTTPForbidden, HTTPBadRequest | |
27 |
|
27 | |||
28 | from rhodecode.lib import helpers as h, diffs |
|
28 | from rhodecode.lib import helpers as h, diffs, rc_cache | |
29 | from rhodecode.lib.utils2 import ( |
|
29 | from rhodecode.lib.utils2 import ( | |
30 | StrictAttributeDict, str2bool, safe_int, datetime_to_time, safe_unicode) |
|
30 | StrictAttributeDict, str2bool, safe_int, datetime_to_time, safe_unicode) | |
|
31 | from rhodecode.lib.markup_renderer import MarkupRenderer, relative_links | |||
|
32 | from rhodecode.lib.vcs.backends.base import EmptyCommit | |||
31 | from rhodecode.lib.vcs.exceptions import RepositoryRequirementError |
|
33 | from rhodecode.lib.vcs.exceptions import RepositoryRequirementError | |
32 | from rhodecode.model import repo |
|
34 | from rhodecode.model import repo | |
33 | from rhodecode.model import repo_group |
|
35 | from rhodecode.model import repo_group | |
@@ -36,6 +38,7 b' from rhodecode.model import user' | |||||
36 | from rhodecode.model.db import User |
|
38 | from rhodecode.model.db import User | |
37 | from rhodecode.model.scm import ScmModel |
|
39 | from rhodecode.model.scm import ScmModel | |
38 | from rhodecode.model.settings import VcsSettingsModel |
|
40 | from rhodecode.model.settings import VcsSettingsModel | |
|
41 | from rhodecode.model.repo import ReadmeFinder | |||
39 |
|
42 | |||
40 | log = logging.getLogger(__name__) |
|
43 | log = logging.getLogger(__name__) | |
41 |
|
44 | |||
@@ -222,6 +225,7 b' class RepoAppView(BaseAppView):' | |||||
222 | self.db_repo = request.db_repo |
|
225 | self.db_repo = request.db_repo | |
223 | self.db_repo_name = self.db_repo.repo_name |
|
226 | self.db_repo_name = self.db_repo.repo_name | |
224 | self.db_repo_pull_requests = ScmModel().get_pull_requests(self.db_repo) |
|
227 | self.db_repo_pull_requests = ScmModel().get_pull_requests(self.db_repo) | |
|
228 | self.db_repo_artifacts = ScmModel().get_artifacts(self.db_repo) | |||
225 |
|
229 | |||
226 | def _handle_missing_requirements(self, error): |
|
230 | def _handle_missing_requirements(self, error): | |
227 | log.error( |
|
231 | log.error( | |
@@ -237,6 +241,7 b' class RepoAppView(BaseAppView):' | |||||
237 | c.rhodecode_db_repo = self.db_repo |
|
241 | c.rhodecode_db_repo = self.db_repo | |
238 | c.repo_name = self.db_repo_name |
|
242 | c.repo_name = self.db_repo_name | |
239 | c.repository_pull_requests = self.db_repo_pull_requests |
|
243 | c.repository_pull_requests = self.db_repo_pull_requests | |
|
244 | c.repository_artifacts = self.db_repo_artifacts | |||
240 | c.repository_is_user_following = ScmModel().is_following_repo( |
|
245 | c.repository_is_user_following = ScmModel().is_following_repo( | |
241 | self.db_repo_name, self._rhodecode_user.user_id) |
|
246 | self.db_repo_name, self._rhodecode_user.user_id) | |
242 | self.path_filter = PathFilter(None) |
|
247 | self.path_filter = PathFilter(None) | |
@@ -305,6 +310,69 b' class RepoAppView(BaseAppView):' | |||||
305 | settings = settings_model.get_general_settings() |
|
310 | settings = settings_model.get_general_settings() | |
306 | return settings.get(settings_key, default) |
|
311 | return settings.get(settings_key, default) | |
307 |
|
312 | |||
|
313 | def _get_repo_setting(self, target_repo, settings_key, default=False): | |||
|
314 | settings_model = VcsSettingsModel(repo=target_repo) | |||
|
315 | settings = settings_model.get_repo_settings_inherited() | |||
|
316 | return settings.get(settings_key, default) | |||
|
317 | ||||
|
318 | def _get_readme_data(self, db_repo, renderer_type, commit_id=None, path='/'): | |||
|
319 | log.debug('Looking for README file at path %s', path) | |||
|
320 | if commit_id: | |||
|
321 | landing_commit_id = commit_id | |||
|
322 | else: | |||
|
323 | landing_commit = db_repo.get_landing_commit() | |||
|
324 | if isinstance(landing_commit, EmptyCommit): | |||
|
325 | return None, None | |||
|
326 | landing_commit_id = landing_commit.raw_id | |||
|
327 | ||||
|
328 | cache_namespace_uid = 'cache_repo.{}'.format(db_repo.repo_id) | |||
|
329 | region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid) | |||
|
330 | start = time.time() | |||
|
331 | ||||
|
332 | @region.conditional_cache_on_arguments(namespace=cache_namespace_uid) | |||
|
333 | def generate_repo_readme(repo_id, _commit_id, _repo_name, _readme_search_path, _renderer_type): | |||
|
334 | readme_data = None | |||
|
335 | readme_filename = None | |||
|
336 | ||||
|
337 | commit = db_repo.get_commit(_commit_id) | |||
|
338 | log.debug("Searching for a README file at commit %s.", _commit_id) | |||
|
339 | readme_node = ReadmeFinder(_renderer_type).search(commit, path=_readme_search_path) | |||
|
340 | ||||
|
341 | if readme_node: | |||
|
342 | log.debug('Found README node: %s', readme_node) | |||
|
343 | relative_urls = { | |||
|
344 | 'raw': h.route_path( | |||
|
345 | 'repo_file_raw', repo_name=_repo_name, | |||
|
346 | commit_id=commit.raw_id, f_path=readme_node.path), | |||
|
347 | 'standard': h.route_path( | |||
|
348 | 'repo_files', repo_name=_repo_name, | |||
|
349 | commit_id=commit.raw_id, f_path=readme_node.path), | |||
|
350 | } | |||
|
351 | readme_data = self._render_readme_or_none(commit, readme_node, relative_urls) | |||
|
352 | readme_filename = readme_node.unicode_path | |||
|
353 | ||||
|
354 | return readme_data, readme_filename | |||
|
355 | ||||
|
356 | readme_data, readme_filename = generate_repo_readme( | |||
|
357 | db_repo.repo_id, landing_commit_id, db_repo.repo_name, path, renderer_type,) | |||
|
358 | compute_time = time.time() - start | |||
|
359 | log.debug('Repo README for path %s generated and computed in %.4fs', | |||
|
360 | path, compute_time) | |||
|
361 | return readme_data, readme_filename | |||
|
362 | ||||
|
363 | def _render_readme_or_none(self, commit, readme_node, relative_urls): | |||
|
364 | log.debug('Found README file `%s` rendering...', readme_node.path) | |||
|
365 | renderer = MarkupRenderer() | |||
|
366 | try: | |||
|
367 | html_source = renderer.render( | |||
|
368 | readme_node.content, filename=readme_node.path) | |||
|
369 | if relative_urls: | |||
|
370 | return relative_links(html_source, relative_urls) | |||
|
371 | return html_source | |||
|
372 | except Exception: | |||
|
373 | log.exception( | |||
|
374 | "Exception while trying to render the README") | |||
|
375 | ||||
308 | def get_recache_flag(self): |
|
376 | def get_recache_flag(self): | |
309 | for flag_name in ['force_recache', 'force-recache', 'no-cache']: |
|
377 | for flag_name in ['force_recache', 'force-recache', 'no-cache']: | |
310 | flag_val = self.request.GET.get(flag_name) |
|
378 | flag_val = self.request.GET.get(flag_name) | |
@@ -464,7 +532,7 b' class BaseReferencesView(RepoAppView):' | |||||
464 |
|
532 | |||
465 | def load_refs_context(self, ref_items, partials_template): |
|
533 | def load_refs_context(self, ref_items, partials_template): | |
466 | _render = self.request.get_partial_renderer(partials_template) |
|
534 | _render = self.request.get_partial_renderer(partials_template) | |
467 | pre_load = ["author", "date", "message"] |
|
535 | pre_load = ["author", "date", "message", "parents"] | |
468 |
|
536 | |||
469 | is_svn = h.is_svn(self.rhodecode_vcs_repo) |
|
537 | is_svn = h.is_svn(self.rhodecode_vcs_repo) | |
470 | is_hg = h.is_hg(self.rhodecode_vcs_repo) |
|
538 | is_hg = h.is_hg(self.rhodecode_vcs_repo) |
@@ -140,7 +140,6 b' def admin_routes(config):' | |||||
140 | name='admin_settings_visual_update', |
|
140 | name='admin_settings_visual_update', | |
141 | pattern='/settings/visual/update') |
|
141 | pattern='/settings/visual/update') | |
142 |
|
142 | |||
143 |
|
||||
144 | config.add_route( |
|
143 | config.add_route( | |
145 | name='admin_settings_issuetracker', |
|
144 | name='admin_settings_issuetracker', | |
146 | pattern='/settings/issue-tracker') |
|
145 | pattern='/settings/issue-tracker') | |
@@ -378,6 +377,10 b' def admin_routes(config):' | |||||
378 | name='edit_user_audit_logs', |
|
377 | name='edit_user_audit_logs', | |
379 | pattern='/users/{user_id:\d+}/edit/audit', user_route=True) |
|
378 | pattern='/users/{user_id:\d+}/edit/audit', user_route=True) | |
380 |
|
379 | |||
|
380 | config.add_route( | |||
|
381 | name='edit_user_audit_logs_download', | |||
|
382 | pattern='/users/{user_id:\d+}/edit/audit/download', user_route=True) | |||
|
383 | ||||
381 | # user caches |
|
384 | # user caches | |
382 | config.add_route( |
|
385 | config.add_route( | |
383 | name='edit_user_caches', |
|
386 | name='edit_user_caches', | |
@@ -411,6 +414,10 b' def admin_routes(config):' | |||||
411 | pattern='/repos') |
|
414 | pattern='/repos') | |
412 |
|
415 | |||
413 | config.add_route( |
|
416 | config.add_route( | |
|
417 | name='repos_data', | |||
|
418 | pattern='/repos_data') | |||
|
419 | ||||
|
420 | config.add_route( | |||
414 | name='repo_new', |
|
421 | name='repo_new', | |
415 | pattern='/repos/new') |
|
422 | pattern='/repos/new') | |
416 |
|
423 |
@@ -155,7 +155,7 b' class TestAuthSettingsView(object):' | |||||
155 | response = self._post_ldap_settings(params, override={ |
|
155 | response = self._post_ldap_settings(params, override={ | |
156 | 'port': invalid_port_value, |
|
156 | 'port': invalid_port_value, | |
157 | }) |
|
157 | }) | |
158 |
assertr = |
|
158 | assertr = response.assert_response() | |
159 | assertr.element_contains( |
|
159 | assertr.element_contains( | |
160 | '.form .field #port ~ .error-message', |
|
160 | '.form .field #port ~ .error-message', | |
161 | invalid_port_value) |
|
161 | invalid_port_value) |
@@ -47,6 +47,7 b' def route_path(name, params=None, **kwar' | |||||
47 |
|
47 | |||
48 | base_url = { |
|
48 | base_url = { | |
49 | 'repos': ADMIN_PREFIX + '/repos', |
|
49 | 'repos': ADMIN_PREFIX + '/repos', | |
|
50 | 'repos_data': ADMIN_PREFIX + '/repos_data', | |||
50 | 'repo_new': ADMIN_PREFIX + '/repos/new', |
|
51 | 'repo_new': ADMIN_PREFIX + '/repos/new', | |
51 | 'repo_create': ADMIN_PREFIX + '/repos/create', |
|
52 | 'repo_create': ADMIN_PREFIX + '/repos/create', | |
52 |
|
53 | |||
@@ -70,24 +71,25 b' def _get_permission_for_user(user, repo)' | |||||
70 | @pytest.mark.usefixtures("app") |
|
71 | @pytest.mark.usefixtures("app") | |
71 | class TestAdminRepos(object): |
|
72 | class TestAdminRepos(object): | |
72 |
|
73 | |||
73 | def test_repo_list(self, autologin_user, user_util): |
|
74 | def test_repo_list(self, autologin_user, user_util, xhr_header): | |
74 | repo = user_util.create_repo() |
|
75 | repo = user_util.create_repo() | |
75 | repo_name = repo.repo_name |
|
76 | repo_name = repo.repo_name | |
76 | response = self.app.get( |
|
77 | response = self.app.get( | |
77 |
route_path('repos'), status=200 |
|
78 | route_path('repos_data'), status=200, | |
|
79 | extra_environ=xhr_header) | |||
78 |
|
80 | |||
79 | response.mustcontain(repo_name) |
|
81 | response.mustcontain(repo_name) | |
80 |
|
82 | |||
81 | def test_create_page_restricted_to_single_backend(self, autologin_user, backend): |
|
83 | def test_create_page_restricted_to_single_backend(self, autologin_user, backend): | |
82 | with mock.patch('rhodecode.BACKENDS', {'git': 'git'}): |
|
84 | with mock.patch('rhodecode.BACKENDS', {'git': 'git'}): | |
83 | response = self.app.get(route_path('repo_new'), status=200) |
|
85 | response = self.app.get(route_path('repo_new'), status=200) | |
84 |
assert_response = |
|
86 | assert_response = response.assert_response() | |
85 | element = assert_response.get_element('#repo_type') |
|
87 | element = assert_response.get_element('#repo_type') | |
86 | assert element.text_content() == '\ngit\n' |
|
88 | assert element.text_content() == '\ngit\n' | |
87 |
|
89 | |||
88 | def test_create_page_non_restricted_backends(self, autologin_user, backend): |
|
90 | def test_create_page_non_restricted_backends(self, autologin_user, backend): | |
89 | response = self.app.get(route_path('repo_new'), status=200) |
|
91 | response = self.app.get(route_path('repo_new'), status=200) | |
90 |
assert_response = |
|
92 | assert_response = response.assert_response() | |
91 | assert_response.element_contains('#repo_type', 'git') |
|
93 | assert_response.element_contains('#repo_type', 'git') | |
92 | assert_response.element_contains('#repo_type', 'svn') |
|
94 | assert_response.element_contains('#repo_type', 'svn') | |
93 | assert_response.element_contains('#repo_type', 'hg') |
|
95 | assert_response.element_contains('#repo_type', 'hg') |
@@ -84,7 +84,7 b' class TestAdminRepositoryGroups(object):' | |||||
84 | fixture.create_repo_group('test_repo_group') |
|
84 | fixture.create_repo_group('test_repo_group') | |
85 | response = self.app.get(route_path( |
|
85 | response = self.app.get(route_path( | |
86 | 'repo_groups_data'), extra_environ=xhr_header) |
|
86 | 'repo_groups_data'), extra_environ=xhr_header) | |
87 |
response.mustcontain('" |
|
87 | response.mustcontain('<a href=\\"/{}/_edit\\" title=\\"Edit\\">Edit</a>'.format('test_repo_group')) | |
88 | fixture.destroy_repo_group('test_repo_group') |
|
88 | fixture.destroy_repo_group('test_repo_group') | |
89 |
|
89 | |||
90 | def test_new(self, autologin_user): |
|
90 | def test_new(self, autologin_user): |
@@ -367,7 +367,7 b' class TestAdminSettingsVcs(object):' | |||||
367 |
|
367 | |||
368 | def test_has_an_input_for_invalidation_of_inline_comments(self): |
|
368 | def test_has_an_input_for_invalidation_of_inline_comments(self): | |
369 | response = self.app.get(route_path('admin_settings_vcs')) |
|
369 | response = self.app.get(route_path('admin_settings_vcs')) | |
370 |
assert_response = |
|
370 | assert_response = response.assert_response() | |
371 | assert_response.one_element_exists( |
|
371 | assert_response.one_element_exists( | |
372 | '[name=rhodecode_use_outdated_comments]') |
|
372 | '[name=rhodecode_use_outdated_comments]') | |
373 |
|
373 | |||
@@ -412,14 +412,14 b' class TestAdminSettingsVcs(object):' | |||||
412 | setting = SettingsModel().get_setting_by_name(setting_key) |
|
412 | setting = SettingsModel().get_setting_by_name(setting_key) | |
413 | assert setting.app_settings_value is new_value |
|
413 | assert setting.app_settings_value is new_value | |
414 |
|
414 | |||
415 | @pytest.fixture |
|
415 | @pytest.fixture() | |
416 | def disable_sql_cache(self, request): |
|
416 | def disable_sql_cache(self, request): | |
417 | patcher = mock.patch( |
|
417 | patcher = mock.patch( | |
418 | 'rhodecode.lib.caching_query.FromCache.process_query') |
|
418 | 'rhodecode.lib.caching_query.FromCache.process_query') | |
419 | request.addfinalizer(patcher.stop) |
|
419 | request.addfinalizer(patcher.stop) | |
420 | patcher.start() |
|
420 | patcher.start() | |
421 |
|
421 | |||
422 | @pytest.fixture |
|
422 | @pytest.fixture() | |
423 | def form_defaults(self): |
|
423 | def form_defaults(self): | |
424 | from rhodecode.apps.admin.views.settings import AdminSettingsView |
|
424 | from rhodecode.apps.admin.views.settings import AdminSettingsView | |
425 | return AdminSettingsView._form_defaults() |
|
425 | return AdminSettingsView._form_defaults() | |
@@ -518,7 +518,7 b' class TestOpenSourceLicenses(object):' | |||||
518 | response = self.app.get( |
|
518 | response = self.app.get( | |
519 | route_path('admin_settings_open_source'), status=200) |
|
519 | route_path('admin_settings_open_source'), status=200) | |
520 |
|
520 | |||
521 |
assert_response = |
|
521 | assert_response = response.assert_response() | |
522 | assert_response.element_contains( |
|
522 | assert_response.element_contains( | |
523 | '.panel-heading', 'Licenses of Third Party Packages') |
|
523 | '.panel-heading', 'Licenses of Third Party Packages') | |
524 | for license_data in sample_licenses: |
|
524 | for license_data in sample_licenses: | |
@@ -528,7 +528,7 b' class TestOpenSourceLicenses(object):' | |||||
528 | def test_records_can_be_read(self, autologin_user): |
|
528 | def test_records_can_be_read(self, autologin_user): | |
529 | response = self.app.get( |
|
529 | response = self.app.get( | |
530 | route_path('admin_settings_open_source'), status=200) |
|
530 | route_path('admin_settings_open_source'), status=200) | |
531 |
assert_response = |
|
531 | assert_response = response.assert_response() | |
532 | assert_response.element_contains( |
|
532 | assert_response.element_contains( | |
533 | '.panel-heading', 'Licenses of Third Party Packages') |
|
533 | '.panel-heading', 'Licenses of Third Party Packages') | |
534 |
|
534 | |||
@@ -726,7 +726,7 b' class TestAdminSettingsIssueTracker(obje' | |||||
726 | IssueTrackerSettingsModel().delete_entries(self.uid) |
|
726 | IssueTrackerSettingsModel().delete_entries(self.uid) | |
727 |
|
727 | |||
728 | def test_delete_issuetracker_pattern( |
|
728 | def test_delete_issuetracker_pattern( | |
729 | self, autologin_user, backend, csrf_token, settings_util): |
|
729 | self, autologin_user, backend, csrf_token, settings_util, xhr_header): | |
730 | pattern = 'issuetracker_pat' |
|
730 | pattern = 'issuetracker_pat' | |
731 | uid = md5(pattern) |
|
731 | uid = md5(pattern) | |
732 | settings_util.create_rhodecode_setting( |
|
732 | settings_util.create_rhodecode_setting( | |
@@ -734,10 +734,9 b' class TestAdminSettingsIssueTracker(obje' | |||||
734 |
|
734 | |||
735 | post_url = route_path('admin_settings_issuetracker_delete') |
|
735 | post_url = route_path('admin_settings_issuetracker_delete') | |
736 | post_data = { |
|
736 | post_data = { | |
737 | '_method': 'delete', |
|
|||
738 | 'uid': uid, |
|
737 | 'uid': uid, | |
739 | 'csrf_token': csrf_token |
|
738 | 'csrf_token': csrf_token | |
740 | } |
|
739 | } | |
741 |
self.app.post(post_url, post_data, status= |
|
740 | self.app.post(post_url, post_data, extra_environ=xhr_header, status=200) | |
742 | settings = SettingsModel().get_all_settings() |
|
741 | settings = SettingsModel().get_all_settings() | |
743 | assert 'rhodecode_%s%s' % (self.SHORT_PATTERN_KEY, uid) not in settings |
|
742 | assert 'rhodecode_%s%s' % (self.SHORT_PATTERN_KEY, uid) not in settings |
@@ -91,6 +91,9 b' def route_path(name, params=None, **kwar' | |||||
91 | 'edit_user_audit_logs': |
|
91 | 'edit_user_audit_logs': | |
92 | ADMIN_PREFIX + '/users/{user_id}/edit/audit', |
|
92 | ADMIN_PREFIX + '/users/{user_id}/edit/audit', | |
93 |
|
93 | |||
|
94 | 'edit_user_audit_logs_download': | |||
|
95 | ADMIN_PREFIX + '/users/{user_id}/edit/audit/download', | |||
|
96 | ||||
94 | }[name].format(**kwargs) |
|
97 | }[name].format(**kwargs) | |
95 |
|
98 | |||
96 | if params: |
|
99 | if params: | |
@@ -318,7 +321,6 b' class TestAdminUsersView(TestController)' | |||||
318 | route_path('edit_user_emails', user_id=user_id)) |
|
321 | route_path('edit_user_emails', user_id=user_id)) | |
319 | response.mustcontain(no=['example@rhodecode.com']) |
|
322 | response.mustcontain(no=['example@rhodecode.com']) | |
320 |
|
323 | |||
321 |
|
||||
322 | def test_create(self, request, xhr_header): |
|
324 | def test_create(self, request, xhr_header): | |
323 | self.log_user() |
|
325 | self.log_user() | |
324 | username = 'newtestuser' |
|
326 | username = 'newtestuser' | |
@@ -333,6 +335,7 b' class TestAdminUsersView(TestController)' | |||||
333 | response = self.app.post(route_path('users_create'), params={ |
|
335 | response = self.app.post(route_path('users_create'), params={ | |
334 | 'username': username, |
|
336 | 'username': username, | |
335 | 'password': password, |
|
337 | 'password': password, | |
|
338 | 'description': 'mr CTO', | |||
336 | 'password_confirmation': password_confirmation, |
|
339 | 'password_confirmation': password_confirmation, | |
337 | 'firstname': name, |
|
340 | 'firstname': name, | |
338 | 'active': True, |
|
341 | 'active': True, | |
@@ -381,6 +384,7 b' class TestAdminUsersView(TestController)' | |||||
381 | 'name': name, |
|
384 | 'name': name, | |
382 | 'active': False, |
|
385 | 'active': False, | |
383 | 'lastname': lastname, |
|
386 | 'lastname': lastname, | |
|
387 | 'description': 'mr CTO', | |||
384 | 'email': email, |
|
388 | 'email': email, | |
385 | 'csrf_token': self.csrf_token, |
|
389 | 'csrf_token': self.csrf_token, | |
386 | }) |
|
390 | }) | |
@@ -418,6 +422,7 b' class TestAdminUsersView(TestController)' | |||||
418 | ('email', {'email': 'some@email.com'}), |
|
422 | ('email', {'email': 'some@email.com'}), | |
419 | ('language', {'language': 'de'}), |
|
423 | ('language', {'language': 'de'}), | |
420 | ('language', {'language': 'en'}), |
|
424 | ('language', {'language': 'en'}), | |
|
425 | ('description', {'description': 'hello CTO'}), | |||
421 | # ('new_password', {'new_password': 'foobar123', |
|
426 | # ('new_password', {'new_password': 'foobar123', | |
422 | # 'password_confirmation': 'foobar123'}) |
|
427 | # 'password_confirmation': 'foobar123'}) | |
423 | ]) |
|
428 | ]) | |
@@ -515,7 +520,7 b' class TestAdminUsersView(TestController)' | |||||
515 | route_path('user_delete', user_id=new_user.user_id), |
|
520 | route_path('user_delete', user_id=new_user.user_id), | |
516 | params={'csrf_token': self.csrf_token}) |
|
521 | params={'csrf_token': self.csrf_token}) | |
517 |
|
522 | |||
518 | assert_session_flash(response, 'Successfully deleted user') |
|
523 | assert_session_flash(response, 'Successfully deleted user `{}`'.format(username)) | |
519 |
|
524 | |||
520 | def test_delete_owner_of_repository(self, request, user_util): |
|
525 | def test_delete_owner_of_repository(self, request, user_util): | |
521 | self.log_user() |
|
526 | self.log_user() | |
@@ -531,8 +536,7 b' class TestAdminUsersView(TestController)' | |||||
531 | params={'csrf_token': self.csrf_token}) |
|
536 | params={'csrf_token': self.csrf_token}) | |
532 |
|
537 | |||
533 | msg = 'user "%s" still owns 1 repositories and cannot be removed. ' \ |
|
538 | msg = 'user "%s" still owns 1 repositories and cannot be removed. ' \ | |
534 | 'Switch owners or remove those repositories:%s' % (username, |
|
539 | 'Switch owners or remove those repositories:%s' % (username, obj_name) | |
535 | obj_name) |
|
|||
536 | assert_session_flash(response, msg) |
|
540 | assert_session_flash(response, msg) | |
537 | fixture.destroy_repo(obj_name) |
|
541 | fixture.destroy_repo(obj_name) | |
538 |
|
542 | |||
@@ -542,6 +546,7 b' class TestAdminUsersView(TestController)' | |||||
542 | usr = user_util.create_user(auto_cleanup=False) |
|
546 | usr = user_util.create_user(auto_cleanup=False) | |
543 | username = usr.username |
|
547 | username = usr.username | |
544 | fixture.create_repo(obj_name, cur_user=usr.username) |
|
548 | fixture.create_repo(obj_name, cur_user=usr.username) | |
|
549 | Session().commit() | |||
545 |
|
550 | |||
546 | new_user = Session().query(User)\ |
|
551 | new_user = Session().query(User)\ | |
547 | .filter(User.username == username).one() |
|
552 | .filter(User.username == username).one() | |
@@ -583,8 +588,7 b' class TestAdminUsersView(TestController)' | |||||
583 | params={'csrf_token': self.csrf_token}) |
|
588 | params={'csrf_token': self.csrf_token}) | |
584 |
|
589 | |||
585 | msg = 'user "%s" still owns 1 repository groups and cannot be removed. ' \ |
|
590 | msg = 'user "%s" still owns 1 repository groups and cannot be removed. ' \ | |
586 | 'Switch owners or remove those repository groups:%s' % (username, |
|
591 | 'Switch owners or remove those repository groups:%s' % (username, obj_name) | |
587 | obj_name) |
|
|||
588 | assert_session_flash(response, msg) |
|
592 | assert_session_flash(response, msg) | |
589 | fixture.destroy_repo_group(obj_name) |
|
593 | fixture.destroy_repo_group(obj_name) | |
590 |
|
594 | |||
@@ -635,8 +639,7 b' class TestAdminUsersView(TestController)' | |||||
635 | params={'csrf_token': self.csrf_token}) |
|
639 | params={'csrf_token': self.csrf_token}) | |
636 |
|
640 | |||
637 | msg = 'user "%s" still owns 1 user groups and cannot be removed. ' \ |
|
641 | msg = 'user "%s" still owns 1 user groups and cannot be removed. ' \ | |
638 | 'Switch owners or remove those user groups:%s' % (username, |
|
642 | 'Switch owners or remove those user groups:%s' % (username, obj_name) | |
639 | obj_name) |
|
|||
640 | assert_session_flash(response, msg) |
|
643 | assert_session_flash(response, msg) | |
641 | fixture.destroy_user_group(obj_name) |
|
644 | fixture.destroy_user_group(obj_name) | |
642 |
|
645 | |||
@@ -779,3 +782,13 b' class TestAdminUsersView(TestController)' | |||||
779 | user = self.log_user() |
|
782 | user = self.log_user() | |
780 | self.app.get( |
|
783 | self.app.get( | |
781 | route_path('edit_user_audit_logs', user_id=user['user_id'])) |
|
784 | route_path('edit_user_audit_logs', user_id=user['user_id'])) | |
|
785 | ||||
|
786 | def test_audit_log_page_download(self): | |||
|
787 | user = self.log_user() | |||
|
788 | user_id = user['user_id'] | |||
|
789 | response = self.app.get( | |||
|
790 | route_path('edit_user_audit_logs_download', user_id=user_id)) | |||
|
791 | ||||
|
792 | assert response.content_disposition == \ | |||
|
793 | 'attachment; filename=user_{}_audit_logs.json'.format(user_id) | |||
|
794 | assert response.content_type == "application/json" |
@@ -28,7 +28,7 b' from rhodecode.model.db import joinedloa' | |||||
28 | from rhodecode.lib.user_log_filter import user_log_filter |
|
28 | from rhodecode.lib.user_log_filter import user_log_filter | |
29 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator |
|
29 | from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator | |
30 | from rhodecode.lib.utils2 import safe_int |
|
30 | from rhodecode.lib.utils2 import safe_int | |
31 | from rhodecode.lib.helpers import Page |
|
31 | from rhodecode.lib.helpers import SqlPage | |
32 |
|
32 | |||
33 | log = logging.getLogger(__name__) |
|
33 | log = logging.getLogger(__name__) | |
34 |
|
34 | |||
@@ -62,13 +62,16 b' class AdminAuditLogsView(BaseAppView):' | |||||
62 |
|
62 | |||
63 | p = safe_int(self.request.GET.get('page', 1), 1) |
|
63 | p = safe_int(self.request.GET.get('page', 1), 1) | |
64 |
|
64 | |||
65 |
def url_generator( |
|
65 | def url_generator(page_num): | |
|
66 | query_params = { | |||
|
67 | 'page': page_num | |||
|
68 | } | |||
66 | if c.search_term: |
|
69 | if c.search_term: | |
67 |
|
|
70 | query_params['filter'] = c.search_term | |
68 |
return self.request.current_route_path(_query= |
|
71 | return self.request.current_route_path(_query=query_params) | |
69 |
|
72 | |||
70 | c.audit_logs = Page(users_log, page=p, items_per_page=10, |
|
73 | c.audit_logs = SqlPage(users_log, page=p, items_per_page=10, | |
71 | url=url_generator) |
|
74 | url_maker=url_generator) | |
72 | return self._get_template_context(c) |
|
75 | return self._get_template_context(c) | |
73 |
|
76 | |||
74 | @LoginRequired() |
|
77 | @LoginRequired() |
@@ -25,7 +25,7 b' from pyramid.view import view_config' | |||||
25 |
|
25 | |||
26 | from rhodecode.apps._base import BaseAppView |
|
26 | from rhodecode.apps._base import BaseAppView | |
27 | from rhodecode.lib import helpers as h |
|
27 | from rhodecode.lib import helpers as h | |
28 | from rhodecode.lib.auth import (LoginRequired, NotAnonymous) |
|
28 | from rhodecode.lib.auth import (LoginRequired, NotAnonymous, HasRepoPermissionAny) | |
29 | from rhodecode.model.db import PullRequest |
|
29 | from rhodecode.model.db import PullRequest | |
30 |
|
30 | |||
31 |
|
31 | |||
@@ -66,6 +66,13 b' class AdminMainView(BaseAppView):' | |||||
66 | pull_request_id = pull_request.pull_request_id |
|
66 | pull_request_id = pull_request.pull_request_id | |
67 |
|
67 | |||
68 | repo_name = pull_request.target_repo.repo_name |
|
68 | repo_name = pull_request.target_repo.repo_name | |
|
69 | # NOTE(marcink): | |||
|
70 | # check permissions so we don't redirect to repo that we don't have access to | |||
|
71 | # exposing it's name | |||
|
72 | target_repo_perm = HasRepoPermissionAny( | |||
|
73 | 'repository.read', 'repository.write', 'repository.admin')(repo_name) | |||
|
74 | if not target_repo_perm: | |||
|
75 | raise HTTPNotFound() | |||
69 |
|
76 | |||
70 | raise HTTPFound( |
|
77 | raise HTTPFound( | |
71 | h.route_path('pullrequest_show', repo_name=repo_name, |
|
78 | h.route_path('pullrequest_show', repo_name=repo_name, |
@@ -68,7 +68,8 b' class AdminPermissionsView(BaseAppView, ' | |||||
68 |
|
68 | |||
69 | c.user = User.get_default_user(refresh=True) |
|
69 | c.user = User.get_default_user(refresh=True) | |
70 |
|
70 | |||
71 | app_settings = SettingsModel().get_all_settings() |
|
71 | app_settings = c.rc_config | |
|
72 | ||||
72 | defaults = { |
|
73 | defaults = { | |
73 | 'anonymous': c.user.active, |
|
74 | 'anonymous': c.user.active, | |
74 | 'default_register_message': app_settings.get( |
|
75 | 'default_register_message': app_settings.get( |
@@ -47,7 +47,7 b' class AdminProcessManagementView(BaseApp' | |||||
47 | 'name': proc.name(), |
|
47 | 'name': proc.name(), | |
48 | 'mem_rss': mem.rss, |
|
48 | 'mem_rss': mem.rss, | |
49 | 'mem_vms': mem.vms, |
|
49 | 'mem_vms': mem.vms, | |
50 | 'cpu_percent': proc.cpu_percent(), |
|
50 | 'cpu_percent': proc.cpu_percent(interval=0.1), | |
51 | 'create_time': proc.create_time(), |
|
51 | 'create_time': proc.create_time(), | |
52 | 'cmd': ' '.join(proc.cmdline()), |
|
52 | 'cmd': ' '.join(proc.cmdline()), | |
53 | }) |
|
53 | }) |
@@ -19,6 +19,8 b'' | |||||
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 | import datetime |
|
20 | import datetime | |
21 | import logging |
|
21 | import logging | |
|
22 | import time | |||
|
23 | ||||
22 | import formencode |
|
24 | import formencode | |
23 | import formencode.htmlfill |
|
25 | import formencode.htmlfill | |
24 |
|
26 | |||
@@ -63,7 +65,7 b' class AdminRepoGroupsView(BaseAppView, D' | |||||
63 | # and display only those we have ADMIN right |
|
65 | # and display only those we have ADMIN right | |
64 | groups_with_admin_rights = RepoGroupList( |
|
66 | groups_with_admin_rights = RepoGroupList( | |
65 | RepoGroup.query().all(), |
|
67 | RepoGroup.query().all(), | |
66 | perm_set=['group.admin']) |
|
68 | perm_set=['group.admin'], extra_kwargs=dict(user=self._rhodecode_user)) | |
67 | c.repo_groups = RepoGroup.groups_choices( |
|
69 | c.repo_groups = RepoGroup.groups_choices( | |
68 | groups=groups_with_admin_rights, |
|
70 | groups=groups_with_admin_rights, | |
69 | show_empty_group=allow_empty_group) |
|
71 | show_empty_group=allow_empty_group) | |
@@ -109,9 +111,9 b' class AdminRepoGroupsView(BaseAppView, D' | |||||
109 | def repo_group_list_data(self): |
|
111 | def repo_group_list_data(self): | |
110 | self.load_default_context() |
|
112 | self.load_default_context() | |
111 | column_map = { |
|
113 | column_map = { | |
112 |
'name |
|
114 | 'name': 'group_name_hash', | |
113 | 'desc': 'group_description', |
|
115 | 'desc': 'group_description', | |
114 |
'last_change |
|
116 | 'last_change': 'updated_on', | |
115 | 'top_level_repos': 'repos_total', |
|
117 | 'top_level_repos': 'repos_total', | |
116 | 'owner': 'user_username', |
|
118 | 'owner': 'user_username', | |
117 | } |
|
119 | } | |
@@ -131,9 +133,10 b' class AdminRepoGroupsView(BaseAppView, D' | |||||
131 |
|
133 | |||
132 | def last_change(last_change): |
|
134 | def last_change(last_change): | |
133 | if isinstance(last_change, datetime.datetime) and not last_change.tzinfo: |
|
135 | if isinstance(last_change, datetime.datetime) and not last_change.tzinfo: | |
134 |
|
|
136 | ts = time.time() | |
135 |
|
|
137 | utc_offset = (datetime.datetime.fromtimestamp(ts) | |
136 | last_change = last_change + delta |
|
138 | - datetime.datetime.utcfromtimestamp(ts)).total_seconds() | |
|
139 | last_change = last_change + datetime.timedelta(seconds=utc_offset) | |||
137 | return _render("last_change", last_change) |
|
140 | return _render("last_change", last_change) | |
138 |
|
141 | |||
139 | def desc(desc, personal): |
|
142 | def desc(desc, personal): | |
@@ -147,12 +150,8 b' class AdminRepoGroupsView(BaseAppView, D' | |||||
147 | def user_profile(username): |
|
150 | def user_profile(username): | |
148 | return _render('user_profile', username) |
|
151 | return _render('user_profile', username) | |
149 |
|
152 | |||
150 | auth_repo_group_list = RepoGroupList( |
|
153 | _perms = ['group.admin'] | |
151 | RepoGroup.query().all(), perm_set=['group.admin']) |
|
154 | allowed_ids = [-1] + self._rhodecode_user.repo_group_acl_ids_from_stack(_perms) | |
152 |
|
||||
153 | allowed_ids = [-1] |
|
|||
154 | for repo_group in auth_repo_group_list: |
|
|||
155 | allowed_ids.append(repo_group.group_id) |
|
|||
156 |
|
155 | |||
157 | repo_groups_data_total_count = RepoGroup.query()\ |
|
156 | repo_groups_data_total_count = RepoGroup.query()\ | |
158 | .filter(or_( |
|
157 | .filter(or_( | |
@@ -180,7 +179,7 b' class AdminRepoGroupsView(BaseAppView, D' | |||||
180 | # generate multiple IN to fix limitation problems |
|
179 | # generate multiple IN to fix limitation problems | |
181 | *in_filter_generator(RepoGroup.group_id, allowed_ids) |
|
180 | *in_filter_generator(RepoGroup.group_id, allowed_ids) | |
182 | )) \ |
|
181 | )) \ | |
183 | .outerjoin(Repository) \ |
|
182 | .outerjoin(Repository, Repository.group_id == RepoGroup.group_id) \ | |
184 | .join(User, User.user_id == RepoGroup.user_id) \ |
|
183 | .join(User, User.user_id == RepoGroup.user_id) \ | |
185 | .group_by(RepoGroup, User) |
|
184 | .group_by(RepoGroup, User) | |
186 |
|
185 | |||
@@ -224,9 +223,8 b' class AdminRepoGroupsView(BaseAppView, D' | |||||
224 | row = { |
|
223 | row = { | |
225 | "menu": quick_menu(repo_gr.group_name), |
|
224 | "menu": quick_menu(repo_gr.group_name), | |
226 | "name": repo_group_lnk(repo_gr.group_name), |
|
225 | "name": repo_group_lnk(repo_gr.group_name), | |
227 | "name_raw": repo_gr.group_name, |
|
226 | ||
228 | "last_change": last_change(repo_gr.updated_on), |
|
227 | "last_change": last_change(repo_gr.updated_on), | |
229 | "last_change_raw": datetime_to_time(repo_gr.updated_on), |
|
|||
230 |
|
228 | |||
231 | "last_changeset": "", |
|
229 | "last_changeset": "", | |
232 | "last_changeset_raw": "", |
|
230 | "last_changeset_raw": "", |
@@ -31,7 +31,6 b' from rhodecode import events' | |||||
31 | from rhodecode.apps._base import BaseAppView, DataGridAppView |
|
31 | from rhodecode.apps._base import BaseAppView, DataGridAppView | |
32 | from rhodecode.lib.celerylib.utils import get_task_id |
|
32 | from rhodecode.lib.celerylib.utils import get_task_id | |
33 |
|
33 | |||
34 | from rhodecode.lib.ext_json import json |
|
|||
35 | from rhodecode.lib.auth import ( |
|
34 | from rhodecode.lib.auth import ( | |
36 | LoginRequired, CSRFRequired, NotAnonymous, |
|
35 | LoginRequired, CSRFRequired, NotAnonymous, | |
37 | HasPermissionAny, HasRepoGroupPermissionAny) |
|
36 | HasPermissionAny, HasRepoGroupPermissionAny) | |
@@ -43,7 +42,8 b' from rhodecode.model.permission import P' | |||||
43 | from rhodecode.model.repo import RepoModel |
|
42 | from rhodecode.model.repo import RepoModel | |
44 | from rhodecode.model.scm import RepoList, RepoGroupList, ScmModel |
|
43 | from rhodecode.model.scm import RepoList, RepoGroupList, ScmModel | |
45 | from rhodecode.model.settings import SettingsModel |
|
44 | from rhodecode.model.settings import SettingsModel | |
46 |
from rhodecode.model.db import |
|
45 | from rhodecode.model.db import ( | |
|
46 | in_filter_generator, or_, func, Session, Repository, RepoGroup, User) | |||
47 |
|
47 | |||
48 | log = logging.getLogger(__name__) |
|
48 | log = logging.getLogger(__name__) | |
49 |
|
49 | |||
@@ -60,8 +60,6 b' class AdminReposView(BaseAppView, DataGr' | |||||
60 | perm_set=['group.write', 'group.admin']) |
|
60 | perm_set=['group.write', 'group.admin']) | |
61 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) |
|
61 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) | |
62 | c.repo_groups_choices = map(lambda k: safe_unicode(k[0]), c.repo_groups) |
|
62 | c.repo_groups_choices = map(lambda k: safe_unicode(k[0]), c.repo_groups) | |
63 | c.landing_revs_choices, c.landing_revs = \ |
|
|||
64 | ScmModel().get_repo_landing_revs(self.request.translate) |
|
|||
65 | c.personal_repo_group = self._rhodecode_user.personal_repo_group |
|
63 | c.personal_repo_group = self._rhodecode_user.personal_repo_group | |
66 |
|
64 | |||
67 | @LoginRequired() |
|
65 | @LoginRequired() | |
@@ -72,15 +70,94 b' class AdminReposView(BaseAppView, DataGr' | |||||
72 | renderer='rhodecode:templates/admin/repos/repos.mako') |
|
70 | renderer='rhodecode:templates/admin/repos/repos.mako') | |
73 | def repository_list(self): |
|
71 | def repository_list(self): | |
74 | c = self.load_default_context() |
|
72 | c = self.load_default_context() | |
|
73 | return self._get_template_context(c) | |||
75 |
|
74 | |||
76 | repo_list = Repository.get_all_repos() |
|
75 | @LoginRequired() | |
77 | c.repo_list = RepoList(repo_list, perm_set=['repository.admin']) |
|
76 | @NotAnonymous() | |
|
77 | # perms check inside | |||
|
78 | @view_config( | |||
|
79 | route_name='repos_data', request_method='GET', | |||
|
80 | renderer='json_ext', xhr=True) | |||
|
81 | def repository_list_data(self): | |||
|
82 | self.load_default_context() | |||
|
83 | column_map = { | |||
|
84 | 'name': 'repo_name', | |||
|
85 | 'desc': 'description', | |||
|
86 | 'last_change': 'updated_on', | |||
|
87 | 'owner': 'user_username', | |||
|
88 | } | |||
|
89 | draw, start, limit = self._extract_chunk(self.request) | |||
|
90 | search_q, order_by, order_dir = self._extract_ordering( | |||
|
91 | self.request, column_map=column_map) | |||
|
92 | ||||
|
93 | _perms = ['repository.admin'] | |||
|
94 | allowed_ids = [-1] + self._rhodecode_user.repo_acl_ids_from_stack(_perms) | |||
|
95 | ||||
|
96 | repos_data_total_count = Repository.query() \ | |||
|
97 | .filter(or_( | |||
|
98 | # generate multiple IN to fix limitation problems | |||
|
99 | *in_filter_generator(Repository.repo_id, allowed_ids)) | |||
|
100 | ) \ | |||
|
101 | .count() | |||
|
102 | ||||
|
103 | base_q = Session.query( | |||
|
104 | Repository.repo_id, | |||
|
105 | Repository.repo_name, | |||
|
106 | Repository.description, | |||
|
107 | Repository.repo_type, | |||
|
108 | Repository.repo_state, | |||
|
109 | Repository.private, | |||
|
110 | Repository.archived, | |||
|
111 | Repository.fork, | |||
|
112 | Repository.updated_on, | |||
|
113 | Repository._changeset_cache, | |||
|
114 | User, | |||
|
115 | ) \ | |||
|
116 | .filter(or_( | |||
|
117 | # generate multiple IN to fix limitation problems | |||
|
118 | *in_filter_generator(Repository.repo_id, allowed_ids)) | |||
|
119 | ) \ | |||
|
120 | .join(User, User.user_id == Repository.user_id) \ | |||
|
121 | .group_by(Repository, User) | |||
|
122 | ||||
|
123 | if search_q: | |||
|
124 | like_expression = u'%{}%'.format(safe_unicode(search_q)) | |||
|
125 | base_q = base_q.filter(or_( | |||
|
126 | Repository.repo_name.ilike(like_expression), | |||
|
127 | )) | |||
|
128 | ||||
|
129 | repos_data_total_filtered_count = base_q.count() | |||
|
130 | ||||
|
131 | sort_defined = False | |||
|
132 | if order_by == 'repo_name': | |||
|
133 | sort_col = func.lower(Repository.repo_name) | |||
|
134 | sort_defined = True | |||
|
135 | elif order_by == 'user_username': | |||
|
136 | sort_col = User.username | |||
|
137 | else: | |||
|
138 | sort_col = getattr(Repository, order_by, None) | |||
|
139 | ||||
|
140 | if sort_defined or sort_col: | |||
|
141 | if order_dir == 'asc': | |||
|
142 | sort_col = sort_col.asc() | |||
|
143 | else: | |||
|
144 | sort_col = sort_col.desc() | |||
|
145 | ||||
|
146 | base_q = base_q.order_by(sort_col) | |||
|
147 | base_q = base_q.offset(start).limit(limit) | |||
|
148 | ||||
|
149 | repos_list = base_q.all() | |||
|
150 | ||||
78 | repos_data = RepoModel().get_repos_as_dict( |
|
151 | repos_data = RepoModel().get_repos_as_dict( | |
79 |
repo_list= |
|
152 | repo_list=repos_list, admin=True, super_user_actions=True) | |
80 | # json used to render the grid |
|
|||
81 | c.data = json.dumps(repos_data) |
|
|||
82 |
|
153 | |||
83 | return self._get_template_context(c) |
|
154 | data = ({ | |
|
155 | 'draw': draw, | |||
|
156 | 'data': repos_data, | |||
|
157 | 'recordsTotal': repos_data_total_count, | |||
|
158 | 'recordsFiltered': repos_data_total_filtered_count, | |||
|
159 | }) | |||
|
160 | return data | |||
84 |
|
161 | |||
85 | @LoginRequired() |
|
162 | @LoginRequired() | |
86 | @NotAnonymous() |
|
163 | @NotAnonymous() | |
@@ -151,8 +228,7 b' class AdminReposView(BaseAppView, DataGr' | |||||
151 | try: |
|
228 | try: | |
152 | # CanWriteToGroup validators checks permissions of this POST |
|
229 | # CanWriteToGroup validators checks permissions of this POST | |
153 | form = RepoForm( |
|
230 | form = RepoForm( | |
154 |
self.request.translate, repo_groups=c.repo_groups_choices |
|
231 | self.request.translate, repo_groups=c.repo_groups_choices)() | |
155 | landing_revs=c.landing_revs_choices)() |
|
|||
156 | form_result = form.to_python(dict(self.request.POST)) |
|
232 | form_result = form.to_python(dict(self.request.POST)) | |
157 | copy_permissions = form_result.get('repo_copy_permissions') |
|
233 | copy_permissions = form_result.get('repo_copy_permissions') | |
158 | # create is done sometimes async on celery, db transaction |
|
234 | # create is done sometimes async on celery, db transaction |
@@ -445,7 +445,7 b' class AdminSettingsView(BaseAppView):' | |||||
445 | def settings_issuetracker(self): |
|
445 | def settings_issuetracker(self): | |
446 | c = self.load_default_context() |
|
446 | c = self.load_default_context() | |
447 | c.active = 'issuetracker' |
|
447 | c.active = 'issuetracker' | |
448 | defaults = SettingsModel().get_all_settings() |
|
448 | defaults = c.rc_config | |
449 |
|
449 | |||
450 | entry_key = 'rhodecode_issuetracker_pat_' |
|
450 | entry_key = 'rhodecode_issuetracker_pat_' | |
451 |
|
451 | |||
@@ -518,7 +518,7 b' class AdminSettingsView(BaseAppView):' | |||||
518 | @CSRFRequired() |
|
518 | @CSRFRequired() | |
519 | @view_config( |
|
519 | @view_config( | |
520 | route_name='admin_settings_issuetracker_delete', request_method='POST', |
|
520 | route_name='admin_settings_issuetracker_delete', request_method='POST', | |
521 | renderer='rhodecode:templates/admin/settings/settings.mako') |
|
521 | renderer='json_ext', xhr=True) | |
522 | def settings_issuetracker_delete(self): |
|
522 | def settings_issuetracker_delete(self): | |
523 | _ = self.request.translate |
|
523 | _ = self.request.translate | |
524 | self.load_default_context() |
|
524 | self.load_default_context() | |
@@ -528,8 +528,11 b' class AdminSettingsView(BaseAppView):' | |||||
528 | except Exception: |
|
528 | except Exception: | |
529 | log.exception('Failed to delete issue tracker setting %s', uid) |
|
529 | log.exception('Failed to delete issue tracker setting %s', uid) | |
530 | raise HTTPNotFound() |
|
530 | raise HTTPNotFound() | |
531 | h.flash(_('Removed issue tracker entry'), category='success') |
|
531 | ||
532 | raise HTTPFound(h.route_path('admin_settings_issuetracker')) |
|
532 | SettingsModel().invalidate_settings_cache() | |
|
533 | h.flash(_('Removed issue tracker entry.'), category='success') | |||
|
534 | ||||
|
535 | return {'deleted': uid} | |||
533 |
|
536 | |||
534 | @LoginRequired() |
|
537 | @LoginRequired() | |
535 | @HasPermissionAllDecorator('hg.admin') |
|
538 | @HasPermissionAllDecorator('hg.admin') | |
@@ -570,8 +573,7 b' class AdminSettingsView(BaseAppView):' | |||||
570 |
|
573 | |||
571 | email_kwargs = { |
|
574 | email_kwargs = { | |
572 | 'date': datetime.datetime.now(), |
|
575 | 'date': datetime.datetime.now(), | |
573 |
'user': c.rhodecode_user |
|
576 | 'user': c.rhodecode_user | |
574 | 'rhodecode_version': c.rhodecode_version |
|
|||
575 | } |
|
577 | } | |
576 |
|
578 | |||
577 | (subject, headers, email_body, |
|
579 | (subject, headers, email_body, |
@@ -155,6 +155,10 b' class AdminSystemInfoSettingsView(BaseAp' | |||||
155 |
|
155 | |||
156 | ] |
|
156 | ] | |
157 |
|
157 | |||
|
158 | c.vcsserver_data_items = [ | |||
|
159 | (k, v) for k,v in (val('vcs_server_config') or {}).items() | |||
|
160 | ] | |||
|
161 | ||||
158 | if snapshot: |
|
162 | if snapshot: | |
159 | if c.allowed_to_snapshot: |
|
163 | if c.allowed_to_snapshot: | |
160 | c.data_items.pop(0) # remove server info |
|
164 | c.data_items.pop(0) # remove server info |
@@ -99,12 +99,8 b' class AdminUserGroupsView(BaseAppView, D' | |||||
99 | def user_profile(username): |
|
99 | def user_profile(username): | |
100 | return _render('user_profile', username) |
|
100 | return _render('user_profile', username) | |
101 |
|
101 | |||
102 | auth_user_group_list = UserGroupList( |
|
102 | _perms = ['usergroup.admin'] | |
103 | UserGroup.query().all(), perm_set=['usergroup.admin']) |
|
103 | allowed_ids = [-1] + self._rhodecode_user.user_group_acl_ids_from_stack(_perms) | |
104 |
|
||||
105 | allowed_ids = [-1] |
|
|||
106 | for user_group in auth_user_group_list: |
|
|||
107 | allowed_ids.append(user_group.users_group_id) |
|
|||
108 |
|
104 | |||
109 | user_groups_data_total_count = UserGroup.query()\ |
|
105 | user_groups_data_total_count = UserGroup.query()\ | |
110 | .filter(or_( |
|
106 | .filter(or_( | |
@@ -134,7 +130,7 b' class AdminUserGroupsView(BaseAppView, D' | |||||
134 | # generate multiple IN to fix limitation problems |
|
130 | # generate multiple IN to fix limitation problems | |
135 | *in_filter_generator(UserGroup.users_group_id, allowed_ids) |
|
131 | *in_filter_generator(UserGroup.users_group_id, allowed_ids) | |
136 | )) \ |
|
132 | )) \ | |
137 | .outerjoin(UserGroupMember) \ |
|
133 | .outerjoin(UserGroupMember, UserGroupMember.users_group_id == UserGroup.users_group_id) \ | |
138 | .join(User, User.user_id == UserGroup.user_id) \ |
|
134 | .join(User, User.user_id == UserGroup.user_id) \ | |
139 | .group_by(UserGroup, User) |
|
135 | .group_by(UserGroup, User) | |
140 |
|
136 | |||
@@ -175,7 +171,6 b' class AdminUserGroupsView(BaseAppView, D' | |||||
175 | for user_gr in auth_user_group_list: |
|
171 | for user_gr in auth_user_group_list: | |
176 | row = { |
|
172 | row = { | |
177 | "users_group_name": user_group_name(user_gr.users_group_name), |
|
173 | "users_group_name": user_group_name(user_gr.users_group_name), | |
178 | "name_raw": h.escape(user_gr.users_group_name), |
|
|||
179 | "description": h.escape(user_gr.user_group_description), |
|
174 | "description": h.escape(user_gr.user_group_description), | |
180 | "members": user_gr.member_count, |
|
175 | "members": user_gr.member_count, | |
181 | # NOTE(marcink): because of advanced query we |
|
176 | # NOTE(marcink): because of advanced query we |
@@ -31,6 +31,7 b' from pyramid.response import Response' | |||||
31 | from rhodecode import events |
|
31 | from rhodecode import events | |
32 | from rhodecode.apps._base import BaseAppView, DataGridAppView, UserAppView |
|
32 | from rhodecode.apps._base import BaseAppView, DataGridAppView, UserAppView | |
33 | from rhodecode.apps.ssh_support import SshKeyFileChangeEvent |
|
33 | from rhodecode.apps.ssh_support import SshKeyFileChangeEvent | |
|
34 | from rhodecode.authentication.base import get_authn_registry, RhodeCodeExternalAuthPlugin | |||
34 | from rhodecode.authentication.plugins import auth_rhodecode |
|
35 | from rhodecode.authentication.plugins import auth_rhodecode | |
35 | from rhodecode.events import trigger |
|
36 | from rhodecode.events import trigger | |
36 | from rhodecode.model.db import true |
|
37 | from rhodecode.model.db import true | |
@@ -43,6 +44,7 b' from rhodecode.lib.ext_json import json' | |||||
43 | from rhodecode.lib.auth import ( |
|
44 | from rhodecode.lib.auth import ( | |
44 | LoginRequired, HasPermissionAllDecorator, CSRFRequired) |
|
45 | LoginRequired, HasPermissionAllDecorator, CSRFRequired) | |
45 | from rhodecode.lib import helpers as h |
|
46 | from rhodecode.lib import helpers as h | |
|
47 | from rhodecode.lib.helpers import SqlPage | |||
46 | from rhodecode.lib.utils2 import safe_int, safe_unicode, AttributeDict |
|
48 | from rhodecode.lib.utils2 import safe_int, safe_unicode, AttributeDict | |
47 | from rhodecode.model.auth_token import AuthTokenModel |
|
49 | from rhodecode.model.auth_token import AuthTokenModel | |
48 | from rhodecode.model.forms import ( |
|
50 | from rhodecode.model.forms import ( | |
@@ -249,7 +251,32 b' class UsersView(UserAppView):' | |||||
249 | in there as well. |
|
251 | in there as well. | |
250 | """ |
|
252 | """ | |
251 |
|
253 | |||
|
254 | def get_auth_plugins(self): | |||
|
255 | valid_plugins = [] | |||
|
256 | authn_registry = get_authn_registry(self.request.registry) | |||
|
257 | for plugin in authn_registry.get_plugins_for_authentication(): | |||
|
258 | if isinstance(plugin, RhodeCodeExternalAuthPlugin): | |||
|
259 | valid_plugins.append(plugin) | |||
|
260 | elif plugin.name == 'rhodecode': | |||
|
261 | valid_plugins.append(plugin) | |||
|
262 | ||||
|
263 | # extend our choices if user has set a bound plugin which isn't enabled at the | |||
|
264 | # moment | |||
|
265 | extern_type = self.db_user.extern_type | |||
|
266 | if extern_type not in [x.uid for x in valid_plugins]: | |||
|
267 | try: | |||
|
268 | plugin = authn_registry.get_plugin_by_uid(extern_type) | |||
|
269 | if plugin: | |||
|
270 | valid_plugins.append(plugin) | |||
|
271 | ||||
|
272 | except Exception: | |||
|
273 | log.exception( | |||
|
274 | 'Could not extend user plugins with `{}`'.format(extern_type)) | |||
|
275 | return valid_plugins | |||
|
276 | ||||
252 | def load_default_context(self): |
|
277 | def load_default_context(self): | |
|
278 | req = self.request | |||
|
279 | ||||
253 | c = self._get_local_tmpl_context() |
|
280 | c = self._get_local_tmpl_context() | |
254 | c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS |
|
281 | c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS | |
255 | c.allowed_languages = [ |
|
282 | c.allowed_languages = [ | |
@@ -263,7 +290,10 b' class UsersView(UserAppView):' | |||||
263 | ('ru', 'Russian (ru)'), |
|
290 | ('ru', 'Russian (ru)'), | |
264 | ('zh', 'Chinese (zh)'), |
|
291 | ('zh', 'Chinese (zh)'), | |
265 | ] |
|
292 | ] | |
266 | req = self.request |
|
293 | ||
|
294 | c.allowed_extern_types = [ | |||
|
295 | (x.uid, x.get_display_name()) for x in self.get_auth_plugins() | |||
|
296 | ] | |||
267 |
|
297 | |||
268 | c.available_permissions = req.registry.settings['available_permissions'] |
|
298 | c.available_permissions = req.registry.settings['available_permissions'] | |
269 | PermissionModel().set_global_permission_choices( |
|
299 | PermissionModel().set_global_permission_choices( | |
@@ -297,7 +327,7 b' class UsersView(UserAppView):' | |||||
297 | old_values = c.user.get_api_data() |
|
327 | old_values = c.user.get_api_data() | |
298 | try: |
|
328 | try: | |
299 | form_result = _form.to_python(dict(self.request.POST)) |
|
329 | form_result = _form.to_python(dict(self.request.POST)) | |
300 |
skip_attrs = [' |
|
330 | skip_attrs = ['extern_name'] | |
301 | # TODO: plugin should define if username can be updated |
|
331 | # TODO: plugin should define if username can be updated | |
302 | if c.extern_type != "rhodecode": |
|
332 | if c.extern_type != "rhodecode": | |
303 | # forbid updating username for external accounts |
|
333 | # forbid updating username for external accounts | |
@@ -347,59 +377,69 b' class UsersView(UserAppView):' | |||||
347 | _repos = c.user.repositories |
|
377 | _repos = c.user.repositories | |
348 | _repo_groups = c.user.repository_groups |
|
378 | _repo_groups = c.user.repository_groups | |
349 | _user_groups = c.user.user_groups |
|
379 | _user_groups = c.user.user_groups | |
|
380 | _artifacts = c.user.artifacts | |||
350 |
|
381 | |||
351 | handle_repos = None |
|
382 | handle_repos = None | |
352 | handle_repo_groups = None |
|
383 | handle_repo_groups = None | |
353 | handle_user_groups = None |
|
384 | handle_user_groups = None | |
354 | # dummy call for flash of handle |
|
385 | handle_artifacts = None | |
355 | set_handle_flash_repos = lambda: None |
|
386 | ||
356 | set_handle_flash_repo_groups = lambda: None |
|
387 | # calls for flash of handle based on handle case detach or delete | |
357 |
|
|
388 | def set_handle_flash_repos(): | |
|
389 | handle = handle_repos | |||
|
390 | if handle == 'detach': | |||
|
391 | h.flash(_('Detached %s repositories') % len(_repos), | |||
|
392 | category='success') | |||
|
393 | elif handle == 'delete': | |||
|
394 | h.flash(_('Deleted %s repositories') % len(_repos), | |||
|
395 | category='success') | |||
|
396 | ||||
|
397 | def set_handle_flash_repo_groups(): | |||
|
398 | handle = handle_repo_groups | |||
|
399 | if handle == 'detach': | |||
|
400 | h.flash(_('Detached %s repository groups') % len(_repo_groups), | |||
|
401 | category='success') | |||
|
402 | elif handle == 'delete': | |||
|
403 | h.flash(_('Deleted %s repository groups') % len(_repo_groups), | |||
|
404 | category='success') | |||
|
405 | ||||
|
406 | def set_handle_flash_user_groups(): | |||
|
407 | handle = handle_user_groups | |||
|
408 | if handle == 'detach': | |||
|
409 | h.flash(_('Detached %s user groups') % len(_user_groups), | |||
|
410 | category='success') | |||
|
411 | elif handle == 'delete': | |||
|
412 | h.flash(_('Deleted %s user groups') % len(_user_groups), | |||
|
413 | category='success') | |||
|
414 | ||||
|
415 | def set_handle_flash_artifacts(): | |||
|
416 | handle = handle_artifacts | |||
|
417 | if handle == 'detach': | |||
|
418 | h.flash(_('Detached %s artifacts') % len(_artifacts), | |||
|
419 | category='success') | |||
|
420 | elif handle == 'delete': | |||
|
421 | h.flash(_('Deleted %s artifacts') % len(_artifacts), | |||
|
422 | category='success') | |||
358 |
|
423 | |||
359 | if _repos and self.request.POST.get('user_repos'): |
|
424 | if _repos and self.request.POST.get('user_repos'): | |
360 | do = self.request.POST['user_repos'] |
|
425 | handle_repos = self.request.POST['user_repos'] | |
361 | if do == 'detach': |
|
|||
362 | handle_repos = 'detach' |
|
|||
363 | set_handle_flash_repos = lambda: h.flash( |
|
|||
364 | _('Detached %s repositories') % len(_repos), |
|
|||
365 | category='success') |
|
|||
366 | elif do == 'delete': |
|
|||
367 | handle_repos = 'delete' |
|
|||
368 | set_handle_flash_repos = lambda: h.flash( |
|
|||
369 | _('Deleted %s repositories') % len(_repos), |
|
|||
370 | category='success') |
|
|||
371 |
|
426 | |||
372 | if _repo_groups and self.request.POST.get('user_repo_groups'): |
|
427 | if _repo_groups and self.request.POST.get('user_repo_groups'): | |
373 | do = self.request.POST['user_repo_groups'] |
|
428 | handle_repo_groups = self.request.POST['user_repo_groups'] | |
374 | if do == 'detach': |
|
|||
375 | handle_repo_groups = 'detach' |
|
|||
376 | set_handle_flash_repo_groups = lambda: h.flash( |
|
|||
377 | _('Detached %s repository groups') % len(_repo_groups), |
|
|||
378 | category='success') |
|
|||
379 | elif do == 'delete': |
|
|||
380 | handle_repo_groups = 'delete' |
|
|||
381 | set_handle_flash_repo_groups = lambda: h.flash( |
|
|||
382 | _('Deleted %s repository groups') % len(_repo_groups), |
|
|||
383 | category='success') |
|
|||
384 |
|
429 | |||
385 | if _user_groups and self.request.POST.get('user_user_groups'): |
|
430 | if _user_groups and self.request.POST.get('user_user_groups'): | |
386 | do = self.request.POST['user_user_groups'] |
|
431 | handle_user_groups = self.request.POST['user_user_groups'] | |
387 | if do == 'detach': |
|
432 | ||
388 | handle_user_groups = 'detach' |
|
433 | if _artifacts and self.request.POST.get('user_artifacts'): | |
389 | set_handle_flash_user_groups = lambda: h.flash( |
|
434 | handle_artifacts = self.request.POST['user_artifacts'] | |
390 | _('Detached %s user groups') % len(_user_groups), |
|
|||
391 | category='success') |
|
|||
392 | elif do == 'delete': |
|
|||
393 | handle_user_groups = 'delete' |
|
|||
394 | set_handle_flash_user_groups = lambda: h.flash( |
|
|||
395 | _('Deleted %s user groups') % len(_user_groups), |
|
|||
396 | category='success') |
|
|||
397 |
|
435 | |||
398 | old_values = c.user.get_api_data() |
|
436 | old_values = c.user.get_api_data() | |
|
437 | ||||
399 | try: |
|
438 | try: | |
400 | UserModel().delete(c.user, handle_repos=handle_repos, |
|
439 | UserModel().delete(c.user, handle_repos=handle_repos, | |
401 | handle_repo_groups=handle_repo_groups, |
|
440 | handle_repo_groups=handle_repo_groups, | |
402 |
handle_user_groups=handle_user_groups |
|
441 | handle_user_groups=handle_user_groups, | |
|
442 | handle_artifacts=handle_artifacts) | |||
403 |
|
443 | |||
404 | audit_logger.store_web( |
|
444 | audit_logger.store_web( | |
405 | 'user.delete', action_data={'old_data': old_values}, |
|
445 | 'user.delete', action_data={'old_data': old_values}, | |
@@ -409,7 +449,9 b' class UsersView(UserAppView):' | |||||
409 | set_handle_flash_repos() |
|
449 | set_handle_flash_repos() | |
410 | set_handle_flash_repo_groups() |
|
450 | set_handle_flash_repo_groups() | |
411 | set_handle_flash_user_groups() |
|
451 | set_handle_flash_user_groups() | |
412 | h.flash(_('Successfully deleted user'), category='success') |
|
452 | set_handle_flash_artifacts() | |
|
453 | username = h.escape(old_values['username']) | |||
|
454 | h.flash(_('Successfully deleted user `{}`').format(username), category='success') | |||
413 | except (UserOwnsReposException, UserOwnsRepoGroupsException, |
|
455 | except (UserOwnsReposException, UserOwnsRepoGroupsException, | |
414 | UserOwnsUserGroupsException, DefaultUserException) as e: |
|
456 | UserOwnsUserGroupsException, DefaultUserException) as e: | |
415 | h.flash(e, category='warning') |
|
457 | h.flash(e, category='warning') | |
@@ -1187,19 +1229,45 b' class UsersView(UserAppView):' | |||||
1187 | filter_term = self.request.GET.get('filter') |
|
1229 | filter_term = self.request.GET.get('filter') | |
1188 | user_log = UserModel().get_user_log(c.user, filter_term) |
|
1230 | user_log = UserModel().get_user_log(c.user, filter_term) | |
1189 |
|
1231 | |||
1190 |
def url_generator( |
|
1232 | def url_generator(page_num): | |
|
1233 | query_params = { | |||
|
1234 | 'page': page_num | |||
|
1235 | } | |||
1191 | if filter_term: |
|
1236 | if filter_term: | |
1192 |
|
|
1237 | query_params['filter'] = filter_term | |
1193 |
return self.request.current_route_path(_query= |
|
1238 | return self.request.current_route_path(_query=query_params) | |
1194 |
|
1239 | |||
1195 |
c.audit_logs = |
|
1240 | c.audit_logs = SqlPage( | |
1196 | user_log, page=p, items_per_page=10, url=url_generator) |
|
1241 | user_log, page=p, items_per_page=10, url_maker=url_generator) | |
1197 | c.filter_term = filter_term |
|
1242 | c.filter_term = filter_term | |
1198 | return self._get_template_context(c) |
|
1243 | return self._get_template_context(c) | |
1199 |
|
1244 | |||
1200 | @LoginRequired() |
|
1245 | @LoginRequired() | |
1201 | @HasPermissionAllDecorator('hg.admin') |
|
1246 | @HasPermissionAllDecorator('hg.admin') | |
1202 | @view_config( |
|
1247 | @view_config( | |
|
1248 | route_name='edit_user_audit_logs_download', request_method='GET', | |||
|
1249 | renderer='string') | |||
|
1250 | def user_audit_logs_download(self): | |||
|
1251 | _ = self.request.translate | |||
|
1252 | c = self.load_default_context() | |||
|
1253 | c.user = self.db_user | |||
|
1254 | ||||
|
1255 | user_log = UserModel().get_user_log(c.user, filter_term=None) | |||
|
1256 | ||||
|
1257 | audit_log_data = {} | |||
|
1258 | for entry in user_log: | |||
|
1259 | audit_log_data[entry.user_log_id] = entry.get_dict() | |||
|
1260 | ||||
|
1261 | response = Response(json.dumps(audit_log_data, indent=4)) | |||
|
1262 | response.content_disposition = str( | |||
|
1263 | 'attachment; filename=%s' % 'user_{}_audit_logs.json'.format(c.user.user_id)) | |||
|
1264 | response.content_type = 'application/json' | |||
|
1265 | ||||
|
1266 | return response | |||
|
1267 | ||||
|
1268 | @LoginRequired() | |||
|
1269 | @HasPermissionAllDecorator('hg.admin') | |||
|
1270 | @view_config( | |||
1203 | route_name='edit_user_perms_summary', request_method='GET', |
|
1271 | route_name='edit_user_perms_summary', request_method='GET', | |
1204 | renderer='rhodecode:templates/admin/users/user_edit.mako') |
|
1272 | renderer='rhodecode:templates/admin/users/user_edit.mako') | |
1205 | def user_perms_summary(self): |
|
1273 | def user_perms_summary(self): |
@@ -43,6 +43,14 b' def includeme(config):' | |||||
43 | pattern=ADMIN_PREFIX + '/debug_style', |
|
43 | pattern=ADMIN_PREFIX + '/debug_style', | |
44 | debug_style=True) |
|
44 | debug_style=True) | |
45 | config.add_route( |
|
45 | config.add_route( | |
|
46 | name='debug_style_email', | |||
|
47 | pattern=ADMIN_PREFIX + '/debug_style/email/{email_id}', | |||
|
48 | debug_style=True) | |||
|
49 | config.add_route( | |||
|
50 | name='debug_style_email_plain_rendered', | |||
|
51 | pattern=ADMIN_PREFIX + '/debug_style/email-rendered/{email_id}', | |||
|
52 | debug_style=True) | |||
|
53 | config.add_route( | |||
46 | name='debug_style_template', |
|
54 | name='debug_style_template', | |
47 | pattern=ADMIN_PREFIX + '/debug_style/t/{t_path}', |
|
55 | pattern=ADMIN_PREFIX + '/debug_style/t/{t_path}', | |
48 | debug_style=True) |
|
56 | debug_style=True) |
@@ -20,10 +20,15 b'' | |||||
20 |
|
20 | |||
21 | import os |
|
21 | import os | |
22 | import logging |
|
22 | import logging | |
|
23 | import datetime | |||
23 |
|
24 | |||
24 | from pyramid.view import view_config |
|
25 | from pyramid.view import view_config | |
25 | from pyramid.renderers import render_to_response |
|
26 | from pyramid.renderers import render_to_response | |
26 | from rhodecode.apps._base import BaseAppView |
|
27 | from rhodecode.apps._base import BaseAppView | |
|
28 | from rhodecode.lib.celerylib import run_task, tasks | |||
|
29 | from rhodecode.lib.utils2 import AttributeDict | |||
|
30 | from rhodecode.model.db import User | |||
|
31 | from rhodecode.model.notification import EmailNotificationModel | |||
27 |
|
32 | |||
28 | log = logging.getLogger(__name__) |
|
33 | log = logging.getLogger(__name__) | |
29 |
|
34 | |||
@@ -46,6 +51,317 b' class DebugStyleView(BaseAppView):' | |||||
46 | request=self.request) |
|
51 | request=self.request) | |
47 |
|
52 | |||
48 | @view_config( |
|
53 | @view_config( | |
|
54 | route_name='debug_style_email', request_method='GET', | |||
|
55 | renderer=None) | |||
|
56 | @view_config( | |||
|
57 | route_name='debug_style_email_plain_rendered', request_method='GET', | |||
|
58 | renderer=None) | |||
|
59 | def render_email(self): | |||
|
60 | c = self.load_default_context() | |||
|
61 | email_id = self.request.matchdict['email_id'] | |||
|
62 | c.active = 'emails' | |||
|
63 | ||||
|
64 | pr = AttributeDict( | |||
|
65 | pull_request_id=123, | |||
|
66 | title='digital_ocean: fix redis, elastic search start on boot, ' | |||
|
67 | 'fix fd limits on supervisor, set postgres 11 version', | |||
|
68 | description=''' | |||
|
69 | Check if we should use full-topic or mini-topic. | |||
|
70 | ||||
|
71 | - full topic produces some problems with merge states etc | |||
|
72 | - server-mini-topic needs probably tweeks. | |||
|
73 | ''', | |||
|
74 | repo_name='foobar', | |||
|
75 | source_ref_parts=AttributeDict(type='branch', name='fix-ticket-2000'), | |||
|
76 | target_ref_parts=AttributeDict(type='branch', name='master'), | |||
|
77 | ) | |||
|
78 | target_repo = AttributeDict(repo_name='repo_group/target_repo') | |||
|
79 | source_repo = AttributeDict(repo_name='repo_group/source_repo') | |||
|
80 | user = User.get_by_username(self.request.GET.get('user')) or self._rhodecode_db_user | |||
|
81 | # file/commit changes for PR update | |||
|
82 | commit_changes = AttributeDict({ | |||
|
83 | 'added': ['aaaaaaabbbbb', 'cccccccddddddd'], | |||
|
84 | 'removed': ['eeeeeeeeeee'], | |||
|
85 | }) | |||
|
86 | file_changes = AttributeDict({ | |||
|
87 | 'added': ['a/file1.md', 'file2.py'], | |||
|
88 | 'modified': ['b/modified_file.rst'], | |||
|
89 | 'removed': ['.idea'], | |||
|
90 | }) | |||
|
91 | email_kwargs = { | |||
|
92 | 'test': {}, | |||
|
93 | 'message': { | |||
|
94 | 'body': 'message body !' | |||
|
95 | }, | |||
|
96 | 'email_test': { | |||
|
97 | 'user': user, | |||
|
98 | 'date': datetime.datetime.now(), | |||
|
99 | }, | |||
|
100 | 'password_reset': { | |||
|
101 | 'password_reset_url': 'http://example.com/reset-rhodecode-password/token', | |||
|
102 | ||||
|
103 | 'user': user, | |||
|
104 | 'date': datetime.datetime.now(), | |||
|
105 | 'email': 'test@rhodecode.com', | |||
|
106 | 'first_admin_email': User.get_first_super_admin().email | |||
|
107 | }, | |||
|
108 | 'password_reset_confirmation': { | |||
|
109 | 'new_password': 'new-password-example', | |||
|
110 | 'user': user, | |||
|
111 | 'date': datetime.datetime.now(), | |||
|
112 | 'email': 'test@rhodecode.com', | |||
|
113 | 'first_admin_email': User.get_first_super_admin().email | |||
|
114 | }, | |||
|
115 | 'registration': { | |||
|
116 | 'user': user, | |||
|
117 | 'date': datetime.datetime.now(), | |||
|
118 | }, | |||
|
119 | ||||
|
120 | 'pull_request_comment': { | |||
|
121 | 'user': user, | |||
|
122 | ||||
|
123 | 'status_change': None, | |||
|
124 | 'status_change_type': None, | |||
|
125 | ||||
|
126 | 'pull_request': pr, | |||
|
127 | 'pull_request_commits': [], | |||
|
128 | ||||
|
129 | 'pull_request_target_repo': target_repo, | |||
|
130 | 'pull_request_target_repo_url': 'http://target-repo/url', | |||
|
131 | ||||
|
132 | 'pull_request_source_repo': source_repo, | |||
|
133 | 'pull_request_source_repo_url': 'http://source-repo/url', | |||
|
134 | ||||
|
135 | 'pull_request_url': 'http://localhost/pr1', | |||
|
136 | 'pr_comment_url': 'http://comment-url', | |||
|
137 | 'pr_comment_reply_url': 'http://comment-url#reply', | |||
|
138 | ||||
|
139 | 'comment_file': None, | |||
|
140 | 'comment_line': None, | |||
|
141 | 'comment_type': 'note', | |||
|
142 | 'comment_body': 'This is my comment body. *I like !*', | |||
|
143 | 'comment_id': 2048, | |||
|
144 | 'renderer_type': 'markdown', | |||
|
145 | 'mention': True, | |||
|
146 | ||||
|
147 | }, | |||
|
148 | 'pull_request_comment+status': { | |||
|
149 | 'user': user, | |||
|
150 | ||||
|
151 | 'status_change': 'approved', | |||
|
152 | 'status_change_type': 'approved', | |||
|
153 | ||||
|
154 | 'pull_request': pr, | |||
|
155 | 'pull_request_commits': [], | |||
|
156 | ||||
|
157 | 'pull_request_target_repo': target_repo, | |||
|
158 | 'pull_request_target_repo_url': 'http://target-repo/url', | |||
|
159 | ||||
|
160 | 'pull_request_source_repo': source_repo, | |||
|
161 | 'pull_request_source_repo_url': 'http://source-repo/url', | |||
|
162 | ||||
|
163 | 'pull_request_url': 'http://localhost/pr1', | |||
|
164 | 'pr_comment_url': 'http://comment-url', | |||
|
165 | 'pr_comment_reply_url': 'http://comment-url#reply', | |||
|
166 | ||||
|
167 | 'comment_type': 'todo', | |||
|
168 | 'comment_file': None, | |||
|
169 | 'comment_line': None, | |||
|
170 | 'comment_body': ''' | |||
|
171 | I think something like this would be better | |||
|
172 | ||||
|
173 | ```py | |||
|
174 | ||||
|
175 | def db(): | |||
|
176 | global connection | |||
|
177 | return connection | |||
|
178 | ||||
|
179 | ``` | |||
|
180 | ||||
|
181 | ''', | |||
|
182 | 'comment_id': 2048, | |||
|
183 | 'renderer_type': 'markdown', | |||
|
184 | 'mention': True, | |||
|
185 | ||||
|
186 | }, | |||
|
187 | 'pull_request_comment+file': { | |||
|
188 | 'user': user, | |||
|
189 | ||||
|
190 | 'status_change': None, | |||
|
191 | 'status_change_type': None, | |||
|
192 | ||||
|
193 | 'pull_request': pr, | |||
|
194 | 'pull_request_commits': [], | |||
|
195 | ||||
|
196 | 'pull_request_target_repo': target_repo, | |||
|
197 | 'pull_request_target_repo_url': 'http://target-repo/url', | |||
|
198 | ||||
|
199 | 'pull_request_source_repo': source_repo, | |||
|
200 | 'pull_request_source_repo_url': 'http://source-repo/url', | |||
|
201 | ||||
|
202 | 'pull_request_url': 'http://localhost/pr1', | |||
|
203 | ||||
|
204 | 'pr_comment_url': 'http://comment-url', | |||
|
205 | 'pr_comment_reply_url': 'http://comment-url#reply', | |||
|
206 | ||||
|
207 | 'comment_file': 'rhodecode/model/db.py', | |||
|
208 | 'comment_line': 'o1210', | |||
|
209 | 'comment_type': 'todo', | |||
|
210 | 'comment_body': ''' | |||
|
211 | I like this ! | |||
|
212 | ||||
|
213 | But please check this code:: | |||
|
214 | ||||
|
215 | def main(): | |||
|
216 | print 'ok' | |||
|
217 | ||||
|
218 | This should work better ! | |||
|
219 | ''', | |||
|
220 | 'comment_id': 2048, | |||
|
221 | 'renderer_type': 'rst', | |||
|
222 | 'mention': True, | |||
|
223 | ||||
|
224 | }, | |||
|
225 | ||||
|
226 | 'pull_request_update': { | |||
|
227 | 'updating_user': user, | |||
|
228 | ||||
|
229 | 'status_change': None, | |||
|
230 | 'status_change_type': None, | |||
|
231 | ||||
|
232 | 'pull_request': pr, | |||
|
233 | 'pull_request_commits': [], | |||
|
234 | ||||
|
235 | 'pull_request_target_repo': target_repo, | |||
|
236 | 'pull_request_target_repo_url': 'http://target-repo/url', | |||
|
237 | ||||
|
238 | 'pull_request_source_repo': source_repo, | |||
|
239 | 'pull_request_source_repo_url': 'http://source-repo/url', | |||
|
240 | ||||
|
241 | 'pull_request_url': 'http://localhost/pr1', | |||
|
242 | ||||
|
243 | # update comment links | |||
|
244 | 'pr_comment_url': 'http://comment-url', | |||
|
245 | 'pr_comment_reply_url': 'http://comment-url#reply', | |||
|
246 | 'ancestor_commit_id': 'f39bd443', | |||
|
247 | 'added_commits': commit_changes.added, | |||
|
248 | 'removed_commits': commit_changes.removed, | |||
|
249 | 'changed_files': (file_changes.added + file_changes.modified + file_changes.removed), | |||
|
250 | 'added_files': file_changes.added, | |||
|
251 | 'modified_files': file_changes.modified, | |||
|
252 | 'removed_files': file_changes.removed, | |||
|
253 | }, | |||
|
254 | ||||
|
255 | 'cs_comment': { | |||
|
256 | 'user': user, | |||
|
257 | 'commit': AttributeDict(idx=123, raw_id='a'*40, message='Commit message'), | |||
|
258 | 'status_change': None, | |||
|
259 | 'status_change_type': None, | |||
|
260 | ||||
|
261 | 'commit_target_repo_url': 'http://foo.example.com/#comment1', | |||
|
262 | 'repo_name': 'test-repo', | |||
|
263 | 'comment_type': 'note', | |||
|
264 | 'comment_file': None, | |||
|
265 | 'comment_line': None, | |||
|
266 | 'commit_comment_url': 'http://comment-url', | |||
|
267 | 'commit_comment_reply_url': 'http://comment-url#reply', | |||
|
268 | 'comment_body': 'This is my comment body. *I like !*', | |||
|
269 | 'comment_id': 2048, | |||
|
270 | 'renderer_type': 'markdown', | |||
|
271 | 'mention': True, | |||
|
272 | }, | |||
|
273 | 'cs_comment+status': { | |||
|
274 | 'user': user, | |||
|
275 | 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'), | |||
|
276 | 'status_change': 'approved', | |||
|
277 | 'status_change_type': 'approved', | |||
|
278 | ||||
|
279 | 'commit_target_repo_url': 'http://foo.example.com/#comment1', | |||
|
280 | 'repo_name': 'test-repo', | |||
|
281 | 'comment_type': 'note', | |||
|
282 | 'comment_file': None, | |||
|
283 | 'comment_line': None, | |||
|
284 | 'commit_comment_url': 'http://comment-url', | |||
|
285 | 'commit_comment_reply_url': 'http://comment-url#reply', | |||
|
286 | 'comment_body': ''' | |||
|
287 | Hello **world** | |||
|
288 | ||||
|
289 | This is a multiline comment :) | |||
|
290 | ||||
|
291 | - list | |||
|
292 | - list2 | |||
|
293 | ''', | |||
|
294 | 'comment_id': 2048, | |||
|
295 | 'renderer_type': 'markdown', | |||
|
296 | 'mention': True, | |||
|
297 | }, | |||
|
298 | 'cs_comment+file': { | |||
|
299 | 'user': user, | |||
|
300 | 'commit': AttributeDict(idx=123, raw_id='a' * 40, message='Commit message'), | |||
|
301 | 'status_change': None, | |||
|
302 | 'status_change_type': None, | |||
|
303 | ||||
|
304 | 'commit_target_repo_url': 'http://foo.example.com/#comment1', | |||
|
305 | 'repo_name': 'test-repo', | |||
|
306 | ||||
|
307 | 'comment_type': 'note', | |||
|
308 | 'comment_file': 'test-file.py', | |||
|
309 | 'comment_line': 'n100', | |||
|
310 | ||||
|
311 | 'commit_comment_url': 'http://comment-url', | |||
|
312 | 'commit_comment_reply_url': 'http://comment-url#reply', | |||
|
313 | 'comment_body': 'This is my comment body. *I like !*', | |||
|
314 | 'comment_id': 2048, | |||
|
315 | 'renderer_type': 'markdown', | |||
|
316 | 'mention': True, | |||
|
317 | }, | |||
|
318 | ||||
|
319 | 'pull_request': { | |||
|
320 | 'user': user, | |||
|
321 | 'pull_request': pr, | |||
|
322 | 'pull_request_commits': [ | |||
|
323 | ('472d1df03bf7206e278fcedc6ac92b46b01c4e21', '''\ | |||
|
324 | my-account: moved email closer to profile as it's similar data just moved outside. | |||
|
325 | '''), | |||
|
326 | ('cbfa3061b6de2696c7161ed15ba5c6a0045f90a7', '''\ | |||
|
327 | users: description edit fixes | |||
|
328 | ||||
|
329 | - tests | |||
|
330 | - added metatags info | |||
|
331 | '''), | |||
|
332 | ], | |||
|
333 | ||||
|
334 | 'pull_request_target_repo': target_repo, | |||
|
335 | 'pull_request_target_repo_url': 'http://target-repo/url', | |||
|
336 | ||||
|
337 | 'pull_request_source_repo': source_repo, | |||
|
338 | 'pull_request_source_repo_url': 'http://source-repo/url', | |||
|
339 | ||||
|
340 | 'pull_request_url': 'http://code.rhodecode.com/_pull-request/123', | |||
|
341 | } | |||
|
342 | ||||
|
343 | } | |||
|
344 | ||||
|
345 | template_type = email_id.split('+')[0] | |||
|
346 | (c.subject, c.headers, c.email_body, | |||
|
347 | c.email_body_plaintext) = EmailNotificationModel().render_email( | |||
|
348 | template_type, **email_kwargs.get(email_id, {})) | |||
|
349 | ||||
|
350 | test_email = self.request.GET.get('email') | |||
|
351 | if test_email: | |||
|
352 | recipients = [test_email] | |||
|
353 | run_task(tasks.send_email, recipients, c.subject, | |||
|
354 | c.email_body_plaintext, c.email_body) | |||
|
355 | ||||
|
356 | if self.request.matched_route.name == 'debug_style_email_plain_rendered': | |||
|
357 | template = 'debug_style/email_plain_rendered.mako' | |||
|
358 | else: | |||
|
359 | template = 'debug_style/email.mako' | |||
|
360 | return render_to_response( | |||
|
361 | template, self._get_template_context(c), | |||
|
362 | request=self.request) | |||
|
363 | ||||
|
364 | @view_config( | |||
49 | route_name='debug_style_template', request_method='GET', |
|
365 | route_name='debug_style_template', request_method='GET', | |
50 | renderer=None) |
|
366 | renderer=None) | |
51 | def template(self): |
|
367 | def template(self): | |
@@ -53,7 +369,18 b' class DebugStyleView(BaseAppView):' | |||||
53 | c = self.load_default_context() |
|
369 | c = self.load_default_context() | |
54 | c.active = os.path.splitext(t_path)[0] |
|
370 | c.active = os.path.splitext(t_path)[0] | |
55 | c.came_from = '' |
|
371 | c.came_from = '' | |
|
372 | c.email_types = { | |||
|
373 | 'cs_comment+file': {}, | |||
|
374 | 'cs_comment+status': {}, | |||
|
375 | ||||
|
376 | 'pull_request_comment+file': {}, | |||
|
377 | 'pull_request_comment+status': {}, | |||
|
378 | ||||
|
379 | 'pull_request_update': {}, | |||
|
380 | } | |||
|
381 | c.email_types.update(EmailNotificationModel.email_types) | |||
56 |
|
382 | |||
57 | return render_to_response( |
|
383 | return render_to_response( | |
58 | 'debug_style/' + t_path, self._get_template_context(c), |
|
384 | 'debug_style/' + t_path, self._get_template_context(c), | |
59 | request=self.request) No newline at end of file |
|
385 | request=self.request) | |
|
386 |
@@ -44,6 +44,9 b' def includeme(config):' | |||||
44 | config.add_route( |
|
44 | config.add_route( | |
45 | name='download_file', |
|
45 | name='download_file', | |
46 | pattern='/_file_store/download/{fid}') |
|
46 | pattern='/_file_store/download/{fid}') | |
|
47 | config.add_route( | |||
|
48 | name='download_file_by_token', | |||
|
49 | pattern='/_file_store/token-download/{_auth_token}/{fid}') | |||
47 |
|
50 | |||
48 | # Scan module for configuration decorators. |
|
51 | # Scan module for configuration decorators. | |
49 | config.scan('.views', ignore='.tests') |
|
52 | config.scan('.views', ignore='.tests') |
@@ -26,7 +26,8 b' import hashlib' | |||||
26 | from rhodecode.lib.ext_json import json |
|
26 | from rhodecode.lib.ext_json import json | |
27 | from rhodecode.apps.file_store import utils |
|
27 | from rhodecode.apps.file_store import utils | |
28 | from rhodecode.apps.file_store.extensions import resolve_extensions |
|
28 | from rhodecode.apps.file_store.extensions import resolve_extensions | |
29 |
from rhodecode.apps.file_store.exceptions import |
|
29 | from rhodecode.apps.file_store.exceptions import ( | |
|
30 | FileNotAllowedException, FileOverSizeException) | |||
30 |
|
31 | |||
31 | METADATA_VER = 'v1' |
|
32 | METADATA_VER = 'v1' | |
32 |
|
33 | |||
@@ -91,6 +92,9 b' class LocalFileStorage(object):' | |||||
91 | self.base_path = base_path |
|
92 | self.base_path = base_path | |
92 | self.extensions = resolve_extensions([], groups=extension_groups) |
|
93 | self.extensions = resolve_extensions([], groups=extension_groups) | |
93 |
|
94 | |||
|
95 | def __repr__(self): | |||
|
96 | return '{}@{}'.format(self.__class__, self.base_path) | |||
|
97 | ||||
94 | def store_path(self, filename): |
|
98 | def store_path(self, filename): | |
95 | """ |
|
99 | """ | |
96 | Returns absolute file path of the filename, joined to the |
|
100 | Returns absolute file path of the filename, joined to the | |
@@ -140,16 +144,21 b' class LocalFileStorage(object):' | |||||
140 | :param ext: extension to check |
|
144 | :param ext: extension to check | |
141 | :param extensions: iterable of extensions to validate against (or self.extensions) |
|
145 | :param extensions: iterable of extensions to validate against (or self.extensions) | |
142 | """ |
|
146 | """ | |
|
147 | def normalize_ext(_ext): | |||
|
148 | if _ext.startswith('.'): | |||
|
149 | _ext = _ext[1:] | |||
|
150 | return _ext.lower() | |||
143 |
|
151 | |||
144 | extensions = extensions or self.extensions |
|
152 | extensions = extensions or self.extensions | |
145 | if not extensions: |
|
153 | if not extensions: | |
146 | return True |
|
154 | return True | |
147 | if ext.startswith('.'): |
|
155 | ||
148 | ext = ext[1:] |
|
156 | ext = normalize_ext(ext) | |
149 | return ext.lower() in extensions |
|
157 | ||
|
158 | return ext in [normalize_ext(x) for x in extensions] | |||
150 |
|
159 | |||
151 | def save_file(self, file_obj, filename, directory=None, extensions=None, |
|
160 | def save_file(self, file_obj, filename, directory=None, extensions=None, | |
152 | extra_metadata=None, **kwargs): |
|
161 | extra_metadata=None, max_filesize=None, **kwargs): | |
153 | """ |
|
162 | """ | |
154 | Saves a file object to the uploads location. |
|
163 | Saves a file object to the uploads location. | |
155 | Returns the resolved filename, i.e. the directory + |
|
164 | Returns the resolved filename, i.e. the directory + | |
@@ -159,7 +168,9 b' class LocalFileStorage(object):' | |||||
159 | :param filename: original filename |
|
168 | :param filename: original filename | |
160 | :param directory: relative path of sub-directory |
|
169 | :param directory: relative path of sub-directory | |
161 | :param extensions: iterable of allowed extensions, if not default |
|
170 | :param extensions: iterable of allowed extensions, if not default | |
|
171 | :param max_filesize: maximum size of file that should be allowed | |||
162 | :param extra_metadata: extra JSON metadata to store next to the file with .meta suffix |
|
172 | :param extra_metadata: extra JSON metadata to store next to the file with .meta suffix | |
|
173 | ||||
163 | """ |
|
174 | """ | |
164 |
|
175 | |||
165 | extensions = extensions or self.extensions |
|
176 | extensions = extensions or self.extensions | |
@@ -191,6 +202,12 b' class LocalFileStorage(object):' | |||||
191 | metadata = extra_metadata |
|
202 | metadata = extra_metadata | |
192 |
|
203 | |||
193 | size = os.stat(path).st_size |
|
204 | size = os.stat(path).st_size | |
|
205 | ||||
|
206 | if max_filesize and size > max_filesize: | |||
|
207 | # free up the copied file, and raise exc | |||
|
208 | os.remove(path) | |||
|
209 | raise FileOverSizeException() | |||
|
210 | ||||
194 | file_hash = self.calculate_path_hash(path) |
|
211 | file_hash = self.calculate_path_hash(path) | |
195 |
|
212 | |||
196 | metadata.update( |
|
213 | metadata.update( |
@@ -21,7 +21,8 b' import os' | |||||
21 | import pytest |
|
21 | import pytest | |
22 |
|
22 | |||
23 | from rhodecode.lib.ext_json import json |
|
23 | from rhodecode.lib.ext_json import json | |
24 |
from rhodecode.model. |
|
24 | from rhodecode.model.auth_token import AuthTokenModel | |
|
25 | from rhodecode.model.db import Session, FileStore, Repository, User | |||
25 | from rhodecode.tests import TestController |
|
26 | from rhodecode.tests import TestController | |
26 | from rhodecode.apps.file_store import utils, config_keys |
|
27 | from rhodecode.apps.file_store import utils, config_keys | |
27 |
|
28 | |||
@@ -32,6 +33,7 b' def route_path(name, params=None, **kwar' | |||||
32 | base_url = { |
|
33 | base_url = { | |
33 | 'upload_file': '/_file_store/upload', |
|
34 | 'upload_file': '/_file_store/upload', | |
34 | 'download_file': '/_file_store/download/{fid}', |
|
35 | 'download_file': '/_file_store/download/{fid}', | |
|
36 | 'download_file_by_token': '/_file_store/token-download/{_auth_token}/{fid}' | |||
35 |
|
37 | |||
36 | }[name].format(**kwargs) |
|
38 | }[name].format(**kwargs) | |
37 |
|
39 | |||
@@ -124,3 +126,136 b' class TestFileStoreViews(TestController)' | |||||
124 | status=200) |
|
126 | status=200) | |
125 |
|
127 | |||
126 | assert response.json['store_fid'] |
|
128 | assert response.json['store_fid'] | |
|
129 | ||||
|
130 | @pytest.fixture() | |||
|
131 | def create_artifact_factory(self, tmpdir): | |||
|
132 | def factory(user_id, content): | |||
|
133 | store_path = self.app._pyramid_settings[config_keys.store_path] | |||
|
134 | store = utils.get_file_storage({config_keys.store_path: store_path}) | |||
|
135 | fid = 'example.txt' | |||
|
136 | ||||
|
137 | filesystem_file = os.path.join(str(tmpdir), fid) | |||
|
138 | with open(filesystem_file, 'wb') as f: | |||
|
139 | f.write(content) | |||
|
140 | ||||
|
141 | with open(filesystem_file, 'rb') as f: | |||
|
142 | store_uid, metadata = store.save_file(f, fid, extra_metadata={'filename': fid}) | |||
|
143 | ||||
|
144 | entry = FileStore.create( | |||
|
145 | file_uid=store_uid, filename=metadata["filename"], | |||
|
146 | file_hash=metadata["sha256"], file_size=metadata["size"], | |||
|
147 | file_display_name='file_display_name', | |||
|
148 | file_description='repo artifact `{}`'.format(metadata["filename"]), | |||
|
149 | check_acl=True, user_id=user_id, | |||
|
150 | ) | |||
|
151 | Session().add(entry) | |||
|
152 | Session().commit() | |||
|
153 | return entry | |||
|
154 | return factory | |||
|
155 | ||||
|
156 | def test_download_file_non_scoped(self, user_util, create_artifact_factory): | |||
|
157 | user = self.log_user() | |||
|
158 | user_id = user['user_id'] | |||
|
159 | content = 'HELLO MY NAME IS ARTIFACT !' | |||
|
160 | ||||
|
161 | artifact = create_artifact_factory(user_id, content) | |||
|
162 | file_uid = artifact.file_uid | |||
|
163 | response = self.app.get(route_path('download_file', fid=file_uid), status=200) | |||
|
164 | assert response.text == content | |||
|
165 | ||||
|
166 | # log-in to new user and test download again | |||
|
167 | user = user_util.create_user(password='qweqwe') | |||
|
168 | self.log_user(user.username, 'qweqwe') | |||
|
169 | response = self.app.get(route_path('download_file', fid=file_uid), status=200) | |||
|
170 | assert response.text == content | |||
|
171 | ||||
|
172 | def test_download_file_scoped_to_repo(self, user_util, create_artifact_factory): | |||
|
173 | user = self.log_user() | |||
|
174 | user_id = user['user_id'] | |||
|
175 | content = 'HELLO MY NAME IS ARTIFACT !' | |||
|
176 | ||||
|
177 | artifact = create_artifact_factory(user_id, content) | |||
|
178 | # bind to repo | |||
|
179 | repo = user_util.create_repo() | |||
|
180 | repo_id = repo.repo_id | |||
|
181 | artifact.scope_repo_id = repo_id | |||
|
182 | Session().add(artifact) | |||
|
183 | Session().commit() | |||
|
184 | ||||
|
185 | file_uid = artifact.file_uid | |||
|
186 | response = self.app.get(route_path('download_file', fid=file_uid), status=200) | |||
|
187 | assert response.text == content | |||
|
188 | ||||
|
189 | # log-in to new user and test download again | |||
|
190 | user = user_util.create_user(password='qweqwe') | |||
|
191 | self.log_user(user.username, 'qweqwe') | |||
|
192 | response = self.app.get(route_path('download_file', fid=file_uid), status=200) | |||
|
193 | assert response.text == content | |||
|
194 | ||||
|
195 | # forbid user the rights to repo | |||
|
196 | repo = Repository.get(repo_id) | |||
|
197 | user_util.grant_user_permission_to_repo(repo, user, 'repository.none') | |||
|
198 | self.app.get(route_path('download_file', fid=file_uid), status=404) | |||
|
199 | ||||
|
200 | def test_download_file_scoped_to_user(self, user_util, create_artifact_factory): | |||
|
201 | user = self.log_user() | |||
|
202 | user_id = user['user_id'] | |||
|
203 | content = 'HELLO MY NAME IS ARTIFACT !' | |||
|
204 | ||||
|
205 | artifact = create_artifact_factory(user_id, content) | |||
|
206 | # bind to user | |||
|
207 | user = user_util.create_user(password='qweqwe') | |||
|
208 | ||||
|
209 | artifact.scope_user_id = user.user_id | |||
|
210 | Session().add(artifact) | |||
|
211 | Session().commit() | |||
|
212 | ||||
|
213 | # artifact creator doesn't have access since it's bind to another user | |||
|
214 | file_uid = artifact.file_uid | |||
|
215 | self.app.get(route_path('download_file', fid=file_uid), status=404) | |||
|
216 | ||||
|
217 | # log-in to new user and test download again, should be ok since we're bind to this artifact | |||
|
218 | self.log_user(user.username, 'qweqwe') | |||
|
219 | response = self.app.get(route_path('download_file', fid=file_uid), status=200) | |||
|
220 | assert response.text == content | |||
|
221 | ||||
|
222 | def test_download_file_scoped_to_repo_with_bad_token(self, user_util, create_artifact_factory): | |||
|
223 | user_id = User.get_first_super_admin().user_id | |||
|
224 | content = 'HELLO MY NAME IS ARTIFACT !' | |||
|
225 | ||||
|
226 | artifact = create_artifact_factory(user_id, content) | |||
|
227 | # bind to repo | |||
|
228 | repo = user_util.create_repo() | |||
|
229 | repo_id = repo.repo_id | |||
|
230 | artifact.scope_repo_id = repo_id | |||
|
231 | Session().add(artifact) | |||
|
232 | Session().commit() | |||
|
233 | ||||
|
234 | file_uid = artifact.file_uid | |||
|
235 | self.app.get(route_path('download_file_by_token', | |||
|
236 | _auth_token='bogus', fid=file_uid), status=302) | |||
|
237 | ||||
|
238 | def test_download_file_scoped_to_repo_with_token(self, user_util, create_artifact_factory): | |||
|
239 | user = User.get_first_super_admin() | |||
|
240 | AuthTokenModel().create(user, 'test artifact token', | |||
|
241 | role=AuthTokenModel.cls.ROLE_ARTIFACT_DOWNLOAD) | |||
|
242 | ||||
|
243 | user = User.get_first_super_admin() | |||
|
244 | artifact_token = user.artifact_token | |||
|
245 | ||||
|
246 | user_id = User.get_first_super_admin().user_id | |||
|
247 | content = 'HELLO MY NAME IS ARTIFACT !' | |||
|
248 | ||||
|
249 | artifact = create_artifact_factory(user_id, content) | |||
|
250 | # bind to repo | |||
|
251 | repo = user_util.create_repo() | |||
|
252 | repo_id = repo.repo_id | |||
|
253 | artifact.scope_repo_id = repo_id | |||
|
254 | Session().add(artifact) | |||
|
255 | Session().commit() | |||
|
256 | ||||
|
257 | file_uid = artifact.file_uid | |||
|
258 | response = self.app.get( | |||
|
259 | route_path('download_file_by_token', | |||
|
260 | _auth_token=artifact_token, fid=file_uid), status=200) | |||
|
261 | assert response.text == content |
@@ -25,7 +25,7 b' import pathlib2' | |||||
25 |
|
25 | |||
26 |
|
26 | |||
27 | def get_file_storage(settings): |
|
27 | def get_file_storage(settings): | |
28 | from rhodecode.apps.file_store.local_store import LocalFileStorage |
|
28 | from rhodecode.apps.file_store.backends.local_store import LocalFileStorage | |
29 | from rhodecode.apps.file_store import config_keys |
|
29 | from rhodecode.apps.file_store import config_keys | |
30 | store_path = settings.get(config_keys.store_path) |
|
30 | store_path = settings.get(config_keys.store_path) | |
31 | return LocalFileStorage(base_path=store_path) |
|
31 | return LocalFileStorage(base_path=store_path) |
@@ -30,8 +30,10 b' from rhodecode.apps.file_store.exception' | |||||
30 |
|
30 | |||
31 | from rhodecode.lib import helpers as h |
|
31 | from rhodecode.lib import helpers as h | |
32 | from rhodecode.lib import audit_logger |
|
32 | from rhodecode.lib import audit_logger | |
33 | from rhodecode.lib.auth import (CSRFRequired, NotAnonymous, HasRepoPermissionAny, HasRepoGroupPermissionAny) |
|
33 | from rhodecode.lib.auth import ( | |
34 | from rhodecode.model.db import Session, FileStore |
|
34 | CSRFRequired, NotAnonymous, HasRepoPermissionAny, HasRepoGroupPermissionAny, | |
|
35 | LoginRequired) | |||
|
36 | from rhodecode.model.db import Session, FileStore, UserApiKeys | |||
35 |
|
37 | |||
36 | log = logging.getLogger(__name__) |
|
38 | log = logging.getLogger(__name__) | |
37 |
|
39 | |||
@@ -44,6 +46,55 b' class FileStoreView(BaseAppView):' | |||||
44 | self.storage = utils.get_file_storage(self.request.registry.settings) |
|
46 | self.storage = utils.get_file_storage(self.request.registry.settings) | |
45 | return c |
|
47 | return c | |
46 |
|
48 | |||
|
49 | def _serve_file(self, file_uid): | |||
|
50 | ||||
|
51 | if not self.storage.exists(file_uid): | |||
|
52 | store_path = self.storage.store_path(file_uid) | |||
|
53 | log.debug('File with FID:%s not found in the store under `%s`', | |||
|
54 | file_uid, store_path) | |||
|
55 | raise HTTPNotFound() | |||
|
56 | ||||
|
57 | db_obj = FileStore().query().filter(FileStore.file_uid == file_uid).scalar() | |||
|
58 | if not db_obj: | |||
|
59 | raise HTTPNotFound() | |||
|
60 | ||||
|
61 | # private upload for user | |||
|
62 | if db_obj.check_acl and db_obj.scope_user_id: | |||
|
63 | log.debug('Artifact: checking scope access for bound artifact user: `%s`', | |||
|
64 | db_obj.scope_user_id) | |||
|
65 | user = db_obj.user | |||
|
66 | if self._rhodecode_db_user.user_id != user.user_id: | |||
|
67 | log.warning('Access to file store object forbidden') | |||
|
68 | raise HTTPNotFound() | |||
|
69 | ||||
|
70 | # scoped to repository permissions | |||
|
71 | if db_obj.check_acl and db_obj.scope_repo_id: | |||
|
72 | log.debug('Artifact: checking scope access for bound artifact repo: `%s`', | |||
|
73 | db_obj.scope_repo_id) | |||
|
74 | repo = db_obj.repo | |||
|
75 | perm_set = ['repository.read', 'repository.write', 'repository.admin'] | |||
|
76 | has_perm = HasRepoPermissionAny(*perm_set)(repo.repo_name, 'FileStore check') | |||
|
77 | if not has_perm: | |||
|
78 | log.warning('Access to file store object `%s` forbidden', file_uid) | |||
|
79 | raise HTTPNotFound() | |||
|
80 | ||||
|
81 | # scoped to repository group permissions | |||
|
82 | if db_obj.check_acl and db_obj.scope_repo_group_id: | |||
|
83 | log.debug('Artifact: checking scope access for bound artifact repo group: `%s`', | |||
|
84 | db_obj.scope_repo_group_id) | |||
|
85 | repo_group = db_obj.repo_group | |||
|
86 | perm_set = ['group.read', 'group.write', 'group.admin'] | |||
|
87 | has_perm = HasRepoGroupPermissionAny(*perm_set)(repo_group.group_name, 'FileStore check') | |||
|
88 | if not has_perm: | |||
|
89 | log.warning('Access to file store object `%s` forbidden', file_uid) | |||
|
90 | raise HTTPNotFound() | |||
|
91 | ||||
|
92 | FileStore.bump_access_counter(file_uid) | |||
|
93 | ||||
|
94 | file_path = self.storage.store_path(file_uid) | |||
|
95 | return FileResponse(file_path) | |||
|
96 | ||||
|
97 | @LoginRequired() | |||
47 | @NotAnonymous() |
|
98 | @NotAnonymous() | |
48 | @CSRFRequired() |
|
99 | @CSRFRequired() | |
49 | @view_config(route_name='upload_file', request_method='POST', renderer='json_ext') |
|
100 | @view_config(route_name='upload_file', request_method='POST', renderer='json_ext') | |
@@ -84,7 +135,7 b' class FileStoreView(BaseAppView):' | |||||
84 | entry = FileStore.create( |
|
135 | entry = FileStore.create( | |
85 | file_uid=store_uid, filename=metadata["filename"], |
|
136 | file_uid=store_uid, filename=metadata["filename"], | |
86 | file_hash=metadata["sha256"], file_size=metadata["size"], |
|
137 | file_hash=metadata["sha256"], file_size=metadata["size"], | |
87 | file_description='upload attachment', |
|
138 | file_description=u'upload attachment', | |
88 | check_acl=False, user_id=self._rhodecode_user.user_id |
|
139 | check_acl=False, user_id=self._rhodecode_user.user_id | |
89 | ) |
|
140 | ) | |
90 | Session().add(entry) |
|
141 | Session().add(entry) | |
@@ -99,46 +150,25 b' class FileStoreView(BaseAppView):' | |||||
99 | return {'store_fid': store_uid, |
|
150 | return {'store_fid': store_uid, | |
100 | 'access_path': h.route_path('download_file', fid=store_uid)} |
|
151 | 'access_path': h.route_path('download_file', fid=store_uid)} | |
101 |
|
152 | |||
|
153 | # ACL is checked by scopes, if no scope the file is accessible to all | |||
102 | @view_config(route_name='download_file') |
|
154 | @view_config(route_name='download_file') | |
103 | def download_file(self): |
|
155 | def download_file(self): | |
104 | self.load_default_context() |
|
156 | self.load_default_context() | |
105 | file_uid = self.request.matchdict['fid'] |
|
157 | file_uid = self.request.matchdict['fid'] | |
106 | log.debug('Requesting FID:%s from store %s', file_uid, self.storage) |
|
158 | log.debug('Requesting FID:%s from store %s', file_uid, self.storage) | |
107 |
|
159 | return self._serve_file(file_uid) | ||
108 | if not self.storage.exists(file_uid): |
|
|||
109 | log.debug('File with FID:%s not found in the store', file_uid) |
|
|||
110 | raise HTTPNotFound() |
|
|||
111 |
|
||||
112 | db_obj = FileStore().query().filter(FileStore.file_uid == file_uid).scalar() |
|
|||
113 | if not db_obj: |
|
|||
114 | raise HTTPNotFound() |
|
|||
115 |
|
||||
116 | # private upload for user |
|
|||
117 | if db_obj.check_acl and db_obj.scope_user_id: |
|
|||
118 | user = db_obj.user |
|
|||
119 | if self._rhodecode_db_user.user_id != user.user_id: |
|
|||
120 | log.warning('Access to file store object forbidden') |
|
|||
121 | raise HTTPNotFound() |
|
|||
122 |
|
160 | |||
123 | # scoped to repository permissions |
|
161 | # in addition to @LoginRequired ACL is checked by scopes | |
124 | if db_obj.check_acl and db_obj.scope_repo_id: |
|
162 | @LoginRequired(auth_token_access=[UserApiKeys.ROLE_ARTIFACT_DOWNLOAD]) | |
125 | repo = db_obj.repo |
|
163 | @NotAnonymous() | |
126 | perm_set = ['repository.read', 'repository.write', 'repository.admin'] |
|
164 | @view_config(route_name='download_file_by_token') | |
127 | has_perm = HasRepoPermissionAny(*perm_set)(repo.repo_name, 'FileStore check') |
|
165 | def download_file_by_token(self): | |
128 | if not has_perm: |
|
166 | """ | |
129 | log.warning('Access to file store object forbidden') |
|
167 | Special view that allows to access the download file by special URL that | |
130 | raise HTTPNotFound() |
|
168 | is stored inside the URL. | |
131 |
|
|
169 | ||
132 | # scoped to repository group permissions |
|
170 | http://example.com/_file_store/token-download/TOKEN/FILE_UID | |
133 | if db_obj.check_acl and db_obj.scope_repo_group_id: |
|
171 | """ | |
134 | repo_group = db_obj.repo_group |
|
172 | self.load_default_context() | |
135 | perm_set = ['group.read', 'group.write', 'group.admin'] |
|
173 | file_uid = self.request.matchdict['fid'] | |
136 | has_perm = HasRepoGroupPermissionAny(*perm_set)(repo_group.group_name, 'FileStore check') |
|
174 | return self._serve_file(file_uid) | |
137 | if not has_perm: |
|
|||
138 | log.warning('Access to file store object forbidden') |
|
|||
139 | raise HTTPNotFound() |
|
|||
140 |
|
||||
141 | FileStore.bump_access_counter(file_uid) |
|
|||
142 |
|
||||
143 | file_path = self.storage.store_path(file_uid) |
|
|||
144 | return FileResponse(file_path) |
|
@@ -83,7 +83,7 b' class GistUtility(object):' | |||||
83 | Session().commit() |
|
83 | Session().commit() | |
84 |
|
84 | |||
85 |
|
85 | |||
86 | @pytest.fixture |
|
86 | @pytest.fixture() | |
87 | def create_gist(request): |
|
87 | def create_gist(request): | |
88 | gist_utility = GistUtility() |
|
88 | gist_utility = GistUtility() | |
89 | request.addfinalizer(gist_utility.cleanup) |
|
89 | request.addfinalizer(gist_utility.cleanup) | |
@@ -159,7 +159,7 b' class TestGistsController(TestController' | |||||
159 | params={'lifetime': -1, |
|
159 | params={'lifetime': -1, | |
160 | 'content': 'gist test', |
|
160 | 'content': 'gist test', | |
161 | 'filename': 'foo', |
|
161 | 'filename': 'foo', | |
162 |
' |
|
162 | 'gist_type': 'public', | |
163 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
163 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, | |
164 | 'csrf_token': self.csrf_token}, |
|
164 | 'csrf_token': self.csrf_token}, | |
165 | status=302) |
|
165 | status=302) | |
@@ -174,7 +174,7 b' class TestGistsController(TestController' | |||||
174 | params={'lifetime': -1, |
|
174 | params={'lifetime': -1, | |
175 | 'content': 'gist test', |
|
175 | 'content': 'gist test', | |
176 | 'filename': '/home/foo', |
|
176 | 'filename': '/home/foo', | |
177 |
' |
|
177 | 'gist_type': 'public', | |
178 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
178 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, | |
179 | 'csrf_token': self.csrf_token}, |
|
179 | 'csrf_token': self.csrf_token}, | |
180 | status=200) |
|
180 | status=200) | |
@@ -197,7 +197,7 b' class TestGistsController(TestController' | |||||
197 | params={'lifetime': -1, |
|
197 | params={'lifetime': -1, | |
198 | 'content': 'private gist test', |
|
198 | 'content': 'private gist test', | |
199 | 'filename': 'private-foo', |
|
199 | 'filename': 'private-foo', | |
200 |
' |
|
200 | 'gist_type': 'private', | |
201 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
201 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, | |
202 | 'csrf_token': self.csrf_token}, |
|
202 | 'csrf_token': self.csrf_token}, | |
203 | status=302) |
|
203 | status=302) | |
@@ -216,7 +216,7 b' class TestGistsController(TestController' | |||||
216 | params={'lifetime': -1, |
|
216 | params={'lifetime': -1, | |
217 | 'content': 'private gist test', |
|
217 | 'content': 'private gist test', | |
218 | 'filename': 'private-foo', |
|
218 | 'filename': 'private-foo', | |
219 |
' |
|
219 | 'gist_type': 'private', | |
220 | 'gist_acl_level': Gist.ACL_LEVEL_PRIVATE, |
|
220 | 'gist_acl_level': Gist.ACL_LEVEL_PRIVATE, | |
221 | 'csrf_token': self.csrf_token}, |
|
221 | 'csrf_token': self.csrf_token}, | |
222 | status=302) |
|
222 | status=302) | |
@@ -236,7 +236,7 b' class TestGistsController(TestController' | |||||
236 | 'content': 'gist test', |
|
236 | 'content': 'gist test', | |
237 | 'filename': 'foo-desc', |
|
237 | 'filename': 'foo-desc', | |
238 | 'description': 'gist-desc', |
|
238 | 'description': 'gist-desc', | |
239 |
' |
|
239 | 'gist_type': 'public', | |
240 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
240 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, | |
241 | 'csrf_token': self.csrf_token}, |
|
241 | 'csrf_token': self.csrf_token}, | |
242 | status=302) |
|
242 | status=302) | |
@@ -252,7 +252,7 b' class TestGistsController(TestController' | |||||
252 | 'content': 'gist test', |
|
252 | 'content': 'gist test', | |
253 | 'filename': 'foo-desc', |
|
253 | 'filename': 'foo-desc', | |
254 | 'description': 'gist-desc', |
|
254 | 'description': 'gist-desc', | |
255 |
' |
|
255 | 'gist_type': 'public', | |
256 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, |
|
256 | 'gist_acl_level': Gist.ACL_LEVEL_PUBLIC, | |
257 | 'csrf_token': self.csrf_token |
|
257 | 'csrf_token': self.csrf_token | |
258 | } |
|
258 | } |
@@ -72,7 +72,7 b' class GistView(BaseAppView):' | |||||
72 | @LoginRequired() |
|
72 | @LoginRequired() | |
73 | @view_config( |
|
73 | @view_config( | |
74 | route_name='gists_show', request_method='GET', |
|
74 | route_name='gists_show', request_method='GET', | |
75 | renderer='rhodecode:templates/admin/gists/index.mako') |
|
75 | renderer='rhodecode:templates/admin/gists/gist_index.mako') | |
76 | def gist_show_all(self): |
|
76 | def gist_show_all(self): | |
77 | c = self.load_default_context() |
|
77 | c = self.load_default_context() | |
78 |
|
78 | |||
@@ -136,7 +136,7 b' class GistView(BaseAppView):' | |||||
136 | @NotAnonymous() |
|
136 | @NotAnonymous() | |
137 | @view_config( |
|
137 | @view_config( | |
138 | route_name='gists_new', request_method='GET', |
|
138 | route_name='gists_new', request_method='GET', | |
139 | renderer='rhodecode:templates/admin/gists/new.mako') |
|
139 | renderer='rhodecode:templates/admin/gists/gist_new.mako') | |
140 | def gist_new(self): |
|
140 | def gist_new(self): | |
141 | c = self.load_default_context() |
|
141 | c = self.load_default_context() | |
142 | return self._get_template_context(c) |
|
142 | return self._get_template_context(c) | |
@@ -146,21 +146,26 b' class GistView(BaseAppView):' | |||||
146 | @CSRFRequired() |
|
146 | @CSRFRequired() | |
147 | @view_config( |
|
147 | @view_config( | |
148 | route_name='gists_create', request_method='POST', |
|
148 | route_name='gists_create', request_method='POST', | |
149 | renderer='rhodecode:templates/admin/gists/new.mako') |
|
149 | renderer='rhodecode:templates/admin/gists/gist_new.mako') | |
150 | def gist_create(self): |
|
150 | def gist_create(self): | |
151 | _ = self.request.translate |
|
151 | _ = self.request.translate | |
152 | c = self.load_default_context() |
|
152 | c = self.load_default_context() | |
153 |
|
153 | |||
154 | data = dict(self.request.POST) |
|
154 | data = dict(self.request.POST) | |
155 | data['filename'] = data.get('filename') or Gist.DEFAULT_FILENAME |
|
155 | data['filename'] = data.get('filename') or Gist.DEFAULT_FILENAME | |
|
156 | ||||
156 | data['nodes'] = [{ |
|
157 | data['nodes'] = [{ | |
157 | 'filename': data['filename'], |
|
158 | 'filename': data['filename'], | |
158 | 'content': data.get('content'), |
|
159 | 'content': data.get('content'), | |
159 | 'mimetype': data.get('mimetype') # None is autodetect |
|
160 | 'mimetype': data.get('mimetype') # None is autodetect | |
160 | }] |
|
161 | }] | |
|
162 | gist_type = { | |||
|
163 | 'public': Gist.GIST_PUBLIC, | |||
|
164 | 'private': Gist.GIST_PRIVATE | |||
|
165 | }.get(data.get('gist_type')) or Gist.GIST_PRIVATE | |||
161 |
|
166 | |||
162 |
data['gist_type'] = |
|
167 | data['gist_type'] = gist_type | |
163 | Gist.GIST_PUBLIC if data.get('public') else Gist.GIST_PRIVATE) |
|
168 | ||
164 | data['gist_acl_level'] = ( |
|
169 | data['gist_acl_level'] = ( | |
165 | data.get('gist_acl_level') or Gist.ACL_LEVEL_PRIVATE) |
|
170 | data.get('gist_acl_level') or Gist.ACL_LEVEL_PRIVATE) | |
166 |
|
171 | |||
@@ -196,7 +201,7 b' class GistView(BaseAppView):' | |||||
196 | errors['filename'] = errors['nodes.0.filename'] |
|
201 | errors['filename'] = errors['nodes.0.filename'] | |
197 | del errors['nodes.0.filename'] |
|
202 | del errors['nodes.0.filename'] | |
198 |
|
203 | |||
199 | data = render('rhodecode:templates/admin/gists/new.mako', |
|
204 | data = render('rhodecode:templates/admin/gists/gist_new.mako', | |
200 | self._get_template_context(c), self.request) |
|
205 | self._get_template_context(c), self.request) | |
201 | html = formencode.htmlfill.render( |
|
206 | html = formencode.htmlfill.render( | |
202 | data, |
|
207 | data, | |
@@ -260,10 +265,10 b' class GistView(BaseAppView):' | |||||
260 | @LoginRequired() |
|
265 | @LoginRequired() | |
261 | @view_config( |
|
266 | @view_config( | |
262 | route_name='gist_show', request_method='GET', |
|
267 | route_name='gist_show', request_method='GET', | |
263 | renderer='rhodecode:templates/admin/gists/show.mako') |
|
268 | renderer='rhodecode:templates/admin/gists/gist_show.mako') | |
264 | @view_config( |
|
269 | @view_config( | |
265 | route_name='gist_show_rev', request_method='GET', |
|
270 | route_name='gist_show_rev', request_method='GET', | |
266 | renderer='rhodecode:templates/admin/gists/show.mako') |
|
271 | renderer='rhodecode:templates/admin/gists/gist_show.mako') | |
267 | @view_config( |
|
272 | @view_config( | |
268 | route_name='gist_show_formatted', request_method='GET', |
|
273 | route_name='gist_show_formatted', request_method='GET', | |
269 | renderer=None) |
|
274 | renderer=None) | |
@@ -304,7 +309,7 b' class GistView(BaseAppView):' | |||||
304 | @NotAnonymous() |
|
309 | @NotAnonymous() | |
305 | @view_config( |
|
310 | @view_config( | |
306 | route_name='gist_edit', request_method='GET', |
|
311 | route_name='gist_edit', request_method='GET', | |
307 | renderer='rhodecode:templates/admin/gists/edit.mako') |
|
312 | renderer='rhodecode:templates/admin/gists/gist_edit.mako') | |
308 | def gist_edit(self): |
|
313 | def gist_edit(self): | |
309 | _ = self.request.translate |
|
314 | _ = self.request.translate | |
310 | gist_id = self.request.matchdict['gist_id'] |
|
315 | gist_id = self.request.matchdict['gist_id'] | |
@@ -338,7 +343,7 b' class GistView(BaseAppView):' | |||||
338 | @CSRFRequired() |
|
343 | @CSRFRequired() | |
339 | @view_config( |
|
344 | @view_config( | |
340 | route_name='gist_update', request_method='POST', |
|
345 | route_name='gist_update', request_method='POST', | |
341 | renderer='rhodecode:templates/admin/gists/edit.mako') |
|
346 | renderer='rhodecode:templates/admin/gists/gist_edit.mako') | |
342 | def gist_update(self): |
|
347 | def gist_update(self): | |
343 | _ = self.request.translate |
|
348 | _ = self.request.translate | |
344 | gist_id = self.request.matchdict['gist_id'] |
|
349 | gist_id = self.request.matchdict['gist_id'] |
@@ -44,6 +44,14 b' def includeme(config):' | |||||
44 | pattern='/') |
|
44 | pattern='/') | |
45 |
|
45 | |||
46 | config.add_route( |
|
46 | config.add_route( | |
|
47 | name='main_page_repos_data', | |||
|
48 | pattern='/_home_repos') | |||
|
49 | ||||
|
50 | config.add_route( | |||
|
51 | name='main_page_repo_groups_data', | |||
|
52 | pattern='/_home_repo_groups') | |||
|
53 | ||||
|
54 | config.add_route( | |||
47 | name='user_autocomplete_data', |
|
55 | name='user_autocomplete_data', | |
48 | pattern='/_users') |
|
56 | pattern='/_users') | |
49 |
|
57 |
@@ -22,7 +22,7 b'' | |||||
22 | import pytest |
|
22 | import pytest | |
23 |
|
23 | |||
24 | import rhodecode |
|
24 | import rhodecode | |
25 | from rhodecode.model.db import Repository |
|
25 | from rhodecode.model.db import Repository, RepoGroup, User | |
26 | from rhodecode.model.meta import Session |
|
26 | from rhodecode.model.meta import Session | |
27 | from rhodecode.model.repo import RepoModel |
|
27 | from rhodecode.model.repo import RepoModel | |
28 | from rhodecode.model.repo_group import RepoGroupModel |
|
28 | from rhodecode.model.repo_group import RepoGroupModel | |
@@ -37,6 +37,8 b' fixture = Fixture()' | |||||
37 | def route_path(name, **kwargs): |
|
37 | def route_path(name, **kwargs): | |
38 | return { |
|
38 | return { | |
39 | 'home': '/', |
|
39 | 'home': '/', | |
|
40 | 'main_page_repos_data': '/_home_repos', | |||
|
41 | 'main_page_repo_groups_data': '/_home_repo_groups', | |||
40 | 'repo_group_home': '/{repo_group_name}' |
|
42 | 'repo_group_home': '/{repo_group_name}' | |
41 | }[name].format(**kwargs) |
|
43 | }[name].format(**kwargs) | |
42 |
|
44 | |||
@@ -47,11 +49,42 b' class TestHomeController(TestController)' | |||||
47 | self.log_user() |
|
49 | self.log_user() | |
48 | response = self.app.get(route_path('home')) |
|
50 | response = self.app.get(route_path('home')) | |
49 | # if global permission is set |
|
51 | # if global permission is set | |
50 |
response.mustcontain(' |
|
52 | response.mustcontain('New Repository') | |
|
53 | ||||
|
54 | def test_index_grid_repos(self, xhr_header): | |||
|
55 | self.log_user() | |||
|
56 | response = self.app.get(route_path('main_page_repos_data'), extra_environ=xhr_header) | |||
|
57 | # search for objects inside the JavaScript JSON | |||
|
58 | for obj in Repository.getAll(): | |||
|
59 | response.mustcontain('<a href=\\"/{}\\">'.format(obj.repo_name)) | |||
|
60 | ||||
|
61 | def test_index_grid_repo_groups(self, xhr_header): | |||
|
62 | self.log_user() | |||
|
63 | response = self.app.get(route_path('main_page_repo_groups_data'), | |||
|
64 | extra_environ=xhr_header,) | |||
51 |
|
65 | |||
52 | # search for objects inside the JavaScript JSON |
|
66 | # search for objects inside the JavaScript JSON | |
53 |
for |
|
67 | for obj in RepoGroup.getAll(): | |
54 |
response.mustcontain('" |
|
68 | response.mustcontain('<a href=\\"/{}\\">'.format(obj.group_name)) | |
|
69 | ||||
|
70 | def test_index_grid_repo_groups_without_access(self, xhr_header, user_util): | |||
|
71 | user = user_util.create_user(password='qweqwe') | |||
|
72 | group_ok = user_util.create_repo_group(owner=user) | |||
|
73 | group_id_ok = group_ok.group_id | |||
|
74 | ||||
|
75 | group_forbidden = user_util.create_repo_group(owner=User.get_first_super_admin()) | |||
|
76 | group_id_forbidden = group_forbidden.group_id | |||
|
77 | ||||
|
78 | user_util.grant_user_permission_to_repo_group(group_forbidden, user, 'group.none') | |||
|
79 | self.log_user(user.username, 'qweqwe') | |||
|
80 | ||||
|
81 | self.app.get(route_path('main_page_repo_groups_data'), | |||
|
82 | extra_environ=xhr_header, | |||
|
83 | params={'repo_group_id': group_id_ok}, status=200) | |||
|
84 | ||||
|
85 | self.app.get(route_path('main_page_repo_groups_data'), | |||
|
86 | extra_environ=xhr_header, | |||
|
87 | params={'repo_group_id': group_id_forbidden}, status=404) | |||
55 |
|
88 | |||
56 | def test_index_contains_statics_with_ver(self): |
|
89 | def test_index_contains_statics_with_ver(self): | |
57 | from rhodecode.lib.base import calculate_version_hash |
|
90 | from rhodecode.lib.base import calculate_version_hash | |
@@ -62,11 +95,11 b' class TestHomeController(TestController)' | |||||
62 | rhodecode_version_hash = calculate_version_hash( |
|
95 | rhodecode_version_hash = calculate_version_hash( | |
63 | {'beaker.session.secret': 'test-rc-uytcxaz'}) |
|
96 | {'beaker.session.secret': 'test-rc-uytcxaz'}) | |
64 | response.mustcontain('style.css?ver={0}'.format(rhodecode_version_hash)) |
|
97 | response.mustcontain('style.css?ver={0}'.format(rhodecode_version_hash)) | |
65 | response.mustcontain('scripts.js?ver={0}'.format(rhodecode_version_hash)) |
|
98 | response.mustcontain('scripts.min.js?ver={0}'.format(rhodecode_version_hash)) | |
66 |
|
99 | |||
67 | def test_index_contains_backend_specific_details(self, backend): |
|
100 | def test_index_contains_backend_specific_details(self, backend, xhr_header): | |
68 | self.log_user() |
|
101 | self.log_user() | |
69 |
response = self.app.get(route_path(' |
|
102 | response = self.app.get(route_path('main_page_repos_data'), extra_environ=xhr_header) | |
70 | tip = backend.repo.get_commit().raw_id |
|
103 | tip = backend.repo.get_commit().raw_id | |
71 |
|
104 | |||
72 | # html in javascript variable: |
|
105 | # html in javascript variable: | |
@@ -81,39 +114,44 b' class TestHomeController(TestController)' | |||||
81 | response = self.app.get(route_path('home'), status=302) |
|
114 | response = self.app.get(route_path('home'), status=302) | |
82 | assert 'login' in response.location |
|
115 | assert 'login' in response.location | |
83 |
|
116 | |||
84 |
def test_index_page_on_groups(self, autologin_user, |
|
117 | def test_index_page_on_groups_with_wrong_group_id(self, autologin_user, xhr_header): | |
85 | response = self.app.get(route_path('repo_group_home', repo_group_name='gr1')) |
|
118 | group_id = 918123 | |
86 | response.mustcontain("gr1/repo_in_group") |
|
119 | self.app.get( | |
|
120 | route_path('main_page_repo_groups_data'), | |||
|
121 | params={'repo_group_id': group_id}, | |||
|
122 | status=404, extra_environ=xhr_header) | |||
87 |
|
123 | |||
88 | def test_index_page_on_group_with_trailing_slash( |
|
124 | def test_index_page_on_groups(self, autologin_user, user_util, xhr_header): | |
89 | self, autologin_user, repo_group): |
|
125 | gr = user_util.create_repo_group() | |
90 | response = self.app.get(route_path('repo_group_home', repo_group_name='gr1') + '/') |
|
126 | repo = user_util.create_repo(parent=gr) | |
91 | response.mustcontain("gr1/repo_in_group") |
|
127 | repo_name = repo.repo_name | |
|
128 | group_id = gr.group_id | |||
92 |
|
129 | |||
93 | @pytest.fixture(scope='class') |
|
130 | response = self.app.get(route_path( | |
94 | def repo_group(self, request): |
|
131 | 'repo_group_home', repo_group_name=gr.group_name)) | |
95 | gr = fixture.create_repo_group('gr1') |
|
132 | response.mustcontain('d.repo_group_id = {}'.format(group_id)) | |
96 | fixture.create_repo(name='gr1/repo_in_group', repo_group=gr) |
|
|||
97 |
|
133 | |||
98 | @request.addfinalizer |
|
134 | response = self.app.get( | |
99 | def cleanup(): |
|
135 | route_path('main_page_repos_data'), | |
100 | RepoModel().delete('gr1/repo_in_group') |
|
136 | params={'repo_group_id': group_id}, | |
101 | RepoGroupModel().delete(repo_group='gr1', force_delete=True) |
|
137 | extra_environ=xhr_header,) | |
102 | Session().commit() |
|
138 | response.mustcontain(repo_name) | |
103 |
|
139 | |||
104 |
def test_index_ |
|
140 | def test_index_page_on_group_with_trailing_slash(self, autologin_user, user_util, xhr_header): | |
105 |
|
|
141 | gr = user_util.create_repo_group() | |
106 | username = user.username |
|
142 | repo = user_util.create_repo(parent=gr) | |
107 | user.name = '<img src="/image1" onload="alert(\'Hello, World!\');">' |
|
143 | repo_name = repo.repo_name | |
108 | user.lastname = '#"><img src=x onerror=prompt(document.cookie);>' |
|
144 | group_id = gr.group_id | |
109 |
|
145 | |||
110 | Session().add(user) |
|
146 | response = self.app.get(route_path( | |
111 | Session().commit() |
|
147 | 'repo_group_home', repo_group_name=gr.group_name+'/')) | |
112 | user_util.create_repo(owner=username) |
|
148 | response.mustcontain('d.repo_group_id = {}'.format(group_id)) | |
113 |
|
149 | |||
114 |
response = self.app.get( |
|
150 | response = self.app.get( | |
115 | response.mustcontain(h.html_escape(user.first_name)) |
|
151 | route_path('main_page_repos_data'), | |
116 | response.mustcontain(h.html_escape(user.last_name)) |
|
152 | params={'repo_group_id': group_id}, | |
|
153 | extra_environ=xhr_header, ) | |||
|
154 | response.mustcontain(repo_name) | |||
117 |
|
155 | |||
118 | @pytest.mark.parametrize("name, state", [ |
|
156 | @pytest.mark.parametrize("name, state", [ | |
119 | ('Disabled', False), |
|
157 | ('Disabled', False), | |
@@ -137,5 +175,5 b' class TestHomeController(TestController)' | |||||
137 | def test_logout_form_contains_csrf(self, autologin_user, csrf_token): |
|
175 | def test_logout_form_contains_csrf(self, autologin_user, csrf_token): | |
138 | response = self.app.get(route_path('home')) |
|
176 | response = self.app.get(route_path('home')) | |
139 | assert_response = response.assert_response() |
|
177 | assert_response = response.assert_response() | |
140 |
element = assert_response.get_element('.logout |
|
178 | element = assert_response.get_element('.logout [name=csrf_token]') | |
141 | assert element.value == csrf_token |
|
179 | assert element.value == csrf_token |
@@ -22,29 +22,30 b' import re' | |||||
22 | import logging |
|
22 | import logging | |
23 | import collections |
|
23 | import collections | |
24 |
|
24 | |||
|
25 | from pyramid.httpexceptions import HTTPNotFound | |||
25 | from pyramid.view import view_config |
|
26 | from pyramid.view import view_config | |
26 |
|
27 | |||
27 | from rhodecode.apps._base import BaseAppView |
|
28 | from rhodecode.apps._base import BaseAppView, DataGridAppView | |
28 | from rhodecode.lib import helpers as h |
|
29 | from rhodecode.lib import helpers as h | |
29 | from rhodecode.lib.auth import ( |
|
30 | from rhodecode.lib.auth import ( | |
30 |
LoginRequired, NotAnonymous, HasRepoGroupPermissionAnyDecorator, CSRFRequired |
|
31 | LoginRequired, NotAnonymous, HasRepoGroupPermissionAnyDecorator, CSRFRequired, | |
|
32 | HasRepoGroupPermissionAny, AuthUser) | |||
31 | from rhodecode.lib.codeblocks import filenode_as_lines_tokens |
|
33 | from rhodecode.lib.codeblocks import filenode_as_lines_tokens | |
32 | from rhodecode.lib.index import searcher_from_config |
|
34 | from rhodecode.lib.index import searcher_from_config | |
33 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int |
|
35 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int | |
34 | from rhodecode.lib.ext_json import json |
|
|||
35 | from rhodecode.lib.vcs.nodes import FileNode |
|
36 | from rhodecode.lib.vcs.nodes import FileNode | |
36 | from rhodecode.model.db import ( |
|
37 | from rhodecode.model.db import ( | |
37 |
func, true, or_, case, in_filter_generator, |
|
38 | func, true, or_, case, in_filter_generator, Session, | |
|
39 | Repository, RepoGroup, User, UserGroup) | |||
38 | from rhodecode.model.repo import RepoModel |
|
40 | from rhodecode.model.repo import RepoModel | |
39 | from rhodecode.model.repo_group import RepoGroupModel |
|
41 | from rhodecode.model.repo_group import RepoGroupModel | |
40 | from rhodecode.model.scm import RepoGroupList, RepoList |
|
|||
41 | from rhodecode.model.user import UserModel |
|
42 | from rhodecode.model.user import UserModel | |
42 | from rhodecode.model.user_group import UserGroupModel |
|
43 | from rhodecode.model.user_group import UserGroupModel | |
43 |
|
44 | |||
44 | log = logging.getLogger(__name__) |
|
45 | log = logging.getLogger(__name__) | |
45 |
|
46 | |||
46 |
|
47 | |||
47 | class HomeView(BaseAppView): |
|
48 | class HomeView(BaseAppView, DataGridAppView): | |
48 |
|
49 | |||
49 | def load_default_context(self): |
|
50 | def load_default_context(self): | |
50 | c = self._get_local_tmpl_context() |
|
51 | c = self._get_local_tmpl_context() | |
@@ -112,7 +113,12 b' class HomeView(BaseAppView):' | |||||
112 | ['repository.read', 'repository.write', 'repository.admin'], |
|
113 | ['repository.read', 'repository.write', 'repository.admin'], | |
113 | cache=False, name_filter=name_contains) or [-1] |
|
114 | cache=False, name_filter=name_contains) or [-1] | |
114 |
|
115 | |||
115 |
query = |
|
116 | query = Session().query( | |
|
117 | Repository.repo_name, | |||
|
118 | Repository.repo_id, | |||
|
119 | Repository.repo_type, | |||
|
120 | Repository.private, | |||
|
121 | )\ | |||
116 | .filter(Repository.archived.isnot(true()))\ |
|
122 | .filter(Repository.archived.isnot(true()))\ | |
117 | .filter(or_( |
|
123 | .filter(or_( | |
118 | # generate multiple IN to fix limitation problems |
|
124 | # generate multiple IN to fix limitation problems | |
@@ -158,7 +164,10 b' class HomeView(BaseAppView):' | |||||
158 | ['group.read', 'group.write', 'group.admin'], |
|
164 | ['group.read', 'group.write', 'group.admin'], | |
159 | cache=False, name_filter=name_contains) or [-1] |
|
165 | cache=False, name_filter=name_contains) or [-1] | |
160 |
|
166 | |||
161 |
query = |
|
167 | query = Session().query( | |
|
168 | RepoGroup.group_id, | |||
|
169 | RepoGroup.group_name, | |||
|
170 | )\ | |||
162 | .filter(or_( |
|
171 | .filter(or_( | |
163 | # generate multiple IN to fix limitation problems |
|
172 | # generate multiple IN to fix limitation problems | |
164 | *in_filter_generator(RepoGroup.group_id, allowed_ids) |
|
173 | *in_filter_generator(RepoGroup.group_id, allowed_ids) | |
@@ -449,6 +458,7 b' class HomeView(BaseAppView):' | |||||
449 | 'id': -10, |
|
458 | 'id': -10, | |
450 | 'value': query, |
|
459 | 'value': query, | |
451 | 'value_display': label, |
|
460 | 'value_display': label, | |
|
461 | 'value_icon': '<i class="icon-code"></i>', | |||
452 | 'type': 'search', |
|
462 | 'type': 'search', | |
453 | 'subtype': 'repo', |
|
463 | 'subtype': 'repo', | |
454 | 'url': h.route_path('search_repo', |
|
464 | 'url': h.route_path('search_repo', | |
@@ -466,6 +476,7 b' class HomeView(BaseAppView):' | |||||
466 | 'id': -20, |
|
476 | 'id': -20, | |
467 | 'value': query, |
|
477 | 'value': query, | |
468 | 'value_display': label, |
|
478 | 'value_display': label, | |
|
479 | 'value_icon': '<i class="icon-history"></i>', | |||
469 | 'type': 'search', |
|
480 | 'type': 'search', | |
470 | 'subtype': 'repo', |
|
481 | 'subtype': 'repo', | |
471 | 'url': h.route_path('search_repo', |
|
482 | 'url': h.route_path('search_repo', | |
@@ -491,6 +502,7 b' class HomeView(BaseAppView):' | |||||
491 | 'id': -30, |
|
502 | 'id': -30, | |
492 | 'value': query, |
|
503 | 'value': query, | |
493 | 'value_display': label, |
|
504 | 'value_display': label, | |
|
505 | 'value_icon': '<i class="icon-code"></i>', | |||
494 | 'type': 'search', |
|
506 | 'type': 'search', | |
495 | 'subtype': 'repo_group', |
|
507 | 'subtype': 'repo_group', | |
496 | 'url': h.route_path('search_repo_group', |
|
508 | 'url': h.route_path('search_repo_group', | |
@@ -508,6 +520,7 b' class HomeView(BaseAppView):' | |||||
508 | 'id': -40, |
|
520 | 'id': -40, | |
509 | 'value': query, |
|
521 | 'value': query, | |
510 | 'value_display': label, |
|
522 | 'value_display': label, | |
|
523 | 'value_icon': '<i class="icon-history"></i>', | |||
511 | 'type': 'search', |
|
524 | 'type': 'search', | |
512 | 'subtype': 'repo_group', |
|
525 | 'subtype': 'repo_group', | |
513 | 'url': h.route_path('search_repo_group', |
|
526 | 'url': h.route_path('search_repo_group', | |
@@ -529,6 +542,7 b' class HomeView(BaseAppView):' | |||||
529 | 'id': -1, |
|
542 | 'id': -1, | |
530 | 'value': query, |
|
543 | 'value': query, | |
531 | 'value_display': u'File search for: `{}`'.format(query), |
|
544 | 'value_display': u'File search for: `{}`'.format(query), | |
|
545 | 'value_icon': '<i class="icon-code"></i>', | |||
532 | 'type': 'search', |
|
546 | 'type': 'search', | |
533 | 'subtype': 'global', |
|
547 | 'subtype': 'global', | |
534 | 'url': h.route_path('search', |
|
548 | 'url': h.route_path('search', | |
@@ -539,6 +553,7 b' class HomeView(BaseAppView):' | |||||
539 | 'id': -2, |
|
553 | 'id': -2, | |
540 | 'value': query, |
|
554 | 'value': query, | |
541 | 'value_display': u'Commit search for: `{}`'.format(query), |
|
555 | 'value_display': u'Commit search for: `{}`'.format(query), | |
|
556 | 'value_icon': '<i class="icon-history"></i>', | |||
542 | 'type': 'search', |
|
557 | 'type': 'search', | |
543 | 'subtype': 'global', |
|
558 | 'subtype': 'global', | |
544 | 'url': h.route_path('search', |
|
559 | 'url': h.route_path('search', | |
@@ -667,23 +682,6 b' class HomeView(BaseAppView):' | |||||
667 |
|
682 | |||
668 | return {'suggestions': res} |
|
683 | return {'suggestions': res} | |
669 |
|
684 | |||
670 | def _get_groups_and_repos(self, repo_group_id=None): |
|
|||
671 | # repo groups groups |
|
|||
672 | repo_group_list = RepoGroup.get_all_repo_groups(group_id=repo_group_id) |
|
|||
673 | _perms = ['group.read', 'group.write', 'group.admin'] |
|
|||
674 | repo_group_list_acl = RepoGroupList(repo_group_list, perm_set=_perms) |
|
|||
675 | repo_group_data = RepoGroupModel().get_repo_groups_as_dict( |
|
|||
676 | repo_group_list=repo_group_list_acl, admin=False) |
|
|||
677 |
|
||||
678 | # repositories |
|
|||
679 | repo_list = Repository.get_all_repos(group_id=repo_group_id) |
|
|||
680 | _perms = ['repository.read', 'repository.write', 'repository.admin'] |
|
|||
681 | repo_list_acl = RepoList(repo_list, perm_set=_perms) |
|
|||
682 | repo_data = RepoModel().get_repos_as_dict( |
|
|||
683 | repo_list=repo_list_acl, admin=False) |
|
|||
684 |
|
||||
685 | return repo_data, repo_group_data |
|
|||
686 |
|
||||
687 | @LoginRequired() |
|
685 | @LoginRequired() | |
688 | @view_config( |
|
686 | @view_config( | |
689 | route_name='home', request_method='GET', |
|
687 | route_name='home', request_method='GET', | |
@@ -691,17 +689,74 b' class HomeView(BaseAppView):' | |||||
691 | def main_page(self): |
|
689 | def main_page(self): | |
692 | c = self.load_default_context() |
|
690 | c = self.load_default_context() | |
693 | c.repo_group = None |
|
691 | c.repo_group = None | |
694 |
|
||||
695 | repo_data, repo_group_data = self._get_groups_and_repos() |
|
|||
696 | # json used to render the grids |
|
|||
697 | c.repos_data = json.dumps(repo_data) |
|
|||
698 | c.repo_groups_data = json.dumps(repo_group_data) |
|
|||
699 |
|
||||
700 | return self._get_template_context(c) |
|
692 | return self._get_template_context(c) | |
701 |
|
693 | |||
|
694 | def _main_page_repo_groups_data(self, repo_group_id): | |||
|
695 | column_map = { | |||
|
696 | 'name': 'group_name_hash', | |||
|
697 | 'desc': 'group_description', | |||
|
698 | 'last_change': 'updated_on', | |||
|
699 | 'owner': 'user_username', | |||
|
700 | } | |||
|
701 | draw, start, limit = self._extract_chunk(self.request) | |||
|
702 | search_q, order_by, order_dir = self._extract_ordering( | |||
|
703 | self.request, column_map=column_map) | |||
|
704 | return RepoGroupModel().get_repo_groups_data_table( | |||
|
705 | draw, start, limit, | |||
|
706 | search_q, order_by, order_dir, | |||
|
707 | self._rhodecode_user, repo_group_id) | |||
|
708 | ||||
|
709 | def _main_page_repos_data(self, repo_group_id): | |||
|
710 | column_map = { | |||
|
711 | 'name': 'repo_name', | |||
|
712 | 'desc': 'description', | |||
|
713 | 'last_change': 'updated_on', | |||
|
714 | 'owner': 'user_username', | |||
|
715 | } | |||
|
716 | draw, start, limit = self._extract_chunk(self.request) | |||
|
717 | search_q, order_by, order_dir = self._extract_ordering( | |||
|
718 | self.request, column_map=column_map) | |||
|
719 | return RepoModel().get_repos_data_table( | |||
|
720 | draw, start, limit, | |||
|
721 | search_q, order_by, order_dir, | |||
|
722 | self._rhodecode_user, repo_group_id) | |||
|
723 | ||||
702 | @LoginRequired() |
|
724 | @LoginRequired() | |
703 | @HasRepoGroupPermissionAnyDecorator( |
|
725 | @view_config( | |
704 | 'group.read', 'group.write', 'group.admin') |
|
726 | route_name='main_page_repo_groups_data', | |
|
727 | request_method='GET', renderer='json_ext', xhr=True) | |||
|
728 | def main_page_repo_groups_data(self): | |||
|
729 | self.load_default_context() | |||
|
730 | repo_group_id = safe_int(self.request.GET.get('repo_group_id')) | |||
|
731 | ||||
|
732 | if repo_group_id: | |||
|
733 | group = RepoGroup.get_or_404(repo_group_id) | |||
|
734 | _perms = AuthUser.repo_group_read_perms | |||
|
735 | if not HasRepoGroupPermissionAny(*_perms)( | |||
|
736 | group.group_name, 'user is allowed to list repo group children'): | |||
|
737 | raise HTTPNotFound() | |||
|
738 | ||||
|
739 | return self._main_page_repo_groups_data(repo_group_id) | |||
|
740 | ||||
|
741 | @LoginRequired() | |||
|
742 | @view_config( | |||
|
743 | route_name='main_page_repos_data', | |||
|
744 | request_method='GET', renderer='json_ext', xhr=True) | |||
|
745 | def main_page_repos_data(self): | |||
|
746 | self.load_default_context() | |||
|
747 | repo_group_id = safe_int(self.request.GET.get('repo_group_id')) | |||
|
748 | ||||
|
749 | if repo_group_id: | |||
|
750 | group = RepoGroup.get_or_404(repo_group_id) | |||
|
751 | _perms = AuthUser.repo_group_read_perms | |||
|
752 | if not HasRepoGroupPermissionAny(*_perms)( | |||
|
753 | group.group_name, 'user is allowed to list repo group children'): | |||
|
754 | raise HTTPNotFound() | |||
|
755 | ||||
|
756 | return self._main_page_repos_data(repo_group_id) | |||
|
757 | ||||
|
758 | @LoginRequired() | |||
|
759 | @HasRepoGroupPermissionAnyDecorator(*AuthUser.repo_group_read_perms) | |||
705 | @view_config( |
|
760 | @view_config( | |
706 | route_name='repo_group_home', request_method='GET', |
|
761 | route_name='repo_group_home', request_method='GET', | |
707 | renderer='rhodecode:templates/index_repo_group.mako') |
|
762 | renderer='rhodecode:templates/index_repo_group.mako') | |
@@ -711,16 +766,6 b' class HomeView(BaseAppView):' | |||||
711 | def repo_group_main_page(self): |
|
766 | def repo_group_main_page(self): | |
712 | c = self.load_default_context() |
|
767 | c = self.load_default_context() | |
713 | c.repo_group = self.request.db_repo_group |
|
768 | c.repo_group = self.request.db_repo_group | |
714 | repo_data, repo_group_data = self._get_groups_and_repos(c.repo_group.group_id) |
|
|||
715 |
|
||||
716 | # update every 5 min |
|
|||
717 | if self.request.db_repo_group.last_commit_cache_update_diff > 60 * 5: |
|
|||
718 | self.request.db_repo_group.update_commit_cache() |
|
|||
719 |
|
||||
720 | # json used to render the grids |
|
|||
721 | c.repos_data = json.dumps(repo_data) |
|
|||
722 | c.repo_groups_data = json.dumps(repo_group_data) |
|
|||
723 |
|
||||
724 | return self._get_template_context(c) |
|
769 | return self._get_template_context(c) | |
725 |
|
770 | |||
726 | @LoginRequired() |
|
771 | @LoginRequired() |
@@ -22,7 +22,7 b'' | |||||
22 | import logging |
|
22 | import logging | |
23 | import itertools |
|
23 | import itertools | |
24 |
|
24 | |||
25 | from webhelpers.feedgenerator import Atom1Feed, Rss201rev2Feed |
|
25 | ||
26 |
|
26 | |||
27 | from pyramid.view import view_config |
|
27 | from pyramid.view import view_config | |
28 | from pyramid.httpexceptions import HTTPBadRequest |
|
28 | from pyramid.httpexceptions import HTTPBadRequest | |
@@ -34,10 +34,11 b' from rhodecode.model.db import (' | |||||
34 | or_, joinedload, Repository, UserLog, UserFollowing, User, UserApiKeys) |
|
34 | or_, joinedload, Repository, UserLog, UserFollowing, User, UserApiKeys) | |
35 | from rhodecode.model.meta import Session |
|
35 | from rhodecode.model.meta import Session | |
36 | import rhodecode.lib.helpers as h |
|
36 | import rhodecode.lib.helpers as h | |
37 | from rhodecode.lib.helpers import Page |
|
37 | from rhodecode.lib.helpers import SqlPage | |
38 | from rhodecode.lib.user_log_filter import user_log_filter |
|
38 | from rhodecode.lib.user_log_filter import user_log_filter | |
39 | from rhodecode.lib.auth import LoginRequired, NotAnonymous, CSRFRequired, HasRepoPermissionAny |
|
39 | from rhodecode.lib.auth import LoginRequired, NotAnonymous, CSRFRequired, HasRepoPermissionAny | |
40 | from rhodecode.lib.utils2 import safe_int, AttributeDict, md5_safe |
|
40 | from rhodecode.lib.utils2 import safe_int, AttributeDict, md5_safe | |
|
41 | from rhodecode.lib.feedgenerator.feedgenerator import Atom1Feed, Rss201rev2Feed | |||
41 | from rhodecode.model.scm import ScmModel |
|
42 | from rhodecode.model.scm import ScmModel | |
42 |
|
43 | |||
43 | log = logging.getLogger(__name__) |
|
44 | log = logging.getLogger(__name__) | |
@@ -166,7 +167,7 b' class JournalView(BaseAppView):' | |||||
166 | description=desc) |
|
167 | description=desc) | |
167 |
|
168 | |||
168 | response = Response(feed.writeString('utf-8')) |
|
169 | response = Response(feed.writeString('utf-8')) | |
169 |
response.content_type = feed. |
|
170 | response.content_type = feed.content_type | |
170 | return response |
|
171 | return response | |
171 |
|
172 | |||
172 | def _rss_feed(self, repos, search_term, public=True): |
|
173 | def _rss_feed(self, repos, search_term, public=True): | |
@@ -212,7 +213,7 b' class JournalView(BaseAppView):' | |||||
212 | description=desc) |
|
213 | description=desc) | |
213 |
|
214 | |||
214 | response = Response(feed.writeString('utf-8')) |
|
215 | response = Response(feed.writeString('utf-8')) | |
215 |
response.content_type = feed. |
|
216 | response.content_type = feed.content_type | |
216 | return response |
|
217 | return response | |
217 |
|
218 | |||
218 | @LoginRequired() |
|
219 | @LoginRequired() | |
@@ -232,15 +233,15 b' class JournalView(BaseAppView):' | |||||
232 |
|
233 | |||
233 | journal = self._get_journal_data(following, c.search_term) |
|
234 | journal = self._get_journal_data(following, c.search_term) | |
234 |
|
235 | |||
235 |
def url_generator( |
|
236 | def url_generator(page_num): | |
236 | query_params = { |
|
237 | query_params = { | |
|
238 | 'page': page_num, | |||
237 | 'filter': c.search_term |
|
239 | 'filter': c.search_term | |
238 | } |
|
240 | } | |
239 | query_params.update(kw) |
|
|||
240 | return self.request.current_route_path(_query=query_params) |
|
241 | return self.request.current_route_path(_query=query_params) | |
241 |
|
242 | |||
242 | c.journal_pager = Page( |
|
243 | c.journal_pager = SqlPage( | |
243 | journal, page=p, items_per_page=20, url=url_generator) |
|
244 | journal, page=p, items_per_page=20, url_maker=url_generator) | |
244 | c.journal_day_aggreagate = self._get_daily_aggregate(c.journal_pager) |
|
245 | c.journal_day_aggreagate = self._get_daily_aggregate(c.journal_pager) | |
245 |
|
246 | |||
246 | c.journal_data = render( |
|
247 | c.journal_data = render( | |
@@ -333,13 +334,14 b' class JournalView(BaseAppView):' | |||||
333 |
|
334 | |||
334 | journal = self._get_journal_data(c.following, c.search_term) |
|
335 | journal = self._get_journal_data(c.following, c.search_term) | |
335 |
|
336 | |||
336 |
def url_generator( |
|
337 | def url_generator(page_num): | |
337 |
query_params = { |
|
338 | query_params = { | |
338 | query_params.update(kw) |
|
339 | 'page': page_num | |
|
340 | } | |||
339 | return self.request.current_route_path(_query=query_params) |
|
341 | return self.request.current_route_path(_query=query_params) | |
340 |
|
342 | |||
341 | c.journal_pager = Page( |
|
343 | c.journal_pager = SqlPage( | |
342 | journal, page=p, items_per_page=20, url=url_generator) |
|
344 | journal, page=p, items_per_page=20, url_maker=url_generator) | |
343 | c.journal_day_aggreagate = self._get_daily_aggregate(c.journal_pager) |
|
345 | c.journal_day_aggreagate = self._get_daily_aggregate(c.journal_pager) | |
344 |
|
346 | |||
345 | c.journal_data = render( |
|
347 | c.journal_data = render( |
@@ -93,7 +93,7 b' class TestLoginController(object):' | |||||
93 | session = response.get_session_from_response() |
|
93 | session = response.get_session_from_response() | |
94 | username = session['rhodecode_user'].get('username') |
|
94 | username = session['rhodecode_user'].get('username') | |
95 | assert username == 'test_admin' |
|
95 | assert username == 'test_admin' | |
96 |
response.mustcontain(' |
|
96 | response.mustcontain('logout') | |
97 |
|
97 | |||
98 | def test_login_regular_ok(self): |
|
98 | def test_login_regular_ok(self): | |
99 | response = self.app.post(route_path('login'), |
|
99 | response = self.app.post(route_path('login'), | |
@@ -104,8 +104,7 b' class TestLoginController(object):' | |||||
104 | session = response.get_session_from_response() |
|
104 | session = response.get_session_from_response() | |
105 | username = session['rhodecode_user'].get('username') |
|
105 | username = session['rhodecode_user'].get('username') | |
106 | assert username == 'test_regular' |
|
106 | assert username == 'test_regular' | |
107 |
|
107 | response.mustcontain('logout') | ||
108 | response.mustcontain('/%s' % HG_REPO) |
|
|||
109 |
|
108 | |||
110 | def test_login_regular_forbidden_when_super_admin_restriction(self): |
|
109 | def test_login_regular_forbidden_when_super_admin_restriction(self): | |
111 | from rhodecode.authentication.plugins.auth_rhodecode import RhodeCodeAuthPlugin |
|
110 | from rhodecode.authentication.plugins.auth_rhodecode import RhodeCodeAuthPlugin | |
@@ -225,7 +224,7 b' class TestLoginController(object):' | |||||
225 | session = response.get_session_from_response() |
|
224 | session = response.get_session_from_response() | |
226 | username = session['rhodecode_user'].get('username') |
|
225 | username = session['rhodecode_user'].get('username') | |
227 | assert username == temp_user |
|
226 | assert username == temp_user | |
228 |
response.mustcontain(' |
|
227 | response.mustcontain('logout') | |
229 |
|
228 | |||
230 | # new password should be bcrypted, after log-in and transfer |
|
229 | # new password should be bcrypted, after log-in and transfer | |
231 | user = User.get_by_username(temp_user) |
|
230 | user = User.get_by_username(temp_user) | |
@@ -401,7 +400,7 b' class TestLoginController(object):' | |||||
401 | ) # This should be overridden |
|
400 | ) # This should be overridden | |
402 |
|
401 | |||
403 | assert_session_flash( |
|
402 | assert_session_flash( | |
404 | response, 'You have successfully registered with RhodeCode') |
|
403 | response, 'You have successfully registered with RhodeCode. You can log-in now.') | |
405 |
|
404 | |||
406 | ret = Session().query(User).filter( |
|
405 | ret = Session().query(User).filter( | |
407 | User.username == 'test_regular4').one() |
|
406 | User.username == 'test_regular4').one() |
@@ -88,7 +88,7 b' class TestPasswordReset(TestController):' | |||||
88 |
|
88 | |||
89 | response = self.app.get(route_path('reset_password')) |
|
89 | response = self.app.get(route_path('reset_password')) | |
90 |
|
90 | |||
91 |
assert_response = |
|
91 | assert_response = response.assert_response() | |
92 | if show_reset: |
|
92 | if show_reset: | |
93 | response.mustcontain('Send password reset email') |
|
93 | response.mustcontain('Send password reset email') | |
94 | assert_response.one_element_exists('#email') |
|
94 | assert_response.one_element_exists('#email') |
@@ -90,7 +90,7 b' class TestRegisterCaptcha(object):' | |||||
90 |
|
90 | |||
91 | response = app.get(ADMIN_PREFIX + '/register') |
|
91 | response = app.get(ADMIN_PREFIX + '/register') | |
92 |
|
92 | |||
93 |
assertr = |
|
93 | assertr = response.assert_response() | |
94 | if active: |
|
94 | if active: | |
95 | assertr.one_element_exists('#recaptcha_field') |
|
95 | assertr.one_element_exists('#recaptcha_field') | |
96 | else: |
|
96 | else: | |
@@ -128,6 +128,6 b' class TestRegisterCaptcha(object):' | |||||
128 | else: |
|
128 | else: | |
129 | # If captche input is invalid we expect to stay on the registration |
|
129 | # If captche input is invalid we expect to stay on the registration | |
130 | # page with an error message displayed. |
|
130 | # page with an error message displayed. | |
131 |
assertr = |
|
131 | assertr = response.assert_response() | |
132 | assert response.status_int == 200 |
|
132 | assert response.status_int == 200 | |
133 | assertr.one_element_exists('#recaptcha_field ~ span.error-message') |
|
133 | assertr.one_element_exists('#recaptcha_field ~ span.error-message') |
@@ -312,8 +312,6 b' class LoginView(BaseAppView):' | |||||
312 | action_data = {'data': new_user.get_api_data(), |
|
312 | action_data = {'data': new_user.get_api_data(), | |
313 | 'user_agent': self.request.user_agent} |
|
313 | 'user_agent': self.request.user_agent} | |
314 |
|
314 | |||
315 |
|
||||
316 |
|
||||
317 | if external_identity: |
|
315 | if external_identity: | |
318 | action_data['external_identity'] = external_identity |
|
316 | action_data['external_identity'] = external_identity | |
319 |
|
317 | |||
@@ -329,7 +327,12 b' class LoginView(BaseAppView):' | |||||
329 | event = UserRegistered(user=new_user, session=self.session) |
|
327 | event = UserRegistered(user=new_user, session=self.session) | |
330 | trigger(event) |
|
328 | trigger(event) | |
331 | h.flash( |
|
329 | h.flash( | |
332 | _('You have successfully registered with RhodeCode'), |
|
330 | _('You have successfully registered with RhodeCode. You can log-in now.'), | |
|
331 | category='success') | |||
|
332 | if external_identity: | |||
|
333 | h.flash( | |||
|
334 | _('Please use the {identity} button to log-in').format( | |||
|
335 | identity=external_identity), | |||
333 | category='success') |
|
336 | category='success') | |
334 | Session().commit() |
|
337 | Session().commit() | |
335 |
|
338 |
@@ -33,22 +33,21 b' from rhodecode import forms' | |||||
33 | from rhodecode.lib import helpers as h |
|
33 | from rhodecode.lib import helpers as h | |
34 | from rhodecode.lib import audit_logger |
|
34 | from rhodecode.lib import audit_logger | |
35 | from rhodecode.lib.ext_json import json |
|
35 | from rhodecode.lib.ext_json import json | |
36 | from rhodecode.lib.auth import LoginRequired, NotAnonymous, CSRFRequired, \ |
|
36 | from rhodecode.lib.auth import ( | |
37 | HasRepoPermissionAny, HasRepoGroupPermissionAny |
|
37 | LoginRequired, NotAnonymous, CSRFRequired, | |
|
38 | HasRepoPermissionAny, HasRepoGroupPermissionAny, AuthUser) | |||
38 | from rhodecode.lib.channelstream import ( |
|
39 | from rhodecode.lib.channelstream import ( | |
39 | channelstream_request, ChannelstreamException) |
|
40 | channelstream_request, ChannelstreamException) | |
40 | from rhodecode.lib.utils2 import safe_int, md5, str2bool |
|
41 | from rhodecode.lib.utils2 import safe_int, md5, str2bool | |
41 | from rhodecode.model.auth_token import AuthTokenModel |
|
42 | from rhodecode.model.auth_token import AuthTokenModel | |
42 | from rhodecode.model.comment import CommentsModel |
|
43 | from rhodecode.model.comment import CommentsModel | |
43 | from rhodecode.model.db import ( |
|
44 | from rhodecode.model.db import ( | |
44 |
IntegrityError, |
|
45 | IntegrityError, or_, in_filter_generator, | |
45 | Repository, UserEmailMap, UserApiKeys, UserFollowing, |
|
46 | Repository, UserEmailMap, UserApiKeys, UserFollowing, | |
46 | PullRequest, UserBookmark, RepoGroup) |
|
47 | PullRequest, UserBookmark, RepoGroup) | |
47 | from rhodecode.model.meta import Session |
|
48 | from rhodecode.model.meta import Session | |
48 | from rhodecode.model.pull_request import PullRequestModel |
|
49 | from rhodecode.model.pull_request import PullRequestModel | |
49 | from rhodecode.model.scm import RepoList |
|
|||
50 | from rhodecode.model.user import UserModel |
|
50 | from rhodecode.model.user import UserModel | |
51 | from rhodecode.model.repo import RepoModel |
|
|||
52 | from rhodecode.model.user_group import UserGroupModel |
|
51 | from rhodecode.model.user_group import UserGroupModel | |
53 | from rhodecode.model.validation_schema.schemas import user_schema |
|
52 | from rhodecode.model.validation_schema.schemas import user_schema | |
54 |
|
53 | |||
@@ -345,22 +344,59 b' class MyAccountView(BaseAppView, DataGri' | |||||
345 | 'You should see a new live message now.'} |
|
344 | 'You should see a new live message now.'} | |
346 |
|
345 | |||
347 | def _load_my_repos_data(self, watched=False): |
|
346 | def _load_my_repos_data(self, watched=False): | |
|
347 | ||||
|
348 | allowed_ids = [-1] + self._rhodecode_user.repo_acl_ids_from_stack(AuthUser.repo_read_perms) | |||
|
349 | ||||
348 | if watched: |
|
350 | if watched: | |
349 | admin = False |
|
351 | # repos user watch | |
350 |
|
|
352 | repo_list = Session().query( | |
351 | .filter(UserFollowing.user_id == self._rhodecode_user.user_id)\ |
|
353 | Repository | |
352 | .options(joinedload(UserFollowing.follows_repository))\ |
|
354 | ) \ | |
|
355 | .join( | |||
|
356 | (UserFollowing, UserFollowing.follows_repo_id == Repository.repo_id) | |||
|
357 | ) \ | |||
|
358 | .filter( | |||
|
359 | UserFollowing.user_id == self._rhodecode_user.user_id | |||
|
360 | ) \ | |||
|
361 | .filter(or_( | |||
|
362 | # generate multiple IN to fix limitation problems | |||
|
363 | *in_filter_generator(Repository.repo_id, allowed_ids)) | |||
|
364 | ) \ | |||
|
365 | .order_by(Repository.repo_name) \ | |||
353 | .all() |
|
366 | .all() | |
354 | repo_list = [x.follows_repository for x in follows_repos] |
|
367 | ||
355 | else: |
|
368 | else: | |
356 | admin = True |
|
369 | # repos user is owner of | |
357 |
repo_list = |
|
370 | repo_list = Session().query( | |
358 | user_id=self._rhodecode_user.user_id) |
|
371 | Repository | |
359 | repo_list = RepoList(repo_list, perm_set=[ |
|
372 | ) \ | |
360 | 'repository.read', 'repository.write', 'repository.admin']) |
|
373 | .filter( | |
|
374 | Repository.user_id == self._rhodecode_user.user_id | |||
|
375 | ) \ | |||
|
376 | .filter(or_( | |||
|
377 | # generate multiple IN to fix limitation problems | |||
|
378 | *in_filter_generator(Repository.repo_id, allowed_ids)) | |||
|
379 | ) \ | |||
|
380 | .order_by(Repository.repo_name) \ | |||
|
381 | .all() | |||
361 |
|
382 | |||
362 | repos_data = RepoModel().get_repos_as_dict( |
|
383 | _render = self.request.get_partial_renderer( | |
363 | repo_list=repo_list, admin=admin, short_name=False) |
|
384 | 'rhodecode:templates/data_table/_dt_elements.mako') | |
|
385 | ||||
|
386 | def repo_lnk(name, rtype, rstate, private, archived, fork_of): | |||
|
387 | return _render('repo_name', name, rtype, rstate, private, archived, fork_of, | |||
|
388 | short_name=False, admin=False) | |||
|
389 | ||||
|
390 | repos_data = [] | |||
|
391 | for repo in repo_list: | |||
|
392 | row = { | |||
|
393 | "name": repo_lnk(repo.repo_name, repo.repo_type, repo.repo_state, | |||
|
394 | repo.private, repo.archived, repo.fork), | |||
|
395 | "name_raw": repo.repo_name.lower(), | |||
|
396 | } | |||
|
397 | ||||
|
398 | repos_data.append(row) | |||
|
399 | ||||
364 | # json used to render the grid |
|
400 | # json used to render the grid | |
365 | return json.dumps(repos_data) |
|
401 | return json.dumps(repos_data) | |
366 |
|
402 | |||
@@ -398,16 +434,19 b' class MyAccountView(BaseAppView, DataGri' | |||||
398 | def my_account_bookmarks(self): |
|
434 | def my_account_bookmarks(self): | |
399 | c = self.load_default_context() |
|
435 | c = self.load_default_context() | |
400 | c.active = 'bookmarks' |
|
436 | c.active = 'bookmarks' | |
|
437 | c.bookmark_items = UserBookmark.get_bookmarks_for_user( | |||
|
438 | self._rhodecode_db_user.user_id, cache=False) | |||
401 | return self._get_template_context(c) |
|
439 | return self._get_template_context(c) | |
402 |
|
440 | |||
403 | def _process_entry(self, entry, user_id): |
|
441 | def _process_bookmark_entry(self, entry, user_id): | |
404 | position = safe_int(entry.get('position')) |
|
442 | position = safe_int(entry.get('position')) | |
|
443 | cur_position = safe_int(entry.get('cur_position')) | |||
405 | if position is None: |
|
444 | if position is None: | |
406 | return |
|
445 | return | |
407 |
|
446 | |||
408 | # check if this is an existing entry |
|
447 | # check if this is an existing entry | |
409 | is_new = False |
|
448 | is_new = False | |
410 | db_entry = UserBookmark().get_by_position_for_user(position, user_id) |
|
449 | db_entry = UserBookmark().get_by_position_for_user(cur_position, user_id) | |
411 |
|
450 | |||
412 | if db_entry and str2bool(entry.get('remove')): |
|
451 | if db_entry and str2bool(entry.get('remove')): | |
413 | log.debug('Marked bookmark %s for deletion', db_entry) |
|
452 | log.debug('Marked bookmark %s for deletion', db_entry) | |
@@ -446,12 +485,12 b' class MyAccountView(BaseAppView, DataGri' | |||||
446 | should_save = True |
|
485 | should_save = True | |
447 |
|
486 | |||
448 | if should_save: |
|
487 | if should_save: | |
449 | log.debug('Saving bookmark %s, new:%s', db_entry, is_new) |
|
|||
450 | # mark user and position |
|
488 | # mark user and position | |
451 | db_entry.user_id = user_id |
|
489 | db_entry.user_id = user_id | |
452 | db_entry.position = position |
|
490 | db_entry.position = position | |
453 | db_entry.title = entry.get('title') |
|
491 | db_entry.title = entry.get('title') | |
454 | db_entry.redirect_url = entry.get('redirect_url') or default_redirect_url |
|
492 | db_entry.redirect_url = entry.get('redirect_url') or default_redirect_url | |
|
493 | log.debug('Saving bookmark %s, new:%s', db_entry, is_new) | |||
455 |
|
494 | |||
456 | Session().add(db_entry) |
|
495 | Session().add(db_entry) | |
457 |
|
496 | |||
@@ -468,15 +507,31 b' class MyAccountView(BaseAppView, DataGri' | |||||
468 | controls = peppercorn.parse(self.request.POST.items()) |
|
507 | controls = peppercorn.parse(self.request.POST.items()) | |
469 | user_id = c.user.user_id |
|
508 | user_id = c.user.user_id | |
470 |
|
509 | |||
471 | try: |
|
510 | # validate positions | |
|
511 | positions = {} | |||
472 |
|
|
512 | for entry in controls.get('bookmarks', []): | |
473 | self._process_entry(entry, user_id) |
|
513 | position = safe_int(entry['position']) | |
|
514 | if position is None: | |||
|
515 | continue | |||
|
516 | ||||
|
517 | if position in positions: | |||
|
518 | h.flash(_("Position {} is defined twice. " | |||
|
519 | "Please correct this error.").format(position), category='error') | |||
|
520 | return HTTPFound(h.route_path('my_account_bookmarks')) | |||
|
521 | ||||
|
522 | entry['position'] = position | |||
|
523 | entry['cur_position'] = safe_int(entry.get('cur_position')) | |||
|
524 | positions[position] = entry | |||
|
525 | ||||
|
526 | try: | |||
|
527 | for entry in positions.values(): | |||
|
528 | self._process_bookmark_entry(entry, user_id) | |||
474 |
|
529 | |||
475 | Session().commit() |
|
530 | Session().commit() | |
476 | h.flash(_("Update Bookmarks"), category='success') |
|
531 | h.flash(_("Update Bookmarks"), category='success') | |
477 | except IntegrityError: |
|
532 | except IntegrityError: | |
478 | h.flash(_("Failed to update bookmarks. " |
|
533 | h.flash(_("Failed to update bookmarks. " | |
479 | "Make sure an unique position is used"), category='error') |
|
534 | "Make sure an unique position is used."), category='error') | |
480 |
|
535 | |||
481 | return HTTPFound(h.route_path('my_account_bookmarks')) |
|
536 | return HTTPFound(h.route_path('my_account_bookmarks')) | |
482 |
|
537 | |||
@@ -582,6 +637,7 b' class MyAccountView(BaseAppView, DataGri' | |||||
582 | 'email': c.user.email, |
|
637 | 'email': c.user.email, | |
583 | 'firstname': c.user.firstname, |
|
638 | 'firstname': c.user.firstname, | |
584 | 'lastname': c.user.lastname, |
|
639 | 'lastname': c.user.lastname, | |
|
640 | 'description': c.user.description, | |||
585 | } |
|
641 | } | |
586 | c.form = forms.RcForm( |
|
642 | c.form = forms.RcForm( | |
587 | schema, appstruct=appstruct, |
|
643 | schema, appstruct=appstruct, | |
@@ -664,7 +720,8 b' class MyAccountView(BaseAppView, DataGri' | |||||
664 | 'target_repo': _render('pullrequest_target_repo', |
|
720 | 'target_repo': _render('pullrequest_target_repo', | |
665 | pr.target_repo.repo_name), |
|
721 | pr.target_repo.repo_name), | |
666 | 'name': _render('pullrequest_name', |
|
722 | 'name': _render('pullrequest_name', | |
667 |
pr.pull_request_id, pr. |
|
723 | pr.pull_request_id, pr.pull_request_state, | |
|
724 | pr.work_in_progress, pr.target_repo.repo_name, | |||
668 | short=True), |
|
725 | short=True), | |
669 | 'name_raw': pr.pull_request_id, |
|
726 | 'name_raw': pr.pull_request_id, | |
670 | 'status': _render('pullrequest_status', |
|
727 | 'status': _render('pullrequest_status', |
@@ -28,7 +28,7 b' from rhodecode.apps._base import BaseApp' | |||||
28 | from rhodecode.lib.auth import LoginRequired, NotAnonymous, CSRFRequired |
|
28 | from rhodecode.lib.auth import LoginRequired, NotAnonymous, CSRFRequired | |
29 |
|
29 | |||
30 | from rhodecode.lib import helpers as h |
|
30 | from rhodecode.lib import helpers as h | |
31 | from rhodecode.lib.helpers import Page |
|
31 | from rhodecode.lib.helpers import SqlPage | |
32 | from rhodecode.lib.utils2 import safe_int |
|
32 | from rhodecode.lib.utils2 import safe_int | |
33 | from rhodecode.model.db import Notification |
|
33 | from rhodecode.model.db import Notification | |
34 | from rhodecode.model.notification import NotificationModel |
|
34 | from rhodecode.model.notification import NotificationModel | |
@@ -74,13 +74,16 b' class MyAccountNotificationsView(BaseApp' | |||||
74 |
|
74 | |||
75 | p = safe_int(self.request.GET.get('page', 1), 1) |
|
75 | p = safe_int(self.request.GET.get('page', 1), 1) | |
76 |
|
76 | |||
77 |
def url_generator( |
|
77 | def url_generator(page_num): | |
|
78 | query_params = { | |||
|
79 | 'page': page_num | |||
|
80 | } | |||
78 | _query = self.request.GET.mixed() |
|
81 | _query = self.request.GET.mixed() | |
79 |
|
|
82 | query_params.update(_query) | |
80 |
return self.request.current_route_path(_query= |
|
83 | return self.request.current_route_path(_query=query_params) | |
81 |
|
84 | |||
82 | c.notifications = Page(notifications, page=p, items_per_page=10, |
|
85 | c.notifications = SqlPage(notifications, page=p, items_per_page=10, | |
83 | url=url_generator) |
|
86 | url_maker=url_generator) | |
84 |
|
87 | |||
85 | c.unread_type = 'unread' |
|
88 | c.unread_type = 'unread' | |
86 | c.all_type = 'all' |
|
89 | c.all_type = 'all' |
@@ -46,9 +46,16 b' class RepoGroupSettingsView(RepoGroupApp' | |||||
46 | route_name='edit_repo_group_advanced', request_method='GET', |
|
46 | route_name='edit_repo_group_advanced', request_method='GET', | |
47 | renderer='rhodecode:templates/admin/repo_groups/repo_group_edit.mako') |
|
47 | renderer='rhodecode:templates/admin/repo_groups/repo_group_edit.mako') | |
48 | def edit_repo_group_advanced(self): |
|
48 | def edit_repo_group_advanced(self): | |
|
49 | _ = self.request.translate | |||
49 | c = self.load_default_context() |
|
50 | c = self.load_default_context() | |
50 | c.active = 'advanced' |
|
51 | c.active = 'advanced' | |
51 | c.repo_group = self.db_repo_group |
|
52 | c.repo_group = self.db_repo_group | |
|
53 | ||||
|
54 | # update commit cache if GET flag is present | |||
|
55 | if self.request.GET.get('update_commit_cache'): | |||
|
56 | self.db_repo_group.update_commit_cache() | |||
|
57 | h.flash(_('updated commit cache'), category='success') | |||
|
58 | ||||
52 | return self._get_template_context(c) |
|
59 | return self._get_template_context(c) | |
53 |
|
60 | |||
54 | @LoginRequired() |
|
61 | @LoginRequired() |
@@ -79,6 +79,10 b' def includeme(config):' | |||||
79 | pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/preview', repo_route=True) |
|
79 | pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/preview', repo_route=True) | |
80 |
|
80 | |||
81 | config.add_route( |
|
81 | config.add_route( | |
|
82 | name='repo_commit_comment_attachment_upload', | |||
|
83 | pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/attachment_upload', repo_route=True) | |||
|
84 | ||||
|
85 | config.add_route( | |||
82 | name='repo_commit_comment_delete', |
|
86 | name='repo_commit_comment_delete', | |
83 | pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/delete', repo_route=True) |
|
87 | pattern='/{repo_name:.*?[^/]}/changeset/{commit_id}/comment/{comment_id}/delete', repo_route=True) | |
84 |
|
88 |
@@ -98,7 +98,7 b' class TestRepoCommitCommentsView(TestCon' | |||||
98 | assert notification.type_ == Notification.TYPE_CHANGESET_COMMENT |
|
98 | assert notification.type_ == Notification.TYPE_CHANGESET_COMMENT | |
99 |
|
99 | |||
100 | author = notification.created_by_user.username_and_name |
|
100 | author = notification.created_by_user.username_and_name | |
101 | sbj = '{0} left a {1} on commit `{2}` in the {3} repository'.format( |
|
101 | sbj = '@{0} left a {1} on commit `{2}` in the `{3}` repository'.format( | |
102 | author, comment_type, h.show_id(commit), backend.repo_name) |
|
102 | author, comment_type, h.show_id(commit), backend.repo_name) | |
103 | assert sbj == notification.subject |
|
103 | assert sbj == notification.subject | |
104 |
|
104 | |||
@@ -159,7 +159,7 b' class TestRepoCommitCommentsView(TestCon' | |||||
159 | assert comment.revision == commit_id |
|
159 | assert comment.revision == commit_id | |
160 |
|
160 | |||
161 | author = notification.created_by_user.username_and_name |
|
161 | author = notification.created_by_user.username_and_name | |
162 | sbj = '{0} left a {1} on file `{2}` in commit `{3}` in the {4} repository'.format( |
|
162 | sbj = '@{0} left a {1} on file `{2}` in commit `{3}` in the `{4}` repository'.format( | |
163 | author, comment_type, f_path, h.show_id(commit), backend.repo_name) |
|
163 | author, comment_type, f_path, h.show_id(commit), backend.repo_name) | |
164 |
|
164 | |||
165 | assert sbj == notification.subject |
|
165 | assert sbj == notification.subject | |
@@ -230,7 +230,7 b' class TestRepoCommitCommentsView(TestCon' | |||||
230 | assert notification.type_ == Notification.TYPE_CHANGESET_COMMENT |
|
230 | assert notification.type_ == Notification.TYPE_CHANGESET_COMMENT | |
231 |
|
231 | |||
232 | author = notification.created_by_user.username_and_name |
|
232 | author = notification.created_by_user.username_and_name | |
233 | sbj = '[status: Approved] {0} left a note on commit `{1}` in the {2} repository'.format( |
|
233 | sbj = '[status: Approved] @{0} left a note on commit `{1}` in the `{2}` repository'.format( | |
234 | author, h.show_id(commit), backend.repo_name) |
|
234 | author, h.show_id(commit), backend.repo_name) | |
235 | assert sbj == notification.subject |
|
235 | assert sbj == notification.subject | |
236 |
|
236 | |||
@@ -299,14 +299,14 b' class TestRepoCommitCommentsView(TestCon' | |||||
299 |
|
299 | |||
300 | def assert_comment_links(response, comments, inline_comments): |
|
300 | def assert_comment_links(response, comments, inline_comments): | |
301 | if comments == 1: |
|
301 | if comments == 1: | |
302 |
comments_text = "%d |
|
302 | comments_text = "%d General" % comments | |
303 | else: |
|
303 | else: | |
304 |
comments_text = "%d |
|
304 | comments_text = "%d General" % comments | |
305 |
|
305 | |||
306 | if inline_comments == 1: |
|
306 | if inline_comments == 1: | |
307 |
inline_comments_text = "%d Inline |
|
307 | inline_comments_text = "%d Inline" % inline_comments | |
308 | else: |
|
308 | else: | |
309 |
inline_comments_text = "%d Inline |
|
309 | inline_comments_text = "%d Inline" % inline_comments | |
310 |
|
310 | |||
311 | if comments: |
|
311 | if comments: | |
312 | response.mustcontain('<a href="#comments">%s</a>,' % comments_text) |
|
312 | response.mustcontain('<a href="#comments">%s</a>,' % comments_text) | |
@@ -315,6 +315,6 b' def assert_comment_links(response, comme' | |||||
315 |
|
315 | |||
316 | if inline_comments: |
|
316 | if inline_comments: | |
317 | response.mustcontain( |
|
317 | response.mustcontain( | |
318 |
'id="inline-comments-counter">%s |
|
318 | 'id="inline-comments-counter">%s' % inline_comments_text) | |
319 | else: |
|
319 | else: | |
320 | response.mustcontain(inline_comments_text) |
|
320 | response.mustcontain(inline_comments_text) |
@@ -20,6 +20,7 b'' | |||||
20 |
|
20 | |||
21 | import pytest |
|
21 | import pytest | |
22 |
|
22 | |||
|
23 | from rhodecode.apps.repository.tests.test_repo_compare import ComparePage | |||
23 | from rhodecode.lib.helpers import _shorten_commit_id |
|
24 | from rhodecode.lib.helpers import _shorten_commit_id | |
24 |
|
25 | |||
25 |
|
26 | |||
@@ -79,13 +80,22 b' class TestRepoCommitView(object):' | |||||
79 | 'git': '03fa803d7e9fb14daa9a3089e0d1494eda75d986', |
|
80 | 'git': '03fa803d7e9fb14daa9a3089e0d1494eda75d986', | |
80 | 'svn': '337', |
|
81 | 'svn': '337', | |
81 | } |
|
82 | } | |
|
83 | diff_stat = { | |||
|
84 | 'hg': (21, 943, 288), | |||
|
85 | 'git': (20, 941, 286), | |||
|
86 | 'svn': (21, 943, 288), | |||
|
87 | } | |||
|
88 | ||||
82 | commit_id = commit_id[backend.alias] |
|
89 | commit_id = commit_id[backend.alias] | |
83 | response = self.app.get(route_path( |
|
90 | response = self.app.get(route_path( | |
84 | 'repo_commit', |
|
91 | 'repo_commit', | |
85 | repo_name=backend.repo_name, commit_id=commit_id)) |
|
92 | repo_name=backend.repo_name, commit_id=commit_id)) | |
86 |
|
93 | |||
87 | response.mustcontain(_shorten_commit_id(commit_id)) |
|
94 | response.mustcontain(_shorten_commit_id(commit_id)) | |
88 | response.mustcontain('21 files changed: 943 inserted, 288 deleted') |
|
95 | ||
|
96 | compare_page = ComparePage(response) | |||
|
97 | file_changes = diff_stat[backend.alias] | |||
|
98 | compare_page.contains_change_summary(*file_changes) | |||
89 |
|
99 | |||
90 | # files op files |
|
100 | # files op files | |
91 | response.mustcontain('File not present at commit: %s' % |
|
101 | response.mustcontain('File not present at commit: %s' % | |
@@ -121,20 +131,24 b' class TestRepoCommitView(object):' | |||||
121 | response.mustcontain(_shorten_commit_id(commit_ids[0])) |
|
131 | response.mustcontain(_shorten_commit_id(commit_ids[0])) | |
122 | response.mustcontain(_shorten_commit_id(commit_ids[1])) |
|
132 | response.mustcontain(_shorten_commit_id(commit_ids[1])) | |
123 |
|
133 | |||
|
134 | compare_page = ComparePage(response) | |||
|
135 | ||||
124 | # svn is special |
|
136 | # svn is special | |
125 | if backend.alias == 'svn': |
|
137 | if backend.alias == 'svn': | |
126 | response.mustcontain('new file 10644') |
|
138 | response.mustcontain('new file 10644') | |
127 | response.mustcontain('1 file changed: 5 inserted, 1 deleted') |
|
139 | for file_changes in [(1, 5, 1), (12, 236, 22), (21, 943, 288)]: | |
128 | response.mustcontain('12 files changed: 236 inserted, 22 deleted') |
|
140 | compare_page.contains_change_summary(*file_changes) | |
129 | response.mustcontain('21 files changed: 943 inserted, 288 deleted') |
|
141 | elif backend.alias == 'git': | |
|
142 | response.mustcontain('new file 100644') | |||
|
143 | for file_changes in [(12, 222, 20), (20, 941, 286)]: | |||
|
144 | compare_page.contains_change_summary(*file_changes) | |||
130 | else: |
|
145 | else: | |
131 | response.mustcontain('new file 100644') |
|
146 | response.mustcontain('new file 100644') | |
132 | response.mustcontain('12 files changed: 222 inserted, 20 deleted') |
|
147 | for file_changes in [(12, 222, 20), (21, 943, 288)]: | |
133 | response.mustcontain('21 files changed: 943 inserted, 288 deleted') |
|
148 | compare_page.contains_change_summary(*file_changes) | |
134 |
|
149 | |||
135 | # files op files |
|
150 | # files op files | |
136 | response.mustcontain('File not present at commit: %s' % |
|
151 | response.mustcontain('File not present at commit: %s' % _shorten_commit_id(commit_ids[1])) | |
137 | _shorten_commit_id(commit_ids[1])) |
|
|||
138 | response.mustcontain('Added docstrings to vcs.cli') # commit msg |
|
152 | response.mustcontain('Added docstrings to vcs.cli') # commit msg | |
139 | response.mustcontain('Changed theme to ADC theme') # commit msg |
|
153 | response.mustcontain('Changed theme to ADC theme') # commit msg | |
140 |
|
154 | |||
@@ -166,13 +180,21 b' class TestRepoCommitView(object):' | |||||
166 | response.mustcontain('File not present at commit: %s' % |
|
180 | response.mustcontain('File not present at commit: %s' % | |
167 | _shorten_commit_id(commit_ids[1])) |
|
181 | _shorten_commit_id(commit_ids[1])) | |
168 |
|
182 | |||
|
183 | compare_page = ComparePage(response) | |||
|
184 | ||||
169 | # svn is special |
|
185 | # svn is special | |
170 | if backend.alias == 'svn': |
|
186 | if backend.alias == 'svn': | |
171 | response.mustcontain('new file 10644') |
|
187 | response.mustcontain('new file 10644') | |
172 | response.mustcontain('32 files changed: 1179 inserted, 310 deleted') |
|
188 | file_changes = (32, 1179, 310) | |
|
189 | compare_page.contains_change_summary(*file_changes) | |||
|
190 | elif backend.alias == 'git': | |||
|
191 | response.mustcontain('new file 100644') | |||
|
192 | file_changes = (31, 1163, 306) | |||
|
193 | compare_page.contains_change_summary(*file_changes) | |||
173 | else: |
|
194 | else: | |
174 | response.mustcontain('new file 100644') |
|
195 | response.mustcontain('new file 100644') | |
175 | response.mustcontain('32 files changed: 1165 inserted, 308 deleted') |
|
196 | file_changes = (32, 1165, 308) | |
|
197 | compare_page.contains_change_summary(*file_changes) | |||
176 |
|
198 | |||
177 | response.mustcontain('Added docstrings to vcs.cli') # commit msg |
|
199 | response.mustcontain('Added docstrings to vcs.cli') # commit msg | |
178 | response.mustcontain('Changed theme to ADC theme') # commit msg |
|
200 | response.mustcontain('Changed theme to ADC theme') # commit msg | |
@@ -246,7 +268,7 b' new file mode 120000' | |||||
246 | """, |
|
268 | """, | |
247 | 'git': r"""diff --git a/README b/README |
|
269 | 'git': r"""diff --git a/README b/README | |
248 | new file mode 120000 |
|
270 | new file mode 120000 | |
249 | index 0000000000000000000000000000000000000000..92cacd285355271487b7e379dba6ca60f9a554a4 |
|
271 | index 0000000..92cacd2 | |
250 | --- /dev/null |
|
272 | --- /dev/null | |
251 | +++ b/README |
|
273 | +++ b/README | |
252 | @@ -0,0 +1 @@ |
|
274 | @@ -0,0 +1 @@ | |
@@ -300,6 +322,6 b' Added a symlink' | |||||
300 |
|
322 | |||
301 | # right pane diff menus |
|
323 | # right pane diff menus | |
302 | if right_menu: |
|
324 | if right_menu: | |
303 |
for elem in ['Hide whitespace changes', 'Toggle |
|
325 | for elem in ['Hide whitespace changes', 'Toggle wide diff', | |
304 | 'Show full context diff']: |
|
326 | 'Show full context diff']: | |
305 | response.mustcontain(elem) |
|
327 | response.mustcontain(elem) |
@@ -623,8 +623,8 b' class ComparePage(AssertResponse):' | |||||
623 |
|
623 | |||
624 | def contains_change_summary(self, files_changed, inserted, deleted): |
|
624 | def contains_change_summary(self, files_changed, inserted, deleted): | |
625 | template = ( |
|
625 | template = ( | |
626 |
|
|
626 | '{files_changed} file{plural} changed: ' | |
627 |
"{inserted} inserted, {deleted} deleted |
|
627 | '<span class="op-added">{inserted} inserted</span>, <span class="op-deleted">{deleted} deleted</span>') | |
628 | self.response.mustcontain(template.format( |
|
628 | self.response.mustcontain(template.format( | |
629 | files_changed=files_changed, |
|
629 | files_changed=files_changed, | |
630 | plural="s" if files_changed > 1 else "", |
|
630 | plural="s" if files_changed > 1 else "", |
@@ -102,7 +102,7 b' class TestCompareView(object):' | |||||
102 | 'git': { |
|
102 | 'git': { | |
103 | 'tag': 'v0.2.2', |
|
103 | 'tag': 'v0.2.2', | |
104 | 'branch': 'master', |
|
104 | 'branch': 'master', | |
105 |
'response': (7 |
|
105 | 'response': (70, 1855, 3002) | |
106 | }, |
|
106 | }, | |
107 | } |
|
107 | } | |
108 |
|
108 |
@@ -20,6 +20,7 b'' | |||||
20 |
|
20 | |||
21 | import pytest |
|
21 | import pytest | |
22 |
|
22 | |||
|
23 | from rhodecode.apps.repository.tests.test_repo_compare import ComparePage | |||
23 | from rhodecode.lib.vcs import nodes |
|
24 | from rhodecode.lib.vcs import nodes | |
24 | from rhodecode.lib.vcs.backends.base import EmptyCommit |
|
25 | from rhodecode.lib.vcs.backends.base import EmptyCommit | |
25 | from rhodecode.tests.fixture import Fixture |
|
26 | from rhodecode.tests.fixture import Fixture | |
@@ -49,18 +50,18 b' class TestSideBySideDiff(object):' | |||||
49 | 'hg': { |
|
50 | 'hg': { | |
50 | 'commits': ['25d7e49c18b159446cadfa506a5cf8ad1cb04067', |
|
51 | 'commits': ['25d7e49c18b159446cadfa506a5cf8ad1cb04067', | |
51 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], |
|
52 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], | |
52 |
'changes': |
|
53 | 'changes': (21, 943, 288), | |
53 | }, |
|
54 | }, | |
54 | 'git': { |
|
55 | 'git': { | |
55 | 'commits': ['6fc9270775aaf5544c1deb014f4ddd60c952fcbb', |
|
56 | 'commits': ['6fc9270775aaf5544c1deb014f4ddd60c952fcbb', | |
56 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], |
|
57 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], | |
57 |
'changes': |
|
58 | 'changes': (20, 941, 286), | |
58 | }, |
|
59 | }, | |
59 |
|
60 | |||
60 | 'svn': { |
|
61 | 'svn': { | |
61 | 'commits': ['336', |
|
62 | 'commits': ['336', | |
62 | '337'], |
|
63 | '337'], | |
63 |
'changes': |
|
64 | 'changes': (21, 943, 288), | |
64 | }, |
|
65 | }, | |
65 | } |
|
66 | } | |
66 |
|
67 | |||
@@ -79,26 +80,27 b' class TestSideBySideDiff(object):' | |||||
79 | params=dict(target_repo=backend.repo_name, diffmode='sidebyside') |
|
80 | params=dict(target_repo=backend.repo_name, diffmode='sidebyside') | |
80 | )) |
|
81 | )) | |
81 |
|
82 | |||
82 | response.mustcontain(file_changes) |
|
83 | compare_page = ComparePage(response) | |
83 | response.mustcontain('Expand 1 commit') |
|
84 | compare_page.contains_change_summary(*file_changes) | |
|
85 | response.mustcontain('Collapse 1 commit') | |||
84 |
|
86 | |||
85 | def test_diff_sidebyside_two_commits(self, app, backend): |
|
87 | def test_diff_sidebyside_two_commits(self, app, backend): | |
86 | commit_id_range = { |
|
88 | commit_id_range = { | |
87 | 'hg': { |
|
89 | 'hg': { | |
88 | 'commits': ['4fdd71e9427417b2e904e0464c634fdee85ec5a7', |
|
90 | 'commits': ['4fdd71e9427417b2e904e0464c634fdee85ec5a7', | |
89 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], |
|
91 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], | |
90 |
'changes': |
|
92 | 'changes': (32, 1165, 308), | |
91 | }, |
|
93 | }, | |
92 | 'git': { |
|
94 | 'git': { | |
93 | 'commits': ['f5fbf9cfd5f1f1be146f6d3b38bcd791a7480c13', |
|
95 | 'commits': ['f5fbf9cfd5f1f1be146f6d3b38bcd791a7480c13', | |
94 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], |
|
96 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], | |
95 |
'changes': |
|
97 | 'changes': (31, 1163, 306), | |
96 | }, |
|
98 | }, | |
97 |
|
99 | |||
98 | 'svn': { |
|
100 | 'svn': { | |
99 | 'commits': ['335', |
|
101 | 'commits': ['335', | |
100 | '337'], |
|
102 | '337'], | |
101 |
'changes': |
|
103 | 'changes': (32, 1179, 310), | |
102 | }, |
|
104 | }, | |
103 | } |
|
105 | } | |
104 |
|
106 | |||
@@ -117,8 +119,36 b' class TestSideBySideDiff(object):' | |||||
117 | params=dict(target_repo=backend.repo_name, diffmode='sidebyside') |
|
119 | params=dict(target_repo=backend.repo_name, diffmode='sidebyside') | |
118 | )) |
|
120 | )) | |
119 |
|
121 | |||
120 | response.mustcontain(file_changes) |
|
122 | compare_page = ComparePage(response) | |
121 | response.mustcontain('Expand 2 commits') |
|
123 | compare_page.contains_change_summary(*file_changes) | |
|
124 | ||||
|
125 | response.mustcontain('Collapse 2 commits') | |||
|
126 | ||||
|
127 | def test_diff_sidebyside_collapsed_commits(self, app, backend_svn): | |||
|
128 | commit_id_range = { | |||
|
129 | ||||
|
130 | 'svn': { | |||
|
131 | 'commits': ['330', | |||
|
132 | '337'], | |||
|
133 | ||||
|
134 | }, | |||
|
135 | } | |||
|
136 | ||||
|
137 | commit_info = commit_id_range['svn'] | |||
|
138 | commit2, commit1 = commit_info['commits'] | |||
|
139 | ||||
|
140 | response = self.app.get(route_path( | |||
|
141 | 'repo_compare', | |||
|
142 | repo_name=backend_svn.repo_name, | |||
|
143 | source_ref_type='rev', | |||
|
144 | source_ref=commit2, | |||
|
145 | target_repo=backend_svn.repo_name, | |||
|
146 | target_ref_type='rev', | |||
|
147 | target_ref=commit1, | |||
|
148 | params=dict(target_repo=backend_svn.repo_name, diffmode='sidebyside') | |||
|
149 | )) | |||
|
150 | ||||
|
151 | response.mustcontain('Expand 7 commits') | |||
122 |
|
152 | |||
123 | @pytest.mark.xfail(reason='GIT does not handle empty commit compare correct (missing 1 commit)') |
|
153 | @pytest.mark.xfail(reason='GIT does not handle empty commit compare correct (missing 1 commit)') | |
124 | def test_diff_side_by_side_from_0_commit(self, app, backend, backend_stub): |
|
154 | def test_diff_side_by_side_from_0_commit(self, app, backend, backend_stub): | |
@@ -145,14 +175,14 b' class TestSideBySideDiff(object):' | |||||
145 | params=dict(diffmode='sidebyside') |
|
175 | params=dict(diffmode='sidebyside') | |
146 | )) |
|
176 | )) | |
147 |
|
177 | |||
148 |
response.mustcontain(' |
|
178 | response.mustcontain('Collapse 2 commits') | |
149 | response.mustcontain('123 file changed') |
|
179 | response.mustcontain('123 file changed') | |
150 |
|
180 | |||
151 | response.mustcontain( |
|
181 | response.mustcontain( | |
152 | 'r%s:%s...r%s:%s' % ( |
|
182 | 'r%s:%s...r%s:%s' % ( | |
153 | commit1.idx, commit1.short_id, commit2.idx, commit2.short_id)) |
|
183 | commit1.idx, commit1.short_id, commit2.idx, commit2.short_id)) | |
154 |
|
184 | |||
155 |
response.mustcontain( |
|
185 | response.mustcontain(f_path) | |
156 |
|
186 | |||
157 | @pytest.mark.xfail(reason='GIT does not handle empty commit compare correct (missing 1 commit)') |
|
187 | @pytest.mark.xfail(reason='GIT does not handle empty commit compare correct (missing 1 commit)') | |
158 | def test_diff_side_by_side_from_0_commit_with_file_filter(self, app, backend, backend_stub): |
|
188 | def test_diff_side_by_side_from_0_commit_with_file_filter(self, app, backend, backend_stub): | |
@@ -179,14 +209,14 b' class TestSideBySideDiff(object):' | |||||
179 | params=dict(f_path=f_path, target_repo=repo.repo_name, diffmode='sidebyside') |
|
209 | params=dict(f_path=f_path, target_repo=repo.repo_name, diffmode='sidebyside') | |
180 | )) |
|
210 | )) | |
181 |
|
211 | |||
182 |
response.mustcontain(' |
|
212 | response.mustcontain('Collapse 2 commits') | |
183 | response.mustcontain('1 file changed') |
|
213 | response.mustcontain('1 file changed') | |
184 |
|
214 | |||
185 | response.mustcontain( |
|
215 | response.mustcontain( | |
186 | 'r%s:%s...r%s:%s' % ( |
|
216 | 'r%s:%s...r%s:%s' % ( | |
187 | commit1.idx, commit1.short_id, commit2.idx, commit2.short_id)) |
|
217 | commit1.idx, commit1.short_id, commit2.idx, commit2.short_id)) | |
188 |
|
218 | |||
189 |
response.mustcontain( |
|
219 | response.mustcontain(f_path) | |
190 |
|
220 | |||
191 | def test_diff_side_by_side_with_empty_file(self, app, backend, backend_stub): |
|
221 | def test_diff_side_by_side_with_empty_file(self, app, backend, backend_stub): | |
192 | commits = [ |
|
222 | commits = [ | |
@@ -211,32 +241,32 b' class TestSideBySideDiff(object):' | |||||
211 | params=dict(f_path=f_path, target_repo=repo.repo_name, diffmode='sidebyside') |
|
241 | params=dict(f_path=f_path, target_repo=repo.repo_name, diffmode='sidebyside') | |
212 | )) |
|
242 | )) | |
213 |
|
243 | |||
214 |
response.mustcontain(' |
|
244 | response.mustcontain('Collapse 2 commits') | |
215 | response.mustcontain('1 file changed') |
|
245 | response.mustcontain('1 file changed') | |
216 |
|
246 | |||
217 | response.mustcontain( |
|
247 | response.mustcontain( | |
218 | 'r%s:%s...r%s:%s' % ( |
|
248 | 'r%s:%s...r%s:%s' % ( | |
219 | commit2.idx, commit2.short_id, commit3.idx, commit3.short_id)) |
|
249 | commit2.idx, commit2.short_id, commit3.idx, commit3.short_id)) | |
220 |
|
250 | |||
221 |
response.mustcontain( |
|
251 | response.mustcontain(f_path) | |
222 |
|
252 | |||
223 | def test_diff_sidebyside_two_commits_with_file_filter(self, app, backend): |
|
253 | def test_diff_sidebyside_two_commits_with_file_filter(self, app, backend): | |
224 | commit_id_range = { |
|
254 | commit_id_range = { | |
225 | 'hg': { |
|
255 | 'hg': { | |
226 | 'commits': ['4fdd71e9427417b2e904e0464c634fdee85ec5a7', |
|
256 | 'commits': ['4fdd71e9427417b2e904e0464c634fdee85ec5a7', | |
227 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], |
|
257 | '603d6c72c46d953420c89d36372f08d9f305f5dd'], | |
228 |
'changes': |
|
258 | 'changes': (1, 3, 3) | |
229 | }, |
|
259 | }, | |
230 | 'git': { |
|
260 | 'git': { | |
231 | 'commits': ['f5fbf9cfd5f1f1be146f6d3b38bcd791a7480c13', |
|
261 | 'commits': ['f5fbf9cfd5f1f1be146f6d3b38bcd791a7480c13', | |
232 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], |
|
262 | '03fa803d7e9fb14daa9a3089e0d1494eda75d986'], | |
233 |
'changes': |
|
263 | 'changes': (1, 3, 3) | |
234 | }, |
|
264 | }, | |
235 |
|
265 | |||
236 | 'svn': { |
|
266 | 'svn': { | |
237 | 'commits': ['335', |
|
267 | 'commits': ['335', | |
238 | '337'], |
|
268 | '337'], | |
239 |
'changes': |
|
269 | 'changes': (1, 3, 3) | |
240 | }, |
|
270 | }, | |
241 | } |
|
271 | } | |
242 | f_path = 'docs/conf.py' |
|
272 | f_path = 'docs/conf.py' | |
@@ -255,5 +285,7 b' class TestSideBySideDiff(object):' | |||||
255 | params=dict(f_path=f_path, target_repo=backend.repo_name, diffmode='sidebyside') |
|
285 | params=dict(f_path=f_path, target_repo=backend.repo_name, diffmode='sidebyside') | |
256 | )) |
|
286 | )) | |
257 |
|
287 | |||
258 |
response.mustcontain(' |
|
288 | response.mustcontain('Collapse 2 commits') | |
259 | response.mustcontain(file_changes) |
|
289 | ||
|
290 | compare_page = ComparePage(response) | |||
|
291 | compare_page.contains_change_summary(*file_changes) |
@@ -41,7 +41,7 b' def route_path(name, params=None, **kwar' | |||||
41 | class TestFeedView(TestController): |
|
41 | class TestFeedView(TestController): | |
42 |
|
42 | |||
43 | @pytest.mark.parametrize("feed_type,response_types,content_type",[ |
|
43 | @pytest.mark.parametrize("feed_type,response_types,content_type",[ | |
44 |
('rss', ['<rss version="2.0" |
|
44 | ('rss', ['<rss version="2.0"'], | |
45 | "application/rss+xml"), |
|
45 | "application/rss+xml"), | |
46 | ('atom', ['xmlns="http://www.w3.org/2005/Atom"', 'xml:lang="en-us"'], |
|
46 | ('atom', ['xmlns="http://www.w3.org/2005/Atom"', 'xml:lang="en-us"'], | |
47 | "application/atom+xml"), |
|
47 | "application/atom+xml"), | |
@@ -116,7 +116,7 b' class TestFeedView(TestController):' | |||||
116 | self, backend, user_util, feed_type): |
|
116 | self, backend, user_util, feed_type): | |
117 | user = user_util.create_user() |
|
117 | user = user_util.create_user() | |
118 | auth_token = AuthTokenModel().create( |
|
118 | auth_token = AuthTokenModel().create( | |
119 | user.user_id, 'test-token', -1, AuthTokenModel.cls.ROLE_API) |
|
119 | user.user_id, u'test-token', -1, AuthTokenModel.cls.ROLE_API) | |
120 | auth_token = auth_token.api_key |
|
120 | auth_token = auth_token.api_key | |
121 |
|
121 | |||
122 | self.app.get( |
|
122 | self.app.get( | |
@@ -127,7 +127,7 b' class TestFeedView(TestController):' | |||||
127 | status=302) |
|
127 | status=302) | |
128 |
|
128 | |||
129 | auth_token = AuthTokenModel().create( |
|
129 | auth_token = AuthTokenModel().create( | |
130 | user.user_id, 'test-token', -1, AuthTokenModel.cls.ROLE_FEED) |
|
130 | user.user_id, u'test-token', -1, AuthTokenModel.cls.ROLE_FEED) | |
131 | auth_token = auth_token.api_key |
|
131 | auth_token = auth_token.api_key | |
132 | self.app.get( |
|
132 | self.app.get( | |
133 | route_path( |
|
133 | route_path( |
@@ -23,6 +23,7 b' import os' | |||||
23 | import mock |
|
23 | import mock | |
24 | import pytest |
|
24 | import pytest | |
25 |
|
25 | |||
|
26 | from rhodecode.apps.repository.tests.test_repo_compare import ComparePage | |||
26 | from rhodecode.apps.repository.views.repo_files import RepoFilesView |
|
27 | from rhodecode.apps.repository.views.repo_files import RepoFilesView | |
27 | from rhodecode.lib import helpers as h |
|
28 | from rhodecode.lib import helpers as h | |
28 | from rhodecode.lib.compat import OrderedDict |
|
29 | from rhodecode.lib.compat import OrderedDict | |
@@ -616,8 +617,11 b' class TestFilesDiff(object):' | |||||
616 | }) |
|
617 | }) | |
617 | # use redirect since this is OLD view redirecting to compare page |
|
618 | # use redirect since this is OLD view redirecting to compare page | |
618 | response = response.follow() |
|
619 | response = response.follow() | |
619 |
response.mustcontain(' |
|
620 | response.mustcontain('Collapse 1 commit') | |
620 | response.mustcontain('1 file changed: 0 inserted, 0 deleted') |
|
621 | file_changes = (1, 0, 0) | |
|
622 | ||||
|
623 | compare_page = ComparePage(response) | |||
|
624 | compare_page.contains_change_summary(*file_changes) | |||
621 |
|
625 | |||
622 | if backend.alias == 'svn': |
|
626 | if backend.alias == 'svn': | |
623 | response.mustcontain('new file 10644') |
|
627 | response.mustcontain('new file 10644') |
@@ -130,7 +130,6 b' class TestRepoForkViewTests(TestControll' | |||||
130 | 'repo_type': backend.alias, |
|
130 | 'repo_type': backend.alias, | |
131 | 'description': description, |
|
131 | 'description': description, | |
132 | 'private': 'False', |
|
132 | 'private': 'False', | |
133 | 'landing_rev': 'rev:tip', |
|
|||
134 | 'csrf_token': csrf_token, |
|
133 | 'csrf_token': csrf_token, | |
135 | } |
|
134 | } | |
136 |
|
135 | |||
@@ -159,7 +158,6 b' class TestRepoForkViewTests(TestControll' | |||||
159 | 'repo_type': backend.alias, |
|
158 | 'repo_type': backend.alias, | |
160 | 'description': description, |
|
159 | 'description': description, | |
161 | 'private': 'False', |
|
160 | 'private': 'False', | |
162 | 'landing_rev': 'rev:tip', |
|
|||
163 | 'csrf_token': csrf_token, |
|
161 | 'csrf_token': csrf_token, | |
164 | } |
|
162 | } | |
165 | self.app.post( |
|
163 | self.app.post( | |
@@ -172,8 +170,8 b' class TestRepoForkViewTests(TestControll' | |||||
172 | route_path('repo_creating_check', repo_name=fork_name)) |
|
170 | route_path('repo_creating_check', repo_name=fork_name)) | |
173 | # test if we have a message that fork is ok |
|
171 | # test if we have a message that fork is ok | |
174 | assert_session_flash(response, |
|
172 | assert_session_flash(response, | |
175 | 'Forked repository %s as <a href="/%s">%s</a>' |
|
173 | 'Forked repository %s as <a href="/%s">%s</a>' % ( | |
176 |
|
|
174 | repo_name, fork_name, fork_name)) | |
177 |
|
175 | |||
178 | # test if the fork was created in the database |
|
176 | # test if the fork was created in the database | |
179 | fork_repo = Session().query(Repository)\ |
|
177 | fork_repo = Session().query(Repository)\ | |
@@ -205,7 +203,6 b' class TestRepoForkViewTests(TestControll' | |||||
205 | 'repo_type': backend.alias, |
|
203 | 'repo_type': backend.alias, | |
206 | 'description': description, |
|
204 | 'description': description, | |
207 | 'private': 'False', |
|
205 | 'private': 'False', | |
208 | 'landing_rev': 'rev:tip', |
|
|||
209 | 'csrf_token': csrf_token, |
|
206 | 'csrf_token': csrf_token, | |
210 | } |
|
207 | } | |
211 | self.app.post( |
|
208 | self.app.post( | |
@@ -218,8 +215,8 b' class TestRepoForkViewTests(TestControll' | |||||
218 | route_path('repo_creating_check', repo_name=fork_name_full)) |
|
215 | route_path('repo_creating_check', repo_name=fork_name_full)) | |
219 | # test if we have a message that fork is ok |
|
216 | # test if we have a message that fork is ok | |
220 | assert_session_flash(response, |
|
217 | assert_session_flash(response, | |
221 | 'Forked repository %s as <a href="/%s">%s</a>' |
|
218 | 'Forked repository %s as <a href="/%s">%s</a>' % ( | |
222 |
|
|
219 | repo_name, fork_name_full, fork_name_full)) | |
223 |
|
220 | |||
224 | # test if the fork was created in the database |
|
221 | # test if the fork was created in the database | |
225 | fork_repo = Session().query(Repository)\ |
|
222 | fork_repo = Session().query(Repository)\ |
@@ -84,7 +84,7 b' class TestRepoIssueTracker(object):' | |||||
84 | extra_environ=xhr_header, params=data) |
|
84 | extra_environ=xhr_header, params=data) | |
85 |
|
85 | |||
86 | assert response.body == \ |
|
86 | assert response.body == \ | |
87 | 'example of <a class="issue-tracker-link" href="http://url">prefix</a> replacement' |
|
87 | 'example of <a class="tooltip issue-tracker-link" href="http://url" title="description">prefix</a> replacement' | |
88 |
|
88 | |||
89 | @request.addfinalizer |
|
89 | @request.addfinalizer | |
90 | def cleanup(): |
|
90 | def cleanup(): | |
@@ -125,7 +125,7 b' class TestRepoIssueTracker(object):' | |||||
125 | self.settings_model.delete_entries(self.uid) |
|
125 | self.settings_model.delete_entries(self.uid) | |
126 |
|
126 | |||
127 | def test_delete_issuetracker_pattern( |
|
127 | def test_delete_issuetracker_pattern( | |
128 | self, autologin_user, backend, csrf_token, settings_util): |
|
128 | self, autologin_user, backend, csrf_token, settings_util, xhr_header): | |
129 | repo = backend.create_repo() |
|
129 | repo = backend.create_repo() | |
130 | repo_name = repo.repo_name |
|
130 | repo_name = repo.repo_name | |
131 | entry_key = 'issuetracker_pat_' |
|
131 | entry_key = 'issuetracker_pat_' | |
@@ -141,8 +141,9 b' class TestRepoIssueTracker(object):' | |||||
141 | repo_name=backend.repo.repo_name), |
|
141 | repo_name=backend.repo.repo_name), | |
142 | { |
|
142 | { | |
143 | 'uid': uid, |
|
143 | 'uid': uid, | |
144 | 'csrf_token': csrf_token |
|
144 | 'csrf_token': csrf_token, | |
145 | }, status=302) |
|
145 | '': '' | |
|
146 | }, extra_environ=xhr_header, status=200) | |||
146 | settings = IssueTrackerSettingsModel( |
|
147 | settings = IssueTrackerSettingsModel( | |
147 | repo=Repository.get_by_repo_name(repo_name)).get_repo_settings() |
|
148 | repo=Repository.get_by_repo_name(repo_name)).get_repo_settings() | |
148 | assert 'rhodecode_%s%s' % (entry_key, uid) not in settings |
|
149 | assert 'rhodecode_%s%s' % (entry_key, uid) not in settings |
@@ -62,7 +62,7 b' class TestRepoPermissionsView(object):' | |||||
62 | route_path('edit_repo_perms', |
|
62 | route_path('edit_repo_perms', | |
63 | repo_name=repo_name), form_data).follow() |
|
63 | repo_name=repo_name), form_data).follow() | |
64 |
|
64 | |||
65 | assert 'Repository permissions updated' in response |
|
65 | assert 'Repository access permissions updated' in response | |
66 |
|
66 | |||
67 | # revoke given |
|
67 | # revoke given | |
68 | form_data = permission_update_data_generator( |
|
68 | form_data = permission_update_data_generator( | |
@@ -74,4 +74,4 b' class TestRepoPermissionsView(object):' | |||||
74 | route_path('edit_repo_perms', |
|
74 | route_path('edit_repo_perms', | |
75 | repo_name=repo_name), form_data).follow() |
|
75 | repo_name=repo_name), form_data).follow() | |
76 |
|
76 | |||
77 | assert 'Repository permissions updated' in response |
|
77 | assert 'Repository access permissions updated' in response |
@@ -101,12 +101,11 b' class TestPullrequestsView(object):' | |||||
101 | for commit_id in pull_request.revisions: |
|
101 | for commit_id in pull_request.revisions: | |
102 | response.mustcontain(commit_id) |
|
102 | response.mustcontain(commit_id) | |
103 |
|
103 | |||
104 |
|
|
104 | response.mustcontain(pull_request.target_ref_parts.type) | |
105 |
|
|
105 | response.mustcontain(pull_request.target_ref_parts.name) | |
106 | target_clone_url = pull_request.target_repo.clone_url() |
|
|||
107 | assert target_clone_url in response |
|
|||
108 |
|
106 | |||
109 |
|
|
107 | response.mustcontain('class="pull-request-merge"') | |
|
108 | ||||
110 | if pr_merge_enabled: |
|
109 | if pr_merge_enabled: | |
111 | response.mustcontain('Pull request reviewer approval is pending') |
|
110 | response.mustcontain('Pull request reviewer approval is pending') | |
112 | else: |
|
111 | else: | |
@@ -261,7 +260,7 b' class TestPullrequestsView(object):' | |||||
261 | True, True, '', MergeFailureReason.MISSING_TARGET_REF, |
|
260 | True, True, '', MergeFailureReason.MISSING_TARGET_REF, | |
262 | metadata={'target_ref': PullRequest.unicode_to_reference(unicode_reference)}) |
|
261 | metadata={'target_ref': PullRequest.unicode_to_reference(unicode_reference)}) | |
263 | response.assert_response().element_contains( |
|
262 | response.assert_response().element_contains( | |
264 |
' |
|
263 | 'div[data-role="merge-message"]', merge_resp.merge_status_message) | |
265 |
|
264 | |||
266 | def test_comment_and_close_pull_request_custom_message_approved( |
|
265 | def test_comment_and_close_pull_request_custom_message_approved( | |
267 | self, pr_util, csrf_token, xhr_header): |
|
266 | self, pr_util, csrf_token, xhr_header): | |
@@ -284,7 +283,7 b' class TestPullrequestsView(object):' | |||||
284 | journal = UserLog.query()\ |
|
283 | journal = UserLog.query()\ | |
285 | .filter(UserLog.user_id == author)\ |
|
284 | .filter(UserLog.user_id == author)\ | |
286 | .filter(UserLog.repository_id == repo) \ |
|
285 | .filter(UserLog.repository_id == repo) \ | |
287 |
.order_by( |
|
286 | .order_by(UserLog.user_log_id.asc()) \ | |
288 | .all() |
|
287 | .all() | |
289 | assert journal[-1].action == 'repo.pull_request.close' |
|
288 | assert journal[-1].action == 'repo.pull_request.close' | |
290 |
|
289 | |||
@@ -323,7 +322,7 b' class TestPullrequestsView(object):' | |||||
323 |
|
322 | |||
324 | journal = UserLog.query()\ |
|
323 | journal = UserLog.query()\ | |
325 | .filter(UserLog.user_id == author, UserLog.repository_id == repo) \ |
|
324 | .filter(UserLog.user_id == author, UserLog.repository_id == repo) \ | |
326 |
.order_by( |
|
325 | .order_by(UserLog.user_log_id.asc()) \ | |
327 | .all() |
|
326 | .all() | |
328 | assert journal[-1].action == 'repo.pull_request.close' |
|
327 | assert journal[-1].action == 'repo.pull_request.close' | |
329 |
|
328 | |||
@@ -467,7 +466,7 b' class TestPullrequestsView(object):' | |||||
467 | .filter(Notification.created_by == pull_request.author.user_id, |
|
466 | .filter(Notification.created_by == pull_request.author.user_id, | |
468 | Notification.type_ == Notification.TYPE_PULL_REQUEST, |
|
467 | Notification.type_ == Notification.TYPE_PULL_REQUEST, | |
469 | Notification.subject.contains( |
|
468 | Notification.subject.contains( | |
470 |
" |
|
469 | "requested a pull request review. !%s" % pull_request_id)) | |
471 | assert len(notifications.all()) == 1 |
|
470 | assert len(notifications.all()) == 1 | |
472 |
|
471 | |||
473 | # Change reviewers and check that a notification was made |
|
472 | # Change reviewers and check that a notification was made | |
@@ -536,9 +535,9 b' class TestPullrequestsView(object):' | |||||
536 |
|
535 | |||
537 | # Check generated diff contents |
|
536 | # Check generated diff contents | |
538 | response = response.follow() |
|
537 | response = response.follow() | |
539 | assert 'content_of_ancestor' not in response.body |
|
538 | response.mustcontain(no=['content_of_ancestor']) | |
540 |
|
|
539 | response.mustcontain(no=['content_of_ancestor-child']) | |
541 |
|
|
540 | response.mustcontain('content_of_change') | |
542 |
|
541 | |||
543 | def test_merge_pull_request_enabled(self, pr_util, csrf_token): |
|
542 | def test_merge_pull_request_enabled(self, pr_util, csrf_token): | |
544 | # Clear any previous calls to rcextensions |
|
543 | # Clear any previous calls to rcextensions | |
@@ -549,11 +548,10 b' class TestPullrequestsView(object):' | |||||
549 | pull_request_id = pull_request.pull_request_id |
|
548 | pull_request_id = pull_request.pull_request_id | |
550 | repo_name = pull_request.target_repo.scm_instance().name, |
|
549 | repo_name = pull_request.target_repo.scm_instance().name, | |
551 |
|
550 | |||
552 | response = self.app.post( |
|
551 | url = route_path('pullrequest_merge', | |
553 | route_path('pullrequest_merge', |
|
|||
554 | repo_name=str(repo_name[0]), |
|
552 | repo_name=str(repo_name[0]), | |
555 |
pull_request_id=pull_request_id) |
|
553 | pull_request_id=pull_request_id) | |
556 |
|
|
554 | response = self.app.post(url, params={'csrf_token': csrf_token}).follow() | |
557 |
|
555 | |||
558 | pull_request = PullRequest.get(pull_request_id) |
|
556 | pull_request = PullRequest.get(pull_request_id) | |
559 |
|
557 | |||
@@ -563,7 +561,7 b' class TestPullrequestsView(object):' | |||||
563 | pull_request, ChangesetStatus.STATUS_APPROVED) |
|
561 | pull_request, ChangesetStatus.STATUS_APPROVED) | |
564 |
|
562 | |||
565 | # Check the relevant log entries were added |
|
563 | # Check the relevant log entries were added | |
566 |
user_logs = UserLog.query().order_by( |
|
564 | user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(3) | |
567 | actions = [log.action for log in user_logs] |
|
565 | actions = [log.action for log in user_logs] | |
568 | pr_commit_ids = PullRequestModel()._get_commit_ids(pull_request) |
|
566 | pr_commit_ids = PullRequestModel()._get_commit_ids(pull_request) | |
569 | expected_actions = [ |
|
567 | expected_actions = [ | |
@@ -573,7 +571,7 b' class TestPullrequestsView(object):' | |||||
573 | ] |
|
571 | ] | |
574 | assert actions == expected_actions |
|
572 | assert actions == expected_actions | |
575 |
|
573 | |||
576 |
user_logs = UserLog.query().order_by( |
|
574 | user_logs = UserLog.query().order_by(UserLog.user_log_id.desc()).limit(4) | |
577 | actions = [log for log in user_logs] |
|
575 | actions = [log for log in user_logs] | |
578 | assert actions[-1].action == 'user.push' |
|
576 | assert actions[-1].action == 'user.push' | |
579 | assert actions[-1].action_data['commit_ids'] == pr_commit_ids |
|
577 | assert actions[-1].action_data['commit_ids'] == pr_commit_ids | |
@@ -690,8 +688,8 b' class TestPullrequestsView(object):' | |||||
690 | pull_request_id=pull_request.pull_request_id)) |
|
688 | pull_request_id=pull_request.pull_request_id)) | |
691 |
|
689 | |||
692 | assert response.status_int == 200 |
|
690 | assert response.status_int == 200 | |
693 |
|
|
691 | response.mustcontain('Pull request updated to') | |
694 |
|
|
692 | response.mustcontain('with 1 added, 0 removed commits.') | |
695 |
|
693 | |||
696 | # check that we have now both revisions |
|
694 | # check that we have now both revisions | |
697 | pull_request = PullRequest.get(pull_request_id) |
|
695 | pull_request = PullRequest.get(pull_request_id) | |
@@ -735,10 +733,10 b' class TestPullrequestsView(object):' | |||||
735 | backend.pull_heads(source, heads=['change-rebased']) |
|
733 | backend.pull_heads(source, heads=['change-rebased']) | |
736 |
|
734 | |||
737 | # update PR |
|
735 | # update PR | |
738 | self.app.post( |
|
736 | url = route_path('pullrequest_update', | |
739 | route_path('pullrequest_update', |
|
|||
740 | repo_name=target.repo_name, |
|
737 | repo_name=target.repo_name, | |
741 |
pull_request_id=pull_request_id) |
|
738 | pull_request_id=pull_request_id) | |
|
739 | self.app.post(url, | |||
742 | params={'update_commits': 'true', 'csrf_token': csrf_token}, |
|
740 | params={'update_commits': 'true', 'csrf_token': csrf_token}, | |
743 | status=200) |
|
741 | status=200) | |
744 |
|
742 | |||
@@ -753,8 +751,8 b' class TestPullrequestsView(object):' | |||||
753 | repo_name=target.repo_name, |
|
751 | repo_name=target.repo_name, | |
754 | pull_request_id=pull_request.pull_request_id)) |
|
752 | pull_request_id=pull_request.pull_request_id)) | |
755 | assert response.status_int == 200 |
|
753 | assert response.status_int == 200 | |
756 |
|
|
754 | response.mustcontain('Pull request updated to') | |
757 |
|
|
755 | response.mustcontain('with 1 added, 1 removed commits.') | |
758 |
|
756 | |||
759 | def test_update_target_revision_with_removal_of_1_commit_git(self, backend_git, csrf_token): |
|
757 | def test_update_target_revision_with_removal_of_1_commit_git(self, backend_git, csrf_token): | |
760 | backend = backend_git |
|
758 | backend = backend_git | |
@@ -801,10 +799,10 b' class TestPullrequestsView(object):' | |||||
801 | vcsrepo.run_git_command(['reset', '--soft', 'HEAD~2']) |
|
799 | vcsrepo.run_git_command(['reset', '--soft', 'HEAD~2']) | |
802 |
|
800 | |||
803 | # update PR |
|
801 | # update PR | |
804 | self.app.post( |
|
802 | url = route_path('pullrequest_update', | |
805 | route_path('pullrequest_update', |
|
|||
806 | repo_name=target.repo_name, |
|
803 | repo_name=target.repo_name, | |
807 |
pull_request_id=pull_request_id) |
|
804 | pull_request_id=pull_request_id) | |
|
805 | self.app.post(url, | |||
808 | params={'update_commits': 'true', 'csrf_token': csrf_token}, |
|
806 | params={'update_commits': 'true', 'csrf_token': csrf_token}, | |
809 | status=200) |
|
807 | status=200) | |
810 |
|
808 | |||
@@ -871,6 +869,7 b' class TestPullrequestsView(object):' | |||||
871 | {'message': 'new-feature', 'branch': branch_name}, |
|
869 | {'message': 'new-feature', 'branch': branch_name}, | |
872 | ] |
|
870 | ] | |
873 | repo = backend_git.create_repo(commits) |
|
871 | repo = backend_git.create_repo(commits) | |
|
872 | repo_name = repo.repo_name | |||
874 | commit_ids = backend_git.commit_ids |
|
873 | commit_ids = backend_git.commit_ids | |
875 |
|
874 | |||
876 | pull_request = PullRequest() |
|
875 | pull_request = PullRequest() | |
@@ -888,13 +887,15 b' class TestPullrequestsView(object):' | |||||
888 | Session().add(pull_request) |
|
887 | Session().add(pull_request) | |
889 | Session().commit() |
|
888 | Session().commit() | |
890 |
|
889 | |||
|
890 | pull_request_id = pull_request.pull_request_id | |||
|
891 | ||||
891 | vcs = repo.scm_instance() |
|
892 | vcs = repo.scm_instance() | |
892 | vcs.remove_ref('refs/heads/{}'.format(branch_name)) |
|
893 | vcs.remove_ref('refs/heads/{}'.format(branch_name)) | |
893 |
|
894 | |||
894 | response = self.app.get(route_path( |
|
895 | response = self.app.get(route_path( | |
895 | 'pullrequest_show', |
|
896 | 'pullrequest_show', | |
896 |
repo_name= |
|
897 | repo_name=repo_name, | |
897 |
pull_request_id= |
|
898 | pull_request_id=pull_request_id)) | |
898 |
|
899 | |||
899 | assert response.status_int == 200 |
|
900 | assert response.status_int == 200 | |
900 |
|
901 | |||
@@ -958,15 +959,15 b' class TestPullrequestsView(object):' | |||||
958 | else: |
|
959 | else: | |
959 | vcs.strip(pr_util.commit_ids['new-feature']) |
|
960 | vcs.strip(pr_util.commit_ids['new-feature']) | |
960 |
|
961 | |||
961 | response = self.app.post( |
|
962 | url = route_path('pullrequest_update', | |
962 | route_path('pullrequest_update', |
|
|||
963 | repo_name=pull_request.target_repo.repo_name, |
|
963 | repo_name=pull_request.target_repo.repo_name, | |
964 |
pull_request_id=pull_request.pull_request_id) |
|
964 | pull_request_id=pull_request.pull_request_id) | |
|
965 | response = self.app.post(url, | |||
965 | params={'update_commits': 'true', |
|
966 | params={'update_commits': 'true', | |
966 | 'csrf_token': csrf_token}) |
|
967 | 'csrf_token': csrf_token}) | |
967 |
|
968 | |||
968 | assert response.status_int == 200 |
|
969 | assert response.status_int == 200 | |
969 | assert response.body == 'true' |
|
970 | assert response.body == '{"response": true, "redirect_url": null}' | |
970 |
|
971 | |||
971 | # Make sure that after update, it won't raise 500 errors |
|
972 | # Make sure that after update, it won't raise 500 errors | |
972 | response = self.app.get(route_path( |
|
973 | response = self.app.get(route_path( | |
@@ -992,12 +993,13 b' class TestPullrequestsView(object):' | |||||
992 | pull_request_id=pull_request.pull_request_id)) |
|
993 | pull_request_id=pull_request.pull_request_id)) | |
993 | assert response.status_int == 200 |
|
994 | assert response.status_int == 200 | |
994 |
|
995 | |||
995 |
|
|
996 | source = response.assert_response().get_element('.pr-source-info') | |
996 |
|
|
997 | source_parent = source.getparent() | |
997 |
assert len( |
|
998 | assert len(source_parent) == 1 | |
998 | target = response.assert_response().get_element('.pr-targetinfo .tag') |
|
999 | ||
999 | target_children = target.getchildren() |
|
1000 | target = response.assert_response().get_element('.pr-target-info') | |
1000 | assert len(target_children) == 1 |
|
1001 | target_parent = target.getparent() | |
|
1002 | assert len(target_parent) == 1 | |||
1001 |
|
1003 | |||
1002 | expected_origin_link = route_path( |
|
1004 | expected_origin_link = route_path( | |
1003 | 'repo_commits', |
|
1005 | 'repo_commits', | |
@@ -1007,10 +1009,8 b' class TestPullrequestsView(object):' | |||||
1007 | 'repo_commits', |
|
1009 | 'repo_commits', | |
1008 | repo_name=pull_request.target_repo.scm_instance().name, |
|
1010 | repo_name=pull_request.target_repo.scm_instance().name, | |
1009 | params=dict(branch='target')) |
|
1011 | params=dict(branch='target')) | |
1010 |
assert |
|
1012 | assert source_parent.attrib['href'] == expected_origin_link | |
1011 | assert origin_children[0].text == 'branch: origin' |
|
1013 | assert target_parent.attrib['href'] == expected_target_link | |
1012 | assert target_children[0].attrib['href'] == expected_target_link |
|
|||
1013 | assert target_children[0].text == 'branch: target' |
|
|||
1014 |
|
1014 | |||
1015 | def test_bookmark_is_not_a_link(self, pr_util): |
|
1015 | def test_bookmark_is_not_a_link(self, pr_util): | |
1016 | pull_request = pr_util.create_pull_request() |
|
1016 | pull_request = pr_util.create_pull_request() | |
@@ -1025,13 +1025,13 b' class TestPullrequestsView(object):' | |||||
1025 | pull_request_id=pull_request.pull_request_id)) |
|
1025 | pull_request_id=pull_request.pull_request_id)) | |
1026 | assert response.status_int == 200 |
|
1026 | assert response.status_int == 200 | |
1027 |
|
1027 | |||
1028 |
|
|
1028 | source = response.assert_response().get_element('.pr-source-info') | |
1029 |
assert |
|
1029 | assert source.text.strip() == 'bookmark:origin' | |
1030 | assert origin.getchildren() == [] |
|
1030 | assert source.getparent().attrib.get('href') is None | |
1031 |
|
1031 | |||
1032 |
target = response.assert_response().get_element('.pr-targetinfo |
|
1032 | target = response.assert_response().get_element('.pr-target-info') | |
1033 |
assert target.text.strip() == 'bookmark: |
|
1033 | assert target.text.strip() == 'bookmark:target' | |
1034 |
assert target.get |
|
1034 | assert target.getparent().attrib.get('href') is None | |
1035 |
|
1035 | |||
1036 | def test_tag_is_not_a_link(self, pr_util): |
|
1036 | def test_tag_is_not_a_link(self, pr_util): | |
1037 | pull_request = pr_util.create_pull_request() |
|
1037 | pull_request = pr_util.create_pull_request() | |
@@ -1046,13 +1046,13 b' class TestPullrequestsView(object):' | |||||
1046 | pull_request_id=pull_request.pull_request_id)) |
|
1046 | pull_request_id=pull_request.pull_request_id)) | |
1047 | assert response.status_int == 200 |
|
1047 | assert response.status_int == 200 | |
1048 |
|
1048 | |||
1049 |
|
|
1049 | source = response.assert_response().get_element('.pr-source-info') | |
1050 |
assert |
|
1050 | assert source.text.strip() == 'tag:origin' | |
1051 | assert origin.getchildren() == [] |
|
1051 | assert source.getparent().attrib.get('href') is None | |
1052 |
|
1052 | |||
1053 |
target = response.assert_response().get_element('.pr-targetinfo |
|
1053 | target = response.assert_response().get_element('.pr-target-info') | |
1054 |
assert target.text.strip() == 'tag: |
|
1054 | assert target.text.strip() == 'tag:target' | |
1055 |
assert target.get |
|
1055 | assert target.getparent().attrib.get('href') is None | |
1056 |
|
1056 | |||
1057 | @pytest.mark.parametrize('mergeable', [True, False]) |
|
1057 | @pytest.mark.parametrize('mergeable', [True, False]) | |
1058 | def test_shadow_repository_link( |
|
1058 | def test_shadow_repository_link( | |
@@ -1205,14 +1205,11 b' class TestPullrequestsControllerDelete(o' | |||||
1205 |
|
1205 | |||
1206 |
|
1206 | |||
1207 | def assert_pull_request_status(pull_request, expected_status): |
|
1207 | def assert_pull_request_status(pull_request, expected_status): | |
1208 | status = ChangesetStatusModel().calculated_review_status( |
|
1208 | status = ChangesetStatusModel().calculated_review_status(pull_request=pull_request) | |
1209 | pull_request=pull_request) |
|
|||
1210 | assert status == expected_status |
|
1209 | assert status == expected_status | |
1211 |
|
1210 | |||
1212 |
|
1211 | |||
1213 | @pytest.mark.parametrize('route', ['pullrequest_new', 'pullrequest_create']) |
|
1212 | @pytest.mark.parametrize('route', ['pullrequest_new', 'pullrequest_create']) | |
1214 | @pytest.mark.usefixtures("autologin_user") |
|
1213 | @pytest.mark.usefixtures("autologin_user") | |
1215 | def test_forbidde_to_repo_summary_for_svn_repositories(backend_svn, app, route): |
|
1214 | def test_forbidde_to_repo_summary_for_svn_repositories(backend_svn, app, route): | |
1216 | response = app.get( |
|
1215 | app.get(route_path(route, repo_name=backend_svn.repo_name), status=404) | |
1217 | route_path(route, repo_name=backend_svn.repo_name), status=404) |
|
|||
1218 |
|
@@ -236,7 +236,7 b' class TestSummaryView(object):' | |||||
236 | with scm_patcher: |
|
236 | with scm_patcher: | |
237 | response = self.app.get( |
|
237 | response = self.app.get( | |
238 | route_path('repo_summary', repo_name=repo_name)) |
|
238 | route_path('repo_summary', repo_name=repo_name)) | |
239 |
assert_response = |
|
239 | assert_response = response.assert_response() | |
240 | assert_response.element_contains( |
|
240 | assert_response.element_contains( | |
241 | '.main .alert-warning strong', 'Missing requirements') |
|
241 | '.main .alert-warning strong', 'Missing requirements') | |
242 | assert_response.element_contains( |
|
242 | assert_response.element_contains( | |
@@ -327,7 +327,7 b' def summary_view(context_stub, request_s' | |||||
327 | @pytest.mark.usefixtures('app') |
|
327 | @pytest.mark.usefixtures('app') | |
328 | class TestCreateReferenceData(object): |
|
328 | class TestCreateReferenceData(object): | |
329 |
|
329 | |||
330 | @pytest.fixture |
|
330 | @pytest.fixture() | |
331 | def example_refs(self): |
|
331 | def example_refs(self): | |
332 | section_1_refs = OrderedDict((('a', 'a_id'), ('b', 'b_id'))) |
|
332 | section_1_refs = OrderedDict((('a', 'a_id'), ('b', 'b_id'))) | |
333 | example_refs = [ |
|
333 | example_refs = [ |
@@ -98,7 +98,7 b' class TestVcsSettings(object):' | |||||
98 | repo_name = backend.repo_name |
|
98 | repo_name = backend.repo_name | |
99 | response = self.app.get(route_path('edit_repo_vcs', repo_name=repo_name)) |
|
99 | response = self.app.get(route_path('edit_repo_vcs', repo_name=repo_name)) | |
100 |
|
100 | |||
101 |
assert_response = |
|
101 | assert_response = response.assert_response() | |
102 | element = assert_response.get_element('#inherit_global_settings') |
|
102 | element = assert_response.get_element('#inherit_global_settings') | |
103 | assert element.checked |
|
103 | assert element.checked | |
104 |
|
104 | |||
@@ -111,7 +111,7 b' class TestVcsSettings(object):' | |||||
111 | repo, 'inherit_vcs_settings', checked_value, 'bool') |
|
111 | repo, 'inherit_vcs_settings', checked_value, 'bool') | |
112 | response = self.app.get(route_path('edit_repo_vcs', repo_name=repo_name)) |
|
112 | response = self.app.get(route_path('edit_repo_vcs', repo_name=repo_name)) | |
113 |
|
113 | |||
114 |
assert_response = |
|
114 | assert_response = response.assert_response() | |
115 | element = assert_response.get_element('#inherit_global_settings') |
|
115 | element = assert_response.get_element('#inherit_global_settings') | |
116 | assert element.checked == checked_value |
|
116 | assert element.checked == checked_value | |
117 |
|
117 | |||
@@ -491,7 +491,7 b' class TestVcsSettings(object):' | |||||
491 |
|
491 | |||
492 | response = self.app.get( |
|
492 | response = self.app.get( | |
493 | route_path('edit_repo_vcs', repo_name=repo_name), status=200) |
|
493 | route_path('edit_repo_vcs', repo_name=repo_name), status=200) | |
494 |
assert_response = |
|
494 | assert_response = response.assert_response() | |
495 | for branch in branches: |
|
495 | for branch in branches: | |
496 | css_selector = '[name=branch_value_{}]'.format(branch.ui_id) |
|
496 | css_selector = '[name=branch_value_{}]'.format(branch.ui_id) | |
497 | element = assert_response.get_element(css_selector) |
|
497 | element = assert_response.get_element(css_selector) | |
@@ -668,7 +668,7 b' class TestVcsSettings(object):' | |||||
668 | Session().commit() |
|
668 | Session().commit() | |
669 |
|
669 | |||
670 | def assert_repo_value_equals_global_value(self, response, setting): |
|
670 | def assert_repo_value_equals_global_value(self, response, setting): | |
671 |
assert_response = |
|
671 | assert_response = response.assert_response() | |
672 | global_css_selector = '[name={}_inherited]'.format(setting) |
|
672 | global_css_selector = '[name={}_inherited]'.format(setting) | |
673 | repo_css_selector = '[name={}]'.format(setting) |
|
673 | repo_css_selector = '[name={}]'.format(setting) | |
674 | repo_element = assert_response.get_element(repo_css_selector) |
|
674 | repo_element = assert_response.get_element(repo_css_selector) |
@@ -59,7 +59,7 b' class TestAdminRepoVcsSettings(object):' | |||||
59 | rhodecode.CONFIG, {'labs_settings_active': 'true'}): |
|
59 | rhodecode.CONFIG, {'labs_settings_active': 'true'}): | |
60 | response = self.app.get(vcs_settings_url) |
|
60 | response = self.app.get(vcs_settings_url) | |
61 |
|
61 | |||
62 |
assertr = |
|
62 | assertr = response.assert_response() | |
63 | assertr.one_element_exists('#rhodecode_{}'.format(setting_name)) |
|
63 | assertr.one_element_exists('#rhodecode_{}'.format(setting_name)) | |
64 |
|
64 | |||
65 | @pytest.mark.parametrize('setting_name, setting_backends', [ |
|
65 | @pytest.mark.parametrize('setting_name, setting_backends', [ |
@@ -22,6 +22,7 b' import logging' | |||||
22 | from pyramid.view import view_config |
|
22 | from pyramid.view import view_config | |
23 |
|
23 | |||
24 | from rhodecode.apps._base import RepoAppView |
|
24 | from rhodecode.apps._base import RepoAppView | |
|
25 | from rhodecode.lib.helpers import SqlPage | |||
25 | from rhodecode.lib import helpers as h |
|
26 | from rhodecode.lib import helpers as h | |
26 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
27 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
27 | from rhodecode.lib.utils2 import safe_int |
|
28 | from rhodecode.lib.utils2 import safe_int | |
@@ -33,8 +34,6 b' log = logging.getLogger(__name__)' | |||||
33 | class AuditLogsView(RepoAppView): |
|
34 | class AuditLogsView(RepoAppView): | |
34 | def load_default_context(self): |
|
35 | def load_default_context(self): | |
35 | c = self._get_local_tmpl_context() |
|
36 | c = self._get_local_tmpl_context() | |
36 |
|
||||
37 |
|
||||
38 | return c |
|
37 | return c | |
39 |
|
38 | |||
40 | @LoginRequired() |
|
39 | @LoginRequired() | |
@@ -54,12 +53,15 b' class AuditLogsView(RepoAppView):' | |||||
54 | filter_term = self.request.GET.get('filter') |
|
53 | filter_term = self.request.GET.get('filter') | |
55 | user_log = RepoModel().get_repo_log(c.db_repo, filter_term) |
|
54 | user_log = RepoModel().get_repo_log(c.db_repo, filter_term) | |
56 |
|
55 | |||
57 |
def url_generator( |
|
56 | def url_generator(page_num): | |
|
57 | query_params = { | |||
|
58 | 'page': page_num | |||
|
59 | } | |||
58 | if filter_term: |
|
60 | if filter_term: | |
59 |
|
|
61 | query_params['filter'] = filter_term | |
60 |
return self.request.current_route_path(_query= |
|
62 | return self.request.current_route_path(_query=query_params) | |
61 |
|
63 | |||
62 |
c.audit_logs = |
|
64 | c.audit_logs = SqlPage( | |
63 | user_log, page=p, items_per_page=10, url=url_generator) |
|
65 | user_log, page=p, items_per_page=10, url_maker=url_generator) | |
64 | c.filter_term = filter_term |
|
66 | c.filter_term = filter_term | |
65 | return self._get_template_context(c) |
|
67 | return self._get_template_context(c) |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file, binary diff hidden |
|
NO CONTENT: modified file, binary diff hidden |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file, binary diff hidden |
|
NO CONTENT: modified file, binary diff hidden |
1 | NO CONTENT: modified file, binary diff hidden |
|
NO CONTENT: modified file, binary diff hidden |
1 | NO CONTENT: modified file, binary diff hidden |
|
NO CONTENT: modified file, binary diff hidden |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/templates/admin/gists/edit.mako to rhodecode/templates/admin/gists/gist_edit.mako |
|
NO CONTENT: file renamed from rhodecode/templates/admin/gists/edit.mako to rhodecode/templates/admin/gists/gist_edit.mako | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/templates/admin/gists/index.mako to rhodecode/templates/admin/gists/gist_index.mako |
|
NO CONTENT: file renamed from rhodecode/templates/admin/gists/index.mako to rhodecode/templates/admin/gists/gist_index.mako | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/templates/admin/gists/new.mako to rhodecode/templates/admin/gists/gist_new.mako |
|
NO CONTENT: file renamed from rhodecode/templates/admin/gists/new.mako to rhodecode/templates/admin/gists/gist_new.mako | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file renamed from rhodecode/templates/admin/gists/show.mako to rhodecode/templates/admin/gists/gist_show.mako |
|
NO CONTENT: file renamed from rhodecode/templates/admin/gists/show.mako to rhodecode/templates/admin/gists/gist_show.mako | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
General Comments 0
You need to be logged in to leave comments.
Login now