Show More
The requested changes are too big and content was truncated. Show full diff
@@ -0,0 +1,137 b'' | |||||
|
1 | .. _svn-http: | |||
|
2 | ||||
|
3 | |svn| With Write Over HTTP | |||
|
4 | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | |||
|
5 | ||||
|
6 | To use |svn| with read/write support over the |svn| HTTP protocol, you have to | |||
|
7 | configure the HTTP |svn| backend. | |||
|
8 | ||||
|
9 | Prerequisites | |||
|
10 | ============= | |||
|
11 | ||||
|
12 | - Enable HTTP support inside the admin VCS settings on your |RCE| instance | |||
|
13 | - You need to install the following tools on the machine that is running an | |||
|
14 | instance of |RCE|: | |||
|
15 | ``Apache HTTP Server`` and ``mod_dav_svn``. | |||
|
16 | ||||
|
17 | ||||
|
18 | Using Ubuntu 14.04 Distribution as an example execute the following: | |||
|
19 | ||||
|
20 | .. code-block:: bash | |||
|
21 | ||||
|
22 | $ sudo apt-get install apache2 libapache2-mod-svn | |||
|
23 | ||||
|
24 | Once installed you need to enable ``dav_svn``: | |||
|
25 | ||||
|
26 | .. code-block:: bash | |||
|
27 | ||||
|
28 | $ sudo a2enmod dav_svn | |||
|
29 | $ sudo a2enmod headers | |||
|
30 | ||||
|
31 | ||||
|
32 | Configuring Apache Setup | |||
|
33 | ======================== | |||
|
34 | ||||
|
35 | .. tip:: | |||
|
36 | ||||
|
37 | It is recommended to run Apache on a port other than 80, due to possible | |||
|
38 | conflicts with other HTTP servers like nginx. To do this, set the | |||
|
39 | ``Listen`` parameter in the ``/etc/apache2/ports.conf`` file, for example | |||
|
40 | ``Listen 8090``. | |||
|
41 | ||||
|
42 | ||||
|
43 | .. warning:: | |||
|
44 | ||||
|
45 | Make sure your Apache instance which runs the mod_dav_svn module is | |||
|
46 | only accessible by |RCE|. Otherwise everyone is able to browse | |||
|
47 | the repositories or run subversion operations (checkout/commit/etc.). | |||
|
48 | ||||
|
49 | It is also recommended to run apache as the same user as |RCE|, otherwise | |||
|
50 | permission issues could occur. To do this edit the ``/etc/apache2/envvars`` | |||
|
51 | ||||
|
52 | .. code-block:: apache | |||
|
53 | ||||
|
54 | export APACHE_RUN_USER=rhodecode | |||
|
55 | export APACHE_RUN_GROUP=rhodecode | |||
|
56 | ||||
|
57 | 1. To configure Apache, create and edit a virtual hosts file, for example | |||
|
58 | :file:`/etc/apache2/sites-available/default.conf`. Below is an example | |||
|
59 | how to use one with auto-generated config ```mod_dav_svn.conf``` | |||
|
60 | from configured |RCE| instance. | |||
|
61 | ||||
|
62 | .. code-block:: apache | |||
|
63 | ||||
|
64 | <VirtualHost *:8090> | |||
|
65 | ServerAdmin rhodecode-admin@localhost | |||
|
66 | DocumentRoot /var/www/html | |||
|
67 | ErrorLog ${'${APACHE_LOG_DIR}'}/error.log | |||
|
68 | CustomLog ${'${APACHE_LOG_DIR}'}/access.log combined | |||
|
69 | Include /home/user/.rccontrol/enterprise-1/mod_dav_svn.conf | |||
|
70 | </VirtualHost> | |||
|
71 | ||||
|
72 | ||||
|
73 | 2. Go to the :menuselection:`Admin --> Settings --> VCS` page, and | |||
|
74 | enable :guilabel:`Proxy Subversion HTTP requests`, and specify the | |||
|
75 | :guilabel:`Subversion HTTP Server URL`. | |||
|
76 | ||||
|
77 | 3. Open the |RCE| configuration file, | |||
|
78 | :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` | |||
|
79 | ||||
|
80 | 4. Add the following configuration option in the ``[app:main]`` | |||
|
81 | section if you don't have it yet. | |||
|
82 | ||||
|
83 | This enables mapping of the created |RCE| repo groups into special | |||
|
84 | |svn| paths. Each time a new repository group is created, the system will | |||
|
85 | update the template file and create new mapping. Apache web server needs to | |||
|
86 | be reloaded to pick up the changes on this file. | |||
|
87 | To do this, simply configure `svn.proxy.reload_cmd` inside the .ini file. | |||
|
88 | Example configuration: | |||
|
89 | ||||
|
90 | ||||
|
91 | .. code-block:: ini | |||
|
92 | ||||
|
93 | ############################################################ | |||
|
94 | ### Subversion proxy support (mod_dav_svn) ### | |||
|
95 | ### Maps RhodeCode repo groups into SVN paths for Apache ### | |||
|
96 | ############################################################ | |||
|
97 | ## Enable or disable the config file generation. | |||
|
98 | svn.proxy.generate_config = true | |||
|
99 | ## Generate config file with `SVNListParentPath` set to `On`. | |||
|
100 | svn.proxy.list_parent_path = true | |||
|
101 | ## Set location and file name of generated config file. | |||
|
102 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |||
|
103 | ## Used as a prefix to the <Location> block in the generated config file. | |||
|
104 | ## In most cases it should be set to `/`. | |||
|
105 | svn.proxy.location_root = / | |||
|
106 | ## Command to reload the mod dav svn configuration on change. | |||
|
107 | ## Example: `/etc/init.d/apache2 reload` | |||
|
108 | svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |||
|
109 | ## If the timeout expires before the reload command finishes, the command will | |||
|
110 | ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |||
|
111 | #svn.proxy.reload_timeout = 10 | |||
|
112 | ||||
|
113 | ||||
|
114 | This would create a special template file called ```mod_dav_svn.conf```. We | |||
|
115 | used that file path in the apache config above inside the Include statement. | |||
|
116 | It's also possible to generate the config from the | |||
|
117 | :menuselection:`Admin --> Settings --> VCS` page. | |||
|
118 | ||||
|
119 | ||||
|
120 | Using |svn| | |||
|
121 | =========== | |||
|
122 | ||||
|
123 | Once |svn| has been enabled on your instance, you can use it with the | |||
|
124 | following examples. For more |svn| information, see the `Subversion Red Book`_ | |||
|
125 | ||||
|
126 | .. code-block:: bash | |||
|
127 | ||||
|
128 | # To clone a repository | |||
|
129 | svn checkout http://my-svn-server.example.com/my-svn-repo | |||
|
130 | ||||
|
131 | # svn commit | |||
|
132 | svn commit | |||
|
133 | ||||
|
134 | ||||
|
135 | .. _Subversion Red Book: http://svnbook.red-bean.com/en/1.7/svn-book.html#svn.ref.svn | |||
|
136 | ||||
|
137 | .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue No newline at end of file |
@@ -0,0 +1,17 b'' | |||||
|
1 | .. _checklist-pull-request: | |||
|
2 | ||||
|
3 | ======================= | |||
|
4 | Pull Request Checklists | |||
|
5 | ======================= | |||
|
6 | ||||
|
7 | ||||
|
8 | ||||
|
9 | Checklists for Pull Request | |||
|
10 | =========================== | |||
|
11 | ||||
|
12 | ||||
|
13 | - Informative description | |||
|
14 | - Linear commit history | |||
|
15 | - Rebased on top of latest changes | |||
|
16 | - Add ticket references. eg fixes #123, references #123 etc. | |||
|
17 |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
@@ -0,0 +1,159 b'' | |||||
|
1 | |RCE| 4.5.0 |RNS| | |||
|
2 | ----------------- | |||
|
3 | ||||
|
4 | Release Date | |||
|
5 | ^^^^^^^^^^^^ | |||
|
6 | ||||
|
7 | - 2016-12-02 | |||
|
8 | ||||
|
9 | ||||
|
10 | New Features | |||
|
11 | ^^^^^^^^^^^^ | |||
|
12 | ||||
|
13 | - Diffs: re-implemented diff engine. Added: Syntax highlighting inside diffs, | |||
|
14 | new side-by-side view with commenting and live chat. Enabled soft-wrapping of | |||
|
15 | long lines and much improved rendering speed for large diffs. | |||
|
16 | - File source view: new file display engine. File view now | |||
|
17 | soft wraps long lines. Double click inside file view show occurrences of | |||
|
18 | clicked item. Added pygments-markdown-lexer for highlighting markdown syntax. | |||
|
19 | - Files annotation: Added new grouped annotations. Color all related commits | |||
|
20 | by double clicking singe commit in annotation view. | |||
|
21 | - Pull request reviewers (EE only): added new default reviewers functionality. | |||
|
22 | Allows picking users or user groups defined as reviewers for new pull request. | |||
|
23 | Picking reviewers can be based on branch name, changed file name patterns or | |||
|
24 | original author of changed source code. eg *.css -> design team. | |||
|
25 | Master branch -> repo owner, fixes #1131. | |||
|
26 | - Pull request reviewers: store and show reasons why given person is a reviewer. | |||
|
27 | Manually adding reviewers after creating a PR will now be also indicated | |||
|
28 | together with who added a given person to review. | |||
|
29 | - Integrations: Webhooks integration now allows to use variables inside the | |||
|
30 | call URL. Currently supported variables are ${repo_name}, ${repo_type}, | |||
|
31 | ${repo_id}, ${repo_url}, ${branch}, ${commit_id}, ${pull_request_id}, | |||
|
32 | ${pull_request_url}. Commits are now grouped by branches as well. | |||
|
33 | Allows much easier integration with CI systems. | |||
|
34 | - Integrations (EE only): allow wildcard * project key in Jira integration | |||
|
35 | settings to allow referencing multiple projects per commit, fixes #4267. | |||
|
36 | - Live notifications: RhodeCode sends live notification to online | |||
|
37 | users on certain events and pages. Currently this works on: invite to chat, | |||
|
38 | update pull request, commit/inline comment. Part of live code review system. | |||
|
39 | Allows users to update the reviewed code while doing the review and never | |||
|
40 | miss any updates or comment replies as they happen. Requires channelstream | |||
|
41 | to be enabled. | |||
|
42 | - Repository groups: added default personal repository groups. Personal groups | |||
|
43 | are isolated playground for users allowing them to create projects or forks. | |||
|
44 | Adds new setting to automatically create personal repo groups for newly | |||
|
45 | created users. New groups are created from specified pattern, for example | |||
|
46 | /u/{username}. Implements #4003. | |||
|
47 | - Security: It's now possible to disable password reset functionality. | |||
|
48 | This is useful for cases when users only use LDAP or similar types of | |||
|
49 | authentication. Implements #3944 | |||
|
50 | - Pull requests: exposed shadow repositories to end users. Users are now given | |||
|
51 | access to the shadow repository which represents state after merge performed. | |||
|
52 | In this way users or especially CI servers can much easier perform code | |||
|
53 | analysis of the final merged code. | |||
|
54 | - Pull requests: My account > pull request page now uses datagrid. | |||
|
55 | It's faster, filterable and sortable. Fixes #4297. | |||
|
56 | - Pull requests: delete pull request action was moved from my account | |||
|
57 | into pull request view itself. This is where everyone was looking for it. | |||
|
58 | - Pull requests: improve showing failed merges with proper status in pull | |||
|
59 | request page. | |||
|
60 | - User groups: overhaul of edit user group page. Added new selector for | |||
|
61 | adding new user group members. | |||
|
62 | - Licensing (EE only): exposed unlock link to deactivate users that are over | |||
|
63 | license limit, to unlock full functionality. This might happen when migrating | |||
|
64 | from CE into EE, and your license supports less active users then allowed. | |||
|
65 | - Global settings: add a new header/footer template to allow flash filtering. | |||
|
66 | In case a license warning appears and admin wants to hide it for some time. | |||
|
67 | The new template can be used to do this. | |||
|
68 | - System info: added create snapshot button to easily generate system state | |||
|
69 | report. Comes in handy for support and reporting. System state holds | |||
|
70 | information such as free disk/memory, CPU load and some of RhodeCode settings. | |||
|
71 | - System info: fetch and show vcs settings from vcsserver. Fixes #4276. | |||
|
72 | - System info: use real memory usage based on new psutil api available. | |||
|
73 | - System info: added info about temporary storage. | |||
|
74 | - System info: expose inode limits and usage. Fixes #4282. | |||
|
75 | - Ui: added new icon for merge commit. | |||
|
76 | ||||
|
77 | ||||
|
78 | ||||
|
79 | General | |||
|
80 | ^^^^^^^ | |||
|
81 | ||||
|
82 | - Notifications: move all notifications into polymer for consistency. | |||
|
83 | Fixes #4201. | |||
|
84 | - Live chat (EE): Improved UI for live-chat. Use Codemirror editor as | |||
|
85 | input for text box. | |||
|
86 | - Api: WARNING DEPRECATION, refactor repository group schemas. Fixes #4133. | |||
|
87 | When using create_repo, create_repo_group, update_repo, update_repo_group | |||
|
88 | the *_name parameter now takes full path including sub repository groups. | |||
|
89 | This is the only way to add resource under another repository group. | |||
|
90 | Furthermore giving non-existing path will no longer create the missing | |||
|
91 | structure. This change makes the api more consistent, it better validates | |||
|
92 | the errors in the data sent to given api call. | |||
|
93 | - Pull requests: disable subrepo handling on pull requests. It means users can | |||
|
94 | now use more types of repositories with subrepos to create pull requests. | |||
|
95 | Since handling is disabled, repositories behind authentication, or outside | |||
|
96 | of network can be used. | |||
|
97 | - VCSServer: fetch backend info from vcsserver including git/hg/svn versions | |||
|
98 | and connection information. | |||
|
99 | - Svn support: it's no longer required to put in repo root path to | |||
|
100 | generate mod-dav-svn config. Fixes #4203. | |||
|
101 | - Svn support: Add reload command option (svn.proxy.reload_cmd) to ini files. | |||
|
102 | Apache can now be automatically reloaded when the mod_dav_svn config changes. | |||
|
103 | - Svn support: Add a view to trigger the (re)generation of Apache mod_dav_svn | |||
|
104 | configuration file. Users are able to generate such file manually by clicking | |||
|
105 | that button. | |||
|
106 | - Dependency: updated subversion library to 1.9. | |||
|
107 | - Dependency: updated ipython to 5.1.0. | |||
|
108 | - Dependency: updated psutil to 4.3.1. | |||
|
109 | ||||
|
110 | ||||
|
111 | Security | |||
|
112 | ^^^^^^^^ | |||
|
113 | ||||
|
114 | - Hipchat: escape user entered data to avoid xss/formatting problems. | |||
|
115 | - VCSServer: obfuscate credentials added into remote url during remote | |||
|
116 | repository creation. Prevents leaking of those credentials inside | |||
|
117 | RhodeCode logs. | |||
|
118 | ||||
|
119 | ||||
|
120 | Performance | |||
|
121 | ^^^^^^^^^^^ | |||
|
122 | ||||
|
123 | - Diffs: new diff engine is much smarter when it comes to showing huge diffs. | |||
|
124 | The rendering speed should be much improved in such cases, however showing | |||
|
125 | full diff is still supported. | |||
|
126 | - VCS backends: when using a repo object from database, re-use this information | |||
|
127 | instead of trying to detect a backend. Reduces the traffic to vcsserver. | |||
|
128 | - Pull requests: Add a column to hold the last merge revision. This will skip | |||
|
129 | heavy recalculation of merge state if nothing changed inside a pull request. | |||
|
130 | - File source view: don't load the file if it is over the size limit since it | |||
|
131 | won't be displayed anyway. This increases speed of loading the page when a | |||
|
132 | file is above cut-off limit defined. | |||
|
133 | ||||
|
134 | ||||
|
135 | Fixes | |||
|
136 | ^^^^^ | |||
|
137 | ||||
|
138 | - Users admin: fixed search filter in user admin page. | |||
|
139 | - Autocomplete: improve the lookup of users with non-ascii characters. In case | |||
|
140 | of unicode email the previous method could generate wrong data, and | |||
|
141 | make search not show up such users. | |||
|
142 | - Svn: added request header downgrade for COPY command to work on | |||
|
143 | https setup. Fixes #4307. | |||
|
144 | - Svn: add handling of renamed files inside our generated changes metadata. | |||
|
145 | Fixes #4258. | |||
|
146 | - Pull requests: fixed problem with creating pull requests on empty repositories. | |||
|
147 | - Events: use branch from previous commit for repo push event commits data so | |||
|
148 | that per-branch grouping works. Fixes #4233. | |||
|
149 | - Login: make sure recaptcha data is always validated. Fixes #4279. | |||
|
150 | - Vcs: Use commit date as modification time when creating archives. | |||
|
151 | Fixes problem with unstable hashes for archives. Fixes #4247. | |||
|
152 | - Issue trackers: fixed bug where saving empty issue tracker via form was | |||
|
153 | causing exception. Fixes #4278. | |||
|
154 | - Styling: fixed gravatar size for pull request reviewers. | |||
|
155 | - Ldap: fixed email extraction typo. An empty email from LDAP server will now | |||
|
156 | not overwrite the stored one. | |||
|
157 | - Integrations: use consistent formatting of users data in Slack integration. | |||
|
158 | - Meta-tags: meta tags are not taken into account when truncating descriptions | |||
|
159 | that are too long. Fixes #4305. No newline at end of file |
@@ -0,0 +1,14 b'' | |||||
|
1 | Copyright 2006 Google Inc. | |||
|
2 | http://code.google.com/p/google-diff-match-patch/ | |||
|
3 | ||||
|
4 | Licensed under the Apache License, Version 2.0 (the "License"); | |||
|
5 | you may not use this file except in compliance with the License. | |||
|
6 | You may obtain a copy of the License at | |||
|
7 | ||||
|
8 | http://www.apache.org/licenses/LICENSE-2.0 | |||
|
9 | ||||
|
10 | Unless required by applicable law or agreed to in writing, software | |||
|
11 | distributed under the License is distributed on an "AS IS" BASIS, | |||
|
12 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||
|
13 | See the License for the specific language governing permissions and | |||
|
14 | limitations under the License. |
@@ -0,0 +1,12 b'' | |||||
|
1 | Copyright © 2015 Jürgen Hermann <jh@web.de> | |||
|
2 | ||||
|
3 | Licensed under the Apache License, Version 2.0 (the "License"); | |||
|
4 | you may not use this file except in compliance with the License. | |||
|
5 | You may obtain a copy of the License at | |||
|
6 | http://www.apache.org/licenses/LICENSE-2.0 | |||
|
7 | ||||
|
8 | Unless required by applicable law or agreed to in writing, software | |||
|
9 | distributed under the License is distributed on an "AS IS" BASIS, | |||
|
10 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |||
|
11 | See the License for the specific language governing permissions and | |||
|
12 | limitations under the License. |
This diff has been collapsed as it changes many lines, (665 lines changed) Show them Hide them | |||||
@@ -0,0 +1,665 b'' | |||||
|
1 | # -*- coding: utf-8 -*- | |||
|
2 | ||||
|
3 | # Copyright (C) 2011-2016 RhodeCode GmbH | |||
|
4 | # | |||
|
5 | # This program is free software: you can redistribute it and/or modify | |||
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |||
|
7 | # (only), as published by the Free Software Foundation. | |||
|
8 | # | |||
|
9 | # This program is distributed in the hope that it will be useful, | |||
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |||
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |||
|
12 | # GNU General Public License for more details. | |||
|
13 | # | |||
|
14 | # You should have received a copy of the GNU Affero General Public License | |||
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |||
|
16 | # | |||
|
17 | # This program is dual-licensed. If you wish to learn more about the | |||
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |||
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |||
|
20 | ||||
|
21 | import logging | |||
|
22 | import difflib | |||
|
23 | from itertools import groupby | |||
|
24 | ||||
|
25 | from pygments import lex | |||
|
26 | from pygments.formatters.html import _get_ttype_class as pygment_token_class | |||
|
27 | from rhodecode.lib.helpers import ( | |||
|
28 | get_lexer_for_filenode, get_lexer_safe, html_escape) | |||
|
29 | from rhodecode.lib.utils2 import AttributeDict | |||
|
30 | from rhodecode.lib.vcs.nodes import FileNode | |||
|
31 | from rhodecode.lib.diff_match_patch import diff_match_patch | |||
|
32 | from rhodecode.lib.diffs import LimitedDiffContainer | |||
|
33 | from pygments.lexers import get_lexer_by_name | |||
|
34 | ||||
|
35 | plain_text_lexer = get_lexer_by_name( | |||
|
36 | 'text', stripall=False, stripnl=False, ensurenl=False) | |||
|
37 | ||||
|
38 | ||||
|
39 | log = logging.getLogger() | |||
|
40 | ||||
|
41 | ||||
|
42 | def filenode_as_lines_tokens(filenode, lexer=None): | |||
|
43 | lexer = lexer or get_lexer_for_filenode(filenode) | |||
|
44 | log.debug('Generating file node pygment tokens for %s, %s', lexer, filenode) | |||
|
45 | tokens = tokenize_string(filenode.content, lexer) | |||
|
46 | lines = split_token_stream(tokens, split_string='\n') | |||
|
47 | rv = list(lines) | |||
|
48 | return rv | |||
|
49 | ||||
|
50 | ||||
|
51 | def tokenize_string(content, lexer): | |||
|
52 | """ | |||
|
53 | Use pygments to tokenize some content based on a lexer | |||
|
54 | ensuring all original new lines and whitespace is preserved | |||
|
55 | """ | |||
|
56 | ||||
|
57 | lexer.stripall = False | |||
|
58 | lexer.stripnl = False | |||
|
59 | lexer.ensurenl = False | |||
|
60 | for token_type, token_text in lex(content, lexer): | |||
|
61 | yield pygment_token_class(token_type), token_text | |||
|
62 | ||||
|
63 | ||||
|
64 | def split_token_stream(tokens, split_string=u'\n'): | |||
|
65 | """ | |||
|
66 | Take a list of (TokenType, text) tuples and split them by a string | |||
|
67 | ||||
|
68 | >>> split_token_stream([(TEXT, 'some\ntext'), (TEXT, 'more\n')]) | |||
|
69 | [(TEXT, 'some'), (TEXT, 'text'), | |||
|
70 | (TEXT, 'more'), (TEXT, 'text')] | |||
|
71 | """ | |||
|
72 | ||||
|
73 | buffer = [] | |||
|
74 | for token_class, token_text in tokens: | |||
|
75 | parts = token_text.split(split_string) | |||
|
76 | for part in parts[:-1]: | |||
|
77 | buffer.append((token_class, part)) | |||
|
78 | yield buffer | |||
|
79 | buffer = [] | |||
|
80 | ||||
|
81 | buffer.append((token_class, parts[-1])) | |||
|
82 | ||||
|
83 | if buffer: | |||
|
84 | yield buffer | |||
|
85 | ||||
|
86 | ||||
|
87 | def filenode_as_annotated_lines_tokens(filenode): | |||
|
88 | """ | |||
|
89 | Take a file node and return a list of annotations => lines, if no annotation | |||
|
90 | is found, it will be None. | |||
|
91 | ||||
|
92 | eg: | |||
|
93 | ||||
|
94 | [ | |||
|
95 | (annotation1, [ | |||
|
96 | (1, line1_tokens_list), | |||
|
97 | (2, line2_tokens_list), | |||
|
98 | ]), | |||
|
99 | (annotation2, [ | |||
|
100 | (3, line1_tokens_list), | |||
|
101 | ]), | |||
|
102 | (None, [ | |||
|
103 | (4, line1_tokens_list), | |||
|
104 | ]), | |||
|
105 | (annotation1, [ | |||
|
106 | (5, line1_tokens_list), | |||
|
107 | (6, line2_tokens_list), | |||
|
108 | ]) | |||
|
109 | ] | |||
|
110 | """ | |||
|
111 | ||||
|
112 | commit_cache = {} # cache commit_getter lookups | |||
|
113 | ||||
|
114 | def _get_annotation(commit_id, commit_getter): | |||
|
115 | if commit_id not in commit_cache: | |||
|
116 | commit_cache[commit_id] = commit_getter() | |||
|
117 | return commit_cache[commit_id] | |||
|
118 | ||||
|
119 | annotation_lookup = { | |||
|
120 | line_no: _get_annotation(commit_id, commit_getter) | |||
|
121 | for line_no, commit_id, commit_getter, line_content | |||
|
122 | in filenode.annotate | |||
|
123 | } | |||
|
124 | ||||
|
125 | annotations_lines = ((annotation_lookup.get(line_no), line_no, tokens) | |||
|
126 | for line_no, tokens | |||
|
127 | in enumerate(filenode_as_lines_tokens(filenode), 1)) | |||
|
128 | ||||
|
129 | grouped_annotations_lines = groupby(annotations_lines, lambda x: x[0]) | |||
|
130 | ||||
|
131 | for annotation, group in grouped_annotations_lines: | |||
|
132 | yield ( | |||
|
133 | annotation, [(line_no, tokens) | |||
|
134 | for (_, line_no, tokens) in group] | |||
|
135 | ) | |||
|
136 | ||||
|
137 | ||||
|
138 | def render_tokenstream(tokenstream): | |||
|
139 | result = [] | |||
|
140 | for token_class, token_ops_texts in rollup_tokenstream(tokenstream): | |||
|
141 | ||||
|
142 | if token_class: | |||
|
143 | result.append(u'<span class="%s">' % token_class) | |||
|
144 | else: | |||
|
145 | result.append(u'<span>') | |||
|
146 | ||||
|
147 | for op_tag, token_text in token_ops_texts: | |||
|
148 | ||||
|
149 | if op_tag: | |||
|
150 | result.append(u'<%s>' % op_tag) | |||
|
151 | ||||
|
152 | escaped_text = html_escape(token_text) | |||
|
153 | ||||
|
154 | # TODO: dan: investigate showing hidden characters like space/nl/tab | |||
|
155 | # escaped_text = escaped_text.replace(' ', '<sp> </sp>') | |||
|
156 | # escaped_text = escaped_text.replace('\n', '<nl>\n</nl>') | |||
|
157 | # escaped_text = escaped_text.replace('\t', '<tab>\t</tab>') | |||
|
158 | ||||
|
159 | result.append(escaped_text) | |||
|
160 | ||||
|
161 | if op_tag: | |||
|
162 | result.append(u'</%s>' % op_tag) | |||
|
163 | ||||
|
164 | result.append(u'</span>') | |||
|
165 | ||||
|
166 | html = ''.join(result) | |||
|
167 | return html | |||
|
168 | ||||
|
169 | ||||
|
170 | def rollup_tokenstream(tokenstream): | |||
|
171 | """ | |||
|
172 | Group a token stream of the format: | |||
|
173 | ||||
|
174 | ('class', 'op', 'text') | |||
|
175 | or | |||
|
176 | ('class', 'text') | |||
|
177 | ||||
|
178 | into | |||
|
179 | ||||
|
180 | [('class1', | |||
|
181 | [('op1', 'text'), | |||
|
182 | ('op2', 'text')]), | |||
|
183 | ('class2', | |||
|
184 | [('op3', 'text')])] | |||
|
185 | ||||
|
186 | This is used to get the minimal tags necessary when | |||
|
187 | rendering to html eg for a token stream ie. | |||
|
188 | ||||
|
189 | <span class="A"><ins>he</ins>llo</span> | |||
|
190 | vs | |||
|
191 | <span class="A"><ins>he</ins></span><span class="A">llo</span> | |||
|
192 | ||||
|
193 | If a 2 tuple is passed in, the output op will be an empty string. | |||
|
194 | ||||
|
195 | eg: | |||
|
196 | ||||
|
197 | >>> rollup_tokenstream([('classA', '', 'h'), | |||
|
198 | ('classA', 'del', 'ell'), | |||
|
199 | ('classA', '', 'o'), | |||
|
200 | ('classB', '', ' '), | |||
|
201 | ('classA', '', 'the'), | |||
|
202 | ('classA', '', 're'), | |||
|
203 | ]) | |||
|
204 | ||||
|
205 | [('classA', [('', 'h'), ('del', 'ell'), ('', 'o')], | |||
|
206 | ('classB', [('', ' ')], | |||
|
207 | ('classA', [('', 'there')]] | |||
|
208 | ||||
|
209 | """ | |||
|
210 | if tokenstream and len(tokenstream[0]) == 2: | |||
|
211 | tokenstream = ((t[0], '', t[1]) for t in tokenstream) | |||
|
212 | ||||
|
213 | result = [] | |||
|
214 | for token_class, op_list in groupby(tokenstream, lambda t: t[0]): | |||
|
215 | ops = [] | |||
|
216 | for token_op, token_text_list in groupby(op_list, lambda o: o[1]): | |||
|
217 | text_buffer = [] | |||
|
218 | for t_class, t_op, t_text in token_text_list: | |||
|
219 | text_buffer.append(t_text) | |||
|
220 | ops.append((token_op, ''.join(text_buffer))) | |||
|
221 | result.append((token_class, ops)) | |||
|
222 | return result | |||
|
223 | ||||
|
224 | ||||
|
225 | def tokens_diff(old_tokens, new_tokens, use_diff_match_patch=True): | |||
|
226 | """ | |||
|
227 | Converts a list of (token_class, token_text) tuples to a list of | |||
|
228 | (token_class, token_op, token_text) tuples where token_op is one of | |||
|
229 | ('ins', 'del', '') | |||
|
230 | ||||
|
231 | :param old_tokens: list of (token_class, token_text) tuples of old line | |||
|
232 | :param new_tokens: list of (token_class, token_text) tuples of new line | |||
|
233 | :param use_diff_match_patch: boolean, will use google's diff match patch | |||
|
234 | library which has options to 'smooth' out the character by character | |||
|
235 | differences making nicer ins/del blocks | |||
|
236 | """ | |||
|
237 | ||||
|
238 | old_tokens_result = [] | |||
|
239 | new_tokens_result = [] | |||
|
240 | ||||
|
241 | similarity = difflib.SequenceMatcher(None, | |||
|
242 | ''.join(token_text for token_class, token_text in old_tokens), | |||
|
243 | ''.join(token_text for token_class, token_text in new_tokens) | |||
|
244 | ).ratio() | |||
|
245 | ||||
|
246 | if similarity < 0.6: # return, the blocks are too different | |||
|
247 | for token_class, token_text in old_tokens: | |||
|
248 | old_tokens_result.append((token_class, '', token_text)) | |||
|
249 | for token_class, token_text in new_tokens: | |||
|
250 | new_tokens_result.append((token_class, '', token_text)) | |||
|
251 | return old_tokens_result, new_tokens_result, similarity | |||
|
252 | ||||
|
253 | token_sequence_matcher = difflib.SequenceMatcher(None, | |||
|
254 | [x[1] for x in old_tokens], | |||
|
255 | [x[1] for x in new_tokens]) | |||
|
256 | ||||
|
257 | for tag, o1, o2, n1, n2 in token_sequence_matcher.get_opcodes(): | |||
|
258 | # check the differences by token block types first to give a more | |||
|
259 | # nicer "block" level replacement vs character diffs | |||
|
260 | ||||
|
261 | if tag == 'equal': | |||
|
262 | for token_class, token_text in old_tokens[o1:o2]: | |||
|
263 | old_tokens_result.append((token_class, '', token_text)) | |||
|
264 | for token_class, token_text in new_tokens[n1:n2]: | |||
|
265 | new_tokens_result.append((token_class, '', token_text)) | |||
|
266 | elif tag == 'delete': | |||
|
267 | for token_class, token_text in old_tokens[o1:o2]: | |||
|
268 | old_tokens_result.append((token_class, 'del', token_text)) | |||
|
269 | elif tag == 'insert': | |||
|
270 | for token_class, token_text in new_tokens[n1:n2]: | |||
|
271 | new_tokens_result.append((token_class, 'ins', token_text)) | |||
|
272 | elif tag == 'replace': | |||
|
273 | # if same type token blocks must be replaced, do a diff on the | |||
|
274 | # characters in the token blocks to show individual changes | |||
|
275 | ||||
|
276 | old_char_tokens = [] | |||
|
277 | new_char_tokens = [] | |||
|
278 | for token_class, token_text in old_tokens[o1:o2]: | |||
|
279 | for char in token_text: | |||
|
280 | old_char_tokens.append((token_class, char)) | |||
|
281 | ||||
|
282 | for token_class, token_text in new_tokens[n1:n2]: | |||
|
283 | for char in token_text: | |||
|
284 | new_char_tokens.append((token_class, char)) | |||
|
285 | ||||
|
286 | old_string = ''.join([token_text for | |||
|
287 | token_class, token_text in old_char_tokens]) | |||
|
288 | new_string = ''.join([token_text for | |||
|
289 | token_class, token_text in new_char_tokens]) | |||
|
290 | ||||
|
291 | char_sequence = difflib.SequenceMatcher( | |||
|
292 | None, old_string, new_string) | |||
|
293 | copcodes = char_sequence.get_opcodes() | |||
|
294 | obuffer, nbuffer = [], [] | |||
|
295 | ||||
|
296 | if use_diff_match_patch: | |||
|
297 | dmp = diff_match_patch() | |||
|
298 | dmp.Diff_EditCost = 11 # TODO: dan: extract this to a setting | |||
|
299 | reps = dmp.diff_main(old_string, new_string) | |||
|
300 | dmp.diff_cleanupEfficiency(reps) | |||
|
301 | ||||
|
302 | a, b = 0, 0 | |||
|
303 | for op, rep in reps: | |||
|
304 | l = len(rep) | |||
|
305 | if op == 0: | |||
|
306 | for i, c in enumerate(rep): | |||
|
307 | obuffer.append((old_char_tokens[a+i][0], '', c)) | |||
|
308 | nbuffer.append((new_char_tokens[b+i][0], '', c)) | |||
|
309 | a += l | |||
|
310 | b += l | |||
|
311 | elif op == -1: | |||
|
312 | for i, c in enumerate(rep): | |||
|
313 | obuffer.append((old_char_tokens[a+i][0], 'del', c)) | |||
|
314 | a += l | |||
|
315 | elif op == 1: | |||
|
316 | for i, c in enumerate(rep): | |||
|
317 | nbuffer.append((new_char_tokens[b+i][0], 'ins', c)) | |||
|
318 | b += l | |||
|
319 | else: | |||
|
320 | for ctag, co1, co2, cn1, cn2 in copcodes: | |||
|
321 | if ctag == 'equal': | |||
|
322 | for token_class, token_text in old_char_tokens[co1:co2]: | |||
|
323 | obuffer.append((token_class, '', token_text)) | |||
|
324 | for token_class, token_text in new_char_tokens[cn1:cn2]: | |||
|
325 | nbuffer.append((token_class, '', token_text)) | |||
|
326 | elif ctag == 'delete': | |||
|
327 | for token_class, token_text in old_char_tokens[co1:co2]: | |||
|
328 | obuffer.append((token_class, 'del', token_text)) | |||
|
329 | elif ctag == 'insert': | |||
|
330 | for token_class, token_text in new_char_tokens[cn1:cn2]: | |||
|
331 | nbuffer.append((token_class, 'ins', token_text)) | |||
|
332 | elif ctag == 'replace': | |||
|
333 | for token_class, token_text in old_char_tokens[co1:co2]: | |||
|
334 | obuffer.append((token_class, 'del', token_text)) | |||
|
335 | for token_class, token_text in new_char_tokens[cn1:cn2]: | |||
|
336 | nbuffer.append((token_class, 'ins', token_text)) | |||
|
337 | ||||
|
338 | old_tokens_result.extend(obuffer) | |||
|
339 | new_tokens_result.extend(nbuffer) | |||
|
340 | ||||
|
341 | return old_tokens_result, new_tokens_result, similarity | |||
|
342 | ||||
|
343 | ||||
|
344 | class DiffSet(object): | |||
|
345 | """ | |||
|
346 | An object for parsing the diff result from diffs.DiffProcessor and | |||
|
347 | adding highlighting, side by side/unified renderings and line diffs | |||
|
348 | """ | |||
|
349 | ||||
|
350 | HL_REAL = 'REAL' # highlights using original file, slow | |||
|
351 | HL_FAST = 'FAST' # highlights using just the line, fast but not correct | |||
|
352 | # in the case of multiline code | |||
|
353 | HL_NONE = 'NONE' # no highlighting, fastest | |||
|
354 | ||||
|
355 | def __init__(self, highlight_mode=HL_REAL, repo_name=None, | |||
|
356 | source_node_getter=lambda filename: None, | |||
|
357 | target_node_getter=lambda filename: None, | |||
|
358 | source_nodes=None, target_nodes=None, | |||
|
359 | max_file_size_limit=150 * 1024, # files over this size will | |||
|
360 | # use fast highlighting | |||
|
361 | comments=None, | |||
|
362 | ): | |||
|
363 | ||||
|
364 | self.highlight_mode = highlight_mode | |||
|
365 | self.highlighted_filenodes = {} | |||
|
366 | self.source_node_getter = source_node_getter | |||
|
367 | self.target_node_getter = target_node_getter | |||
|
368 | self.source_nodes = source_nodes or {} | |||
|
369 | self.target_nodes = target_nodes or {} | |||
|
370 | self.repo_name = repo_name | |||
|
371 | self.comments = comments or {} | |||
|
372 | self.max_file_size_limit = max_file_size_limit | |||
|
373 | ||||
|
374 | def render_patchset(self, patchset, source_ref=None, target_ref=None): | |||
|
375 | diffset = AttributeDict(dict( | |||
|
376 | lines_added=0, | |||
|
377 | lines_deleted=0, | |||
|
378 | changed_files=0, | |||
|
379 | files=[], | |||
|
380 | limited_diff=isinstance(patchset, LimitedDiffContainer), | |||
|
381 | repo_name=self.repo_name, | |||
|
382 | source_ref=source_ref, | |||
|
383 | target_ref=target_ref, | |||
|
384 | )) | |||
|
385 | for patch in patchset: | |||
|
386 | filediff = self.render_patch(patch) | |||
|
387 | filediff.diffset = diffset | |||
|
388 | diffset.files.append(filediff) | |||
|
389 | diffset.changed_files += 1 | |||
|
390 | if not patch['stats']['binary']: | |||
|
391 | diffset.lines_added += patch['stats']['added'] | |||
|
392 | diffset.lines_deleted += patch['stats']['deleted'] | |||
|
393 | ||||
|
394 | return diffset | |||
|
395 | ||||
|
396 | _lexer_cache = {} | |||
|
397 | def _get_lexer_for_filename(self, filename): | |||
|
398 | # cached because we might need to call it twice for source/target | |||
|
399 | if filename not in self._lexer_cache: | |||
|
400 | self._lexer_cache[filename] = get_lexer_safe(filepath=filename) | |||
|
401 | return self._lexer_cache[filename] | |||
|
402 | ||||
|
403 | def render_patch(self, patch): | |||
|
404 | log.debug('rendering diff for %r' % patch['filename']) | |||
|
405 | ||||
|
406 | source_filename = patch['original_filename'] | |||
|
407 | target_filename = patch['filename'] | |||
|
408 | ||||
|
409 | source_lexer = plain_text_lexer | |||
|
410 | target_lexer = plain_text_lexer | |||
|
411 | ||||
|
412 | if not patch['stats']['binary']: | |||
|
413 | if self.highlight_mode == self.HL_REAL: | |||
|
414 | if (source_filename and patch['operation'] in ('D', 'M') | |||
|
415 | and source_filename not in self.source_nodes): | |||
|
416 | self.source_nodes[source_filename] = ( | |||
|
417 | self.source_node_getter(source_filename)) | |||
|
418 | ||||
|
419 | if (target_filename and patch['operation'] in ('A', 'M') | |||
|
420 | and target_filename not in self.target_nodes): | |||
|
421 | self.target_nodes[target_filename] = ( | |||
|
422 | self.target_node_getter(target_filename)) | |||
|
423 | ||||
|
424 | elif self.highlight_mode == self.HL_FAST: | |||
|
425 | source_lexer = self._get_lexer_for_filename(source_filename) | |||
|
426 | target_lexer = self._get_lexer_for_filename(target_filename) | |||
|
427 | ||||
|
428 | source_file = self.source_nodes.get(source_filename, source_filename) | |||
|
429 | target_file = self.target_nodes.get(target_filename, target_filename) | |||
|
430 | ||||
|
431 | source_filenode, target_filenode = None, None | |||
|
432 | ||||
|
433 | # TODO: dan: FileNode.lexer works on the content of the file - which | |||
|
434 | # can be slow - issue #4289 explains a lexer clean up - which once | |||
|
435 | # done can allow caching a lexer for a filenode to avoid the file lookup | |||
|
436 | if isinstance(source_file, FileNode): | |||
|
437 | source_filenode = source_file | |||
|
438 | source_lexer = source_file.lexer | |||
|
439 | if isinstance(target_file, FileNode): | |||
|
440 | target_filenode = target_file | |||
|
441 | target_lexer = target_file.lexer | |||
|
442 | ||||
|
443 | source_file_path, target_file_path = None, None | |||
|
444 | ||||
|
445 | if source_filename != '/dev/null': | |||
|
446 | source_file_path = source_filename | |||
|
447 | if target_filename != '/dev/null': | |||
|
448 | target_file_path = target_filename | |||
|
449 | ||||
|
450 | source_file_type = source_lexer.name | |||
|
451 | target_file_type = target_lexer.name | |||
|
452 | ||||
|
453 | op_hunks = patch['chunks'][0] | |||
|
454 | hunks = patch['chunks'][1:] | |||
|
455 | ||||
|
456 | filediff = AttributeDict({ | |||
|
457 | 'source_file_path': source_file_path, | |||
|
458 | 'target_file_path': target_file_path, | |||
|
459 | 'source_filenode': source_filenode, | |||
|
460 | 'target_filenode': target_filenode, | |||
|
461 | 'hunks': [], | |||
|
462 | 'source_file_type': target_file_type, | |||
|
463 | 'target_file_type': source_file_type, | |||
|
464 | 'patch': patch, | |||
|
465 | 'source_mode': patch['stats']['old_mode'], | |||
|
466 | 'target_mode': patch['stats']['new_mode'], | |||
|
467 | 'limited_diff': isinstance(patch, LimitedDiffContainer), | |||
|
468 | 'diffset': self, | |||
|
469 | }) | |||
|
470 | ||||
|
471 | for hunk in hunks: | |||
|
472 | hunkbit = self.parse_hunk(hunk, source_file, target_file) | |||
|
473 | hunkbit.filediff = filediff | |||
|
474 | filediff.hunks.append(hunkbit) | |||
|
475 | return filediff | |||
|
476 | ||||
|
477 | def parse_hunk(self, hunk, source_file, target_file): | |||
|
478 | result = AttributeDict(dict( | |||
|
479 | source_start=hunk['source_start'], | |||
|
480 | source_length=hunk['source_length'], | |||
|
481 | target_start=hunk['target_start'], | |||
|
482 | target_length=hunk['target_length'], | |||
|
483 | section_header=hunk['section_header'], | |||
|
484 | lines=[], | |||
|
485 | )) | |||
|
486 | before, after = [], [] | |||
|
487 | ||||
|
488 | for line in hunk['lines']: | |||
|
489 | if line['action'] == 'unmod': | |||
|
490 | result.lines.extend( | |||
|
491 | self.parse_lines(before, after, source_file, target_file)) | |||
|
492 | after.append(line) | |||
|
493 | before.append(line) | |||
|
494 | elif line['action'] == 'add': | |||
|
495 | after.append(line) | |||
|
496 | elif line['action'] == 'del': | |||
|
497 | before.append(line) | |||
|
498 | elif line['action'] == 'old-no-nl': | |||
|
499 | before.append(line) | |||
|
500 | elif line['action'] == 'new-no-nl': | |||
|
501 | after.append(line) | |||
|
502 | ||||
|
503 | result.lines.extend( | |||
|
504 | self.parse_lines(before, after, source_file, target_file)) | |||
|
505 | result.unified = self.as_unified(result.lines) | |||
|
506 | result.sideside = result.lines | |||
|
507 | return result | |||
|
508 | ||||
|
509 | def parse_lines(self, before_lines, after_lines, source_file, target_file): | |||
|
510 | # TODO: dan: investigate doing the diff comparison and fast highlighting | |||
|
511 | # on the entire before and after buffered block lines rather than by | |||
|
512 | # line, this means we can get better 'fast' highlighting if the context | |||
|
513 | # allows it - eg. | |||
|
514 | # line 4: """ | |||
|
515 | # line 5: this gets highlighted as a string | |||
|
516 | # line 6: """ | |||
|
517 | ||||
|
518 | lines = [] | |||
|
519 | while before_lines or after_lines: | |||
|
520 | before, after = None, None | |||
|
521 | before_tokens, after_tokens = None, None | |||
|
522 | ||||
|
523 | if before_lines: | |||
|
524 | before = before_lines.pop(0) | |||
|
525 | if after_lines: | |||
|
526 | after = after_lines.pop(0) | |||
|
527 | ||||
|
528 | original = AttributeDict() | |||
|
529 | modified = AttributeDict() | |||
|
530 | ||||
|
531 | if before: | |||
|
532 | if before['action'] == 'old-no-nl': | |||
|
533 | before_tokens = [('nonl', before['line'])] | |||
|
534 | else: | |||
|
535 | before_tokens = self.get_line_tokens( | |||
|
536 | line_text=before['line'], line_number=before['old_lineno'], | |||
|
537 | file=source_file) | |||
|
538 | original.lineno = before['old_lineno'] | |||
|
539 | original.content = before['line'] | |||
|
540 | original.action = self.action_to_op(before['action']) | |||
|
541 | original.comments = self.get_comments_for('old', | |||
|
542 | source_file, before['old_lineno']) | |||
|
543 | ||||
|
544 | if after: | |||
|
545 | if after['action'] == 'new-no-nl': | |||
|
546 | after_tokens = [('nonl', after['line'])] | |||
|
547 | else: | |||
|
548 | after_tokens = self.get_line_tokens( | |||
|
549 | line_text=after['line'], line_number=after['new_lineno'], | |||
|
550 | file=target_file) | |||
|
551 | modified.lineno = after['new_lineno'] | |||
|
552 | modified.content = after['line'] | |||
|
553 | modified.action = self.action_to_op(after['action']) | |||
|
554 | modified.comments = self.get_comments_for('new', | |||
|
555 | target_file, after['new_lineno']) | |||
|
556 | ||||
|
557 | # diff the lines | |||
|
558 | if before_tokens and after_tokens: | |||
|
559 | o_tokens, m_tokens, similarity = tokens_diff( | |||
|
560 | before_tokens, after_tokens) | |||
|
561 | original.content = render_tokenstream(o_tokens) | |||
|
562 | modified.content = render_tokenstream(m_tokens) | |||
|
563 | elif before_tokens: | |||
|
564 | original.content = render_tokenstream( | |||
|
565 | [(x[0], '', x[1]) for x in before_tokens]) | |||
|
566 | elif after_tokens: | |||
|
567 | modified.content = render_tokenstream( | |||
|
568 | [(x[0], '', x[1]) for x in after_tokens]) | |||
|
569 | ||||
|
570 | lines.append(AttributeDict({ | |||
|
571 | 'original': original, | |||
|
572 | 'modified': modified, | |||
|
573 | })) | |||
|
574 | ||||
|
575 | return lines | |||
|
576 | ||||
|
577 | def get_comments_for(self, version, file, line_number): | |||
|
578 | if hasattr(file, 'unicode_path'): | |||
|
579 | file = file.unicode_path | |||
|
580 | ||||
|
581 | if not isinstance(file, basestring): | |||
|
582 | return None | |||
|
583 | ||||
|
584 | line_key = { | |||
|
585 | 'old': 'o', | |||
|
586 | 'new': 'n', | |||
|
587 | }[version] + str(line_number) | |||
|
588 | ||||
|
589 | return self.comments.get(file, {}).get(line_key) | |||
|
590 | ||||
|
591 | def get_line_tokens(self, line_text, line_number, file=None): | |||
|
592 | filenode = None | |||
|
593 | filename = None | |||
|
594 | ||||
|
595 | if isinstance(file, basestring): | |||
|
596 | filename = file | |||
|
597 | elif isinstance(file, FileNode): | |||
|
598 | filenode = file | |||
|
599 | filename = file.unicode_path | |||
|
600 | ||||
|
601 | if self.highlight_mode == self.HL_REAL and filenode: | |||
|
602 | if line_number and file.size < self.max_file_size_limit: | |||
|
603 | return self.get_tokenized_filenode_line(file, line_number) | |||
|
604 | ||||
|
605 | if self.highlight_mode in (self.HL_REAL, self.HL_FAST) and filename: | |||
|
606 | lexer = self._get_lexer_for_filename(filename) | |||
|
607 | return list(tokenize_string(line_text, lexer)) | |||
|
608 | ||||
|
609 | return list(tokenize_string(line_text, plain_text_lexer)) | |||
|
610 | ||||
|
611 | def get_tokenized_filenode_line(self, filenode, line_number): | |||
|
612 | ||||
|
613 | if filenode not in self.highlighted_filenodes: | |||
|
614 | tokenized_lines = filenode_as_lines_tokens(filenode, filenode.lexer) | |||
|
615 | self.highlighted_filenodes[filenode] = tokenized_lines | |||
|
616 | return self.highlighted_filenodes[filenode][line_number - 1] | |||
|
617 | ||||
|
618 | def action_to_op(self, action): | |||
|
619 | return { | |||
|
620 | 'add': '+', | |||
|
621 | 'del': '-', | |||
|
622 | 'unmod': ' ', | |||
|
623 | 'old-no-nl': ' ', | |||
|
624 | 'new-no-nl': ' ', | |||
|
625 | }.get(action, action) | |||
|
626 | ||||
|
627 | def as_unified(self, lines): | |||
|
628 | """ Return a generator that yields the lines of a diff in unified order """ | |||
|
629 | def generator(): | |||
|
630 | buf = [] | |||
|
631 | for line in lines: | |||
|
632 | ||||
|
633 | if buf and not line.original or line.original.action == ' ': | |||
|
634 | for b in buf: | |||
|
635 | yield b | |||
|
636 | buf = [] | |||
|
637 | ||||
|
638 | if line.original: | |||
|
639 | if line.original.action == ' ': | |||
|
640 | yield (line.original.lineno, line.modified.lineno, | |||
|
641 | line.original.action, line.original.content, | |||
|
642 | line.original.comments) | |||
|
643 | continue | |||
|
644 | ||||
|
645 | if line.original.action == '-': | |||
|
646 | yield (line.original.lineno, None, | |||
|
647 | line.original.action, line.original.content, | |||
|
648 | line.original.comments) | |||
|
649 | ||||
|
650 | if line.modified.action == '+': | |||
|
651 | buf.append(( | |||
|
652 | None, line.modified.lineno, | |||
|
653 | line.modified.action, line.modified.content, | |||
|
654 | line.modified.comments)) | |||
|
655 | continue | |||
|
656 | ||||
|
657 | if line.modified: | |||
|
658 | yield (None, line.modified.lineno, | |||
|
659 | line.modified.action, line.modified.content, | |||
|
660 | line.modified.comments) | |||
|
661 | ||||
|
662 | for b in buf: | |||
|
663 | yield b | |||
|
664 | ||||
|
665 | return generator() |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644, binary diff hidden |
|
NO CONTENT: new file 100644, binary diff hidden |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: new file 100644 |
|
NO CONTENT: new file 100644 | ||
The requested commit or file is too big and content was truncated. Show full diff |
@@ -1,5 +1,5 b'' | |||||
1 | [bumpversion] |
|
1 | [bumpversion] | |
2 |
current_version = 4. |
|
2 | current_version = 4.5.0 | |
3 | message = release: Bump version {current_version} to {new_version} |
|
3 | message = release: Bump version {current_version} to {new_version} | |
4 |
|
4 | |||
5 | [bumpversion:file:rhodecode/VERSION] |
|
5 | [bumpversion:file:rhodecode/VERSION] |
@@ -43,6 +43,7 b' syntax: regexp' | |||||
43 | ^rhodecode/public/css/style-polymer.css$ |
|
43 | ^rhodecode/public/css/style-polymer.css$ | |
44 | ^rhodecode/public/js/rhodecode-components.html$ |
|
44 | ^rhodecode/public/js/rhodecode-components.html$ | |
45 | ^rhodecode/public/js/scripts.js$ |
|
45 | ^rhodecode/public/js/scripts.js$ | |
|
46 | ^rhodecode/public/js/rhodecode-components.js$ | |||
46 | ^rhodecode/public/js/src/components/root-styles.gen.html$ |
|
47 | ^rhodecode/public/js/src/components/root-styles.gen.html$ | |
47 | ^rhodecode/public/js/vendors/webcomponentsjs/ |
|
48 | ^rhodecode/public/js/vendors/webcomponentsjs/ | |
48 | ^rhodecode\.db$ |
|
49 | ^rhodecode\.db$ |
@@ -4,26 +4,21 b' done = false' | |||||
4 | [task:bump_version] |
|
4 | [task:bump_version] | |
5 | done = true |
|
5 | done = true | |
6 |
|
6 | |||
7 | [task:rc_tools_pinned] |
|
|||
8 | done = true |
|
|||
9 |
|
||||
10 | [task:fixes_on_stable] |
|
7 | [task:fixes_on_stable] | |
11 | done = true |
|
|||
12 |
|
8 | |||
13 | [task:pip2nix_generated] |
|
9 | [task:pip2nix_generated] | |
14 | done = true |
|
|||
15 |
|
10 | |||
16 | [task:changelog_updated] |
|
11 | [task:changelog_updated] | |
17 | done = true |
|
|||
18 |
|
12 | |||
19 | [task:generate_api_docs] |
|
13 | [task:generate_api_docs] | |
20 | done = true |
|
14 | ||
|
15 | [task:updated_translation] | |||
21 |
|
16 | |||
22 | [release] |
|
17 | [release] | |
23 |
state = |
|
18 | state = in_progress | |
24 |
version = 4. |
|
19 | version = 4.5.0 | |
25 |
|
20 | |||
26 | [task:updated_translation] |
|
21 | [task:rc_tools_pinned] | |
27 |
|
22 | |||
28 | [task:generate_js_routes] |
|
23 | [task:generate_js_routes] | |
29 |
|
24 |
@@ -11,5 +11,5 b' module.exports = function(grunt) {' | |||||
11 | grunt.loadNpmTasks('grunt-crisper'); |
|
11 | grunt.loadNpmTasks('grunt-crisper'); | |
12 | grunt.loadNpmTasks('grunt-contrib-copy'); |
|
12 | grunt.loadNpmTasks('grunt-contrib-copy'); | |
13 |
|
13 | |||
14 |
grunt.registerTask('default', ['less:production', 'less:components', 'concat:polymercss', 'copy','vulcanize', 'crisper' |
|
14 | grunt.registerTask('default', ['less:production', 'less:components', 'concat:polymercss', 'copy', 'concat:dist', 'vulcanize', 'crisper']); | |
15 | }; |
|
15 | }; |
@@ -12,6 +12,10 b' permission notice:' | |||||
12 | file:licenses/msgpack_license.txt |
|
12 | file:licenses/msgpack_license.txt | |
13 | Copyright (c) 2009 - tornado |
|
13 | Copyright (c) 2009 - tornado | |
14 | file:licenses/tornado_license.txt |
|
14 | file:licenses/tornado_license.txt | |
|
15 | Copyright (c) 2015 - pygments-markdown-lexer | |||
|
16 | file:licenses/pygments_markdown_lexer_license.txt | |||
|
17 | Copyright 2006 - diff_match_patch | |||
|
18 | file:licenses/diff_match_patch_license.txt | |||
15 |
|
19 | |||
16 | All licensed under the Apache License, Version 2.0 (the "License"); |
|
20 | All licensed under the Apache License, Version 2.0 (the "License"); | |
17 | you may not use this file except in compliance with the License. |
|
21 | you may not use this file except in compliance with the License. |
@@ -2,7 +2,6 b'' | |||||
2 | WEBPACK=./node_modules/webpack/bin/webpack.js |
|
2 | WEBPACK=./node_modules/webpack/bin/webpack.js | |
3 | GRUNT=grunt |
|
3 | GRUNT=grunt | |
4 | NODE_PATH=./node_modules |
|
4 | NODE_PATH=./node_modules | |
5 | FLAKE8=flake8 setup.py pytest_pylons/ rhodecode/ --select=E124 --ignore=E711,E712,E510,E121,E122,E126,E127,E128,E501,F401 --max-line-length=100 --exclude=*rhodecode/lib/dbmigrate/*,*rhodecode/tests/*,*rhodecode/lib/vcs/utils/* |
|
|||
6 | CI_PREFIX=enterprise |
|
5 | CI_PREFIX=enterprise | |
7 |
|
6 | |||
8 | .PHONY: docs docs-clean ci-docs clean test test-clean test-lint test-only |
|
7 | .PHONY: docs docs-clean ci-docs clean test test-clean test-lint test-only | |
@@ -25,13 +24,6 b' test: test-clean test-only' | |||||
25 | test-clean: |
|
24 | test-clean: | |
26 | rm -rf coverage.xml htmlcov junit.xml pylint.log result |
|
25 | rm -rf coverage.xml htmlcov junit.xml pylint.log result | |
27 |
|
26 | |||
28 | test-lint: |
|
|||
29 | if [ "$$IN_NIX_SHELL" = "1" ]; then \ |
|
|||
30 | $(FLAKE8); \ |
|
|||
31 | else \ |
|
|||
32 | $(FLAKE8) --format=pylint --exit-zero > pylint.log; \ |
|
|||
33 | fi |
|
|||
34 |
|
||||
35 | test-only: |
|
27 | test-only: | |
36 | PYTHONHASHSEED=random py.test -vv -r xw --cov=rhodecode --cov-report=term-missing --cov-report=html rhodecode/tests/ |
|
28 | PYTHONHASHSEED=random py.test -vv -r xw --cov=rhodecode --cov-report=term-missing --cov-report=html rhodecode/tests/ | |
37 |
|
29 |
@@ -10,6 +10,8 b'' | |||||
10 | "paper-tooltip": "PolymerElements/paper-tooltip#^1.1.2", |
|
10 | "paper-tooltip": "PolymerElements/paper-tooltip#^1.1.2", | |
11 | "paper-toast": "PolymerElements/paper-toast#^1.3.0", |
|
11 | "paper-toast": "PolymerElements/paper-toast#^1.3.0", | |
12 | "paper-toggle-button": "PolymerElements/paper-toggle-button#^1.2.0", |
|
12 | "paper-toggle-button": "PolymerElements/paper-toggle-button#^1.2.0", | |
13 | "iron-ajax": "PolymerElements/iron-ajax#^1.4.3" |
|
13 | "iron-ajax": "PolymerElements/iron-ajax#^1.4.3", | |
|
14 | "iron-autogrow-textarea": "PolymerElements/iron-autogrow-textarea#^1.0.13", | |||
|
15 | "iron-a11y-keys": "PolymerElements/iron-a11y-keys#^1.0.6" | |||
14 | } |
|
16 | } | |
15 | } |
|
17 | } |
@@ -1,7 +1,7 b'' | |||||
1 |
|
1 | |||
2 |
|
2 | |||
3 | ################################################################################ |
|
3 | ################################################################################ | |
4 |
## |
|
4 | ## RHODECODE COMMUNITY EDITION CONFIGURATION ## | |
5 | # The %(here)s variable will be replaced with the parent directory of this file# |
|
5 | # The %(here)s variable will be replaced with the parent directory of this file# | |
6 | ################################################################################ |
|
6 | ################################################################################ | |
7 |
|
7 | |||
@@ -64,7 +64,7 b' asyncore_use_poll = true' | |||||
64 | ########################## |
|
64 | ########################## | |
65 | ## GUNICORN WSGI SERVER ## |
|
65 | ## GUNICORN WSGI SERVER ## | |
66 | ########################## |
|
66 | ########################## | |
67 |
## run with gunicorn --log-config |
|
67 | ## run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |
68 |
|
68 | |||
69 | #use = egg:gunicorn#main |
|
69 | #use = egg:gunicorn#main | |
70 | ## Sets the number of process workers. You must set `instance_id = *` |
|
70 | ## Sets the number of process workers. You must set `instance_id = *` | |
@@ -91,11 +91,13 b' asyncore_use_poll = true' | |||||
91 | #timeout = 21600 |
|
91 | #timeout = 21600 | |
92 |
|
92 | |||
93 |
|
93 | |||
94 |
## prefix middleware for RhodeCode |
|
94 | ## prefix middleware for RhodeCode. | |
95 | ## recommended when using proxy setup. |
|
95 | ## recommended when using proxy setup. | |
96 | ## allows to set RhodeCode under a prefix in server. |
|
96 | ## allows to set RhodeCode under a prefix in server. | |
97 |
## eg https://server.com/ |
|
97 | ## eg https://server.com/custom_prefix. Enable `filter-with =` option below as well. | |
98 |
## |
|
98 | ## And set your prefix like: `prefix = /custom_prefix` | |
|
99 | ## be sure to also set beaker.session.cookie_path = /custom_prefix if you need | |||
|
100 | ## to make your cookies only work on prefix url | |||
99 | [filter:proxy-prefix] |
|
101 | [filter:proxy-prefix] | |
100 | use = egg:PasteDeploy#prefix |
|
102 | use = egg:PasteDeploy#prefix | |
101 | prefix = / |
|
103 | prefix = / | |
@@ -194,17 +196,17 b' rss_items_per_page = 10' | |||||
194 | rss_include_diff = false |
|
196 | rss_include_diff = false | |
195 |
|
197 | |||
196 | ## gist URL alias, used to create nicer urls for gist. This should be an |
|
198 | ## gist URL alias, used to create nicer urls for gist. This should be an | |
197 |
## url that does rewrites to _admin/gists/ |
|
199 | ## url that does rewrites to _admin/gists/{gistid}. | |
198 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal |
|
200 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal | |
199 |
## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/ |
|
201 | ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid} | |
200 | gist_alias_url = |
|
202 | gist_alias_url = | |
201 |
|
203 | |||
202 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be |
|
204 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be | |
203 | ## used for access. |
|
205 | ## used for access. | |
204 |
## Adding ?auth_token |
|
206 | ## Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it | |
205 | ## came from the the logged in user who own this authentication token. |
|
207 | ## came from the the logged in user who own this authentication token. | |
206 | ## |
|
208 | ## | |
207 |
## Syntax is |
|
209 | ## Syntax is ControllerClass:function_pattern. | |
208 | ## To enable access to raw_files put `FilesController:raw`. |
|
210 | ## To enable access to raw_files put `FilesController:raw`. | |
209 | ## To enable access to patches add `ChangesetController:changeset_patch`. |
|
211 | ## To enable access to patches add `ChangesetController:changeset_patch`. | |
210 | ## The list should be "," separated and on a single line. |
|
212 | ## The list should be "," separated and on a single line. | |
@@ -377,15 +379,15 b' beaker.session.lock_dir = %(here)s/data/' | |||||
377 |
|
379 | |||
378 | ## Secure encrypted cookie. Requires AES and AES python libraries |
|
380 | ## Secure encrypted cookie. Requires AES and AES python libraries | |
379 | ## you must disable beaker.session.secret to use this |
|
381 | ## you must disable beaker.session.secret to use this | |
380 |
#beaker.session.encrypt_key = |
|
382 | #beaker.session.encrypt_key = key_for_encryption | |
381 |
#beaker.session.validate_key = |
|
383 | #beaker.session.validate_key = validation_key | |
382 |
|
384 | |||
383 | ## sets session as invalid(also logging out user) if it haven not been |
|
385 | ## sets session as invalid(also logging out user) if it haven not been | |
384 | ## accessed for given amount of time in seconds |
|
386 | ## accessed for given amount of time in seconds | |
385 | beaker.session.timeout = 2592000 |
|
387 | beaker.session.timeout = 2592000 | |
386 | beaker.session.httponly = true |
|
388 | beaker.session.httponly = true | |
387 | ## Path to use for the cookie. |
|
389 | ## Path to use for the cookie. Set to prefix if you use prefix middleware | |
388 |
#beaker.session.cookie_path = / |
|
390 | #beaker.session.cookie_path = /custom_prefix | |
389 |
|
391 | |||
390 | ## uncomment for https secure cookie |
|
392 | ## uncomment for https secure cookie | |
391 | beaker.session.secure = false |
|
393 | beaker.session.secure = false | |
@@ -403,8 +405,8 b' beaker.session.auto = false' | |||||
403 | ## Full text search indexer is available in rhodecode-tools under |
|
405 | ## Full text search indexer is available in rhodecode-tools under | |
404 | ## `rhodecode-tools index` command |
|
406 | ## `rhodecode-tools index` command | |
405 |
|
407 | |||
406 | # WHOOSH Backend, doesn't require additional services to run |
|
408 | ## WHOOSH Backend, doesn't require additional services to run | |
407 | # it works good with few dozen repos |
|
409 | ## it works good with few dozen repos | |
408 | search.module = rhodecode.lib.index.whoosh |
|
410 | search.module = rhodecode.lib.index.whoosh | |
409 | search.location = %(here)s/data/index |
|
411 | search.location = %(here)s/data/index | |
410 |
|
412 | |||
@@ -511,7 +513,7 b' sqlalchemy.db1.url = sqlite:///%(here)s/' | |||||
511 |
|
513 | |||
512 | ## print the sql statements to output |
|
514 | ## print the sql statements to output | |
513 | sqlalchemy.db1.echo = false |
|
515 | sqlalchemy.db1.echo = false | |
514 |
## recycle the connections after this am |
|
516 | ## recycle the connections after this amount of seconds | |
515 | sqlalchemy.db1.pool_recycle = 3600 |
|
517 | sqlalchemy.db1.pool_recycle = 3600 | |
516 | sqlalchemy.db1.convert_unicode = true |
|
518 | sqlalchemy.db1.convert_unicode = true | |
517 |
|
519 | |||
@@ -533,19 +535,19 b' vcs.server = localhost:9900' | |||||
533 |
|
535 | |||
534 | ## Web server connectivity protocol, responsible for web based VCS operatations |
|
536 | ## Web server connectivity protocol, responsible for web based VCS operatations | |
535 | ## Available protocols are: |
|
537 | ## Available protocols are: | |
536 |
## `pyro4` - us |
|
538 | ## `pyro4` - use pyro4 server | |
537 |
## `http` - us |
|
539 | ## `http` - use http-rpc backend (default) | |
538 | vcs.server.protocol = http |
|
540 | vcs.server.protocol = http | |
539 |
|
541 | |||
540 | ## Push/Pull operations protocol, available options are: |
|
542 | ## Push/Pull operations protocol, available options are: | |
541 |
## `pyro4` - us |
|
543 | ## `pyro4` - use pyro4 server | |
542 | ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended |
|
544 | ## `http` - use http-rpc backend (default) | |
543 | ## `vcsserver.scm_app` - internal app (EE only) |
|
545 | ## | |
544 |
vcs.scm_app_implementation = |
|
546 | vcs.scm_app_implementation = http | |
545 |
|
547 | |||
546 | ## Push/Pull operations hooks protocol, available options are: |
|
548 | ## Push/Pull operations hooks protocol, available options are: | |
547 |
## `pyro4` - us |
|
549 | ## `pyro4` - use pyro4 server | |
548 |
## `http` - us |
|
550 | ## `http` - use http-rpc backend (default) | |
549 | vcs.hooks.protocol = http |
|
551 | vcs.hooks.protocol = http | |
550 |
|
552 | |||
551 | vcs.server.log_level = debug |
|
553 | vcs.server.log_level = debug | |
@@ -574,12 +576,15 b' svn.proxy.generate_config = false' | |||||
574 | svn.proxy.list_parent_path = true |
|
576 | svn.proxy.list_parent_path = true | |
575 | ## Set location and file name of generated config file. |
|
577 | ## Set location and file name of generated config file. | |
576 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf |
|
578 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |
577 | ## File system path to the directory containing the repositories served by |
|
579 | ## Used as a prefix to the `Location` block in the generated config file. | |
578 | ## RhodeCode. |
|
580 | ## In most cases it should be set to `/`. | |
579 | svn.proxy.parent_path_root = /path/to/repo_store |
|
|||
580 | ## Used as a prefix to the <Location> block in the generated config file. In |
|
|||
581 | ## most cases it should be set to `/`. |
|
|||
582 | svn.proxy.location_root = / |
|
581 | svn.proxy.location_root = / | |
|
582 | ## Command to reload the mod dav svn configuration on change. | |||
|
583 | ## Example: `/etc/init.d/apache2 reload` | |||
|
584 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |||
|
585 | ## If the timeout expires before the reload command finishes, the command will | |||
|
586 | ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |||
|
587 | #svn.proxy.reload_timeout = 10 | |||
583 |
|
588 | |||
584 |
|
589 | |||
585 | ################################ |
|
590 | ################################ |
@@ -1,7 +1,7 b'' | |||||
1 |
|
1 | |||
2 |
|
2 | |||
3 | ################################################################################ |
|
3 | ################################################################################ | |
4 |
## |
|
4 | ## RHODECODE COMMUNITY EDITION CONFIGURATION ## | |
5 | # The %(here)s variable will be replaced with the parent directory of this file# |
|
5 | # The %(here)s variable will be replaced with the parent directory of this file# | |
6 | ################################################################################ |
|
6 | ################################################################################ | |
7 |
|
7 | |||
@@ -64,7 +64,7 b' port = 5000' | |||||
64 | ########################## |
|
64 | ########################## | |
65 | ## GUNICORN WSGI SERVER ## |
|
65 | ## GUNICORN WSGI SERVER ## | |
66 | ########################## |
|
66 | ########################## | |
67 |
## run with gunicorn --log-config |
|
67 | ## run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |
68 |
|
68 | |||
69 | use = egg:gunicorn#main |
|
69 | use = egg:gunicorn#main | |
70 | ## Sets the number of process workers. You must set `instance_id = *` |
|
70 | ## Sets the number of process workers. You must set `instance_id = *` | |
@@ -91,11 +91,13 b' max_requests_jitter = 30' | |||||
91 | timeout = 21600 |
|
91 | timeout = 21600 | |
92 |
|
92 | |||
93 |
|
93 | |||
94 |
## prefix middleware for RhodeCode |
|
94 | ## prefix middleware for RhodeCode. | |
95 | ## recommended when using proxy setup. |
|
95 | ## recommended when using proxy setup. | |
96 | ## allows to set RhodeCode under a prefix in server. |
|
96 | ## allows to set RhodeCode under a prefix in server. | |
97 |
## eg https://server.com/ |
|
97 | ## eg https://server.com/custom_prefix. Enable `filter-with =` option below as well. | |
98 |
## |
|
98 | ## And set your prefix like: `prefix = /custom_prefix` | |
|
99 | ## be sure to also set beaker.session.cookie_path = /custom_prefix if you need | |||
|
100 | ## to make your cookies only work on prefix url | |||
99 | [filter:proxy-prefix] |
|
101 | [filter:proxy-prefix] | |
100 | use = egg:PasteDeploy#prefix |
|
102 | use = egg:PasteDeploy#prefix | |
101 | prefix = / |
|
103 | prefix = / | |
@@ -168,17 +170,17 b' rss_items_per_page = 10' | |||||
168 | rss_include_diff = false |
|
170 | rss_include_diff = false | |
169 |
|
171 | |||
170 | ## gist URL alias, used to create nicer urls for gist. This should be an |
|
172 | ## gist URL alias, used to create nicer urls for gist. This should be an | |
171 |
## url that does rewrites to _admin/gists/ |
|
173 | ## url that does rewrites to _admin/gists/{gistid}. | |
172 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal |
|
174 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal | |
173 |
## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/ |
|
175 | ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid} | |
174 | gist_alias_url = |
|
176 | gist_alias_url = | |
175 |
|
177 | |||
176 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be |
|
178 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be | |
177 | ## used for access. |
|
179 | ## used for access. | |
178 |
## Adding ?auth_token |
|
180 | ## Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it | |
179 | ## came from the the logged in user who own this authentication token. |
|
181 | ## came from the the logged in user who own this authentication token. | |
180 | ## |
|
182 | ## | |
181 |
## Syntax is |
|
183 | ## Syntax is ControllerClass:function_pattern. | |
182 | ## To enable access to raw_files put `FilesController:raw`. |
|
184 | ## To enable access to raw_files put `FilesController:raw`. | |
183 | ## To enable access to patches add `ChangesetController:changeset_patch`. |
|
185 | ## To enable access to patches add `ChangesetController:changeset_patch`. | |
184 | ## The list should be "," separated and on a single line. |
|
186 | ## The list should be "," separated and on a single line. | |
@@ -351,15 +353,15 b' beaker.session.lock_dir = %(here)s/data/' | |||||
351 |
|
353 | |||
352 | ## Secure encrypted cookie. Requires AES and AES python libraries |
|
354 | ## Secure encrypted cookie. Requires AES and AES python libraries | |
353 | ## you must disable beaker.session.secret to use this |
|
355 | ## you must disable beaker.session.secret to use this | |
354 |
#beaker.session.encrypt_key = |
|
356 | #beaker.session.encrypt_key = key_for_encryption | |
355 |
#beaker.session.validate_key = |
|
357 | #beaker.session.validate_key = validation_key | |
356 |
|
358 | |||
357 | ## sets session as invalid(also logging out user) if it haven not been |
|
359 | ## sets session as invalid(also logging out user) if it haven not been | |
358 | ## accessed for given amount of time in seconds |
|
360 | ## accessed for given amount of time in seconds | |
359 | beaker.session.timeout = 2592000 |
|
361 | beaker.session.timeout = 2592000 | |
360 | beaker.session.httponly = true |
|
362 | beaker.session.httponly = true | |
361 | ## Path to use for the cookie. |
|
363 | ## Path to use for the cookie. Set to prefix if you use prefix middleware | |
362 |
#beaker.session.cookie_path = / |
|
364 | #beaker.session.cookie_path = /custom_prefix | |
363 |
|
365 | |||
364 | ## uncomment for https secure cookie |
|
366 | ## uncomment for https secure cookie | |
365 | beaker.session.secure = false |
|
367 | beaker.session.secure = false | |
@@ -377,8 +379,8 b' beaker.session.auto = false' | |||||
377 | ## Full text search indexer is available in rhodecode-tools under |
|
379 | ## Full text search indexer is available in rhodecode-tools under | |
378 | ## `rhodecode-tools index` command |
|
380 | ## `rhodecode-tools index` command | |
379 |
|
381 | |||
380 | # WHOOSH Backend, doesn't require additional services to run |
|
382 | ## WHOOSH Backend, doesn't require additional services to run | |
381 | # it works good with few dozen repos |
|
383 | ## it works good with few dozen repos | |
382 | search.module = rhodecode.lib.index.whoosh |
|
384 | search.module = rhodecode.lib.index.whoosh | |
383 | search.location = %(here)s/data/index |
|
385 | search.location = %(here)s/data/index | |
384 |
|
386 | |||
@@ -480,7 +482,7 b' sqlalchemy.db1.url = postgresql://postgr' | |||||
480 |
|
482 | |||
481 | ## print the sql statements to output |
|
483 | ## print the sql statements to output | |
482 | sqlalchemy.db1.echo = false |
|
484 | sqlalchemy.db1.echo = false | |
483 |
## recycle the connections after this am |
|
485 | ## recycle the connections after this amount of seconds | |
484 | sqlalchemy.db1.pool_recycle = 3600 |
|
486 | sqlalchemy.db1.pool_recycle = 3600 | |
485 | sqlalchemy.db1.convert_unicode = true |
|
487 | sqlalchemy.db1.convert_unicode = true | |
486 |
|
488 | |||
@@ -502,20 +504,20 b' vcs.server = localhost:9900' | |||||
502 |
|
504 | |||
503 | ## Web server connectivity protocol, responsible for web based VCS operatations |
|
505 | ## Web server connectivity protocol, responsible for web based VCS operatations | |
504 | ## Available protocols are: |
|
506 | ## Available protocols are: | |
505 |
## `pyro4` - us |
|
507 | ## `pyro4` - use pyro4 server | |
506 |
## `http` - us |
|
508 | ## `http` - use http-rpc backend (default) | |
507 |
|
|
509 | vcs.server.protocol = http | |
508 |
|
510 | |||
509 | ## Push/Pull operations protocol, available options are: |
|
511 | ## Push/Pull operations protocol, available options are: | |
510 |
## `pyro4` - us |
|
512 | ## `pyro4` - use pyro4 server | |
511 | ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended |
|
513 | ## `http` - use http-rpc backend (default) | |
512 | ## `vcsserver.scm_app` - internal app (EE only) |
|
514 | ## | |
513 |
|
|
515 | vcs.scm_app_implementation = http | |
514 |
|
516 | |||
515 | ## Push/Pull operations hooks protocol, available options are: |
|
517 | ## Push/Pull operations hooks protocol, available options are: | |
516 |
## `pyro4` - us |
|
518 | ## `pyro4` - use pyro4 server | |
517 |
## `http` - us |
|
519 | ## `http` - use http-rpc backend (default) | |
518 |
|
|
520 | vcs.hooks.protocol = http | |
519 |
|
521 | |||
520 | vcs.server.log_level = info |
|
522 | vcs.server.log_level = info | |
521 | ## Start VCSServer with this instance as a subprocess, usefull for development |
|
523 | ## Start VCSServer with this instance as a subprocess, usefull for development | |
@@ -543,12 +545,15 b' svn.proxy.generate_config = false' | |||||
543 | svn.proxy.list_parent_path = true |
|
545 | svn.proxy.list_parent_path = true | |
544 | ## Set location and file name of generated config file. |
|
546 | ## Set location and file name of generated config file. | |
545 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf |
|
547 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |
546 | ## File system path to the directory containing the repositories served by |
|
548 | ## Used as a prefix to the `Location` block in the generated config file. | |
547 | ## RhodeCode. |
|
549 | ## In most cases it should be set to `/`. | |
548 | svn.proxy.parent_path_root = /path/to/repo_store |
|
|||
549 | ## Used as a prefix to the <Location> block in the generated config file. In |
|
|||
550 | ## most cases it should be set to `/`. |
|
|||
551 | svn.proxy.location_root = / |
|
550 | svn.proxy.location_root = / | |
|
551 | ## Command to reload the mod dav svn configuration on change. | |||
|
552 | ## Example: `/etc/init.d/apache2 reload` | |||
|
553 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |||
|
554 | ## If the timeout expires before the reload command finishes, the command will | |||
|
555 | ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |||
|
556 | #svn.proxy.reload_timeout = 10 | |||
552 |
|
557 | |||
553 |
|
558 | |||
554 | ################################ |
|
559 | ################################ |
@@ -4,27 +4,55 b'' | |||||
4 | # derivation. For advanced tweaks to pimp up the development environment we use |
|
4 | # derivation. For advanced tweaks to pimp up the development environment we use | |
5 | # "shell.nix" so that it does not have to clutter this file. |
|
5 | # "shell.nix" so that it does not have to clutter this file. | |
6 |
|
6 | |||
7 | { pkgs ? (import <nixpkgs> {}) |
|
7 | args@ | |
8 |
|
|
8 | { pythonPackages ? "python27Packages" | |
9 | , pythonExternalOverrides ? self: super: {} |
|
9 | , pythonExternalOverrides ? self: super: {} | |
10 | , doCheck ? true |
|
10 | , doCheck ? true | |
|
11 | , ... | |||
11 | }: |
|
12 | }: | |
12 |
|
13 | |||
13 | let pkgs_ = pkgs; in |
|
|||
14 |
|
||||
15 | let |
|
14 | let | |
16 | pkgs = pkgs_.overridePackages (self: super: { |
|
15 | ||
17 | # Override subversion derivation to |
|
16 | # Use nixpkgs from args or import them. We use this indirect approach | |
18 | # - activate python bindings |
|
17 | # through args to be able to use the name `pkgs` for our customized packages. | |
19 | # - set version to 1.8 |
|
18 | # Otherwise we will end up with an infinite recursion. | |
20 | subversion = super.subversion18.override { |
|
19 | nixpkgs = args.pkgs or (import <nixpkgs> { }); | |
21 | httpSupport = true; |
|
20 | ||
22 | pythonBindings = true; |
|
21 | # johbo: Interim bridge which allows us to build with the upcoming | |
23 | python = self.python27Packages.python; |
|
22 | # nixos.16.09 branch (unstable at the moment of writing this note) and the | |
24 | }; |
|
23 | # current stable nixos-16.03. | |
|
24 | backwardsCompatibleFetchgit = { ... }@args: | |||
|
25 | let | |||
|
26 | origSources = nixpkgs.fetchgit args; | |||
|
27 | in | |||
|
28 | nixpkgs.lib.overrideDerivation origSources (oldAttrs: { | |||
|
29 | NIX_PREFETCH_GIT_CHECKOUT_HOOK = '' | |||
|
30 | find $out -name '.git*' -print0 | xargs -0 rm -rf | |||
|
31 | ''; | |||
|
32 | }); | |||
|
33 | ||||
|
34 | # Create a customized version of nixpkgs which should be used throughout the | |||
|
35 | # rest of this file. | |||
|
36 | pkgs = nixpkgs.overridePackages (self: super: { | |||
|
37 | fetchgit = backwardsCompatibleFetchgit; | |||
25 | }); |
|
38 | }); | |
26 |
|
39 | |||
27 | inherit (pkgs.lib) fix extends; |
|
40 | # Evaluates to the last segment of a file system path. | |
|
41 | basename = path: with pkgs.lib; last (splitString "/" path); | |||
|
42 | ||||
|
43 | # source code filter used as arugment to builtins.filterSource. | |||
|
44 | src-filter = path: type: with pkgs.lib; | |||
|
45 | let | |||
|
46 | ext = last (splitString "." path); | |||
|
47 | in | |||
|
48 | !builtins.elem (basename path) [ | |||
|
49 | ".git" ".hg" "__pycache__" ".eggs" | |||
|
50 | "bower_components" "node_modules" | |||
|
51 | "build" "data" "result" "tmp"] && | |||
|
52 | !builtins.elem ext ["egg-info" "pyc"] && | |||
|
53 | # TODO: johbo: This check is wrong, since "path" contains an absolute path, | |||
|
54 | # it would still be good to restore it since we want to ignore "result-*". | |||
|
55 | !hasPrefix "result" path; | |||
28 |
|
56 | |||
29 | basePythonPackages = with builtins; if isAttrs pythonPackages |
|
57 | basePythonPackages = with builtins; if isAttrs pythonPackages | |
30 | then pythonPackages |
|
58 | then pythonPackages | |
@@ -34,25 +62,6 b' let' | |||||
34 | pkgs.buildBowerComponents or |
|
62 | pkgs.buildBowerComponents or | |
35 | (import ./pkgs/backport-16.03-build-bower-components.nix { inherit pkgs; }); |
|
63 | (import ./pkgs/backport-16.03-build-bower-components.nix { inherit pkgs; }); | |
36 |
|
64 | |||
37 | elem = builtins.elem; |
|
|||
38 | basename = path: with pkgs.lib; last (splitString "/" path); |
|
|||
39 | startsWith = prefix: full: let |
|
|||
40 | actualPrefix = builtins.substring 0 (builtins.stringLength prefix) full; |
|
|||
41 | in actualPrefix == prefix; |
|
|||
42 |
|
||||
43 | src-filter = path: type: with pkgs.lib; |
|
|||
44 | let |
|
|||
45 | ext = last (splitString "." path); |
|
|||
46 | in |
|
|||
47 | !elem (basename path) [ |
|
|||
48 | ".git" ".hg" "__pycache__" ".eggs" |
|
|||
49 | "bower_components" "node_modules" |
|
|||
50 | "build" "data" "result" "tmp"] && |
|
|||
51 | !elem ext ["egg-info" "pyc"] && |
|
|||
52 | # TODO: johbo: This check is wrong, since "path" contains an absolute path, |
|
|||
53 | # it would still be good to restore it since we want to ignore "result-*". |
|
|||
54 | !startsWith "result" path; |
|
|||
55 |
|
||||
56 | sources = pkgs.config.rc.sources or {}; |
|
65 | sources = pkgs.config.rc.sources or {}; | |
57 | version = builtins.readFile ./rhodecode/VERSION; |
|
66 | version = builtins.readFile ./rhodecode/VERSION; | |
58 | rhodecode-enterprise-ce-src = builtins.filterSource src-filter ./.; |
|
67 | rhodecode-enterprise-ce-src = builtins.filterSource src-filter ./.; | |
@@ -147,18 +156,6 b' let' | |||||
147 | then "${pkgs.glibcLocales}/lib/locale/locale-archive" |
|
156 | then "${pkgs.glibcLocales}/lib/locale/locale-archive" | |
148 | else ""; |
|
157 | else ""; | |
149 |
|
158 | |||
150 | # Somewhat snappier setup of the development environment |
|
|||
151 | # TODO: move into shell.nix |
|
|||
152 | # TODO: think of supporting a stable path again, so that multiple shells |
|
|||
153 | # can share it. |
|
|||
154 | shellHook = '' |
|
|||
155 | tmp_path=$(mktemp -d) |
|
|||
156 | export PATH="$tmp_path/bin:$PATH" |
|
|||
157 | export PYTHONPATH="$tmp_path/${self.python.sitePackages}:$PYTHONPATH" |
|
|||
158 | mkdir -p $tmp_path/${self.python.sitePackages} |
|
|||
159 | python setup.py develop --prefix $tmp_path --allow-hosts "" |
|
|||
160 | '' + linkNodeAndBowerPackages; |
|
|||
161 |
|
||||
162 | preCheck = '' |
|
159 | preCheck = '' | |
163 | export PATH="$out/bin:$PATH" |
|
160 | export PATH="$out/bin:$PATH" | |
164 | ''; |
|
161 | ''; | |
@@ -226,16 +223,16 b' let' | |||||
226 | rhodecode-testdata-src = sources.rhodecode-testdata or ( |
|
223 | rhodecode-testdata-src = sources.rhodecode-testdata or ( | |
227 | pkgs.fetchhg { |
|
224 | pkgs.fetchhg { | |
228 | url = "https://code.rhodecode.com/upstream/rc_testdata"; |
|
225 | url = "https://code.rhodecode.com/upstream/rc_testdata"; | |
229 |
rev = "v0. |
|
226 | rev = "v0.9.0"; | |
230 | sha256 = "0hy1ba134rq2f9si85yx7j4qhc9ky0hjzdk553s3q026i7km809m"; |
|
227 | sha256 = "0k0ccb7cncd6mmzwckfbr6l7fsymcympwcm948qc3i0f0m6bbg1y"; | |
231 | }); |
|
228 | }); | |
232 |
|
229 | |||
233 | # Apply all overrides and fix the final package set |
|
230 | # Apply all overrides and fix the final package set | |
234 | myPythonPackagesUnfix = |
|
231 | myPythonPackagesUnfix = with pkgs.lib; | |
235 | (extends pythonExternalOverrides |
|
232 | (extends pythonExternalOverrides | |
236 | (extends pythonLocalOverrides |
|
233 | (extends pythonLocalOverrides | |
237 | (extends pythonOverrides |
|
234 | (extends pythonOverrides | |
238 | pythonGeneratedPackages))); |
|
235 | pythonGeneratedPackages))); | |
239 | myPythonPackages = (fix myPythonPackagesUnfix); |
|
236 | myPythonPackages = (pkgs.lib.fix myPythonPackagesUnfix); | |
240 |
|
237 | |||
241 | in myPythonPackages.rhodecode-enterprise-ce |
|
238 | in myPythonPackages.rhodecode-enterprise-ce |
@@ -62,6 +62,24 b' 3. Select :guilabel:`Save`, and you will' | |||||
62 |
|
62 | |||
63 | .. _md-rst: |
|
63 | .. _md-rst: | |
64 |
|
64 | |||
|
65 | ||||
|
66 | Suppress license warnings or errors | |||
|
67 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |||
|
68 | ||||
|
69 | In case you're running on maximum allowed users, RhodeCode will display a | |||
|
70 | warning message on pages that you're close to the license limits. | |||
|
71 | It's often not desired to show that all the time. Here's how you can suppress | |||
|
72 | the license messages. | |||
|
73 | ||||
|
74 | 1. From the |RCE| interface, select | |||
|
75 | :menuselection:`Admin --> Settings --> Global` | |||
|
76 | 2. Select :guilabel:`Flash message filtering` from the drop-down menu. | |||
|
77 | 3. Select :guilabel:`Save`, and you will no longer see the license message | |||
|
78 | once your page refreshes. | |||
|
79 | ||||
|
80 | .. _admin-tricks-suppress-license-messages: | |||
|
81 | ||||
|
82 | ||||
65 | Markdown or RST Rendering |
|
83 | Markdown or RST Rendering | |
66 | ^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
84 | ^^^^^^^^^^^^^^^^^^^^^^^^^ | |
67 |
|
85 |
@@ -7,10 +7,11 b' Use the following example to configure A' | |||||
7 |
|
7 | |||
8 | .. code-block:: apache |
|
8 | .. code-block:: apache | |
9 |
|
9 | |||
10 |
<Location /<someprefix> |
|
10 | <Location /<someprefix>/ # Change <someprefix> into your chosen prefix | |
11 | ProxyPass http://127.0.0.1:5000/<someprefix> |
|
11 | ProxyPreserveHost On | |
12 |
ProxyPass |
|
12 | ProxyPass "http://127.0.0.1:5000/" | |
13 | SetEnvIf X-Url-Scheme https HTTPS=1 |
|
13 | ProxyPassReverse "http://127.0.0.1:5000/" | |
|
14 | Header set X-Url-Scheme https env=HTTPS | |||
14 | </Location> |
|
15 | </Location> | |
15 |
|
16 | |||
16 | In addition to the regular Apache setup you will need to add the following |
|
17 | In addition to the regular Apache setup you will need to add the following |
@@ -9,9 +9,14 b' Backup and Restore' | |||||
9 | To snapshot an instance of |RCE|, and save its settings, you need to backup the |
|
9 | To snapshot an instance of |RCE|, and save its settings, you need to backup the | |
10 | following parts of the system at the same time. |
|
10 | following parts of the system at the same time. | |
11 |
|
11 | |||
12 | * The |repos| managed by the instance. |
|
12 | * The |repos| managed by the instance together with the stored Gists. | |
13 | * The |RCE| database. |
|
13 | * The |RCE| database. | |
14 | * Any configuration files or extensions that you've configured. |
|
14 | * Any configuration files or extensions that you've configured. In most | |
|
15 | cases it's only the :file:`rhodecode.ini` file. | |||
|
16 | * Installer files such as those in `/opt/rhodecode` can be backed-up, however | |||
|
17 | it's not required since in case of a recovery installer simply | |||
|
18 | re-creates those. | |||
|
19 | ||||
15 |
|
20 | |||
16 | .. important:: |
|
21 | .. important:: | |
17 |
|
22 | |||
@@ -29,11 +34,17 b' Repository Backup' | |||||
29 | ^^^^^^^^^^^^^^^^^ |
|
34 | ^^^^^^^^^^^^^^^^^ | |
30 |
|
35 | |||
31 | To back up your |repos|, use the API to get a list of all |repos| managed, |
|
36 | To back up your |repos|, use the API to get a list of all |repos| managed, | |
32 | and then clone them to your backup location. |
|
37 | and then clone them to your backup location. This is the most safe backup option. | |
|
38 | Backing up the storage directory could potentially result in a backup of | |||
|
39 | partially committed files or commits. (Backup taking place during a big push) | |||
|
40 | As an alternative you could use a rsync or simple `cp` commands if you can | |||
|
41 | ensure your instance is only in read-only mode or stopped at the moment. | |||
|
42 | ||||
33 |
|
43 | |||
34 | Use the ``get_repos`` method to list all your managed |repos|, |
|
44 | Use the ``get_repos`` method to list all your managed |repos|, | |
35 | and use the ``clone_uri`` information that is returned. See the :ref:`api` |
|
45 | and use the ``clone_uri`` information that is returned. See the :ref:`api` | |
36 | for more information. |
|
46 | for more information. Be sure to keep the structure or repositories with their | |
|
47 | repository groups. | |||
37 |
|
48 | |||
38 | .. important:: |
|
49 | .. important:: | |
39 |
|
50 | |||
@@ -54,13 +65,19 b' backup location:' | |||||
54 | .. code-block:: bash |
|
65 | .. code-block:: bash | |
55 |
|
66 | |||
56 | # For MySQL DBs |
|
67 | # For MySQL DBs | |
57 | $ mysqldump -u <uname> -p <pass> db_name > mysql-db-backup |
|
68 | $ mysqldump -u <uname> -p <pass> rhodecode_db_name > mysql-db-backup | |
|
69 | # MySQL restore command | |||
|
70 | $ mysql -u <uname> -p <pass> rhodecode_db_name < mysql-db-backup | |||
58 |
|
71 | |||
59 | # For PostgreSQL DBs |
|
72 | # For PostgreSQL DBs | |
60 | $ pg_dump dbname > postgresql-db-backup |
|
73 | $ PGPASSWORD=<pass> pg_dump rhodecode_db_name > postgresql-db-backup | |
|
74 | # PosgreSQL restore | |||
|
75 | $ PGPASSWORD=<pass> psql -U <uname> -h localhost -d rhodecode_db_name -1 -f postgresql-db-backup | |||
61 |
|
76 | |||
62 |
# For SQL |
|
77 | # For SQLite | |
63 | $ sqlite3 rhodecode.db ‘.dump’ > sqlite-db-backup |
|
78 | $ sqlite3 rhodecode.db ‘.dump’ > sqlite-db-backup | |
|
79 | # SQLite restore | |||
|
80 | $ copy sqlite-db-backup rhodecode.db | |||
64 |
|
81 | |||
65 |
|
82 | |||
66 | The default |RCE| SQLite database location is |
|
83 | The default |RCE| SQLite database location is | |
@@ -75,13 +92,18 b' Configuration File Backup' | |||||
75 | Depending on your setup, you could have a number of configuration files that |
|
92 | Depending on your setup, you could have a number of configuration files that | |
76 | should be backed up. You may have some, or all of the configuration files |
|
93 | should be backed up. You may have some, or all of the configuration files | |
77 | listed in the :ref:`config-rce-files` section. Ideally you should back these |
|
94 | listed in the :ref:`config-rce-files` section. Ideally you should back these | |
78 | up at the same time as the database and |repos|. |
|
95 | up at the same time as the database and |repos|. It really depends on if you need | |
|
96 | the configuration file like logs, custom modules. We always recommend backing | |||
|
97 | those up. | |||
79 |
|
98 | |||
80 | Gist Backup |
|
99 | Gist Backup | |
81 | ^^^^^^^^^^^ |
|
100 | ^^^^^^^^^^^ | |
82 |
|
101 | |||
83 |
To backup the gists on your |RCE| instance you |
|
102 | To backup the gists on your |RCE| instance you usually have to backup the | |
84 | ``get_gists`` API methods to fetch the gists for each user on the instance. |
|
103 | gist storage path. If this haven't been changed it's located inside | |
|
104 | :file:`.rc_gist_store` and the metadata in :file:`.rc_gist_metadata`. | |||
|
105 | You can use the ``get_users`` and ``get_gists`` API methods to fetch the | |||
|
106 | gists for each user on the instance. | |||
85 |
|
107 | |||
86 | Extension Backups |
|
108 | Extension Backups | |
87 | ^^^^^^^^^^^^^^^^^ |
|
109 | ^^^^^^^^^^^^^^^^^ | |
@@ -100,15 +122,17 b' the :ref:`indexing-ref` section.' | |||||
100 | Restoration Steps |
|
122 | Restoration Steps | |
101 | ----------------- |
|
123 | ----------------- | |
102 |
|
124 | |||
103 |
To restore an instance of |RCE| from its backed up components, |
|
125 | To restore an instance of |RCE| from its backed up components, to a fresh | |
104 | following steps. |
|
126 | system use the following steps. | |
105 |
|
127 | |||
106 | 1. Install a new instance of |RCE|. |
|
128 | 1. Install a new instance of |RCE| using sqlite option as database. | |
107 | 2. Once installed, configure the instance to use the backed up |
|
129 | 2. Restore your database. | |
108 | :file:`rhodecode.ini` file. Ensure this file points to the backed up |
|
130 | 3. Once installed, replace you backed up the :file:`rhodecode.ini` with your | |
|
131 | backup version. Ensure this file points to the restored | |||
109 | database, see the :ref:`config-database` section. |
|
132 | database, see the :ref:`config-database` section. | |
110 |
|
|
133 | 4. Restart |RCE| and remap and rescan your |repos| to verify filesystem access, | |
111 | :ref:`remap-rescan` section. |
|
134 | see the :ref:`remap-rescan` section. | |
|
135 | ||||
112 |
|
136 | |||
113 | Post Restoration Steps |
|
137 | Post Restoration Steps | |
114 | ^^^^^^^^^^^^^^^^^^^^^^ |
|
138 | ^^^^^^^^^^^^^^^^^^^^^^ |
@@ -22,8 +22,8 b' account permissions.' | |||||
22 | .. code-block:: bash |
|
22 | .. code-block:: bash | |
23 |
|
23 | |||
24 | # Open iShell from the terminal |
|
24 | # Open iShell from the terminal | |
25 |
$ .rccontrol/enterprise- |
|
25 | $ .rccontrol/enterprise-1/profile/bin/paster \ | |
26 |
ishell .rccontrol/enterprise- |
|
26 | ishell .rccontrol/enterprise-1/rhodecode.ini | |
27 |
|
27 | |||
28 | .. code-block:: mysql |
|
28 | .. code-block:: mysql | |
29 |
|
29 | |||
@@ -52,11 +52,12 b' following example to make changes to thi' | |||||
52 | .. code-block:: mysql |
|
52 | .. code-block:: mysql | |
53 |
|
53 | |||
54 | # Use this example to enable global .hgrc access |
|
54 | # Use this example to enable global .hgrc access | |
55 |
In [ |
|
55 | In [1]: new_option = RhodeCodeUi() | |
56 |
In [ |
|
56 | In [2]: new_option.ui_section='web' | |
57 |
In [ |
|
57 | In [3]: new_option.ui_key='allow_push' | |
58 |
In [ |
|
58 | In [4]: new_option.ui_value='*' | |
59 |
In [ |
|
59 | In [5]: Session().add(new_option);Session().commit() | |
|
60 | In [6]: exit() | |||
60 |
|
61 | |||
61 | Manually Reset Password |
|
62 | Manually Reset Password | |
62 | ^^^^^^^^^^^^^^^^^^^^^^^ |
|
63 | ^^^^^^^^^^^^^^^^^^^^^^^ | |
@@ -72,24 +73,47 b' Use the following code example to carry ' | |||||
72 | .. code-block:: bash |
|
73 | .. code-block:: bash | |
73 |
|
74 | |||
74 | # starts the ishell interactive prompt |
|
75 | # starts the ishell interactive prompt | |
75 |
$ .rccontrol/enterprise- |
|
76 | $ .rccontrol/enterprise-1/profile/bin/paster \ | |
76 |
ishell .rccontrol/enterprise- |
|
77 | ishell .rccontrol/enterprise-1/rhodecode.ini | |
77 |
|
78 | |||
78 | .. code-block:: mysql |
|
79 | .. code-block:: mysql | |
79 |
|
80 | |||
80 | from rhodecode.lib.auth import generate_auth_token |
|
81 | In [1]: from rhodecode.lib.auth import generate_auth_token | |
81 | from rhodecode.lib.auth import get_crypt_password |
|
82 | In [2]: from rhodecode.lib.auth import get_crypt_password | |
82 |
|
||||
83 | # Enter the user name whose password you wish to change |
|
83 | # Enter the user name whose password you wish to change | |
84 | my_user = 'USERNAME' |
|
84 | In [3]: my_user = 'USERNAME' | |
85 | u = User.get_by_username(my_user) |
|
85 | In [4]: u = User.get_by_username(my_user) | |
86 |
|
||||
87 | # If this fails then the user does not exist |
|
86 | # If this fails then the user does not exist | |
88 | u.auth_token = generate_auth_token(my_user) |
|
87 | In [5]: u.auth_token = generate_auth_token(my_user) | |
89 |
|
||||
90 | # Set the new password |
|
88 | # Set the new password | |
91 | u.password = get_crypt_password('PASSWORD') |
|
89 | In [6]: u.password = get_crypt_password('PASSWORD') | |
|
90 | In [7]: Session().add(u);Session().commit() | |||
|
91 | In [8]: exit() | |||
|
92 | ||||
|
93 | ||||
|
94 | ||||
|
95 | Change user details | |||
|
96 | ^^^^^^^^^^^^^^^^^^^ | |||
|
97 | ||||
|
98 | If you need to manually change some of users details, use the following steps. | |||
|
99 | ||||
|
100 | 1. Navigate to your |RCE| install location. | |||
|
101 | 2. Run the interactive ``ishell`` prompt. | |||
|
102 | 3. Set a new arguments for users. | |||
92 |
|
103 | |||
93 | Session().add(u) |
|
104 | Use the following code example to carry out these steps. | |
94 | Session().commit() |
|
105 | ||
95 | exit |
|
106 | .. code-block:: bash | |
|
107 | ||||
|
108 | # starts the ishell interactive prompt | |||
|
109 | $ .rccontrol/enterprise-1/profile/bin/paster \ | |||
|
110 | ishell .rccontrol/enterprise-1/rhodecode.ini | |||
|
111 | ||||
|
112 | .. code-block:: mysql | |||
|
113 | ||||
|
114 | # Use this example to change email and username of LDAP user | |||
|
115 | In [1]: my_user = User.get_by_username('some_username') | |||
|
116 | In [2]: my_user.email = 'new_email@foobar.com' | |||
|
117 | In [3]: my_user.username = 'SomeUser' | |||
|
118 | In [4]: Session().add(my_user);Session().commit() | |||
|
119 | In [5]: exit() |
@@ -36,7 +36,7 b' 1. On your local machine create the publ' | |||||
36 | Your public key has been saved in /home/user/.ssh/id_rsa.pub. |
|
36 | Your public key has been saved in /home/user/.ssh/id_rsa.pub. | |
37 | The key fingerprint is: |
|
37 | The key fingerprint is: | |
38 | 02:82:38:95:e5:30:d2:ad:17:60:15:7f:94:17:9f:30 user@ubuntu |
|
38 | 02:82:38:95:e5:30:d2:ad:17:60:15:7f:94:17:9f:30 user@ubuntu | |
39 | The key's randomart image is: |
|
39 | The key\'s randomart image is: | |
40 | +--[ RSA 2048]----+ |
|
40 | +--[ RSA 2048]----+ | |
41 |
|
41 | |||
42 | 2. SFTP to your server, and copy the public key to the ``~/.ssh`` folder. |
|
42 | 2. SFTP to your server, and copy the public key to the ``~/.ssh`` folder. |
@@ -18,6 +18,7 b' The following are the most common system' | |||||
18 |
|
18 | |||
19 | config-files-overview |
|
19 | config-files-overview | |
20 | vcs-server |
|
20 | vcs-server | |
|
21 | svn-http | |||
21 | apache-config |
|
22 | apache-config | |
22 | nginx-config |
|
23 | nginx-config | |
23 | backup-restore |
|
24 | backup-restore |
@@ -18,7 +18,7 b' 1. Open ishell from the terminal and use' | |||||
18 | 2. Run the following commands, and ensure that |RCE| has write access to the |
|
18 | 2. Run the following commands, and ensure that |RCE| has write access to the | |
19 | new directory: |
|
19 | new directory: | |
20 |
|
20 | |||
21 |
.. code-block:: |
|
21 | .. code-block:: bash | |
22 |
|
22 | |||
23 |
# Once logged into the database, use SQL to redirect |
|
23 | # Once logged into the database, use SQL to redirect | |
24 |
# the large files location |
|
24 | # the large files location |
@@ -298,133 +298,7 b' For a more detailed explanation of the l' | |||||
298 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
298 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s | |
299 | datefmt = %Y-%m-%d %H:%M:%S |
|
299 | datefmt = %Y-%m-%d %H:%M:%S | |
300 |
|
300 | |||
301 | .. _svn-http: |
|
|||
302 |
|
||||
303 | |svn| With Write Over HTTP |
|
|||
304 | ^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
|||
305 |
|
||||
306 | To use |svn| with read/write support over the |svn| HTTP protocol, you have to |
|
|||
307 | configure the HTTP |svn| backend. |
|
|||
308 |
|
||||
309 | Prerequisites |
|
|||
310 | ============= |
|
|||
311 |
|
||||
312 | - Enable HTTP support inside the admin VCS settings on your |RCE| instance |
|
|||
313 | - You need to install the following tools on the machine that is running an |
|
|||
314 | instance of |RCE|: |
|
|||
315 | ``Apache HTTP Server`` and |
|
|||
316 | ``mod_dav_svn``. |
|
|||
317 |
|
||||
318 |
|
||||
319 | Using Ubuntu Distribution as an example you can run: |
|
|||
320 |
|
||||
321 | .. code-block:: bash |
|
|||
322 |
|
||||
323 | $ sudo apt-get install apache2 libapache2-mod-svn |
|
|||
324 |
|
||||
325 | Once installed you need to enable ``dav_svn``: |
|
|||
326 |
|
||||
327 | .. code-block:: bash |
|
|||
328 |
|
||||
329 | $ sudo a2enmod dav_svn |
|
|||
330 |
|
||||
331 | Configuring Apache Setup |
|
|||
332 | ======================== |
|
|||
333 |
|
||||
334 | .. tip:: |
|
|||
335 |
|
||||
336 | It is recommended to run Apache on a port other than 80, due to possible |
|
|||
337 | conflicts with other HTTP servers like nginx. To do this, set the |
|
|||
338 | ``Listen`` parameter in the ``/etc/apache2/ports.conf`` file, for example |
|
|||
339 | ``Listen 8090``. |
|
|||
340 |
|
||||
341 |
|
||||
342 | .. warning:: |
|
|||
343 |
|
||||
344 | Make sure your Apache instance which runs the mod_dav_svn module is |
|
|||
345 | only accessible by RhodeCode. Otherwise everyone is able to browse |
|
|||
346 | the repositories or run subversion operations (checkout/commit/etc.). |
|
|||
347 |
|
||||
348 | It is also recommended to run apache as the same user as |RCE|, otherwise |
|
|||
349 | permission issues could occur. To do this edit the ``/etc/apache2/envvars`` |
|
|||
350 |
|
||||
351 | .. code-block:: apache |
|
|||
352 |
|
||||
353 | export APACHE_RUN_USER=rhodecode |
|
|||
354 | export APACHE_RUN_GROUP=rhodecode |
|
|||
355 |
|
||||
356 | 1. To configure Apache, create and edit a virtual hosts file, for example |
|
|||
357 | :file:`/etc/apache2/sites-available/default.conf`. Below is an example |
|
|||
358 | how to use one with auto-generated config ```mod_dav_svn.conf``` |
|
|||
359 | from configured |RCE| instance. |
|
|||
360 |
|
||||
361 | .. code-block:: apache |
|
|||
362 |
|
||||
363 | <VirtualHost *:8080> |
|
|||
364 | ServerAdmin rhodecode-admin@localhost |
|
|||
365 | DocumentRoot /var/www/html |
|
|||
366 | ErrorLog ${'${APACHE_LOG_DIR}'}/error.log |
|
|||
367 | CustomLog ${'${APACHE_LOG_DIR}'}/access.log combined |
|
|||
368 | Include /home/user/.rccontrol/enterprise-1/mod_dav_svn.conf |
|
|||
369 | </VirtualHost> |
|
|||
370 |
|
||||
371 |
|
||||
372 | 2. Go to the :menuselection:`Admin --> Settings --> VCS` page, and |
|
|||
373 | enable :guilabel:`Proxy Subversion HTTP requests`, and specify the |
|
|||
374 | :guilabel:`Subversion HTTP Server URL`. |
|
|||
375 |
|
||||
376 | 3. Open the |RCE| configuration file, |
|
|||
377 | :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` |
|
|||
378 |
|
||||
379 | 4. Add the following configuration option in the ``[app:main]`` |
|
|||
380 | section if you don't have it yet. |
|
|||
381 |
|
||||
382 | This enables mapping of the created |RCE| repo groups into special |svn| paths. |
|
|||
383 | Each time a new repository group is created, the system will update |
|
|||
384 | the template file and create new mapping. Apache web server needs to be |
|
|||
385 | reloaded to pick up the changes on this file. |
|
|||
386 | It's recommended to add reload into a crontab so the changes can be picked |
|
|||
387 | automatically once someone creates a repository group inside RhodeCode. |
|
|||
388 |
|
||||
389 |
|
||||
390 | .. code-block:: ini |
|
|||
391 |
|
||||
392 | ############################################## |
|
|||
393 | ### Subversion proxy support (mod_dav_svn) ### |
|
|||
394 | ############################################## |
|
|||
395 | ## Enable or disable the config file generation. |
|
|||
396 | svn.proxy.generate_config = true |
|
|||
397 | ## Generate config file with `SVNListParentPath` set to `On`. |
|
|||
398 | svn.proxy.list_parent_path = true |
|
|||
399 | ## Set location and file name of generated config file. |
|
|||
400 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf |
|
|||
401 | ## File system path to the directory containing the repositories served by |
|
|||
402 | ## RhodeCode. |
|
|||
403 | svn.proxy.parent_path_root = /path/to/repo_store |
|
|||
404 | ## Used as a prefix to the <Location> block in the generated config file. In |
|
|||
405 | ## most cases it should be set to `/`. |
|
|||
406 | svn.proxy.location_root = / |
|
|||
407 |
|
||||
408 |
|
||||
409 | This would create a special template file called ```mod_dav_svn.conf```. We |
|
|||
410 | used that file path in the apache config above inside the Include statement. |
|
|||
411 |
|
||||
412 |
|
||||
413 | Using |svn| |
|
|||
414 | =========== |
|
|||
415 |
|
||||
416 | Once |svn| has been enabled on your instance, you can use it with the |
|
|||
417 | following examples. For more |svn| information, see the `Subversion Red Book`_ |
|
|||
418 |
|
||||
419 | .. code-block:: bash |
|
|||
420 |
|
||||
421 | # To clone a repository |
|
|||
422 | svn checkout http://my-svn-server.example.com/my-svn-repo |
|
|||
423 |
|
||||
424 | # svn commit |
|
|||
425 | svn commit |
|
|||
426 |
|
301 | |||
427 | .. _Subversion Red Book: http://svnbook.red-bean.com/en/1.7/svn-book.html#svn.ref.svn |
|
302 | .. _Subversion Red Book: http://svnbook.red-bean.com/en/1.7/svn-book.html#svn.ref.svn | |
428 |
|
303 | |||
429 |
|
304 | .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue | ||
430 | .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue No newline at end of file |
|
@@ -1,7 +1,7 b'' | |||||
1 | .. _deprecated-methods-ref: |
|
1 | .. _deprecated-methods-ref: | |
2 |
|
2 | |||
3 | deprecated methods |
|
3 | deprecated methods | |
4 | ================= |
|
4 | ================== | |
5 |
|
5 | |||
6 | changeset_comment |
|
6 | changeset_comment | |
7 | ----------------- |
|
7 | ----------------- |
@@ -1,7 +1,7 b'' | |||||
1 | .. _gist-methods-ref: |
|
1 | .. _gist-methods-ref: | |
2 |
|
2 | |||
3 | gist methods |
|
3 | gist methods | |
4 |
============ |
|
4 | ============ | |
5 |
|
5 | |||
6 | create_gist |
|
6 | create_gist | |
7 | ----------- |
|
7 | ----------- |
@@ -1,10 +1,10 b'' | |||||
1 | .. _license-methods-ref: |
|
1 | .. _license-methods-ref: | |
2 |
|
2 | |||
3 | license methods |
|
3 | license methods | |
4 |
=============== |
|
4 | =============== | |
5 |
|
5 | |||
6 | get_license_info (EE only) |
|
6 | get_license_info (EE only) | |
7 | ---------------- |
|
7 | -------------------------- | |
8 |
|
8 | |||
9 | .. py:function:: get_license_info(apiuser) |
|
9 | .. py:function:: get_license_info(apiuser) | |
10 |
|
10 | |||
@@ -32,7 +32,7 b' get_license_info (EE only)' | |||||
32 |
|
32 | |||
33 |
|
33 | |||
34 | set_license_key (EE only) |
|
34 | set_license_key (EE only) | |
35 | --------------- |
|
35 | ------------------------- | |
36 |
|
36 | |||
37 | .. py:function:: set_license_key(apiuser, key) |
|
37 | .. py:function:: set_license_key(apiuser, key) | |
38 |
|
38 |
@@ -1,7 +1,7 b'' | |||||
1 | .. _pull-request-methods-ref: |
|
1 | .. _pull-request-methods-ref: | |
2 |
|
2 | |||
3 | pull_request methods |
|
3 | pull_request methods | |
4 | ================= |
|
4 | ==================== | |
5 |
|
5 | |||
6 | close_pull_request |
|
6 | close_pull_request | |
7 | ------------------ |
|
7 | ------------------ | |
@@ -103,6 +103,10 b' create_pull_request' | |||||
103 | :type description: Optional(str) |
|
103 | :type description: Optional(str) | |
104 | :param reviewers: Set the new pull request reviewers list. |
|
104 | :param reviewers: Set the new pull request reviewers list. | |
105 | :type reviewers: Optional(list) |
|
105 | :type reviewers: Optional(list) | |
|
106 | Accepts username strings or objects of the format: | |||
|
107 | { | |||
|
108 | 'username': 'nick', 'reasons': ['original author'] | |||
|
109 | } | |||
106 |
|
110 | |||
107 |
|
111 | |||
108 | get_pull_request |
|
112 | get_pull_request | |
@@ -165,6 +169,15 b' get_pull_request' | |||||
165 | "commit_id": "<commit_id>", |
|
169 | "commit_id": "<commit_id>", | |
166 | } |
|
170 | } | |
167 | }, |
|
171 | }, | |
|
172 | "merge": { | |||
|
173 | "clone_url": "<clone_url>", | |||
|
174 | "reference": | |||
|
175 | { | |||
|
176 | "name": "<name>", | |||
|
177 | "type": "<type>", | |||
|
178 | "commit_id": "<commit_id>", | |||
|
179 | } | |||
|
180 | }, | |||
168 | "author": <user_obj>, |
|
181 | "author": <user_obj>, | |
169 | "reviewers": [ |
|
182 | "reviewers": [ | |
170 | ... |
|
183 | ... | |
@@ -241,6 +254,15 b' get_pull_requests' | |||||
241 | "commit_id": "<commit_id>", |
|
254 | "commit_id": "<commit_id>", | |
242 | } |
|
255 | } | |
243 | }, |
|
256 | }, | |
|
257 | "merge": { | |||
|
258 | "clone_url": "<clone_url>", | |||
|
259 | "reference": | |||
|
260 | { | |||
|
261 | "name": "<name>", | |||
|
262 | "type": "<type>", | |||
|
263 | "commit_id": "<commit_id>", | |||
|
264 | } | |||
|
265 | }, | |||
244 | "author": <user_obj>, |
|
266 | "author": <user_obj>, | |
245 | "reviewers": [ |
|
267 | "reviewers": [ | |
246 | ... |
|
268 | ... | |
@@ -284,7 +306,12 b' merge_pull_request' | |||||
284 | "executed": "<bool>", |
|
306 | "executed": "<bool>", | |
285 | "failure_reason": "<int>", |
|
307 | "failure_reason": "<int>", | |
286 | "merge_commit_id": "<merge_commit_id>", |
|
308 | "merge_commit_id": "<merge_commit_id>", | |
287 | "possible": "<bool>" |
|
309 | "possible": "<bool>", | |
|
310 | "merge_ref": { | |||
|
311 | "commit_id": "<commit_id>", | |||
|
312 | "type": "<type>", | |||
|
313 | "name": "<name>" | |||
|
314 | } | |||
288 | }, |
|
315 | }, | |
289 | "error": null |
|
316 | "error": null | |
290 |
|
317 |
@@ -1,24 +1,25 b'' | |||||
1 | .. _repo-group-methods-ref: |
|
1 | .. _repo-group-methods-ref: | |
2 |
|
2 | |||
3 | repo_group methods |
|
3 | repo_group methods | |
4 | ================= |
|
4 | ================== | |
5 |
|
5 | |||
6 | create_repo_group |
|
6 | create_repo_group | |
7 | ----------------- |
|
7 | ----------------- | |
8 |
|
8 | |||
9 |
.. py:function:: create_repo_group(apiuser, group_name, |
|
9 | .. py:function:: create_repo_group(apiuser, group_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, copy_permissions=<Optional:False>) | |
10 |
|
10 | |||
11 | Creates a repository group. |
|
11 | Creates a repository group. | |
12 |
|
12 | |||
13 |
* If the repository group name contains "/", |
|
13 | * If the repository group name contains "/", repository group will be | |
14 | groups will be created. |
|
14 | created inside a repository group or nested repository groups | |
15 |
|
15 | |||
16 |
For example "foo/bar/ |
|
16 | For example "foo/bar/group1" will create repository group called "group1" | |
17 | (with "foo" as parent). It will also create the "baz" repository |
|
17 | inside group "foo/bar". You have to have permissions to access and | |
18 | with "bar" as |repo| group. |
|
18 | write to the last repository group ("bar" in this example) | |
19 |
|
19 | |||
20 |
This command can only be run using an |authtoken| with a |
|
20 | This command can only be run using an |authtoken| with at least | |
21 | permissions. |
|
21 | permissions to create repository groups, or admin permissions to | |
|
22 | parent repository groups. | |||
22 |
|
23 | |||
23 | :param apiuser: This is filled automatically from the |authtoken|. |
|
24 | :param apiuser: This is filled automatically from the |authtoken|. | |
24 | :type apiuser: AuthUser |
|
25 | :type apiuser: AuthUser | |
@@ -73,7 +74,7 b' delete_repo_group' | |||||
73 |
|
74 | |||
74 | id : <id_given_in_input> |
|
75 | id : <id_given_in_input> | |
75 | result : { |
|
76 | result : { | |
76 | 'msg': 'deleted repo group ID:<repogroupid> <repogroupname> |
|
77 | 'msg': 'deleted repo group ID:<repogroupid> <repogroupname>' | |
77 | 'repo_group': null |
|
78 | 'repo_group': null | |
78 | } |
|
79 | } | |
79 | error : null |
|
80 | error : null | |
@@ -325,13 +326,22 b' revoke_user_permission_from_repo_group' | |||||
325 | update_repo_group |
|
326 | update_repo_group | |
326 | ----------------- |
|
327 | ----------------- | |
327 |
|
328 | |||
328 |
.. py:function:: update_repo_group(apiuser, repogroupid, group_name=<Optional:''>, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser> |
|
329 | .. py:function:: update_repo_group(apiuser, repogroupid, group_name=<Optional:''>, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser>>, enable_locking=<Optional:False>) | |
329 |
|
330 | |||
330 | Updates repository group with the details given. |
|
331 | Updates repository group with the details given. | |
331 |
|
332 | |||
332 | This command can only be run using an |authtoken| with admin |
|
333 | This command can only be run using an |authtoken| with admin | |
333 | permissions. |
|
334 | permissions. | |
334 |
|
335 | |||
|
336 | * If the group_name name contains "/", repository group will be updated | |||
|
337 | accordingly with a repository group or nested repository groups | |||
|
338 | ||||
|
339 | For example repogroupid=group-test group_name="foo/bar/group-test" | |||
|
340 | will update repository group called "group-test" and place it | |||
|
341 | inside group "foo/bar". | |||
|
342 | You have to have permissions to access and write to the last repository | |||
|
343 | group ("bar" in this example) | |||
|
344 | ||||
335 | :param apiuser: This is filled automatically from the |authtoken|. |
|
345 | :param apiuser: This is filled automatically from the |authtoken|. | |
336 | :type apiuser: AuthUser |
|
346 | :type apiuser: AuthUser | |
337 | :param repogroupid: Set the ID of repository group. |
|
347 | :param repogroupid: Set the ID of repository group. | |
@@ -342,8 +352,6 b' update_repo_group' | |||||
342 | :type description: str |
|
352 | :type description: str | |
343 | :param owner: Set the |repo| group owner. |
|
353 | :param owner: Set the |repo| group owner. | |
344 | :type owner: str |
|
354 | :type owner: str | |
345 | :param parent: Set the |repo| group parent. |
|
|||
346 | :type parent: str or int |
|
|||
347 | :param enable_locking: Enable |repo| locking. The default is false. |
|
355 | :param enable_locking: Enable |repo| locking. The default is false. | |
348 | :type enable_locking: bool |
|
356 | :type enable_locking: bool | |
349 |
|
357 |
@@ -1,7 +1,7 b'' | |||||
1 | .. _repo-methods-ref: |
|
1 | .. _repo-methods-ref: | |
2 |
|
2 | |||
3 | repo methods |
|
3 | repo methods | |
4 |
============ |
|
4 | ============ | |
5 |
|
5 | |||
6 | add_field_to_repo |
|
6 | add_field_to_repo | |
7 | ----------------- |
|
7 | ----------------- | |
@@ -68,15 +68,16 b' create_repo' | |||||
68 |
|
68 | |||
69 | Creates a repository. |
|
69 | Creates a repository. | |
70 |
|
70 | |||
71 |
* If the repository name contains "/", |
|
71 | * If the repository name contains "/", repository will be created inside | |
72 | groups will be created. |
|
72 | a repository group or nested repository groups | |
73 |
|
73 | |||
74 |
For example "foo/bar/ |
|
74 | For example "foo/bar/repo1" will create |repo| called "repo1" inside | |
75 | (with "foo" as parent). It will also create the "baz" repository |
|
75 | group "foo/bar". You have to have permissions to access and write to | |
76 | with "bar" as |repo| group. |
|
76 | the last repository group ("bar" in this example) | |
77 |
|
77 | |||
78 | This command can only be run using an |authtoken| with at least |
|
78 | This command can only be run using an |authtoken| with at least | |
79 | write permissions to the |repo|. |
|
79 | permissions to create repositories, or write permissions to | |
|
80 | parent repository groups. | |||
80 |
|
81 | |||
81 | :param apiuser: This is filled automatically from the |authtoken|. |
|
82 | :param apiuser: This is filled automatically from the |authtoken|. | |
82 | :type apiuser: AuthUser |
|
83 | :type apiuser: AuthUser | |
@@ -88,9 +89,9 b' create_repo' | |||||
88 | :type owner: Optional(str) |
|
89 | :type owner: Optional(str) | |
89 | :param description: Set the repository description. |
|
90 | :param description: Set the repository description. | |
90 | :type description: Optional(str) |
|
91 | :type description: Optional(str) | |
91 | :param private: |
|
92 | :param private: set repository as private | |
92 | :type private: bool |
|
93 | :type private: bool | |
93 | :param clone_uri: |
|
94 | :param clone_uri: set clone_uri | |
94 | :type clone_uri: str |
|
95 | :type clone_uri: str | |
95 | :param landing_rev: <rev_type>:<rev> |
|
96 | :param landing_rev: <rev_type>:<rev> | |
96 | :type landing_rev: str |
|
97 | :type landing_rev: str | |
@@ -125,7 +126,7 b' create_repo' | |||||
125 | id : <id_given_in_input> |
|
126 | id : <id_given_in_input> | |
126 | result : null |
|
127 | result : null | |
127 | error : { |
|
128 | error : { | |
128 | 'failed to create repository `<repo_name>` |
|
129 | 'failed to create repository `<repo_name>`' | |
129 | } |
|
130 | } | |
130 |
|
131 | |||
131 |
|
132 | |||
@@ -164,25 +165,29 b' delete_repo' | |||||
164 | fork_repo |
|
165 | fork_repo | |
165 | --------- |
|
166 | --------- | |
166 |
|
167 | |||
167 |
.. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, |
|
168 | .. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, copy_permissions=<Optional:False>) | |
168 |
|
169 | |||
169 | Creates a fork of the specified |repo|. |
|
170 | Creates a fork of the specified |repo|. | |
170 |
|
171 | |||
171 | * If using |RCE| with Celery this will immediately return a success |
|
172 | * If the fork_name contains "/", fork will be created inside | |
172 | message, even though the fork will be created asynchronously. |
|
173 | a repository group or nested repository groups | |
173 |
|
174 | |||
174 | This command can only be run using an |authtoken| with fork |
|
175 | For example "foo/bar/fork-repo" will create fork called "fork-repo" | |
175 | permissions on the |repo|. |
|
176 | inside group "foo/bar". You have to have permissions to access and | |
|
177 | write to the last repository group ("bar" in this example) | |||
|
178 | ||||
|
179 | This command can only be run using an |authtoken| with minimum | |||
|
180 | read permissions of the forked repo, create fork permissions for an user. | |||
176 |
|
181 | |||
177 | :param apiuser: This is filled automatically from the |authtoken|. |
|
182 | :param apiuser: This is filled automatically from the |authtoken|. | |
178 | :type apiuser: AuthUser |
|
183 | :type apiuser: AuthUser | |
179 | :param repoid: Set repository name or repository ID. |
|
184 | :param repoid: Set repository name or repository ID. | |
180 | :type repoid: str or int |
|
185 | :type repoid: str or int | |
181 | :param fork_name: Set the fork name. |
|
186 | :param fork_name: Set the fork name, including it's repository group membership. | |
182 | :type fork_name: str |
|
187 | :type fork_name: str | |
183 | :param owner: Set the fork owner. |
|
188 | :param owner: Set the fork owner. | |
184 | :type owner: str |
|
189 | :type owner: str | |
185 | :param description: Set the fork descripton. |
|
190 | :param description: Set the fork description. | |
186 | :type description: str |
|
191 | :type description: str | |
187 | :param copy_permissions: Copy permissions from parent |repo|. The |
|
192 | :param copy_permissions: Copy permissions from parent |repo|. The | |
188 | default is False. |
|
193 | default is False. | |
@@ -729,7 +734,7 b' lock' | |||||
729 | id : <id_given_in_input> |
|
734 | id : <id_given_in_input> | |
730 | result : null |
|
735 | result : null | |
731 | error : { |
|
736 | error : { | |
732 | 'Error occurred locking repository `<reponame>` |
|
737 | 'Error occurred locking repository `<reponame>`' | |
733 | } |
|
738 | } | |
734 |
|
739 | |||
735 |
|
740 | |||
@@ -923,24 +928,31 b' strip' | |||||
923 | update_repo |
|
928 | update_repo | |
924 | ----------- |
|
929 | ----------- | |
925 |
|
930 | |||
926 | .. py:function:: update_repo(apiuser, repoid, name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, group=<Optional:None>, fork_of=<Optional:None>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>) |
|
931 | .. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, fork_of=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>) | |
927 |
|
932 | |||
928 | Updates a repository with the given information. |
|
933 | Updates a repository with the given information. | |
929 |
|
934 | |||
930 | This command can only be run using an |authtoken| with at least |
|
935 | This command can only be run using an |authtoken| with at least | |
931 |
|
|
936 | admin permissions to the |repo|. | |
|
937 | ||||
|
938 | * If the repository name contains "/", repository will be updated | |||
|
939 | accordingly with a repository group or nested repository groups | |||
|
940 | ||||
|
941 | For example repoid=repo-test name="foo/bar/repo-test" will update |repo| | |||
|
942 | called "repo-test" and place it inside group "foo/bar". | |||
|
943 | You have to have permissions to access and write to the last repository | |||
|
944 | group ("bar" in this example) | |||
932 |
|
945 | |||
933 | :param apiuser: This is filled automatically from the |authtoken|. |
|
946 | :param apiuser: This is filled automatically from the |authtoken|. | |
934 | :type apiuser: AuthUser |
|
947 | :type apiuser: AuthUser | |
935 | :param repoid: repository name or repository ID. |
|
948 | :param repoid: repository name or repository ID. | |
936 | :type repoid: str or int |
|
949 | :type repoid: str or int | |
937 |
:param name: Update the |repo| name |
|
950 | :param repo_name: Update the |repo| name, including the | |
938 | :type name: str |
|
951 | repository group it's in. | |
|
952 | :type repo_name: str | |||
939 | :param owner: Set the |repo| owner. |
|
953 | :param owner: Set the |repo| owner. | |
940 | :type owner: str |
|
954 | :type owner: str | |
941 |
:param |
|
955 | :param fork_of: Set the |repo| as fork of another |repo|. | |
942 | :type group: str |
|
|||
943 | :param fork_of: Set the master |repo| name. |
|
|||
944 | :type fork_of: str |
|
956 | :type fork_of: str | |
945 | :param description: Update the |repo| description. |
|
957 | :param description: Update the |repo| description. | |
946 | :type description: str |
|
958 | :type description: str | |
@@ -948,16 +960,13 b' update_repo' | |||||
948 | :type private: bool |
|
960 | :type private: bool | |
949 | :param clone_uri: Update the |repo| clone URI. |
|
961 | :param clone_uri: Update the |repo| clone URI. | |
950 | :type clone_uri: str |
|
962 | :type clone_uri: str | |
951 | :param landing_rev: Set the |repo| landing revision. Default is |
|
963 | :param landing_rev: Set the |repo| landing revision. Default is ``rev:tip``. | |
952 | ``tip``. |
|
|||
953 | :type landing_rev: str |
|
964 | :type landing_rev: str | |
954 | :param enable_statistics: Enable statistics on the |repo|, |
|
965 | :param enable_statistics: Enable statistics on the |repo|, (True | False). | |
955 | (True | False). |
|
|||
956 | :type enable_statistics: bool |
|
966 | :type enable_statistics: bool | |
957 | :param enable_locking: Enable |repo| locking. |
|
967 | :param enable_locking: Enable |repo| locking. | |
958 | :type enable_locking: bool |
|
968 | :type enable_locking: bool | |
959 | :param enable_downloads: Enable downloads from the |repo|, |
|
969 | :param enable_downloads: Enable downloads from the |repo|, (True | False). | |
960 | (True | False). |
|
|||
961 | :type enable_downloads: bool |
|
970 | :type enable_downloads: bool | |
962 | :param fields: Add extra fields to the |repo|. Use the following |
|
971 | :param fields: Add extra fields to the |repo|. Use the following | |
963 | example format: ``field_key=field_val,field_key2=fieldval2``. |
|
972 | example format: ``field_key=field_val,field_key2=fieldval2``. |
@@ -1,7 +1,7 b'' | |||||
1 | .. _server-methods-ref: |
|
1 | .. _server-methods-ref: | |
2 |
|
2 | |||
3 | server methods |
|
3 | server methods | |
4 |
============== |
|
4 | ============== | |
5 |
|
5 | |||
6 | get_ip |
|
6 | get_ip | |
7 | ------ |
|
7 | ------ |
@@ -1,7 +1,7 b'' | |||||
1 | .. _user-group-methods-ref: |
|
1 | .. _user-group-methods-ref: | |
2 |
|
2 | |||
3 | user_group methods |
|
3 | user_group methods | |
4 | ================= |
|
4 | ================== | |
5 |
|
5 | |||
6 | add_user_to_user_group |
|
6 | add_user_to_user_group | |
7 | ---------------------- |
|
7 | ---------------------- |
@@ -1,12 +1,12 b'' | |||||
1 | .. _user-methods-ref: |
|
1 | .. _user-methods-ref: | |
2 |
|
2 | |||
3 | user methods |
|
3 | user methods | |
4 |
============ |
|
4 | ============ | |
5 |
|
5 | |||
6 | create_user |
|
6 | create_user | |
7 | ----------- |
|
7 | ----------- | |
8 |
|
8 | |||
9 | .. py:function:: create_user(apiuser, username, email, password=<Optional:''>, firstname=<Optional:''>, lastname=<Optional:''>, active=<Optional:True>, admin=<Optional:False>, extern_name=<Optional:'rhodecode'>, extern_type=<Optional:'rhodecode'>, force_password_change=<Optional:False>) |
|
9 | .. py:function:: create_user(apiuser, username, email, password=<Optional:''>, firstname=<Optional:''>, lastname=<Optional:''>, active=<Optional:True>, admin=<Optional:False>, extern_name=<Optional:'rhodecode'>, extern_type=<Optional:'rhodecode'>, force_password_change=<Optional:False>, create_personal_repo_group=<Optional:None>) | |
10 |
|
10 | |||
11 | Creates a new user and returns the new user object. |
|
11 | Creates a new user and returns the new user object. | |
12 |
|
12 | |||
@@ -39,7 +39,8 b' create_user' | |||||
39 | :param force_password_change: Force the new user to change password |
|
39 | :param force_password_change: Force the new user to change password | |
40 | on next login. |
|
40 | on next login. | |
41 | :type force_password_change: Optional(``True`` | ``False``) |
|
41 | :type force_password_change: Optional(``True`` | ``False``) | |
42 |
|
42 | :param create_personal_repo_group: Create personal repo group for this user | ||
|
43 | :type create_personal_repo_group: Optional(``True`` | ``False``) | |||
43 | Example output: |
|
44 | Example output: | |
44 |
|
45 | |||
45 | .. code-block:: bash |
|
46 | .. code-block:: bash | |
@@ -163,6 +164,7 b' get_user' | |||||
163 | "usergroup.read", |
|
164 | "usergroup.read", | |
164 | "hg.repogroup.create.false", |
|
165 | "hg.repogroup.create.false", | |
165 | "hg.create.none", |
|
166 | "hg.create.none", | |
|
167 | "hg.password_reset.enabled", | |||
166 | "hg.extern_activate.manual", |
|
168 | "hg.extern_activate.manual", | |
167 | "hg.create.write_on_repogroup.false", |
|
169 | "hg.create.write_on_repogroup.false", | |
168 | "hg.usergroup.create.false", |
|
170 | "hg.usergroup.create.false", |
@@ -1,7 +1,7 b'' | |||||
1 |
|
1 | |||
2 | ===== |
|
2 | =================== | |
3 | API |
|
3 | CONTRIBUTING TO API | |
4 | ===== |
|
4 | =================== | |
5 |
|
5 | |||
6 |
|
6 | |||
7 |
|
7 |
@@ -130,7 +130,7 b' is a very small pencil which has to be c' | |||||
130 | ticket. |
|
130 | ticket. | |
131 |
|
131 | |||
132 |
|
132 | |||
133 | .. figure:: images/redmine-description.png |
|
133 | .. figure:: ../images/redmine-description.png | |
134 | :alt: Example of pencil to change the ticket description |
|
134 | :alt: Example of pencil to change the ticket description | |
135 |
|
135 | |||
136 | Shows an example of the pencil which lets you change the description. |
|
136 | Shows an example of the pencil which lets you change the description. |
@@ -9,9 +9,6 b'' | |||||
9 | .. Avoid duplicating the quickstart instructions by importing the README |
|
9 | .. Avoid duplicating the quickstart instructions by importing the README | |
10 | file. |
|
10 | file. | |
11 |
|
11 | |||
12 | .. include:: ../../../acceptance_tests/README.rst |
|
|||
13 |
|
||||
14 |
|
||||
15 |
|
|
12 | ||
16 | Choices of technology and tools |
|
13 | Choices of technology and tools | |
17 | =============================== |
|
14 | =============================== |
@@ -88,10 +88,10 b' let' | |||||
88 | }; |
|
88 | }; | |
89 |
|
89 | |||
90 | Sphinx = buildPythonPackage (rec { |
|
90 | Sphinx = buildPythonPackage (rec { | |
91 |
name = "Sphinx-1.4. |
|
91 | name = "Sphinx-1.4.8"; | |
92 | src = fetchurl { |
|
92 | src = fetchurl { | |
93 |
url = "https://pypi.python.org/packages/ |
|
93 | url = "https://pypi.python.org/packages/1f/f6/e54a7aad73e35232356103771ae76306dadd8546b024c646fbe75135571c/${name}.tar.gz"; | |
94 | md5 = "64ce2ec08d37ed56313a98232cbe2aee"; |
|
94 | md5 = "5ec718a4855917e149498bba91b74e67"; | |
95 | }; |
|
95 | }; | |
96 | propagatedBuildInputs = [ |
|
96 | propagatedBuildInputs = [ | |
97 | docutils |
|
97 | docutils |
@@ -20,8 +20,10 b' and commit files and |repos| while manag' | |||||
20 | * Migration from existing databases. |
|
20 | * Migration from existing databases. | |
21 | * |RCM| SDK. |
|
21 | * |RCM| SDK. | |
22 | * Built-in analytics |
|
22 | * Built-in analytics | |
|
23 | * Built in integrations including: Slack, Jenkins, Webhooks, Jira, Redmine, Hipchat | |||
23 | * Pluggable authentication system. |
|
24 | * Pluggable authentication system. | |
24 | * Support for |LDAP|, Crowd, CAS, PAM. |
|
25 | * Support for AD, |LDAP|, Crowd, CAS, PAM. | |
|
26 | * Support for external authentication via Oauth Google, Github, Bitbucket, Twitter. | |||
25 | * Debug modes of operation. |
|
27 | * Debug modes of operation. | |
26 | * Private and public gists. |
|
28 | * Private and public gists. | |
27 | * Gists with limited lifetimes and within instance only sharing. |
|
29 | * Gists with limited lifetimes and within instance only sharing. |
@@ -50,3 +50,4 b' See pages specific to each type of integ' | |||||
50 | redmine |
|
50 | redmine | |
51 | jira |
|
51 | jira | |
52 | webhook |
|
52 | webhook | |
|
53 |
@@ -7,6 +7,16 b' The Webhook integration allows you to PO' | |||||
7 | or pull requests to a custom http endpoint as a json dict with details of the |
|
7 | or pull requests to a custom http endpoint as a json dict with details of the | |
8 | event. |
|
8 | event. | |
9 |
|
9 | |||
|
10 | Starting from 4.5.0 release, webhook integration allows to use variables | |||
|
11 | inside the URL. For example in URL `https://server-example.com/${repo_name}` | |||
|
12 | ${repo_name} will be replaced with the name of repository which events is | |||
|
13 | triggered from. Some of the variables like | |||
|
14 | `${branch}` will result in webhook be called multiple times when multiple | |||
|
15 | branches are pushed. | |||
|
16 | ||||
|
17 | Some of the variables like `${pull_request_id}` will be replaced only in | |||
|
18 | the pull request related events. | |||
|
19 | ||||
10 | To create a webhook integration, select "webhook" in the integration settings |
|
20 | To create a webhook integration, select "webhook" in the integration settings | |
11 |
and use the |
|
21 | and use the URL and key from your any previous custom webhook created. See | |
12 | :ref:`creating-integrations` for additional instructions. No newline at end of file |
|
22 | :ref:`creating-integrations` for additional instructions. |
@@ -7,38 +7,6 b' Release Date' | |||||
7 | - 2016-08-12 |
|
7 | - 2016-08-12 | |
8 |
|
8 | |||
9 |
|
9 | |||
10 | General |
|
|||
11 | ^^^^^^^ |
|
|||
12 |
|
||||
13 | - Subversion: detect requests also based on magic path. |
|
|||
14 | This adds subversion 1.9 support for SVN backend. |
|
|||
15 | - Summary/changelog: unified how data is displayed for those pages. |
|
|||
16 | * use consistent order of columns |
|
|||
17 | * fix the link to commit status |
|
|||
18 | * fix order of displaying comments |
|
|||
19 | - Live-chat: refactor live chat system for code review based on |
|
|||
20 | latest channelstream changes. |
|
|||
21 | - SVN: Add template to generate the apache mod_dav_svn config for all |
|
|||
22 | repository groups. Repository groups can now be automatically mapped to be |
|
|||
23 | supported by SVN backend. Set `svn.proxy.generate_config = true` and similar |
|
|||
24 | options found inside .ini config. |
|
|||
25 | - Readme/markup: improved order of generating readme files. Fixes #4050 |
|
|||
26 | * we now use order based on default system renderer |
|
|||
27 | * having multiple readme files will pick correct one as set renderer |
|
|||
28 | - Api: add a max_file_bytes parameter to get_nodes so that large files |
|
|||
29 | can be skipped. |
|
|||
30 | - Auth-ldap: added flag to set debug mode for LDAP connections. |
|
|||
31 | - Labs: moved rebase-merge option from labs settings into VCS settings. |
|
|||
32 | - System: send platform type and version to upgrade endpoint when checking |
|
|||
33 | for new versions. |
|
|||
34 | - Packaging: update rhodecode-tools from 0.8.3 to 0.10.0 |
|
|||
35 | - Packaging: update codemirror from 5.4.0 to 5.11.0 |
|
|||
36 | - Packaging: updated pygments to 2.1.3 |
|
|||
37 | - Packaging: bumped supervisor to 3.3.0 |
|
|||
38 | - Packaging: bumped psycopg2 to 2.6.1 |
|
|||
39 | - Packaging: bumped mercurial to 3.8.4 |
|
|||
40 |
|
||||
41 |
|
||||
42 | New Features |
|
10 | New Features | |
43 | ^^^^^^^^^^^^ |
|
11 | ^^^^^^^^^^^^ | |
44 |
|
12 | |||
@@ -64,6 +32,38 b' New Features' | |||||
64 | onto comments you submitted while doing a code-review. |
|
32 | onto comments you submitted while doing a code-review. | |
65 |
|
33 | |||
66 |
|
34 | |||
|
35 | General | |||
|
36 | ^^^^^^^ | |||
|
37 | ||||
|
38 | - Subversion: detect requests also based on magic path. | |||
|
39 | This adds subversion 1.9 support for SVN backend. | |||
|
40 | - Summary/changelog: unified how data is displayed for those pages. | |||
|
41 | * use consistent order of columns | |||
|
42 | * fix the link to commit status | |||
|
43 | * fix order of displaying comments | |||
|
44 | - Live chat: refactor live chat system for code review based on | |||
|
45 | latest channelstream changes. | |||
|
46 | - SVN: Add template to generate the apache mod_dav_svn config for all | |||
|
47 | repository groups. Repository groups can now be automatically mapped to be | |||
|
48 | supported by SVN backend. Set `svn.proxy.generate_config = true` and similar | |||
|
49 | options found inside .ini config. | |||
|
50 | - Readme/markup: improved order of generating readme files. Fixes #4050 | |||
|
51 | * we now use order based on default system renderer | |||
|
52 | * having multiple readme files will pick correct one as set renderer | |||
|
53 | - Api: add a max_file_bytes parameter to get_nodes so that large files | |||
|
54 | can be skipped. | |||
|
55 | - Auth-ldap: added flag to set debug mode for LDAP connections. | |||
|
56 | - Labs: moved rebase-merge option from labs settings into VCS settings. | |||
|
57 | - System: send platform type and version to upgrade endpoint when checking | |||
|
58 | for new versions. | |||
|
59 | - Packaging: update rhodecode-tools from 0.8.3 to 0.10.0 | |||
|
60 | - Packaging: update codemirror from 5.4.0 to 5.11.0 | |||
|
61 | - Packaging: updated pygments to 2.1.3 | |||
|
62 | - Packaging: bumped supervisor to 3.3.0 | |||
|
63 | - Packaging: bumped psycopg2 to 2.6.1 | |||
|
64 | - Packaging: bumped mercurial to 3.8.4 | |||
|
65 | ||||
|
66 | ||||
67 | Security |
|
67 | Security | |
68 | ^^^^^^^^ |
|
68 | ^^^^^^^^ | |
69 |
|
69 | |||
@@ -105,7 +105,7 b' Fixes' | |||||
105 | support to gevent compatible handling. |
|
105 | support to gevent compatible handling. | |
106 | - Diff2way: fixed unicode problem on non-ascii files. |
|
106 | - Diff2way: fixed unicode problem on non-ascii files. | |
107 | - Full text search: whoosh schema uses now bigger ints, fixes #4035 |
|
107 | - Full text search: whoosh schema uses now bigger ints, fixes #4035 | |
108 |
- File |
|
108 | - File browser: optimized cached tree calculation, reduced load times by | |
109 | 50% on complex file trees. |
|
109 | 50% on complex file trees. | |
110 | - Styling: #4086 fixing bug where long commit messages did not wrap in file view. |
|
110 | - Styling: #4086 fixing bug where long commit messages did not wrap in file view. | |
111 | - SVN: Ignore the content length header from response, fixes #4112. |
|
111 | - SVN: Ignore the content length header from response, fixes #4112. |
@@ -6,6 +6,27 b' Release Date' | |||||
6 |
|
6 | |||
7 | - 2016-08-23 |
|
7 | - 2016-08-23 | |
8 |
|
8 | |||
|
9 | ||||
|
10 | New Features | |||
|
11 | ^^^^^^^^^^^^ | |||
|
12 | ||||
|
13 | ||||
|
14 | ||||
|
15 | General | |||
|
16 | ^^^^^^^ | |||
|
17 | ||||
|
18 | ||||
|
19 | ||||
|
20 | Security | |||
|
21 | ^^^^^^^^ | |||
|
22 | ||||
|
23 | ||||
|
24 | ||||
|
25 | Performance | |||
|
26 | ^^^^^^^^^^^ | |||
|
27 | ||||
|
28 | ||||
|
29 | ||||
9 | Fixes |
|
30 | Fixes | |
10 | ^^^^^ |
|
31 | ^^^^^ | |
11 |
|
32 |
@@ -7,18 +7,6 b' Release Date' | |||||
7 | - 2016-09-16 |
|
7 | - 2016-09-16 | |
8 |
|
8 | |||
9 |
|
9 | |||
10 | General |
|
|||
11 | ^^^^^^^ |
|
|||
12 |
|
||||
13 | - UI: introduced Polymer webcomponents into core application. RhodeCode will |
|
|||
14 | be now shipped together with Polymer framework webcomponents. Most of |
|
|||
15 | dynamic UI components that require large amounts of interaction |
|
|||
16 | will be done now with Polymer. |
|
|||
17 | - live-notifications: use rhodecode-toast for live notifications instead of |
|
|||
18 | toastr jquery plugin. |
|
|||
19 | - Svn: moved svn http support out of labs settings. It's tested and stable now. |
|
|||
20 |
|
||||
21 |
|
||||
22 | New Features |
|
10 | New Features | |
23 | ^^^^^^^^^^^^ |
|
11 | ^^^^^^^^^^^^ | |
24 |
|
12 | |||
@@ -29,11 +17,11 b' New Features' | |||||
29 | It will allow to configure exactly which projects use which integrations. |
|
17 | It will allow to configure exactly which projects use which integrations. | |
30 | - Integrations: show branches/commits separately when posting push events |
|
18 | - Integrations: show branches/commits separately when posting push events | |
31 | to hipchat/slack, fixes #4192. |
|
19 | to hipchat/slack, fixes #4192. | |
32 |
- Pull |
|
20 | - Pull requests: summary page now shows update dates for pull request to | |
33 | easier see which one were receantly updated. |
|
21 | easier see which one were receantly updated. | |
34 | - UI: hidden inline comments will be shown in side view when browsing the diffs |
|
22 | - UI: hidden inline comments will be shown in side view when browsing the diffs | |
35 | - Diffs: added inline comments toggle into pull requests diff view. #2884 |
|
23 | - Diffs: added inline comments toggle into pull requests diff view. #2884 | |
36 |
- Live |
|
24 | - Live chat: added summon reviewers functionality. You can now request | |
37 | presence from online users into a chat for collaborative code-review. |
|
25 | presence from online users into a chat for collaborative code-review. | |
38 | This requires channelstream to be enabled. |
|
26 | This requires channelstream to be enabled. | |
39 | - UX: added a static 502 page for gateway error. Once configured via |
|
27 | - UX: added a static 502 page for gateway error. Once configured via | |
@@ -41,6 +29,18 b' New Features' | |||||
41 | backend servers are offline. Fixes #4202. |
|
29 | backend servers are offline. Fixes #4202. | |
42 |
|
30 | |||
43 |
|
31 | |||
|
32 | General | |||
|
33 | ^^^^^^^ | |||
|
34 | ||||
|
35 | - UI: introduced Polymer webcomponents into core application. RhodeCode will | |||
|
36 | be now shipped together with Polymer framework webcomponents. Most of | |||
|
37 | dynamic UI components that require large amounts of interaction | |||
|
38 | will be done now with Polymer. | |||
|
39 | - Live notifications: use rhodecode-toast for live notifications instead of | |||
|
40 | toastr jquery plugin. | |||
|
41 | - Svn: moved svn http support out of labs settings. It's tested and stable now. | |||
|
42 | ||||
|
43 | ||||
44 | Security |
|
44 | Security | |
45 | ^^^^^^^^ |
|
45 | ^^^^^^^^ | |
46 |
|
46 | |||
@@ -67,12 +67,12 b' Fixes' | |||||
67 | match rest of ui, fixes: #4200. |
|
67 | match rest of ui, fixes: #4200. | |
68 | - UX: show multiple tags/branches in changelog/summary instead of |
|
68 | - UX: show multiple tags/branches in changelog/summary instead of | |
69 | truncating them. |
|
69 | truncating them. | |
70 |
- My |
|
70 | - My account: fix test notifications for IE10+ | |
71 | - Vcs: change way refs are retrieved for git so same name branch/tags and |
|
71 | - Vcs: change way refs are retrieved for git so same name branch/tags and | |
72 | remotes can be supported, fixes #298. |
|
72 | remotes can be supported, fixes #298. | |
73 | - Lexers: added small extensions table to extend syntax highlighting for file |
|
73 | - Lexers: added small extensions table to extend syntax highlighting for file | |
74 | sources. Fixes #4227. |
|
74 | sources. Fixes #4227. | |
75 | - Search: fix bug where file path link was wrong when the repository name was |
|
75 | - Search: fix bug where file path link was wrong when the repository name was | |
76 | in the file path, fixes #4228 |
|
76 | in the file path, fixes #4228 | |
77 |
- |
|
77 | - Pagination: fixed INT overflow bug. | |
78 | - Events: send pushed commits always in the correct in order. |
|
78 | - Events: send pushed commits always in the correct in order. |
@@ -7,19 +7,18 b' Release Date' | |||||
7 | - 2016-09-27 |
|
7 | - 2016-09-27 | |
8 |
|
8 | |||
9 |
|
9 | |||
|
10 | New Features | |||
|
11 | ^^^^^^^^^^^^ | |||
|
12 | ||||
|
13 | ||||
10 | General |
|
14 | General | |
11 | ^^^^^^^ |
|
15 | ^^^^^^^ | |
12 |
|
16 | |||
13 |
- |
|
17 | - Channelstream: auto-generate the url to channelstream server if it's not | |
14 | explicitly defined in the config. It allows to use a relative server |
|
18 | explicitly defined in the config. It allows to use a relative server | |
15 | without knowing its address upfront. |
|
19 | without knowing its address upfront. | |
16 |
|
20 | |||
17 |
|
21 | |||
18 | New Features |
|
|||
19 | ^^^^^^^^^^^^ |
|
|||
20 |
|
||||
21 |
|
||||
22 |
|
||||
23 | Security |
|
22 | Security | |
24 | ^^^^^^^^ |
|
23 | ^^^^^^^^ | |
25 |
|
24 | |||
@@ -34,7 +33,7 b' Fixes' | |||||
34 | ^^^^^ |
|
33 | ^^^^^ | |
35 |
|
34 | |||
36 | - GIT: properly extract branches on events and submit them to integrations. |
|
35 | - GIT: properly extract branches on events and submit them to integrations. | |
37 | - Pullrequests: fix problems with unicode in auto-generated descriptions |
|
36 | - Pull requests: fix problems with unicode in auto-generated descriptions | |
38 | - Gist: fixed bug in update functionality of Gists that auto changed them |
|
37 | - Gist: fixed bug in update functionality of Gists that auto changed them | |
39 | to private. |
|
38 | to private. | |
40 | - SVN: add proper escaping in the autogenerated svn mod_dav config |
|
39 | - SVN: add proper escaping in the autogenerated svn mod_dav config |
@@ -7,21 +7,21 b' Release Date' | |||||
7 | - 2016-10-17 |
|
7 | - 2016-10-17 | |
8 |
|
8 | |||
9 |
|
9 | |||
10 | General |
|
|||
11 | ^^^^^^^ |
|
|||
12 |
|
||||
13 | - packaging: pinned against rhodecode-tools 0.10.1 |
|
|||
14 |
|
||||
15 |
|
||||
16 | New Features |
|
10 | New Features | |
17 | ^^^^^^^^^^^^ |
|
11 | ^^^^^^^^^^^^ | |
18 |
|
12 | |||
19 |
|
13 | |||
20 |
|
14 | |||
|
15 | General | |||
|
16 | ^^^^^^^ | |||
|
17 | ||||
|
18 | - Packaging: pinned against rhodecode-tools 0.10.1 | |||
|
19 | ||||
|
20 | ||||
21 | Security |
|
21 | Security | |
22 | ^^^^^^^^ |
|
22 | ^^^^^^^^ | |
23 |
|
23 | |||
24 |
- |
|
24 | - Integrations: fix 500 error on integrations page when delegated admin | |
25 | tried to access integration page after adding some integrations. |
|
25 | tried to access integration page after adding some integrations. | |
26 | Permission checks were to strict for delegated admins. |
|
26 | Permission checks were to strict for delegated admins. | |
27 |
|
27 | |||
@@ -34,8 +34,8 b' Performance' | |||||
34 | Fixes |
|
34 | Fixes | |
35 | ^^^^^ |
|
35 | ^^^^^ | |
36 |
|
36 | |||
37 |
- |
|
37 | - Vcsserver: make sure we correctly ping against bundled HG/GIT/SVN binaries. | |
38 | This should fix a problem where system binaries could be used accidentally |
|
38 | This should fix a problem where system binaries could be used accidentally | |
39 | by the RhodeCode. |
|
39 | by the RhodeCode. | |
40 |
- |
|
40 | - LDAP: fixed email extraction issues. Empty email addresses from LDAP server | |
41 | will no longer take precedence over those stored inside RhodeCode database. |
|
41 | will no longer take precedence over those stored inside RhodeCode database. |
@@ -9,6 +9,7 b' Release Notes' | |||||
9 | .. toctree:: |
|
9 | .. toctree:: | |
10 | :maxdepth: 1 |
|
10 | :maxdepth: 1 | |
11 |
|
11 | |||
|
12 | release-notes-4.5.0.rst | |||
12 | release-notes-4.4.2.rst |
|
13 | release-notes-4.4.2.rst | |
13 | release-notes-4.4.1.rst |
|
14 | release-notes-4.4.1.rst | |
14 | release-notes-4.4.0.rst |
|
15 | release-notes-4.4.0.rst |
@@ -4,7 +4,7 b'' | |||||
4 | Scaling Best Practices |
|
4 | Scaling Best Practices | |
5 | ====================== |
|
5 | ====================== | |
6 |
|
6 | |||
7 | When deploying |RCE| at scale; 100s of users, multiple instances, CI servers, |
|
7 | When deploying |RCE| at scale; 1000s of users, multiple instances, CI servers, | |
8 | there are a number of steps you can take to ensure you are getting the |
|
8 | there are a number of steps you can take to ensure you are getting the | |
9 | most out of your system. |
|
9 | most out of your system. | |
10 |
|
10 | |||
@@ -15,20 +15,23 b' You can configure multiple |RCE| instanc' | |||||
15 | set of |repos|. This lets users work on an instance that has less traffic |
|
15 | set of |repos|. This lets users work on an instance that has less traffic | |
16 | than those being hit by CI servers. To configure this, use |RCC| to install |
|
16 | than those being hit by CI servers. To configure this, use |RCC| to install | |
17 | multiple instances and configure the database and |repos| connection. If you |
|
17 | multiple instances and configure the database and |repos| connection. If you | |
18 | do need to reset the database connection, see the |
|
18 | do need to reset/adjust the database connection, see the | |
19 | :ref:`config-database` section. |
|
19 | :ref:`config-database` section. | |
20 |
|
20 | |||
21 | Once configured, set your CI servers to use a particular instance and for |
|
21 | You can configure then a load-balancer to balance the traffic between the CI | |
22 | user specific instances you can configure loads balancing. See the |
|
22 | dedicated instance and instance that end users would use. | |
23 | :ref:`nginx-ws-ref` section for examples. |
|
23 | See the :ref:`nginx-ws-ref` section for examples on how to do it in NGINX. | |
24 |
|
24 | |||
25 | Switch to Database Sessions |
|
25 | Switch to Database Sessions | |
26 | --------------------------- |
|
26 | --------------------------- | |
27 |
|
27 | |||
28 |
To increase |
|
28 | To increase |RCE| performance switch from the default file based sessions to | |
29 | large scale deployment, we recommend switching from file-based |
|
29 | database-based. In such way all your distributed instances would not need to | |
30 | sessions to database-based user sessions. For configuration details, see the |
|
30 | share the file storage to use file-based sessions. | |
31 | :ref:`db-session-ref` section. |
|
31 | Database based session have an additional advantage of the file | |
|
32 | based ones that they don't need a periodic cleanup as the session library | |||
|
33 | cleans them up for users. For configuration details, | |||
|
34 | see the :ref:`db-session-ref` section. | |||
32 |
|
35 | |||
33 | Tuning |RCE| |
|
36 | Tuning |RCE| | |
34 | ------------ |
|
37 | ------------ |
@@ -6,7 +6,9 b'' | |||||
6 | }, |
|
6 | }, | |
7 | "js": { |
|
7 | "js": { | |
8 | "src": "rhodecode/public/js/src", |
|
8 | "src": "rhodecode/public/js/src", | |
9 | "dest": "rhodecode/public/js" |
|
9 | "dest": "rhodecode/public/js", | |
|
10 | "bower": "bower_components", | |||
|
11 | "node_modules": "node_modules" | |||
10 | } |
|
12 | } | |
11 | }, |
|
13 | }, | |
12 | "copy": { |
|
14 | "copy": { | |
@@ -34,7 +36,8 b'' | |||||
34 | "<%= dirs.js.src %>/bootstrap.js", |
|
36 | "<%= dirs.js.src %>/bootstrap.js", | |
35 | "<%= dirs.js.src %>/mousetrap.js", |
|
37 | "<%= dirs.js.src %>/mousetrap.js", | |
36 | "<%= dirs.js.src %>/moment.js", |
|
38 | "<%= dirs.js.src %>/moment.js", | |
37 |
"<%= dirs.js.s |
|
39 | "<%= dirs.js.node_modules %>/appenlight-client/appenlight-client.min.js", | |
|
40 | "<%= dirs.js.node_modules %>/favico.js/favico-0.3.10.min.js", | |||
38 | "<%= dirs.js.src %>/i18n_utils.js", |
|
41 | "<%= dirs.js.src %>/i18n_utils.js", | |
39 | "<%= dirs.js.src %>/deform.js", |
|
42 | "<%= dirs.js.src %>/deform.js", | |
40 | "<%= dirs.js.src %>/plugins/jquery.pjax.js", |
|
43 | "<%= dirs.js.src %>/plugins/jquery.pjax.js", | |
@@ -64,7 +67,6 b'' | |||||
64 | "<%= dirs.js.src %>/rhodecode/utils/ie.js", |
|
67 | "<%= dirs.js.src %>/rhodecode/utils/ie.js", | |
65 | "<%= dirs.js.src %>/rhodecode/utils/os.js", |
|
68 | "<%= dirs.js.src %>/rhodecode/utils/os.js", | |
66 | "<%= dirs.js.src %>/rhodecode/utils/topics.js", |
|
69 | "<%= dirs.js.src %>/rhodecode/utils/topics.js", | |
67 | "<%= dirs.js.src %>/rhodecode/widgets/multiselect.js", |
|
|||
68 | "<%= dirs.js.src %>/rhodecode/init.js", |
|
70 | "<%= dirs.js.src %>/rhodecode/init.js", | |
69 | "<%= dirs.js.src %>/rhodecode/codemirror.js", |
|
71 | "<%= dirs.js.src %>/rhodecode/codemirror.js", | |
70 | "<%= dirs.js.src %>/rhodecode/comments.js", |
|
72 | "<%= dirs.js.src %>/rhodecode/comments.js", | |
@@ -144,7 +146,9 b'' | |||||
144 | "less:development", |
|
146 | "less:development", | |
145 | "less:components", |
|
147 | "less:components", | |
146 | "concat:polymercss", |
|
148 | "concat:polymercss", | |
147 | "vulcanize" |
|
149 | "vulcanize", | |
|
150 | "crisper", | |||
|
151 | "concat:dist" | |||
148 | ] |
|
152 | ] | |
149 | }, |
|
153 | }, | |
150 | "js": { |
|
154 | "js": { |
@@ -13,6 +13,8 b'' | |||||
13 | "grunt-crisper": "^1.0.1", |
|
13 | "grunt-crisper": "^1.0.1", | |
14 | "grunt-vulcanize": "^1.0.0", |
|
14 | "grunt-vulcanize": "^1.0.0", | |
15 | "jshint": "^2.9.1-rc3", |
|
15 | "jshint": "^2.9.1-rc3", | |
16 | "bower": "^1.7.9" |
|
16 | "bower": "^1.7.9", | |
|
17 | "favico.js": "^0.3.10", | |||
|
18 | "appenlight-client": "git+https://git@github.com/AppEnlight/appenlight-client-js.git#0.5.0" | |||
17 | } |
|
19 | } | |
18 | } |
|
20 | } |
@@ -4,10 +4,12 b' buildEnv { name = "bower-env"; ignoreCol' | |||||
4 | (fetchbower "polymer" "Polymer/polymer#1.6.1" "Polymer/polymer#^1.6.1" "09mm0jgk457gvwqlc155swch7gjr6fs3g7spnvhi6vh5b6518540") |
|
4 | (fetchbower "polymer" "Polymer/polymer#1.6.1" "Polymer/polymer#^1.6.1" "09mm0jgk457gvwqlc155swch7gjr6fs3g7spnvhi6vh5b6518540") | |
5 | (fetchbower "paper-button" "PolymerElements/paper-button#1.0.13" "PolymerElements/paper-button#^1.0.13" "0i3y153nqk06pn0gk282vyybnl3g1w3w41d5i9z659cgn27g3fvm") |
|
5 | (fetchbower "paper-button" "PolymerElements/paper-button#1.0.13" "PolymerElements/paper-button#^1.0.13" "0i3y153nqk06pn0gk282vyybnl3g1w3w41d5i9z659cgn27g3fvm") | |
6 | (fetchbower "paper-spinner" "PolymerElements/paper-spinner#1.2.0" "PolymerElements/paper-spinner#^1.2.0" "1av1m6y81jw3hjhz1yqy3rwcgxarjzl58ldfn4q6sn51pgzngfqb") |
|
6 | (fetchbower "paper-spinner" "PolymerElements/paper-spinner#1.2.0" "PolymerElements/paper-spinner#^1.2.0" "1av1m6y81jw3hjhz1yqy3rwcgxarjzl58ldfn4q6sn51pgzngfqb") | |
7 |
(fetchbower "paper-tooltip" "PolymerElements/paper-tooltip#1.1. |
|
7 | (fetchbower "paper-tooltip" "PolymerElements/paper-tooltip#1.1.3" "PolymerElements/paper-tooltip#^1.1.2" "0vmrm1n8k9sk9nvqy03q177axy22pia6i3j1gxbk72j3pqiqvg6k") | |
8 | (fetchbower "paper-toast" "PolymerElements/paper-toast#1.3.0" "PolymerElements/paper-toast#^1.3.0" "0x9rqxsks5455s8pk4aikpp99ijdn6kxr9gvhwh99nbcqdzcxq1m") |
|
8 | (fetchbower "paper-toast" "PolymerElements/paper-toast#1.3.0" "PolymerElements/paper-toast#^1.3.0" "0x9rqxsks5455s8pk4aikpp99ijdn6kxr9gvhwh99nbcqdzcxq1m") | |
9 | (fetchbower "paper-toggle-button" "PolymerElements/paper-toggle-button#1.2.0" "PolymerElements/paper-toggle-button#^1.2.0" "0mphcng3ngspbpg4jjn0mb91nvr4xc1phq3qswib15h6sfww1b2w") |
|
9 | (fetchbower "paper-toggle-button" "PolymerElements/paper-toggle-button#1.2.0" "PolymerElements/paper-toggle-button#^1.2.0" "0mphcng3ngspbpg4jjn0mb91nvr4xc1phq3qswib15h6sfww1b2w") | |
10 | (fetchbower "iron-ajax" "PolymerElements/iron-ajax#1.4.3" "PolymerElements/iron-ajax#^1.4.3" "0m3dx27arwmlcp00b7n516sc5a51f40p9vapr1nvd57l3i3z0pzm") |
|
10 | (fetchbower "iron-ajax" "PolymerElements/iron-ajax#1.4.3" "PolymerElements/iron-ajax#^1.4.3" "0m3dx27arwmlcp00b7n516sc5a51f40p9vapr1nvd57l3i3z0pzm") | |
|
11 | (fetchbower "iron-autogrow-textarea" "PolymerElements/iron-autogrow-textarea#1.0.13" "PolymerElements/iron-autogrow-textarea#^1.0.13" "0zwhpl97vii1s8k0lgain8i9dnw29b0mxc5ixdscx9las13n2lqq") | |||
|
12 | (fetchbower "iron-a11y-keys" "PolymerElements/iron-a11y-keys#1.0.6" "PolymerElements/iron-a11y-keys#^1.0.6" "1xz3mgghfcxixq28sdb654iaxj4nyi1bzcwf77ydkms6fviqs9mv") | |||
11 | (fetchbower "iron-flex-layout" "PolymerElements/iron-flex-layout#1.3.1" "PolymerElements/iron-flex-layout#^1.0.0" "0nswv3ih3bhflgcd2wjfmddqswzgqxb2xbq65jk9w3rkj26hplbl") |
|
13 | (fetchbower "iron-flex-layout" "PolymerElements/iron-flex-layout#1.3.1" "PolymerElements/iron-flex-layout#^1.0.0" "0nswv3ih3bhflgcd2wjfmddqswzgqxb2xbq65jk9w3rkj26hplbl") | |
12 | (fetchbower "paper-behaviors" "PolymerElements/paper-behaviors#1.0.12" "PolymerElements/paper-behaviors#^1.0.0" "012bqk97awgz55cn7rm9g7cckrdhkqhls3zvp8l6nd4rdwcrdzq8") |
|
14 | (fetchbower "paper-behaviors" "PolymerElements/paper-behaviors#1.0.12" "PolymerElements/paper-behaviors#^1.0.0" "012bqk97awgz55cn7rm9g7cckrdhkqhls3zvp8l6nd4rdwcrdzq8") | |
13 | (fetchbower "paper-material" "PolymerElements/paper-material#1.0.6" "PolymerElements/paper-material#^1.0.0" "0rljmknfdbm5aabvx9pk77754zckj3l127c3mvnmwkpkkr353xnh") |
|
15 | (fetchbower "paper-material" "PolymerElements/paper-material#1.0.6" "PolymerElements/paper-material#^1.0.0" "0rljmknfdbm5aabvx9pk77754zckj3l127c3mvnmwkpkkr353xnh") | |
@@ -19,13 +21,13 b' buildEnv { name = "bower-env"; ignoreCol' | |||||
19 | (fetchbower "iron-checked-element-behavior" "PolymerElements/iron-checked-element-behavior#1.0.5" "PolymerElements/iron-checked-element-behavior#^1.0.0" "0l0yy4ah454s8bzfv076s8by7h67zy9ni6xb932qwyhx8br6c1m7") |
|
21 | (fetchbower "iron-checked-element-behavior" "PolymerElements/iron-checked-element-behavior#1.0.5" "PolymerElements/iron-checked-element-behavior#^1.0.0" "0l0yy4ah454s8bzfv076s8by7h67zy9ni6xb932qwyhx8br6c1m7") | |
20 | (fetchbower "promise-polyfill" "polymerlabs/promise-polyfill#1.0.1" "polymerlabs/promise-polyfill#^1.0.0" "045bj2caav3famr5hhxgs1dx7n08r4s46mlzwb313vdy17is38xb") |
|
22 | (fetchbower "promise-polyfill" "polymerlabs/promise-polyfill#1.0.1" "polymerlabs/promise-polyfill#^1.0.0" "045bj2caav3famr5hhxgs1dx7n08r4s46mlzwb313vdy17is38xb") | |
21 | (fetchbower "iron-behaviors" "PolymerElements/iron-behaviors#1.0.17" "PolymerElements/iron-behaviors#^1.0.0" "021qvkmbk32jrrmmphpmwgby4bzi5jyf47rh1bxmq2ip07ly4bpr") |
|
23 | (fetchbower "iron-behaviors" "PolymerElements/iron-behaviors#1.0.17" "PolymerElements/iron-behaviors#^1.0.0" "021qvkmbk32jrrmmphpmwgby4bzi5jyf47rh1bxmq2ip07ly4bpr") | |
|
24 | (fetchbower "iron-validatable-behavior" "PolymerElements/iron-validatable-behavior#1.1.1" "PolymerElements/iron-validatable-behavior#^1.0.0" "1yhxlvywhw2klbbgm3f3cmanxfxggagph4ii635zv0c13707wslv") | |||
|
25 | (fetchbower "iron-form-element-behavior" "PolymerElements/iron-form-element-behavior#1.0.6" "PolymerElements/iron-form-element-behavior#^1.0.0" "0rdhxivgkdhhz2yadgdbjfc70l555p3y83vjh8rfj5hr0asyn6q1") | |||
|
26 | (fetchbower "iron-a11y-keys-behavior" "polymerelements/iron-a11y-keys-behavior#1.1.9" "polymerelements/iron-a11y-keys-behavior#^1.0.0" "1imm4gc84qizihhbyhfa8lwjh3myhj837f79i5m04xjgwrjmkaf6") | |||
22 | (fetchbower "paper-ripple" "PolymerElements/paper-ripple#1.0.8" "PolymerElements/paper-ripple#^1.0.0" "0r9sq8ik7wwrw0qb82c3rw0c030ljwd3s466c9y4qbcrsbvfjnns") |
|
27 | (fetchbower "paper-ripple" "PolymerElements/paper-ripple#1.0.8" "PolymerElements/paper-ripple#^1.0.0" "0r9sq8ik7wwrw0qb82c3rw0c030ljwd3s466c9y4qbcrsbvfjnns") | |
23 | (fetchbower "font-roboto" "PolymerElements/font-roboto#1.0.1" "PolymerElements/font-roboto#^1.0.1" "02jz43r0wkyr3yp7rq2rc08l5cwnsgca9fr54sr4rhsnl7cjpxrj") |
|
28 | (fetchbower "font-roboto" "PolymerElements/font-roboto#1.0.1" "PolymerElements/font-roboto#^1.0.1" "02jz43r0wkyr3yp7rq2rc08l5cwnsgca9fr54sr4rhsnl7cjpxrj") | |
24 | (fetchbower "iron-meta" "PolymerElements/iron-meta#1.1.2" "PolymerElements/iron-meta#^1.0.0" "1wl4dx8fnsknw9z9xi8bpc4cy9x70c11x4zxwxnj73hf3smifppl") |
|
29 | (fetchbower "iron-meta" "PolymerElements/iron-meta#1.1.2" "PolymerElements/iron-meta#^1.0.0" "1wl4dx8fnsknw9z9xi8bpc4cy9x70c11x4zxwxnj73hf3smifppl") | |
25 | (fetchbower "iron-resizable-behavior" "PolymerElements/iron-resizable-behavior#1.0.5" "PolymerElements/iron-resizable-behavior#^1.0.0" "1fd5zmbr2hax42vmcasncvk7lzi38fmb1kyii26nn8pnnjak7zkn") |
|
30 | (fetchbower "iron-resizable-behavior" "PolymerElements/iron-resizable-behavior#1.0.5" "PolymerElements/iron-resizable-behavior#^1.0.0" "1fd5zmbr2hax42vmcasncvk7lzi38fmb1kyii26nn8pnnjak7zkn") | |
26 | (fetchbower "iron-selector" "PolymerElements/iron-selector#1.5.2" "PolymerElements/iron-selector#^1.0.0" "1ajv46llqzvahm5g6g75w7nfyjcslp53ji0wm96l2k94j87spv3r") |
|
31 | (fetchbower "iron-selector" "PolymerElements/iron-selector#1.5.2" "PolymerElements/iron-selector#^1.0.0" "1ajv46llqzvahm5g6g75w7nfyjcslp53ji0wm96l2k94j87spv3r") | |
27 | (fetchbower "web-animations-js" "web-animations/web-animations-js#2.2.2" "web-animations/web-animations-js#^2.2.0" "1izfvm3l67vwys0bqbhidi9rqziw2f8wv289386sc6jsxzgkzhga") |
|
32 | (fetchbower "web-animations-js" "web-animations/web-animations-js#2.2.2" "web-animations/web-animations-js#^2.2.0" "1izfvm3l67vwys0bqbhidi9rqziw2f8wv289386sc6jsxzgkzhga") | |
28 | (fetchbower "iron-a11y-keys-behavior" "PolymerElements/iron-a11y-keys-behavior#1.1.7" "PolymerElements/iron-a11y-keys-behavior#^1.0.0" "070z46dbbz242002gmqrgy28x0y1fcqp9hnvbi05r3zphiqfx3l7") |
|
|||
29 | (fetchbower "iron-validatable-behavior" "PolymerElements/iron-validatable-behavior#1.1.1" "PolymerElements/iron-validatable-behavior#^1.0.0" "1yhxlvywhw2klbbgm3f3cmanxfxggagph4ii635zv0c13707wslv") |
|
|||
30 | (fetchbower "iron-form-element-behavior" "PolymerElements/iron-form-element-behavior#1.0.6" "PolymerElements/iron-form-element-behavior#^1.0.0" "0rdhxivgkdhhz2yadgdbjfc70l555p3y83vjh8rfj5hr0asyn6q1") |
|
|||
31 | ]; } |
|
33 | ]; } |
@@ -103,6 +103,34 b' let' | |||||
103 | sha1 = "a2e14ff85c2d6bf8c8080e5aa55129ebc6a2d320"; |
|
103 | sha1 = "a2e14ff85c2d6bf8c8080e5aa55129ebc6a2d320"; | |
104 | }; |
|
104 | }; | |
105 | }; |
|
105 | }; | |
|
106 | "bower-1.7.9" = { | |||
|
107 | name = "bower"; | |||
|
108 | packageName = "bower"; | |||
|
109 | version = "1.7.9"; | |||
|
110 | src = fetchurl { | |||
|
111 | url = "https://registry.npmjs.org/bower/-/bower-1.7.9.tgz"; | |||
|
112 | sha1 = "b7296c2393e0d75edaa6ca39648132dd255812b0"; | |||
|
113 | }; | |||
|
114 | }; | |||
|
115 | "favico.js-0.3.10" = { | |||
|
116 | name = "favico.js"; | |||
|
117 | packageName = "favico.js"; | |||
|
118 | version = "0.3.10"; | |||
|
119 | src = fetchurl { | |||
|
120 | url = "https://registry.npmjs.org/favico.js/-/favico.js-0.3.10.tgz"; | |||
|
121 | sha1 = "80586e27a117f24a8d51c18a99bdc714d4339301"; | |||
|
122 | }; | |||
|
123 | }; | |||
|
124 | "appenlight-client-git+https://git@github.com/AppEnlight/appenlight-client-js.git#0.5.0" = { | |||
|
125 | name = "appenlight-client"; | |||
|
126 | packageName = "appenlight-client"; | |||
|
127 | version = "0.5.0"; | |||
|
128 | src = fetchgit { | |||
|
129 | url = "https://git@github.com/AppEnlight/appenlight-client-js.git"; | |||
|
130 | rev = "b1d6853345dc3e96468b34537810b3eb77e0764f"; | |||
|
131 | sha256 = "2ef00aef7dafdecdc1666d2e83fc190a796849985d04a8f0fad148d64aa4f8db"; | |||
|
132 | }; | |||
|
133 | }; | |||
106 | "async-0.1.22" = { |
|
134 | "async-0.1.22" = { | |
107 | name = "async"; |
|
135 | name = "async"; | |
108 | packageName = "async"; |
|
136 | packageName = "async"; | |
@@ -301,13 +329,13 b' let' | |||||
301 | sha1 = "fadd834b9683073da179b3eae6d9c0d15053f73e"; |
|
329 | sha1 = "fadd834b9683073da179b3eae6d9c0d15053f73e"; | |
302 | }; |
|
330 | }; | |
303 | }; |
|
331 | }; | |
304 |
"inherits-2.0. |
|
332 | "inherits-2.0.3" = { | |
305 | name = "inherits"; |
|
333 | name = "inherits"; | |
306 | packageName = "inherits"; |
|
334 | packageName = "inherits"; | |
307 |
version = "2.0. |
|
335 | version = "2.0.3"; | |
308 | src = fetchurl { |
|
336 | src = fetchurl { | |
309 |
url = "https://registry.npmjs.org/inherits/-/inherits-2.0. |
|
337 | url = "https://registry.npmjs.org/inherits/-/inherits-2.0.3.tgz"; | |
310 | sha1 = "b17d08d326b4423e568eff719f91b0b1cbdf69f1"; |
|
338 | sha1 = "633c2c83e3da42a502f52466022480f4208261de"; | |
311 | }; |
|
339 | }; | |
312 | }; |
|
340 | }; | |
313 | "minimatch-0.3.0" = { |
|
341 | "minimatch-0.3.0" = { | |
@@ -580,13 +608,13 b' let' | |||||
580 | sha1 = "6cbfea22b3b830304e9a5fb371d54fa480c9d7cf"; |
|
608 | sha1 = "6cbfea22b3b830304e9a5fb371d54fa480c9d7cf"; | |
581 | }; |
|
609 | }; | |
582 | }; |
|
610 | }; | |
583 |
"lodash-4.1 |
|
611 | "lodash-4.16.2" = { | |
584 | name = "lodash"; |
|
612 | name = "lodash"; | |
585 | packageName = "lodash"; |
|
613 | packageName = "lodash"; | |
586 |
version = "4.1 |
|
614 | version = "4.16.2"; | |
587 | src = fetchurl { |
|
615 | src = fetchurl { | |
588 |
url = "https://registry.npmjs.org/lodash/-/lodash-4.1 |
|
616 | url = "https://registry.npmjs.org/lodash/-/lodash-4.16.2.tgz"; | |
589 | sha1 = "3162391d8f0140aa22cf8f6b3c34d6b7f63d3aa9"; |
|
617 | sha1 = "3e626db827048a699281a8a125226326cfc0e652"; | |
590 | }; |
|
618 | }; | |
591 | }; |
|
619 | }; | |
592 | "errno-0.1.4" = { |
|
620 | "errno-0.1.4" = { | |
@@ -598,13 +626,13 b' let' | |||||
598 | sha1 = "b896e23a9e5e8ba33871fc996abd3635fc9a1c7d"; |
|
626 | sha1 = "b896e23a9e5e8ba33871fc996abd3635fc9a1c7d"; | |
599 | }; |
|
627 | }; | |
600 | }; |
|
628 | }; | |
601 |
"graceful-fs-4.1. |
|
629 | "graceful-fs-4.1.8" = { | |
602 | name = "graceful-fs"; |
|
630 | name = "graceful-fs"; | |
603 | packageName = "graceful-fs"; |
|
631 | packageName = "graceful-fs"; | |
604 |
version = "4.1. |
|
632 | version = "4.1.8"; | |
605 | src = fetchurl { |
|
633 | src = fetchurl { | |
606 |
url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.1. |
|
634 | url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.1.8.tgz"; | |
607 | sha1 = "514c38772b31bee2e08bedc21a0aeb3abf54c19e"; |
|
635 | sha1 = "da3e11135eb2168bdd374532c4e2649751672890"; | |
608 | }; |
|
636 | }; | |
609 | }; |
|
637 | }; | |
610 | "image-size-0.5.0" = { |
|
638 | "image-size-0.5.0" = { | |
@@ -670,13 +698,13 b' let' | |||||
670 | sha1 = "857fcabfc3397d2625b8228262e86aa7a011b05d"; |
|
698 | sha1 = "857fcabfc3397d2625b8228262e86aa7a011b05d"; | |
671 | }; |
|
699 | }; | |
672 | }; |
|
700 | }; | |
673 |
"asap-2.0. |
|
701 | "asap-2.0.5" = { | |
674 | name = "asap"; |
|
702 | name = "asap"; | |
675 | packageName = "asap"; |
|
703 | packageName = "asap"; | |
676 |
version = "2.0. |
|
704 | version = "2.0.5"; | |
677 | src = fetchurl { |
|
705 | src = fetchurl { | |
678 |
url = "https://registry.npmjs.org/asap/-/asap-2.0. |
|
706 | url = "https://registry.npmjs.org/asap/-/asap-2.0.5.tgz"; | |
679 | sha1 = "b391bf7f6bfbc65706022fec8f49c4b07fecf589"; |
|
707 | sha1 = "522765b50c3510490e52d7dcfe085ef9ba96958f"; | |
680 | }; |
|
708 | }; | |
681 | }; |
|
709 | }; | |
682 | "gaze-0.5.2" = { |
|
710 | "gaze-0.5.2" = { | |
@@ -778,13 +806,13 b' let' | |||||
778 | sha1 = "f197d6eaff34c9085577484b2864375b294f5697"; |
|
806 | sha1 = "f197d6eaff34c9085577484b2864375b294f5697"; | |
779 | }; |
|
807 | }; | |
780 | }; |
|
808 | }; | |
781 |
"dom5-1.3. |
|
809 | "dom5-1.3.6" = { | |
782 | name = "dom5"; |
|
810 | name = "dom5"; | |
783 | packageName = "dom5"; |
|
811 | packageName = "dom5"; | |
784 |
version = "1.3. |
|
812 | version = "1.3.6"; | |
785 | src = fetchurl { |
|
813 | src = fetchurl { | |
786 |
url = "https://registry.npmjs.org/dom5/-/dom5-1.3. |
|
814 | url = "https://registry.npmjs.org/dom5/-/dom5-1.3.6.tgz"; | |
787 | sha1 = "07e514522c245c7aa8512aa3f9118e8bcab9f909"; |
|
815 | sha1 = "a7088a9fc5f3b08dc9f6eda4c7abaeb241945e0d"; | |
788 | }; |
|
816 | }; | |
789 | }; |
|
817 | }; | |
790 | "array-back-1.0.3" = { |
|
818 | "array-back-1.0.3" = { | |
@@ -832,13 +860,13 b' let' | |||||
832 | sha1 = "a2d6ce740d15f0d92b1b26763e2ce9c0e361fd98"; |
|
860 | sha1 = "a2d6ce740d15f0d92b1b26763e2ce9c0e361fd98"; | |
833 | }; |
|
861 | }; | |
834 | }; |
|
862 | }; | |
835 |
"typical-2. |
|
863 | "typical-2.6.0" = { | |
836 | name = "typical"; |
|
864 | name = "typical"; | |
837 | packageName = "typical"; |
|
865 | packageName = "typical"; | |
838 |
version = "2. |
|
866 | version = "2.6.0"; | |
839 | src = fetchurl { |
|
867 | src = fetchurl { | |
840 |
url = "https://registry.npmjs.org/typical/-/typical-2. |
|
868 | url = "https://registry.npmjs.org/typical/-/typical-2.6.0.tgz"; | |
841 | sha1 = "81244918aa28180c9e602aa457173404be0604f1"; |
|
869 | sha1 = "89d51554ab139848a65bcc2c8772f8fb450c40ed"; | |
842 | }; |
|
870 | }; | |
843 | }; |
|
871 | }; | |
844 | "ansi-escape-sequences-2.2.2" = { |
|
872 | "ansi-escape-sequences-2.2.2" = { | |
@@ -958,22 +986,22 b' let' | |||||
958 | sha1 = "a09136f72ec043d27c893707c2b159bfad7de93f"; |
|
986 | sha1 = "a09136f72ec043d27c893707c2b159bfad7de93f"; | |
959 | }; |
|
987 | }; | |
960 | }; |
|
988 | }; | |
961 |
"test-value-2. |
|
989 | "test-value-2.1.0" = { | |
962 | name = "test-value"; |
|
990 | name = "test-value"; | |
963 | packageName = "test-value"; |
|
991 | packageName = "test-value"; | |
964 |
version = "2. |
|
992 | version = "2.1.0"; | |
965 | src = fetchurl { |
|
993 | src = fetchurl { | |
966 |
url = "https://registry.npmjs.org/test-value/-/test-value-2. |
|
994 | url = "https://registry.npmjs.org/test-value/-/test-value-2.1.0.tgz"; | |
967 | sha1 = "0d65c45ee0b48a757c4507a5e98ec2680a9db137"; |
|
995 | sha1 = "11da6ff670f3471a73b625ca4f3fdcf7bb748291"; | |
968 | }; |
|
996 | }; | |
969 | }; |
|
997 | }; | |
970 |
"@types/clone-0.1. |
|
998 | "@types/clone-0.1.30" = { | |
971 | name = "@types/clone"; |
|
999 | name = "@types/clone"; | |
972 | packageName = "@types/clone"; |
|
1000 | packageName = "@types/clone"; | |
973 |
version = "0.1. |
|
1001 | version = "0.1.30"; | |
974 | src = fetchurl { |
|
1002 | src = fetchurl { | |
975 |
url = "https://registry.npmjs.org/@types/clone/-/clone-0.1. |
|
1003 | url = "https://registry.npmjs.org/@types/clone/-/clone-0.1.30.tgz"; | |
976 | sha1 = "65a0be88189ffddcd373e450aa6b68c9c83218b7"; |
|
1004 | sha1 = "e7365648c1b42136a59c7d5040637b3b5c83b614"; | |
977 | }; |
|
1005 | }; | |
978 | }; |
|
1006 | }; | |
979 | "@types/node-4.0.30" = { |
|
1007 | "@types/node-4.0.30" = { | |
@@ -985,13 +1013,13 b' let' | |||||
985 | sha1 = "553f490ed3030311620f88003e7abfc0edcb301e"; |
|
1013 | sha1 = "553f490ed3030311620f88003e7abfc0edcb301e"; | |
986 | }; |
|
1014 | }; | |
987 | }; |
|
1015 | }; | |
988 |
"@types/parse5-0.0. |
|
1016 | "@types/parse5-0.0.31" = { | |
989 | name = "@types/parse5"; |
|
1017 | name = "@types/parse5"; | |
990 | packageName = "@types/parse5"; |
|
1018 | packageName = "@types/parse5"; | |
991 |
version = "0.0. |
|
1019 | version = "0.0.31"; | |
992 | src = fetchurl { |
|
1020 | src = fetchurl { | |
993 |
url = "https://registry.npmjs.org/@types/parse5/-/parse5-0.0. |
|
1021 | url = "https://registry.npmjs.org/@types/parse5/-/parse5-0.0.31.tgz"; | |
994 | sha1 = "2a38cb7145bb157688d4ad2c46944c6dffae3cc6"; |
|
1022 | sha1 = "e827a493a443b156e1b582a2e4c3bdc0040f2ee7"; | |
995 | }; |
|
1023 | }; | |
996 | }; |
|
1024 | }; | |
997 | "clone-1.0.2" = { |
|
1025 | "clone-1.0.2" = { | |
@@ -1012,13 +1040,13 b' let' | |||||
1012 | sha1 = "9b7f3b0de32be78dc2401b17573ccaf0f6f59d94"; |
|
1040 | sha1 = "9b7f3b0de32be78dc2401b17573ccaf0f6f59d94"; | |
1013 | }; |
|
1041 | }; | |
1014 | }; |
|
1042 | }; | |
1015 |
"@types/node-6.0. |
|
1043 | "@types/node-6.0.41" = { | |
1016 | name = "@types/node"; |
|
1044 | name = "@types/node"; | |
1017 | packageName = "@types/node"; |
|
1045 | packageName = "@types/node"; | |
1018 |
version = "6.0. |
|
1046 | version = "6.0.41"; | |
1019 | src = fetchurl { |
|
1047 | src = fetchurl { | |
1020 |
url = "https://registry.npmjs.org/@types/node/-/node-6.0. |
|
1048 | url = "https://registry.npmjs.org/@types/node/-/node-6.0.41.tgz"; | |
1021 | sha1 = "a1e081f2ec60074113d3a1fbf11f35d304f30e39"; |
|
1049 | sha1 = "578cf53aaec65887bcaf16792f8722932e8ff8ea"; | |
1022 | }; |
|
1050 | }; | |
1023 | }; |
|
1051 | }; | |
1024 | "es6-promise-2.3.0" = { |
|
1052 | "es6-promise-2.3.0" = { | |
@@ -1093,13 +1121,13 b' let' | |||||
1093 | sha1 = "5a5b53af4693110bebb0867aa3430dd3b70a1018"; |
|
1121 | sha1 = "5a5b53af4693110bebb0867aa3430dd3b70a1018"; | |
1094 | }; |
|
1122 | }; | |
1095 | }; |
|
1123 | }; | |
1096 |
"espree-3.1 |
|
1124 | "espree-3.3.1" = { | |
1097 | name = "espree"; |
|
1125 | name = "espree"; | |
1098 | packageName = "espree"; |
|
1126 | packageName = "espree"; | |
1099 |
version = "3.1 |
|
1127 | version = "3.3.1"; | |
1100 | src = fetchurl { |
|
1128 | src = fetchurl { | |
1101 |
url = "https://registry.npmjs.org/espree/-/espree-3.1 |
|
1129 | url = "https://registry.npmjs.org/espree/-/espree-3.3.1.tgz"; | |
1102 | sha1 = "fd5deec76a97a5120a9cd3a7cb1177a0923b11d2"; |
|
1130 | sha1 = "42107376856738a65ff3b5877f3a58bd52497643"; | |
1103 | }; |
|
1131 | }; | |
1104 | }; |
|
1132 | }; | |
1105 | "estraverse-3.1.0" = { |
|
1133 | "estraverse-3.1.0" = { | |
@@ -1183,13 +1211,13 b' let' | |||||
1183 | sha1 = "96e3b70d5779f6ad49cd032673d1c312767ba581"; |
|
1211 | sha1 = "96e3b70d5779f6ad49cd032673d1c312767ba581"; | |
1184 | }; |
|
1212 | }; | |
1185 | }; |
|
1213 | }; | |
1186 |
"optionator-0.8. |
|
1214 | "optionator-0.8.2" = { | |
1187 | name = "optionator"; |
|
1215 | name = "optionator"; | |
1188 | packageName = "optionator"; |
|
1216 | packageName = "optionator"; | |
1189 |
version = "0.8. |
|
1217 | version = "0.8.2"; | |
1190 | src = fetchurl { |
|
1218 | src = fetchurl { | |
1191 |
url = "https://registry.npmjs.org/optionator/-/optionator-0.8. |
|
1219 | url = "https://registry.npmjs.org/optionator/-/optionator-0.8.2.tgz"; | |
1192 | sha1 = "e31b4932cdd5fb862a8b0d10bc63d3ee1ec7d78b"; |
|
1220 | sha1 = "364c5e409d3f4d6301d6c0b4c05bba50180aeb64"; | |
1193 | }; |
|
1221 | }; | |
1194 | }; |
|
1222 | }; | |
1195 | "source-map-0.2.0" = { |
|
1223 | "source-map-0.2.0" = { | |
@@ -1246,13 +1274,31 b' let' | |||||
1246 | sha1 = "3b09924edf9f083c0490fdd4c0bc4421e04764ee"; |
|
1274 | sha1 = "3b09924edf9f083c0490fdd4c0bc4421e04764ee"; | |
1247 | }; |
|
1275 | }; | |
1248 | }; |
|
1276 | }; | |
1249 |
"fast-levenshtein- |
|
1277 | "fast-levenshtein-2.0.4" = { | |
1250 | name = "fast-levenshtein"; |
|
1278 | name = "fast-levenshtein"; | |
1251 | packageName = "fast-levenshtein"; |
|
1279 | packageName = "fast-levenshtein"; | |
1252 |
version = " |
|
1280 | version = "2.0.4"; | |
|
1281 | src = fetchurl { | |||
|
1282 | url = "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-2.0.4.tgz"; | |||
|
1283 | sha1 = "e31e729eea62233c60a7bc9dce2bdcc88b4fffe3"; | |||
|
1284 | }; | |||
|
1285 | }; | |||
|
1286 | "acorn-4.0.3" = { | |||
|
1287 | name = "acorn"; | |||
|
1288 | packageName = "acorn"; | |||
|
1289 | version = "4.0.3"; | |||
1253 | src = fetchurl { |
|
1290 | src = fetchurl { | |
1254 |
url = "https://registry.npmjs.org/ |
|
1291 | url = "https://registry.npmjs.org/acorn/-/acorn-4.0.3.tgz"; | |
1255 | sha1 = "e6a754cc8f15e58987aa9cbd27af66fd6f4e5af9"; |
|
1292 | sha1 = "1a3e850b428e73ba6b09d1cc527f5aaad4d03ef1"; | |
|
1293 | }; | |||
|
1294 | }; | |||
|
1295 | "acorn-jsx-3.0.1" = { | |||
|
1296 | name = "acorn-jsx"; | |||
|
1297 | packageName = "acorn-jsx"; | |||
|
1298 | version = "3.0.1"; | |||
|
1299 | src = fetchurl { | |||
|
1300 | url = "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-3.0.1.tgz"; | |||
|
1301 | sha1 = "afdf9488fb1ecefc8348f6fb22f464e32a58b36b"; | |||
1256 | }; |
|
1302 | }; | |
1257 | }; |
|
1303 | }; | |
1258 | "acorn-3.3.0" = { |
|
1304 | "acorn-3.3.0" = { | |
@@ -1264,15 +1310,6 b' let' | |||||
1264 | sha1 = "45e37fb39e8da3f25baee3ff5369e2bb5f22017a"; |
|
1310 | sha1 = "45e37fb39e8da3f25baee3ff5369e2bb5f22017a"; | |
1265 | }; |
|
1311 | }; | |
1266 | }; |
|
1312 | }; | |
1267 | "acorn-jsx-3.0.1" = { |
|
|||
1268 | name = "acorn-jsx"; |
|
|||
1269 | packageName = "acorn-jsx"; |
|
|||
1270 | version = "3.0.1"; |
|
|||
1271 | src = fetchurl { |
|
|||
1272 | url = "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-3.0.1.tgz"; |
|
|||
1273 | sha1 = "afdf9488fb1ecefc8348f6fb22f464e32a58b36b"; |
|
|||
1274 | }; |
|
|||
1275 | }; |
|
|||
1276 | "boxen-0.3.1" = { |
|
1313 | "boxen-0.3.1" = { | |
1277 | name = "boxen"; |
|
1314 | name = "boxen"; | |
1278 | packageName = "boxen"; |
|
1315 | packageName = "boxen"; | |
@@ -1282,13 +1319,13 b' let' | |||||
1282 | sha1 = "a7d898243ae622f7abb6bb604d740a76c6a5461b"; |
|
1319 | sha1 = "a7d898243ae622f7abb6bb604d740a76c6a5461b"; | |
1283 | }; |
|
1320 | }; | |
1284 | }; |
|
1321 | }; | |
1285 |
"configstore-2. |
|
1322 | "configstore-2.1.0" = { | |
1286 | name = "configstore"; |
|
1323 | name = "configstore"; | |
1287 | packageName = "configstore"; |
|
1324 | packageName = "configstore"; | |
1288 |
version = "2. |
|
1325 | version = "2.1.0"; | |
1289 | src = fetchurl { |
|
1326 | src = fetchurl { | |
1290 |
url = "https://registry.npmjs.org/configstore/-/configstore-2. |
|
1327 | url = "https://registry.npmjs.org/configstore/-/configstore-2.1.0.tgz"; | |
1291 | sha1 = "8d81e9cdfa73ebd0e06bc985147856b2f1c4e764"; |
|
1328 | sha1 = "737a3a7036e9886102aa6099e47bb33ab1aba1a1"; | |
1292 | }; |
|
1329 | }; | |
1293 | }; |
|
1330 | }; | |
1294 | "is-npm-1.0.0" = { |
|
1331 | "is-npm-1.0.0" = { | |
@@ -1399,13 +1436,13 b' let' | |||||
1399 | sha1 = "ef9e31386f031a7f0d643af82fde50c457ef00cb"; |
|
1436 | sha1 = "ef9e31386f031a7f0d643af82fde50c457ef00cb"; | |
1400 | }; |
|
1437 | }; | |
1401 | }; |
|
1438 | }; | |
1402 |
"dot-prop- |
|
1439 | "dot-prop-3.0.0" = { | |
1403 | name = "dot-prop"; |
|
1440 | name = "dot-prop"; | |
1404 | packageName = "dot-prop"; |
|
1441 | packageName = "dot-prop"; | |
1405 |
version = " |
|
1442 | version = "3.0.0"; | |
1406 | src = fetchurl { |
|
1443 | src = fetchurl { | |
1407 |
url = "https://registry.npmjs.org/dot-prop/-/dot-prop- |
|
1444 | url = "https://registry.npmjs.org/dot-prop/-/dot-prop-3.0.0.tgz"; | |
1408 | sha1 = "848e28f7f1d50740c6747ab3cb07670462b6f89c"; |
|
1445 | sha1 = "1b708af094a49c9a0e7dbcad790aba539dac1177"; | |
1409 | }; |
|
1446 | }; | |
1410 | }; |
|
1447 | }; | |
1411 | "os-tmpdir-1.0.1" = { |
|
1448 | "os-tmpdir-1.0.1" = { | |
@@ -1426,13 +1463,13 b' let' | |||||
1426 | sha1 = "83cf05c6d6458fc4d5ac6362ea325d92f2754217"; |
|
1463 | sha1 = "83cf05c6d6458fc4d5ac6362ea325d92f2754217"; | |
1427 | }; |
|
1464 | }; | |
1428 | }; |
|
1465 | }; | |
1429 |
"uuid-2.0. |
|
1466 | "uuid-2.0.3" = { | |
1430 | name = "uuid"; |
|
1467 | name = "uuid"; | |
1431 | packageName = "uuid"; |
|
1468 | packageName = "uuid"; | |
1432 |
version = "2.0. |
|
1469 | version = "2.0.3"; | |
1433 | src = fetchurl { |
|
1470 | src = fetchurl { | |
1434 |
url = "https://registry.npmjs.org/uuid/-/uuid-2.0. |
|
1471 | url = "https://registry.npmjs.org/uuid/-/uuid-2.0.3.tgz"; | |
1435 | sha1 = "48bd5698f0677e3c7901a1c46ef15b1643794726"; |
|
1472 | sha1 = "67e2e863797215530dff318e5bf9dcebfd47b21a"; | |
1436 | }; |
|
1473 | }; | |
1437 | }; |
|
1474 | }; | |
1438 | "write-file-atomic-1.2.0" = { |
|
1475 | "write-file-atomic-1.2.0" = { | |
@@ -1489,13 +1526,13 b' let' | |||||
1489 | sha1 = "56eb027d65b4d2dce6cb2e2d32c4d4afc9e1d707"; |
|
1526 | sha1 = "56eb027d65b4d2dce6cb2e2d32c4d4afc9e1d707"; | |
1490 | }; |
|
1527 | }; | |
1491 | }; |
|
1528 | }; | |
1492 |
"package-json-2. |
|
1529 | "package-json-2.4.0" = { | |
1493 | name = "package-json"; |
|
1530 | name = "package-json"; | |
1494 | packageName = "package-json"; |
|
1531 | packageName = "package-json"; | |
1495 |
version = "2. |
|
1532 | version = "2.4.0"; | |
1496 | src = fetchurl { |
|
1533 | src = fetchurl { | |
1497 |
url = "https://registry.npmjs.org/package-json/-/package-json-2. |
|
1534 | url = "https://registry.npmjs.org/package-json/-/package-json-2.4.0.tgz"; | |
1498 | sha1 = "14895311a963d18edf8801e06b67ea87795d15b9"; |
|
1535 | sha1 = "0d15bd67d1cbbddbb2ca222ff2edb86bcb31a8bb"; | |
1499 | }; |
|
1536 | }; | |
1500 | }; |
|
1537 | }; | |
1501 | "got-5.6.0" = { |
|
1538 | "got-5.6.0" = { | |
@@ -1507,13 +1544,13 b' let' | |||||
1507 | sha1 = "bb1d7ee163b78082bbc8eb836f3f395004ea6fbf"; |
|
1544 | sha1 = "bb1d7ee163b78082bbc8eb836f3f395004ea6fbf"; | |
1508 | }; |
|
1545 | }; | |
1509 | }; |
|
1546 | }; | |
1510 | "rc-1.1.6" = { |
|
1547 | "registry-auth-token-3.0.1" = { | |
1511 |
name = "r |
|
1548 | name = "registry-auth-token"; | |
1512 |
packageName = "r |
|
1549 | packageName = "registry-auth-token"; | |
1513 |
version = " |
|
1550 | version = "3.0.1"; | |
1514 | src = fetchurl { |
|
1551 | src = fetchurl { | |
1515 |
url = "https://registry.npmjs.org/r |
|
1552 | url = "https://registry.npmjs.org/registry-auth-token/-/registry-auth-token-3.0.1.tgz"; | |
1516 | sha1 = "43651b76b6ae53b5c802f1151fa3fc3b059969c9"; |
|
1553 | sha1 = "c3ee5ec585bce29f88bf41629a3944c71ed53e25"; | |
1517 | }; |
|
1554 | }; | |
1518 | }; |
|
1555 | }; | |
1519 | "registry-url-3.1.0" = { |
|
1556 | "registry-url-3.1.0" = { | |
@@ -1651,13 +1688,13 b' let' | |||||
1651 | sha1 = "f38b0ae81d3747d628001f41dafc652ace671c0a"; |
|
1688 | sha1 = "f38b0ae81d3747d628001f41dafc652ace671c0a"; | |
1652 | }; |
|
1689 | }; | |
1653 | }; |
|
1690 | }; | |
1654 |
"unzip-response-1.0. |
|
1691 | "unzip-response-1.0.1" = { | |
1655 | name = "unzip-response"; |
|
1692 | name = "unzip-response"; | |
1656 | packageName = "unzip-response"; |
|
1693 | packageName = "unzip-response"; | |
1657 |
version = "1.0. |
|
1694 | version = "1.0.1"; | |
1658 | src = fetchurl { |
|
1695 | src = fetchurl { | |
1659 |
url = "https://registry.npmjs.org/unzip-response/-/unzip-response-1.0. |
|
1696 | url = "https://registry.npmjs.org/unzip-response/-/unzip-response-1.0.1.tgz"; | |
1660 | sha1 = "bfda54eeec658f00c2df4d4494b9dca0ca00f3e4"; |
|
1697 | sha1 = "4a73959f2989470fa503791cefb54e1dbbc68412"; | |
1661 | }; |
|
1698 | }; | |
1662 | }; |
|
1699 | }; | |
1663 | "url-parse-lax-1.0.0" = { |
|
1700 | "url-parse-lax-1.0.0" = { | |
@@ -1768,6 +1805,15 b' let' | |||||
1768 | sha1 = "d4f4562b0ce3696e41ac52d0e002e57a635dc6dc"; |
|
1805 | sha1 = "d4f4562b0ce3696e41ac52d0e002e57a635dc6dc"; | |
1769 | }; |
|
1806 | }; | |
1770 | }; |
|
1807 | }; | |
|
1808 | "rc-1.1.6" = { | |||
|
1809 | name = "rc"; | |||
|
1810 | packageName = "rc"; | |||
|
1811 | version = "1.1.6"; | |||
|
1812 | src = fetchurl { | |||
|
1813 | url = "https://registry.npmjs.org/rc/-/rc-1.1.6.tgz"; | |||
|
1814 | sha1 = "43651b76b6ae53b5c802f1151fa3fc3b059969c9"; | |||
|
1815 | }; | |||
|
1816 | }; | |||
1771 | "ini-1.3.4" = { |
|
1817 | "ini-1.3.4" = { | |
1772 | name = "ini"; |
|
1818 | name = "ini"; | |
1773 | packageName = "ini"; |
|
1819 | packageName = "ini"; | |
@@ -1858,13 +1904,13 b' let' | |||||
1858 | sha1 = "3678bd8ab995057c07ade836ed2ef087da811d45"; |
|
1904 | sha1 = "3678bd8ab995057c07ade836ed2ef087da811d45"; | |
1859 | }; |
|
1905 | }; | |
1860 | }; |
|
1906 | }; | |
1861 |
"glob-7.0 |
|
1907 | "glob-7.1.0" = { | |
1862 | name = "glob"; |
|
1908 | name = "glob"; | |
1863 | packageName = "glob"; |
|
1909 | packageName = "glob"; | |
1864 |
version = "7.0 |
|
1910 | version = "7.1.0"; | |
1865 | src = fetchurl { |
|
1911 | src = fetchurl { | |
1866 |
url = "https://registry.npmjs.org/glob/-/glob-7.0 |
|
1912 | url = "https://registry.npmjs.org/glob/-/glob-7.1.0.tgz"; | |
1867 | sha1 = "211bafaf49e525b8cd93260d14ab136152b3f57a"; |
|
1913 | sha1 = "36add856d746d0d99e4cc2797bba1ae2c67272fd"; | |
1868 | }; |
|
1914 | }; | |
1869 | }; |
|
1915 | }; | |
1870 | "fs.realpath-1.0.0" = { |
|
1916 | "fs.realpath-1.0.0" = { | |
@@ -1885,13 +1931,13 b' let' | |||||
1885 | sha1 = "db3204cd5a9de2e6cd890b85c6e2f66bcf4f620a"; |
|
1931 | sha1 = "db3204cd5a9de2e6cd890b85c6e2f66bcf4f620a"; | |
1886 | }; |
|
1932 | }; | |
1887 | }; |
|
1933 | }; | |
1888 |
"once-1. |
|
1934 | "once-1.4.0" = { | |
1889 | name = "once"; |
|
1935 | name = "once"; | |
1890 | packageName = "once"; |
|
1936 | packageName = "once"; | |
1891 |
version = "1. |
|
1937 | version = "1.4.0"; | |
1892 | src = fetchurl { |
|
1938 | src = fetchurl { | |
1893 |
url = "https://registry.npmjs.org/once/-/once-1. |
|
1939 | url = "https://registry.npmjs.org/once/-/once-1.4.0.tgz"; | |
1894 | sha1 = "b2e261557ce4c314ec8304f3fa82663e4297ca20"; |
|
1940 | sha1 = "583b1aa775961d4b113ac17d9c50baef9dd76bd1"; | |
1895 | }; |
|
1941 | }; | |
1896 | }; |
|
1942 | }; | |
1897 | "wrappy-1.0.2" = { |
|
1943 | "wrappy-1.0.2" = { | |
@@ -2034,7 +2080,7 b' let' | |||||
2034 | (sources."grunt-contrib-less-1.4.0" // { |
|
2080 | (sources."grunt-contrib-less-1.4.0" // { | |
2035 | dependencies = [ |
|
2081 | dependencies = [ | |
2036 | sources."async-2.0.1" |
|
2082 | sources."async-2.0.1" | |
2037 |
sources."lodash-4.1 |
|
2083 | sources."lodash-4.16.2" | |
2038 | ]; |
|
2084 | ]; | |
2039 | }) |
|
2085 | }) | |
2040 | (sources."grunt-contrib-watch-0.6.1" // { |
|
2086 | (sources."grunt-contrib-watch-0.6.1" // { | |
@@ -2062,6 +2108,9 b' let' | |||||
2062 | sources."lodash-3.7.0" |
|
2108 | sources."lodash-3.7.0" | |
2063 | ]; |
|
2109 | ]; | |
2064 | }) |
|
2110 | }) | |
|
2111 | sources."bower-1.7.9" | |||
|
2112 | sources."favico.js-0.3.10" | |||
|
2113 | sources."appenlight-client-git+https://git@github.com/AppEnlight/appenlight-client-js.git#0.5.0" | |||
2065 | sources."async-0.1.22" |
|
2114 | sources."async-0.1.22" | |
2066 | sources."coffee-script-1.3.3" |
|
2115 | sources."coffee-script-1.3.3" | |
2067 | sources."colors-0.6.2" |
|
2116 | sources."colors-0.6.2" | |
@@ -2097,7 +2146,7 b' let' | |||||
2097 | sources."underscore.string-2.3.3" |
|
2146 | sources."underscore.string-2.3.3" | |
2098 | ]; |
|
2147 | ]; | |
2099 | }) |
|
2148 | }) | |
2100 |
sources."inherits-2.0. |
|
2149 | sources."inherits-2.0.3" | |
2101 | sources."lru-cache-2.7.3" |
|
2150 | sources."lru-cache-2.7.3" | |
2102 | sources."sigmund-1.0.1" |
|
2151 | sources."sigmund-1.0.1" | |
2103 | sources."graceful-fs-1.2.3" |
|
2152 | sources."graceful-fs-1.2.3" | |
@@ -2127,7 +2176,7 b' let' | |||||
2127 | sources."amdefine-1.0.0" |
|
2176 | sources."amdefine-1.0.0" | |
2128 | (sources."less-2.7.1" // { |
|
2177 | (sources."less-2.7.1" // { | |
2129 | dependencies = [ |
|
2178 | dependencies = [ | |
2130 |
sources."graceful-fs-4.1. |
|
2179 | sources."graceful-fs-4.1.8" | |
2131 | sources."source-map-0.5.6" |
|
2180 | sources."source-map-0.5.6" | |
2132 | ]; |
|
2181 | ]; | |
2133 | }) |
|
2182 | }) | |
@@ -2138,7 +2187,7 b' let' | |||||
2138 | sources."promise-7.1.1" |
|
2187 | sources."promise-7.1.1" | |
2139 | sources."prr-0.0.0" |
|
2188 | sources."prr-0.0.0" | |
2140 | sources."minimist-0.0.8" |
|
2189 | sources."minimist-0.0.8" | |
2141 |
sources."asap-2.0. |
|
2190 | sources."asap-2.0.5" | |
2142 | sources."gaze-0.5.2" |
|
2191 | sources."gaze-0.5.2" | |
2143 | sources."tiny-lr-fork-0.0.5" |
|
2192 | sources."tiny-lr-fork-0.0.5" | |
2144 | (sources."globule-0.1.0" // { |
|
2193 | (sources."globule-0.1.0" // { | |
@@ -2155,17 +2204,17 b' let' | |||||
2155 | }) |
|
2204 | }) | |
2156 | sources."debug-0.7.4" |
|
2205 | sources."debug-0.7.4" | |
2157 | sources."command-line-args-2.1.6" |
|
2206 | sources."command-line-args-2.1.6" | |
2158 |
sources."dom5-1.3. |
|
2207 | sources."dom5-1.3.6" | |
2159 | sources."array-back-1.0.3" |
|
2208 | sources."array-back-1.0.3" | |
2160 | sources."command-line-usage-2.0.5" |
|
2209 | sources."command-line-usage-2.0.5" | |
2161 | sources."core-js-2.4.1" |
|
2210 | sources."core-js-2.4.1" | |
2162 | sources."feature-detect-es6-1.3.1" |
|
2211 | sources."feature-detect-es6-1.3.1" | |
2163 | (sources."find-replace-1.0.2" // { |
|
2212 | (sources."find-replace-1.0.2" // { | |
2164 | dependencies = [ |
|
2213 | dependencies = [ | |
2165 |
sources."test-value-2. |
|
2214 | sources."test-value-2.1.0" | |
2166 | ]; |
|
2215 | ]; | |
2167 | }) |
|
2216 | }) | |
2168 |
sources."typical-2. |
|
2217 | sources."typical-2.6.0" | |
2169 | sources."ansi-escape-sequences-2.2.2" |
|
2218 | sources."ansi-escape-sequences-2.2.2" | |
2170 | sources."column-layout-2.1.4" |
|
2219 | sources."column-layout-2.1.4" | |
2171 | sources."wordwrapjs-1.2.1" |
|
2220 | sources."wordwrapjs-1.2.1" | |
@@ -2182,11 +2231,11 b' let' | |||||
2182 | sources."object-tools-2.0.6" |
|
2231 | sources."object-tools-2.0.6" | |
2183 | sources."object-get-2.1.0" |
|
2232 | sources."object-get-2.1.0" | |
2184 | sources."test-value-1.1.0" |
|
2233 | sources."test-value-1.1.0" | |
2185 |
sources."@types/clone-0.1. |
|
2234 | sources."@types/clone-0.1.30" | |
2186 | sources."@types/node-4.0.30" |
|
2235 | sources."@types/node-4.0.30" | |
2187 |
(sources."@types/parse5-0.0. |
|
2236 | (sources."@types/parse5-0.0.31" // { | |
2188 | dependencies = [ |
|
2237 | dependencies = [ | |
2189 |
sources."@types/node-6.0. |
|
2238 | sources."@types/node-6.0.41" | |
2190 | ]; |
|
2239 | ]; | |
2191 | }) |
|
2240 | }) | |
2192 | sources."clone-1.0.2" |
|
2241 | sources."clone-1.0.2" | |
@@ -2205,26 +2254,30 b' let' | |||||
2205 | sources."source-map-0.2.0" |
|
2254 | sources."source-map-0.2.0" | |
2206 | ]; |
|
2255 | ]; | |
2207 | }) |
|
2256 | }) | |
2208 |
sources."espree-3.1 |
|
2257 | sources."espree-3.3.1" | |
2209 | sources."estraverse-3.1.0" |
|
2258 | sources."estraverse-3.1.0" | |
2210 | sources."path-is-absolute-1.0.0" |
|
2259 | sources."path-is-absolute-1.0.0" | |
2211 | sources."babel-runtime-6.11.6" |
|
2260 | sources."babel-runtime-6.11.6" | |
2212 | sources."regenerator-runtime-0.9.5" |
|
2261 | sources."regenerator-runtime-0.9.5" | |
2213 | sources."esutils-1.1.6" |
|
2262 | sources."esutils-1.1.6" | |
2214 | sources."isarray-0.0.1" |
|
2263 | sources."isarray-0.0.1" | |
2215 |
sources."optionator-0.8. |
|
2264 | sources."optionator-0.8.2" | |
2216 | sources."prelude-ls-1.1.2" |
|
2265 | sources."prelude-ls-1.1.2" | |
2217 | sources."deep-is-0.1.3" |
|
2266 | sources."deep-is-0.1.3" | |
2218 | sources."wordwrap-1.0.0" |
|
2267 | sources."wordwrap-1.0.0" | |
2219 | sources."type-check-0.3.2" |
|
2268 | sources."type-check-0.3.2" | |
2220 | sources."levn-0.3.0" |
|
2269 | sources."levn-0.3.0" | |
2221 |
sources."fast-levenshtein- |
|
2270 | sources."fast-levenshtein-2.0.4" | |
2222 |
sources."acorn- |
|
2271 | sources."acorn-4.0.3" | |
2223 | sources."acorn-jsx-3.0.1" |
|
2272 | (sources."acorn-jsx-3.0.1" // { | |
|
2273 | dependencies = [ | |||
|
2274 | sources."acorn-3.3.0" | |||
|
2275 | ]; | |||
|
2276 | }) | |||
2224 | sources."boxen-0.3.1" |
|
2277 | sources."boxen-0.3.1" | |
2225 |
(sources."configstore-2. |
|
2278 | (sources."configstore-2.1.0" // { | |
2226 | dependencies = [ |
|
2279 | dependencies = [ | |
2227 |
sources."graceful-fs-4.1. |
|
2280 | sources."graceful-fs-4.1.8" | |
2228 | ]; |
|
2281 | ]; | |
2229 | }) |
|
2282 | }) | |
2230 | sources."is-npm-1.0.0" |
|
2283 | sources."is-npm-1.0.0" | |
@@ -2239,13 +2292,13 b' let' | |||||
2239 | sources."number-is-nan-1.0.0" |
|
2292 | sources."number-is-nan-1.0.0" | |
2240 | sources."code-point-at-1.0.0" |
|
2293 | sources."code-point-at-1.0.0" | |
2241 | sources."is-fullwidth-code-point-1.0.0" |
|
2294 | sources."is-fullwidth-code-point-1.0.0" | |
2242 |
sources."dot-prop- |
|
2295 | sources."dot-prop-3.0.0" | |
2243 | sources."os-tmpdir-1.0.1" |
|
2296 | sources."os-tmpdir-1.0.1" | |
2244 | sources."osenv-0.1.3" |
|
2297 | sources."osenv-0.1.3" | |
2245 |
sources."uuid-2.0. |
|
2298 | sources."uuid-2.0.3" | |
2246 | (sources."write-file-atomic-1.2.0" // { |
|
2299 | (sources."write-file-atomic-1.2.0" // { | |
2247 | dependencies = [ |
|
2300 | dependencies = [ | |
2248 |
sources."graceful-fs-4.1. |
|
2301 | sources."graceful-fs-4.1.8" | |
2249 | ]; |
|
2302 | ]; | |
2250 | }) |
|
2303 | }) | |
2251 | sources."xdg-basedir-2.0.0" |
|
2304 | sources."xdg-basedir-2.0.0" | |
@@ -2253,13 +2306,9 b' let' | |||||
2253 | sources."os-homedir-1.0.1" |
|
2306 | sources."os-homedir-1.0.1" | |
2254 | sources."imurmurhash-0.1.4" |
|
2307 | sources."imurmurhash-0.1.4" | |
2255 | sources."slide-1.1.6" |
|
2308 | sources."slide-1.1.6" | |
2256 |
sources."package-json-2. |
|
2309 | sources."package-json-2.4.0" | |
2257 | sources."got-5.6.0" |
|
2310 | sources."got-5.6.0" | |
2258 | (sources."rc-1.1.6" // { |
|
2311 | sources."registry-auth-token-3.0.1" | |
2259 | dependencies = [ |
|
|||
2260 | sources."minimist-1.2.0" |
|
|||
2261 | ]; |
|
|||
2262 | }) |
|
|||
2263 | sources."registry-url-3.1.0" |
|
2312 | sources."registry-url-3.1.0" | |
2264 | sources."semver-5.3.0" |
|
2313 | sources."semver-5.3.0" | |
2265 | sources."create-error-class-3.0.2" |
|
2314 | sources."create-error-class-3.0.2" | |
@@ -2279,7 +2328,7 b' let' | |||||
2279 | ]; |
|
2328 | ]; | |
2280 | }) |
|
2329 | }) | |
2281 | sources."timed-out-2.0.0" |
|
2330 | sources."timed-out-2.0.0" | |
2282 |
sources."unzip-response-1.0. |
|
2331 | sources."unzip-response-1.0.1" | |
2283 | sources."url-parse-lax-1.0.0" |
|
2332 | sources."url-parse-lax-1.0.0" | |
2284 | sources."capture-stack-trace-1.0.0" |
|
2333 | sources."capture-stack-trace-1.0.0" | |
2285 | sources."error-ex-1.3.0" |
|
2334 | sources."error-ex-1.3.0" | |
@@ -2291,11 +2340,16 b' let' | |||||
2291 | sources."string_decoder-0.10.31" |
|
2340 | sources."string_decoder-0.10.31" | |
2292 | sources."util-deprecate-1.0.2" |
|
2341 | sources."util-deprecate-1.0.2" | |
2293 | sources."prepend-http-1.0.4" |
|
2342 | sources."prepend-http-1.0.4" | |
|
2343 | (sources."rc-1.1.6" // { | |||
|
2344 | dependencies = [ | |||
|
2345 | sources."minimist-1.2.0" | |||
|
2346 | ]; | |||
|
2347 | }) | |||
2294 | sources."ini-1.3.4" |
|
2348 | sources."ini-1.3.4" | |
2295 | sources."strip-json-comments-1.0.4" |
|
2349 | sources."strip-json-comments-1.0.4" | |
2296 | (sources."cli-1.0.0" // { |
|
2350 | (sources."cli-1.0.0" // { | |
2297 | dependencies = [ |
|
2351 | dependencies = [ | |
2298 |
sources."glob-7.0 |
|
2352 | sources."glob-7.1.0" | |
2299 | sources."minimatch-3.0.3" |
|
2353 | sources."minimatch-3.0.3" | |
2300 | ]; |
|
2354 | ]; | |
2301 | }) |
|
2355 | }) | |
@@ -2308,7 +2362,7 b' let' | |||||
2308 | sources."shelljs-0.3.0" |
|
2362 | sources."shelljs-0.3.0" | |
2309 | sources."fs.realpath-1.0.0" |
|
2363 | sources."fs.realpath-1.0.0" | |
2310 | sources."inflight-1.0.5" |
|
2364 | sources."inflight-1.0.5" | |
2311 |
sources."once-1. |
|
2365 | sources."once-1.4.0" | |
2312 | sources."wrappy-1.0.2" |
|
2366 | sources."wrappy-1.0.2" | |
2313 | sources."brace-expansion-1.1.6" |
|
2367 | sources."brace-expansion-1.1.6" | |
2314 | sources."balanced-match-0.4.2" |
|
2368 | sources."balanced-match-0.4.2" |
@@ -15,19 +15,6 b' let' | |||||
15 | }; |
|
15 | }; | |
16 | }; |
|
16 | }; | |
17 |
|
17 | |||
18 | # johbo: Interim bridge which allows us to build with the upcoming |
|
|||
19 | # nixos.16.09 branch (unstable at the moment of writing this note) and the |
|
|||
20 | # current stable nixos-16.03. |
|
|||
21 | backwardsCompatibleFetchgit = { ... }@args: |
|
|||
22 | let |
|
|||
23 | origSources = pkgs.fetchgit args; |
|
|||
24 | in |
|
|||
25 | pkgs.lib.overrideDerivation origSources (oldAttrs: { |
|
|||
26 | NIX_PREFETCH_GIT_CHECKOUT_HOOK = '' |
|
|||
27 | find $out -name '.git*' -print0 | xargs -0 rm -rf |
|
|||
28 | ''; |
|
|||
29 | }); |
|
|||
30 |
|
||||
31 | in |
|
18 | in | |
32 |
|
19 | |||
33 | self: super: { |
|
20 | self: super: { | |
@@ -77,6 +64,9 b' self: super: {' | |||||
77 | }); |
|
64 | }); | |
78 |
|
65 | |||
79 | lxml = super.lxml.override (attrs: { |
|
66 | lxml = super.lxml.override (attrs: { | |
|
67 | # johbo: On 16.09 we need this to compile on darwin, otherwise compilation | |||
|
68 | # fails on Darwin. | |||
|
69 | hardeningDisable = if pkgs.stdenv.isDarwin then [ "format" ] else null; | |||
80 | buildInputs = with self; [ |
|
70 | buildInputs = with self; [ | |
81 | pkgs.libxml2 |
|
71 | pkgs.libxml2 | |
82 | pkgs.libxslt |
|
72 | pkgs.libxslt | |
@@ -110,7 +100,7 b' self: super: {' | |||||
110 | }); |
|
100 | }); | |
111 |
|
101 | |||
112 | py-gfm = super.py-gfm.override { |
|
102 | py-gfm = super.py-gfm.override { | |
113 |
src = |
|
103 | src = pkgs.fetchgit { | |
114 | url = "https://code.rhodecode.com/upstream/py-gfm"; |
|
104 | url = "https://code.rhodecode.com/upstream/py-gfm"; | |
115 | rev = "0d66a19bc16e3d49de273c0f797d4e4781e8c0f2"; |
|
105 | rev = "0d66a19bc16e3d49de273c0f797d4e4781e8c0f2"; | |
116 | sha256 = "0ryp74jyihd3ckszq31bml5jr3bciimhfp7va7kw6ld92930ksv3"; |
|
106 | sha256 = "0ryp74jyihd3ckszq31bml5jr3bciimhfp7va7kw6ld92930ksv3"; | |
@@ -134,7 +124,7 b' self: super: {' | |||||
134 |
|
124 | |||
135 | Pylons = super.Pylons.override (attrs: { |
|
125 | Pylons = super.Pylons.override (attrs: { | |
136 | name = "Pylons-1.0.1-patch1"; |
|
126 | name = "Pylons-1.0.1-patch1"; | |
137 |
src = |
|
127 | src = pkgs.fetchgit { | |
138 | url = "https://code.rhodecode.com/upstream/pylons"; |
|
128 | url = "https://code.rhodecode.com/upstream/pylons"; | |
139 | rev = "707354ee4261b9c10450404fc9852ccea4fd667d"; |
|
129 | rev = "707354ee4261b9c10450404fc9852ccea4fd667d"; | |
140 | sha256 = "b2763274c2780523a335f83a1df65be22ebe4ff413a7bc9e9288d23c1f62032e"; |
|
130 | sha256 = "b2763274c2780523a335f83a1df65be22ebe4ff413a7bc9e9288d23c1f62032e"; | |
@@ -190,7 +180,8 b' self: super: {' | |||||
190 | pkgs.openldap |
|
180 | pkgs.openldap | |
191 | pkgs.openssl |
|
181 | pkgs.openssl | |
192 | ]; |
|
182 | ]; | |
193 | NIX_CFLAGS_COMPILE = "-I${pkgs.cyrus_sasl}/include/sasl"; |
|
183 | # TODO: johbo: Remove the "or" once we drop 16.03 support. | |
|
184 | NIX_CFLAGS_COMPILE = "-I${pkgs.cyrus_sasl.dev or pkgs.cyrus_sasl}/include/sasl"; | |||
194 | }); |
|
185 | }); | |
195 |
|
186 | |||
196 | python-pam = super.python-pam.override (attrs: |
|
187 | python-pam = super.python-pam.override (attrs: |
@@ -431,6 +431,19 b'' | |||||
431 | license = [ pkgs.lib.licenses.psfl ]; |
|
431 | license = [ pkgs.lib.licenses.psfl ]; | |
432 | }; |
|
432 | }; | |
433 | }; |
|
433 | }; | |
|
434 | backports.shutil-get-terminal-size = super.buildPythonPackage { | |||
|
435 | name = "backports.shutil-get-terminal-size-1.0.0"; | |||
|
436 | buildInputs = with self; []; | |||
|
437 | doCheck = false; | |||
|
438 | propagatedBuildInputs = with self; []; | |||
|
439 | src = fetchurl { | |||
|
440 | url = "https://pypi.python.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz"; | |||
|
441 | md5 = "03267762480bd86b50580dc19dff3c66"; | |||
|
442 | }; | |||
|
443 | meta = { | |||
|
444 | license = [ pkgs.lib.licenses.mit ]; | |||
|
445 | }; | |||
|
446 | }; | |||
434 | bottle = super.buildPythonPackage { |
|
447 | bottle = super.buildPythonPackage { | |
435 | name = "bottle-0.12.8"; |
|
448 | name = "bottle-0.12.8"; | |
436 | buildInputs = with self; []; |
|
449 | buildInputs = with self; []; | |
@@ -678,17 +691,17 b'' | |||||
678 | license = [ pkgs.lib.licenses.asl20 ]; |
|
691 | license = [ pkgs.lib.licenses.asl20 ]; | |
679 | }; |
|
692 | }; | |
680 | }; |
|
693 | }; | |
681 |
|
|
694 | enum34 = super.buildPythonPackage { | |
682 |
name = " |
|
695 | name = "enum34-1.1.6"; | |
683 | buildInputs = with self; []; |
|
696 | buildInputs = with self; []; | |
684 | doCheck = false; |
|
697 | doCheck = false; | |
685 |
propagatedBuildInputs = with self; [ |
|
698 | propagatedBuildInputs = with self; []; | |
686 | src = fetchurl { |
|
699 | src = fetchurl { | |
687 | url = "https://pypi.python.org/packages/8f/b5/9a73c66c7dba273bac8758398f060c008a25f3e84531063b42503b5d0a95/flake8-2.4.1.tar.gz"; |
|
700 | url = "https://pypi.python.org/packages/bf/3e/31d502c25302814a7c2f1d3959d2a3b3f78e509002ba91aea64993936876/enum34-1.1.6.tar.gz"; | |
688 | md5 = "ed45d3db81a3b7c88bd63c6e37ca1d65"; |
|
701 | md5 = "5f13a0841a61f7fc295c514490d120d0"; | |
689 | }; |
|
702 | }; | |
690 | meta = { |
|
703 | meta = { | |
691 |
license = [ pkgs.lib.licenses. |
|
704 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
692 | }; |
|
705 | }; | |
693 | }; |
|
706 | }; | |
694 | future = super.buildPythonPackage { |
|
707 | future = super.buildPythonPackage { | |
@@ -809,26 +822,39 b'' | |||||
809 | }; |
|
822 | }; | |
810 | }; |
|
823 | }; | |
811 | ipdb = super.buildPythonPackage { |
|
824 | ipdb = super.buildPythonPackage { | |
812 |
name = "ipdb-0. |
|
825 | name = "ipdb-0.10.1"; | |
813 | buildInputs = with self; []; |
|
826 | buildInputs = with self; []; | |
814 | doCheck = false; |
|
827 | doCheck = false; | |
815 | propagatedBuildInputs = with self; [ipython]; |
|
828 | propagatedBuildInputs = with self; [ipython setuptools]; | |
816 | src = fetchurl { |
|
829 | src = fetchurl { | |
817 | url = "https://pypi.python.org/packages/f0/25/d7dd430ced6cd8dc242a933c8682b5dbf32eb4011d82f87e34209e5ec845/ipdb-0.8.zip"; |
|
830 | url = "https://pypi.python.org/packages/eb/0a/0a37dc19572580336ad3813792c0d18c8d7117c2d66fc63c501f13a7a8f8/ipdb-0.10.1.tar.gz"; | |
818 | md5 = "96dca0712efa01aa5eaf6b22071dd3ed"; |
|
831 | md5 = "4aeab65f633ddc98ebdb5eebf08dc713"; | |
819 | }; |
|
832 | }; | |
820 | meta = { |
|
833 | meta = { | |
821 |
license = [ pkgs.lib.licenses. |
|
834 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
822 | }; |
|
835 | }; | |
823 | }; |
|
836 | }; | |
824 | ipython = super.buildPythonPackage { |
|
837 | ipython = super.buildPythonPackage { | |
825 |
name = "ipython- |
|
838 | name = "ipython-5.1.0"; | |
|
839 | buildInputs = with self; []; | |||
|
840 | doCheck = false; | |||
|
841 | propagatedBuildInputs = with self; [setuptools decorator pickleshare simplegeneric traitlets prompt-toolkit Pygments pexpect backports.shutil-get-terminal-size pathlib2 pexpect]; | |||
|
842 | src = fetchurl { | |||
|
843 | url = "https://pypi.python.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz"; | |||
|
844 | md5 = "47c8122420f65b58784cb4b9b4af35e3"; | |||
|
845 | }; | |||
|
846 | meta = { | |||
|
847 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
848 | }; | |||
|
849 | }; | |||
|
850 | ipython-genutils = super.buildPythonPackage { | |||
|
851 | name = "ipython-genutils-0.1.0"; | |||
826 | buildInputs = with self; []; |
|
852 | buildInputs = with self; []; | |
827 | doCheck = false; |
|
853 | doCheck = false; | |
828 | propagatedBuildInputs = with self; []; |
|
854 | propagatedBuildInputs = with self; []; | |
829 | src = fetchurl { |
|
855 | src = fetchurl { | |
830 |
url = "https://pypi.python.org/packages/06 |
|
856 | url = "https://pypi.python.org/packages/71/b7/a64c71578521606edbbce15151358598f3dfb72a3431763edc2baf19e71f/ipython_genutils-0.1.0.tar.gz"; | |
831 | md5 = "a749d90c16068687b0ec45a27e72ef8f"; |
|
857 | md5 = "9a8afbe0978adbcbfcb3b35b2d015a56"; | |
832 | }; |
|
858 | }; | |
833 | meta = { |
|
859 | meta = { | |
834 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
860 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
@@ -886,19 +912,6 b'' | |||||
886 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
912 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
887 | }; |
|
913 | }; | |
888 | }; |
|
914 | }; | |
889 | mccabe = super.buildPythonPackage { |
|
|||
890 | name = "mccabe-0.3"; |
|
|||
891 | buildInputs = with self; []; |
|
|||
892 | doCheck = false; |
|
|||
893 | propagatedBuildInputs = with self; []; |
|
|||
894 | src = fetchurl { |
|
|||
895 | url = "https://pypi.python.org/packages/c9/2e/75231479e11a906b64ac43bad9d0bb534d00080b18bdca8db9da46e1faf7/mccabe-0.3.tar.gz"; |
|
|||
896 | md5 = "81640948ff226f8c12b3277059489157"; |
|
|||
897 | }; |
|
|||
898 | meta = { |
|
|||
899 | license = [ { fullName = "Expat license"; } pkgs.lib.licenses.mit ]; |
|
|||
900 | }; |
|
|||
901 | }; |
|
|||
902 | meld3 = super.buildPythonPackage { |
|
915 | meld3 = super.buildPythonPackage { | |
903 | name = "meld3-1.0.2"; |
|
916 | name = "meld3-1.0.2"; | |
904 | buildInputs = with self; []; |
|
917 | buildInputs = with self; []; | |
@@ -990,17 +1003,17 b'' | |||||
990 | license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; |
|
1003 | license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; | |
991 | }; |
|
1004 | }; | |
992 | }; |
|
1005 | }; | |
993 |
p |
|
1006 | pathlib2 = super.buildPythonPackage { | |
994 |
name = "p |
|
1007 | name = "pathlib2-2.1.0"; | |
995 | buildInputs = with self; []; |
|
1008 | buildInputs = with self; []; | |
996 | doCheck = false; |
|
1009 | doCheck = false; | |
997 | propagatedBuildInputs = with self; []; |
|
1010 | propagatedBuildInputs = with self; [six]; | |
998 | src = fetchurl { |
|
1011 | src = fetchurl { | |
999 | url = "https://pypi.python.org/packages/8b/de/259f5e735897ada1683489dd514b2a1c91aaa74e5e6b68f80acf128a6368/pep8-1.5.7.tar.gz"; |
|
1012 | url = "https://pypi.python.org/packages/c9/27/8448b10d8440c08efeff0794adf7d0ed27adb98372c70c7b38f3947d4749/pathlib2-2.1.0.tar.gz"; | |
1000 | md5 = "f6adbdd69365ecca20513c709f9b7c93"; |
|
1013 | md5 = "38e4f58b4d69dfcb9edb49a54a8b28d2"; | |
1001 | }; |
|
1014 | }; | |
1002 | meta = { |
|
1015 | meta = { | |
1003 |
license = [ |
|
1016 | license = [ pkgs.lib.licenses.mit ]; | |
1004 | }; |
|
1017 | }; | |
1005 | }; |
|
1018 | }; | |
1006 | peppercorn = super.buildPythonPackage { |
|
1019 | peppercorn = super.buildPythonPackage { | |
@@ -1016,14 +1029,53 b'' | |||||
1016 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
|
1029 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
1017 | }; |
|
1030 | }; | |
1018 | }; |
|
1031 | }; | |
|
1032 | pexpect = super.buildPythonPackage { | |||
|
1033 | name = "pexpect-4.2.1"; | |||
|
1034 | buildInputs = with self; []; | |||
|
1035 | doCheck = false; | |||
|
1036 | propagatedBuildInputs = with self; [ptyprocess]; | |||
|
1037 | src = fetchurl { | |||
|
1038 | url = "https://pypi.python.org/packages/e8/13/d0b0599099d6cd23663043a2a0bb7c61e58c6ba359b2656e6fb000ef5b98/pexpect-4.2.1.tar.gz"; | |||
|
1039 | md5 = "3694410001a99dff83f0b500a1ca1c95"; | |||
|
1040 | }; | |||
|
1041 | meta = { | |||
|
1042 | license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ]; | |||
|
1043 | }; | |||
|
1044 | }; | |||
|
1045 | pickleshare = super.buildPythonPackage { | |||
|
1046 | name = "pickleshare-0.7.4"; | |||
|
1047 | buildInputs = with self; []; | |||
|
1048 | doCheck = false; | |||
|
1049 | propagatedBuildInputs = with self; [pathlib2]; | |||
|
1050 | src = fetchurl { | |||
|
1051 | url = "https://pypi.python.org/packages/69/fe/dd137d84daa0fd13a709e448138e310d9ea93070620c9db5454e234af525/pickleshare-0.7.4.tar.gz"; | |||
|
1052 | md5 = "6a9e5dd8dfc023031f6b7b3f824cab12"; | |||
|
1053 | }; | |||
|
1054 | meta = { | |||
|
1055 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1056 | }; | |||
|
1057 | }; | |||
|
1058 | prompt-toolkit = super.buildPythonPackage { | |||
|
1059 | name = "prompt-toolkit-1.0.9"; | |||
|
1060 | buildInputs = with self; []; | |||
|
1061 | doCheck = false; | |||
|
1062 | propagatedBuildInputs = with self; [six wcwidth]; | |||
|
1063 | src = fetchurl { | |||
|
1064 | url = "https://pypi.python.org/packages/83/14/5ac258da6c530eca02852ee25c7a9ff3ca78287bb4c198d0d0055845d856/prompt_toolkit-1.0.9.tar.gz"; | |||
|
1065 | md5 = "a39f91a54308fb7446b1a421c11f227c"; | |||
|
1066 | }; | |||
|
1067 | meta = { | |||
|
1068 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
1069 | }; | |||
|
1070 | }; | |||
1019 | psutil = super.buildPythonPackage { |
|
1071 | psutil = super.buildPythonPackage { | |
1020 |
name = "psutil- |
|
1072 | name = "psutil-4.3.1"; | |
1021 | buildInputs = with self; []; |
|
1073 | buildInputs = with self; []; | |
1022 | doCheck = false; |
|
1074 | doCheck = false; | |
1023 | propagatedBuildInputs = with self; []; |
|
1075 | propagatedBuildInputs = with self; []; | |
1024 | src = fetchurl { |
|
1076 | src = fetchurl { | |
1025 | url = "https://pypi.python.org/packages/df/47/ee54ef14dd40f8ce831a7581001a5096494dc99fe71586260ca6b531fe86/psutil-2.2.1.tar.gz"; |
|
1077 | url = "https://pypi.python.org/packages/78/cc/f267a1371f229bf16db6a4e604428c3b032b823b83155bd33cef45e49a53/psutil-4.3.1.tar.gz"; | |
1026 | md5 = "1a2b58cd9e3a53528bb6148f0c4d5244"; |
|
1078 | md5 = "199a366dba829c88bddaf5b41d19ddc0"; | |
1027 | }; |
|
1079 | }; | |
1028 | meta = { |
|
1080 | meta = { | |
1029 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
1081 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
@@ -1042,6 +1094,19 b'' | |||||
1042 | license = [ pkgs.lib.licenses.zpt21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ]; |
|
1094 | license = [ pkgs.lib.licenses.zpt21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ]; | |
1043 | }; |
|
1095 | }; | |
1044 | }; |
|
1096 | }; | |
|
1097 | ptyprocess = super.buildPythonPackage { | |||
|
1098 | name = "ptyprocess-0.5.1"; | |||
|
1099 | buildInputs = with self; []; | |||
|
1100 | doCheck = false; | |||
|
1101 | propagatedBuildInputs = with self; []; | |||
|
1102 | src = fetchurl { | |||
|
1103 | url = "https://pypi.python.org/packages/db/d7/b465161910f3d1cef593c5e002bff67e0384898f597f1a7fdc8db4c02bf6/ptyprocess-0.5.1.tar.gz"; | |||
|
1104 | md5 = "94e537122914cc9ec9c1eadcd36e73a1"; | |||
|
1105 | }; | |||
|
1106 | meta = { | |||
|
1107 | license = [ ]; | |||
|
1108 | }; | |||
|
1109 | }; | |||
1045 | py = super.buildPythonPackage { |
|
1110 | py = super.buildPythonPackage { | |
1046 | name = "py-1.4.29"; |
|
1111 | name = "py-1.4.29"; | |
1047 | buildInputs = with self; []; |
|
1112 | buildInputs = with self; []; | |
@@ -1120,6 +1185,19 b'' | |||||
1120 | license = [ pkgs.lib.licenses.mit ]; |
|
1185 | license = [ pkgs.lib.licenses.mit ]; | |
1121 | }; |
|
1186 | }; | |
1122 | }; |
|
1187 | }; | |
|
1188 | pygments-markdown-lexer = super.buildPythonPackage { | |||
|
1189 | name = "pygments-markdown-lexer-0.1.0.dev39"; | |||
|
1190 | buildInputs = with self; []; | |||
|
1191 | doCheck = false; | |||
|
1192 | propagatedBuildInputs = with self; [Pygments]; | |||
|
1193 | src = fetchurl { | |||
|
1194 | url = "https://pypi.python.org/packages/c3/12/674cdee66635d638cedb2c5d9c85ce507b7b2f91bdba29e482f1b1160ff6/pygments-markdown-lexer-0.1.0.dev39.zip"; | |||
|
1195 | md5 = "6360fe0f6d1f896e35b7a0142ce6459c"; | |||
|
1196 | }; | |||
|
1197 | meta = { | |||
|
1198 | license = [ pkgs.lib.licenses.asl20 ]; | |||
|
1199 | }; | |||
|
1200 | }; | |||
1123 | pyparsing = super.buildPythonPackage { |
|
1201 | pyparsing = super.buildPythonPackage { | |
1124 | name = "pyparsing-1.5.7"; |
|
1202 | name = "pyparsing-1.5.7"; | |
1125 | buildInputs = with self; []; |
|
1203 | buildInputs = with self; []; | |
@@ -1420,10 +1498,10 b'' | |||||
1420 | }; |
|
1498 | }; | |
1421 | }; |
|
1499 | }; | |
1422 | rhodecode-enterprise-ce = super.buildPythonPackage { |
|
1500 | rhodecode-enterprise-ce = super.buildPythonPackage { | |
1423 |
name = "rhodecode-enterprise-ce-4. |
|
1501 | name = "rhodecode-enterprise-ce-4.5.0"; | |
1424 |
buildInputs = with self; [WebTest configobj cssselect |
|
1502 | buildInputs = with self; [WebTest configobj cssselect lxml mock pytest pytest-cov pytest-runner pytest-sugar]; | |
1425 | doCheck = true; |
|
1503 | doCheck = true; | |
1426 | propagatedBuildInputs = with self; [Babel Beaker FormEncode Mako Markdown MarkupSafe MySQL-python Paste PasteDeploy PasteScript Pygments Pylons Pyro4 Routes SQLAlchemy Tempita URLObject WebError WebHelpers WebHelpers2 WebOb WebTest Whoosh alembic amqplib anyjson appenlight-client authomatic backport-ipaddress celery channelstream colander decorator deform docutils gevent gunicorn infrae.cache ipython iso8601 kombu msgpack-python packaging psycopg2 py-gfm pycrypto pycurl pyparsing pyramid pyramid-debugtoolbar pyramid-mako pyramid-beaker pysqlite python-dateutil python-ldap python-memcached python-pam recaptcha-client repoze.lru requests simplejson waitress zope.cachedescriptors dogpile.cache dogpile.core psutil py-bcrypt]; |
|
1504 | propagatedBuildInputs = with self; [Babel Beaker FormEncode Mako Markdown MarkupSafe MySQL-python Paste PasteDeploy PasteScript Pygments pygments-markdown-lexer Pylons Pyro4 Routes SQLAlchemy Tempita URLObject WebError WebHelpers WebHelpers2 WebOb WebTest Whoosh alembic amqplib anyjson appenlight-client authomatic backport-ipaddress celery channelstream colander decorator deform docutils gevent gunicorn infrae.cache ipython iso8601 kombu msgpack-python packaging psycopg2 py-gfm pycrypto pycurl pyparsing pyramid pyramid-debugtoolbar pyramid-mako pyramid-beaker pysqlite python-dateutil python-ldap python-memcached python-pam recaptcha-client repoze.lru requests simplejson subprocess32 waitress zope.cachedescriptors dogpile.cache dogpile.core psutil py-bcrypt]; | |
1427 | src = ./.; |
|
1505 | src = ./.; | |
1428 | meta = { |
|
1506 | meta = { | |
1429 | license = [ { fullName = "AGPLv3, and Commercial License"; } ]; |
|
1507 | license = [ { fullName = "AGPLv3, and Commercial License"; } ]; | |
@@ -1494,6 +1572,19 b'' | |||||
1494 | license = [ pkgs.lib.licenses.mit ]; |
|
1572 | license = [ pkgs.lib.licenses.mit ]; | |
1495 | }; |
|
1573 | }; | |
1496 | }; |
|
1574 | }; | |
|
1575 | simplegeneric = super.buildPythonPackage { | |||
|
1576 | name = "simplegeneric-0.8.1"; | |||
|
1577 | buildInputs = with self; []; | |||
|
1578 | doCheck = false; | |||
|
1579 | propagatedBuildInputs = with self; []; | |||
|
1580 | src = fetchurl { | |||
|
1581 | url = "https://pypi.python.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip"; | |||
|
1582 | md5 = "f9c1fab00fd981be588fc32759f474e3"; | |||
|
1583 | }; | |||
|
1584 | meta = { | |||
|
1585 | license = [ pkgs.lib.licenses.zpt21 ]; | |||
|
1586 | }; | |||
|
1587 | }; | |||
1497 | simplejson = super.buildPythonPackage { |
|
1588 | simplejson = super.buildPythonPackage { | |
1498 | name = "simplejson-3.7.2"; |
|
1589 | name = "simplejson-3.7.2"; | |
1499 | buildInputs = with self; []; |
|
1590 | buildInputs = with self; []; | |
@@ -1546,6 +1637,19 b'' | |||||
1546 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
|
1637 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; | |
1547 | }; |
|
1638 | }; | |
1548 | }; |
|
1639 | }; | |
|
1640 | traitlets = super.buildPythonPackage { | |||
|
1641 | name = "traitlets-4.3.1"; | |||
|
1642 | buildInputs = with self; []; | |||
|
1643 | doCheck = false; | |||
|
1644 | propagatedBuildInputs = with self; [ipython-genutils six decorator enum34]; | |||
|
1645 | src = fetchurl { | |||
|
1646 | url = "https://pypi.python.org/packages/b1/d6/5b5aa6d5c474691909b91493da1e8972e309c9f01ecfe4aeafd272eb3234/traitlets-4.3.1.tar.gz"; | |||
|
1647 | md5 = "dd0b1b6e5d31ce446d55a4b5e5083c98"; | |||
|
1648 | }; | |||
|
1649 | meta = { | |||
|
1650 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
1651 | }; | |||
|
1652 | }; | |||
1549 | transifex-client = super.buildPythonPackage { |
|
1653 | transifex-client = super.buildPythonPackage { | |
1550 | name = "transifex-client-0.10"; |
|
1654 | name = "transifex-client-0.10"; | |
1551 | buildInputs = with self; []; |
|
1655 | buildInputs = with self; []; | |
@@ -1637,6 +1741,19 b'' | |||||
1637 | license = [ pkgs.lib.licenses.zpt21 ]; |
|
1741 | license = [ pkgs.lib.licenses.zpt21 ]; | |
1638 | }; |
|
1742 | }; | |
1639 | }; |
|
1743 | }; | |
|
1744 | wcwidth = super.buildPythonPackage { | |||
|
1745 | name = "wcwidth-0.1.7"; | |||
|
1746 | buildInputs = with self; []; | |||
|
1747 | doCheck = false; | |||
|
1748 | propagatedBuildInputs = with self; []; | |||
|
1749 | src = fetchurl { | |||
|
1750 | url = "https://pypi.python.org/packages/55/11/e4a2bb08bb450fdbd42cc709dd40de4ed2c472cf0ccb9e64af22279c5495/wcwidth-0.1.7.tar.gz"; | |||
|
1751 | md5 = "b3b6a0a08f0c8a34d1de8cf44150a4ad"; | |||
|
1752 | }; | |||
|
1753 | meta = { | |||
|
1754 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1755 | }; | |||
|
1756 | }; | |||
1640 | ws4py = super.buildPythonPackage { |
|
1757 | ws4py = super.buildPythonPackage { | |
1641 | name = "ws4py-0.3.5"; |
|
1758 | name = "ws4py-0.3.5"; | |
1642 | buildInputs = with self; []; |
|
1759 | buildInputs = with self; []; | |
@@ -1718,5 +1835,30 b'' | |||||
1718 |
|
1835 | |||
1719 | ### Test requirements |
|
1836 | ### Test requirements | |
1720 |
|
1837 | |||
1721 |
|
1838 | pytest-sugar = super.buildPythonPackage { | ||
|
1839 | name = "pytest-sugar-0.7.1"; | |||
|
1840 | buildInputs = with self; []; | |||
|
1841 | doCheck = false; | |||
|
1842 | propagatedBuildInputs = with self; [pytest termcolor]; | |||
|
1843 | src = fetchurl { | |||
|
1844 | url = "https://pypi.python.org/packages/03/97/05d988b4fa870e7373e8ee4582408543b9ca2bd35c3c67b569369c6f9c49/pytest-sugar-0.7.1.tar.gz"; | |||
|
1845 | md5 = "7400f7c11f3d572b2c2a3b60352d35fe"; | |||
|
1846 | }; | |||
|
1847 | meta = { | |||
|
1848 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |||
|
1849 | }; | |||
|
1850 | }; | |||
|
1851 | termcolor = super.buildPythonPackage { | |||
|
1852 | name = "termcolor-1.1.0"; | |||
|
1853 | buildInputs = with self; []; | |||
|
1854 | doCheck = false; | |||
|
1855 | propagatedBuildInputs = with self; []; | |||
|
1856 | src = fetchurl { | |||
|
1857 | url = "https://pypi.python.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz"; | |||
|
1858 | md5 = "043e89644f8909d462fbbfa511c768df"; | |||
|
1859 | }; | |||
|
1860 | meta = { | |||
|
1861 | license = [ pkgs.lib.licenses.mit ]; | |||
|
1862 | }; | |||
|
1863 | }; | |||
1722 | } |
|
1864 | } |
@@ -1,9 +1,9 b'' | |||||
1 | [pytest] |
|
1 | [pytest] | |
2 | testpaths = ./rhodecode |
|
2 | testpaths = ./rhodecode | |
3 | pylons_config = test.ini |
|
3 | pylons_config = rhodecode/tests/rhodecode.ini | |
4 |
vcsserver_protocol = p |
|
4 | vcsserver_protocol = http | |
5 | vcsserver_config = rhodecode/tests/vcsserver.ini |
|
5 | vcsserver_config_pyro4 = rhodecode/tests/vcsserver_pyro4.ini | |
6 |
vcsserver_config_http = rhodecode/tests/vcsserver_p |
|
6 | vcsserver_config_http = rhodecode/tests/vcsserver_http.ini | |
7 | norecursedirs = tests/scripts |
|
7 | norecursedirs = tests/scripts | |
8 | addopts = -k "not _BaseTest" |
|
8 | addopts = -k "not _BaseTest" | |
9 | markers = |
|
9 | markers = |
@@ -1,5 +1,6 b'' | |||||
1 | Babel==1.3 |
|
1 | Babel==1.3 | |
2 | Beaker==1.7.0 |
|
2 | Beaker==1.7.0 | |
|
3 | Chameleon==2.24 | |||
3 | CProfileV==1.0.6 |
|
4 | CProfileV==1.0.6 | |
4 | FormEncode==1.2.4 |
|
5 | FormEncode==1.2.4 | |
5 | Jinja2==2.7.3 |
|
6 | Jinja2==2.7.3 | |
@@ -11,6 +12,7 b' Paste==2.0.2' | |||||
11 | PasteDeploy==1.5.2 |
|
12 | PasteDeploy==1.5.2 | |
12 | PasteScript==1.7.5 |
|
13 | PasteScript==1.7.5 | |
13 | Pygments==2.1.3 |
|
14 | Pygments==2.1.3 | |
|
15 | pygments-markdown-lexer==0.1.0.dev39 | |||
14 |
|
16 | |||
15 | # TODO: This version is not available on PyPI |
|
17 | # TODO: This version is not available on PyPI | |
16 | # Pylons==1.0.2.dev20160108 |
|
18 | # Pylons==1.0.2.dev20160108 | |
@@ -62,7 +64,6 b' dogpile.cache==0.6.1' | |||||
62 | dogpile.core==0.4.1 |
|
64 | dogpile.core==0.4.1 | |
63 | dulwich==0.12.0 |
|
65 | dulwich==0.12.0 | |
64 | ecdsa==0.11 |
|
66 | ecdsa==0.11 | |
65 | flake8==2.4.1 |
|
|||
66 | future==0.14.3 |
|
67 | future==0.14.3 | |
67 | futures==3.0.2 |
|
68 | futures==3.0.2 | |
68 | gevent==1.1.1 |
|
69 | gevent==1.1.1 | |
@@ -77,13 +78,12 b' gunicorn==19.6.0' | |||||
77 | gnureadline==6.3.3 |
|
78 | gnureadline==6.3.3 | |
78 | infrae.cache==1.0.1 |
|
79 | infrae.cache==1.0.1 | |
79 | invoke==0.13.0 |
|
80 | invoke==0.13.0 | |
80 |
ipdb==0. |
|
81 | ipdb==0.10.1 | |
81 |
ipython== |
|
82 | ipython==5.1.0 | |
82 | iso8601==0.1.11 |
|
83 | iso8601==0.1.11 | |
83 | itsdangerous==0.24 |
|
84 | itsdangerous==0.24 | |
84 | kombu==1.5.1 |
|
85 | kombu==1.5.1 | |
85 | lxml==3.4.4 |
|
86 | lxml==3.4.4 | |
86 | mccabe==0.3 |
|
|||
87 | meld3==1.0.2 |
|
87 | meld3==1.0.2 | |
88 | mock==1.0.1 |
|
88 | mock==1.0.1 | |
89 | msgpack-python==0.4.6 |
|
89 | msgpack-python==0.4.6 | |
@@ -91,8 +91,7 b' nose==1.3.6' | |||||
91 | objgraph==2.0.0 |
|
91 | objgraph==2.0.0 | |
92 | packaging==15.2 |
|
92 | packaging==15.2 | |
93 | paramiko==1.15.1 |
|
93 | paramiko==1.15.1 | |
94 | pep8==1.5.7 |
|
94 | psutil==4.3.1 | |
95 | psutil==2.2.1 |
|
|||
96 | psycopg2==2.6.1 |
|
95 | psycopg2==2.6.1 | |
97 | py==1.4.29 |
|
96 | py==1.4.29 | |
98 | py-bcrypt==0.4 |
|
97 | py-bcrypt==0.4 | |
@@ -141,6 +140,7 b' transifex-client==0.10' | |||||
141 | translationstring==1.3 |
|
140 | translationstring==1.3 | |
142 | trollius==1.0.4 |
|
141 | trollius==1.0.4 | |
143 | uWSGI==2.0.11.2 |
|
142 | uWSGI==2.0.11.2 | |
|
143 | urllib3==1.16 | |||
144 | venusian==1.0 |
|
144 | venusian==1.0 | |
145 | waitress==0.8.9 |
|
145 | waitress==0.8.9 | |
146 | wsgiref==0.1.2 |
|
146 | wsgiref==0.1.2 |
@@ -51,7 +51,7 b' PYRAMID_SETTINGS = {}' | |||||
51 | EXTENSIONS = {} |
|
51 | EXTENSIONS = {} | |
52 |
|
52 | |||
53 | __version__ = ('.'.join((str(each) for each in VERSION[:3]))) |
|
53 | __version__ = ('.'.join((str(each) for each in VERSION[:3]))) | |
54 |
__dbversion__ = |
|
54 | __dbversion__ = 63 # defines current db version for migrations | |
55 | __platform__ = platform.system() |
|
55 | __platform__ = platform.system() | |
56 | __license__ = 'AGPLv3, and Commercial License' |
|
56 | __license__ = 'AGPLv3, and Commercial License' | |
57 | __author__ = 'RhodeCode GmbH' |
|
57 | __author__ = 'RhodeCode GmbH' |
@@ -35,6 +35,9 b' def includeme(config):' | |||||
35 | config.add_route( |
|
35 | config.add_route( | |
36 | name='admin_settings_open_source', |
|
36 | name='admin_settings_open_source', | |
37 | pattern=ADMIN_PREFIX + '/settings/open_source') |
|
37 | pattern=ADMIN_PREFIX + '/settings/open_source') | |
|
38 | config.add_route( | |||
|
39 | name='admin_settings_vcs_svn_generate_cfg', | |||
|
40 | pattern=ADMIN_PREFIX + '/settings/vcs/svn_generate_cfg') | |||
38 |
|
41 | |||
39 | # Scan module for configuration decorators. |
|
42 | # Scan module for configuration decorators. | |
40 | config.scan() |
|
43 | config.scan() |
@@ -24,8 +24,11 b' import logging' | |||||
24 | from pylons import tmpl_context as c |
|
24 | from pylons import tmpl_context as c | |
25 | from pyramid.view import view_config |
|
25 | from pyramid.view import view_config | |
26 |
|
26 | |||
27 |
from rhodecode.lib.auth import |
|
27 | from rhodecode.lib.auth import ( | |
|
28 | LoginRequired, HasPermissionAllDecorator, CSRFRequired) | |||
28 | from rhodecode.lib.utils import read_opensource_licenses |
|
29 | from rhodecode.lib.utils import read_opensource_licenses | |
|
30 | from rhodecode.svn_support.utils import generate_mod_dav_svn_config | |||
|
31 | from rhodecode.translation import _ | |||
29 |
|
32 | |||
30 | from .navigation import navigation_list |
|
33 | from .navigation import navigation_list | |
31 |
|
34 | |||
@@ -53,3 +56,27 b' class AdminSettingsView(object):' | |||||
53 | sorted(read_opensource_licenses().items(), key=lambda t: t[0])) |
|
56 | sorted(read_opensource_licenses().items(), key=lambda t: t[0])) | |
54 |
|
57 | |||
55 | return {} |
|
58 | return {} | |
|
59 | ||||
|
60 | @LoginRequired() | |||
|
61 | @CSRFRequired() | |||
|
62 | @HasPermissionAllDecorator('hg.admin') | |||
|
63 | @view_config( | |||
|
64 | route_name='admin_settings_vcs_svn_generate_cfg', | |||
|
65 | request_method='POST', renderer='json') | |||
|
66 | def vcs_svn_generate_config(self): | |||
|
67 | try: | |||
|
68 | generate_mod_dav_svn_config(self.request.registry) | |||
|
69 | msg = { | |||
|
70 | 'message': _('Apache configuration for Subversion generated.'), | |||
|
71 | 'level': 'success', | |||
|
72 | } | |||
|
73 | except Exception: | |||
|
74 | log.exception( | |||
|
75 | 'Exception while generating the Apache configuration for Subversion.') | |||
|
76 | msg = { | |||
|
77 | 'message': _('Failed to generate the Apache configuration for Subversion.'), | |||
|
78 | 'level': 'error', | |||
|
79 | } | |||
|
80 | ||||
|
81 | data = {'message': msg} | |||
|
82 | return data |
@@ -23,8 +23,11 b' import json' | |||||
23 | import mock |
|
23 | import mock | |
24 | import pytest |
|
24 | import pytest | |
25 |
|
25 | |||
|
26 | from rhodecode.lib.utils2 import safe_unicode | |||
26 | from rhodecode.lib.vcs import settings |
|
27 | from rhodecode.lib.vcs import settings | |
|
28 | from rhodecode.model.meta import Session | |||
27 | from rhodecode.model.repo import RepoModel |
|
29 | from rhodecode.model.repo import RepoModel | |
|
30 | from rhodecode.model.user import UserModel | |||
28 | from rhodecode.tests import TEST_USER_ADMIN_LOGIN |
|
31 | from rhodecode.tests import TEST_USER_ADMIN_LOGIN | |
29 | from rhodecode.api.tests.utils import ( |
|
32 | from rhodecode.api.tests.utils import ( | |
30 | build_data, api_call, assert_ok, assert_error, crash) |
|
33 | build_data, api_call, assert_ok, assert_error, crash) | |
@@ -36,29 +39,37 b' fixture = Fixture()' | |||||
36 |
|
39 | |||
37 | @pytest.mark.usefixtures("testuser_api", "app") |
|
40 | @pytest.mark.usefixtures("testuser_api", "app") | |
38 | class TestCreateRepo(object): |
|
41 | class TestCreateRepo(object): | |
39 | def test_api_create_repo(self, backend): |
|
42 | ||
40 | repo_name = 'api-repo-1' |
|
43 | @pytest.mark.parametrize('given, expected_name, expected_exc', [ | |
|
44 | ('api repo-1', 'api-repo-1', False), | |||
|
45 | ('api-repo 1-ąć', 'api-repo-1-ąć', False), | |||
|
46 | (u'unicode-ąć', u'unicode-ąć', False), | |||
|
47 | ('some repo v1.2', 'some-repo-v1.2', False), | |||
|
48 | ('v2.0', 'v2.0', False), | |||
|
49 | ]) | |||
|
50 | def test_api_create_repo(self, backend, given, expected_name, expected_exc): | |||
|
51 | ||||
41 | id_, params = build_data( |
|
52 | id_, params = build_data( | |
42 | self.apikey, |
|
53 | self.apikey, | |
43 | 'create_repo', |
|
54 | 'create_repo', | |
44 |
repo_name= |
|
55 | repo_name=given, | |
45 | owner=TEST_USER_ADMIN_LOGIN, |
|
56 | owner=TEST_USER_ADMIN_LOGIN, | |
46 | repo_type=backend.alias, |
|
57 | repo_type=backend.alias, | |
47 | ) |
|
58 | ) | |
48 | response = api_call(self.app, params) |
|
59 | response = api_call(self.app, params) | |
49 |
|
60 | |||
50 | repo = RepoModel().get_by_repo_name(repo_name) |
|
|||
51 |
|
||||
52 | assert repo is not None |
|
|||
53 | ret = { |
|
61 | ret = { | |
54 |
'msg': 'Created new repository `%s`' % ( |
|
62 | 'msg': 'Created new repository `%s`' % (expected_name,), | |
55 | 'success': True, |
|
63 | 'success': True, | |
56 | 'task': None, |
|
64 | 'task': None, | |
57 | } |
|
65 | } | |
58 | expected = ret |
|
66 | expected = ret | |
59 | assert_ok(id_, expected, given=response.body) |
|
67 | assert_ok(id_, expected, given=response.body) | |
60 |
|
68 | |||
61 | id_, params = build_data(self.apikey, 'get_repo', repoid=repo_name) |
|
69 | repo = RepoModel().get_by_repo_name(safe_unicode(expected_name)) | |
|
70 | assert repo is not None | |||
|
71 | ||||
|
72 | id_, params = build_data(self.apikey, 'get_repo', repoid=expected_name) | |||
62 | response = api_call(self.app, params) |
|
73 | response = api_call(self.app, params) | |
63 | body = json.loads(response.body) |
|
74 | body = json.loads(response.body) | |
64 |
|
75 | |||
@@ -66,7 +77,7 b' class TestCreateRepo(object):' | |||||
66 | assert body['result']['enable_locking'] is False |
|
77 | assert body['result']['enable_locking'] is False | |
67 | assert body['result']['enable_statistics'] is False |
|
78 | assert body['result']['enable_statistics'] is False | |
68 |
|
79 | |||
69 |
fixture.destroy_repo( |
|
80 | fixture.destroy_repo(safe_unicode(expected_name)) | |
70 |
|
81 | |||
71 | def test_api_create_restricted_repo_type(self, backend): |
|
82 | def test_api_create_restricted_repo_type(self, backend): | |
72 | repo_name = 'api-repo-type-{0}'.format(backend.alias) |
|
83 | repo_name = 'api-repo-type-{0}'.format(backend.alias) | |
@@ -158,6 +169,21 b' class TestCreateRepo(object):' | |||||
158 | fixture.destroy_repo(repo_name) |
|
169 | fixture.destroy_repo(repo_name) | |
159 | fixture.destroy_repo_group(repo_group_name) |
|
170 | fixture.destroy_repo_group(repo_group_name) | |
160 |
|
171 | |||
|
172 | def test_create_repo_in_group_that_doesnt_exist(self, backend, user_util): | |||
|
173 | repo_group_name = 'fake_group' | |||
|
174 | ||||
|
175 | repo_name = '%s/api-repo-gr' % (repo_group_name,) | |||
|
176 | id_, params = build_data( | |||
|
177 | self.apikey, 'create_repo', | |||
|
178 | repo_name=repo_name, | |||
|
179 | owner=TEST_USER_ADMIN_LOGIN, | |||
|
180 | repo_type=backend.alias,) | |||
|
181 | response = api_call(self.app, params) | |||
|
182 | ||||
|
183 | expected = {'repo_group': 'Repository group `{}` does not exist'.format( | |||
|
184 | repo_group_name)} | |||
|
185 | assert_error(id_, expected, given=response.body) | |||
|
186 | ||||
161 | def test_api_create_repo_unknown_owner(self, backend): |
|
187 | def test_api_create_repo_unknown_owner(self, backend): | |
162 | repo_name = 'api-repo-2' |
|
188 | repo_name = 'api-repo-2' | |
163 | owner = 'i-dont-exist' |
|
189 | owner = 'i-dont-exist' | |
@@ -218,10 +244,48 b' class TestCreateRepo(object):' | |||||
218 | owner=owner) |
|
244 | owner=owner) | |
219 | response = api_call(self.app, params) |
|
245 | response = api_call(self.app, params) | |
220 |
|
246 | |||
221 | expected = 'Only RhodeCode admin can specify `owner` param' |
|
247 | expected = 'Only RhodeCode super-admin can specify `owner` param' | |
222 | assert_error(id_, expected, given=response.body) |
|
248 | assert_error(id_, expected, given=response.body) | |
223 | fixture.destroy_repo(repo_name) |
|
249 | fixture.destroy_repo(repo_name) | |
224 |
|
250 | |||
|
251 | def test_api_create_repo_by_non_admin_no_parent_group_perms(self, backend): | |||
|
252 | repo_group_name = 'no-access' | |||
|
253 | fixture.create_repo_group(repo_group_name) | |||
|
254 | repo_name = 'no-access/api-repo' | |||
|
255 | ||||
|
256 | id_, params = build_data( | |||
|
257 | self.apikey_regular, 'create_repo', | |||
|
258 | repo_name=repo_name, | |||
|
259 | repo_type=backend.alias) | |||
|
260 | response = api_call(self.app, params) | |||
|
261 | ||||
|
262 | expected = {'repo_group': 'Repository group `{}` does not exist'.format( | |||
|
263 | repo_group_name)} | |||
|
264 | assert_error(id_, expected, given=response.body) | |||
|
265 | fixture.destroy_repo_group(repo_group_name) | |||
|
266 | fixture.destroy_repo(repo_name) | |||
|
267 | ||||
|
268 | def test_api_create_repo_non_admin_no_permission_to_create_to_root_level( | |||
|
269 | self, backend, user_util): | |||
|
270 | ||||
|
271 | regular_user = user_util.create_user() | |||
|
272 | regular_user_api_key = regular_user.api_key | |||
|
273 | ||||
|
274 | usr = UserModel().get_by_username(regular_user.username) | |||
|
275 | usr.inherit_default_permissions = False | |||
|
276 | Session().add(usr) | |||
|
277 | ||||
|
278 | repo_name = backend.new_repo_name() | |||
|
279 | id_, params = build_data( | |||
|
280 | regular_user_api_key, 'create_repo', | |||
|
281 | repo_name=repo_name, | |||
|
282 | repo_type=backend.alias) | |||
|
283 | response = api_call(self.app, params) | |||
|
284 | expected = { | |||
|
285 | "repo_name": "You do not have the permission to " | |||
|
286 | "store repositories in the root location."} | |||
|
287 | assert_error(id_, expected, given=response.body) | |||
|
288 | ||||
225 | def test_api_create_repo_exists(self, backend): |
|
289 | def test_api_create_repo_exists(self, backend): | |
226 | repo_name = backend.repo_name |
|
290 | repo_name = backend.repo_name | |
227 | id_, params = build_data( |
|
291 | id_, params = build_data( | |
@@ -230,7 +294,9 b' class TestCreateRepo(object):' | |||||
230 | owner=TEST_USER_ADMIN_LOGIN, |
|
294 | owner=TEST_USER_ADMIN_LOGIN, | |
231 | repo_type=backend.alias,) |
|
295 | repo_type=backend.alias,) | |
232 | response = api_call(self.app, params) |
|
296 | response = api_call(self.app, params) | |
233 | expected = "repo `%s` already exist" % (repo_name,) |
|
297 | expected = { | |
|
298 | 'unique_repo_name': 'Repository with name `{}` already exists'.format( | |||
|
299 | repo_name)} | |||
234 | assert_error(id_, expected, given=response.body) |
|
300 | assert_error(id_, expected, given=response.body) | |
235 |
|
301 | |||
236 | @mock.patch.object(RepoModel, 'create', crash) |
|
302 | @mock.patch.object(RepoModel, 'create', crash) | |
@@ -245,26 +311,40 b' class TestCreateRepo(object):' | |||||
245 | expected = 'failed to create repository `%s`' % (repo_name,) |
|
311 | expected = 'failed to create repository `%s`' % (repo_name,) | |
246 | assert_error(id_, expected, given=response.body) |
|
312 | assert_error(id_, expected, given=response.body) | |
247 |
|
313 | |||
248 | def test_create_repo_with_extra_slashes_in_name(self, backend, user_util): |
|
314 | @pytest.mark.parametrize('parent_group, dirty_name, expected_name', [ | |
249 | existing_repo_group = user_util.create_repo_group() |
|
315 | (None, 'foo bar x', 'foo-bar-x'), | |
250 | dirty_repo_name = '//{}/repo_name//'.format( |
|
316 | ('foo', '/foo//bar x', 'foo/bar-x'), | |
251 | existing_repo_group.group_name) |
|
317 | ('foo-bar', 'foo-bar //bar x', 'foo-bar/bar-x'), | |
252 | cleaned_repo_name = '{}/repo_name'.format( |
|
318 | ]) | |
253 | existing_repo_group.group_name) |
|
319 | def test_create_repo_with_extra_slashes_in_name( | |
|
320 | self, backend, parent_group, dirty_name, expected_name): | |||
|
321 | ||||
|
322 | if parent_group: | |||
|
323 | gr = fixture.create_repo_group(parent_group) | |||
|
324 | assert gr.group_name == parent_group | |||
254 |
|
325 | |||
255 | id_, params = build_data( |
|
326 | id_, params = build_data( | |
256 | self.apikey, 'create_repo', |
|
327 | self.apikey, 'create_repo', | |
257 |
repo_name=dirty_ |
|
328 | repo_name=dirty_name, | |
258 | repo_type=backend.alias, |
|
329 | repo_type=backend.alias, | |
259 | owner=TEST_USER_ADMIN_LOGIN,) |
|
330 | owner=TEST_USER_ADMIN_LOGIN,) | |
260 | response = api_call(self.app, params) |
|
331 | response = api_call(self.app, params) | |
261 | repo = RepoModel().get_by_repo_name(cleaned_repo_name) |
|
332 | expected ={ | |
|
333 | "msg": "Created new repository `{}`".format(expected_name), | |||
|
334 | "task": None, | |||
|
335 | "success": True | |||
|
336 | } | |||
|
337 | assert_ok(id_, expected, response.body) | |||
|
338 | ||||
|
339 | repo = RepoModel().get_by_repo_name(expected_name) | |||
262 | assert repo is not None |
|
340 | assert repo is not None | |
263 |
|
341 | |||
264 | expected = { |
|
342 | expected = { | |
265 |
'msg': 'Created new repository `%s`' % ( |
|
343 | 'msg': 'Created new repository `%s`' % (expected_name,), | |
266 | 'success': True, |
|
344 | 'success': True, | |
267 | 'task': None, |
|
345 | 'task': None, | |
268 | } |
|
346 | } | |
269 | assert_ok(id_, expected, given=response.body) |
|
347 | assert_ok(id_, expected, given=response.body) | |
270 |
fixture.destroy_repo( |
|
348 | fixture.destroy_repo(expected_name) | |
|
349 | if parent_group: | |||
|
350 | fixture.destroy_repo_group(parent_group) |
@@ -54,55 +54,10 b' class TestCreateRepoGroup(object):' | |||||
54 | 'repo_group': repo_group.get_api_data() |
|
54 | 'repo_group': repo_group.get_api_data() | |
55 | } |
|
55 | } | |
56 | expected = ret |
|
56 | expected = ret | |
57 | assert_ok(id_, expected, given=response.body) |
|
57 | try: | |
58 | fixture.destroy_repo_group(repo_group_name) |
|
58 | assert_ok(id_, expected, given=response.body) | |
59 |
|
59 | finally: | ||
60 | def test_api_create_repo_group_regular_user(self): |
|
60 | fixture.destroy_repo_group(repo_group_name) | |
61 | repo_group_name = 'api-repo-group' |
|
|||
62 |
|
||||
63 | usr = UserModel().get_by_username(self.TEST_USER_LOGIN) |
|
|||
64 | usr.inherit_default_permissions = False |
|
|||
65 | Session().add(usr) |
|
|||
66 | UserModel().grant_perm( |
|
|||
67 | self.TEST_USER_LOGIN, 'hg.repogroup.create.true') |
|
|||
68 | Session().commit() |
|
|||
69 |
|
||||
70 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) |
|
|||
71 | assert repo_group is None |
|
|||
72 |
|
||||
73 | id_, params = build_data( |
|
|||
74 | self.apikey_regular, 'create_repo_group', |
|
|||
75 | group_name=repo_group_name, |
|
|||
76 | owner=TEST_USER_ADMIN_LOGIN,) |
|
|||
77 | response = api_call(self.app, params) |
|
|||
78 |
|
||||
79 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) |
|
|||
80 | assert repo_group is not None |
|
|||
81 | ret = { |
|
|||
82 | 'msg': 'Created new repo group `%s`' % (repo_group_name,), |
|
|||
83 | 'repo_group': repo_group.get_api_data() |
|
|||
84 | } |
|
|||
85 | expected = ret |
|
|||
86 | assert_ok(id_, expected, given=response.body) |
|
|||
87 | fixture.destroy_repo_group(repo_group_name) |
|
|||
88 | UserModel().revoke_perm( |
|
|||
89 | self.TEST_USER_LOGIN, 'hg.repogroup.create.true') |
|
|||
90 | usr = UserModel().get_by_username(self.TEST_USER_LOGIN) |
|
|||
91 | usr.inherit_default_permissions = True |
|
|||
92 | Session().add(usr) |
|
|||
93 | Session().commit() |
|
|||
94 |
|
||||
95 | def test_api_create_repo_group_regular_user_no_permission(self): |
|
|||
96 | repo_group_name = 'api-repo-group' |
|
|||
97 |
|
||||
98 | id_, params = build_data( |
|
|||
99 | self.apikey_regular, 'create_repo_group', |
|
|||
100 | group_name=repo_group_name, |
|
|||
101 | owner=TEST_USER_ADMIN_LOGIN,) |
|
|||
102 | response = api_call(self.app, params) |
|
|||
103 |
|
||||
104 | expected = "Access was denied to this resource." |
|
|||
105 | assert_error(id_, expected, given=response.body) |
|
|||
106 |
|
61 | |||
107 | def test_api_create_repo_group_in_another_group(self): |
|
62 | def test_api_create_repo_group_in_another_group(self): | |
108 | repo_group_name = 'api-repo-group' |
|
63 | repo_group_name = 'api-repo-group' | |
@@ -127,9 +82,11 b' class TestCreateRepoGroup(object):' | |||||
127 | 'repo_group': repo_group.get_api_data() |
|
82 | 'repo_group': repo_group.get_api_data() | |
128 | } |
|
83 | } | |
129 | expected = ret |
|
84 | expected = ret | |
130 | assert_ok(id_, expected, given=response.body) |
|
85 | try: | |
131 | fixture.destroy_repo_group(full_repo_group_name) |
|
86 | assert_ok(id_, expected, given=response.body) | |
132 | fixture.destroy_repo_group(repo_group_name) |
|
87 | finally: | |
|
88 | fixture.destroy_repo_group(full_repo_group_name) | |||
|
89 | fixture.destroy_repo_group(repo_group_name) | |||
133 |
|
90 | |||
134 | def test_api_create_repo_group_in_another_group_not_existing(self): |
|
91 | def test_api_create_repo_group_in_another_group_not_existing(self): | |
135 | repo_group_name = 'api-repo-group-no' |
|
92 | repo_group_name = 'api-repo-group-no' | |
@@ -144,7 +101,10 b' class TestCreateRepoGroup(object):' | |||||
144 | owner=TEST_USER_ADMIN_LOGIN, |
|
101 | owner=TEST_USER_ADMIN_LOGIN, | |
145 | copy_permissions=True) |
|
102 | copy_permissions=True) | |
146 | response = api_call(self.app, params) |
|
103 | response = api_call(self.app, params) | |
147 | expected = 'repository group `%s` does not exist' % (repo_group_name,) |
|
104 | expected = { | |
|
105 | 'repo_group': | |||
|
106 | 'Parent repository group `{}` does not exist'.format( | |||
|
107 | repo_group_name)} | |||
148 | assert_error(id_, expected, given=response.body) |
|
108 | assert_error(id_, expected, given=response.body) | |
149 |
|
109 | |||
150 | def test_api_create_repo_group_that_exists(self): |
|
110 | def test_api_create_repo_group_that_exists(self): | |
@@ -159,9 +119,139 b' class TestCreateRepoGroup(object):' | |||||
159 | group_name=repo_group_name, |
|
119 | group_name=repo_group_name, | |
160 | owner=TEST_USER_ADMIN_LOGIN,) |
|
120 | owner=TEST_USER_ADMIN_LOGIN,) | |
161 | response = api_call(self.app, params) |
|
121 | response = api_call(self.app, params) | |
162 | expected = 'repo group `%s` already exist' % (repo_group_name,) |
|
122 | expected = { | |
|
123 | 'unique_repo_group_name': | |||
|
124 | 'Repository group with name `{}` already exists'.format( | |||
|
125 | repo_group_name)} | |||
|
126 | try: | |||
|
127 | assert_error(id_, expected, given=response.body) | |||
|
128 | finally: | |||
|
129 | fixture.destroy_repo_group(repo_group_name) | |||
|
130 | ||||
|
131 | def test_api_create_repo_group_regular_user_wit_root_location_perms( | |||
|
132 | self, user_util): | |||
|
133 | regular_user = user_util.create_user() | |||
|
134 | regular_user_api_key = regular_user.api_key | |||
|
135 | ||||
|
136 | repo_group_name = 'api-repo-group-by-regular-user' | |||
|
137 | ||||
|
138 | usr = UserModel().get_by_username(regular_user.username) | |||
|
139 | usr.inherit_default_permissions = False | |||
|
140 | Session().add(usr) | |||
|
141 | ||||
|
142 | UserModel().grant_perm( | |||
|
143 | regular_user.username, 'hg.repogroup.create.true') | |||
|
144 | Session().commit() | |||
|
145 | ||||
|
146 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) | |||
|
147 | assert repo_group is None | |||
|
148 | ||||
|
149 | id_, params = build_data( | |||
|
150 | regular_user_api_key, 'create_repo_group', | |||
|
151 | group_name=repo_group_name) | |||
|
152 | response = api_call(self.app, params) | |||
|
153 | ||||
|
154 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) | |||
|
155 | assert repo_group is not None | |||
|
156 | expected = { | |||
|
157 | 'msg': 'Created new repo group `%s`' % (repo_group_name,), | |||
|
158 | 'repo_group': repo_group.get_api_data() | |||
|
159 | } | |||
|
160 | try: | |||
|
161 | assert_ok(id_, expected, given=response.body) | |||
|
162 | finally: | |||
|
163 | fixture.destroy_repo_group(repo_group_name) | |||
|
164 | ||||
|
165 | def test_api_create_repo_group_regular_user_with_admin_perms_to_parent( | |||
|
166 | self, user_util): | |||
|
167 | ||||
|
168 | repo_group_name = 'api-repo-group-parent' | |||
|
169 | ||||
|
170 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) | |||
|
171 | assert repo_group is None | |||
|
172 | # create the parent | |||
|
173 | fixture.create_repo_group(repo_group_name) | |||
|
174 | ||||
|
175 | # user perms | |||
|
176 | regular_user = user_util.create_user() | |||
|
177 | regular_user_api_key = regular_user.api_key | |||
|
178 | ||||
|
179 | usr = UserModel().get_by_username(regular_user.username) | |||
|
180 | usr.inherit_default_permissions = False | |||
|
181 | Session().add(usr) | |||
|
182 | ||||
|
183 | RepoGroupModel().grant_user_permission( | |||
|
184 | repo_group_name, regular_user.username, 'group.admin') | |||
|
185 | Session().commit() | |||
|
186 | ||||
|
187 | full_repo_group_name = repo_group_name + '/' + repo_group_name | |||
|
188 | id_, params = build_data( | |||
|
189 | regular_user_api_key, 'create_repo_group', | |||
|
190 | group_name=full_repo_group_name) | |||
|
191 | response = api_call(self.app, params) | |||
|
192 | ||||
|
193 | repo_group = RepoGroupModel.cls.get_by_group_name(full_repo_group_name) | |||
|
194 | assert repo_group is not None | |||
|
195 | expected = { | |||
|
196 | 'msg': 'Created new repo group `{}`'.format(full_repo_group_name), | |||
|
197 | 'repo_group': repo_group.get_api_data() | |||
|
198 | } | |||
|
199 | try: | |||
|
200 | assert_ok(id_, expected, given=response.body) | |||
|
201 | finally: | |||
|
202 | fixture.destroy_repo_group(full_repo_group_name) | |||
|
203 | fixture.destroy_repo_group(repo_group_name) | |||
|
204 | ||||
|
205 | def test_api_create_repo_group_regular_user_no_permission_to_create_to_root_level(self): | |||
|
206 | repo_group_name = 'api-repo-group' | |||
|
207 | ||||
|
208 | id_, params = build_data( | |||
|
209 | self.apikey_regular, 'create_repo_group', | |||
|
210 | group_name=repo_group_name) | |||
|
211 | response = api_call(self.app, params) | |||
|
212 | ||||
|
213 | expected = { | |||
|
214 | 'repo_group': | |||
|
215 | u'You do not have the permission to store ' | |||
|
216 | u'repository groups in the root location.'} | |||
163 | assert_error(id_, expected, given=response.body) |
|
217 | assert_error(id_, expected, given=response.body) | |
164 | fixture.destroy_repo_group(repo_group_name) |
|
218 | ||
|
219 | def test_api_create_repo_group_regular_user_no_parent_group_perms(self): | |||
|
220 | repo_group_name = 'api-repo-group-regular-user' | |||
|
221 | ||||
|
222 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) | |||
|
223 | assert repo_group is None | |||
|
224 | # create the parent | |||
|
225 | fixture.create_repo_group(repo_group_name) | |||
|
226 | ||||
|
227 | full_repo_group_name = repo_group_name+'/'+repo_group_name | |||
|
228 | ||||
|
229 | id_, params = build_data( | |||
|
230 | self.apikey_regular, 'create_repo_group', | |||
|
231 | group_name=full_repo_group_name) | |||
|
232 | response = api_call(self.app, params) | |||
|
233 | ||||
|
234 | expected = { | |||
|
235 | 'repo_group': | |||
|
236 | 'Parent repository group `{}` does not exist'.format( | |||
|
237 | repo_group_name)} | |||
|
238 | try: | |||
|
239 | assert_error(id_, expected, given=response.body) | |||
|
240 | finally: | |||
|
241 | fixture.destroy_repo_group(repo_group_name) | |||
|
242 | ||||
|
243 | def test_api_create_repo_group_regular_user_no_permission_to_specify_owner( | |||
|
244 | self): | |||
|
245 | repo_group_name = 'api-repo-group' | |||
|
246 | ||||
|
247 | id_, params = build_data( | |||
|
248 | self.apikey_regular, 'create_repo_group', | |||
|
249 | group_name=repo_group_name, | |||
|
250 | owner=TEST_USER_ADMIN_LOGIN,) | |||
|
251 | response = api_call(self.app, params) | |||
|
252 | ||||
|
253 | expected = "Only RhodeCode super-admin can specify `owner` param" | |||
|
254 | assert_error(id_, expected, given=response.body) | |||
165 |
|
255 | |||
166 | @mock.patch.object(RepoGroupModel, 'create', crash) |
|
256 | @mock.patch.object(RepoGroupModel, 'create', crash) | |
167 | def test_api_create_repo_group_exception_occurred(self): |
|
257 | def test_api_create_repo_group_exception_occurred(self): |
@@ -28,6 +28,7 b' from rhodecode.tests import (' | |||||
28 | from rhodecode.api.tests.utils import ( |
|
28 | from rhodecode.api.tests.utils import ( | |
29 | build_data, api_call, assert_ok, assert_error, jsonify, crash) |
|
29 | build_data, api_call, assert_ok, assert_error, jsonify, crash) | |
30 | from rhodecode.tests.fixture import Fixture |
|
30 | from rhodecode.tests.fixture import Fixture | |
|
31 | from rhodecode.model.db import RepoGroup | |||
31 |
|
32 | |||
32 |
|
33 | |||
33 | # TODO: mikhail: remove fixture from here |
|
34 | # TODO: mikhail: remove fixture from here | |
@@ -145,6 +146,36 b' class TestCreateUser(object):' | |||||
145 | finally: |
|
146 | finally: | |
146 | fixture.destroy_user(usr.user_id) |
|
147 | fixture.destroy_user(usr.user_id) | |
147 |
|
148 | |||
|
149 | def test_api_create_user_with_personal_repo_group(self): | |||
|
150 | username = 'test_new_api_user_personal_group' | |||
|
151 | email = username + "@foo.com" | |||
|
152 | ||||
|
153 | id_, params = build_data( | |||
|
154 | self.apikey, 'create_user', | |||
|
155 | username=username, | |||
|
156 | email=email, extern_name='rhodecode', | |||
|
157 | create_personal_repo_group=True) | |||
|
158 | response = api_call(self.app, params) | |||
|
159 | ||||
|
160 | usr = UserModel().get_by_username(username) | |||
|
161 | ret = { | |||
|
162 | 'msg': 'created new user `%s`' % (username,), | |||
|
163 | 'user': jsonify(usr.get_api_data(include_secrets=True)), | |||
|
164 | } | |||
|
165 | ||||
|
166 | personal_group = RepoGroup.get_by_group_name(username) | |||
|
167 | assert personal_group | |||
|
168 | assert personal_group.personal == True | |||
|
169 | assert personal_group.user.username == username | |||
|
170 | ||||
|
171 | try: | |||
|
172 | expected = ret | |||
|
173 | assert_ok(id_, expected, given=response.body) | |||
|
174 | finally: | |||
|
175 | fixture.destroy_repo_group(username) | |||
|
176 | fixture.destroy_user(usr.user_id) | |||
|
177 | ||||
|
178 | ||||
148 | @mock.patch.object(UserModel, 'create_or_update', crash) |
|
179 | @mock.patch.object(UserModel, 'create_or_update', crash) | |
149 | def test_api_create_user_when_exception_happened(self): |
|
180 | def test_api_create_user_when_exception_happened(self): | |
150 |
|
181 |
@@ -24,6 +24,7 b' import pytest' | |||||
24 |
|
24 | |||
25 | from rhodecode.model.meta import Session |
|
25 | from rhodecode.model.meta import Session | |
26 | from rhodecode.model.repo import RepoModel |
|
26 | from rhodecode.model.repo import RepoModel | |
|
27 | from rhodecode.model.repo_group import RepoGroupModel | |||
27 | from rhodecode.model.user import UserModel |
|
28 | from rhodecode.model.user import UserModel | |
28 | from rhodecode.tests import TEST_USER_ADMIN_LOGIN |
|
29 | from rhodecode.tests import TEST_USER_ADMIN_LOGIN | |
29 | from rhodecode.api.tests.utils import ( |
|
30 | from rhodecode.api.tests.utils import ( | |
@@ -99,11 +100,35 b' class TestApiForkRepo(object):' | |||||
99 | finally: |
|
100 | finally: | |
100 | fixture.destroy_repo(fork_name) |
|
101 | fixture.destroy_repo(fork_name) | |
101 |
|
102 | |||
|
103 | def test_api_fork_repo_non_admin_into_group_no_permission(self, backend, user_util): | |||
|
104 | source_name = backend['minimal'].repo_name | |||
|
105 | repo_group = user_util.create_repo_group() | |||
|
106 | repo_group_name = repo_group.group_name | |||
|
107 | fork_name = '%s/api-repo-fork' % repo_group_name | |||
|
108 | ||||
|
109 | id_, params = build_data( | |||
|
110 | self.apikey_regular, 'fork_repo', | |||
|
111 | repoid=source_name, | |||
|
112 | fork_name=fork_name) | |||
|
113 | response = api_call(self.app, params) | |||
|
114 | ||||
|
115 | expected = { | |||
|
116 | 'repo_group': 'Repository group `{}` does not exist'.format( | |||
|
117 | repo_group_name)} | |||
|
118 | try: | |||
|
119 | assert_error(id_, expected, given=response.body) | |||
|
120 | finally: | |||
|
121 | fixture.destroy_repo(fork_name) | |||
|
122 | ||||
102 | def test_api_fork_repo_non_admin_into_group(self, backend, user_util): |
|
123 | def test_api_fork_repo_non_admin_into_group(self, backend, user_util): | |
103 | source_name = backend['minimal'].repo_name |
|
124 | source_name = backend['minimal'].repo_name | |
104 | repo_group = user_util.create_repo_group() |
|
125 | repo_group = user_util.create_repo_group() | |
105 | fork_name = '%s/api-repo-fork' % repo_group.group_name |
|
126 | fork_name = '%s/api-repo-fork' % repo_group.group_name | |
106 |
|
127 | |||
|
128 | RepoGroupModel().grant_user_permission( | |||
|
129 | repo_group, self.TEST_USER_LOGIN, 'group.admin') | |||
|
130 | Session().commit() | |||
|
131 | ||||
107 | id_, params = build_data( |
|
132 | id_, params = build_data( | |
108 | self.apikey_regular, 'fork_repo', |
|
133 | self.apikey_regular, 'fork_repo', | |
109 | repoid=source_name, |
|
134 | repoid=source_name, | |
@@ -129,10 +154,11 b' class TestApiForkRepo(object):' | |||||
129 | fork_name=fork_name, |
|
154 | fork_name=fork_name, | |
130 | owner=TEST_USER_ADMIN_LOGIN) |
|
155 | owner=TEST_USER_ADMIN_LOGIN) | |
131 | response = api_call(self.app, params) |
|
156 | response = api_call(self.app, params) | |
132 | expected = 'Only RhodeCode admin can specify `owner` param' |
|
157 | expected = 'Only RhodeCode super-admin can specify `owner` param' | |
133 | assert_error(id_, expected, given=response.body) |
|
158 | assert_error(id_, expected, given=response.body) | |
134 |
|
159 | |||
135 |
def test_api_fork_repo_non_admin_no_permission_ |
|
160 | def test_api_fork_repo_non_admin_no_permission_of_source_repo( | |
|
161 | self, backend): | |||
136 | source_name = backend['minimal'].repo_name |
|
162 | source_name = backend['minimal'].repo_name | |
137 | RepoModel().grant_user_permission(repo=source_name, |
|
163 | RepoModel().grant_user_permission(repo=source_name, | |
138 | user=self.TEST_USER_LOGIN, |
|
164 | user=self.TEST_USER_LOGIN, | |
@@ -147,19 +173,44 b' class TestApiForkRepo(object):' | |||||
147 | assert_error(id_, expected, given=response.body) |
|
173 | assert_error(id_, expected, given=response.body) | |
148 |
|
174 | |||
149 | def test_api_fork_repo_non_admin_no_permission_to_fork_to_root_level( |
|
175 | def test_api_fork_repo_non_admin_no_permission_to_fork_to_root_level( | |
150 | self, backend): |
|
176 | self, backend, user_util): | |
|
177 | ||||
|
178 | regular_user = user_util.create_user() | |||
|
179 | regular_user_api_key = regular_user.api_key | |||
|
180 | usr = UserModel().get_by_username(regular_user.username) | |||
|
181 | usr.inherit_default_permissions = False | |||
|
182 | Session().add(usr) | |||
|
183 | UserModel().grant_perm(regular_user.username, 'hg.fork.repository') | |||
|
184 | ||||
151 | source_name = backend['minimal'].repo_name |
|
185 | source_name = backend['minimal'].repo_name | |
|
186 | fork_name = backend.new_repo_name() | |||
|
187 | id_, params = build_data( | |||
|
188 | regular_user_api_key, 'fork_repo', | |||
|
189 | repoid=source_name, | |||
|
190 | fork_name=fork_name) | |||
|
191 | response = api_call(self.app, params) | |||
|
192 | expected = { | |||
|
193 | "repo_name": "You do not have the permission to " | |||
|
194 | "store repositories in the root location."} | |||
|
195 | assert_error(id_, expected, given=response.body) | |||
152 |
|
196 | |||
153 | usr = UserModel().get_by_username(self.TEST_USER_LOGIN) |
|
197 | def test_api_fork_repo_non_admin_no_permission_to_fork( | |
|
198 | self, backend, user_util): | |||
|
199 | ||||
|
200 | regular_user = user_util.create_user() | |||
|
201 | regular_user_api_key = regular_user.api_key | |||
|
202 | usr = UserModel().get_by_username(regular_user.username) | |||
154 | usr.inherit_default_permissions = False |
|
203 | usr.inherit_default_permissions = False | |
155 | Session().add(usr) |
|
204 | Session().add(usr) | |
156 |
|
205 | |||
|
206 | source_name = backend['minimal'].repo_name | |||
157 | fork_name = backend.new_repo_name() |
|
207 | fork_name = backend.new_repo_name() | |
158 | id_, params = build_data( |
|
208 | id_, params = build_data( | |
159 |
|
|
209 | regular_user_api_key, 'fork_repo', | |
160 | repoid=source_name, |
|
210 | repoid=source_name, | |
161 | fork_name=fork_name) |
|
211 | fork_name=fork_name) | |
162 | response = api_call(self.app, params) |
|
212 | response = api_call(self.app, params) | |
|
213 | ||||
163 | expected = "Access was denied to this resource." |
|
214 | expected = "Access was denied to this resource." | |
164 | assert_error(id_, expected, given=response.body) |
|
215 | assert_error(id_, expected, given=response.body) | |
165 |
|
216 | |||
@@ -189,7 +240,9 b' class TestApiForkRepo(object):' | |||||
189 | response = api_call(self.app, params) |
|
240 | response = api_call(self.app, params) | |
190 |
|
241 | |||
191 | try: |
|
242 | try: | |
192 | expected = "fork `%s` already exist" % (fork_name,) |
|
243 | expected = { | |
|
244 | 'unique_repo_name': 'Repository with name `{}` already exists'.format( | |||
|
245 | fork_name)} | |||
193 | assert_error(id_, expected, given=response.body) |
|
246 | assert_error(id_, expected, given=response.body) | |
194 | finally: |
|
247 | finally: | |
195 | fixture.destroy_repo(fork_repo.repo_name) |
|
248 | fixture.destroy_repo(fork_repo.repo_name) | |
@@ -205,7 +258,9 b' class TestApiForkRepo(object):' | |||||
205 | owner=TEST_USER_ADMIN_LOGIN) |
|
258 | owner=TEST_USER_ADMIN_LOGIN) | |
206 | response = api_call(self.app, params) |
|
259 | response = api_call(self.app, params) | |
207 |
|
260 | |||
208 | expected = "repo `%s` already exist" % (fork_name,) |
|
261 | expected = { | |
|
262 | 'unique_repo_name': 'Repository with name `{}` already exists'.format( | |||
|
263 | fork_name)} | |||
209 | assert_error(id_, expected, given=response.body) |
|
264 | assert_error(id_, expected, given=response.body) | |
210 |
|
265 | |||
211 | @mock.patch.object(RepoModel, 'create_fork', crash) |
|
266 | @mock.patch.object(RepoModel, 'create_fork', crash) |
@@ -34,6 +34,7 b' pytestmark = pytest.mark.backends("git",' | |||||
34 | class TestGetPullRequest(object): |
|
34 | class TestGetPullRequest(object): | |
35 |
|
35 | |||
36 | def test_api_get_pull_request(self, pr_util): |
|
36 | def test_api_get_pull_request(self, pr_util): | |
|
37 | from rhodecode.model.pull_request import PullRequestModel | |||
37 | pull_request = pr_util.create_pull_request(mergeable=True) |
|
38 | pull_request = pr_util.create_pull_request(mergeable=True) | |
38 | id_, params = build_data( |
|
39 | id_, params = build_data( | |
39 | self.apikey, 'get_pull_request', |
|
40 | self.apikey, 'get_pull_request', | |
@@ -57,6 +58,8 b' class TestGetPullRequest(object):' | |||||
57 | target_url = unicode( |
|
58 | target_url = unicode( | |
58 | pull_request.target_repo.clone_url() |
|
59 | pull_request.target_repo.clone_url() | |
59 | .with_netloc('test.example.com:80')) |
|
60 | .with_netloc('test.example.com:80')) | |
|
61 | shadow_url = unicode( | |||
|
62 | PullRequestModel().get_shadow_clone_url(pull_request)) | |||
60 | expected = { |
|
63 | expected = { | |
61 | 'pull_request_id': pull_request.pull_request_id, |
|
64 | 'pull_request_id': pull_request.pull_request_id, | |
62 | 'url': pr_url, |
|
65 | 'url': pr_url, | |
@@ -89,15 +92,24 b' class TestGetPullRequest(object):' | |||||
89 | 'commit_id': pull_request.target_ref_parts.commit_id, |
|
92 | 'commit_id': pull_request.target_ref_parts.commit_id, | |
90 | }, |
|
93 | }, | |
91 | }, |
|
94 | }, | |
|
95 | 'merge': { | |||
|
96 | 'clone_url': shadow_url, | |||
|
97 | 'reference': { | |||
|
98 | 'name': pull_request.shadow_merge_ref.name, | |||
|
99 | 'type': pull_request.shadow_merge_ref.type, | |||
|
100 | 'commit_id': pull_request.shadow_merge_ref.commit_id, | |||
|
101 | }, | |||
|
102 | }, | |||
92 | 'author': pull_request.author.get_api_data(include_secrets=False, |
|
103 | 'author': pull_request.author.get_api_data(include_secrets=False, | |
93 | details='basic'), |
|
104 | details='basic'), | |
94 | 'reviewers': [ |
|
105 | 'reviewers': [ | |
95 | { |
|
106 | { | |
96 | 'user': reviewer.get_api_data(include_secrets=False, |
|
107 | 'user': reviewer.get_api_data(include_secrets=False, | |
97 | details='basic'), |
|
108 | details='basic'), | |
|
109 | 'reasons': reasons, | |||
98 | 'review_status': st[0][1].status if st else 'not_reviewed', |
|
110 | 'review_status': st[0][1].status if st else 'not_reviewed', | |
99 | } |
|
111 | } | |
100 | for reviewer, st in pull_request.reviewers_statuses() |
|
112 | for reviewer, reasons, st in pull_request.reviewers_statuses() | |
101 | ] |
|
113 | ] | |
102 | } |
|
114 | } | |
103 | assert_ok(id_, expected, response.body) |
|
115 | assert_ok(id_, expected, response.body) |
@@ -45,8 +45,13 b' class TestGetServerInfo(object):' | |||||
45 | expected['uptime'] = resp['result']['uptime'] |
|
45 | expected['uptime'] = resp['result']['uptime'] | |
46 | expected['load'] = resp['result']['load'] |
|
46 | expected['load'] = resp['result']['load'] | |
47 | expected['cpu'] = resp['result']['cpu'] |
|
47 | expected['cpu'] = resp['result']['cpu'] | |
48 |
expected[' |
|
48 | expected['storage'] = resp['result']['storage'] | |
49 | expected['server_ip'] = '127.0.0.1:80' |
|
49 | expected['storage_temp'] = resp['result']['storage_temp'] | |
|
50 | expected['storage_inodes'] = resp['result']['storage_inodes'] | |||
|
51 | expected['server'] = resp['result']['server'] | |||
|
52 | ||||
|
53 | expected['index_storage'] = resp['result']['index_storage'] | |||
|
54 | expected['storage'] = resp['result']['storage'] | |||
50 |
|
55 | |||
51 | assert_ok(id_, expected, given=response.body) |
|
56 | assert_ok(id_, expected, given=response.body) | |
52 |
|
57 | |||
@@ -59,7 +64,21 b' class TestGetServerInfo(object):' | |||||
59 | expected['uptime'] = resp['result']['uptime'] |
|
64 | expected['uptime'] = resp['result']['uptime'] | |
60 | expected['load'] = resp['result']['load'] |
|
65 | expected['load'] = resp['result']['load'] | |
61 | expected['cpu'] = resp['result']['cpu'] |
|
66 | expected['cpu'] = resp['result']['cpu'] | |
62 |
expected[' |
|
67 | expected['storage'] = resp['result']['storage'] | |
63 | expected['server_ip'] = '127.0.0.1:80' |
|
68 | expected['storage_temp'] = resp['result']['storage_temp'] | |
|
69 | expected['storage_inodes'] = resp['result']['storage_inodes'] | |||
|
70 | expected['server'] = resp['result']['server'] | |||
|
71 | ||||
|
72 | expected['index_storage'] = resp['result']['index_storage'] | |||
|
73 | expected['storage'] = resp['result']['storage'] | |||
64 |
|
74 | |||
65 | assert_ok(id_, expected, given=response.body) |
|
75 | assert_ok(id_, expected, given=response.body) | |
|
76 | ||||
|
77 | def test_api_get_server_info_data_for_search_index_build(self): | |||
|
78 | id_, params = build_data(self.apikey, 'get_server_info') | |||
|
79 | response = api_call(self.app, params) | |||
|
80 | resp = response.json | |||
|
81 | ||||
|
82 | # required by indexer | |||
|
83 | assert resp['result']['index_storage'] | |||
|
84 | assert resp['result']['storage'] |
@@ -32,42 +32,37 b' from rhodecode.api.tests.utils import (' | |||||
32 | class TestMergePullRequest(object): |
|
32 | class TestMergePullRequest(object): | |
33 | @pytest.mark.backends("git", "hg") |
|
33 | @pytest.mark.backends("git", "hg") | |
34 | def test_api_merge_pull_request(self, pr_util, no_notifications): |
|
34 | def test_api_merge_pull_request(self, pr_util, no_notifications): | |
35 | pull_request = pr_util.create_pull_request() |
|
35 | pull_request = pr_util.create_pull_request(mergeable=True) | |
36 | pull_request_2 = PullRequestModel().create( |
|
|||
37 | created_by=pull_request.author, |
|
|||
38 | source_repo=pull_request.source_repo, |
|
|||
39 | source_ref=pull_request.source_ref, |
|
|||
40 | target_repo=pull_request.target_repo, |
|
|||
41 | target_ref=pull_request.target_ref, |
|
|||
42 | revisions=pull_request.revisions, |
|
|||
43 | reviewers=(), |
|
|||
44 | title=pull_request.title, |
|
|||
45 | description=pull_request.description, |
|
|||
46 | ) |
|
|||
47 | author = pull_request.user_id |
|
36 | author = pull_request.user_id | |
48 |
repo = pull_request |
|
37 | repo = pull_request.target_repo.repo_id | |
49 |
pull_request_ |
|
38 | pull_request_id = pull_request.pull_request_id | |
50 |
pull_request_ |
|
39 | pull_request_repo = pull_request.target_repo.repo_name | |
51 | Session().commit() |
|
|||
52 |
|
40 | |||
53 | id_, params = build_data( |
|
41 | id_, params = build_data( | |
54 | self.apikey, 'merge_pull_request', |
|
42 | self.apikey, 'merge_pull_request', | |
55 |
repoid=pull_request_ |
|
43 | repoid=pull_request_repo, | |
56 |
pullrequestid=pull_request_ |
|
44 | pullrequestid=pull_request_id) | |
|
45 | ||||
57 | response = api_call(self.app, params) |
|
46 | response = api_call(self.app, params) | |
58 |
|
47 | |||
|
48 | # The above api call detaches the pull request DB object from the | |||
|
49 | # session because of an unconditional transaction rollback in our | |||
|
50 | # middleware. Therefore we need to add it back here if we want to use | |||
|
51 | # it. | |||
|
52 | Session().add(pull_request) | |||
|
53 | ||||
59 | expected = { |
|
54 | expected = { | |
60 | 'executed': True, |
|
55 | 'executed': True, | |
61 | 'failure_reason': 0, |
|
56 | 'failure_reason': 0, | |
62 | 'possible': True |
|
57 | 'possible': True, | |
|
58 | 'merge_commit_id': pull_request.shadow_merge_ref.commit_id, | |||
|
59 | 'merge_ref': pull_request.shadow_merge_ref._asdict() | |||
63 | } |
|
60 | } | |
64 |
|
61 | |||
65 | response_json = response.json['result'] |
|
62 | response_json = response.json['result'] | |
66 | assert response_json['merge_commit_id'] |
|
|||
67 | response_json.pop('merge_commit_id') |
|
|||
68 | assert response_json == expected |
|
63 | assert response_json == expected | |
69 |
|
64 | |||
70 |
action = 'user_merged_pull_request:%d' % (pull_request_ |
|
65 | action = 'user_merged_pull_request:%d' % (pull_request_id, ) | |
71 | journal = UserLog.query()\ |
|
66 | journal = UserLog.query()\ | |
72 | .filter(UserLog.user_id == author)\ |
|
67 | .filter(UserLog.user_id == author)\ | |
73 | .filter(UserLog.repository_id == repo)\ |
|
68 | .filter(UserLog.repository_id == repo)\ | |
@@ -77,11 +72,11 b' class TestMergePullRequest(object):' | |||||
77 |
|
72 | |||
78 | id_, params = build_data( |
|
73 | id_, params = build_data( | |
79 | self.apikey, 'merge_pull_request', |
|
74 | self.apikey, 'merge_pull_request', | |
80 |
repoid=pull_request_ |
|
75 | repoid=pull_request_repo, pullrequestid=pull_request_id) | |
81 | response = api_call(self.app, params) |
|
76 | response = api_call(self.app, params) | |
82 |
|
77 | |||
83 | expected = 'pull request `%s` merge failed, pull request is closed' % ( |
|
78 | expected = 'pull request `%s` merge failed, pull request is closed' % ( | |
84 |
pull_request_ |
|
79 | pull_request_id) | |
85 | assert_error(id_, expected, given=response.body) |
|
80 | assert_error(id_, expected, given=response.body) | |
86 |
|
81 | |||
87 | @pytest.mark.backends("git", "hg") |
|
82 | @pytest.mark.backends("git", "hg") |
@@ -32,35 +32,60 b' fixture = Fixture()' | |||||
32 |
|
32 | |||
33 | UPDATE_REPO_NAME = 'api_update_me' |
|
33 | UPDATE_REPO_NAME = 'api_update_me' | |
34 |
|
34 | |||
35 | class SAME_AS_UPDATES(object): """ Constant used for tests below """ |
|
35 | ||
|
36 | class SAME_AS_UPDATES(object): | |||
|
37 | """ Constant used for tests below """ | |||
|
38 | ||||
36 |
|
39 | |||
37 | @pytest.mark.usefixtures("testuser_api", "app") |
|
40 | @pytest.mark.usefixtures("testuser_api", "app") | |
38 | class TestApiUpdateRepo(object): |
|
41 | class TestApiUpdateRepo(object): | |
39 |
|
42 | |||
40 | @pytest.mark.parametrize("updates, expected", [ |
|
43 | @pytest.mark.parametrize("updates, expected", [ | |
41 |
({'owner': TEST_USER_REGULAR_LOGIN}, |
|
44 | ({'owner': TEST_USER_REGULAR_LOGIN}, | |
42 | ({'description': 'new description'}, SAME_AS_UPDATES), |
|
45 | SAME_AS_UPDATES), | |
43 | ({'clone_uri': 'http://foo.com/repo'}, SAME_AS_UPDATES), |
|
46 | ||
44 | ({'clone_uri': None}, {'clone_uri': ''}), |
|
47 | ({'description': 'new description'}, | |
45 | ({'clone_uri': ''}, {'clone_uri': ''}), |
|
48 | SAME_AS_UPDATES), | |
46 | ({'landing_rev': 'branch:master'}, {'landing_rev': ['branch','master']}), |
|
49 | ||
47 | ({'enable_statistics': True}, SAME_AS_UPDATES), |
|
50 | ({'clone_uri': 'http://foo.com/repo'}, | |
48 |
|
|
51 | SAME_AS_UPDATES), | |
49 | ({'enable_downloads': True}, SAME_AS_UPDATES), |
|
52 | ||
50 |
({'n |
|
53 | ({'clone_uri': None}, | |
|
54 | {'clone_uri': ''}), | |||
|
55 | ||||
|
56 | ({'clone_uri': ''}, | |||
|
57 | {'clone_uri': ''}), | |||
|
58 | ||||
|
59 | ({'landing_rev': 'rev:tip'}, | |||
|
60 | {'landing_rev': ['rev', 'tip']}), | |||
|
61 | ||||
|
62 | ({'enable_statistics': True}, | |||
|
63 | SAME_AS_UPDATES), | |||
|
64 | ||||
|
65 | ({'enable_locking': True}, | |||
|
66 | SAME_AS_UPDATES), | |||
|
67 | ||||
|
68 | ({'enable_downloads': True}, | |||
|
69 | SAME_AS_UPDATES), | |||
|
70 | ||||
|
71 | ({'repo_name': 'new_repo_name'}, | |||
|
72 | { | |||
51 | 'repo_name': 'new_repo_name', |
|
73 | 'repo_name': 'new_repo_name', | |
52 |
'url': 'http://test.example.com:80/new_repo_name' |
|
74 | 'url': 'http://test.example.com:80/new_repo_name' | |
53 | }), |
|
75 | }), | |
54 | ({'group': 'test_group_for_update'}, { |
|
76 | ||
55 |
|
|
77 | ({'repo_name': 'test_group_for_update/{}'.format(UPDATE_REPO_NAME), | |
56 | 'url': 'http://test.example.com:80/test_group_for_update/%s' % UPDATE_REPO_NAME |
|
78 | '_group': 'test_group_for_update'}, | |
57 |
|
|
79 | { | |
|
80 | 'repo_name': 'test_group_for_update/{}'.format(UPDATE_REPO_NAME), | |||
|
81 | 'url': 'http://test.example.com:80/test_group_for_update/{}'.format(UPDATE_REPO_NAME) | |||
|
82 | }), | |||
58 | ]) |
|
83 | ]) | |
59 | def test_api_update_repo(self, updates, expected, backend): |
|
84 | def test_api_update_repo(self, updates, expected, backend): | |
60 | repo_name = UPDATE_REPO_NAME |
|
85 | repo_name = UPDATE_REPO_NAME | |
61 | repo = fixture.create_repo(repo_name, repo_type=backend.alias) |
|
86 | repo = fixture.create_repo(repo_name, repo_type=backend.alias) | |
62 | if updates.get('group'): |
|
87 | if updates.get('_group'): | |
63 | fixture.create_repo_group(updates['group']) |
|
88 | fixture.create_repo_group(updates['_group']) | |
64 |
|
89 | |||
65 | expected_api_data = repo.get_api_data(include_secrets=True) |
|
90 | expected_api_data = repo.get_api_data(include_secrets=True) | |
66 | if expected is SAME_AS_UPDATES: |
|
91 | if expected is SAME_AS_UPDATES: | |
@@ -68,15 +93,12 b' class TestApiUpdateRepo(object):' | |||||
68 | else: |
|
93 | else: | |
69 | expected_api_data.update(expected) |
|
94 | expected_api_data.update(expected) | |
70 |
|
95 | |||
71 |
|
||||
72 | id_, params = build_data( |
|
96 | id_, params = build_data( | |
73 | self.apikey, 'update_repo', repoid=repo_name, **updates) |
|
97 | self.apikey, 'update_repo', repoid=repo_name, **updates) | |
74 | response = api_call(self.app, params) |
|
98 | response = api_call(self.app, params) | |
75 |
|
99 | |||
76 | if updates.get('name'): |
|
100 | if updates.get('repo_name'): | |
77 | repo_name = updates['name'] |
|
101 | repo_name = updates['repo_name'] | |
78 | if updates.get('group'): |
|
|||
79 | repo_name = '/'.join([updates['group'], repo_name]) |
|
|||
80 |
|
102 | |||
81 | try: |
|
103 | try: | |
82 | expected = { |
|
104 | expected = { | |
@@ -86,8 +108,8 b' class TestApiUpdateRepo(object):' | |||||
86 | assert_ok(id_, expected, given=response.body) |
|
108 | assert_ok(id_, expected, given=response.body) | |
87 | finally: |
|
109 | finally: | |
88 | fixture.destroy_repo(repo_name) |
|
110 | fixture.destroy_repo(repo_name) | |
89 | if updates.get('group'): |
|
111 | if updates.get('_group'): | |
90 | fixture.destroy_repo_group(updates['group']) |
|
112 | fixture.destroy_repo_group(updates['_group']) | |
91 |
|
113 | |||
92 | def test_api_update_repo_fork_of_field(self, backend): |
|
114 | def test_api_update_repo_fork_of_field(self, backend): | |
93 | master_repo = backend.create_repo() |
|
115 | master_repo = backend.create_repo() | |
@@ -118,19 +140,23 b' class TestApiUpdateRepo(object):' | |||||
118 | id_, params = build_data( |
|
140 | id_, params = build_data( | |
119 | self.apikey, 'update_repo', repoid=repo.repo_name, **updates) |
|
141 | self.apikey, 'update_repo', repoid=repo.repo_name, **updates) | |
120 | response = api_call(self.app, params) |
|
142 | response = api_call(self.app, params) | |
121 | expected = 'repository `{}` does not exist'.format(master_repo_name) |
|
143 | expected = { | |
|
144 | 'repo_fork_of': 'Fork with id `{}` does not exists'.format( | |||
|
145 | master_repo_name)} | |||
122 | assert_error(id_, expected, given=response.body) |
|
146 | assert_error(id_, expected, given=response.body) | |
123 |
|
147 | |||
124 | def test_api_update_repo_with_repo_group_not_existing(self): |
|
148 | def test_api_update_repo_with_repo_group_not_existing(self): | |
125 | repo_name = 'admin_owned' |
|
149 | repo_name = 'admin_owned' | |
|
150 | fake_repo_group = 'test_group_for_update' | |||
126 | fixture.create_repo(repo_name) |
|
151 | fixture.create_repo(repo_name) | |
127 | updates = {'group': 'test_group_for_update'} |
|
152 | updates = {'repo_name': '{}/{}'.format(fake_repo_group, repo_name)} | |
128 | id_, params = build_data( |
|
153 | id_, params = build_data( | |
129 | self.apikey, 'update_repo', repoid=repo_name, **updates) |
|
154 | self.apikey, 'update_repo', repoid=repo_name, **updates) | |
130 | response = api_call(self.app, params) |
|
155 | response = api_call(self.app, params) | |
131 | try: |
|
156 | try: | |
132 | expected = 'repository group `%s` does not exist' % ( |
|
157 | expected = { | |
133 | updates['group'],) |
|
158 | 'repo_group': 'Repository group `{}` does not exist'.format(fake_repo_group) | |
|
159 | } | |||
134 | assert_error(id_, expected, given=response.body) |
|
160 | assert_error(id_, expected, given=response.body) | |
135 | finally: |
|
161 | finally: | |
136 | fixture.destroy_repo(repo_name) |
|
162 | fixture.destroy_repo(repo_name) |
@@ -30,22 +30,30 b' from rhodecode.api.tests.utils import (' | |||||
30 |
|
30 | |||
31 | @pytest.mark.usefixtures("testuser_api", "app") |
|
31 | @pytest.mark.usefixtures("testuser_api", "app") | |
32 | class TestApiUpdateRepoGroup(object): |
|
32 | class TestApiUpdateRepoGroup(object): | |
|
33 | ||||
33 | def test_update_group_name(self, user_util): |
|
34 | def test_update_group_name(self, user_util): | |
34 | new_group_name = 'new-group' |
|
35 | new_group_name = 'new-group' | |
35 | initial_name = self._update(user_util, group_name=new_group_name) |
|
36 | initial_name = self._update(user_util, group_name=new_group_name) | |
36 | assert RepoGroupModel()._get_repo_group(initial_name) is None |
|
37 | assert RepoGroupModel()._get_repo_group(initial_name) is None | |
37 |
|
|
38 | new_group = RepoGroupModel()._get_repo_group(new_group_name) | |
|
39 | assert new_group is not None | |||
|
40 | assert new_group.full_path == new_group_name | |||
38 |
|
41 | |||
39 | def test_update_parent(self, user_util): |
|
42 | def test_update_group_name_change_parent(self, user_util): | |
|
43 | ||||
40 | parent_group = user_util.create_repo_group() |
|
44 | parent_group = user_util.create_repo_group() | |
41 | initial_name = self._update(user_util, parent=parent_group.name) |
|
45 | parent_group_name = parent_group.name | |
42 |
|
46 | |||
43 |
expected_group_name = '{}/{}'.format(parent_group |
|
47 | expected_group_name = '{}/{}'.format(parent_group_name, 'new-group') | |
|
48 | initial_name = self._update(user_util, group_name=expected_group_name) | |||
|
49 | ||||
44 | repo_group = RepoGroupModel()._get_repo_group(expected_group_name) |
|
50 | repo_group = RepoGroupModel()._get_repo_group(expected_group_name) | |
|
51 | ||||
45 | assert repo_group is not None |
|
52 | assert repo_group is not None | |
46 | assert repo_group.group_name == expected_group_name |
|
53 | assert repo_group.group_name == expected_group_name | |
47 |
assert repo_group. |
|
54 | assert repo_group.full_path == expected_group_name | |
48 | assert RepoGroupModel()._get_repo_group(initial_name) is None |
|
55 | assert RepoGroupModel()._get_repo_group(initial_name) is None | |
|
56 | ||||
49 | new_path = os.path.join( |
|
57 | new_path = os.path.join( | |
50 | RepoGroupModel().repos_path, *repo_group.full_path_splitted) |
|
58 | RepoGroupModel().repos_path, *repo_group.full_path_splitted) | |
51 | assert os.path.exists(new_path) |
|
59 | assert os.path.exists(new_path) | |
@@ -67,15 +75,47 b' class TestApiUpdateRepoGroup(object):' | |||||
67 | repo_group = RepoGroupModel()._get_repo_group(initial_name) |
|
75 | repo_group = RepoGroupModel()._get_repo_group(initial_name) | |
68 | assert repo_group.user.username == owner |
|
76 | assert repo_group.user.username == owner | |
69 |
|
77 | |||
70 | def test_api_update_repo_group_by_regular_user_no_permission( |
|
78 | def test_update_group_name_conflict_with_existing(self, user_util): | |
71 | self, backend): |
|
79 | group_1 = user_util.create_repo_group() | |
72 |
|
|
80 | group_2 = user_util.create_repo_group() | |
73 | repo_name = repo.repo_name |
|
81 | repo_group_name_1 = group_1.group_name | |
|
82 | repo_group_name_2 = group_2.group_name | |||
74 |
|
83 | |||
75 | id_, params = build_data( |
|
84 | id_, params = build_data( | |
76 |
self.apikey |
|
85 | self.apikey, 'update_repo_group', repogroupid=repo_group_name_1, | |
|
86 | group_name=repo_group_name_2) | |||
|
87 | response = api_call(self.app, params) | |||
|
88 | expected = { | |||
|
89 | 'unique_repo_group_name': | |||
|
90 | 'Repository group with name `{}` already exists'.format( | |||
|
91 | repo_group_name_2)} | |||
|
92 | assert_error(id_, expected, given=response.body) | |||
|
93 | ||||
|
94 | def test_api_update_repo_group_by_regular_user_no_permission(self, user_util): | |||
|
95 | temp_user = user_util.create_user() | |||
|
96 | temp_user_api_key = temp_user.api_key | |||
|
97 | parent_group = user_util.create_repo_group() | |||
|
98 | repo_group_name = parent_group.group_name | |||
|
99 | id_, params = build_data( | |||
|
100 | temp_user_api_key, 'update_repo_group', repogroupid=repo_group_name) | |||
77 | response = api_call(self.app, params) |
|
101 | response = api_call(self.app, params) | |
78 | expected = 'repository group `%s` does not exist' % (repo_name,) |
|
102 | expected = 'repository group `%s` does not exist' % (repo_group_name,) | |
|
103 | assert_error(id_, expected, given=response.body) | |||
|
104 | ||||
|
105 | def test_api_update_repo_group_regular_user_no_root_write_permissions( | |||
|
106 | self, user_util): | |||
|
107 | temp_user = user_util.create_user() | |||
|
108 | temp_user_api_key = temp_user.api_key | |||
|
109 | parent_group = user_util.create_repo_group(owner=temp_user.username) | |||
|
110 | repo_group_name = parent_group.group_name | |||
|
111 | ||||
|
112 | id_, params = build_data( | |||
|
113 | temp_user_api_key, 'update_repo_group', repogroupid=repo_group_name, | |||
|
114 | group_name='at-root-level') | |||
|
115 | response = api_call(self.app, params) | |||
|
116 | expected = { | |||
|
117 | 'repo_group': 'You do not have the permission to store ' | |||
|
118 | 'repository groups in the root location.'} | |||
79 | assert_error(id_, expected, given=response.body) |
|
119 | assert_error(id_, expected, given=response.body) | |
80 |
|
120 | |||
81 | def _update(self, user_util, **kwargs): |
|
121 | def _update(self, user_util, **kwargs): | |
@@ -89,7 +129,10 b' class TestApiUpdateRepoGroup(object):' | |||||
89 | self.apikey, 'update_repo_group', repogroupid=initial_name, |
|
129 | self.apikey, 'update_repo_group', repogroupid=initial_name, | |
90 | **kwargs) |
|
130 | **kwargs) | |
91 | response = api_call(self.app, params) |
|
131 | response = api_call(self.app, params) | |
92 | ret = { |
|
132 | ||
|
133 | repo_group = RepoGroupModel.cls.get(repo_group.group_id) | |||
|
134 | ||||
|
135 | expected = { | |||
93 | 'msg': 'updated repository group ID:{} {}'.format( |
|
136 | 'msg': 'updated repository group ID:{} {}'.format( | |
94 | repo_group.group_id, repo_group.group_name), |
|
137 | repo_group.group_id, repo_group.group_name), | |
95 | 'repo_group': { |
|
138 | 'repo_group': { | |
@@ -103,5 +146,5 b' class TestApiUpdateRepoGroup(object):' | |||||
103 | if repo_group.parent_group else None) |
|
146 | if repo_group.parent_group else None) | |
104 | } |
|
147 | } | |
105 | } |
|
148 | } | |
106 |
assert_ok(id_, |
|
149 | assert_ok(id_, expected, given=response.body) | |
107 | return initial_name |
|
150 | return initial_name |
@@ -249,7 +249,7 b' class TestRepoAccess(object):' | |||||
249 | fake_repo = Mock() |
|
249 | fake_repo = Mock() | |
250 | with self.repo_perm_patch as rmock: |
|
250 | with self.repo_perm_patch as rmock: | |
251 | rmock.return_value = repo_mock |
|
251 | rmock.return_value = repo_mock | |
252 |
assert utils. |
|
252 | assert utils.validate_repo_permissions( | |
253 | 'fake_user', 'fake_repo_id', fake_repo, |
|
253 | 'fake_user', 'fake_repo_id', fake_repo, | |
254 | ['perm1', 'perm2']) |
|
254 | ['perm1', 'perm2']) | |
255 | rmock.assert_called_once_with(*['perm1', 'perm2']) |
|
255 | rmock.assert_called_once_with(*['perm1', 'perm2']) | |
@@ -263,6 +263,6 b' class TestRepoAccess(object):' | |||||
263 | with self.repo_perm_patch as rmock: |
|
263 | with self.repo_perm_patch as rmock: | |
264 | rmock.return_value = repo_mock |
|
264 | rmock.return_value = repo_mock | |
265 | with pytest.raises(JSONRPCError) as excinfo: |
|
265 | with pytest.raises(JSONRPCError) as excinfo: | |
266 |
utils. |
|
266 | utils.validate_repo_permissions( | |
267 | 'fake_user', 'fake_repo_id', fake_repo, 'perms') |
|
267 | 'fake_user', 'fake_repo_id', fake_repo, 'perms') | |
268 | assert 'fake_repo_id' in excinfo |
|
268 | assert 'fake_repo_id' in excinfo |
@@ -26,7 +26,8 b' import collections' | |||||
26 | import logging |
|
26 | import logging | |
27 |
|
27 | |||
28 | from rhodecode.api.exc import JSONRPCError |
|
28 | from rhodecode.api.exc import JSONRPCError | |
29 | from rhodecode.lib.auth import HasPermissionAnyApi, HasRepoPermissionAnyApi |
|
29 | from rhodecode.lib.auth import HasPermissionAnyApi, HasRepoPermissionAnyApi, \ | |
|
30 | HasRepoGroupPermissionAnyApi | |||
30 | from rhodecode.lib.utils import safe_unicode |
|
31 | from rhodecode.lib.utils import safe_unicode | |
31 | from rhodecode.controllers.utils import get_commit_from_ref_name |
|
32 | from rhodecode.controllers.utils import get_commit_from_ref_name | |
32 | from rhodecode.lib.vcs.exceptions import RepositoryError |
|
33 | from rhodecode.lib.vcs.exceptions import RepositoryError | |
@@ -153,7 +154,7 b' def has_superadmin_permission(apiuser):' | |||||
153 | return False |
|
154 | return False | |
154 |
|
155 | |||
155 |
|
156 | |||
156 |
def |
|
157 | def validate_repo_permissions(apiuser, repoid, repo, perms): | |
157 | """ |
|
158 | """ | |
158 | Raise JsonRPCError if apiuser is not authorized or return True |
|
159 | Raise JsonRPCError if apiuser is not authorized or return True | |
159 |
|
160 | |||
@@ -170,6 +171,36 b' def has_repo_permissions(apiuser, repoid' | |||||
170 | return True |
|
171 | return True | |
171 |
|
172 | |||
172 |
|
173 | |||
|
174 | def validate_repo_group_permissions(apiuser, repogroupid, repo_group, perms): | |||
|
175 | """ | |||
|
176 | Raise JsonRPCError if apiuser is not authorized or return True | |||
|
177 | ||||
|
178 | :param apiuser: | |||
|
179 | :param repogroupid: just the id of repository group | |||
|
180 | :param repo_group: instance of repo_group | |||
|
181 | :param perms: | |||
|
182 | """ | |||
|
183 | if not HasRepoGroupPermissionAnyApi(*perms)( | |||
|
184 | user=apiuser, group_name=repo_group.group_name): | |||
|
185 | raise JSONRPCError( | |||
|
186 | 'repository group `%s` does not exist' % repogroupid) | |||
|
187 | ||||
|
188 | return True | |||
|
189 | ||||
|
190 | ||||
|
191 | def validate_set_owner_permissions(apiuser, owner): | |||
|
192 | if isinstance(owner, Optional): | |||
|
193 | owner = get_user_or_error(apiuser.user_id) | |||
|
194 | else: | |||
|
195 | if has_superadmin_permission(apiuser): | |||
|
196 | owner = get_user_or_error(owner) | |||
|
197 | else: | |||
|
198 | # forbid setting owner for non-admins | |||
|
199 | raise JSONRPCError( | |||
|
200 | 'Only RhodeCode super-admin can specify `owner` param') | |||
|
201 | return owner | |||
|
202 | ||||
|
203 | ||||
173 | def get_user_or_error(userid): |
|
204 | def get_user_or_error(userid): | |
174 | """ |
|
205 | """ | |
175 | Get user by id or name or return JsonRPCError if not found |
|
206 | Get user by id or name or return JsonRPCError if not found |
@@ -25,7 +25,7 b' from rhodecode.api import jsonrpc_method' | |||||
25 | from rhodecode.api.utils import ( |
|
25 | from rhodecode.api.utils import ( | |
26 | has_superadmin_permission, Optional, OAttr, get_repo_or_error, |
|
26 | has_superadmin_permission, Optional, OAttr, get_repo_or_error, | |
27 | get_pull_request_or_error, get_commit_or_error, get_user_or_error, |
|
27 | get_pull_request_or_error, get_commit_or_error, get_user_or_error, | |
28 |
|
|
28 | validate_repo_permissions, resolve_ref_or_error) | |
29 | from rhodecode.lib.auth import (HasRepoPermissionAnyApi) |
|
29 | from rhodecode.lib.auth import (HasRepoPermissionAnyApi) | |
30 | from rhodecode.lib.base import vcs_operation_context |
|
30 | from rhodecode.lib.base import vcs_operation_context | |
31 | from rhodecode.lib.utils2 import str2bool |
|
31 | from rhodecode.lib.utils2 import str2bool | |
@@ -96,6 +96,15 b' def get_pull_request(request, apiuser, r' | |||||
96 | "commit_id": "<commit_id>", |
|
96 | "commit_id": "<commit_id>", | |
97 | } |
|
97 | } | |
98 | }, |
|
98 | }, | |
|
99 | "merge": { | |||
|
100 | "clone_url": "<clone_url>", | |||
|
101 | "reference": | |||
|
102 | { | |||
|
103 | "name": "<name>", | |||
|
104 | "type": "<type>", | |||
|
105 | "commit_id": "<commit_id>", | |||
|
106 | } | |||
|
107 | }, | |||
99 | "author": <user_obj>, |
|
108 | "author": <user_obj>, | |
100 | "reviewers": [ |
|
109 | "reviewers": [ | |
101 | ... |
|
110 | ... | |
@@ -178,6 +187,15 b' def get_pull_requests(request, apiuser, ' | |||||
178 | "commit_id": "<commit_id>", |
|
187 | "commit_id": "<commit_id>", | |
179 | } |
|
188 | } | |
180 | }, |
|
189 | }, | |
|
190 | "merge": { | |||
|
191 | "clone_url": "<clone_url>", | |||
|
192 | "reference": | |||
|
193 | { | |||
|
194 | "name": "<name>", | |||
|
195 | "type": "<type>", | |||
|
196 | "commit_id": "<commit_id>", | |||
|
197 | } | |||
|
198 | }, | |||
181 | "author": <user_obj>, |
|
199 | "author": <user_obj>, | |
182 | "reviewers": [ |
|
200 | "reviewers": [ | |
183 | ... |
|
201 | ... | |
@@ -197,7 +215,7 b' def get_pull_requests(request, apiuser, ' | |||||
197 | if not has_superadmin_permission(apiuser): |
|
215 | if not has_superadmin_permission(apiuser): | |
198 | _perms = ( |
|
216 | _perms = ( | |
199 | 'repository.admin', 'repository.write', 'repository.read',) |
|
217 | 'repository.admin', 'repository.write', 'repository.read',) | |
200 |
|
|
218 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
201 |
|
219 | |||
202 | status = Optional.extract(status) |
|
220 | status = Optional.extract(status) | |
203 | pull_requests = PullRequestModel().get_all(repo, statuses=[status]) |
|
221 | pull_requests = PullRequestModel().get_all(repo, statuses=[status]) | |
@@ -232,7 +250,12 b' def merge_pull_request(request, apiuser,' | |||||
232 | "executed": "<bool>", |
|
250 | "executed": "<bool>", | |
233 | "failure_reason": "<int>", |
|
251 | "failure_reason": "<int>", | |
234 | "merge_commit_id": "<merge_commit_id>", |
|
252 | "merge_commit_id": "<merge_commit_id>", | |
235 | "possible": "<bool>" |
|
253 | "possible": "<bool>", | |
|
254 | "merge_ref": { | |||
|
255 | "commit_id": "<commit_id>", | |||
|
256 | "type": "<type>", | |||
|
257 | "name": "<name>" | |||
|
258 | } | |||
236 | }, |
|
259 | }, | |
237 | "error": null |
|
260 | "error": null | |
238 |
|
261 | |||
@@ -260,13 +283,21 b' def merge_pull_request(request, apiuser,' | |||||
260 | request.environ, repo_name=target_repo.repo_name, |
|
283 | request.environ, repo_name=target_repo.repo_name, | |
261 | username=apiuser.username, action='push', |
|
284 | username=apiuser.username, action='push', | |
262 | scm=target_repo.repo_type) |
|
285 | scm=target_repo.repo_type) | |
263 | data = PullRequestModel().merge(pull_request, apiuser, extras=extras) |
|
286 | merge_response = PullRequestModel().merge( | |
264 | if data.executed: |
|
287 | pull_request, apiuser, extras=extras) | |
|
288 | if merge_response.executed: | |||
265 | PullRequestModel().close_pull_request( |
|
289 | PullRequestModel().close_pull_request( | |
266 | pull_request.pull_request_id, apiuser) |
|
290 | pull_request.pull_request_id, apiuser) | |
267 |
|
291 | |||
268 | Session().commit() |
|
292 | Session().commit() | |
269 | return data |
|
293 | ||
|
294 | # In previous versions the merge response directly contained the merge | |||
|
295 | # commit id. It is now contained in the merge reference object. To be | |||
|
296 | # backwards compatible we have to extract it again. | |||
|
297 | merge_response = merge_response._asdict() | |||
|
298 | merge_response['merge_commit_id'] = merge_response['merge_ref'].commit_id | |||
|
299 | ||||
|
300 | return merge_response | |||
270 |
|
301 | |||
271 |
|
302 | |||
272 | @jsonrpc_method() |
|
303 | @jsonrpc_method() | |
@@ -463,12 +494,17 b' def create_pull_request(' | |||||
463 | :type description: Optional(str) |
|
494 | :type description: Optional(str) | |
464 | :param reviewers: Set the new pull request reviewers list. |
|
495 | :param reviewers: Set the new pull request reviewers list. | |
465 | :type reviewers: Optional(list) |
|
496 | :type reviewers: Optional(list) | |
|
497 | Accepts username strings or objects of the format: | |||
|
498 | { | |||
|
499 | 'username': 'nick', 'reasons': ['original author'] | |||
|
500 | } | |||
466 | """ |
|
501 | """ | |
|
502 | ||||
467 | source = get_repo_or_error(source_repo) |
|
503 | source = get_repo_or_error(source_repo) | |
468 | target = get_repo_or_error(target_repo) |
|
504 | target = get_repo_or_error(target_repo) | |
469 | if not has_superadmin_permission(apiuser): |
|
505 | if not has_superadmin_permission(apiuser): | |
470 | _perms = ('repository.admin', 'repository.write', 'repository.read',) |
|
506 | _perms = ('repository.admin', 'repository.write', 'repository.read',) | |
471 |
|
|
507 | validate_repo_permissions(apiuser, source_repo, source, _perms) | |
472 |
|
508 | |||
473 | full_source_ref = resolve_ref_or_error(source_ref, source) |
|
509 | full_source_ref = resolve_ref_or_error(source_ref, source) | |
474 | full_target_ref = resolve_ref_or_error(target_ref, target) |
|
510 | full_target_ref = resolve_ref_or_error(target_ref, target) | |
@@ -490,12 +526,21 b' def create_pull_request(' | |||||
490 | if not ancestor: |
|
526 | if not ancestor: | |
491 | raise JSONRPCError('no common ancestor found') |
|
527 | raise JSONRPCError('no common ancestor found') | |
492 |
|
528 | |||
493 |
reviewer_ |
|
529 | reviewer_objects = Optional.extract(reviewers) or [] | |
494 |
if not isinstance(reviewer_ |
|
530 | if not isinstance(reviewer_objects, list): | |
495 | raise JSONRPCError('reviewers should be specified as a list') |
|
531 | raise JSONRPCError('reviewers should be specified as a list') | |
496 |
|
532 | |||
497 | reviewer_users = [get_user_or_error(n) for n in reviewer_names] |
|
533 | reviewers_reasons = [] | |
498 | reviewer_ids = [u.user_id for u in reviewer_users] |
|
534 | for reviewer_object in reviewer_objects: | |
|
535 | reviewer_reasons = [] | |||
|
536 | if isinstance(reviewer_object, (basestring, int)): | |||
|
537 | reviewer_username = reviewer_object | |||
|
538 | else: | |||
|
539 | reviewer_username = reviewer_object['username'] | |||
|
540 | reviewer_reasons = reviewer_object.get('reasons', []) | |||
|
541 | ||||
|
542 | user = get_user_or_error(reviewer_username) | |||
|
543 | reviewers_reasons.append((user.user_id, reviewer_reasons)) | |||
499 |
|
544 | |||
500 | pull_request_model = PullRequestModel() |
|
545 | pull_request_model = PullRequestModel() | |
501 | pull_request = pull_request_model.create( |
|
546 | pull_request = pull_request_model.create( | |
@@ -506,7 +551,7 b' def create_pull_request(' | |||||
506 | target_ref=full_target_ref, |
|
551 | target_ref=full_target_ref, | |
507 | revisions=reversed( |
|
552 | revisions=reversed( | |
508 | [commit.raw_id for commit in reversed(commit_ranges)]), |
|
553 | [commit.raw_id for commit in reversed(commit_ranges)]), | |
509 |
reviewers=reviewer |
|
554 | reviewers=reviewers_reasons, | |
510 | title=title, |
|
555 | title=title, | |
511 | description=Optional.extract(description) |
|
556 | description=Optional.extract(description) | |
512 | ) |
|
557 | ) | |
@@ -585,12 +630,23 b' def update_pull_request(' | |||||
585 | 'pull request `%s` update failed, pull request is closed' % ( |
|
630 | 'pull request `%s` update failed, pull request is closed' % ( | |
586 | pullrequestid,)) |
|
631 | pullrequestid,)) | |
587 |
|
632 | |||
588 |
reviewer_ |
|
633 | reviewer_objects = Optional.extract(reviewers) or [] | |
589 |
if not isinstance(reviewer_ |
|
634 | if not isinstance(reviewer_objects, list): | |
590 | raise JSONRPCError('reviewers should be specified as a list') |
|
635 | raise JSONRPCError('reviewers should be specified as a list') | |
591 |
|
636 | |||
592 | reviewer_users = [get_user_or_error(n) for n in reviewer_names] |
|
637 | reviewers_reasons = [] | |
593 | reviewer_ids = [u.user_id for u in reviewer_users] |
|
638 | reviewer_ids = set() | |
|
639 | for reviewer_object in reviewer_objects: | |||
|
640 | reviewer_reasons = [] | |||
|
641 | if isinstance(reviewer_object, (int, basestring)): | |||
|
642 | reviewer_username = reviewer_object | |||
|
643 | else: | |||
|
644 | reviewer_username = reviewer_object['username'] | |||
|
645 | reviewer_reasons = reviewer_object.get('reasons', []) | |||
|
646 | ||||
|
647 | user = get_user_or_error(reviewer_username) | |||
|
648 | reviewer_ids.add(user.user_id) | |||
|
649 | reviewers_reasons.append((user.user_id, reviewer_reasons)) | |||
594 |
|
650 | |||
595 | title = Optional.extract(title) |
|
651 | title = Optional.extract(title) | |
596 | description = Optional.extract(description) |
|
652 | description = Optional.extract(description) | |
@@ -603,15 +659,15 b' def update_pull_request(' | |||||
603 | commit_changes = {"added": [], "common": [], "removed": []} |
|
659 | commit_changes = {"added": [], "common": [], "removed": []} | |
604 | if str2bool(Optional.extract(update_commits)): |
|
660 | if str2bool(Optional.extract(update_commits)): | |
605 | if PullRequestModel().has_valid_update_type(pull_request): |
|
661 | if PullRequestModel().has_valid_update_type(pull_request): | |
606 |
|
|
662 | update_response = PullRequestModel().update_commits( | |
607 | pull_request) |
|
663 | pull_request) | |
608 |
commit_changes = |
|
664 | commit_changes = update_response.changes or commit_changes | |
609 | Session().commit() |
|
665 | Session().commit() | |
610 |
|
666 | |||
611 | reviewers_changes = {"added": [], "removed": []} |
|
667 | reviewers_changes = {"added": [], "removed": []} | |
612 | if reviewer_ids: |
|
668 | if reviewer_ids: | |
613 | added_reviewers, removed_reviewers = \ |
|
669 | added_reviewers, removed_reviewers = \ | |
614 |
PullRequestModel().update_reviewers(pull_request, reviewer |
|
670 | PullRequestModel().update_reviewers(pull_request, reviewers_reasons) | |
615 |
|
671 | |||
616 | reviewers_changes['added'] = sorted( |
|
672 | reviewers_changes['added'] = sorted( | |
617 | [get_user_or_error(n).username for n in added_reviewers]) |
|
673 | [get_user_or_error(n).username for n in added_reviewers]) | |
@@ -631,5 +687,5 b' def update_pull_request(' | |||||
631 | 'updated_commits': commit_changes, |
|
687 | 'updated_commits': commit_changes, | |
632 | 'updated_reviewers': reviewers_changes |
|
688 | 'updated_reviewers': reviewers_changes | |
633 | } |
|
689 | } | |
|
690 | ||||
634 | return data |
|
691 | return data | |
635 |
|
@@ -21,29 +21,26 b'' | |||||
21 | import logging |
|
21 | import logging | |
22 | import time |
|
22 | import time | |
23 |
|
23 | |||
24 |
import co |
|
24 | import rhodecode | |
25 |
|
25 | from rhodecode.api import ( | ||
26 | from rhodecode import BACKENDS |
|
26 | jsonrpc_method, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError) | |
27 | from rhodecode.api import jsonrpc_method, JSONRPCError, JSONRPCForbidden, json |
|
|||
28 | from rhodecode.api.utils import ( |
|
27 | from rhodecode.api.utils import ( | |
29 | has_superadmin_permission, Optional, OAttr, get_repo_or_error, |
|
28 | has_superadmin_permission, Optional, OAttr, get_repo_or_error, | |
30 |
get_user_group_or_error, get_user_or_error, |
|
29 | get_user_group_or_error, get_user_or_error, validate_repo_permissions, | |
31 | get_perm_or_error, store_update, get_repo_group_or_error, parse_args, |
|
30 | get_perm_or_error, parse_args, get_origin, build_commit_data, | |
32 | get_origin, build_commit_data) |
|
31 | validate_set_owner_permissions) | |
33 | from rhodecode.lib.auth import ( |
|
32 | from rhodecode.lib.auth import HasPermissionAnyApi, HasUserGroupPermissionAnyApi | |
34 | HasPermissionAnyApi, HasRepoGroupPermissionAnyApi, |
|
|||
35 | HasUserGroupPermissionAnyApi) |
|
|||
36 | from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError |
|
33 | from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError | |
37 | from rhodecode.lib.utils import map_groups |
|
|||
38 | from rhodecode.lib.utils2 import str2bool, time_to_datetime |
|
34 | from rhodecode.lib.utils2 import str2bool, time_to_datetime | |
|
35 | from rhodecode.lib.ext_json import json | |||
39 | from rhodecode.model.changeset_status import ChangesetStatusModel |
|
36 | from rhodecode.model.changeset_status import ChangesetStatusModel | |
40 | from rhodecode.model.comment import ChangesetCommentsModel |
|
37 | from rhodecode.model.comment import ChangesetCommentsModel | |
41 | from rhodecode.model.db import ( |
|
38 | from rhodecode.model.db import ( | |
42 | Session, ChangesetStatus, RepositoryField, Repository) |
|
39 | Session, ChangesetStatus, RepositoryField, Repository) | |
43 | from rhodecode.model.repo import RepoModel |
|
40 | from rhodecode.model.repo import RepoModel | |
44 | from rhodecode.model.repo_group import RepoGroupModel |
|
|||
45 | from rhodecode.model.scm import ScmModel, RepoList |
|
41 | from rhodecode.model.scm import ScmModel, RepoList | |
46 | from rhodecode.model.settings import SettingsModel, VcsSettingsModel |
|
42 | from rhodecode.model.settings import SettingsModel, VcsSettingsModel | |
|
43 | from rhodecode.model import validation_schema | |||
47 | from rhodecode.model.validation_schema.schemas import repo_schema |
|
44 | from rhodecode.model.validation_schema.schemas import repo_schema | |
48 |
|
45 | |||
49 | log = logging.getLogger(__name__) |
|
46 | log = logging.getLogger(__name__) | |
@@ -177,6 +174,7 b' def get_repo(request, apiuser, repoid, c' | |||||
177 |
|
174 | |||
178 | repo = get_repo_or_error(repoid) |
|
175 | repo = get_repo_or_error(repoid) | |
179 | cache = Optional.extract(cache) |
|
176 | cache = Optional.extract(cache) | |
|
177 | ||||
180 | include_secrets = False |
|
178 | include_secrets = False | |
181 | if has_superadmin_permission(apiuser): |
|
179 | if has_superadmin_permission(apiuser): | |
182 | include_secrets = True |
|
180 | include_secrets = True | |
@@ -184,7 +182,7 b' def get_repo(request, apiuser, repoid, c' | |||||
184 | # check if we have at least read permission for this repo ! |
|
182 | # check if we have at least read permission for this repo ! | |
185 | _perms = ( |
|
183 | _perms = ( | |
186 | 'repository.admin', 'repository.write', 'repository.read',) |
|
184 | 'repository.admin', 'repository.write', 'repository.read',) | |
187 |
|
|
185 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
188 |
|
186 | |||
189 | permissions = [] |
|
187 | permissions = [] | |
190 | for _user in repo.permissions(): |
|
188 | for _user in repo.permissions(): | |
@@ -292,7 +290,7 b' def get_repo_changeset(request, apiuser,' | |||||
292 | if not has_superadmin_permission(apiuser): |
|
290 | if not has_superadmin_permission(apiuser): | |
293 | _perms = ( |
|
291 | _perms = ( | |
294 | 'repository.admin', 'repository.write', 'repository.read',) |
|
292 | 'repository.admin', 'repository.write', 'repository.read',) | |
295 |
|
|
293 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
296 |
|
294 | |||
297 | changes_details = Optional.extract(details) |
|
295 | changes_details = Optional.extract(details) | |
298 | _changes_details_types = ['basic', 'extended', 'full'] |
|
296 | _changes_details_types = ['basic', 'extended', 'full'] | |
@@ -355,7 +353,7 b' def get_repo_changesets(request, apiuser' | |||||
355 | if not has_superadmin_permission(apiuser): |
|
353 | if not has_superadmin_permission(apiuser): | |
356 | _perms = ( |
|
354 | _perms = ( | |
357 | 'repository.admin', 'repository.write', 'repository.read',) |
|
355 | 'repository.admin', 'repository.write', 'repository.read',) | |
358 |
|
|
356 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
359 |
|
357 | |||
360 | changes_details = Optional.extract(details) |
|
358 | changes_details = Optional.extract(details) | |
361 | _changes_details_types = ['basic', 'extended', 'full'] |
|
359 | _changes_details_types = ['basic', 'extended', 'full'] | |
@@ -450,7 +448,7 b' def get_repo_nodes(request, apiuser, rep' | |||||
450 | if not has_superadmin_permission(apiuser): |
|
448 | if not has_superadmin_permission(apiuser): | |
451 | _perms = ( |
|
449 | _perms = ( | |
452 | 'repository.admin', 'repository.write', 'repository.read',) |
|
450 | 'repository.admin', 'repository.write', 'repository.read',) | |
453 |
|
|
451 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
454 |
|
452 | |||
455 | ret_type = Optional.extract(ret_type) |
|
453 | ret_type = Optional.extract(ret_type) | |
456 | details = Optional.extract(details) |
|
454 | details = Optional.extract(details) | |
@@ -523,7 +521,7 b' def get_repo_refs(request, apiuser, repo' | |||||
523 | repo = get_repo_or_error(repoid) |
|
521 | repo = get_repo_or_error(repoid) | |
524 | if not has_superadmin_permission(apiuser): |
|
522 | if not has_superadmin_permission(apiuser): | |
525 | _perms = ('repository.admin', 'repository.write', 'repository.read',) |
|
523 | _perms = ('repository.admin', 'repository.write', 'repository.read',) | |
526 |
|
|
524 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
527 |
|
525 | |||
528 | try: |
|
526 | try: | |
529 | # check if repo is not empty by any chance, skip quicker if it is. |
|
527 | # check if repo is not empty by any chance, skip quicker if it is. | |
@@ -538,26 +536,30 b' def get_repo_refs(request, apiuser, repo' | |||||
538 |
|
536 | |||
539 |
|
537 | |||
540 | @jsonrpc_method() |
|
538 | @jsonrpc_method() | |
541 | def create_repo(request, apiuser, repo_name, repo_type, |
|
539 | def create_repo( | |
542 | owner=Optional(OAttr('apiuser')), description=Optional(''), |
|
540 | request, apiuser, repo_name, repo_type, | |
543 | private=Optional(False), clone_uri=Optional(None), |
|
541 | owner=Optional(OAttr('apiuser')), | |
544 | landing_rev=Optional('rev:tip'), |
|
542 | description=Optional(''), | |
545 |
|
|
543 | private=Optional(False), | |
546 |
|
|
544 | clone_uri=Optional(None), | |
547 | enable_downloads=Optional(False), |
|
545 | landing_rev=Optional('rev:tip'), | |
548 |
|
|
546 | enable_statistics=Optional(False), | |
|
547 | enable_locking=Optional(False), | |||
|
548 | enable_downloads=Optional(False), | |||
|
549 | copy_permissions=Optional(False)): | |||
549 | """ |
|
550 | """ | |
550 | Creates a repository. |
|
551 | Creates a repository. | |
551 |
|
552 | |||
552 |
* If the repository name contains "/", |
|
553 | * If the repository name contains "/", repository will be created inside | |
553 | groups will be created. |
|
554 | a repository group or nested repository groups | |
554 |
|
555 | |||
555 |
For example "foo/bar/ |
|
556 | For example "foo/bar/repo1" will create |repo| called "repo1" inside | |
556 | (with "foo" as parent). It will also create the "baz" repository |
|
557 | group "foo/bar". You have to have permissions to access and write to | |
557 | with "bar" as |repo| group. |
|
558 | the last repository group ("bar" in this example) | |
558 |
|
559 | |||
559 | This command can only be run using an |authtoken| with at least |
|
560 | This command can only be run using an |authtoken| with at least | |
560 | write permissions to the |repo|. |
|
561 | permissions to create repositories, or write permissions to | |
|
562 | parent repository groups. | |||
561 |
|
563 | |||
562 | :param apiuser: This is filled automatically from the |authtoken|. |
|
564 | :param apiuser: This is filled automatically from the |authtoken|. | |
563 | :type apiuser: AuthUser |
|
565 | :type apiuser: AuthUser | |
@@ -569,9 +571,9 b' def create_repo(request, apiuser, repo_n' | |||||
569 | :type owner: Optional(str) |
|
571 | :type owner: Optional(str) | |
570 | :param description: Set the repository description. |
|
572 | :param description: Set the repository description. | |
571 | :type description: Optional(str) |
|
573 | :type description: Optional(str) | |
572 | :param private: |
|
574 | :param private: set repository as private | |
573 | :type private: bool |
|
575 | :type private: bool | |
574 | :param clone_uri: |
|
576 | :param clone_uri: set clone_uri | |
575 | :type clone_uri: str |
|
577 | :type clone_uri: str | |
576 | :param landing_rev: <rev_type>:<rev> |
|
578 | :param landing_rev: <rev_type>:<rev> | |
577 | :type landing_rev: str |
|
579 | :type landing_rev: str | |
@@ -606,53 +608,17 b' def create_repo(request, apiuser, repo_n' | |||||
606 | id : <id_given_in_input> |
|
608 | id : <id_given_in_input> | |
607 | result : null |
|
609 | result : null | |
608 | error : { |
|
610 | error : { | |
609 | 'failed to create repository `<repo_name>` |
|
611 | 'failed to create repository `<repo_name>`' | |
610 | } |
|
612 | } | |
611 |
|
613 | |||
612 | """ |
|
614 | """ | |
613 | schema = repo_schema.RepoSchema() |
|
|||
614 | try: |
|
|||
615 | data = schema.deserialize({ |
|
|||
616 | 'repo_name': repo_name |
|
|||
617 | }) |
|
|||
618 | except colander.Invalid as e: |
|
|||
619 | raise JSONRPCError("Validation failed: %s" % (e.asdict(),)) |
|
|||
620 | repo_name = data['repo_name'] |
|
|||
621 |
|
615 | |||
622 | (repo_name_cleaned, |
|
616 | owner = validate_set_owner_permissions(apiuser, owner) | |
623 | parent_group_name) = RepoGroupModel()._get_group_name_and_parent( |
|
|||
624 | repo_name) |
|
|||
625 |
|
||||
626 | if not HasPermissionAnyApi( |
|
|||
627 | 'hg.admin', 'hg.create.repository')(user=apiuser): |
|
|||
628 | # check if we have admin permission for this repo group if given ! |
|
|||
629 |
|
||||
630 | if parent_group_name: |
|
|||
631 | repogroupid = parent_group_name |
|
|||
632 | repo_group = get_repo_group_or_error(parent_group_name) |
|
|||
633 |
|
617 | |||
634 | _perms = ('group.admin',) |
|
618 | description = Optional.extract(description) | |
635 | if not HasRepoGroupPermissionAnyApi(*_perms)( |
|
619 | copy_permissions = Optional.extract(copy_permissions) | |
636 | user=apiuser, group_name=repo_group.group_name): |
|
620 | clone_uri = Optional.extract(clone_uri) | |
637 | raise JSONRPCError( |
|
621 | landing_commit_ref = Optional.extract(landing_rev) | |
638 | 'repository group `%s` does not exist' % ( |
|
|||
639 | repogroupid,)) |
|
|||
640 | else: |
|
|||
641 | raise JSONRPCForbidden() |
|
|||
642 |
|
||||
643 | if not has_superadmin_permission(apiuser): |
|
|||
644 | if not isinstance(owner, Optional): |
|
|||
645 | # forbid setting owner for non-admins |
|
|||
646 | raise JSONRPCError( |
|
|||
647 | 'Only RhodeCode admin can specify `owner` param') |
|
|||
648 |
|
||||
649 | if isinstance(owner, Optional): |
|
|||
650 | owner = apiuser.user_id |
|
|||
651 |
|
||||
652 | owner = get_user_or_error(owner) |
|
|||
653 |
|
||||
654 | if RepoModel().get_by_repo_name(repo_name): |
|
|||
655 | raise JSONRPCError("repo `%s` already exist" % repo_name) |
|
|||
656 |
|
622 | |||
657 | defs = SettingsModel().get_default_repo_settings(strip_prefix=True) |
|
623 | defs = SettingsModel().get_default_repo_settings(strip_prefix=True) | |
658 | if isinstance(private, Optional): |
|
624 | if isinstance(private, Optional): | |
@@ -666,32 +632,44 b' def create_repo(request, apiuser, repo_n' | |||||
666 | if isinstance(enable_downloads, Optional): |
|
632 | if isinstance(enable_downloads, Optional): | |
667 | enable_downloads = defs.get('repo_enable_downloads') |
|
633 | enable_downloads = defs.get('repo_enable_downloads') | |
668 |
|
634 | |||
669 | clone_uri = Optional.extract(clone_uri) |
|
635 | schema = repo_schema.RepoSchema().bind( | |
670 | description = Optional.extract(description) |
|
636 | repo_type_options=rhodecode.BACKENDS.keys(), | |
671 | landing_rev = Optional.extract(landing_rev) |
|
637 | # user caller | |
672 | copy_permissions = Optional.extract(copy_permissions) |
|
638 | user=apiuser) | |
673 |
|
639 | |||
674 | try: |
|
640 | try: | |
675 | # create structure of groups and return the last group |
|
641 | schema_data = schema.deserialize(dict( | |
676 |
|
|
642 | repo_name=repo_name, | |
|
643 | repo_type=repo_type, | |||
|
644 | repo_owner=owner.username, | |||
|
645 | repo_description=description, | |||
|
646 | repo_landing_commit_ref=landing_commit_ref, | |||
|
647 | repo_clone_uri=clone_uri, | |||
|
648 | repo_private=private, | |||
|
649 | repo_copy_permissions=copy_permissions, | |||
|
650 | repo_enable_statistics=enable_statistics, | |||
|
651 | repo_enable_downloads=enable_downloads, | |||
|
652 | repo_enable_locking=enable_locking)) | |||
|
653 | except validation_schema.Invalid as err: | |||
|
654 | raise JSONRPCValidationError(colander_exc=err) | |||
|
655 | ||||
|
656 | try: | |||
677 | data = { |
|
657 | data = { | |
678 | 'repo_name': repo_name_cleaned, |
|
|||
679 | 'repo_name_full': repo_name, |
|
|||
680 | 'repo_type': repo_type, |
|
|||
681 | 'repo_description': description, |
|
|||
682 | 'owner': owner, |
|
658 | 'owner': owner, | |
683 | 'repo_private': private, |
|
659 | 'repo_name': schema_data['repo_group']['repo_name_without_group'], | |
684 | 'clone_uri': clone_uri, |
|
660 | 'repo_name_full': schema_data['repo_name'], | |
685 |
'repo_group': repo_group |
|
661 | 'repo_group': schema_data['repo_group']['repo_group_id'], | |
686 | 'repo_landing_rev': landing_rev, |
|
662 | 'repo_type': schema_data['repo_type'], | |
687 | 'enable_statistics': enable_statistics, |
|
663 | 'repo_description': schema_data['repo_description'], | |
688 | 'enable_locking': enable_locking, |
|
664 | 'repo_private': schema_data['repo_private'], | |
689 | 'enable_downloads': enable_downloads, |
|
665 | 'clone_uri': schema_data['repo_clone_uri'], | |
690 | 'repo_copy_permissions': copy_permissions, |
|
666 | 'repo_landing_rev': schema_data['repo_landing_commit_ref'], | |
|
667 | 'enable_statistics': schema_data['repo_enable_statistics'], | |||
|
668 | 'enable_locking': schema_data['repo_enable_locking'], | |||
|
669 | 'enable_downloads': schema_data['repo_enable_downloads'], | |||
|
670 | 'repo_copy_permissions': schema_data['repo_copy_permissions'], | |||
691 | } |
|
671 | } | |
692 |
|
672 | |||
693 | if repo_type not in BACKENDS.keys(): |
|
|||
694 | raise Exception("Invalid backend type %s" % repo_type) |
|
|||
695 | task = RepoModel().create(form_data=data, cur_user=owner) |
|
673 | task = RepoModel().create(form_data=data, cur_user=owner) | |
696 | from celery.result import BaseAsyncResult |
|
674 | from celery.result import BaseAsyncResult | |
697 | task_id = None |
|
675 | task_id = None | |
@@ -699,17 +677,17 b' def create_repo(request, apiuser, repo_n' | |||||
699 | task_id = task.task_id |
|
677 | task_id = task.task_id | |
700 | # no commit, it's done in RepoModel, or async via celery |
|
678 | # no commit, it's done in RepoModel, or async via celery | |
701 | return { |
|
679 | return { | |
702 | 'msg': "Created new repository `%s`" % (repo_name,), |
|
680 | 'msg': "Created new repository `%s`" % (schema_data['repo_name'],), | |
703 | 'success': True, # cannot return the repo data here since fork |
|
681 | 'success': True, # cannot return the repo data here since fork | |
704 |
# can |
|
682 | # can be done async | |
705 | 'task': task_id |
|
683 | 'task': task_id | |
706 | } |
|
684 | } | |
707 | except Exception: |
|
685 | except Exception: | |
708 | log.exception( |
|
686 | log.exception( | |
709 | u"Exception while trying to create the repository %s", |
|
687 | u"Exception while trying to create the repository %s", | |
710 | repo_name) |
|
688 | schema_data['repo_name']) | |
711 | raise JSONRPCError( |
|
689 | raise JSONRPCError( | |
712 | 'failed to create repository `%s`' % (repo_name,)) |
|
690 | 'failed to create repository `%s`' % (schema_data['repo_name'],)) | |
713 |
|
691 | |||
714 |
|
692 | |||
715 | @jsonrpc_method() |
|
693 | @jsonrpc_method() | |
@@ -735,7 +713,7 b' def add_field_to_repo(request, apiuser, ' | |||||
735 | repo = get_repo_or_error(repoid) |
|
713 | repo = get_repo_or_error(repoid) | |
736 | if not has_superadmin_permission(apiuser): |
|
714 | if not has_superadmin_permission(apiuser): | |
737 | _perms = ('repository.admin',) |
|
715 | _perms = ('repository.admin',) | |
738 |
|
|
716 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
739 |
|
717 | |||
740 | label = Optional.extract(label) or key |
|
718 | label = Optional.extract(label) or key | |
741 | description = Optional.extract(description) |
|
719 | description = Optional.extract(description) | |
@@ -778,7 +756,7 b' def remove_field_from_repo(request, apiu' | |||||
778 | repo = get_repo_or_error(repoid) |
|
756 | repo = get_repo_or_error(repoid) | |
779 | if not has_superadmin_permission(apiuser): |
|
757 | if not has_superadmin_permission(apiuser): | |
780 | _perms = ('repository.admin',) |
|
758 | _perms = ('repository.admin',) | |
781 |
|
|
759 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
782 |
|
760 | |||
783 | field = RepositoryField.get_by_key_name(key, repo) |
|
761 | field = RepositoryField.get_by_key_name(key, repo) | |
784 | if not field: |
|
762 | if not field: | |
@@ -800,33 +778,38 b' def remove_field_from_repo(request, apiu' | |||||
800 |
|
778 | |||
801 |
|
779 | |||
802 | @jsonrpc_method() |
|
780 | @jsonrpc_method() | |
803 | def update_repo(request, apiuser, repoid, name=Optional(None), |
|
781 | def update_repo( | |
804 | owner=Optional(OAttr('apiuser')), |
|
782 | request, apiuser, repoid, repo_name=Optional(None), | |
805 | group=Optional(None), |
|
783 | owner=Optional(OAttr('apiuser')), description=Optional(''), | |
806 | fork_of=Optional(None), |
|
784 | private=Optional(False), clone_uri=Optional(None), | |
807 |
|
|
785 | landing_rev=Optional('rev:tip'), fork_of=Optional(None), | |
808 | clone_uri=Optional(None), landing_rev=Optional('rev:tip'), |
|
786 | enable_statistics=Optional(False), | |
809 |
|
|
787 | enable_locking=Optional(False), | |
810 |
|
|
788 | enable_downloads=Optional(False), fields=Optional('')): | |
811 | enable_downloads=Optional(False), |
|
|||
812 | fields=Optional('')): |
|
|||
813 | """ |
|
789 | """ | |
814 | Updates a repository with the given information. |
|
790 | Updates a repository with the given information. | |
815 |
|
791 | |||
816 | This command can only be run using an |authtoken| with at least |
|
792 | This command can only be run using an |authtoken| with at least | |
817 |
|
|
793 | admin permissions to the |repo|. | |
|
794 | ||||
|
795 | * If the repository name contains "/", repository will be updated | |||
|
796 | accordingly with a repository group or nested repository groups | |||
|
797 | ||||
|
798 | For example repoid=repo-test name="foo/bar/repo-test" will update |repo| | |||
|
799 | called "repo-test" and place it inside group "foo/bar". | |||
|
800 | You have to have permissions to access and write to the last repository | |||
|
801 | group ("bar" in this example) | |||
818 |
|
802 | |||
819 | :param apiuser: This is filled automatically from the |authtoken|. |
|
803 | :param apiuser: This is filled automatically from the |authtoken|. | |
820 | :type apiuser: AuthUser |
|
804 | :type apiuser: AuthUser | |
821 | :param repoid: repository name or repository ID. |
|
805 | :param repoid: repository name or repository ID. | |
822 | :type repoid: str or int |
|
806 | :type repoid: str or int | |
823 |
:param name: Update the |repo| name |
|
807 | :param repo_name: Update the |repo| name, including the | |
824 | :type name: str |
|
808 | repository group it's in. | |
|
809 | :type repo_name: str | |||
825 | :param owner: Set the |repo| owner. |
|
810 | :param owner: Set the |repo| owner. | |
826 | :type owner: str |
|
811 | :type owner: str | |
827 |
:param |
|
812 | :param fork_of: Set the |repo| as fork of another |repo|. | |
828 | :type group: str |
|
|||
829 | :param fork_of: Set the master |repo| name. |
|
|||
830 | :type fork_of: str |
|
813 | :type fork_of: str | |
831 | :param description: Update the |repo| description. |
|
814 | :param description: Update the |repo| description. | |
832 | :type description: str |
|
815 | :type description: str | |
@@ -834,69 +817,115 b' def update_repo(request, apiuser, repoid' | |||||
834 | :type private: bool |
|
817 | :type private: bool | |
835 | :param clone_uri: Update the |repo| clone URI. |
|
818 | :param clone_uri: Update the |repo| clone URI. | |
836 | :type clone_uri: str |
|
819 | :type clone_uri: str | |
837 | :param landing_rev: Set the |repo| landing revision. Default is |
|
820 | :param landing_rev: Set the |repo| landing revision. Default is ``rev:tip``. | |
838 | ``tip``. |
|
|||
839 | :type landing_rev: str |
|
821 | :type landing_rev: str | |
840 | :param enable_statistics: Enable statistics on the |repo|, |
|
822 | :param enable_statistics: Enable statistics on the |repo|, (True | False). | |
841 | (True | False). |
|
|||
842 | :type enable_statistics: bool |
|
823 | :type enable_statistics: bool | |
843 | :param enable_locking: Enable |repo| locking. |
|
824 | :param enable_locking: Enable |repo| locking. | |
844 | :type enable_locking: bool |
|
825 | :type enable_locking: bool | |
845 | :param enable_downloads: Enable downloads from the |repo|, |
|
826 | :param enable_downloads: Enable downloads from the |repo|, (True | False). | |
846 | (True | False). |
|
|||
847 | :type enable_downloads: bool |
|
827 | :type enable_downloads: bool | |
848 | :param fields: Add extra fields to the |repo|. Use the following |
|
828 | :param fields: Add extra fields to the |repo|. Use the following | |
849 | example format: ``field_key=field_val,field_key2=fieldval2``. |
|
829 | example format: ``field_key=field_val,field_key2=fieldval2``. | |
850 | Escape ', ' with \, |
|
830 | Escape ', ' with \, | |
851 | :type fields: str |
|
831 | :type fields: str | |
852 | """ |
|
832 | """ | |
|
833 | ||||
853 | repo = get_repo_or_error(repoid) |
|
834 | repo = get_repo_or_error(repoid) | |
|
835 | ||||
854 | include_secrets = False |
|
836 | include_secrets = False | |
855 | if has_superadmin_permission(apiuser): |
|
837 | if not has_superadmin_permission(apiuser): | |
|
838 | validate_repo_permissions(apiuser, repoid, repo, ('repository.admin',)) | |||
|
839 | else: | |||
856 | include_secrets = True |
|
840 | include_secrets = True | |
857 | else: |
|
841 | ||
858 | _perms = ('repository.admin',) |
|
842 | updates = dict( | |
859 | has_repo_permissions(apiuser, repoid, repo, _perms) |
|
843 | repo_name=repo_name | |
|
844 | if not isinstance(repo_name, Optional) else repo.repo_name, | |||
|
845 | ||||
|
846 | fork_id=fork_of | |||
|
847 | if not isinstance(fork_of, Optional) else repo.fork.repo_name if repo.fork else None, | |||
|
848 | ||||
|
849 | user=owner | |||
|
850 | if not isinstance(owner, Optional) else repo.user.username, | |||
|
851 | ||||
|
852 | repo_description=description | |||
|
853 | if not isinstance(description, Optional) else repo.description, | |||
|
854 | ||||
|
855 | repo_private=private | |||
|
856 | if not isinstance(private, Optional) else repo.private, | |||
|
857 | ||||
|
858 | clone_uri=clone_uri | |||
|
859 | if not isinstance(clone_uri, Optional) else repo.clone_uri, | |||
|
860 | ||||
|
861 | repo_landing_rev=landing_rev | |||
|
862 | if not isinstance(landing_rev, Optional) else repo._landing_revision, | |||
|
863 | ||||
|
864 | repo_enable_statistics=enable_statistics | |||
|
865 | if not isinstance(enable_statistics, Optional) else repo.enable_statistics, | |||
|
866 | ||||
|
867 | repo_enable_locking=enable_locking | |||
|
868 | if not isinstance(enable_locking, Optional) else repo.enable_locking, | |||
|
869 | ||||
|
870 | repo_enable_downloads=enable_downloads | |||
|
871 | if not isinstance(enable_downloads, Optional) else repo.enable_downloads) | |||
|
872 | ||||
|
873 | ref_choices, _labels = ScmModel().get_repo_landing_revs(repo=repo) | |||
860 |
|
874 | |||
861 | updates = { |
|
875 | schema = repo_schema.RepoSchema().bind( | |
862 | # update function requires this. |
|
876 | repo_type_options=rhodecode.BACKENDS.keys(), | |
863 | 'repo_name': repo.just_name |
|
877 | repo_ref_options=ref_choices, | |
864 | } |
|
878 | # user caller | |
865 | repo_group = group |
|
879 | user=apiuser, | |
866 | if not isinstance(repo_group, Optional): |
|
880 | old_values=repo.get_api_data()) | |
867 | repo_group = get_repo_group_or_error(repo_group) |
|
881 | try: | |
868 | repo_group = repo_group.group_id |
|
882 | schema_data = schema.deserialize(dict( | |
|
883 | # we save old value, users cannot change type | |||
|
884 | repo_type=repo.repo_type, | |||
|
885 | ||||
|
886 | repo_name=updates['repo_name'], | |||
|
887 | repo_owner=updates['user'], | |||
|
888 | repo_description=updates['repo_description'], | |||
|
889 | repo_clone_uri=updates['clone_uri'], | |||
|
890 | repo_fork_of=updates['fork_id'], | |||
|
891 | repo_private=updates['repo_private'], | |||
|
892 | repo_landing_commit_ref=updates['repo_landing_rev'], | |||
|
893 | repo_enable_statistics=updates['repo_enable_statistics'], | |||
|
894 | repo_enable_downloads=updates['repo_enable_downloads'], | |||
|
895 | repo_enable_locking=updates['repo_enable_locking'])) | |||
|
896 | except validation_schema.Invalid as err: | |||
|
897 | raise JSONRPCValidationError(colander_exc=err) | |||
869 |
|
898 | |||
870 | repo_fork_of = fork_of |
|
899 | # save validated data back into the updates dict | |
871 | if not isinstance(repo_fork_of, Optional): |
|
900 | validated_updates = dict( | |
872 | repo_fork_of = get_repo_or_error(repo_fork_of) |
|
901 | repo_name=schema_data['repo_group']['repo_name_without_group'], | |
873 | repo_fork_of = repo_fork_of.repo_id |
|
902 | repo_group=schema_data['repo_group']['repo_group_id'], | |
|
903 | ||||
|
904 | user=schema_data['repo_owner'], | |||
|
905 | repo_description=schema_data['repo_description'], | |||
|
906 | repo_private=schema_data['repo_private'], | |||
|
907 | clone_uri=schema_data['repo_clone_uri'], | |||
|
908 | repo_landing_rev=schema_data['repo_landing_commit_ref'], | |||
|
909 | repo_enable_statistics=schema_data['repo_enable_statistics'], | |||
|
910 | repo_enable_locking=schema_data['repo_enable_locking'], | |||
|
911 | repo_enable_downloads=schema_data['repo_enable_downloads'], | |||
|
912 | ) | |||
|
913 | ||||
|
914 | if schema_data['repo_fork_of']: | |||
|
915 | fork_repo = get_repo_or_error(schema_data['repo_fork_of']) | |||
|
916 | validated_updates['fork_id'] = fork_repo.repo_id | |||
|
917 | ||||
|
918 | # extra fields | |||
|
919 | fields = parse_args(Optional.extract(fields), key_prefix='ex_') | |||
|
920 | if fields: | |||
|
921 | validated_updates.update(fields) | |||
874 |
|
922 | |||
875 | try: |
|
923 | try: | |
876 | store_update(updates, name, 'repo_name') |
|
924 | RepoModel().update(repo, **validated_updates) | |
877 | store_update(updates, repo_group, 'repo_group') |
|
|||
878 | store_update(updates, repo_fork_of, 'fork_id') |
|
|||
879 | store_update(updates, owner, 'user') |
|
|||
880 | store_update(updates, description, 'repo_description') |
|
|||
881 | store_update(updates, private, 'repo_private') |
|
|||
882 | store_update(updates, clone_uri, 'clone_uri') |
|
|||
883 | store_update(updates, landing_rev, 'repo_landing_rev') |
|
|||
884 | store_update(updates, enable_statistics, 'repo_enable_statistics') |
|
|||
885 | store_update(updates, enable_locking, 'repo_enable_locking') |
|
|||
886 | store_update(updates, enable_downloads, 'repo_enable_downloads') |
|
|||
887 |
|
||||
888 | # extra fields |
|
|||
889 | fields = parse_args(Optional.extract(fields), key_prefix='ex_') |
|
|||
890 | if fields: |
|
|||
891 | updates.update(fields) |
|
|||
892 |
|
||||
893 | RepoModel().update(repo, **updates) |
|
|||
894 | Session().commit() |
|
925 | Session().commit() | |
895 | return { |
|
926 | return { | |
896 | 'msg': 'updated repo ID:%s %s' % ( |
|
927 | 'msg': 'updated repo ID:%s %s' % (repo.repo_id, repo.repo_name), | |
897 | repo.repo_id, repo.repo_name), |
|
928 | 'repository': repo.get_api_data(include_secrets=include_secrets) | |
898 | 'repository': repo.get_api_data( |
|
|||
899 | include_secrets=include_secrets) |
|
|||
900 | } |
|
929 | } | |
901 | except Exception: |
|
930 | except Exception: | |
902 | log.exception( |
|
931 | log.exception( | |
@@ -908,26 +937,33 b' def update_repo(request, apiuser, repoid' | |||||
908 | @jsonrpc_method() |
|
937 | @jsonrpc_method() | |
909 | def fork_repo(request, apiuser, repoid, fork_name, |
|
938 | def fork_repo(request, apiuser, repoid, fork_name, | |
910 | owner=Optional(OAttr('apiuser')), |
|
939 | owner=Optional(OAttr('apiuser')), | |
911 |
description=Optional(''), |
|
940 | description=Optional(''), | |
912 |
private=Optional(False), |
|
941 | private=Optional(False), | |
|
942 | clone_uri=Optional(None), | |||
|
943 | landing_rev=Optional('rev:tip'), | |||
|
944 | copy_permissions=Optional(False)): | |||
913 | """ |
|
945 | """ | |
914 | Creates a fork of the specified |repo|. |
|
946 | Creates a fork of the specified |repo|. | |
915 |
|
947 | |||
916 | * If using |RCE| with Celery this will immediately return a success |
|
948 | * If the fork_name contains "/", fork will be created inside | |
917 | message, even though the fork will be created asynchronously. |
|
949 | a repository group or nested repository groups | |
918 |
|
950 | |||
919 | This command can only be run using an |authtoken| with fork |
|
951 | For example "foo/bar/fork-repo" will create fork called "fork-repo" | |
920 | permissions on the |repo|. |
|
952 | inside group "foo/bar". You have to have permissions to access and | |
|
953 | write to the last repository group ("bar" in this example) | |||
|
954 | ||||
|
955 | This command can only be run using an |authtoken| with minimum | |||
|
956 | read permissions of the forked repo, create fork permissions for an user. | |||
921 |
|
957 | |||
922 | :param apiuser: This is filled automatically from the |authtoken|. |
|
958 | :param apiuser: This is filled automatically from the |authtoken|. | |
923 | :type apiuser: AuthUser |
|
959 | :type apiuser: AuthUser | |
924 | :param repoid: Set repository name or repository ID. |
|
960 | :param repoid: Set repository name or repository ID. | |
925 | :type repoid: str or int |
|
961 | :type repoid: str or int | |
926 | :param fork_name: Set the fork name. |
|
962 | :param fork_name: Set the fork name, including it's repository group membership. | |
927 | :type fork_name: str |
|
963 | :type fork_name: str | |
928 | :param owner: Set the fork owner. |
|
964 | :param owner: Set the fork owner. | |
929 | :type owner: str |
|
965 | :type owner: str | |
930 | :param description: Set the fork descripton. |
|
966 | :param description: Set the fork description. | |
931 | :type description: str |
|
967 | :type description: str | |
932 | :param copy_permissions: Copy permissions from parent |repo|. The |
|
968 | :param copy_permissions: Copy permissions from parent |repo|. The | |
933 | default is False. |
|
969 | default is False. | |
@@ -965,71 +1001,63 b' def fork_repo(request, apiuser, repoid, ' | |||||
965 | error: null |
|
1001 | error: null | |
966 |
|
1002 | |||
967 | """ |
|
1003 | """ | |
968 | if not has_superadmin_permission(apiuser): |
|
|||
969 | if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser): |
|
|||
970 | raise JSONRPCForbidden() |
|
|||
971 |
|
1004 | |||
972 | repo = get_repo_or_error(repoid) |
|
1005 | repo = get_repo_or_error(repoid) | |
973 | repo_name = repo.repo_name |
|
1006 | repo_name = repo.repo_name | |
974 |
|
1007 | |||
975 | (fork_name_cleaned, |
|
|||
976 | parent_group_name) = RepoGroupModel()._get_group_name_and_parent( |
|
|||
977 | fork_name) |
|
|||
978 |
|
||||
979 | if not has_superadmin_permission(apiuser): |
|
1008 | if not has_superadmin_permission(apiuser): | |
980 | # check if we have at least read permission for |
|
1009 | # check if we have at least read permission for | |
981 | # this repo that we fork ! |
|
1010 | # this repo that we fork ! | |
982 | _perms = ( |
|
1011 | _perms = ( | |
983 | 'repository.admin', 'repository.write', 'repository.read') |
|
1012 | 'repository.admin', 'repository.write', 'repository.read') | |
984 |
|
|
1013 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
985 |
|
1014 | |||
986 | if not isinstance(owner, Optional): |
|
1015 | # check if the regular user has at least fork permissions as well | |
987 | # forbid setting owner for non super admins |
|
1016 | if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser): | |
988 |
raise JSONRPC |
|
1017 | raise JSONRPCForbidden() | |
989 | 'Only RhodeCode admin can specify `owner` param' |
|
1018 | ||
990 | ) |
|
1019 | # check if user can set owner parameter | |
991 | # check if we have a create.repo permission if not maybe the parent |
|
1020 | owner = validate_set_owner_permissions(apiuser, owner) | |
992 | # group permission |
|
|||
993 | if not HasPermissionAnyApi('hg.create.repository')(user=apiuser): |
|
|||
994 | if parent_group_name: |
|
|||
995 | repogroupid = parent_group_name |
|
|||
996 | repo_group = get_repo_group_or_error(parent_group_name) |
|
|||
997 |
|
1021 | |||
998 | _perms = ('group.admin',) |
|
1022 | description = Optional.extract(description) | |
999 | if not HasRepoGroupPermissionAnyApi(*_perms)( |
|
1023 | copy_permissions = Optional.extract(copy_permissions) | |
1000 | user=apiuser, group_name=repo_group.group_name): |
|
1024 | clone_uri = Optional.extract(clone_uri) | |
1001 | raise JSONRPCError( |
|
1025 | landing_commit_ref = Optional.extract(landing_rev) | |
1002 | 'repository group `%s` does not exist' % ( |
|
1026 | private = Optional.extract(private) | |
1003 | repogroupid,)) |
|
|||
1004 | else: |
|
|||
1005 | raise JSONRPCForbidden() |
|
|||
1006 |
|
1027 | |||
1007 | _repo = RepoModel().get_by_repo_name(fork_name) |
|
1028 | schema = repo_schema.RepoSchema().bind( | |
1008 | if _repo: |
|
1029 | repo_type_options=rhodecode.BACKENDS.keys(), | |
1009 | type_ = 'fork' if _repo.fork else 'repo' |
|
1030 | # user caller | |
1010 | raise JSONRPCError("%s `%s` already exist" % (type_, fork_name)) |
|
1031 | user=apiuser) | |
1011 |
|
||||
1012 | if isinstance(owner, Optional): |
|
|||
1013 | owner = apiuser.user_id |
|
|||
1014 |
|
||||
1015 | owner = get_user_or_error(owner) |
|
|||
1016 |
|
1032 | |||
1017 | try: |
|
1033 | try: | |
1018 | # create structure of groups and return the last group |
|
1034 | schema_data = schema.deserialize(dict( | |
1019 |
|
|
1035 | repo_name=fork_name, | |
1020 | form_data = { |
|
1036 | repo_type=repo.repo_type, | |
1021 | 'repo_name': fork_name_cleaned, |
|
1037 | repo_owner=owner.username, | |
1022 | 'repo_name_full': fork_name, |
|
1038 | repo_description=description, | |
1023 | 'repo_group': repo_group.group_id if repo_group else None, |
|
1039 | repo_landing_commit_ref=landing_commit_ref, | |
1024 | 'repo_type': repo.repo_type, |
|
1040 | repo_clone_uri=clone_uri, | |
1025 | 'description': Optional.extract(description), |
|
1041 | repo_private=private, | |
1026 | 'private': Optional.extract(private), |
|
1042 | repo_copy_permissions=copy_permissions)) | |
1027 | 'copy_permissions': Optional.extract(copy_permissions), |
|
1043 | except validation_schema.Invalid as err: | |
1028 | 'landing_rev': Optional.extract(landing_rev), |
|
1044 | raise JSONRPCValidationError(colander_exc=err) | |
|
1045 | ||||
|
1046 | try: | |||
|
1047 | data = { | |||
1029 | 'fork_parent_id': repo.repo_id, |
|
1048 | 'fork_parent_id': repo.repo_id, | |
|
1049 | ||||
|
1050 | 'repo_name': schema_data['repo_group']['repo_name_without_group'], | |||
|
1051 | 'repo_name_full': schema_data['repo_name'], | |||
|
1052 | 'repo_group': schema_data['repo_group']['repo_group_id'], | |||
|
1053 | 'repo_type': schema_data['repo_type'], | |||
|
1054 | 'description': schema_data['repo_description'], | |||
|
1055 | 'private': schema_data['repo_private'], | |||
|
1056 | 'copy_permissions': schema_data['repo_copy_permissions'], | |||
|
1057 | 'landing_rev': schema_data['repo_landing_commit_ref'], | |||
1030 | } |
|
1058 | } | |
1031 |
|
1059 | |||
1032 |
task = RepoModel().create_fork( |
|
1060 | task = RepoModel().create_fork(data, cur_user=owner) | |
1033 | # no commit, it's done in RepoModel, or async via celery |
|
1061 | # no commit, it's done in RepoModel, or async via celery | |
1034 | from celery.result import BaseAsyncResult |
|
1062 | from celery.result import BaseAsyncResult | |
1035 | task_id = None |
|
1063 | task_id = None | |
@@ -1037,16 +1065,18 b' def fork_repo(request, apiuser, repoid, ' | |||||
1037 | task_id = task.task_id |
|
1065 | task_id = task.task_id | |
1038 | return { |
|
1066 | return { | |
1039 | 'msg': 'Created fork of `%s` as `%s`' % ( |
|
1067 | 'msg': 'Created fork of `%s` as `%s`' % ( | |
1040 |
repo.repo_name, |
|
1068 | repo.repo_name, schema_data['repo_name']), | |
1041 | 'success': True, # cannot return the repo data here since fork |
|
1069 | 'success': True, # cannot return the repo data here since fork | |
1042 | # can be done async |
|
1070 | # can be done async | |
1043 | 'task': task_id |
|
1071 | 'task': task_id | |
1044 | } |
|
1072 | } | |
1045 | except Exception: |
|
1073 | except Exception: | |
1046 | log.exception("Exception occurred while trying to fork a repo") |
|
1074 | log.exception( | |
|
1075 | u"Exception while trying to create fork %s", | |||
|
1076 | schema_data['repo_name']) | |||
1047 | raise JSONRPCError( |
|
1077 | raise JSONRPCError( | |
1048 | 'failed to fork repository `%s` as `%s`' % ( |
|
1078 | 'failed to fork repository `%s` as `%s`' % ( | |
1049 |
repo_name, |
|
1079 | repo_name, schema_data['repo_name'])) | |
1050 |
|
1080 | |||
1051 |
|
1081 | |||
1052 | @jsonrpc_method() |
|
1082 | @jsonrpc_method() | |
@@ -1082,7 +1112,7 b' def delete_repo(request, apiuser, repoid' | |||||
1082 | repo = get_repo_or_error(repoid) |
|
1112 | repo = get_repo_or_error(repoid) | |
1083 | if not has_superadmin_permission(apiuser): |
|
1113 | if not has_superadmin_permission(apiuser): | |
1084 | _perms = ('repository.admin',) |
|
1114 | _perms = ('repository.admin',) | |
1085 |
|
|
1115 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
1086 |
|
1116 | |||
1087 | try: |
|
1117 | try: | |
1088 | handle_forks = Optional.extract(forks) |
|
1118 | handle_forks = Optional.extract(forks) | |
@@ -1157,7 +1187,7 b' def invalidate_cache(request, apiuser, r' | |||||
1157 | repo = get_repo_or_error(repoid) |
|
1187 | repo = get_repo_or_error(repoid) | |
1158 | if not has_superadmin_permission(apiuser): |
|
1188 | if not has_superadmin_permission(apiuser): | |
1159 | _perms = ('repository.admin', 'repository.write',) |
|
1189 | _perms = ('repository.admin', 'repository.write',) | |
1160 |
|
|
1190 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
1161 |
|
1191 | |||
1162 | delete = Optional.extract(delete_keys) |
|
1192 | delete = Optional.extract(delete_keys) | |
1163 | try: |
|
1193 | try: | |
@@ -1228,7 +1258,7 b' def lock(request, apiuser, repoid, locke' | |||||
1228 | id : <id_given_in_input> |
|
1258 | id : <id_given_in_input> | |
1229 | result : null |
|
1259 | result : null | |
1230 | error : { |
|
1260 | error : { | |
1231 | 'Error occurred locking repository `<reponame>` |
|
1261 | 'Error occurred locking repository `<reponame>`' | |
1232 | } |
|
1262 | } | |
1233 | """ |
|
1263 | """ | |
1234 |
|
1264 | |||
@@ -1236,7 +1266,7 b' def lock(request, apiuser, repoid, locke' | |||||
1236 | if not has_superadmin_permission(apiuser): |
|
1266 | if not has_superadmin_permission(apiuser): | |
1237 | # check if we have at least write permission for this repo ! |
|
1267 | # check if we have at least write permission for this repo ! | |
1238 | _perms = ('repository.admin', 'repository.write',) |
|
1268 | _perms = ('repository.admin', 'repository.write',) | |
1239 |
|
|
1269 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
1240 |
|
1270 | |||
1241 | # make sure normal user does not pass someone else userid, |
|
1271 | # make sure normal user does not pass someone else userid, | |
1242 | # he is not allowed to do that |
|
1272 | # he is not allowed to do that | |
@@ -1347,7 +1377,7 b' def comment_commit(' | |||||
1347 | repo = get_repo_or_error(repoid) |
|
1377 | repo = get_repo_or_error(repoid) | |
1348 | if not has_superadmin_permission(apiuser): |
|
1378 | if not has_superadmin_permission(apiuser): | |
1349 | _perms = ('repository.read', 'repository.write', 'repository.admin') |
|
1379 | _perms = ('repository.read', 'repository.write', 'repository.admin') | |
1350 |
|
|
1380 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
1351 |
|
1381 | |||
1352 | if isinstance(userid, Optional): |
|
1382 | if isinstance(userid, Optional): | |
1353 | userid = apiuser.user_id |
|
1383 | userid = apiuser.user_id | |
@@ -1438,7 +1468,7 b' def grant_user_permission(request, apius' | |||||
1438 | perm = get_perm_or_error(perm) |
|
1468 | perm = get_perm_or_error(perm) | |
1439 | if not has_superadmin_permission(apiuser): |
|
1469 | if not has_superadmin_permission(apiuser): | |
1440 | _perms = ('repository.admin',) |
|
1470 | _perms = ('repository.admin',) | |
1441 |
|
|
1471 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
1442 |
|
1472 | |||
1443 | try: |
|
1473 | try: | |
1444 |
|
1474 | |||
@@ -1492,7 +1522,7 b' def revoke_user_permission(request, apiu' | |||||
1492 | user = get_user_or_error(userid) |
|
1522 | user = get_user_or_error(userid) | |
1493 | if not has_superadmin_permission(apiuser): |
|
1523 | if not has_superadmin_permission(apiuser): | |
1494 | _perms = ('repository.admin',) |
|
1524 | _perms = ('repository.admin',) | |
1495 |
|
|
1525 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
1496 |
|
1526 | |||
1497 | try: |
|
1527 | try: | |
1498 | RepoModel().revoke_user_permission(repo=repo, user=user) |
|
1528 | RepoModel().revoke_user_permission(repo=repo, user=user) | |
@@ -1560,7 +1590,7 b' def grant_user_group_permission(request,' | |||||
1560 | perm = get_perm_or_error(perm) |
|
1590 | perm = get_perm_or_error(perm) | |
1561 | if not has_superadmin_permission(apiuser): |
|
1591 | if not has_superadmin_permission(apiuser): | |
1562 | _perms = ('repository.admin',) |
|
1592 | _perms = ('repository.admin',) | |
1563 |
|
|
1593 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
1564 |
|
1594 | |||
1565 | user_group = get_user_group_or_error(usergroupid) |
|
1595 | user_group = get_user_group_or_error(usergroupid) | |
1566 | if not has_superadmin_permission(apiuser): |
|
1596 | if not has_superadmin_permission(apiuser): | |
@@ -1625,7 +1655,7 b' def revoke_user_group_permission(request' | |||||
1625 | repo = get_repo_or_error(repoid) |
|
1655 | repo = get_repo_or_error(repoid) | |
1626 | if not has_superadmin_permission(apiuser): |
|
1656 | if not has_superadmin_permission(apiuser): | |
1627 | _perms = ('repository.admin',) |
|
1657 | _perms = ('repository.admin',) | |
1628 |
|
|
1658 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
1629 |
|
1659 | |||
1630 | user_group = get_user_group_or_error(usergroupid) |
|
1660 | user_group = get_user_group_or_error(usergroupid) | |
1631 | if not has_superadmin_permission(apiuser): |
|
1661 | if not has_superadmin_permission(apiuser): | |
@@ -1701,7 +1731,7 b' def pull(request, apiuser, repoid):' | |||||
1701 | repo = get_repo_or_error(repoid) |
|
1731 | repo = get_repo_or_error(repoid) | |
1702 | if not has_superadmin_permission(apiuser): |
|
1732 | if not has_superadmin_permission(apiuser): | |
1703 | _perms = ('repository.admin',) |
|
1733 | _perms = ('repository.admin',) | |
1704 |
|
|
1734 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
1705 |
|
1735 | |||
1706 | try: |
|
1736 | try: | |
1707 | ScmModel().pull_changes(repo.repo_name, apiuser.username) |
|
1737 | ScmModel().pull_changes(repo.repo_name, apiuser.username) | |
@@ -1764,7 +1794,7 b' def strip(request, apiuser, repoid, revi' | |||||
1764 | repo = get_repo_or_error(repoid) |
|
1794 | repo = get_repo_or_error(repoid) | |
1765 | if not has_superadmin_permission(apiuser): |
|
1795 | if not has_superadmin_permission(apiuser): | |
1766 | _perms = ('repository.admin',) |
|
1796 | _perms = ('repository.admin',) | |
1767 |
|
|
1797 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
1768 |
|
1798 | |||
1769 | try: |
|
1799 | try: | |
1770 | ScmModel().strip(repo, revision, branch) |
|
1800 | ScmModel().strip(repo, revision, branch) |
@@ -21,19 +21,18 b'' | |||||
21 |
|
21 | |||
22 | import logging |
|
22 | import logging | |
23 |
|
23 | |||
24 | import colander |
|
24 | from rhodecode.api import JSONRPCValidationError | |
25 |
|
25 | from rhodecode.api import jsonrpc_method, JSONRPCError | ||
26 | from rhodecode.api import jsonrpc_method, JSONRPCError, JSONRPCForbidden |
|
|||
27 | from rhodecode.api.utils import ( |
|
26 | from rhodecode.api.utils import ( | |
28 | has_superadmin_permission, Optional, OAttr, get_user_or_error, |
|
27 | has_superadmin_permission, Optional, OAttr, get_user_or_error, | |
29 | store_update, get_repo_group_or_error, |
|
28 | get_repo_group_or_error, get_perm_or_error, get_user_group_or_error, | |
30 | get_perm_or_error, get_user_group_or_error, get_origin) |
|
29 | get_origin, validate_repo_group_permissions, validate_set_owner_permissions) | |
31 | from rhodecode.lib.auth import ( |
|
30 | from rhodecode.lib.auth import ( | |
32 | HasPermissionAnyApi, HasRepoGroupPermissionAnyApi, |
|
31 | HasRepoGroupPermissionAnyApi, HasUserGroupPermissionAnyApi) | |
33 | HasUserGroupPermissionAnyApi) |
|
32 | from rhodecode.model.db import Session | |
34 | from rhodecode.model.db import Session, RepoGroup |
|
|||
35 | from rhodecode.model.repo_group import RepoGroupModel |
|
33 | from rhodecode.model.repo_group import RepoGroupModel | |
36 | from rhodecode.model.scm import RepoGroupList |
|
34 | from rhodecode.model.scm import RepoGroupList | |
|
35 | from rhodecode.model import validation_schema | |||
37 | from rhodecode.model.validation_schema.schemas import repo_group_schema |
|
36 | from rhodecode.model.validation_schema.schemas import repo_group_schema | |
38 |
|
37 | |||
39 |
|
38 | |||
@@ -142,21 +141,24 b' def get_repo_groups(request, apiuser):' | |||||
142 |
|
141 | |||
143 |
|
142 | |||
144 | @jsonrpc_method() |
|
143 | @jsonrpc_method() | |
145 | def create_repo_group(request, apiuser, group_name, description=Optional(''), |
|
144 | def create_repo_group( | |
146 | owner=Optional(OAttr('apiuser')), |
|
145 | request, apiuser, group_name, | |
147 | copy_permissions=Optional(False)): |
|
146 | owner=Optional(OAttr('apiuser')), | |
|
147 | description=Optional(''), | |||
|
148 | copy_permissions=Optional(False)): | |||
148 | """ |
|
149 | """ | |
149 | Creates a repository group. |
|
150 | Creates a repository group. | |
150 |
|
151 | |||
151 |
* If the repository group name contains "/", |
|
152 | * If the repository group name contains "/", repository group will be | |
152 | groups will be created. |
|
153 | created inside a repository group or nested repository groups | |
153 |
|
154 | |||
154 |
For example "foo/bar/ |
|
155 | For example "foo/bar/group1" will create repository group called "group1" | |
155 | (with "foo" as parent). It will also create the "baz" repository |
|
156 | inside group "foo/bar". You have to have permissions to access and | |
156 | with "bar" as |repo| group. |
|
157 | write to the last repository group ("bar" in this example) | |
157 |
|
158 | |||
158 |
This command can only be run using an |authtoken| with a |
|
159 | This command can only be run using an |authtoken| with at least | |
159 | permissions. |
|
160 | permissions to create repository groups, or admin permissions to | |
|
161 | parent repository groups. | |||
160 |
|
162 | |||
161 | :param apiuser: This is filled automatically from the |authtoken|. |
|
163 | :param apiuser: This is filled automatically from the |authtoken|. | |
162 | :type apiuser: AuthUser |
|
164 | :type apiuser: AuthUser | |
@@ -193,72 +195,64 b' def create_repo_group(request, apiuser, ' | |||||
193 |
|
195 | |||
194 | """ |
|
196 | """ | |
195 |
|
197 | |||
196 | schema = repo_group_schema.RepoGroupSchema() |
|
198 | owner = validate_set_owner_permissions(apiuser, owner) | |
197 | try: |
|
|||
198 | data = schema.deserialize({ |
|
|||
199 | 'group_name': group_name |
|
|||
200 | }) |
|
|||
201 | except colander.Invalid as e: |
|
|||
202 | raise JSONRPCError("Validation failed: %s" % (e.asdict(),)) |
|
|||
203 | group_name = data['group_name'] |
|
|||
204 |
|
199 | |||
205 | if isinstance(owner, Optional): |
|
200 | description = Optional.extract(description) | |
206 | owner = apiuser.user_id |
|
|||
207 |
|
||||
208 | group_description = Optional.extract(description) |
|
|||
209 | copy_permissions = Optional.extract(copy_permissions) |
|
201 | copy_permissions = Optional.extract(copy_permissions) | |
210 |
|
202 | |||
211 | # get by full name with parents, check if it already exist |
|
203 | schema = repo_group_schema.RepoGroupSchema().bind( | |
212 | if RepoGroup.get_by_group_name(group_name): |
|
204 | # user caller | |
213 | raise JSONRPCError("repo group `%s` already exist" % (group_name,)) |
|
205 | user=apiuser) | |
214 |
|
||||
215 | (group_name_cleaned, |
|
|||
216 | parent_group_name) = RepoGroupModel()._get_group_name_and_parent( |
|
|||
217 | group_name) |
|
|||
218 |
|
206 | |||
219 | parent_group = None |
|
207 | try: | |
220 | if parent_group_name: |
|
208 | schema_data = schema.deserialize(dict( | |
221 |
|
|
209 | repo_group_name=group_name, | |
|
210 | repo_group_owner=owner.username, | |||
|
211 | repo_group_description=description, | |||
|
212 | repo_group_copy_permissions=copy_permissions, | |||
|
213 | )) | |||
|
214 | except validation_schema.Invalid as err: | |||
|
215 | raise JSONRPCValidationError(colander_exc=err) | |||
222 |
|
216 | |||
223 | if not HasPermissionAnyApi( |
|
217 | validated_group_name = schema_data['repo_group_name'] | |
224 | 'hg.admin', 'hg.repogroup.create.true')(user=apiuser): |
|
|||
225 | # check if we have admin permission for this parent repo group ! |
|
|||
226 | # users without admin or hg.repogroup.create can only create other |
|
|||
227 | # groups in groups they own so this is a required, but can be empty |
|
|||
228 | parent_group = getattr(parent_group, 'group_name', '') |
|
|||
229 | _perms = ('group.admin',) |
|
|||
230 | if not HasRepoGroupPermissionAnyApi(*_perms)( |
|
|||
231 | user=apiuser, group_name=parent_group): |
|
|||
232 | raise JSONRPCForbidden() |
|
|||
233 |
|
218 | |||
234 | try: |
|
219 | try: | |
235 | repo_group = RepoGroupModel().create( |
|
220 | repo_group = RepoGroupModel().create( | |
236 | group_name=group_name, |
|
|||
237 | group_description=group_description, |
|
|||
238 | owner=owner, |
|
221 | owner=owner, | |
239 | copy_permissions=copy_permissions) |
|
222 | group_name=validated_group_name, | |
|
223 | group_description=schema_data['repo_group_name'], | |||
|
224 | copy_permissions=schema_data['repo_group_copy_permissions']) | |||
240 | Session().commit() |
|
225 | Session().commit() | |
241 | return { |
|
226 | return { | |
242 | 'msg': 'Created new repo group `%s`' % group_name, |
|
227 | 'msg': 'Created new repo group `%s`' % validated_group_name, | |
243 | 'repo_group': repo_group.get_api_data() |
|
228 | 'repo_group': repo_group.get_api_data() | |
244 | } |
|
229 | } | |
245 | except Exception: |
|
230 | except Exception: | |
246 | log.exception("Exception occurred while trying create repo group") |
|
231 | log.exception("Exception occurred while trying create repo group") | |
247 | raise JSONRPCError( |
|
232 | raise JSONRPCError( | |
248 | 'failed to create repo group `%s`' % (group_name,)) |
|
233 | 'failed to create repo group `%s`' % (validated_group_name,)) | |
249 |
|
234 | |||
250 |
|
235 | |||
251 | @jsonrpc_method() |
|
236 | @jsonrpc_method() | |
252 | def update_repo_group( |
|
237 | def update_repo_group( | |
253 | request, apiuser, repogroupid, group_name=Optional(''), |
|
238 | request, apiuser, repogroupid, group_name=Optional(''), | |
254 | description=Optional(''), owner=Optional(OAttr('apiuser')), |
|
239 | description=Optional(''), owner=Optional(OAttr('apiuser')), | |
255 |
|
|
240 | enable_locking=Optional(False)): | |
256 | """ |
|
241 | """ | |
257 | Updates repository group with the details given. |
|
242 | Updates repository group with the details given. | |
258 |
|
243 | |||
259 | This command can only be run using an |authtoken| with admin |
|
244 | This command can only be run using an |authtoken| with admin | |
260 | permissions. |
|
245 | permissions. | |
261 |
|
246 | |||
|
247 | * If the group_name name contains "/", repository group will be updated | |||
|
248 | accordingly with a repository group or nested repository groups | |||
|
249 | ||||
|
250 | For example repogroupid=group-test group_name="foo/bar/group-test" | |||
|
251 | will update repository group called "group-test" and place it | |||
|
252 | inside group "foo/bar". | |||
|
253 | You have to have permissions to access and write to the last repository | |||
|
254 | group ("bar" in this example) | |||
|
255 | ||||
262 | :param apiuser: This is filled automatically from the |authtoken|. |
|
256 | :param apiuser: This is filled automatically from the |authtoken|. | |
263 | :type apiuser: AuthUser |
|
257 | :type apiuser: AuthUser | |
264 | :param repogroupid: Set the ID of repository group. |
|
258 | :param repogroupid: Set the ID of repository group. | |
@@ -269,29 +263,55 b' def update_repo_group(' | |||||
269 | :type description: str |
|
263 | :type description: str | |
270 | :param owner: Set the |repo| group owner. |
|
264 | :param owner: Set the |repo| group owner. | |
271 | :type owner: str |
|
265 | :type owner: str | |
272 | :param parent: Set the |repo| group parent. |
|
|||
273 | :type parent: str or int |
|
|||
274 | :param enable_locking: Enable |repo| locking. The default is false. |
|
266 | :param enable_locking: Enable |repo| locking. The default is false. | |
275 | :type enable_locking: bool |
|
267 | :type enable_locking: bool | |
276 | """ |
|
268 | """ | |
277 |
|
269 | |||
278 | repo_group = get_repo_group_or_error(repogroupid) |
|
270 | repo_group = get_repo_group_or_error(repogroupid) | |
|
271 | ||||
279 | if not has_superadmin_permission(apiuser): |
|
272 | if not has_superadmin_permission(apiuser): | |
280 | # check if we have admin permission for this repo group ! |
|
273 | validate_repo_group_permissions( | |
281 |
|
|
274 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
282 | if not HasRepoGroupPermissionAnyApi(*_perms)( |
|
275 | ||
283 | user=apiuser, group_name=repo_group.group_name): |
|
276 | updates = dict( | |
284 | raise JSONRPCError( |
|
277 | group_name=group_name | |
285 | 'repository group `%s` does not exist' % (repogroupid,)) |
|
278 | if not isinstance(group_name, Optional) else repo_group.group_name, | |
|
279 | ||||
|
280 | group_description=description | |||
|
281 | if not isinstance(description, Optional) else repo_group.group_description, | |||
|
282 | ||||
|
283 | user=owner | |||
|
284 | if not isinstance(owner, Optional) else repo_group.user.username, | |||
|
285 | ||||
|
286 | enable_locking=enable_locking | |||
|
287 | if not isinstance(enable_locking, Optional) else repo_group.enable_locking | |||
|
288 | ) | |||
286 |
|
289 | |||
287 | updates = {} |
|
290 | schema = repo_group_schema.RepoGroupSchema().bind( | |
|
291 | # user caller | |||
|
292 | user=apiuser, | |||
|
293 | old_values=repo_group.get_api_data()) | |||
|
294 | ||||
288 | try: |
|
295 | try: | |
289 | store_update(updates, group_name, 'group_name') |
|
296 | schema_data = schema.deserialize(dict( | |
290 | store_update(updates, description, 'group_description') |
|
297 | repo_group_name=updates['group_name'], | |
291 |
|
|
298 | repo_group_owner=updates['user'], | |
292 | store_update(updates, parent, 'group_parent_id') |
|
299 | repo_group_description=updates['group_description'], | |
293 |
|
|
300 | repo_group_enable_locking=updates['enable_locking'], | |
294 | repo_group = RepoGroupModel().update(repo_group, updates) |
|
301 | )) | |
|
302 | except validation_schema.Invalid as err: | |||
|
303 | raise JSONRPCValidationError(colander_exc=err) | |||
|
304 | ||||
|
305 | validated_updates = dict( | |||
|
306 | group_name=schema_data['repo_group']['repo_group_name_without_group'], | |||
|
307 | group_parent_id=schema_data['repo_group']['repo_group_id'], | |||
|
308 | user=schema_data['repo_group_owner'], | |||
|
309 | group_description=schema_data['repo_group_description'], | |||
|
310 | enable_locking=schema_data['repo_group_enable_locking'], | |||
|
311 | ) | |||
|
312 | ||||
|
313 | try: | |||
|
314 | RepoGroupModel().update(repo_group, validated_updates) | |||
295 | Session().commit() |
|
315 | Session().commit() | |
296 | return { |
|
316 | return { | |
297 | 'msg': 'updated repository group ID:%s %s' % ( |
|
317 | 'msg': 'updated repository group ID:%s %s' % ( | |
@@ -299,7 +319,9 b' def update_repo_group(' | |||||
299 | 'repo_group': repo_group.get_api_data() |
|
319 | 'repo_group': repo_group.get_api_data() | |
300 | } |
|
320 | } | |
301 | except Exception: |
|
321 | except Exception: | |
302 | log.exception("Exception occurred while trying update repo group") |
|
322 | log.exception( | |
|
323 | u"Exception occurred while trying update repo group %s", | |||
|
324 | repogroupid) | |||
303 | raise JSONRPCError('failed to update repository group `%s`' |
|
325 | raise JSONRPCError('failed to update repository group `%s`' | |
304 | % (repogroupid,)) |
|
326 | % (repogroupid,)) | |
305 |
|
327 | |||
@@ -321,7 +343,7 b' def delete_repo_group(request, apiuser, ' | |||||
321 |
|
343 | |||
322 | id : <id_given_in_input> |
|
344 | id : <id_given_in_input> | |
323 | result : { |
|
345 | result : { | |
324 | 'msg': 'deleted repo group ID:<repogroupid> <repogroupname> |
|
346 | 'msg': 'deleted repo group ID:<repogroupid> <repogroupname>' | |
325 | 'repo_group': null |
|
347 | 'repo_group': null | |
326 | } |
|
348 | } | |
327 | error : null |
|
349 | error : null | |
@@ -340,12 +362,9 b' def delete_repo_group(request, apiuser, ' | |||||
340 |
|
362 | |||
341 | repo_group = get_repo_group_or_error(repogroupid) |
|
363 | repo_group = get_repo_group_or_error(repogroupid) | |
342 | if not has_superadmin_permission(apiuser): |
|
364 | if not has_superadmin_permission(apiuser): | |
343 | # check if we have admin permission for this repo group ! |
|
365 | validate_repo_group_permissions( | |
344 |
|
|
366 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
345 | if not HasRepoGroupPermissionAnyApi(*_perms)( |
|
367 | ||
346 | user=apiuser, group_name=repo_group.group_name): |
|
|||
347 | raise JSONRPCError( |
|
|||
348 | 'repository group `%s` does not exist' % (repogroupid,)) |
|
|||
349 | try: |
|
368 | try: | |
350 | RepoGroupModel().delete(repo_group) |
|
369 | RepoGroupModel().delete(repo_group) | |
351 | Session().commit() |
|
370 | Session().commit() | |
@@ -408,12 +427,8 b' def grant_user_permission_to_repo_group(' | |||||
408 | repo_group = get_repo_group_or_error(repogroupid) |
|
427 | repo_group = get_repo_group_or_error(repogroupid) | |
409 |
|
428 | |||
410 | if not has_superadmin_permission(apiuser): |
|
429 | if not has_superadmin_permission(apiuser): | |
411 | # check if we have admin permission for this repo group ! |
|
430 | validate_repo_group_permissions( | |
412 |
|
|
431 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
413 | if not HasRepoGroupPermissionAnyApi(*_perms)( |
|
|||
414 | user=apiuser, group_name=repo_group.group_name): |
|
|||
415 | raise JSONRPCError( |
|
|||
416 | 'repository group `%s` does not exist' % (repogroupid,)) |
|
|||
417 |
|
432 | |||
418 | user = get_user_or_error(userid) |
|
433 | user = get_user_or_error(userid) | |
419 | perm = get_perm_or_error(perm, prefix='group.') |
|
434 | perm = get_perm_or_error(perm, prefix='group.') | |
@@ -487,12 +502,8 b' def revoke_user_permission_from_repo_gro' | |||||
487 | repo_group = get_repo_group_or_error(repogroupid) |
|
502 | repo_group = get_repo_group_or_error(repogroupid) | |
488 |
|
503 | |||
489 | if not has_superadmin_permission(apiuser): |
|
504 | if not has_superadmin_permission(apiuser): | |
490 | # check if we have admin permission for this repo group ! |
|
505 | validate_repo_group_permissions( | |
491 |
|
|
506 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
492 | if not HasRepoGroupPermissionAnyApi(*_perms)( |
|
|||
493 | user=apiuser, group_name=repo_group.group_name): |
|
|||
494 | raise JSONRPCError( |
|
|||
495 | 'repository group `%s` does not exist' % (repogroupid,)) |
|
|||
496 |
|
507 | |||
497 | user = get_user_or_error(userid) |
|
508 | user = get_user_or_error(userid) | |
498 | apply_to_children = Optional.extract(apply_to_children) |
|
509 | apply_to_children = Optional.extract(apply_to_children) | |
@@ -569,12 +580,8 b' def grant_user_group_permission_to_repo_' | |||||
569 | perm = get_perm_or_error(perm, prefix='group.') |
|
580 | perm = get_perm_or_error(perm, prefix='group.') | |
570 | user_group = get_user_group_or_error(usergroupid) |
|
581 | user_group = get_user_group_or_error(usergroupid) | |
571 | if not has_superadmin_permission(apiuser): |
|
582 | if not has_superadmin_permission(apiuser): | |
572 | # check if we have admin permission for this repo group ! |
|
583 | validate_repo_group_permissions( | |
573 |
|
|
584 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
574 | if not HasRepoGroupPermissionAnyApi(*_perms)( |
|
|||
575 | user=apiuser, group_name=repo_group.group_name): |
|
|||
576 | raise JSONRPCError( |
|
|||
577 | 'repository group `%s` does not exist' % (repogroupid,)) |
|
|||
578 |
|
585 | |||
579 | # check if we have at least read permission for this user group ! |
|
586 | # check if we have at least read permission for this user group ! | |
580 | _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',) |
|
587 | _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',) | |
@@ -656,12 +663,8 b' def revoke_user_group_permission_from_re' | |||||
656 | repo_group = get_repo_group_or_error(repogroupid) |
|
663 | repo_group = get_repo_group_or_error(repogroupid) | |
657 | user_group = get_user_group_or_error(usergroupid) |
|
664 | user_group = get_user_group_or_error(usergroupid) | |
658 | if not has_superadmin_permission(apiuser): |
|
665 | if not has_superadmin_permission(apiuser): | |
659 | # check if we have admin permission for this repo group ! |
|
666 | validate_repo_group_permissions( | |
660 |
|
|
667 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
661 | if not HasRepoGroupPermissionAnyApi(*_perms)( |
|
|||
662 | user=apiuser, group_name=repo_group.group_name): |
|
|||
663 | raise JSONRPCError( |
|
|||
664 | 'repository group `%s` does not exist' % (repogroupid,)) |
|
|||
665 |
|
668 | |||
666 | # check if we have at least read permission for this user group ! |
|
669 | # check if we have at least read permission for this user group ! | |
667 | _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',) |
|
670 | _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',) |
@@ -60,7 +60,13 b' def get_server_info(request, apiuser):' | |||||
60 | if not has_superadmin_permission(apiuser): |
|
60 | if not has_superadmin_permission(apiuser): | |
61 | raise JSONRPCForbidden() |
|
61 | raise JSONRPCForbidden() | |
62 |
|
62 | |||
63 |
re |
|
63 | server_info = ScmModel().get_server_info(request.environ) | |
|
64 | # rhodecode-index requires those | |||
|
65 | ||||
|
66 | server_info['index_storage'] = server_info['search']['value']['location'] | |||
|
67 | server_info['storage'] = server_info['storage']['value']['path'] | |||
|
68 | ||||
|
69 | return server_info | |||
64 |
|
70 | |||
65 |
|
71 | |||
66 | @jsonrpc_method() |
|
72 | @jsonrpc_method() |
@@ -25,7 +25,7 b' from rhodecode.api.utils import (' | |||||
25 | Optional, OAttr, has_superadmin_permission, get_user_or_error, store_update) |
|
25 | Optional, OAttr, has_superadmin_permission, get_user_or_error, store_update) | |
26 | from rhodecode.lib.auth import AuthUser, PasswordGenerator |
|
26 | from rhodecode.lib.auth import AuthUser, PasswordGenerator | |
27 | from rhodecode.lib.exceptions import DefaultUserException |
|
27 | from rhodecode.lib.exceptions import DefaultUserException | |
28 | from rhodecode.lib.utils2 import safe_int |
|
28 | from rhodecode.lib.utils2 import safe_int, str2bool | |
29 | from rhodecode.model.db import Session, User, Repository |
|
29 | from rhodecode.model.db import Session, User, Repository | |
30 | from rhodecode.model.user import UserModel |
|
30 | from rhodecode.model.user import UserModel | |
31 |
|
31 | |||
@@ -81,6 +81,7 b' def get_user(request, apiuser, userid=Op' | |||||
81 | "usergroup.read", |
|
81 | "usergroup.read", | |
82 | "hg.repogroup.create.false", |
|
82 | "hg.repogroup.create.false", | |
83 | "hg.create.none", |
|
83 | "hg.create.none", | |
|
84 | "hg.password_reset.enabled", | |||
84 | "hg.extern_activate.manual", |
|
85 | "hg.extern_activate.manual", | |
85 | "hg.create.write_on_repogroup.false", |
|
86 | "hg.create.write_on_repogroup.false", | |
86 | "hg.usergroup.create.false", |
|
87 | "hg.usergroup.create.false", | |
@@ -154,7 +155,8 b' def create_user(request, apiuser, userna' | |||||
154 | active=Optional(True), admin=Optional(False), |
|
155 | active=Optional(True), admin=Optional(False), | |
155 | extern_name=Optional('rhodecode'), |
|
156 | extern_name=Optional('rhodecode'), | |
156 | extern_type=Optional('rhodecode'), |
|
157 | extern_type=Optional('rhodecode'), | |
157 |
force_password_change=Optional(False) |
|
158 | force_password_change=Optional(False), | |
|
159 | create_personal_repo_group=Optional(None)): | |||
158 | """ |
|
160 | """ | |
159 | Creates a new user and returns the new user object. |
|
161 | Creates a new user and returns the new user object. | |
160 |
|
162 | |||
@@ -187,7 +189,8 b' def create_user(request, apiuser, userna' | |||||
187 | :param force_password_change: Force the new user to change password |
|
189 | :param force_password_change: Force the new user to change password | |
188 | on next login. |
|
190 | on next login. | |
189 | :type force_password_change: Optional(``True`` | ``False``) |
|
191 | :type force_password_change: Optional(``True`` | ``False``) | |
190 |
|
192 | :param create_personal_repo_group: Create personal repo group for this user | ||
|
193 | :type create_personal_repo_group: Optional(``True`` | ``False``) | |||
191 | Example output: |
|
194 | Example output: | |
192 |
|
195 | |||
193 | .. code-block:: bash |
|
196 | .. code-block:: bash | |
@@ -229,6 +232,9 b' def create_user(request, apiuser, userna' | |||||
229 | Optional.extract(extern_name) != 'rhodecode'): |
|
232 | Optional.extract(extern_name) != 'rhodecode'): | |
230 | # generate temporary password if user is external |
|
233 | # generate temporary password if user is external | |
231 | password = PasswordGenerator().gen_password(length=16) |
|
234 | password = PasswordGenerator().gen_password(length=16) | |
|
235 | create_repo_group = Optional.extract(create_personal_repo_group) | |||
|
236 | if isinstance(create_repo_group, basestring): | |||
|
237 | create_repo_group = str2bool(create_repo_group) | |||
232 |
|
238 | |||
233 | try: |
|
239 | try: | |
234 | user = UserModel().create_or_update( |
|
240 | user = UserModel().create_or_update( | |
@@ -242,6 +248,7 b' def create_user(request, apiuser, userna' | |||||
242 | extern_type=Optional.extract(extern_type), |
|
248 | extern_type=Optional.extract(extern_type), | |
243 | extern_name=Optional.extract(extern_name), |
|
249 | extern_name=Optional.extract(extern_name), | |
244 | force_password_change=Optional.extract(force_password_change), |
|
250 | force_password_change=Optional.extract(force_password_change), | |
|
251 | create_repo_group=create_repo_group | |||
245 | ) |
|
252 | ) | |
246 | Session().commit() |
|
253 | Session().commit() | |
247 | return { |
|
254 | return { |
@@ -226,6 +226,15 b' class RhodeCodeAuthPluginBase(object):' | |||||
226 | """ |
|
226 | """ | |
227 | raise NotImplementedError("Not implemented in base class") |
|
227 | raise NotImplementedError("Not implemented in base class") | |
228 |
|
228 | |||
|
229 | def get_url_slug(self): | |||
|
230 | """ | |||
|
231 | Returns a slug which should be used when constructing URLs which refer | |||
|
232 | to this plugin. By default it returns the plugin name. If the name is | |||
|
233 | not suitable for using it in an URL the plugin should override this | |||
|
234 | method. | |||
|
235 | """ | |||
|
236 | return self.name | |||
|
237 | ||||
229 | @property |
|
238 | @property | |
230 | def is_headers_auth(self): |
|
239 | def is_headers_auth(self): | |
231 | """ |
|
240 | """ |
@@ -45,7 +45,7 b' class AuthnPluginResourceBase(AuthnResou' | |||||
45 |
|
45 | |||
46 | def __init__(self, plugin): |
|
46 | def __init__(self, plugin): | |
47 | self.plugin = plugin |
|
47 | self.plugin = plugin | |
48 |
self.__name__ = plugin. |
|
48 | self.__name__ = plugin.get_url_slug() | |
49 | self.display_name = plugin.get_display_name() |
|
49 | self.display_name = plugin.get_display_name() | |
50 |
|
50 | |||
51 |
|
51 |
@@ -84,7 +84,7 b' class AuthnPluginViewBase(object):' | |||||
84 |
|
84 | |||
85 | try: |
|
85 | try: | |
86 | valid_data = schema.deserialize(data) |
|
86 | valid_data = schema.deserialize(data) | |
87 |
except colander.Invalid |
|
87 | except colander.Invalid as e: | |
88 | # Display error message and display form again. |
|
88 | # Display error message and display form again. | |
89 | self.request.session.flash( |
|
89 | self.request.session.flash( | |
90 | _('Errors exist when saving plugin settings. ' |
|
90 | _('Errors exist when saving plugin settings. ' |
@@ -31,9 +31,9 b' from pyramid.authorization import ACLAut' | |||||
31 | from pyramid.config import Configurator |
|
31 | from pyramid.config import Configurator | |
32 | from pyramid.settings import asbool, aslist |
|
32 | from pyramid.settings import asbool, aslist | |
33 | from pyramid.wsgi import wsgiapp |
|
33 | from pyramid.wsgi import wsgiapp | |
34 |
from pyramid.httpexceptions import |
|
34 | from pyramid.httpexceptions import ( | |
|
35 | HTTPError, HTTPInternalServerError, HTTPFound) | |||
35 | from pyramid.events import ApplicationCreated |
|
36 | from pyramid.events import ApplicationCreated | |
36 | import pyramid.httpexceptions as httpexceptions |
|
|||
37 | from pyramid.renderers import render_to_response |
|
37 | from pyramid.renderers import render_to_response | |
38 | from routes.middleware import RoutesMiddleware |
|
38 | from routes.middleware import RoutesMiddleware | |
39 | import routes.util |
|
39 | import routes.util | |
@@ -44,10 +44,10 b' from rhodecode.config import patches' | |||||
44 | from rhodecode.config.routing import STATIC_FILE_PREFIX |
|
44 | from rhodecode.config.routing import STATIC_FILE_PREFIX | |
45 | from rhodecode.config.environment import ( |
|
45 | from rhodecode.config.environment import ( | |
46 | load_environment, load_pyramid_environment) |
|
46 | load_environment, load_pyramid_environment) | |
47 | from rhodecode.lib.exceptions import VCSServerUnavailable |
|
|||
48 | from rhodecode.lib.vcs.exceptions import VCSCommunicationError |
|
|||
49 | from rhodecode.lib.middleware import csrf |
|
47 | from rhodecode.lib.middleware import csrf | |
50 | from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled |
|
48 | from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled | |
|
49 | from rhodecode.lib.middleware.error_handling import ( | |||
|
50 | PylonsErrorHandlingMiddleware) | |||
51 | from rhodecode.lib.middleware.https_fixup import HttpsFixup |
|
51 | from rhodecode.lib.middleware.https_fixup import HttpsFixup | |
52 | from rhodecode.lib.middleware.vcs import VCSMiddleware |
|
52 | from rhodecode.lib.middleware.vcs import VCSMiddleware | |
53 | from rhodecode.lib.plugins.utils import register_rhodecode_plugin |
|
53 | from rhodecode.lib.plugins.utils import register_rhodecode_plugin | |
@@ -186,53 +186,27 b' def make_not_found_view(config):' | |||||
186 | pylons_app, appenlight_client = wrap_in_appenlight_if_enabled( |
|
186 | pylons_app, appenlight_client = wrap_in_appenlight_if_enabled( | |
187 | pylons_app, settings) |
|
187 | pylons_app, settings) | |
188 |
|
188 | |||
189 | # The VCSMiddleware shall operate like a fallback if pyramid doesn't find |
|
189 | # The pylons app is executed inside of the pyramid 404 exception handler. | |
190 | # a view to handle the request. Therefore we wrap it around the pylons app. |
|
190 | # Exceptions which are raised inside of it are not handled by pyramid | |
|
191 | # again. Therefore we add a middleware that invokes the error handler in | |||
|
192 | # case of an exception or error response. This way we return proper error | |||
|
193 | # HTML pages in case of an error. | |||
|
194 | reraise = (settings.get('debugtoolbar.enabled', False) or | |||
|
195 | rhodecode.disable_error_handler) | |||
|
196 | pylons_app = PylonsErrorHandlingMiddleware( | |||
|
197 | pylons_app, error_handler, reraise) | |||
|
198 | ||||
|
199 | # The VCSMiddleware shall operate like a fallback if pyramid doesn't find a | |||
|
200 | # view to handle the request. Therefore it is wrapped around the pylons | |||
|
201 | # app. It has to be outside of the error handling otherwise error responses | |||
|
202 | # from the vcsserver are converted to HTML error pages. This confuses the | |||
|
203 | # command line tools and the user won't get a meaningful error message. | |||
191 | if vcs_server_enabled: |
|
204 | if vcs_server_enabled: | |
192 | pylons_app = VCSMiddleware( |
|
205 | pylons_app = VCSMiddleware( | |
193 | pylons_app, settings, appenlight_client, registry=config.registry) |
|
206 | pylons_app, settings, appenlight_client, registry=config.registry) | |
194 |
|
207 | |||
195 | pylons_app_as_view = wsgiapp(pylons_app) |
|
208 | # Convert WSGI app to pyramid view and return it. | |
196 |
|
209 | return wsgiapp(pylons_app) | ||
197 | def pylons_app_with_error_handler(context, request): |
|
|||
198 | """ |
|
|||
199 | Handle exceptions from rc pylons app: |
|
|||
200 |
|
||||
201 | - old webob type exceptions get converted to pyramid exceptions |
|
|||
202 | - pyramid exceptions are passed to the error handler view |
|
|||
203 | """ |
|
|||
204 | def is_vcs_response(response): |
|
|||
205 | return 'X-RhodeCode-Backend' in response.headers |
|
|||
206 |
|
||||
207 | def is_http_error(response): |
|
|||
208 | # webob type error responses |
|
|||
209 | return (400 <= response.status_int <= 599) |
|
|||
210 |
|
||||
211 | def is_error_handling_needed(response): |
|
|||
212 | return is_http_error(response) and not is_vcs_response(response) |
|
|||
213 |
|
||||
214 | try: |
|
|||
215 | response = pylons_app_as_view(context, request) |
|
|||
216 | if is_error_handling_needed(response): |
|
|||
217 | response = webob_to_pyramid_http_response(response) |
|
|||
218 | return error_handler(response, request) |
|
|||
219 | except HTTPError as e: # pyramid type exceptions |
|
|||
220 | return error_handler(e, request) |
|
|||
221 | except Exception as e: |
|
|||
222 | log.exception(e) |
|
|||
223 |
|
||||
224 | if (settings.get('debugtoolbar.enabled', False) or |
|
|||
225 | rhodecode.disable_error_handler): |
|
|||
226 | raise |
|
|||
227 |
|
||||
228 | if isinstance(e, VCSCommunicationError): |
|
|||
229 | return error_handler(VCSServerUnavailable(), request) |
|
|||
230 |
|
||||
231 | return error_handler(HTTPInternalServerError(), request) |
|
|||
232 |
|
||||
233 | return response |
|
|||
234 |
|
||||
235 | return pylons_app_with_error_handler |
|
|||
236 |
|
210 | |||
237 |
|
211 | |||
238 | def add_pylons_compat_data(registry, global_config, settings): |
|
212 | def add_pylons_compat_data(registry, global_config, settings): | |
@@ -243,16 +217,6 b' def add_pylons_compat_data(registry, glo' | |||||
243 | registry._pylons_compat_settings = settings |
|
217 | registry._pylons_compat_settings = settings | |
244 |
|
218 | |||
245 |
|
219 | |||
246 | def webob_to_pyramid_http_response(webob_response): |
|
|||
247 | ResponseClass = httpexceptions.status_map[webob_response.status_int] |
|
|||
248 | pyramid_response = ResponseClass(webob_response.status) |
|
|||
249 | pyramid_response.status = webob_response.status |
|
|||
250 | pyramid_response.headers.update(webob_response.headers) |
|
|||
251 | if pyramid_response.headers['content-type'] == 'text/html': |
|
|||
252 | pyramid_response.headers['content-type'] = 'text/html; charset=UTF-8' |
|
|||
253 | return pyramid_response |
|
|||
254 |
|
||||
255 |
|
||||
256 | def error_handler(exception, request): |
|
220 | def error_handler(exception, request): | |
257 | from rhodecode.model.settings import SettingsModel |
|
221 | from rhodecode.model.settings import SettingsModel | |
258 | from rhodecode.lib.utils2 import AttributeDict |
|
222 | from rhodecode.lib.utils2 import AttributeDict | |
@@ -466,10 +430,11 b' def _sanitize_vcs_settings(settings):' | |||||
466 | """ |
|
430 | """ | |
467 | _string_setting(settings, 'vcs.svn.compatible_version', '') |
|
431 | _string_setting(settings, 'vcs.svn.compatible_version', '') | |
468 | _string_setting(settings, 'git_rev_filter', '--all') |
|
432 | _string_setting(settings, 'git_rev_filter', '--all') | |
469 |
_string_setting(settings, 'vcs.hooks.protocol', 'p |
|
433 | _string_setting(settings, 'vcs.hooks.protocol', 'http') | |
|
434 | _string_setting(settings, 'vcs.scm_app_implementation', 'http') | |||
470 | _string_setting(settings, 'vcs.server', '') |
|
435 | _string_setting(settings, 'vcs.server', '') | |
471 | _string_setting(settings, 'vcs.server.log_level', 'debug') |
|
436 | _string_setting(settings, 'vcs.server.log_level', 'debug') | |
472 |
_string_setting(settings, 'vcs.server.protocol', 'p |
|
437 | _string_setting(settings, 'vcs.server.protocol', 'http') | |
473 | _bool_setting(settings, 'startup.import_repos', 'false') |
|
438 | _bool_setting(settings, 'startup.import_repos', 'false') | |
474 | _bool_setting(settings, 'vcs.hooks.direct_calls', 'false') |
|
439 | _bool_setting(settings, 'vcs.hooks.direct_calls', 'false') | |
475 | _bool_setting(settings, 'vcs.server.enable', 'true') |
|
440 | _bool_setting(settings, 'vcs.server.enable', 'true') | |
@@ -477,6 +442,13 b' def _sanitize_vcs_settings(settings):' | |||||
477 | _list_setting(settings, 'vcs.backends', 'hg, git, svn') |
|
442 | _list_setting(settings, 'vcs.backends', 'hg, git, svn') | |
478 | _int_setting(settings, 'vcs.connection_timeout', 3600) |
|
443 | _int_setting(settings, 'vcs.connection_timeout', 3600) | |
479 |
|
444 | |||
|
445 | # Support legacy values of vcs.scm_app_implementation. Legacy | |||
|
446 | # configurations may use 'rhodecode.lib.middleware.utils.scm_app_http' | |||
|
447 | # which is now mapped to 'http'. | |||
|
448 | scm_app_impl = settings['vcs.scm_app_implementation'] | |||
|
449 | if scm_app_impl == 'rhodecode.lib.middleware.utils.scm_app_http': | |||
|
450 | settings['vcs.scm_app_implementation'] = 'http' | |||
|
451 | ||||
480 |
|
452 | |||
481 | def _int_setting(settings, name, default): |
|
453 | def _int_setting(settings, name, default): | |
482 | settings[name] = int(settings.get(name, default)) |
|
454 | settings[name] = int(settings.get(name, default)) | |
@@ -501,5 +473,8 b' def _list_setting(settings, name, defaul' | |||||
501 | settings[name] = aslist(raw_value) |
|
473 | settings[name] = aslist(raw_value) | |
502 |
|
474 | |||
503 |
|
475 | |||
504 | def _string_setting(settings, name, default): |
|
476 | def _string_setting(settings, name, default, lower=True): | |
505 |
|
|
477 | value = settings.get(name, default) | |
|
478 | if lower: | |||
|
479 | value = value.lower() | |||
|
480 | settings[name] = value |
@@ -196,7 +196,7 b' def make_map(config):' | |||||
196 | rmap.connect('user_autocomplete_data', '/_users', controller='home', |
|
196 | rmap.connect('user_autocomplete_data', '/_users', controller='home', | |
197 | action='user_autocomplete_data', jsroute=True) |
|
197 | action='user_autocomplete_data', jsroute=True) | |
198 | rmap.connect('user_group_autocomplete_data', '/_user_groups', controller='home', |
|
198 | rmap.connect('user_group_autocomplete_data', '/_user_groups', controller='home', | |
199 | action='user_group_autocomplete_data') |
|
199 | action='user_group_autocomplete_data', jsroute=True) | |
200 |
|
200 | |||
201 | rmap.connect( |
|
201 | rmap.connect( | |
202 | 'user_profile', '/_profiles/{username}', controller='users', |
|
202 | 'user_profile', '/_profiles/{username}', controller='users', | |
@@ -305,7 +305,7 b' def make_map(config):' | |||||
305 | m.connect('delete_user', '/users/{user_id}', |
|
305 | m.connect('delete_user', '/users/{user_id}', | |
306 | action='delete', conditions={'method': ['DELETE']}) |
|
306 | action='delete', conditions={'method': ['DELETE']}) | |
307 | m.connect('edit_user', '/users/{user_id}/edit', |
|
307 | m.connect('edit_user', '/users/{user_id}/edit', | |
308 | action='edit', conditions={'method': ['GET']}) |
|
308 | action='edit', conditions={'method': ['GET']}, jsroute=True) | |
309 | m.connect('user', '/users/{user_id}', |
|
309 | m.connect('user', '/users/{user_id}', | |
310 | action='show', conditions={'method': ['GET']}) |
|
310 | action='show', conditions={'method': ['GET']}) | |
311 | m.connect('force_password_reset_user', '/users/{user_id}/password_reset', |
|
311 | m.connect('force_password_reset_user', '/users/{user_id}/password_reset', | |
@@ -389,7 +389,7 b' def make_map(config):' | |||||
389 |
|
389 | |||
390 | m.connect('edit_user_group_members', |
|
390 | m.connect('edit_user_group_members', | |
391 | '/user_groups/{user_group_id}/edit/members', jsroute=True, |
|
391 | '/user_groups/{user_group_id}/edit/members', jsroute=True, | |
392 |
action=' |
|
392 | action='user_group_members', conditions={'method': ['GET']}) | |
393 |
|
393 | |||
394 | # ADMIN PERMISSIONS ROUTES |
|
394 | # ADMIN PERMISSIONS ROUTES | |
395 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
|
395 | with rmap.submapper(path_prefix=ADMIN_PREFIX, | |
@@ -699,6 +699,9 b' def make_map(config):' | |||||
699 | rmap.connect('repo_refs_changelog_data', '/{repo_name}/refs-data-changelog', |
|
699 | rmap.connect('repo_refs_changelog_data', '/{repo_name}/refs-data-changelog', | |
700 | controller='summary', action='repo_refs_changelog_data', |
|
700 | controller='summary', action='repo_refs_changelog_data', | |
701 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
701 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) | |
|
702 | rmap.connect('repo_default_reviewers_data', '/{repo_name}/default-reviewers', | |||
|
703 | controller='summary', action='repo_default_reviewers_data', | |||
|
704 | jsroute=True, requirements=URL_NAME_REQUIREMENTS) | |||
702 |
|
705 | |||
703 | rmap.connect('changeset_home', '/{repo_name}/changeset/{revision}', |
|
706 | rmap.connect('changeset_home', '/{repo_name}/changeset/{revision}', | |
704 | controller='changeset', revision='tip', jsroute=True, |
|
707 | controller='changeset', revision='tip', jsroute=True, | |
@@ -824,6 +827,10 b' def make_map(config):' | |||||
824 | controller='admin/repos', action='repo_delete_svn_pattern', |
|
827 | controller='admin/repos', action='repo_delete_svn_pattern', | |
825 | conditions={'method': ['DELETE'], 'function': check_repo}, |
|
828 | conditions={'method': ['DELETE'], 'function': check_repo}, | |
826 | requirements=URL_NAME_REQUIREMENTS) |
|
829 | requirements=URL_NAME_REQUIREMENTS) | |
|
830 | rmap.connect('repo_pullrequest_settings', '/{repo_name}/settings/pullrequest', | |||
|
831 | controller='admin/repos', action='repo_settings_pullrequest', | |||
|
832 | conditions={'method': ['GET', 'POST'], 'function': check_repo}, | |||
|
833 | requirements=URL_NAME_REQUIREMENTS) | |||
827 |
|
834 | |||
828 | # still working url for backward compat. |
|
835 | # still working url for backward compat. | |
829 | rmap.connect('raw_changeset_home_depraced', |
|
836 | rmap.connect('raw_changeset_home_depraced', |
@@ -28,10 +28,9 b' import logging' | |||||
28 |
|
28 | |||
29 | import formencode |
|
29 | import formencode | |
30 | import peppercorn |
|
30 | import peppercorn | |
31 | from formencode import htmlfill |
|
|||
32 |
|
31 | |||
33 | from pylons import request, response, tmpl_context as c, url |
|
32 | from pylons import request, response, tmpl_context as c, url | |
34 |
from pylons.controllers.util import |
|
33 | from pylons.controllers.util import redirect | |
35 | from pylons.i18n.translation import _ |
|
34 | from pylons.i18n.translation import _ | |
36 | from webob.exc import HTTPNotFound, HTTPForbidden |
|
35 | from webob.exc import HTTPNotFound, HTTPForbidden | |
37 | from sqlalchemy.sql.expression import or_ |
|
36 | from sqlalchemy.sql.expression import or_ | |
@@ -45,7 +44,7 b' from rhodecode.lib import helpers as h' | |||||
45 | from rhodecode.lib.base import BaseController, render |
|
44 | from rhodecode.lib.base import BaseController, render | |
46 | from rhodecode.lib.auth import LoginRequired, NotAnonymous |
|
45 | from rhodecode.lib.auth import LoginRequired, NotAnonymous | |
47 | from rhodecode.lib.utils import jsonify |
|
46 | from rhodecode.lib.utils import jsonify | |
48 |
from rhodecode.lib.utils2 import |
|
47 | from rhodecode.lib.utils2 import time_to_datetime | |
49 | from rhodecode.lib.ext_json import json |
|
48 | from rhodecode.lib.ext_json import json | |
50 | from rhodecode.lib.vcs.exceptions import VCSError, NodeNotChangedError |
|
49 | from rhodecode.lib.vcs.exceptions import VCSError, NodeNotChangedError | |
51 | from rhodecode.model import validation_schema |
|
50 | from rhodecode.model import validation_schema |
@@ -39,19 +39,20 b' from rhodecode.lib.auth import (' | |||||
39 | LoginRequired, NotAnonymous, AuthUser, generate_auth_token) |
|
39 | LoginRequired, NotAnonymous, AuthUser, generate_auth_token) | |
40 | from rhodecode.lib.base import BaseController, render |
|
40 | from rhodecode.lib.base import BaseController, render | |
41 | from rhodecode.lib.utils import jsonify |
|
41 | from rhodecode.lib.utils import jsonify | |
42 | from rhodecode.lib.utils2 import safe_int, md5 |
|
42 | from rhodecode.lib.utils2 import safe_int, md5, str2bool | |
43 | from rhodecode.lib.ext_json import json |
|
43 | from rhodecode.lib.ext_json import json | |
44 |
|
44 | |||
45 | from rhodecode.model.validation_schema.schemas import user_schema |
|
45 | from rhodecode.model.validation_schema.schemas import user_schema | |
46 | from rhodecode.model.db import ( |
|
46 | from rhodecode.model.db import ( | |
47 |
Repository, PullRequest, |
|
47 | Repository, PullRequest, UserEmailMap, User, UserFollowing) | |
48 | UserFollowing) |
|
|||
49 | from rhodecode.model.forms import UserForm |
|
48 | from rhodecode.model.forms import UserForm | |
50 | from rhodecode.model.scm import RepoList |
|
49 | from rhodecode.model.scm import RepoList | |
51 | from rhodecode.model.user import UserModel |
|
50 | from rhodecode.model.user import UserModel | |
52 | from rhodecode.model.repo import RepoModel |
|
51 | from rhodecode.model.repo import RepoModel | |
53 | from rhodecode.model.auth_token import AuthTokenModel |
|
52 | from rhodecode.model.auth_token import AuthTokenModel | |
54 | from rhodecode.model.meta import Session |
|
53 | from rhodecode.model.meta import Session | |
|
54 | from rhodecode.model.pull_request import PullRequestModel | |||
|
55 | from rhodecode.model.comment import ChangesetCommentsModel | |||
55 |
|
56 | |||
56 | log = logging.getLogger(__name__) |
|
57 | log = logging.getLogger(__name__) | |
57 |
|
58 | |||
@@ -289,25 +290,85 b' class MyAccountController(BaseController' | |||||
289 | category='success') |
|
290 | category='success') | |
290 | return redirect(url('my_account_emails')) |
|
291 | return redirect(url('my_account_emails')) | |
291 |
|
292 | |||
|
293 | def _extract_ordering(self, request): | |||
|
294 | column_index = safe_int(request.GET.get('order[0][column]')) | |||
|
295 | order_dir = request.GET.get('order[0][dir]', 'desc') | |||
|
296 | order_by = request.GET.get( | |||
|
297 | 'columns[%s][data][sort]' % column_index, 'name_raw') | |||
|
298 | return order_by, order_dir | |||
|
299 | ||||
|
300 | def _get_pull_requests_list(self, statuses): | |||
|
301 | start = safe_int(request.GET.get('start'), 0) | |||
|
302 | length = safe_int(request.GET.get('length'), c.visual.dashboard_items) | |||
|
303 | order_by, order_dir = self._extract_ordering(request) | |||
|
304 | ||||
|
305 | pull_requests = PullRequestModel().get_im_participating_in( | |||
|
306 | user_id=c.rhodecode_user.user_id, | |||
|
307 | statuses=statuses, | |||
|
308 | offset=start, length=length, order_by=order_by, | |||
|
309 | order_dir=order_dir) | |||
|
310 | ||||
|
311 | pull_requests_total_count = PullRequestModel().count_im_participating_in( | |||
|
312 | user_id=c.rhodecode_user.user_id, statuses=statuses) | |||
|
313 | ||||
|
314 | from rhodecode.lib.utils import PartialRenderer | |||
|
315 | _render = PartialRenderer('data_table/_dt_elements.html') | |||
|
316 | data = [] | |||
|
317 | for pr in pull_requests: | |||
|
318 | repo_id = pr.target_repo_id | |||
|
319 | comments = ChangesetCommentsModel().get_all_comments( | |||
|
320 | repo_id, pull_request=pr) | |||
|
321 | owned = pr.user_id == c.rhodecode_user.user_id | |||
|
322 | status = pr.calculated_review_status() | |||
|
323 | ||||
|
324 | data.append({ | |||
|
325 | 'target_repo': _render('pullrequest_target_repo', | |||
|
326 | pr.target_repo.repo_name), | |||
|
327 | 'name': _render('pullrequest_name', | |||
|
328 | pr.pull_request_id, pr.target_repo.repo_name, | |||
|
329 | short=True), | |||
|
330 | 'name_raw': pr.pull_request_id, | |||
|
331 | 'status': _render('pullrequest_status', status), | |||
|
332 | 'title': _render( | |||
|
333 | 'pullrequest_title', pr.title, pr.description), | |||
|
334 | 'description': h.escape(pr.description), | |||
|
335 | 'updated_on': _render('pullrequest_updated_on', | |||
|
336 | h.datetime_to_time(pr.updated_on)), | |||
|
337 | 'updated_on_raw': h.datetime_to_time(pr.updated_on), | |||
|
338 | 'created_on': _render('pullrequest_updated_on', | |||
|
339 | h.datetime_to_time(pr.created_on)), | |||
|
340 | 'created_on_raw': h.datetime_to_time(pr.created_on), | |||
|
341 | 'author': _render('pullrequest_author', | |||
|
342 | pr.author.full_contact, ), | |||
|
343 | 'author_raw': pr.author.full_name, | |||
|
344 | 'comments': _render('pullrequest_comments', len(comments)), | |||
|
345 | 'comments_raw': len(comments), | |||
|
346 | 'closed': pr.is_closed(), | |||
|
347 | 'owned': owned | |||
|
348 | }) | |||
|
349 | # json used to render the grid | |||
|
350 | data = ({ | |||
|
351 | 'data': data, | |||
|
352 | 'recordsTotal': pull_requests_total_count, | |||
|
353 | 'recordsFiltered': pull_requests_total_count, | |||
|
354 | }) | |||
|
355 | return data | |||
|
356 | ||||
292 | def my_account_pullrequests(self): |
|
357 | def my_account_pullrequests(self): | |
293 | c.active = 'pullrequests' |
|
358 | c.active = 'pullrequests' | |
294 | self.__load_data() |
|
359 | self.__load_data() | |
295 | c.show_closed = request.GET.get('pr_show_closed') |
|
360 | c.show_closed = str2bool(request.GET.get('pr_show_closed')) | |
296 |
|
||||
297 | def _filter(pr): |
|
|||
298 | s = sorted(pr, key=lambda o: o.created_on, reverse=True) |
|
|||
299 | if not c.show_closed: |
|
|||
300 | s = filter(lambda p: p.status != PullRequest.STATUS_CLOSED, s) |
|
|||
301 | return s |
|
|||
302 |
|
361 | |||
303 | c.my_pull_requests = _filter( |
|
362 | statuses = [PullRequest.STATUS_NEW, PullRequest.STATUS_OPEN] | |
304 | PullRequest.query().filter( |
|
363 | if c.show_closed: | |
305 | PullRequest.user_id == c.rhodecode_user.user_id).all()) |
|
364 | statuses += [PullRequest.STATUS_CLOSED] | |
306 | my_prs = [ |
|
365 | data = self._get_pull_requests_list(statuses) | |
307 | x.pull_request for x in PullRequestReviewers.query().filter( |
|
366 | if not request.is_xhr: | |
308 | PullRequestReviewers.user_id == c.rhodecode_user.user_id).all()] |
|
367 | c.data_participate = json.dumps(data['data']) | |
309 | c.participate_in_pull_requests = _filter(my_prs) |
|
368 | c.records_total_participate = data['recordsTotal'] | |
310 | return render('admin/my_account/my_account.html') |
|
369 | return render('admin/my_account/my_account.html') | |
|
370 | else: | |||
|
371 | return json.dumps(data) | |||
311 |
|
372 | |||
312 | def my_account_auth_tokens(self): |
|
373 | def my_account_auth_tokens(self): | |
313 | c.active = 'auth_tokens' |
|
374 | c.active = 'auth_tokens' |
@@ -57,7 +57,7 b' class PermissionsController(BaseControll' | |||||
57 | super(PermissionsController, self).__before__() |
|
57 | super(PermissionsController, self).__before__() | |
58 |
|
58 | |||
59 | def __load_data(self): |
|
59 | def __load_data(self): | |
60 | PermissionModel().set_global_permission_choices(c, translator=_) |
|
60 | PermissionModel().set_global_permission_choices(c, gettext_translator=_) | |
61 |
|
61 | |||
62 | @HasPermissionAllDecorator('hg.admin') |
|
62 | @HasPermissionAllDecorator('hg.admin') | |
63 | def permission_application(self): |
|
63 | def permission_application(self): | |
@@ -92,6 +92,7 b' class PermissionsController(BaseControll' | |||||
92 | self.__load_data() |
|
92 | self.__load_data() | |
93 | _form = ApplicationPermissionsForm( |
|
93 | _form = ApplicationPermissionsForm( | |
94 | [x[0] for x in c.register_choices], |
|
94 | [x[0] for x in c.register_choices], | |
|
95 | [x[0] for x in c.password_reset_choices], | |||
95 | [x[0] for x in c.extern_activate_choices])() |
|
96 | [x[0] for x in c.extern_activate_choices])() | |
96 |
|
97 | |||
97 | try: |
|
98 | try: |
@@ -160,6 +160,7 b' class ReposController(BaseRepoController' | |||||
160 | self.__load_defaults() |
|
160 | self.__load_defaults() | |
161 | form_result = {} |
|
161 | form_result = {} | |
162 | task_id = None |
|
162 | task_id = None | |
|
163 | c.personal_repo_group = c.rhodecode_user.personal_repo_group | |||
163 | try: |
|
164 | try: | |
164 | # CanWriteToGroup validators checks permissions of this POST |
|
165 | # CanWriteToGroup validators checks permissions of this POST | |
165 | form_result = RepoForm(repo_groups=c.repo_groups_choices, |
|
166 | form_result = RepoForm(repo_groups=c.repo_groups_choices, | |
@@ -173,8 +174,6 b' class ReposController(BaseRepoController' | |||||
173 | if isinstance(task, BaseAsyncResult): |
|
174 | if isinstance(task, BaseAsyncResult): | |
174 | task_id = task.task_id |
|
175 | task_id = task.task_id | |
175 | except formencode.Invalid as errors: |
|
176 | except formencode.Invalid as errors: | |
176 | c.personal_repo_group = RepoGroup.get_by_group_name( |
|
|||
177 | c.rhodecode_user.username) |
|
|||
178 | return htmlfill.render( |
|
177 | return htmlfill.render( | |
179 | render('admin/repos/repo_add.html'), |
|
178 | render('admin/repos/repo_add.html'), | |
180 | defaults=errors.value, |
|
179 | defaults=errors.value, | |
@@ -215,7 +214,7 b' class ReposController(BaseRepoController' | |||||
215 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) |
|
214 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) | |
216 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
215 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) | |
217 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() |
|
216 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() | |
218 |
c.personal_repo_group = |
|
217 | c.personal_repo_group = c.rhodecode_user.personal_repo_group | |
219 | c.new_repo = repo_name_slug(new_repo) |
|
218 | c.new_repo = repo_name_slug(new_repo) | |
220 |
|
219 | |||
221 | ## apply the defaults from defaults page |
|
220 | ## apply the defaults from defaults page | |
@@ -299,9 +298,8 b' class ReposController(BaseRepoController' | |||||
299 | repo_model = RepoModel() |
|
298 | repo_model = RepoModel() | |
300 | changed_name = repo_name |
|
299 | changed_name = repo_name | |
301 |
|
300 | |||
|
301 | c.personal_repo_group = c.rhodecode_user.personal_repo_group | |||
302 | # override the choices with extracted revisions ! |
|
302 | # override the choices with extracted revisions ! | |
303 | c.personal_repo_group = RepoGroup.get_by_group_name( |
|
|||
304 | c.rhodecode_user.username) |
|
|||
305 | repo = Repository.get_by_repo_name(repo_name) |
|
303 | repo = Repository.get_by_repo_name(repo_name) | |
306 | old_data = { |
|
304 | old_data = { | |
307 | 'repo_name': repo_name, |
|
305 | 'repo_name': repo_name, | |
@@ -399,8 +397,7 b' class ReposController(BaseRepoController' | |||||
399 |
|
397 | |||
400 | c.repo_fields = RepositoryField.query()\ |
|
398 | c.repo_fields = RepositoryField.query()\ | |
401 | .filter(RepositoryField.repository == c.repo_info).all() |
|
399 | .filter(RepositoryField.repository == c.repo_info).all() | |
402 |
c.personal_repo_group = |
|
400 | c.personal_repo_group = c.rhodecode_user.personal_repo_group | |
403 | c.rhodecode_user.username) |
|
|||
404 | c.active = 'settings' |
|
401 | c.active = 'settings' | |
405 | return htmlfill.render( |
|
402 | return htmlfill.render( | |
406 | render('admin/repos/repo_edit.html'), |
|
403 | render('admin/repos/repo_edit.html'), |
@@ -34,6 +34,7 b' import packaging.version' | |||||
34 | from pylons import request, tmpl_context as c, url, config |
|
34 | from pylons import request, tmpl_context as c, url, config | |
35 | from pylons.controllers.util import redirect |
|
35 | from pylons.controllers.util import redirect | |
36 | from pylons.i18n.translation import _, lazy_ugettext |
|
36 | from pylons.i18n.translation import _, lazy_ugettext | |
|
37 | from pyramid.threadlocal import get_current_registry | |||
37 | from webob.exc import HTTPBadRequest |
|
38 | from webob.exc import HTTPBadRequest | |
38 |
|
39 | |||
39 | import rhodecode |
|
40 | import rhodecode | |
@@ -54,6 +55,7 b' from rhodecode.model.db import RhodeCode' | |||||
54 | from rhodecode.model.forms import ApplicationSettingsForm, \ |
|
55 | from rhodecode.model.forms import ApplicationSettingsForm, \ | |
55 | ApplicationUiSettingsForm, ApplicationVisualisationForm, \ |
|
56 | ApplicationUiSettingsForm, ApplicationVisualisationForm, \ | |
56 | LabsSettingsForm, IssueTrackerPatternsForm |
|
57 | LabsSettingsForm, IssueTrackerPatternsForm | |
|
58 | from rhodecode.model.repo_group import RepoGroupModel | |||
57 |
|
59 | |||
58 | from rhodecode.model.scm import ScmModel |
|
60 | from rhodecode.model.scm import ScmModel | |
59 | from rhodecode.model.notification import EmailNotificationModel |
|
61 | from rhodecode.model.notification import EmailNotificationModel | |
@@ -63,6 +65,7 b' from rhodecode.model.settings import (' | |||||
63 | SettingsModel) |
|
65 | SettingsModel) | |
64 |
|
66 | |||
65 | from rhodecode.model.supervisor import SupervisorModel, SUPERVISOR_MASTER |
|
67 | from rhodecode.model.supervisor import SupervisorModel, SUPERVISOR_MASTER | |
|
68 | from rhodecode.svn_support.config_keys import generate_config | |||
66 |
|
69 | |||
67 |
|
70 | |||
68 | log = logging.getLogger(__name__) |
|
71 | log = logging.getLogger(__name__) | |
@@ -134,6 +137,10 b' class SettingsController(BaseController)' | |||||
134 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
137 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() | |
135 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
138 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() | |
136 |
|
139 | |||
|
140 | # TODO: Replace with request.registry after migrating to pyramid. | |||
|
141 | pyramid_settings = get_current_registry().settings | |||
|
142 | c.svn_proxy_generate_config = pyramid_settings[generate_config] | |||
|
143 | ||||
137 | application_form = ApplicationUiSettingsForm()() |
|
144 | application_form = ApplicationUiSettingsForm()() | |
138 |
|
145 | |||
139 | try: |
|
146 | try: | |
@@ -186,6 +193,10 b' class SettingsController(BaseController)' | |||||
186 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
193 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() | |
187 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
194 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() | |
188 |
|
195 | |||
|
196 | # TODO: Replace with request.registry after migrating to pyramid. | |||
|
197 | pyramid_settings = get_current_registry().settings | |||
|
198 | c.svn_proxy_generate_config = pyramid_settings[generate_config] | |||
|
199 | ||||
189 | return htmlfill.render( |
|
200 | return htmlfill.render( | |
190 | render('admin/settings/settings.html'), |
|
201 | render('admin/settings/settings.html'), | |
191 | defaults=self._form_defaults(), |
|
202 | defaults=self._form_defaults(), | |
@@ -235,6 +246,8 b' class SettingsController(BaseController)' | |||||
235 | """POST /admin/settings/global: All items in the collection""" |
|
246 | """POST /admin/settings/global: All items in the collection""" | |
236 | # url('admin_settings_global') |
|
247 | # url('admin_settings_global') | |
237 | c.active = 'global' |
|
248 | c.active = 'global' | |
|
249 | c.personal_repo_group_default_pattern = RepoGroupModel()\ | |||
|
250 | .get_personal_group_name_pattern() | |||
238 | application_form = ApplicationSettingsForm()() |
|
251 | application_form = ApplicationSettingsForm()() | |
239 | try: |
|
252 | try: | |
240 | form_result = application_form.to_python(dict(request.POST)) |
|
253 | form_result = application_form.to_python(dict(request.POST)) | |
@@ -249,16 +262,18 b' class SettingsController(BaseController)' | |||||
249 |
|
262 | |||
250 | try: |
|
263 | try: | |
251 | settings = [ |
|
264 | settings = [ | |
252 | ('title', 'rhodecode_title'), |
|
265 | ('title', 'rhodecode_title', 'unicode'), | |
253 | ('realm', 'rhodecode_realm'), |
|
266 | ('realm', 'rhodecode_realm', 'unicode'), | |
254 | ('pre_code', 'rhodecode_pre_code'), |
|
267 | ('pre_code', 'rhodecode_pre_code', 'unicode'), | |
255 | ('post_code', 'rhodecode_post_code'), |
|
268 | ('post_code', 'rhodecode_post_code', 'unicode'), | |
256 | ('captcha_public_key', 'rhodecode_captcha_public_key'), |
|
269 | ('captcha_public_key', 'rhodecode_captcha_public_key', 'unicode'), | |
257 | ('captcha_private_key', 'rhodecode_captcha_private_key'), |
|
270 | ('captcha_private_key', 'rhodecode_captcha_private_key', 'unicode'), | |
|
271 | ('create_personal_repo_group', 'rhodecode_create_personal_repo_group', 'bool'), | |||
|
272 | ('personal_repo_group_pattern', 'rhodecode_personal_repo_group_pattern', 'unicode'), | |||
258 | ] |
|
273 | ] | |
259 | for setting, form_key in settings: |
|
274 | for setting, form_key, type_ in settings: | |
260 | sett = SettingsModel().create_or_update_setting( |
|
275 | sett = SettingsModel().create_or_update_setting( | |
261 | setting, form_result[form_key]) |
|
276 | setting, form_result[form_key], type_) | |
262 | Session().add(sett) |
|
277 | Session().add(sett) | |
263 |
|
278 | |||
264 | Session().commit() |
|
279 | Session().commit() | |
@@ -277,6 +292,8 b' class SettingsController(BaseController)' | |||||
277 | """GET /admin/settings/global: All items in the collection""" |
|
292 | """GET /admin/settings/global: All items in the collection""" | |
278 | # url('admin_settings_global') |
|
293 | # url('admin_settings_global') | |
279 | c.active = 'global' |
|
294 | c.active = 'global' | |
|
295 | c.personal_repo_group_default_pattern = RepoGroupModel()\ | |||
|
296 | .get_personal_group_name_pattern() | |||
280 |
|
297 | |||
281 | return htmlfill.render( |
|
298 | return htmlfill.render( | |
282 | render('admin/settings/settings.html'), |
|
299 | render('admin/settings/settings.html'), | |
@@ -397,19 +414,20 b' class SettingsController(BaseController)' | |||||
397 | settings_model = IssueTrackerSettingsModel() |
|
414 | settings_model = IssueTrackerSettingsModel() | |
398 |
|
415 | |||
399 | form = IssueTrackerPatternsForm()().to_python(request.POST) |
|
416 | form = IssueTrackerPatternsForm()().to_python(request.POST) | |
400 | for uid in form['delete_patterns']: |
|
417 | if form: | |
401 | settings_model.delete_entries(uid) |
|
418 | for uid in form.get('delete_patterns', []): | |
|
419 | settings_model.delete_entries(uid) | |||
402 |
|
420 | |||
403 |
for pattern in form |
|
421 | for pattern in form.get('patterns', []): | |
404 | for setting, value, type_ in pattern: |
|
422 | for setting, value, type_ in pattern: | |
405 | sett = settings_model.create_or_update_setting( |
|
423 | sett = settings_model.create_or_update_setting( | |
406 | setting, value, type_) |
|
424 | setting, value, type_) | |
407 | Session().add(sett) |
|
425 | Session().add(sett) | |
408 |
|
426 | |||
409 | Session().commit() |
|
427 | Session().commit() | |
410 |
|
428 | |||
411 | SettingsModel().invalidate_settings_cache() |
|
429 | SettingsModel().invalidate_settings_cache() | |
412 | h.flash(_('Updated issue tracker entries'), category='success') |
|
430 | h.flash(_('Updated issue tracker entries'), category='success') | |
413 | return redirect(url('admin_settings_issuetracker')) |
|
431 | return redirect(url('admin_settings_issuetracker')) | |
414 |
|
432 | |||
415 | @HasPermissionAllDecorator('hg.admin') |
|
433 | @HasPermissionAllDecorator('hg.admin') | |
@@ -530,65 +548,93 b' class SettingsController(BaseController)' | |||||
530 | """GET /admin/settings/system: All items in the collection""" |
|
548 | """GET /admin/settings/system: All items in the collection""" | |
531 | # url('admin_settings_system') |
|
549 | # url('admin_settings_system') | |
532 | snapshot = str2bool(request.GET.get('snapshot')) |
|
550 | snapshot = str2bool(request.GET.get('snapshot')) | |
533 | c.active = 'system' |
|
551 | defaults = self._form_defaults() | |
534 |
|
552 | |||
535 | defaults = self._form_defaults() |
|
553 | c.active = 'system' | |
536 | c.rhodecode_ini = rhodecode.CONFIG |
|
|||
537 | c.rhodecode_update_url = defaults.get('rhodecode_update_url') |
|
554 | c.rhodecode_update_url = defaults.get('rhodecode_update_url') | |
538 | server_info = ScmModel().get_server_info(request.environ) |
|
555 | server_info = ScmModel().get_server_info(request.environ) | |
|
556 | ||||
539 | for key, val in server_info.iteritems(): |
|
557 | for key, val in server_info.iteritems(): | |
540 | setattr(c, key, val) |
|
558 | setattr(c, key, val) | |
541 |
|
559 | |||
542 | if c.disk['percent'] > 90: |
|
560 | def val(name, subkey='human_value'): | |
543 | h.flash(h.literal(_( |
|
561 | return server_info[name][subkey] | |
544 | 'Critical: your disk space is very low <b>%s%%</b> used' % |
|
562 | ||
545 | c.disk['percent'])), 'error') |
|
563 | def state(name): | |
546 | elif c.disk['percent'] > 70: |
|
564 | return server_info[name]['state'] | |
547 | h.flash(h.literal(_( |
|
565 | ||
548 | 'Warning: your disk space is running low <b>%s%%</b> used' % |
|
566 | def val2(name): | |
549 | c.disk['percent'])), 'warning') |
|
567 | val = server_info[name]['human_value'] | |
|
568 | state = server_info[name]['state'] | |||
|
569 | return val, state | |||
550 |
|
570 | |||
551 | try: |
|
571 | c.data_items = [ | |
552 | c.uptime_age = h._age( |
|
572 | # update info | |
553 | h.time_to_datetime(c.boot_time), False, show_suffix=False) |
|
573 | (_('Update info'), h.literal( | |
554 | except TypeError: |
|
574 | '<span class="link" id="check_for_update" >%s.</span>' % ( | |
555 | c.uptime_age = c.boot_time |
|
575 | _('Check for updates')) + | |
|
576 | '<br/> <span >%s.</span>' % (_('Note: please make sure this server can access `%s` for the update link to work') % c.rhodecode_update_url) | |||
|
577 | ), ''), | |||
|
578 | ||||
|
579 | # RhodeCode specific | |||
|
580 | (_('RhodeCode Version'), val('rhodecode_app')['text'], state('rhodecode_app')), | |||
|
581 | (_('RhodeCode Server IP'), val('server')['server_ip'], state('server')), | |||
|
582 | (_('RhodeCode Server ID'), val('server')['server_id'], state('server')), | |||
|
583 | (_('RhodeCode Configuration'), val('rhodecode_config')['path'], state('rhodecode_config')), | |||
|
584 | ('', '', ''), # spacer | |||
|
585 | ||||
|
586 | # Database | |||
|
587 | (_('Database'), val('database')['url'], state('database')), | |||
|
588 | (_('Database version'), val('database')['version'], state('database')), | |||
|
589 | ('', '', ''), # spacer | |||
556 |
|
590 | |||
557 | try: |
|
591 | # Platform/Python | |
558 | c.system_memory = '%s/%s, %s%% (%s%%) used%s' % ( |
|
592 | (_('Platform'), val('platform')['name'], state('platform')), | |
559 | h.format_byte_size_binary(c.memory['used']), |
|
593 | (_('Platform UUID'), val('platform')['uuid'], state('platform')), | |
560 | h.format_byte_size_binary(c.memory['total']), |
|
594 | (_('Python version'), val('python')['version'], state('python')), | |
561 | c.memory['percent2'], |
|
595 | (_('Python path'), val('python')['executable'], state('python')), | |
562 | c.memory['percent'], |
|
596 | ('', '', ''), # spacer | |
563 | ' %s' % c.memory['error'] if 'error' in c.memory else '') |
|
597 | ||
564 | except TypeError: |
|
598 | # Systems stats | |
565 | c.system_memory = 'NOT AVAILABLE' |
|
599 | (_('CPU'), val('cpu'), state('cpu')), | |
|
600 | (_('Load'), val('load')['text'], state('load')), | |||
|
601 | (_('Memory'), val('memory')['text'], state('memory')), | |||
|
602 | (_('Uptime'), val('uptime')['text'], state('uptime')), | |||
|
603 | ('', '', ''), # spacer | |||
|
604 | ||||
|
605 | # Repo storage | |||
|
606 | (_('Storage location'), val('storage')['path'], state('storage')), | |||
|
607 | (_('Storage info'), val('storage')['text'], state('storage')), | |||
|
608 | (_('Storage inodes'), val('storage_inodes')['text'], state('storage_inodes')), | |||
566 |
|
609 | |||
567 | rhodecode_ini_safe = rhodecode.CONFIG.copy() |
|
610 | (_('Gist storage location'), val('storage_gist')['path'], state('storage_gist')), | |
568 | blacklist = [ |
|
611 | (_('Gist storage info'), val('storage_gist')['text'], state('storage_gist')), | |
569 | 'rhodecode_license_key', |
|
612 | ||
570 | 'routes.map', |
|
613 | (_('Archive cache storage location'), val('storage_archive')['path'], state('storage_archive')), | |
571 | 'pylons.h', |
|
614 | (_('Archive cache info'), val('storage_archive')['text'], state('storage_archive')), | |
572 | 'pylons.app_globals', |
|
615 | ||
573 | 'pylons.environ_config', |
|
616 | (_('Temp storage location'), val('storage_temp')['path'], state('storage_temp')), | |
574 | 'sqlalchemy.db1.url', |
|
617 | (_('Temp storage info'), val('storage_temp')['text'], state('storage_temp')), | |
575 | ('app_conf', 'sqlalchemy.db1.url') |
|
618 | ||
|
619 | (_('Search info'), val('search')['text'], state('search')), | |||
|
620 | (_('Search location'), val('search')['location'], state('search')), | |||
|
621 | ('', '', ''), # spacer | |||
|
622 | ||||
|
623 | # VCS specific | |||
|
624 | (_('VCS Backends'), val('vcs_backends'), state('vcs_backends')), | |||
|
625 | (_('VCS Server'), val('vcs_server')['text'], state('vcs_server')), | |||
|
626 | (_('GIT'), val('git'), state('git')), | |||
|
627 | (_('HG'), val('hg'), state('hg')), | |||
|
628 | (_('SVN'), val('svn'), state('svn')), | |||
|
629 | ||||
576 | ] |
|
630 | ] | |
577 | for k in blacklist: |
|
|||
578 | if isinstance(k, tuple): |
|
|||
579 | section, key = k |
|
|||
580 | if section in rhodecode_ini_safe: |
|
|||
581 | rhodecode_ini_safe[section].pop(key, None) |
|
|||
582 | else: |
|
|||
583 | rhodecode_ini_safe.pop(k, None) |
|
|||
584 |
|
||||
585 | c.rhodecode_ini_safe = rhodecode_ini_safe |
|
|||
586 |
|
631 | |||
587 | # TODO: marcink, figure out how to allow only selected users to do this |
|
632 | # TODO: marcink, figure out how to allow only selected users to do this | |
588 |
c.allowed_to_snapshot = |
|
633 | c.allowed_to_snapshot = c.rhodecode_user.admin | |
589 |
|
634 | |||
590 | if snapshot: |
|
635 | if snapshot: | |
591 | if c.allowed_to_snapshot: |
|
636 | if c.allowed_to_snapshot: | |
|
637 | c.data_items.pop(0) # remove server info | |||
592 | return render('admin/settings/settings_system_snapshot.html') |
|
638 | return render('admin/settings/settings_system_snapshot.html') | |
593 | else: |
|
639 | else: | |
594 | h.flash('You are not allowed to do this', category='warning') |
|
640 | h.flash('You are not allowed to do this', category='warning') |
@@ -25,6 +25,7 b' User Groups crud controller for pylons' | |||||
25 | import logging |
|
25 | import logging | |
26 | import formencode |
|
26 | import formencode | |
27 |
|
27 | |||
|
28 | import peppercorn | |||
28 | from formencode import htmlfill |
|
29 | from formencode import htmlfill | |
29 | from pylons import request, tmpl_context as c, url, config |
|
30 | from pylons import request, tmpl_context as c, url, config | |
30 | from pylons.controllers.util import redirect |
|
31 | from pylons.controllers.util import redirect | |
@@ -40,7 +41,7 b' from rhodecode.lib.utils import jsonify,' | |||||
40 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int |
|
41 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int | |
41 | from rhodecode.lib.auth import ( |
|
42 | from rhodecode.lib.auth import ( | |
42 | LoginRequired, NotAnonymous, HasUserGroupPermissionAnyDecorator, |
|
43 | LoginRequired, NotAnonymous, HasUserGroupPermissionAnyDecorator, | |
43 | HasPermissionAnyDecorator) |
|
44 | HasPermissionAnyDecorator, XHRRequired) | |
44 | from rhodecode.lib.base import BaseController, render |
|
45 | from rhodecode.lib.base import BaseController, render | |
45 | from rhodecode.model.permission import PermissionModel |
|
46 | from rhodecode.model.permission import PermissionModel | |
46 | from rhodecode.model.scm import UserGroupList |
|
47 | from rhodecode.model.scm import UserGroupList | |
@@ -64,18 +65,13 b' class UserGroupsController(BaseControlle' | |||||
64 | def __before__(self): |
|
65 | def __before__(self): | |
65 | super(UserGroupsController, self).__before__() |
|
66 | super(UserGroupsController, self).__before__() | |
66 | c.available_permissions = config['available_permissions'] |
|
67 | c.available_permissions = config['available_permissions'] | |
67 | PermissionModel().set_global_permission_choices(c, translator=_) |
|
68 | PermissionModel().set_global_permission_choices(c, gettext_translator=_) | |
68 |
|
69 | |||
69 | def __load_data(self, user_group_id): |
|
70 | def __load_data(self, user_group_id): | |
70 | c.group_members_obj = [x.user for x in c.user_group.members] |
|
71 | c.group_members_obj = [x.user for x in c.user_group.members] | |
71 | c.group_members_obj.sort(key=lambda u: u.username.lower()) |
|
72 | c.group_members_obj.sort(key=lambda u: u.username.lower()) | |
72 |
|
||||
73 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
73 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] | |
74 |
|
74 | |||
75 | c.available_members = [(x.user_id, x.username) |
|
|||
76 | for x in User.query().all()] |
|
|||
77 | c.available_members.sort(key=lambda u: u[1].lower()) |
|
|||
78 |
|
||||
79 | def __load_defaults(self, user_group_id): |
|
75 | def __load_defaults(self, user_group_id): | |
80 | """ |
|
76 | """ | |
81 | Load defaults settings for edit, and update |
|
77 | Load defaults settings for edit, and update | |
@@ -207,20 +203,21 b' class UserGroupsController(BaseControlle' | |||||
207 | c.active = 'settings' |
|
203 | c.active = 'settings' | |
208 | self.__load_data(user_group_id) |
|
204 | self.__load_data(user_group_id) | |
209 |
|
205 | |||
210 | available_members = [safe_unicode(x[0]) for x in c.available_members] |
|
|||
211 |
|
||||
212 | users_group_form = UserGroupForm( |
|
206 | users_group_form = UserGroupForm( | |
213 | edit=True, old_data=c.user_group.get_dict(), |
|
207 | edit=True, old_data=c.user_group.get_dict(), allow_disabled=True)() | |
214 | available_members=available_members, allow_disabled=True)() |
|
|||
215 |
|
208 | |||
216 | try: |
|
209 | try: | |
217 | form_result = users_group_form.to_python(request.POST) |
|
210 | form_result = users_group_form.to_python(request.POST) | |
|
211 | pstruct = peppercorn.parse(request.POST.items()) | |||
|
212 | form_result['users_group_members'] = pstruct['user_group_members'] | |||
|
213 | ||||
218 | UserGroupModel().update(c.user_group, form_result) |
|
214 | UserGroupModel().update(c.user_group, form_result) | |
219 | gr = form_result['users_group_name'] |
|
215 | updated_user_group = form_result['users_group_name'] | |
220 | action_logger(c.rhodecode_user, |
|
216 | action_logger(c.rhodecode_user, | |
221 | 'admin_updated_users_group:%s' % gr, |
|
217 | 'admin_updated_users_group:%s' % updated_user_group, | |
222 | None, self.ip_addr, self.sa) |
|
218 | None, self.ip_addr, self.sa) | |
223 |
h.flash(_('Updated user group %s') % gr, |
|
219 | h.flash(_('Updated user group %s') % updated_user_group, | |
|
220 | category='success') | |||
224 | Session().commit() |
|
221 | Session().commit() | |
225 | except formencode.Invalid as errors: |
|
222 | except formencode.Invalid as errors: | |
226 | defaults = errors.value |
|
223 | defaults = errors.value | |
@@ -462,19 +459,29 b' class UserGroupsController(BaseControlle' | |||||
462 | return render('admin/user_groups/user_group_edit.html') |
|
459 | return render('admin/user_groups/user_group_edit.html') | |
463 |
|
460 | |||
464 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
461 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') | |
465 | def edit_members(self, user_group_id): |
|
462 | @XHRRequired() | |
|
463 | @jsonify | |||
|
464 | def user_group_members(self, user_group_id): | |||
466 | user_group_id = safe_int(user_group_id) |
|
465 | user_group_id = safe_int(user_group_id) | |
467 |
|
|
466 | user_group = UserGroup.get_or_404(user_group_id) | |
468 | c.active = 'members' |
|
467 | group_members_obj = sorted((x.user for x in user_group.members), | |
469 | c.group_members_obj = sorted((x.user for x in c.user_group.members), |
|
468 | key=lambda u: u.username.lower()) | |
470 | key=lambda u: u.username.lower()) |
|
|||
471 |
|
469 | |||
472 | group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
470 | group_members = [ | |
|
471 | { | |||
|
472 | 'id': user.user_id, | |||
|
473 | 'first_name': user.name, | |||
|
474 | 'last_name': user.lastname, | |||
|
475 | 'username': user.username, | |||
|
476 | 'icon_link': h.gravatar_url(user.email, 30), | |||
|
477 | 'value_display': h.person(user.email), | |||
|
478 | 'value': user.username, | |||
|
479 | 'value_type': 'user', | |||
|
480 | 'active': user.active, | |||
|
481 | } | |||
|
482 | for user in group_members_obj | |||
|
483 | ] | |||
473 |
|
484 | |||
474 | if request.is_xhr: |
|
485 | return { | |
475 | return jsonify(lambda *a, **k: { |
|
486 | 'members': group_members | |
476 | 'members': group_members |
|
487 | } | |
477 | }) |
|
|||
478 |
|
||||
479 | c.group_members = group_members |
|
|||
480 | return render('admin/user_groups/user_group_edit.html') |
|
@@ -45,12 +45,13 b' from rhodecode.model.db import (' | |||||
45 | PullRequestReviewers, User, UserEmailMap, UserIpMap, RepoGroup) |
|
45 | PullRequestReviewers, User, UserEmailMap, UserIpMap, RepoGroup) | |
46 | from rhodecode.model.forms import ( |
|
46 | from rhodecode.model.forms import ( | |
47 | UserForm, UserPermissionsForm, UserIndividualPermissionsForm) |
|
47 | UserForm, UserPermissionsForm, UserIndividualPermissionsForm) | |
|
48 | from rhodecode.model.repo_group import RepoGroupModel | |||
48 | from rhodecode.model.user import UserModel |
|
49 | from rhodecode.model.user import UserModel | |
49 | from rhodecode.model.meta import Session |
|
50 | from rhodecode.model.meta import Session | |
50 | from rhodecode.model.permission import PermissionModel |
|
51 | from rhodecode.model.permission import PermissionModel | |
51 | from rhodecode.lib.utils import action_logger |
|
52 | from rhodecode.lib.utils import action_logger | |
52 | from rhodecode.lib.ext_json import json |
|
53 | from rhodecode.lib.ext_json import json | |
53 | from rhodecode.lib.utils2 import datetime_to_time, safe_int |
|
54 | from rhodecode.lib.utils2 import datetime_to_time, safe_int, AttributeDict | |
54 |
|
55 | |||
55 | log = logging.getLogger(__name__) |
|
56 | log = logging.getLogger(__name__) | |
56 |
|
57 | |||
@@ -73,7 +74,7 b' class UsersController(BaseController):' | |||||
73 | ('ru', 'Russian (ru)'), |
|
74 | ('ru', 'Russian (ru)'), | |
74 | ('zh', 'Chinese (zh)'), |
|
75 | ('zh', 'Chinese (zh)'), | |
75 | ] |
|
76 | ] | |
76 | PermissionModel().set_global_permission_choices(c, translator=_) |
|
77 | PermissionModel().set_global_permission_choices(c, gettext_translator=_) | |
77 |
|
78 | |||
78 | @HasPermissionAllDecorator('hg.admin') |
|
79 | @HasPermissionAllDecorator('hg.admin') | |
79 | def index(self): |
|
80 | def index(self): | |
@@ -120,6 +121,16 b' class UsersController(BaseController):' | |||||
120 | c.data = json.dumps(users_data) |
|
121 | c.data = json.dumps(users_data) | |
121 | return render('admin/users/users.html') |
|
122 | return render('admin/users/users.html') | |
122 |
|
123 | |||
|
124 | def _get_personal_repo_group_template_vars(self): | |||
|
125 | DummyUser = AttributeDict({ | |||
|
126 | 'username': '${username}', | |||
|
127 | 'user_id': '${user_id}', | |||
|
128 | }) | |||
|
129 | c.default_create_repo_group = RepoGroupModel() \ | |||
|
130 | .get_default_create_personal_repo_group() | |||
|
131 | c.personal_repo_group_name = RepoGroupModel() \ | |||
|
132 | .get_personal_group_name(DummyUser) | |||
|
133 | ||||
123 | @HasPermissionAllDecorator('hg.admin') |
|
134 | @HasPermissionAllDecorator('hg.admin') | |
124 | @auth.CSRFRequired() |
|
135 | @auth.CSRFRequired() | |
125 | def create(self): |
|
136 | def create(self): | |
@@ -143,6 +154,7 b' class UsersController(BaseController):' | |||||
143 | % {'user_link': user_link}), category='success') |
|
154 | % {'user_link': user_link}), category='success') | |
144 | Session().commit() |
|
155 | Session().commit() | |
145 | except formencode.Invalid as errors: |
|
156 | except formencode.Invalid as errors: | |
|
157 | self._get_personal_repo_group_template_vars() | |||
146 | return htmlfill.render( |
|
158 | return htmlfill.render( | |
147 | render('admin/users/user_add.html'), |
|
159 | render('admin/users/user_add.html'), | |
148 | defaults=errors.value, |
|
160 | defaults=errors.value, | |
@@ -163,6 +175,7 b' class UsersController(BaseController):' | |||||
163 | """GET /users/new: Form to create a new item""" |
|
175 | """GET /users/new: Form to create a new item""" | |
164 | # url('new_user') |
|
176 | # url('new_user') | |
165 | c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.name |
|
177 | c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.name | |
|
178 | self._get_personal_repo_group_template_vars() | |||
166 | return render('admin/users/user_add.html') |
|
179 | return render('admin/users/user_add.html') | |
167 |
|
180 | |||
168 | @HasPermissionAllDecorator('hg.admin') |
|
181 | @HasPermissionAllDecorator('hg.admin') | |
@@ -339,22 +352,41 b' class UsersController(BaseController):' | |||||
339 |
|
352 | |||
340 | user_id = safe_int(user_id) |
|
353 | user_id = safe_int(user_id) | |
341 | c.user = User.get_or_404(user_id) |
|
354 | c.user = User.get_or_404(user_id) | |
|
355 | personal_repo_group = RepoGroup.get_user_personal_repo_group( | |||
|
356 | c.user.user_id) | |||
|
357 | if personal_repo_group: | |||
|
358 | return redirect(url('edit_user_advanced', user_id=user_id)) | |||
342 |
|
359 | |||
|
360 | personal_repo_group_name = RepoGroupModel().get_personal_group_name( | |||
|
361 | c.user) | |||
|
362 | named_personal_group = RepoGroup.get_by_group_name( | |||
|
363 | personal_repo_group_name) | |||
343 | try: |
|
364 | try: | |
344 | desc = RepoGroupModel.PERSONAL_GROUP_DESC % { |
|
|||
345 | 'username': c.user.username} |
|
|||
346 | if not RepoGroup.get_by_group_name(c.user.username): |
|
|||
347 | RepoGroupModel().create(group_name=c.user.username, |
|
|||
348 | group_description=desc, |
|
|||
349 | owner=c.user.username) |
|
|||
350 |
|
365 | |||
351 | msg = _('Created repository group `%s`' % (c.user.username,)) |
|
366 | if named_personal_group and named_personal_group.user_id == c.user.user_id: | |
|
367 | # migrate the same named group, and mark it as personal | |||
|
368 | named_personal_group.personal = True | |||
|
369 | Session().add(named_personal_group) | |||
|
370 | Session().commit() | |||
|
371 | msg = _('Linked repository group `%s` as personal' % ( | |||
|
372 | personal_repo_group_name,)) | |||
352 | h.flash(msg, category='success') |
|
373 | h.flash(msg, category='success') | |
|
374 | elif not named_personal_group: | |||
|
375 | RepoGroupModel().create_personal_repo_group(c.user) | |||
|
376 | ||||
|
377 | msg = _('Created repository group `%s`' % ( | |||
|
378 | personal_repo_group_name,)) | |||
|
379 | h.flash(msg, category='success') | |||
|
380 | else: | |||
|
381 | msg = _('Repository group `%s` is already taken' % ( | |||
|
382 | personal_repo_group_name,)) | |||
|
383 | h.flash(msg, category='warning') | |||
353 | except Exception: |
|
384 | except Exception: | |
354 | log.exception("Exception during repository group creation") |
|
385 | log.exception("Exception during repository group creation") | |
355 | msg = _( |
|
386 | msg = _( | |
356 | 'An error occurred during repository group creation for user') |
|
387 | 'An error occurred during repository group creation for user') | |
357 | h.flash(msg, category='error') |
|
388 | h.flash(msg, category='error') | |
|
389 | Session().rollback() | |||
358 |
|
390 | |||
359 | return redirect(url('edit_user_advanced', user_id=user_id)) |
|
391 | return redirect(url('edit_user_advanced', user_id=user_id)) | |
360 |
|
392 | |||
@@ -397,7 +429,9 b' class UsersController(BaseController):' | |||||
397 |
|
429 | |||
398 | c.active = 'advanced' |
|
430 | c.active = 'advanced' | |
399 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) |
|
431 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) | |
400 | c.personal_repo_group = RepoGroup.get_by_group_name(user.username) |
|
432 | c.personal_repo_group = c.perm_user.personal_repo_group | |
|
433 | c.personal_repo_group_name = RepoGroupModel()\ | |||
|
434 | .get_personal_group_name(user) | |||
401 | c.first_admin = User.get_first_super_admin() |
|
435 | c.first_admin = User.get_first_super_admin() | |
402 | defaults = user.get_dict() |
|
436 | defaults = user.get_dict() | |
403 |
|
437 |
@@ -32,7 +32,7 b' from pylons.i18n.translation import _' | |||||
32 | from pylons.controllers.util import redirect |
|
32 | from pylons.controllers.util import redirect | |
33 |
|
33 | |||
34 | from rhodecode.lib import auth |
|
34 | from rhodecode.lib import auth | |
35 | from rhodecode.lib import diffs |
|
35 | from rhodecode.lib import diffs, codeblocks | |
36 | from rhodecode.lib.auth import ( |
|
36 | from rhodecode.lib.auth import ( | |
37 | LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous) |
|
37 | LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous) | |
38 | from rhodecode.lib.base import BaseRepoController, render |
|
38 | from rhodecode.lib.base import BaseRepoController, render | |
@@ -43,7 +43,7 b' from rhodecode.lib.utils import action_l' | |||||
43 | from rhodecode.lib.utils2 import safe_unicode |
|
43 | from rhodecode.lib.utils2 import safe_unicode | |
44 | from rhodecode.lib.vcs.backends.base import EmptyCommit |
|
44 | from rhodecode.lib.vcs.backends.base import EmptyCommit | |
45 | from rhodecode.lib.vcs.exceptions import ( |
|
45 | from rhodecode.lib.vcs.exceptions import ( | |
46 | RepositoryError, CommitDoesNotExistError) |
|
46 | RepositoryError, CommitDoesNotExistError, NodeDoesNotExistError) | |
47 | from rhodecode.model.db import ChangesetComment, ChangesetStatus |
|
47 | from rhodecode.model.db import ChangesetComment, ChangesetStatus | |
48 | from rhodecode.model.changeset_status import ChangesetStatusModel |
|
48 | from rhodecode.model.changeset_status import ChangesetStatusModel | |
49 | from rhodecode.model.comment import ChangesetCommentsModel |
|
49 | from rhodecode.model.comment import ChangesetCommentsModel | |
@@ -156,15 +156,24 b' class ChangesetController(BaseRepoContro' | |||||
156 | c.ignorews_url = _ignorews_url |
|
156 | c.ignorews_url = _ignorews_url | |
157 | c.context_url = _context_url |
|
157 | c.context_url = _context_url | |
158 | c.fulldiff = fulldiff = request.GET.get('fulldiff') |
|
158 | c.fulldiff = fulldiff = request.GET.get('fulldiff') | |
|
159 | ||||
|
160 | # fetch global flags of ignore ws or context lines | |||
|
161 | context_lcl = get_line_ctx('', request.GET) | |||
|
162 | ign_whitespace_lcl = get_ignore_ws('', request.GET) | |||
|
163 | ||||
|
164 | # diff_limit will cut off the whole diff if the limit is applied | |||
|
165 | # otherwise it will just hide the big files from the front-end | |||
|
166 | diff_limit = self.cut_off_limit_diff | |||
|
167 | file_limit = self.cut_off_limit_file | |||
|
168 | ||||
159 | # get ranges of commit ids if preset |
|
169 | # get ranges of commit ids if preset | |
160 | commit_range = commit_id_range.split('...')[:2] |
|
170 | commit_range = commit_id_range.split('...')[:2] | |
161 | enable_comments = True |
|
171 | ||
162 | try: |
|
172 | try: | |
163 | pre_load = ['affected_files', 'author', 'branch', 'date', |
|
173 | pre_load = ['affected_files', 'author', 'branch', 'date', | |
164 | 'message', 'parents'] |
|
174 | 'message', 'parents'] | |
165 |
|
175 | |||
166 | if len(commit_range) == 2: |
|
176 | if len(commit_range) == 2: | |
167 | enable_comments = False |
|
|||
168 | commits = c.rhodecode_repo.get_commits( |
|
177 | commits = c.rhodecode_repo.get_commits( | |
169 | start_id=commit_range[0], end_id=commit_range[1], |
|
178 | start_id=commit_range[0], end_id=commit_range[1], | |
170 | pre_load=pre_load) |
|
179 | pre_load=pre_load) | |
@@ -190,88 +199,78 b' class ChangesetController(BaseRepoContro' | |||||
190 | c.lines_deleted = 0 |
|
199 | c.lines_deleted = 0 | |
191 |
|
200 | |||
192 | c.commit_statuses = ChangesetStatus.STATUSES |
|
201 | c.commit_statuses = ChangesetStatus.STATUSES | |
193 | c.comments = [] |
|
|||
194 | c.statuses = [] |
|
|||
195 | c.inline_comments = [] |
|
202 | c.inline_comments = [] | |
196 | c.inline_cnt = 0 |
|
203 | c.inline_cnt = 0 | |
197 | c.files = [] |
|
204 | c.files = [] | |
198 |
|
205 | |||
|
206 | c.statuses = [] | |||
|
207 | c.comments = [] | |||
|
208 | if len(c.commit_ranges) == 1: | |||
|
209 | commit = c.commit_ranges[0] | |||
|
210 | c.comments = ChangesetCommentsModel().get_comments( | |||
|
211 | c.rhodecode_db_repo.repo_id, | |||
|
212 | revision=commit.raw_id) | |||
|
213 | c.statuses.append(ChangesetStatusModel().get_status( | |||
|
214 | c.rhodecode_db_repo.repo_id, commit.raw_id)) | |||
|
215 | # comments from PR | |||
|
216 | statuses = ChangesetStatusModel().get_statuses( | |||
|
217 | c.rhodecode_db_repo.repo_id, commit.raw_id, | |||
|
218 | with_revisions=True) | |||
|
219 | prs = set(st.pull_request for st in statuses | |||
|
220 | if st.pull_request is not None) | |||
|
221 | # from associated statuses, check the pull requests, and | |||
|
222 | # show comments from them | |||
|
223 | for pr in prs: | |||
|
224 | c.comments.extend(pr.comments) | |||
|
225 | ||||
199 | # Iterate over ranges (default commit view is always one commit) |
|
226 | # Iterate over ranges (default commit view is always one commit) | |
200 | for commit in c.commit_ranges: |
|
227 | for commit in c.commit_ranges: | |
201 | if method == 'show': |
|
|||
202 | c.statuses.extend([ChangesetStatusModel().get_status( |
|
|||
203 | c.rhodecode_db_repo.repo_id, commit.raw_id)]) |
|
|||
204 |
|
||||
205 | c.comments.extend(ChangesetCommentsModel().get_comments( |
|
|||
206 | c.rhodecode_db_repo.repo_id, |
|
|||
207 | revision=commit.raw_id)) |
|
|||
208 |
|
||||
209 | # comments from PR |
|
|||
210 | st = ChangesetStatusModel().get_statuses( |
|
|||
211 | c.rhodecode_db_repo.repo_id, commit.raw_id, |
|
|||
212 | with_revisions=True) |
|
|||
213 |
|
||||
214 | # from associated statuses, check the pull requests, and |
|
|||
215 | # show comments from them |
|
|||
216 |
|
||||
217 | prs = set(x.pull_request for x in |
|
|||
218 | filter(lambda x: x.pull_request is not None, st)) |
|
|||
219 | for pr in prs: |
|
|||
220 | c.comments.extend(pr.comments) |
|
|||
221 |
|
||||
222 | inlines = ChangesetCommentsModel().get_inline_comments( |
|
|||
223 | c.rhodecode_db_repo.repo_id, revision=commit.raw_id) |
|
|||
224 | c.inline_comments.extend(inlines.iteritems()) |
|
|||
225 |
|
||||
226 | c.changes[commit.raw_id] = [] |
|
228 | c.changes[commit.raw_id] = [] | |
227 |
|
229 | |||
228 | commit2 = commit |
|
230 | commit2 = commit | |
229 | commit1 = commit.parents[0] if commit.parents else EmptyCommit() |
|
231 | commit1 = commit.parents[0] if commit.parents else EmptyCommit() | |
230 |
|
232 | |||
231 | # fetch global flags of ignore ws or context lines |
|
|||
232 | context_lcl = get_line_ctx('', request.GET) |
|
|||
233 | ign_whitespace_lcl = get_ignore_ws('', request.GET) |
|
|||
234 |
|
||||
235 | _diff = c.rhodecode_repo.get_diff( |
|
233 | _diff = c.rhodecode_repo.get_diff( | |
236 | commit1, commit2, |
|
234 | commit1, commit2, | |
237 | ignore_whitespace=ign_whitespace_lcl, context=context_lcl) |
|
235 | ignore_whitespace=ign_whitespace_lcl, context=context_lcl) | |
|
236 | diff_processor = diffs.DiffProcessor( | |||
|
237 | _diff, format='newdiff', diff_limit=diff_limit, | |||
|
238 | file_limit=file_limit, show_full_diff=fulldiff) | |||
238 |
|
239 | |||
239 | # diff_limit will cut off the whole diff if the limit is applied |
|
|||
240 | # otherwise it will just hide the big files from the front-end |
|
|||
241 | diff_limit = self.cut_off_limit_diff |
|
|||
242 | file_limit = self.cut_off_limit_file |
|
|||
243 |
|
||||
244 | diff_processor = diffs.DiffProcessor( |
|
|||
245 | _diff, format='gitdiff', diff_limit=diff_limit, |
|
|||
246 | file_limit=file_limit, show_full_diff=fulldiff) |
|
|||
247 | commit_changes = OrderedDict() |
|
240 | commit_changes = OrderedDict() | |
248 | if method == 'show': |
|
241 | if method == 'show': | |
249 | _parsed = diff_processor.prepare() |
|
242 | _parsed = diff_processor.prepare() | |
250 | c.limited_diff = isinstance(_parsed, diffs.LimitedDiffContainer) |
|
243 | c.limited_diff = isinstance(_parsed, diffs.LimitedDiffContainer) | |
251 | for f in _parsed: |
|
244 | ||
252 | c.files.append(f) |
|
245 | _parsed = diff_processor.prepare() | |
253 | st = f['stats'] |
|
246 | ||
254 | c.lines_added += st['added'] |
|
247 | def _node_getter(commit): | |
255 | c.lines_deleted += st['deleted'] |
|
248 | def get_node(fname): | |
256 | fid = h.FID(commit.raw_id, f['filename']) |
|
249 | try: | |
257 | diff = diff_processor.as_html(enable_comments=enable_comments, |
|
250 | return commit.get_node(fname) | |
258 | parsed_lines=[f]) |
|
251 | except NodeDoesNotExistError: | |
259 | commit_changes[fid] = [ |
|
252 | return None | |
260 | commit1.raw_id, commit2.raw_id, |
|
253 | return get_node | |
261 | f['operation'], f['filename'], diff, st, f] |
|
254 | ||
|
255 | inline_comments = ChangesetCommentsModel().get_inline_comments( | |||
|
256 | c.rhodecode_db_repo.repo_id, revision=commit.raw_id) | |||
|
257 | c.inline_cnt += len(inline_comments) | |||
|
258 | ||||
|
259 | diffset = codeblocks.DiffSet( | |||
|
260 | repo_name=c.repo_name, | |||
|
261 | source_node_getter=_node_getter(commit1), | |||
|
262 | target_node_getter=_node_getter(commit2), | |||
|
263 | comments=inline_comments | |||
|
264 | ).render_patchset(_parsed, commit1.raw_id, commit2.raw_id) | |||
|
265 | c.changes[commit.raw_id] = diffset | |||
262 | else: |
|
266 | else: | |
263 | # downloads/raw we only need RAW diff nothing else |
|
267 | # downloads/raw we only need RAW diff nothing else | |
264 | diff = diff_processor.as_raw() |
|
268 | diff = diff_processor.as_raw() | |
265 |
c |
|
269 | c.changes[commit.raw_id] = [None, None, None, None, diff, None, None] | |
266 | c.changes[commit.raw_id] = commit_changes |
|
|||
267 |
|
270 | |||
268 | # sort comments by how they were generated |
|
271 | # sort comments by how they were generated | |
269 | c.comments = sorted(c.comments, key=lambda x: x.comment_id) |
|
272 | c.comments = sorted(c.comments, key=lambda x: x.comment_id) | |
270 |
|
273 | |||
271 | # count inline comments |
|
|||
272 | for __, lines in c.inline_comments: |
|
|||
273 | for comments in lines.values(): |
|
|||
274 | c.inline_cnt += len(comments) |
|
|||
275 |
|
274 | |||
276 | if len(c.commit_ranges) == 1: |
|
275 | if len(c.commit_ranges) == 1: | |
277 | c.commit = c.commit_ranges[0] |
|
276 | c.commit = c.commit_ranges[0] |
@@ -31,13 +31,14 b' from pylons.i18n.translation import _' | |||||
31 |
|
31 | |||
32 | from rhodecode.controllers.utils import parse_path_ref, get_commit_from_ref_name |
|
32 | from rhodecode.controllers.utils import parse_path_ref, get_commit_from_ref_name | |
33 | from rhodecode.lib import helpers as h |
|
33 | from rhodecode.lib import helpers as h | |
34 | from rhodecode.lib import diffs |
|
34 | from rhodecode.lib import diffs, codeblocks | |
35 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
35 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator | |
36 | from rhodecode.lib.base import BaseRepoController, render |
|
36 | from rhodecode.lib.base import BaseRepoController, render | |
37 | from rhodecode.lib.utils import safe_str |
|
37 | from rhodecode.lib.utils import safe_str | |
38 | from rhodecode.lib.utils2 import safe_unicode, str2bool |
|
38 | from rhodecode.lib.utils2 import safe_unicode, str2bool | |
39 | from rhodecode.lib.vcs.exceptions import ( |
|
39 | from rhodecode.lib.vcs.exceptions import ( | |
40 |
EmptyRepositoryError, RepositoryError, RepositoryRequirementError |
|
40 | EmptyRepositoryError, RepositoryError, RepositoryRequirementError, | |
|
41 | NodeDoesNotExistError) | |||
41 | from rhodecode.model.db import Repository, ChangesetStatus |
|
42 | from rhodecode.model.db import Repository, ChangesetStatus | |
42 |
|
43 | |||
43 | log = logging.getLogger(__name__) |
|
44 | log = logging.getLogger(__name__) | |
@@ -78,7 +79,7 b' class CompareController(BaseRepoControll' | |||||
78 | def index(self, repo_name): |
|
79 | def index(self, repo_name): | |
79 | c.compare_home = True |
|
80 | c.compare_home = True | |
80 | c.commit_ranges = [] |
|
81 | c.commit_ranges = [] | |
81 |
c. |
|
82 | c.diffset = None | |
82 | c.limited_diff = False |
|
83 | c.limited_diff = False | |
83 | source_repo = c.rhodecode_db_repo.repo_name |
|
84 | source_repo = c.rhodecode_db_repo.repo_name | |
84 | target_repo = request.GET.get('target_repo', source_repo) |
|
85 | target_repo = request.GET.get('target_repo', source_repo) | |
@@ -239,28 +240,24 b' class CompareController(BaseRepoControll' | |||||
239 | commit1=source_commit, commit2=target_commit, |
|
240 | commit1=source_commit, commit2=target_commit, | |
240 | path1=source_path, path=target_path) |
|
241 | path1=source_path, path=target_path) | |
241 | diff_processor = diffs.DiffProcessor( |
|
242 | diff_processor = diffs.DiffProcessor( | |
242 |
txtdiff, format=' |
|
243 | txtdiff, format='newdiff', diff_limit=diff_limit, | |
243 | file_limit=file_limit, show_full_diff=c.fulldiff) |
|
244 | file_limit=file_limit, show_full_diff=c.fulldiff) | |
244 | _parsed = diff_processor.prepare() |
|
245 | _parsed = diff_processor.prepare() | |
245 |
|
246 | |||
246 | c.limited_diff = False |
|
247 | def _node_getter(commit): | |
247 | if isinstance(_parsed, diffs.LimitedDiffContainer): |
|
248 | """ Returns a function that returns a node for a commit or None """ | |
248 | c.limited_diff = True |
|
249 | def get_node(fname): | |
|
250 | try: | |||
|
251 | return commit.get_node(fname) | |||
|
252 | except NodeDoesNotExistError: | |||
|
253 | return None | |||
|
254 | return get_node | |||
249 |
|
255 | |||
250 | c.files = [] |
|
256 | c.diffset = codeblocks.DiffSet( | |
251 | c.changes = {} |
|
257 | repo_name=source_repo.repo_name, | |
252 | c.lines_added = 0 |
|
258 | source_node_getter=_node_getter(source_commit), | |
253 | c.lines_deleted = 0 |
|
259 | target_node_getter=_node_getter(target_commit), | |
254 | for f in _parsed: |
|
260 | ).render_patchset(_parsed, source_ref, target_ref) | |
255 | st = f['stats'] |
|
|||
256 | if not st['binary']: |
|
|||
257 | c.lines_added += st['added'] |
|
|||
258 | c.lines_deleted += st['deleted'] |
|
|||
259 | fid = h.FID('', f['filename']) |
|
|||
260 | c.files.append([fid, f['operation'], f['filename'], f['stats'], f]) |
|
|||
261 | htmldiff = diff_processor.as_html( |
|
|||
262 | enable_comments=False, parsed_lines=[f]) |
|
|||
263 | c.changes[fid] = [f['operation'], f['filename'], htmldiff, f] |
|
|||
264 |
|
261 | |||
265 | c.preview_mode = merge |
|
262 | c.preview_mode = merge | |
266 |
|
263 |
@@ -36,6 +36,8 b' from webob.exc import HTTPNotFound, HTTP' | |||||
36 | from rhodecode.controllers.utils import parse_path_ref |
|
36 | from rhodecode.controllers.utils import parse_path_ref | |
37 | from rhodecode.lib import diffs, helpers as h, caches |
|
37 | from rhodecode.lib import diffs, helpers as h, caches | |
38 | from rhodecode.lib.compat import OrderedDict |
|
38 | from rhodecode.lib.compat import OrderedDict | |
|
39 | from rhodecode.lib.codeblocks import ( | |||
|
40 | filenode_as_lines_tokens, filenode_as_annotated_lines_tokens) | |||
39 | from rhodecode.lib.utils import jsonify, action_logger |
|
41 | from rhodecode.lib.utils import jsonify, action_logger | |
40 | from rhodecode.lib.utils2 import ( |
|
42 | from rhodecode.lib.utils2 import ( | |
41 | convert_line_endings, detect_mode, safe_str, str2bool) |
|
43 | convert_line_endings, detect_mode, safe_str, str2bool) | |
@@ -221,9 +223,19 b' class FilesController(BaseRepoController' | |||||
221 | c.file_author = True |
|
223 | c.file_author = True | |
222 | c.file_tree = '' |
|
224 | c.file_tree = '' | |
223 | if c.file.is_file(): |
|
225 | if c.file.is_file(): | |
224 |
c. |
|
226 | c.file_source_page = 'true' | |
225 | c.renderer and h.renderer_from_filename(c.file.path)) |
|
|||
226 | c.file_last_commit = c.file.last_commit |
|
227 | c.file_last_commit = c.file.last_commit | |
|
228 | if c.file.size < self.cut_off_limit_file: | |||
|
229 | if c.annotate: # annotation has precedence over renderer | |||
|
230 | c.annotated_lines = filenode_as_annotated_lines_tokens( | |||
|
231 | c.file | |||
|
232 | ) | |||
|
233 | else: | |||
|
234 | c.renderer = ( | |||
|
235 | c.renderer and h.renderer_from_filename(c.file.path) | |||
|
236 | ) | |||
|
237 | if not c.renderer: | |||
|
238 | c.lines = filenode_as_lines_tokens(c.file) | |||
227 |
|
239 | |||
228 | c.on_branch_head = self._is_valid_head( |
|
240 | c.on_branch_head = self._is_valid_head( | |
229 | commit_id, c.rhodecode_repo) |
|
241 | commit_id, c.rhodecode_repo) | |
@@ -233,6 +245,7 b' class FilesController(BaseRepoController' | |||||
233 | c.authors = [(h.email(author), |
|
245 | c.authors = [(h.email(author), | |
234 | h.person(author, 'username_or_name_or_email'))] |
|
246 | h.person(author, 'username_or_name_or_email'))] | |
235 | else: |
|
247 | else: | |
|
248 | c.file_source_page = 'false' | |||
236 | c.authors = [] |
|
249 | c.authors = [] | |
237 | c.file_tree = self._get_tree_at_commit( |
|
250 | c.file_tree = self._get_tree_at_commit( | |
238 | repo_name, c.commit.raw_id, f_path) |
|
251 | repo_name, c.commit.raw_id, f_path) |
@@ -60,8 +60,7 b' class ForksController(BaseRepoController' | |||||
60 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
60 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) | |
61 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() |
|
61 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() | |
62 | c.landing_revs_choices = choices |
|
62 | c.landing_revs_choices = choices | |
63 |
c.personal_repo_group = |
|
63 | c.personal_repo_group = c.rhodecode_user.personal_repo_group | |
64 | c.rhodecode_user.username) |
|
|||
65 |
|
64 | |||
66 | def __load_data(self, repo_name=None): |
|
65 | def __load_data(self, repo_name=None): | |
67 | """ |
|
66 | """ |
@@ -286,4 +286,3 b' class HomeController(BaseController):' | |||||
286 | _user_groups = _user_groups |
|
286 | _user_groups = _user_groups | |
287 |
|
287 | |||
288 | return {'suggestions': _user_groups} |
|
288 | return {'suggestions': _user_groups} | |
289 |
|
@@ -22,6 +22,7 b'' | |||||
22 | pull requests controller for rhodecode for initializing pull requests |
|
22 | pull requests controller for rhodecode for initializing pull requests | |
23 | """ |
|
23 | """ | |
24 |
|
24 | |||
|
25 | import peppercorn | |||
25 | import formencode |
|
26 | import formencode | |
26 | import logging |
|
27 | import logging | |
27 |
|
28 | |||
@@ -29,22 +30,26 b' from webob.exc import HTTPNotFound, HTTP' | |||||
29 | from pylons import request, tmpl_context as c, url |
|
30 | from pylons import request, tmpl_context as c, url | |
30 | from pylons.controllers.util import redirect |
|
31 | from pylons.controllers.util import redirect | |
31 | from pylons.i18n.translation import _ |
|
32 | from pylons.i18n.translation import _ | |
|
33 | from pyramid.threadlocal import get_current_registry | |||
32 | from sqlalchemy.sql import func |
|
34 | from sqlalchemy.sql import func | |
33 | from sqlalchemy.sql.expression import or_ |
|
35 | from sqlalchemy.sql.expression import or_ | |
34 |
|
36 | |||
35 | from rhodecode import events |
|
37 | from rhodecode import events | |
36 | from rhodecode.lib import auth, diffs, helpers as h |
|
38 | from rhodecode.lib import auth, diffs, helpers as h, codeblocks | |
37 | from rhodecode.lib.ext_json import json |
|
39 | from rhodecode.lib.ext_json import json | |
38 | from rhodecode.lib.base import ( |
|
40 | from rhodecode.lib.base import ( | |
39 | BaseRepoController, render, vcs_operation_context) |
|
41 | BaseRepoController, render, vcs_operation_context) | |
40 | from rhodecode.lib.auth import ( |
|
42 | from rhodecode.lib.auth import ( | |
41 | LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, |
|
43 | LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, | |
42 | HasAcceptedRepoType, XHRRequired) |
|
44 | HasAcceptedRepoType, XHRRequired) | |
|
45 | from rhodecode.lib.channelstream import channelstream_request | |||
|
46 | from rhodecode.lib.compat import OrderedDict | |||
43 | from rhodecode.lib.utils import jsonify |
|
47 | from rhodecode.lib.utils import jsonify | |
44 | from rhodecode.lib.utils2 import safe_int, safe_str, str2bool, safe_unicode |
|
48 | from rhodecode.lib.utils2 import safe_int, safe_str, str2bool, safe_unicode | |
45 | from rhodecode.lib.vcs.backends.base import EmptyCommit |
|
49 | from rhodecode.lib.vcs.backends.base import EmptyCommit, UpdateFailureReason | |
46 | from rhodecode.lib.vcs.exceptions import ( |
|
50 | from rhodecode.lib.vcs.exceptions import ( | |
47 |
EmptyRepositoryError, CommitDoesNotExistError, RepositoryRequirementError |
|
51 | EmptyRepositoryError, CommitDoesNotExistError, RepositoryRequirementError, | |
|
52 | NodeDoesNotExistError) | |||
48 | from rhodecode.lib.diffs import LimitedDiffContainer |
|
53 | from rhodecode.lib.diffs import LimitedDiffContainer | |
49 | from rhodecode.model.changeset_status import ChangesetStatusModel |
|
54 | from rhodecode.model.changeset_status import ChangesetStatusModel | |
50 | from rhodecode.model.comment import ChangesetCommentsModel |
|
55 | from rhodecode.model.comment import ChangesetCommentsModel | |
@@ -61,7 +66,7 b' class PullrequestsController(BaseRepoCon' | |||||
61 | def __before__(self): |
|
66 | def __before__(self): | |
62 | super(PullrequestsController, self).__before__() |
|
67 | super(PullrequestsController, self).__before__() | |
63 |
|
68 | |||
64 | def _load_compare_data(self, pull_request, enable_comments=True): |
|
69 | def _load_compare_data(self, pull_request, inline_comments, enable_comments=True): | |
65 | """ |
|
70 | """ | |
66 | Load context data needed for generating compare diff |
|
71 | Load context data needed for generating compare diff | |
67 |
|
72 | |||
@@ -114,6 +119,7 b' class PullrequestsController(BaseRepoCon' | |||||
114 | except RepositoryRequirementError: |
|
119 | except RepositoryRequirementError: | |
115 | c.missing_requirements = True |
|
120 | c.missing_requirements = True | |
116 |
|
121 | |||
|
122 | c.changes = {} | |||
117 | c.missing_commits = False |
|
123 | c.missing_commits = False | |
118 | if (c.missing_requirements or |
|
124 | if (c.missing_requirements or | |
119 | isinstance(source_commit, EmptyCommit) or |
|
125 | isinstance(source_commit, EmptyCommit) or | |
@@ -123,11 +129,31 b' class PullrequestsController(BaseRepoCon' | |||||
123 | else: |
|
129 | else: | |
124 | vcs_diff = PullRequestModel().get_diff(pull_request) |
|
130 | vcs_diff = PullRequestModel().get_diff(pull_request) | |
125 | diff_processor = diffs.DiffProcessor( |
|
131 | diff_processor = diffs.DiffProcessor( | |
126 |
vcs_diff, format=' |
|
132 | vcs_diff, format='newdiff', diff_limit=diff_limit, | |
127 | file_limit=file_limit, show_full_diff=c.fulldiff) |
|
133 | file_limit=file_limit, show_full_diff=c.fulldiff) | |
128 | _parsed = diff_processor.prepare() |
|
134 | _parsed = diff_processor.prepare() | |
129 |
|
135 | |||
130 | c.limited_diff = isinstance(_parsed, LimitedDiffContainer) |
|
136 | commit_changes = OrderedDict() | |
|
137 | _parsed = diff_processor.prepare() | |||
|
138 | c.limited_diff = isinstance(_parsed, diffs.LimitedDiffContainer) | |||
|
139 | ||||
|
140 | _parsed = diff_processor.prepare() | |||
|
141 | ||||
|
142 | def _node_getter(commit): | |||
|
143 | def get_node(fname): | |||
|
144 | try: | |||
|
145 | return commit.get_node(fname) | |||
|
146 | except NodeDoesNotExistError: | |||
|
147 | return None | |||
|
148 | return get_node | |||
|
149 | ||||
|
150 | c.diffset = codeblocks.DiffSet( | |||
|
151 | repo_name=c.repo_name, | |||
|
152 | source_node_getter=_node_getter(target_commit), | |||
|
153 | target_node_getter=_node_getter(source_commit), | |||
|
154 | comments=inline_comments | |||
|
155 | ).render_patchset(_parsed, target_commit.raw_id, source_commit.raw_id) | |||
|
156 | ||||
131 |
|
157 | |||
132 | c.files = [] |
|
158 | c.files = [] | |
133 | c.changes = {} |
|
159 | c.changes = {} | |
@@ -136,17 +162,17 b' class PullrequestsController(BaseRepoCon' | |||||
136 | c.included_files = [] |
|
162 | c.included_files = [] | |
137 | c.deleted_files = [] |
|
163 | c.deleted_files = [] | |
138 |
|
164 | |||
139 | for f in _parsed: |
|
165 | # for f in _parsed: | |
140 | st = f['stats'] |
|
166 | # st = f['stats'] | |
141 | c.lines_added += st['added'] |
|
167 | # c.lines_added += st['added'] | |
142 | c.lines_deleted += st['deleted'] |
|
168 | # c.lines_deleted += st['deleted'] | |
143 |
|
169 | |||
144 | fid = h.FID('', f['filename']) |
|
170 | # fid = h.FID('', f['filename']) | |
145 | c.files.append([fid, f['operation'], f['filename'], f['stats']]) |
|
171 | # c.files.append([fid, f['operation'], f['filename'], f['stats']]) | |
146 | c.included_files.append(f['filename']) |
|
172 | # c.included_files.append(f['filename']) | |
147 | html_diff = diff_processor.as_html(enable_comments=enable_comments, |
|
173 | # html_diff = diff_processor.as_html(enable_comments=enable_comments, | |
148 | parsed_lines=[f]) |
|
174 | # parsed_lines=[f]) | |
149 | c.changes[fid] = [f['operation'], f['filename'], html_diff, f] |
|
175 | # c.changes[fid] = [f['operation'], f['filename'], html_diff, f] | |
150 |
|
176 | |||
151 | def _extract_ordering(self, request): |
|
177 | def _extract_ordering(self, request): | |
152 | column_index = safe_int(request.GET.get('order[0][column]')) |
|
178 | column_index = safe_int(request.GET.get('order[0][column]')) | |
@@ -295,10 +321,12 b' class PullrequestsController(BaseRepoCon' | |||||
295 | redirect(url('pullrequest_home', repo_name=source_repo.repo_name)) |
|
321 | redirect(url('pullrequest_home', repo_name=source_repo.repo_name)) | |
296 |
|
322 | |||
297 | default_target_repo = source_repo |
|
323 | default_target_repo = source_repo | |
298 | if (source_repo.parent and |
|
324 | ||
299 | not source_repo.parent.scm_instance().is_empty()): |
|
325 | if source_repo.parent: | |
300 | # change default if we have a parent repo |
|
326 | parent_vcs_obj = source_repo.parent.scm_instance() | |
301 | default_target_repo = source_repo.parent |
|
327 | if parent_vcs_obj and not parent_vcs_obj.is_empty(): | |
|
328 | # change default if we have a parent repo | |||
|
329 | default_target_repo = source_repo.parent | |||
302 |
|
330 | |||
303 | target_repo_data = PullRequestModel().generate_repo_data( |
|
331 | target_repo_data = PullRequestModel().generate_repo_data( | |
304 | default_target_repo) |
|
332 | default_target_repo) | |
@@ -360,7 +388,8 b' class PullrequestsController(BaseRepoCon' | |||||
360 | add_parent = False |
|
388 | add_parent = False | |
361 | if repo.parent: |
|
389 | if repo.parent: | |
362 | if filter_query in repo.parent.repo_name: |
|
390 | if filter_query in repo.parent.repo_name: | |
363 |
|
|
391 | parent_vcs_obj = repo.parent.scm_instance() | |
|
392 | if parent_vcs_obj and not parent_vcs_obj.is_empty(): | |||
364 | add_parent = True |
|
393 | add_parent = True | |
365 |
|
394 | |||
366 | limit = 20 - 1 if add_parent else 20 |
|
395 | limit = 20 - 1 if add_parent else 20 | |
@@ -397,8 +426,10 b' class PullrequestsController(BaseRepoCon' | |||||
397 | if not repo: |
|
426 | if not repo: | |
398 | raise HTTPNotFound |
|
427 | raise HTTPNotFound | |
399 |
|
428 | |||
|
429 | controls = peppercorn.parse(request.POST.items()) | |||
|
430 | ||||
400 | try: |
|
431 | try: | |
401 |
_form = PullRequestForm(repo.repo_id)().to_python( |
|
432 | _form = PullRequestForm(repo.repo_id)().to_python(controls) | |
402 | except formencode.Invalid as errors: |
|
433 | except formencode.Invalid as errors: | |
403 | if errors.error_dict.get('revisions'): |
|
434 | if errors.error_dict.get('revisions'): | |
404 | msg = 'Revisions: %s' % errors.error_dict['revisions'] |
|
435 | msg = 'Revisions: %s' % errors.error_dict['revisions'] | |
@@ -417,7 +448,8 b' class PullrequestsController(BaseRepoCon' | |||||
417 | target_repo = _form['target_repo'] |
|
448 | target_repo = _form['target_repo'] | |
418 | target_ref = _form['target_ref'] |
|
449 | target_ref = _form['target_ref'] | |
419 | commit_ids = _form['revisions'][::-1] |
|
450 | commit_ids = _form['revisions'][::-1] | |
420 |
reviewers = |
|
451 | reviewers = [ | |
|
452 | (r['user_id'], r['reasons']) for r in _form['review_members']] | |||
421 |
|
453 | |||
422 | # find the ancestor for this pr |
|
454 | # find the ancestor for this pr | |
423 | source_db_repo = Repository.get_by_repo_name(_form['source_repo']) |
|
455 | source_db_repo = Repository.get_by_repo_name(_form['source_repo']) | |
@@ -476,8 +508,11 b' class PullrequestsController(BaseRepoCon' | |||||
476 | allowed_to_update = PullRequestModel().check_user_update( |
|
508 | allowed_to_update = PullRequestModel().check_user_update( | |
477 | pull_request, c.rhodecode_user) |
|
509 | pull_request, c.rhodecode_user) | |
478 | if allowed_to_update: |
|
510 | if allowed_to_update: | |
479 | if 'reviewers_ids' in request.POST: |
|
511 | controls = peppercorn.parse(request.POST.items()) | |
480 | self._update_reviewers(pull_request_id) |
|
512 | ||
|
513 | if 'review_members' in controls: | |||
|
514 | self._update_reviewers( | |||
|
515 | pull_request_id, controls['review_members']) | |||
481 | elif str2bool(request.POST.get('update_commits', 'false')): |
|
516 | elif str2bool(request.POST.get('update_commits', 'false')): | |
482 | self._update_commits(pull_request) |
|
517 | self._update_commits(pull_request) | |
483 | elif str2bool(request.POST.get('close_pull_request', 'false')): |
|
518 | elif str2bool(request.POST.get('close_pull_request', 'false')): | |
@@ -506,32 +541,51 b' class PullrequestsController(BaseRepoCon' | |||||
506 | return |
|
541 | return | |
507 |
|
542 | |||
508 | def _update_commits(self, pull_request): |
|
543 | def _update_commits(self, pull_request): | |
509 | try: |
|
544 | resp = PullRequestModel().update_commits(pull_request) | |
510 | if PullRequestModel().has_valid_update_type(pull_request): |
|
545 | ||
511 | updated_version, changes = PullRequestModel().update_commits( |
|
546 | if resp.executed: | |
512 | pull_request) |
|
547 | msg = _( | |
513 | if updated_version: |
|
548 | u'Pull request updated to "{source_commit_id}" with ' | |
514 | msg = _( |
|
549 | u'{count_added} added, {count_removed} removed commits.') | |
515 | u'Pull request updated to "{source_commit_id}" with ' |
|
550 | msg = msg.format( | |
516 | u'{count_added} added, {count_removed} removed ' |
|
551 | source_commit_id=pull_request.source_ref_parts.commit_id, | |
517 | u'commits.' |
|
552 | count_added=len(resp.changes.added), | |
518 | ).format( |
|
553 | count_removed=len(resp.changes.removed)) | |
519 | source_commit_id=pull_request.source_ref_parts.commit_id, |
|
554 | h.flash(msg, category='success') | |
520 | count_added=len(changes.added), |
|
555 | ||
521 | count_removed=len(changes.removed)) |
|
556 | registry = get_current_registry() | |
522 | h.flash(msg, category='success') |
|
557 | rhodecode_plugins = getattr(registry, 'rhodecode_plugins', {}) | |
523 | else: |
|
558 | channelstream_config = rhodecode_plugins.get('channelstream', {}) | |
524 | h.flash(_("Nothing changed in pull request."), |
|
559 | if channelstream_config.get('enabled'): | |
525 | category='warning') |
|
560 | message = msg + ( | |
526 | else: |
|
561 | ' - <a onclick="window.location.reload()">' | |
527 | msg = _( |
|
562 | '<strong>{}</strong></a>'.format(_('Reload page'))) | |
528 | u"Skipping update of pull request due to reference " |
|
563 | channel = '/repo${}$/pr/{}'.format( | |
529 | u"type: {reference_type}" |
|
564 | pull_request.target_repo.repo_name, | |
530 | ).format(reference_type=pull_request.source_ref_parts.type) |
|
565 | pull_request.pull_request_id | |
531 | h.flash(msg, category='warning') |
|
566 | ) | |
532 | except CommitDoesNotExistError: |
|
567 | payload = { | |
533 | h.flash( |
|
568 | 'type': 'message', | |
534 | _(u'Update failed due to missing commits.'), category='error') |
|
569 | 'user': 'system', | |
|
570 | 'exclude_users': [request.user.username], | |||
|
571 | 'channel': channel, | |||
|
572 | 'message': { | |||
|
573 | 'message': message, | |||
|
574 | 'level': 'success', | |||
|
575 | 'topic': '/notifications' | |||
|
576 | } | |||
|
577 | } | |||
|
578 | channelstream_request( | |||
|
579 | channelstream_config, [payload], '/message', | |||
|
580 | raise_exc=False) | |||
|
581 | else: | |||
|
582 | msg = PullRequestModel.UPDATE_STATUS_MESSAGES[resp.reason] | |||
|
583 | warning_reasons = [ | |||
|
584 | UpdateFailureReason.NO_CHANGE, | |||
|
585 | UpdateFailureReason.WRONG_REF_TPYE, | |||
|
586 | ] | |||
|
587 | category = 'warning' if resp.reason in warning_reasons else 'error' | |||
|
588 | h.flash(msg, category=category) | |||
535 |
|
589 | |||
536 | @auth.CSRFRequired() |
|
590 | @auth.CSRFRequired() | |
537 | @LoginRequired() |
|
591 | @LoginRequired() | |
@@ -601,11 +655,10 b' class PullrequestsController(BaseRepoCon' | |||||
601 | merge_resp.failure_reason) |
|
655 | merge_resp.failure_reason) | |
602 | h.flash(msg, category='error') |
|
656 | h.flash(msg, category='error') | |
603 |
|
657 | |||
604 | def _update_reviewers(self, pull_request_id): |
|
658 | def _update_reviewers(self, pull_request_id, review_members): | |
605 |
reviewers |
|
659 | reviewers = [ | |
606 | lambda v: v not in [None, ''], |
|
660 | (int(r['user_id']), r['reasons']) for r in review_members] | |
607 | request.POST.get('reviewers_ids', '').split(','))) |
|
661 | PullRequestModel().update_reviewers(pull_request_id, reviewers) | |
608 | PullRequestModel().update_reviewers(pull_request_id, reviewers_ids) |
|
|||
609 | Session().commit() |
|
662 | Session().commit() | |
610 |
|
663 | |||
611 | def _reject_close(self, pull_request): |
|
664 | def _reject_close(self, pull_request): | |
@@ -655,6 +708,10 b' class PullrequestsController(BaseRepoCon' | |||||
655 | c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed() |
|
708 | c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed() | |
656 | c.allowed_to_merge = PullRequestModel().check_user_merge( |
|
709 | c.allowed_to_merge = PullRequestModel().check_user_merge( | |
657 | c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed() |
|
710 | c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed() | |
|
711 | c.shadow_clone_url = PullRequestModel().get_shadow_clone_url( | |||
|
712 | c.pull_request) | |||
|
713 | c.allowed_to_delete = PullRequestModel().check_user_delete( | |||
|
714 | c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed() | |||
658 |
|
715 | |||
659 | cc_model = ChangesetCommentsModel() |
|
716 | cc_model = ChangesetCommentsModel() | |
660 |
|
717 | |||
@@ -669,23 +726,13 b' class PullrequestsController(BaseRepoCon' | |||||
669 | c.pr_merge_status = False |
|
726 | c.pr_merge_status = False | |
670 | # load compare data into template context |
|
727 | # load compare data into template context | |
671 | enable_comments = not c.pull_request.is_closed() |
|
728 | enable_comments = not c.pull_request.is_closed() | |
672 | self._load_compare_data(c.pull_request, enable_comments=enable_comments) |
|
|||
673 |
|
729 | |||
674 | # this is a hack to properly display links, when creating PR, the |
|
|||
675 | # compare view and others uses different notation, and |
|
|||
676 | # compare_commits.html renders links based on the target_repo. |
|
|||
677 | # We need to swap that here to generate it properly on the html side |
|
|||
678 | c.target_repo = c.source_repo |
|
|||
679 |
|
730 | |||
680 | # inline comments |
|
731 | # inline comments | |
681 | c.inline_cnt = 0 |
|
|||
682 | c.inline_comments = cc_model.get_inline_comments( |
|
732 | c.inline_comments = cc_model.get_inline_comments( | |
683 | c.rhodecode_db_repo.repo_id, |
|
733 | c.rhodecode_db_repo.repo_id, | |
684 |
pull_request=pull_request_id) |
|
734 | pull_request=pull_request_id) | |
685 |
|
|
735 | c.inline_cnt = len(c.inline_comments) | |
686 | for __, lines in c.inline_comments: |
|
|||
687 | for comments in lines.values(): |
|
|||
688 | c.inline_cnt += len(comments) |
|
|||
689 |
|
736 | |||
690 | # outdated comments |
|
737 | # outdated comments | |
691 | c.outdated_cnt = 0 |
|
738 | c.outdated_cnt = 0 | |
@@ -702,6 +749,15 b' class PullrequestsController(BaseRepoCon' | |||||
702 | else: |
|
749 | else: | |
703 | c.outdated_comments = {} |
|
750 | c.outdated_comments = {} | |
704 |
|
751 | |||
|
752 | self._load_compare_data( | |||
|
753 | c.pull_request, c.inline_comments, enable_comments=enable_comments) | |||
|
754 | ||||
|
755 | # this is a hack to properly display links, when creating PR, the | |||
|
756 | # compare view and others uses different notation, and | |||
|
757 | # compare_commits.html renders links based on the target_repo. | |||
|
758 | # We need to swap that here to generate it properly on the html side | |||
|
759 | c.target_repo = c.source_repo | |||
|
760 | ||||
705 | # comments |
|
761 | # comments | |
706 | c.comments = cc_model.get_comments(c.rhodecode_db_repo.repo_id, |
|
762 | c.comments = cc_model.get_comments(c.rhodecode_db_repo.repo_id, | |
707 | pull_request=pull_request_id) |
|
763 | pull_request=pull_request_id) |
@@ -251,6 +251,16 b' class SummaryController(BaseRepoControll' | |||||
251 | } |
|
251 | } | |
252 | return data |
|
252 | return data | |
253 |
|
253 | |||
|
254 | @LoginRequired() | |||
|
255 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |||
|
256 | 'repository.admin') | |||
|
257 | @jsonify | |||
|
258 | def repo_default_reviewers_data(self, repo_name): | |||
|
259 | return { | |||
|
260 | 'reviewers': [utils.reviewer_as_json( | |||
|
261 | user=c.rhodecode_db_repo.user, reasons=None)] | |||
|
262 | } | |||
|
263 | ||||
254 | @jsonify |
|
264 | @jsonify | |
255 | def repo_refs_changelog_data(self, repo_name): |
|
265 | def repo_refs_changelog_data(self, repo_name): | |
256 | repo = c.rhodecode_repo |
|
266 | repo = c.rhodecode_repo |
@@ -86,3 +86,21 b' def get_commit_from_ref_name(repo, ref_n' | |||||
86 | '%s "%s" does not exist' % (ref_type, ref_name)) |
|
86 | '%s "%s" does not exist' % (ref_type, ref_name)) | |
87 |
|
87 | |||
88 | return repo_scm.get_commit(commit_id) |
|
88 | return repo_scm.get_commit(commit_id) | |
|
89 | ||||
|
90 | ||||
|
91 | def reviewer_as_json(user, reasons): | |||
|
92 | """ | |||
|
93 | Returns json struct of a reviewer for frontend | |||
|
94 | ||||
|
95 | :param user: the reviewer | |||
|
96 | :param reasons: list of strings of why they are reviewers | |||
|
97 | """ | |||
|
98 | ||||
|
99 | return { | |||
|
100 | 'user_id': user.user_id, | |||
|
101 | 'reasons': reasons, | |||
|
102 | 'username': user.username, | |||
|
103 | 'firstname': user.firstname, | |||
|
104 | 'lastname': user.lastname, | |||
|
105 | 'gravatar_link': h.gravatar_url(user.email, 14), | |||
|
106 | } |
@@ -48,6 +48,7 b' from rhodecode.events.base import Rhodec' | |||||
48 |
|
48 | |||
49 | from rhodecode.events.user import ( # noqa |
|
49 | from rhodecode.events.user import ( # noqa | |
50 | UserPreCreate, |
|
50 | UserPreCreate, | |
|
51 | UserPostCreate, | |||
51 | UserPreUpdate, |
|
52 | UserPreUpdate, | |
52 | UserRegistered |
|
53 | UserRegistered | |
53 | ) |
|
54 | ) |
@@ -51,6 +51,19 b' class UserPreCreate(RhodecodeEvent):' | |||||
51 | self.user_data = user_data |
|
51 | self.user_data = user_data | |
52 |
|
52 | |||
53 |
|
53 | |||
|
54 | @implementer(IUserPreCreate) | |||
|
55 | class UserPostCreate(RhodecodeEvent): | |||
|
56 | """ | |||
|
57 | An instance of this class is emitted as an :term:`event` after a new user | |||
|
58 | object is created. | |||
|
59 | """ | |||
|
60 | name = 'user-post-create' | |||
|
61 | display_name = lazy_ugettext('user post create') | |||
|
62 | ||||
|
63 | def __init__(self, user_data): | |||
|
64 | self.user_data = user_data | |||
|
65 | ||||
|
66 | ||||
54 | @implementer(IUserPreUpdate) |
|
67 | @implementer(IUserPreUpdate) | |
55 | class UserPreUpdate(RhodecodeEvent): |
|
68 | class UserPreUpdate(RhodecodeEvent): | |
56 | """ |
|
69 | """ |
@@ -161,7 +161,7 b' class HipchatIntegrationType(Integration' | |||||
161 | comment_text = data['comment']['text'] |
|
161 | comment_text = data['comment']['text'] | |
162 | if len(comment_text) > 200: |
|
162 | if len(comment_text) > 200: | |
163 | comment_text = '{comment_text}<a href="{comment_url}">...<a/>'.format( |
|
163 | comment_text = '{comment_text}<a href="{comment_url}">...<a/>'.format( | |
164 | comment_text=comment_text[:200], |
|
164 | comment_text=h.html_escape(comment_text[:200]), | |
165 | comment_url=data['comment']['url'], |
|
165 | comment_url=data['comment']['url'], | |
166 | ) |
|
166 | ) | |
167 |
|
167 | |||
@@ -179,8 +179,8 b' class HipchatIntegrationType(Integration' | |||||
179 | number=data['pullrequest']['pull_request_id'], |
|
179 | number=data['pullrequest']['pull_request_id'], | |
180 | pr_url=data['pullrequest']['url'], |
|
180 | pr_url=data['pullrequest']['url'], | |
181 | pr_status=data['pullrequest']['status'], |
|
181 | pr_status=data['pullrequest']['status'], | |
182 | pr_title=data['pullrequest']['title'], |
|
182 | pr_title=h.html_escape(data['pullrequest']['title']), | |
183 | comment_text=comment_text |
|
183 | comment_text=h.html_escape(comment_text) | |
184 | ) |
|
184 | ) | |
185 | ) |
|
185 | ) | |
186 |
|
186 | |||
@@ -193,7 +193,7 b' class HipchatIntegrationType(Integration' | |||||
193 | number=data['pullrequest']['pull_request_id'], |
|
193 | number=data['pullrequest']['pull_request_id'], | |
194 | pr_url=data['pullrequest']['url'], |
|
194 | pr_url=data['pullrequest']['url'], | |
195 | pr_status=data['pullrequest']['status'], |
|
195 | pr_status=data['pullrequest']['status'], | |
196 | pr_title=data['pullrequest']['title'], |
|
196 | pr_title=h.html_escape(data['pullrequest']['title']), | |
197 | ) |
|
197 | ) | |
198 | ) |
|
198 | ) | |
199 |
|
199 | |||
@@ -206,21 +206,20 b' class HipchatIntegrationType(Integration' | |||||
206 | }.get(event.__class__, str(event.__class__)) |
|
206 | }.get(event.__class__, str(event.__class__)) | |
207 |
|
207 | |||
208 | return ('Pull request <a href="{url}">#{number}</a> - {title} ' |
|
208 | return ('Pull request <a href="{url}">#{number}</a> - {title} ' | |
209 | '{action} by {user}').format( |
|
209 | '{action} by <b>{user}</b>').format( | |
210 | user=data['actor']['username'], |
|
210 | user=data['actor']['username'], | |
211 | number=data['pullrequest']['pull_request_id'], |
|
211 | number=data['pullrequest']['pull_request_id'], | |
212 | url=data['pullrequest']['url'], |
|
212 | url=data['pullrequest']['url'], | |
213 | title=data['pullrequest']['title'], |
|
213 | title=h.html_escape(data['pullrequest']['title']), | |
214 | action=action |
|
214 | action=action | |
215 | ) |
|
215 | ) | |
216 |
|
216 | |||
217 | def format_repo_push_event(self, data): |
|
217 | def format_repo_push_event(self, data): | |
218 | branch_data = {branch['name']: branch |
|
218 | branch_data = {branch['name']: branch | |
219 | for branch in data['push']['branches']} |
|
219 | for branch in data['push']['branches']} | |
220 |
|
220 | |||
221 | branches_commits = {} |
|
221 | branches_commits = {} | |
222 | for commit in data['push']['commits']: |
|
222 | for commit in data['push']['commits']: | |
223 | log.critical(commit) |
|
|||
224 | if commit['branch'] not in branches_commits: |
|
223 | if commit['branch'] not in branches_commits: | |
225 | branch_commits = {'branch': branch_data[commit['branch']], |
|
224 | branch_commits = {'branch': branch_data[commit['branch']], | |
226 | 'commits': []} |
|
225 | 'commits': []} | |
@@ -238,7 +237,7 b' class HipchatIntegrationType(Integration' | |||||
238 | def format_repo_create_event(self, data): |
|
237 | def format_repo_create_event(self, data): | |
239 | return '<a href="{}">{}</a> ({}) repository created by <b>{}</b>'.format( |
|
238 | return '<a href="{}">{}</a> ({}) repository created by <b>{}</b>'.format( | |
240 | data['repo']['url'], |
|
239 | data['repo']['url'], | |
241 | data['repo']['repo_name'], |
|
240 | h.html_escape(data['repo']['repo_name']), | |
242 | data['repo']['repo_type'], |
|
241 | data['repo']['repo_type'], | |
243 | data['actor']['username'], |
|
242 | data['actor']['username'], | |
244 | ) |
|
243 | ) |
@@ -173,7 +173,7 b' class SlackIntegrationType(IntegrationTy' | |||||
173 |
|
173 | |||
174 | return (textwrap.dedent( |
|
174 | return (textwrap.dedent( | |
175 | ''' |
|
175 | ''' | |
176 | {user} commented on pull request <{pr_url}|#{number}> - {pr_title}: |
|
176 | *{user}* commented on pull request <{pr_url}|#{number}> - {pr_title}: | |
177 | >>> {comment_status}{comment_text} |
|
177 | >>> {comment_status}{comment_text} | |
178 | ''').format( |
|
178 | ''').format( | |
179 | comment_status=comment_status, |
|
179 | comment_status=comment_status, | |
@@ -208,7 +208,7 b' class SlackIntegrationType(IntegrationTy' | |||||
208 | }.get(event.__class__, str(event.__class__)) |
|
208 | }.get(event.__class__, str(event.__class__)) | |
209 |
|
209 | |||
210 | return ('Pull request <{url}|#{number}> - {title} ' |
|
210 | return ('Pull request <{url}|#{number}> - {title} ' | |
211 | '{action} by {user}').format( |
|
211 | '`{action}` by *{user}*').format( | |
212 | user=data['actor']['username'], |
|
212 | user=data['actor']['username'], | |
213 | number=data['pullrequest']['pull_request_id'], |
|
213 | number=data['pullrequest']['pull_request_id'], | |
214 | url=data['pullrequest']['url'], |
|
214 | url=data['pullrequest']['url'], | |
@@ -218,11 +218,10 b' class SlackIntegrationType(IntegrationTy' | |||||
218 |
|
218 | |||
219 | def format_repo_push_event(self, data): |
|
219 | def format_repo_push_event(self, data): | |
220 | branch_data = {branch['name']: branch |
|
220 | branch_data = {branch['name']: branch | |
221 | for branch in data['push']['branches']} |
|
221 | for branch in data['push']['branches']} | |
222 |
|
222 | |||
223 | branches_commits = {} |
|
223 | branches_commits = {} | |
224 | for commit in data['push']['commits']: |
|
224 | for commit in data['push']['commits']: | |
225 | log.critical(commit) |
|
|||
226 | if commit['branch'] not in branches_commits: |
|
225 | if commit['branch'] not in branches_commits: | |
227 | branch_commits = {'branch': branch_data[commit['branch']], |
|
226 | branch_commits = {'branch': branch_data[commit['branch']], | |
228 | 'commits': []} |
|
227 | 'commits': []} |
@@ -19,13 +19,15 b'' | |||||
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
21 | from __future__ import unicode_literals |
|
21 | from __future__ import unicode_literals | |
|
22 | import string | |||
|
23 | from collections import OrderedDict | |||
22 |
|
24 | |||
23 | import deform |
|
25 | import deform | |
24 | import logging |
|
26 | import logging | |
25 | import requests |
|
27 | import requests | |
26 | import colander |
|
28 | import colander | |
27 | from celery.task import task |
|
29 | from celery.task import task | |
28 | from mako.template import Template |
|
30 | from requests.packages.urllib3.util.retry import Retry | |
29 |
|
31 | |||
30 | from rhodecode import events |
|
32 | from rhodecode import events | |
31 | from rhodecode.translation import _ |
|
33 | from rhodecode.translation import _ | |
@@ -33,12 +35,127 b' from rhodecode.integrations.types.base i' | |||||
33 |
|
35 | |||
34 | log = logging.getLogger(__name__) |
|
36 | log = logging.getLogger(__name__) | |
35 |
|
37 | |||
|
38 | # updating this required to update the `base_vars` passed in url calling func | |||
|
39 | WEBHOOK_URL_VARS = [ | |||
|
40 | 'repo_name', | |||
|
41 | 'repo_type', | |||
|
42 | 'repo_id', | |||
|
43 | 'repo_url', | |||
|
44 | ||||
|
45 | # special attrs below that we handle, using multi-call | |||
|
46 | 'branch', | |||
|
47 | 'commit_id', | |||
|
48 | ||||
|
49 | # pr events vars | |||
|
50 | 'pull_request_id', | |||
|
51 | 'pull_request_url', | |||
|
52 | ||||
|
53 | ] | |||
|
54 | URL_VARS = ', '.join('${' + x + '}' for x in WEBHOOK_URL_VARS) | |||
|
55 | ||||
|
56 | ||||
|
57 | class WebhookHandler(object): | |||
|
58 | def __init__(self, template_url, secret_token): | |||
|
59 | self.template_url = template_url | |||
|
60 | self.secret_token = secret_token | |||
|
61 | ||||
|
62 | def get_base_parsed_template(self, data): | |||
|
63 | """ | |||
|
64 | initially parses the passed in template with some common variables | |||
|
65 | available on ALL calls | |||
|
66 | """ | |||
|
67 | # note: make sure to update the `WEBHOOK_URL_VARS` if this changes | |||
|
68 | common_vars = { | |||
|
69 | 'repo_name': data['repo']['repo_name'], | |||
|
70 | 'repo_type': data['repo']['repo_type'], | |||
|
71 | 'repo_id': data['repo']['repo_id'], | |||
|
72 | 'repo_url': data['repo']['url'], | |||
|
73 | } | |||
|
74 | ||||
|
75 | return string.Template( | |||
|
76 | self.template_url).safe_substitute(**common_vars) | |||
|
77 | ||||
|
78 | def repo_push_event_handler(self, event, data): | |||
|
79 | url = self.get_base_parsed_template(data) | |||
|
80 | url_cals = [] | |||
|
81 | branch_data = OrderedDict() | |||
|
82 | for obj in data['push']['branches']: | |||
|
83 | branch_data[obj['name']] = obj | |||
|
84 | ||||
|
85 | branches_commits = OrderedDict() | |||
|
86 | for commit in data['push']['commits']: | |||
|
87 | if commit['branch'] not in branches_commits: | |||
|
88 | branch_commits = {'branch': branch_data[commit['branch']], | |||
|
89 | 'commits': []} | |||
|
90 | branches_commits[commit['branch']] = branch_commits | |||
|
91 | ||||
|
92 | branch_commits = branches_commits[commit['branch']] | |||
|
93 | branch_commits['commits'].append(commit) | |||
|
94 | ||||
|
95 | if '${branch}' in url: | |||
|
96 | # call it multiple times, for each branch if used in variables | |||
|
97 | for branch, commit_ids in branches_commits.items(): | |||
|
98 | branch_url = string.Template(url).safe_substitute(branch=branch) | |||
|
99 | # call further down for each commit if used | |||
|
100 | if '${commit_id}' in branch_url: | |||
|
101 | for commit_data in commit_ids['commits']: | |||
|
102 | commit_id = commit_data['raw_id'] | |||
|
103 | commit_url = string.Template(branch_url).safe_substitute( | |||
|
104 | commit_id=commit_id) | |||
|
105 | # register per-commit call | |||
|
106 | log.debug( | |||
|
107 | 'register webhook call(%s) to url %s', event, commit_url) | |||
|
108 | url_cals.append((commit_url, self.secret_token, data)) | |||
|
109 | ||||
|
110 | else: | |||
|
111 | # register per-branch call | |||
|
112 | log.debug( | |||
|
113 | 'register webhook call(%s) to url %s', event, branch_url) | |||
|
114 | url_cals.append((branch_url, self.secret_token, data)) | |||
|
115 | ||||
|
116 | else: | |||
|
117 | log.debug( | |||
|
118 | 'register webhook call(%s) to url %s', event, url) | |||
|
119 | url_cals.append((url, self.secret_token, data)) | |||
|
120 | ||||
|
121 | return url_cals | |||
|
122 | ||||
|
123 | def repo_create_event_handler(self, event, data): | |||
|
124 | url = self.get_base_parsed_template(data) | |||
|
125 | log.debug( | |||
|
126 | 'register webhook call(%s) to url %s', event, url) | |||
|
127 | return [(url, self.secret_token, data)] | |||
|
128 | ||||
|
129 | def pull_request_event_handler(self, event, data): | |||
|
130 | url = self.get_base_parsed_template(data) | |||
|
131 | log.debug( | |||
|
132 | 'register webhook call(%s) to url %s', event, url) | |||
|
133 | url = string.Template(url).safe_substitute( | |||
|
134 | pull_request_id=data['pullrequest']['pull_request_id'], | |||
|
135 | pull_request_url=data['pullrequest']['url']) | |||
|
136 | return [(url, self.secret_token, data)] | |||
|
137 | ||||
|
138 | def __call__(self, event, data): | |||
|
139 | if isinstance(event, events.RepoPushEvent): | |||
|
140 | return self.repo_push_event_handler(event, data) | |||
|
141 | elif isinstance(event, events.RepoCreateEvent): | |||
|
142 | return self.repo_create_event_handler(event, data) | |||
|
143 | elif isinstance(event, events.PullRequestEvent): | |||
|
144 | return self.pull_request_event_handler(event, data) | |||
|
145 | else: | |||
|
146 | raise ValueError('event type not supported: %s' % events) | |||
|
147 | ||||
36 |
|
148 | |||
37 | class WebhookSettingsSchema(colander.Schema): |
|
149 | class WebhookSettingsSchema(colander.Schema): | |
38 | url = colander.SchemaNode( |
|
150 | url = colander.SchemaNode( | |
39 | colander.String(), |
|
151 | colander.String(), | |
40 | title=_('Webhook URL'), |
|
152 | title=_('Webhook URL'), | |
41 | description=_('URL of the webhook to receive POST event.'), |
|
153 | description= | |
|
154 | _('URL of the webhook to receive POST event. Following variables ' | |||
|
155 | 'are allowed to be used: {vars}. Some of the variables would ' | |||
|
156 | 'trigger multiple calls, like ${{branch}} or ${{commit_id}}. ' | |||
|
157 | 'Webhook will be called as many times as unique objects in ' | |||
|
158 | 'data in such cases.').format(vars=URL_VARS), | |||
42 | missing=colander.required, |
|
159 | missing=colander.required, | |
43 | required=True, |
|
160 | required=True, | |
44 | validator=colander.url, |
|
161 | validator=colander.url, | |
@@ -58,8 +175,6 b' class WebhookSettingsSchema(colander.Sch' | |||||
58 | ) |
|
175 | ) | |
59 |
|
176 | |||
60 |
|
177 | |||
61 |
|
||||
62 |
|
||||
63 | class WebhookIntegrationType(IntegrationTypeBase): |
|
178 | class WebhookIntegrationType(IntegrationTypeBase): | |
64 | key = 'webhook' |
|
179 | key = 'webhook' | |
65 | display_name = _('Webhook') |
|
180 | display_name = _('Webhook') | |
@@ -104,14 +219,30 b' class WebhookIntegrationType(Integration' | |||||
104 | return |
|
219 | return | |
105 |
|
220 | |||
106 | data = event.as_dict() |
|
221 | data = event.as_dict() | |
107 | post_to_webhook(data, self.settings) |
|
222 | template_url = self.settings['url'] | |
|
223 | ||||
|
224 | handler = WebhookHandler(template_url, self.settings['secret_token']) | |||
|
225 | url_calls = handler(event, data) | |||
|
226 | log.debug('webhook: calling following urls: %s', | |||
|
227 | [x[0] for x in url_calls]) | |||
|
228 | post_to_webhook(url_calls) | |||
108 |
|
229 | |||
109 |
|
230 | |||
110 | @task(ignore_result=True) |
|
231 | @task(ignore_result=True) | |
111 |
def post_to_webhook( |
|
232 | def post_to_webhook(url_calls): | |
112 | log.debug('sending event:%s to webhook %s', data['name'], settings['url']) |
|
233 | max_retries = 3 | |
113 | resp = requests.post(settings['url'], json={ |
|
234 | for url, token, data in url_calls: | |
114 | 'token': settings['secret_token'], |
|
235 | # retry max N times | |
115 | 'event': data |
|
236 | retries = Retry( | |
116 | }) |
|
237 | total=max_retries, | |
117 | resp.raise_for_status() # raise exception on a failed request |
|
238 | backoff_factor=0.15, | |
|
239 | status_forcelist=[500, 502, 503, 504]) | |||
|
240 | req_session = requests.Session() | |||
|
241 | req_session.mount( | |||
|
242 | 'http://', requests.adapters.HTTPAdapter(max_retries=retries)) | |||
|
243 | ||||
|
244 | resp = req_session.post(url, json={ | |||
|
245 | 'token': token, | |||
|
246 | 'event': data | |||
|
247 | }) | |||
|
248 | resp.raise_for_status() # raise exception on a failed request |
@@ -49,7 +49,7 b' from rhodecode.model.meta import Session' | |||||
49 | from rhodecode.model.user import UserModel |
|
49 | from rhodecode.model.user import UserModel | |
50 | from rhodecode.model.db import ( |
|
50 | from rhodecode.model.db import ( | |
51 | User, Repository, Permission, UserToPerm, UserGroupToPerm, UserGroupMember, |
|
51 | User, Repository, Permission, UserToPerm, UserGroupToPerm, UserGroupMember, | |
52 | UserIpMap, UserApiKeys) |
|
52 | UserIpMap, UserApiKeys, RepoGroup) | |
53 | from rhodecode.lib import caches |
|
53 | from rhodecode.lib import caches | |
54 | from rhodecode.lib.utils2 import safe_unicode, aslist, safe_str, md5 |
|
54 | from rhodecode.lib.utils2 import safe_unicode, aslist, safe_str, md5 | |
55 | from rhodecode.lib.utils import ( |
|
55 | from rhodecode.lib.utils import ( | |
@@ -983,6 +983,9 b' class AuthUser(object):' | |||||
983 | inherit = self.inherit_default_permissions |
|
983 | inherit = self.inherit_default_permissions | |
984 | return AuthUser.check_ip_allowed(self.user_id, self.ip_addr, |
|
984 | return AuthUser.check_ip_allowed(self.user_id, self.ip_addr, | |
985 | inherit_from_default=inherit) |
|
985 | inherit_from_default=inherit) | |
|
986 | @property | |||
|
987 | def personal_repo_group(self): | |||
|
988 | return RepoGroup.get_user_personal_repo_group(self.user_id) | |||
986 |
|
989 | |||
987 | @classmethod |
|
990 | @classmethod | |
988 | def check_ip_allowed(cls, user_id, ip_addr, inherit_from_default): |
|
991 | def check_ip_allowed(cls, user_id, ip_addr, inherit_from_default): |
@@ -163,7 +163,8 b' def get_access_path(environ):' | |||||
163 |
|
163 | |||
164 |
|
164 | |||
165 | def vcs_operation_context( |
|
165 | def vcs_operation_context( | |
166 |
environ, repo_name, username, action, scm, check_locking=True |
|
166 | environ, repo_name, username, action, scm, check_locking=True, | |
|
167 | is_shadow_repo=False): | |||
167 | """ |
|
168 | """ | |
168 | Generate the context for a vcs operation, e.g. push or pull. |
|
169 | Generate the context for a vcs operation, e.g. push or pull. | |
169 |
|
170 | |||
@@ -200,6 +201,7 b' def vcs_operation_context(' | |||||
200 | 'locked_by': locked_by, |
|
201 | 'locked_by': locked_by, | |
201 | 'server_url': utils2.get_server_url(environ), |
|
202 | 'server_url': utils2.get_server_url(environ), | |
202 | 'hooks': get_enabled_hook_classes(ui_settings), |
|
203 | 'hooks': get_enabled_hook_classes(ui_settings), | |
|
204 | 'is_shadow_repo': is_shadow_repo, | |||
203 | } |
|
205 | } | |
204 | return extras |
|
206 | return extras | |
205 |
|
207 | |||
@@ -363,6 +365,18 b' def attach_context_attributes(context, r' | |||||
363 | # Fix this and remove it from base controller. |
|
365 | # Fix this and remove it from base controller. | |
364 | # context.repo_name = get_repo_slug(request) # can be empty |
|
366 | # context.repo_name = get_repo_slug(request) # can be empty | |
365 |
|
367 | |||
|
368 | diffmode = 'sideside' | |||
|
369 | if request.GET.get('diffmode'): | |||
|
370 | if request.GET['diffmode'] == 'unified': | |||
|
371 | diffmode = 'unified' | |||
|
372 | elif request.session.get('diffmode'): | |||
|
373 | diffmode = request.session['diffmode'] | |||
|
374 | ||||
|
375 | context.diffmode = diffmode | |||
|
376 | ||||
|
377 | if request.session.get('diffmode') != diffmode: | |||
|
378 | request.session['diffmode'] = diffmode | |||
|
379 | ||||
366 | context.csrf_token = auth.get_csrf_token() |
|
380 | context.csrf_token = auth.get_csrf_token() | |
367 | context.backends = rhodecode.BACKENDS.keys() |
|
381 | context.backends = rhodecode.BACKENDS.keys() | |
368 | context.backends.sort() |
|
382 | context.backends.sort() |
@@ -24,9 +24,10 b' import logging' | |||||
24 | import threading |
|
24 | import threading | |
25 |
|
25 | |||
26 | from beaker.cache import _cache_decorate, cache_regions, region_invalidate |
|
26 | from beaker.cache import _cache_decorate, cache_regions, region_invalidate | |
|
27 | from sqlalchemy.exc import IntegrityError | |||
27 |
|
28 | |||
28 | from rhodecode.lib.utils import safe_str, md5 |
|
29 | from rhodecode.lib.utils import safe_str, md5 | |
29 |
from rhodecode.model.db import Session, CacheKey |
|
30 | from rhodecode.model.db import Session, CacheKey | |
30 |
|
31 | |||
31 | log = logging.getLogger(__name__) |
|
32 | log = logging.getLogger(__name__) | |
32 |
|
33 |
@@ -18,16 +18,14 b'' | |||||
18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
20 |
|
20 | |||
|
21 | import hashlib | |||
|
22 | import itsdangerous | |||
21 | import logging |
|
23 | import logging | |
22 | import os |
|
24 | import os | |
23 |
|
||||
24 | import itsdangerous |
|
|||
25 | import requests |
|
25 | import requests | |
26 |
|
||||
27 | from dogpile.core import ReadWriteMutex |
|
26 | from dogpile.core import ReadWriteMutex | |
28 |
|
27 | |||
29 | import rhodecode.lib.helpers as h |
|
28 | import rhodecode.lib.helpers as h | |
30 |
|
||||
31 | from rhodecode.lib.auth import HasRepoPermissionAny |
|
29 | from rhodecode.lib.auth import HasRepoPermissionAny | |
32 | from rhodecode.lib.ext_json import json |
|
30 | from rhodecode.lib.ext_json import json | |
33 | from rhodecode.model.db import User |
|
31 | from rhodecode.model.db import User | |
@@ -81,7 +79,7 b' def get_user_data(user_id):' | |||||
81 | 'username': user.username, |
|
79 | 'username': user.username, | |
82 | 'first_name': user.name, |
|
80 | 'first_name': user.name, | |
83 | 'last_name': user.lastname, |
|
81 | 'last_name': user.lastname, | |
84 |
'icon_link': h.gravatar_url(user.email, |
|
82 | 'icon_link': h.gravatar_url(user.email, 60), | |
85 | 'display_name': h.person(user, 'username_or_name_or_email'), |
|
83 | 'display_name': h.person(user, 'username_or_name_or_email'), | |
86 | 'display_link': h.link_to_user(user), |
|
84 | 'display_link': h.link_to_user(user), | |
87 | 'notifications': user.user_data.get('notification_status', True) |
|
85 | 'notifications': user.user_data.get('notification_status', True) | |
@@ -169,7 +167,9 b' def parse_channels_info(info_result, inc' | |||||
169 |
|
167 | |||
170 |
|
168 | |||
171 | def log_filepath(history_location, channel_name): |
|
169 | def log_filepath(history_location, channel_name): | |
172 | filename = '{}.log'.format(channel_name.encode('hex')) |
|
170 | hasher = hashlib.sha256() | |
|
171 | hasher.update(channel_name.encode('utf8')) | |||
|
172 | filename = '{}.log'.format(hasher.hexdigest()) | |||
173 | filepath = os.path.join(history_location, filename) |
|
173 | filepath = os.path.join(history_location, filename) | |
174 | return filepath |
|
174 | return filepath | |
175 |
|
175 |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file, binary diff hidden |
|
NO CONTENT: modified file, binary diff hidden |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file, binary diff hidden |
|
NO CONTENT: modified file, binary diff hidden |
1 | NO CONTENT: modified file, binary diff hidden |
|
NO CONTENT: modified file, binary diff hidden |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: modified file |
|
NO CONTENT: modified file | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
1 | NO CONTENT: file was removed |
|
NO CONTENT: file was removed | ||
The requested commit or file is too big and content was truncated. Show full diff |
General Comments 0
You need to be logged in to leave comments.
Login now