Show More
The requested changes are too big and content was truncated. Show full diff
@@ -0,0 +1,137 b'' | |||
|
1 | .. _svn-http: | |
|
2 | ||
|
3 | |svn| With Write Over HTTP | |
|
4 | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
|
5 | ||
|
6 | To use |svn| with read/write support over the |svn| HTTP protocol, you have to | |
|
7 | configure the HTTP |svn| backend. | |
|
8 | ||
|
9 | Prerequisites | |
|
10 | ============= | |
|
11 | ||
|
12 | - Enable HTTP support inside the admin VCS settings on your |RCE| instance | |
|
13 | - You need to install the following tools on the machine that is running an | |
|
14 | instance of |RCE|: | |
|
15 | ``Apache HTTP Server`` and ``mod_dav_svn``. | |
|
16 | ||
|
17 | ||
|
18 | Using Ubuntu 14.04 Distribution as an example execute the following: | |
|
19 | ||
|
20 | .. code-block:: bash | |
|
21 | ||
|
22 | $ sudo apt-get install apache2 libapache2-mod-svn | |
|
23 | ||
|
24 | Once installed you need to enable ``dav_svn``: | |
|
25 | ||
|
26 | .. code-block:: bash | |
|
27 | ||
|
28 | $ sudo a2enmod dav_svn | |
|
29 | $ sudo a2enmod headers | |
|
30 | ||
|
31 | ||
|
32 | Configuring Apache Setup | |
|
33 | ======================== | |
|
34 | ||
|
35 | .. tip:: | |
|
36 | ||
|
37 | It is recommended to run Apache on a port other than 80, due to possible | |
|
38 | conflicts with other HTTP servers like nginx. To do this, set the | |
|
39 | ``Listen`` parameter in the ``/etc/apache2/ports.conf`` file, for example | |
|
40 | ``Listen 8090``. | |
|
41 | ||
|
42 | ||
|
43 | .. warning:: | |
|
44 | ||
|
45 | Make sure your Apache instance which runs the mod_dav_svn module is | |
|
46 | only accessible by |RCE|. Otherwise everyone is able to browse | |
|
47 | the repositories or run subversion operations (checkout/commit/etc.). | |
|
48 | ||
|
49 | It is also recommended to run apache as the same user as |RCE|, otherwise | |
|
50 | permission issues could occur. To do this edit the ``/etc/apache2/envvars`` | |
|
51 | ||
|
52 | .. code-block:: apache | |
|
53 | ||
|
54 | export APACHE_RUN_USER=rhodecode | |
|
55 | export APACHE_RUN_GROUP=rhodecode | |
|
56 | ||
|
57 | 1. To configure Apache, create and edit a virtual hosts file, for example | |
|
58 | :file:`/etc/apache2/sites-available/default.conf`. Below is an example | |
|
59 | how to use one with auto-generated config ```mod_dav_svn.conf``` | |
|
60 | from configured |RCE| instance. | |
|
61 | ||
|
62 | .. code-block:: apache | |
|
63 | ||
|
64 | <VirtualHost *:8090> | |
|
65 | ServerAdmin rhodecode-admin@localhost | |
|
66 | DocumentRoot /var/www/html | |
|
67 | ErrorLog ${'${APACHE_LOG_DIR}'}/error.log | |
|
68 | CustomLog ${'${APACHE_LOG_DIR}'}/access.log combined | |
|
69 | Include /home/user/.rccontrol/enterprise-1/mod_dav_svn.conf | |
|
70 | </VirtualHost> | |
|
71 | ||
|
72 | ||
|
73 | 2. Go to the :menuselection:`Admin --> Settings --> VCS` page, and | |
|
74 | enable :guilabel:`Proxy Subversion HTTP requests`, and specify the | |
|
75 | :guilabel:`Subversion HTTP Server URL`. | |
|
76 | ||
|
77 | 3. Open the |RCE| configuration file, | |
|
78 | :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` | |
|
79 | ||
|
80 | 4. Add the following configuration option in the ``[app:main]`` | |
|
81 | section if you don't have it yet. | |
|
82 | ||
|
83 | This enables mapping of the created |RCE| repo groups into special | |
|
84 | |svn| paths. Each time a new repository group is created, the system will | |
|
85 | update the template file and create new mapping. Apache web server needs to | |
|
86 | be reloaded to pick up the changes on this file. | |
|
87 | To do this, simply configure `svn.proxy.reload_cmd` inside the .ini file. | |
|
88 | Example configuration: | |
|
89 | ||
|
90 | ||
|
91 | .. code-block:: ini | |
|
92 | ||
|
93 | ############################################################ | |
|
94 | ### Subversion proxy support (mod_dav_svn) ### | |
|
95 | ### Maps RhodeCode repo groups into SVN paths for Apache ### | |
|
96 | ############################################################ | |
|
97 | ## Enable or disable the config file generation. | |
|
98 | svn.proxy.generate_config = true | |
|
99 | ## Generate config file with `SVNListParentPath` set to `On`. | |
|
100 | svn.proxy.list_parent_path = true | |
|
101 | ## Set location and file name of generated config file. | |
|
102 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |
|
103 | ## Used as a prefix to the <Location> block in the generated config file. | |
|
104 | ## In most cases it should be set to `/`. | |
|
105 | svn.proxy.location_root = / | |
|
106 | ## Command to reload the mod dav svn configuration on change. | |
|
107 | ## Example: `/etc/init.d/apache2 reload` | |
|
108 | svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |
|
109 | ## If the timeout expires before the reload command finishes, the command will | |
|
110 | ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |
|
111 | #svn.proxy.reload_timeout = 10 | |
|
112 | ||
|
113 | ||
|
114 | This would create a special template file called ```mod_dav_svn.conf```. We | |
|
115 | used that file path in the apache config above inside the Include statement. | |
|
116 | It's also possible to generate the config from the | |
|
117 | :menuselection:`Admin --> Settings --> VCS` page. | |
|
118 | ||
|
119 | ||
|
120 | Using |svn| | |
|
121 | =========== | |
|
122 | ||
|
123 | Once |svn| has been enabled on your instance, you can use it with the | |
|
124 | following examples. For more |svn| information, see the `Subversion Red Book`_ | |
|
125 | ||
|
126 | .. code-block:: bash | |
|
127 | ||
|
128 | # To clone a repository | |
|
129 | svn checkout http://my-svn-server.example.com/my-svn-repo | |
|
130 | ||
|
131 | # svn commit | |
|
132 | svn commit | |
|
133 | ||
|
134 | ||
|
135 | .. _Subversion Red Book: http://svnbook.red-bean.com/en/1.7/svn-book.html#svn.ref.svn | |
|
136 | ||
|
137 | .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue No newline at end of file |
@@ -0,0 +1,17 b'' | |||
|
1 | .. _checklist-pull-request: | |
|
2 | ||
|
3 | ======================= | |
|
4 | Pull Request Checklists | |
|
5 | ======================= | |
|
6 | ||
|
7 | ||
|
8 | ||
|
9 | Checklists for Pull Request | |
|
10 | =========================== | |
|
11 | ||
|
12 | ||
|
13 | - Informative description | |
|
14 | - Linear commit history | |
|
15 | - Rebased on top of latest changes | |
|
16 | - Add ticket references. eg fixes #123, references #123 etc. | |
|
17 |
|
1 | NO CONTENT: new file 100644, binary diff hidden |
@@ -0,0 +1,159 b'' | |||
|
1 | |RCE| 4.5.0 |RNS| | |
|
2 | ----------------- | |
|
3 | ||
|
4 | Release Date | |
|
5 | ^^^^^^^^^^^^ | |
|
6 | ||
|
7 | - 2016-12-02 | |
|
8 | ||
|
9 | ||
|
10 | New Features | |
|
11 | ^^^^^^^^^^^^ | |
|
12 | ||
|
13 | - Diffs: re-implemented diff engine. Added: Syntax highlighting inside diffs, | |
|
14 | new side-by-side view with commenting and live chat. Enabled soft-wrapping of | |
|
15 | long lines and much improved rendering speed for large diffs. | |
|
16 | - File source view: new file display engine. File view now | |
|
17 | soft wraps long lines. Double click inside file view show occurrences of | |
|
18 | clicked item. Added pygments-markdown-lexer for highlighting markdown syntax. | |
|
19 | - Files annotation: Added new grouped annotations. Color all related commits | |
|
20 | by double clicking singe commit in annotation view. | |
|
21 | - Pull request reviewers (EE only): added new default reviewers functionality. | |
|
22 | Allows picking users or user groups defined as reviewers for new pull request. | |
|
23 | Picking reviewers can be based on branch name, changed file name patterns or | |
|
24 | original author of changed source code. eg *.css -> design team. | |
|
25 | Master branch -> repo owner, fixes #1131. | |
|
26 | - Pull request reviewers: store and show reasons why given person is a reviewer. | |
|
27 | Manually adding reviewers after creating a PR will now be also indicated | |
|
28 | together with who added a given person to review. | |
|
29 | - Integrations: Webhooks integration now allows to use variables inside the | |
|
30 | call URL. Currently supported variables are ${repo_name}, ${repo_type}, | |
|
31 | ${repo_id}, ${repo_url}, ${branch}, ${commit_id}, ${pull_request_id}, | |
|
32 | ${pull_request_url}. Commits are now grouped by branches as well. | |
|
33 | Allows much easier integration with CI systems. | |
|
34 | - Integrations (EE only): allow wildcard * project key in Jira integration | |
|
35 | settings to allow referencing multiple projects per commit, fixes #4267. | |
|
36 | - Live notifications: RhodeCode sends live notification to online | |
|
37 | users on certain events and pages. Currently this works on: invite to chat, | |
|
38 | update pull request, commit/inline comment. Part of live code review system. | |
|
39 | Allows users to update the reviewed code while doing the review and never | |
|
40 | miss any updates or comment replies as they happen. Requires channelstream | |
|
41 | to be enabled. | |
|
42 | - Repository groups: added default personal repository groups. Personal groups | |
|
43 | are isolated playground for users allowing them to create projects or forks. | |
|
44 | Adds new setting to automatically create personal repo groups for newly | |
|
45 | created users. New groups are created from specified pattern, for example | |
|
46 | /u/{username}. Implements #4003. | |
|
47 | - Security: It's now possible to disable password reset functionality. | |
|
48 | This is useful for cases when users only use LDAP or similar types of | |
|
49 | authentication. Implements #3944 | |
|
50 | - Pull requests: exposed shadow repositories to end users. Users are now given | |
|
51 | access to the shadow repository which represents state after merge performed. | |
|
52 | In this way users or especially CI servers can much easier perform code | |
|
53 | analysis of the final merged code. | |
|
54 | - Pull requests: My account > pull request page now uses datagrid. | |
|
55 | It's faster, filterable and sortable. Fixes #4297. | |
|
56 | - Pull requests: delete pull request action was moved from my account | |
|
57 | into pull request view itself. This is where everyone was looking for it. | |
|
58 | - Pull requests: improve showing failed merges with proper status in pull | |
|
59 | request page. | |
|
60 | - User groups: overhaul of edit user group page. Added new selector for | |
|
61 | adding new user group members. | |
|
62 | - Licensing (EE only): exposed unlock link to deactivate users that are over | |
|
63 | license limit, to unlock full functionality. This might happen when migrating | |
|
64 | from CE into EE, and your license supports less active users then allowed. | |
|
65 | - Global settings: add a new header/footer template to allow flash filtering. | |
|
66 | In case a license warning appears and admin wants to hide it for some time. | |
|
67 | The new template can be used to do this. | |
|
68 | - System info: added create snapshot button to easily generate system state | |
|
69 | report. Comes in handy for support and reporting. System state holds | |
|
70 | information such as free disk/memory, CPU load and some of RhodeCode settings. | |
|
71 | - System info: fetch and show vcs settings from vcsserver. Fixes #4276. | |
|
72 | - System info: use real memory usage based on new psutil api available. | |
|
73 | - System info: added info about temporary storage. | |
|
74 | - System info: expose inode limits and usage. Fixes #4282. | |
|
75 | - Ui: added new icon for merge commit. | |
|
76 | ||
|
77 | ||
|
78 | ||
|
79 | General | |
|
80 | ^^^^^^^ | |
|
81 | ||
|
82 | - Notifications: move all notifications into polymer for consistency. | |
|
83 | Fixes #4201. | |
|
84 | - Live chat (EE): Improved UI for live-chat. Use Codemirror editor as | |
|
85 | input for text box. | |
|
86 | - Api: WARNING DEPRECATION, refactor repository group schemas. Fixes #4133. | |
|
87 | When using create_repo, create_repo_group, update_repo, update_repo_group | |
|
88 | the *_name parameter now takes full path including sub repository groups. | |
|
89 | This is the only way to add resource under another repository group. | |
|
90 | Furthermore giving non-existing path will no longer create the missing | |
|
91 | structure. This change makes the api more consistent, it better validates | |
|
92 | the errors in the data sent to given api call. | |
|
93 | - Pull requests: disable subrepo handling on pull requests. It means users can | |
|
94 | now use more types of repositories with subrepos to create pull requests. | |
|
95 | Since handling is disabled, repositories behind authentication, or outside | |
|
96 | of network can be used. | |
|
97 | - VCSServer: fetch backend info from vcsserver including git/hg/svn versions | |
|
98 | and connection information. | |
|
99 | - Svn support: it's no longer required to put in repo root path to | |
|
100 | generate mod-dav-svn config. Fixes #4203. | |
|
101 | - Svn support: Add reload command option (svn.proxy.reload_cmd) to ini files. | |
|
102 | Apache can now be automatically reloaded when the mod_dav_svn config changes. | |
|
103 | - Svn support: Add a view to trigger the (re)generation of Apache mod_dav_svn | |
|
104 | configuration file. Users are able to generate such file manually by clicking | |
|
105 | that button. | |
|
106 | - Dependency: updated subversion library to 1.9. | |
|
107 | - Dependency: updated ipython to 5.1.0. | |
|
108 | - Dependency: updated psutil to 4.3.1. | |
|
109 | ||
|
110 | ||
|
111 | Security | |
|
112 | ^^^^^^^^ | |
|
113 | ||
|
114 | - Hipchat: escape user entered data to avoid xss/formatting problems. | |
|
115 | - VCSServer: obfuscate credentials added into remote url during remote | |
|
116 | repository creation. Prevents leaking of those credentials inside | |
|
117 | RhodeCode logs. | |
|
118 | ||
|
119 | ||
|
120 | Performance | |
|
121 | ^^^^^^^^^^^ | |
|
122 | ||
|
123 | - Diffs: new diff engine is much smarter when it comes to showing huge diffs. | |
|
124 | The rendering speed should be much improved in such cases, however showing | |
|
125 | full diff is still supported. | |
|
126 | - VCS backends: when using a repo object from database, re-use this information | |
|
127 | instead of trying to detect a backend. Reduces the traffic to vcsserver. | |
|
128 | - Pull requests: Add a column to hold the last merge revision. This will skip | |
|
129 | heavy recalculation of merge state if nothing changed inside a pull request. | |
|
130 | - File source view: don't load the file if it is over the size limit since it | |
|
131 | won't be displayed anyway. This increases speed of loading the page when a | |
|
132 | file is above cut-off limit defined. | |
|
133 | ||
|
134 | ||
|
135 | Fixes | |
|
136 | ^^^^^ | |
|
137 | ||
|
138 | - Users admin: fixed search filter in user admin page. | |
|
139 | - Autocomplete: improve the lookup of users with non-ascii characters. In case | |
|
140 | of unicode email the previous method could generate wrong data, and | |
|
141 | make search not show up such users. | |
|
142 | - Svn: added request header downgrade for COPY command to work on | |
|
143 | https setup. Fixes #4307. | |
|
144 | - Svn: add handling of renamed files inside our generated changes metadata. | |
|
145 | Fixes #4258. | |
|
146 | - Pull requests: fixed problem with creating pull requests on empty repositories. | |
|
147 | - Events: use branch from previous commit for repo push event commits data so | |
|
148 | that per-branch grouping works. Fixes #4233. | |
|
149 | - Login: make sure recaptcha data is always validated. Fixes #4279. | |
|
150 | - Vcs: Use commit date as modification time when creating archives. | |
|
151 | Fixes problem with unstable hashes for archives. Fixes #4247. | |
|
152 | - Issue trackers: fixed bug where saving empty issue tracker via form was | |
|
153 | causing exception. Fixes #4278. | |
|
154 | - Styling: fixed gravatar size for pull request reviewers. | |
|
155 | - Ldap: fixed email extraction typo. An empty email from LDAP server will now | |
|
156 | not overwrite the stored one. | |
|
157 | - Integrations: use consistent formatting of users data in Slack integration. | |
|
158 | - Meta-tags: meta tags are not taken into account when truncating descriptions | |
|
159 | that are too long. Fixes #4305. No newline at end of file |
@@ -0,0 +1,14 b'' | |||
|
1 | Copyright 2006 Google Inc. | |
|
2 | http://code.google.com/p/google-diff-match-patch/ | |
|
3 | ||
|
4 | Licensed under the Apache License, Version 2.0 (the "License"); | |
|
5 | you may not use this file except in compliance with the License. | |
|
6 | You may obtain a copy of the License at | |
|
7 | ||
|
8 | http://www.apache.org/licenses/LICENSE-2.0 | |
|
9 | ||
|
10 | Unless required by applicable law or agreed to in writing, software | |
|
11 | distributed under the License is distributed on an "AS IS" BASIS, | |
|
12 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
13 | See the License for the specific language governing permissions and | |
|
14 | limitations under the License. |
@@ -0,0 +1,12 b'' | |||
|
1 | Copyright © 2015 Jürgen Hermann <jh@web.de> | |
|
2 | ||
|
3 | Licensed under the Apache License, Version 2.0 (the "License"); | |
|
4 | you may not use this file except in compliance with the License. | |
|
5 | You may obtain a copy of the License at | |
|
6 | http://www.apache.org/licenses/LICENSE-2.0 | |
|
7 | ||
|
8 | Unless required by applicable law or agreed to in writing, software | |
|
9 | distributed under the License is distributed on an "AS IS" BASIS, | |
|
10 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
|
11 | See the License for the specific language governing permissions and | |
|
12 | limitations under the License. |
This diff has been collapsed as it changes many lines, (665 lines changed) Show them Hide them | |||
@@ -0,0 +1,665 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2011-2016 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | import logging | |
|
22 | import difflib | |
|
23 | from itertools import groupby | |
|
24 | ||
|
25 | from pygments import lex | |
|
26 | from pygments.formatters.html import _get_ttype_class as pygment_token_class | |
|
27 | from rhodecode.lib.helpers import ( | |
|
28 | get_lexer_for_filenode, get_lexer_safe, html_escape) | |
|
29 | from rhodecode.lib.utils2 import AttributeDict | |
|
30 | from rhodecode.lib.vcs.nodes import FileNode | |
|
31 | from rhodecode.lib.diff_match_patch import diff_match_patch | |
|
32 | from rhodecode.lib.diffs import LimitedDiffContainer | |
|
33 | from pygments.lexers import get_lexer_by_name | |
|
34 | ||
|
35 | plain_text_lexer = get_lexer_by_name( | |
|
36 | 'text', stripall=False, stripnl=False, ensurenl=False) | |
|
37 | ||
|
38 | ||
|
39 | log = logging.getLogger() | |
|
40 | ||
|
41 | ||
|
42 | def filenode_as_lines_tokens(filenode, lexer=None): | |
|
43 | lexer = lexer or get_lexer_for_filenode(filenode) | |
|
44 | log.debug('Generating file node pygment tokens for %s, %s', lexer, filenode) | |
|
45 | tokens = tokenize_string(filenode.content, lexer) | |
|
46 | lines = split_token_stream(tokens, split_string='\n') | |
|
47 | rv = list(lines) | |
|
48 | return rv | |
|
49 | ||
|
50 | ||
|
51 | def tokenize_string(content, lexer): | |
|
52 | """ | |
|
53 | Use pygments to tokenize some content based on a lexer | |
|
54 | ensuring all original new lines and whitespace is preserved | |
|
55 | """ | |
|
56 | ||
|
57 | lexer.stripall = False | |
|
58 | lexer.stripnl = False | |
|
59 | lexer.ensurenl = False | |
|
60 | for token_type, token_text in lex(content, lexer): | |
|
61 | yield pygment_token_class(token_type), token_text | |
|
62 | ||
|
63 | ||
|
64 | def split_token_stream(tokens, split_string=u'\n'): | |
|
65 | """ | |
|
66 | Take a list of (TokenType, text) tuples and split them by a string | |
|
67 | ||
|
68 | >>> split_token_stream([(TEXT, 'some\ntext'), (TEXT, 'more\n')]) | |
|
69 | [(TEXT, 'some'), (TEXT, 'text'), | |
|
70 | (TEXT, 'more'), (TEXT, 'text')] | |
|
71 | """ | |
|
72 | ||
|
73 | buffer = [] | |
|
74 | for token_class, token_text in tokens: | |
|
75 | parts = token_text.split(split_string) | |
|
76 | for part in parts[:-1]: | |
|
77 | buffer.append((token_class, part)) | |
|
78 | yield buffer | |
|
79 | buffer = [] | |
|
80 | ||
|
81 | buffer.append((token_class, parts[-1])) | |
|
82 | ||
|
83 | if buffer: | |
|
84 | yield buffer | |
|
85 | ||
|
86 | ||
|
87 | def filenode_as_annotated_lines_tokens(filenode): | |
|
88 | """ | |
|
89 | Take a file node and return a list of annotations => lines, if no annotation | |
|
90 | is found, it will be None. | |
|
91 | ||
|
92 | eg: | |
|
93 | ||
|
94 | [ | |
|
95 | (annotation1, [ | |
|
96 | (1, line1_tokens_list), | |
|
97 | (2, line2_tokens_list), | |
|
98 | ]), | |
|
99 | (annotation2, [ | |
|
100 | (3, line1_tokens_list), | |
|
101 | ]), | |
|
102 | (None, [ | |
|
103 | (4, line1_tokens_list), | |
|
104 | ]), | |
|
105 | (annotation1, [ | |
|
106 | (5, line1_tokens_list), | |
|
107 | (6, line2_tokens_list), | |
|
108 | ]) | |
|
109 | ] | |
|
110 | """ | |
|
111 | ||
|
112 | commit_cache = {} # cache commit_getter lookups | |
|
113 | ||
|
114 | def _get_annotation(commit_id, commit_getter): | |
|
115 | if commit_id not in commit_cache: | |
|
116 | commit_cache[commit_id] = commit_getter() | |
|
117 | return commit_cache[commit_id] | |
|
118 | ||
|
119 | annotation_lookup = { | |
|
120 | line_no: _get_annotation(commit_id, commit_getter) | |
|
121 | for line_no, commit_id, commit_getter, line_content | |
|
122 | in filenode.annotate | |
|
123 | } | |
|
124 | ||
|
125 | annotations_lines = ((annotation_lookup.get(line_no), line_no, tokens) | |
|
126 | for line_no, tokens | |
|
127 | in enumerate(filenode_as_lines_tokens(filenode), 1)) | |
|
128 | ||
|
129 | grouped_annotations_lines = groupby(annotations_lines, lambda x: x[0]) | |
|
130 | ||
|
131 | for annotation, group in grouped_annotations_lines: | |
|
132 | yield ( | |
|
133 | annotation, [(line_no, tokens) | |
|
134 | for (_, line_no, tokens) in group] | |
|
135 | ) | |
|
136 | ||
|
137 | ||
|
138 | def render_tokenstream(tokenstream): | |
|
139 | result = [] | |
|
140 | for token_class, token_ops_texts in rollup_tokenstream(tokenstream): | |
|
141 | ||
|
142 | if token_class: | |
|
143 | result.append(u'<span class="%s">' % token_class) | |
|
144 | else: | |
|
145 | result.append(u'<span>') | |
|
146 | ||
|
147 | for op_tag, token_text in token_ops_texts: | |
|
148 | ||
|
149 | if op_tag: | |
|
150 | result.append(u'<%s>' % op_tag) | |
|
151 | ||
|
152 | escaped_text = html_escape(token_text) | |
|
153 | ||
|
154 | # TODO: dan: investigate showing hidden characters like space/nl/tab | |
|
155 | # escaped_text = escaped_text.replace(' ', '<sp> </sp>') | |
|
156 | # escaped_text = escaped_text.replace('\n', '<nl>\n</nl>') | |
|
157 | # escaped_text = escaped_text.replace('\t', '<tab>\t</tab>') | |
|
158 | ||
|
159 | result.append(escaped_text) | |
|
160 | ||
|
161 | if op_tag: | |
|
162 | result.append(u'</%s>' % op_tag) | |
|
163 | ||
|
164 | result.append(u'</span>') | |
|
165 | ||
|
166 | html = ''.join(result) | |
|
167 | return html | |
|
168 | ||
|
169 | ||
|
170 | def rollup_tokenstream(tokenstream): | |
|
171 | """ | |
|
172 | Group a token stream of the format: | |
|
173 | ||
|
174 | ('class', 'op', 'text') | |
|
175 | or | |
|
176 | ('class', 'text') | |
|
177 | ||
|
178 | into | |
|
179 | ||
|
180 | [('class1', | |
|
181 | [('op1', 'text'), | |
|
182 | ('op2', 'text')]), | |
|
183 | ('class2', | |
|
184 | [('op3', 'text')])] | |
|
185 | ||
|
186 | This is used to get the minimal tags necessary when | |
|
187 | rendering to html eg for a token stream ie. | |
|
188 | ||
|
189 | <span class="A"><ins>he</ins>llo</span> | |
|
190 | vs | |
|
191 | <span class="A"><ins>he</ins></span><span class="A">llo</span> | |
|
192 | ||
|
193 | If a 2 tuple is passed in, the output op will be an empty string. | |
|
194 | ||
|
195 | eg: | |
|
196 | ||
|
197 | >>> rollup_tokenstream([('classA', '', 'h'), | |
|
198 | ('classA', 'del', 'ell'), | |
|
199 | ('classA', '', 'o'), | |
|
200 | ('classB', '', ' '), | |
|
201 | ('classA', '', 'the'), | |
|
202 | ('classA', '', 're'), | |
|
203 | ]) | |
|
204 | ||
|
205 | [('classA', [('', 'h'), ('del', 'ell'), ('', 'o')], | |
|
206 | ('classB', [('', ' ')], | |
|
207 | ('classA', [('', 'there')]] | |
|
208 | ||
|
209 | """ | |
|
210 | if tokenstream and len(tokenstream[0]) == 2: | |
|
211 | tokenstream = ((t[0], '', t[1]) for t in tokenstream) | |
|
212 | ||
|
213 | result = [] | |
|
214 | for token_class, op_list in groupby(tokenstream, lambda t: t[0]): | |
|
215 | ops = [] | |
|
216 | for token_op, token_text_list in groupby(op_list, lambda o: o[1]): | |
|
217 | text_buffer = [] | |
|
218 | for t_class, t_op, t_text in token_text_list: | |
|
219 | text_buffer.append(t_text) | |
|
220 | ops.append((token_op, ''.join(text_buffer))) | |
|
221 | result.append((token_class, ops)) | |
|
222 | return result | |
|
223 | ||
|
224 | ||
|
225 | def tokens_diff(old_tokens, new_tokens, use_diff_match_patch=True): | |
|
226 | """ | |
|
227 | Converts a list of (token_class, token_text) tuples to a list of | |
|
228 | (token_class, token_op, token_text) tuples where token_op is one of | |
|
229 | ('ins', 'del', '') | |
|
230 | ||
|
231 | :param old_tokens: list of (token_class, token_text) tuples of old line | |
|
232 | :param new_tokens: list of (token_class, token_text) tuples of new line | |
|
233 | :param use_diff_match_patch: boolean, will use google's diff match patch | |
|
234 | library which has options to 'smooth' out the character by character | |
|
235 | differences making nicer ins/del blocks | |
|
236 | """ | |
|
237 | ||
|
238 | old_tokens_result = [] | |
|
239 | new_tokens_result = [] | |
|
240 | ||
|
241 | similarity = difflib.SequenceMatcher(None, | |
|
242 | ''.join(token_text for token_class, token_text in old_tokens), | |
|
243 | ''.join(token_text for token_class, token_text in new_tokens) | |
|
244 | ).ratio() | |
|
245 | ||
|
246 | if similarity < 0.6: # return, the blocks are too different | |
|
247 | for token_class, token_text in old_tokens: | |
|
248 | old_tokens_result.append((token_class, '', token_text)) | |
|
249 | for token_class, token_text in new_tokens: | |
|
250 | new_tokens_result.append((token_class, '', token_text)) | |
|
251 | return old_tokens_result, new_tokens_result, similarity | |
|
252 | ||
|
253 | token_sequence_matcher = difflib.SequenceMatcher(None, | |
|
254 | [x[1] for x in old_tokens], | |
|
255 | [x[1] for x in new_tokens]) | |
|
256 | ||
|
257 | for tag, o1, o2, n1, n2 in token_sequence_matcher.get_opcodes(): | |
|
258 | # check the differences by token block types first to give a more | |
|
259 | # nicer "block" level replacement vs character diffs | |
|
260 | ||
|
261 | if tag == 'equal': | |
|
262 | for token_class, token_text in old_tokens[o1:o2]: | |
|
263 | old_tokens_result.append((token_class, '', token_text)) | |
|
264 | for token_class, token_text in new_tokens[n1:n2]: | |
|
265 | new_tokens_result.append((token_class, '', token_text)) | |
|
266 | elif tag == 'delete': | |
|
267 | for token_class, token_text in old_tokens[o1:o2]: | |
|
268 | old_tokens_result.append((token_class, 'del', token_text)) | |
|
269 | elif tag == 'insert': | |
|
270 | for token_class, token_text in new_tokens[n1:n2]: | |
|
271 | new_tokens_result.append((token_class, 'ins', token_text)) | |
|
272 | elif tag == 'replace': | |
|
273 | # if same type token blocks must be replaced, do a diff on the | |
|
274 | # characters in the token blocks to show individual changes | |
|
275 | ||
|
276 | old_char_tokens = [] | |
|
277 | new_char_tokens = [] | |
|
278 | for token_class, token_text in old_tokens[o1:o2]: | |
|
279 | for char in token_text: | |
|
280 | old_char_tokens.append((token_class, char)) | |
|
281 | ||
|
282 | for token_class, token_text in new_tokens[n1:n2]: | |
|
283 | for char in token_text: | |
|
284 | new_char_tokens.append((token_class, char)) | |
|
285 | ||
|
286 | old_string = ''.join([token_text for | |
|
287 | token_class, token_text in old_char_tokens]) | |
|
288 | new_string = ''.join([token_text for | |
|
289 | token_class, token_text in new_char_tokens]) | |
|
290 | ||
|
291 | char_sequence = difflib.SequenceMatcher( | |
|
292 | None, old_string, new_string) | |
|
293 | copcodes = char_sequence.get_opcodes() | |
|
294 | obuffer, nbuffer = [], [] | |
|
295 | ||
|
296 | if use_diff_match_patch: | |
|
297 | dmp = diff_match_patch() | |
|
298 | dmp.Diff_EditCost = 11 # TODO: dan: extract this to a setting | |
|
299 | reps = dmp.diff_main(old_string, new_string) | |
|
300 | dmp.diff_cleanupEfficiency(reps) | |
|
301 | ||
|
302 | a, b = 0, 0 | |
|
303 | for op, rep in reps: | |
|
304 | l = len(rep) | |
|
305 | if op == 0: | |
|
306 | for i, c in enumerate(rep): | |
|
307 | obuffer.append((old_char_tokens[a+i][0], '', c)) | |
|
308 | nbuffer.append((new_char_tokens[b+i][0], '', c)) | |
|
309 | a += l | |
|
310 | b += l | |
|
311 | elif op == -1: | |
|
312 | for i, c in enumerate(rep): | |
|
313 | obuffer.append((old_char_tokens[a+i][0], 'del', c)) | |
|
314 | a += l | |
|
315 | elif op == 1: | |
|
316 | for i, c in enumerate(rep): | |
|
317 | nbuffer.append((new_char_tokens[b+i][0], 'ins', c)) | |
|
318 | b += l | |
|
319 | else: | |
|
320 | for ctag, co1, co2, cn1, cn2 in copcodes: | |
|
321 | if ctag == 'equal': | |
|
322 | for token_class, token_text in old_char_tokens[co1:co2]: | |
|
323 | obuffer.append((token_class, '', token_text)) | |
|
324 | for token_class, token_text in new_char_tokens[cn1:cn2]: | |
|
325 | nbuffer.append((token_class, '', token_text)) | |
|
326 | elif ctag == 'delete': | |
|
327 | for token_class, token_text in old_char_tokens[co1:co2]: | |
|
328 | obuffer.append((token_class, 'del', token_text)) | |
|
329 | elif ctag == 'insert': | |
|
330 | for token_class, token_text in new_char_tokens[cn1:cn2]: | |
|
331 | nbuffer.append((token_class, 'ins', token_text)) | |
|
332 | elif ctag == 'replace': | |
|
333 | for token_class, token_text in old_char_tokens[co1:co2]: | |
|
334 | obuffer.append((token_class, 'del', token_text)) | |
|
335 | for token_class, token_text in new_char_tokens[cn1:cn2]: | |
|
336 | nbuffer.append((token_class, 'ins', token_text)) | |
|
337 | ||
|
338 | old_tokens_result.extend(obuffer) | |
|
339 | new_tokens_result.extend(nbuffer) | |
|
340 | ||
|
341 | return old_tokens_result, new_tokens_result, similarity | |
|
342 | ||
|
343 | ||
|
344 | class DiffSet(object): | |
|
345 | """ | |
|
346 | An object for parsing the diff result from diffs.DiffProcessor and | |
|
347 | adding highlighting, side by side/unified renderings and line diffs | |
|
348 | """ | |
|
349 | ||
|
350 | HL_REAL = 'REAL' # highlights using original file, slow | |
|
351 | HL_FAST = 'FAST' # highlights using just the line, fast but not correct | |
|
352 | # in the case of multiline code | |
|
353 | HL_NONE = 'NONE' # no highlighting, fastest | |
|
354 | ||
|
355 | def __init__(self, highlight_mode=HL_REAL, repo_name=None, | |
|
356 | source_node_getter=lambda filename: None, | |
|
357 | target_node_getter=lambda filename: None, | |
|
358 | source_nodes=None, target_nodes=None, | |
|
359 | max_file_size_limit=150 * 1024, # files over this size will | |
|
360 | # use fast highlighting | |
|
361 | comments=None, | |
|
362 | ): | |
|
363 | ||
|
364 | self.highlight_mode = highlight_mode | |
|
365 | self.highlighted_filenodes = {} | |
|
366 | self.source_node_getter = source_node_getter | |
|
367 | self.target_node_getter = target_node_getter | |
|
368 | self.source_nodes = source_nodes or {} | |
|
369 | self.target_nodes = target_nodes or {} | |
|
370 | self.repo_name = repo_name | |
|
371 | self.comments = comments or {} | |
|
372 | self.max_file_size_limit = max_file_size_limit | |
|
373 | ||
|
374 | def render_patchset(self, patchset, source_ref=None, target_ref=None): | |
|
375 | diffset = AttributeDict(dict( | |
|
376 | lines_added=0, | |
|
377 | lines_deleted=0, | |
|
378 | changed_files=0, | |
|
379 | files=[], | |
|
380 | limited_diff=isinstance(patchset, LimitedDiffContainer), | |
|
381 | repo_name=self.repo_name, | |
|
382 | source_ref=source_ref, | |
|
383 | target_ref=target_ref, | |
|
384 | )) | |
|
385 | for patch in patchset: | |
|
386 | filediff = self.render_patch(patch) | |
|
387 | filediff.diffset = diffset | |
|
388 | diffset.files.append(filediff) | |
|
389 | diffset.changed_files += 1 | |
|
390 | if not patch['stats']['binary']: | |
|
391 | diffset.lines_added += patch['stats']['added'] | |
|
392 | diffset.lines_deleted += patch['stats']['deleted'] | |
|
393 | ||
|
394 | return diffset | |
|
395 | ||
|
396 | _lexer_cache = {} | |
|
397 | def _get_lexer_for_filename(self, filename): | |
|
398 | # cached because we might need to call it twice for source/target | |
|
399 | if filename not in self._lexer_cache: | |
|
400 | self._lexer_cache[filename] = get_lexer_safe(filepath=filename) | |
|
401 | return self._lexer_cache[filename] | |
|
402 | ||
|
403 | def render_patch(self, patch): | |
|
404 | log.debug('rendering diff for %r' % patch['filename']) | |
|
405 | ||
|
406 | source_filename = patch['original_filename'] | |
|
407 | target_filename = patch['filename'] | |
|
408 | ||
|
409 | source_lexer = plain_text_lexer | |
|
410 | target_lexer = plain_text_lexer | |
|
411 | ||
|
412 | if not patch['stats']['binary']: | |
|
413 | if self.highlight_mode == self.HL_REAL: | |
|
414 | if (source_filename and patch['operation'] in ('D', 'M') | |
|
415 | and source_filename not in self.source_nodes): | |
|
416 | self.source_nodes[source_filename] = ( | |
|
417 | self.source_node_getter(source_filename)) | |
|
418 | ||
|
419 | if (target_filename and patch['operation'] in ('A', 'M') | |
|
420 | and target_filename not in self.target_nodes): | |
|
421 | self.target_nodes[target_filename] = ( | |
|
422 | self.target_node_getter(target_filename)) | |
|
423 | ||
|
424 | elif self.highlight_mode == self.HL_FAST: | |
|
425 | source_lexer = self._get_lexer_for_filename(source_filename) | |
|
426 | target_lexer = self._get_lexer_for_filename(target_filename) | |
|
427 | ||
|
428 | source_file = self.source_nodes.get(source_filename, source_filename) | |
|
429 | target_file = self.target_nodes.get(target_filename, target_filename) | |
|
430 | ||
|
431 | source_filenode, target_filenode = None, None | |
|
432 | ||
|
433 | # TODO: dan: FileNode.lexer works on the content of the file - which | |
|
434 | # can be slow - issue #4289 explains a lexer clean up - which once | |
|
435 | # done can allow caching a lexer for a filenode to avoid the file lookup | |
|
436 | if isinstance(source_file, FileNode): | |
|
437 | source_filenode = source_file | |
|
438 | source_lexer = source_file.lexer | |
|
439 | if isinstance(target_file, FileNode): | |
|
440 | target_filenode = target_file | |
|
441 | target_lexer = target_file.lexer | |
|
442 | ||
|
443 | source_file_path, target_file_path = None, None | |
|
444 | ||
|
445 | if source_filename != '/dev/null': | |
|
446 | source_file_path = source_filename | |
|
447 | if target_filename != '/dev/null': | |
|
448 | target_file_path = target_filename | |
|
449 | ||
|
450 | source_file_type = source_lexer.name | |
|
451 | target_file_type = target_lexer.name | |
|
452 | ||
|
453 | op_hunks = patch['chunks'][0] | |
|
454 | hunks = patch['chunks'][1:] | |
|
455 | ||
|
456 | filediff = AttributeDict({ | |
|
457 | 'source_file_path': source_file_path, | |
|
458 | 'target_file_path': target_file_path, | |
|
459 | 'source_filenode': source_filenode, | |
|
460 | 'target_filenode': target_filenode, | |
|
461 | 'hunks': [], | |
|
462 | 'source_file_type': target_file_type, | |
|
463 | 'target_file_type': source_file_type, | |
|
464 | 'patch': patch, | |
|
465 | 'source_mode': patch['stats']['old_mode'], | |
|
466 | 'target_mode': patch['stats']['new_mode'], | |
|
467 | 'limited_diff': isinstance(patch, LimitedDiffContainer), | |
|
468 | 'diffset': self, | |
|
469 | }) | |
|
470 | ||
|
471 | for hunk in hunks: | |
|
472 | hunkbit = self.parse_hunk(hunk, source_file, target_file) | |
|
473 | hunkbit.filediff = filediff | |
|
474 | filediff.hunks.append(hunkbit) | |
|
475 | return filediff | |
|
476 | ||
|
477 | def parse_hunk(self, hunk, source_file, target_file): | |
|
478 | result = AttributeDict(dict( | |
|
479 | source_start=hunk['source_start'], | |
|
480 | source_length=hunk['source_length'], | |
|
481 | target_start=hunk['target_start'], | |
|
482 | target_length=hunk['target_length'], | |
|
483 | section_header=hunk['section_header'], | |
|
484 | lines=[], | |
|
485 | )) | |
|
486 | before, after = [], [] | |
|
487 | ||
|
488 | for line in hunk['lines']: | |
|
489 | if line['action'] == 'unmod': | |
|
490 | result.lines.extend( | |
|
491 | self.parse_lines(before, after, source_file, target_file)) | |
|
492 | after.append(line) | |
|
493 | before.append(line) | |
|
494 | elif line['action'] == 'add': | |
|
495 | after.append(line) | |
|
496 | elif line['action'] == 'del': | |
|
497 | before.append(line) | |
|
498 | elif line['action'] == 'old-no-nl': | |
|
499 | before.append(line) | |
|
500 | elif line['action'] == 'new-no-nl': | |
|
501 | after.append(line) | |
|
502 | ||
|
503 | result.lines.extend( | |
|
504 | self.parse_lines(before, after, source_file, target_file)) | |
|
505 | result.unified = self.as_unified(result.lines) | |
|
506 | result.sideside = result.lines | |
|
507 | return result | |
|
508 | ||
|
509 | def parse_lines(self, before_lines, after_lines, source_file, target_file): | |
|
510 | # TODO: dan: investigate doing the diff comparison and fast highlighting | |
|
511 | # on the entire before and after buffered block lines rather than by | |
|
512 | # line, this means we can get better 'fast' highlighting if the context | |
|
513 | # allows it - eg. | |
|
514 | # line 4: """ | |
|
515 | # line 5: this gets highlighted as a string | |
|
516 | # line 6: """ | |
|
517 | ||
|
518 | lines = [] | |
|
519 | while before_lines or after_lines: | |
|
520 | before, after = None, None | |
|
521 | before_tokens, after_tokens = None, None | |
|
522 | ||
|
523 | if before_lines: | |
|
524 | before = before_lines.pop(0) | |
|
525 | if after_lines: | |
|
526 | after = after_lines.pop(0) | |
|
527 | ||
|
528 | original = AttributeDict() | |
|
529 | modified = AttributeDict() | |
|
530 | ||
|
531 | if before: | |
|
532 | if before['action'] == 'old-no-nl': | |
|
533 | before_tokens = [('nonl', before['line'])] | |
|
534 | else: | |
|
535 | before_tokens = self.get_line_tokens( | |
|
536 | line_text=before['line'], line_number=before['old_lineno'], | |
|
537 | file=source_file) | |
|
538 | original.lineno = before['old_lineno'] | |
|
539 | original.content = before['line'] | |
|
540 | original.action = self.action_to_op(before['action']) | |
|
541 | original.comments = self.get_comments_for('old', | |
|
542 | source_file, before['old_lineno']) | |
|
543 | ||
|
544 | if after: | |
|
545 | if after['action'] == 'new-no-nl': | |
|
546 | after_tokens = [('nonl', after['line'])] | |
|
547 | else: | |
|
548 | after_tokens = self.get_line_tokens( | |
|
549 | line_text=after['line'], line_number=after['new_lineno'], | |
|
550 | file=target_file) | |
|
551 | modified.lineno = after['new_lineno'] | |
|
552 | modified.content = after['line'] | |
|
553 | modified.action = self.action_to_op(after['action']) | |
|
554 | modified.comments = self.get_comments_for('new', | |
|
555 | target_file, after['new_lineno']) | |
|
556 | ||
|
557 | # diff the lines | |
|
558 | if before_tokens and after_tokens: | |
|
559 | o_tokens, m_tokens, similarity = tokens_diff( | |
|
560 | before_tokens, after_tokens) | |
|
561 | original.content = render_tokenstream(o_tokens) | |
|
562 | modified.content = render_tokenstream(m_tokens) | |
|
563 | elif before_tokens: | |
|
564 | original.content = render_tokenstream( | |
|
565 | [(x[0], '', x[1]) for x in before_tokens]) | |
|
566 | elif after_tokens: | |
|
567 | modified.content = render_tokenstream( | |
|
568 | [(x[0], '', x[1]) for x in after_tokens]) | |
|
569 | ||
|
570 | lines.append(AttributeDict({ | |
|
571 | 'original': original, | |
|
572 | 'modified': modified, | |
|
573 | })) | |
|
574 | ||
|
575 | return lines | |
|
576 | ||
|
577 | def get_comments_for(self, version, file, line_number): | |
|
578 | if hasattr(file, 'unicode_path'): | |
|
579 | file = file.unicode_path | |
|
580 | ||
|
581 | if not isinstance(file, basestring): | |
|
582 | return None | |
|
583 | ||
|
584 | line_key = { | |
|
585 | 'old': 'o', | |
|
586 | 'new': 'n', | |
|
587 | }[version] + str(line_number) | |
|
588 | ||
|
589 | return self.comments.get(file, {}).get(line_key) | |
|
590 | ||
|
591 | def get_line_tokens(self, line_text, line_number, file=None): | |
|
592 | filenode = None | |
|
593 | filename = None | |
|
594 | ||
|
595 | if isinstance(file, basestring): | |
|
596 | filename = file | |
|
597 | elif isinstance(file, FileNode): | |
|
598 | filenode = file | |
|
599 | filename = file.unicode_path | |
|
600 | ||
|
601 | if self.highlight_mode == self.HL_REAL and filenode: | |
|
602 | if line_number and file.size < self.max_file_size_limit: | |
|
603 | return self.get_tokenized_filenode_line(file, line_number) | |
|
604 | ||
|
605 | if self.highlight_mode in (self.HL_REAL, self.HL_FAST) and filename: | |
|
606 | lexer = self._get_lexer_for_filename(filename) | |
|
607 | return list(tokenize_string(line_text, lexer)) | |
|
608 | ||
|
609 | return list(tokenize_string(line_text, plain_text_lexer)) | |
|
610 | ||
|
611 | def get_tokenized_filenode_line(self, filenode, line_number): | |
|
612 | ||
|
613 | if filenode not in self.highlighted_filenodes: | |
|
614 | tokenized_lines = filenode_as_lines_tokens(filenode, filenode.lexer) | |
|
615 | self.highlighted_filenodes[filenode] = tokenized_lines | |
|
616 | return self.highlighted_filenodes[filenode][line_number - 1] | |
|
617 | ||
|
618 | def action_to_op(self, action): | |
|
619 | return { | |
|
620 | 'add': '+', | |
|
621 | 'del': '-', | |
|
622 | 'unmod': ' ', | |
|
623 | 'old-no-nl': ' ', | |
|
624 | 'new-no-nl': ' ', | |
|
625 | }.get(action, action) | |
|
626 | ||
|
627 | def as_unified(self, lines): | |
|
628 | """ Return a generator that yields the lines of a diff in unified order """ | |
|
629 | def generator(): | |
|
630 | buf = [] | |
|
631 | for line in lines: | |
|
632 | ||
|
633 | if buf and not line.original or line.original.action == ' ': | |
|
634 | for b in buf: | |
|
635 | yield b | |
|
636 | buf = [] | |
|
637 | ||
|
638 | if line.original: | |
|
639 | if line.original.action == ' ': | |
|
640 | yield (line.original.lineno, line.modified.lineno, | |
|
641 | line.original.action, line.original.content, | |
|
642 | line.original.comments) | |
|
643 | continue | |
|
644 | ||
|
645 | if line.original.action == '-': | |
|
646 | yield (line.original.lineno, None, | |
|
647 | line.original.action, line.original.content, | |
|
648 | line.original.comments) | |
|
649 | ||
|
650 | if line.modified.action == '+': | |
|
651 | buf.append(( | |
|
652 | None, line.modified.lineno, | |
|
653 | line.modified.action, line.modified.content, | |
|
654 | line.modified.comments)) | |
|
655 | continue | |
|
656 | ||
|
657 | if line.modified: | |
|
658 | yield (None, line.modified.lineno, | |
|
659 | line.modified.action, line.modified.content, | |
|
660 | line.modified.comments) | |
|
661 | ||
|
662 | for b in buf: | |
|
663 | yield b | |
|
664 | ||
|
665 | return generator() |
This diff has been collapsed as it changes many lines, (3640 lines changed) Show them Hide them | |||
@@ -0,0 +1,3640 b'' | |||
|
1 | # -*- coding: utf-8 -*- | |
|
2 | ||
|
3 | # Copyright (C) 2010-2016 RhodeCode GmbH | |
|
4 | # | |
|
5 | # This program is free software: you can redistribute it and/or modify | |
|
6 | # it under the terms of the GNU Affero General Public License, version 3 | |
|
7 | # (only), as published by the Free Software Foundation. | |
|
8 | # | |
|
9 | # This program is distributed in the hope that it will be useful, | |
|
10 | # but WITHOUT ANY WARRANTY; without even the implied warranty of | |
|
11 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the | |
|
12 | # GNU General Public License for more details. | |
|
13 | # | |
|
14 | # You should have received a copy of the GNU Affero General Public License | |
|
15 | # along with this program. If not, see <http://www.gnu.org/licenses/>. | |
|
16 | # | |
|
17 | # This program is dual-licensed. If you wish to learn more about the | |
|
18 | # RhodeCode Enterprise Edition, including its added features, Support services, | |
|
19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ | |
|
20 | ||
|
21 | """ | |
|
22 | Database Models for RhodeCode Enterprise | |
|
23 | """ | |
|
24 | ||
|
25 | import re | |
|
26 | import os | |
|
27 | import sys | |
|
28 | import time | |
|
29 | import hashlib | |
|
30 | import logging | |
|
31 | import datetime | |
|
32 | import warnings | |
|
33 | import ipaddress | |
|
34 | import functools | |
|
35 | import traceback | |
|
36 | import collections | |
|
37 | ||
|
38 | ||
|
39 | from sqlalchemy import * | |
|
40 | from sqlalchemy.exc import IntegrityError | |
|
41 | from sqlalchemy.ext.declarative import declared_attr | |
|
42 | from sqlalchemy.ext.hybrid import hybrid_property | |
|
43 | from sqlalchemy.orm import ( | |
|
44 | relationship, joinedload, class_mapper, validates, aliased) | |
|
45 | from sqlalchemy.sql.expression import true | |
|
46 | from beaker.cache import cache_region, region_invalidate | |
|
47 | from webob.exc import HTTPNotFound | |
|
48 | from zope.cachedescriptors.property import Lazy as LazyProperty | |
|
49 | ||
|
50 | from pylons import url | |
|
51 | from pylons.i18n.translation import lazy_ugettext as _ | |
|
52 | ||
|
53 | from rhodecode.lib.vcs import get_backend, get_vcs_instance | |
|
54 | from rhodecode.lib.vcs.utils.helpers import get_scm | |
|
55 | from rhodecode.lib.vcs.exceptions import VCSError | |
|
56 | from rhodecode.lib.vcs.backends.base import ( | |
|
57 | EmptyCommit, Reference, MergeFailureReason) | |
|
58 | from rhodecode.lib.utils2 import ( | |
|
59 | str2bool, safe_str, get_commit_safe, safe_unicode, remove_prefix, md5_safe, | |
|
60 | time_to_datetime, aslist, Optional, safe_int, get_clone_url, AttributeDict, | |
|
61 | glob2re) | |
|
62 | from rhodecode.lib.jsonalchemy import MutationObj, JsonType, JSONDict | |
|
63 | from rhodecode.lib.ext_json import json | |
|
64 | from rhodecode.lib.caching_query import FromCache | |
|
65 | from rhodecode.lib.encrypt import AESCipher | |
|
66 | ||
|
67 | from rhodecode.model.meta import Base, Session | |
|
68 | ||
|
69 | URL_SEP = '/' | |
|
70 | log = logging.getLogger(__name__) | |
|
71 | ||
|
72 | # ============================================================================= | |
|
73 | # BASE CLASSES | |
|
74 | # ============================================================================= | |
|
75 | ||
|
76 | # this is propagated from .ini file rhodecode.encrypted_values.secret or | |
|
77 | # beaker.session.secret if first is not set. | |
|
78 | # and initialized at environment.py | |
|
79 | ENCRYPTION_KEY = None | |
|
80 | ||
|
81 | # used to sort permissions by types, '#' used here is not allowed to be in | |
|
82 | # usernames, and it's very early in sorted string.printable table. | |
|
83 | PERMISSION_TYPE_SORT = { | |
|
84 | 'admin': '####', | |
|
85 | 'write': '###', | |
|
86 | 'read': '##', | |
|
87 | 'none': '#', | |
|
88 | } | |
|
89 | ||
|
90 | ||
|
91 | def display_sort(obj): | |
|
92 | """ | |
|
93 | Sort function used to sort permissions in .permissions() function of | |
|
94 | Repository, RepoGroup, UserGroup. Also it put the default user in front | |
|
95 | of all other resources | |
|
96 | """ | |
|
97 | ||
|
98 | if obj.username == User.DEFAULT_USER: | |
|
99 | return '#####' | |
|
100 | prefix = PERMISSION_TYPE_SORT.get(obj.permission.split('.')[-1], '') | |
|
101 | return prefix + obj.username | |
|
102 | ||
|
103 | ||
|
104 | def _hash_key(k): | |
|
105 | return md5_safe(k) | |
|
106 | ||
|
107 | ||
|
108 | class EncryptedTextValue(TypeDecorator): | |
|
109 | """ | |
|
110 | Special column for encrypted long text data, use like:: | |
|
111 | ||
|
112 | value = Column("encrypted_value", EncryptedValue(), nullable=False) | |
|
113 | ||
|
114 | This column is intelligent so if value is in unencrypted form it return | |
|
115 | unencrypted form, but on save it always encrypts | |
|
116 | """ | |
|
117 | impl = Text | |
|
118 | ||
|
119 | def process_bind_param(self, value, dialect): | |
|
120 | if not value: | |
|
121 | return value | |
|
122 | if value.startswith('enc$aes$') or value.startswith('enc$aes_hmac$'): | |
|
123 | # protect against double encrypting if someone manually starts | |
|
124 | # doing | |
|
125 | raise ValueError('value needs to be in unencrypted format, ie. ' | |
|
126 | 'not starting with enc$aes') | |
|
127 | return 'enc$aes_hmac$%s' % AESCipher( | |
|
128 | ENCRYPTION_KEY, hmac=True).encrypt(value) | |
|
129 | ||
|
130 | def process_result_value(self, value, dialect): | |
|
131 | import rhodecode | |
|
132 | ||
|
133 | if not value: | |
|
134 | return value | |
|
135 | ||
|
136 | parts = value.split('$', 3) | |
|
137 | if not len(parts) == 3: | |
|
138 | # probably not encrypted values | |
|
139 | return value | |
|
140 | else: | |
|
141 | if parts[0] != 'enc': | |
|
142 | # parts ok but without our header ? | |
|
143 | return value | |
|
144 | enc_strict_mode = str2bool(rhodecode.CONFIG.get( | |
|
145 | 'rhodecode.encrypted_values.strict') or True) | |
|
146 | # at that stage we know it's our encryption | |
|
147 | if parts[1] == 'aes': | |
|
148 | decrypted_data = AESCipher(ENCRYPTION_KEY).decrypt(parts[2]) | |
|
149 | elif parts[1] == 'aes_hmac': | |
|
150 | decrypted_data = AESCipher( | |
|
151 | ENCRYPTION_KEY, hmac=True, | |
|
152 | strict_verification=enc_strict_mode).decrypt(parts[2]) | |
|
153 | else: | |
|
154 | raise ValueError( | |
|
155 | 'Encryption type part is wrong, must be `aes` ' | |
|
156 | 'or `aes_hmac`, got `%s` instead' % (parts[1])) | |
|
157 | return decrypted_data | |
|
158 | ||
|
159 | ||
|
160 | class BaseModel(object): | |
|
161 | """ | |
|
162 | Base Model for all classes | |
|
163 | """ | |
|
164 | ||
|
165 | @classmethod | |
|
166 | def _get_keys(cls): | |
|
167 | """return column names for this model """ | |
|
168 | return class_mapper(cls).c.keys() | |
|
169 | ||
|
170 | def get_dict(self): | |
|
171 | """ | |
|
172 | return dict with keys and values corresponding | |
|
173 | to this model data """ | |
|
174 | ||
|
175 | d = {} | |
|
176 | for k in self._get_keys(): | |
|
177 | d[k] = getattr(self, k) | |
|
178 | ||
|
179 | # also use __json__() if present to get additional fields | |
|
180 | _json_attr = getattr(self, '__json__', None) | |
|
181 | if _json_attr: | |
|
182 | # update with attributes from __json__ | |
|
183 | if callable(_json_attr): | |
|
184 | _json_attr = _json_attr() | |
|
185 | for k, val in _json_attr.iteritems(): | |
|
186 | d[k] = val | |
|
187 | return d | |
|
188 | ||
|
189 | def get_appstruct(self): | |
|
190 | """return list with keys and values tuples corresponding | |
|
191 | to this model data """ | |
|
192 | ||
|
193 | l = [] | |
|
194 | for k in self._get_keys(): | |
|
195 | l.append((k, getattr(self, k),)) | |
|
196 | return l | |
|
197 | ||
|
198 | def populate_obj(self, populate_dict): | |
|
199 | """populate model with data from given populate_dict""" | |
|
200 | ||
|
201 | for k in self._get_keys(): | |
|
202 | if k in populate_dict: | |
|
203 | setattr(self, k, populate_dict[k]) | |
|
204 | ||
|
205 | @classmethod | |
|
206 | def query(cls): | |
|
207 | return Session().query(cls) | |
|
208 | ||
|
209 | @classmethod | |
|
210 | def get(cls, id_): | |
|
211 | if id_: | |
|
212 | return cls.query().get(id_) | |
|
213 | ||
|
214 | @classmethod | |
|
215 | def get_or_404(cls, id_): | |
|
216 | try: | |
|
217 | id_ = int(id_) | |
|
218 | except (TypeError, ValueError): | |
|
219 | raise HTTPNotFound | |
|
220 | ||
|
221 | res = cls.query().get(id_) | |
|
222 | if not res: | |
|
223 | raise HTTPNotFound | |
|
224 | return res | |
|
225 | ||
|
226 | @classmethod | |
|
227 | def getAll(cls): | |
|
228 | # deprecated and left for backward compatibility | |
|
229 | return cls.get_all() | |
|
230 | ||
|
231 | @classmethod | |
|
232 | def get_all(cls): | |
|
233 | return cls.query().all() | |
|
234 | ||
|
235 | @classmethod | |
|
236 | def delete(cls, id_): | |
|
237 | obj = cls.query().get(id_) | |
|
238 | Session().delete(obj) | |
|
239 | ||
|
240 | @classmethod | |
|
241 | def identity_cache(cls, session, attr_name, value): | |
|
242 | exist_in_session = [] | |
|
243 | for (item_cls, pkey), instance in session.identity_map.items(): | |
|
244 | if cls == item_cls and getattr(instance, attr_name) == value: | |
|
245 | exist_in_session.append(instance) | |
|
246 | if exist_in_session: | |
|
247 | if len(exist_in_session) == 1: | |
|
248 | return exist_in_session[0] | |
|
249 | log.exception( | |
|
250 | 'multiple objects with attr %s and ' | |
|
251 | 'value %s found with same name: %r', | |
|
252 | attr_name, value, exist_in_session) | |
|
253 | ||
|
254 | def __repr__(self): | |
|
255 | if hasattr(self, '__unicode__'): | |
|
256 | # python repr needs to return str | |
|
257 | try: | |
|
258 | return safe_str(self.__unicode__()) | |
|
259 | except UnicodeDecodeError: | |
|
260 | pass | |
|
261 | return '<DB:%s>' % (self.__class__.__name__) | |
|
262 | ||
|
263 | ||
|
264 | class RhodeCodeSetting(Base, BaseModel): | |
|
265 | __tablename__ = 'rhodecode_settings' | |
|
266 | __table_args__ = ( | |
|
267 | UniqueConstraint('app_settings_name'), | |
|
268 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
269 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
270 | ) | |
|
271 | ||
|
272 | SETTINGS_TYPES = { | |
|
273 | 'str': safe_str, | |
|
274 | 'int': safe_int, | |
|
275 | 'unicode': safe_unicode, | |
|
276 | 'bool': str2bool, | |
|
277 | 'list': functools.partial(aslist, sep=',') | |
|
278 | } | |
|
279 | DEFAULT_UPDATE_URL = 'https://rhodecode.com/api/v1/info/versions' | |
|
280 | GLOBAL_CONF_KEY = 'app_settings' | |
|
281 | ||
|
282 | app_settings_id = Column("app_settings_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
283 | app_settings_name = Column("app_settings_name", String(255), nullable=True, unique=None, default=None) | |
|
284 | _app_settings_value = Column("app_settings_value", String(4096), nullable=True, unique=None, default=None) | |
|
285 | _app_settings_type = Column("app_settings_type", String(255), nullable=True, unique=None, default=None) | |
|
286 | ||
|
287 | def __init__(self, key='', val='', type='unicode'): | |
|
288 | self.app_settings_name = key | |
|
289 | self.app_settings_type = type | |
|
290 | self.app_settings_value = val | |
|
291 | ||
|
292 | @validates('_app_settings_value') | |
|
293 | def validate_settings_value(self, key, val): | |
|
294 | assert type(val) == unicode | |
|
295 | return val | |
|
296 | ||
|
297 | @hybrid_property | |
|
298 | def app_settings_value(self): | |
|
299 | v = self._app_settings_value | |
|
300 | _type = self.app_settings_type | |
|
301 | if _type: | |
|
302 | _type = self.app_settings_type.split('.')[0] | |
|
303 | # decode the encrypted value | |
|
304 | if 'encrypted' in self.app_settings_type: | |
|
305 | cipher = EncryptedTextValue() | |
|
306 | v = safe_unicode(cipher.process_result_value(v, None)) | |
|
307 | ||
|
308 | converter = self.SETTINGS_TYPES.get(_type) or \ | |
|
309 | self.SETTINGS_TYPES['unicode'] | |
|
310 | return converter(v) | |
|
311 | ||
|
312 | @app_settings_value.setter | |
|
313 | def app_settings_value(self, val): | |
|
314 | """ | |
|
315 | Setter that will always make sure we use unicode in app_settings_value | |
|
316 | ||
|
317 | :param val: | |
|
318 | """ | |
|
319 | val = safe_unicode(val) | |
|
320 | # encode the encrypted value | |
|
321 | if 'encrypted' in self.app_settings_type: | |
|
322 | cipher = EncryptedTextValue() | |
|
323 | val = safe_unicode(cipher.process_bind_param(val, None)) | |
|
324 | self._app_settings_value = val | |
|
325 | ||
|
326 | @hybrid_property | |
|
327 | def app_settings_type(self): | |
|
328 | return self._app_settings_type | |
|
329 | ||
|
330 | @app_settings_type.setter | |
|
331 | def app_settings_type(self, val): | |
|
332 | if val.split('.')[0] not in self.SETTINGS_TYPES: | |
|
333 | raise Exception('type must be one of %s got %s' | |
|
334 | % (self.SETTINGS_TYPES.keys(), val)) | |
|
335 | self._app_settings_type = val | |
|
336 | ||
|
337 | def __unicode__(self): | |
|
338 | return u"<%s('%s:%s[%s]')>" % ( | |
|
339 | self.__class__.__name__, | |
|
340 | self.app_settings_name, self.app_settings_value, | |
|
341 | self.app_settings_type | |
|
342 | ) | |
|
343 | ||
|
344 | ||
|
345 | class RhodeCodeUi(Base, BaseModel): | |
|
346 | __tablename__ = 'rhodecode_ui' | |
|
347 | __table_args__ = ( | |
|
348 | UniqueConstraint('ui_key'), | |
|
349 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
350 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
351 | ) | |
|
352 | ||
|
353 | HOOK_REPO_SIZE = 'changegroup.repo_size' | |
|
354 | # HG | |
|
355 | HOOK_PRE_PULL = 'preoutgoing.pre_pull' | |
|
356 | HOOK_PULL = 'outgoing.pull_logger' | |
|
357 | HOOK_PRE_PUSH = 'prechangegroup.pre_push' | |
|
358 | HOOK_PUSH = 'changegroup.push_logger' | |
|
359 | ||
|
360 | # TODO: johbo: Unify way how hooks are configured for git and hg, | |
|
361 | # git part is currently hardcoded. | |
|
362 | ||
|
363 | # SVN PATTERNS | |
|
364 | SVN_BRANCH_ID = 'vcs_svn_branch' | |
|
365 | SVN_TAG_ID = 'vcs_svn_tag' | |
|
366 | ||
|
367 | ui_id = Column( | |
|
368 | "ui_id", Integer(), nullable=False, unique=True, default=None, | |
|
369 | primary_key=True) | |
|
370 | ui_section = Column( | |
|
371 | "ui_section", String(255), nullable=True, unique=None, default=None) | |
|
372 | ui_key = Column( | |
|
373 | "ui_key", String(255), nullable=True, unique=None, default=None) | |
|
374 | ui_value = Column( | |
|
375 | "ui_value", String(255), nullable=True, unique=None, default=None) | |
|
376 | ui_active = Column( | |
|
377 | "ui_active", Boolean(), nullable=True, unique=None, default=True) | |
|
378 | ||
|
379 | def __repr__(self): | |
|
380 | return '<%s[%s]%s=>%s]>' % (self.__class__.__name__, self.ui_section, | |
|
381 | self.ui_key, self.ui_value) | |
|
382 | ||
|
383 | ||
|
384 | class RepoRhodeCodeSetting(Base, BaseModel): | |
|
385 | __tablename__ = 'repo_rhodecode_settings' | |
|
386 | __table_args__ = ( | |
|
387 | UniqueConstraint( | |
|
388 | 'app_settings_name', 'repository_id', | |
|
389 | name='uq_repo_rhodecode_setting_name_repo_id'), | |
|
390 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
391 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
392 | ) | |
|
393 | ||
|
394 | repository_id = Column( | |
|
395 | "repository_id", Integer(), ForeignKey('repositories.repo_id'), | |
|
396 | nullable=False) | |
|
397 | app_settings_id = Column( | |
|
398 | "app_settings_id", Integer(), nullable=False, unique=True, | |
|
399 | default=None, primary_key=True) | |
|
400 | app_settings_name = Column( | |
|
401 | "app_settings_name", String(255), nullable=True, unique=None, | |
|
402 | default=None) | |
|
403 | _app_settings_value = Column( | |
|
404 | "app_settings_value", String(4096), nullable=True, unique=None, | |
|
405 | default=None) | |
|
406 | _app_settings_type = Column( | |
|
407 | "app_settings_type", String(255), nullable=True, unique=None, | |
|
408 | default=None) | |
|
409 | ||
|
410 | repository = relationship('Repository') | |
|
411 | ||
|
412 | def __init__(self, repository_id, key='', val='', type='unicode'): | |
|
413 | self.repository_id = repository_id | |
|
414 | self.app_settings_name = key | |
|
415 | self.app_settings_type = type | |
|
416 | self.app_settings_value = val | |
|
417 | ||
|
418 | @validates('_app_settings_value') | |
|
419 | def validate_settings_value(self, key, val): | |
|
420 | assert type(val) == unicode | |
|
421 | return val | |
|
422 | ||
|
423 | @hybrid_property | |
|
424 | def app_settings_value(self): | |
|
425 | v = self._app_settings_value | |
|
426 | type_ = self.app_settings_type | |
|
427 | SETTINGS_TYPES = RhodeCodeSetting.SETTINGS_TYPES | |
|
428 | converter = SETTINGS_TYPES.get(type_) or SETTINGS_TYPES['unicode'] | |
|
429 | return converter(v) | |
|
430 | ||
|
431 | @app_settings_value.setter | |
|
432 | def app_settings_value(self, val): | |
|
433 | """ | |
|
434 | Setter that will always make sure we use unicode in app_settings_value | |
|
435 | ||
|
436 | :param val: | |
|
437 | """ | |
|
438 | self._app_settings_value = safe_unicode(val) | |
|
439 | ||
|
440 | @hybrid_property | |
|
441 | def app_settings_type(self): | |
|
442 | return self._app_settings_type | |
|
443 | ||
|
444 | @app_settings_type.setter | |
|
445 | def app_settings_type(self, val): | |
|
446 | SETTINGS_TYPES = RhodeCodeSetting.SETTINGS_TYPES | |
|
447 | if val not in SETTINGS_TYPES: | |
|
448 | raise Exception('type must be one of %s got %s' | |
|
449 | % (SETTINGS_TYPES.keys(), val)) | |
|
450 | self._app_settings_type = val | |
|
451 | ||
|
452 | def __unicode__(self): | |
|
453 | return u"<%s('%s:%s:%s[%s]')>" % ( | |
|
454 | self.__class__.__name__, self.repository.repo_name, | |
|
455 | self.app_settings_name, self.app_settings_value, | |
|
456 | self.app_settings_type | |
|
457 | ) | |
|
458 | ||
|
459 | ||
|
460 | class RepoRhodeCodeUi(Base, BaseModel): | |
|
461 | __tablename__ = 'repo_rhodecode_ui' | |
|
462 | __table_args__ = ( | |
|
463 | UniqueConstraint( | |
|
464 | 'repository_id', 'ui_section', 'ui_key', | |
|
465 | name='uq_repo_rhodecode_ui_repository_id_section_key'), | |
|
466 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
467 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
468 | ) | |
|
469 | ||
|
470 | repository_id = Column( | |
|
471 | "repository_id", Integer(), ForeignKey('repositories.repo_id'), | |
|
472 | nullable=False) | |
|
473 | ui_id = Column( | |
|
474 | "ui_id", Integer(), nullable=False, unique=True, default=None, | |
|
475 | primary_key=True) | |
|
476 | ui_section = Column( | |
|
477 | "ui_section", String(255), nullable=True, unique=None, default=None) | |
|
478 | ui_key = Column( | |
|
479 | "ui_key", String(255), nullable=True, unique=None, default=None) | |
|
480 | ui_value = Column( | |
|
481 | "ui_value", String(255), nullable=True, unique=None, default=None) | |
|
482 | ui_active = Column( | |
|
483 | "ui_active", Boolean(), nullable=True, unique=None, default=True) | |
|
484 | ||
|
485 | repository = relationship('Repository') | |
|
486 | ||
|
487 | def __repr__(self): | |
|
488 | return '<%s[%s:%s]%s=>%s]>' % ( | |
|
489 | self.__class__.__name__, self.repository.repo_name, | |
|
490 | self.ui_section, self.ui_key, self.ui_value) | |
|
491 | ||
|
492 | ||
|
493 | class User(Base, BaseModel): | |
|
494 | __tablename__ = 'users' | |
|
495 | __table_args__ = ( | |
|
496 | UniqueConstraint('username'), UniqueConstraint('email'), | |
|
497 | Index('u_username_idx', 'username'), | |
|
498 | Index('u_email_idx', 'email'), | |
|
499 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
500 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
501 | ) | |
|
502 | DEFAULT_USER = 'default' | |
|
503 | DEFAULT_USER_EMAIL = 'anonymous@rhodecode.org' | |
|
504 | DEFAULT_GRAVATAR_URL = 'https://secure.gravatar.com/avatar/{md5email}?d=identicon&s={size}' | |
|
505 | ||
|
506 | user_id = Column("user_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
507 | username = Column("username", String(255), nullable=True, unique=None, default=None) | |
|
508 | password = Column("password", String(255), nullable=True, unique=None, default=None) | |
|
509 | active = Column("active", Boolean(), nullable=True, unique=None, default=True) | |
|
510 | admin = Column("admin", Boolean(), nullable=True, unique=None, default=False) | |
|
511 | name = Column("firstname", String(255), nullable=True, unique=None, default=None) | |
|
512 | lastname = Column("lastname", String(255), nullable=True, unique=None, default=None) | |
|
513 | _email = Column("email", String(255), nullable=True, unique=None, default=None) | |
|
514 | last_login = Column("last_login", DateTime(timezone=False), nullable=True, unique=None, default=None) | |
|
515 | extern_type = Column("extern_type", String(255), nullable=True, unique=None, default=None) | |
|
516 | extern_name = Column("extern_name", String(255), nullable=True, unique=None, default=None) | |
|
517 | api_key = Column("api_key", String(255), nullable=True, unique=None, default=None) | |
|
518 | inherit_default_permissions = Column("inherit_default_permissions", Boolean(), nullable=False, unique=None, default=True) | |
|
519 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
|
520 | _user_data = Column("user_data", LargeBinary(), nullable=True) # JSON data | |
|
521 | ||
|
522 | user_log = relationship('UserLog') | |
|
523 | user_perms = relationship('UserToPerm', primaryjoin="User.user_id==UserToPerm.user_id", cascade='all') | |
|
524 | ||
|
525 | repositories = relationship('Repository') | |
|
526 | repository_groups = relationship('RepoGroup') | |
|
527 | user_groups = relationship('UserGroup') | |
|
528 | ||
|
529 | user_followers = relationship('UserFollowing', primaryjoin='UserFollowing.follows_user_id==User.user_id', cascade='all') | |
|
530 | followings = relationship('UserFollowing', primaryjoin='UserFollowing.user_id==User.user_id', cascade='all') | |
|
531 | ||
|
532 | repo_to_perm = relationship('UserRepoToPerm', primaryjoin='UserRepoToPerm.user_id==User.user_id', cascade='all') | |
|
533 | repo_group_to_perm = relationship('UserRepoGroupToPerm', primaryjoin='UserRepoGroupToPerm.user_id==User.user_id', cascade='all') | |
|
534 | user_group_to_perm = relationship('UserUserGroupToPerm', primaryjoin='UserUserGroupToPerm.user_id==User.user_id', cascade='all') | |
|
535 | ||
|
536 | group_member = relationship('UserGroupMember', cascade='all') | |
|
537 | ||
|
538 | notifications = relationship('UserNotification', cascade='all') | |
|
539 | # notifications assigned to this user | |
|
540 | user_created_notifications = relationship('Notification', cascade='all') | |
|
541 | # comments created by this user | |
|
542 | user_comments = relationship('ChangesetComment', cascade='all') | |
|
543 | # user profile extra info | |
|
544 | user_emails = relationship('UserEmailMap', cascade='all') | |
|
545 | user_ip_map = relationship('UserIpMap', cascade='all') | |
|
546 | user_auth_tokens = relationship('UserApiKeys', cascade='all') | |
|
547 | # gists | |
|
548 | user_gists = relationship('Gist', cascade='all') | |
|
549 | # user pull requests | |
|
550 | user_pull_requests = relationship('PullRequest', cascade='all') | |
|
551 | # external identities | |
|
552 | extenal_identities = relationship( | |
|
553 | 'ExternalIdentity', | |
|
554 | primaryjoin="User.user_id==ExternalIdentity.local_user_id", | |
|
555 | cascade='all') | |
|
556 | ||
|
557 | def __unicode__(self): | |
|
558 | return u"<%s('id:%s:%s')>" % (self.__class__.__name__, | |
|
559 | self.user_id, self.username) | |
|
560 | ||
|
561 | @hybrid_property | |
|
562 | def email(self): | |
|
563 | return self._email | |
|
564 | ||
|
565 | @email.setter | |
|
566 | def email(self, val): | |
|
567 | self._email = val.lower() if val else None | |
|
568 | ||
|
569 | @property | |
|
570 | def firstname(self): | |
|
571 | # alias for future | |
|
572 | return self.name | |
|
573 | ||
|
574 | @property | |
|
575 | def emails(self): | |
|
576 | other = UserEmailMap.query().filter(UserEmailMap.user==self).all() | |
|
577 | return [self.email] + [x.email for x in other] | |
|
578 | ||
|
579 | @property | |
|
580 | def auth_tokens(self): | |
|
581 | return [self.api_key] + [x.api_key for x in self.extra_auth_tokens] | |
|
582 | ||
|
583 | @property | |
|
584 | def extra_auth_tokens(self): | |
|
585 | return UserApiKeys.query().filter(UserApiKeys.user == self).all() | |
|
586 | ||
|
587 | @property | |
|
588 | def feed_token(self): | |
|
589 | feed_tokens = UserApiKeys.query()\ | |
|
590 | .filter(UserApiKeys.user == self)\ | |
|
591 | .filter(UserApiKeys.role == UserApiKeys.ROLE_FEED)\ | |
|
592 | .all() | |
|
593 | if feed_tokens: | |
|
594 | return feed_tokens[0].api_key | |
|
595 | else: | |
|
596 | # use the main token so we don't end up with nothing... | |
|
597 | return self.api_key | |
|
598 | ||
|
599 | @classmethod | |
|
600 | def extra_valid_auth_tokens(cls, user, role=None): | |
|
601 | tokens = UserApiKeys.query().filter(UserApiKeys.user == user)\ | |
|
602 | .filter(or_(UserApiKeys.expires == -1, | |
|
603 | UserApiKeys.expires >= time.time())) | |
|
604 | if role: | |
|
605 | tokens = tokens.filter(or_(UserApiKeys.role == role, | |
|
606 | UserApiKeys.role == UserApiKeys.ROLE_ALL)) | |
|
607 | return tokens.all() | |
|
608 | ||
|
609 | @property | |
|
610 | def ip_addresses(self): | |
|
611 | ret = UserIpMap.query().filter(UserIpMap.user == self).all() | |
|
612 | return [x.ip_addr for x in ret] | |
|
613 | ||
|
614 | @property | |
|
615 | def username_and_name(self): | |
|
616 | return '%s (%s %s)' % (self.username, self.firstname, self.lastname) | |
|
617 | ||
|
618 | @property | |
|
619 | def username_or_name_or_email(self): | |
|
620 | full_name = self.full_name if self.full_name is not ' ' else None | |
|
621 | return self.username or full_name or self.email | |
|
622 | ||
|
623 | @property | |
|
624 | def full_name(self): | |
|
625 | return '%s %s' % (self.firstname, self.lastname) | |
|
626 | ||
|
627 | @property | |
|
628 | def full_name_or_username(self): | |
|
629 | return ('%s %s' % (self.firstname, self.lastname) | |
|
630 | if (self.firstname and self.lastname) else self.username) | |
|
631 | ||
|
632 | @property | |
|
633 | def full_contact(self): | |
|
634 | return '%s %s <%s>' % (self.firstname, self.lastname, self.email) | |
|
635 | ||
|
636 | @property | |
|
637 | def short_contact(self): | |
|
638 | return '%s %s' % (self.firstname, self.lastname) | |
|
639 | ||
|
640 | @property | |
|
641 | def is_admin(self): | |
|
642 | return self.admin | |
|
643 | ||
|
644 | @property | |
|
645 | def AuthUser(self): | |
|
646 | """ | |
|
647 | Returns instance of AuthUser for this user | |
|
648 | """ | |
|
649 | from rhodecode.lib.auth import AuthUser | |
|
650 | return AuthUser(user_id=self.user_id, api_key=self.api_key, | |
|
651 | username=self.username) | |
|
652 | ||
|
653 | @hybrid_property | |
|
654 | def user_data(self): | |
|
655 | if not self._user_data: | |
|
656 | return {} | |
|
657 | ||
|
658 | try: | |
|
659 | return json.loads(self._user_data) | |
|
660 | except TypeError: | |
|
661 | return {} | |
|
662 | ||
|
663 | @user_data.setter | |
|
664 | def user_data(self, val): | |
|
665 | if not isinstance(val, dict): | |
|
666 | raise Exception('user_data must be dict, got %s' % type(val)) | |
|
667 | try: | |
|
668 | self._user_data = json.dumps(val) | |
|
669 | except Exception: | |
|
670 | log.error(traceback.format_exc()) | |
|
671 | ||
|
672 | @classmethod | |
|
673 | def get_by_username(cls, username, case_insensitive=False, | |
|
674 | cache=False, identity_cache=False): | |
|
675 | session = Session() | |
|
676 | ||
|
677 | if case_insensitive: | |
|
678 | q = cls.query().filter( | |
|
679 | func.lower(cls.username) == func.lower(username)) | |
|
680 | else: | |
|
681 | q = cls.query().filter(cls.username == username) | |
|
682 | ||
|
683 | if cache: | |
|
684 | if identity_cache: | |
|
685 | val = cls.identity_cache(session, 'username', username) | |
|
686 | if val: | |
|
687 | return val | |
|
688 | else: | |
|
689 | q = q.options( | |
|
690 | FromCache("sql_cache_short", | |
|
691 | "get_user_by_name_%s" % _hash_key(username))) | |
|
692 | ||
|
693 | return q.scalar() | |
|
694 | ||
|
695 | @classmethod | |
|
696 | def get_by_auth_token(cls, auth_token, cache=False, fallback=True): | |
|
697 | q = cls.query().filter(cls.api_key == auth_token) | |
|
698 | ||
|
699 | if cache: | |
|
700 | q = q.options(FromCache("sql_cache_short", | |
|
701 | "get_auth_token_%s" % auth_token)) | |
|
702 | res = q.scalar() | |
|
703 | ||
|
704 | if fallback and not res: | |
|
705 | #fallback to additional keys | |
|
706 | _res = UserApiKeys.query()\ | |
|
707 | .filter(UserApiKeys.api_key == auth_token)\ | |
|
708 | .filter(or_(UserApiKeys.expires == -1, | |
|
709 | UserApiKeys.expires >= time.time()))\ | |
|
710 | .first() | |
|
711 | if _res: | |
|
712 | res = _res.user | |
|
713 | return res | |
|
714 | ||
|
715 | @classmethod | |
|
716 | def get_by_email(cls, email, case_insensitive=False, cache=False): | |
|
717 | ||
|
718 | if case_insensitive: | |
|
719 | q = cls.query().filter(func.lower(cls.email) == func.lower(email)) | |
|
720 | ||
|
721 | else: | |
|
722 | q = cls.query().filter(cls.email == email) | |
|
723 | ||
|
724 | if cache: | |
|
725 | q = q.options(FromCache("sql_cache_short", | |
|
726 | "get_email_key_%s" % _hash_key(email))) | |
|
727 | ||
|
728 | ret = q.scalar() | |
|
729 | if ret is None: | |
|
730 | q = UserEmailMap.query() | |
|
731 | # try fetching in alternate email map | |
|
732 | if case_insensitive: | |
|
733 | q = q.filter(func.lower(UserEmailMap.email) == func.lower(email)) | |
|
734 | else: | |
|
735 | q = q.filter(UserEmailMap.email == email) | |
|
736 | q = q.options(joinedload(UserEmailMap.user)) | |
|
737 | if cache: | |
|
738 | q = q.options(FromCache("sql_cache_short", | |
|
739 | "get_email_map_key_%s" % email)) | |
|
740 | ret = getattr(q.scalar(), 'user', None) | |
|
741 | ||
|
742 | return ret | |
|
743 | ||
|
744 | @classmethod | |
|
745 | def get_from_cs_author(cls, author): | |
|
746 | """ | |
|
747 | Tries to get User objects out of commit author string | |
|
748 | ||
|
749 | :param author: | |
|
750 | """ | |
|
751 | from rhodecode.lib.helpers import email, author_name | |
|
752 | # Valid email in the attribute passed, see if they're in the system | |
|
753 | _email = email(author) | |
|
754 | if _email: | |
|
755 | user = cls.get_by_email(_email, case_insensitive=True) | |
|
756 | if user: | |
|
757 | return user | |
|
758 | # Maybe we can match by username? | |
|
759 | _author = author_name(author) | |
|
760 | user = cls.get_by_username(_author, case_insensitive=True) | |
|
761 | if user: | |
|
762 | return user | |
|
763 | ||
|
764 | def update_userdata(self, **kwargs): | |
|
765 | usr = self | |
|
766 | old = usr.user_data | |
|
767 | old.update(**kwargs) | |
|
768 | usr.user_data = old | |
|
769 | Session().add(usr) | |
|
770 | log.debug('updated userdata with ', kwargs) | |
|
771 | ||
|
772 | def update_lastlogin(self): | |
|
773 | """Update user lastlogin""" | |
|
774 | self.last_login = datetime.datetime.now() | |
|
775 | Session().add(self) | |
|
776 | log.debug('updated user %s lastlogin', self.username) | |
|
777 | ||
|
778 | def update_lastactivity(self): | |
|
779 | """Update user lastactivity""" | |
|
780 | usr = self | |
|
781 | old = usr.user_data | |
|
782 | old.update({'last_activity': time.time()}) | |
|
783 | usr.user_data = old | |
|
784 | Session().add(usr) | |
|
785 | log.debug('updated user %s lastactivity', usr.username) | |
|
786 | ||
|
787 | def update_password(self, new_password, change_api_key=False): | |
|
788 | from rhodecode.lib.auth import get_crypt_password,generate_auth_token | |
|
789 | ||
|
790 | self.password = get_crypt_password(new_password) | |
|
791 | if change_api_key: | |
|
792 | self.api_key = generate_auth_token(self.username) | |
|
793 | Session().add(self) | |
|
794 | ||
|
795 | @classmethod | |
|
796 | def get_first_super_admin(cls): | |
|
797 | user = User.query().filter(User.admin == true()).first() | |
|
798 | if user is None: | |
|
799 | raise Exception('FATAL: Missing administrative account!') | |
|
800 | return user | |
|
801 | ||
|
802 | @classmethod | |
|
803 | def get_all_super_admins(cls): | |
|
804 | """ | |
|
805 | Returns all admin accounts sorted by username | |
|
806 | """ | |
|
807 | return User.query().filter(User.admin == true())\ | |
|
808 | .order_by(User.username.asc()).all() | |
|
809 | ||
|
810 | @classmethod | |
|
811 | def get_default_user(cls, cache=False): | |
|
812 | user = User.get_by_username(User.DEFAULT_USER, cache=cache) | |
|
813 | if user is None: | |
|
814 | raise Exception('FATAL: Missing default account!') | |
|
815 | return user | |
|
816 | ||
|
817 | def _get_default_perms(self, user, suffix=''): | |
|
818 | from rhodecode.model.permission import PermissionModel | |
|
819 | return PermissionModel().get_default_perms(user.user_perms, suffix) | |
|
820 | ||
|
821 | def get_default_perms(self, suffix=''): | |
|
822 | return self._get_default_perms(self, suffix) | |
|
823 | ||
|
824 | def get_api_data(self, include_secrets=False, details='full'): | |
|
825 | """ | |
|
826 | Common function for generating user related data for API | |
|
827 | ||
|
828 | :param include_secrets: By default secrets in the API data will be replaced | |
|
829 | by a placeholder value to prevent exposing this data by accident. In case | |
|
830 | this data shall be exposed, set this flag to ``True``. | |
|
831 | ||
|
832 | :param details: details can be 'basic|full' basic gives only a subset of | |
|
833 | the available user information that includes user_id, name and emails. | |
|
834 | """ | |
|
835 | user = self | |
|
836 | user_data = self.user_data | |
|
837 | data = { | |
|
838 | 'user_id': user.user_id, | |
|
839 | 'username': user.username, | |
|
840 | 'firstname': user.name, | |
|
841 | 'lastname': user.lastname, | |
|
842 | 'email': user.email, | |
|
843 | 'emails': user.emails, | |
|
844 | } | |
|
845 | if details == 'basic': | |
|
846 | return data | |
|
847 | ||
|
848 | api_key_length = 40 | |
|
849 | api_key_replacement = '*' * api_key_length | |
|
850 | ||
|
851 | extras = { | |
|
852 | 'api_key': api_key_replacement, | |
|
853 | 'api_keys': [api_key_replacement], | |
|
854 | 'active': user.active, | |
|
855 | 'admin': user.admin, | |
|
856 | 'extern_type': user.extern_type, | |
|
857 | 'extern_name': user.extern_name, | |
|
858 | 'last_login': user.last_login, | |
|
859 | 'ip_addresses': user.ip_addresses, | |
|
860 | 'language': user_data.get('language') | |
|
861 | } | |
|
862 | data.update(extras) | |
|
863 | ||
|
864 | if include_secrets: | |
|
865 | data['api_key'] = user.api_key | |
|
866 | data['api_keys'] = user.auth_tokens | |
|
867 | return data | |
|
868 | ||
|
869 | def __json__(self): | |
|
870 | data = { | |
|
871 | 'full_name': self.full_name, | |
|
872 | 'full_name_or_username': self.full_name_or_username, | |
|
873 | 'short_contact': self.short_contact, | |
|
874 | 'full_contact': self.full_contact, | |
|
875 | } | |
|
876 | data.update(self.get_api_data()) | |
|
877 | return data | |
|
878 | ||
|
879 | ||
|
880 | class UserApiKeys(Base, BaseModel): | |
|
881 | __tablename__ = 'user_api_keys' | |
|
882 | __table_args__ = ( | |
|
883 | Index('uak_api_key_idx', 'api_key'), | |
|
884 | Index('uak_api_key_expires_idx', 'api_key', 'expires'), | |
|
885 | UniqueConstraint('api_key'), | |
|
886 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
887 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
888 | ) | |
|
889 | __mapper_args__ = {} | |
|
890 | ||
|
891 | # ApiKey role | |
|
892 | ROLE_ALL = 'token_role_all' | |
|
893 | ROLE_HTTP = 'token_role_http' | |
|
894 | ROLE_VCS = 'token_role_vcs' | |
|
895 | ROLE_API = 'token_role_api' | |
|
896 | ROLE_FEED = 'token_role_feed' | |
|
897 | ROLES = [ROLE_ALL, ROLE_HTTP, ROLE_VCS, ROLE_API, ROLE_FEED] | |
|
898 | ||
|
899 | user_api_key_id = Column("user_api_key_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
900 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None) | |
|
901 | api_key = Column("api_key", String(255), nullable=False, unique=True) | |
|
902 | description = Column('description', UnicodeText().with_variant(UnicodeText(1024), 'mysql')) | |
|
903 | expires = Column('expires', Float(53), nullable=False) | |
|
904 | role = Column('role', String(255), nullable=True) | |
|
905 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
|
906 | ||
|
907 | user = relationship('User', lazy='joined') | |
|
908 | ||
|
909 | @classmethod | |
|
910 | def _get_role_name(cls, role): | |
|
911 | return { | |
|
912 | cls.ROLE_ALL: _('all'), | |
|
913 | cls.ROLE_HTTP: _('http/web interface'), | |
|
914 | cls.ROLE_VCS: _('vcs (git/hg/svn protocol)'), | |
|
915 | cls.ROLE_API: _('api calls'), | |
|
916 | cls.ROLE_FEED: _('feed access'), | |
|
917 | }.get(role, role) | |
|
918 | ||
|
919 | @property | |
|
920 | def expired(self): | |
|
921 | if self.expires == -1: | |
|
922 | return False | |
|
923 | return time.time() > self.expires | |
|
924 | ||
|
925 | @property | |
|
926 | def role_humanized(self): | |
|
927 | return self._get_role_name(self.role) | |
|
928 | ||
|
929 | ||
|
930 | class UserEmailMap(Base, BaseModel): | |
|
931 | __tablename__ = 'user_email_map' | |
|
932 | __table_args__ = ( | |
|
933 | Index('uem_email_idx', 'email'), | |
|
934 | UniqueConstraint('email'), | |
|
935 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
936 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
937 | ) | |
|
938 | __mapper_args__ = {} | |
|
939 | ||
|
940 | email_id = Column("email_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
941 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None) | |
|
942 | _email = Column("email", String(255), nullable=True, unique=False, default=None) | |
|
943 | user = relationship('User', lazy='joined') | |
|
944 | ||
|
945 | @validates('_email') | |
|
946 | def validate_email(self, key, email): | |
|
947 | # check if this email is not main one | |
|
948 | main_email = Session().query(User).filter(User.email == email).scalar() | |
|
949 | if main_email is not None: | |
|
950 | raise AttributeError('email %s is present is user table' % email) | |
|
951 | return email | |
|
952 | ||
|
953 | @hybrid_property | |
|
954 | def email(self): | |
|
955 | return self._email | |
|
956 | ||
|
957 | @email.setter | |
|
958 | def email(self, val): | |
|
959 | self._email = val.lower() if val else None | |
|
960 | ||
|
961 | ||
|
962 | class UserIpMap(Base, BaseModel): | |
|
963 | __tablename__ = 'user_ip_map' | |
|
964 | __table_args__ = ( | |
|
965 | UniqueConstraint('user_id', 'ip_addr'), | |
|
966 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
967 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
968 | ) | |
|
969 | __mapper_args__ = {} | |
|
970 | ||
|
971 | ip_id = Column("ip_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
972 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None) | |
|
973 | ip_addr = Column("ip_addr", String(255), nullable=True, unique=False, default=None) | |
|
974 | active = Column("active", Boolean(), nullable=True, unique=None, default=True) | |
|
975 | description = Column("description", String(10000), nullable=True, unique=None, default=None) | |
|
976 | user = relationship('User', lazy='joined') | |
|
977 | ||
|
978 | @classmethod | |
|
979 | def _get_ip_range(cls, ip_addr): | |
|
980 | net = ipaddress.ip_network(ip_addr, strict=False) | |
|
981 | return [str(net.network_address), str(net.broadcast_address)] | |
|
982 | ||
|
983 | def __json__(self): | |
|
984 | return { | |
|
985 | 'ip_addr': self.ip_addr, | |
|
986 | 'ip_range': self._get_ip_range(self.ip_addr), | |
|
987 | } | |
|
988 | ||
|
989 | def __unicode__(self): | |
|
990 | return u"<%s('user_id:%s=>%s')>" % (self.__class__.__name__, | |
|
991 | self.user_id, self.ip_addr) | |
|
992 | ||
|
993 | class UserLog(Base, BaseModel): | |
|
994 | __tablename__ = 'user_logs' | |
|
995 | __table_args__ = ( | |
|
996 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
997 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
998 | ) | |
|
999 | user_log_id = Column("user_log_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
1000 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None) | |
|
1001 | username = Column("username", String(255), nullable=True, unique=None, default=None) | |
|
1002 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True) | |
|
1003 | repository_name = Column("repository_name", String(255), nullable=True, unique=None, default=None) | |
|
1004 | user_ip = Column("user_ip", String(255), nullable=True, unique=None, default=None) | |
|
1005 | action = Column("action", Text().with_variant(Text(1200000), 'mysql'), nullable=True, unique=None, default=None) | |
|
1006 | action_date = Column("action_date", DateTime(timezone=False), nullable=True, unique=None, default=None) | |
|
1007 | ||
|
1008 | def __unicode__(self): | |
|
1009 | return u"<%s('id:%s:%s')>" % (self.__class__.__name__, | |
|
1010 | self.repository_name, | |
|
1011 | self.action) | |
|
1012 | ||
|
1013 | @property | |
|
1014 | def action_as_day(self): | |
|
1015 | return datetime.date(*self.action_date.timetuple()[:3]) | |
|
1016 | ||
|
1017 | user = relationship('User') | |
|
1018 | repository = relationship('Repository', cascade='') | |
|
1019 | ||
|
1020 | ||
|
1021 | class UserGroup(Base, BaseModel): | |
|
1022 | __tablename__ = 'users_groups' | |
|
1023 | __table_args__ = ( | |
|
1024 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
1025 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
1026 | ) | |
|
1027 | ||
|
1028 | users_group_id = Column("users_group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
1029 | users_group_name = Column("users_group_name", String(255), nullable=False, unique=True, default=None) | |
|
1030 | user_group_description = Column("user_group_description", String(10000), nullable=True, unique=None, default=None) | |
|
1031 | users_group_active = Column("users_group_active", Boolean(), nullable=True, unique=None, default=None) | |
|
1032 | inherit_default_permissions = Column("users_group_inherit_default_permissions", Boolean(), nullable=False, unique=None, default=True) | |
|
1033 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=False, default=None) | |
|
1034 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
|
1035 | _group_data = Column("group_data", LargeBinary(), nullable=True) # JSON data | |
|
1036 | ||
|
1037 | members = relationship('UserGroupMember', cascade="all, delete, delete-orphan", lazy="joined") | |
|
1038 | users_group_to_perm = relationship('UserGroupToPerm', cascade='all') | |
|
1039 | users_group_repo_to_perm = relationship('UserGroupRepoToPerm', cascade='all') | |
|
1040 | users_group_repo_group_to_perm = relationship('UserGroupRepoGroupToPerm', cascade='all') | |
|
1041 | user_user_group_to_perm = relationship('UserUserGroupToPerm', cascade='all') | |
|
1042 | user_group_user_group_to_perm = relationship('UserGroupUserGroupToPerm ', primaryjoin="UserGroupUserGroupToPerm.target_user_group_id==UserGroup.users_group_id", cascade='all') | |
|
1043 | ||
|
1044 | user = relationship('User') | |
|
1045 | ||
|
1046 | @hybrid_property | |
|
1047 | def group_data(self): | |
|
1048 | if not self._group_data: | |
|
1049 | return {} | |
|
1050 | ||
|
1051 | try: | |
|
1052 | return json.loads(self._group_data) | |
|
1053 | except TypeError: | |
|
1054 | return {} | |
|
1055 | ||
|
1056 | @group_data.setter | |
|
1057 | def group_data(self, val): | |
|
1058 | try: | |
|
1059 | self._group_data = json.dumps(val) | |
|
1060 | except Exception: | |
|
1061 | log.error(traceback.format_exc()) | |
|
1062 | ||
|
1063 | def __unicode__(self): | |
|
1064 | return u"<%s('id:%s:%s')>" % (self.__class__.__name__, | |
|
1065 | self.users_group_id, | |
|
1066 | self.users_group_name) | |
|
1067 | ||
|
1068 | @classmethod | |
|
1069 | def get_by_group_name(cls, group_name, cache=False, | |
|
1070 | case_insensitive=False): | |
|
1071 | if case_insensitive: | |
|
1072 | q = cls.query().filter(func.lower(cls.users_group_name) == | |
|
1073 | func.lower(group_name)) | |
|
1074 | ||
|
1075 | else: | |
|
1076 | q = cls.query().filter(cls.users_group_name == group_name) | |
|
1077 | if cache: | |
|
1078 | q = q.options(FromCache( | |
|
1079 | "sql_cache_short", | |
|
1080 | "get_group_%s" % _hash_key(group_name))) | |
|
1081 | return q.scalar() | |
|
1082 | ||
|
1083 | @classmethod | |
|
1084 | def get(cls, user_group_id, cache=False): | |
|
1085 | user_group = cls.query() | |
|
1086 | if cache: | |
|
1087 | user_group = user_group.options(FromCache("sql_cache_short", | |
|
1088 | "get_users_group_%s" % user_group_id)) | |
|
1089 | return user_group.get(user_group_id) | |
|
1090 | ||
|
1091 | def permissions(self, with_admins=True, with_owner=True): | |
|
1092 | q = UserUserGroupToPerm.query().filter(UserUserGroupToPerm.user_group == self) | |
|
1093 | q = q.options(joinedload(UserUserGroupToPerm.user_group), | |
|
1094 | joinedload(UserUserGroupToPerm.user), | |
|
1095 | joinedload(UserUserGroupToPerm.permission),) | |
|
1096 | ||
|
1097 | # get owners and admins and permissions. We do a trick of re-writing | |
|
1098 | # objects from sqlalchemy to named-tuples due to sqlalchemy session | |
|
1099 | # has a global reference and changing one object propagates to all | |
|
1100 | # others. This means if admin is also an owner admin_row that change | |
|
1101 | # would propagate to both objects | |
|
1102 | perm_rows = [] | |
|
1103 | for _usr in q.all(): | |
|
1104 | usr = AttributeDict(_usr.user.get_dict()) | |
|
1105 | usr.permission = _usr.permission.permission_name | |
|
1106 | perm_rows.append(usr) | |
|
1107 | ||
|
1108 | # filter the perm rows by 'default' first and then sort them by | |
|
1109 | # admin,write,read,none permissions sorted again alphabetically in | |
|
1110 | # each group | |
|
1111 | perm_rows = sorted(perm_rows, key=display_sort) | |
|
1112 | ||
|
1113 | _admin_perm = 'usergroup.admin' | |
|
1114 | owner_row = [] | |
|
1115 | if with_owner: | |
|
1116 | usr = AttributeDict(self.user.get_dict()) | |
|
1117 | usr.owner_row = True | |
|
1118 | usr.permission = _admin_perm | |
|
1119 | owner_row.append(usr) | |
|
1120 | ||
|
1121 | super_admin_rows = [] | |
|
1122 | if with_admins: | |
|
1123 | for usr in User.get_all_super_admins(): | |
|
1124 | # if this admin is also owner, don't double the record | |
|
1125 | if usr.user_id == owner_row[0].user_id: | |
|
1126 | owner_row[0].admin_row = True | |
|
1127 | else: | |
|
1128 | usr = AttributeDict(usr.get_dict()) | |
|
1129 | usr.admin_row = True | |
|
1130 | usr.permission = _admin_perm | |
|
1131 | super_admin_rows.append(usr) | |
|
1132 | ||
|
1133 | return super_admin_rows + owner_row + perm_rows | |
|
1134 | ||
|
1135 | def permission_user_groups(self): | |
|
1136 | q = UserGroupUserGroupToPerm.query().filter(UserGroupUserGroupToPerm.target_user_group == self) | |
|
1137 | q = q.options(joinedload(UserGroupUserGroupToPerm.user_group), | |
|
1138 | joinedload(UserGroupUserGroupToPerm.target_user_group), | |
|
1139 | joinedload(UserGroupUserGroupToPerm.permission),) | |
|
1140 | ||
|
1141 | perm_rows = [] | |
|
1142 | for _user_group in q.all(): | |
|
1143 | usr = AttributeDict(_user_group.user_group.get_dict()) | |
|
1144 | usr.permission = _user_group.permission.permission_name | |
|
1145 | perm_rows.append(usr) | |
|
1146 | ||
|
1147 | return perm_rows | |
|
1148 | ||
|
1149 | def _get_default_perms(self, user_group, suffix=''): | |
|
1150 | from rhodecode.model.permission import PermissionModel | |
|
1151 | return PermissionModel().get_default_perms(user_group.users_group_to_perm, suffix) | |
|
1152 | ||
|
1153 | def get_default_perms(self, suffix=''): | |
|
1154 | return self._get_default_perms(self, suffix) | |
|
1155 | ||
|
1156 | def get_api_data(self, with_group_members=True, include_secrets=False): | |
|
1157 | """ | |
|
1158 | :param include_secrets: See :meth:`User.get_api_data`, this parameter is | |
|
1159 | basically forwarded. | |
|
1160 | ||
|
1161 | """ | |
|
1162 | user_group = self | |
|
1163 | ||
|
1164 | data = { | |
|
1165 | 'users_group_id': user_group.users_group_id, | |
|
1166 | 'group_name': user_group.users_group_name, | |
|
1167 | 'group_description': user_group.user_group_description, | |
|
1168 | 'active': user_group.users_group_active, | |
|
1169 | 'owner': user_group.user.username, | |
|
1170 | } | |
|
1171 | if with_group_members: | |
|
1172 | users = [] | |
|
1173 | for user in user_group.members: | |
|
1174 | user = user.user | |
|
1175 | users.append(user.get_api_data(include_secrets=include_secrets)) | |
|
1176 | data['users'] = users | |
|
1177 | ||
|
1178 | return data | |
|
1179 | ||
|
1180 | ||
|
1181 | class UserGroupMember(Base, BaseModel): | |
|
1182 | __tablename__ = 'users_groups_members' | |
|
1183 | __table_args__ = ( | |
|
1184 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
1185 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
1186 | ) | |
|
1187 | ||
|
1188 | users_group_member_id = Column("users_group_member_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
1189 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
|
1190 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
|
1191 | ||
|
1192 | user = relationship('User', lazy='joined') | |
|
1193 | users_group = relationship('UserGroup') | |
|
1194 | ||
|
1195 | def __init__(self, gr_id='', u_id=''): | |
|
1196 | self.users_group_id = gr_id | |
|
1197 | self.user_id = u_id | |
|
1198 | ||
|
1199 | ||
|
1200 | class RepositoryField(Base, BaseModel): | |
|
1201 | __tablename__ = 'repositories_fields' | |
|
1202 | __table_args__ = ( | |
|
1203 | UniqueConstraint('repository_id', 'field_key'), # no-multi field | |
|
1204 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
1205 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
1206 | ) | |
|
1207 | PREFIX = 'ex_' # prefix used in form to not conflict with already existing fields | |
|
1208 | ||
|
1209 | repo_field_id = Column("repo_field_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
1210 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None) | |
|
1211 | field_key = Column("field_key", String(250)) | |
|
1212 | field_label = Column("field_label", String(1024), nullable=False) | |
|
1213 | field_value = Column("field_value", String(10000), nullable=False) | |
|
1214 | field_desc = Column("field_desc", String(1024), nullable=False) | |
|
1215 | field_type = Column("field_type", String(255), nullable=False, unique=None) | |
|
1216 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
|
1217 | ||
|
1218 | repository = relationship('Repository') | |
|
1219 | ||
|
1220 | @property | |
|
1221 | def field_key_prefixed(self): | |
|
1222 | return 'ex_%s' % self.field_key | |
|
1223 | ||
|
1224 | @classmethod | |
|
1225 | def un_prefix_key(cls, key): | |
|
1226 | if key.startswith(cls.PREFIX): | |
|
1227 | return key[len(cls.PREFIX):] | |
|
1228 | return key | |
|
1229 | ||
|
1230 | @classmethod | |
|
1231 | def get_by_key_name(cls, key, repo): | |
|
1232 | row = cls.query()\ | |
|
1233 | .filter(cls.repository == repo)\ | |
|
1234 | .filter(cls.field_key == key).scalar() | |
|
1235 | return row | |
|
1236 | ||
|
1237 | ||
|
1238 | class Repository(Base, BaseModel): | |
|
1239 | __tablename__ = 'repositories' | |
|
1240 | __table_args__ = ( | |
|
1241 | Index('r_repo_name_idx', 'repo_name', mysql_length=255), | |
|
1242 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
1243 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
1244 | ) | |
|
1245 | DEFAULT_CLONE_URI = '{scheme}://{user}@{netloc}/{repo}' | |
|
1246 | DEFAULT_CLONE_URI_ID = '{scheme}://{user}@{netloc}/_{repoid}' | |
|
1247 | ||
|
1248 | STATE_CREATED = 'repo_state_created' | |
|
1249 | STATE_PENDING = 'repo_state_pending' | |
|
1250 | STATE_ERROR = 'repo_state_error' | |
|
1251 | ||
|
1252 | LOCK_AUTOMATIC = 'lock_auto' | |
|
1253 | LOCK_API = 'lock_api' | |
|
1254 | LOCK_WEB = 'lock_web' | |
|
1255 | LOCK_PULL = 'lock_pull' | |
|
1256 | ||
|
1257 | NAME_SEP = URL_SEP | |
|
1258 | ||
|
1259 | repo_id = Column( | |
|
1260 | "repo_id", Integer(), nullable=False, unique=True, default=None, | |
|
1261 | primary_key=True) | |
|
1262 | _repo_name = Column( | |
|
1263 | "repo_name", Text(), nullable=False, default=None) | |
|
1264 | _repo_name_hash = Column( | |
|
1265 | "repo_name_hash", String(255), nullable=False, unique=True) | |
|
1266 | repo_state = Column("repo_state", String(255), nullable=True) | |
|
1267 | ||
|
1268 | clone_uri = Column( | |
|
1269 | "clone_uri", EncryptedTextValue(), nullable=True, unique=False, | |
|
1270 | default=None) | |
|
1271 | repo_type = Column( | |
|
1272 | "repo_type", String(255), nullable=False, unique=False, default=None) | |
|
1273 | user_id = Column( | |
|
1274 | "user_id", Integer(), ForeignKey('users.user_id'), nullable=False, | |
|
1275 | unique=False, default=None) | |
|
1276 | private = Column( | |
|
1277 | "private", Boolean(), nullable=True, unique=None, default=None) | |
|
1278 | enable_statistics = Column( | |
|
1279 | "statistics", Boolean(), nullable=True, unique=None, default=True) | |
|
1280 | enable_downloads = Column( | |
|
1281 | "downloads", Boolean(), nullable=True, unique=None, default=True) | |
|
1282 | description = Column( | |
|
1283 | "description", String(10000), nullable=True, unique=None, default=None) | |
|
1284 | created_on = Column( | |
|
1285 | 'created_on', DateTime(timezone=False), nullable=True, unique=None, | |
|
1286 | default=datetime.datetime.now) | |
|
1287 | updated_on = Column( | |
|
1288 | 'updated_on', DateTime(timezone=False), nullable=True, unique=None, | |
|
1289 | default=datetime.datetime.now) | |
|
1290 | _landing_revision = Column( | |
|
1291 | "landing_revision", String(255), nullable=False, unique=False, | |
|
1292 | default=None) | |
|
1293 | enable_locking = Column( | |
|
1294 | "enable_locking", Boolean(), nullable=False, unique=None, | |
|
1295 | default=False) | |
|
1296 | _locked = Column( | |
|
1297 | "locked", String(255), nullable=True, unique=False, default=None) | |
|
1298 | _changeset_cache = Column( | |
|
1299 | "changeset_cache", LargeBinary(), nullable=True) # JSON data | |
|
1300 | ||
|
1301 | fork_id = Column( | |
|
1302 | "fork_id", Integer(), ForeignKey('repositories.repo_id'), | |
|
1303 | nullable=True, unique=False, default=None) | |
|
1304 | group_id = Column( | |
|
1305 | "group_id", Integer(), ForeignKey('groups.group_id'), nullable=True, | |
|
1306 | unique=False, default=None) | |
|
1307 | ||
|
1308 | user = relationship('User', lazy='joined') | |
|
1309 | fork = relationship('Repository', remote_side=repo_id, lazy='joined') | |
|
1310 | group = relationship('RepoGroup', lazy='joined') | |
|
1311 | repo_to_perm = relationship( | |
|
1312 | 'UserRepoToPerm', cascade='all', | |
|
1313 | order_by='UserRepoToPerm.repo_to_perm_id') | |
|
1314 | users_group_to_perm = relationship('UserGroupRepoToPerm', cascade='all') | |
|
1315 | stats = relationship('Statistics', cascade='all', uselist=False) | |
|
1316 | ||
|
1317 | followers = relationship( | |
|
1318 | 'UserFollowing', | |
|
1319 | primaryjoin='UserFollowing.follows_repo_id==Repository.repo_id', | |
|
1320 | cascade='all') | |
|
1321 | extra_fields = relationship( | |
|
1322 | 'RepositoryField', cascade="all, delete, delete-orphan") | |
|
1323 | logs = relationship('UserLog') | |
|
1324 | comments = relationship( | |
|
1325 | 'ChangesetComment', cascade="all, delete, delete-orphan") | |
|
1326 | pull_requests_source = relationship( | |
|
1327 | 'PullRequest', | |
|
1328 | primaryjoin='PullRequest.source_repo_id==Repository.repo_id', | |
|
1329 | cascade="all, delete, delete-orphan") | |
|
1330 | pull_requests_target = relationship( | |
|
1331 | 'PullRequest', | |
|
1332 | primaryjoin='PullRequest.target_repo_id==Repository.repo_id', | |
|
1333 | cascade="all, delete, delete-orphan") | |
|
1334 | ui = relationship('RepoRhodeCodeUi', cascade="all") | |
|
1335 | settings = relationship('RepoRhodeCodeSetting', cascade="all") | |
|
1336 | integrations = relationship('Integration', | |
|
1337 | cascade="all, delete, delete-orphan") | |
|
1338 | ||
|
1339 | def __unicode__(self): | |
|
1340 | return u"<%s('%s:%s')>" % (self.__class__.__name__, self.repo_id, | |
|
1341 | safe_unicode(self.repo_name)) | |
|
1342 | ||
|
1343 | @hybrid_property | |
|
1344 | def landing_rev(self): | |
|
1345 | # always should return [rev_type, rev] | |
|
1346 | if self._landing_revision: | |
|
1347 | _rev_info = self._landing_revision.split(':') | |
|
1348 | if len(_rev_info) < 2: | |
|
1349 | _rev_info.insert(0, 'rev') | |
|
1350 | return [_rev_info[0], _rev_info[1]] | |
|
1351 | return [None, None] | |
|
1352 | ||
|
1353 | @landing_rev.setter | |
|
1354 | def landing_rev(self, val): | |
|
1355 | if ':' not in val: | |
|
1356 | raise ValueError('value must be delimited with `:` and consist ' | |
|
1357 | 'of <rev_type>:<rev>, got %s instead' % val) | |
|
1358 | self._landing_revision = val | |
|
1359 | ||
|
1360 | @hybrid_property | |
|
1361 | def locked(self): | |
|
1362 | if self._locked: | |
|
1363 | user_id, timelocked, reason = self._locked.split(':') | |
|
1364 | lock_values = int(user_id), timelocked, reason | |
|
1365 | else: | |
|
1366 | lock_values = [None, None, None] | |
|
1367 | return lock_values | |
|
1368 | ||
|
1369 | @locked.setter | |
|
1370 | def locked(self, val): | |
|
1371 | if val and isinstance(val, (list, tuple)): | |
|
1372 | self._locked = ':'.join(map(str, val)) | |
|
1373 | else: | |
|
1374 | self._locked = None | |
|
1375 | ||
|
1376 | @hybrid_property | |
|
1377 | def changeset_cache(self): | |
|
1378 | from rhodecode.lib.vcs.backends.base import EmptyCommit | |
|
1379 | dummy = EmptyCommit().__json__() | |
|
1380 | if not self._changeset_cache: | |
|
1381 | return dummy | |
|
1382 | try: | |
|
1383 | return json.loads(self._changeset_cache) | |
|
1384 | except TypeError: | |
|
1385 | return dummy | |
|
1386 | except Exception: | |
|
1387 | log.error(traceback.format_exc()) | |
|
1388 | return dummy | |
|
1389 | ||
|
1390 | @changeset_cache.setter | |
|
1391 | def changeset_cache(self, val): | |
|
1392 | try: | |
|
1393 | self._changeset_cache = json.dumps(val) | |
|
1394 | except Exception: | |
|
1395 | log.error(traceback.format_exc()) | |
|
1396 | ||
|
1397 | @hybrid_property | |
|
1398 | def repo_name(self): | |
|
1399 | return self._repo_name | |
|
1400 | ||
|
1401 | @repo_name.setter | |
|
1402 | def repo_name(self, value): | |
|
1403 | self._repo_name = value | |
|
1404 | self._repo_name_hash = hashlib.sha1(safe_str(value)).hexdigest() | |
|
1405 | ||
|
1406 | @classmethod | |
|
1407 | def normalize_repo_name(cls, repo_name): | |
|
1408 | """ | |
|
1409 | Normalizes os specific repo_name to the format internally stored inside | |
|
1410 | database using URL_SEP | |
|
1411 | ||
|
1412 | :param cls: | |
|
1413 | :param repo_name: | |
|
1414 | """ | |
|
1415 | return cls.NAME_SEP.join(repo_name.split(os.sep)) | |
|
1416 | ||
|
1417 | @classmethod | |
|
1418 | def get_by_repo_name(cls, repo_name, cache=False, identity_cache=False): | |
|
1419 | session = Session() | |
|
1420 | q = session.query(cls).filter(cls.repo_name == repo_name) | |
|
1421 | ||
|
1422 | if cache: | |
|
1423 | if identity_cache: | |
|
1424 | val = cls.identity_cache(session, 'repo_name', repo_name) | |
|
1425 | if val: | |
|
1426 | return val | |
|
1427 | else: | |
|
1428 | q = q.options( | |
|
1429 | FromCache("sql_cache_short", | |
|
1430 | "get_repo_by_name_%s" % _hash_key(repo_name))) | |
|
1431 | ||
|
1432 | return q.scalar() | |
|
1433 | ||
|
1434 | @classmethod | |
|
1435 | def get_by_full_path(cls, repo_full_path): | |
|
1436 | repo_name = repo_full_path.split(cls.base_path(), 1)[-1] | |
|
1437 | repo_name = cls.normalize_repo_name(repo_name) | |
|
1438 | return cls.get_by_repo_name(repo_name.strip(URL_SEP)) | |
|
1439 | ||
|
1440 | @classmethod | |
|
1441 | def get_repo_forks(cls, repo_id): | |
|
1442 | return cls.query().filter(Repository.fork_id == repo_id) | |
|
1443 | ||
|
1444 | @classmethod | |
|
1445 | def base_path(cls): | |
|
1446 | """ | |
|
1447 | Returns base path when all repos are stored | |
|
1448 | ||
|
1449 | :param cls: | |
|
1450 | """ | |
|
1451 | q = Session().query(RhodeCodeUi)\ | |
|
1452 | .filter(RhodeCodeUi.ui_key == cls.NAME_SEP) | |
|
1453 | q = q.options(FromCache("sql_cache_short", "repository_repo_path")) | |
|
1454 | return q.one().ui_value | |
|
1455 | ||
|
1456 | @classmethod | |
|
1457 | def is_valid(cls, repo_name): | |
|
1458 | """ | |
|
1459 | returns True if given repo name is a valid filesystem repository | |
|
1460 | ||
|
1461 | :param cls: | |
|
1462 | :param repo_name: | |
|
1463 | """ | |
|
1464 | from rhodecode.lib.utils import is_valid_repo | |
|
1465 | ||
|
1466 | return is_valid_repo(repo_name, cls.base_path()) | |
|
1467 | ||
|
1468 | @classmethod | |
|
1469 | def get_all_repos(cls, user_id=Optional(None), group_id=Optional(None), | |
|
1470 | case_insensitive=True): | |
|
1471 | q = Repository.query() | |
|
1472 | ||
|
1473 | if not isinstance(user_id, Optional): | |
|
1474 | q = q.filter(Repository.user_id == user_id) | |
|
1475 | ||
|
1476 | if not isinstance(group_id, Optional): | |
|
1477 | q = q.filter(Repository.group_id == group_id) | |
|
1478 | ||
|
1479 | if case_insensitive: | |
|
1480 | q = q.order_by(func.lower(Repository.repo_name)) | |
|
1481 | else: | |
|
1482 | q = q.order_by(Repository.repo_name) | |
|
1483 | return q.all() | |
|
1484 | ||
|
1485 | @property | |
|
1486 | def forks(self): | |
|
1487 | """ | |
|
1488 | Return forks of this repo | |
|
1489 | """ | |
|
1490 | return Repository.get_repo_forks(self.repo_id) | |
|
1491 | ||
|
1492 | @property | |
|
1493 | def parent(self): | |
|
1494 | """ | |
|
1495 | Returns fork parent | |
|
1496 | """ | |
|
1497 | return self.fork | |
|
1498 | ||
|
1499 | @property | |
|
1500 | def just_name(self): | |
|
1501 | return self.repo_name.split(self.NAME_SEP)[-1] | |
|
1502 | ||
|
1503 | @property | |
|
1504 | def groups_with_parents(self): | |
|
1505 | groups = [] | |
|
1506 | if self.group is None: | |
|
1507 | return groups | |
|
1508 | ||
|
1509 | cur_gr = self.group | |
|
1510 | groups.insert(0, cur_gr) | |
|
1511 | while 1: | |
|
1512 | gr = getattr(cur_gr, 'parent_group', None) | |
|
1513 | cur_gr = cur_gr.parent_group | |
|
1514 | if gr is None: | |
|
1515 | break | |
|
1516 | groups.insert(0, gr) | |
|
1517 | ||
|
1518 | return groups | |
|
1519 | ||
|
1520 | @property | |
|
1521 | def groups_and_repo(self): | |
|
1522 | return self.groups_with_parents, self | |
|
1523 | ||
|
1524 | @LazyProperty | |
|
1525 | def repo_path(self): | |
|
1526 | """ | |
|
1527 | Returns base full path for that repository means where it actually | |
|
1528 | exists on a filesystem | |
|
1529 | """ | |
|
1530 | q = Session().query(RhodeCodeUi).filter( | |
|
1531 | RhodeCodeUi.ui_key == self.NAME_SEP) | |
|
1532 | q = q.options(FromCache("sql_cache_short", "repository_repo_path")) | |
|
1533 | return q.one().ui_value | |
|
1534 | ||
|
1535 | @property | |
|
1536 | def repo_full_path(self): | |
|
1537 | p = [self.repo_path] | |
|
1538 | # we need to split the name by / since this is how we store the | |
|
1539 | # names in the database, but that eventually needs to be converted | |
|
1540 | # into a valid system path | |
|
1541 | p += self.repo_name.split(self.NAME_SEP) | |
|
1542 | return os.path.join(*map(safe_unicode, p)) | |
|
1543 | ||
|
1544 | @property | |
|
1545 | def cache_keys(self): | |
|
1546 | """ | |
|
1547 | Returns associated cache keys for that repo | |
|
1548 | """ | |
|
1549 | return CacheKey.query()\ | |
|
1550 | .filter(CacheKey.cache_args == self.repo_name)\ | |
|
1551 | .order_by(CacheKey.cache_key)\ | |
|
1552 | .all() | |
|
1553 | ||
|
1554 | def get_new_name(self, repo_name): | |
|
1555 | """ | |
|
1556 | returns new full repository name based on assigned group and new new | |
|
1557 | ||
|
1558 | :param group_name: | |
|
1559 | """ | |
|
1560 | path_prefix = self.group.full_path_splitted if self.group else [] | |
|
1561 | return self.NAME_SEP.join(path_prefix + [repo_name]) | |
|
1562 | ||
|
1563 | @property | |
|
1564 | def _config(self): | |
|
1565 | """ | |
|
1566 | Returns db based config object. | |
|
1567 | """ | |
|
1568 | from rhodecode.lib.utils import make_db_config | |
|
1569 | return make_db_config(clear_session=False, repo=self) | |
|
1570 | ||
|
1571 | def permissions(self, with_admins=True, with_owner=True): | |
|
1572 | q = UserRepoToPerm.query().filter(UserRepoToPerm.repository == self) | |
|
1573 | q = q.options(joinedload(UserRepoToPerm.repository), | |
|
1574 | joinedload(UserRepoToPerm.user), | |
|
1575 | joinedload(UserRepoToPerm.permission),) | |
|
1576 | ||
|
1577 | # get owners and admins and permissions. We do a trick of re-writing | |
|
1578 | # objects from sqlalchemy to named-tuples due to sqlalchemy session | |
|
1579 | # has a global reference and changing one object propagates to all | |
|
1580 | # others. This means if admin is also an owner admin_row that change | |
|
1581 | # would propagate to both objects | |
|
1582 | perm_rows = [] | |
|
1583 | for _usr in q.all(): | |
|
1584 | usr = AttributeDict(_usr.user.get_dict()) | |
|
1585 | usr.permission = _usr.permission.permission_name | |
|
1586 | perm_rows.append(usr) | |
|
1587 | ||
|
1588 | # filter the perm rows by 'default' first and then sort them by | |
|
1589 | # admin,write,read,none permissions sorted again alphabetically in | |
|
1590 | # each group | |
|
1591 | perm_rows = sorted(perm_rows, key=display_sort) | |
|
1592 | ||
|
1593 | _admin_perm = 'repository.admin' | |
|
1594 | owner_row = [] | |
|
1595 | if with_owner: | |
|
1596 | usr = AttributeDict(self.user.get_dict()) | |
|
1597 | usr.owner_row = True | |
|
1598 | usr.permission = _admin_perm | |
|
1599 | owner_row.append(usr) | |
|
1600 | ||
|
1601 | super_admin_rows = [] | |
|
1602 | if with_admins: | |
|
1603 | for usr in User.get_all_super_admins(): | |
|
1604 | # if this admin is also owner, don't double the record | |
|
1605 | if usr.user_id == owner_row[0].user_id: | |
|
1606 | owner_row[0].admin_row = True | |
|
1607 | else: | |
|
1608 | usr = AttributeDict(usr.get_dict()) | |
|
1609 | usr.admin_row = True | |
|
1610 | usr.permission = _admin_perm | |
|
1611 | super_admin_rows.append(usr) | |
|
1612 | ||
|
1613 | return super_admin_rows + owner_row + perm_rows | |
|
1614 | ||
|
1615 | def permission_user_groups(self): | |
|
1616 | q = UserGroupRepoToPerm.query().filter( | |
|
1617 | UserGroupRepoToPerm.repository == self) | |
|
1618 | q = q.options(joinedload(UserGroupRepoToPerm.repository), | |
|
1619 | joinedload(UserGroupRepoToPerm.users_group), | |
|
1620 | joinedload(UserGroupRepoToPerm.permission),) | |
|
1621 | ||
|
1622 | perm_rows = [] | |
|
1623 | for _user_group in q.all(): | |
|
1624 | usr = AttributeDict(_user_group.users_group.get_dict()) | |
|
1625 | usr.permission = _user_group.permission.permission_name | |
|
1626 | perm_rows.append(usr) | |
|
1627 | ||
|
1628 | return perm_rows | |
|
1629 | ||
|
1630 | def get_api_data(self, include_secrets=False): | |
|
1631 | """ | |
|
1632 | Common function for generating repo api data | |
|
1633 | ||
|
1634 | :param include_secrets: See :meth:`User.get_api_data`. | |
|
1635 | ||
|
1636 | """ | |
|
1637 | # TODO: mikhail: Here there is an anti-pattern, we probably need to | |
|
1638 | # move this methods on models level. | |
|
1639 | from rhodecode.model.settings import SettingsModel | |
|
1640 | ||
|
1641 | repo = self | |
|
1642 | _user_id, _time, _reason = self.locked | |
|
1643 | ||
|
1644 | data = { | |
|
1645 | 'repo_id': repo.repo_id, | |
|
1646 | 'repo_name': repo.repo_name, | |
|
1647 | 'repo_type': repo.repo_type, | |
|
1648 | 'clone_uri': repo.clone_uri or '', | |
|
1649 | 'url': url('summary_home', repo_name=self.repo_name, qualified=True), | |
|
1650 | 'private': repo.private, | |
|
1651 | 'created_on': repo.created_on, | |
|
1652 | 'description': repo.description, | |
|
1653 | 'landing_rev': repo.landing_rev, | |
|
1654 | 'owner': repo.user.username, | |
|
1655 | 'fork_of': repo.fork.repo_name if repo.fork else None, | |
|
1656 | 'enable_statistics': repo.enable_statistics, | |
|
1657 | 'enable_locking': repo.enable_locking, | |
|
1658 | 'enable_downloads': repo.enable_downloads, | |
|
1659 | 'last_changeset': repo.changeset_cache, | |
|
1660 | 'locked_by': User.get(_user_id).get_api_data( | |
|
1661 | include_secrets=include_secrets) if _user_id else None, | |
|
1662 | 'locked_date': time_to_datetime(_time) if _time else None, | |
|
1663 | 'lock_reason': _reason if _reason else None, | |
|
1664 | } | |
|
1665 | ||
|
1666 | # TODO: mikhail: should be per-repo settings here | |
|
1667 | rc_config = SettingsModel().get_all_settings() | |
|
1668 | repository_fields = str2bool( | |
|
1669 | rc_config.get('rhodecode_repository_fields')) | |
|
1670 | if repository_fields: | |
|
1671 | for f in self.extra_fields: | |
|
1672 | data[f.field_key_prefixed] = f.field_value | |
|
1673 | ||
|
1674 | return data | |
|
1675 | ||
|
1676 | @classmethod | |
|
1677 | def lock(cls, repo, user_id, lock_time=None, lock_reason=None): | |
|
1678 | if not lock_time: | |
|
1679 | lock_time = time.time() | |
|
1680 | if not lock_reason: | |
|
1681 | lock_reason = cls.LOCK_AUTOMATIC | |
|
1682 | repo.locked = [user_id, lock_time, lock_reason] | |
|
1683 | Session().add(repo) | |
|
1684 | Session().commit() | |
|
1685 | ||
|
1686 | @classmethod | |
|
1687 | def unlock(cls, repo): | |
|
1688 | repo.locked = None | |
|
1689 | Session().add(repo) | |
|
1690 | Session().commit() | |
|
1691 | ||
|
1692 | @classmethod | |
|
1693 | def getlock(cls, repo): | |
|
1694 | return repo.locked | |
|
1695 | ||
|
1696 | def is_user_lock(self, user_id): | |
|
1697 | if self.lock[0]: | |
|
1698 | lock_user_id = safe_int(self.lock[0]) | |
|
1699 | user_id = safe_int(user_id) | |
|
1700 | # both are ints, and they are equal | |
|
1701 | return all([lock_user_id, user_id]) and lock_user_id == user_id | |
|
1702 | ||
|
1703 | return False | |
|
1704 | ||
|
1705 | def get_locking_state(self, action, user_id, only_when_enabled=True): | |
|
1706 | """ | |
|
1707 | Checks locking on this repository, if locking is enabled and lock is | |
|
1708 | present returns a tuple of make_lock, locked, locked_by. | |
|
1709 | make_lock can have 3 states None (do nothing) True, make lock | |
|
1710 | False release lock, This value is later propagated to hooks, which | |
|
1711 | do the locking. Think about this as signals passed to hooks what to do. | |
|
1712 | ||
|
1713 | """ | |
|
1714 | # TODO: johbo: This is part of the business logic and should be moved | |
|
1715 | # into the RepositoryModel. | |
|
1716 | ||
|
1717 | if action not in ('push', 'pull'): | |
|
1718 | raise ValueError("Invalid action value: %s" % repr(action)) | |
|
1719 | ||
|
1720 | # defines if locked error should be thrown to user | |
|
1721 | currently_locked = False | |
|
1722 | # defines if new lock should be made, tri-state | |
|
1723 | make_lock = None | |
|
1724 | repo = self | |
|
1725 | user = User.get(user_id) | |
|
1726 | ||
|
1727 | lock_info = repo.locked | |
|
1728 | ||
|
1729 | if repo and (repo.enable_locking or not only_when_enabled): | |
|
1730 | if action == 'push': | |
|
1731 | # check if it's already locked !, if it is compare users | |
|
1732 | locked_by_user_id = lock_info[0] | |
|
1733 | if user.user_id == locked_by_user_id: | |
|
1734 | log.debug( | |
|
1735 | 'Got `push` action from user %s, now unlocking', user) | |
|
1736 | # unlock if we have push from user who locked | |
|
1737 | make_lock = False | |
|
1738 | else: | |
|
1739 | # we're not the same user who locked, ban with | |
|
1740 | # code defined in settings (default is 423 HTTP Locked) ! | |
|
1741 | log.debug('Repo %s is currently locked by %s', repo, user) | |
|
1742 | currently_locked = True | |
|
1743 | elif action == 'pull': | |
|
1744 | # [0] user [1] date | |
|
1745 | if lock_info[0] and lock_info[1]: | |
|
1746 | log.debug('Repo %s is currently locked by %s', repo, user) | |
|
1747 | currently_locked = True | |
|
1748 | else: | |
|
1749 | log.debug('Setting lock on repo %s by %s', repo, user) | |
|
1750 | make_lock = True | |
|
1751 | ||
|
1752 | else: | |
|
1753 | log.debug('Repository %s do not have locking enabled', repo) | |
|
1754 | ||
|
1755 | log.debug('FINAL locking values make_lock:%s,locked:%s,locked_by:%s', | |
|
1756 | make_lock, currently_locked, lock_info) | |
|
1757 | ||
|
1758 | from rhodecode.lib.auth import HasRepoPermissionAny | |
|
1759 | perm_check = HasRepoPermissionAny('repository.write', 'repository.admin') | |
|
1760 | if make_lock and not perm_check(repo_name=repo.repo_name, user=user): | |
|
1761 | # if we don't have at least write permission we cannot make a lock | |
|
1762 | log.debug('lock state reset back to FALSE due to lack ' | |
|
1763 | 'of at least read permission') | |
|
1764 | make_lock = False | |
|
1765 | ||
|
1766 | return make_lock, currently_locked, lock_info | |
|
1767 | ||
|
1768 | @property | |
|
1769 | def last_db_change(self): | |
|
1770 | return self.updated_on | |
|
1771 | ||
|
1772 | @property | |
|
1773 | def clone_uri_hidden(self): | |
|
1774 | clone_uri = self.clone_uri | |
|
1775 | if clone_uri: | |
|
1776 | import urlobject | |
|
1777 | url_obj = urlobject.URLObject(clone_uri) | |
|
1778 | if url_obj.password: | |
|
1779 | clone_uri = url_obj.with_password('*****') | |
|
1780 | return clone_uri | |
|
1781 | ||
|
1782 | def clone_url(self, **override): | |
|
1783 | qualified_home_url = url('home', qualified=True) | |
|
1784 | ||
|
1785 | uri_tmpl = None | |
|
1786 | if 'with_id' in override: | |
|
1787 | uri_tmpl = self.DEFAULT_CLONE_URI_ID | |
|
1788 | del override['with_id'] | |
|
1789 | ||
|
1790 | if 'uri_tmpl' in override: | |
|
1791 | uri_tmpl = override['uri_tmpl'] | |
|
1792 | del override['uri_tmpl'] | |
|
1793 | ||
|
1794 | # we didn't override our tmpl from **overrides | |
|
1795 | if not uri_tmpl: | |
|
1796 | uri_tmpl = self.DEFAULT_CLONE_URI | |
|
1797 | try: | |
|
1798 | from pylons import tmpl_context as c | |
|
1799 | uri_tmpl = c.clone_uri_tmpl | |
|
1800 | except Exception: | |
|
1801 | # in any case if we call this outside of request context, | |
|
1802 | # ie, not having tmpl_context set up | |
|
1803 | pass | |
|
1804 | ||
|
1805 | return get_clone_url(uri_tmpl=uri_tmpl, | |
|
1806 | qualifed_home_url=qualified_home_url, | |
|
1807 | repo_name=self.repo_name, | |
|
1808 | repo_id=self.repo_id, **override) | |
|
1809 | ||
|
1810 | def set_state(self, state): | |
|
1811 | self.repo_state = state | |
|
1812 | Session().add(self) | |
|
1813 | #========================================================================== | |
|
1814 | # SCM PROPERTIES | |
|
1815 | #========================================================================== | |
|
1816 | ||
|
1817 | def get_commit(self, commit_id=None, commit_idx=None, pre_load=None): | |
|
1818 | return get_commit_safe( | |
|
1819 | self.scm_instance(), commit_id, commit_idx, pre_load=pre_load) | |
|
1820 | ||
|
1821 | def get_changeset(self, rev=None, pre_load=None): | |
|
1822 | warnings.warn("Use get_commit", DeprecationWarning) | |
|
1823 | commit_id = None | |
|
1824 | commit_idx = None | |
|
1825 | if isinstance(rev, basestring): | |
|
1826 | commit_id = rev | |
|
1827 | else: | |
|
1828 | commit_idx = rev | |
|
1829 | return self.get_commit(commit_id=commit_id, commit_idx=commit_idx, | |
|
1830 | pre_load=pre_load) | |
|
1831 | ||
|
1832 | def get_landing_commit(self): | |
|
1833 | """ | |
|
1834 | Returns landing commit, or if that doesn't exist returns the tip | |
|
1835 | """ | |
|
1836 | _rev_type, _rev = self.landing_rev | |
|
1837 | commit = self.get_commit(_rev) | |
|
1838 | if isinstance(commit, EmptyCommit): | |
|
1839 | return self.get_commit() | |
|
1840 | return commit | |
|
1841 | ||
|
1842 | def update_commit_cache(self, cs_cache=None, config=None): | |
|
1843 | """ | |
|
1844 | Update cache of last changeset for repository, keys should be:: | |
|
1845 | ||
|
1846 | short_id | |
|
1847 | raw_id | |
|
1848 | revision | |
|
1849 | parents | |
|
1850 | message | |
|
1851 | date | |
|
1852 | author | |
|
1853 | ||
|
1854 | :param cs_cache: | |
|
1855 | """ | |
|
1856 | from rhodecode.lib.vcs.backends.base import BaseChangeset | |
|
1857 | if cs_cache is None: | |
|
1858 | # use no-cache version here | |
|
1859 | scm_repo = self.scm_instance(cache=False, config=config) | |
|
1860 | if scm_repo: | |
|
1861 | cs_cache = scm_repo.get_commit( | |
|
1862 | pre_load=["author", "date", "message", "parents"]) | |
|
1863 | else: | |
|
1864 | cs_cache = EmptyCommit() | |
|
1865 | ||
|
1866 | if isinstance(cs_cache, BaseChangeset): | |
|
1867 | cs_cache = cs_cache.__json__() | |
|
1868 | ||
|
1869 | def is_outdated(new_cs_cache): | |
|
1870 | if (new_cs_cache['raw_id'] != self.changeset_cache['raw_id'] or | |
|
1871 | new_cs_cache['revision'] != self.changeset_cache['revision']): | |
|
1872 | return True | |
|
1873 | return False | |
|
1874 | ||
|
1875 | # check if we have maybe already latest cached revision | |
|
1876 | if is_outdated(cs_cache) or not self.changeset_cache: | |
|
1877 | _default = datetime.datetime.fromtimestamp(0) | |
|
1878 | last_change = cs_cache.get('date') or _default | |
|
1879 | log.debug('updated repo %s with new cs cache %s', | |
|
1880 | self.repo_name, cs_cache) | |
|
1881 | self.updated_on = last_change | |
|
1882 | self.changeset_cache = cs_cache | |
|
1883 | Session().add(self) | |
|
1884 | Session().commit() | |
|
1885 | else: | |
|
1886 | log.debug('Skipping update_commit_cache for repo:`%s` ' | |
|
1887 | 'commit already with latest changes', self.repo_name) | |
|
1888 | ||
|
1889 | @property | |
|
1890 | def tip(self): | |
|
1891 | return self.get_commit('tip') | |
|
1892 | ||
|
1893 | @property | |
|
1894 | def author(self): | |
|
1895 | return self.tip.author | |
|
1896 | ||
|
1897 | @property | |
|
1898 | def last_change(self): | |
|
1899 | return self.scm_instance().last_change | |
|
1900 | ||
|
1901 | def get_comments(self, revisions=None): | |
|
1902 | """ | |
|
1903 | Returns comments for this repository grouped by revisions | |
|
1904 | ||
|
1905 | :param revisions: filter query by revisions only | |
|
1906 | """ | |
|
1907 | cmts = ChangesetComment.query()\ | |
|
1908 | .filter(ChangesetComment.repo == self) | |
|
1909 | if revisions: | |
|
1910 | cmts = cmts.filter(ChangesetComment.revision.in_(revisions)) | |
|
1911 | grouped = collections.defaultdict(list) | |
|
1912 | for cmt in cmts.all(): | |
|
1913 | grouped[cmt.revision].append(cmt) | |
|
1914 | return grouped | |
|
1915 | ||
|
1916 | def statuses(self, revisions=None): | |
|
1917 | """ | |
|
1918 | Returns statuses for this repository | |
|
1919 | ||
|
1920 | :param revisions: list of revisions to get statuses for | |
|
1921 | """ | |
|
1922 | statuses = ChangesetStatus.query()\ | |
|
1923 | .filter(ChangesetStatus.repo == self)\ | |
|
1924 | .filter(ChangesetStatus.version == 0) | |
|
1925 | ||
|
1926 | if revisions: | |
|
1927 | # Try doing the filtering in chunks to avoid hitting limits | |
|
1928 | size = 500 | |
|
1929 | status_results = [] | |
|
1930 | for chunk in xrange(0, len(revisions), size): | |
|
1931 | status_results += statuses.filter( | |
|
1932 | ChangesetStatus.revision.in_( | |
|
1933 | revisions[chunk: chunk+size]) | |
|
1934 | ).all() | |
|
1935 | else: | |
|
1936 | status_results = statuses.all() | |
|
1937 | ||
|
1938 | grouped = {} | |
|
1939 | ||
|
1940 | # maybe we have open new pullrequest without a status? | |
|
1941 | stat = ChangesetStatus.STATUS_UNDER_REVIEW | |
|
1942 | status_lbl = ChangesetStatus.get_status_lbl(stat) | |
|
1943 | for pr in PullRequest.query().filter(PullRequest.source_repo == self).all(): | |
|
1944 | for rev in pr.revisions: | |
|
1945 | pr_id = pr.pull_request_id | |
|
1946 | pr_repo = pr.target_repo.repo_name | |
|
1947 | grouped[rev] = [stat, status_lbl, pr_id, pr_repo] | |
|
1948 | ||
|
1949 | for stat in status_results: | |
|
1950 | pr_id = pr_repo = None | |
|
1951 | if stat.pull_request: | |
|
1952 | pr_id = stat.pull_request.pull_request_id | |
|
1953 | pr_repo = stat.pull_request.target_repo.repo_name | |
|
1954 | grouped[stat.revision] = [str(stat.status), stat.status_lbl, | |
|
1955 | pr_id, pr_repo] | |
|
1956 | return grouped | |
|
1957 | ||
|
1958 | # ========================================================================== | |
|
1959 | # SCM CACHE INSTANCE | |
|
1960 | # ========================================================================== | |
|
1961 | ||
|
1962 | def scm_instance(self, **kwargs): | |
|
1963 | import rhodecode | |
|
1964 | ||
|
1965 | # Passing a config will not hit the cache currently only used | |
|
1966 | # for repo2dbmapper | |
|
1967 | config = kwargs.pop('config', None) | |
|
1968 | cache = kwargs.pop('cache', None) | |
|
1969 | full_cache = str2bool(rhodecode.CONFIG.get('vcs_full_cache')) | |
|
1970 | # if cache is NOT defined use default global, else we have a full | |
|
1971 | # control over cache behaviour | |
|
1972 | if cache is None and full_cache and not config: | |
|
1973 | return self._get_instance_cached() | |
|
1974 | return self._get_instance(cache=bool(cache), config=config) | |
|
1975 | ||
|
1976 | def _get_instance_cached(self): | |
|
1977 | @cache_region('long_term') | |
|
1978 | def _get_repo(cache_key): | |
|
1979 | return self._get_instance() | |
|
1980 | ||
|
1981 | invalidator_context = CacheKey.repo_context_cache( | |
|
1982 | _get_repo, self.repo_name, None, thread_scoped=True) | |
|
1983 | ||
|
1984 | with invalidator_context as context: | |
|
1985 | context.invalidate() | |
|
1986 | repo = context.compute() | |
|
1987 | ||
|
1988 | return repo | |
|
1989 | ||
|
1990 | def _get_instance(self, cache=True, config=None): | |
|
1991 | config = config or self._config | |
|
1992 | custom_wire = { | |
|
1993 | 'cache': cache # controls the vcs.remote cache | |
|
1994 | } | |
|
1995 | ||
|
1996 | repo = get_vcs_instance( | |
|
1997 | repo_path=safe_str(self.repo_full_path), | |
|
1998 | config=config, | |
|
1999 | with_wire=custom_wire, | |
|
2000 | create=False) | |
|
2001 | ||
|
2002 | return repo | |
|
2003 | ||
|
2004 | def __json__(self): | |
|
2005 | return {'landing_rev': self.landing_rev} | |
|
2006 | ||
|
2007 | def get_dict(self): | |
|
2008 | ||
|
2009 | # Since we transformed `repo_name` to a hybrid property, we need to | |
|
2010 | # keep compatibility with the code which uses `repo_name` field. | |
|
2011 | ||
|
2012 | result = super(Repository, self).get_dict() | |
|
2013 | result['repo_name'] = result.pop('_repo_name', None) | |
|
2014 | return result | |
|
2015 | ||
|
2016 | ||
|
2017 | class RepoGroup(Base, BaseModel): | |
|
2018 | __tablename__ = 'groups' | |
|
2019 | __table_args__ = ( | |
|
2020 | UniqueConstraint('group_name', 'group_parent_id'), | |
|
2021 | CheckConstraint('group_id != group_parent_id'), | |
|
2022 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2023 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
2024 | ) | |
|
2025 | __mapper_args__ = {'order_by': 'group_name'} | |
|
2026 | ||
|
2027 | CHOICES_SEPARATOR = '/' # used to generate select2 choices for nested groups | |
|
2028 | ||
|
2029 | group_id = Column("group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2030 | group_name = Column("group_name", String(255), nullable=False, unique=True, default=None) | |
|
2031 | group_parent_id = Column("group_parent_id", Integer(), ForeignKey('groups.group_id'), nullable=True, unique=None, default=None) | |
|
2032 | group_description = Column("group_description", String(10000), nullable=True, unique=None, default=None) | |
|
2033 | enable_locking = Column("enable_locking", Boolean(), nullable=False, unique=None, default=False) | |
|
2034 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=False, default=None) | |
|
2035 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
|
2036 | ||
|
2037 | repo_group_to_perm = relationship('UserRepoGroupToPerm', cascade='all', order_by='UserRepoGroupToPerm.group_to_perm_id') | |
|
2038 | users_group_to_perm = relationship('UserGroupRepoGroupToPerm', cascade='all') | |
|
2039 | parent_group = relationship('RepoGroup', remote_side=group_id) | |
|
2040 | user = relationship('User') | |
|
2041 | integrations = relationship('Integration', | |
|
2042 | cascade="all, delete, delete-orphan") | |
|
2043 | ||
|
2044 | def __init__(self, group_name='', parent_group=None): | |
|
2045 | self.group_name = group_name | |
|
2046 | self.parent_group = parent_group | |
|
2047 | ||
|
2048 | def __unicode__(self): | |
|
2049 | return u"<%s('id:%s:%s')>" % (self.__class__.__name__, self.group_id, | |
|
2050 | self.group_name) | |
|
2051 | ||
|
2052 | @classmethod | |
|
2053 | def _generate_choice(cls, repo_group): | |
|
2054 | from webhelpers.html import literal as _literal | |
|
2055 | _name = lambda k: _literal(cls.CHOICES_SEPARATOR.join(k)) | |
|
2056 | return repo_group.group_id, _name(repo_group.full_path_splitted) | |
|
2057 | ||
|
2058 | @classmethod | |
|
2059 | def groups_choices(cls, groups=None, show_empty_group=True): | |
|
2060 | if not groups: | |
|
2061 | groups = cls.query().all() | |
|
2062 | ||
|
2063 | repo_groups = [] | |
|
2064 | if show_empty_group: | |
|
2065 | repo_groups = [('-1', u'-- %s --' % _('No parent'))] | |
|
2066 | ||
|
2067 | repo_groups.extend([cls._generate_choice(x) for x in groups]) | |
|
2068 | ||
|
2069 | repo_groups = sorted( | |
|
2070 | repo_groups, key=lambda t: t[1].split(cls.CHOICES_SEPARATOR)[0]) | |
|
2071 | return repo_groups | |
|
2072 | ||
|
2073 | @classmethod | |
|
2074 | def url_sep(cls): | |
|
2075 | return URL_SEP | |
|
2076 | ||
|
2077 | @classmethod | |
|
2078 | def get_by_group_name(cls, group_name, cache=False, case_insensitive=False): | |
|
2079 | if case_insensitive: | |
|
2080 | gr = cls.query().filter(func.lower(cls.group_name) | |
|
2081 | == func.lower(group_name)) | |
|
2082 | else: | |
|
2083 | gr = cls.query().filter(cls.group_name == group_name) | |
|
2084 | if cache: | |
|
2085 | gr = gr.options(FromCache( | |
|
2086 | "sql_cache_short", | |
|
2087 | "get_group_%s" % _hash_key(group_name))) | |
|
2088 | return gr.scalar() | |
|
2089 | ||
|
2090 | @classmethod | |
|
2091 | def get_all_repo_groups(cls, user_id=Optional(None), group_id=Optional(None), | |
|
2092 | case_insensitive=True): | |
|
2093 | q = RepoGroup.query() | |
|
2094 | ||
|
2095 | if not isinstance(user_id, Optional): | |
|
2096 | q = q.filter(RepoGroup.user_id == user_id) | |
|
2097 | ||
|
2098 | if not isinstance(group_id, Optional): | |
|
2099 | q = q.filter(RepoGroup.group_parent_id == group_id) | |
|
2100 | ||
|
2101 | if case_insensitive: | |
|
2102 | q = q.order_by(func.lower(RepoGroup.group_name)) | |
|
2103 | else: | |
|
2104 | q = q.order_by(RepoGroup.group_name) | |
|
2105 | return q.all() | |
|
2106 | ||
|
2107 | @property | |
|
2108 | def parents(self): | |
|
2109 | parents_recursion_limit = 10 | |
|
2110 | groups = [] | |
|
2111 | if self.parent_group is None: | |
|
2112 | return groups | |
|
2113 | cur_gr = self.parent_group | |
|
2114 | groups.insert(0, cur_gr) | |
|
2115 | cnt = 0 | |
|
2116 | while 1: | |
|
2117 | cnt += 1 | |
|
2118 | gr = getattr(cur_gr, 'parent_group', None) | |
|
2119 | cur_gr = cur_gr.parent_group | |
|
2120 | if gr is None: | |
|
2121 | break | |
|
2122 | if cnt == parents_recursion_limit: | |
|
2123 | # this will prevent accidental infinit loops | |
|
2124 | log.error(('more than %s parents found for group %s, stopping ' | |
|
2125 | 'recursive parent fetching' % (parents_recursion_limit, self))) | |
|
2126 | break | |
|
2127 | ||
|
2128 | groups.insert(0, gr) | |
|
2129 | return groups | |
|
2130 | ||
|
2131 | @property | |
|
2132 | def children(self): | |
|
2133 | return RepoGroup.query().filter(RepoGroup.parent_group == self) | |
|
2134 | ||
|
2135 | @property | |
|
2136 | def name(self): | |
|
2137 | return self.group_name.split(RepoGroup.url_sep())[-1] | |
|
2138 | ||
|
2139 | @property | |
|
2140 | def full_path(self): | |
|
2141 | return self.group_name | |
|
2142 | ||
|
2143 | @property | |
|
2144 | def full_path_splitted(self): | |
|
2145 | return self.group_name.split(RepoGroup.url_sep()) | |
|
2146 | ||
|
2147 | @property | |
|
2148 | def repositories(self): | |
|
2149 | return Repository.query()\ | |
|
2150 | .filter(Repository.group == self)\ | |
|
2151 | .order_by(Repository.repo_name) | |
|
2152 | ||
|
2153 | @property | |
|
2154 | def repositories_recursive_count(self): | |
|
2155 | cnt = self.repositories.count() | |
|
2156 | ||
|
2157 | def children_count(group): | |
|
2158 | cnt = 0 | |
|
2159 | for child in group.children: | |
|
2160 | cnt += child.repositories.count() | |
|
2161 | cnt += children_count(child) | |
|
2162 | return cnt | |
|
2163 | ||
|
2164 | return cnt + children_count(self) | |
|
2165 | ||
|
2166 | def _recursive_objects(self, include_repos=True): | |
|
2167 | all_ = [] | |
|
2168 | ||
|
2169 | def _get_members(root_gr): | |
|
2170 | if include_repos: | |
|
2171 | for r in root_gr.repositories: | |
|
2172 | all_.append(r) | |
|
2173 | childs = root_gr.children.all() | |
|
2174 | if childs: | |
|
2175 | for gr in childs: | |
|
2176 | all_.append(gr) | |
|
2177 | _get_members(gr) | |
|
2178 | ||
|
2179 | _get_members(self) | |
|
2180 | return [self] + all_ | |
|
2181 | ||
|
2182 | def recursive_groups_and_repos(self): | |
|
2183 | """ | |
|
2184 | Recursive return all groups, with repositories in those groups | |
|
2185 | """ | |
|
2186 | return self._recursive_objects() | |
|
2187 | ||
|
2188 | def recursive_groups(self): | |
|
2189 | """ | |
|
2190 | Returns all children groups for this group including children of children | |
|
2191 | """ | |
|
2192 | return self._recursive_objects(include_repos=False) | |
|
2193 | ||
|
2194 | def get_new_name(self, group_name): | |
|
2195 | """ | |
|
2196 | returns new full group name based on parent and new name | |
|
2197 | ||
|
2198 | :param group_name: | |
|
2199 | """ | |
|
2200 | path_prefix = (self.parent_group.full_path_splitted if | |
|
2201 | self.parent_group else []) | |
|
2202 | return RepoGroup.url_sep().join(path_prefix + [group_name]) | |
|
2203 | ||
|
2204 | def permissions(self, with_admins=True, with_owner=True): | |
|
2205 | q = UserRepoGroupToPerm.query().filter(UserRepoGroupToPerm.group == self) | |
|
2206 | q = q.options(joinedload(UserRepoGroupToPerm.group), | |
|
2207 | joinedload(UserRepoGroupToPerm.user), | |
|
2208 | joinedload(UserRepoGroupToPerm.permission),) | |
|
2209 | ||
|
2210 | # get owners and admins and permissions. We do a trick of re-writing | |
|
2211 | # objects from sqlalchemy to named-tuples due to sqlalchemy session | |
|
2212 | # has a global reference and changing one object propagates to all | |
|
2213 | # others. This means if admin is also an owner admin_row that change | |
|
2214 | # would propagate to both objects | |
|
2215 | perm_rows = [] | |
|
2216 | for _usr in q.all(): | |
|
2217 | usr = AttributeDict(_usr.user.get_dict()) | |
|
2218 | usr.permission = _usr.permission.permission_name | |
|
2219 | perm_rows.append(usr) | |
|
2220 | ||
|
2221 | # filter the perm rows by 'default' first and then sort them by | |
|
2222 | # admin,write,read,none permissions sorted again alphabetically in | |
|
2223 | # each group | |
|
2224 | perm_rows = sorted(perm_rows, key=display_sort) | |
|
2225 | ||
|
2226 | _admin_perm = 'group.admin' | |
|
2227 | owner_row = [] | |
|
2228 | if with_owner: | |
|
2229 | usr = AttributeDict(self.user.get_dict()) | |
|
2230 | usr.owner_row = True | |
|
2231 | usr.permission = _admin_perm | |
|
2232 | owner_row.append(usr) | |
|
2233 | ||
|
2234 | super_admin_rows = [] | |
|
2235 | if with_admins: | |
|
2236 | for usr in User.get_all_super_admins(): | |
|
2237 | # if this admin is also owner, don't double the record | |
|
2238 | if usr.user_id == owner_row[0].user_id: | |
|
2239 | owner_row[0].admin_row = True | |
|
2240 | else: | |
|
2241 | usr = AttributeDict(usr.get_dict()) | |
|
2242 | usr.admin_row = True | |
|
2243 | usr.permission = _admin_perm | |
|
2244 | super_admin_rows.append(usr) | |
|
2245 | ||
|
2246 | return super_admin_rows + owner_row + perm_rows | |
|
2247 | ||
|
2248 | def permission_user_groups(self): | |
|
2249 | q = UserGroupRepoGroupToPerm.query().filter(UserGroupRepoGroupToPerm.group == self) | |
|
2250 | q = q.options(joinedload(UserGroupRepoGroupToPerm.group), | |
|
2251 | joinedload(UserGroupRepoGroupToPerm.users_group), | |
|
2252 | joinedload(UserGroupRepoGroupToPerm.permission),) | |
|
2253 | ||
|
2254 | perm_rows = [] | |
|
2255 | for _user_group in q.all(): | |
|
2256 | usr = AttributeDict(_user_group.users_group.get_dict()) | |
|
2257 | usr.permission = _user_group.permission.permission_name | |
|
2258 | perm_rows.append(usr) | |
|
2259 | ||
|
2260 | return perm_rows | |
|
2261 | ||
|
2262 | def get_api_data(self): | |
|
2263 | """ | |
|
2264 | Common function for generating api data | |
|
2265 | ||
|
2266 | """ | |
|
2267 | group = self | |
|
2268 | data = { | |
|
2269 | 'group_id': group.group_id, | |
|
2270 | 'group_name': group.group_name, | |
|
2271 | 'group_description': group.group_description, | |
|
2272 | 'parent_group': group.parent_group.group_name if group.parent_group else None, | |
|
2273 | 'repositories': [x.repo_name for x in group.repositories], | |
|
2274 | 'owner': group.user.username, | |
|
2275 | } | |
|
2276 | return data | |
|
2277 | ||
|
2278 | ||
|
2279 | class Permission(Base, BaseModel): | |
|
2280 | __tablename__ = 'permissions' | |
|
2281 | __table_args__ = ( | |
|
2282 | Index('p_perm_name_idx', 'permission_name'), | |
|
2283 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2284 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
2285 | ) | |
|
2286 | PERMS = [ | |
|
2287 | ('hg.admin', _('RhodeCode Super Administrator')), | |
|
2288 | ||
|
2289 | ('repository.none', _('Repository no access')), | |
|
2290 | ('repository.read', _('Repository read access')), | |
|
2291 | ('repository.write', _('Repository write access')), | |
|
2292 | ('repository.admin', _('Repository admin access')), | |
|
2293 | ||
|
2294 | ('group.none', _('Repository group no access')), | |
|
2295 | ('group.read', _('Repository group read access')), | |
|
2296 | ('group.write', _('Repository group write access')), | |
|
2297 | ('group.admin', _('Repository group admin access')), | |
|
2298 | ||
|
2299 | ('usergroup.none', _('User group no access')), | |
|
2300 | ('usergroup.read', _('User group read access')), | |
|
2301 | ('usergroup.write', _('User group write access')), | |
|
2302 | ('usergroup.admin', _('User group admin access')), | |
|
2303 | ||
|
2304 | ('hg.repogroup.create.false', _('Repository Group creation disabled')), | |
|
2305 | ('hg.repogroup.create.true', _('Repository Group creation enabled')), | |
|
2306 | ||
|
2307 | ('hg.usergroup.create.false', _('User Group creation disabled')), | |
|
2308 | ('hg.usergroup.create.true', _('User Group creation enabled')), | |
|
2309 | ||
|
2310 | ('hg.create.none', _('Repository creation disabled')), | |
|
2311 | ('hg.create.repository', _('Repository creation enabled')), | |
|
2312 | ('hg.create.write_on_repogroup.true', _('Repository creation enabled with write permission to a repository group')), | |
|
2313 | ('hg.create.write_on_repogroup.false', _('Repository creation disabled with write permission to a repository group')), | |
|
2314 | ||
|
2315 | ('hg.fork.none', _('Repository forking disabled')), | |
|
2316 | ('hg.fork.repository', _('Repository forking enabled')), | |
|
2317 | ||
|
2318 | ('hg.register.none', _('Registration disabled')), | |
|
2319 | ('hg.register.manual_activate', _('User Registration with manual account activation')), | |
|
2320 | ('hg.register.auto_activate', _('User Registration with automatic account activation')), | |
|
2321 | ||
|
2322 | ('hg.extern_activate.manual', _('Manual activation of external account')), | |
|
2323 | ('hg.extern_activate.auto', _('Automatic activation of external account')), | |
|
2324 | ||
|
2325 | ('hg.inherit_default_perms.false', _('Inherit object permissions from default user disabled')), | |
|
2326 | ('hg.inherit_default_perms.true', _('Inherit object permissions from default user enabled')), | |
|
2327 | ] | |
|
2328 | ||
|
2329 | # definition of system default permissions for DEFAULT user | |
|
2330 | DEFAULT_USER_PERMISSIONS = [ | |
|
2331 | 'repository.read', | |
|
2332 | 'group.read', | |
|
2333 | 'usergroup.read', | |
|
2334 | 'hg.create.repository', | |
|
2335 | 'hg.repogroup.create.false', | |
|
2336 | 'hg.usergroup.create.false', | |
|
2337 | 'hg.create.write_on_repogroup.true', | |
|
2338 | 'hg.fork.repository', | |
|
2339 | 'hg.register.manual_activate', | |
|
2340 | 'hg.extern_activate.auto', | |
|
2341 | 'hg.inherit_default_perms.true', | |
|
2342 | ] | |
|
2343 | ||
|
2344 | # defines which permissions are more important higher the more important | |
|
2345 | # Weight defines which permissions are more important. | |
|
2346 | # The higher number the more important. | |
|
2347 | PERM_WEIGHTS = { | |
|
2348 | 'repository.none': 0, | |
|
2349 | 'repository.read': 1, | |
|
2350 | 'repository.write': 3, | |
|
2351 | 'repository.admin': 4, | |
|
2352 | ||
|
2353 | 'group.none': 0, | |
|
2354 | 'group.read': 1, | |
|
2355 | 'group.write': 3, | |
|
2356 | 'group.admin': 4, | |
|
2357 | ||
|
2358 | 'usergroup.none': 0, | |
|
2359 | 'usergroup.read': 1, | |
|
2360 | 'usergroup.write': 3, | |
|
2361 | 'usergroup.admin': 4, | |
|
2362 | ||
|
2363 | 'hg.repogroup.create.false': 0, | |
|
2364 | 'hg.repogroup.create.true': 1, | |
|
2365 | ||
|
2366 | 'hg.usergroup.create.false': 0, | |
|
2367 | 'hg.usergroup.create.true': 1, | |
|
2368 | ||
|
2369 | 'hg.fork.none': 0, | |
|
2370 | 'hg.fork.repository': 1, | |
|
2371 | 'hg.create.none': 0, | |
|
2372 | 'hg.create.repository': 1 | |
|
2373 | } | |
|
2374 | ||
|
2375 | permission_id = Column("permission_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2376 | permission_name = Column("permission_name", String(255), nullable=True, unique=None, default=None) | |
|
2377 | permission_longname = Column("permission_longname", String(255), nullable=True, unique=None, default=None) | |
|
2378 | ||
|
2379 | def __unicode__(self): | |
|
2380 | return u"<%s('%s:%s')>" % ( | |
|
2381 | self.__class__.__name__, self.permission_id, self.permission_name | |
|
2382 | ) | |
|
2383 | ||
|
2384 | @classmethod | |
|
2385 | def get_by_key(cls, key): | |
|
2386 | return cls.query().filter(cls.permission_name == key).scalar() | |
|
2387 | ||
|
2388 | @classmethod | |
|
2389 | def get_default_repo_perms(cls, user_id, repo_id=None): | |
|
2390 | q = Session().query(UserRepoToPerm, Repository, Permission)\ | |
|
2391 | .join((Permission, UserRepoToPerm.permission_id == Permission.permission_id))\ | |
|
2392 | .join((Repository, UserRepoToPerm.repository_id == Repository.repo_id))\ | |
|
2393 | .filter(UserRepoToPerm.user_id == user_id) | |
|
2394 | if repo_id: | |
|
2395 | q = q.filter(UserRepoToPerm.repository_id == repo_id) | |
|
2396 | return q.all() | |
|
2397 | ||
|
2398 | @classmethod | |
|
2399 | def get_default_repo_perms_from_user_group(cls, user_id, repo_id=None): | |
|
2400 | q = Session().query(UserGroupRepoToPerm, Repository, Permission)\ | |
|
2401 | .join( | |
|
2402 | Permission, | |
|
2403 | UserGroupRepoToPerm.permission_id == Permission.permission_id)\ | |
|
2404 | .join( | |
|
2405 | Repository, | |
|
2406 | UserGroupRepoToPerm.repository_id == Repository.repo_id)\ | |
|
2407 | .join( | |
|
2408 | UserGroup, | |
|
2409 | UserGroupRepoToPerm.users_group_id == | |
|
2410 | UserGroup.users_group_id)\ | |
|
2411 | .join( | |
|
2412 | UserGroupMember, | |
|
2413 | UserGroupRepoToPerm.users_group_id == | |
|
2414 | UserGroupMember.users_group_id)\ | |
|
2415 | .filter( | |
|
2416 | UserGroupMember.user_id == user_id, | |
|
2417 | UserGroup.users_group_active == true()) | |
|
2418 | if repo_id: | |
|
2419 | q = q.filter(UserGroupRepoToPerm.repository_id == repo_id) | |
|
2420 | return q.all() | |
|
2421 | ||
|
2422 | @classmethod | |
|
2423 | def get_default_group_perms(cls, user_id, repo_group_id=None): | |
|
2424 | q = Session().query(UserRepoGroupToPerm, RepoGroup, Permission)\ | |
|
2425 | .join((Permission, UserRepoGroupToPerm.permission_id == Permission.permission_id))\ | |
|
2426 | .join((RepoGroup, UserRepoGroupToPerm.group_id == RepoGroup.group_id))\ | |
|
2427 | .filter(UserRepoGroupToPerm.user_id == user_id) | |
|
2428 | if repo_group_id: | |
|
2429 | q = q.filter(UserRepoGroupToPerm.group_id == repo_group_id) | |
|
2430 | return q.all() | |
|
2431 | ||
|
2432 | @classmethod | |
|
2433 | def get_default_group_perms_from_user_group( | |
|
2434 | cls, user_id, repo_group_id=None): | |
|
2435 | q = Session().query(UserGroupRepoGroupToPerm, RepoGroup, Permission)\ | |
|
2436 | .join( | |
|
2437 | Permission, | |
|
2438 | UserGroupRepoGroupToPerm.permission_id == | |
|
2439 | Permission.permission_id)\ | |
|
2440 | .join( | |
|
2441 | RepoGroup, | |
|
2442 | UserGroupRepoGroupToPerm.group_id == RepoGroup.group_id)\ | |
|
2443 | .join( | |
|
2444 | UserGroup, | |
|
2445 | UserGroupRepoGroupToPerm.users_group_id == | |
|
2446 | UserGroup.users_group_id)\ | |
|
2447 | .join( | |
|
2448 | UserGroupMember, | |
|
2449 | UserGroupRepoGroupToPerm.users_group_id == | |
|
2450 | UserGroupMember.users_group_id)\ | |
|
2451 | .filter( | |
|
2452 | UserGroupMember.user_id == user_id, | |
|
2453 | UserGroup.users_group_active == true()) | |
|
2454 | if repo_group_id: | |
|
2455 | q = q.filter(UserGroupRepoGroupToPerm.group_id == repo_group_id) | |
|
2456 | return q.all() | |
|
2457 | ||
|
2458 | @classmethod | |
|
2459 | def get_default_user_group_perms(cls, user_id, user_group_id=None): | |
|
2460 | q = Session().query(UserUserGroupToPerm, UserGroup, Permission)\ | |
|
2461 | .join((Permission, UserUserGroupToPerm.permission_id == Permission.permission_id))\ | |
|
2462 | .join((UserGroup, UserUserGroupToPerm.user_group_id == UserGroup.users_group_id))\ | |
|
2463 | .filter(UserUserGroupToPerm.user_id == user_id) | |
|
2464 | if user_group_id: | |
|
2465 | q = q.filter(UserUserGroupToPerm.user_group_id == user_group_id) | |
|
2466 | return q.all() | |
|
2467 | ||
|
2468 | @classmethod | |
|
2469 | def get_default_user_group_perms_from_user_group( | |
|
2470 | cls, user_id, user_group_id=None): | |
|
2471 | TargetUserGroup = aliased(UserGroup, name='target_user_group') | |
|
2472 | q = Session().query(UserGroupUserGroupToPerm, UserGroup, Permission)\ | |
|
2473 | .join( | |
|
2474 | Permission, | |
|
2475 | UserGroupUserGroupToPerm.permission_id == | |
|
2476 | Permission.permission_id)\ | |
|
2477 | .join( | |
|
2478 | TargetUserGroup, | |
|
2479 | UserGroupUserGroupToPerm.target_user_group_id == | |
|
2480 | TargetUserGroup.users_group_id)\ | |
|
2481 | .join( | |
|
2482 | UserGroup, | |
|
2483 | UserGroupUserGroupToPerm.user_group_id == | |
|
2484 | UserGroup.users_group_id)\ | |
|
2485 | .join( | |
|
2486 | UserGroupMember, | |
|
2487 | UserGroupUserGroupToPerm.user_group_id == | |
|
2488 | UserGroupMember.users_group_id)\ | |
|
2489 | .filter( | |
|
2490 | UserGroupMember.user_id == user_id, | |
|
2491 | UserGroup.users_group_active == true()) | |
|
2492 | if user_group_id: | |
|
2493 | q = q.filter( | |
|
2494 | UserGroupUserGroupToPerm.user_group_id == user_group_id) | |
|
2495 | ||
|
2496 | return q.all() | |
|
2497 | ||
|
2498 | ||
|
2499 | class UserRepoToPerm(Base, BaseModel): | |
|
2500 | __tablename__ = 'repo_to_perm' | |
|
2501 | __table_args__ = ( | |
|
2502 | UniqueConstraint('user_id', 'repository_id', 'permission_id'), | |
|
2503 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2504 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2505 | ) | |
|
2506 | repo_to_perm_id = Column("repo_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2507 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
|
2508 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
|
2509 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None) | |
|
2510 | ||
|
2511 | user = relationship('User') | |
|
2512 | repository = relationship('Repository') | |
|
2513 | permission = relationship('Permission') | |
|
2514 | ||
|
2515 | @classmethod | |
|
2516 | def create(cls, user, repository, permission): | |
|
2517 | n = cls() | |
|
2518 | n.user = user | |
|
2519 | n.repository = repository | |
|
2520 | n.permission = permission | |
|
2521 | Session().add(n) | |
|
2522 | return n | |
|
2523 | ||
|
2524 | def __unicode__(self): | |
|
2525 | return u'<%s => %s >' % (self.user, self.repository) | |
|
2526 | ||
|
2527 | ||
|
2528 | class UserUserGroupToPerm(Base, BaseModel): | |
|
2529 | __tablename__ = 'user_user_group_to_perm' | |
|
2530 | __table_args__ = ( | |
|
2531 | UniqueConstraint('user_id', 'user_group_id', 'permission_id'), | |
|
2532 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2533 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2534 | ) | |
|
2535 | user_user_group_to_perm_id = Column("user_user_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2536 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
|
2537 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
|
2538 | user_group_id = Column("user_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
|
2539 | ||
|
2540 | user = relationship('User') | |
|
2541 | user_group = relationship('UserGroup') | |
|
2542 | permission = relationship('Permission') | |
|
2543 | ||
|
2544 | @classmethod | |
|
2545 | def create(cls, user, user_group, permission): | |
|
2546 | n = cls() | |
|
2547 | n.user = user | |
|
2548 | n.user_group = user_group | |
|
2549 | n.permission = permission | |
|
2550 | Session().add(n) | |
|
2551 | return n | |
|
2552 | ||
|
2553 | def __unicode__(self): | |
|
2554 | return u'<%s => %s >' % (self.user, self.user_group) | |
|
2555 | ||
|
2556 | ||
|
2557 | class UserToPerm(Base, BaseModel): | |
|
2558 | __tablename__ = 'user_to_perm' | |
|
2559 | __table_args__ = ( | |
|
2560 | UniqueConstraint('user_id', 'permission_id'), | |
|
2561 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2562 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2563 | ) | |
|
2564 | user_to_perm_id = Column("user_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2565 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
|
2566 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
|
2567 | ||
|
2568 | user = relationship('User') | |
|
2569 | permission = relationship('Permission', lazy='joined') | |
|
2570 | ||
|
2571 | def __unicode__(self): | |
|
2572 | return u'<%s => %s >' % (self.user, self.permission) | |
|
2573 | ||
|
2574 | ||
|
2575 | class UserGroupRepoToPerm(Base, BaseModel): | |
|
2576 | __tablename__ = 'users_group_repo_to_perm' | |
|
2577 | __table_args__ = ( | |
|
2578 | UniqueConstraint('repository_id', 'users_group_id', 'permission_id'), | |
|
2579 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2580 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2581 | ) | |
|
2582 | users_group_to_perm_id = Column("users_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2583 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
|
2584 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
|
2585 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None) | |
|
2586 | ||
|
2587 | users_group = relationship('UserGroup') | |
|
2588 | permission = relationship('Permission') | |
|
2589 | repository = relationship('Repository') | |
|
2590 | ||
|
2591 | @classmethod | |
|
2592 | def create(cls, users_group, repository, permission): | |
|
2593 | n = cls() | |
|
2594 | n.users_group = users_group | |
|
2595 | n.repository = repository | |
|
2596 | n.permission = permission | |
|
2597 | Session().add(n) | |
|
2598 | return n | |
|
2599 | ||
|
2600 | def __unicode__(self): | |
|
2601 | return u'<UserGroupRepoToPerm:%s => %s >' % (self.users_group, self.repository) | |
|
2602 | ||
|
2603 | ||
|
2604 | class UserGroupUserGroupToPerm(Base, BaseModel): | |
|
2605 | __tablename__ = 'user_group_user_group_to_perm' | |
|
2606 | __table_args__ = ( | |
|
2607 | UniqueConstraint('target_user_group_id', 'user_group_id', 'permission_id'), | |
|
2608 | CheckConstraint('target_user_group_id != user_group_id'), | |
|
2609 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2610 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2611 | ) | |
|
2612 | user_group_user_group_to_perm_id = Column("user_group_user_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2613 | target_user_group_id = Column("target_user_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
|
2614 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
|
2615 | user_group_id = Column("user_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
|
2616 | ||
|
2617 | target_user_group = relationship('UserGroup', primaryjoin='UserGroupUserGroupToPerm.target_user_group_id==UserGroup.users_group_id') | |
|
2618 | user_group = relationship('UserGroup', primaryjoin='UserGroupUserGroupToPerm.user_group_id==UserGroup.users_group_id') | |
|
2619 | permission = relationship('Permission') | |
|
2620 | ||
|
2621 | @classmethod | |
|
2622 | def create(cls, target_user_group, user_group, permission): | |
|
2623 | n = cls() | |
|
2624 | n.target_user_group = target_user_group | |
|
2625 | n.user_group = user_group | |
|
2626 | n.permission = permission | |
|
2627 | Session().add(n) | |
|
2628 | return n | |
|
2629 | ||
|
2630 | def __unicode__(self): | |
|
2631 | return u'<UserGroupUserGroup:%s => %s >' % (self.target_user_group, self.user_group) | |
|
2632 | ||
|
2633 | ||
|
2634 | class UserGroupToPerm(Base, BaseModel): | |
|
2635 | __tablename__ = 'users_group_to_perm' | |
|
2636 | __table_args__ = ( | |
|
2637 | UniqueConstraint('users_group_id', 'permission_id',), | |
|
2638 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2639 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2640 | ) | |
|
2641 | users_group_to_perm_id = Column("users_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2642 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
|
2643 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
|
2644 | ||
|
2645 | users_group = relationship('UserGroup') | |
|
2646 | permission = relationship('Permission') | |
|
2647 | ||
|
2648 | ||
|
2649 | class UserRepoGroupToPerm(Base, BaseModel): | |
|
2650 | __tablename__ = 'user_repo_group_to_perm' | |
|
2651 | __table_args__ = ( | |
|
2652 | UniqueConstraint('user_id', 'group_id', 'permission_id'), | |
|
2653 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2654 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2655 | ) | |
|
2656 | ||
|
2657 | group_to_perm_id = Column("group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2658 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
|
2659 | group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=False, unique=None, default=None) | |
|
2660 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
|
2661 | ||
|
2662 | user = relationship('User') | |
|
2663 | group = relationship('RepoGroup') | |
|
2664 | permission = relationship('Permission') | |
|
2665 | ||
|
2666 | @classmethod | |
|
2667 | def create(cls, user, repository_group, permission): | |
|
2668 | n = cls() | |
|
2669 | n.user = user | |
|
2670 | n.group = repository_group | |
|
2671 | n.permission = permission | |
|
2672 | Session().add(n) | |
|
2673 | return n | |
|
2674 | ||
|
2675 | ||
|
2676 | class UserGroupRepoGroupToPerm(Base, BaseModel): | |
|
2677 | __tablename__ = 'users_group_repo_group_to_perm' | |
|
2678 | __table_args__ = ( | |
|
2679 | UniqueConstraint('users_group_id', 'group_id'), | |
|
2680 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2681 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2682 | ) | |
|
2683 | ||
|
2684 | users_group_repo_group_to_perm_id = Column("users_group_repo_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2685 | users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None) | |
|
2686 | group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=False, unique=None, default=None) | |
|
2687 | permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None) | |
|
2688 | ||
|
2689 | users_group = relationship('UserGroup') | |
|
2690 | permission = relationship('Permission') | |
|
2691 | group = relationship('RepoGroup') | |
|
2692 | ||
|
2693 | @classmethod | |
|
2694 | def create(cls, user_group, repository_group, permission): | |
|
2695 | n = cls() | |
|
2696 | n.users_group = user_group | |
|
2697 | n.group = repository_group | |
|
2698 | n.permission = permission | |
|
2699 | Session().add(n) | |
|
2700 | return n | |
|
2701 | ||
|
2702 | def __unicode__(self): | |
|
2703 | return u'<UserGroupRepoGroupToPerm:%s => %s >' % (self.users_group, self.group) | |
|
2704 | ||
|
2705 | ||
|
2706 | class Statistics(Base, BaseModel): | |
|
2707 | __tablename__ = 'statistics' | |
|
2708 | __table_args__ = ( | |
|
2709 | UniqueConstraint('repository_id'), | |
|
2710 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2711 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2712 | ) | |
|
2713 | stat_id = Column("stat_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2714 | repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=True, default=None) | |
|
2715 | stat_on_revision = Column("stat_on_revision", Integer(), nullable=False) | |
|
2716 | commit_activity = Column("commit_activity", LargeBinary(1000000), nullable=False)#JSON data | |
|
2717 | commit_activity_combined = Column("commit_activity_combined", LargeBinary(), nullable=False)#JSON data | |
|
2718 | languages = Column("languages", LargeBinary(1000000), nullable=False)#JSON data | |
|
2719 | ||
|
2720 | repository = relationship('Repository', single_parent=True) | |
|
2721 | ||
|
2722 | ||
|
2723 | class UserFollowing(Base, BaseModel): | |
|
2724 | __tablename__ = 'user_followings' | |
|
2725 | __table_args__ = ( | |
|
2726 | UniqueConstraint('user_id', 'follows_repository_id'), | |
|
2727 | UniqueConstraint('user_id', 'follows_user_id'), | |
|
2728 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2729 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2730 | ) | |
|
2731 | ||
|
2732 | user_following_id = Column("user_following_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2733 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None) | |
|
2734 | follows_repo_id = Column("follows_repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True, unique=None, default=None) | |
|
2735 | follows_user_id = Column("follows_user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None) | |
|
2736 | follows_from = Column('follows_from', DateTime(timezone=False), nullable=True, unique=None, default=datetime.datetime.now) | |
|
2737 | ||
|
2738 | user = relationship('User', primaryjoin='User.user_id==UserFollowing.user_id') | |
|
2739 | ||
|
2740 | follows_user = relationship('User', primaryjoin='User.user_id==UserFollowing.follows_user_id') | |
|
2741 | follows_repository = relationship('Repository', order_by='Repository.repo_name') | |
|
2742 | ||
|
2743 | @classmethod | |
|
2744 | def get_repo_followers(cls, repo_id): | |
|
2745 | return cls.query().filter(cls.follows_repo_id == repo_id) | |
|
2746 | ||
|
2747 | ||
|
2748 | class CacheKey(Base, BaseModel): | |
|
2749 | __tablename__ = 'cache_invalidation' | |
|
2750 | __table_args__ = ( | |
|
2751 | UniqueConstraint('cache_key'), | |
|
2752 | Index('key_idx', 'cache_key'), | |
|
2753 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2754 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
2755 | ) | |
|
2756 | CACHE_TYPE_ATOM = 'ATOM' | |
|
2757 | CACHE_TYPE_RSS = 'RSS' | |
|
2758 | CACHE_TYPE_README = 'README' | |
|
2759 | ||
|
2760 | cache_id = Column("cache_id", Integer(), nullable=False, unique=True, default=None, primary_key=True) | |
|
2761 | cache_key = Column("cache_key", String(255), nullable=True, unique=None, default=None) | |
|
2762 | cache_args = Column("cache_args", String(255), nullable=True, unique=None, default=None) | |
|
2763 | cache_active = Column("cache_active", Boolean(), nullable=True, unique=None, default=False) | |
|
2764 | ||
|
2765 | def __init__(self, cache_key, cache_args=''): | |
|
2766 | self.cache_key = cache_key | |
|
2767 | self.cache_args = cache_args | |
|
2768 | self.cache_active = False | |
|
2769 | ||
|
2770 | def __unicode__(self): | |
|
2771 | return u"<%s('%s:%s[%s]')>" % ( | |
|
2772 | self.__class__.__name__, | |
|
2773 | self.cache_id, self.cache_key, self.cache_active) | |
|
2774 | ||
|
2775 | def _cache_key_partition(self): | |
|
2776 | prefix, repo_name, suffix = self.cache_key.partition(self.cache_args) | |
|
2777 | return prefix, repo_name, suffix | |
|
2778 | ||
|
2779 | def get_prefix(self): | |
|
2780 | """ | |
|
2781 | Try to extract prefix from existing cache key. The key could consist | |
|
2782 | of prefix, repo_name, suffix | |
|
2783 | """ | |
|
2784 | # this returns prefix, repo_name, suffix | |
|
2785 | return self._cache_key_partition()[0] | |
|
2786 | ||
|
2787 | def get_suffix(self): | |
|
2788 | """ | |
|
2789 | get suffix that might have been used in _get_cache_key to | |
|
2790 | generate self.cache_key. Only used for informational purposes | |
|
2791 | in repo_edit.html. | |
|
2792 | """ | |
|
2793 | # prefix, repo_name, suffix | |
|
2794 | return self._cache_key_partition()[2] | |
|
2795 | ||
|
2796 | @classmethod | |
|
2797 | def delete_all_cache(cls): | |
|
2798 | """ | |
|
2799 | Delete all cache keys from database. | |
|
2800 | Should only be run when all instances are down and all entries | |
|
2801 | thus stale. | |
|
2802 | """ | |
|
2803 | cls.query().delete() | |
|
2804 | Session().commit() | |
|
2805 | ||
|
2806 | @classmethod | |
|
2807 | def get_cache_key(cls, repo_name, cache_type): | |
|
2808 | """ | |
|
2809 | ||
|
2810 | Generate a cache key for this process of RhodeCode instance. | |
|
2811 | Prefix most likely will be process id or maybe explicitly set | |
|
2812 | instance_id from .ini file. | |
|
2813 | """ | |
|
2814 | import rhodecode | |
|
2815 | prefix = safe_unicode(rhodecode.CONFIG.get('instance_id') or '') | |
|
2816 | ||
|
2817 | repo_as_unicode = safe_unicode(repo_name) | |
|
2818 | key = u'{}_{}'.format(repo_as_unicode, cache_type) \ | |
|
2819 | if cache_type else repo_as_unicode | |
|
2820 | ||
|
2821 | return u'{}{}'.format(prefix, key) | |
|
2822 | ||
|
2823 | @classmethod | |
|
2824 | def set_invalidate(cls, repo_name, delete=False): | |
|
2825 | """ | |
|
2826 | Mark all caches of a repo as invalid in the database. | |
|
2827 | """ | |
|
2828 | ||
|
2829 | try: | |
|
2830 | qry = Session().query(cls).filter(cls.cache_args == repo_name) | |
|
2831 | if delete: | |
|
2832 | log.debug('cache objects deleted for repo %s', | |
|
2833 | safe_str(repo_name)) | |
|
2834 | qry.delete() | |
|
2835 | else: | |
|
2836 | log.debug('cache objects marked as invalid for repo %s', | |
|
2837 | safe_str(repo_name)) | |
|
2838 | qry.update({"cache_active": False}) | |
|
2839 | ||
|
2840 | Session().commit() | |
|
2841 | except Exception: | |
|
2842 | log.exception( | |
|
2843 | 'Cache key invalidation failed for repository %s', | |
|
2844 | safe_str(repo_name)) | |
|
2845 | Session().rollback() | |
|
2846 | ||
|
2847 | @classmethod | |
|
2848 | def get_active_cache(cls, cache_key): | |
|
2849 | inv_obj = cls.query().filter(cls.cache_key == cache_key).scalar() | |
|
2850 | if inv_obj: | |
|
2851 | return inv_obj | |
|
2852 | return None | |
|
2853 | ||
|
2854 | @classmethod | |
|
2855 | def repo_context_cache(cls, compute_func, repo_name, cache_type, | |
|
2856 | thread_scoped=False): | |
|
2857 | """ | |
|
2858 | @cache_region('long_term') | |
|
2859 | def _heavy_calculation(cache_key): | |
|
2860 | return 'result' | |
|
2861 | ||
|
2862 | cache_context = CacheKey.repo_context_cache( | |
|
2863 | _heavy_calculation, repo_name, cache_type) | |
|
2864 | ||
|
2865 | with cache_context as context: | |
|
2866 | context.invalidate() | |
|
2867 | computed = context.compute() | |
|
2868 | ||
|
2869 | assert computed == 'result' | |
|
2870 | """ | |
|
2871 | from rhodecode.lib import caches | |
|
2872 | return caches.InvalidationContext( | |
|
2873 | compute_func, repo_name, cache_type, thread_scoped=thread_scoped) | |
|
2874 | ||
|
2875 | ||
|
2876 | class ChangesetComment(Base, BaseModel): | |
|
2877 | __tablename__ = 'changeset_comments' | |
|
2878 | __table_args__ = ( | |
|
2879 | Index('cc_revision_idx', 'revision'), | |
|
2880 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2881 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
2882 | ) | |
|
2883 | ||
|
2884 | COMMENT_OUTDATED = u'comment_outdated' | |
|
2885 | ||
|
2886 | comment_id = Column('comment_id', Integer(), nullable=False, primary_key=True) | |
|
2887 | repo_id = Column('repo_id', Integer(), ForeignKey('repositories.repo_id'), nullable=False) | |
|
2888 | revision = Column('revision', String(40), nullable=True) | |
|
2889 | pull_request_id = Column("pull_request_id", Integer(), ForeignKey('pull_requests.pull_request_id'), nullable=True) | |
|
2890 | pull_request_version_id = Column("pull_request_version_id", Integer(), ForeignKey('pull_request_versions.pull_request_version_id'), nullable=True) | |
|
2891 | line_no = Column('line_no', Unicode(10), nullable=True) | |
|
2892 | hl_lines = Column('hl_lines', Unicode(512), nullable=True) | |
|
2893 | f_path = Column('f_path', Unicode(1000), nullable=True) | |
|
2894 | user_id = Column('user_id', Integer(), ForeignKey('users.user_id'), nullable=False) | |
|
2895 | text = Column('text', UnicodeText().with_variant(UnicodeText(25000), 'mysql'), nullable=False) | |
|
2896 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
|
2897 | modified_at = Column('modified_at', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
|
2898 | renderer = Column('renderer', Unicode(64), nullable=True) | |
|
2899 | display_state = Column('display_state', Unicode(128), nullable=True) | |
|
2900 | ||
|
2901 | author = relationship('User', lazy='joined') | |
|
2902 | repo = relationship('Repository') | |
|
2903 | status_change = relationship('ChangesetStatus', cascade="all, delete, delete-orphan") | |
|
2904 | pull_request = relationship('PullRequest', lazy='joined') | |
|
2905 | pull_request_version = relationship('PullRequestVersion') | |
|
2906 | ||
|
2907 | @classmethod | |
|
2908 | def get_users(cls, revision=None, pull_request_id=None): | |
|
2909 | """ | |
|
2910 | Returns user associated with this ChangesetComment. ie those | |
|
2911 | who actually commented | |
|
2912 | ||
|
2913 | :param cls: | |
|
2914 | :param revision: | |
|
2915 | """ | |
|
2916 | q = Session().query(User)\ | |
|
2917 | .join(ChangesetComment.author) | |
|
2918 | if revision: | |
|
2919 | q = q.filter(cls.revision == revision) | |
|
2920 | elif pull_request_id: | |
|
2921 | q = q.filter(cls.pull_request_id == pull_request_id) | |
|
2922 | return q.all() | |
|
2923 | ||
|
2924 | def render(self, mentions=False): | |
|
2925 | from rhodecode.lib import helpers as h | |
|
2926 | return h.render(self.text, renderer=self.renderer, mentions=mentions) | |
|
2927 | ||
|
2928 | def __repr__(self): | |
|
2929 | if self.comment_id: | |
|
2930 | return '<DB:ChangesetComment #%s>' % self.comment_id | |
|
2931 | else: | |
|
2932 | return '<DB:ChangesetComment at %#x>' % id(self) | |
|
2933 | ||
|
2934 | ||
|
2935 | class ChangesetStatus(Base, BaseModel): | |
|
2936 | __tablename__ = 'changeset_statuses' | |
|
2937 | __table_args__ = ( | |
|
2938 | Index('cs_revision_idx', 'revision'), | |
|
2939 | Index('cs_version_idx', 'version'), | |
|
2940 | UniqueConstraint('repo_id', 'revision', 'version'), | |
|
2941 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
2942 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
2943 | ) | |
|
2944 | STATUS_NOT_REVIEWED = DEFAULT = 'not_reviewed' | |
|
2945 | STATUS_APPROVED = 'approved' | |
|
2946 | STATUS_REJECTED = 'rejected' | |
|
2947 | STATUS_UNDER_REVIEW = 'under_review' | |
|
2948 | ||
|
2949 | STATUSES = [ | |
|
2950 | (STATUS_NOT_REVIEWED, _("Not Reviewed")), # (no icon) and default | |
|
2951 | (STATUS_APPROVED, _("Approved")), | |
|
2952 | (STATUS_REJECTED, _("Rejected")), | |
|
2953 | (STATUS_UNDER_REVIEW, _("Under Review")), | |
|
2954 | ] | |
|
2955 | ||
|
2956 | changeset_status_id = Column('changeset_status_id', Integer(), nullable=False, primary_key=True) | |
|
2957 | repo_id = Column('repo_id', Integer(), ForeignKey('repositories.repo_id'), nullable=False) | |
|
2958 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None) | |
|
2959 | revision = Column('revision', String(40), nullable=False) | |
|
2960 | status = Column('status', String(128), nullable=False, default=DEFAULT) | |
|
2961 | changeset_comment_id = Column('changeset_comment_id', Integer(), ForeignKey('changeset_comments.comment_id')) | |
|
2962 | modified_at = Column('modified_at', DateTime(), nullable=False, default=datetime.datetime.now) | |
|
2963 | version = Column('version', Integer(), nullable=False, default=0) | |
|
2964 | pull_request_id = Column("pull_request_id", Integer(), ForeignKey('pull_requests.pull_request_id'), nullable=True) | |
|
2965 | ||
|
2966 | author = relationship('User', lazy='joined') | |
|
2967 | repo = relationship('Repository') | |
|
2968 | comment = relationship('ChangesetComment', lazy='joined') | |
|
2969 | pull_request = relationship('PullRequest', lazy='joined') | |
|
2970 | ||
|
2971 | def __unicode__(self): | |
|
2972 | return u"<%s('%s[%s]:%s')>" % ( | |
|
2973 | self.__class__.__name__, | |
|
2974 | self.status, self.version, self.author | |
|
2975 | ) | |
|
2976 | ||
|
2977 | @classmethod | |
|
2978 | def get_status_lbl(cls, value): | |
|
2979 | return dict(cls.STATUSES).get(value) | |
|
2980 | ||
|
2981 | @property | |
|
2982 | def status_lbl(self): | |
|
2983 | return ChangesetStatus.get_status_lbl(self.status) | |
|
2984 | ||
|
2985 | ||
|
2986 | class _PullRequestBase(BaseModel): | |
|
2987 | """ | |
|
2988 | Common attributes of pull request and version entries. | |
|
2989 | """ | |
|
2990 | ||
|
2991 | # .status values | |
|
2992 | STATUS_NEW = u'new' | |
|
2993 | STATUS_OPEN = u'open' | |
|
2994 | STATUS_CLOSED = u'closed' | |
|
2995 | ||
|
2996 | title = Column('title', Unicode(255), nullable=True) | |
|
2997 | description = Column( | |
|
2998 | 'description', UnicodeText().with_variant(UnicodeText(10240), 'mysql'), | |
|
2999 | nullable=True) | |
|
3000 | # new/open/closed status of pull request (not approve/reject/etc) | |
|
3001 | status = Column('status', Unicode(255), nullable=False, default=STATUS_NEW) | |
|
3002 | created_on = Column( | |
|
3003 | 'created_on', DateTime(timezone=False), nullable=False, | |
|
3004 | default=datetime.datetime.now) | |
|
3005 | updated_on = Column( | |
|
3006 | 'updated_on', DateTime(timezone=False), nullable=False, | |
|
3007 | default=datetime.datetime.now) | |
|
3008 | ||
|
3009 | @declared_attr | |
|
3010 | def user_id(cls): | |
|
3011 | return Column( | |
|
3012 | "user_id", Integer(), ForeignKey('users.user_id'), nullable=False, | |
|
3013 | unique=None) | |
|
3014 | ||
|
3015 | # 500 revisions max | |
|
3016 | _revisions = Column( | |
|
3017 | 'revisions', UnicodeText().with_variant(UnicodeText(20500), 'mysql')) | |
|
3018 | ||
|
3019 | @declared_attr | |
|
3020 | def source_repo_id(cls): | |
|
3021 | # TODO: dan: rename column to source_repo_id | |
|
3022 | return Column( | |
|
3023 | 'org_repo_id', Integer(), ForeignKey('repositories.repo_id'), | |
|
3024 | nullable=False) | |
|
3025 | ||
|
3026 | source_ref = Column('org_ref', Unicode(255), nullable=False) | |
|
3027 | ||
|
3028 | @declared_attr | |
|
3029 | def target_repo_id(cls): | |
|
3030 | # TODO: dan: rename column to target_repo_id | |
|
3031 | return Column( | |
|
3032 | 'other_repo_id', Integer(), ForeignKey('repositories.repo_id'), | |
|
3033 | nullable=False) | |
|
3034 | ||
|
3035 | target_ref = Column('other_ref', Unicode(255), nullable=False) | |
|
3036 | ||
|
3037 | # TODO: dan: rename column to last_merge_source_rev | |
|
3038 | _last_merge_source_rev = Column( | |
|
3039 | 'last_merge_org_rev', String(40), nullable=True) | |
|
3040 | # TODO: dan: rename column to last_merge_target_rev | |
|
3041 | _last_merge_target_rev = Column( | |
|
3042 | 'last_merge_other_rev', String(40), nullable=True) | |
|
3043 | _last_merge_status = Column('merge_status', Integer(), nullable=True) | |
|
3044 | merge_rev = Column('merge_rev', String(40), nullable=True) | |
|
3045 | ||
|
3046 | @hybrid_property | |
|
3047 | def revisions(self): | |
|
3048 | return self._revisions.split(':') if self._revisions else [] | |
|
3049 | ||
|
3050 | @revisions.setter | |
|
3051 | def revisions(self, val): | |
|
3052 | self._revisions = ':'.join(val) | |
|
3053 | ||
|
3054 | @declared_attr | |
|
3055 | def author(cls): | |
|
3056 | return relationship('User', lazy='joined') | |
|
3057 | ||
|
3058 | @declared_attr | |
|
3059 | def source_repo(cls): | |
|
3060 | return relationship( | |
|
3061 | 'Repository', | |
|
3062 | primaryjoin='%s.source_repo_id==Repository.repo_id' % cls.__name__) | |
|
3063 | ||
|
3064 | @property | |
|
3065 | def source_ref_parts(self): | |
|
3066 | refs = self.source_ref.split(':') | |
|
3067 | return Reference(refs[0], refs[1], refs[2]) | |
|
3068 | ||
|
3069 | @declared_attr | |
|
3070 | def target_repo(cls): | |
|
3071 | return relationship( | |
|
3072 | 'Repository', | |
|
3073 | primaryjoin='%s.target_repo_id==Repository.repo_id' % cls.__name__) | |
|
3074 | ||
|
3075 | @property | |
|
3076 | def target_ref_parts(self): | |
|
3077 | refs = self.target_ref.split(':') | |
|
3078 | return Reference(refs[0], refs[1], refs[2]) | |
|
3079 | ||
|
3080 | ||
|
3081 | class PullRequest(Base, _PullRequestBase): | |
|
3082 | __tablename__ = 'pull_requests' | |
|
3083 | __table_args__ = ( | |
|
3084 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3085 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
3086 | ) | |
|
3087 | ||
|
3088 | pull_request_id = Column( | |
|
3089 | 'pull_request_id', Integer(), nullable=False, primary_key=True) | |
|
3090 | ||
|
3091 | def __repr__(self): | |
|
3092 | if self.pull_request_id: | |
|
3093 | return '<DB:PullRequest #%s>' % self.pull_request_id | |
|
3094 | else: | |
|
3095 | return '<DB:PullRequest at %#x>' % id(self) | |
|
3096 | ||
|
3097 | reviewers = relationship('PullRequestReviewers', | |
|
3098 | cascade="all, delete, delete-orphan") | |
|
3099 | statuses = relationship('ChangesetStatus') | |
|
3100 | comments = relationship('ChangesetComment', | |
|
3101 | cascade="all, delete, delete-orphan") | |
|
3102 | versions = relationship('PullRequestVersion', | |
|
3103 | cascade="all, delete, delete-orphan") | |
|
3104 | ||
|
3105 | def is_closed(self): | |
|
3106 | return self.status == self.STATUS_CLOSED | |
|
3107 | ||
|
3108 | def get_api_data(self): | |
|
3109 | from rhodecode.model.pull_request import PullRequestModel | |
|
3110 | pull_request = self | |
|
3111 | merge_status = PullRequestModel().merge_status(pull_request) | |
|
3112 | data = { | |
|
3113 | 'pull_request_id': pull_request.pull_request_id, | |
|
3114 | 'url': url('pullrequest_show', repo_name=self.target_repo.repo_name, | |
|
3115 | pull_request_id=self.pull_request_id, | |
|
3116 | qualified=True), | |
|
3117 | 'title': pull_request.title, | |
|
3118 | 'description': pull_request.description, | |
|
3119 | 'status': pull_request.status, | |
|
3120 | 'created_on': pull_request.created_on, | |
|
3121 | 'updated_on': pull_request.updated_on, | |
|
3122 | 'commit_ids': pull_request.revisions, | |
|
3123 | 'review_status': pull_request.calculated_review_status(), | |
|
3124 | 'mergeable': { | |
|
3125 | 'status': merge_status[0], | |
|
3126 | 'message': unicode(merge_status[1]), | |
|
3127 | }, | |
|
3128 | 'source': { | |
|
3129 | 'clone_url': pull_request.source_repo.clone_url(), | |
|
3130 | 'repository': pull_request.source_repo.repo_name, | |
|
3131 | 'reference': { | |
|
3132 | 'name': pull_request.source_ref_parts.name, | |
|
3133 | 'type': pull_request.source_ref_parts.type, | |
|
3134 | 'commit_id': pull_request.source_ref_parts.commit_id, | |
|
3135 | }, | |
|
3136 | }, | |
|
3137 | 'target': { | |
|
3138 | 'clone_url': pull_request.target_repo.clone_url(), | |
|
3139 | 'repository': pull_request.target_repo.repo_name, | |
|
3140 | 'reference': { | |
|
3141 | 'name': pull_request.target_ref_parts.name, | |
|
3142 | 'type': pull_request.target_ref_parts.type, | |
|
3143 | 'commit_id': pull_request.target_ref_parts.commit_id, | |
|
3144 | }, | |
|
3145 | }, | |
|
3146 | 'author': pull_request.author.get_api_data(include_secrets=False, | |
|
3147 | details='basic'), | |
|
3148 | 'reviewers': [ | |
|
3149 | { | |
|
3150 | 'user': reviewer.get_api_data(include_secrets=False, | |
|
3151 | details='basic'), | |
|
3152 | 'review_status': st[0][1].status if st else 'not_reviewed', | |
|
3153 | } | |
|
3154 | for reviewer, st in pull_request.reviewers_statuses() | |
|
3155 | ] | |
|
3156 | } | |
|
3157 | ||
|
3158 | return data | |
|
3159 | ||
|
3160 | def __json__(self): | |
|
3161 | return { | |
|
3162 | 'revisions': self.revisions, | |
|
3163 | } | |
|
3164 | ||
|
3165 | def calculated_review_status(self): | |
|
3166 | # TODO: anderson: 13.05.15 Used only on templates/my_account_pullrequests.html | |
|
3167 | # because it's tricky on how to use ChangesetStatusModel from there | |
|
3168 | warnings.warn("Use calculated_review_status from ChangesetStatusModel", DeprecationWarning) | |
|
3169 | from rhodecode.model.changeset_status import ChangesetStatusModel | |
|
3170 | return ChangesetStatusModel().calculated_review_status(self) | |
|
3171 | ||
|
3172 | def reviewers_statuses(self): | |
|
3173 | warnings.warn("Use reviewers_statuses from ChangesetStatusModel", DeprecationWarning) | |
|
3174 | from rhodecode.model.changeset_status import ChangesetStatusModel | |
|
3175 | return ChangesetStatusModel().reviewers_statuses(self) | |
|
3176 | ||
|
3177 | ||
|
3178 | class PullRequestVersion(Base, _PullRequestBase): | |
|
3179 | __tablename__ = 'pull_request_versions' | |
|
3180 | __table_args__ = ( | |
|
3181 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3182 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
3183 | ) | |
|
3184 | ||
|
3185 | pull_request_version_id = Column( | |
|
3186 | 'pull_request_version_id', Integer(), nullable=False, primary_key=True) | |
|
3187 | pull_request_id = Column( | |
|
3188 | 'pull_request_id', Integer(), | |
|
3189 | ForeignKey('pull_requests.pull_request_id'), nullable=False) | |
|
3190 | pull_request = relationship('PullRequest') | |
|
3191 | ||
|
3192 | def __repr__(self): | |
|
3193 | if self.pull_request_version_id: | |
|
3194 | return '<DB:PullRequestVersion #%s>' % self.pull_request_version_id | |
|
3195 | else: | |
|
3196 | return '<DB:PullRequestVersion at %#x>' % id(self) | |
|
3197 | ||
|
3198 | ||
|
3199 | class PullRequestReviewers(Base, BaseModel): | |
|
3200 | __tablename__ = 'pull_request_reviewers' | |
|
3201 | __table_args__ = ( | |
|
3202 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3203 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
3204 | ) | |
|
3205 | ||
|
3206 | def __init__(self, user=None, pull_request=None): | |
|
3207 | self.user = user | |
|
3208 | self.pull_request = pull_request | |
|
3209 | ||
|
3210 | pull_requests_reviewers_id = Column( | |
|
3211 | 'pull_requests_reviewers_id', Integer(), nullable=False, | |
|
3212 | primary_key=True) | |
|
3213 | pull_request_id = Column( | |
|
3214 | "pull_request_id", Integer(), | |
|
3215 | ForeignKey('pull_requests.pull_request_id'), nullable=False) | |
|
3216 | user_id = Column( | |
|
3217 | "user_id", Integer(), ForeignKey('users.user_id'), nullable=True) | |
|
3218 | ||
|
3219 | user = relationship('User') | |
|
3220 | pull_request = relationship('PullRequest') | |
|
3221 | ||
|
3222 | ||
|
3223 | class Notification(Base, BaseModel): | |
|
3224 | __tablename__ = 'notifications' | |
|
3225 | __table_args__ = ( | |
|
3226 | Index('notification_type_idx', 'type'), | |
|
3227 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3228 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
3229 | ) | |
|
3230 | ||
|
3231 | TYPE_CHANGESET_COMMENT = u'cs_comment' | |
|
3232 | TYPE_MESSAGE = u'message' | |
|
3233 | TYPE_MENTION = u'mention' | |
|
3234 | TYPE_REGISTRATION = u'registration' | |
|
3235 | TYPE_PULL_REQUEST = u'pull_request' | |
|
3236 | TYPE_PULL_REQUEST_COMMENT = u'pull_request_comment' | |
|
3237 | ||
|
3238 | notification_id = Column('notification_id', Integer(), nullable=False, primary_key=True) | |
|
3239 | subject = Column('subject', Unicode(512), nullable=True) | |
|
3240 | body = Column('body', UnicodeText().with_variant(UnicodeText(50000), 'mysql'), nullable=True) | |
|
3241 | created_by = Column("created_by", Integer(), ForeignKey('users.user_id'), nullable=True) | |
|
3242 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
|
3243 | type_ = Column('type', Unicode(255)) | |
|
3244 | ||
|
3245 | created_by_user = relationship('User') | |
|
3246 | notifications_to_users = relationship('UserNotification', lazy='joined', | |
|
3247 | cascade="all, delete, delete-orphan") | |
|
3248 | ||
|
3249 | @property | |
|
3250 | def recipients(self): | |
|
3251 | return [x.user for x in UserNotification.query()\ | |
|
3252 | .filter(UserNotification.notification == self)\ | |
|
3253 | .order_by(UserNotification.user_id.asc()).all()] | |
|
3254 | ||
|
3255 | @classmethod | |
|
3256 | def create(cls, created_by, subject, body, recipients, type_=None): | |
|
3257 | if type_ is None: | |
|
3258 | type_ = Notification.TYPE_MESSAGE | |
|
3259 | ||
|
3260 | notification = cls() | |
|
3261 | notification.created_by_user = created_by | |
|
3262 | notification.subject = subject | |
|
3263 | notification.body = body | |
|
3264 | notification.type_ = type_ | |
|
3265 | notification.created_on = datetime.datetime.now() | |
|
3266 | ||
|
3267 | for u in recipients: | |
|
3268 | assoc = UserNotification() | |
|
3269 | assoc.notification = notification | |
|
3270 | ||
|
3271 | # if created_by is inside recipients mark his notification | |
|
3272 | # as read | |
|
3273 | if u.user_id == created_by.user_id: | |
|
3274 | assoc.read = True | |
|
3275 | ||
|
3276 | u.notifications.append(assoc) | |
|
3277 | Session().add(notification) | |
|
3278 | ||
|
3279 | return notification | |
|
3280 | ||
|
3281 | @property | |
|
3282 | def description(self): | |
|
3283 | from rhodecode.model.notification import NotificationModel | |
|
3284 | return NotificationModel().make_description(self) | |
|
3285 | ||
|
3286 | ||
|
3287 | class UserNotification(Base, BaseModel): | |
|
3288 | __tablename__ = 'user_to_notification' | |
|
3289 | __table_args__ = ( | |
|
3290 | UniqueConstraint('user_id', 'notification_id'), | |
|
3291 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3292 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
3293 | ) | |
|
3294 | user_id = Column('user_id', Integer(), ForeignKey('users.user_id'), primary_key=True) | |
|
3295 | notification_id = Column("notification_id", Integer(), ForeignKey('notifications.notification_id'), primary_key=True) | |
|
3296 | read = Column('read', Boolean, default=False) | |
|
3297 | sent_on = Column('sent_on', DateTime(timezone=False), nullable=True, unique=None) | |
|
3298 | ||
|
3299 | user = relationship('User', lazy="joined") | |
|
3300 | notification = relationship('Notification', lazy="joined", | |
|
3301 | order_by=lambda: Notification.created_on.desc(),) | |
|
3302 | ||
|
3303 | def mark_as_read(self): | |
|
3304 | self.read = True | |
|
3305 | Session().add(self) | |
|
3306 | ||
|
3307 | ||
|
3308 | class Gist(Base, BaseModel): | |
|
3309 | __tablename__ = 'gists' | |
|
3310 | __table_args__ = ( | |
|
3311 | Index('g_gist_access_id_idx', 'gist_access_id'), | |
|
3312 | Index('g_created_on_idx', 'created_on'), | |
|
3313 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3314 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
3315 | ) | |
|
3316 | GIST_PUBLIC = u'public' | |
|
3317 | GIST_PRIVATE = u'private' | |
|
3318 | DEFAULT_FILENAME = u'gistfile1.txt' | |
|
3319 | ||
|
3320 | ACL_LEVEL_PUBLIC = u'acl_public' | |
|
3321 | ACL_LEVEL_PRIVATE = u'acl_private' | |
|
3322 | ||
|
3323 | gist_id = Column('gist_id', Integer(), primary_key=True) | |
|
3324 | gist_access_id = Column('gist_access_id', Unicode(250)) | |
|
3325 | gist_description = Column('gist_description', UnicodeText().with_variant(UnicodeText(1024), 'mysql')) | |
|
3326 | gist_owner = Column('user_id', Integer(), ForeignKey('users.user_id'), nullable=True) | |
|
3327 | gist_expires = Column('gist_expires', Float(53), nullable=False) | |
|
3328 | gist_type = Column('gist_type', Unicode(128), nullable=False) | |
|
3329 | created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
|
3330 | modified_at = Column('modified_at', DateTime(timezone=False), nullable=False, default=datetime.datetime.now) | |
|
3331 | acl_level = Column('acl_level', Unicode(128), nullable=True) | |
|
3332 | ||
|
3333 | owner = relationship('User') | |
|
3334 | ||
|
3335 | def __repr__(self): | |
|
3336 | return '<Gist:[%s]%s>' % (self.gist_type, self.gist_access_id) | |
|
3337 | ||
|
3338 | @classmethod | |
|
3339 | def get_or_404(cls, id_): | |
|
3340 | res = cls.query().filter(cls.gist_access_id == id_).scalar() | |
|
3341 | if not res: | |
|
3342 | raise HTTPNotFound | |
|
3343 | return res | |
|
3344 | ||
|
3345 | @classmethod | |
|
3346 | def get_by_access_id(cls, gist_access_id): | |
|
3347 | return cls.query().filter(cls.gist_access_id == gist_access_id).scalar() | |
|
3348 | ||
|
3349 | def gist_url(self): | |
|
3350 | import rhodecode | |
|
3351 | alias_url = rhodecode.CONFIG.get('gist_alias_url') | |
|
3352 | if alias_url: | |
|
3353 | return alias_url.replace('{gistid}', self.gist_access_id) | |
|
3354 | ||
|
3355 | return url('gist', gist_id=self.gist_access_id, qualified=True) | |
|
3356 | ||
|
3357 | @classmethod | |
|
3358 | def base_path(cls): | |
|
3359 | """ | |
|
3360 | Returns base path when all gists are stored | |
|
3361 | ||
|
3362 | :param cls: | |
|
3363 | """ | |
|
3364 | from rhodecode.model.gist import GIST_STORE_LOC | |
|
3365 | q = Session().query(RhodeCodeUi)\ | |
|
3366 | .filter(RhodeCodeUi.ui_key == URL_SEP) | |
|
3367 | q = q.options(FromCache("sql_cache_short", "repository_repo_path")) | |
|
3368 | return os.path.join(q.one().ui_value, GIST_STORE_LOC) | |
|
3369 | ||
|
3370 | def get_api_data(self): | |
|
3371 | """ | |
|
3372 | Common function for generating gist related data for API | |
|
3373 | """ | |
|
3374 | gist = self | |
|
3375 | data = { | |
|
3376 | 'gist_id': gist.gist_id, | |
|
3377 | 'type': gist.gist_type, | |
|
3378 | 'access_id': gist.gist_access_id, | |
|
3379 | 'description': gist.gist_description, | |
|
3380 | 'url': gist.gist_url(), | |
|
3381 | 'expires': gist.gist_expires, | |
|
3382 | 'created_on': gist.created_on, | |
|
3383 | 'modified_at': gist.modified_at, | |
|
3384 | 'content': None, | |
|
3385 | 'acl_level': gist.acl_level, | |
|
3386 | } | |
|
3387 | return data | |
|
3388 | ||
|
3389 | def __json__(self): | |
|
3390 | data = dict( | |
|
3391 | ) | |
|
3392 | data.update(self.get_api_data()) | |
|
3393 | return data | |
|
3394 | # SCM functions | |
|
3395 | ||
|
3396 | def scm_instance(self, **kwargs): | |
|
3397 | full_repo_path = os.path.join(self.base_path(), self.gist_access_id) | |
|
3398 | return get_vcs_instance( | |
|
3399 | repo_path=safe_str(full_repo_path), create=False) | |
|
3400 | ||
|
3401 | ||
|
3402 | class DbMigrateVersion(Base, BaseModel): | |
|
3403 | __tablename__ = 'db_migrate_version' | |
|
3404 | __table_args__ = ( | |
|
3405 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3406 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}, | |
|
3407 | ) | |
|
3408 | repository_id = Column('repository_id', String(250), primary_key=True) | |
|
3409 | repository_path = Column('repository_path', Text) | |
|
3410 | version = Column('version', Integer) | |
|
3411 | ||
|
3412 | ||
|
3413 | class ExternalIdentity(Base, BaseModel): | |
|
3414 | __tablename__ = 'external_identities' | |
|
3415 | __table_args__ = ( | |
|
3416 | Index('local_user_id_idx', 'local_user_id'), | |
|
3417 | Index('external_id_idx', 'external_id'), | |
|
3418 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3419 | 'mysql_charset': 'utf8'}) | |
|
3420 | ||
|
3421 | external_id = Column('external_id', Unicode(255), default=u'', | |
|
3422 | primary_key=True) | |
|
3423 | external_username = Column('external_username', Unicode(1024), default=u'') | |
|
3424 | local_user_id = Column('local_user_id', Integer(), | |
|
3425 | ForeignKey('users.user_id'), primary_key=True) | |
|
3426 | provider_name = Column('provider_name', Unicode(255), default=u'', | |
|
3427 | primary_key=True) | |
|
3428 | access_token = Column('access_token', String(1024), default=u'') | |
|
3429 | alt_token = Column('alt_token', String(1024), default=u'') | |
|
3430 | token_secret = Column('token_secret', String(1024), default=u'') | |
|
3431 | ||
|
3432 | @classmethod | |
|
3433 | def by_external_id_and_provider(cls, external_id, provider_name, | |
|
3434 | local_user_id=None): | |
|
3435 | """ | |
|
3436 | Returns ExternalIdentity instance based on search params | |
|
3437 | ||
|
3438 | :param external_id: | |
|
3439 | :param provider_name: | |
|
3440 | :return: ExternalIdentity | |
|
3441 | """ | |
|
3442 | query = cls.query() | |
|
3443 | query = query.filter(cls.external_id == external_id) | |
|
3444 | query = query.filter(cls.provider_name == provider_name) | |
|
3445 | if local_user_id: | |
|
3446 | query = query.filter(cls.local_user_id == local_user_id) | |
|
3447 | return query.first() | |
|
3448 | ||
|
3449 | @classmethod | |
|
3450 | def user_by_external_id_and_provider(cls, external_id, provider_name): | |
|
3451 | """ | |
|
3452 | Returns User instance based on search params | |
|
3453 | ||
|
3454 | :param external_id: | |
|
3455 | :param provider_name: | |
|
3456 | :return: User | |
|
3457 | """ | |
|
3458 | query = User.query() | |
|
3459 | query = query.filter(cls.external_id == external_id) | |
|
3460 | query = query.filter(cls.provider_name == provider_name) | |
|
3461 | query = query.filter(User.user_id == cls.local_user_id) | |
|
3462 | return query.first() | |
|
3463 | ||
|
3464 | @classmethod | |
|
3465 | def by_local_user_id(cls, local_user_id): | |
|
3466 | """ | |
|
3467 | Returns all tokens for user | |
|
3468 | ||
|
3469 | :param local_user_id: | |
|
3470 | :return: ExternalIdentity | |
|
3471 | """ | |
|
3472 | query = cls.query() | |
|
3473 | query = query.filter(cls.local_user_id == local_user_id) | |
|
3474 | return query | |
|
3475 | ||
|
3476 | ||
|
3477 | class Integration(Base, BaseModel): | |
|
3478 | __tablename__ = 'integrations' | |
|
3479 | __table_args__ = ( | |
|
3480 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3481 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True} | |
|
3482 | ) | |
|
3483 | ||
|
3484 | integration_id = Column('integration_id', Integer(), primary_key=True) | |
|
3485 | integration_type = Column('integration_type', String(255)) | |
|
3486 | enabled = Column('enabled', Boolean(), nullable=False) | |
|
3487 | name = Column('name', String(255), nullable=False) | |
|
3488 | child_repos_only = Column('child_repos_only', Boolean(), nullable=False, | |
|
3489 | default=False) | |
|
3490 | ||
|
3491 | settings = Column( | |
|
3492 | 'settings_json', MutationObj.as_mutable( | |
|
3493 | JsonType(dialect_map=dict(mysql=UnicodeText(16384))))) | |
|
3494 | repo_id = Column( | |
|
3495 | 'repo_id', Integer(), ForeignKey('repositories.repo_id'), | |
|
3496 | nullable=True, unique=None, default=None) | |
|
3497 | repo = relationship('Repository', lazy='joined') | |
|
3498 | ||
|
3499 | repo_group_id = Column( | |
|
3500 | 'repo_group_id', Integer(), ForeignKey('groups.group_id'), | |
|
3501 | nullable=True, unique=None, default=None) | |
|
3502 | repo_group = relationship('RepoGroup', lazy='joined') | |
|
3503 | ||
|
3504 | @property | |
|
3505 | def scope(self): | |
|
3506 | if self.repo: | |
|
3507 | return repr(self.repo) | |
|
3508 | if self.repo_group: | |
|
3509 | if self.child_repos_only: | |
|
3510 | return repr(self.repo_group) + ' (child repos only)' | |
|
3511 | else: | |
|
3512 | return repr(self.repo_group) + ' (recursive)' | |
|
3513 | if self.child_repos_only: | |
|
3514 | return 'root_repos' | |
|
3515 | return 'global' | |
|
3516 | ||
|
3517 | def __repr__(self): | |
|
3518 | return '<Integration(%r, %r)>' % (self.integration_type, self.scope) | |
|
3519 | ||
|
3520 | ||
|
3521 | class RepoReviewRuleUser(Base, BaseModel): | |
|
3522 | __tablename__ = 'repo_review_rules_users' | |
|
3523 | __table_args__ = ( | |
|
3524 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3525 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,} | |
|
3526 | ) | |
|
3527 | repo_review_rule_user_id = Column( | |
|
3528 | 'repo_review_rule_user_id', Integer(), primary_key=True) | |
|
3529 | repo_review_rule_id = Column("repo_review_rule_id", | |
|
3530 | Integer(), ForeignKey('repo_review_rules.repo_review_rule_id')) | |
|
3531 | user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), | |
|
3532 | nullable=False) | |
|
3533 | user = relationship('User') | |
|
3534 | ||
|
3535 | ||
|
3536 | class RepoReviewRuleUserGroup(Base, BaseModel): | |
|
3537 | __tablename__ = 'repo_review_rules_users_groups' | |
|
3538 | __table_args__ = ( | |
|
3539 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3540 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,} | |
|
3541 | ) | |
|
3542 | repo_review_rule_users_group_id = Column( | |
|
3543 | 'repo_review_rule_users_group_id', Integer(), primary_key=True) | |
|
3544 | repo_review_rule_id = Column("repo_review_rule_id", | |
|
3545 | Integer(), ForeignKey('repo_review_rules.repo_review_rule_id')) | |
|
3546 | users_group_id = Column("users_group_id", Integer(), | |
|
3547 | ForeignKey('users_groups.users_group_id'), nullable=False) | |
|
3548 | users_group = relationship('UserGroup') | |
|
3549 | ||
|
3550 | ||
|
3551 | class RepoReviewRule(Base, BaseModel): | |
|
3552 | __tablename__ = 'repo_review_rules' | |
|
3553 | __table_args__ = ( | |
|
3554 | {'extend_existing': True, 'mysql_engine': 'InnoDB', | |
|
3555 | 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,} | |
|
3556 | ) | |
|
3557 | ||
|
3558 | repo_review_rule_id = Column( | |
|
3559 | 'repo_review_rule_id', Integer(), primary_key=True) | |
|
3560 | repo_id = Column( | |
|
3561 | "repo_id", Integer(), ForeignKey('repositories.repo_id')) | |
|
3562 | repo = relationship('Repository', backref='review_rules') | |
|
3563 | ||
|
3564 | _branch_pattern = Column("branch_pattern", UnicodeText().with_variant(UnicodeText(255), 'mysql'), | |
|
3565 | default=u'*') # glob | |
|
3566 | _file_pattern = Column("file_pattern", UnicodeText().with_variant(UnicodeText(255), 'mysql'), | |
|
3567 | default=u'*') # glob | |
|
3568 | ||
|
3569 | use_authors_for_review = Column("use_authors_for_review", Boolean(), | |
|
3570 | nullable=False, default=False) | |
|
3571 | rule_users = relationship('RepoReviewRuleUser') | |
|
3572 | rule_user_groups = relationship('RepoReviewRuleUserGroup') | |
|
3573 | ||
|
3574 | @hybrid_property | |
|
3575 | def branch_pattern(self): | |
|
3576 | return self._branch_pattern or '*' | |
|
3577 | ||
|
3578 | def _validate_pattern(self, value): | |
|
3579 | re.compile('^' + glob2re(value) + '$') | |
|
3580 | ||
|
3581 | @branch_pattern.setter | |
|
3582 | def branch_pattern(self, value): | |
|
3583 | self._validate_glob(value) | |
|
3584 | self._branch_pattern = value or '*' | |
|
3585 | ||
|
3586 | @hybrid_property | |
|
3587 | def file_pattern(self): | |
|
3588 | return self._file_pattern or '*' | |
|
3589 | ||
|
3590 | @file_pattern.setter | |
|
3591 | def file_pattern(self, value): | |
|
3592 | self._validate_glob(value) | |
|
3593 | self._file_pattern = value or '*' | |
|
3594 | ||
|
3595 | def matches(self, branch, files_changed): | |
|
3596 | """ | |
|
3597 | Check if this review rule matches a branch/files in a pull request | |
|
3598 | ||
|
3599 | :param branch: branch name for the commit | |
|
3600 | :param files_changed: list of file paths changed in the pull request | |
|
3601 | """ | |
|
3602 | ||
|
3603 | branch = branch or '' | |
|
3604 | files_changed = files_changed or [] | |
|
3605 | ||
|
3606 | branch_matches = True | |
|
3607 | if branch: | |
|
3608 | branch_regex = re.compile('^' + glob2re(self.branch_pattern) + '$') | |
|
3609 | branch_matches = bool(branch_regex.search(branch)) | |
|
3610 | ||
|
3611 | files_matches = True | |
|
3612 | if self.file_pattern != '*': | |
|
3613 | files_matches = False | |
|
3614 | file_regex = re.compile(glob2re(self.file_pattern)) | |
|
3615 | for filename in files_changed: | |
|
3616 | if file_regex.search(filename): | |
|
3617 | files_matches = True | |
|
3618 | break | |
|
3619 | ||
|
3620 | return branch_matches and files_matches | |
|
3621 | ||
|
3622 | @property | |
|
3623 | def review_users(self): | |
|
3624 | """ Returns the users which this rule applies to """ | |
|
3625 | ||
|
3626 | users = set() | |
|
3627 | users |= set([ | |
|
3628 | rule_user.user for rule_user in self.rule_users | |
|
3629 | if rule_user.user.active]) | |
|
3630 | users |= set( | |
|
3631 | member.user | |
|
3632 | for rule_user_group in self.rule_user_groups | |
|
3633 | for member in rule_user_group.users_group.members | |
|
3634 | if member.user.active | |
|
3635 | ) | |
|
3636 | return users | |
|
3637 | ||
|
3638 | def __repr__(self): | |
|
3639 | return '<RepoReviewerRule(id=%r, repo=%r)>' % ( | |
|
3640 | self.repo_review_rule_id, self.repo) |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644, binary diff hidden |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: new file 100644 | |
The requested commit or file is too big and content was truncated. Show full diff |
@@ -1,5 +1,5 b'' | |||
|
1 | 1 | [bumpversion] |
|
2 |
current_version = 4. |
|
|
2 | current_version = 4.5.0 | |
|
3 | 3 | message = release: Bump version {current_version} to {new_version} |
|
4 | 4 | |
|
5 | 5 | [bumpversion:file:rhodecode/VERSION] |
@@ -43,6 +43,7 b' syntax: regexp' | |||
|
43 | 43 | ^rhodecode/public/css/style-polymer.css$ |
|
44 | 44 | ^rhodecode/public/js/rhodecode-components.html$ |
|
45 | 45 | ^rhodecode/public/js/scripts.js$ |
|
46 | ^rhodecode/public/js/rhodecode-components.js$ | |
|
46 | 47 | ^rhodecode/public/js/src/components/root-styles.gen.html$ |
|
47 | 48 | ^rhodecode/public/js/vendors/webcomponentsjs/ |
|
48 | 49 | ^rhodecode\.db$ |
@@ -4,26 +4,21 b' done = false' | |||
|
4 | 4 | [task:bump_version] |
|
5 | 5 | done = true |
|
6 | 6 | |
|
7 | [task:rc_tools_pinned] | |
|
8 | done = true | |
|
9 | ||
|
10 | 7 | [task:fixes_on_stable] |
|
11 | done = true | |
|
12 | 8 | |
|
13 | 9 | [task:pip2nix_generated] |
|
14 | done = true | |
|
15 | 10 | |
|
16 | 11 | [task:changelog_updated] |
|
17 | done = true | |
|
18 | 12 | |
|
19 | 13 | [task:generate_api_docs] |
|
20 | done = true | |
|
14 | ||
|
15 | [task:updated_translation] | |
|
21 | 16 | |
|
22 | 17 | [release] |
|
23 |
state = |
|
|
24 |
version = 4. |
|
|
18 | state = in_progress | |
|
19 | version = 4.5.0 | |
|
25 | 20 | |
|
26 | [task:updated_translation] | |
|
21 | [task:rc_tools_pinned] | |
|
27 | 22 | |
|
28 | 23 | [task:generate_js_routes] |
|
29 | 24 |
@@ -11,5 +11,5 b' module.exports = function(grunt) {' | |||
|
11 | 11 | grunt.loadNpmTasks('grunt-crisper'); |
|
12 | 12 | grunt.loadNpmTasks('grunt-contrib-copy'); |
|
13 | 13 | |
|
14 |
grunt.registerTask('default', ['less:production', 'less:components', 'concat:polymercss', 'copy','vulcanize', 'crisper' |
|
|
14 | grunt.registerTask('default', ['less:production', 'less:components', 'concat:polymercss', 'copy', 'concat:dist', 'vulcanize', 'crisper']); | |
|
15 | 15 | }; |
@@ -12,6 +12,10 b' permission notice:' | |||
|
12 | 12 | file:licenses/msgpack_license.txt |
|
13 | 13 | Copyright (c) 2009 - tornado |
|
14 | 14 | file:licenses/tornado_license.txt |
|
15 | Copyright (c) 2015 - pygments-markdown-lexer | |
|
16 | file:licenses/pygments_markdown_lexer_license.txt | |
|
17 | Copyright 2006 - diff_match_patch | |
|
18 | file:licenses/diff_match_patch_license.txt | |
|
15 | 19 | |
|
16 | 20 | All licensed under the Apache License, Version 2.0 (the "License"); |
|
17 | 21 | you may not use this file except in compliance with the License. |
@@ -2,7 +2,6 b'' | |||
|
2 | 2 | WEBPACK=./node_modules/webpack/bin/webpack.js |
|
3 | 3 | GRUNT=grunt |
|
4 | 4 | NODE_PATH=./node_modules |
|
5 | FLAKE8=flake8 setup.py pytest_pylons/ rhodecode/ --select=E124 --ignore=E711,E712,E510,E121,E122,E126,E127,E128,E501,F401 --max-line-length=100 --exclude=*rhodecode/lib/dbmigrate/*,*rhodecode/tests/*,*rhodecode/lib/vcs/utils/* | |
|
6 | 5 | CI_PREFIX=enterprise |
|
7 | 6 | |
|
8 | 7 | .PHONY: docs docs-clean ci-docs clean test test-clean test-lint test-only |
@@ -25,13 +24,6 b' test: test-clean test-only' | |||
|
25 | 24 | test-clean: |
|
26 | 25 | rm -rf coverage.xml htmlcov junit.xml pylint.log result |
|
27 | 26 | |
|
28 | test-lint: | |
|
29 | if [ "$$IN_NIX_SHELL" = "1" ]; then \ | |
|
30 | $(FLAKE8); \ | |
|
31 | else \ | |
|
32 | $(FLAKE8) --format=pylint --exit-zero > pylint.log; \ | |
|
33 | fi | |
|
34 | ||
|
35 | 27 | test-only: |
|
36 | 28 | PYTHONHASHSEED=random py.test -vv -r xw --cov=rhodecode --cov-report=term-missing --cov-report=html rhodecode/tests/ |
|
37 | 29 |
@@ -10,6 +10,8 b'' | |||
|
10 | 10 | "paper-tooltip": "PolymerElements/paper-tooltip#^1.1.2", |
|
11 | 11 | "paper-toast": "PolymerElements/paper-toast#^1.3.0", |
|
12 | 12 | "paper-toggle-button": "PolymerElements/paper-toggle-button#^1.2.0", |
|
13 | "iron-ajax": "PolymerElements/iron-ajax#^1.4.3" | |
|
13 | "iron-ajax": "PolymerElements/iron-ajax#^1.4.3", | |
|
14 | "iron-autogrow-textarea": "PolymerElements/iron-autogrow-textarea#^1.0.13", | |
|
15 | "iron-a11y-keys": "PolymerElements/iron-a11y-keys#^1.0.6" | |
|
14 | 16 | } |
|
15 | 17 | } |
@@ -1,7 +1,7 b'' | |||
|
1 | 1 | |
|
2 | 2 | |
|
3 | 3 | ################################################################################ |
|
4 |
## |
|
|
4 | ## RHODECODE COMMUNITY EDITION CONFIGURATION ## | |
|
5 | 5 | # The %(here)s variable will be replaced with the parent directory of this file# |
|
6 | 6 | ################################################################################ |
|
7 | 7 | |
@@ -64,7 +64,7 b' asyncore_use_poll = true' | |||
|
64 | 64 | ########################## |
|
65 | 65 | ## GUNICORN WSGI SERVER ## |
|
66 | 66 | ########################## |
|
67 |
## run with gunicorn --log-config |
|
|
67 | ## run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |
|
68 | 68 | |
|
69 | 69 | #use = egg:gunicorn#main |
|
70 | 70 | ## Sets the number of process workers. You must set `instance_id = *` |
@@ -91,11 +91,13 b' asyncore_use_poll = true' | |||
|
91 | 91 | #timeout = 21600 |
|
92 | 92 | |
|
93 | 93 | |
|
94 |
## prefix middleware for RhodeCode |
|
|
94 | ## prefix middleware for RhodeCode. | |
|
95 | 95 | ## recommended when using proxy setup. |
|
96 | 96 | ## allows to set RhodeCode under a prefix in server. |
|
97 |
## eg https://server.com/ |
|
|
98 |
## |
|
|
97 | ## eg https://server.com/custom_prefix. Enable `filter-with =` option below as well. | |
|
98 | ## And set your prefix like: `prefix = /custom_prefix` | |
|
99 | ## be sure to also set beaker.session.cookie_path = /custom_prefix if you need | |
|
100 | ## to make your cookies only work on prefix url | |
|
99 | 101 | [filter:proxy-prefix] |
|
100 | 102 | use = egg:PasteDeploy#prefix |
|
101 | 103 | prefix = / |
@@ -194,17 +196,17 b' rss_items_per_page = 10' | |||
|
194 | 196 | rss_include_diff = false |
|
195 | 197 | |
|
196 | 198 | ## gist URL alias, used to create nicer urls for gist. This should be an |
|
197 |
## url that does rewrites to _admin/gists/ |
|
|
199 | ## url that does rewrites to _admin/gists/{gistid}. | |
|
198 | 200 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal |
|
199 |
## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/ |
|
|
201 | ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid} | |
|
200 | 202 | gist_alias_url = |
|
201 | 203 | |
|
202 | 204 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be |
|
203 | 205 | ## used for access. |
|
204 |
## Adding ?auth_token |
|
|
206 | ## Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it | |
|
205 | 207 | ## came from the the logged in user who own this authentication token. |
|
206 | 208 | ## |
|
207 |
## Syntax is |
|
|
209 | ## Syntax is ControllerClass:function_pattern. | |
|
208 | 210 | ## To enable access to raw_files put `FilesController:raw`. |
|
209 | 211 | ## To enable access to patches add `ChangesetController:changeset_patch`. |
|
210 | 212 | ## The list should be "," separated and on a single line. |
@@ -377,15 +379,15 b' beaker.session.lock_dir = %(here)s/data/' | |||
|
377 | 379 | |
|
378 | 380 | ## Secure encrypted cookie. Requires AES and AES python libraries |
|
379 | 381 | ## you must disable beaker.session.secret to use this |
|
380 |
#beaker.session.encrypt_key = |
|
|
381 |
#beaker.session.validate_key = |
|
|
382 | #beaker.session.encrypt_key = key_for_encryption | |
|
383 | #beaker.session.validate_key = validation_key | |
|
382 | 384 | |
|
383 | 385 | ## sets session as invalid(also logging out user) if it haven not been |
|
384 | 386 | ## accessed for given amount of time in seconds |
|
385 | 387 | beaker.session.timeout = 2592000 |
|
386 | 388 | beaker.session.httponly = true |
|
387 | ## Path to use for the cookie. | |
|
388 |
#beaker.session.cookie_path = / |
|
|
389 | ## Path to use for the cookie. Set to prefix if you use prefix middleware | |
|
390 | #beaker.session.cookie_path = /custom_prefix | |
|
389 | 391 | |
|
390 | 392 | ## uncomment for https secure cookie |
|
391 | 393 | beaker.session.secure = false |
@@ -403,8 +405,8 b' beaker.session.auto = false' | |||
|
403 | 405 | ## Full text search indexer is available in rhodecode-tools under |
|
404 | 406 | ## `rhodecode-tools index` command |
|
405 | 407 | |
|
406 | # WHOOSH Backend, doesn't require additional services to run | |
|
407 | # it works good with few dozen repos | |
|
408 | ## WHOOSH Backend, doesn't require additional services to run | |
|
409 | ## it works good with few dozen repos | |
|
408 | 410 | search.module = rhodecode.lib.index.whoosh |
|
409 | 411 | search.location = %(here)s/data/index |
|
410 | 412 | |
@@ -511,7 +513,7 b' sqlalchemy.db1.url = sqlite:///%(here)s/' | |||
|
511 | 513 | |
|
512 | 514 | ## print the sql statements to output |
|
513 | 515 | sqlalchemy.db1.echo = false |
|
514 |
## recycle the connections after this am |
|
|
516 | ## recycle the connections after this amount of seconds | |
|
515 | 517 | sqlalchemy.db1.pool_recycle = 3600 |
|
516 | 518 | sqlalchemy.db1.convert_unicode = true |
|
517 | 519 | |
@@ -533,19 +535,19 b' vcs.server = localhost:9900' | |||
|
533 | 535 | |
|
534 | 536 | ## Web server connectivity protocol, responsible for web based VCS operatations |
|
535 | 537 | ## Available protocols are: |
|
536 |
## `pyro4` - us |
|
|
537 |
## `http` - us |
|
|
538 | ## `pyro4` - use pyro4 server | |
|
539 | ## `http` - use http-rpc backend (default) | |
|
538 | 540 | vcs.server.protocol = http |
|
539 | 541 | |
|
540 | 542 | ## Push/Pull operations protocol, available options are: |
|
541 |
## `pyro4` - us |
|
|
542 | ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended | |
|
543 | ## `vcsserver.scm_app` - internal app (EE only) | |
|
544 |
vcs.scm_app_implementation = |
|
|
543 | ## `pyro4` - use pyro4 server | |
|
544 | ## `http` - use http-rpc backend (default) | |
|
545 | ## | |
|
546 | vcs.scm_app_implementation = http | |
|
545 | 547 | |
|
546 | 548 | ## Push/Pull operations hooks protocol, available options are: |
|
547 |
## `pyro4` - us |
|
|
548 |
## `http` - us |
|
|
549 | ## `pyro4` - use pyro4 server | |
|
550 | ## `http` - use http-rpc backend (default) | |
|
549 | 551 | vcs.hooks.protocol = http |
|
550 | 552 | |
|
551 | 553 | vcs.server.log_level = debug |
@@ -574,12 +576,15 b' svn.proxy.generate_config = false' | |||
|
574 | 576 | svn.proxy.list_parent_path = true |
|
575 | 577 | ## Set location and file name of generated config file. |
|
576 | 578 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf |
|
577 | ## File system path to the directory containing the repositories served by | |
|
578 | ## RhodeCode. | |
|
579 | svn.proxy.parent_path_root = /path/to/repo_store | |
|
580 | ## Used as a prefix to the <Location> block in the generated config file. In | |
|
581 | ## most cases it should be set to `/`. | |
|
579 | ## Used as a prefix to the `Location` block in the generated config file. | |
|
580 | ## In most cases it should be set to `/`. | |
|
582 | 581 | svn.proxy.location_root = / |
|
582 | ## Command to reload the mod dav svn configuration on change. | |
|
583 | ## Example: `/etc/init.d/apache2 reload` | |
|
584 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |
|
585 | ## If the timeout expires before the reload command finishes, the command will | |
|
586 | ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |
|
587 | #svn.proxy.reload_timeout = 10 | |
|
583 | 588 | |
|
584 | 589 | |
|
585 | 590 | ################################ |
@@ -1,7 +1,7 b'' | |||
|
1 | 1 | |
|
2 | 2 | |
|
3 | 3 | ################################################################################ |
|
4 |
## |
|
|
4 | ## RHODECODE COMMUNITY EDITION CONFIGURATION ## | |
|
5 | 5 | # The %(here)s variable will be replaced with the parent directory of this file# |
|
6 | 6 | ################################################################################ |
|
7 | 7 | |
@@ -64,7 +64,7 b' port = 5000' | |||
|
64 | 64 | ########################## |
|
65 | 65 | ## GUNICORN WSGI SERVER ## |
|
66 | 66 | ########################## |
|
67 |
## run with gunicorn --log-config |
|
|
67 | ## run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini | |
|
68 | 68 | |
|
69 | 69 | use = egg:gunicorn#main |
|
70 | 70 | ## Sets the number of process workers. You must set `instance_id = *` |
@@ -91,11 +91,13 b' max_requests_jitter = 30' | |||
|
91 | 91 | timeout = 21600 |
|
92 | 92 | |
|
93 | 93 | |
|
94 |
## prefix middleware for RhodeCode |
|
|
94 | ## prefix middleware for RhodeCode. | |
|
95 | 95 | ## recommended when using proxy setup. |
|
96 | 96 | ## allows to set RhodeCode under a prefix in server. |
|
97 |
## eg https://server.com/ |
|
|
98 |
## |
|
|
97 | ## eg https://server.com/custom_prefix. Enable `filter-with =` option below as well. | |
|
98 | ## And set your prefix like: `prefix = /custom_prefix` | |
|
99 | ## be sure to also set beaker.session.cookie_path = /custom_prefix if you need | |
|
100 | ## to make your cookies only work on prefix url | |
|
99 | 101 | [filter:proxy-prefix] |
|
100 | 102 | use = egg:PasteDeploy#prefix |
|
101 | 103 | prefix = / |
@@ -168,17 +170,17 b' rss_items_per_page = 10' | |||
|
168 | 170 | rss_include_diff = false |
|
169 | 171 | |
|
170 | 172 | ## gist URL alias, used to create nicer urls for gist. This should be an |
|
171 |
## url that does rewrites to _admin/gists/ |
|
|
173 | ## url that does rewrites to _admin/gists/{gistid}. | |
|
172 | 174 | ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal |
|
173 |
## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/ |
|
|
175 | ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid} | |
|
174 | 176 | gist_alias_url = |
|
175 | 177 | |
|
176 | 178 | ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be |
|
177 | 179 | ## used for access. |
|
178 |
## Adding ?auth_token |
|
|
180 | ## Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it | |
|
179 | 181 | ## came from the the logged in user who own this authentication token. |
|
180 | 182 | ## |
|
181 |
## Syntax is |
|
|
183 | ## Syntax is ControllerClass:function_pattern. | |
|
182 | 184 | ## To enable access to raw_files put `FilesController:raw`. |
|
183 | 185 | ## To enable access to patches add `ChangesetController:changeset_patch`. |
|
184 | 186 | ## The list should be "," separated and on a single line. |
@@ -351,15 +353,15 b' beaker.session.lock_dir = %(here)s/data/' | |||
|
351 | 353 | |
|
352 | 354 | ## Secure encrypted cookie. Requires AES and AES python libraries |
|
353 | 355 | ## you must disable beaker.session.secret to use this |
|
354 |
#beaker.session.encrypt_key = |
|
|
355 |
#beaker.session.validate_key = |
|
|
356 | #beaker.session.encrypt_key = key_for_encryption | |
|
357 | #beaker.session.validate_key = validation_key | |
|
356 | 358 | |
|
357 | 359 | ## sets session as invalid(also logging out user) if it haven not been |
|
358 | 360 | ## accessed for given amount of time in seconds |
|
359 | 361 | beaker.session.timeout = 2592000 |
|
360 | 362 | beaker.session.httponly = true |
|
361 | ## Path to use for the cookie. | |
|
362 |
#beaker.session.cookie_path = / |
|
|
363 | ## Path to use for the cookie. Set to prefix if you use prefix middleware | |
|
364 | #beaker.session.cookie_path = /custom_prefix | |
|
363 | 365 | |
|
364 | 366 | ## uncomment for https secure cookie |
|
365 | 367 | beaker.session.secure = false |
@@ -377,8 +379,8 b' beaker.session.auto = false' | |||
|
377 | 379 | ## Full text search indexer is available in rhodecode-tools under |
|
378 | 380 | ## `rhodecode-tools index` command |
|
379 | 381 | |
|
380 | # WHOOSH Backend, doesn't require additional services to run | |
|
381 | # it works good with few dozen repos | |
|
382 | ## WHOOSH Backend, doesn't require additional services to run | |
|
383 | ## it works good with few dozen repos | |
|
382 | 384 | search.module = rhodecode.lib.index.whoosh |
|
383 | 385 | search.location = %(here)s/data/index |
|
384 | 386 | |
@@ -480,7 +482,7 b' sqlalchemy.db1.url = postgresql://postgr' | |||
|
480 | 482 | |
|
481 | 483 | ## print the sql statements to output |
|
482 | 484 | sqlalchemy.db1.echo = false |
|
483 |
## recycle the connections after this am |
|
|
485 | ## recycle the connections after this amount of seconds | |
|
484 | 486 | sqlalchemy.db1.pool_recycle = 3600 |
|
485 | 487 | sqlalchemy.db1.convert_unicode = true |
|
486 | 488 | |
@@ -502,20 +504,20 b' vcs.server = localhost:9900' | |||
|
502 | 504 | |
|
503 | 505 | ## Web server connectivity protocol, responsible for web based VCS operatations |
|
504 | 506 | ## Available protocols are: |
|
505 |
## `pyro4` - us |
|
|
506 |
## `http` - us |
|
|
507 |
|
|
|
507 | ## `pyro4` - use pyro4 server | |
|
508 | ## `http` - use http-rpc backend (default) | |
|
509 | vcs.server.protocol = http | |
|
508 | 510 | |
|
509 | 511 | ## Push/Pull operations protocol, available options are: |
|
510 |
## `pyro4` - us |
|
|
511 | ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended | |
|
512 | ## `vcsserver.scm_app` - internal app (EE only) | |
|
513 |
|
|
|
512 | ## `pyro4` - use pyro4 server | |
|
513 | ## `http` - use http-rpc backend (default) | |
|
514 | ## | |
|
515 | vcs.scm_app_implementation = http | |
|
514 | 516 | |
|
515 | 517 | ## Push/Pull operations hooks protocol, available options are: |
|
516 |
## `pyro4` - us |
|
|
517 |
## `http` - us |
|
|
518 |
|
|
|
518 | ## `pyro4` - use pyro4 server | |
|
519 | ## `http` - use http-rpc backend (default) | |
|
520 | vcs.hooks.protocol = http | |
|
519 | 521 | |
|
520 | 522 | vcs.server.log_level = info |
|
521 | 523 | ## Start VCSServer with this instance as a subprocess, usefull for development |
@@ -543,12 +545,15 b' svn.proxy.generate_config = false' | |||
|
543 | 545 | svn.proxy.list_parent_path = true |
|
544 | 546 | ## Set location and file name of generated config file. |
|
545 | 547 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf |
|
546 | ## File system path to the directory containing the repositories served by | |
|
547 | ## RhodeCode. | |
|
548 | svn.proxy.parent_path_root = /path/to/repo_store | |
|
549 | ## Used as a prefix to the <Location> block in the generated config file. In | |
|
550 | ## most cases it should be set to `/`. | |
|
548 | ## Used as a prefix to the `Location` block in the generated config file. | |
|
549 | ## In most cases it should be set to `/`. | |
|
551 | 550 | svn.proxy.location_root = / |
|
551 | ## Command to reload the mod dav svn configuration on change. | |
|
552 | ## Example: `/etc/init.d/apache2 reload` | |
|
553 | #svn.proxy.reload_cmd = /etc/init.d/apache2 reload | |
|
554 | ## If the timeout expires before the reload command finishes, the command will | |
|
555 | ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds. | |
|
556 | #svn.proxy.reload_timeout = 10 | |
|
552 | 557 | |
|
553 | 558 | |
|
554 | 559 | ################################ |
@@ -4,27 +4,55 b'' | |||
|
4 | 4 | # derivation. For advanced tweaks to pimp up the development environment we use |
|
5 | 5 | # "shell.nix" so that it does not have to clutter this file. |
|
6 | 6 | |
|
7 | { pkgs ? (import <nixpkgs> {}) | |
|
8 |
|
|
|
7 | args@ | |
|
8 | { pythonPackages ? "python27Packages" | |
|
9 | 9 | , pythonExternalOverrides ? self: super: {} |
|
10 | 10 | , doCheck ? true |
|
11 | , ... | |
|
11 | 12 | }: |
|
12 | 13 | |
|
13 | let pkgs_ = pkgs; in | |
|
14 | ||
|
15 | 14 | let |
|
16 | pkgs = pkgs_.overridePackages (self: super: { | |
|
17 | # Override subversion derivation to | |
|
18 | # - activate python bindings | |
|
19 | # - set version to 1.8 | |
|
20 | subversion = super.subversion18.override { | |
|
21 | httpSupport = true; | |
|
22 | pythonBindings = true; | |
|
23 | python = self.python27Packages.python; | |
|
24 | }; | |
|
15 | ||
|
16 | # Use nixpkgs from args or import them. We use this indirect approach | |
|
17 | # through args to be able to use the name `pkgs` for our customized packages. | |
|
18 | # Otherwise we will end up with an infinite recursion. | |
|
19 | nixpkgs = args.pkgs or (import <nixpkgs> { }); | |
|
20 | ||
|
21 | # johbo: Interim bridge which allows us to build with the upcoming | |
|
22 | # nixos.16.09 branch (unstable at the moment of writing this note) and the | |
|
23 | # current stable nixos-16.03. | |
|
24 | backwardsCompatibleFetchgit = { ... }@args: | |
|
25 | let | |
|
26 | origSources = nixpkgs.fetchgit args; | |
|
27 | in | |
|
28 | nixpkgs.lib.overrideDerivation origSources (oldAttrs: { | |
|
29 | NIX_PREFETCH_GIT_CHECKOUT_HOOK = '' | |
|
30 | find $out -name '.git*' -print0 | xargs -0 rm -rf | |
|
31 | ''; | |
|
25 | 32 | }); |
|
26 | 33 | |
|
27 | inherit (pkgs.lib) fix extends; | |
|
34 | # Create a customized version of nixpkgs which should be used throughout the | |
|
35 | # rest of this file. | |
|
36 | pkgs = nixpkgs.overridePackages (self: super: { | |
|
37 | fetchgit = backwardsCompatibleFetchgit; | |
|
38 | }); | |
|
39 | ||
|
40 | # Evaluates to the last segment of a file system path. | |
|
41 | basename = path: with pkgs.lib; last (splitString "/" path); | |
|
42 | ||
|
43 | # source code filter used as arugment to builtins.filterSource. | |
|
44 | src-filter = path: type: with pkgs.lib; | |
|
45 | let | |
|
46 | ext = last (splitString "." path); | |
|
47 | in | |
|
48 | !builtins.elem (basename path) [ | |
|
49 | ".git" ".hg" "__pycache__" ".eggs" | |
|
50 | "bower_components" "node_modules" | |
|
51 | "build" "data" "result" "tmp"] && | |
|
52 | !builtins.elem ext ["egg-info" "pyc"] && | |
|
53 | # TODO: johbo: This check is wrong, since "path" contains an absolute path, | |
|
54 | # it would still be good to restore it since we want to ignore "result-*". | |
|
55 | !hasPrefix "result" path; | |
|
28 | 56 | |
|
29 | 57 | basePythonPackages = with builtins; if isAttrs pythonPackages |
|
30 | 58 | then pythonPackages |
@@ -34,25 +62,6 b' let' | |||
|
34 | 62 | pkgs.buildBowerComponents or |
|
35 | 63 | (import ./pkgs/backport-16.03-build-bower-components.nix { inherit pkgs; }); |
|
36 | 64 | |
|
37 | elem = builtins.elem; | |
|
38 | basename = path: with pkgs.lib; last (splitString "/" path); | |
|
39 | startsWith = prefix: full: let | |
|
40 | actualPrefix = builtins.substring 0 (builtins.stringLength prefix) full; | |
|
41 | in actualPrefix == prefix; | |
|
42 | ||
|
43 | src-filter = path: type: with pkgs.lib; | |
|
44 | let | |
|
45 | ext = last (splitString "." path); | |
|
46 | in | |
|
47 | !elem (basename path) [ | |
|
48 | ".git" ".hg" "__pycache__" ".eggs" | |
|
49 | "bower_components" "node_modules" | |
|
50 | "build" "data" "result" "tmp"] && | |
|
51 | !elem ext ["egg-info" "pyc"] && | |
|
52 | # TODO: johbo: This check is wrong, since "path" contains an absolute path, | |
|
53 | # it would still be good to restore it since we want to ignore "result-*". | |
|
54 | !startsWith "result" path; | |
|
55 | ||
|
56 | 65 | sources = pkgs.config.rc.sources or {}; |
|
57 | 66 | version = builtins.readFile ./rhodecode/VERSION; |
|
58 | 67 | rhodecode-enterprise-ce-src = builtins.filterSource src-filter ./.; |
@@ -147,18 +156,6 b' let' | |||
|
147 | 156 | then "${pkgs.glibcLocales}/lib/locale/locale-archive" |
|
148 | 157 | else ""; |
|
149 | 158 | |
|
150 | # Somewhat snappier setup of the development environment | |
|
151 | # TODO: move into shell.nix | |
|
152 | # TODO: think of supporting a stable path again, so that multiple shells | |
|
153 | # can share it. | |
|
154 | shellHook = '' | |
|
155 | tmp_path=$(mktemp -d) | |
|
156 | export PATH="$tmp_path/bin:$PATH" | |
|
157 | export PYTHONPATH="$tmp_path/${self.python.sitePackages}:$PYTHONPATH" | |
|
158 | mkdir -p $tmp_path/${self.python.sitePackages} | |
|
159 | python setup.py develop --prefix $tmp_path --allow-hosts "" | |
|
160 | '' + linkNodeAndBowerPackages; | |
|
161 | ||
|
162 | 159 | preCheck = '' |
|
163 | 160 | export PATH="$out/bin:$PATH" |
|
164 | 161 | ''; |
@@ -226,16 +223,16 b' let' | |||
|
226 | 223 | rhodecode-testdata-src = sources.rhodecode-testdata or ( |
|
227 | 224 | pkgs.fetchhg { |
|
228 | 225 | url = "https://code.rhodecode.com/upstream/rc_testdata"; |
|
229 |
rev = "v0. |
|
|
230 | sha256 = "0hy1ba134rq2f9si85yx7j4qhc9ky0hjzdk553s3q026i7km809m"; | |
|
226 | rev = "v0.9.0"; | |
|
227 | sha256 = "0k0ccb7cncd6mmzwckfbr6l7fsymcympwcm948qc3i0f0m6bbg1y"; | |
|
231 | 228 | }); |
|
232 | 229 | |
|
233 | 230 | # Apply all overrides and fix the final package set |
|
234 | myPythonPackagesUnfix = | |
|
231 | myPythonPackagesUnfix = with pkgs.lib; | |
|
235 | 232 | (extends pythonExternalOverrides |
|
236 | 233 | (extends pythonLocalOverrides |
|
237 | 234 | (extends pythonOverrides |
|
238 | 235 | pythonGeneratedPackages))); |
|
239 | myPythonPackages = (fix myPythonPackagesUnfix); | |
|
236 | myPythonPackages = (pkgs.lib.fix myPythonPackagesUnfix); | |
|
240 | 237 | |
|
241 | 238 | in myPythonPackages.rhodecode-enterprise-ce |
@@ -62,6 +62,24 b' 3. Select :guilabel:`Save`, and you will' | |||
|
62 | 62 | |
|
63 | 63 | .. _md-rst: |
|
64 | 64 | |
|
65 | ||
|
66 | Suppress license warnings or errors | |
|
67 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
|
68 | ||
|
69 | In case you're running on maximum allowed users, RhodeCode will display a | |
|
70 | warning message on pages that you're close to the license limits. | |
|
71 | It's often not desired to show that all the time. Here's how you can suppress | |
|
72 | the license messages. | |
|
73 | ||
|
74 | 1. From the |RCE| interface, select | |
|
75 | :menuselection:`Admin --> Settings --> Global` | |
|
76 | 2. Select :guilabel:`Flash message filtering` from the drop-down menu. | |
|
77 | 3. Select :guilabel:`Save`, and you will no longer see the license message | |
|
78 | once your page refreshes. | |
|
79 | ||
|
80 | .. _admin-tricks-suppress-license-messages: | |
|
81 | ||
|
82 | ||
|
65 | 83 | Markdown or RST Rendering |
|
66 | 84 | ^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
67 | 85 |
@@ -7,10 +7,11 b' Use the following example to configure A' | |||
|
7 | 7 | |
|
8 | 8 | .. code-block:: apache |
|
9 | 9 | |
|
10 |
<Location /<someprefix> |
|
|
11 | ProxyPass http://127.0.0.1:5000/<someprefix> | |
|
12 |
ProxyPass |
|
|
13 | SetEnvIf X-Url-Scheme https HTTPS=1 | |
|
10 | <Location /<someprefix>/ # Change <someprefix> into your chosen prefix | |
|
11 | ProxyPreserveHost On | |
|
12 | ProxyPass "http://127.0.0.1:5000/" | |
|
13 | ProxyPassReverse "http://127.0.0.1:5000/" | |
|
14 | Header set X-Url-Scheme https env=HTTPS | |
|
14 | 15 | </Location> |
|
15 | 16 | |
|
16 | 17 | In addition to the regular Apache setup you will need to add the following |
@@ -9,9 +9,14 b' Backup and Restore' | |||
|
9 | 9 | To snapshot an instance of |RCE|, and save its settings, you need to backup the |
|
10 | 10 | following parts of the system at the same time. |
|
11 | 11 | |
|
12 | * The |repos| managed by the instance. | |
|
12 | * The |repos| managed by the instance together with the stored Gists. | |
|
13 | 13 | * The |RCE| database. |
|
14 | * Any configuration files or extensions that you've configured. | |
|
14 | * Any configuration files or extensions that you've configured. In most | |
|
15 | cases it's only the :file:`rhodecode.ini` file. | |
|
16 | * Installer files such as those in `/opt/rhodecode` can be backed-up, however | |
|
17 | it's not required since in case of a recovery installer simply | |
|
18 | re-creates those. | |
|
19 | ||
|
15 | 20 | |
|
16 | 21 | .. important:: |
|
17 | 22 | |
@@ -29,11 +34,17 b' Repository Backup' | |||
|
29 | 34 | ^^^^^^^^^^^^^^^^^ |
|
30 | 35 | |
|
31 | 36 | To back up your |repos|, use the API to get a list of all |repos| managed, |
|
32 | and then clone them to your backup location. | |
|
37 | and then clone them to your backup location. This is the most safe backup option. | |
|
38 | Backing up the storage directory could potentially result in a backup of | |
|
39 | partially committed files or commits. (Backup taking place during a big push) | |
|
40 | As an alternative you could use a rsync or simple `cp` commands if you can | |
|
41 | ensure your instance is only in read-only mode or stopped at the moment. | |
|
42 | ||
|
33 | 43 | |
|
34 | 44 | Use the ``get_repos`` method to list all your managed |repos|, |
|
35 | 45 | and use the ``clone_uri`` information that is returned. See the :ref:`api` |
|
36 | for more information. | |
|
46 | for more information. Be sure to keep the structure or repositories with their | |
|
47 | repository groups. | |
|
37 | 48 | |
|
38 | 49 | .. important:: |
|
39 | 50 | |
@@ -54,13 +65,19 b' backup location:' | |||
|
54 | 65 | .. code-block:: bash |
|
55 | 66 | |
|
56 | 67 | # For MySQL DBs |
|
57 | $ mysqldump -u <uname> -p <pass> db_name > mysql-db-backup | |
|
68 | $ mysqldump -u <uname> -p <pass> rhodecode_db_name > mysql-db-backup | |
|
69 | # MySQL restore command | |
|
70 | $ mysql -u <uname> -p <pass> rhodecode_db_name < mysql-db-backup | |
|
58 | 71 | |
|
59 | 72 | # For PostgreSQL DBs |
|
60 | $ pg_dump dbname > postgresql-db-backup | |
|
73 | $ PGPASSWORD=<pass> pg_dump rhodecode_db_name > postgresql-db-backup | |
|
74 | # PosgreSQL restore | |
|
75 | $ PGPASSWORD=<pass> psql -U <uname> -h localhost -d rhodecode_db_name -1 -f postgresql-db-backup | |
|
61 | 76 | |
|
62 |
# For SQL |
|
|
77 | # For SQLite | |
|
63 | 78 | $ sqlite3 rhodecode.db ‘.dump’ > sqlite-db-backup |
|
79 | # SQLite restore | |
|
80 | $ copy sqlite-db-backup rhodecode.db | |
|
64 | 81 | |
|
65 | 82 | |
|
66 | 83 | The default |RCE| SQLite database location is |
@@ -75,13 +92,18 b' Configuration File Backup' | |||
|
75 | 92 | Depending on your setup, you could have a number of configuration files that |
|
76 | 93 | should be backed up. You may have some, or all of the configuration files |
|
77 | 94 | listed in the :ref:`config-rce-files` section. Ideally you should back these |
|
78 | up at the same time as the database and |repos|. | |
|
95 | up at the same time as the database and |repos|. It really depends on if you need | |
|
96 | the configuration file like logs, custom modules. We always recommend backing | |
|
97 | those up. | |
|
79 | 98 | |
|
80 | 99 | Gist Backup |
|
81 | 100 | ^^^^^^^^^^^ |
|
82 | 101 | |
|
83 |
To backup the gists on your |RCE| instance you |
|
|
84 | ``get_gists`` API methods to fetch the gists for each user on the instance. | |
|
102 | To backup the gists on your |RCE| instance you usually have to backup the | |
|
103 | gist storage path. If this haven't been changed it's located inside | |
|
104 | :file:`.rc_gist_store` and the metadata in :file:`.rc_gist_metadata`. | |
|
105 | You can use the ``get_users`` and ``get_gists`` API methods to fetch the | |
|
106 | gists for each user on the instance. | |
|
85 | 107 | |
|
86 | 108 | Extension Backups |
|
87 | 109 | ^^^^^^^^^^^^^^^^^ |
@@ -100,15 +122,17 b' the :ref:`indexing-ref` section.' | |||
|
100 | 122 | Restoration Steps |
|
101 | 123 | ----------------- |
|
102 | 124 | |
|
103 |
To restore an instance of |RCE| from its backed up components, |
|
|
104 | following steps. | |
|
125 | To restore an instance of |RCE| from its backed up components, to a fresh | |
|
126 | system use the following steps. | |
|
105 | 127 | |
|
106 | 1. Install a new instance of |RCE|. | |
|
107 | 2. Once installed, configure the instance to use the backed up | |
|
108 | :file:`rhodecode.ini` file. Ensure this file points to the backed up | |
|
128 | 1. Install a new instance of |RCE| using sqlite option as database. | |
|
129 | 2. Restore your database. | |
|
130 | 3. Once installed, replace you backed up the :file:`rhodecode.ini` with your | |
|
131 | backup version. Ensure this file points to the restored | |
|
109 | 132 | database, see the :ref:`config-database` section. |
|
110 |
|
|
|
111 | :ref:`remap-rescan` section. | |
|
133 | 4. Restart |RCE| and remap and rescan your |repos| to verify filesystem access, | |
|
134 | see the :ref:`remap-rescan` section. | |
|
135 | ||
|
112 | 136 | |
|
113 | 137 | Post Restoration Steps |
|
114 | 138 | ^^^^^^^^^^^^^^^^^^^^^^ |
@@ -22,8 +22,8 b' account permissions.' | |||
|
22 | 22 | .. code-block:: bash |
|
23 | 23 | |
|
24 | 24 | # Open iShell from the terminal |
|
25 |
$ .rccontrol/enterprise- |
|
|
26 |
ishell .rccontrol/enterprise- |
|
|
25 | $ .rccontrol/enterprise-1/profile/bin/paster \ | |
|
26 | ishell .rccontrol/enterprise-1/rhodecode.ini | |
|
27 | 27 | |
|
28 | 28 | .. code-block:: mysql |
|
29 | 29 | |
@@ -52,11 +52,12 b' following example to make changes to thi' | |||
|
52 | 52 | .. code-block:: mysql |
|
53 | 53 | |
|
54 | 54 | # Use this example to enable global .hgrc access |
|
55 |
In [ |
|
|
56 |
In [ |
|
|
57 |
In [ |
|
|
58 |
In [ |
|
|
59 |
In [ |
|
|
55 | In [1]: new_option = RhodeCodeUi() | |
|
56 | In [2]: new_option.ui_section='web' | |
|
57 | In [3]: new_option.ui_key='allow_push' | |
|
58 | In [4]: new_option.ui_value='*' | |
|
59 | In [5]: Session().add(new_option);Session().commit() | |
|
60 | In [6]: exit() | |
|
60 | 61 | |
|
61 | 62 | Manually Reset Password |
|
62 | 63 | ^^^^^^^^^^^^^^^^^^^^^^^ |
@@ -72,24 +73,47 b' Use the following code example to carry ' | |||
|
72 | 73 | .. code-block:: bash |
|
73 | 74 | |
|
74 | 75 | # starts the ishell interactive prompt |
|
75 |
$ .rccontrol/enterprise- |
|
|
76 |
ishell .rccontrol/enterprise- |
|
|
76 | $ .rccontrol/enterprise-1/profile/bin/paster \ | |
|
77 | ishell .rccontrol/enterprise-1/rhodecode.ini | |
|
77 | 78 | |
|
78 | 79 | .. code-block:: mysql |
|
79 | 80 | |
|
80 | from rhodecode.lib.auth import generate_auth_token | |
|
81 | from rhodecode.lib.auth import get_crypt_password | |
|
82 | ||
|
81 | In [1]: from rhodecode.lib.auth import generate_auth_token | |
|
82 | In [2]: from rhodecode.lib.auth import get_crypt_password | |
|
83 | 83 | # Enter the user name whose password you wish to change |
|
84 | my_user = 'USERNAME' | |
|
85 | u = User.get_by_username(my_user) | |
|
86 | ||
|
84 | In [3]: my_user = 'USERNAME' | |
|
85 | In [4]: u = User.get_by_username(my_user) | |
|
87 | 86 | # If this fails then the user does not exist |
|
88 | u.auth_token = generate_auth_token(my_user) | |
|
89 | ||
|
87 | In [5]: u.auth_token = generate_auth_token(my_user) | |
|
90 | 88 | # Set the new password |
|
91 | u.password = get_crypt_password('PASSWORD') | |
|
89 | In [6]: u.password = get_crypt_password('PASSWORD') | |
|
90 | In [7]: Session().add(u);Session().commit() | |
|
91 | In [8]: exit() | |
|
92 | ||
|
93 | ||
|
94 | ||
|
95 | Change user details | |
|
96 | ^^^^^^^^^^^^^^^^^^^ | |
|
97 | ||
|
98 | If you need to manually change some of users details, use the following steps. | |
|
99 | ||
|
100 | 1. Navigate to your |RCE| install location. | |
|
101 | 2. Run the interactive ``ishell`` prompt. | |
|
102 | 3. Set a new arguments for users. | |
|
92 | 103 | |
|
93 | Session().add(u) | |
|
94 | Session().commit() | |
|
95 | exit | |
|
104 | Use the following code example to carry out these steps. | |
|
105 | ||
|
106 | .. code-block:: bash | |
|
107 | ||
|
108 | # starts the ishell interactive prompt | |
|
109 | $ .rccontrol/enterprise-1/profile/bin/paster \ | |
|
110 | ishell .rccontrol/enterprise-1/rhodecode.ini | |
|
111 | ||
|
112 | .. code-block:: mysql | |
|
113 | ||
|
114 | # Use this example to change email and username of LDAP user | |
|
115 | In [1]: my_user = User.get_by_username('some_username') | |
|
116 | In [2]: my_user.email = 'new_email@foobar.com' | |
|
117 | In [3]: my_user.username = 'SomeUser' | |
|
118 | In [4]: Session().add(my_user);Session().commit() | |
|
119 | In [5]: exit() |
@@ -36,7 +36,7 b' 1. On your local machine create the publ' | |||
|
36 | 36 | Your public key has been saved in /home/user/.ssh/id_rsa.pub. |
|
37 | 37 | The key fingerprint is: |
|
38 | 38 | 02:82:38:95:e5:30:d2:ad:17:60:15:7f:94:17:9f:30 user@ubuntu |
|
39 | The key's randomart image is: | |
|
39 | The key\'s randomart image is: | |
|
40 | 40 | +--[ RSA 2048]----+ |
|
41 | 41 | |
|
42 | 42 | 2. SFTP to your server, and copy the public key to the ``~/.ssh`` folder. |
@@ -18,6 +18,7 b' The following are the most common system' | |||
|
18 | 18 | |
|
19 | 19 | config-files-overview |
|
20 | 20 | vcs-server |
|
21 | svn-http | |
|
21 | 22 | apache-config |
|
22 | 23 | nginx-config |
|
23 | 24 | backup-restore |
@@ -18,7 +18,7 b' 1. Open ishell from the terminal and use' | |||
|
18 | 18 | 2. Run the following commands, and ensure that |RCE| has write access to the |
|
19 | 19 | new directory: |
|
20 | 20 | |
|
21 |
.. code-block:: |
|
|
21 | .. code-block:: bash | |
|
22 | 22 | |
|
23 | 23 |
# Once logged into the database, use SQL to redirect |
|
24 | 24 |
# the large files location |
@@ -298,133 +298,7 b' For a more detailed explanation of the l' | |||
|
298 | 298 | format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s |
|
299 | 299 | datefmt = %Y-%m-%d %H:%M:%S |
|
300 | 300 | |
|
301 | .. _svn-http: | |
|
302 | ||
|
303 | |svn| With Write Over HTTP | |
|
304 | ^^^^^^^^^^^^^^^^^^^^^^^^^^ | |
|
305 | ||
|
306 | To use |svn| with read/write support over the |svn| HTTP protocol, you have to | |
|
307 | configure the HTTP |svn| backend. | |
|
308 | ||
|
309 | Prerequisites | |
|
310 | ============= | |
|
311 | ||
|
312 | - Enable HTTP support inside the admin VCS settings on your |RCE| instance | |
|
313 | - You need to install the following tools on the machine that is running an | |
|
314 | instance of |RCE|: | |
|
315 | ``Apache HTTP Server`` and | |
|
316 | ``mod_dav_svn``. | |
|
317 | ||
|
318 | ||
|
319 | Using Ubuntu Distribution as an example you can run: | |
|
320 | ||
|
321 | .. code-block:: bash | |
|
322 | ||
|
323 | $ sudo apt-get install apache2 libapache2-mod-svn | |
|
324 | ||
|
325 | Once installed you need to enable ``dav_svn``: | |
|
326 | ||
|
327 | .. code-block:: bash | |
|
328 | ||
|
329 | $ sudo a2enmod dav_svn | |
|
330 | ||
|
331 | Configuring Apache Setup | |
|
332 | ======================== | |
|
333 | ||
|
334 | .. tip:: | |
|
335 | ||
|
336 | It is recommended to run Apache on a port other than 80, due to possible | |
|
337 | conflicts with other HTTP servers like nginx. To do this, set the | |
|
338 | ``Listen`` parameter in the ``/etc/apache2/ports.conf`` file, for example | |
|
339 | ``Listen 8090``. | |
|
340 | ||
|
341 | ||
|
342 | .. warning:: | |
|
343 | ||
|
344 | Make sure your Apache instance which runs the mod_dav_svn module is | |
|
345 | only accessible by RhodeCode. Otherwise everyone is able to browse | |
|
346 | the repositories or run subversion operations (checkout/commit/etc.). | |
|
347 | ||
|
348 | It is also recommended to run apache as the same user as |RCE|, otherwise | |
|
349 | permission issues could occur. To do this edit the ``/etc/apache2/envvars`` | |
|
350 | ||
|
351 | .. code-block:: apache | |
|
352 | ||
|
353 | export APACHE_RUN_USER=rhodecode | |
|
354 | export APACHE_RUN_GROUP=rhodecode | |
|
355 | ||
|
356 | 1. To configure Apache, create and edit a virtual hosts file, for example | |
|
357 | :file:`/etc/apache2/sites-available/default.conf`. Below is an example | |
|
358 | how to use one with auto-generated config ```mod_dav_svn.conf``` | |
|
359 | from configured |RCE| instance. | |
|
360 | ||
|
361 | .. code-block:: apache | |
|
362 | ||
|
363 | <VirtualHost *:8080> | |
|
364 | ServerAdmin rhodecode-admin@localhost | |
|
365 | DocumentRoot /var/www/html | |
|
366 | ErrorLog ${'${APACHE_LOG_DIR}'}/error.log | |
|
367 | CustomLog ${'${APACHE_LOG_DIR}'}/access.log combined | |
|
368 | Include /home/user/.rccontrol/enterprise-1/mod_dav_svn.conf | |
|
369 | </VirtualHost> | |
|
370 | ||
|
371 | ||
|
372 | 2. Go to the :menuselection:`Admin --> Settings --> VCS` page, and | |
|
373 | enable :guilabel:`Proxy Subversion HTTP requests`, and specify the | |
|
374 | :guilabel:`Subversion HTTP Server URL`. | |
|
375 | ||
|
376 | 3. Open the |RCE| configuration file, | |
|
377 | :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini` | |
|
378 | ||
|
379 | 4. Add the following configuration option in the ``[app:main]`` | |
|
380 | section if you don't have it yet. | |
|
381 | ||
|
382 | This enables mapping of the created |RCE| repo groups into special |svn| paths. | |
|
383 | Each time a new repository group is created, the system will update | |
|
384 | the template file and create new mapping. Apache web server needs to be | |
|
385 | reloaded to pick up the changes on this file. | |
|
386 | It's recommended to add reload into a crontab so the changes can be picked | |
|
387 | automatically once someone creates a repository group inside RhodeCode. | |
|
388 | ||
|
389 | ||
|
390 | .. code-block:: ini | |
|
391 | ||
|
392 | ############################################## | |
|
393 | ### Subversion proxy support (mod_dav_svn) ### | |
|
394 | ############################################## | |
|
395 | ## Enable or disable the config file generation. | |
|
396 | svn.proxy.generate_config = true | |
|
397 | ## Generate config file with `SVNListParentPath` set to `On`. | |
|
398 | svn.proxy.list_parent_path = true | |
|
399 | ## Set location and file name of generated config file. | |
|
400 | svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf | |
|
401 | ## File system path to the directory containing the repositories served by | |
|
402 | ## RhodeCode. | |
|
403 | svn.proxy.parent_path_root = /path/to/repo_store | |
|
404 | ## Used as a prefix to the <Location> block in the generated config file. In | |
|
405 | ## most cases it should be set to `/`. | |
|
406 | svn.proxy.location_root = / | |
|
407 | ||
|
408 | ||
|
409 | This would create a special template file called ```mod_dav_svn.conf```. We | |
|
410 | used that file path in the apache config above inside the Include statement. | |
|
411 | ||
|
412 | ||
|
413 | Using |svn| | |
|
414 | =========== | |
|
415 | ||
|
416 | Once |svn| has been enabled on your instance, you can use it with the | |
|
417 | following examples. For more |svn| information, see the `Subversion Red Book`_ | |
|
418 | ||
|
419 | .. code-block:: bash | |
|
420 | ||
|
421 | # To clone a repository | |
|
422 | svn checkout http://my-svn-server.example.com/my-svn-repo | |
|
423 | ||
|
424 | # svn commit | |
|
425 | svn commit | |
|
426 | 301 | |
|
427 | 302 | .. _Subversion Red Book: http://svnbook.red-bean.com/en/1.7/svn-book.html#svn.ref.svn |
|
428 | 303 | |
|
429 | ||
|
430 | .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue No newline at end of file | |
|
304 | .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue |
@@ -1,7 +1,7 b'' | |||
|
1 | 1 | .. _deprecated-methods-ref: |
|
2 | 2 | |
|
3 | 3 | deprecated methods |
|
4 | ================= | |
|
4 | ================== | |
|
5 | 5 | |
|
6 | 6 | changeset_comment |
|
7 | 7 | ----------------- |
@@ -1,7 +1,7 b'' | |||
|
1 | 1 | .. _gist-methods-ref: |
|
2 | 2 | |
|
3 | 3 | gist methods |
|
4 |
============ |
|
|
4 | ============ | |
|
5 | 5 | |
|
6 | 6 | create_gist |
|
7 | 7 | ----------- |
@@ -1,10 +1,10 b'' | |||
|
1 | 1 | .. _license-methods-ref: |
|
2 | 2 | |
|
3 | 3 | license methods |
|
4 |
=============== |
|
|
4 | =============== | |
|
5 | 5 | |
|
6 | 6 | get_license_info (EE only) |
|
7 | ---------------- | |
|
7 | -------------------------- | |
|
8 | 8 | |
|
9 | 9 | .. py:function:: get_license_info(apiuser) |
|
10 | 10 | |
@@ -32,7 +32,7 b' get_license_info (EE only)' | |||
|
32 | 32 | |
|
33 | 33 | |
|
34 | 34 | set_license_key (EE only) |
|
35 | --------------- | |
|
35 | ------------------------- | |
|
36 | 36 | |
|
37 | 37 | .. py:function:: set_license_key(apiuser, key) |
|
38 | 38 |
@@ -1,7 +1,7 b'' | |||
|
1 | 1 | .. _pull-request-methods-ref: |
|
2 | 2 | |
|
3 | 3 | pull_request methods |
|
4 | ================= | |
|
4 | ==================== | |
|
5 | 5 | |
|
6 | 6 | close_pull_request |
|
7 | 7 | ------------------ |
@@ -103,6 +103,10 b' create_pull_request' | |||
|
103 | 103 | :type description: Optional(str) |
|
104 | 104 | :param reviewers: Set the new pull request reviewers list. |
|
105 | 105 | :type reviewers: Optional(list) |
|
106 | Accepts username strings or objects of the format: | |
|
107 | { | |
|
108 | 'username': 'nick', 'reasons': ['original author'] | |
|
109 | } | |
|
106 | 110 | |
|
107 | 111 | |
|
108 | 112 | get_pull_request |
@@ -165,6 +169,15 b' get_pull_request' | |||
|
165 | 169 | "commit_id": "<commit_id>", |
|
166 | 170 | } |
|
167 | 171 | }, |
|
172 | "merge": { | |
|
173 | "clone_url": "<clone_url>", | |
|
174 | "reference": | |
|
175 | { | |
|
176 | "name": "<name>", | |
|
177 | "type": "<type>", | |
|
178 | "commit_id": "<commit_id>", | |
|
179 | } | |
|
180 | }, | |
|
168 | 181 | "author": <user_obj>, |
|
169 | 182 | "reviewers": [ |
|
170 | 183 | ... |
@@ -241,6 +254,15 b' get_pull_requests' | |||
|
241 | 254 | "commit_id": "<commit_id>", |
|
242 | 255 | } |
|
243 | 256 | }, |
|
257 | "merge": { | |
|
258 | "clone_url": "<clone_url>", | |
|
259 | "reference": | |
|
260 | { | |
|
261 | "name": "<name>", | |
|
262 | "type": "<type>", | |
|
263 | "commit_id": "<commit_id>", | |
|
264 | } | |
|
265 | }, | |
|
244 | 266 | "author": <user_obj>, |
|
245 | 267 | "reviewers": [ |
|
246 | 268 | ... |
@@ -284,7 +306,12 b' merge_pull_request' | |||
|
284 | 306 | "executed": "<bool>", |
|
285 | 307 | "failure_reason": "<int>", |
|
286 | 308 | "merge_commit_id": "<merge_commit_id>", |
|
287 | "possible": "<bool>" | |
|
309 | "possible": "<bool>", | |
|
310 | "merge_ref": { | |
|
311 | "commit_id": "<commit_id>", | |
|
312 | "type": "<type>", | |
|
313 | "name": "<name>" | |
|
314 | } | |
|
288 | 315 | }, |
|
289 | 316 | "error": null |
|
290 | 317 |
@@ -1,24 +1,25 b'' | |||
|
1 | 1 | .. _repo-group-methods-ref: |
|
2 | 2 | |
|
3 | 3 | repo_group methods |
|
4 | ================= | |
|
4 | ================== | |
|
5 | 5 | |
|
6 | 6 | create_repo_group |
|
7 | 7 | ----------------- |
|
8 | 8 | |
|
9 |
.. py:function:: create_repo_group(apiuser, group_name, |
|
|
9 | .. py:function:: create_repo_group(apiuser, group_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, copy_permissions=<Optional:False>) | |
|
10 | 10 | |
|
11 | 11 | Creates a repository group. |
|
12 | 12 | |
|
13 |
* If the repository group name contains "/", |
|
|
14 | groups will be created. | |
|
13 | * If the repository group name contains "/", repository group will be | |
|
14 | created inside a repository group or nested repository groups | |
|
15 | 15 | |
|
16 |
For example "foo/bar/ |
|
|
17 | (with "foo" as parent). It will also create the "baz" repository | |
|
18 | with "bar" as |repo| group. | |
|
16 | For example "foo/bar/group1" will create repository group called "group1" | |
|
17 | inside group "foo/bar". You have to have permissions to access and | |
|
18 | write to the last repository group ("bar" in this example) | |
|
19 | 19 | |
|
20 |
This command can only be run using an |authtoken| with a |
|
|
21 | permissions. | |
|
20 | This command can only be run using an |authtoken| with at least | |
|
21 | permissions to create repository groups, or admin permissions to | |
|
22 | parent repository groups. | |
|
22 | 23 | |
|
23 | 24 | :param apiuser: This is filled automatically from the |authtoken|. |
|
24 | 25 | :type apiuser: AuthUser |
@@ -73,7 +74,7 b' delete_repo_group' | |||
|
73 | 74 | |
|
74 | 75 | id : <id_given_in_input> |
|
75 | 76 | result : { |
|
76 | 'msg': 'deleted repo group ID:<repogroupid> <repogroupname> | |
|
77 | 'msg': 'deleted repo group ID:<repogroupid> <repogroupname>' | |
|
77 | 78 | 'repo_group': null |
|
78 | 79 | } |
|
79 | 80 | error : null |
@@ -325,13 +326,22 b' revoke_user_permission_from_repo_group' | |||
|
325 | 326 | update_repo_group |
|
326 | 327 | ----------------- |
|
327 | 328 | |
|
328 |
.. py:function:: update_repo_group(apiuser, repogroupid, group_name=<Optional:''>, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser> |
|
|
329 | .. py:function:: update_repo_group(apiuser, repogroupid, group_name=<Optional:''>, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser>>, enable_locking=<Optional:False>) | |
|
329 | 330 | |
|
330 | 331 | Updates repository group with the details given. |
|
331 | 332 | |
|
332 | 333 | This command can only be run using an |authtoken| with admin |
|
333 | 334 | permissions. |
|
334 | 335 | |
|
336 | * If the group_name name contains "/", repository group will be updated | |
|
337 | accordingly with a repository group or nested repository groups | |
|
338 | ||
|
339 | For example repogroupid=group-test group_name="foo/bar/group-test" | |
|
340 | will update repository group called "group-test" and place it | |
|
341 | inside group "foo/bar". | |
|
342 | You have to have permissions to access and write to the last repository | |
|
343 | group ("bar" in this example) | |
|
344 | ||
|
335 | 345 | :param apiuser: This is filled automatically from the |authtoken|. |
|
336 | 346 | :type apiuser: AuthUser |
|
337 | 347 | :param repogroupid: Set the ID of repository group. |
@@ -342,8 +352,6 b' update_repo_group' | |||
|
342 | 352 | :type description: str |
|
343 | 353 | :param owner: Set the |repo| group owner. |
|
344 | 354 | :type owner: str |
|
345 | :param parent: Set the |repo| group parent. | |
|
346 | :type parent: str or int | |
|
347 | 355 | :param enable_locking: Enable |repo| locking. The default is false. |
|
348 | 356 | :type enable_locking: bool |
|
349 | 357 |
@@ -1,7 +1,7 b'' | |||
|
1 | 1 | .. _repo-methods-ref: |
|
2 | 2 | |
|
3 | 3 | repo methods |
|
4 |
============ |
|
|
4 | ============ | |
|
5 | 5 | |
|
6 | 6 | add_field_to_repo |
|
7 | 7 | ----------------- |
@@ -68,15 +68,16 b' create_repo' | |||
|
68 | 68 | |
|
69 | 69 | Creates a repository. |
|
70 | 70 | |
|
71 |
* If the repository name contains "/", |
|
|
72 | groups will be created. | |
|
71 | * If the repository name contains "/", repository will be created inside | |
|
72 | a repository group or nested repository groups | |
|
73 | 73 | |
|
74 |
For example "foo/bar/ |
|
|
75 | (with "foo" as parent). It will also create the "baz" repository | |
|
76 | with "bar" as |repo| group. | |
|
74 | For example "foo/bar/repo1" will create |repo| called "repo1" inside | |
|
75 | group "foo/bar". You have to have permissions to access and write to | |
|
76 | the last repository group ("bar" in this example) | |
|
77 | 77 | |
|
78 | 78 | This command can only be run using an |authtoken| with at least |
|
79 | write permissions to the |repo|. | |
|
79 | permissions to create repositories, or write permissions to | |
|
80 | parent repository groups. | |
|
80 | 81 | |
|
81 | 82 | :param apiuser: This is filled automatically from the |authtoken|. |
|
82 | 83 | :type apiuser: AuthUser |
@@ -88,9 +89,9 b' create_repo' | |||
|
88 | 89 | :type owner: Optional(str) |
|
89 | 90 | :param description: Set the repository description. |
|
90 | 91 | :type description: Optional(str) |
|
91 | :param private: | |
|
92 | :param private: set repository as private | |
|
92 | 93 | :type private: bool |
|
93 | :param clone_uri: | |
|
94 | :param clone_uri: set clone_uri | |
|
94 | 95 | :type clone_uri: str |
|
95 | 96 | :param landing_rev: <rev_type>:<rev> |
|
96 | 97 | :type landing_rev: str |
@@ -125,7 +126,7 b' create_repo' | |||
|
125 | 126 | id : <id_given_in_input> |
|
126 | 127 | result : null |
|
127 | 128 | error : { |
|
128 | 'failed to create repository `<repo_name>` | |
|
129 | 'failed to create repository `<repo_name>`' | |
|
129 | 130 | } |
|
130 | 131 | |
|
131 | 132 | |
@@ -164,25 +165,29 b' delete_repo' | |||
|
164 | 165 | fork_repo |
|
165 | 166 | --------- |
|
166 | 167 | |
|
167 |
.. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, |
|
|
168 | .. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, copy_permissions=<Optional:False>) | |
|
168 | 169 | |
|
169 | 170 | Creates a fork of the specified |repo|. |
|
170 | 171 | |
|
171 | * If using |RCE| with Celery this will immediately return a success | |
|
172 | message, even though the fork will be created asynchronously. | |
|
172 | * If the fork_name contains "/", fork will be created inside | |
|
173 | a repository group or nested repository groups | |
|
173 | 174 | |
|
174 | This command can only be run using an |authtoken| with fork | |
|
175 | permissions on the |repo|. | |
|
175 | For example "foo/bar/fork-repo" will create fork called "fork-repo" | |
|
176 | inside group "foo/bar". You have to have permissions to access and | |
|
177 | write to the last repository group ("bar" in this example) | |
|
178 | ||
|
179 | This command can only be run using an |authtoken| with minimum | |
|
180 | read permissions of the forked repo, create fork permissions for an user. | |
|
176 | 181 | |
|
177 | 182 | :param apiuser: This is filled automatically from the |authtoken|. |
|
178 | 183 | :type apiuser: AuthUser |
|
179 | 184 | :param repoid: Set repository name or repository ID. |
|
180 | 185 | :type repoid: str or int |
|
181 | :param fork_name: Set the fork name. | |
|
186 | :param fork_name: Set the fork name, including it's repository group membership. | |
|
182 | 187 | :type fork_name: str |
|
183 | 188 | :param owner: Set the fork owner. |
|
184 | 189 | :type owner: str |
|
185 | :param description: Set the fork descripton. | |
|
190 | :param description: Set the fork description. | |
|
186 | 191 | :type description: str |
|
187 | 192 | :param copy_permissions: Copy permissions from parent |repo|. The |
|
188 | 193 | default is False. |
@@ -729,7 +734,7 b' lock' | |||
|
729 | 734 | id : <id_given_in_input> |
|
730 | 735 | result : null |
|
731 | 736 | error : { |
|
732 | 'Error occurred locking repository `<reponame>` | |
|
737 | 'Error occurred locking repository `<reponame>`' | |
|
733 | 738 | } |
|
734 | 739 | |
|
735 | 740 | |
@@ -923,24 +928,31 b' strip' | |||
|
923 | 928 | update_repo |
|
924 | 929 | ----------- |
|
925 | 930 | |
|
926 | .. py:function:: update_repo(apiuser, repoid, name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, group=<Optional:None>, fork_of=<Optional:None>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>) | |
|
931 | .. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, fork_of=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>) | |
|
927 | 932 | |
|
928 | 933 | Updates a repository with the given information. |
|
929 | 934 | |
|
930 | 935 | This command can only be run using an |authtoken| with at least |
|
931 |
|
|
|
936 | admin permissions to the |repo|. | |
|
937 | ||
|
938 | * If the repository name contains "/", repository will be updated | |
|
939 | accordingly with a repository group or nested repository groups | |
|
940 | ||
|
941 | For example repoid=repo-test name="foo/bar/repo-test" will update |repo| | |
|
942 | called "repo-test" and place it inside group "foo/bar". | |
|
943 | You have to have permissions to access and write to the last repository | |
|
944 | group ("bar" in this example) | |
|
932 | 945 | |
|
933 | 946 | :param apiuser: This is filled automatically from the |authtoken|. |
|
934 | 947 | :type apiuser: AuthUser |
|
935 | 948 | :param repoid: repository name or repository ID. |
|
936 | 949 | :type repoid: str or int |
|
937 |
:param name: Update the |repo| name |
|
|
938 | :type name: str | |
|
950 | :param repo_name: Update the |repo| name, including the | |
|
951 | repository group it's in. | |
|
952 | :type repo_name: str | |
|
939 | 953 | :param owner: Set the |repo| owner. |
|
940 | 954 | :type owner: str |
|
941 |
:param |
|
|
942 | :type group: str | |
|
943 | :param fork_of: Set the master |repo| name. | |
|
955 | :param fork_of: Set the |repo| as fork of another |repo|. | |
|
944 | 956 | :type fork_of: str |
|
945 | 957 | :param description: Update the |repo| description. |
|
946 | 958 | :type description: str |
@@ -948,16 +960,13 b' update_repo' | |||
|
948 | 960 | :type private: bool |
|
949 | 961 | :param clone_uri: Update the |repo| clone URI. |
|
950 | 962 | :type clone_uri: str |
|
951 | :param landing_rev: Set the |repo| landing revision. Default is | |
|
952 | ``tip``. | |
|
963 | :param landing_rev: Set the |repo| landing revision. Default is ``rev:tip``. | |
|
953 | 964 | :type landing_rev: str |
|
954 | :param enable_statistics: Enable statistics on the |repo|, | |
|
955 | (True | False). | |
|
965 | :param enable_statistics: Enable statistics on the |repo|, (True | False). | |
|
956 | 966 | :type enable_statistics: bool |
|
957 | 967 | :param enable_locking: Enable |repo| locking. |
|
958 | 968 | :type enable_locking: bool |
|
959 | :param enable_downloads: Enable downloads from the |repo|, | |
|
960 | (True | False). | |
|
969 | :param enable_downloads: Enable downloads from the |repo|, (True | False). | |
|
961 | 970 | :type enable_downloads: bool |
|
962 | 971 | :param fields: Add extra fields to the |repo|. Use the following |
|
963 | 972 | example format: ``field_key=field_val,field_key2=fieldval2``. |
@@ -1,7 +1,7 b'' | |||
|
1 | 1 | .. _server-methods-ref: |
|
2 | 2 | |
|
3 | 3 | server methods |
|
4 |
============== |
|
|
4 | ============== | |
|
5 | 5 | |
|
6 | 6 | get_ip |
|
7 | 7 | ------ |
@@ -1,7 +1,7 b'' | |||
|
1 | 1 | .. _user-group-methods-ref: |
|
2 | 2 | |
|
3 | 3 | user_group methods |
|
4 | ================= | |
|
4 | ================== | |
|
5 | 5 | |
|
6 | 6 | add_user_to_user_group |
|
7 | 7 | ---------------------- |
@@ -1,12 +1,12 b'' | |||
|
1 | 1 | .. _user-methods-ref: |
|
2 | 2 | |
|
3 | 3 | user methods |
|
4 |
============ |
|
|
4 | ============ | |
|
5 | 5 | |
|
6 | 6 | create_user |
|
7 | 7 | ----------- |
|
8 | 8 | |
|
9 | .. py:function:: create_user(apiuser, username, email, password=<Optional:''>, firstname=<Optional:''>, lastname=<Optional:''>, active=<Optional:True>, admin=<Optional:False>, extern_name=<Optional:'rhodecode'>, extern_type=<Optional:'rhodecode'>, force_password_change=<Optional:False>) | |
|
9 | .. py:function:: create_user(apiuser, username, email, password=<Optional:''>, firstname=<Optional:''>, lastname=<Optional:''>, active=<Optional:True>, admin=<Optional:False>, extern_name=<Optional:'rhodecode'>, extern_type=<Optional:'rhodecode'>, force_password_change=<Optional:False>, create_personal_repo_group=<Optional:None>) | |
|
10 | 10 | |
|
11 | 11 | Creates a new user and returns the new user object. |
|
12 | 12 | |
@@ -39,7 +39,8 b' create_user' | |||
|
39 | 39 | :param force_password_change: Force the new user to change password |
|
40 | 40 | on next login. |
|
41 | 41 | :type force_password_change: Optional(``True`` | ``False``) |
|
42 | ||
|
42 | :param create_personal_repo_group: Create personal repo group for this user | |
|
43 | :type create_personal_repo_group: Optional(``True`` | ``False``) | |
|
43 | 44 | Example output: |
|
44 | 45 | |
|
45 | 46 | .. code-block:: bash |
@@ -163,6 +164,7 b' get_user' | |||
|
163 | 164 | "usergroup.read", |
|
164 | 165 | "hg.repogroup.create.false", |
|
165 | 166 | "hg.create.none", |
|
167 | "hg.password_reset.enabled", | |
|
166 | 168 | "hg.extern_activate.manual", |
|
167 | 169 | "hg.create.write_on_repogroup.false", |
|
168 | 170 | "hg.usergroup.create.false", |
@@ -1,7 +1,7 b'' | |||
|
1 | 1 | |
|
2 | ===== | |
|
3 | API | |
|
4 | ===== | |
|
2 | =================== | |
|
3 | CONTRIBUTING TO API | |
|
4 | =================== | |
|
5 | 5 | |
|
6 | 6 | |
|
7 | 7 |
@@ -130,7 +130,7 b' is a very small pencil which has to be c' | |||
|
130 | 130 | ticket. |
|
131 | 131 | |
|
132 | 132 | |
|
133 | .. figure:: images/redmine-description.png | |
|
133 | .. figure:: ../images/redmine-description.png | |
|
134 | 134 | :alt: Example of pencil to change the ticket description |
|
135 | 135 | |
|
136 | 136 | Shows an example of the pencil which lets you change the description. |
@@ -9,9 +9,6 b'' | |||
|
9 | 9 | .. Avoid duplicating the quickstart instructions by importing the README |
|
10 | 10 | file. |
|
11 | 11 | |
|
12 | .. include:: ../../../acceptance_tests/README.rst | |
|
13 | ||
|
14 | ||
|
15 | 12 |
|
|
16 | 13 | Choices of technology and tools |
|
17 | 14 | =============================== |
@@ -88,10 +88,10 b' let' | |||
|
88 | 88 | }; |
|
89 | 89 | |
|
90 | 90 | Sphinx = buildPythonPackage (rec { |
|
91 |
name = "Sphinx-1.4. |
|
|
91 | name = "Sphinx-1.4.8"; | |
|
92 | 92 | src = fetchurl { |
|
93 |
url = "https://pypi.python.org/packages/ |
|
|
94 | md5 = "64ce2ec08d37ed56313a98232cbe2aee"; | |
|
93 | url = "https://pypi.python.org/packages/1f/f6/e54a7aad73e35232356103771ae76306dadd8546b024c646fbe75135571c/${name}.tar.gz"; | |
|
94 | md5 = "5ec718a4855917e149498bba91b74e67"; | |
|
95 | 95 | }; |
|
96 | 96 | propagatedBuildInputs = [ |
|
97 | 97 | docutils |
@@ -20,8 +20,10 b' and commit files and |repos| while manag' | |||
|
20 | 20 | * Migration from existing databases. |
|
21 | 21 | * |RCM| SDK. |
|
22 | 22 | * Built-in analytics |
|
23 | * Built in integrations including: Slack, Jenkins, Webhooks, Jira, Redmine, Hipchat | |
|
23 | 24 | * Pluggable authentication system. |
|
24 | * Support for |LDAP|, Crowd, CAS, PAM. | |
|
25 | * Support for AD, |LDAP|, Crowd, CAS, PAM. | |
|
26 | * Support for external authentication via Oauth Google, Github, Bitbucket, Twitter. | |
|
25 | 27 | * Debug modes of operation. |
|
26 | 28 | * Private and public gists. |
|
27 | 29 | * Gists with limited lifetimes and within instance only sharing. |
@@ -50,3 +50,4 b' See pages specific to each type of integ' | |||
|
50 | 50 | redmine |
|
51 | 51 | jira |
|
52 | 52 | webhook |
|
53 |
@@ -7,6 +7,16 b' The Webhook integration allows you to PO' | |||
|
7 | 7 | or pull requests to a custom http endpoint as a json dict with details of the |
|
8 | 8 | event. |
|
9 | 9 | |
|
10 | Starting from 4.5.0 release, webhook integration allows to use variables | |
|
11 | inside the URL. For example in URL `https://server-example.com/${repo_name}` | |
|
12 | ${repo_name} will be replaced with the name of repository which events is | |
|
13 | triggered from. Some of the variables like | |
|
14 | `${branch}` will result in webhook be called multiple times when multiple | |
|
15 | branches are pushed. | |
|
16 | ||
|
17 | Some of the variables like `${pull_request_id}` will be replaced only in | |
|
18 | the pull request related events. | |
|
19 | ||
|
10 | 20 | To create a webhook integration, select "webhook" in the integration settings |
|
11 |
and use the |
|
|
12 | :ref:`creating-integrations` for additional instructions. No newline at end of file | |
|
21 | and use the URL and key from your any previous custom webhook created. See | |
|
22 | :ref:`creating-integrations` for additional instructions. |
@@ -7,38 +7,6 b' Release Date' | |||
|
7 | 7 | - 2016-08-12 |
|
8 | 8 | |
|
9 | 9 | |
|
10 | General | |
|
11 | ^^^^^^^ | |
|
12 | ||
|
13 | - Subversion: detect requests also based on magic path. | |
|
14 | This adds subversion 1.9 support for SVN backend. | |
|
15 | - Summary/changelog: unified how data is displayed for those pages. | |
|
16 | * use consistent order of columns | |
|
17 | * fix the link to commit status | |
|
18 | * fix order of displaying comments | |
|
19 | - Live-chat: refactor live chat system for code review based on | |
|
20 | latest channelstream changes. | |
|
21 | - SVN: Add template to generate the apache mod_dav_svn config for all | |
|
22 | repository groups. Repository groups can now be automatically mapped to be | |
|
23 | supported by SVN backend. Set `svn.proxy.generate_config = true` and similar | |
|
24 | options found inside .ini config. | |
|
25 | - Readme/markup: improved order of generating readme files. Fixes #4050 | |
|
26 | * we now use order based on default system renderer | |
|
27 | * having multiple readme files will pick correct one as set renderer | |
|
28 | - Api: add a max_file_bytes parameter to get_nodes so that large files | |
|
29 | can be skipped. | |
|
30 | - Auth-ldap: added flag to set debug mode for LDAP connections. | |
|
31 | - Labs: moved rebase-merge option from labs settings into VCS settings. | |
|
32 | - System: send platform type and version to upgrade endpoint when checking | |
|
33 | for new versions. | |
|
34 | - Packaging: update rhodecode-tools from 0.8.3 to 0.10.0 | |
|
35 | - Packaging: update codemirror from 5.4.0 to 5.11.0 | |
|
36 | - Packaging: updated pygments to 2.1.3 | |
|
37 | - Packaging: bumped supervisor to 3.3.0 | |
|
38 | - Packaging: bumped psycopg2 to 2.6.1 | |
|
39 | - Packaging: bumped mercurial to 3.8.4 | |
|
40 | ||
|
41 | ||
|
42 | 10 | New Features |
|
43 | 11 | ^^^^^^^^^^^^ |
|
44 | 12 | |
@@ -64,6 +32,38 b' New Features' | |||
|
64 | 32 | onto comments you submitted while doing a code-review. |
|
65 | 33 | |
|
66 | 34 | |
|
35 | General | |
|
36 | ^^^^^^^ | |
|
37 | ||
|
38 | - Subversion: detect requests also based on magic path. | |
|
39 | This adds subversion 1.9 support for SVN backend. | |
|
40 | - Summary/changelog: unified how data is displayed for those pages. | |
|
41 | * use consistent order of columns | |
|
42 | * fix the link to commit status | |
|
43 | * fix order of displaying comments | |
|
44 | - Live chat: refactor live chat system for code review based on | |
|
45 | latest channelstream changes. | |
|
46 | - SVN: Add template to generate the apache mod_dav_svn config for all | |
|
47 | repository groups. Repository groups can now be automatically mapped to be | |
|
48 | supported by SVN backend. Set `svn.proxy.generate_config = true` and similar | |
|
49 | options found inside .ini config. | |
|
50 | - Readme/markup: improved order of generating readme files. Fixes #4050 | |
|
51 | * we now use order based on default system renderer | |
|
52 | * having multiple readme files will pick correct one as set renderer | |
|
53 | - Api: add a max_file_bytes parameter to get_nodes so that large files | |
|
54 | can be skipped. | |
|
55 | - Auth-ldap: added flag to set debug mode for LDAP connections. | |
|
56 | - Labs: moved rebase-merge option from labs settings into VCS settings. | |
|
57 | - System: send platform type and version to upgrade endpoint when checking | |
|
58 | for new versions. | |
|
59 | - Packaging: update rhodecode-tools from 0.8.3 to 0.10.0 | |
|
60 | - Packaging: update codemirror from 5.4.0 to 5.11.0 | |
|
61 | - Packaging: updated pygments to 2.1.3 | |
|
62 | - Packaging: bumped supervisor to 3.3.0 | |
|
63 | - Packaging: bumped psycopg2 to 2.6.1 | |
|
64 | - Packaging: bumped mercurial to 3.8.4 | |
|
65 | ||
|
66 | ||
|
67 | 67 | Security |
|
68 | 68 | ^^^^^^^^ |
|
69 | 69 | |
@@ -105,7 +105,7 b' Fixes' | |||
|
105 | 105 | support to gevent compatible handling. |
|
106 | 106 | - Diff2way: fixed unicode problem on non-ascii files. |
|
107 | 107 | - Full text search: whoosh schema uses now bigger ints, fixes #4035 |
|
108 |
- File |
|
|
108 | - File browser: optimized cached tree calculation, reduced load times by | |
|
109 | 109 | 50% on complex file trees. |
|
110 | 110 | - Styling: #4086 fixing bug where long commit messages did not wrap in file view. |
|
111 | 111 | - SVN: Ignore the content length header from response, fixes #4112. |
@@ -6,6 +6,27 b' Release Date' | |||
|
6 | 6 | |
|
7 | 7 | - 2016-08-23 |
|
8 | 8 | |
|
9 | ||
|
10 | New Features | |
|
11 | ^^^^^^^^^^^^ | |
|
12 | ||
|
13 | ||
|
14 | ||
|
15 | General | |
|
16 | ^^^^^^^ | |
|
17 | ||
|
18 | ||
|
19 | ||
|
20 | Security | |
|
21 | ^^^^^^^^ | |
|
22 | ||
|
23 | ||
|
24 | ||
|
25 | Performance | |
|
26 | ^^^^^^^^^^^ | |
|
27 | ||
|
28 | ||
|
29 | ||
|
9 | 30 | Fixes |
|
10 | 31 | ^^^^^ |
|
11 | 32 |
@@ -7,18 +7,6 b' Release Date' | |||
|
7 | 7 | - 2016-09-16 |
|
8 | 8 | |
|
9 | 9 | |
|
10 | General | |
|
11 | ^^^^^^^ | |
|
12 | ||
|
13 | - UI: introduced Polymer webcomponents into core application. RhodeCode will | |
|
14 | be now shipped together with Polymer framework webcomponents. Most of | |
|
15 | dynamic UI components that require large amounts of interaction | |
|
16 | will be done now with Polymer. | |
|
17 | - live-notifications: use rhodecode-toast for live notifications instead of | |
|
18 | toastr jquery plugin. | |
|
19 | - Svn: moved svn http support out of labs settings. It's tested and stable now. | |
|
20 | ||
|
21 | ||
|
22 | 10 | New Features |
|
23 | 11 | ^^^^^^^^^^^^ |
|
24 | 12 | |
@@ -29,11 +17,11 b' New Features' | |||
|
29 | 17 | It will allow to configure exactly which projects use which integrations. |
|
30 | 18 | - Integrations: show branches/commits separately when posting push events |
|
31 | 19 | to hipchat/slack, fixes #4192. |
|
32 |
- Pull |
|
|
20 | - Pull requests: summary page now shows update dates for pull request to | |
|
33 | 21 | easier see which one were receantly updated. |
|
34 | 22 | - UI: hidden inline comments will be shown in side view when browsing the diffs |
|
35 | 23 | - Diffs: added inline comments toggle into pull requests diff view. #2884 |
|
36 |
- Live |
|
|
24 | - Live chat: added summon reviewers functionality. You can now request | |
|
37 | 25 | presence from online users into a chat for collaborative code-review. |
|
38 | 26 | This requires channelstream to be enabled. |
|
39 | 27 | - UX: added a static 502 page for gateway error. Once configured via |
@@ -41,6 +29,18 b' New Features' | |||
|
41 | 29 | backend servers are offline. Fixes #4202. |
|
42 | 30 | |
|
43 | 31 | |
|
32 | General | |
|
33 | ^^^^^^^ | |
|
34 | ||
|
35 | - UI: introduced Polymer webcomponents into core application. RhodeCode will | |
|
36 | be now shipped together with Polymer framework webcomponents. Most of | |
|
37 | dynamic UI components that require large amounts of interaction | |
|
38 | will be done now with Polymer. | |
|
39 | - Live notifications: use rhodecode-toast for live notifications instead of | |
|
40 | toastr jquery plugin. | |
|
41 | - Svn: moved svn http support out of labs settings. It's tested and stable now. | |
|
42 | ||
|
43 | ||
|
44 | 44 | Security |
|
45 | 45 | ^^^^^^^^ |
|
46 | 46 | |
@@ -67,12 +67,12 b' Fixes' | |||
|
67 | 67 | match rest of ui, fixes: #4200. |
|
68 | 68 | - UX: show multiple tags/branches in changelog/summary instead of |
|
69 | 69 | truncating them. |
|
70 |
- My |
|
|
70 | - My account: fix test notifications for IE10+ | |
|
71 | 71 | - Vcs: change way refs are retrieved for git so same name branch/tags and |
|
72 | 72 | remotes can be supported, fixes #298. |
|
73 | 73 | - Lexers: added small extensions table to extend syntax highlighting for file |
|
74 | 74 | sources. Fixes #4227. |
|
75 | 75 | - Search: fix bug where file path link was wrong when the repository name was |
|
76 | 76 | in the file path, fixes #4228 |
|
77 |
- |
|
|
77 | - Pagination: fixed INT overflow bug. | |
|
78 | 78 | - Events: send pushed commits always in the correct in order. |
@@ -7,19 +7,18 b' Release Date' | |||
|
7 | 7 | - 2016-09-27 |
|
8 | 8 | |
|
9 | 9 | |
|
10 | New Features | |
|
11 | ^^^^^^^^^^^^ | |
|
12 | ||
|
13 | ||
|
10 | 14 | General |
|
11 | 15 | ^^^^^^^ |
|
12 | 16 | |
|
13 |
- |
|
|
17 | - Channelstream: auto-generate the url to channelstream server if it's not | |
|
14 | 18 | explicitly defined in the config. It allows to use a relative server |
|
15 | 19 | without knowing its address upfront. |
|
16 | 20 | |
|
17 | 21 | |
|
18 | New Features | |
|
19 | ^^^^^^^^^^^^ | |
|
20 | ||
|
21 | ||
|
22 | ||
|
23 | 22 | Security |
|
24 | 23 | ^^^^^^^^ |
|
25 | 24 |
@@ -7,21 +7,21 b' Release Date' | |||
|
7 | 7 | - 2016-10-17 |
|
8 | 8 | |
|
9 | 9 | |
|
10 | General | |
|
11 | ^^^^^^^ | |
|
12 | ||
|
13 | - packaging: pinned against rhodecode-tools 0.10.1 | |
|
14 | ||
|
15 | ||
|
16 | 10 | New Features |
|
17 | 11 | ^^^^^^^^^^^^ |
|
18 | 12 | |
|
19 | 13 | |
|
20 | 14 | |
|
15 | General | |
|
16 | ^^^^^^^ | |
|
17 | ||
|
18 | - Packaging: pinned against rhodecode-tools 0.10.1 | |
|
19 | ||
|
20 | ||
|
21 | 21 | Security |
|
22 | 22 | ^^^^^^^^ |
|
23 | 23 | |
|
24 |
- |
|
|
24 | - Integrations: fix 500 error on integrations page when delegated admin | |
|
25 | 25 | tried to access integration page after adding some integrations. |
|
26 | 26 | Permission checks were to strict for delegated admins. |
|
27 | 27 | |
@@ -34,8 +34,8 b' Performance' | |||
|
34 | 34 | Fixes |
|
35 | 35 | ^^^^^ |
|
36 | 36 | |
|
37 |
- |
|
|
37 | - Vcsserver: make sure we correctly ping against bundled HG/GIT/SVN binaries. | |
|
38 | 38 | This should fix a problem where system binaries could be used accidentally |
|
39 | 39 | by the RhodeCode. |
|
40 |
- |
|
|
40 | - LDAP: fixed email extraction issues. Empty email addresses from LDAP server | |
|
41 | 41 | will no longer take precedence over those stored inside RhodeCode database. |
@@ -9,6 +9,7 b' Release Notes' | |||
|
9 | 9 | .. toctree:: |
|
10 | 10 | :maxdepth: 1 |
|
11 | 11 | |
|
12 | release-notes-4.5.0.rst | |
|
12 | 13 | release-notes-4.4.2.rst |
|
13 | 14 | release-notes-4.4.1.rst |
|
14 | 15 | release-notes-4.4.0.rst |
@@ -4,7 +4,7 b'' | |||
|
4 | 4 | Scaling Best Practices |
|
5 | 5 | ====================== |
|
6 | 6 | |
|
7 | When deploying |RCE| at scale; 100s of users, multiple instances, CI servers, | |
|
7 | When deploying |RCE| at scale; 1000s of users, multiple instances, CI servers, | |
|
8 | 8 | there are a number of steps you can take to ensure you are getting the |
|
9 | 9 | most out of your system. |
|
10 | 10 | |
@@ -15,20 +15,23 b' You can configure multiple |RCE| instanc' | |||
|
15 | 15 | set of |repos|. This lets users work on an instance that has less traffic |
|
16 | 16 | than those being hit by CI servers. To configure this, use |RCC| to install |
|
17 | 17 | multiple instances and configure the database and |repos| connection. If you |
|
18 | do need to reset the database connection, see the | |
|
18 | do need to reset/adjust the database connection, see the | |
|
19 | 19 | :ref:`config-database` section. |
|
20 | 20 | |
|
21 | Once configured, set your CI servers to use a particular instance and for | |
|
22 | user specific instances you can configure loads balancing. See the | |
|
23 | :ref:`nginx-ws-ref` section for examples. | |
|
21 | You can configure then a load-balancer to balance the traffic between the CI | |
|
22 | dedicated instance and instance that end users would use. | |
|
23 | See the :ref:`nginx-ws-ref` section for examples on how to do it in NGINX. | |
|
24 | 24 | |
|
25 | 25 | Switch to Database Sessions |
|
26 | 26 | --------------------------- |
|
27 | 27 | |
|
28 |
To increase |
|
|
29 | large scale deployment, we recommend switching from file-based | |
|
30 | sessions to database-based user sessions. For configuration details, see the | |
|
31 | :ref:`db-session-ref` section. | |
|
28 | To increase |RCE| performance switch from the default file based sessions to | |
|
29 | database-based. In such way all your distributed instances would not need to | |
|
30 | share the file storage to use file-based sessions. | |
|
31 | Database based session have an additional advantage of the file | |
|
32 | based ones that they don't need a periodic cleanup as the session library | |
|
33 | cleans them up for users. For configuration details, | |
|
34 | see the :ref:`db-session-ref` section. | |
|
32 | 35 | |
|
33 | 36 | Tuning |RCE| |
|
34 | 37 | ------------ |
@@ -6,7 +6,9 b'' | |||
|
6 | 6 | }, |
|
7 | 7 | "js": { |
|
8 | 8 | "src": "rhodecode/public/js/src", |
|
9 | "dest": "rhodecode/public/js" | |
|
9 | "dest": "rhodecode/public/js", | |
|
10 | "bower": "bower_components", | |
|
11 | "node_modules": "node_modules" | |
|
10 | 12 | } |
|
11 | 13 | }, |
|
12 | 14 | "copy": { |
@@ -34,7 +36,8 b'' | |||
|
34 | 36 | "<%= dirs.js.src %>/bootstrap.js", |
|
35 | 37 | "<%= dirs.js.src %>/mousetrap.js", |
|
36 | 38 | "<%= dirs.js.src %>/moment.js", |
|
37 |
"<%= dirs.js.s |
|
|
39 | "<%= dirs.js.node_modules %>/appenlight-client/appenlight-client.min.js", | |
|
40 | "<%= dirs.js.node_modules %>/favico.js/favico-0.3.10.min.js", | |
|
38 | 41 | "<%= dirs.js.src %>/i18n_utils.js", |
|
39 | 42 | "<%= dirs.js.src %>/deform.js", |
|
40 | 43 | "<%= dirs.js.src %>/plugins/jquery.pjax.js", |
@@ -64,7 +67,6 b'' | |||
|
64 | 67 | "<%= dirs.js.src %>/rhodecode/utils/ie.js", |
|
65 | 68 | "<%= dirs.js.src %>/rhodecode/utils/os.js", |
|
66 | 69 | "<%= dirs.js.src %>/rhodecode/utils/topics.js", |
|
67 | "<%= dirs.js.src %>/rhodecode/widgets/multiselect.js", | |
|
68 | 70 | "<%= dirs.js.src %>/rhodecode/init.js", |
|
69 | 71 | "<%= dirs.js.src %>/rhodecode/codemirror.js", |
|
70 | 72 | "<%= dirs.js.src %>/rhodecode/comments.js", |
@@ -144,7 +146,9 b'' | |||
|
144 | 146 | "less:development", |
|
145 | 147 | "less:components", |
|
146 | 148 | "concat:polymercss", |
|
147 | "vulcanize" | |
|
149 | "vulcanize", | |
|
150 | "crisper", | |
|
151 | "concat:dist" | |
|
148 | 152 | ] |
|
149 | 153 | }, |
|
150 | 154 | "js": { |
@@ -13,6 +13,8 b'' | |||
|
13 | 13 | "grunt-crisper": "^1.0.1", |
|
14 | 14 | "grunt-vulcanize": "^1.0.0", |
|
15 | 15 | "jshint": "^2.9.1-rc3", |
|
16 | "bower": "^1.7.9" | |
|
16 | "bower": "^1.7.9", | |
|
17 | "favico.js": "^0.3.10", | |
|
18 | "appenlight-client": "git+https://git@github.com/AppEnlight/appenlight-client-js.git#0.5.0" | |
|
17 | 19 | } |
|
18 | 20 | } |
@@ -4,10 +4,12 b' buildEnv { name = "bower-env"; ignoreCol' | |||
|
4 | 4 | (fetchbower "polymer" "Polymer/polymer#1.6.1" "Polymer/polymer#^1.6.1" "09mm0jgk457gvwqlc155swch7gjr6fs3g7spnvhi6vh5b6518540") |
|
5 | 5 | (fetchbower "paper-button" "PolymerElements/paper-button#1.0.13" "PolymerElements/paper-button#^1.0.13" "0i3y153nqk06pn0gk282vyybnl3g1w3w41d5i9z659cgn27g3fvm") |
|
6 | 6 | (fetchbower "paper-spinner" "PolymerElements/paper-spinner#1.2.0" "PolymerElements/paper-spinner#^1.2.0" "1av1m6y81jw3hjhz1yqy3rwcgxarjzl58ldfn4q6sn51pgzngfqb") |
|
7 |
(fetchbower "paper-tooltip" "PolymerElements/paper-tooltip#1.1. |
|
|
7 | (fetchbower "paper-tooltip" "PolymerElements/paper-tooltip#1.1.3" "PolymerElements/paper-tooltip#^1.1.2" "0vmrm1n8k9sk9nvqy03q177axy22pia6i3j1gxbk72j3pqiqvg6k") | |
|
8 | 8 | (fetchbower "paper-toast" "PolymerElements/paper-toast#1.3.0" "PolymerElements/paper-toast#^1.3.0" "0x9rqxsks5455s8pk4aikpp99ijdn6kxr9gvhwh99nbcqdzcxq1m") |
|
9 | 9 | (fetchbower "paper-toggle-button" "PolymerElements/paper-toggle-button#1.2.0" "PolymerElements/paper-toggle-button#^1.2.0" "0mphcng3ngspbpg4jjn0mb91nvr4xc1phq3qswib15h6sfww1b2w") |
|
10 | 10 | (fetchbower "iron-ajax" "PolymerElements/iron-ajax#1.4.3" "PolymerElements/iron-ajax#^1.4.3" "0m3dx27arwmlcp00b7n516sc5a51f40p9vapr1nvd57l3i3z0pzm") |
|
11 | (fetchbower "iron-autogrow-textarea" "PolymerElements/iron-autogrow-textarea#1.0.13" "PolymerElements/iron-autogrow-textarea#^1.0.13" "0zwhpl97vii1s8k0lgain8i9dnw29b0mxc5ixdscx9las13n2lqq") | |
|
12 | (fetchbower "iron-a11y-keys" "PolymerElements/iron-a11y-keys#1.0.6" "PolymerElements/iron-a11y-keys#^1.0.6" "1xz3mgghfcxixq28sdb654iaxj4nyi1bzcwf77ydkms6fviqs9mv") | |
|
11 | 13 | (fetchbower "iron-flex-layout" "PolymerElements/iron-flex-layout#1.3.1" "PolymerElements/iron-flex-layout#^1.0.0" "0nswv3ih3bhflgcd2wjfmddqswzgqxb2xbq65jk9w3rkj26hplbl") |
|
12 | 14 | (fetchbower "paper-behaviors" "PolymerElements/paper-behaviors#1.0.12" "PolymerElements/paper-behaviors#^1.0.0" "012bqk97awgz55cn7rm9g7cckrdhkqhls3zvp8l6nd4rdwcrdzq8") |
|
13 | 15 | (fetchbower "paper-material" "PolymerElements/paper-material#1.0.6" "PolymerElements/paper-material#^1.0.0" "0rljmknfdbm5aabvx9pk77754zckj3l127c3mvnmwkpkkr353xnh") |
@@ -19,13 +21,13 b' buildEnv { name = "bower-env"; ignoreCol' | |||
|
19 | 21 | (fetchbower "iron-checked-element-behavior" "PolymerElements/iron-checked-element-behavior#1.0.5" "PolymerElements/iron-checked-element-behavior#^1.0.0" "0l0yy4ah454s8bzfv076s8by7h67zy9ni6xb932qwyhx8br6c1m7") |
|
20 | 22 | (fetchbower "promise-polyfill" "polymerlabs/promise-polyfill#1.0.1" "polymerlabs/promise-polyfill#^1.0.0" "045bj2caav3famr5hhxgs1dx7n08r4s46mlzwb313vdy17is38xb") |
|
21 | 23 | (fetchbower "iron-behaviors" "PolymerElements/iron-behaviors#1.0.17" "PolymerElements/iron-behaviors#^1.0.0" "021qvkmbk32jrrmmphpmwgby4bzi5jyf47rh1bxmq2ip07ly4bpr") |
|
24 | (fetchbower "iron-validatable-behavior" "PolymerElements/iron-validatable-behavior#1.1.1" "PolymerElements/iron-validatable-behavior#^1.0.0" "1yhxlvywhw2klbbgm3f3cmanxfxggagph4ii635zv0c13707wslv") | |
|
25 | (fetchbower "iron-form-element-behavior" "PolymerElements/iron-form-element-behavior#1.0.6" "PolymerElements/iron-form-element-behavior#^1.0.0" "0rdhxivgkdhhz2yadgdbjfc70l555p3y83vjh8rfj5hr0asyn6q1") | |
|
26 | (fetchbower "iron-a11y-keys-behavior" "polymerelements/iron-a11y-keys-behavior#1.1.9" "polymerelements/iron-a11y-keys-behavior#^1.0.0" "1imm4gc84qizihhbyhfa8lwjh3myhj837f79i5m04xjgwrjmkaf6") | |
|
22 | 27 | (fetchbower "paper-ripple" "PolymerElements/paper-ripple#1.0.8" "PolymerElements/paper-ripple#^1.0.0" "0r9sq8ik7wwrw0qb82c3rw0c030ljwd3s466c9y4qbcrsbvfjnns") |
|
23 | 28 | (fetchbower "font-roboto" "PolymerElements/font-roboto#1.0.1" "PolymerElements/font-roboto#^1.0.1" "02jz43r0wkyr3yp7rq2rc08l5cwnsgca9fr54sr4rhsnl7cjpxrj") |
|
24 | 29 | (fetchbower "iron-meta" "PolymerElements/iron-meta#1.1.2" "PolymerElements/iron-meta#^1.0.0" "1wl4dx8fnsknw9z9xi8bpc4cy9x70c11x4zxwxnj73hf3smifppl") |
|
25 | 30 | (fetchbower "iron-resizable-behavior" "PolymerElements/iron-resizable-behavior#1.0.5" "PolymerElements/iron-resizable-behavior#^1.0.0" "1fd5zmbr2hax42vmcasncvk7lzi38fmb1kyii26nn8pnnjak7zkn") |
|
26 | 31 | (fetchbower "iron-selector" "PolymerElements/iron-selector#1.5.2" "PolymerElements/iron-selector#^1.0.0" "1ajv46llqzvahm5g6g75w7nfyjcslp53ji0wm96l2k94j87spv3r") |
|
27 | 32 | (fetchbower "web-animations-js" "web-animations/web-animations-js#2.2.2" "web-animations/web-animations-js#^2.2.0" "1izfvm3l67vwys0bqbhidi9rqziw2f8wv289386sc6jsxzgkzhga") |
|
28 | (fetchbower "iron-a11y-keys-behavior" "PolymerElements/iron-a11y-keys-behavior#1.1.7" "PolymerElements/iron-a11y-keys-behavior#^1.0.0" "070z46dbbz242002gmqrgy28x0y1fcqp9hnvbi05r3zphiqfx3l7") | |
|
29 | (fetchbower "iron-validatable-behavior" "PolymerElements/iron-validatable-behavior#1.1.1" "PolymerElements/iron-validatable-behavior#^1.0.0" "1yhxlvywhw2klbbgm3f3cmanxfxggagph4ii635zv0c13707wslv") | |
|
30 | (fetchbower "iron-form-element-behavior" "PolymerElements/iron-form-element-behavior#1.0.6" "PolymerElements/iron-form-element-behavior#^1.0.0" "0rdhxivgkdhhz2yadgdbjfc70l555p3y83vjh8rfj5hr0asyn6q1") | |
|
31 | 33 | ]; } |
@@ -103,6 +103,34 b' let' | |||
|
103 | 103 | sha1 = "a2e14ff85c2d6bf8c8080e5aa55129ebc6a2d320"; |
|
104 | 104 | }; |
|
105 | 105 | }; |
|
106 | "bower-1.7.9" = { | |
|
107 | name = "bower"; | |
|
108 | packageName = "bower"; | |
|
109 | version = "1.7.9"; | |
|
110 | src = fetchurl { | |
|
111 | url = "https://registry.npmjs.org/bower/-/bower-1.7.9.tgz"; | |
|
112 | sha1 = "b7296c2393e0d75edaa6ca39648132dd255812b0"; | |
|
113 | }; | |
|
114 | }; | |
|
115 | "favico.js-0.3.10" = { | |
|
116 | name = "favico.js"; | |
|
117 | packageName = "favico.js"; | |
|
118 | version = "0.3.10"; | |
|
119 | src = fetchurl { | |
|
120 | url = "https://registry.npmjs.org/favico.js/-/favico.js-0.3.10.tgz"; | |
|
121 | sha1 = "80586e27a117f24a8d51c18a99bdc714d4339301"; | |
|
122 | }; | |
|
123 | }; | |
|
124 | "appenlight-client-git+https://git@github.com/AppEnlight/appenlight-client-js.git#0.5.0" = { | |
|
125 | name = "appenlight-client"; | |
|
126 | packageName = "appenlight-client"; | |
|
127 | version = "0.5.0"; | |
|
128 | src = fetchgit { | |
|
129 | url = "https://git@github.com/AppEnlight/appenlight-client-js.git"; | |
|
130 | rev = "b1d6853345dc3e96468b34537810b3eb77e0764f"; | |
|
131 | sha256 = "2ef00aef7dafdecdc1666d2e83fc190a796849985d04a8f0fad148d64aa4f8db"; | |
|
132 | }; | |
|
133 | }; | |
|
106 | 134 | "async-0.1.22" = { |
|
107 | 135 | name = "async"; |
|
108 | 136 | packageName = "async"; |
@@ -301,13 +329,13 b' let' | |||
|
301 | 329 | sha1 = "fadd834b9683073da179b3eae6d9c0d15053f73e"; |
|
302 | 330 | }; |
|
303 | 331 | }; |
|
304 |
"inherits-2.0. |
|
|
332 | "inherits-2.0.3" = { | |
|
305 | 333 | name = "inherits"; |
|
306 | 334 | packageName = "inherits"; |
|
307 |
version = "2.0. |
|
|
335 | version = "2.0.3"; | |
|
308 | 336 | src = fetchurl { |
|
309 |
url = "https://registry.npmjs.org/inherits/-/inherits-2.0. |
|
|
310 | sha1 = "b17d08d326b4423e568eff719f91b0b1cbdf69f1"; | |
|
337 | url = "https://registry.npmjs.org/inherits/-/inherits-2.0.3.tgz"; | |
|
338 | sha1 = "633c2c83e3da42a502f52466022480f4208261de"; | |
|
311 | 339 | }; |
|
312 | 340 | }; |
|
313 | 341 | "minimatch-0.3.0" = { |
@@ -580,13 +608,13 b' let' | |||
|
580 | 608 | sha1 = "6cbfea22b3b830304e9a5fb371d54fa480c9d7cf"; |
|
581 | 609 | }; |
|
582 | 610 | }; |
|
583 |
"lodash-4.1 |
|
|
611 | "lodash-4.16.2" = { | |
|
584 | 612 | name = "lodash"; |
|
585 | 613 | packageName = "lodash"; |
|
586 |
version = "4.1 |
|
|
614 | version = "4.16.2"; | |
|
587 | 615 | src = fetchurl { |
|
588 |
url = "https://registry.npmjs.org/lodash/-/lodash-4.1 |
|
|
589 | sha1 = "3162391d8f0140aa22cf8f6b3c34d6b7f63d3aa9"; | |
|
616 | url = "https://registry.npmjs.org/lodash/-/lodash-4.16.2.tgz"; | |
|
617 | sha1 = "3e626db827048a699281a8a125226326cfc0e652"; | |
|
590 | 618 | }; |
|
591 | 619 | }; |
|
592 | 620 | "errno-0.1.4" = { |
@@ -598,13 +626,13 b' let' | |||
|
598 | 626 | sha1 = "b896e23a9e5e8ba33871fc996abd3635fc9a1c7d"; |
|
599 | 627 | }; |
|
600 | 628 | }; |
|
601 |
"graceful-fs-4.1. |
|
|
629 | "graceful-fs-4.1.8" = { | |
|
602 | 630 | name = "graceful-fs"; |
|
603 | 631 | packageName = "graceful-fs"; |
|
604 |
version = "4.1. |
|
|
632 | version = "4.1.8"; | |
|
605 | 633 | src = fetchurl { |
|
606 |
url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.1. |
|
|
607 | sha1 = "514c38772b31bee2e08bedc21a0aeb3abf54c19e"; | |
|
634 | url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.1.8.tgz"; | |
|
635 | sha1 = "da3e11135eb2168bdd374532c4e2649751672890"; | |
|
608 | 636 | }; |
|
609 | 637 | }; |
|
610 | 638 | "image-size-0.5.0" = { |
@@ -670,13 +698,13 b' let' | |||
|
670 | 698 | sha1 = "857fcabfc3397d2625b8228262e86aa7a011b05d"; |
|
671 | 699 | }; |
|
672 | 700 | }; |
|
673 |
"asap-2.0. |
|
|
701 | "asap-2.0.5" = { | |
|
674 | 702 | name = "asap"; |
|
675 | 703 | packageName = "asap"; |
|
676 |
version = "2.0. |
|
|
704 | version = "2.0.5"; | |
|
677 | 705 | src = fetchurl { |
|
678 |
url = "https://registry.npmjs.org/asap/-/asap-2.0. |
|
|
679 | sha1 = "b391bf7f6bfbc65706022fec8f49c4b07fecf589"; | |
|
706 | url = "https://registry.npmjs.org/asap/-/asap-2.0.5.tgz"; | |
|
707 | sha1 = "522765b50c3510490e52d7dcfe085ef9ba96958f"; | |
|
680 | 708 | }; |
|
681 | 709 | }; |
|
682 | 710 | "gaze-0.5.2" = { |
@@ -778,13 +806,13 b' let' | |||
|
778 | 806 | sha1 = "f197d6eaff34c9085577484b2864375b294f5697"; |
|
779 | 807 | }; |
|
780 | 808 | }; |
|
781 |
"dom5-1.3. |
|
|
809 | "dom5-1.3.6" = { | |
|
782 | 810 | name = "dom5"; |
|
783 | 811 | packageName = "dom5"; |
|
784 |
version = "1.3. |
|
|
812 | version = "1.3.6"; | |
|
785 | 813 | src = fetchurl { |
|
786 |
url = "https://registry.npmjs.org/dom5/-/dom5-1.3. |
|
|
787 | sha1 = "07e514522c245c7aa8512aa3f9118e8bcab9f909"; | |
|
814 | url = "https://registry.npmjs.org/dom5/-/dom5-1.3.6.tgz"; | |
|
815 | sha1 = "a7088a9fc5f3b08dc9f6eda4c7abaeb241945e0d"; | |
|
788 | 816 | }; |
|
789 | 817 | }; |
|
790 | 818 | "array-back-1.0.3" = { |
@@ -832,13 +860,13 b' let' | |||
|
832 | 860 | sha1 = "a2d6ce740d15f0d92b1b26763e2ce9c0e361fd98"; |
|
833 | 861 | }; |
|
834 | 862 | }; |
|
835 |
"typical-2. |
|
|
863 | "typical-2.6.0" = { | |
|
836 | 864 | name = "typical"; |
|
837 | 865 | packageName = "typical"; |
|
838 |
version = "2. |
|
|
866 | version = "2.6.0"; | |
|
839 | 867 | src = fetchurl { |
|
840 |
url = "https://registry.npmjs.org/typical/-/typical-2. |
|
|
841 | sha1 = "81244918aa28180c9e602aa457173404be0604f1"; | |
|
868 | url = "https://registry.npmjs.org/typical/-/typical-2.6.0.tgz"; | |
|
869 | sha1 = "89d51554ab139848a65bcc2c8772f8fb450c40ed"; | |
|
842 | 870 | }; |
|
843 | 871 | }; |
|
844 | 872 | "ansi-escape-sequences-2.2.2" = { |
@@ -958,22 +986,22 b' let' | |||
|
958 | 986 | sha1 = "a09136f72ec043d27c893707c2b159bfad7de93f"; |
|
959 | 987 | }; |
|
960 | 988 | }; |
|
961 |
"test-value-2. |
|
|
989 | "test-value-2.1.0" = { | |
|
962 | 990 | name = "test-value"; |
|
963 | 991 | packageName = "test-value"; |
|
964 |
version = "2. |
|
|
992 | version = "2.1.0"; | |
|
965 | 993 | src = fetchurl { |
|
966 |
url = "https://registry.npmjs.org/test-value/-/test-value-2. |
|
|
967 | sha1 = "0d65c45ee0b48a757c4507a5e98ec2680a9db137"; | |
|
968 | }; | |
|
969 | }; | |
|
970 |
"@types/clone-0.1. |
|
|
994 | url = "https://registry.npmjs.org/test-value/-/test-value-2.1.0.tgz"; | |
|
995 | sha1 = "11da6ff670f3471a73b625ca4f3fdcf7bb748291"; | |
|
996 | }; | |
|
997 | }; | |
|
998 | "@types/clone-0.1.30" = { | |
|
971 | 999 | name = "@types/clone"; |
|
972 | 1000 | packageName = "@types/clone"; |
|
973 |
version = "0.1. |
|
|
1001 | version = "0.1.30"; | |
|
974 | 1002 | src = fetchurl { |
|
975 |
url = "https://registry.npmjs.org/@types/clone/-/clone-0.1. |
|
|
976 | sha1 = "65a0be88189ffddcd373e450aa6b68c9c83218b7"; | |
|
1003 | url = "https://registry.npmjs.org/@types/clone/-/clone-0.1.30.tgz"; | |
|
1004 | sha1 = "e7365648c1b42136a59c7d5040637b3b5c83b614"; | |
|
977 | 1005 | }; |
|
978 | 1006 | }; |
|
979 | 1007 | "@types/node-4.0.30" = { |
@@ -985,13 +1013,13 b' let' | |||
|
985 | 1013 | sha1 = "553f490ed3030311620f88003e7abfc0edcb301e"; |
|
986 | 1014 | }; |
|
987 | 1015 | }; |
|
988 |
"@types/parse5-0.0. |
|
|
1016 | "@types/parse5-0.0.31" = { | |
|
989 | 1017 | name = "@types/parse5"; |
|
990 | 1018 | packageName = "@types/parse5"; |
|
991 |
version = "0.0. |
|
|
1019 | version = "0.0.31"; | |
|
992 | 1020 | src = fetchurl { |
|
993 |
url = "https://registry.npmjs.org/@types/parse5/-/parse5-0.0. |
|
|
994 | sha1 = "2a38cb7145bb157688d4ad2c46944c6dffae3cc6"; | |
|
1021 | url = "https://registry.npmjs.org/@types/parse5/-/parse5-0.0.31.tgz"; | |
|
1022 | sha1 = "e827a493a443b156e1b582a2e4c3bdc0040f2ee7"; | |
|
995 | 1023 | }; |
|
996 | 1024 | }; |
|
997 | 1025 | "clone-1.0.2" = { |
@@ -1012,13 +1040,13 b' let' | |||
|
1012 | 1040 | sha1 = "9b7f3b0de32be78dc2401b17573ccaf0f6f59d94"; |
|
1013 | 1041 | }; |
|
1014 | 1042 | }; |
|
1015 |
"@types/node-6.0. |
|
|
1043 | "@types/node-6.0.41" = { | |
|
1016 | 1044 | name = "@types/node"; |
|
1017 | 1045 | packageName = "@types/node"; |
|
1018 |
version = "6.0. |
|
|
1046 | version = "6.0.41"; | |
|
1019 | 1047 | src = fetchurl { |
|
1020 |
url = "https://registry.npmjs.org/@types/node/-/node-6.0. |
|
|
1021 | sha1 = "a1e081f2ec60074113d3a1fbf11f35d304f30e39"; | |
|
1048 | url = "https://registry.npmjs.org/@types/node/-/node-6.0.41.tgz"; | |
|
1049 | sha1 = "578cf53aaec65887bcaf16792f8722932e8ff8ea"; | |
|
1022 | 1050 | }; |
|
1023 | 1051 | }; |
|
1024 | 1052 | "es6-promise-2.3.0" = { |
@@ -1093,13 +1121,13 b' let' | |||
|
1093 | 1121 | sha1 = "5a5b53af4693110bebb0867aa3430dd3b70a1018"; |
|
1094 | 1122 | }; |
|
1095 | 1123 | }; |
|
1096 |
"espree-3.1 |
|
|
1124 | "espree-3.3.1" = { | |
|
1097 | 1125 | name = "espree"; |
|
1098 | 1126 | packageName = "espree"; |
|
1099 |
version = "3.1 |
|
|
1127 | version = "3.3.1"; | |
|
1100 | 1128 | src = fetchurl { |
|
1101 |
url = "https://registry.npmjs.org/espree/-/espree-3.1 |
|
|
1102 | sha1 = "fd5deec76a97a5120a9cd3a7cb1177a0923b11d2"; | |
|
1129 | url = "https://registry.npmjs.org/espree/-/espree-3.3.1.tgz"; | |
|
1130 | sha1 = "42107376856738a65ff3b5877f3a58bd52497643"; | |
|
1103 | 1131 | }; |
|
1104 | 1132 | }; |
|
1105 | 1133 | "estraverse-3.1.0" = { |
@@ -1183,13 +1211,13 b' let' | |||
|
1183 | 1211 | sha1 = "96e3b70d5779f6ad49cd032673d1c312767ba581"; |
|
1184 | 1212 | }; |
|
1185 | 1213 | }; |
|
1186 |
"optionator-0.8. |
|
|
1214 | "optionator-0.8.2" = { | |
|
1187 | 1215 | name = "optionator"; |
|
1188 | 1216 | packageName = "optionator"; |
|
1189 |
version = "0.8. |
|
|
1217 | version = "0.8.2"; | |
|
1190 | 1218 | src = fetchurl { |
|
1191 |
url = "https://registry.npmjs.org/optionator/-/optionator-0.8. |
|
|
1192 | sha1 = "e31b4932cdd5fb862a8b0d10bc63d3ee1ec7d78b"; | |
|
1219 | url = "https://registry.npmjs.org/optionator/-/optionator-0.8.2.tgz"; | |
|
1220 | sha1 = "364c5e409d3f4d6301d6c0b4c05bba50180aeb64"; | |
|
1193 | 1221 | }; |
|
1194 | 1222 | }; |
|
1195 | 1223 | "source-map-0.2.0" = { |
@@ -1246,13 +1274,31 b' let' | |||
|
1246 | 1274 | sha1 = "3b09924edf9f083c0490fdd4c0bc4421e04764ee"; |
|
1247 | 1275 | }; |
|
1248 | 1276 | }; |
|
1249 |
"fast-levenshtein- |
|
|
1277 | "fast-levenshtein-2.0.4" = { | |
|
1250 | 1278 | name = "fast-levenshtein"; |
|
1251 | 1279 | packageName = "fast-levenshtein"; |
|
1252 |
version = " |
|
|
1280 | version = "2.0.4"; | |
|
1281 | src = fetchurl { | |
|
1282 | url = "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-2.0.4.tgz"; | |
|
1283 | sha1 = "e31e729eea62233c60a7bc9dce2bdcc88b4fffe3"; | |
|
1284 | }; | |
|
1285 | }; | |
|
1286 | "acorn-4.0.3" = { | |
|
1287 | name = "acorn"; | |
|
1288 | packageName = "acorn"; | |
|
1289 | version = "4.0.3"; | |
|
1253 | 1290 | src = fetchurl { |
|
1254 |
url = "https://registry.npmjs.org/ |
|
|
1255 | sha1 = "e6a754cc8f15e58987aa9cbd27af66fd6f4e5af9"; | |
|
1291 | url = "https://registry.npmjs.org/acorn/-/acorn-4.0.3.tgz"; | |
|
1292 | sha1 = "1a3e850b428e73ba6b09d1cc527f5aaad4d03ef1"; | |
|
1293 | }; | |
|
1294 | }; | |
|
1295 | "acorn-jsx-3.0.1" = { | |
|
1296 | name = "acorn-jsx"; | |
|
1297 | packageName = "acorn-jsx"; | |
|
1298 | version = "3.0.1"; | |
|
1299 | src = fetchurl { | |
|
1300 | url = "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-3.0.1.tgz"; | |
|
1301 | sha1 = "afdf9488fb1ecefc8348f6fb22f464e32a58b36b"; | |
|
1256 | 1302 | }; |
|
1257 | 1303 | }; |
|
1258 | 1304 | "acorn-3.3.0" = { |
@@ -1264,15 +1310,6 b' let' | |||
|
1264 | 1310 | sha1 = "45e37fb39e8da3f25baee3ff5369e2bb5f22017a"; |
|
1265 | 1311 | }; |
|
1266 | 1312 | }; |
|
1267 | "acorn-jsx-3.0.1" = { | |
|
1268 | name = "acorn-jsx"; | |
|
1269 | packageName = "acorn-jsx"; | |
|
1270 | version = "3.0.1"; | |
|
1271 | src = fetchurl { | |
|
1272 | url = "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-3.0.1.tgz"; | |
|
1273 | sha1 = "afdf9488fb1ecefc8348f6fb22f464e32a58b36b"; | |
|
1274 | }; | |
|
1275 | }; | |
|
1276 | 1313 | "boxen-0.3.1" = { |
|
1277 | 1314 | name = "boxen"; |
|
1278 | 1315 | packageName = "boxen"; |
@@ -1282,13 +1319,13 b' let' | |||
|
1282 | 1319 | sha1 = "a7d898243ae622f7abb6bb604d740a76c6a5461b"; |
|
1283 | 1320 | }; |
|
1284 | 1321 | }; |
|
1285 |
"configstore-2. |
|
|
1322 | "configstore-2.1.0" = { | |
|
1286 | 1323 | name = "configstore"; |
|
1287 | 1324 | packageName = "configstore"; |
|
1288 |
version = "2. |
|
|
1325 | version = "2.1.0"; | |
|
1289 | 1326 | src = fetchurl { |
|
1290 |
url = "https://registry.npmjs.org/configstore/-/configstore-2. |
|
|
1291 | sha1 = "8d81e9cdfa73ebd0e06bc985147856b2f1c4e764"; | |
|
1327 | url = "https://registry.npmjs.org/configstore/-/configstore-2.1.0.tgz"; | |
|
1328 | sha1 = "737a3a7036e9886102aa6099e47bb33ab1aba1a1"; | |
|
1292 | 1329 | }; |
|
1293 | 1330 | }; |
|
1294 | 1331 | "is-npm-1.0.0" = { |
@@ -1399,13 +1436,13 b' let' | |||
|
1399 | 1436 | sha1 = "ef9e31386f031a7f0d643af82fde50c457ef00cb"; |
|
1400 | 1437 | }; |
|
1401 | 1438 | }; |
|
1402 |
"dot-prop- |
|
|
1439 | "dot-prop-3.0.0" = { | |
|
1403 | 1440 | name = "dot-prop"; |
|
1404 | 1441 | packageName = "dot-prop"; |
|
1405 |
version = " |
|
|
1442 | version = "3.0.0"; | |
|
1406 | 1443 | src = fetchurl { |
|
1407 |
url = "https://registry.npmjs.org/dot-prop/-/dot-prop- |
|
|
1408 | sha1 = "848e28f7f1d50740c6747ab3cb07670462b6f89c"; | |
|
1444 | url = "https://registry.npmjs.org/dot-prop/-/dot-prop-3.0.0.tgz"; | |
|
1445 | sha1 = "1b708af094a49c9a0e7dbcad790aba539dac1177"; | |
|
1409 | 1446 | }; |
|
1410 | 1447 | }; |
|
1411 | 1448 | "os-tmpdir-1.0.1" = { |
@@ -1426,13 +1463,13 b' let' | |||
|
1426 | 1463 | sha1 = "83cf05c6d6458fc4d5ac6362ea325d92f2754217"; |
|
1427 | 1464 | }; |
|
1428 | 1465 | }; |
|
1429 |
"uuid-2.0. |
|
|
1466 | "uuid-2.0.3" = { | |
|
1430 | 1467 | name = "uuid"; |
|
1431 | 1468 | packageName = "uuid"; |
|
1432 |
version = "2.0. |
|
|
1469 | version = "2.0.3"; | |
|
1433 | 1470 | src = fetchurl { |
|
1434 |
url = "https://registry.npmjs.org/uuid/-/uuid-2.0. |
|
|
1435 | sha1 = "48bd5698f0677e3c7901a1c46ef15b1643794726"; | |
|
1471 | url = "https://registry.npmjs.org/uuid/-/uuid-2.0.3.tgz"; | |
|
1472 | sha1 = "67e2e863797215530dff318e5bf9dcebfd47b21a"; | |
|
1436 | 1473 | }; |
|
1437 | 1474 | }; |
|
1438 | 1475 | "write-file-atomic-1.2.0" = { |
@@ -1489,13 +1526,13 b' let' | |||
|
1489 | 1526 | sha1 = "56eb027d65b4d2dce6cb2e2d32c4d4afc9e1d707"; |
|
1490 | 1527 | }; |
|
1491 | 1528 | }; |
|
1492 |
"package-json-2. |
|
|
1529 | "package-json-2.4.0" = { | |
|
1493 | 1530 | name = "package-json"; |
|
1494 | 1531 | packageName = "package-json"; |
|
1495 |
version = "2. |
|
|
1532 | version = "2.4.0"; | |
|
1496 | 1533 | src = fetchurl { |
|
1497 |
url = "https://registry.npmjs.org/package-json/-/package-json-2. |
|
|
1498 | sha1 = "14895311a963d18edf8801e06b67ea87795d15b9"; | |
|
1534 | url = "https://registry.npmjs.org/package-json/-/package-json-2.4.0.tgz"; | |
|
1535 | sha1 = "0d15bd67d1cbbddbb2ca222ff2edb86bcb31a8bb"; | |
|
1499 | 1536 | }; |
|
1500 | 1537 | }; |
|
1501 | 1538 | "got-5.6.0" = { |
@@ -1507,13 +1544,13 b' let' | |||
|
1507 | 1544 | sha1 = "bb1d7ee163b78082bbc8eb836f3f395004ea6fbf"; |
|
1508 | 1545 | }; |
|
1509 | 1546 | }; |
|
1510 | "rc-1.1.6" = { | |
|
1511 |
name = "r |
|
|
1512 |
packageName = "r |
|
|
1513 |
version = " |
|
|
1547 | "registry-auth-token-3.0.1" = { | |
|
1548 | name = "registry-auth-token"; | |
|
1549 | packageName = "registry-auth-token"; | |
|
1550 | version = "3.0.1"; | |
|
1514 | 1551 | src = fetchurl { |
|
1515 |
url = "https://registry.npmjs.org/r |
|
|
1516 | sha1 = "43651b76b6ae53b5c802f1151fa3fc3b059969c9"; | |
|
1552 | url = "https://registry.npmjs.org/registry-auth-token/-/registry-auth-token-3.0.1.tgz"; | |
|
1553 | sha1 = "c3ee5ec585bce29f88bf41629a3944c71ed53e25"; | |
|
1517 | 1554 | }; |
|
1518 | 1555 | }; |
|
1519 | 1556 | "registry-url-3.1.0" = { |
@@ -1651,13 +1688,13 b' let' | |||
|
1651 | 1688 | sha1 = "f38b0ae81d3747d628001f41dafc652ace671c0a"; |
|
1652 | 1689 | }; |
|
1653 | 1690 | }; |
|
1654 |
"unzip-response-1.0. |
|
|
1691 | "unzip-response-1.0.1" = { | |
|
1655 | 1692 | name = "unzip-response"; |
|
1656 | 1693 | packageName = "unzip-response"; |
|
1657 |
version = "1.0. |
|
|
1694 | version = "1.0.1"; | |
|
1658 | 1695 | src = fetchurl { |
|
1659 |
url = "https://registry.npmjs.org/unzip-response/-/unzip-response-1.0. |
|
|
1660 | sha1 = "bfda54eeec658f00c2df4d4494b9dca0ca00f3e4"; | |
|
1696 | url = "https://registry.npmjs.org/unzip-response/-/unzip-response-1.0.1.tgz"; | |
|
1697 | sha1 = "4a73959f2989470fa503791cefb54e1dbbc68412"; | |
|
1661 | 1698 | }; |
|
1662 | 1699 | }; |
|
1663 | 1700 | "url-parse-lax-1.0.0" = { |
@@ -1768,6 +1805,15 b' let' | |||
|
1768 | 1805 | sha1 = "d4f4562b0ce3696e41ac52d0e002e57a635dc6dc"; |
|
1769 | 1806 | }; |
|
1770 | 1807 | }; |
|
1808 | "rc-1.1.6" = { | |
|
1809 | name = "rc"; | |
|
1810 | packageName = "rc"; | |
|
1811 | version = "1.1.6"; | |
|
1812 | src = fetchurl { | |
|
1813 | url = "https://registry.npmjs.org/rc/-/rc-1.1.6.tgz"; | |
|
1814 | sha1 = "43651b76b6ae53b5c802f1151fa3fc3b059969c9"; | |
|
1815 | }; | |
|
1816 | }; | |
|
1771 | 1817 | "ini-1.3.4" = { |
|
1772 | 1818 | name = "ini"; |
|
1773 | 1819 | packageName = "ini"; |
@@ -1858,13 +1904,13 b' let' | |||
|
1858 | 1904 | sha1 = "3678bd8ab995057c07ade836ed2ef087da811d45"; |
|
1859 | 1905 | }; |
|
1860 | 1906 | }; |
|
1861 |
"glob-7.0 |
|
|
1907 | "glob-7.1.0" = { | |
|
1862 | 1908 | name = "glob"; |
|
1863 | 1909 | packageName = "glob"; |
|
1864 |
version = "7.0 |
|
|
1910 | version = "7.1.0"; | |
|
1865 | 1911 | src = fetchurl { |
|
1866 |
url = "https://registry.npmjs.org/glob/-/glob-7.0 |
|
|
1867 | sha1 = "211bafaf49e525b8cd93260d14ab136152b3f57a"; | |
|
1912 | url = "https://registry.npmjs.org/glob/-/glob-7.1.0.tgz"; | |
|
1913 | sha1 = "36add856d746d0d99e4cc2797bba1ae2c67272fd"; | |
|
1868 | 1914 | }; |
|
1869 | 1915 | }; |
|
1870 | 1916 | "fs.realpath-1.0.0" = { |
@@ -1885,13 +1931,13 b' let' | |||
|
1885 | 1931 | sha1 = "db3204cd5a9de2e6cd890b85c6e2f66bcf4f620a"; |
|
1886 | 1932 | }; |
|
1887 | 1933 | }; |
|
1888 |
"once-1. |
|
|
1934 | "once-1.4.0" = { | |
|
1889 | 1935 | name = "once"; |
|
1890 | 1936 | packageName = "once"; |
|
1891 |
version = "1. |
|
|
1937 | version = "1.4.0"; | |
|
1892 | 1938 | src = fetchurl { |
|
1893 |
url = "https://registry.npmjs.org/once/-/once-1. |
|
|
1894 | sha1 = "b2e261557ce4c314ec8304f3fa82663e4297ca20"; | |
|
1939 | url = "https://registry.npmjs.org/once/-/once-1.4.0.tgz"; | |
|
1940 | sha1 = "583b1aa775961d4b113ac17d9c50baef9dd76bd1"; | |
|
1895 | 1941 | }; |
|
1896 | 1942 | }; |
|
1897 | 1943 | "wrappy-1.0.2" = { |
@@ -2034,7 +2080,7 b' let' | |||
|
2034 | 2080 | (sources."grunt-contrib-less-1.4.0" // { |
|
2035 | 2081 | dependencies = [ |
|
2036 | 2082 | sources."async-2.0.1" |
|
2037 |
sources."lodash-4.1 |
|
|
2083 | sources."lodash-4.16.2" | |
|
2038 | 2084 | ]; |
|
2039 | 2085 | }) |
|
2040 | 2086 | (sources."grunt-contrib-watch-0.6.1" // { |
@@ -2062,6 +2108,9 b' let' | |||
|
2062 | 2108 | sources."lodash-3.7.0" |
|
2063 | 2109 | ]; |
|
2064 | 2110 | }) |
|
2111 | sources."bower-1.7.9" | |
|
2112 | sources."favico.js-0.3.10" | |
|
2113 | sources."appenlight-client-git+https://git@github.com/AppEnlight/appenlight-client-js.git#0.5.0" | |
|
2065 | 2114 | sources."async-0.1.22" |
|
2066 | 2115 | sources."coffee-script-1.3.3" |
|
2067 | 2116 | sources."colors-0.6.2" |
@@ -2097,7 +2146,7 b' let' | |||
|
2097 | 2146 | sources."underscore.string-2.3.3" |
|
2098 | 2147 | ]; |
|
2099 | 2148 | }) |
|
2100 |
sources."inherits-2.0. |
|
|
2149 | sources."inherits-2.0.3" | |
|
2101 | 2150 | sources."lru-cache-2.7.3" |
|
2102 | 2151 | sources."sigmund-1.0.1" |
|
2103 | 2152 | sources."graceful-fs-1.2.3" |
@@ -2127,7 +2176,7 b' let' | |||
|
2127 | 2176 | sources."amdefine-1.0.0" |
|
2128 | 2177 | (sources."less-2.7.1" // { |
|
2129 | 2178 | dependencies = [ |
|
2130 |
sources."graceful-fs-4.1. |
|
|
2179 | sources."graceful-fs-4.1.8" | |
|
2131 | 2180 | sources."source-map-0.5.6" |
|
2132 | 2181 | ]; |
|
2133 | 2182 | }) |
@@ -2138,7 +2187,7 b' let' | |||
|
2138 | 2187 | sources."promise-7.1.1" |
|
2139 | 2188 | sources."prr-0.0.0" |
|
2140 | 2189 | sources."minimist-0.0.8" |
|
2141 |
sources."asap-2.0. |
|
|
2190 | sources."asap-2.0.5" | |
|
2142 | 2191 | sources."gaze-0.5.2" |
|
2143 | 2192 | sources."tiny-lr-fork-0.0.5" |
|
2144 | 2193 | (sources."globule-0.1.0" // { |
@@ -2155,17 +2204,17 b' let' | |||
|
2155 | 2204 | }) |
|
2156 | 2205 | sources."debug-0.7.4" |
|
2157 | 2206 | sources."command-line-args-2.1.6" |
|
2158 |
sources."dom5-1.3. |
|
|
2207 | sources."dom5-1.3.6" | |
|
2159 | 2208 | sources."array-back-1.0.3" |
|
2160 | 2209 | sources."command-line-usage-2.0.5" |
|
2161 | 2210 | sources."core-js-2.4.1" |
|
2162 | 2211 | sources."feature-detect-es6-1.3.1" |
|
2163 | 2212 | (sources."find-replace-1.0.2" // { |
|
2164 | 2213 | dependencies = [ |
|
2165 |
sources."test-value-2. |
|
|
2214 | sources."test-value-2.1.0" | |
|
2166 | 2215 | ]; |
|
2167 | 2216 | }) |
|
2168 |
sources."typical-2. |
|
|
2217 | sources."typical-2.6.0" | |
|
2169 | 2218 | sources."ansi-escape-sequences-2.2.2" |
|
2170 | 2219 | sources."column-layout-2.1.4" |
|
2171 | 2220 | sources."wordwrapjs-1.2.1" |
@@ -2182,11 +2231,11 b' let' | |||
|
2182 | 2231 | sources."object-tools-2.0.6" |
|
2183 | 2232 | sources."object-get-2.1.0" |
|
2184 | 2233 | sources."test-value-1.1.0" |
|
2185 |
sources."@types/clone-0.1. |
|
|
2234 | sources."@types/clone-0.1.30" | |
|
2186 | 2235 | sources."@types/node-4.0.30" |
|
2187 |
(sources."@types/parse5-0.0. |
|
|
2236 | (sources."@types/parse5-0.0.31" // { | |
|
2188 | 2237 | dependencies = [ |
|
2189 |
sources."@types/node-6.0. |
|
|
2238 | sources."@types/node-6.0.41" | |
|
2190 | 2239 | ]; |
|
2191 | 2240 | }) |
|
2192 | 2241 | sources."clone-1.0.2" |
@@ -2205,26 +2254,30 b' let' | |||
|
2205 | 2254 | sources."source-map-0.2.0" |
|
2206 | 2255 | ]; |
|
2207 | 2256 | }) |
|
2208 |
sources."espree-3.1 |
|
|
2257 | sources."espree-3.3.1" | |
|
2209 | 2258 | sources."estraverse-3.1.0" |
|
2210 | 2259 | sources."path-is-absolute-1.0.0" |
|
2211 | 2260 | sources."babel-runtime-6.11.6" |
|
2212 | 2261 | sources."regenerator-runtime-0.9.5" |
|
2213 | 2262 | sources."esutils-1.1.6" |
|
2214 | 2263 | sources."isarray-0.0.1" |
|
2215 |
sources."optionator-0.8. |
|
|
2264 | sources."optionator-0.8.2" | |
|
2216 | 2265 | sources."prelude-ls-1.1.2" |
|
2217 | 2266 | sources."deep-is-0.1.3" |
|
2218 | 2267 | sources."wordwrap-1.0.0" |
|
2219 | 2268 | sources."type-check-0.3.2" |
|
2220 | 2269 | sources."levn-0.3.0" |
|
2221 |
sources."fast-levenshtein- |
|
|
2270 | sources."fast-levenshtein-2.0.4" | |
|
2271 | sources."acorn-4.0.3" | |
|
2272 | (sources."acorn-jsx-3.0.1" // { | |
|
2273 | dependencies = [ | |
|
2222 | 2274 | sources."acorn-3.3.0" |
|
2223 | sources."acorn-jsx-3.0.1" | |
|
2275 | ]; | |
|
2276 | }) | |
|
2224 | 2277 | sources."boxen-0.3.1" |
|
2225 |
(sources."configstore-2. |
|
|
2278 | (sources."configstore-2.1.0" // { | |
|
2226 | 2279 | dependencies = [ |
|
2227 |
sources."graceful-fs-4.1. |
|
|
2280 | sources."graceful-fs-4.1.8" | |
|
2228 | 2281 | ]; |
|
2229 | 2282 | }) |
|
2230 | 2283 | sources."is-npm-1.0.0" |
@@ -2239,13 +2292,13 b' let' | |||
|
2239 | 2292 | sources."number-is-nan-1.0.0" |
|
2240 | 2293 | sources."code-point-at-1.0.0" |
|
2241 | 2294 | sources."is-fullwidth-code-point-1.0.0" |
|
2242 |
sources."dot-prop- |
|
|
2295 | sources."dot-prop-3.0.0" | |
|
2243 | 2296 | sources."os-tmpdir-1.0.1" |
|
2244 | 2297 | sources."osenv-0.1.3" |
|
2245 |
sources."uuid-2.0. |
|
|
2298 | sources."uuid-2.0.3" | |
|
2246 | 2299 | (sources."write-file-atomic-1.2.0" // { |
|
2247 | 2300 | dependencies = [ |
|
2248 |
sources."graceful-fs-4.1. |
|
|
2301 | sources."graceful-fs-4.1.8" | |
|
2249 | 2302 | ]; |
|
2250 | 2303 | }) |
|
2251 | 2304 | sources."xdg-basedir-2.0.0" |
@@ -2253,13 +2306,9 b' let' | |||
|
2253 | 2306 | sources."os-homedir-1.0.1" |
|
2254 | 2307 | sources."imurmurhash-0.1.4" |
|
2255 | 2308 | sources."slide-1.1.6" |
|
2256 |
sources."package-json-2. |
|
|
2309 | sources."package-json-2.4.0" | |
|
2257 | 2310 | sources."got-5.6.0" |
|
2258 | (sources."rc-1.1.6" // { | |
|
2259 | dependencies = [ | |
|
2260 | sources."minimist-1.2.0" | |
|
2261 | ]; | |
|
2262 | }) | |
|
2311 | sources."registry-auth-token-3.0.1" | |
|
2263 | 2312 | sources."registry-url-3.1.0" |
|
2264 | 2313 | sources."semver-5.3.0" |
|
2265 | 2314 | sources."create-error-class-3.0.2" |
@@ -2279,7 +2328,7 b' let' | |||
|
2279 | 2328 | ]; |
|
2280 | 2329 | }) |
|
2281 | 2330 | sources."timed-out-2.0.0" |
|
2282 |
sources."unzip-response-1.0. |
|
|
2331 | sources."unzip-response-1.0.1" | |
|
2283 | 2332 | sources."url-parse-lax-1.0.0" |
|
2284 | 2333 | sources."capture-stack-trace-1.0.0" |
|
2285 | 2334 | sources."error-ex-1.3.0" |
@@ -2291,11 +2340,16 b' let' | |||
|
2291 | 2340 | sources."string_decoder-0.10.31" |
|
2292 | 2341 | sources."util-deprecate-1.0.2" |
|
2293 | 2342 | sources."prepend-http-1.0.4" |
|
2343 | (sources."rc-1.1.6" // { | |
|
2344 | dependencies = [ | |
|
2345 | sources."minimist-1.2.0" | |
|
2346 | ]; | |
|
2347 | }) | |
|
2294 | 2348 | sources."ini-1.3.4" |
|
2295 | 2349 | sources."strip-json-comments-1.0.4" |
|
2296 | 2350 | (sources."cli-1.0.0" // { |
|
2297 | 2351 | dependencies = [ |
|
2298 |
sources."glob-7.0 |
|
|
2352 | sources."glob-7.1.0" | |
|
2299 | 2353 | sources."minimatch-3.0.3" |
|
2300 | 2354 | ]; |
|
2301 | 2355 | }) |
@@ -2308,7 +2362,7 b' let' | |||
|
2308 | 2362 | sources."shelljs-0.3.0" |
|
2309 | 2363 | sources."fs.realpath-1.0.0" |
|
2310 | 2364 | sources."inflight-1.0.5" |
|
2311 |
sources."once-1. |
|
|
2365 | sources."once-1.4.0" | |
|
2312 | 2366 | sources."wrappy-1.0.2" |
|
2313 | 2367 | sources."brace-expansion-1.1.6" |
|
2314 | 2368 | sources."balanced-match-0.4.2" |
@@ -15,19 +15,6 b' let' | |||
|
15 | 15 | }; |
|
16 | 16 | }; |
|
17 | 17 | |
|
18 | # johbo: Interim bridge which allows us to build with the upcoming | |
|
19 | # nixos.16.09 branch (unstable at the moment of writing this note) and the | |
|
20 | # current stable nixos-16.03. | |
|
21 | backwardsCompatibleFetchgit = { ... }@args: | |
|
22 | let | |
|
23 | origSources = pkgs.fetchgit args; | |
|
24 | in | |
|
25 | pkgs.lib.overrideDerivation origSources (oldAttrs: { | |
|
26 | NIX_PREFETCH_GIT_CHECKOUT_HOOK = '' | |
|
27 | find $out -name '.git*' -print0 | xargs -0 rm -rf | |
|
28 | ''; | |
|
29 | }); | |
|
30 | ||
|
31 | 18 | in |
|
32 | 19 | |
|
33 | 20 | self: super: { |
@@ -77,6 +64,9 b' self: super: {' | |||
|
77 | 64 | }); |
|
78 | 65 | |
|
79 | 66 | lxml = super.lxml.override (attrs: { |
|
67 | # johbo: On 16.09 we need this to compile on darwin, otherwise compilation | |
|
68 | # fails on Darwin. | |
|
69 | hardeningDisable = if pkgs.stdenv.isDarwin then [ "format" ] else null; | |
|
80 | 70 | buildInputs = with self; [ |
|
81 | 71 | pkgs.libxml2 |
|
82 | 72 | pkgs.libxslt |
@@ -110,7 +100,7 b' self: super: {' | |||
|
110 | 100 | }); |
|
111 | 101 | |
|
112 | 102 | py-gfm = super.py-gfm.override { |
|
113 |
src = |
|
|
103 | src = pkgs.fetchgit { | |
|
114 | 104 | url = "https://code.rhodecode.com/upstream/py-gfm"; |
|
115 | 105 | rev = "0d66a19bc16e3d49de273c0f797d4e4781e8c0f2"; |
|
116 | 106 | sha256 = "0ryp74jyihd3ckszq31bml5jr3bciimhfp7va7kw6ld92930ksv3"; |
@@ -134,7 +124,7 b' self: super: {' | |||
|
134 | 124 | |
|
135 | 125 | Pylons = super.Pylons.override (attrs: { |
|
136 | 126 | name = "Pylons-1.0.1-patch1"; |
|
137 |
src = |
|
|
127 | src = pkgs.fetchgit { | |
|
138 | 128 | url = "https://code.rhodecode.com/upstream/pylons"; |
|
139 | 129 | rev = "707354ee4261b9c10450404fc9852ccea4fd667d"; |
|
140 | 130 | sha256 = "b2763274c2780523a335f83a1df65be22ebe4ff413a7bc9e9288d23c1f62032e"; |
@@ -190,7 +180,8 b' self: super: {' | |||
|
190 | 180 | pkgs.openldap |
|
191 | 181 | pkgs.openssl |
|
192 | 182 | ]; |
|
193 | NIX_CFLAGS_COMPILE = "-I${pkgs.cyrus_sasl}/include/sasl"; | |
|
183 | # TODO: johbo: Remove the "or" once we drop 16.03 support. | |
|
184 | NIX_CFLAGS_COMPILE = "-I${pkgs.cyrus_sasl.dev or pkgs.cyrus_sasl}/include/sasl"; | |
|
194 | 185 | }); |
|
195 | 186 | |
|
196 | 187 | python-pam = super.python-pam.override (attrs: |
@@ -431,6 +431,19 b'' | |||
|
431 | 431 | license = [ pkgs.lib.licenses.psfl ]; |
|
432 | 432 | }; |
|
433 | 433 | }; |
|
434 | backports.shutil-get-terminal-size = super.buildPythonPackage { | |
|
435 | name = "backports.shutil-get-terminal-size-1.0.0"; | |
|
436 | buildInputs = with self; []; | |
|
437 | doCheck = false; | |
|
438 | propagatedBuildInputs = with self; []; | |
|
439 | src = fetchurl { | |
|
440 | url = "https://pypi.python.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz"; | |
|
441 | md5 = "03267762480bd86b50580dc19dff3c66"; | |
|
442 | }; | |
|
443 | meta = { | |
|
444 | license = [ pkgs.lib.licenses.mit ]; | |
|
445 | }; | |
|
446 | }; | |
|
434 | 447 | bottle = super.buildPythonPackage { |
|
435 | 448 | name = "bottle-0.12.8"; |
|
436 | 449 | buildInputs = with self; []; |
@@ -678,17 +691,17 b'' | |||
|
678 | 691 | license = [ pkgs.lib.licenses.asl20 ]; |
|
679 | 692 | }; |
|
680 | 693 | }; |
|
681 |
|
|
|
682 |
name = " |
|
|
694 | enum34 = super.buildPythonPackage { | |
|
695 | name = "enum34-1.1.6"; | |
|
683 | 696 | buildInputs = with self; []; |
|
684 | 697 | doCheck = false; |
|
685 |
propagatedBuildInputs = with self; [ |
|
|
698 | propagatedBuildInputs = with self; []; | |
|
686 | 699 | src = fetchurl { |
|
687 | url = "https://pypi.python.org/packages/8f/b5/9a73c66c7dba273bac8758398f060c008a25f3e84531063b42503b5d0a95/flake8-2.4.1.tar.gz"; | |
|
688 | md5 = "ed45d3db81a3b7c88bd63c6e37ca1d65"; | |
|
700 | url = "https://pypi.python.org/packages/bf/3e/31d502c25302814a7c2f1d3959d2a3b3f78e509002ba91aea64993936876/enum34-1.1.6.tar.gz"; | |
|
701 | md5 = "5f13a0841a61f7fc295c514490d120d0"; | |
|
689 | 702 | }; |
|
690 | 703 | meta = { |
|
691 |
license = [ pkgs.lib.licenses. |
|
|
704 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
692 | 705 | }; |
|
693 | 706 | }; |
|
694 | 707 | future = super.buildPythonPackage { |
@@ -809,26 +822,39 b'' | |||
|
809 | 822 | }; |
|
810 | 823 | }; |
|
811 | 824 | ipdb = super.buildPythonPackage { |
|
812 |
name = "ipdb-0. |
|
|
825 | name = "ipdb-0.10.1"; | |
|
813 | 826 | buildInputs = with self; []; |
|
814 | 827 | doCheck = false; |
|
815 | propagatedBuildInputs = with self; [ipython]; | |
|
828 | propagatedBuildInputs = with self; [ipython setuptools]; | |
|
816 | 829 | src = fetchurl { |
|
817 | url = "https://pypi.python.org/packages/f0/25/d7dd430ced6cd8dc242a933c8682b5dbf32eb4011d82f87e34209e5ec845/ipdb-0.8.zip"; | |
|
818 | md5 = "96dca0712efa01aa5eaf6b22071dd3ed"; | |
|
830 | url = "https://pypi.python.org/packages/eb/0a/0a37dc19572580336ad3813792c0d18c8d7117c2d66fc63c501f13a7a8f8/ipdb-0.10.1.tar.gz"; | |
|
831 | md5 = "4aeab65f633ddc98ebdb5eebf08dc713"; | |
|
819 | 832 | }; |
|
820 | 833 | meta = { |
|
821 |
license = [ pkgs.lib.licenses. |
|
|
834 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
822 | 835 | }; |
|
823 | 836 | }; |
|
824 | 837 | ipython = super.buildPythonPackage { |
|
825 |
name = "ipython- |
|
|
838 | name = "ipython-5.1.0"; | |
|
839 | buildInputs = with self; []; | |
|
840 | doCheck = false; | |
|
841 | propagatedBuildInputs = with self; [setuptools decorator pickleshare simplegeneric traitlets prompt-toolkit Pygments pexpect backports.shutil-get-terminal-size pathlib2 pexpect]; | |
|
842 | src = fetchurl { | |
|
843 | url = "https://pypi.python.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz"; | |
|
844 | md5 = "47c8122420f65b58784cb4b9b4af35e3"; | |
|
845 | }; | |
|
846 | meta = { | |
|
847 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
848 | }; | |
|
849 | }; | |
|
850 | ipython-genutils = super.buildPythonPackage { | |
|
851 | name = "ipython-genutils-0.1.0"; | |
|
826 | 852 | buildInputs = with self; []; |
|
827 | 853 | doCheck = false; |
|
828 | 854 | propagatedBuildInputs = with self; []; |
|
829 | 855 | src = fetchurl { |
|
830 |
url = "https://pypi.python.org/packages/06 |
|
|
831 | md5 = "a749d90c16068687b0ec45a27e72ef8f"; | |
|
856 | url = "https://pypi.python.org/packages/71/b7/a64c71578521606edbbce15151358598f3dfb72a3431763edc2baf19e71f/ipython_genutils-0.1.0.tar.gz"; | |
|
857 | md5 = "9a8afbe0978adbcbfcb3b35b2d015a56"; | |
|
832 | 858 | }; |
|
833 | 859 | meta = { |
|
834 | 860 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -886,19 +912,6 b'' | |||
|
886 | 912 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
|
887 | 913 | }; |
|
888 | 914 | }; |
|
889 | mccabe = super.buildPythonPackage { | |
|
890 | name = "mccabe-0.3"; | |
|
891 | buildInputs = with self; []; | |
|
892 | doCheck = false; | |
|
893 | propagatedBuildInputs = with self; []; | |
|
894 | src = fetchurl { | |
|
895 | url = "https://pypi.python.org/packages/c9/2e/75231479e11a906b64ac43bad9d0bb534d00080b18bdca8db9da46e1faf7/mccabe-0.3.tar.gz"; | |
|
896 | md5 = "81640948ff226f8c12b3277059489157"; | |
|
897 | }; | |
|
898 | meta = { | |
|
899 | license = [ { fullName = "Expat license"; } pkgs.lib.licenses.mit ]; | |
|
900 | }; | |
|
901 | }; | |
|
902 | 915 | meld3 = super.buildPythonPackage { |
|
903 | 916 | name = "meld3-1.0.2"; |
|
904 | 917 | buildInputs = with self; []; |
@@ -990,17 +1003,17 b'' | |||
|
990 | 1003 | license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ]; |
|
991 | 1004 | }; |
|
992 | 1005 | }; |
|
993 |
p |
|
|
994 |
name = "p |
|
|
1006 | pathlib2 = super.buildPythonPackage { | |
|
1007 | name = "pathlib2-2.1.0"; | |
|
995 | 1008 | buildInputs = with self; []; |
|
996 | 1009 | doCheck = false; |
|
997 | propagatedBuildInputs = with self; []; | |
|
1010 | propagatedBuildInputs = with self; [six]; | |
|
998 | 1011 | src = fetchurl { |
|
999 | url = "https://pypi.python.org/packages/8b/de/259f5e735897ada1683489dd514b2a1c91aaa74e5e6b68f80acf128a6368/pep8-1.5.7.tar.gz"; | |
|
1000 | md5 = "f6adbdd69365ecca20513c709f9b7c93"; | |
|
1012 | url = "https://pypi.python.org/packages/c9/27/8448b10d8440c08efeff0794adf7d0ed27adb98372c70c7b38f3947d4749/pathlib2-2.1.0.tar.gz"; | |
|
1013 | md5 = "38e4f58b4d69dfcb9edb49a54a8b28d2"; | |
|
1001 | 1014 | }; |
|
1002 | 1015 | meta = { |
|
1003 |
license = [ |
|
|
1016 | license = [ pkgs.lib.licenses.mit ]; | |
|
1004 | 1017 | }; |
|
1005 | 1018 | }; |
|
1006 | 1019 | peppercorn = super.buildPythonPackage { |
@@ -1016,14 +1029,53 b'' | |||
|
1016 | 1029 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
|
1017 | 1030 | }; |
|
1018 | 1031 | }; |
|
1032 | pexpect = super.buildPythonPackage { | |
|
1033 | name = "pexpect-4.2.1"; | |
|
1034 | buildInputs = with self; []; | |
|
1035 | doCheck = false; | |
|
1036 | propagatedBuildInputs = with self; [ptyprocess]; | |
|
1037 | src = fetchurl { | |
|
1038 | url = "https://pypi.python.org/packages/e8/13/d0b0599099d6cd23663043a2a0bb7c61e58c6ba359b2656e6fb000ef5b98/pexpect-4.2.1.tar.gz"; | |
|
1039 | md5 = "3694410001a99dff83f0b500a1ca1c95"; | |
|
1040 | }; | |
|
1041 | meta = { | |
|
1042 | license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ]; | |
|
1043 | }; | |
|
1044 | }; | |
|
1045 | pickleshare = super.buildPythonPackage { | |
|
1046 | name = "pickleshare-0.7.4"; | |
|
1047 | buildInputs = with self; []; | |
|
1048 | doCheck = false; | |
|
1049 | propagatedBuildInputs = with self; [pathlib2]; | |
|
1050 | src = fetchurl { | |
|
1051 | url = "https://pypi.python.org/packages/69/fe/dd137d84daa0fd13a709e448138e310d9ea93070620c9db5454e234af525/pickleshare-0.7.4.tar.gz"; | |
|
1052 | md5 = "6a9e5dd8dfc023031f6b7b3f824cab12"; | |
|
1053 | }; | |
|
1054 | meta = { | |
|
1055 | license = [ pkgs.lib.licenses.mit ]; | |
|
1056 | }; | |
|
1057 | }; | |
|
1058 | prompt-toolkit = super.buildPythonPackage { | |
|
1059 | name = "prompt-toolkit-1.0.9"; | |
|
1060 | buildInputs = with self; []; | |
|
1061 | doCheck = false; | |
|
1062 | propagatedBuildInputs = with self; [six wcwidth]; | |
|
1063 | src = fetchurl { | |
|
1064 | url = "https://pypi.python.org/packages/83/14/5ac258da6c530eca02852ee25c7a9ff3ca78287bb4c198d0d0055845d856/prompt_toolkit-1.0.9.tar.gz"; | |
|
1065 | md5 = "a39f91a54308fb7446b1a421c11f227c"; | |
|
1066 | }; | |
|
1067 | meta = { | |
|
1068 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
1069 | }; | |
|
1070 | }; | |
|
1019 | 1071 | psutil = super.buildPythonPackage { |
|
1020 |
name = "psutil- |
|
|
1072 | name = "psutil-4.3.1"; | |
|
1021 | 1073 | buildInputs = with self; []; |
|
1022 | 1074 | doCheck = false; |
|
1023 | 1075 | propagatedBuildInputs = with self; []; |
|
1024 | 1076 | src = fetchurl { |
|
1025 | url = "https://pypi.python.org/packages/df/47/ee54ef14dd40f8ce831a7581001a5096494dc99fe71586260ca6b531fe86/psutil-2.2.1.tar.gz"; | |
|
1026 | md5 = "1a2b58cd9e3a53528bb6148f0c4d5244"; | |
|
1077 | url = "https://pypi.python.org/packages/78/cc/f267a1371f229bf16db6a4e604428c3b032b823b83155bd33cef45e49a53/psutil-4.3.1.tar.gz"; | |
|
1078 | md5 = "199a366dba829c88bddaf5b41d19ddc0"; | |
|
1027 | 1079 | }; |
|
1028 | 1080 | meta = { |
|
1029 | 1081 | license = [ pkgs.lib.licenses.bsdOriginal ]; |
@@ -1042,6 +1094,19 b'' | |||
|
1042 | 1094 | license = [ pkgs.lib.licenses.zpt21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ]; |
|
1043 | 1095 | }; |
|
1044 | 1096 | }; |
|
1097 | ptyprocess = super.buildPythonPackage { | |
|
1098 | name = "ptyprocess-0.5.1"; | |
|
1099 | buildInputs = with self; []; | |
|
1100 | doCheck = false; | |
|
1101 | propagatedBuildInputs = with self; []; | |
|
1102 | src = fetchurl { | |
|
1103 | url = "https://pypi.python.org/packages/db/d7/b465161910f3d1cef593c5e002bff67e0384898f597f1a7fdc8db4c02bf6/ptyprocess-0.5.1.tar.gz"; | |
|
1104 | md5 = "94e537122914cc9ec9c1eadcd36e73a1"; | |
|
1105 | }; | |
|
1106 | meta = { | |
|
1107 | license = [ ]; | |
|
1108 | }; | |
|
1109 | }; | |
|
1045 | 1110 | py = super.buildPythonPackage { |
|
1046 | 1111 | name = "py-1.4.29"; |
|
1047 | 1112 | buildInputs = with self; []; |
@@ -1120,6 +1185,19 b'' | |||
|
1120 | 1185 | license = [ pkgs.lib.licenses.mit ]; |
|
1121 | 1186 | }; |
|
1122 | 1187 | }; |
|
1188 | pygments-markdown-lexer = super.buildPythonPackage { | |
|
1189 | name = "pygments-markdown-lexer-0.1.0.dev39"; | |
|
1190 | buildInputs = with self; []; | |
|
1191 | doCheck = false; | |
|
1192 | propagatedBuildInputs = with self; [Pygments]; | |
|
1193 | src = fetchurl { | |
|
1194 | url = "https://pypi.python.org/packages/c3/12/674cdee66635d638cedb2c5d9c85ce507b7b2f91bdba29e482f1b1160ff6/pygments-markdown-lexer-0.1.0.dev39.zip"; | |
|
1195 | md5 = "6360fe0f6d1f896e35b7a0142ce6459c"; | |
|
1196 | }; | |
|
1197 | meta = { | |
|
1198 | license = [ pkgs.lib.licenses.asl20 ]; | |
|
1199 | }; | |
|
1200 | }; | |
|
1123 | 1201 | pyparsing = super.buildPythonPackage { |
|
1124 | 1202 | name = "pyparsing-1.5.7"; |
|
1125 | 1203 | buildInputs = with self; []; |
@@ -1420,10 +1498,10 b'' | |||
|
1420 | 1498 | }; |
|
1421 | 1499 | }; |
|
1422 | 1500 | rhodecode-enterprise-ce = super.buildPythonPackage { |
|
1423 |
name = "rhodecode-enterprise-ce-4. |
|
|
1424 |
buildInputs = with self; [WebTest configobj cssselect |
|
|
1501 | name = "rhodecode-enterprise-ce-4.5.0"; | |
|
1502 | buildInputs = with self; [WebTest configobj cssselect lxml mock pytest pytest-cov pytest-runner pytest-sugar]; | |
|
1425 | 1503 | doCheck = true; |
|
1426 | propagatedBuildInputs = with self; [Babel Beaker FormEncode Mako Markdown MarkupSafe MySQL-python Paste PasteDeploy PasteScript Pygments Pylons Pyro4 Routes SQLAlchemy Tempita URLObject WebError WebHelpers WebHelpers2 WebOb WebTest Whoosh alembic amqplib anyjson appenlight-client authomatic backport-ipaddress celery channelstream colander decorator deform docutils gevent gunicorn infrae.cache ipython iso8601 kombu msgpack-python packaging psycopg2 py-gfm pycrypto pycurl pyparsing pyramid pyramid-debugtoolbar pyramid-mako pyramid-beaker pysqlite python-dateutil python-ldap python-memcached python-pam recaptcha-client repoze.lru requests simplejson waitress zope.cachedescriptors dogpile.cache dogpile.core psutil py-bcrypt]; | |
|
1504 | propagatedBuildInputs = with self; [Babel Beaker FormEncode Mako Markdown MarkupSafe MySQL-python Paste PasteDeploy PasteScript Pygments pygments-markdown-lexer Pylons Pyro4 Routes SQLAlchemy Tempita URLObject WebError WebHelpers WebHelpers2 WebOb WebTest Whoosh alembic amqplib anyjson appenlight-client authomatic backport-ipaddress celery channelstream colander decorator deform docutils gevent gunicorn infrae.cache ipython iso8601 kombu msgpack-python packaging psycopg2 py-gfm pycrypto pycurl pyparsing pyramid pyramid-debugtoolbar pyramid-mako pyramid-beaker pysqlite python-dateutil python-ldap python-memcached python-pam recaptcha-client repoze.lru requests simplejson subprocess32 waitress zope.cachedescriptors dogpile.cache dogpile.core psutil py-bcrypt]; | |
|
1427 | 1505 | src = ./.; |
|
1428 | 1506 | meta = { |
|
1429 | 1507 | license = [ { fullName = "AGPLv3, and Commercial License"; } ]; |
@@ -1494,6 +1572,19 b'' | |||
|
1494 | 1572 | license = [ pkgs.lib.licenses.mit ]; |
|
1495 | 1573 | }; |
|
1496 | 1574 | }; |
|
1575 | simplegeneric = super.buildPythonPackage { | |
|
1576 | name = "simplegeneric-0.8.1"; | |
|
1577 | buildInputs = with self; []; | |
|
1578 | doCheck = false; | |
|
1579 | propagatedBuildInputs = with self; []; | |
|
1580 | src = fetchurl { | |
|
1581 | url = "https://pypi.python.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip"; | |
|
1582 | md5 = "f9c1fab00fd981be588fc32759f474e3"; | |
|
1583 | }; | |
|
1584 | meta = { | |
|
1585 | license = [ pkgs.lib.licenses.zpt21 ]; | |
|
1586 | }; | |
|
1587 | }; | |
|
1497 | 1588 | simplejson = super.buildPythonPackage { |
|
1498 | 1589 | name = "simplejson-3.7.2"; |
|
1499 | 1590 | buildInputs = with self; []; |
@@ -1546,6 +1637,19 b'' | |||
|
1546 | 1637 | license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ]; |
|
1547 | 1638 | }; |
|
1548 | 1639 | }; |
|
1640 | traitlets = super.buildPythonPackage { | |
|
1641 | name = "traitlets-4.3.1"; | |
|
1642 | buildInputs = with self; []; | |
|
1643 | doCheck = false; | |
|
1644 | propagatedBuildInputs = with self; [ipython-genutils six decorator enum34]; | |
|
1645 | src = fetchurl { | |
|
1646 | url = "https://pypi.python.org/packages/b1/d6/5b5aa6d5c474691909b91493da1e8972e309c9f01ecfe4aeafd272eb3234/traitlets-4.3.1.tar.gz"; | |
|
1647 | md5 = "dd0b1b6e5d31ce446d55a4b5e5083c98"; | |
|
1648 | }; | |
|
1649 | meta = { | |
|
1650 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
1651 | }; | |
|
1652 | }; | |
|
1549 | 1653 | transifex-client = super.buildPythonPackage { |
|
1550 | 1654 | name = "transifex-client-0.10"; |
|
1551 | 1655 | buildInputs = with self; []; |
@@ -1637,6 +1741,19 b'' | |||
|
1637 | 1741 | license = [ pkgs.lib.licenses.zpt21 ]; |
|
1638 | 1742 | }; |
|
1639 | 1743 | }; |
|
1744 | wcwidth = super.buildPythonPackage { | |
|
1745 | name = "wcwidth-0.1.7"; | |
|
1746 | buildInputs = with self; []; | |
|
1747 | doCheck = false; | |
|
1748 | propagatedBuildInputs = with self; []; | |
|
1749 | src = fetchurl { | |
|
1750 | url = "https://pypi.python.org/packages/55/11/e4a2bb08bb450fdbd42cc709dd40de4ed2c472cf0ccb9e64af22279c5495/wcwidth-0.1.7.tar.gz"; | |
|
1751 | md5 = "b3b6a0a08f0c8a34d1de8cf44150a4ad"; | |
|
1752 | }; | |
|
1753 | meta = { | |
|
1754 | license = [ pkgs.lib.licenses.mit ]; | |
|
1755 | }; | |
|
1756 | }; | |
|
1640 | 1757 | ws4py = super.buildPythonPackage { |
|
1641 | 1758 | name = "ws4py-0.3.5"; |
|
1642 | 1759 | buildInputs = with self; []; |
@@ -1718,5 +1835,30 b'' | |||
|
1718 | 1835 | |
|
1719 | 1836 | ### Test requirements |
|
1720 | 1837 | |
|
1721 | ||
|
1838 | pytest-sugar = super.buildPythonPackage { | |
|
1839 | name = "pytest-sugar-0.7.1"; | |
|
1840 | buildInputs = with self; []; | |
|
1841 | doCheck = false; | |
|
1842 | propagatedBuildInputs = with self; [pytest termcolor]; | |
|
1843 | src = fetchurl { | |
|
1844 | url = "https://pypi.python.org/packages/03/97/05d988b4fa870e7373e8ee4582408543b9ca2bd35c3c67b569369c6f9c49/pytest-sugar-0.7.1.tar.gz"; | |
|
1845 | md5 = "7400f7c11f3d572b2c2a3b60352d35fe"; | |
|
1846 | }; | |
|
1847 | meta = { | |
|
1848 | license = [ pkgs.lib.licenses.bsdOriginal ]; | |
|
1849 | }; | |
|
1850 | }; | |
|
1851 | termcolor = super.buildPythonPackage { | |
|
1852 | name = "termcolor-1.1.0"; | |
|
1853 | buildInputs = with self; []; | |
|
1854 | doCheck = false; | |
|
1855 | propagatedBuildInputs = with self; []; | |
|
1856 | src = fetchurl { | |
|
1857 | url = "https://pypi.python.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz"; | |
|
1858 | md5 = "043e89644f8909d462fbbfa511c768df"; | |
|
1859 | }; | |
|
1860 | meta = { | |
|
1861 | license = [ pkgs.lib.licenses.mit ]; | |
|
1862 | }; | |
|
1863 | }; | |
|
1722 | 1864 | } |
@@ -1,9 +1,9 b'' | |||
|
1 | 1 | [pytest] |
|
2 | 2 | testpaths = ./rhodecode |
|
3 | pylons_config = test.ini | |
|
4 |
vcsserver_protocol = p |
|
|
5 | vcsserver_config = rhodecode/tests/vcsserver.ini | |
|
6 |
vcsserver_config_http = rhodecode/tests/vcsserver_p |
|
|
3 | pylons_config = rhodecode/tests/rhodecode.ini | |
|
4 | vcsserver_protocol = http | |
|
5 | vcsserver_config_pyro4 = rhodecode/tests/vcsserver_pyro4.ini | |
|
6 | vcsserver_config_http = rhodecode/tests/vcsserver_http.ini | |
|
7 | 7 | norecursedirs = tests/scripts |
|
8 | 8 | addopts = -k "not _BaseTest" |
|
9 | 9 | markers = |
@@ -1,5 +1,6 b'' | |||
|
1 | 1 | Babel==1.3 |
|
2 | 2 | Beaker==1.7.0 |
|
3 | Chameleon==2.24 | |
|
3 | 4 | CProfileV==1.0.6 |
|
4 | 5 | FormEncode==1.2.4 |
|
5 | 6 | Jinja2==2.7.3 |
@@ -11,6 +12,7 b' Paste==2.0.2' | |||
|
11 | 12 | PasteDeploy==1.5.2 |
|
12 | 13 | PasteScript==1.7.5 |
|
13 | 14 | Pygments==2.1.3 |
|
15 | pygments-markdown-lexer==0.1.0.dev39 | |
|
14 | 16 | |
|
15 | 17 | # TODO: This version is not available on PyPI |
|
16 | 18 | # Pylons==1.0.2.dev20160108 |
@@ -62,7 +64,6 b' dogpile.cache==0.6.1' | |||
|
62 | 64 | dogpile.core==0.4.1 |
|
63 | 65 | dulwich==0.12.0 |
|
64 | 66 | ecdsa==0.11 |
|
65 | flake8==2.4.1 | |
|
66 | 67 | future==0.14.3 |
|
67 | 68 | futures==3.0.2 |
|
68 | 69 | gevent==1.1.1 |
@@ -77,13 +78,12 b' gunicorn==19.6.0' | |||
|
77 | 78 | gnureadline==6.3.3 |
|
78 | 79 | infrae.cache==1.0.1 |
|
79 | 80 | invoke==0.13.0 |
|
80 |
ipdb==0. |
|
|
81 |
ipython== |
|
|
81 | ipdb==0.10.1 | |
|
82 | ipython==5.1.0 | |
|
82 | 83 | iso8601==0.1.11 |
|
83 | 84 | itsdangerous==0.24 |
|
84 | 85 | kombu==1.5.1 |
|
85 | 86 | lxml==3.4.4 |
|
86 | mccabe==0.3 | |
|
87 | 87 | meld3==1.0.2 |
|
88 | 88 | mock==1.0.1 |
|
89 | 89 | msgpack-python==0.4.6 |
@@ -91,8 +91,7 b' nose==1.3.6' | |||
|
91 | 91 | objgraph==2.0.0 |
|
92 | 92 | packaging==15.2 |
|
93 | 93 | paramiko==1.15.1 |
|
94 | pep8==1.5.7 | |
|
95 | psutil==2.2.1 | |
|
94 | psutil==4.3.1 | |
|
96 | 95 | psycopg2==2.6.1 |
|
97 | 96 | py==1.4.29 |
|
98 | 97 | py-bcrypt==0.4 |
@@ -141,6 +140,7 b' transifex-client==0.10' | |||
|
141 | 140 | translationstring==1.3 |
|
142 | 141 | trollius==1.0.4 |
|
143 | 142 | uWSGI==2.0.11.2 |
|
143 | urllib3==1.16 | |
|
144 | 144 | venusian==1.0 |
|
145 | 145 | waitress==0.8.9 |
|
146 | 146 | wsgiref==0.1.2 |
@@ -51,7 +51,7 b' PYRAMID_SETTINGS = {}' | |||
|
51 | 51 | EXTENSIONS = {} |
|
52 | 52 | |
|
53 | 53 | __version__ = ('.'.join((str(each) for each in VERSION[:3]))) |
|
54 |
__dbversion__ = |
|
|
54 | __dbversion__ = 63 # defines current db version for migrations | |
|
55 | 55 | __platform__ = platform.system() |
|
56 | 56 | __license__ = 'AGPLv3, and Commercial License' |
|
57 | 57 | __author__ = 'RhodeCode GmbH' |
@@ -35,6 +35,9 b' def includeme(config):' | |||
|
35 | 35 | config.add_route( |
|
36 | 36 | name='admin_settings_open_source', |
|
37 | 37 | pattern=ADMIN_PREFIX + '/settings/open_source') |
|
38 | config.add_route( | |
|
39 | name='admin_settings_vcs_svn_generate_cfg', | |
|
40 | pattern=ADMIN_PREFIX + '/settings/vcs/svn_generate_cfg') | |
|
38 | 41 | |
|
39 | 42 | # Scan module for configuration decorators. |
|
40 | 43 | config.scan() |
@@ -24,8 +24,11 b' import logging' | |||
|
24 | 24 | from pylons import tmpl_context as c |
|
25 | 25 | from pyramid.view import view_config |
|
26 | 26 | |
|
27 |
from rhodecode.lib.auth import |
|
|
27 | from rhodecode.lib.auth import ( | |
|
28 | LoginRequired, HasPermissionAllDecorator, CSRFRequired) | |
|
28 | 29 | from rhodecode.lib.utils import read_opensource_licenses |
|
30 | from rhodecode.svn_support.utils import generate_mod_dav_svn_config | |
|
31 | from rhodecode.translation import _ | |
|
29 | 32 | |
|
30 | 33 | from .navigation import navigation_list |
|
31 | 34 | |
@@ -53,3 +56,27 b' class AdminSettingsView(object):' | |||
|
53 | 56 | sorted(read_opensource_licenses().items(), key=lambda t: t[0])) |
|
54 | 57 | |
|
55 | 58 | return {} |
|
59 | ||
|
60 | @LoginRequired() | |
|
61 | @CSRFRequired() | |
|
62 | @HasPermissionAllDecorator('hg.admin') | |
|
63 | @view_config( | |
|
64 | route_name='admin_settings_vcs_svn_generate_cfg', | |
|
65 | request_method='POST', renderer='json') | |
|
66 | def vcs_svn_generate_config(self): | |
|
67 | try: | |
|
68 | generate_mod_dav_svn_config(self.request.registry) | |
|
69 | msg = { | |
|
70 | 'message': _('Apache configuration for Subversion generated.'), | |
|
71 | 'level': 'success', | |
|
72 | } | |
|
73 | except Exception: | |
|
74 | log.exception( | |
|
75 | 'Exception while generating the Apache configuration for Subversion.') | |
|
76 | msg = { | |
|
77 | 'message': _('Failed to generate the Apache configuration for Subversion.'), | |
|
78 | 'level': 'error', | |
|
79 | } | |
|
80 | ||
|
81 | data = {'message': msg} | |
|
82 | return data |
@@ -23,8 +23,11 b' import json' | |||
|
23 | 23 | import mock |
|
24 | 24 | import pytest |
|
25 | 25 | |
|
26 | from rhodecode.lib.utils2 import safe_unicode | |
|
26 | 27 | from rhodecode.lib.vcs import settings |
|
28 | from rhodecode.model.meta import Session | |
|
27 | 29 | from rhodecode.model.repo import RepoModel |
|
30 | from rhodecode.model.user import UserModel | |
|
28 | 31 | from rhodecode.tests import TEST_USER_ADMIN_LOGIN |
|
29 | 32 | from rhodecode.api.tests.utils import ( |
|
30 | 33 | build_data, api_call, assert_ok, assert_error, crash) |
@@ -36,29 +39,37 b' fixture = Fixture()' | |||
|
36 | 39 | |
|
37 | 40 | @pytest.mark.usefixtures("testuser_api", "app") |
|
38 | 41 | class TestCreateRepo(object): |
|
39 | def test_api_create_repo(self, backend): | |
|
40 | repo_name = 'api-repo-1' | |
|
42 | ||
|
43 | @pytest.mark.parametrize('given, expected_name, expected_exc', [ | |
|
44 | ('api repo-1', 'api-repo-1', False), | |
|
45 | ('api-repo 1-ąć', 'api-repo-1-ąć', False), | |
|
46 | (u'unicode-ąć', u'unicode-ąć', False), | |
|
47 | ('some repo v1.2', 'some-repo-v1.2', False), | |
|
48 | ('v2.0', 'v2.0', False), | |
|
49 | ]) | |
|
50 | def test_api_create_repo(self, backend, given, expected_name, expected_exc): | |
|
51 | ||
|
41 | 52 | id_, params = build_data( |
|
42 | 53 | self.apikey, |
|
43 | 54 | 'create_repo', |
|
44 |
repo_name= |
|
|
55 | repo_name=given, | |
|
45 | 56 | owner=TEST_USER_ADMIN_LOGIN, |
|
46 | 57 | repo_type=backend.alias, |
|
47 | 58 | ) |
|
48 | 59 | response = api_call(self.app, params) |
|
49 | 60 | |
|
50 | repo = RepoModel().get_by_repo_name(repo_name) | |
|
51 | ||
|
52 | assert repo is not None | |
|
53 | 61 | ret = { |
|
54 |
'msg': 'Created new repository `%s`' % ( |
|
|
62 | 'msg': 'Created new repository `%s`' % (expected_name,), | |
|
55 | 63 | 'success': True, |
|
56 | 64 | 'task': None, |
|
57 | 65 | } |
|
58 | 66 | expected = ret |
|
59 | 67 | assert_ok(id_, expected, given=response.body) |
|
60 | 68 | |
|
61 | id_, params = build_data(self.apikey, 'get_repo', repoid=repo_name) | |
|
69 | repo = RepoModel().get_by_repo_name(safe_unicode(expected_name)) | |
|
70 | assert repo is not None | |
|
71 | ||
|
72 | id_, params = build_data(self.apikey, 'get_repo', repoid=expected_name) | |
|
62 | 73 | response = api_call(self.app, params) |
|
63 | 74 | body = json.loads(response.body) |
|
64 | 75 | |
@@ -66,7 +77,7 b' class TestCreateRepo(object):' | |||
|
66 | 77 | assert body['result']['enable_locking'] is False |
|
67 | 78 | assert body['result']['enable_statistics'] is False |
|
68 | 79 | |
|
69 |
fixture.destroy_repo( |
|
|
80 | fixture.destroy_repo(safe_unicode(expected_name)) | |
|
70 | 81 | |
|
71 | 82 | def test_api_create_restricted_repo_type(self, backend): |
|
72 | 83 | repo_name = 'api-repo-type-{0}'.format(backend.alias) |
@@ -158,6 +169,21 b' class TestCreateRepo(object):' | |||
|
158 | 169 | fixture.destroy_repo(repo_name) |
|
159 | 170 | fixture.destroy_repo_group(repo_group_name) |
|
160 | 171 | |
|
172 | def test_create_repo_in_group_that_doesnt_exist(self, backend, user_util): | |
|
173 | repo_group_name = 'fake_group' | |
|
174 | ||
|
175 | repo_name = '%s/api-repo-gr' % (repo_group_name,) | |
|
176 | id_, params = build_data( | |
|
177 | self.apikey, 'create_repo', | |
|
178 | repo_name=repo_name, | |
|
179 | owner=TEST_USER_ADMIN_LOGIN, | |
|
180 | repo_type=backend.alias,) | |
|
181 | response = api_call(self.app, params) | |
|
182 | ||
|
183 | expected = {'repo_group': 'Repository group `{}` does not exist'.format( | |
|
184 | repo_group_name)} | |
|
185 | assert_error(id_, expected, given=response.body) | |
|
186 | ||
|
161 | 187 | def test_api_create_repo_unknown_owner(self, backend): |
|
162 | 188 | repo_name = 'api-repo-2' |
|
163 | 189 | owner = 'i-dont-exist' |
@@ -218,10 +244,48 b' class TestCreateRepo(object):' | |||
|
218 | 244 | owner=owner) |
|
219 | 245 | response = api_call(self.app, params) |
|
220 | 246 | |
|
221 | expected = 'Only RhodeCode admin can specify `owner` param' | |
|
247 | expected = 'Only RhodeCode super-admin can specify `owner` param' | |
|
222 | 248 | assert_error(id_, expected, given=response.body) |
|
223 | 249 | fixture.destroy_repo(repo_name) |
|
224 | 250 | |
|
251 | def test_api_create_repo_by_non_admin_no_parent_group_perms(self, backend): | |
|
252 | repo_group_name = 'no-access' | |
|
253 | fixture.create_repo_group(repo_group_name) | |
|
254 | repo_name = 'no-access/api-repo' | |
|
255 | ||
|
256 | id_, params = build_data( | |
|
257 | self.apikey_regular, 'create_repo', | |
|
258 | repo_name=repo_name, | |
|
259 | repo_type=backend.alias) | |
|
260 | response = api_call(self.app, params) | |
|
261 | ||
|
262 | expected = {'repo_group': 'Repository group `{}` does not exist'.format( | |
|
263 | repo_group_name)} | |
|
264 | assert_error(id_, expected, given=response.body) | |
|
265 | fixture.destroy_repo_group(repo_group_name) | |
|
266 | fixture.destroy_repo(repo_name) | |
|
267 | ||
|
268 | def test_api_create_repo_non_admin_no_permission_to_create_to_root_level( | |
|
269 | self, backend, user_util): | |
|
270 | ||
|
271 | regular_user = user_util.create_user() | |
|
272 | regular_user_api_key = regular_user.api_key | |
|
273 | ||
|
274 | usr = UserModel().get_by_username(regular_user.username) | |
|
275 | usr.inherit_default_permissions = False | |
|
276 | Session().add(usr) | |
|
277 | ||
|
278 | repo_name = backend.new_repo_name() | |
|
279 | id_, params = build_data( | |
|
280 | regular_user_api_key, 'create_repo', | |
|
281 | repo_name=repo_name, | |
|
282 | repo_type=backend.alias) | |
|
283 | response = api_call(self.app, params) | |
|
284 | expected = { | |
|
285 | "repo_name": "You do not have the permission to " | |
|
286 | "store repositories in the root location."} | |
|
287 | assert_error(id_, expected, given=response.body) | |
|
288 | ||
|
225 | 289 | def test_api_create_repo_exists(self, backend): |
|
226 | 290 | repo_name = backend.repo_name |
|
227 | 291 | id_, params = build_data( |
@@ -230,7 +294,9 b' class TestCreateRepo(object):' | |||
|
230 | 294 | owner=TEST_USER_ADMIN_LOGIN, |
|
231 | 295 | repo_type=backend.alias,) |
|
232 | 296 | response = api_call(self.app, params) |
|
233 | expected = "repo `%s` already exist" % (repo_name,) | |
|
297 | expected = { | |
|
298 | 'unique_repo_name': 'Repository with name `{}` already exists'.format( | |
|
299 | repo_name)} | |
|
234 | 300 | assert_error(id_, expected, given=response.body) |
|
235 | 301 | |
|
236 | 302 | @mock.patch.object(RepoModel, 'create', crash) |
@@ -245,26 +311,40 b' class TestCreateRepo(object):' | |||
|
245 | 311 | expected = 'failed to create repository `%s`' % (repo_name,) |
|
246 | 312 | assert_error(id_, expected, given=response.body) |
|
247 | 313 | |
|
248 | def test_create_repo_with_extra_slashes_in_name(self, backend, user_util): | |
|
249 | existing_repo_group = user_util.create_repo_group() | |
|
250 | dirty_repo_name = '//{}/repo_name//'.format( | |
|
251 | existing_repo_group.group_name) | |
|
252 | cleaned_repo_name = '{}/repo_name'.format( | |
|
253 | existing_repo_group.group_name) | |
|
314 | @pytest.mark.parametrize('parent_group, dirty_name, expected_name', [ | |
|
315 | (None, 'foo bar x', 'foo-bar-x'), | |
|
316 | ('foo', '/foo//bar x', 'foo/bar-x'), | |
|
317 | ('foo-bar', 'foo-bar //bar x', 'foo-bar/bar-x'), | |
|
318 | ]) | |
|
319 | def test_create_repo_with_extra_slashes_in_name( | |
|
320 | self, backend, parent_group, dirty_name, expected_name): | |
|
321 | ||
|
322 | if parent_group: | |
|
323 | gr = fixture.create_repo_group(parent_group) | |
|
324 | assert gr.group_name == parent_group | |
|
254 | 325 | |
|
255 | 326 | id_, params = build_data( |
|
256 | 327 | self.apikey, 'create_repo', |
|
257 |
repo_name=dirty_ |
|
|
328 | repo_name=dirty_name, | |
|
258 | 329 | repo_type=backend.alias, |
|
259 | 330 | owner=TEST_USER_ADMIN_LOGIN,) |
|
260 | 331 | response = api_call(self.app, params) |
|
261 | repo = RepoModel().get_by_repo_name(cleaned_repo_name) | |
|
332 | expected ={ | |
|
333 | "msg": "Created new repository `{}`".format(expected_name), | |
|
334 | "task": None, | |
|
335 | "success": True | |
|
336 | } | |
|
337 | assert_ok(id_, expected, response.body) | |
|
338 | ||
|
339 | repo = RepoModel().get_by_repo_name(expected_name) | |
|
262 | 340 | assert repo is not None |
|
263 | 341 | |
|
264 | 342 | expected = { |
|
265 |
'msg': 'Created new repository `%s`' % ( |
|
|
343 | 'msg': 'Created new repository `%s`' % (expected_name,), | |
|
266 | 344 | 'success': True, |
|
267 | 345 | 'task': None, |
|
268 | 346 | } |
|
269 | 347 | assert_ok(id_, expected, given=response.body) |
|
270 |
fixture.destroy_repo( |
|
|
348 | fixture.destroy_repo(expected_name) | |
|
349 | if parent_group: | |
|
350 | fixture.destroy_repo_group(parent_group) |
@@ -54,56 +54,11 b' class TestCreateRepoGroup(object):' | |||
|
54 | 54 | 'repo_group': repo_group.get_api_data() |
|
55 | 55 | } |
|
56 | 56 | expected = ret |
|
57 | try: | |
|
57 | 58 | assert_ok(id_, expected, given=response.body) |
|
59 | finally: | |
|
58 | 60 | fixture.destroy_repo_group(repo_group_name) |
|
59 | 61 | |
|
60 | def test_api_create_repo_group_regular_user(self): | |
|
61 | repo_group_name = 'api-repo-group' | |
|
62 | ||
|
63 | usr = UserModel().get_by_username(self.TEST_USER_LOGIN) | |
|
64 | usr.inherit_default_permissions = False | |
|
65 | Session().add(usr) | |
|
66 | UserModel().grant_perm( | |
|
67 | self.TEST_USER_LOGIN, 'hg.repogroup.create.true') | |
|
68 | Session().commit() | |
|
69 | ||
|
70 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) | |
|
71 | assert repo_group is None | |
|
72 | ||
|
73 | id_, params = build_data( | |
|
74 | self.apikey_regular, 'create_repo_group', | |
|
75 | group_name=repo_group_name, | |
|
76 | owner=TEST_USER_ADMIN_LOGIN,) | |
|
77 | response = api_call(self.app, params) | |
|
78 | ||
|
79 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) | |
|
80 | assert repo_group is not None | |
|
81 | ret = { | |
|
82 | 'msg': 'Created new repo group `%s`' % (repo_group_name,), | |
|
83 | 'repo_group': repo_group.get_api_data() | |
|
84 | } | |
|
85 | expected = ret | |
|
86 | assert_ok(id_, expected, given=response.body) | |
|
87 | fixture.destroy_repo_group(repo_group_name) | |
|
88 | UserModel().revoke_perm( | |
|
89 | self.TEST_USER_LOGIN, 'hg.repogroup.create.true') | |
|
90 | usr = UserModel().get_by_username(self.TEST_USER_LOGIN) | |
|
91 | usr.inherit_default_permissions = True | |
|
92 | Session().add(usr) | |
|
93 | Session().commit() | |
|
94 | ||
|
95 | def test_api_create_repo_group_regular_user_no_permission(self): | |
|
96 | repo_group_name = 'api-repo-group' | |
|
97 | ||
|
98 | id_, params = build_data( | |
|
99 | self.apikey_regular, 'create_repo_group', | |
|
100 | group_name=repo_group_name, | |
|
101 | owner=TEST_USER_ADMIN_LOGIN,) | |
|
102 | response = api_call(self.app, params) | |
|
103 | ||
|
104 | expected = "Access was denied to this resource." | |
|
105 | assert_error(id_, expected, given=response.body) | |
|
106 | ||
|
107 | 62 | def test_api_create_repo_group_in_another_group(self): |
|
108 | 63 | repo_group_name = 'api-repo-group' |
|
109 | 64 | |
@@ -127,7 +82,9 b' class TestCreateRepoGroup(object):' | |||
|
127 | 82 | 'repo_group': repo_group.get_api_data() |
|
128 | 83 | } |
|
129 | 84 | expected = ret |
|
85 | try: | |
|
130 | 86 | assert_ok(id_, expected, given=response.body) |
|
87 | finally: | |
|
131 | 88 | fixture.destroy_repo_group(full_repo_group_name) |
|
132 | 89 | fixture.destroy_repo_group(repo_group_name) |
|
133 | 90 | |
@@ -144,7 +101,10 b' class TestCreateRepoGroup(object):' | |||
|
144 | 101 | owner=TEST_USER_ADMIN_LOGIN, |
|
145 | 102 | copy_permissions=True) |
|
146 | 103 | response = api_call(self.app, params) |
|
147 | expected = 'repository group `%s` does not exist' % (repo_group_name,) | |
|
104 | expected = { | |
|
105 | 'repo_group': | |
|
106 | 'Parent repository group `{}` does not exist'.format( | |
|
107 | repo_group_name)} | |
|
148 | 108 | assert_error(id_, expected, given=response.body) |
|
149 | 109 | |
|
150 | 110 | def test_api_create_repo_group_that_exists(self): |
@@ -159,10 +119,140 b' class TestCreateRepoGroup(object):' | |||
|
159 | 119 | group_name=repo_group_name, |
|
160 | 120 | owner=TEST_USER_ADMIN_LOGIN,) |
|
161 | 121 | response = api_call(self.app, params) |
|
162 | expected = 'repo group `%s` already exist' % (repo_group_name,) | |
|
122 | expected = { | |
|
123 | 'unique_repo_group_name': | |
|
124 | 'Repository group with name `{}` already exists'.format( | |
|
125 | repo_group_name)} | |
|
126 | try: | |
|
163 | 127 | assert_error(id_, expected, given=response.body) |
|
128 | finally: | |
|
164 | 129 | fixture.destroy_repo_group(repo_group_name) |
|
165 | 130 | |
|
131 | def test_api_create_repo_group_regular_user_wit_root_location_perms( | |
|
132 | self, user_util): | |
|
133 | regular_user = user_util.create_user() | |
|
134 | regular_user_api_key = regular_user.api_key | |
|
135 | ||
|
136 | repo_group_name = 'api-repo-group-by-regular-user' | |
|
137 | ||
|
138 | usr = UserModel().get_by_username(regular_user.username) | |
|
139 | usr.inherit_default_permissions = False | |
|
140 | Session().add(usr) | |
|
141 | ||
|
142 | UserModel().grant_perm( | |
|
143 | regular_user.username, 'hg.repogroup.create.true') | |
|
144 | Session().commit() | |
|
145 | ||
|
146 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) | |
|
147 | assert repo_group is None | |
|
148 | ||
|
149 | id_, params = build_data( | |
|
150 | regular_user_api_key, 'create_repo_group', | |
|
151 | group_name=repo_group_name) | |
|
152 | response = api_call(self.app, params) | |
|
153 | ||
|
154 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) | |
|
155 | assert repo_group is not None | |
|
156 | expected = { | |
|
157 | 'msg': 'Created new repo group `%s`' % (repo_group_name,), | |
|
158 | 'repo_group': repo_group.get_api_data() | |
|
159 | } | |
|
160 | try: | |
|
161 | assert_ok(id_, expected, given=response.body) | |
|
162 | finally: | |
|
163 | fixture.destroy_repo_group(repo_group_name) | |
|
164 | ||
|
165 | def test_api_create_repo_group_regular_user_with_admin_perms_to_parent( | |
|
166 | self, user_util): | |
|
167 | ||
|
168 | repo_group_name = 'api-repo-group-parent' | |
|
169 | ||
|
170 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) | |
|
171 | assert repo_group is None | |
|
172 | # create the parent | |
|
173 | fixture.create_repo_group(repo_group_name) | |
|
174 | ||
|
175 | # user perms | |
|
176 | regular_user = user_util.create_user() | |
|
177 | regular_user_api_key = regular_user.api_key | |
|
178 | ||
|
179 | usr = UserModel().get_by_username(regular_user.username) | |
|
180 | usr.inherit_default_permissions = False | |
|
181 | Session().add(usr) | |
|
182 | ||
|
183 | RepoGroupModel().grant_user_permission( | |
|
184 | repo_group_name, regular_user.username, 'group.admin') | |
|
185 | Session().commit() | |
|
186 | ||
|
187 | full_repo_group_name = repo_group_name + '/' + repo_group_name | |
|
188 | id_, params = build_data( | |
|
189 | regular_user_api_key, 'create_repo_group', | |
|
190 | group_name=full_repo_group_name) | |
|
191 | response = api_call(self.app, params) | |
|
192 | ||
|
193 | repo_group = RepoGroupModel.cls.get_by_group_name(full_repo_group_name) | |
|
194 | assert repo_group is not None | |
|
195 | expected = { | |
|
196 | 'msg': 'Created new repo group `{}`'.format(full_repo_group_name), | |
|
197 | 'repo_group': repo_group.get_api_data() | |
|
198 | } | |
|
199 | try: | |
|
200 | assert_ok(id_, expected, given=response.body) | |
|
201 | finally: | |
|
202 | fixture.destroy_repo_group(full_repo_group_name) | |
|
203 | fixture.destroy_repo_group(repo_group_name) | |
|
204 | ||
|
205 | def test_api_create_repo_group_regular_user_no_permission_to_create_to_root_level(self): | |
|
206 | repo_group_name = 'api-repo-group' | |
|
207 | ||
|
208 | id_, params = build_data( | |
|
209 | self.apikey_regular, 'create_repo_group', | |
|
210 | group_name=repo_group_name) | |
|
211 | response = api_call(self.app, params) | |
|
212 | ||
|
213 | expected = { | |
|
214 | 'repo_group': | |
|
215 | u'You do not have the permission to store ' | |
|
216 | u'repository groups in the root location.'} | |
|
217 | assert_error(id_, expected, given=response.body) | |
|
218 | ||
|
219 | def test_api_create_repo_group_regular_user_no_parent_group_perms(self): | |
|
220 | repo_group_name = 'api-repo-group-regular-user' | |
|
221 | ||
|
222 | repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name) | |
|
223 | assert repo_group is None | |
|
224 | # create the parent | |
|
225 | fixture.create_repo_group(repo_group_name) | |
|
226 | ||
|
227 | full_repo_group_name = repo_group_name+'/'+repo_group_name | |
|
228 | ||
|
229 | id_, params = build_data( | |
|
230 | self.apikey_regular, 'create_repo_group', | |
|
231 | group_name=full_repo_group_name) | |
|
232 | response = api_call(self.app, params) | |
|
233 | ||
|
234 | expected = { | |
|
235 | 'repo_group': | |
|
236 | 'Parent repository group `{}` does not exist'.format( | |
|
237 | repo_group_name)} | |
|
238 | try: | |
|
239 | assert_error(id_, expected, given=response.body) | |
|
240 | finally: | |
|
241 | fixture.destroy_repo_group(repo_group_name) | |
|
242 | ||
|
243 | def test_api_create_repo_group_regular_user_no_permission_to_specify_owner( | |
|
244 | self): | |
|
245 | repo_group_name = 'api-repo-group' | |
|
246 | ||
|
247 | id_, params = build_data( | |
|
248 | self.apikey_regular, 'create_repo_group', | |
|
249 | group_name=repo_group_name, | |
|
250 | owner=TEST_USER_ADMIN_LOGIN,) | |
|
251 | response = api_call(self.app, params) | |
|
252 | ||
|
253 | expected = "Only RhodeCode super-admin can specify `owner` param" | |
|
254 | assert_error(id_, expected, given=response.body) | |
|
255 | ||
|
166 | 256 | @mock.patch.object(RepoGroupModel, 'create', crash) |
|
167 | 257 | def test_api_create_repo_group_exception_occurred(self): |
|
168 | 258 | repo_group_name = 'api-repo-group' |
@@ -28,6 +28,7 b' from rhodecode.tests import (' | |||
|
28 | 28 | from rhodecode.api.tests.utils import ( |
|
29 | 29 | build_data, api_call, assert_ok, assert_error, jsonify, crash) |
|
30 | 30 | from rhodecode.tests.fixture import Fixture |
|
31 | from rhodecode.model.db import RepoGroup | |
|
31 | 32 | |
|
32 | 33 | |
|
33 | 34 | # TODO: mikhail: remove fixture from here |
@@ -145,6 +146,36 b' class TestCreateUser(object):' | |||
|
145 | 146 | finally: |
|
146 | 147 | fixture.destroy_user(usr.user_id) |
|
147 | 148 | |
|
149 | def test_api_create_user_with_personal_repo_group(self): | |
|
150 | username = 'test_new_api_user_personal_group' | |
|
151 | email = username + "@foo.com" | |
|
152 | ||
|
153 | id_, params = build_data( | |
|
154 | self.apikey, 'create_user', | |
|
155 | username=username, | |
|
156 | email=email, extern_name='rhodecode', | |
|
157 | create_personal_repo_group=True) | |
|
158 | response = api_call(self.app, params) | |
|
159 | ||
|
160 | usr = UserModel().get_by_username(username) | |
|
161 | ret = { | |
|
162 | 'msg': 'created new user `%s`' % (username,), | |
|
163 | 'user': jsonify(usr.get_api_data(include_secrets=True)), | |
|
164 | } | |
|
165 | ||
|
166 | personal_group = RepoGroup.get_by_group_name(username) | |
|
167 | assert personal_group | |
|
168 | assert personal_group.personal == True | |
|
169 | assert personal_group.user.username == username | |
|
170 | ||
|
171 | try: | |
|
172 | expected = ret | |
|
173 | assert_ok(id_, expected, given=response.body) | |
|
174 | finally: | |
|
175 | fixture.destroy_repo_group(username) | |
|
176 | fixture.destroy_user(usr.user_id) | |
|
177 | ||
|
178 | ||
|
148 | 179 | @mock.patch.object(UserModel, 'create_or_update', crash) |
|
149 | 180 | def test_api_create_user_when_exception_happened(self): |
|
150 | 181 |
@@ -24,6 +24,7 b' import pytest' | |||
|
24 | 24 | |
|
25 | 25 | from rhodecode.model.meta import Session |
|
26 | 26 | from rhodecode.model.repo import RepoModel |
|
27 | from rhodecode.model.repo_group import RepoGroupModel | |
|
27 | 28 | from rhodecode.model.user import UserModel |
|
28 | 29 | from rhodecode.tests import TEST_USER_ADMIN_LOGIN |
|
29 | 30 | from rhodecode.api.tests.utils import ( |
@@ -99,11 +100,35 b' class TestApiForkRepo(object):' | |||
|
99 | 100 | finally: |
|
100 | 101 | fixture.destroy_repo(fork_name) |
|
101 | 102 | |
|
103 | def test_api_fork_repo_non_admin_into_group_no_permission(self, backend, user_util): | |
|
104 | source_name = backend['minimal'].repo_name | |
|
105 | repo_group = user_util.create_repo_group() | |
|
106 | repo_group_name = repo_group.group_name | |
|
107 | fork_name = '%s/api-repo-fork' % repo_group_name | |
|
108 | ||
|
109 | id_, params = build_data( | |
|
110 | self.apikey_regular, 'fork_repo', | |
|
111 | repoid=source_name, | |
|
112 | fork_name=fork_name) | |
|
113 | response = api_call(self.app, params) | |
|
114 | ||
|
115 | expected = { | |
|
116 | 'repo_group': 'Repository group `{}` does not exist'.format( | |
|
117 | repo_group_name)} | |
|
118 | try: | |
|
119 | assert_error(id_, expected, given=response.body) | |
|
120 | finally: | |
|
121 | fixture.destroy_repo(fork_name) | |
|
122 | ||
|
102 | 123 | def test_api_fork_repo_non_admin_into_group(self, backend, user_util): |
|
103 | 124 | source_name = backend['minimal'].repo_name |
|
104 | 125 | repo_group = user_util.create_repo_group() |
|
105 | 126 | fork_name = '%s/api-repo-fork' % repo_group.group_name |
|
106 | 127 | |
|
128 | RepoGroupModel().grant_user_permission( | |
|
129 | repo_group, self.TEST_USER_LOGIN, 'group.admin') | |
|
130 | Session().commit() | |
|
131 | ||
|
107 | 132 | id_, params = build_data( |
|
108 | 133 | self.apikey_regular, 'fork_repo', |
|
109 | 134 | repoid=source_name, |
@@ -129,10 +154,11 b' class TestApiForkRepo(object):' | |||
|
129 | 154 | fork_name=fork_name, |
|
130 | 155 | owner=TEST_USER_ADMIN_LOGIN) |
|
131 | 156 | response = api_call(self.app, params) |
|
132 | expected = 'Only RhodeCode admin can specify `owner` param' | |
|
157 | expected = 'Only RhodeCode super-admin can specify `owner` param' | |
|
133 | 158 | assert_error(id_, expected, given=response.body) |
|
134 | 159 | |
|
135 |
def test_api_fork_repo_non_admin_no_permission_ |
|
|
160 | def test_api_fork_repo_non_admin_no_permission_of_source_repo( | |
|
161 | self, backend): | |
|
136 | 162 | source_name = backend['minimal'].repo_name |
|
137 | 163 | RepoModel().grant_user_permission(repo=source_name, |
|
138 | 164 | user=self.TEST_USER_LOGIN, |
@@ -147,19 +173,44 b' class TestApiForkRepo(object):' | |||
|
147 | 173 | assert_error(id_, expected, given=response.body) |
|
148 | 174 | |
|
149 | 175 | def test_api_fork_repo_non_admin_no_permission_to_fork_to_root_level( |
|
150 | self, backend): | |
|
176 | self, backend, user_util): | |
|
177 | ||
|
178 | regular_user = user_util.create_user() | |
|
179 | regular_user_api_key = regular_user.api_key | |
|
180 | usr = UserModel().get_by_username(regular_user.username) | |
|
181 | usr.inherit_default_permissions = False | |
|
182 | Session().add(usr) | |
|
183 | UserModel().grant_perm(regular_user.username, 'hg.fork.repository') | |
|
184 | ||
|
151 | 185 | source_name = backend['minimal'].repo_name |
|
186 | fork_name = backend.new_repo_name() | |
|
187 | id_, params = build_data( | |
|
188 | regular_user_api_key, 'fork_repo', | |
|
189 | repoid=source_name, | |
|
190 | fork_name=fork_name) | |
|
191 | response = api_call(self.app, params) | |
|
192 | expected = { | |
|
193 | "repo_name": "You do not have the permission to " | |
|
194 | "store repositories in the root location."} | |
|
195 | assert_error(id_, expected, given=response.body) | |
|
152 | 196 | |
|
153 | usr = UserModel().get_by_username(self.TEST_USER_LOGIN) | |
|
197 | def test_api_fork_repo_non_admin_no_permission_to_fork( | |
|
198 | self, backend, user_util): | |
|
199 | ||
|
200 | regular_user = user_util.create_user() | |
|
201 | regular_user_api_key = regular_user.api_key | |
|
202 | usr = UserModel().get_by_username(regular_user.username) | |
|
154 | 203 | usr.inherit_default_permissions = False |
|
155 | 204 | Session().add(usr) |
|
156 | 205 | |
|
206 | source_name = backend['minimal'].repo_name | |
|
157 | 207 | fork_name = backend.new_repo_name() |
|
158 | 208 | id_, params = build_data( |
|
159 |
|
|
|
209 | regular_user_api_key, 'fork_repo', | |
|
160 | 210 | repoid=source_name, |
|
161 | 211 | fork_name=fork_name) |
|
162 | 212 | response = api_call(self.app, params) |
|
213 | ||
|
163 | 214 | expected = "Access was denied to this resource." |
|
164 | 215 | assert_error(id_, expected, given=response.body) |
|
165 | 216 | |
@@ -189,7 +240,9 b' class TestApiForkRepo(object):' | |||
|
189 | 240 | response = api_call(self.app, params) |
|
190 | 241 | |
|
191 | 242 | try: |
|
192 | expected = "fork `%s` already exist" % (fork_name,) | |
|
243 | expected = { | |
|
244 | 'unique_repo_name': 'Repository with name `{}` already exists'.format( | |
|
245 | fork_name)} | |
|
193 | 246 | assert_error(id_, expected, given=response.body) |
|
194 | 247 | finally: |
|
195 | 248 | fixture.destroy_repo(fork_repo.repo_name) |
@@ -205,7 +258,9 b' class TestApiForkRepo(object):' | |||
|
205 | 258 | owner=TEST_USER_ADMIN_LOGIN) |
|
206 | 259 | response = api_call(self.app, params) |
|
207 | 260 | |
|
208 | expected = "repo `%s` already exist" % (fork_name,) | |
|
261 | expected = { | |
|
262 | 'unique_repo_name': 'Repository with name `{}` already exists'.format( | |
|
263 | fork_name)} | |
|
209 | 264 | assert_error(id_, expected, given=response.body) |
|
210 | 265 | |
|
211 | 266 | @mock.patch.object(RepoModel, 'create_fork', crash) |
@@ -34,6 +34,7 b' pytestmark = pytest.mark.backends("git",' | |||
|
34 | 34 | class TestGetPullRequest(object): |
|
35 | 35 | |
|
36 | 36 | def test_api_get_pull_request(self, pr_util): |
|
37 | from rhodecode.model.pull_request import PullRequestModel | |
|
37 | 38 | pull_request = pr_util.create_pull_request(mergeable=True) |
|
38 | 39 | id_, params = build_data( |
|
39 | 40 | self.apikey, 'get_pull_request', |
@@ -57,6 +58,8 b' class TestGetPullRequest(object):' | |||
|
57 | 58 | target_url = unicode( |
|
58 | 59 | pull_request.target_repo.clone_url() |
|
59 | 60 | .with_netloc('test.example.com:80')) |
|
61 | shadow_url = unicode( | |
|
62 | PullRequestModel().get_shadow_clone_url(pull_request)) | |
|
60 | 63 | expected = { |
|
61 | 64 | 'pull_request_id': pull_request.pull_request_id, |
|
62 | 65 | 'url': pr_url, |
@@ -89,15 +92,24 b' class TestGetPullRequest(object):' | |||
|
89 | 92 | 'commit_id': pull_request.target_ref_parts.commit_id, |
|
90 | 93 | }, |
|
91 | 94 | }, |
|
95 | 'merge': { | |
|
96 | 'clone_url': shadow_url, | |
|
97 | 'reference': { | |
|
98 | 'name': pull_request.shadow_merge_ref.name, | |
|
99 | 'type': pull_request.shadow_merge_ref.type, | |
|
100 | 'commit_id': pull_request.shadow_merge_ref.commit_id, | |
|
101 | }, | |
|
102 | }, | |
|
92 | 103 | 'author': pull_request.author.get_api_data(include_secrets=False, |
|
93 | 104 | details='basic'), |
|
94 | 105 | 'reviewers': [ |
|
95 | 106 | { |
|
96 | 107 | 'user': reviewer.get_api_data(include_secrets=False, |
|
97 | 108 | details='basic'), |
|
109 | 'reasons': reasons, | |
|
98 | 110 | 'review_status': st[0][1].status if st else 'not_reviewed', |
|
99 | 111 | } |
|
100 | for reviewer, st in pull_request.reviewers_statuses() | |
|
112 | for reviewer, reasons, st in pull_request.reviewers_statuses() | |
|
101 | 113 | ] |
|
102 | 114 | } |
|
103 | 115 | assert_ok(id_, expected, response.body) |
@@ -45,8 +45,13 b' class TestGetServerInfo(object):' | |||
|
45 | 45 | expected['uptime'] = resp['result']['uptime'] |
|
46 | 46 | expected['load'] = resp['result']['load'] |
|
47 | 47 | expected['cpu'] = resp['result']['cpu'] |
|
48 |
expected[' |
|
|
49 | expected['server_ip'] = '127.0.0.1:80' | |
|
48 | expected['storage'] = resp['result']['storage'] | |
|
49 | expected['storage_temp'] = resp['result']['storage_temp'] | |
|
50 | expected['storage_inodes'] = resp['result']['storage_inodes'] | |
|
51 | expected['server'] = resp['result']['server'] | |
|
52 | ||
|
53 | expected['index_storage'] = resp['result']['index_storage'] | |
|
54 | expected['storage'] = resp['result']['storage'] | |
|
50 | 55 | |
|
51 | 56 | assert_ok(id_, expected, given=response.body) |
|
52 | 57 | |
@@ -59,7 +64,21 b' class TestGetServerInfo(object):' | |||
|
59 | 64 | expected['uptime'] = resp['result']['uptime'] |
|
60 | 65 | expected['load'] = resp['result']['load'] |
|
61 | 66 | expected['cpu'] = resp['result']['cpu'] |
|
62 |
expected[' |
|
|
63 | expected['server_ip'] = '127.0.0.1:80' | |
|
67 | expected['storage'] = resp['result']['storage'] | |
|
68 | expected['storage_temp'] = resp['result']['storage_temp'] | |
|
69 | expected['storage_inodes'] = resp['result']['storage_inodes'] | |
|
70 | expected['server'] = resp['result']['server'] | |
|
71 | ||
|
72 | expected['index_storage'] = resp['result']['index_storage'] | |
|
73 | expected['storage'] = resp['result']['storage'] | |
|
64 | 74 | |
|
65 | 75 | assert_ok(id_, expected, given=response.body) |
|
76 | ||
|
77 | def test_api_get_server_info_data_for_search_index_build(self): | |
|
78 | id_, params = build_data(self.apikey, 'get_server_info') | |
|
79 | response = api_call(self.app, params) | |
|
80 | resp = response.json | |
|
81 | ||
|
82 | # required by indexer | |
|
83 | assert resp['result']['index_storage'] | |
|
84 | assert resp['result']['storage'] |
@@ -32,42 +32,37 b' from rhodecode.api.tests.utils import (' | |||
|
32 | 32 | class TestMergePullRequest(object): |
|
33 | 33 | @pytest.mark.backends("git", "hg") |
|
34 | 34 | def test_api_merge_pull_request(self, pr_util, no_notifications): |
|
35 | pull_request = pr_util.create_pull_request() | |
|
36 | pull_request_2 = PullRequestModel().create( | |
|
37 | created_by=pull_request.author, | |
|
38 | source_repo=pull_request.source_repo, | |
|
39 | source_ref=pull_request.source_ref, | |
|
40 | target_repo=pull_request.target_repo, | |
|
41 | target_ref=pull_request.target_ref, | |
|
42 | revisions=pull_request.revisions, | |
|
43 | reviewers=(), | |
|
44 | title=pull_request.title, | |
|
45 | description=pull_request.description, | |
|
46 | ) | |
|
35 | pull_request = pr_util.create_pull_request(mergeable=True) | |
|
47 | 36 | author = pull_request.user_id |
|
48 |
repo = pull_request |
|
|
49 |
pull_request_ |
|
|
50 |
pull_request_ |
|
|
51 | Session().commit() | |
|
37 | repo = pull_request.target_repo.repo_id | |
|
38 | pull_request_id = pull_request.pull_request_id | |
|
39 | pull_request_repo = pull_request.target_repo.repo_name | |
|
52 | 40 | |
|
53 | 41 | id_, params = build_data( |
|
54 | 42 | self.apikey, 'merge_pull_request', |
|
55 |
repoid=pull_request_ |
|
|
56 |
pullrequestid=pull_request_ |
|
|
43 | repoid=pull_request_repo, | |
|
44 | pullrequestid=pull_request_id) | |
|
45 | ||
|
57 | 46 | response = api_call(self.app, params) |
|
58 | 47 | |
|
48 | # The above api call detaches the pull request DB object from the | |
|
49 | # session because of an unconditional transaction rollback in our | |
|
50 | # middleware. Therefore we need to add it back here if we want to use | |
|
51 | # it. | |
|
52 | Session().add(pull_request) | |
|
53 | ||
|
59 | 54 | expected = { |
|
60 | 55 | 'executed': True, |
|
61 | 56 | 'failure_reason': 0, |
|
62 | 'possible': True | |
|
57 | 'possible': True, | |
|
58 | 'merge_commit_id': pull_request.shadow_merge_ref.commit_id, | |
|
59 | 'merge_ref': pull_request.shadow_merge_ref._asdict() | |
|
63 | 60 | } |
|
64 | 61 | |
|
65 | 62 | response_json = response.json['result'] |
|
66 | assert response_json['merge_commit_id'] | |
|
67 | response_json.pop('merge_commit_id') | |
|
68 | 63 | assert response_json == expected |
|
69 | 64 | |
|
70 |
action = 'user_merged_pull_request:%d' % (pull_request_ |
|
|
65 | action = 'user_merged_pull_request:%d' % (pull_request_id, ) | |
|
71 | 66 | journal = UserLog.query()\ |
|
72 | 67 | .filter(UserLog.user_id == author)\ |
|
73 | 68 | .filter(UserLog.repository_id == repo)\ |
@@ -77,11 +72,11 b' class TestMergePullRequest(object):' | |||
|
77 | 72 | |
|
78 | 73 | id_, params = build_data( |
|
79 | 74 | self.apikey, 'merge_pull_request', |
|
80 |
repoid=pull_request_ |
|
|
75 | repoid=pull_request_repo, pullrequestid=pull_request_id) | |
|
81 | 76 | response = api_call(self.app, params) |
|
82 | 77 | |
|
83 | 78 | expected = 'pull request `%s` merge failed, pull request is closed' % ( |
|
84 |
pull_request_ |
|
|
79 | pull_request_id) | |
|
85 | 80 | assert_error(id_, expected, given=response.body) |
|
86 | 81 | |
|
87 | 82 | @pytest.mark.backends("git", "hg") |
@@ -32,35 +32,60 b' fixture = Fixture()' | |||
|
32 | 32 | |
|
33 | 33 | UPDATE_REPO_NAME = 'api_update_me' |
|
34 | 34 | |
|
35 | class SAME_AS_UPDATES(object): """ Constant used for tests below """ | |
|
35 | ||
|
36 | class SAME_AS_UPDATES(object): | |
|
37 | """ Constant used for tests below """ | |
|
38 | ||
|
36 | 39 | |
|
37 | 40 | @pytest.mark.usefixtures("testuser_api", "app") |
|
38 | 41 | class TestApiUpdateRepo(object): |
|
39 | 42 | |
|
40 | 43 | @pytest.mark.parametrize("updates, expected", [ |
|
41 |
({'owner': TEST_USER_REGULAR_LOGIN}, |
|
|
42 | ({'description': 'new description'}, SAME_AS_UPDATES), | |
|
43 | ({'clone_uri': 'http://foo.com/repo'}, SAME_AS_UPDATES), | |
|
44 | ({'clone_uri': None}, {'clone_uri': ''}), | |
|
45 | ({'clone_uri': ''}, {'clone_uri': ''}), | |
|
46 | ({'landing_rev': 'branch:master'}, {'landing_rev': ['branch','master']}), | |
|
47 | ({'enable_statistics': True}, SAME_AS_UPDATES), | |
|
48 |
|
|
|
49 | ({'enable_downloads': True}, SAME_AS_UPDATES), | |
|
50 |
({'n |
|
|
44 | ({'owner': TEST_USER_REGULAR_LOGIN}, | |
|
45 | SAME_AS_UPDATES), | |
|
46 | ||
|
47 | ({'description': 'new description'}, | |
|
48 | SAME_AS_UPDATES), | |
|
49 | ||
|
50 | ({'clone_uri': 'http://foo.com/repo'}, | |
|
51 | SAME_AS_UPDATES), | |
|
52 | ||
|
53 | ({'clone_uri': None}, | |
|
54 | {'clone_uri': ''}), | |
|
55 | ||
|
56 | ({'clone_uri': ''}, | |
|
57 | {'clone_uri': ''}), | |
|
58 | ||
|
59 | ({'landing_rev': 'rev:tip'}, | |
|
60 | {'landing_rev': ['rev', 'tip']}), | |
|
61 | ||
|
62 | ({'enable_statistics': True}, | |
|
63 | SAME_AS_UPDATES), | |
|
64 | ||
|
65 | ({'enable_locking': True}, | |
|
66 | SAME_AS_UPDATES), | |
|
67 | ||
|
68 | ({'enable_downloads': True}, | |
|
69 | SAME_AS_UPDATES), | |
|
70 | ||
|
71 | ({'repo_name': 'new_repo_name'}, | |
|
72 | { | |
|
51 | 73 | 'repo_name': 'new_repo_name', |
|
52 |
'url': 'http://test.example.com:80/new_repo_name' |
|
|
74 | 'url': 'http://test.example.com:80/new_repo_name' | |
|
53 | 75 | }), |
|
54 | ({'group': 'test_group_for_update'}, { | |
|
55 |
|
|
|
56 | 'url': 'http://test.example.com:80/test_group_for_update/%s' % UPDATE_REPO_NAME | |
|
76 | ||
|
77 | ({'repo_name': 'test_group_for_update/{}'.format(UPDATE_REPO_NAME), | |
|
78 | '_group': 'test_group_for_update'}, | |
|
79 | { | |
|
80 | 'repo_name': 'test_group_for_update/{}'.format(UPDATE_REPO_NAME), | |
|
81 | 'url': 'http://test.example.com:80/test_group_for_update/{}'.format(UPDATE_REPO_NAME) | |
|
57 | 82 | }), |
|
58 | 83 | ]) |
|
59 | 84 | def test_api_update_repo(self, updates, expected, backend): |
|
60 | 85 | repo_name = UPDATE_REPO_NAME |
|
61 | 86 | repo = fixture.create_repo(repo_name, repo_type=backend.alias) |
|
62 | if updates.get('group'): | |
|
63 | fixture.create_repo_group(updates['group']) | |
|
87 | if updates.get('_group'): | |
|
88 | fixture.create_repo_group(updates['_group']) | |
|
64 | 89 | |
|
65 | 90 | expected_api_data = repo.get_api_data(include_secrets=True) |
|
66 | 91 | if expected is SAME_AS_UPDATES: |
@@ -68,15 +93,12 b' class TestApiUpdateRepo(object):' | |||
|
68 | 93 | else: |
|
69 | 94 | expected_api_data.update(expected) |
|
70 | 95 | |
|
71 | ||
|
72 | 96 | id_, params = build_data( |
|
73 | 97 | self.apikey, 'update_repo', repoid=repo_name, **updates) |
|
74 | 98 | response = api_call(self.app, params) |
|
75 | 99 | |
|
76 | if updates.get('name'): | |
|
77 | repo_name = updates['name'] | |
|
78 | if updates.get('group'): | |
|
79 | repo_name = '/'.join([updates['group'], repo_name]) | |
|
100 | if updates.get('repo_name'): | |
|
101 | repo_name = updates['repo_name'] | |
|
80 | 102 | |
|
81 | 103 | try: |
|
82 | 104 | expected = { |
@@ -86,8 +108,8 b' class TestApiUpdateRepo(object):' | |||
|
86 | 108 | assert_ok(id_, expected, given=response.body) |
|
87 | 109 | finally: |
|
88 | 110 | fixture.destroy_repo(repo_name) |
|
89 | if updates.get('group'): | |
|
90 | fixture.destroy_repo_group(updates['group']) | |
|
111 | if updates.get('_group'): | |
|
112 | fixture.destroy_repo_group(updates['_group']) | |
|
91 | 113 | |
|
92 | 114 | def test_api_update_repo_fork_of_field(self, backend): |
|
93 | 115 | master_repo = backend.create_repo() |
@@ -118,19 +140,23 b' class TestApiUpdateRepo(object):' | |||
|
118 | 140 | id_, params = build_data( |
|
119 | 141 | self.apikey, 'update_repo', repoid=repo.repo_name, **updates) |
|
120 | 142 | response = api_call(self.app, params) |
|
121 | expected = 'repository `{}` does not exist'.format(master_repo_name) | |
|
143 | expected = { | |
|
144 | 'repo_fork_of': 'Fork with id `{}` does not exists'.format( | |
|
145 | master_repo_name)} | |
|
122 | 146 | assert_error(id_, expected, given=response.body) |
|
123 | 147 | |
|
124 | 148 | def test_api_update_repo_with_repo_group_not_existing(self): |
|
125 | 149 | repo_name = 'admin_owned' |
|
150 | fake_repo_group = 'test_group_for_update' | |
|
126 | 151 | fixture.create_repo(repo_name) |
|
127 | updates = {'group': 'test_group_for_update'} | |
|
152 | updates = {'repo_name': '{}/{}'.format(fake_repo_group, repo_name)} | |
|
128 | 153 | id_, params = build_data( |
|
129 | 154 | self.apikey, 'update_repo', repoid=repo_name, **updates) |
|
130 | 155 | response = api_call(self.app, params) |
|
131 | 156 | try: |
|
132 | expected = 'repository group `%s` does not exist' % ( | |
|
133 | updates['group'],) | |
|
157 | expected = { | |
|
158 | 'repo_group': 'Repository group `{}` does not exist'.format(fake_repo_group) | |
|
159 | } | |
|
134 | 160 | assert_error(id_, expected, given=response.body) |
|
135 | 161 | finally: |
|
136 | 162 | fixture.destroy_repo(repo_name) |
@@ -30,22 +30,30 b' from rhodecode.api.tests.utils import (' | |||
|
30 | 30 | |
|
31 | 31 | @pytest.mark.usefixtures("testuser_api", "app") |
|
32 | 32 | class TestApiUpdateRepoGroup(object): |
|
33 | ||
|
33 | 34 | def test_update_group_name(self, user_util): |
|
34 | 35 | new_group_name = 'new-group' |
|
35 | 36 | initial_name = self._update(user_util, group_name=new_group_name) |
|
36 | 37 | assert RepoGroupModel()._get_repo_group(initial_name) is None |
|
37 |
|
|
|
38 | new_group = RepoGroupModel()._get_repo_group(new_group_name) | |
|
39 | assert new_group is not None | |
|
40 | assert new_group.full_path == new_group_name | |
|
38 | 41 | |
|
39 | def test_update_parent(self, user_util): | |
|
42 | def test_update_group_name_change_parent(self, user_util): | |
|
43 | ||
|
40 | 44 | parent_group = user_util.create_repo_group() |
|
41 | initial_name = self._update(user_util, parent=parent_group.name) | |
|
45 | parent_group_name = parent_group.name | |
|
42 | 46 | |
|
43 |
expected_group_name = '{}/{}'.format(parent_group |
|
|
47 | expected_group_name = '{}/{}'.format(parent_group_name, 'new-group') | |
|
48 | initial_name = self._update(user_util, group_name=expected_group_name) | |
|
49 | ||
|
44 | 50 | repo_group = RepoGroupModel()._get_repo_group(expected_group_name) |
|
51 | ||
|
45 | 52 | assert repo_group is not None |
|
46 | 53 | assert repo_group.group_name == expected_group_name |
|
47 |
assert repo_group. |
|
|
54 | assert repo_group.full_path == expected_group_name | |
|
48 | 55 | assert RepoGroupModel()._get_repo_group(initial_name) is None |
|
56 | ||
|
49 | 57 | new_path = os.path.join( |
|
50 | 58 | RepoGroupModel().repos_path, *repo_group.full_path_splitted) |
|
51 | 59 | assert os.path.exists(new_path) |
@@ -67,15 +75,47 b' class TestApiUpdateRepoGroup(object):' | |||
|
67 | 75 | repo_group = RepoGroupModel()._get_repo_group(initial_name) |
|
68 | 76 | assert repo_group.user.username == owner |
|
69 | 77 | |
|
70 | def test_api_update_repo_group_by_regular_user_no_permission( | |
|
71 | self, backend): | |
|
72 |
|
|
|
73 | repo_name = repo.repo_name | |
|
78 | def test_update_group_name_conflict_with_existing(self, user_util): | |
|
79 | group_1 = user_util.create_repo_group() | |
|
80 | group_2 = user_util.create_repo_group() | |
|
81 | repo_group_name_1 = group_1.group_name | |
|
82 | repo_group_name_2 = group_2.group_name | |
|
74 | 83 | |
|
75 | 84 | id_, params = build_data( |
|
76 |
self.apikey |
|
|
85 | self.apikey, 'update_repo_group', repogroupid=repo_group_name_1, | |
|
86 | group_name=repo_group_name_2) | |
|
87 | response = api_call(self.app, params) | |
|
88 | expected = { | |
|
89 | 'unique_repo_group_name': | |
|
90 | 'Repository group with name `{}` already exists'.format( | |
|
91 | repo_group_name_2)} | |
|
92 | assert_error(id_, expected, given=response.body) | |
|
93 | ||
|
94 | def test_api_update_repo_group_by_regular_user_no_permission(self, user_util): | |
|
95 | temp_user = user_util.create_user() | |
|
96 | temp_user_api_key = temp_user.api_key | |
|
97 | parent_group = user_util.create_repo_group() | |
|
98 | repo_group_name = parent_group.group_name | |
|
99 | id_, params = build_data( | |
|
100 | temp_user_api_key, 'update_repo_group', repogroupid=repo_group_name) | |
|
77 | 101 | response = api_call(self.app, params) |
|
78 | expected = 'repository group `%s` does not exist' % (repo_name,) | |
|
102 | expected = 'repository group `%s` does not exist' % (repo_group_name,) | |
|
103 | assert_error(id_, expected, given=response.body) | |
|
104 | ||
|
105 | def test_api_update_repo_group_regular_user_no_root_write_permissions( | |
|
106 | self, user_util): | |
|
107 | temp_user = user_util.create_user() | |
|
108 | temp_user_api_key = temp_user.api_key | |
|
109 | parent_group = user_util.create_repo_group(owner=temp_user.username) | |
|
110 | repo_group_name = parent_group.group_name | |
|
111 | ||
|
112 | id_, params = build_data( | |
|
113 | temp_user_api_key, 'update_repo_group', repogroupid=repo_group_name, | |
|
114 | group_name='at-root-level') | |
|
115 | response = api_call(self.app, params) | |
|
116 | expected = { | |
|
117 | 'repo_group': 'You do not have the permission to store ' | |
|
118 | 'repository groups in the root location.'} | |
|
79 | 119 | assert_error(id_, expected, given=response.body) |
|
80 | 120 | |
|
81 | 121 | def _update(self, user_util, **kwargs): |
@@ -89,7 +129,10 b' class TestApiUpdateRepoGroup(object):' | |||
|
89 | 129 | self.apikey, 'update_repo_group', repogroupid=initial_name, |
|
90 | 130 | **kwargs) |
|
91 | 131 | response = api_call(self.app, params) |
|
92 | ret = { | |
|
132 | ||
|
133 | repo_group = RepoGroupModel.cls.get(repo_group.group_id) | |
|
134 | ||
|
135 | expected = { | |
|
93 | 136 | 'msg': 'updated repository group ID:{} {}'.format( |
|
94 | 137 | repo_group.group_id, repo_group.group_name), |
|
95 | 138 | 'repo_group': { |
@@ -103,5 +146,5 b' class TestApiUpdateRepoGroup(object):' | |||
|
103 | 146 | if repo_group.parent_group else None) |
|
104 | 147 | } |
|
105 | 148 | } |
|
106 |
assert_ok(id_, |
|
|
149 | assert_ok(id_, expected, given=response.body) | |
|
107 | 150 | return initial_name |
@@ -249,7 +249,7 b' class TestRepoAccess(object):' | |||
|
249 | 249 | fake_repo = Mock() |
|
250 | 250 | with self.repo_perm_patch as rmock: |
|
251 | 251 | rmock.return_value = repo_mock |
|
252 |
assert utils. |
|
|
252 | assert utils.validate_repo_permissions( | |
|
253 | 253 | 'fake_user', 'fake_repo_id', fake_repo, |
|
254 | 254 | ['perm1', 'perm2']) |
|
255 | 255 | rmock.assert_called_once_with(*['perm1', 'perm2']) |
@@ -263,6 +263,6 b' class TestRepoAccess(object):' | |||
|
263 | 263 | with self.repo_perm_patch as rmock: |
|
264 | 264 | rmock.return_value = repo_mock |
|
265 | 265 | with pytest.raises(JSONRPCError) as excinfo: |
|
266 |
utils. |
|
|
266 | utils.validate_repo_permissions( | |
|
267 | 267 | 'fake_user', 'fake_repo_id', fake_repo, 'perms') |
|
268 | 268 | assert 'fake_repo_id' in excinfo |
@@ -26,7 +26,8 b' import collections' | |||
|
26 | 26 | import logging |
|
27 | 27 | |
|
28 | 28 | from rhodecode.api.exc import JSONRPCError |
|
29 | from rhodecode.lib.auth import HasPermissionAnyApi, HasRepoPermissionAnyApi | |
|
29 | from rhodecode.lib.auth import HasPermissionAnyApi, HasRepoPermissionAnyApi, \ | |
|
30 | HasRepoGroupPermissionAnyApi | |
|
30 | 31 | from rhodecode.lib.utils import safe_unicode |
|
31 | 32 | from rhodecode.controllers.utils import get_commit_from_ref_name |
|
32 | 33 | from rhodecode.lib.vcs.exceptions import RepositoryError |
@@ -153,7 +154,7 b' def has_superadmin_permission(apiuser):' | |||
|
153 | 154 | return False |
|
154 | 155 | |
|
155 | 156 | |
|
156 |
def |
|
|
157 | def validate_repo_permissions(apiuser, repoid, repo, perms): | |
|
157 | 158 | """ |
|
158 | 159 | Raise JsonRPCError if apiuser is not authorized or return True |
|
159 | 160 | |
@@ -170,6 +171,36 b' def has_repo_permissions(apiuser, repoid' | |||
|
170 | 171 | return True |
|
171 | 172 | |
|
172 | 173 | |
|
174 | def validate_repo_group_permissions(apiuser, repogroupid, repo_group, perms): | |
|
175 | """ | |
|
176 | Raise JsonRPCError if apiuser is not authorized or return True | |
|
177 | ||
|
178 | :param apiuser: | |
|
179 | :param repogroupid: just the id of repository group | |
|
180 | :param repo_group: instance of repo_group | |
|
181 | :param perms: | |
|
182 | """ | |
|
183 | if not HasRepoGroupPermissionAnyApi(*perms)( | |
|
184 | user=apiuser, group_name=repo_group.group_name): | |
|
185 | raise JSONRPCError( | |
|
186 | 'repository group `%s` does not exist' % repogroupid) | |
|
187 | ||
|
188 | return True | |
|
189 | ||
|
190 | ||
|
191 | def validate_set_owner_permissions(apiuser, owner): | |
|
192 | if isinstance(owner, Optional): | |
|
193 | owner = get_user_or_error(apiuser.user_id) | |
|
194 | else: | |
|
195 | if has_superadmin_permission(apiuser): | |
|
196 | owner = get_user_or_error(owner) | |
|
197 | else: | |
|
198 | # forbid setting owner for non-admins | |
|
199 | raise JSONRPCError( | |
|
200 | 'Only RhodeCode super-admin can specify `owner` param') | |
|
201 | return owner | |
|
202 | ||
|
203 | ||
|
173 | 204 | def get_user_or_error(userid): |
|
174 | 205 | """ |
|
175 | 206 | Get user by id or name or return JsonRPCError if not found |
@@ -25,7 +25,7 b' from rhodecode.api import jsonrpc_method' | |||
|
25 | 25 | from rhodecode.api.utils import ( |
|
26 | 26 | has_superadmin_permission, Optional, OAttr, get_repo_or_error, |
|
27 | 27 | get_pull_request_or_error, get_commit_or_error, get_user_or_error, |
|
28 |
|
|
|
28 | validate_repo_permissions, resolve_ref_or_error) | |
|
29 | 29 | from rhodecode.lib.auth import (HasRepoPermissionAnyApi) |
|
30 | 30 | from rhodecode.lib.base import vcs_operation_context |
|
31 | 31 | from rhodecode.lib.utils2 import str2bool |
@@ -96,6 +96,15 b' def get_pull_request(request, apiuser, r' | |||
|
96 | 96 | "commit_id": "<commit_id>", |
|
97 | 97 | } |
|
98 | 98 | }, |
|
99 | "merge": { | |
|
100 | "clone_url": "<clone_url>", | |
|
101 | "reference": | |
|
102 | { | |
|
103 | "name": "<name>", | |
|
104 | "type": "<type>", | |
|
105 | "commit_id": "<commit_id>", | |
|
106 | } | |
|
107 | }, | |
|
99 | 108 | "author": <user_obj>, |
|
100 | 109 | "reviewers": [ |
|
101 | 110 | ... |
@@ -178,6 +187,15 b' def get_pull_requests(request, apiuser, ' | |||
|
178 | 187 | "commit_id": "<commit_id>", |
|
179 | 188 | } |
|
180 | 189 | }, |
|
190 | "merge": { | |
|
191 | "clone_url": "<clone_url>", | |
|
192 | "reference": | |
|
193 | { | |
|
194 | "name": "<name>", | |
|
195 | "type": "<type>", | |
|
196 | "commit_id": "<commit_id>", | |
|
197 | } | |
|
198 | }, | |
|
181 | 199 | "author": <user_obj>, |
|
182 | 200 | "reviewers": [ |
|
183 | 201 | ... |
@@ -197,7 +215,7 b' def get_pull_requests(request, apiuser, ' | |||
|
197 | 215 | if not has_superadmin_permission(apiuser): |
|
198 | 216 | _perms = ( |
|
199 | 217 | 'repository.admin', 'repository.write', 'repository.read',) |
|
200 |
|
|
|
218 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
201 | 219 | |
|
202 | 220 | status = Optional.extract(status) |
|
203 | 221 | pull_requests = PullRequestModel().get_all(repo, statuses=[status]) |
@@ -232,7 +250,12 b' def merge_pull_request(request, apiuser,' | |||
|
232 | 250 | "executed": "<bool>", |
|
233 | 251 | "failure_reason": "<int>", |
|
234 | 252 | "merge_commit_id": "<merge_commit_id>", |
|
235 | "possible": "<bool>" | |
|
253 | "possible": "<bool>", | |
|
254 | "merge_ref": { | |
|
255 | "commit_id": "<commit_id>", | |
|
256 | "type": "<type>", | |
|
257 | "name": "<name>" | |
|
258 | } | |
|
236 | 259 | }, |
|
237 | 260 | "error": null |
|
238 | 261 | |
@@ -260,13 +283,21 b' def merge_pull_request(request, apiuser,' | |||
|
260 | 283 | request.environ, repo_name=target_repo.repo_name, |
|
261 | 284 | username=apiuser.username, action='push', |
|
262 | 285 | scm=target_repo.repo_type) |
|
263 | data = PullRequestModel().merge(pull_request, apiuser, extras=extras) | |
|
264 | if data.executed: | |
|
286 | merge_response = PullRequestModel().merge( | |
|
287 | pull_request, apiuser, extras=extras) | |
|
288 | if merge_response.executed: | |
|
265 | 289 | PullRequestModel().close_pull_request( |
|
266 | 290 | pull_request.pull_request_id, apiuser) |
|
267 | 291 | |
|
268 | 292 | Session().commit() |
|
269 | return data | |
|
293 | ||
|
294 | # In previous versions the merge response directly contained the merge | |
|
295 | # commit id. It is now contained in the merge reference object. To be | |
|
296 | # backwards compatible we have to extract it again. | |
|
297 | merge_response = merge_response._asdict() | |
|
298 | merge_response['merge_commit_id'] = merge_response['merge_ref'].commit_id | |
|
299 | ||
|
300 | return merge_response | |
|
270 | 301 | |
|
271 | 302 | |
|
272 | 303 | @jsonrpc_method() |
@@ -463,12 +494,17 b' def create_pull_request(' | |||
|
463 | 494 | :type description: Optional(str) |
|
464 | 495 | :param reviewers: Set the new pull request reviewers list. |
|
465 | 496 | :type reviewers: Optional(list) |
|
497 | Accepts username strings or objects of the format: | |
|
498 | { | |
|
499 | 'username': 'nick', 'reasons': ['original author'] | |
|
500 | } | |
|
466 | 501 | """ |
|
502 | ||
|
467 | 503 | source = get_repo_or_error(source_repo) |
|
468 | 504 | target = get_repo_or_error(target_repo) |
|
469 | 505 | if not has_superadmin_permission(apiuser): |
|
470 | 506 | _perms = ('repository.admin', 'repository.write', 'repository.read',) |
|
471 |
|
|
|
507 | validate_repo_permissions(apiuser, source_repo, source, _perms) | |
|
472 | 508 | |
|
473 | 509 | full_source_ref = resolve_ref_or_error(source_ref, source) |
|
474 | 510 | full_target_ref = resolve_ref_or_error(target_ref, target) |
@@ -490,12 +526,21 b' def create_pull_request(' | |||
|
490 | 526 | if not ancestor: |
|
491 | 527 | raise JSONRPCError('no common ancestor found') |
|
492 | 528 | |
|
493 |
reviewer_ |
|
|
494 |
if not isinstance(reviewer_ |
|
|
529 | reviewer_objects = Optional.extract(reviewers) or [] | |
|
530 | if not isinstance(reviewer_objects, list): | |
|
495 | 531 | raise JSONRPCError('reviewers should be specified as a list') |
|
496 | 532 | |
|
497 | reviewer_users = [get_user_or_error(n) for n in reviewer_names] | |
|
498 | reviewer_ids = [u.user_id for u in reviewer_users] | |
|
533 | reviewers_reasons = [] | |
|
534 | for reviewer_object in reviewer_objects: | |
|
535 | reviewer_reasons = [] | |
|
536 | if isinstance(reviewer_object, (basestring, int)): | |
|
537 | reviewer_username = reviewer_object | |
|
538 | else: | |
|
539 | reviewer_username = reviewer_object['username'] | |
|
540 | reviewer_reasons = reviewer_object.get('reasons', []) | |
|
541 | ||
|
542 | user = get_user_or_error(reviewer_username) | |
|
543 | reviewers_reasons.append((user.user_id, reviewer_reasons)) | |
|
499 | 544 | |
|
500 | 545 | pull_request_model = PullRequestModel() |
|
501 | 546 | pull_request = pull_request_model.create( |
@@ -506,7 +551,7 b' def create_pull_request(' | |||
|
506 | 551 | target_ref=full_target_ref, |
|
507 | 552 | revisions=reversed( |
|
508 | 553 | [commit.raw_id for commit in reversed(commit_ranges)]), |
|
509 |
reviewers=reviewer |
|
|
554 | reviewers=reviewers_reasons, | |
|
510 | 555 | title=title, |
|
511 | 556 | description=Optional.extract(description) |
|
512 | 557 | ) |
@@ -585,12 +630,23 b' def update_pull_request(' | |||
|
585 | 630 | 'pull request `%s` update failed, pull request is closed' % ( |
|
586 | 631 | pullrequestid,)) |
|
587 | 632 | |
|
588 |
reviewer_ |
|
|
589 |
if not isinstance(reviewer_ |
|
|
633 | reviewer_objects = Optional.extract(reviewers) or [] | |
|
634 | if not isinstance(reviewer_objects, list): | |
|
590 | 635 | raise JSONRPCError('reviewers should be specified as a list') |
|
591 | 636 | |
|
592 | reviewer_users = [get_user_or_error(n) for n in reviewer_names] | |
|
593 | reviewer_ids = [u.user_id for u in reviewer_users] | |
|
637 | reviewers_reasons = [] | |
|
638 | reviewer_ids = set() | |
|
639 | for reviewer_object in reviewer_objects: | |
|
640 | reviewer_reasons = [] | |
|
641 | if isinstance(reviewer_object, (int, basestring)): | |
|
642 | reviewer_username = reviewer_object | |
|
643 | else: | |
|
644 | reviewer_username = reviewer_object['username'] | |
|
645 | reviewer_reasons = reviewer_object.get('reasons', []) | |
|
646 | ||
|
647 | user = get_user_or_error(reviewer_username) | |
|
648 | reviewer_ids.add(user.user_id) | |
|
649 | reviewers_reasons.append((user.user_id, reviewer_reasons)) | |
|
594 | 650 | |
|
595 | 651 | title = Optional.extract(title) |
|
596 | 652 | description = Optional.extract(description) |
@@ -603,15 +659,15 b' def update_pull_request(' | |||
|
603 | 659 | commit_changes = {"added": [], "common": [], "removed": []} |
|
604 | 660 | if str2bool(Optional.extract(update_commits)): |
|
605 | 661 | if PullRequestModel().has_valid_update_type(pull_request): |
|
606 |
|
|
|
662 | update_response = PullRequestModel().update_commits( | |
|
607 | 663 | pull_request) |
|
608 |
commit_changes = |
|
|
664 | commit_changes = update_response.changes or commit_changes | |
|
609 | 665 | Session().commit() |
|
610 | 666 | |
|
611 | 667 | reviewers_changes = {"added": [], "removed": []} |
|
612 | 668 | if reviewer_ids: |
|
613 | 669 | added_reviewers, removed_reviewers = \ |
|
614 |
PullRequestModel().update_reviewers(pull_request, reviewer |
|
|
670 | PullRequestModel().update_reviewers(pull_request, reviewers_reasons) | |
|
615 | 671 | |
|
616 | 672 | reviewers_changes['added'] = sorted( |
|
617 | 673 | [get_user_or_error(n).username for n in added_reviewers]) |
@@ -631,5 +687,5 b' def update_pull_request(' | |||
|
631 | 687 | 'updated_commits': commit_changes, |
|
632 | 688 | 'updated_reviewers': reviewers_changes |
|
633 | 689 | } |
|
690 | ||
|
634 | 691 | return data |
|
635 |
@@ -21,29 +21,26 b'' | |||
|
21 | 21 | import logging |
|
22 | 22 | import time |
|
23 | 23 | |
|
24 |
import co |
|
|
25 | ||
|
26 | from rhodecode import BACKENDS | |
|
27 | from rhodecode.api import jsonrpc_method, JSONRPCError, JSONRPCForbidden, json | |
|
24 | import rhodecode | |
|
25 | from rhodecode.api import ( | |
|
26 | jsonrpc_method, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError) | |
|
28 | 27 | from rhodecode.api.utils import ( |
|
29 | 28 | has_superadmin_permission, Optional, OAttr, get_repo_or_error, |
|
30 |
get_user_group_or_error, get_user_or_error, |
|
|
31 | get_perm_or_error, store_update, get_repo_group_or_error, parse_args, | |
|
32 | get_origin, build_commit_data) | |
|
33 | from rhodecode.lib.auth import ( | |
|
34 | HasPermissionAnyApi, HasRepoGroupPermissionAnyApi, | |
|
35 | HasUserGroupPermissionAnyApi) | |
|
29 | get_user_group_or_error, get_user_or_error, validate_repo_permissions, | |
|
30 | get_perm_or_error, parse_args, get_origin, build_commit_data, | |
|
31 | validate_set_owner_permissions) | |
|
32 | from rhodecode.lib.auth import HasPermissionAnyApi, HasUserGroupPermissionAnyApi | |
|
36 | 33 | from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError |
|
37 | from rhodecode.lib.utils import map_groups | |
|
38 | 34 | from rhodecode.lib.utils2 import str2bool, time_to_datetime |
|
35 | from rhodecode.lib.ext_json import json | |
|
39 | 36 | from rhodecode.model.changeset_status import ChangesetStatusModel |
|
40 | 37 | from rhodecode.model.comment import ChangesetCommentsModel |
|
41 | 38 | from rhodecode.model.db import ( |
|
42 | 39 | Session, ChangesetStatus, RepositoryField, Repository) |
|
43 | 40 | from rhodecode.model.repo import RepoModel |
|
44 | from rhodecode.model.repo_group import RepoGroupModel | |
|
45 | 41 | from rhodecode.model.scm import ScmModel, RepoList |
|
46 | 42 | from rhodecode.model.settings import SettingsModel, VcsSettingsModel |
|
43 | from rhodecode.model import validation_schema | |
|
47 | 44 | from rhodecode.model.validation_schema.schemas import repo_schema |
|
48 | 45 | |
|
49 | 46 | log = logging.getLogger(__name__) |
@@ -177,6 +174,7 b' def get_repo(request, apiuser, repoid, c' | |||
|
177 | 174 | |
|
178 | 175 | repo = get_repo_or_error(repoid) |
|
179 | 176 | cache = Optional.extract(cache) |
|
177 | ||
|
180 | 178 | include_secrets = False |
|
181 | 179 | if has_superadmin_permission(apiuser): |
|
182 | 180 | include_secrets = True |
@@ -184,7 +182,7 b' def get_repo(request, apiuser, repoid, c' | |||
|
184 | 182 | # check if we have at least read permission for this repo ! |
|
185 | 183 | _perms = ( |
|
186 | 184 | 'repository.admin', 'repository.write', 'repository.read',) |
|
187 |
|
|
|
185 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
188 | 186 | |
|
189 | 187 | permissions = [] |
|
190 | 188 | for _user in repo.permissions(): |
@@ -292,7 +290,7 b' def get_repo_changeset(request, apiuser,' | |||
|
292 | 290 | if not has_superadmin_permission(apiuser): |
|
293 | 291 | _perms = ( |
|
294 | 292 | 'repository.admin', 'repository.write', 'repository.read',) |
|
295 |
|
|
|
293 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
296 | 294 | |
|
297 | 295 | changes_details = Optional.extract(details) |
|
298 | 296 | _changes_details_types = ['basic', 'extended', 'full'] |
@@ -355,7 +353,7 b' def get_repo_changesets(request, apiuser' | |||
|
355 | 353 | if not has_superadmin_permission(apiuser): |
|
356 | 354 | _perms = ( |
|
357 | 355 | 'repository.admin', 'repository.write', 'repository.read',) |
|
358 |
|
|
|
356 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
359 | 357 | |
|
360 | 358 | changes_details = Optional.extract(details) |
|
361 | 359 | _changes_details_types = ['basic', 'extended', 'full'] |
@@ -450,7 +448,7 b' def get_repo_nodes(request, apiuser, rep' | |||
|
450 | 448 | if not has_superadmin_permission(apiuser): |
|
451 | 449 | _perms = ( |
|
452 | 450 | 'repository.admin', 'repository.write', 'repository.read',) |
|
453 |
|
|
|
451 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
454 | 452 | |
|
455 | 453 | ret_type = Optional.extract(ret_type) |
|
456 | 454 | details = Optional.extract(details) |
@@ -523,7 +521,7 b' def get_repo_refs(request, apiuser, repo' | |||
|
523 | 521 | repo = get_repo_or_error(repoid) |
|
524 | 522 | if not has_superadmin_permission(apiuser): |
|
525 | 523 | _perms = ('repository.admin', 'repository.write', 'repository.read',) |
|
526 |
|
|
|
524 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
527 | 525 | |
|
528 | 526 | try: |
|
529 | 527 | # check if repo is not empty by any chance, skip quicker if it is. |
@@ -538,9 +536,12 b' def get_repo_refs(request, apiuser, repo' | |||
|
538 | 536 | |
|
539 | 537 | |
|
540 | 538 | @jsonrpc_method() |
|
541 | def create_repo(request, apiuser, repo_name, repo_type, | |
|
542 | owner=Optional(OAttr('apiuser')), description=Optional(''), | |
|
543 | private=Optional(False), clone_uri=Optional(None), | |
|
539 | def create_repo( | |
|
540 | request, apiuser, repo_name, repo_type, | |
|
541 | owner=Optional(OAttr('apiuser')), | |
|
542 | description=Optional(''), | |
|
543 | private=Optional(False), | |
|
544 | clone_uri=Optional(None), | |
|
544 | 545 |
|
|
545 | 546 |
|
|
546 | 547 |
|
@@ -549,15 +550,16 b' def create_repo(request, apiuser, repo_n' | |||
|
549 | 550 | """ |
|
550 | 551 | Creates a repository. |
|
551 | 552 | |
|
552 |
* If the repository name contains "/", |
|
|
553 | groups will be created. | |
|
553 | * If the repository name contains "/", repository will be created inside | |
|
554 | a repository group or nested repository groups | |
|
554 | 555 | |
|
555 |
For example "foo/bar/ |
|
|
556 | (with "foo" as parent). It will also create the "baz" repository | |
|
557 | with "bar" as |repo| group. | |
|
556 | For example "foo/bar/repo1" will create |repo| called "repo1" inside | |
|
557 | group "foo/bar". You have to have permissions to access and write to | |
|
558 | the last repository group ("bar" in this example) | |
|
558 | 559 | |
|
559 | 560 | This command can only be run using an |authtoken| with at least |
|
560 | write permissions to the |repo|. | |
|
561 | permissions to create repositories, or write permissions to | |
|
562 | parent repository groups. | |
|
561 | 563 | |
|
562 | 564 | :param apiuser: This is filled automatically from the |authtoken|. |
|
563 | 565 | :type apiuser: AuthUser |
@@ -569,9 +571,9 b' def create_repo(request, apiuser, repo_n' | |||
|
569 | 571 | :type owner: Optional(str) |
|
570 | 572 | :param description: Set the repository description. |
|
571 | 573 | :type description: Optional(str) |
|
572 | :param private: | |
|
574 | :param private: set repository as private | |
|
573 | 575 | :type private: bool |
|
574 | :param clone_uri: | |
|
576 | :param clone_uri: set clone_uri | |
|
575 | 577 | :type clone_uri: str |
|
576 | 578 | :param landing_rev: <rev_type>:<rev> |
|
577 | 579 | :type landing_rev: str |
@@ -606,53 +608,17 b' def create_repo(request, apiuser, repo_n' | |||
|
606 | 608 | id : <id_given_in_input> |
|
607 | 609 | result : null |
|
608 | 610 | error : { |
|
609 | 'failed to create repository `<repo_name>` | |
|
611 | 'failed to create repository `<repo_name>`' | |
|
610 | 612 | } |
|
611 | 613 | |
|
612 | 614 | """ |
|
613 | schema = repo_schema.RepoSchema() | |
|
614 | try: | |
|
615 | data = schema.deserialize({ | |
|
616 | 'repo_name': repo_name | |
|
617 | }) | |
|
618 | except colander.Invalid as e: | |
|
619 | raise JSONRPCError("Validation failed: %s" % (e.asdict(),)) | |
|
620 | repo_name = data['repo_name'] | |
|
621 | 615 | |
|
622 | (repo_name_cleaned, | |
|
623 | parent_group_name) = RepoGroupModel()._get_group_name_and_parent( | |
|
624 | repo_name) | |
|
625 | ||
|
626 | if not HasPermissionAnyApi( | |
|
627 | 'hg.admin', 'hg.create.repository')(user=apiuser): | |
|
628 | # check if we have admin permission for this repo group if given ! | |
|
629 | ||
|
630 | if parent_group_name: | |
|
631 | repogroupid = parent_group_name | |
|
632 | repo_group = get_repo_group_or_error(parent_group_name) | |
|
616 | owner = validate_set_owner_permissions(apiuser, owner) | |
|
633 | 617 | |
|
634 | _perms = ('group.admin',) | |
|
635 | if not HasRepoGroupPermissionAnyApi(*_perms)( | |
|
636 | user=apiuser, group_name=repo_group.group_name): | |
|
637 | raise JSONRPCError( | |
|
638 | 'repository group `%s` does not exist' % ( | |
|
639 | repogroupid,)) | |
|
640 | else: | |
|
641 | raise JSONRPCForbidden() | |
|
642 | ||
|
643 | if not has_superadmin_permission(apiuser): | |
|
644 | if not isinstance(owner, Optional): | |
|
645 | # forbid setting owner for non-admins | |
|
646 | raise JSONRPCError( | |
|
647 | 'Only RhodeCode admin can specify `owner` param') | |
|
648 | ||
|
649 | if isinstance(owner, Optional): | |
|
650 | owner = apiuser.user_id | |
|
651 | ||
|
652 | owner = get_user_or_error(owner) | |
|
653 | ||
|
654 | if RepoModel().get_by_repo_name(repo_name): | |
|
655 | raise JSONRPCError("repo `%s` already exist" % repo_name) | |
|
618 | description = Optional.extract(description) | |
|
619 | copy_permissions = Optional.extract(copy_permissions) | |
|
620 | clone_uri = Optional.extract(clone_uri) | |
|
621 | landing_commit_ref = Optional.extract(landing_rev) | |
|
656 | 622 | |
|
657 | 623 | defs = SettingsModel().get_default_repo_settings(strip_prefix=True) |
|
658 | 624 | if isinstance(private, Optional): |
@@ -666,32 +632,44 b' def create_repo(request, apiuser, repo_n' | |||
|
666 | 632 | if isinstance(enable_downloads, Optional): |
|
667 | 633 | enable_downloads = defs.get('repo_enable_downloads') |
|
668 | 634 | |
|
669 | clone_uri = Optional.extract(clone_uri) | |
|
670 | description = Optional.extract(description) | |
|
671 | landing_rev = Optional.extract(landing_rev) | |
|
672 | copy_permissions = Optional.extract(copy_permissions) | |
|
635 | schema = repo_schema.RepoSchema().bind( | |
|
636 | repo_type_options=rhodecode.BACKENDS.keys(), | |
|
637 | # user caller | |
|
638 | user=apiuser) | |
|
673 | 639 | |
|
674 | 640 | try: |
|
675 | # create structure of groups and return the last group | |
|
676 |
|
|
|
641 | schema_data = schema.deserialize(dict( | |
|
642 | repo_name=repo_name, | |
|
643 | repo_type=repo_type, | |
|
644 | repo_owner=owner.username, | |
|
645 | repo_description=description, | |
|
646 | repo_landing_commit_ref=landing_commit_ref, | |
|
647 | repo_clone_uri=clone_uri, | |
|
648 | repo_private=private, | |
|
649 | repo_copy_permissions=copy_permissions, | |
|
650 | repo_enable_statistics=enable_statistics, | |
|
651 | repo_enable_downloads=enable_downloads, | |
|
652 | repo_enable_locking=enable_locking)) | |
|
653 | except validation_schema.Invalid as err: | |
|
654 | raise JSONRPCValidationError(colander_exc=err) | |
|
655 | ||
|
656 | try: | |
|
677 | 657 | data = { |
|
678 | 'repo_name': repo_name_cleaned, | |
|
679 | 'repo_name_full': repo_name, | |
|
680 | 'repo_type': repo_type, | |
|
681 | 'repo_description': description, | |
|
682 | 658 | 'owner': owner, |
|
683 | 'repo_private': private, | |
|
684 | 'clone_uri': clone_uri, | |
|
685 |
'repo_group': repo_group |
|
|
686 | 'repo_landing_rev': landing_rev, | |
|
687 | 'enable_statistics': enable_statistics, | |
|
688 | 'enable_locking': enable_locking, | |
|
689 | 'enable_downloads': enable_downloads, | |
|
690 | 'repo_copy_permissions': copy_permissions, | |
|
659 | 'repo_name': schema_data['repo_group']['repo_name_without_group'], | |
|
660 | 'repo_name_full': schema_data['repo_name'], | |
|
661 | 'repo_group': schema_data['repo_group']['repo_group_id'], | |
|
662 | 'repo_type': schema_data['repo_type'], | |
|
663 | 'repo_description': schema_data['repo_description'], | |
|
664 | 'repo_private': schema_data['repo_private'], | |
|
665 | 'clone_uri': schema_data['repo_clone_uri'], | |
|
666 | 'repo_landing_rev': schema_data['repo_landing_commit_ref'], | |
|
667 | 'enable_statistics': schema_data['repo_enable_statistics'], | |
|
668 | 'enable_locking': schema_data['repo_enable_locking'], | |
|
669 | 'enable_downloads': schema_data['repo_enable_downloads'], | |
|
670 | 'repo_copy_permissions': schema_data['repo_copy_permissions'], | |
|
691 | 671 | } |
|
692 | 672 | |
|
693 | if repo_type not in BACKENDS.keys(): | |
|
694 | raise Exception("Invalid backend type %s" % repo_type) | |
|
695 | 673 | task = RepoModel().create(form_data=data, cur_user=owner) |
|
696 | 674 | from celery.result import BaseAsyncResult |
|
697 | 675 | task_id = None |
@@ -699,17 +677,17 b' def create_repo(request, apiuser, repo_n' | |||
|
699 | 677 | task_id = task.task_id |
|
700 | 678 | # no commit, it's done in RepoModel, or async via celery |
|
701 | 679 | return { |
|
702 | 'msg': "Created new repository `%s`" % (repo_name,), | |
|
680 | 'msg': "Created new repository `%s`" % (schema_data['repo_name'],), | |
|
703 | 681 | 'success': True, # cannot return the repo data here since fork |
|
704 |
# can |
|
|
682 | # can be done async | |
|
705 | 683 | 'task': task_id |
|
706 | 684 | } |
|
707 | 685 | except Exception: |
|
708 | 686 | log.exception( |
|
709 | 687 | u"Exception while trying to create the repository %s", |
|
710 | repo_name) | |
|
688 | schema_data['repo_name']) | |
|
711 | 689 | raise JSONRPCError( |
|
712 | 'failed to create repository `%s`' % (repo_name,)) | |
|
690 | 'failed to create repository `%s`' % (schema_data['repo_name'],)) | |
|
713 | 691 | |
|
714 | 692 | |
|
715 | 693 | @jsonrpc_method() |
@@ -735,7 +713,7 b' def add_field_to_repo(request, apiuser, ' | |||
|
735 | 713 | repo = get_repo_or_error(repoid) |
|
736 | 714 | if not has_superadmin_permission(apiuser): |
|
737 | 715 | _perms = ('repository.admin',) |
|
738 |
|
|
|
716 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
739 | 717 | |
|
740 | 718 | label = Optional.extract(label) or key |
|
741 | 719 | description = Optional.extract(description) |
@@ -778,7 +756,7 b' def remove_field_from_repo(request, apiu' | |||
|
778 | 756 | repo = get_repo_or_error(repoid) |
|
779 | 757 | if not has_superadmin_permission(apiuser): |
|
780 | 758 | _perms = ('repository.admin',) |
|
781 |
|
|
|
759 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
782 | 760 | |
|
783 | 761 | field = RepositoryField.get_by_key_name(key, repo) |
|
784 | 762 | if not field: |
@@ -800,33 +778,38 b' def remove_field_from_repo(request, apiu' | |||
|
800 | 778 | |
|
801 | 779 | |
|
802 | 780 | @jsonrpc_method() |
|
803 | def update_repo(request, apiuser, repoid, name=Optional(None), | |
|
804 | owner=Optional(OAttr('apiuser')), | |
|
805 | group=Optional(None), | |
|
806 | fork_of=Optional(None), | |
|
807 |
|
|
|
808 | clone_uri=Optional(None), landing_rev=Optional('rev:tip'), | |
|
781 | def update_repo( | |
|
782 | request, apiuser, repoid, repo_name=Optional(None), | |
|
783 | owner=Optional(OAttr('apiuser')), description=Optional(''), | |
|
784 | private=Optional(False), clone_uri=Optional(None), | |
|
785 | landing_rev=Optional('rev:tip'), fork_of=Optional(None), | |
|
809 | 786 |
|
|
810 | 787 |
|
|
811 |
|
|
|
812 | fields=Optional('')): | |
|
788 | enable_downloads=Optional(False), fields=Optional('')): | |
|
813 | 789 | """ |
|
814 | 790 | Updates a repository with the given information. |
|
815 | 791 | |
|
816 | 792 | This command can only be run using an |authtoken| with at least |
|
817 |
|
|
|
793 | admin permissions to the |repo|. | |
|
794 | ||
|
795 | * If the repository name contains "/", repository will be updated | |
|
796 | accordingly with a repository group or nested repository groups | |
|
797 | ||
|
798 | For example repoid=repo-test name="foo/bar/repo-test" will update |repo| | |
|
799 | called "repo-test" and place it inside group "foo/bar". | |
|
800 | You have to have permissions to access and write to the last repository | |
|
801 | group ("bar" in this example) | |
|
818 | 802 | |
|
819 | 803 | :param apiuser: This is filled automatically from the |authtoken|. |
|
820 | 804 | :type apiuser: AuthUser |
|
821 | 805 | :param repoid: repository name or repository ID. |
|
822 | 806 | :type repoid: str or int |
|
823 |
:param name: Update the |repo| name |
|
|
824 | :type name: str | |
|
807 | :param repo_name: Update the |repo| name, including the | |
|
808 | repository group it's in. | |
|
809 | :type repo_name: str | |
|
825 | 810 | :param owner: Set the |repo| owner. |
|
826 | 811 | :type owner: str |
|
827 |
:param |
|
|
828 | :type group: str | |
|
829 | :param fork_of: Set the master |repo| name. | |
|
812 | :param fork_of: Set the |repo| as fork of another |repo|. | |
|
830 | 813 | :type fork_of: str |
|
831 | 814 | :param description: Update the |repo| description. |
|
832 | 815 | :type description: str |
@@ -834,69 +817,115 b' def update_repo(request, apiuser, repoid' | |||
|
834 | 817 | :type private: bool |
|
835 | 818 | :param clone_uri: Update the |repo| clone URI. |
|
836 | 819 | :type clone_uri: str |
|
837 | :param landing_rev: Set the |repo| landing revision. Default is | |
|
838 | ``tip``. | |
|
820 | :param landing_rev: Set the |repo| landing revision. Default is ``rev:tip``. | |
|
839 | 821 | :type landing_rev: str |
|
840 | :param enable_statistics: Enable statistics on the |repo|, | |
|
841 | (True | False). | |
|
822 | :param enable_statistics: Enable statistics on the |repo|, (True | False). | |
|
842 | 823 | :type enable_statistics: bool |
|
843 | 824 | :param enable_locking: Enable |repo| locking. |
|
844 | 825 | :type enable_locking: bool |
|
845 | :param enable_downloads: Enable downloads from the |repo|, | |
|
846 | (True | False). | |
|
826 | :param enable_downloads: Enable downloads from the |repo|, (True | False). | |
|
847 | 827 | :type enable_downloads: bool |
|
848 | 828 | :param fields: Add extra fields to the |repo|. Use the following |
|
849 | 829 | example format: ``field_key=field_val,field_key2=fieldval2``. |
|
850 | 830 | Escape ', ' with \, |
|
851 | 831 | :type fields: str |
|
852 | 832 | """ |
|
833 | ||
|
853 | 834 | repo = get_repo_or_error(repoid) |
|
835 | ||
|
854 | 836 | include_secrets = False |
|
855 | if has_superadmin_permission(apiuser): | |
|
837 | if not has_superadmin_permission(apiuser): | |
|
838 | validate_repo_permissions(apiuser, repoid, repo, ('repository.admin',)) | |
|
839 | else: | |
|
856 | 840 | include_secrets = True |
|
857 | else: | |
|
858 | _perms = ('repository.admin',) | |
|
859 | has_repo_permissions(apiuser, repoid, repo, _perms) | |
|
841 | ||
|
842 | updates = dict( | |
|
843 | repo_name=repo_name | |
|
844 | if not isinstance(repo_name, Optional) else repo.repo_name, | |
|
845 | ||
|
846 | fork_id=fork_of | |
|
847 | if not isinstance(fork_of, Optional) else repo.fork.repo_name if repo.fork else None, | |
|
848 | ||
|
849 | user=owner | |
|
850 | if not isinstance(owner, Optional) else repo.user.username, | |
|
860 | 851 | |
|
861 | updates = { | |
|
862 | # update function requires this. | |
|
863 | 'repo_name': repo.just_name | |
|
864 | } | |
|
865 | repo_group = group | |
|
866 | if not isinstance(repo_group, Optional): | |
|
867 | repo_group = get_repo_group_or_error(repo_group) | |
|
868 | repo_group = repo_group.group_id | |
|
852 | repo_description=description | |
|
853 | if not isinstance(description, Optional) else repo.description, | |
|
854 | ||
|
855 | repo_private=private | |
|
856 | if not isinstance(private, Optional) else repo.private, | |
|
857 | ||
|
858 | clone_uri=clone_uri | |
|
859 | if not isinstance(clone_uri, Optional) else repo.clone_uri, | |
|
860 | ||
|
861 | repo_landing_rev=landing_rev | |
|
862 | if not isinstance(landing_rev, Optional) else repo._landing_revision, | |
|
863 | ||
|
864 | repo_enable_statistics=enable_statistics | |
|
865 | if not isinstance(enable_statistics, Optional) else repo.enable_statistics, | |
|
866 | ||
|
867 | repo_enable_locking=enable_locking | |
|
868 | if not isinstance(enable_locking, Optional) else repo.enable_locking, | |
|
869 | ||
|
870 | repo_enable_downloads=enable_downloads | |
|
871 | if not isinstance(enable_downloads, Optional) else repo.enable_downloads) | |
|
872 | ||
|
873 | ref_choices, _labels = ScmModel().get_repo_landing_revs(repo=repo) | |
|
869 | 874 | |
|
870 | repo_fork_of = fork_of | |
|
871 | if not isinstance(repo_fork_of, Optional): | |
|
872 | repo_fork_of = get_repo_or_error(repo_fork_of) | |
|
873 | repo_fork_of = repo_fork_of.repo_id | |
|
874 | ||
|
875 | schema = repo_schema.RepoSchema().bind( | |
|
876 | repo_type_options=rhodecode.BACKENDS.keys(), | |
|
877 | repo_ref_options=ref_choices, | |
|
878 | # user caller | |
|
879 | user=apiuser, | |
|
880 | old_values=repo.get_api_data()) | |
|
875 | 881 | try: |
|
876 | store_update(updates, name, 'repo_name') | |
|
877 | store_update(updates, repo_group, 'repo_group') | |
|
878 | store_update(updates, repo_fork_of, 'fork_id') | |
|
879 | store_update(updates, owner, 'user') | |
|
880 | store_update(updates, description, 'repo_description') | |
|
881 | store_update(updates, private, 'repo_private') | |
|
882 | store_update(updates, clone_uri, 'clone_uri') | |
|
883 | store_update(updates, landing_rev, 'repo_landing_rev') | |
|
884 | store_update(updates, enable_statistics, 'repo_enable_statistics') | |
|
885 | store_update(updates, enable_locking, 'repo_enable_locking') | |
|
886 | store_update(updates, enable_downloads, 'repo_enable_downloads') | |
|
882 | schema_data = schema.deserialize(dict( | |
|
883 | # we save old value, users cannot change type | |
|
884 | repo_type=repo.repo_type, | |
|
885 | ||
|
886 | repo_name=updates['repo_name'], | |
|
887 | repo_owner=updates['user'], | |
|
888 | repo_description=updates['repo_description'], | |
|
889 | repo_clone_uri=updates['clone_uri'], | |
|
890 | repo_fork_of=updates['fork_id'], | |
|
891 | repo_private=updates['repo_private'], | |
|
892 | repo_landing_commit_ref=updates['repo_landing_rev'], | |
|
893 | repo_enable_statistics=updates['repo_enable_statistics'], | |
|
894 | repo_enable_downloads=updates['repo_enable_downloads'], | |
|
895 | repo_enable_locking=updates['repo_enable_locking'])) | |
|
896 | except validation_schema.Invalid as err: | |
|
897 | raise JSONRPCValidationError(colander_exc=err) | |
|
898 | ||
|
899 | # save validated data back into the updates dict | |
|
900 | validated_updates = dict( | |
|
901 | repo_name=schema_data['repo_group']['repo_name_without_group'], | |
|
902 | repo_group=schema_data['repo_group']['repo_group_id'], | |
|
903 | ||
|
904 | user=schema_data['repo_owner'], | |
|
905 | repo_description=schema_data['repo_description'], | |
|
906 | repo_private=schema_data['repo_private'], | |
|
907 | clone_uri=schema_data['repo_clone_uri'], | |
|
908 | repo_landing_rev=schema_data['repo_landing_commit_ref'], | |
|
909 | repo_enable_statistics=schema_data['repo_enable_statistics'], | |
|
910 | repo_enable_locking=schema_data['repo_enable_locking'], | |
|
911 | repo_enable_downloads=schema_data['repo_enable_downloads'], | |
|
912 | ) | |
|
913 | ||
|
914 | if schema_data['repo_fork_of']: | |
|
915 | fork_repo = get_repo_or_error(schema_data['repo_fork_of']) | |
|
916 | validated_updates['fork_id'] = fork_repo.repo_id | |
|
887 | 917 | |
|
888 | 918 |
|
|
889 | 919 |
|
|
890 | 920 |
|
|
891 |
|
|
|
921 | validated_updates.update(fields) | |
|
892 | 922 | |
|
893 | RepoModel().update(repo, **updates) | |
|
923 | try: | |
|
924 | RepoModel().update(repo, **validated_updates) | |
|
894 | 925 | Session().commit() |
|
895 | 926 | return { |
|
896 | 'msg': 'updated repo ID:%s %s' % ( | |
|
897 | repo.repo_id, repo.repo_name), | |
|
898 | 'repository': repo.get_api_data( | |
|
899 | include_secrets=include_secrets) | |
|
927 | 'msg': 'updated repo ID:%s %s' % (repo.repo_id, repo.repo_name), | |
|
928 | 'repository': repo.get_api_data(include_secrets=include_secrets) | |
|
900 | 929 | } |
|
901 | 930 | except Exception: |
|
902 | 931 | log.exception( |
@@ -908,26 +937,33 b' def update_repo(request, apiuser, repoid' | |||
|
908 | 937 | @jsonrpc_method() |
|
909 | 938 | def fork_repo(request, apiuser, repoid, fork_name, |
|
910 | 939 | owner=Optional(OAttr('apiuser')), |
|
911 |
description=Optional(''), |
|
|
912 |
private=Optional(False), |
|
|
940 | description=Optional(''), | |
|
941 | private=Optional(False), | |
|
942 | clone_uri=Optional(None), | |
|
943 | landing_rev=Optional('rev:tip'), | |
|
944 | copy_permissions=Optional(False)): | |
|
913 | 945 | """ |
|
914 | 946 | Creates a fork of the specified |repo|. |
|
915 | 947 | |
|
916 | * If using |RCE| with Celery this will immediately return a success | |
|
917 | message, even though the fork will be created asynchronously. | |
|
948 | * If the fork_name contains "/", fork will be created inside | |
|
949 | a repository group or nested repository groups | |
|
918 | 950 | |
|
919 | This command can only be run using an |authtoken| with fork | |
|
920 | permissions on the |repo|. | |
|
951 | For example "foo/bar/fork-repo" will create fork called "fork-repo" | |
|
952 | inside group "foo/bar". You have to have permissions to access and | |
|
953 | write to the last repository group ("bar" in this example) | |
|
954 | ||
|
955 | This command can only be run using an |authtoken| with minimum | |
|
956 | read permissions of the forked repo, create fork permissions for an user. | |
|
921 | 957 | |
|
922 | 958 | :param apiuser: This is filled automatically from the |authtoken|. |
|
923 | 959 | :type apiuser: AuthUser |
|
924 | 960 | :param repoid: Set repository name or repository ID. |
|
925 | 961 | :type repoid: str or int |
|
926 | :param fork_name: Set the fork name. | |
|
962 | :param fork_name: Set the fork name, including it's repository group membership. | |
|
927 | 963 | :type fork_name: str |
|
928 | 964 | :param owner: Set the fork owner. |
|
929 | 965 | :type owner: str |
|
930 | :param description: Set the fork descripton. | |
|
966 | :param description: Set the fork description. | |
|
931 | 967 | :type description: str |
|
932 | 968 | :param copy_permissions: Copy permissions from parent |repo|. The |
|
933 | 969 | default is False. |
@@ -965,71 +1001,63 b' def fork_repo(request, apiuser, repoid, ' | |||
|
965 | 1001 | error: null |
|
966 | 1002 | |
|
967 | 1003 | """ |
|
968 | if not has_superadmin_permission(apiuser): | |
|
969 | if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser): | |
|
970 | raise JSONRPCForbidden() | |
|
971 | 1004 | |
|
972 | 1005 | repo = get_repo_or_error(repoid) |
|
973 | 1006 | repo_name = repo.repo_name |
|
974 | 1007 | |
|
975 | (fork_name_cleaned, | |
|
976 | parent_group_name) = RepoGroupModel()._get_group_name_and_parent( | |
|
977 | fork_name) | |
|
978 | ||
|
979 | 1008 | if not has_superadmin_permission(apiuser): |
|
980 | 1009 | # check if we have at least read permission for |
|
981 | 1010 | # this repo that we fork ! |
|
982 | 1011 | _perms = ( |
|
983 | 1012 | 'repository.admin', 'repository.write', 'repository.read') |
|
984 |
|
|
|
1013 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
985 | 1014 | |
|
986 | if not isinstance(owner, Optional): | |
|
987 | # forbid setting owner for non super admins | |
|
988 | raise JSONRPCError( | |
|
989 | 'Only RhodeCode admin can specify `owner` param' | |
|
990 | ) | |
|
991 | # check if we have a create.repo permission if not maybe the parent | |
|
992 | # group permission | |
|
993 | if not HasPermissionAnyApi('hg.create.repository')(user=apiuser): | |
|
994 | if parent_group_name: | |
|
995 | repogroupid = parent_group_name | |
|
996 | repo_group = get_repo_group_or_error(parent_group_name) | |
|
997 | ||
|
998 | _perms = ('group.admin',) | |
|
999 | if not HasRepoGroupPermissionAnyApi(*_perms)( | |
|
1000 | user=apiuser, group_name=repo_group.group_name): | |
|
1001 | raise JSONRPCError( | |
|
1002 | 'repository group `%s` does not exist' % ( | |
|
1003 | repogroupid,)) | |
|
1004 | else: | |
|
1015 | # check if the regular user has at least fork permissions as well | |
|
1016 | if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser): | |
|
1005 | 1017 |
|
|
1006 | 1018 | |
|
1007 | _repo = RepoModel().get_by_repo_name(fork_name) | |
|
1008 | if _repo: | |
|
1009 | type_ = 'fork' if _repo.fork else 'repo' | |
|
1010 | raise JSONRPCError("%s `%s` already exist" % (type_, fork_name)) | |
|
1019 | # check if user can set owner parameter | |
|
1020 | owner = validate_set_owner_permissions(apiuser, owner) | |
|
1011 | 1021 | |
|
1012 | if isinstance(owner, Optional): | |
|
1013 | owner = apiuser.user_id | |
|
1022 | description = Optional.extract(description) | |
|
1023 | copy_permissions = Optional.extract(copy_permissions) | |
|
1024 | clone_uri = Optional.extract(clone_uri) | |
|
1025 | landing_commit_ref = Optional.extract(landing_rev) | |
|
1026 | private = Optional.extract(private) | |
|
1014 | 1027 | |
|
1015 | owner = get_user_or_error(owner) | |
|
1028 | schema = repo_schema.RepoSchema().bind( | |
|
1029 | repo_type_options=rhodecode.BACKENDS.keys(), | |
|
1030 | # user caller | |
|
1031 | user=apiuser) | |
|
1016 | 1032 | |
|
1017 | 1033 | try: |
|
1018 | # create structure of groups and return the last group | |
|
1019 |
|
|
|
1020 | form_data = { | |
|
1021 | 'repo_name': fork_name_cleaned, | |
|
1022 | 'repo_name_full': fork_name, | |
|
1023 | 'repo_group': repo_group.group_id if repo_group else None, | |
|
1024 | 'repo_type': repo.repo_type, | |
|
1025 | 'description': Optional.extract(description), | |
|
1026 | 'private': Optional.extract(private), | |
|
1027 | 'copy_permissions': Optional.extract(copy_permissions), | |
|
1028 | 'landing_rev': Optional.extract(landing_rev), | |
|
1034 | schema_data = schema.deserialize(dict( | |
|
1035 | repo_name=fork_name, | |
|
1036 | repo_type=repo.repo_type, | |
|
1037 | repo_owner=owner.username, | |
|
1038 | repo_description=description, | |
|
1039 | repo_landing_commit_ref=landing_commit_ref, | |
|
1040 | repo_clone_uri=clone_uri, | |
|
1041 | repo_private=private, | |
|
1042 | repo_copy_permissions=copy_permissions)) | |
|
1043 | except validation_schema.Invalid as err: | |
|
1044 | raise JSONRPCValidationError(colander_exc=err) | |
|
1045 | ||
|
1046 | try: | |
|
1047 | data = { | |
|
1029 | 1048 | 'fork_parent_id': repo.repo_id, |
|
1049 | ||
|
1050 | 'repo_name': schema_data['repo_group']['repo_name_without_group'], | |
|
1051 | 'repo_name_full': schema_data['repo_name'], | |
|
1052 | 'repo_group': schema_data['repo_group']['repo_group_id'], | |
|
1053 | 'repo_type': schema_data['repo_type'], | |
|
1054 | 'description': schema_data['repo_description'], | |
|
1055 | 'private': schema_data['repo_private'], | |
|
1056 | 'copy_permissions': schema_data['repo_copy_permissions'], | |
|
1057 | 'landing_rev': schema_data['repo_landing_commit_ref'], | |
|
1030 | 1058 | } |
|
1031 | 1059 | |
|
1032 |
task = RepoModel().create_fork( |
|
|
1060 | task = RepoModel().create_fork(data, cur_user=owner) | |
|
1033 | 1061 | # no commit, it's done in RepoModel, or async via celery |
|
1034 | 1062 | from celery.result import BaseAsyncResult |
|
1035 | 1063 | task_id = None |
@@ -1037,16 +1065,18 b' def fork_repo(request, apiuser, repoid, ' | |||
|
1037 | 1065 | task_id = task.task_id |
|
1038 | 1066 | return { |
|
1039 | 1067 | 'msg': 'Created fork of `%s` as `%s`' % ( |
|
1040 |
repo.repo_name, |
|
|
1068 | repo.repo_name, schema_data['repo_name']), | |
|
1041 | 1069 | 'success': True, # cannot return the repo data here since fork |
|
1042 | 1070 | # can be done async |
|
1043 | 1071 | 'task': task_id |
|
1044 | 1072 | } |
|
1045 | 1073 | except Exception: |
|
1046 | log.exception("Exception occurred while trying to fork a repo") | |
|
1074 | log.exception( | |
|
1075 | u"Exception while trying to create fork %s", | |
|
1076 | schema_data['repo_name']) | |
|
1047 | 1077 | raise JSONRPCError( |
|
1048 | 1078 | 'failed to fork repository `%s` as `%s`' % ( |
|
1049 |
repo_name, |
|
|
1079 | repo_name, schema_data['repo_name'])) | |
|
1050 | 1080 | |
|
1051 | 1081 | |
|
1052 | 1082 | @jsonrpc_method() |
@@ -1082,7 +1112,7 b' def delete_repo(request, apiuser, repoid' | |||
|
1082 | 1112 | repo = get_repo_or_error(repoid) |
|
1083 | 1113 | if not has_superadmin_permission(apiuser): |
|
1084 | 1114 | _perms = ('repository.admin',) |
|
1085 |
|
|
|
1115 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
1086 | 1116 | |
|
1087 | 1117 | try: |
|
1088 | 1118 | handle_forks = Optional.extract(forks) |
@@ -1157,7 +1187,7 b' def invalidate_cache(request, apiuser, r' | |||
|
1157 | 1187 | repo = get_repo_or_error(repoid) |
|
1158 | 1188 | if not has_superadmin_permission(apiuser): |
|
1159 | 1189 | _perms = ('repository.admin', 'repository.write',) |
|
1160 |
|
|
|
1190 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
1161 | 1191 | |
|
1162 | 1192 | delete = Optional.extract(delete_keys) |
|
1163 | 1193 | try: |
@@ -1228,7 +1258,7 b' def lock(request, apiuser, repoid, locke' | |||
|
1228 | 1258 | id : <id_given_in_input> |
|
1229 | 1259 | result : null |
|
1230 | 1260 | error : { |
|
1231 | 'Error occurred locking repository `<reponame>` | |
|
1261 | 'Error occurred locking repository `<reponame>`' | |
|
1232 | 1262 | } |
|
1233 | 1263 | """ |
|
1234 | 1264 | |
@@ -1236,7 +1266,7 b' def lock(request, apiuser, repoid, locke' | |||
|
1236 | 1266 | if not has_superadmin_permission(apiuser): |
|
1237 | 1267 | # check if we have at least write permission for this repo ! |
|
1238 | 1268 | _perms = ('repository.admin', 'repository.write',) |
|
1239 |
|
|
|
1269 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
1240 | 1270 | |
|
1241 | 1271 | # make sure normal user does not pass someone else userid, |
|
1242 | 1272 | # he is not allowed to do that |
@@ -1347,7 +1377,7 b' def comment_commit(' | |||
|
1347 | 1377 | repo = get_repo_or_error(repoid) |
|
1348 | 1378 | if not has_superadmin_permission(apiuser): |
|
1349 | 1379 | _perms = ('repository.read', 'repository.write', 'repository.admin') |
|
1350 |
|
|
|
1380 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
1351 | 1381 | |
|
1352 | 1382 | if isinstance(userid, Optional): |
|
1353 | 1383 | userid = apiuser.user_id |
@@ -1438,7 +1468,7 b' def grant_user_permission(request, apius' | |||
|
1438 | 1468 | perm = get_perm_or_error(perm) |
|
1439 | 1469 | if not has_superadmin_permission(apiuser): |
|
1440 | 1470 | _perms = ('repository.admin',) |
|
1441 |
|
|
|
1471 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
1442 | 1472 | |
|
1443 | 1473 | try: |
|
1444 | 1474 | |
@@ -1492,7 +1522,7 b' def revoke_user_permission(request, apiu' | |||
|
1492 | 1522 | user = get_user_or_error(userid) |
|
1493 | 1523 | if not has_superadmin_permission(apiuser): |
|
1494 | 1524 | _perms = ('repository.admin',) |
|
1495 |
|
|
|
1525 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
1496 | 1526 | |
|
1497 | 1527 | try: |
|
1498 | 1528 | RepoModel().revoke_user_permission(repo=repo, user=user) |
@@ -1560,7 +1590,7 b' def grant_user_group_permission(request,' | |||
|
1560 | 1590 | perm = get_perm_or_error(perm) |
|
1561 | 1591 | if not has_superadmin_permission(apiuser): |
|
1562 | 1592 | _perms = ('repository.admin',) |
|
1563 |
|
|
|
1593 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
1564 | 1594 | |
|
1565 | 1595 | user_group = get_user_group_or_error(usergroupid) |
|
1566 | 1596 | if not has_superadmin_permission(apiuser): |
@@ -1625,7 +1655,7 b' def revoke_user_group_permission(request' | |||
|
1625 | 1655 | repo = get_repo_or_error(repoid) |
|
1626 | 1656 | if not has_superadmin_permission(apiuser): |
|
1627 | 1657 | _perms = ('repository.admin',) |
|
1628 |
|
|
|
1658 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
1629 | 1659 | |
|
1630 | 1660 | user_group = get_user_group_or_error(usergroupid) |
|
1631 | 1661 | if not has_superadmin_permission(apiuser): |
@@ -1701,7 +1731,7 b' def pull(request, apiuser, repoid):' | |||
|
1701 | 1731 | repo = get_repo_or_error(repoid) |
|
1702 | 1732 | if not has_superadmin_permission(apiuser): |
|
1703 | 1733 | _perms = ('repository.admin',) |
|
1704 |
|
|
|
1734 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
1705 | 1735 | |
|
1706 | 1736 | try: |
|
1707 | 1737 | ScmModel().pull_changes(repo.repo_name, apiuser.username) |
@@ -1764,7 +1794,7 b' def strip(request, apiuser, repoid, revi' | |||
|
1764 | 1794 | repo = get_repo_or_error(repoid) |
|
1765 | 1795 | if not has_superadmin_permission(apiuser): |
|
1766 | 1796 | _perms = ('repository.admin',) |
|
1767 |
|
|
|
1797 | validate_repo_permissions(apiuser, repoid, repo, _perms) | |
|
1768 | 1798 | |
|
1769 | 1799 | try: |
|
1770 | 1800 | ScmModel().strip(repo, revision, branch) |
@@ -21,19 +21,18 b'' | |||
|
21 | 21 | |
|
22 | 22 | import logging |
|
23 | 23 | |
|
24 | import colander | |
|
25 | ||
|
26 | from rhodecode.api import jsonrpc_method, JSONRPCError, JSONRPCForbidden | |
|
24 | from rhodecode.api import JSONRPCValidationError | |
|
25 | from rhodecode.api import jsonrpc_method, JSONRPCError | |
|
27 | 26 | from rhodecode.api.utils import ( |
|
28 | 27 | has_superadmin_permission, Optional, OAttr, get_user_or_error, |
|
29 | store_update, get_repo_group_or_error, | |
|
30 | get_perm_or_error, get_user_group_or_error, get_origin) | |
|
28 | get_repo_group_or_error, get_perm_or_error, get_user_group_or_error, | |
|
29 | get_origin, validate_repo_group_permissions, validate_set_owner_permissions) | |
|
31 | 30 | from rhodecode.lib.auth import ( |
|
32 | HasPermissionAnyApi, HasRepoGroupPermissionAnyApi, | |
|
33 | HasUserGroupPermissionAnyApi) | |
|
34 | from rhodecode.model.db import Session, RepoGroup | |
|
31 | HasRepoGroupPermissionAnyApi, HasUserGroupPermissionAnyApi) | |
|
32 | from rhodecode.model.db import Session | |
|
35 | 33 | from rhodecode.model.repo_group import RepoGroupModel |
|
36 | 34 | from rhodecode.model.scm import RepoGroupList |
|
35 | from rhodecode.model import validation_schema | |
|
37 | 36 | from rhodecode.model.validation_schema.schemas import repo_group_schema |
|
38 | 37 | |
|
39 | 38 | |
@@ -142,21 +141,24 b' def get_repo_groups(request, apiuser):' | |||
|
142 | 141 | |
|
143 | 142 | |
|
144 | 143 | @jsonrpc_method() |
|
145 | def create_repo_group(request, apiuser, group_name, description=Optional(''), | |
|
144 | def create_repo_group( | |
|
145 | request, apiuser, group_name, | |
|
146 | 146 |
|
|
147 | description=Optional(''), | |
|
147 | 148 |
|
|
148 | 149 | """ |
|
149 | 150 | Creates a repository group. |
|
150 | 151 | |
|
151 |
* If the repository group name contains "/", |
|
|
152 | groups will be created. | |
|
152 | * If the repository group name contains "/", repository group will be | |
|
153 | created inside a repository group or nested repository groups | |
|
153 | 154 | |
|
154 |
For example "foo/bar/ |
|
|
155 | (with "foo" as parent). It will also create the "baz" repository | |
|
156 | with "bar" as |repo| group. | |
|
155 | For example "foo/bar/group1" will create repository group called "group1" | |
|
156 | inside group "foo/bar". You have to have permissions to access and | |
|
157 | write to the last repository group ("bar" in this example) | |
|
157 | 158 | |
|
158 |
This command can only be run using an |authtoken| with a |
|
|
159 | permissions. | |
|
159 | This command can only be run using an |authtoken| with at least | |
|
160 | permissions to create repository groups, or admin permissions to | |
|
161 | parent repository groups. | |
|
160 | 162 | |
|
161 | 163 | :param apiuser: This is filled automatically from the |authtoken|. |
|
162 | 164 | :type apiuser: AuthUser |
@@ -193,72 +195,64 b' def create_repo_group(request, apiuser, ' | |||
|
193 | 195 | |
|
194 | 196 | """ |
|
195 | 197 | |
|
196 | schema = repo_group_schema.RepoGroupSchema() | |
|
197 | try: | |
|
198 | data = schema.deserialize({ | |
|
199 | 'group_name': group_name | |
|
200 | }) | |
|
201 | except colander.Invalid as e: | |
|
202 | raise JSONRPCError("Validation failed: %s" % (e.asdict(),)) | |
|
203 | group_name = data['group_name'] | |
|
198 | owner = validate_set_owner_permissions(apiuser, owner) | |
|
204 | 199 | |
|
205 | if isinstance(owner, Optional): | |
|
206 | owner = apiuser.user_id | |
|
207 | ||
|
208 | group_description = Optional.extract(description) | |
|
200 | description = Optional.extract(description) | |
|
209 | 201 | copy_permissions = Optional.extract(copy_permissions) |
|
210 | 202 | |
|
211 | # get by full name with parents, check if it already exist | |
|
212 | if RepoGroup.get_by_group_name(group_name): | |
|
213 | raise JSONRPCError("repo group `%s` already exist" % (group_name,)) | |
|
214 | ||
|
215 | (group_name_cleaned, | |
|
216 | parent_group_name) = RepoGroupModel()._get_group_name_and_parent( | |
|
217 | group_name) | |
|
203 | schema = repo_group_schema.RepoGroupSchema().bind( | |
|
204 | # user caller | |
|
205 | user=apiuser) | |
|
218 | 206 | |
|
219 | parent_group = None | |
|
220 | if parent_group_name: | |
|
221 |
|
|
|
207 | try: | |
|
208 | schema_data = schema.deserialize(dict( | |
|
209 | repo_group_name=group_name, | |
|
210 | repo_group_owner=owner.username, | |
|
211 | repo_group_description=description, | |
|
212 | repo_group_copy_permissions=copy_permissions, | |
|
213 | )) | |
|
214 | except validation_schema.Invalid as err: | |
|
215 | raise JSONRPCValidationError(colander_exc=err) | |
|
222 | 216 | |
|
223 | if not HasPermissionAnyApi( | |
|
224 | 'hg.admin', 'hg.repogroup.create.true')(user=apiuser): | |
|
225 | # check if we have admin permission for this parent repo group ! | |
|
226 | # users without admin or hg.repogroup.create can only create other | |
|
227 | # groups in groups they own so this is a required, but can be empty | |
|
228 | parent_group = getattr(parent_group, 'group_name', '') | |
|
229 | _perms = ('group.admin',) | |
|
230 | if not HasRepoGroupPermissionAnyApi(*_perms)( | |
|
231 | user=apiuser, group_name=parent_group): | |
|
232 | raise JSONRPCForbidden() | |
|
217 | validated_group_name = schema_data['repo_group_name'] | |
|
233 | 218 | |
|
234 | 219 | try: |
|
235 | 220 | repo_group = RepoGroupModel().create( |
|
236 | group_name=group_name, | |
|
237 | group_description=group_description, | |
|
238 | 221 | owner=owner, |
|
239 | copy_permissions=copy_permissions) | |
|
222 | group_name=validated_group_name, | |
|
223 | group_description=schema_data['repo_group_name'], | |
|
224 | copy_permissions=schema_data['repo_group_copy_permissions']) | |
|
240 | 225 | Session().commit() |
|
241 | 226 | return { |
|
242 | 'msg': 'Created new repo group `%s`' % group_name, | |
|
227 | 'msg': 'Created new repo group `%s`' % validated_group_name, | |
|
243 | 228 | 'repo_group': repo_group.get_api_data() |
|
244 | 229 | } |
|
245 | 230 | except Exception: |
|
246 | 231 | log.exception("Exception occurred while trying create repo group") |
|
247 | 232 | raise JSONRPCError( |
|
248 | 'failed to create repo group `%s`' % (group_name,)) | |
|
233 | 'failed to create repo group `%s`' % (validated_group_name,)) | |
|
249 | 234 | |
|
250 | 235 | |
|
251 | 236 | @jsonrpc_method() |
|
252 | 237 | def update_repo_group( |
|
253 | 238 | request, apiuser, repogroupid, group_name=Optional(''), |
|
254 | 239 | description=Optional(''), owner=Optional(OAttr('apiuser')), |
|
255 |
|
|
|
240 | enable_locking=Optional(False)): | |
|
256 | 241 | """ |
|
257 | 242 | Updates repository group with the details given. |
|
258 | 243 | |
|
259 | 244 | This command can only be run using an |authtoken| with admin |
|
260 | 245 | permissions. |
|
261 | 246 | |
|
247 | * If the group_name name contains "/", repository group will be updated | |
|
248 | accordingly with a repository group or nested repository groups | |
|
249 | ||
|
250 | For example repogroupid=group-test group_name="foo/bar/group-test" | |
|
251 | will update repository group called "group-test" and place it | |
|
252 | inside group "foo/bar". | |
|
253 | You have to have permissions to access and write to the last repository | |
|
254 | group ("bar" in this example) | |
|
255 | ||
|
262 | 256 | :param apiuser: This is filled automatically from the |authtoken|. |
|
263 | 257 | :type apiuser: AuthUser |
|
264 | 258 | :param repogroupid: Set the ID of repository group. |
@@ -269,29 +263,55 b' def update_repo_group(' | |||
|
269 | 263 | :type description: str |
|
270 | 264 | :param owner: Set the |repo| group owner. |
|
271 | 265 | :type owner: str |
|
272 | :param parent: Set the |repo| group parent. | |
|
273 | :type parent: str or int | |
|
274 | 266 | :param enable_locking: Enable |repo| locking. The default is false. |
|
275 | 267 | :type enable_locking: bool |
|
276 | 268 | """ |
|
277 | 269 | |
|
278 | 270 | repo_group = get_repo_group_or_error(repogroupid) |
|
271 | ||
|
279 | 272 | if not has_superadmin_permission(apiuser): |
|
280 | # check if we have admin permission for this repo group ! | |
|
281 |
|
|
|
282 | if not HasRepoGroupPermissionAnyApi(*_perms)( | |
|
283 | user=apiuser, group_name=repo_group.group_name): | |
|
284 | raise JSONRPCError( | |
|
285 | 'repository group `%s` does not exist' % (repogroupid,)) | |
|
273 | validate_repo_group_permissions( | |
|
274 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
|
275 | ||
|
276 | updates = dict( | |
|
277 | group_name=group_name | |
|
278 | if not isinstance(group_name, Optional) else repo_group.group_name, | |
|
279 | ||
|
280 | group_description=description | |
|
281 | if not isinstance(description, Optional) else repo_group.group_description, | |
|
282 | ||
|
283 | user=owner | |
|
284 | if not isinstance(owner, Optional) else repo_group.user.username, | |
|
285 | ||
|
286 | enable_locking=enable_locking | |
|
287 | if not isinstance(enable_locking, Optional) else repo_group.enable_locking | |
|
288 | ) | |
|
286 | 289 | |
|
287 | updates = {} | |
|
290 | schema = repo_group_schema.RepoGroupSchema().bind( | |
|
291 | # user caller | |
|
292 | user=apiuser, | |
|
293 | old_values=repo_group.get_api_data()) | |
|
294 | ||
|
288 | 295 | try: |
|
289 | store_update(updates, group_name, 'group_name') | |
|
290 | store_update(updates, description, 'group_description') | |
|
291 |
|
|
|
292 | store_update(updates, parent, 'group_parent_id') | |
|
293 |
|
|
|
294 | repo_group = RepoGroupModel().update(repo_group, updates) | |
|
296 | schema_data = schema.deserialize(dict( | |
|
297 | repo_group_name=updates['group_name'], | |
|
298 | repo_group_owner=updates['user'], | |
|
299 | repo_group_description=updates['group_description'], | |
|
300 | repo_group_enable_locking=updates['enable_locking'], | |
|
301 | )) | |
|
302 | except validation_schema.Invalid as err: | |
|
303 | raise JSONRPCValidationError(colander_exc=err) | |
|
304 | ||
|
305 | validated_updates = dict( | |
|
306 | group_name=schema_data['repo_group']['repo_group_name_without_group'], | |
|
307 | group_parent_id=schema_data['repo_group']['repo_group_id'], | |
|
308 | user=schema_data['repo_group_owner'], | |
|
309 | group_description=schema_data['repo_group_description'], | |
|
310 | enable_locking=schema_data['repo_group_enable_locking'], | |
|
311 | ) | |
|
312 | ||
|
313 | try: | |
|
314 | RepoGroupModel().update(repo_group, validated_updates) | |
|
295 | 315 | Session().commit() |
|
296 | 316 | return { |
|
297 | 317 | 'msg': 'updated repository group ID:%s %s' % ( |
@@ -299,7 +319,9 b' def update_repo_group(' | |||
|
299 | 319 | 'repo_group': repo_group.get_api_data() |
|
300 | 320 | } |
|
301 | 321 | except Exception: |
|
302 | log.exception("Exception occurred while trying update repo group") | |
|
322 | log.exception( | |
|
323 | u"Exception occurred while trying update repo group %s", | |
|
324 | repogroupid) | |
|
303 | 325 | raise JSONRPCError('failed to update repository group `%s`' |
|
304 | 326 | % (repogroupid,)) |
|
305 | 327 | |
@@ -321,7 +343,7 b' def delete_repo_group(request, apiuser, ' | |||
|
321 | 343 | |
|
322 | 344 | id : <id_given_in_input> |
|
323 | 345 | result : { |
|
324 | 'msg': 'deleted repo group ID:<repogroupid> <repogroupname> | |
|
346 | 'msg': 'deleted repo group ID:<repogroupid> <repogroupname>' | |
|
325 | 347 | 'repo_group': null |
|
326 | 348 | } |
|
327 | 349 | error : null |
@@ -340,12 +362,9 b' def delete_repo_group(request, apiuser, ' | |||
|
340 | 362 | |
|
341 | 363 | repo_group = get_repo_group_or_error(repogroupid) |
|
342 | 364 | if not has_superadmin_permission(apiuser): |
|
343 | # check if we have admin permission for this repo group ! | |
|
344 |
|
|
|
345 | if not HasRepoGroupPermissionAnyApi(*_perms)( | |
|
346 | user=apiuser, group_name=repo_group.group_name): | |
|
347 | raise JSONRPCError( | |
|
348 | 'repository group `%s` does not exist' % (repogroupid,)) | |
|
365 | validate_repo_group_permissions( | |
|
366 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
|
367 | ||
|
349 | 368 | try: |
|
350 | 369 | RepoGroupModel().delete(repo_group) |
|
351 | 370 | Session().commit() |
@@ -408,12 +427,8 b' def grant_user_permission_to_repo_group(' | |||
|
408 | 427 | repo_group = get_repo_group_or_error(repogroupid) |
|
409 | 428 | |
|
410 | 429 | if not has_superadmin_permission(apiuser): |
|
411 | # check if we have admin permission for this repo group ! | |
|
412 |
|
|
|
413 | if not HasRepoGroupPermissionAnyApi(*_perms)( | |
|
414 | user=apiuser, group_name=repo_group.group_name): | |
|
415 | raise JSONRPCError( | |
|
416 | 'repository group `%s` does not exist' % (repogroupid,)) | |
|
430 | validate_repo_group_permissions( | |
|
431 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
|
417 | 432 | |
|
418 | 433 | user = get_user_or_error(userid) |
|
419 | 434 | perm = get_perm_or_error(perm, prefix='group.') |
@@ -487,12 +502,8 b' def revoke_user_permission_from_repo_gro' | |||
|
487 | 502 | repo_group = get_repo_group_or_error(repogroupid) |
|
488 | 503 | |
|
489 | 504 | if not has_superadmin_permission(apiuser): |
|
490 | # check if we have admin permission for this repo group ! | |
|
491 |
|
|
|
492 | if not HasRepoGroupPermissionAnyApi(*_perms)( | |
|
493 | user=apiuser, group_name=repo_group.group_name): | |
|
494 | raise JSONRPCError( | |
|
495 | 'repository group `%s` does not exist' % (repogroupid,)) | |
|
505 | validate_repo_group_permissions( | |
|
506 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
|
496 | 507 | |
|
497 | 508 | user = get_user_or_error(userid) |
|
498 | 509 | apply_to_children = Optional.extract(apply_to_children) |
@@ -569,12 +580,8 b' def grant_user_group_permission_to_repo_' | |||
|
569 | 580 | perm = get_perm_or_error(perm, prefix='group.') |
|
570 | 581 | user_group = get_user_group_or_error(usergroupid) |
|
571 | 582 | if not has_superadmin_permission(apiuser): |
|
572 | # check if we have admin permission for this repo group ! | |
|
573 |
|
|
|
574 | if not HasRepoGroupPermissionAnyApi(*_perms)( | |
|
575 | user=apiuser, group_name=repo_group.group_name): | |
|
576 | raise JSONRPCError( | |
|
577 | 'repository group `%s` does not exist' % (repogroupid,)) | |
|
583 | validate_repo_group_permissions( | |
|
584 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
|
578 | 585 | |
|
579 | 586 | # check if we have at least read permission for this user group ! |
|
580 | 587 | _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',) |
@@ -656,12 +663,8 b' def revoke_user_group_permission_from_re' | |||
|
656 | 663 | repo_group = get_repo_group_or_error(repogroupid) |
|
657 | 664 | user_group = get_user_group_or_error(usergroupid) |
|
658 | 665 | if not has_superadmin_permission(apiuser): |
|
659 | # check if we have admin permission for this repo group ! | |
|
660 |
|
|
|
661 | if not HasRepoGroupPermissionAnyApi(*_perms)( | |
|
662 | user=apiuser, group_name=repo_group.group_name): | |
|
663 | raise JSONRPCError( | |
|
664 | 'repository group `%s` does not exist' % (repogroupid,)) | |
|
666 | validate_repo_group_permissions( | |
|
667 | apiuser, repogroupid, repo_group, ('group.admin',)) | |
|
665 | 668 | |
|
666 | 669 | # check if we have at least read permission for this user group ! |
|
667 | 670 | _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',) |
@@ -60,7 +60,13 b' def get_server_info(request, apiuser):' | |||
|
60 | 60 | if not has_superadmin_permission(apiuser): |
|
61 | 61 | raise JSONRPCForbidden() |
|
62 | 62 | |
|
63 |
re |
|
|
63 | server_info = ScmModel().get_server_info(request.environ) | |
|
64 | # rhodecode-index requires those | |
|
65 | ||
|
66 | server_info['index_storage'] = server_info['search']['value']['location'] | |
|
67 | server_info['storage'] = server_info['storage']['value']['path'] | |
|
68 | ||
|
69 | return server_info | |
|
64 | 70 | |
|
65 | 71 | |
|
66 | 72 | @jsonrpc_method() |
@@ -25,7 +25,7 b' from rhodecode.api.utils import (' | |||
|
25 | 25 | Optional, OAttr, has_superadmin_permission, get_user_or_error, store_update) |
|
26 | 26 | from rhodecode.lib.auth import AuthUser, PasswordGenerator |
|
27 | 27 | from rhodecode.lib.exceptions import DefaultUserException |
|
28 | from rhodecode.lib.utils2 import safe_int | |
|
28 | from rhodecode.lib.utils2 import safe_int, str2bool | |
|
29 | 29 | from rhodecode.model.db import Session, User, Repository |
|
30 | 30 | from rhodecode.model.user import UserModel |
|
31 | 31 | |
@@ -81,6 +81,7 b' def get_user(request, apiuser, userid=Op' | |||
|
81 | 81 | "usergroup.read", |
|
82 | 82 | "hg.repogroup.create.false", |
|
83 | 83 | "hg.create.none", |
|
84 | "hg.password_reset.enabled", | |
|
84 | 85 | "hg.extern_activate.manual", |
|
85 | 86 | "hg.create.write_on_repogroup.false", |
|
86 | 87 | "hg.usergroup.create.false", |
@@ -154,7 +155,8 b' def create_user(request, apiuser, userna' | |||
|
154 | 155 | active=Optional(True), admin=Optional(False), |
|
155 | 156 | extern_name=Optional('rhodecode'), |
|
156 | 157 | extern_type=Optional('rhodecode'), |
|
157 |
force_password_change=Optional(False) |
|
|
158 | force_password_change=Optional(False), | |
|
159 | create_personal_repo_group=Optional(None)): | |
|
158 | 160 | """ |
|
159 | 161 | Creates a new user and returns the new user object. |
|
160 | 162 | |
@@ -187,7 +189,8 b' def create_user(request, apiuser, userna' | |||
|
187 | 189 | :param force_password_change: Force the new user to change password |
|
188 | 190 | on next login. |
|
189 | 191 | :type force_password_change: Optional(``True`` | ``False``) |
|
190 | ||
|
192 | :param create_personal_repo_group: Create personal repo group for this user | |
|
193 | :type create_personal_repo_group: Optional(``True`` | ``False``) | |
|
191 | 194 | Example output: |
|
192 | 195 | |
|
193 | 196 | .. code-block:: bash |
@@ -229,6 +232,9 b' def create_user(request, apiuser, userna' | |||
|
229 | 232 | Optional.extract(extern_name) != 'rhodecode'): |
|
230 | 233 | # generate temporary password if user is external |
|
231 | 234 | password = PasswordGenerator().gen_password(length=16) |
|
235 | create_repo_group = Optional.extract(create_personal_repo_group) | |
|
236 | if isinstance(create_repo_group, basestring): | |
|
237 | create_repo_group = str2bool(create_repo_group) | |
|
232 | 238 | |
|
233 | 239 | try: |
|
234 | 240 | user = UserModel().create_or_update( |
@@ -242,6 +248,7 b' def create_user(request, apiuser, userna' | |||
|
242 | 248 | extern_type=Optional.extract(extern_type), |
|
243 | 249 | extern_name=Optional.extract(extern_name), |
|
244 | 250 | force_password_change=Optional.extract(force_password_change), |
|
251 | create_repo_group=create_repo_group | |
|
245 | 252 | ) |
|
246 | 253 | Session().commit() |
|
247 | 254 | return { |
@@ -226,6 +226,15 b' class RhodeCodeAuthPluginBase(object):' | |||
|
226 | 226 | """ |
|
227 | 227 | raise NotImplementedError("Not implemented in base class") |
|
228 | 228 | |
|
229 | def get_url_slug(self): | |
|
230 | """ | |
|
231 | Returns a slug which should be used when constructing URLs which refer | |
|
232 | to this plugin. By default it returns the plugin name. If the name is | |
|
233 | not suitable for using it in an URL the plugin should override this | |
|
234 | method. | |
|
235 | """ | |
|
236 | return self.name | |
|
237 | ||
|
229 | 238 | @property |
|
230 | 239 | def is_headers_auth(self): |
|
231 | 240 | """ |
@@ -45,7 +45,7 b' class AuthnPluginResourceBase(AuthnResou' | |||
|
45 | 45 | |
|
46 | 46 | def __init__(self, plugin): |
|
47 | 47 | self.plugin = plugin |
|
48 |
self.__name__ = plugin. |
|
|
48 | self.__name__ = plugin.get_url_slug() | |
|
49 | 49 | self.display_name = plugin.get_display_name() |
|
50 | 50 | |
|
51 | 51 |
@@ -84,7 +84,7 b' class AuthnPluginViewBase(object):' | |||
|
84 | 84 | |
|
85 | 85 | try: |
|
86 | 86 | valid_data = schema.deserialize(data) |
|
87 |
except colander.Invalid |
|
|
87 | except colander.Invalid as e: | |
|
88 | 88 | # Display error message and display form again. |
|
89 | 89 | self.request.session.flash( |
|
90 | 90 | _('Errors exist when saving plugin settings. ' |
@@ -31,9 +31,9 b' from pyramid.authorization import ACLAut' | |||
|
31 | 31 | from pyramid.config import Configurator |
|
32 | 32 | from pyramid.settings import asbool, aslist |
|
33 | 33 | from pyramid.wsgi import wsgiapp |
|
34 |
from pyramid.httpexceptions import |
|
|
34 | from pyramid.httpexceptions import ( | |
|
35 | HTTPError, HTTPInternalServerError, HTTPFound) | |
|
35 | 36 | from pyramid.events import ApplicationCreated |
|
36 | import pyramid.httpexceptions as httpexceptions | |
|
37 | 37 | from pyramid.renderers import render_to_response |
|
38 | 38 | from routes.middleware import RoutesMiddleware |
|
39 | 39 | import routes.util |
@@ -44,10 +44,10 b' from rhodecode.config import patches' | |||
|
44 | 44 | from rhodecode.config.routing import STATIC_FILE_PREFIX |
|
45 | 45 | from rhodecode.config.environment import ( |
|
46 | 46 | load_environment, load_pyramid_environment) |
|
47 | from rhodecode.lib.exceptions import VCSServerUnavailable | |
|
48 | from rhodecode.lib.vcs.exceptions import VCSCommunicationError | |
|
49 | 47 | from rhodecode.lib.middleware import csrf |
|
50 | 48 | from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled |
|
49 | from rhodecode.lib.middleware.error_handling import ( | |
|
50 | PylonsErrorHandlingMiddleware) | |
|
51 | 51 | from rhodecode.lib.middleware.https_fixup import HttpsFixup |
|
52 | 52 | from rhodecode.lib.middleware.vcs import VCSMiddleware |
|
53 | 53 | from rhodecode.lib.plugins.utils import register_rhodecode_plugin |
@@ -186,53 +186,27 b' def make_not_found_view(config):' | |||
|
186 | 186 | pylons_app, appenlight_client = wrap_in_appenlight_if_enabled( |
|
187 | 187 | pylons_app, settings) |
|
188 | 188 | |
|
189 | # The VCSMiddleware shall operate like a fallback if pyramid doesn't find | |
|
190 | # a view to handle the request. Therefore we wrap it around the pylons app. | |
|
189 | # The pylons app is executed inside of the pyramid 404 exception handler. | |
|
190 | # Exceptions which are raised inside of it are not handled by pyramid | |
|
191 | # again. Therefore we add a middleware that invokes the error handler in | |
|
192 | # case of an exception or error response. This way we return proper error | |
|
193 | # HTML pages in case of an error. | |
|
194 | reraise = (settings.get('debugtoolbar.enabled', False) or | |
|
195 | rhodecode.disable_error_handler) | |
|
196 | pylons_app = PylonsErrorHandlingMiddleware( | |
|
197 | pylons_app, error_handler, reraise) | |
|
198 | ||
|
199 | # The VCSMiddleware shall operate like a fallback if pyramid doesn't find a | |
|
200 | # view to handle the request. Therefore it is wrapped around the pylons | |
|
201 | # app. It has to be outside of the error handling otherwise error responses | |
|
202 | # from the vcsserver are converted to HTML error pages. This confuses the | |
|
203 | # command line tools and the user won't get a meaningful error message. | |
|
191 | 204 | if vcs_server_enabled: |
|
192 | 205 | pylons_app = VCSMiddleware( |
|
193 | 206 | pylons_app, settings, appenlight_client, registry=config.registry) |
|
194 | 207 | |
|
195 | pylons_app_as_view = wsgiapp(pylons_app) | |
|
196 | ||
|
197 | def pylons_app_with_error_handler(context, request): | |
|
198 | """ | |
|
199 | Handle exceptions from rc pylons app: | |
|
200 | ||
|
201 | - old webob type exceptions get converted to pyramid exceptions | |
|
202 | - pyramid exceptions are passed to the error handler view | |
|
203 | """ | |
|
204 | def is_vcs_response(response): | |
|
205 | return 'X-RhodeCode-Backend' in response.headers | |
|
206 | ||
|
207 | def is_http_error(response): | |
|
208 | # webob type error responses | |
|
209 | return (400 <= response.status_int <= 599) | |
|
210 | ||
|
211 | def is_error_handling_needed(response): | |
|
212 | return is_http_error(response) and not is_vcs_response(response) | |
|
213 | ||
|
214 | try: | |
|
215 | response = pylons_app_as_view(context, request) | |
|
216 | if is_error_handling_needed(response): | |
|
217 | response = webob_to_pyramid_http_response(response) | |
|
218 | return error_handler(response, request) | |
|
219 | except HTTPError as e: # pyramid type exceptions | |
|
220 | return error_handler(e, request) | |
|
221 | except Exception as e: | |
|
222 | log.exception(e) | |
|
223 | ||
|
224 | if (settings.get('debugtoolbar.enabled', False) or | |
|
225 | rhodecode.disable_error_handler): | |
|
226 | raise | |
|
227 | ||
|
228 | if isinstance(e, VCSCommunicationError): | |
|
229 | return error_handler(VCSServerUnavailable(), request) | |
|
230 | ||
|
231 | return error_handler(HTTPInternalServerError(), request) | |
|
232 | ||
|
233 | return response | |
|
234 | ||
|
235 | return pylons_app_with_error_handler | |
|
208 | # Convert WSGI app to pyramid view and return it. | |
|
209 | return wsgiapp(pylons_app) | |
|
236 | 210 | |
|
237 | 211 | |
|
238 | 212 | def add_pylons_compat_data(registry, global_config, settings): |
@@ -243,16 +217,6 b' def add_pylons_compat_data(registry, glo' | |||
|
243 | 217 | registry._pylons_compat_settings = settings |
|
244 | 218 | |
|
245 | 219 | |
|
246 | def webob_to_pyramid_http_response(webob_response): | |
|
247 | ResponseClass = httpexceptions.status_map[webob_response.status_int] | |
|
248 | pyramid_response = ResponseClass(webob_response.status) | |
|
249 | pyramid_response.status = webob_response.status | |
|
250 | pyramid_response.headers.update(webob_response.headers) | |
|
251 | if pyramid_response.headers['content-type'] == 'text/html': | |
|
252 | pyramid_response.headers['content-type'] = 'text/html; charset=UTF-8' | |
|
253 | return pyramid_response | |
|
254 | ||
|
255 | ||
|
256 | 220 | def error_handler(exception, request): |
|
257 | 221 | from rhodecode.model.settings import SettingsModel |
|
258 | 222 | from rhodecode.lib.utils2 import AttributeDict |
@@ -466,10 +430,11 b' def _sanitize_vcs_settings(settings):' | |||
|
466 | 430 | """ |
|
467 | 431 | _string_setting(settings, 'vcs.svn.compatible_version', '') |
|
468 | 432 | _string_setting(settings, 'git_rev_filter', '--all') |
|
469 |
_string_setting(settings, 'vcs.hooks.protocol', 'p |
|
|
433 | _string_setting(settings, 'vcs.hooks.protocol', 'http') | |
|
434 | _string_setting(settings, 'vcs.scm_app_implementation', 'http') | |
|
470 | 435 | _string_setting(settings, 'vcs.server', '') |
|
471 | 436 | _string_setting(settings, 'vcs.server.log_level', 'debug') |
|
472 |
_string_setting(settings, 'vcs.server.protocol', 'p |
|
|
437 | _string_setting(settings, 'vcs.server.protocol', 'http') | |
|
473 | 438 | _bool_setting(settings, 'startup.import_repos', 'false') |
|
474 | 439 | _bool_setting(settings, 'vcs.hooks.direct_calls', 'false') |
|
475 | 440 | _bool_setting(settings, 'vcs.server.enable', 'true') |
@@ -477,6 +442,13 b' def _sanitize_vcs_settings(settings):' | |||
|
477 | 442 | _list_setting(settings, 'vcs.backends', 'hg, git, svn') |
|
478 | 443 | _int_setting(settings, 'vcs.connection_timeout', 3600) |
|
479 | 444 | |
|
445 | # Support legacy values of vcs.scm_app_implementation. Legacy | |
|
446 | # configurations may use 'rhodecode.lib.middleware.utils.scm_app_http' | |
|
447 | # which is now mapped to 'http'. | |
|
448 | scm_app_impl = settings['vcs.scm_app_implementation'] | |
|
449 | if scm_app_impl == 'rhodecode.lib.middleware.utils.scm_app_http': | |
|
450 | settings['vcs.scm_app_implementation'] = 'http' | |
|
451 | ||
|
480 | 452 | |
|
481 | 453 | def _int_setting(settings, name, default): |
|
482 | 454 | settings[name] = int(settings.get(name, default)) |
@@ -501,5 +473,8 b' def _list_setting(settings, name, defaul' | |||
|
501 | 473 | settings[name] = aslist(raw_value) |
|
502 | 474 | |
|
503 | 475 | |
|
504 | def _string_setting(settings, name, default): | |
|
505 |
|
|
|
476 | def _string_setting(settings, name, default, lower=True): | |
|
477 | value = settings.get(name, default) | |
|
478 | if lower: | |
|
479 | value = value.lower() | |
|
480 | settings[name] = value |
@@ -196,7 +196,7 b' def make_map(config):' | |||
|
196 | 196 | rmap.connect('user_autocomplete_data', '/_users', controller='home', |
|
197 | 197 | action='user_autocomplete_data', jsroute=True) |
|
198 | 198 | rmap.connect('user_group_autocomplete_data', '/_user_groups', controller='home', |
|
199 | action='user_group_autocomplete_data') | |
|
199 | action='user_group_autocomplete_data', jsroute=True) | |
|
200 | 200 | |
|
201 | 201 | rmap.connect( |
|
202 | 202 | 'user_profile', '/_profiles/{username}', controller='users', |
@@ -305,7 +305,7 b' def make_map(config):' | |||
|
305 | 305 | m.connect('delete_user', '/users/{user_id}', |
|
306 | 306 | action='delete', conditions={'method': ['DELETE']}) |
|
307 | 307 | m.connect('edit_user', '/users/{user_id}/edit', |
|
308 | action='edit', conditions={'method': ['GET']}) | |
|
308 | action='edit', conditions={'method': ['GET']}, jsroute=True) | |
|
309 | 309 | m.connect('user', '/users/{user_id}', |
|
310 | 310 | action='show', conditions={'method': ['GET']}) |
|
311 | 311 | m.connect('force_password_reset_user', '/users/{user_id}/password_reset', |
@@ -389,7 +389,7 b' def make_map(config):' | |||
|
389 | 389 | |
|
390 | 390 | m.connect('edit_user_group_members', |
|
391 | 391 | '/user_groups/{user_group_id}/edit/members', jsroute=True, |
|
392 |
action=' |
|
|
392 | action='user_group_members', conditions={'method': ['GET']}) | |
|
393 | 393 | |
|
394 | 394 | # ADMIN PERMISSIONS ROUTES |
|
395 | 395 | with rmap.submapper(path_prefix=ADMIN_PREFIX, |
@@ -699,6 +699,9 b' def make_map(config):' | |||
|
699 | 699 | rmap.connect('repo_refs_changelog_data', '/{repo_name}/refs-data-changelog', |
|
700 | 700 | controller='summary', action='repo_refs_changelog_data', |
|
701 | 701 | requirements=URL_NAME_REQUIREMENTS, jsroute=True) |
|
702 | rmap.connect('repo_default_reviewers_data', '/{repo_name}/default-reviewers', | |
|
703 | controller='summary', action='repo_default_reviewers_data', | |
|
704 | jsroute=True, requirements=URL_NAME_REQUIREMENTS) | |
|
702 | 705 | |
|
703 | 706 | rmap.connect('changeset_home', '/{repo_name}/changeset/{revision}', |
|
704 | 707 | controller='changeset', revision='tip', jsroute=True, |
@@ -824,6 +827,10 b' def make_map(config):' | |||
|
824 | 827 | controller='admin/repos', action='repo_delete_svn_pattern', |
|
825 | 828 | conditions={'method': ['DELETE'], 'function': check_repo}, |
|
826 | 829 | requirements=URL_NAME_REQUIREMENTS) |
|
830 | rmap.connect('repo_pullrequest_settings', '/{repo_name}/settings/pullrequest', | |
|
831 | controller='admin/repos', action='repo_settings_pullrequest', | |
|
832 | conditions={'method': ['GET', 'POST'], 'function': check_repo}, | |
|
833 | requirements=URL_NAME_REQUIREMENTS) | |
|
827 | 834 | |
|
828 | 835 | # still working url for backward compat. |
|
829 | 836 | rmap.connect('raw_changeset_home_depraced', |
@@ -28,10 +28,9 b' import logging' | |||
|
28 | 28 | |
|
29 | 29 | import formencode |
|
30 | 30 | import peppercorn |
|
31 | from formencode import htmlfill | |
|
32 | 31 | |
|
33 | 32 | from pylons import request, response, tmpl_context as c, url |
|
34 |
from pylons.controllers.util import |
|
|
33 | from pylons.controllers.util import redirect | |
|
35 | 34 | from pylons.i18n.translation import _ |
|
36 | 35 | from webob.exc import HTTPNotFound, HTTPForbidden |
|
37 | 36 | from sqlalchemy.sql.expression import or_ |
@@ -45,7 +44,7 b' from rhodecode.lib import helpers as h' | |||
|
45 | 44 | from rhodecode.lib.base import BaseController, render |
|
46 | 45 | from rhodecode.lib.auth import LoginRequired, NotAnonymous |
|
47 | 46 | from rhodecode.lib.utils import jsonify |
|
48 |
from rhodecode.lib.utils2 import |
|
|
47 | from rhodecode.lib.utils2 import time_to_datetime | |
|
49 | 48 | from rhodecode.lib.ext_json import json |
|
50 | 49 | from rhodecode.lib.vcs.exceptions import VCSError, NodeNotChangedError |
|
51 | 50 | from rhodecode.model import validation_schema |
@@ -39,19 +39,20 b' from rhodecode.lib.auth import (' | |||
|
39 | 39 | LoginRequired, NotAnonymous, AuthUser, generate_auth_token) |
|
40 | 40 | from rhodecode.lib.base import BaseController, render |
|
41 | 41 | from rhodecode.lib.utils import jsonify |
|
42 | from rhodecode.lib.utils2 import safe_int, md5 | |
|
42 | from rhodecode.lib.utils2 import safe_int, md5, str2bool | |
|
43 | 43 | from rhodecode.lib.ext_json import json |
|
44 | 44 | |
|
45 | 45 | from rhodecode.model.validation_schema.schemas import user_schema |
|
46 | 46 | from rhodecode.model.db import ( |
|
47 |
Repository, PullRequest, |
|
|
48 | UserFollowing) | |
|
47 | Repository, PullRequest, UserEmailMap, User, UserFollowing) | |
|
49 | 48 | from rhodecode.model.forms import UserForm |
|
50 | 49 | from rhodecode.model.scm import RepoList |
|
51 | 50 | from rhodecode.model.user import UserModel |
|
52 | 51 | from rhodecode.model.repo import RepoModel |
|
53 | 52 | from rhodecode.model.auth_token import AuthTokenModel |
|
54 | 53 | from rhodecode.model.meta import Session |
|
54 | from rhodecode.model.pull_request import PullRequestModel | |
|
55 | from rhodecode.model.comment import ChangesetCommentsModel | |
|
55 | 56 | |
|
56 | 57 | log = logging.getLogger(__name__) |
|
57 | 58 | |
@@ -289,25 +290,85 b' class MyAccountController(BaseController' | |||
|
289 | 290 | category='success') |
|
290 | 291 | return redirect(url('my_account_emails')) |
|
291 | 292 | |
|
293 | def _extract_ordering(self, request): | |
|
294 | column_index = safe_int(request.GET.get('order[0][column]')) | |
|
295 | order_dir = request.GET.get('order[0][dir]', 'desc') | |
|
296 | order_by = request.GET.get( | |
|
297 | 'columns[%s][data][sort]' % column_index, 'name_raw') | |
|
298 | return order_by, order_dir | |
|
299 | ||
|
300 | def _get_pull_requests_list(self, statuses): | |
|
301 | start = safe_int(request.GET.get('start'), 0) | |
|
302 | length = safe_int(request.GET.get('length'), c.visual.dashboard_items) | |
|
303 | order_by, order_dir = self._extract_ordering(request) | |
|
304 | ||
|
305 | pull_requests = PullRequestModel().get_im_participating_in( | |
|
306 | user_id=c.rhodecode_user.user_id, | |
|
307 | statuses=statuses, | |
|
308 | offset=start, length=length, order_by=order_by, | |
|
309 | order_dir=order_dir) | |
|
310 | ||
|
311 | pull_requests_total_count = PullRequestModel().count_im_participating_in( | |
|
312 | user_id=c.rhodecode_user.user_id, statuses=statuses) | |
|
313 | ||
|
314 | from rhodecode.lib.utils import PartialRenderer | |
|
315 | _render = PartialRenderer('data_table/_dt_elements.html') | |
|
316 | data = [] | |
|
317 | for pr in pull_requests: | |
|
318 | repo_id = pr.target_repo_id | |
|
319 | comments = ChangesetCommentsModel().get_all_comments( | |
|
320 | repo_id, pull_request=pr) | |
|
321 | owned = pr.user_id == c.rhodecode_user.user_id | |
|
322 | status = pr.calculated_review_status() | |
|
323 | ||
|
324 | data.append({ | |
|
325 | 'target_repo': _render('pullrequest_target_repo', | |
|
326 | pr.target_repo.repo_name), | |
|
327 | 'name': _render('pullrequest_name', | |
|
328 | pr.pull_request_id, pr.target_repo.repo_name, | |
|
329 | short=True), | |
|
330 | 'name_raw': pr.pull_request_id, | |
|
331 | 'status': _render('pullrequest_status', status), | |
|
332 | 'title': _render( | |
|
333 | 'pullrequest_title', pr.title, pr.description), | |
|
334 | 'description': h.escape(pr.description), | |
|
335 | 'updated_on': _render('pullrequest_updated_on', | |
|
336 | h.datetime_to_time(pr.updated_on)), | |
|
337 | 'updated_on_raw': h.datetime_to_time(pr.updated_on), | |
|
338 | 'created_on': _render('pullrequest_updated_on', | |
|
339 | h.datetime_to_time(pr.created_on)), | |
|
340 | 'created_on_raw': h.datetime_to_time(pr.created_on), | |
|
341 | 'author': _render('pullrequest_author', | |
|
342 | pr.author.full_contact, ), | |
|
343 | 'author_raw': pr.author.full_name, | |
|
344 | 'comments': _render('pullrequest_comments', len(comments)), | |
|
345 | 'comments_raw': len(comments), | |
|
346 | 'closed': pr.is_closed(), | |
|
347 | 'owned': owned | |
|
348 | }) | |
|
349 | # json used to render the grid | |
|
350 | data = ({ | |
|
351 | 'data': data, | |
|
352 | 'recordsTotal': pull_requests_total_count, | |
|
353 | 'recordsFiltered': pull_requests_total_count, | |
|
354 | }) | |
|
355 | return data | |
|
356 | ||
|
292 | 357 | def my_account_pullrequests(self): |
|
293 | 358 | c.active = 'pullrequests' |
|
294 | 359 | self.__load_data() |
|
295 | c.show_closed = request.GET.get('pr_show_closed') | |
|
296 | ||
|
297 | def _filter(pr): | |
|
298 | s = sorted(pr, key=lambda o: o.created_on, reverse=True) | |
|
299 | if not c.show_closed: | |
|
300 | s = filter(lambda p: p.status != PullRequest.STATUS_CLOSED, s) | |
|
301 | return s | |
|
360 | c.show_closed = str2bool(request.GET.get('pr_show_closed')) | |
|
302 | 361 | |
|
303 | c.my_pull_requests = _filter( | |
|
304 | PullRequest.query().filter( | |
|
305 | PullRequest.user_id == c.rhodecode_user.user_id).all()) | |
|
306 | my_prs = [ | |
|
307 | x.pull_request for x in PullRequestReviewers.query().filter( | |
|
308 | PullRequestReviewers.user_id == c.rhodecode_user.user_id).all()] | |
|
309 | c.participate_in_pull_requests = _filter(my_prs) | |
|
362 | statuses = [PullRequest.STATUS_NEW, PullRequest.STATUS_OPEN] | |
|
363 | if c.show_closed: | |
|
364 | statuses += [PullRequest.STATUS_CLOSED] | |
|
365 | data = self._get_pull_requests_list(statuses) | |
|
366 | if not request.is_xhr: | |
|
367 | c.data_participate = json.dumps(data['data']) | |
|
368 | c.records_total_participate = data['recordsTotal'] | |
|
310 | 369 | return render('admin/my_account/my_account.html') |
|
370 | else: | |
|
371 | return json.dumps(data) | |
|
311 | 372 | |
|
312 | 373 | def my_account_auth_tokens(self): |
|
313 | 374 | c.active = 'auth_tokens' |
@@ -57,7 +57,7 b' class PermissionsController(BaseControll' | |||
|
57 | 57 | super(PermissionsController, self).__before__() |
|
58 | 58 | |
|
59 | 59 | def __load_data(self): |
|
60 | PermissionModel().set_global_permission_choices(c, translator=_) | |
|
60 | PermissionModel().set_global_permission_choices(c, gettext_translator=_) | |
|
61 | 61 | |
|
62 | 62 | @HasPermissionAllDecorator('hg.admin') |
|
63 | 63 | def permission_application(self): |
@@ -92,6 +92,7 b' class PermissionsController(BaseControll' | |||
|
92 | 92 | self.__load_data() |
|
93 | 93 | _form = ApplicationPermissionsForm( |
|
94 | 94 | [x[0] for x in c.register_choices], |
|
95 | [x[0] for x in c.password_reset_choices], | |
|
95 | 96 | [x[0] for x in c.extern_activate_choices])() |
|
96 | 97 | |
|
97 | 98 | try: |
@@ -160,6 +160,7 b' class ReposController(BaseRepoController' | |||
|
160 | 160 | self.__load_defaults() |
|
161 | 161 | form_result = {} |
|
162 | 162 | task_id = None |
|
163 | c.personal_repo_group = c.rhodecode_user.personal_repo_group | |
|
163 | 164 | try: |
|
164 | 165 | # CanWriteToGroup validators checks permissions of this POST |
|
165 | 166 | form_result = RepoForm(repo_groups=c.repo_groups_choices, |
@@ -173,8 +174,6 b' class ReposController(BaseRepoController' | |||
|
173 | 174 | if isinstance(task, BaseAsyncResult): |
|
174 | 175 | task_id = task.task_id |
|
175 | 176 | except formencode.Invalid as errors: |
|
176 | c.personal_repo_group = RepoGroup.get_by_group_name( | |
|
177 | c.rhodecode_user.username) | |
|
178 | 177 | return htmlfill.render( |
|
179 | 178 | render('admin/repos/repo_add.html'), |
|
180 | 179 | defaults=errors.value, |
@@ -215,7 +214,7 b' class ReposController(BaseRepoController' | |||
|
215 | 214 | c.repo_groups = RepoGroup.groups_choices(groups=acl_groups) |
|
216 | 215 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
217 | 216 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() |
|
218 |
c.personal_repo_group = |
|
|
217 | c.personal_repo_group = c.rhodecode_user.personal_repo_group | |
|
219 | 218 | c.new_repo = repo_name_slug(new_repo) |
|
220 | 219 | |
|
221 | 220 | ## apply the defaults from defaults page |
@@ -299,9 +298,8 b' class ReposController(BaseRepoController' | |||
|
299 | 298 | repo_model = RepoModel() |
|
300 | 299 | changed_name = repo_name |
|
301 | 300 | |
|
301 | c.personal_repo_group = c.rhodecode_user.personal_repo_group | |
|
302 | 302 | # override the choices with extracted revisions ! |
|
303 | c.personal_repo_group = RepoGroup.get_by_group_name( | |
|
304 | c.rhodecode_user.username) | |
|
305 | 303 | repo = Repository.get_by_repo_name(repo_name) |
|
306 | 304 | old_data = { |
|
307 | 305 | 'repo_name': repo_name, |
@@ -399,8 +397,7 b' class ReposController(BaseRepoController' | |||
|
399 | 397 | |
|
400 | 398 | c.repo_fields = RepositoryField.query()\ |
|
401 | 399 | .filter(RepositoryField.repository == c.repo_info).all() |
|
402 |
c.personal_repo_group = |
|
|
403 | c.rhodecode_user.username) | |
|
400 | c.personal_repo_group = c.rhodecode_user.personal_repo_group | |
|
404 | 401 | c.active = 'settings' |
|
405 | 402 | return htmlfill.render( |
|
406 | 403 | render('admin/repos/repo_edit.html'), |
@@ -34,6 +34,7 b' import packaging.version' | |||
|
34 | 34 | from pylons import request, tmpl_context as c, url, config |
|
35 | 35 | from pylons.controllers.util import redirect |
|
36 | 36 | from pylons.i18n.translation import _, lazy_ugettext |
|
37 | from pyramid.threadlocal import get_current_registry | |
|
37 | 38 | from webob.exc import HTTPBadRequest |
|
38 | 39 | |
|
39 | 40 | import rhodecode |
@@ -54,6 +55,7 b' from rhodecode.model.db import RhodeCode' | |||
|
54 | 55 | from rhodecode.model.forms import ApplicationSettingsForm, \ |
|
55 | 56 | ApplicationUiSettingsForm, ApplicationVisualisationForm, \ |
|
56 | 57 | LabsSettingsForm, IssueTrackerPatternsForm |
|
58 | from rhodecode.model.repo_group import RepoGroupModel | |
|
57 | 59 | |
|
58 | 60 | from rhodecode.model.scm import ScmModel |
|
59 | 61 | from rhodecode.model.notification import EmailNotificationModel |
@@ -63,6 +65,7 b' from rhodecode.model.settings import (' | |||
|
63 | 65 | SettingsModel) |
|
64 | 66 | |
|
65 | 67 | from rhodecode.model.supervisor import SupervisorModel, SUPERVISOR_MASTER |
|
68 | from rhodecode.svn_support.config_keys import generate_config | |
|
66 | 69 | |
|
67 | 70 | |
|
68 | 71 | log = logging.getLogger(__name__) |
@@ -134,6 +137,10 b' class SettingsController(BaseController)' | |||
|
134 | 137 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
135 | 138 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
136 | 139 | |
|
140 | # TODO: Replace with request.registry after migrating to pyramid. | |
|
141 | pyramid_settings = get_current_registry().settings | |
|
142 | c.svn_proxy_generate_config = pyramid_settings[generate_config] | |
|
143 | ||
|
137 | 144 | application_form = ApplicationUiSettingsForm()() |
|
138 | 145 | |
|
139 | 146 | try: |
@@ -186,6 +193,10 b' class SettingsController(BaseController)' | |||
|
186 | 193 | c.svn_branch_patterns = model.get_global_svn_branch_patterns() |
|
187 | 194 | c.svn_tag_patterns = model.get_global_svn_tag_patterns() |
|
188 | 195 | |
|
196 | # TODO: Replace with request.registry after migrating to pyramid. | |
|
197 | pyramid_settings = get_current_registry().settings | |
|
198 | c.svn_proxy_generate_config = pyramid_settings[generate_config] | |
|
199 | ||
|
189 | 200 | return htmlfill.render( |
|
190 | 201 | render('admin/settings/settings.html'), |
|
191 | 202 | defaults=self._form_defaults(), |
@@ -235,6 +246,8 b' class SettingsController(BaseController)' | |||
|
235 | 246 | """POST /admin/settings/global: All items in the collection""" |
|
236 | 247 | # url('admin_settings_global') |
|
237 | 248 | c.active = 'global' |
|
249 | c.personal_repo_group_default_pattern = RepoGroupModel()\ | |
|
250 | .get_personal_group_name_pattern() | |
|
238 | 251 | application_form = ApplicationSettingsForm()() |
|
239 | 252 | try: |
|
240 | 253 | form_result = application_form.to_python(dict(request.POST)) |
@@ -249,16 +262,18 b' class SettingsController(BaseController)' | |||
|
249 | 262 | |
|
250 | 263 | try: |
|
251 | 264 | settings = [ |
|
252 | ('title', 'rhodecode_title'), | |
|
253 | ('realm', 'rhodecode_realm'), | |
|
254 | ('pre_code', 'rhodecode_pre_code'), | |
|
255 | ('post_code', 'rhodecode_post_code'), | |
|
256 | ('captcha_public_key', 'rhodecode_captcha_public_key'), | |
|
257 | ('captcha_private_key', 'rhodecode_captcha_private_key'), | |
|
265 | ('title', 'rhodecode_title', 'unicode'), | |
|
266 | ('realm', 'rhodecode_realm', 'unicode'), | |
|
267 | ('pre_code', 'rhodecode_pre_code', 'unicode'), | |
|
268 | ('post_code', 'rhodecode_post_code', 'unicode'), | |
|
269 | ('captcha_public_key', 'rhodecode_captcha_public_key', 'unicode'), | |
|
270 | ('captcha_private_key', 'rhodecode_captcha_private_key', 'unicode'), | |
|
271 | ('create_personal_repo_group', 'rhodecode_create_personal_repo_group', 'bool'), | |
|
272 | ('personal_repo_group_pattern', 'rhodecode_personal_repo_group_pattern', 'unicode'), | |
|
258 | 273 | ] |
|
259 | for setting, form_key in settings: | |
|
274 | for setting, form_key, type_ in settings: | |
|
260 | 275 | sett = SettingsModel().create_or_update_setting( |
|
261 | setting, form_result[form_key]) | |
|
276 | setting, form_result[form_key], type_) | |
|
262 | 277 | Session().add(sett) |
|
263 | 278 | |
|
264 | 279 | Session().commit() |
@@ -277,6 +292,8 b' class SettingsController(BaseController)' | |||
|
277 | 292 | """GET /admin/settings/global: All items in the collection""" |
|
278 | 293 | # url('admin_settings_global') |
|
279 | 294 | c.active = 'global' |
|
295 | c.personal_repo_group_default_pattern = RepoGroupModel()\ | |
|
296 | .get_personal_group_name_pattern() | |
|
280 | 297 | |
|
281 | 298 | return htmlfill.render( |
|
282 | 299 | render('admin/settings/settings.html'), |
@@ -397,10 +414,11 b' class SettingsController(BaseController)' | |||
|
397 | 414 | settings_model = IssueTrackerSettingsModel() |
|
398 | 415 | |
|
399 | 416 | form = IssueTrackerPatternsForm()().to_python(request.POST) |
|
400 | for uid in form['delete_patterns']: | |
|
417 | if form: | |
|
418 | for uid in form.get('delete_patterns', []): | |
|
401 | 419 | settings_model.delete_entries(uid) |
|
402 | 420 | |
|
403 |
for pattern in form |
|
|
421 | for pattern in form.get('patterns', []): | |
|
404 | 422 | for setting, value, type_ in pattern: |
|
405 | 423 | sett = settings_model.create_or_update_setting( |
|
406 | 424 | setting, value, type_) |
@@ -530,65 +548,93 b' class SettingsController(BaseController)' | |||
|
530 | 548 | """GET /admin/settings/system: All items in the collection""" |
|
531 | 549 | # url('admin_settings_system') |
|
532 | 550 | snapshot = str2bool(request.GET.get('snapshot')) |
|
533 | c.active = 'system' | |
|
551 | defaults = self._form_defaults() | |
|
534 | 552 | |
|
535 | defaults = self._form_defaults() | |
|
536 | c.rhodecode_ini = rhodecode.CONFIG | |
|
553 | c.active = 'system' | |
|
537 | 554 | c.rhodecode_update_url = defaults.get('rhodecode_update_url') |
|
538 | 555 | server_info = ScmModel().get_server_info(request.environ) |
|
556 | ||
|
539 | 557 | for key, val in server_info.iteritems(): |
|
540 | 558 | setattr(c, key, val) |
|
541 | 559 | |
|
542 | if c.disk['percent'] > 90: | |
|
543 | h.flash(h.literal(_( | |
|
544 | 'Critical: your disk space is very low <b>%s%%</b> used' % | |
|
545 | c.disk['percent'])), 'error') | |
|
546 | elif c.disk['percent'] > 70: | |
|
547 | h.flash(h.literal(_( | |
|
548 | 'Warning: your disk space is running low <b>%s%%</b> used' % | |
|
549 | c.disk['percent'])), 'warning') | |
|
560 | def val(name, subkey='human_value'): | |
|
561 | return server_info[name][subkey] | |
|
562 | ||
|
563 | def state(name): | |
|
564 | return server_info[name]['state'] | |
|
565 | ||
|
566 | def val2(name): | |
|
567 | val = server_info[name]['human_value'] | |
|
568 | state = server_info[name]['state'] | |
|
569 | return val, state | |
|
550 | 570 | |
|
551 | try: | |
|
552 | c.uptime_age = h._age( | |
|
553 | h.time_to_datetime(c.boot_time), False, show_suffix=False) | |
|
554 | except TypeError: | |
|
555 | c.uptime_age = c.boot_time | |
|
571 | c.data_items = [ | |
|
572 | # update info | |
|
573 | (_('Update info'), h.literal( | |
|
574 | '<span class="link" id="check_for_update" >%s.</span>' % ( | |
|
575 | _('Check for updates')) + | |
|
576 | '<br/> <span >%s.</span>' % (_('Note: please make sure this server can access `%s` for the update link to work') % c.rhodecode_update_url) | |
|
577 | ), ''), | |
|
578 | ||
|
579 | # RhodeCode specific | |
|
580 | (_('RhodeCode Version'), val('rhodecode_app')['text'], state('rhodecode_app')), | |
|
581 | (_('RhodeCode Server IP'), val('server')['server_ip'], state('server')), | |
|
582 | (_('RhodeCode Server ID'), val('server')['server_id'], state('server')), | |
|
583 | (_('RhodeCode Configuration'), val('rhodecode_config')['path'], state('rhodecode_config')), | |
|
584 | ('', '', ''), # spacer | |
|
585 | ||
|
586 | # Database | |
|
587 | (_('Database'), val('database')['url'], state('database')), | |
|
588 | (_('Database version'), val('database')['version'], state('database')), | |
|
589 | ('', '', ''), # spacer | |
|
556 | 590 | |
|
557 | try: | |
|
558 | c.system_memory = '%s/%s, %s%% (%s%%) used%s' % ( | |
|
559 | h.format_byte_size_binary(c.memory['used']), | |
|
560 | h.format_byte_size_binary(c.memory['total']), | |
|
561 | c.memory['percent2'], | |
|
562 | c.memory['percent'], | |
|
563 | ' %s' % c.memory['error'] if 'error' in c.memory else '') | |
|
564 | except TypeError: | |
|
565 | c.system_memory = 'NOT AVAILABLE' | |
|
591 | # Platform/Python | |
|
592 | (_('Platform'), val('platform')['name'], state('platform')), | |
|
593 | (_('Platform UUID'), val('platform')['uuid'], state('platform')), | |
|
594 | (_('Python version'), val('python')['version'], state('python')), | |
|
595 | (_('Python path'), val('python')['executable'], state('python')), | |
|
596 | ('', '', ''), # spacer | |
|
597 | ||
|
598 | # Systems stats | |
|
599 | (_('CPU'), val('cpu'), state('cpu')), | |
|
600 | (_('Load'), val('load')['text'], state('load')), | |
|
601 | (_('Memory'), val('memory')['text'], state('memory')), | |
|
602 | (_('Uptime'), val('uptime')['text'], state('uptime')), | |
|
603 | ('', '', ''), # spacer | |
|
604 | ||
|
605 | # Repo storage | |
|
606 | (_('Storage location'), val('storage')['path'], state('storage')), | |
|
607 | (_('Storage info'), val('storage')['text'], state('storage')), | |
|
608 | (_('Storage inodes'), val('storage_inodes')['text'], state('storage_inodes')), | |
|
566 | 609 | |
|
567 | rhodecode_ini_safe = rhodecode.CONFIG.copy() | |
|
568 | blacklist = [ | |
|
569 | 'rhodecode_license_key', | |
|
570 | 'routes.map', | |
|
571 | 'pylons.h', | |
|
572 | 'pylons.app_globals', | |
|
573 | 'pylons.environ_config', | |
|
574 | 'sqlalchemy.db1.url', | |
|
575 | ('app_conf', 'sqlalchemy.db1.url') | |
|
610 | (_('Gist storage location'), val('storage_gist')['path'], state('storage_gist')), | |
|
611 | (_('Gist storage info'), val('storage_gist')['text'], state('storage_gist')), | |
|
612 | ||
|
613 | (_('Archive cache storage location'), val('storage_archive')['path'], state('storage_archive')), | |
|
614 | (_('Archive cache info'), val('storage_archive')['text'], state('storage_archive')), | |
|
615 | ||
|
616 | (_('Temp storage location'), val('storage_temp')['path'], state('storage_temp')), | |
|
617 | (_('Temp storage info'), val('storage_temp')['text'], state('storage_temp')), | |
|
618 | ||
|
619 | (_('Search info'), val('search')['text'], state('search')), | |
|
620 | (_('Search location'), val('search')['location'], state('search')), | |
|
621 | ('', '', ''), # spacer | |
|
622 | ||
|
623 | # VCS specific | |
|
624 | (_('VCS Backends'), val('vcs_backends'), state('vcs_backends')), | |
|
625 | (_('VCS Server'), val('vcs_server')['text'], state('vcs_server')), | |
|
626 | (_('GIT'), val('git'), state('git')), | |
|
627 | (_('HG'), val('hg'), state('hg')), | |
|
628 | (_('SVN'), val('svn'), state('svn')), | |
|
629 | ||
|
576 | 630 | ] |
|
577 | for k in blacklist: | |
|
578 | if isinstance(k, tuple): | |
|
579 | section, key = k | |
|
580 | if section in rhodecode_ini_safe: | |
|
581 | rhodecode_ini_safe[section].pop(key, None) | |
|
582 | else: | |
|
583 | rhodecode_ini_safe.pop(k, None) | |
|
584 | ||
|
585 | c.rhodecode_ini_safe = rhodecode_ini_safe | |
|
586 | 631 | |
|
587 | 632 | # TODO: marcink, figure out how to allow only selected users to do this |
|
588 |
c.allowed_to_snapshot = |
|
|
633 | c.allowed_to_snapshot = c.rhodecode_user.admin | |
|
589 | 634 | |
|
590 | 635 | if snapshot: |
|
591 | 636 | if c.allowed_to_snapshot: |
|
637 | c.data_items.pop(0) # remove server info | |
|
592 | 638 | return render('admin/settings/settings_system_snapshot.html') |
|
593 | 639 | else: |
|
594 | 640 | h.flash('You are not allowed to do this', category='warning') |
@@ -25,6 +25,7 b' User Groups crud controller for pylons' | |||
|
25 | 25 | import logging |
|
26 | 26 | import formencode |
|
27 | 27 | |
|
28 | import peppercorn | |
|
28 | 29 | from formencode import htmlfill |
|
29 | 30 | from pylons import request, tmpl_context as c, url, config |
|
30 | 31 | from pylons.controllers.util import redirect |
@@ -40,7 +41,7 b' from rhodecode.lib.utils import jsonify,' | |||
|
40 | 41 | from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int |
|
41 | 42 | from rhodecode.lib.auth import ( |
|
42 | 43 | LoginRequired, NotAnonymous, HasUserGroupPermissionAnyDecorator, |
|
43 | HasPermissionAnyDecorator) | |
|
44 | HasPermissionAnyDecorator, XHRRequired) | |
|
44 | 45 | from rhodecode.lib.base import BaseController, render |
|
45 | 46 | from rhodecode.model.permission import PermissionModel |
|
46 | 47 | from rhodecode.model.scm import UserGroupList |
@@ -64,18 +65,13 b' class UserGroupsController(BaseControlle' | |||
|
64 | 65 | def __before__(self): |
|
65 | 66 | super(UserGroupsController, self).__before__() |
|
66 | 67 | c.available_permissions = config['available_permissions'] |
|
67 | PermissionModel().set_global_permission_choices(c, translator=_) | |
|
68 | PermissionModel().set_global_permission_choices(c, gettext_translator=_) | |
|
68 | 69 | |
|
69 | 70 | def __load_data(self, user_group_id): |
|
70 | 71 | c.group_members_obj = [x.user for x in c.user_group.members] |
|
71 | 72 | c.group_members_obj.sort(key=lambda u: u.username.lower()) |
|
72 | ||
|
73 | 73 | c.group_members = [(x.user_id, x.username) for x in c.group_members_obj] |
|
74 | 74 | |
|
75 | c.available_members = [(x.user_id, x.username) | |
|
76 | for x in User.query().all()] | |
|
77 | c.available_members.sort(key=lambda u: u[1].lower()) | |
|
78 | ||
|
79 | 75 | def __load_defaults(self, user_group_id): |
|
80 | 76 | """ |
|
81 | 77 | Load defaults settings for edit, and update |
@@ -207,20 +203,21 b' class UserGroupsController(BaseControlle' | |||
|
207 | 203 | c.active = 'settings' |
|
208 | 204 | self.__load_data(user_group_id) |
|
209 | 205 | |
|
210 | available_members = [safe_unicode(x[0]) for x in c.available_members] | |
|
211 | ||
|
212 | 206 | users_group_form = UserGroupForm( |
|
213 | edit=True, old_data=c.user_group.get_dict(), | |
|
214 | available_members=available_members, allow_disabled=True)() | |
|
207 | edit=True, old_data=c.user_group.get_dict(), allow_disabled=True)() | |
|
215 | 208 | |
|
216 | 209 | try: |
|
217 | 210 | form_result = users_group_form.to_python(request.POST) |
|
211 | pstruct = peppercorn.parse(request.POST.items()) | |
|
212 | form_result['users_group_members'] = pstruct['user_group_members'] | |
|
213 | ||
|
218 | 214 | UserGroupModel().update(c.user_group, form_result) |
|
219 | gr = form_result['users_group_name'] | |
|
215 | updated_user_group = form_result['users_group_name'] | |
|
220 | 216 | action_logger(c.rhodecode_user, |
|
221 | 'admin_updated_users_group:%s' % gr, | |
|
217 | 'admin_updated_users_group:%s' % updated_user_group, | |
|
222 | 218 | None, self.ip_addr, self.sa) |
|
223 |
h.flash(_('Updated user group %s') % gr, |
|
|
219 | h.flash(_('Updated user group %s') % updated_user_group, | |
|
220 | category='success') | |
|
224 | 221 | Session().commit() |
|
225 | 222 | except formencode.Invalid as errors: |
|
226 | 223 | defaults = errors.value |
@@ -462,19 +459,29 b' class UserGroupsController(BaseControlle' | |||
|
462 | 459 | return render('admin/user_groups/user_group_edit.html') |
|
463 | 460 | |
|
464 | 461 | @HasUserGroupPermissionAnyDecorator('usergroup.admin') |
|
465 | def edit_members(self, user_group_id): | |
|
462 | @XHRRequired() | |
|
463 | @jsonify | |
|
464 | def user_group_members(self, user_group_id): | |
|
466 | 465 | user_group_id = safe_int(user_group_id) |
|
467 |
|
|
|
468 | c.active = 'members' | |
|
469 | c.group_members_obj = sorted((x.user for x in c.user_group.members), | |
|
466 | user_group = UserGroup.get_or_404(user_group_id) | |
|
467 | group_members_obj = sorted((x.user for x in user_group.members), | |
|
470 | 468 |
|
|
471 | 469 | |
|
472 | group_members = [(x.user_id, x.username) for x in c.group_members_obj] | |
|
470 | group_members = [ | |
|
471 | { | |
|
472 | 'id': user.user_id, | |
|
473 | 'first_name': user.name, | |
|
474 | 'last_name': user.lastname, | |
|
475 | 'username': user.username, | |
|
476 | 'icon_link': h.gravatar_url(user.email, 30), | |
|
477 | 'value_display': h.person(user.email), | |
|
478 | 'value': user.username, | |
|
479 | 'value_type': 'user', | |
|
480 | 'active': user.active, | |
|
481 | } | |
|
482 | for user in group_members_obj | |
|
483 | ] | |
|
473 | 484 | |
|
474 | if request.is_xhr: | |
|
475 | return jsonify(lambda *a, **k: { | |
|
485 | return { | |
|
476 | 486 |
|
|
477 |
|
|
|
478 | ||
|
479 | c.group_members = group_members | |
|
480 | return render('admin/user_groups/user_group_edit.html') | |
|
487 | } |
@@ -45,12 +45,13 b' from rhodecode.model.db import (' | |||
|
45 | 45 | PullRequestReviewers, User, UserEmailMap, UserIpMap, RepoGroup) |
|
46 | 46 | from rhodecode.model.forms import ( |
|
47 | 47 | UserForm, UserPermissionsForm, UserIndividualPermissionsForm) |
|
48 | from rhodecode.model.repo_group import RepoGroupModel | |
|
48 | 49 | from rhodecode.model.user import UserModel |
|
49 | 50 | from rhodecode.model.meta import Session |
|
50 | 51 | from rhodecode.model.permission import PermissionModel |
|
51 | 52 | from rhodecode.lib.utils import action_logger |
|
52 | 53 | from rhodecode.lib.ext_json import json |
|
53 | from rhodecode.lib.utils2 import datetime_to_time, safe_int | |
|
54 | from rhodecode.lib.utils2 import datetime_to_time, safe_int, AttributeDict | |
|
54 | 55 | |
|
55 | 56 | log = logging.getLogger(__name__) |
|
56 | 57 | |
@@ -73,7 +74,7 b' class UsersController(BaseController):' | |||
|
73 | 74 | ('ru', 'Russian (ru)'), |
|
74 | 75 | ('zh', 'Chinese (zh)'), |
|
75 | 76 | ] |
|
76 | PermissionModel().set_global_permission_choices(c, translator=_) | |
|
77 | PermissionModel().set_global_permission_choices(c, gettext_translator=_) | |
|
77 | 78 | |
|
78 | 79 | @HasPermissionAllDecorator('hg.admin') |
|
79 | 80 | def index(self): |
@@ -120,6 +121,16 b' class UsersController(BaseController):' | |||
|
120 | 121 | c.data = json.dumps(users_data) |
|
121 | 122 | return render('admin/users/users.html') |
|
122 | 123 | |
|
124 | def _get_personal_repo_group_template_vars(self): | |
|
125 | DummyUser = AttributeDict({ | |
|
126 | 'username': '${username}', | |
|
127 | 'user_id': '${user_id}', | |
|
128 | }) | |
|
129 | c.default_create_repo_group = RepoGroupModel() \ | |
|
130 | .get_default_create_personal_repo_group() | |
|
131 | c.personal_repo_group_name = RepoGroupModel() \ | |
|
132 | .get_personal_group_name(DummyUser) | |
|
133 | ||
|
123 | 134 | @HasPermissionAllDecorator('hg.admin') |
|
124 | 135 | @auth.CSRFRequired() |
|
125 | 136 | def create(self): |
@@ -143,6 +154,7 b' class UsersController(BaseController):' | |||
|
143 | 154 | % {'user_link': user_link}), category='success') |
|
144 | 155 | Session().commit() |
|
145 | 156 | except formencode.Invalid as errors: |
|
157 | self._get_personal_repo_group_template_vars() | |
|
146 | 158 | return htmlfill.render( |
|
147 | 159 | render('admin/users/user_add.html'), |
|
148 | 160 | defaults=errors.value, |
@@ -163,6 +175,7 b' class UsersController(BaseController):' | |||
|
163 | 175 | """GET /users/new: Form to create a new item""" |
|
164 | 176 | # url('new_user') |
|
165 | 177 | c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.name |
|
178 | self._get_personal_repo_group_template_vars() | |
|
166 | 179 | return render('admin/users/user_add.html') |
|
167 | 180 | |
|
168 | 181 | @HasPermissionAllDecorator('hg.admin') |
@@ -339,22 +352,41 b' class UsersController(BaseController):' | |||
|
339 | 352 | |
|
340 | 353 | user_id = safe_int(user_id) |
|
341 | 354 | c.user = User.get_or_404(user_id) |
|
355 | personal_repo_group = RepoGroup.get_user_personal_repo_group( | |
|
356 | c.user.user_id) | |
|
357 | if personal_repo_group: | |
|
358 | return redirect(url('edit_user_advanced', user_id=user_id)) | |
|
342 | 359 | |
|
360 | personal_repo_group_name = RepoGroupModel().get_personal_group_name( | |
|
361 | c.user) | |
|
362 | named_personal_group = RepoGroup.get_by_group_name( | |
|
363 | personal_repo_group_name) | |
|
343 | 364 | try: |
|
344 | desc = RepoGroupModel.PERSONAL_GROUP_DESC % { | |
|
345 | 'username': c.user.username} | |
|
346 | if not RepoGroup.get_by_group_name(c.user.username): | |
|
347 | RepoGroupModel().create(group_name=c.user.username, | |
|
348 | group_description=desc, | |
|
349 | owner=c.user.username) | |
|
350 | 365 | |
|
351 | msg = _('Created repository group `%s`' % (c.user.username,)) | |
|
366 | if named_personal_group and named_personal_group.user_id == c.user.user_id: | |
|
367 | # migrate the same named group, and mark it as personal | |
|
368 | named_personal_group.personal = True | |
|
369 | Session().add(named_personal_group) | |
|
370 | Session().commit() | |
|
371 | msg = _('Linked repository group `%s` as personal' % ( | |
|
372 | personal_repo_group_name,)) | |
|
352 | 373 | h.flash(msg, category='success') |
|
374 | elif not named_personal_group: | |
|
375 | RepoGroupModel().create_personal_repo_group(c.user) | |
|
376 | ||
|
377 | msg = _('Created repository group `%s`' % ( | |
|
378 | personal_repo_group_name,)) | |
|
379 | h.flash(msg, category='success') | |
|
380 | else: | |
|
381 | msg = _('Repository group `%s` is already taken' % ( | |
|
382 | personal_repo_group_name,)) | |
|
383 | h.flash(msg, category='warning') | |
|
353 | 384 | except Exception: |
|
354 | 385 | log.exception("Exception during repository group creation") |
|
355 | 386 | msg = _( |
|
356 | 387 | 'An error occurred during repository group creation for user') |
|
357 | 388 | h.flash(msg, category='error') |
|
389 | Session().rollback() | |
|
358 | 390 | |
|
359 | 391 | return redirect(url('edit_user_advanced', user_id=user_id)) |
|
360 | 392 | |
@@ -397,7 +429,9 b' class UsersController(BaseController):' | |||
|
397 | 429 | |
|
398 | 430 | c.active = 'advanced' |
|
399 | 431 | c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr) |
|
400 | c.personal_repo_group = RepoGroup.get_by_group_name(user.username) | |
|
432 | c.personal_repo_group = c.perm_user.personal_repo_group | |
|
433 | c.personal_repo_group_name = RepoGroupModel()\ | |
|
434 | .get_personal_group_name(user) | |
|
401 | 435 | c.first_admin = User.get_first_super_admin() |
|
402 | 436 | defaults = user.get_dict() |
|
403 | 437 |
@@ -32,7 +32,7 b' from pylons.i18n.translation import _' | |||
|
32 | 32 | from pylons.controllers.util import redirect |
|
33 | 33 | |
|
34 | 34 | from rhodecode.lib import auth |
|
35 | from rhodecode.lib import diffs | |
|
35 | from rhodecode.lib import diffs, codeblocks | |
|
36 | 36 | from rhodecode.lib.auth import ( |
|
37 | 37 | LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous) |
|
38 | 38 | from rhodecode.lib.base import BaseRepoController, render |
@@ -43,7 +43,7 b' from rhodecode.lib.utils import action_l' | |||
|
43 | 43 | from rhodecode.lib.utils2 import safe_unicode |
|
44 | 44 | from rhodecode.lib.vcs.backends.base import EmptyCommit |
|
45 | 45 | from rhodecode.lib.vcs.exceptions import ( |
|
46 | RepositoryError, CommitDoesNotExistError) | |
|
46 | RepositoryError, CommitDoesNotExistError, NodeDoesNotExistError) | |
|
47 | 47 | from rhodecode.model.db import ChangesetComment, ChangesetStatus |
|
48 | 48 | from rhodecode.model.changeset_status import ChangesetStatusModel |
|
49 | 49 | from rhodecode.model.comment import ChangesetCommentsModel |
@@ -156,15 +156,24 b' class ChangesetController(BaseRepoContro' | |||
|
156 | 156 | c.ignorews_url = _ignorews_url |
|
157 | 157 | c.context_url = _context_url |
|
158 | 158 | c.fulldiff = fulldiff = request.GET.get('fulldiff') |
|
159 | ||
|
160 | # fetch global flags of ignore ws or context lines | |
|
161 | context_lcl = get_line_ctx('', request.GET) | |
|
162 | ign_whitespace_lcl = get_ignore_ws('', request.GET) | |
|
163 | ||
|
164 | # diff_limit will cut off the whole diff if the limit is applied | |
|
165 | # otherwise it will just hide the big files from the front-end | |
|
166 | diff_limit = self.cut_off_limit_diff | |
|
167 | file_limit = self.cut_off_limit_file | |
|
168 | ||
|
159 | 169 | # get ranges of commit ids if preset |
|
160 | 170 | commit_range = commit_id_range.split('...')[:2] |
|
161 | enable_comments = True | |
|
171 | ||
|
162 | 172 | try: |
|
163 | 173 | pre_load = ['affected_files', 'author', 'branch', 'date', |
|
164 | 174 | 'message', 'parents'] |
|
165 | 175 | |
|
166 | 176 | if len(commit_range) == 2: |
|
167 | enable_comments = False | |
|
168 | 177 | commits = c.rhodecode_repo.get_commits( |
|
169 | 178 | start_id=commit_range[0], end_id=commit_range[1], |
|
170 | 179 | pre_load=pre_load) |
@@ -190,88 +199,78 b' class ChangesetController(BaseRepoContro' | |||
|
190 | 199 | c.lines_deleted = 0 |
|
191 | 200 | |
|
192 | 201 | c.commit_statuses = ChangesetStatus.STATUSES |
|
193 | c.comments = [] | |
|
194 | c.statuses = [] | |
|
195 | 202 | c.inline_comments = [] |
|
196 | 203 | c.inline_cnt = 0 |
|
197 | 204 | c.files = [] |
|
198 | 205 | |
|
199 | # Iterate over ranges (default commit view is always one commit) | |
|
200 | for commit in c.commit_ranges: | |
|
201 | if method == 'show': | |
|
202 | c.statuses.extend([ChangesetStatusModel().get_status( | |
|
203 | c.rhodecode_db_repo.repo_id, commit.raw_id)]) | |
|
204 | ||
|
205 | c.comments.extend(ChangesetCommentsModel().get_comments( | |
|
206 | c.statuses = [] | |
|
207 | c.comments = [] | |
|
208 | if len(c.commit_ranges) == 1: | |
|
209 | commit = c.commit_ranges[0] | |
|
210 | c.comments = ChangesetCommentsModel().get_comments( | |
|
206 | 211 |
|
|
207 |
|
|
|
208 | ||
|
212 | revision=commit.raw_id) | |
|
213 | c.statuses.append(ChangesetStatusModel().get_status( | |
|
214 | c.rhodecode_db_repo.repo_id, commit.raw_id)) | |
|
209 | 215 |
|
|
210 |
|
|
|
216 | statuses = ChangesetStatusModel().get_statuses( | |
|
211 | 217 |
|
|
212 | 218 |
|
|
213 | ||
|
219 | prs = set(st.pull_request for st in statuses | |
|
220 | if st.pull_request is not None) | |
|
214 | 221 |
|
|
215 | 222 |
|
|
216 | ||
|
217 | prs = set(x.pull_request for x in | |
|
218 | filter(lambda x: x.pull_request is not None, st)) | |
|
219 | 223 |
|
|
220 | 224 |
|
|
221 | 225 | |
|
222 | inlines = ChangesetCommentsModel().get_inline_comments( | |
|
223 | c.rhodecode_db_repo.repo_id, revision=commit.raw_id) | |
|
224 | c.inline_comments.extend(inlines.iteritems()) | |
|
225 | ||
|
226 | # Iterate over ranges (default commit view is always one commit) | |
|
227 | for commit in c.commit_ranges: | |
|
226 | 228 | c.changes[commit.raw_id] = [] |
|
227 | 229 | |
|
228 | 230 | commit2 = commit |
|
229 | 231 | commit1 = commit.parents[0] if commit.parents else EmptyCommit() |
|
230 | 232 | |
|
231 | # fetch global flags of ignore ws or context lines | |
|
232 | context_lcl = get_line_ctx('', request.GET) | |
|
233 | ign_whitespace_lcl = get_ignore_ws('', request.GET) | |
|
234 | ||
|
235 | 233 | _diff = c.rhodecode_repo.get_diff( |
|
236 | 234 | commit1, commit2, |
|
237 | 235 | ignore_whitespace=ign_whitespace_lcl, context=context_lcl) |
|
236 | diff_processor = diffs.DiffProcessor( | |
|
237 | _diff, format='newdiff', diff_limit=diff_limit, | |
|
238 | file_limit=file_limit, show_full_diff=fulldiff) | |
|
238 | 239 | |
|
239 | # diff_limit will cut off the whole diff if the limit is applied | |
|
240 | # otherwise it will just hide the big files from the front-end | |
|
241 | diff_limit = self.cut_off_limit_diff | |
|
242 | file_limit = self.cut_off_limit_file | |
|
243 | ||
|
244 | diff_processor = diffs.DiffProcessor( | |
|
245 | _diff, format='gitdiff', diff_limit=diff_limit, | |
|
246 | file_limit=file_limit, show_full_diff=fulldiff) | |
|
247 | 240 | commit_changes = OrderedDict() |
|
248 | 241 | if method == 'show': |
|
249 | 242 | _parsed = diff_processor.prepare() |
|
250 | 243 | c.limited_diff = isinstance(_parsed, diffs.LimitedDiffContainer) |
|
251 | for f in _parsed: | |
|
252 | c.files.append(f) | |
|
253 | st = f['stats'] | |
|
254 | c.lines_added += st['added'] | |
|
255 | c.lines_deleted += st['deleted'] | |
|
256 | fid = h.FID(commit.raw_id, f['filename']) | |
|
257 | diff = diff_processor.as_html(enable_comments=enable_comments, | |
|
258 | parsed_lines=[f]) | |
|
259 | commit_changes[fid] = [ | |
|
260 | commit1.raw_id, commit2.raw_id, | |
|
261 | f['operation'], f['filename'], diff, st, f] | |
|
244 | ||
|
245 | _parsed = diff_processor.prepare() | |
|
246 | ||
|
247 | def _node_getter(commit): | |
|
248 | def get_node(fname): | |
|
249 | try: | |
|
250 | return commit.get_node(fname) | |
|
251 | except NodeDoesNotExistError: | |
|
252 | return None | |
|
253 | return get_node | |
|
254 | ||
|
255 | inline_comments = ChangesetCommentsModel().get_inline_comments( | |
|
256 | c.rhodecode_db_repo.repo_id, revision=commit.raw_id) | |
|
257 | c.inline_cnt += len(inline_comments) | |
|
258 | ||
|
259 | diffset = codeblocks.DiffSet( | |
|
260 | repo_name=c.repo_name, | |
|
261 | source_node_getter=_node_getter(commit1), | |
|
262 | target_node_getter=_node_getter(commit2), | |
|
263 | comments=inline_comments | |
|
264 | ).render_patchset(_parsed, commit1.raw_id, commit2.raw_id) | |
|
265 | c.changes[commit.raw_id] = diffset | |
|
262 | 266 | else: |
|
263 | 267 | # downloads/raw we only need RAW diff nothing else |
|
264 | 268 | diff = diff_processor.as_raw() |
|
265 |
c |
|
|
266 | c.changes[commit.raw_id] = commit_changes | |
|
269 | c.changes[commit.raw_id] = [None, None, None, None, diff, None, None] | |
|
267 | 270 | |
|
268 | 271 | # sort comments by how they were generated |
|
269 | 272 | c.comments = sorted(c.comments, key=lambda x: x.comment_id) |
|
270 | 273 | |
|
271 | # count inline comments | |
|
272 | for __, lines in c.inline_comments: | |
|
273 | for comments in lines.values(): | |
|
274 | c.inline_cnt += len(comments) | |
|
275 | 274 | |
|
276 | 275 | if len(c.commit_ranges) == 1: |
|
277 | 276 | c.commit = c.commit_ranges[0] |
@@ -31,13 +31,14 b' from pylons.i18n.translation import _' | |||
|
31 | 31 | |
|
32 | 32 | from rhodecode.controllers.utils import parse_path_ref, get_commit_from_ref_name |
|
33 | 33 | from rhodecode.lib import helpers as h |
|
34 | from rhodecode.lib import diffs | |
|
34 | from rhodecode.lib import diffs, codeblocks | |
|
35 | 35 | from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator |
|
36 | 36 | from rhodecode.lib.base import BaseRepoController, render |
|
37 | 37 | from rhodecode.lib.utils import safe_str |
|
38 | 38 | from rhodecode.lib.utils2 import safe_unicode, str2bool |
|
39 | 39 | from rhodecode.lib.vcs.exceptions import ( |
|
40 |
EmptyRepositoryError, RepositoryError, RepositoryRequirementError |
|
|
40 | EmptyRepositoryError, RepositoryError, RepositoryRequirementError, | |
|
41 | NodeDoesNotExistError) | |
|
41 | 42 | from rhodecode.model.db import Repository, ChangesetStatus |
|
42 | 43 | |
|
43 | 44 | log = logging.getLogger(__name__) |
@@ -78,7 +79,7 b' class CompareController(BaseRepoControll' | |||
|
78 | 79 | def index(self, repo_name): |
|
79 | 80 | c.compare_home = True |
|
80 | 81 | c.commit_ranges = [] |
|
81 |
c. |
|
|
82 | c.diffset = None | |
|
82 | 83 | c.limited_diff = False |
|
83 | 84 | source_repo = c.rhodecode_db_repo.repo_name |
|
84 | 85 | target_repo = request.GET.get('target_repo', source_repo) |
@@ -239,28 +240,24 b' class CompareController(BaseRepoControll' | |||
|
239 | 240 | commit1=source_commit, commit2=target_commit, |
|
240 | 241 | path1=source_path, path=target_path) |
|
241 | 242 | diff_processor = diffs.DiffProcessor( |
|
242 |
txtdiff, format=' |
|
|
243 | txtdiff, format='newdiff', diff_limit=diff_limit, | |
|
243 | 244 | file_limit=file_limit, show_full_diff=c.fulldiff) |
|
244 | 245 | _parsed = diff_processor.prepare() |
|
245 | 246 | |
|
246 | c.limited_diff = False | |
|
247 | if isinstance(_parsed, diffs.LimitedDiffContainer): | |
|
248 | c.limited_diff = True | |
|
247 | def _node_getter(commit): | |
|
248 | """ Returns a function that returns a node for a commit or None """ | |
|
249 | def get_node(fname): | |
|
250 | try: | |
|
251 | return commit.get_node(fname) | |
|
252 | except NodeDoesNotExistError: | |
|
253 | return None | |
|
254 | return get_node | |
|
249 | 255 | |
|
250 | c.files = [] | |
|
251 | c.changes = {} | |
|
252 | c.lines_added = 0 | |
|
253 | c.lines_deleted = 0 | |
|
254 | for f in _parsed: | |
|
255 | st = f['stats'] | |
|
256 | if not st['binary']: | |
|
257 | c.lines_added += st['added'] | |
|
258 | c.lines_deleted += st['deleted'] | |
|
259 | fid = h.FID('', f['filename']) | |
|
260 | c.files.append([fid, f['operation'], f['filename'], f['stats'], f]) | |
|
261 | htmldiff = diff_processor.as_html( | |
|
262 | enable_comments=False, parsed_lines=[f]) | |
|
263 | c.changes[fid] = [f['operation'], f['filename'], htmldiff, f] | |
|
256 | c.diffset = codeblocks.DiffSet( | |
|
257 | repo_name=source_repo.repo_name, | |
|
258 | source_node_getter=_node_getter(source_commit), | |
|
259 | target_node_getter=_node_getter(target_commit), | |
|
260 | ).render_patchset(_parsed, source_ref, target_ref) | |
|
264 | 261 | |
|
265 | 262 | c.preview_mode = merge |
|
266 | 263 |
@@ -36,6 +36,8 b' from webob.exc import HTTPNotFound, HTTP' | |||
|
36 | 36 | from rhodecode.controllers.utils import parse_path_ref |
|
37 | 37 | from rhodecode.lib import diffs, helpers as h, caches |
|
38 | 38 | from rhodecode.lib.compat import OrderedDict |
|
39 | from rhodecode.lib.codeblocks import ( | |
|
40 | filenode_as_lines_tokens, filenode_as_annotated_lines_tokens) | |
|
39 | 41 | from rhodecode.lib.utils import jsonify, action_logger |
|
40 | 42 | from rhodecode.lib.utils2 import ( |
|
41 | 43 | convert_line_endings, detect_mode, safe_str, str2bool) |
@@ -221,9 +223,19 b' class FilesController(BaseRepoController' | |||
|
221 | 223 | c.file_author = True |
|
222 | 224 | c.file_tree = '' |
|
223 | 225 | if c.file.is_file(): |
|
226 | c.file_source_page = 'true' | |
|
227 | c.file_last_commit = c.file.last_commit | |
|
228 | if c.file.size < self.cut_off_limit_file: | |
|
229 | if c.annotate: # annotation has precedence over renderer | |
|
230 | c.annotated_lines = filenode_as_annotated_lines_tokens( | |
|
231 | c.file | |
|
232 | ) | |
|
233 | else: | |
|
224 | 234 | c.renderer = ( |
|
225 |
c.renderer and h.renderer_from_filename(c.file.path) |
|
|
226 | c.file_last_commit = c.file.last_commit | |
|
235 | c.renderer and h.renderer_from_filename(c.file.path) | |
|
236 | ) | |
|
237 | if not c.renderer: | |
|
238 | c.lines = filenode_as_lines_tokens(c.file) | |
|
227 | 239 | |
|
228 | 240 | c.on_branch_head = self._is_valid_head( |
|
229 | 241 | commit_id, c.rhodecode_repo) |
@@ -233,6 +245,7 b' class FilesController(BaseRepoController' | |||
|
233 | 245 | c.authors = [(h.email(author), |
|
234 | 246 | h.person(author, 'username_or_name_or_email'))] |
|
235 | 247 | else: |
|
248 | c.file_source_page = 'false' | |
|
236 | 249 | c.authors = [] |
|
237 | 250 | c.file_tree = self._get_tree_at_commit( |
|
238 | 251 | repo_name, c.commit.raw_id, f_path) |
@@ -60,8 +60,7 b' class ForksController(BaseRepoController' | |||
|
60 | 60 | c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups) |
|
61 | 61 | choices, c.landing_revs = ScmModel().get_repo_landing_revs() |
|
62 | 62 | c.landing_revs_choices = choices |
|
63 |
c.personal_repo_group = |
|
|
64 | c.rhodecode_user.username) | |
|
63 | c.personal_repo_group = c.rhodecode_user.personal_repo_group | |
|
65 | 64 | |
|
66 | 65 | def __load_data(self, repo_name=None): |
|
67 | 66 | """ |
@@ -286,4 +286,3 b' class HomeController(BaseController):' | |||
|
286 | 286 | _user_groups = _user_groups |
|
287 | 287 | |
|
288 | 288 | return {'suggestions': _user_groups} |
|
289 |
@@ -22,6 +22,7 b'' | |||
|
22 | 22 | pull requests controller for rhodecode for initializing pull requests |
|
23 | 23 | """ |
|
24 | 24 | |
|
25 | import peppercorn | |
|
25 | 26 | import formencode |
|
26 | 27 | import logging |
|
27 | 28 | |
@@ -29,22 +30,26 b' from webob.exc import HTTPNotFound, HTTP' | |||
|
29 | 30 | from pylons import request, tmpl_context as c, url |
|
30 | 31 | from pylons.controllers.util import redirect |
|
31 | 32 | from pylons.i18n.translation import _ |
|
33 | from pyramid.threadlocal import get_current_registry | |
|
32 | 34 | from sqlalchemy.sql import func |
|
33 | 35 | from sqlalchemy.sql.expression import or_ |
|
34 | 36 | |
|
35 | 37 | from rhodecode import events |
|
36 | from rhodecode.lib import auth, diffs, helpers as h | |
|
38 | from rhodecode.lib import auth, diffs, helpers as h, codeblocks | |
|
37 | 39 | from rhodecode.lib.ext_json import json |
|
38 | 40 | from rhodecode.lib.base import ( |
|
39 | 41 | BaseRepoController, render, vcs_operation_context) |
|
40 | 42 | from rhodecode.lib.auth import ( |
|
41 | 43 | LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous, |
|
42 | 44 | HasAcceptedRepoType, XHRRequired) |
|
45 | from rhodecode.lib.channelstream import channelstream_request | |
|
46 | from rhodecode.lib.compat import OrderedDict | |
|
43 | 47 | from rhodecode.lib.utils import jsonify |
|
44 | 48 | from rhodecode.lib.utils2 import safe_int, safe_str, str2bool, safe_unicode |
|
45 | from rhodecode.lib.vcs.backends.base import EmptyCommit | |
|
49 | from rhodecode.lib.vcs.backends.base import EmptyCommit, UpdateFailureReason | |
|
46 | 50 | from rhodecode.lib.vcs.exceptions import ( |
|
47 |
EmptyRepositoryError, CommitDoesNotExistError, RepositoryRequirementError |
|
|
51 | EmptyRepositoryError, CommitDoesNotExistError, RepositoryRequirementError, | |
|
52 | NodeDoesNotExistError) | |
|
48 | 53 | from rhodecode.lib.diffs import LimitedDiffContainer |
|
49 | 54 | from rhodecode.model.changeset_status import ChangesetStatusModel |
|
50 | 55 | from rhodecode.model.comment import ChangesetCommentsModel |
@@ -61,7 +66,7 b' class PullrequestsController(BaseRepoCon' | |||
|
61 | 66 | def __before__(self): |
|
62 | 67 | super(PullrequestsController, self).__before__() |
|
63 | 68 | |
|
64 | def _load_compare_data(self, pull_request, enable_comments=True): | |
|
69 | def _load_compare_data(self, pull_request, inline_comments, enable_comments=True): | |
|
65 | 70 | """ |
|
66 | 71 | Load context data needed for generating compare diff |
|
67 | 72 | |
@@ -114,6 +119,7 b' class PullrequestsController(BaseRepoCon' | |||
|
114 | 119 | except RepositoryRequirementError: |
|
115 | 120 | c.missing_requirements = True |
|
116 | 121 | |
|
122 | c.changes = {} | |
|
117 | 123 | c.missing_commits = False |
|
118 | 124 | if (c.missing_requirements or |
|
119 | 125 | isinstance(source_commit, EmptyCommit) or |
@@ -123,11 +129,31 b' class PullrequestsController(BaseRepoCon' | |||
|
123 | 129 | else: |
|
124 | 130 | vcs_diff = PullRequestModel().get_diff(pull_request) |
|
125 | 131 | diff_processor = diffs.DiffProcessor( |
|
126 |
vcs_diff, format=' |
|
|
132 | vcs_diff, format='newdiff', diff_limit=diff_limit, | |
|
127 | 133 | file_limit=file_limit, show_full_diff=c.fulldiff) |
|
128 | 134 | _parsed = diff_processor.prepare() |
|
129 | 135 | |
|
130 | c.limited_diff = isinstance(_parsed, LimitedDiffContainer) | |
|
136 | commit_changes = OrderedDict() | |
|
137 | _parsed = diff_processor.prepare() | |
|
138 | c.limited_diff = isinstance(_parsed, diffs.LimitedDiffContainer) | |
|
139 | ||
|
140 | _parsed = diff_processor.prepare() | |
|
141 | ||
|
142 | def _node_getter(commit): | |
|
143 | def get_node(fname): | |
|
144 | try: | |
|
145 | return commit.get_node(fname) | |
|
146 | except NodeDoesNotExistError: | |
|
147 | return None | |
|
148 | return get_node | |
|
149 | ||
|
150 | c.diffset = codeblocks.DiffSet( | |
|
151 | repo_name=c.repo_name, | |
|
152 | source_node_getter=_node_getter(target_commit), | |
|
153 | target_node_getter=_node_getter(source_commit), | |
|
154 | comments=inline_comments | |
|
155 | ).render_patchset(_parsed, target_commit.raw_id, source_commit.raw_id) | |
|
156 | ||
|
131 | 157 | |
|
132 | 158 | c.files = [] |
|
133 | 159 | c.changes = {} |
@@ -136,17 +162,17 b' class PullrequestsController(BaseRepoCon' | |||
|
136 | 162 | c.included_files = [] |
|
137 | 163 | c.deleted_files = [] |
|
138 | 164 | |
|
139 | for f in _parsed: | |
|
140 | st = f['stats'] | |
|
141 | c.lines_added += st['added'] | |
|
142 | c.lines_deleted += st['deleted'] | |
|
165 | # for f in _parsed: | |
|
166 | # st = f['stats'] | |
|
167 | # c.lines_added += st['added'] | |
|
168 | # c.lines_deleted += st['deleted'] | |
|
143 | 169 | |
|
144 | fid = h.FID('', f['filename']) | |
|
145 | c.files.append([fid, f['operation'], f['filename'], f['stats']]) | |
|
146 | c.included_files.append(f['filename']) | |
|
147 | html_diff = diff_processor.as_html(enable_comments=enable_comments, | |
|
148 | parsed_lines=[f]) | |
|
149 | c.changes[fid] = [f['operation'], f['filename'], html_diff, f] | |
|
170 | # fid = h.FID('', f['filename']) | |
|
171 | # c.files.append([fid, f['operation'], f['filename'], f['stats']]) | |
|
172 | # c.included_files.append(f['filename']) | |
|
173 | # html_diff = diff_processor.as_html(enable_comments=enable_comments, | |
|
174 | # parsed_lines=[f]) | |
|
175 | # c.changes[fid] = [f['operation'], f['filename'], html_diff, f] | |
|
150 | 176 | |
|
151 | 177 | def _extract_ordering(self, request): |
|
152 | 178 | column_index = safe_int(request.GET.get('order[0][column]')) |
@@ -295,8 +321,10 b' class PullrequestsController(BaseRepoCon' | |||
|
295 | 321 | redirect(url('pullrequest_home', repo_name=source_repo.repo_name)) |
|
296 | 322 | |
|
297 | 323 | default_target_repo = source_repo |
|
298 | if (source_repo.parent and | |
|
299 | not source_repo.parent.scm_instance().is_empty()): | |
|
324 | ||
|
325 | if source_repo.parent: | |
|
326 | parent_vcs_obj = source_repo.parent.scm_instance() | |
|
327 | if parent_vcs_obj and not parent_vcs_obj.is_empty(): | |
|
300 | 328 | # change default if we have a parent repo |
|
301 | 329 | default_target_repo = source_repo.parent |
|
302 | 330 | |
@@ -360,7 +388,8 b' class PullrequestsController(BaseRepoCon' | |||
|
360 | 388 | add_parent = False |
|
361 | 389 | if repo.parent: |
|
362 | 390 | if filter_query in repo.parent.repo_name: |
|
363 |
|
|
|
391 | parent_vcs_obj = repo.parent.scm_instance() | |
|
392 | if parent_vcs_obj and not parent_vcs_obj.is_empty(): | |
|
364 | 393 | add_parent = True |
|
365 | 394 | |
|
366 | 395 | limit = 20 - 1 if add_parent else 20 |
@@ -397,8 +426,10 b' class PullrequestsController(BaseRepoCon' | |||
|
397 | 426 | if not repo: |
|
398 | 427 | raise HTTPNotFound |
|
399 | 428 | |
|
429 | controls = peppercorn.parse(request.POST.items()) | |
|
430 | ||
|
400 | 431 | try: |
|
401 |
_form = PullRequestForm(repo.repo_id)().to_python( |
|
|
432 | _form = PullRequestForm(repo.repo_id)().to_python(controls) | |
|
402 | 433 | except formencode.Invalid as errors: |
|
403 | 434 | if errors.error_dict.get('revisions'): |
|
404 | 435 | msg = 'Revisions: %s' % errors.error_dict['revisions'] |
@@ -417,7 +448,8 b' class PullrequestsController(BaseRepoCon' | |||
|
417 | 448 | target_repo = _form['target_repo'] |
|
418 | 449 | target_ref = _form['target_ref'] |
|
419 | 450 | commit_ids = _form['revisions'][::-1] |
|
420 |
reviewers = |
|
|
451 | reviewers = [ | |
|
452 | (r['user_id'], r['reasons']) for r in _form['review_members']] | |
|
421 | 453 | |
|
422 | 454 | # find the ancestor for this pr |
|
423 | 455 | source_db_repo = Repository.get_by_repo_name(_form['source_repo']) |
@@ -476,8 +508,11 b' class PullrequestsController(BaseRepoCon' | |||
|
476 | 508 | allowed_to_update = PullRequestModel().check_user_update( |
|
477 | 509 | pull_request, c.rhodecode_user) |
|
478 | 510 | if allowed_to_update: |
|
479 | if 'reviewers_ids' in request.POST: | |
|
480 | self._update_reviewers(pull_request_id) | |
|
511 | controls = peppercorn.parse(request.POST.items()) | |
|
512 | ||
|
513 | if 'review_members' in controls: | |
|
514 | self._update_reviewers( | |
|
515 | pull_request_id, controls['review_members']) | |
|
481 | 516 | elif str2bool(request.POST.get('update_commits', 'false')): |
|
482 | 517 | self._update_commits(pull_request) |
|
483 | 518 | elif str2bool(request.POST.get('close_pull_request', 'false')): |
@@ -506,32 +541,51 b' class PullrequestsController(BaseRepoCon' | |||
|
506 | 541 | return |
|
507 | 542 | |
|
508 | 543 | def _update_commits(self, pull_request): |
|
509 | try: | |
|
510 | if PullRequestModel().has_valid_update_type(pull_request): | |
|
511 | updated_version, changes = PullRequestModel().update_commits( | |
|
512 | pull_request) | |
|
513 | if updated_version: | |
|
544 | resp = PullRequestModel().update_commits(pull_request) | |
|
545 | ||
|
546 | if resp.executed: | |
|
514 | 547 |
|
|
515 | 548 |
|
|
516 |
|
|
|
517 | u'commits.' | |
|
518 | ).format( | |
|
549 | u'{count_added} added, {count_removed} removed commits.') | |
|
550 | msg = msg.format( | |
|
519 | 551 |
|
|
520 |
|
|
|
521 |
|
|
|
552 | count_added=len(resp.changes.added), | |
|
553 | count_removed=len(resp.changes.removed)) | |
|
522 | 554 |
|
|
555 | ||
|
556 | registry = get_current_registry() | |
|
557 | rhodecode_plugins = getattr(registry, 'rhodecode_plugins', {}) | |
|
558 | channelstream_config = rhodecode_plugins.get('channelstream', {}) | |
|
559 | if channelstream_config.get('enabled'): | |
|
560 | message = msg + ( | |
|
561 | ' - <a onclick="window.location.reload()">' | |
|
562 | '<strong>{}</strong></a>'.format(_('Reload page'))) | |
|
563 | channel = '/repo${}$/pr/{}'.format( | |
|
564 | pull_request.target_repo.repo_name, | |
|
565 | pull_request.pull_request_id | |
|
566 | ) | |
|
567 | payload = { | |
|
568 | 'type': 'message', | |
|
569 | 'user': 'system', | |
|
570 | 'exclude_users': [request.user.username], | |
|
571 | 'channel': channel, | |
|
572 | 'message': { | |
|
573 | 'message': message, | |
|
574 | 'level': 'success', | |
|
575 | 'topic': '/notifications' | |
|
576 | } | |
|
577 | } | |
|
578 | channelstream_request( | |
|
579 | channelstream_config, [payload], '/message', | |
|
580 | raise_exc=False) | |
|
523 | 581 |
|
|
524 | h.flash(_("Nothing changed in pull request."), | |
|
525 | category='warning') | |
|
526 | else: | |
|
527 | msg = _( | |
|
528 | u"Skipping update of pull request due to reference " | |
|
529 | u"type: {reference_type}" | |
|
530 | ).format(reference_type=pull_request.source_ref_parts.type) | |
|
531 | h.flash(msg, category='warning') | |
|
532 | except CommitDoesNotExistError: | |
|
533 | h.flash( | |
|
534 | _(u'Update failed due to missing commits.'), category='error') | |
|
582 | msg = PullRequestModel.UPDATE_STATUS_MESSAGES[resp.reason] | |
|
583 | warning_reasons = [ | |
|
584 | UpdateFailureReason.NO_CHANGE, | |
|
585 | UpdateFailureReason.WRONG_REF_TPYE, | |
|
586 | ] | |
|
587 | category = 'warning' if resp.reason in warning_reasons else 'error' | |
|
588 | h.flash(msg, category=category) | |
|
535 | 589 | |
|
536 | 590 | @auth.CSRFRequired() |
|
537 | 591 | @LoginRequired() |
@@ -601,11 +655,10 b' class PullrequestsController(BaseRepoCon' | |||
|
601 | 655 | merge_resp.failure_reason) |
|
602 | 656 | h.flash(msg, category='error') |
|
603 | 657 | |
|
604 | def _update_reviewers(self, pull_request_id): | |
|
605 |
reviewers |
|
|
606 | lambda v: v not in [None, ''], | |
|
607 | request.POST.get('reviewers_ids', '').split(','))) | |
|
608 | PullRequestModel().update_reviewers(pull_request_id, reviewers_ids) | |
|
658 | def _update_reviewers(self, pull_request_id, review_members): | |
|
659 | reviewers = [ | |
|
660 | (int(r['user_id']), r['reasons']) for r in review_members] | |
|
661 | PullRequestModel().update_reviewers(pull_request_id, reviewers) | |
|
609 | 662 | Session().commit() |
|
610 | 663 | |
|
611 | 664 | def _reject_close(self, pull_request): |
@@ -655,6 +708,10 b' class PullrequestsController(BaseRepoCon' | |||
|
655 | 708 | c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed() |
|
656 | 709 | c.allowed_to_merge = PullRequestModel().check_user_merge( |
|
657 | 710 | c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed() |
|
711 | c.shadow_clone_url = PullRequestModel().get_shadow_clone_url( | |
|
712 | c.pull_request) | |
|
713 | c.allowed_to_delete = PullRequestModel().check_user_delete( | |
|
714 | c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed() | |
|
658 | 715 | |
|
659 | 716 | cc_model = ChangesetCommentsModel() |
|
660 | 717 | |
@@ -669,23 +726,13 b' class PullrequestsController(BaseRepoCon' | |||
|
669 | 726 | c.pr_merge_status = False |
|
670 | 727 | # load compare data into template context |
|
671 | 728 | enable_comments = not c.pull_request.is_closed() |
|
672 | self._load_compare_data(c.pull_request, enable_comments=enable_comments) | |
|
673 | 729 | |
|
674 | # this is a hack to properly display links, when creating PR, the | |
|
675 | # compare view and others uses different notation, and | |
|
676 | # compare_commits.html renders links based on the target_repo. | |
|
677 | # We need to swap that here to generate it properly on the html side | |
|
678 | c.target_repo = c.source_repo | |
|
679 | 730 | |
|
680 | 731 | # inline comments |
|
681 | c.inline_cnt = 0 | |
|
682 | 732 | c.inline_comments = cc_model.get_inline_comments( |
|
683 | 733 | c.rhodecode_db_repo.repo_id, |
|
684 |
pull_request=pull_request_id) |
|
|
685 |
|
|
|
686 | for __, lines in c.inline_comments: | |
|
687 | for comments in lines.values(): | |
|
688 | c.inline_cnt += len(comments) | |
|
734 | pull_request=pull_request_id) | |
|
735 | c.inline_cnt = len(c.inline_comments) | |
|
689 | 736 | |
|
690 | 737 | # outdated comments |
|
691 | 738 | c.outdated_cnt = 0 |
@@ -702,6 +749,15 b' class PullrequestsController(BaseRepoCon' | |||
|
702 | 749 | else: |
|
703 | 750 | c.outdated_comments = {} |
|
704 | 751 | |
|
752 | self._load_compare_data( | |
|
753 | c.pull_request, c.inline_comments, enable_comments=enable_comments) | |
|
754 | ||
|
755 | # this is a hack to properly display links, when creating PR, the | |
|
756 | # compare view and others uses different notation, and | |
|
757 | # compare_commits.html renders links based on the target_repo. | |
|
758 | # We need to swap that here to generate it properly on the html side | |
|
759 | c.target_repo = c.source_repo | |
|
760 | ||
|
705 | 761 | # comments |
|
706 | 762 | c.comments = cc_model.get_comments(c.rhodecode_db_repo.repo_id, |
|
707 | 763 | pull_request=pull_request_id) |
@@ -251,6 +251,16 b' class SummaryController(BaseRepoControll' | |||
|
251 | 251 | } |
|
252 | 252 | return data |
|
253 | 253 | |
|
254 | @LoginRequired() | |
|
255 | @HasRepoPermissionAnyDecorator('repository.read', 'repository.write', | |
|
256 | 'repository.admin') | |
|
257 | @jsonify | |
|
258 | def repo_default_reviewers_data(self, repo_name): | |
|
259 | return { | |
|
260 | 'reviewers': [utils.reviewer_as_json( | |
|
261 | user=c.rhodecode_db_repo.user, reasons=None)] | |
|
262 | } | |
|
263 | ||
|
254 | 264 | @jsonify |
|
255 | 265 | def repo_refs_changelog_data(self, repo_name): |
|
256 | 266 | repo = c.rhodecode_repo |
@@ -86,3 +86,21 b' def get_commit_from_ref_name(repo, ref_n' | |||
|
86 | 86 | '%s "%s" does not exist' % (ref_type, ref_name)) |
|
87 | 87 | |
|
88 | 88 | return repo_scm.get_commit(commit_id) |
|
89 | ||
|
90 | ||
|
91 | def reviewer_as_json(user, reasons): | |
|
92 | """ | |
|
93 | Returns json struct of a reviewer for frontend | |
|
94 | ||
|
95 | :param user: the reviewer | |
|
96 | :param reasons: list of strings of why they are reviewers | |
|
97 | """ | |
|
98 | ||
|
99 | return { | |
|
100 | 'user_id': user.user_id, | |
|
101 | 'reasons': reasons, | |
|
102 | 'username': user.username, | |
|
103 | 'firstname': user.firstname, | |
|
104 | 'lastname': user.lastname, | |
|
105 | 'gravatar_link': h.gravatar_url(user.email, 14), | |
|
106 | } |
@@ -48,6 +48,7 b' from rhodecode.events.base import Rhodec' | |||
|
48 | 48 | |
|
49 | 49 | from rhodecode.events.user import ( # noqa |
|
50 | 50 | UserPreCreate, |
|
51 | UserPostCreate, | |
|
51 | 52 | UserPreUpdate, |
|
52 | 53 | UserRegistered |
|
53 | 54 | ) |
@@ -51,6 +51,19 b' class UserPreCreate(RhodecodeEvent):' | |||
|
51 | 51 | self.user_data = user_data |
|
52 | 52 | |
|
53 | 53 | |
|
54 | @implementer(IUserPreCreate) | |
|
55 | class UserPostCreate(RhodecodeEvent): | |
|
56 | """ | |
|
57 | An instance of this class is emitted as an :term:`event` after a new user | |
|
58 | object is created. | |
|
59 | """ | |
|
60 | name = 'user-post-create' | |
|
61 | display_name = lazy_ugettext('user post create') | |
|
62 | ||
|
63 | def __init__(self, user_data): | |
|
64 | self.user_data = user_data | |
|
65 | ||
|
66 | ||
|
54 | 67 | @implementer(IUserPreUpdate) |
|
55 | 68 | class UserPreUpdate(RhodecodeEvent): |
|
56 | 69 | """ |
@@ -161,7 +161,7 b' class HipchatIntegrationType(Integration' | |||
|
161 | 161 | comment_text = data['comment']['text'] |
|
162 | 162 | if len(comment_text) > 200: |
|
163 | 163 | comment_text = '{comment_text}<a href="{comment_url}">...<a/>'.format( |
|
164 | comment_text=comment_text[:200], | |
|
164 | comment_text=h.html_escape(comment_text[:200]), | |
|
165 | 165 | comment_url=data['comment']['url'], |
|
166 | 166 | ) |
|
167 | 167 | |
@@ -179,8 +179,8 b' class HipchatIntegrationType(Integration' | |||
|
179 | 179 | number=data['pullrequest']['pull_request_id'], |
|
180 | 180 | pr_url=data['pullrequest']['url'], |
|
181 | 181 | pr_status=data['pullrequest']['status'], |
|
182 | pr_title=data['pullrequest']['title'], | |
|
183 | comment_text=comment_text | |
|
182 | pr_title=h.html_escape(data['pullrequest']['title']), | |
|
183 | comment_text=h.html_escape(comment_text) | |
|
184 | 184 | ) |
|
185 | 185 | ) |
|
186 | 186 | |
@@ -193,7 +193,7 b' class HipchatIntegrationType(Integration' | |||
|
193 | 193 | number=data['pullrequest']['pull_request_id'], |
|
194 | 194 | pr_url=data['pullrequest']['url'], |
|
195 | 195 | pr_status=data['pullrequest']['status'], |
|
196 | pr_title=data['pullrequest']['title'], | |
|
196 | pr_title=h.html_escape(data['pullrequest']['title']), | |
|
197 | 197 | ) |
|
198 | 198 | ) |
|
199 | 199 | |
@@ -206,11 +206,11 b' class HipchatIntegrationType(Integration' | |||
|
206 | 206 | }.get(event.__class__, str(event.__class__)) |
|
207 | 207 | |
|
208 | 208 | return ('Pull request <a href="{url}">#{number}</a> - {title} ' |
|
209 | '{action} by {user}').format( | |
|
209 | '{action} by <b>{user}</b>').format( | |
|
210 | 210 | user=data['actor']['username'], |
|
211 | 211 | number=data['pullrequest']['pull_request_id'], |
|
212 | 212 | url=data['pullrequest']['url'], |
|
213 | title=data['pullrequest']['title'], | |
|
213 | title=h.html_escape(data['pullrequest']['title']), | |
|
214 | 214 | action=action |
|
215 | 215 | ) |
|
216 | 216 | |
@@ -220,7 +220,6 b' class HipchatIntegrationType(Integration' | |||
|
220 | 220 | |
|
221 | 221 | branches_commits = {} |
|
222 | 222 | for commit in data['push']['commits']: |
|
223 | log.critical(commit) | |
|
224 | 223 | if commit['branch'] not in branches_commits: |
|
225 | 224 | branch_commits = {'branch': branch_data[commit['branch']], |
|
226 | 225 | 'commits': []} |
@@ -238,7 +237,7 b' class HipchatIntegrationType(Integration' | |||
|
238 | 237 | def format_repo_create_event(self, data): |
|
239 | 238 | return '<a href="{}">{}</a> ({}) repository created by <b>{}</b>'.format( |
|
240 | 239 | data['repo']['url'], |
|
241 | data['repo']['repo_name'], | |
|
240 | h.html_escape(data['repo']['repo_name']), | |
|
242 | 241 | data['repo']['repo_type'], |
|
243 | 242 | data['actor']['username'], |
|
244 | 243 | ) |
@@ -173,7 +173,7 b' class SlackIntegrationType(IntegrationTy' | |||
|
173 | 173 | |
|
174 | 174 | return (textwrap.dedent( |
|
175 | 175 | ''' |
|
176 | {user} commented on pull request <{pr_url}|#{number}> - {pr_title}: | |
|
176 | *{user}* commented on pull request <{pr_url}|#{number}> - {pr_title}: | |
|
177 | 177 | >>> {comment_status}{comment_text} |
|
178 | 178 | ''').format( |
|
179 | 179 | comment_status=comment_status, |
@@ -208,7 +208,7 b' class SlackIntegrationType(IntegrationTy' | |||
|
208 | 208 | }.get(event.__class__, str(event.__class__)) |
|
209 | 209 | |
|
210 | 210 | return ('Pull request <{url}|#{number}> - {title} ' |
|
211 | '{action} by {user}').format( | |
|
211 | '`{action}` by *{user}*').format( | |
|
212 | 212 | user=data['actor']['username'], |
|
213 | 213 | number=data['pullrequest']['pull_request_id'], |
|
214 | 214 | url=data['pullrequest']['url'], |
@@ -222,7 +222,6 b' class SlackIntegrationType(IntegrationTy' | |||
|
222 | 222 | |
|
223 | 223 | branches_commits = {} |
|
224 | 224 | for commit in data['push']['commits']: |
|
225 | log.critical(commit) | |
|
226 | 225 | if commit['branch'] not in branches_commits: |
|
227 | 226 | branch_commits = {'branch': branch_data[commit['branch']], |
|
228 | 227 | 'commits': []} |
@@ -19,13 +19,15 b'' | |||
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | 21 | from __future__ import unicode_literals |
|
22 | import string | |
|
23 | from collections import OrderedDict | |
|
22 | 24 | |
|
23 | 25 | import deform |
|
24 | 26 | import logging |
|
25 | 27 | import requests |
|
26 | 28 | import colander |
|
27 | 29 | from celery.task import task |
|
28 | from mako.template import Template | |
|
30 | from requests.packages.urllib3.util.retry import Retry | |
|
29 | 31 | |
|
30 | 32 | from rhodecode import events |
|
31 | 33 | from rhodecode.translation import _ |
@@ -33,12 +35,127 b' from rhodecode.integrations.types.base i' | |||
|
33 | 35 | |
|
34 | 36 | log = logging.getLogger(__name__) |
|
35 | 37 | |
|
38 | # updating this required to update the `base_vars` passed in url calling func | |
|
39 | WEBHOOK_URL_VARS = [ | |
|
40 | 'repo_name', | |
|
41 | 'repo_type', | |
|
42 | 'repo_id', | |
|
43 | 'repo_url', | |
|
44 | ||
|
45 | # special attrs below that we handle, using multi-call | |
|
46 | 'branch', | |
|
47 | 'commit_id', | |
|
48 | ||
|
49 | # pr events vars | |
|
50 | 'pull_request_id', | |
|
51 | 'pull_request_url', | |
|
52 | ||
|
53 | ] | |
|
54 | URL_VARS = ', '.join('${' + x + '}' for x in WEBHOOK_URL_VARS) | |
|
55 | ||
|
56 | ||
|
57 | class WebhookHandler(object): | |
|
58 | def __init__(self, template_url, secret_token): | |
|
59 | self.template_url = template_url | |
|
60 | self.secret_token = secret_token | |
|
61 | ||
|
62 | def get_base_parsed_template(self, data): | |
|
63 | """ | |
|
64 | initially parses the passed in template with some common variables | |
|
65 | available on ALL calls | |
|
66 | """ | |
|
67 | # note: make sure to update the `WEBHOOK_URL_VARS` if this changes | |
|
68 | common_vars = { | |
|
69 | 'repo_name': data['repo']['repo_name'], | |
|
70 | 'repo_type': data['repo']['repo_type'], | |
|
71 | 'repo_id': data['repo']['repo_id'], | |
|
72 | 'repo_url': data['repo']['url'], | |
|
73 | } | |
|
74 | ||
|
75 | return string.Template( | |
|
76 | self.template_url).safe_substitute(**common_vars) | |
|
77 | ||
|
78 | def repo_push_event_handler(self, event, data): | |
|
79 | url = self.get_base_parsed_template(data) | |
|
80 | url_cals = [] | |
|
81 | branch_data = OrderedDict() | |
|
82 | for obj in data['push']['branches']: | |
|
83 | branch_data[obj['name']] = obj | |
|
84 | ||
|
85 | branches_commits = OrderedDict() | |
|
86 | for commit in data['push']['commits']: | |
|
87 | if commit['branch'] not in branches_commits: | |
|
88 | branch_commits = {'branch': branch_data[commit['branch']], | |
|
89 | 'commits': []} | |
|
90 | branches_commits[commit['branch']] = branch_commits | |
|
91 | ||
|
92 | branch_commits = branches_commits[commit['branch']] | |
|
93 | branch_commits['commits'].append(commit) | |
|
94 | ||
|
95 | if '${branch}' in url: | |
|
96 | # call it multiple times, for each branch if used in variables | |
|
97 | for branch, commit_ids in branches_commits.items(): | |
|
98 | branch_url = string.Template(url).safe_substitute(branch=branch) | |
|
99 | # call further down for each commit if used | |
|
100 | if '${commit_id}' in branch_url: | |
|
101 | for commit_data in commit_ids['commits']: | |
|
102 | commit_id = commit_data['raw_id'] | |
|
103 | commit_url = string.Template(branch_url).safe_substitute( | |
|
104 | commit_id=commit_id) | |
|
105 | # register per-commit call | |
|
106 | log.debug( | |
|
107 | 'register webhook call(%s) to url %s', event, commit_url) | |
|
108 | url_cals.append((commit_url, self.secret_token, data)) | |
|
109 | ||
|
110 | else: | |
|
111 | # register per-branch call | |
|
112 | log.debug( | |
|
113 | 'register webhook call(%s) to url %s', event, branch_url) | |
|
114 | url_cals.append((branch_url, self.secret_token, data)) | |
|
115 | ||
|
116 | else: | |
|
117 | log.debug( | |
|
118 | 'register webhook call(%s) to url %s', event, url) | |
|
119 | url_cals.append((url, self.secret_token, data)) | |
|
120 | ||
|
121 | return url_cals | |
|
122 | ||
|
123 | def repo_create_event_handler(self, event, data): | |
|
124 | url = self.get_base_parsed_template(data) | |
|
125 | log.debug( | |
|
126 | 'register webhook call(%s) to url %s', event, url) | |
|
127 | return [(url, self.secret_token, data)] | |
|
128 | ||
|
129 | def pull_request_event_handler(self, event, data): | |
|
130 | url = self.get_base_parsed_template(data) | |
|
131 | log.debug( | |
|
132 | 'register webhook call(%s) to url %s', event, url) | |
|
133 | url = string.Template(url).safe_substitute( | |
|
134 | pull_request_id=data['pullrequest']['pull_request_id'], | |
|
135 | pull_request_url=data['pullrequest']['url']) | |
|
136 | return [(url, self.secret_token, data)] | |
|
137 | ||
|
138 | def __call__(self, event, data): | |
|
139 | if isinstance(event, events.RepoPushEvent): | |
|
140 | return self.repo_push_event_handler(event, data) | |
|
141 | elif isinstance(event, events.RepoCreateEvent): | |
|
142 | return self.repo_create_event_handler(event, data) | |
|
143 | elif isinstance(event, events.PullRequestEvent): | |
|
144 | return self.pull_request_event_handler(event, data) | |
|
145 | else: | |
|
146 | raise ValueError('event type not supported: %s' % events) | |
|
147 | ||
|
36 | 148 | |
|
37 | 149 | class WebhookSettingsSchema(colander.Schema): |
|
38 | 150 | url = colander.SchemaNode( |
|
39 | 151 | colander.String(), |
|
40 | 152 | title=_('Webhook URL'), |
|
41 | description=_('URL of the webhook to receive POST event.'), | |
|
153 | description= | |
|
154 | _('URL of the webhook to receive POST event. Following variables ' | |
|
155 | 'are allowed to be used: {vars}. Some of the variables would ' | |
|
156 | 'trigger multiple calls, like ${{branch}} or ${{commit_id}}. ' | |
|
157 | 'Webhook will be called as many times as unique objects in ' | |
|
158 | 'data in such cases.').format(vars=URL_VARS), | |
|
42 | 159 | missing=colander.required, |
|
43 | 160 | required=True, |
|
44 | 161 | validator=colander.url, |
@@ -58,8 +175,6 b' class WebhookSettingsSchema(colander.Sch' | |||
|
58 | 175 | ) |
|
59 | 176 | |
|
60 | 177 | |
|
61 | ||
|
62 | ||
|
63 | 178 | class WebhookIntegrationType(IntegrationTypeBase): |
|
64 | 179 | key = 'webhook' |
|
65 | 180 | display_name = _('Webhook') |
@@ -104,14 +219,30 b' class WebhookIntegrationType(Integration' | |||
|
104 | 219 | return |
|
105 | 220 | |
|
106 | 221 | data = event.as_dict() |
|
107 | post_to_webhook(data, self.settings) | |
|
222 | template_url = self.settings['url'] | |
|
223 | ||
|
224 | handler = WebhookHandler(template_url, self.settings['secret_token']) | |
|
225 | url_calls = handler(event, data) | |
|
226 | log.debug('webhook: calling following urls: %s', | |
|
227 | [x[0] for x in url_calls]) | |
|
228 | post_to_webhook(url_calls) | |
|
108 | 229 | |
|
109 | 230 | |
|
110 | 231 | @task(ignore_result=True) |
|
111 |
def post_to_webhook( |
|
|
112 | log.debug('sending event:%s to webhook %s', data['name'], settings['url']) | |
|
113 | resp = requests.post(settings['url'], json={ | |
|
114 | 'token': settings['secret_token'], | |
|
232 | def post_to_webhook(url_calls): | |
|
233 | max_retries = 3 | |
|
234 | for url, token, data in url_calls: | |
|
235 | # retry max N times | |
|
236 | retries = Retry( | |
|
237 | total=max_retries, | |
|
238 | backoff_factor=0.15, | |
|
239 | status_forcelist=[500, 502, 503, 504]) | |
|
240 | req_session = requests.Session() | |
|
241 | req_session.mount( | |
|
242 | 'http://', requests.adapters.HTTPAdapter(max_retries=retries)) | |
|
243 | ||
|
244 | resp = req_session.post(url, json={ | |
|
245 | 'token': token, | |
|
115 | 246 | 'event': data |
|
116 | 247 | }) |
|
117 | 248 | resp.raise_for_status() # raise exception on a failed request |
@@ -49,7 +49,7 b' from rhodecode.model.meta import Session' | |||
|
49 | 49 | from rhodecode.model.user import UserModel |
|
50 | 50 | from rhodecode.model.db import ( |
|
51 | 51 | User, Repository, Permission, UserToPerm, UserGroupToPerm, UserGroupMember, |
|
52 | UserIpMap, UserApiKeys) | |
|
52 | UserIpMap, UserApiKeys, RepoGroup) | |
|
53 | 53 | from rhodecode.lib import caches |
|
54 | 54 | from rhodecode.lib.utils2 import safe_unicode, aslist, safe_str, md5 |
|
55 | 55 | from rhodecode.lib.utils import ( |
@@ -983,6 +983,9 b' class AuthUser(object):' | |||
|
983 | 983 | inherit = self.inherit_default_permissions |
|
984 | 984 | return AuthUser.check_ip_allowed(self.user_id, self.ip_addr, |
|
985 | 985 | inherit_from_default=inherit) |
|
986 | @property | |
|
987 | def personal_repo_group(self): | |
|
988 | return RepoGroup.get_user_personal_repo_group(self.user_id) | |
|
986 | 989 | |
|
987 | 990 | @classmethod |
|
988 | 991 | def check_ip_allowed(cls, user_id, ip_addr, inherit_from_default): |
@@ -163,7 +163,8 b' def get_access_path(environ):' | |||
|
163 | 163 | |
|
164 | 164 | |
|
165 | 165 | def vcs_operation_context( |
|
166 |
environ, repo_name, username, action, scm, check_locking=True |
|
|
166 | environ, repo_name, username, action, scm, check_locking=True, | |
|
167 | is_shadow_repo=False): | |
|
167 | 168 | """ |
|
168 | 169 | Generate the context for a vcs operation, e.g. push or pull. |
|
169 | 170 | |
@@ -200,6 +201,7 b' def vcs_operation_context(' | |||
|
200 | 201 | 'locked_by': locked_by, |
|
201 | 202 | 'server_url': utils2.get_server_url(environ), |
|
202 | 203 | 'hooks': get_enabled_hook_classes(ui_settings), |
|
204 | 'is_shadow_repo': is_shadow_repo, | |
|
203 | 205 | } |
|
204 | 206 | return extras |
|
205 | 207 | |
@@ -363,6 +365,18 b' def attach_context_attributes(context, r' | |||
|
363 | 365 | # Fix this and remove it from base controller. |
|
364 | 366 | # context.repo_name = get_repo_slug(request) # can be empty |
|
365 | 367 | |
|
368 | diffmode = 'sideside' | |
|
369 | if request.GET.get('diffmode'): | |
|
370 | if request.GET['diffmode'] == 'unified': | |
|
371 | diffmode = 'unified' | |
|
372 | elif request.session.get('diffmode'): | |
|
373 | diffmode = request.session['diffmode'] | |
|
374 | ||
|
375 | context.diffmode = diffmode | |
|
376 | ||
|
377 | if request.session.get('diffmode') != diffmode: | |
|
378 | request.session['diffmode'] = diffmode | |
|
379 | ||
|
366 | 380 | context.csrf_token = auth.get_csrf_token() |
|
367 | 381 | context.backends = rhodecode.BACKENDS.keys() |
|
368 | 382 | context.backends.sort() |
@@ -24,9 +24,10 b' import logging' | |||
|
24 | 24 | import threading |
|
25 | 25 | |
|
26 | 26 | from beaker.cache import _cache_decorate, cache_regions, region_invalidate |
|
27 | from sqlalchemy.exc import IntegrityError | |
|
27 | 28 | |
|
28 | 29 | from rhodecode.lib.utils import safe_str, md5 |
|
29 |
from rhodecode.model.db import Session, CacheKey |
|
|
30 | from rhodecode.model.db import Session, CacheKey | |
|
30 | 31 | |
|
31 | 32 | log = logging.getLogger(__name__) |
|
32 | 33 |
@@ -18,16 +18,14 b'' | |||
|
18 | 18 | # RhodeCode Enterprise Edition, including its added features, Support services, |
|
19 | 19 | # and proprietary license terms, please see https://rhodecode.com/licenses/ |
|
20 | 20 | |
|
21 | import hashlib | |
|
22 | import itsdangerous | |
|
21 | 23 | import logging |
|
22 | 24 | import os |
|
23 | ||
|
24 | import itsdangerous | |
|
25 | 25 | import requests |
|
26 | ||
|
27 | 26 | from dogpile.core import ReadWriteMutex |
|
28 | 27 | |
|
29 | 28 | import rhodecode.lib.helpers as h |
|
30 | ||
|
31 | 29 | from rhodecode.lib.auth import HasRepoPermissionAny |
|
32 | 30 | from rhodecode.lib.ext_json import json |
|
33 | 31 | from rhodecode.model.db import User |
@@ -81,7 +79,7 b' def get_user_data(user_id):' | |||
|
81 | 79 | 'username': user.username, |
|
82 | 80 | 'first_name': user.name, |
|
83 | 81 | 'last_name': user.lastname, |
|
84 |
'icon_link': h.gravatar_url(user.email, |
|
|
82 | 'icon_link': h.gravatar_url(user.email, 60), | |
|
85 | 83 | 'display_name': h.person(user, 'username_or_name_or_email'), |
|
86 | 84 | 'display_link': h.link_to_user(user), |
|
87 | 85 | 'notifications': user.user_data.get('notification_status', True) |
@@ -169,7 +167,9 b' def parse_channels_info(info_result, inc' | |||
|
169 | 167 | |
|
170 | 168 | |
|
171 | 169 | def log_filepath(history_location, channel_name): |
|
172 | filename = '{}.log'.format(channel_name.encode('hex')) | |
|
170 | hasher = hashlib.sha256() | |
|
171 | hasher.update(channel_name.encode('utf8')) | |
|
172 | filename = '{}.log'.format(hasher.hexdigest()) | |
|
173 | 173 | filepath = os.path.join(history_location, filename) |
|
174 | 174 | return filepath |
|
175 | 175 |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: modified file, binary diff hidden |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: modified file | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
|
1 | NO CONTENT: file was removed | |
The requested commit or file is too big and content was truncated. Show full diff |
General Comments 0
You need to be logged in to leave comments.
Login now