##// END OF EJS Templates
release: Merge default into stable for release preparation
marcink -
r1170:729990d5 merge stable
parent child Browse files
Show More
@@ -0,0 +1,137 b''
1 .. _svn-http:
2
3 |svn| With Write Over HTTP
4 ^^^^^^^^^^^^^^^^^^^^^^^^^^
5
6 To use |svn| with read/write support over the |svn| HTTP protocol, you have to
7 configure the HTTP |svn| backend.
8
9 Prerequisites
10 =============
11
12 - Enable HTTP support inside the admin VCS settings on your |RCE| instance
13 - You need to install the following tools on the machine that is running an
14 instance of |RCE|:
15 ``Apache HTTP Server`` and ``mod_dav_svn``.
16
17
18 Using Ubuntu 14.04 Distribution as an example execute the following:
19
20 .. code-block:: bash
21
22 $ sudo apt-get install apache2 libapache2-mod-svn
23
24 Once installed you need to enable ``dav_svn``:
25
26 .. code-block:: bash
27
28 $ sudo a2enmod dav_svn
29 $ sudo a2enmod headers
30
31
32 Configuring Apache Setup
33 ========================
34
35 .. tip::
36
37 It is recommended to run Apache on a port other than 80, due to possible
38 conflicts with other HTTP servers like nginx. To do this, set the
39 ``Listen`` parameter in the ``/etc/apache2/ports.conf`` file, for example
40 ``Listen 8090``.
41
42
43 .. warning::
44
45 Make sure your Apache instance which runs the mod_dav_svn module is
46 only accessible by |RCE|. Otherwise everyone is able to browse
47 the repositories or run subversion operations (checkout/commit/etc.).
48
49 It is also recommended to run apache as the same user as |RCE|, otherwise
50 permission issues could occur. To do this edit the ``/etc/apache2/envvars``
51
52 .. code-block:: apache
53
54 export APACHE_RUN_USER=rhodecode
55 export APACHE_RUN_GROUP=rhodecode
56
57 1. To configure Apache, create and edit a virtual hosts file, for example
58 :file:`/etc/apache2/sites-available/default.conf`. Below is an example
59 how to use one with auto-generated config ```mod_dav_svn.conf```
60 from configured |RCE| instance.
61
62 .. code-block:: apache
63
64 <VirtualHost *:8090>
65 ServerAdmin rhodecode-admin@localhost
66 DocumentRoot /var/www/html
67 ErrorLog ${'${APACHE_LOG_DIR}'}/error.log
68 CustomLog ${'${APACHE_LOG_DIR}'}/access.log combined
69 Include /home/user/.rccontrol/enterprise-1/mod_dav_svn.conf
70 </VirtualHost>
71
72
73 2. Go to the :menuselection:`Admin --> Settings --> VCS` page, and
74 enable :guilabel:`Proxy Subversion HTTP requests`, and specify the
75 :guilabel:`Subversion HTTP Server URL`.
76
77 3. Open the |RCE| configuration file,
78 :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini`
79
80 4. Add the following configuration option in the ``[app:main]``
81 section if you don't have it yet.
82
83 This enables mapping of the created |RCE| repo groups into special
84 |svn| paths. Each time a new repository group is created, the system will
85 update the template file and create new mapping. Apache web server needs to
86 be reloaded to pick up the changes on this file.
87 To do this, simply configure `svn.proxy.reload_cmd` inside the .ini file.
88 Example configuration:
89
90
91 .. code-block:: ini
92
93 ############################################################
94 ### Subversion proxy support (mod_dav_svn) ###
95 ### Maps RhodeCode repo groups into SVN paths for Apache ###
96 ############################################################
97 ## Enable or disable the config file generation.
98 svn.proxy.generate_config = true
99 ## Generate config file with `SVNListParentPath` set to `On`.
100 svn.proxy.list_parent_path = true
101 ## Set location and file name of generated config file.
102 svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf
103 ## Used as a prefix to the <Location> block in the generated config file.
104 ## In most cases it should be set to `/`.
105 svn.proxy.location_root = /
106 ## Command to reload the mod dav svn configuration on change.
107 ## Example: `/etc/init.d/apache2 reload`
108 svn.proxy.reload_cmd = /etc/init.d/apache2 reload
109 ## If the timeout expires before the reload command finishes, the command will
110 ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds.
111 #svn.proxy.reload_timeout = 10
112
113
114 This would create a special template file called ```mod_dav_svn.conf```. We
115 used that file path in the apache config above inside the Include statement.
116 It's also possible to generate the config from the
117 :menuselection:`Admin --> Settings --> VCS` page.
118
119
120 Using |svn|
121 ===========
122
123 Once |svn| has been enabled on your instance, you can use it with the
124 following examples. For more |svn| information, see the `Subversion Red Book`_
125
126 .. code-block:: bash
127
128 # To clone a repository
129 svn checkout http://my-svn-server.example.com/my-svn-repo
130
131 # svn commit
132 svn commit
133
134
135 .. _Subversion Red Book: http://svnbook.red-bean.com/en/1.7/svn-book.html#svn.ref.svn
136
137 .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue No newline at end of file
@@ -0,0 +1,17 b''
1 .. _checklist-pull-request:
2
3 =======================
4 Pull Request Checklists
5 =======================
6
7
8
9 Checklists for Pull Request
10 ===========================
11
12
13 - Informative description
14 - Linear commit history
15 - Rebased on top of latest changes
16 - Add ticket references. eg fixes #123, references #123 etc.
17
1 NO CONTENT: new file 100644, binary diff hidden
@@ -0,0 +1,159 b''
1 |RCE| 4.5.0 |RNS|
2 -----------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2016-12-02
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13 - Diffs: re-implemented diff engine. Added: Syntax highlighting inside diffs,
14 new side-by-side view with commenting and live chat. Enabled soft-wrapping of
15 long lines and much improved rendering speed for large diffs.
16 - File source view: new file display engine. File view now
17 soft wraps long lines. Double click inside file view show occurrences of
18 clicked item. Added pygments-markdown-lexer for highlighting markdown syntax.
19 - Files annotation: Added new grouped annotations. Color all related commits
20 by double clicking singe commit in annotation view.
21 - Pull request reviewers (EE only): added new default reviewers functionality.
22 Allows picking users or user groups defined as reviewers for new pull request.
23 Picking reviewers can be based on branch name, changed file name patterns or
24 original author of changed source code. eg *.css -> design team.
25 Master branch -> repo owner, fixes #1131.
26 - Pull request reviewers: store and show reasons why given person is a reviewer.
27 Manually adding reviewers after creating a PR will now be also indicated
28 together with who added a given person to review.
29 - Integrations: Webhooks integration now allows to use variables inside the
30 call URL. Currently supported variables are ${repo_name}, ${repo_type},
31 ${repo_id}, ${repo_url}, ${branch}, ${commit_id}, ${pull_request_id},
32 ${pull_request_url}. Commits are now grouped by branches as well.
33 Allows much easier integration with CI systems.
34 - Integrations (EE only): allow wildcard * project key in Jira integration
35 settings to allow referencing multiple projects per commit, fixes #4267.
36 - Live notifications: RhodeCode sends live notification to online
37 users on certain events and pages. Currently this works on: invite to chat,
38 update pull request, commit/inline comment. Part of live code review system.
39 Allows users to update the reviewed code while doing the review and never
40 miss any updates or comment replies as they happen. Requires channelstream
41 to be enabled.
42 - Repository groups: added default personal repository groups. Personal groups
43 are isolated playground for users allowing them to create projects or forks.
44 Adds new setting to automatically create personal repo groups for newly
45 created users. New groups are created from specified pattern, for example
46 /u/{username}. Implements #4003.
47 - Security: It's now possible to disable password reset functionality.
48 This is useful for cases when users only use LDAP or similar types of
49 authentication. Implements #3944
50 - Pull requests: exposed shadow repositories to end users. Users are now given
51 access to the shadow repository which represents state after merge performed.
52 In this way users or especially CI servers can much easier perform code
53 analysis of the final merged code.
54 - Pull requests: My account > pull request page now uses datagrid.
55 It's faster, filterable and sortable. Fixes #4297.
56 - Pull requests: delete pull request action was moved from my account
57 into pull request view itself. This is where everyone was looking for it.
58 - Pull requests: improve showing failed merges with proper status in pull
59 request page.
60 - User groups: overhaul of edit user group page. Added new selector for
61 adding new user group members.
62 - Licensing (EE only): exposed unlock link to deactivate users that are over
63 license limit, to unlock full functionality. This might happen when migrating
64 from CE into EE, and your license supports less active users then allowed.
65 - Global settings: add a new header/footer template to allow flash filtering.
66 In case a license warning appears and admin wants to hide it for some time.
67 The new template can be used to do this.
68 - System info: added create snapshot button to easily generate system state
69 report. Comes in handy for support and reporting. System state holds
70 information such as free disk/memory, CPU load and some of RhodeCode settings.
71 - System info: fetch and show vcs settings from vcsserver. Fixes #4276.
72 - System info: use real memory usage based on new psutil api available.
73 - System info: added info about temporary storage.
74 - System info: expose inode limits and usage. Fixes #4282.
75 - Ui: added new icon for merge commit.
76
77
78
79 General
80 ^^^^^^^
81
82 - Notifications: move all notifications into polymer for consistency.
83 Fixes #4201.
84 - Live chat (EE): Improved UI for live-chat. Use Codemirror editor as
85 input for text box.
86 - Api: WARNING DEPRECATION, refactor repository group schemas. Fixes #4133.
87 When using create_repo, create_repo_group, update_repo, update_repo_group
88 the *_name parameter now takes full path including sub repository groups.
89 This is the only way to add resource under another repository group.
90 Furthermore giving non-existing path will no longer create the missing
91 structure. This change makes the api more consistent, it better validates
92 the errors in the data sent to given api call.
93 - Pull requests: disable subrepo handling on pull requests. It means users can
94 now use more types of repositories with subrepos to create pull requests.
95 Since handling is disabled, repositories behind authentication, or outside
96 of network can be used.
97 - VCSServer: fetch backend info from vcsserver including git/hg/svn versions
98 and connection information.
99 - Svn support: it's no longer required to put in repo root path to
100 generate mod-dav-svn config. Fixes #4203.
101 - Svn support: Add reload command option (svn.proxy.reload_cmd) to ini files.
102 Apache can now be automatically reloaded when the mod_dav_svn config changes.
103 - Svn support: Add a view to trigger the (re)generation of Apache mod_dav_svn
104 configuration file. Users are able to generate such file manually by clicking
105 that button.
106 - Dependency: updated subversion library to 1.9.
107 - Dependency: updated ipython to 5.1.0.
108 - Dependency: updated psutil to 4.3.1.
109
110
111 Security
112 ^^^^^^^^
113
114 - Hipchat: escape user entered data to avoid xss/formatting problems.
115 - VCSServer: obfuscate credentials added into remote url during remote
116 repository creation. Prevents leaking of those credentials inside
117 RhodeCode logs.
118
119
120 Performance
121 ^^^^^^^^^^^
122
123 - Diffs: new diff engine is much smarter when it comes to showing huge diffs.
124 The rendering speed should be much improved in such cases, however showing
125 full diff is still supported.
126 - VCS backends: when using a repo object from database, re-use this information
127 instead of trying to detect a backend. Reduces the traffic to vcsserver.
128 - Pull requests: Add a column to hold the last merge revision. This will skip
129 heavy recalculation of merge state if nothing changed inside a pull request.
130 - File source view: don't load the file if it is over the size limit since it
131 won't be displayed anyway. This increases speed of loading the page when a
132 file is above cut-off limit defined.
133
134
135 Fixes
136 ^^^^^
137
138 - Users admin: fixed search filter in user admin page.
139 - Autocomplete: improve the lookup of users with non-ascii characters. In case
140 of unicode email the previous method could generate wrong data, and
141 make search not show up such users.
142 - Svn: added request header downgrade for COPY command to work on
143 https setup. Fixes #4307.
144 - Svn: add handling of renamed files inside our generated changes metadata.
145 Fixes #4258.
146 - Pull requests: fixed problem with creating pull requests on empty repositories.
147 - Events: use branch from previous commit for repo push event commits data so
148 that per-branch grouping works. Fixes #4233.
149 - Login: make sure recaptcha data is always validated. Fixes #4279.
150 - Vcs: Use commit date as modification time when creating archives.
151 Fixes problem with unstable hashes for archives. Fixes #4247.
152 - Issue trackers: fixed bug where saving empty issue tracker via form was
153 causing exception. Fixes #4278.
154 - Styling: fixed gravatar size for pull request reviewers.
155 - Ldap: fixed email extraction typo. An empty email from LDAP server will now
156 not overwrite the stored one.
157 - Integrations: use consistent formatting of users data in Slack integration.
158 - Meta-tags: meta tags are not taken into account when truncating descriptions
159 that are too long. Fixes #4305. No newline at end of file
@@ -0,0 +1,14 b''
1 Copyright 2006 Google Inc.
2 http://code.google.com/p/google-diff-match-patch/
3
4 Licensed under the Apache License, Version 2.0 (the "License");
5 you may not use this file except in compliance with the License.
6 You may obtain a copy of the License at
7
8 http://www.apache.org/licenses/LICENSE-2.0
9
10 Unless required by applicable law or agreed to in writing, software
11 distributed under the License is distributed on an "AS IS" BASIS,
12 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 See the License for the specific language governing permissions and
14 limitations under the License.
@@ -0,0 +1,12 b''
1 Copyright © 2015 Jürgen Hermann <jh@web.de>
2
3 Licensed under the Apache License, Version 2.0 (the "License");
4 you may not use this file except in compliance with the License.
5 You may obtain a copy of the License at
6 http://www.apache.org/licenses/LICENSE-2.0
7
8 Unless required by applicable law or agreed to in writing, software
9 distributed under the License is distributed on an "AS IS" BASIS,
10 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
11 See the License for the specific language governing permissions and
12 limitations under the License.
This diff has been collapsed as it changes many lines, (665 lines changed) Show them Hide them
@@ -0,0 +1,665 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2011-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 import logging
22 import difflib
23 from itertools import groupby
24
25 from pygments import lex
26 from pygments.formatters.html import _get_ttype_class as pygment_token_class
27 from rhodecode.lib.helpers import (
28 get_lexer_for_filenode, get_lexer_safe, html_escape)
29 from rhodecode.lib.utils2 import AttributeDict
30 from rhodecode.lib.vcs.nodes import FileNode
31 from rhodecode.lib.diff_match_patch import diff_match_patch
32 from rhodecode.lib.diffs import LimitedDiffContainer
33 from pygments.lexers import get_lexer_by_name
34
35 plain_text_lexer = get_lexer_by_name(
36 'text', stripall=False, stripnl=False, ensurenl=False)
37
38
39 log = logging.getLogger()
40
41
42 def filenode_as_lines_tokens(filenode, lexer=None):
43 lexer = lexer or get_lexer_for_filenode(filenode)
44 log.debug('Generating file node pygment tokens for %s, %s', lexer, filenode)
45 tokens = tokenize_string(filenode.content, lexer)
46 lines = split_token_stream(tokens, split_string='\n')
47 rv = list(lines)
48 return rv
49
50
51 def tokenize_string(content, lexer):
52 """
53 Use pygments to tokenize some content based on a lexer
54 ensuring all original new lines and whitespace is preserved
55 """
56
57 lexer.stripall = False
58 lexer.stripnl = False
59 lexer.ensurenl = False
60 for token_type, token_text in lex(content, lexer):
61 yield pygment_token_class(token_type), token_text
62
63
64 def split_token_stream(tokens, split_string=u'\n'):
65 """
66 Take a list of (TokenType, text) tuples and split them by a string
67
68 >>> split_token_stream([(TEXT, 'some\ntext'), (TEXT, 'more\n')])
69 [(TEXT, 'some'), (TEXT, 'text'),
70 (TEXT, 'more'), (TEXT, 'text')]
71 """
72
73 buffer = []
74 for token_class, token_text in tokens:
75 parts = token_text.split(split_string)
76 for part in parts[:-1]:
77 buffer.append((token_class, part))
78 yield buffer
79 buffer = []
80
81 buffer.append((token_class, parts[-1]))
82
83 if buffer:
84 yield buffer
85
86
87 def filenode_as_annotated_lines_tokens(filenode):
88 """
89 Take a file node and return a list of annotations => lines, if no annotation
90 is found, it will be None.
91
92 eg:
93
94 [
95 (annotation1, [
96 (1, line1_tokens_list),
97 (2, line2_tokens_list),
98 ]),
99 (annotation2, [
100 (3, line1_tokens_list),
101 ]),
102 (None, [
103 (4, line1_tokens_list),
104 ]),
105 (annotation1, [
106 (5, line1_tokens_list),
107 (6, line2_tokens_list),
108 ])
109 ]
110 """
111
112 commit_cache = {} # cache commit_getter lookups
113
114 def _get_annotation(commit_id, commit_getter):
115 if commit_id not in commit_cache:
116 commit_cache[commit_id] = commit_getter()
117 return commit_cache[commit_id]
118
119 annotation_lookup = {
120 line_no: _get_annotation(commit_id, commit_getter)
121 for line_no, commit_id, commit_getter, line_content
122 in filenode.annotate
123 }
124
125 annotations_lines = ((annotation_lookup.get(line_no), line_no, tokens)
126 for line_no, tokens
127 in enumerate(filenode_as_lines_tokens(filenode), 1))
128
129 grouped_annotations_lines = groupby(annotations_lines, lambda x: x[0])
130
131 for annotation, group in grouped_annotations_lines:
132 yield (
133 annotation, [(line_no, tokens)
134 for (_, line_no, tokens) in group]
135 )
136
137
138 def render_tokenstream(tokenstream):
139 result = []
140 for token_class, token_ops_texts in rollup_tokenstream(tokenstream):
141
142 if token_class:
143 result.append(u'<span class="%s">' % token_class)
144 else:
145 result.append(u'<span>')
146
147 for op_tag, token_text in token_ops_texts:
148
149 if op_tag:
150 result.append(u'<%s>' % op_tag)
151
152 escaped_text = html_escape(token_text)
153
154 # TODO: dan: investigate showing hidden characters like space/nl/tab
155 # escaped_text = escaped_text.replace(' ', '<sp> </sp>')
156 # escaped_text = escaped_text.replace('\n', '<nl>\n</nl>')
157 # escaped_text = escaped_text.replace('\t', '<tab>\t</tab>')
158
159 result.append(escaped_text)
160
161 if op_tag:
162 result.append(u'</%s>' % op_tag)
163
164 result.append(u'</span>')
165
166 html = ''.join(result)
167 return html
168
169
170 def rollup_tokenstream(tokenstream):
171 """
172 Group a token stream of the format:
173
174 ('class', 'op', 'text')
175 or
176 ('class', 'text')
177
178 into
179
180 [('class1',
181 [('op1', 'text'),
182 ('op2', 'text')]),
183 ('class2',
184 [('op3', 'text')])]
185
186 This is used to get the minimal tags necessary when
187 rendering to html eg for a token stream ie.
188
189 <span class="A"><ins>he</ins>llo</span>
190 vs
191 <span class="A"><ins>he</ins></span><span class="A">llo</span>
192
193 If a 2 tuple is passed in, the output op will be an empty string.
194
195 eg:
196
197 >>> rollup_tokenstream([('classA', '', 'h'),
198 ('classA', 'del', 'ell'),
199 ('classA', '', 'o'),
200 ('classB', '', ' '),
201 ('classA', '', 'the'),
202 ('classA', '', 're'),
203 ])
204
205 [('classA', [('', 'h'), ('del', 'ell'), ('', 'o')],
206 ('classB', [('', ' ')],
207 ('classA', [('', 'there')]]
208
209 """
210 if tokenstream and len(tokenstream[0]) == 2:
211 tokenstream = ((t[0], '', t[1]) for t in tokenstream)
212
213 result = []
214 for token_class, op_list in groupby(tokenstream, lambda t: t[0]):
215 ops = []
216 for token_op, token_text_list in groupby(op_list, lambda o: o[1]):
217 text_buffer = []
218 for t_class, t_op, t_text in token_text_list:
219 text_buffer.append(t_text)
220 ops.append((token_op, ''.join(text_buffer)))
221 result.append((token_class, ops))
222 return result
223
224
225 def tokens_diff(old_tokens, new_tokens, use_diff_match_patch=True):
226 """
227 Converts a list of (token_class, token_text) tuples to a list of
228 (token_class, token_op, token_text) tuples where token_op is one of
229 ('ins', 'del', '')
230
231 :param old_tokens: list of (token_class, token_text) tuples of old line
232 :param new_tokens: list of (token_class, token_text) tuples of new line
233 :param use_diff_match_patch: boolean, will use google's diff match patch
234 library which has options to 'smooth' out the character by character
235 differences making nicer ins/del blocks
236 """
237
238 old_tokens_result = []
239 new_tokens_result = []
240
241 similarity = difflib.SequenceMatcher(None,
242 ''.join(token_text for token_class, token_text in old_tokens),
243 ''.join(token_text for token_class, token_text in new_tokens)
244 ).ratio()
245
246 if similarity < 0.6: # return, the blocks are too different
247 for token_class, token_text in old_tokens:
248 old_tokens_result.append((token_class, '', token_text))
249 for token_class, token_text in new_tokens:
250 new_tokens_result.append((token_class, '', token_text))
251 return old_tokens_result, new_tokens_result, similarity
252
253 token_sequence_matcher = difflib.SequenceMatcher(None,
254 [x[1] for x in old_tokens],
255 [x[1] for x in new_tokens])
256
257 for tag, o1, o2, n1, n2 in token_sequence_matcher.get_opcodes():
258 # check the differences by token block types first to give a more
259 # nicer "block" level replacement vs character diffs
260
261 if tag == 'equal':
262 for token_class, token_text in old_tokens[o1:o2]:
263 old_tokens_result.append((token_class, '', token_text))
264 for token_class, token_text in new_tokens[n1:n2]:
265 new_tokens_result.append((token_class, '', token_text))
266 elif tag == 'delete':
267 for token_class, token_text in old_tokens[o1:o2]:
268 old_tokens_result.append((token_class, 'del', token_text))
269 elif tag == 'insert':
270 for token_class, token_text in new_tokens[n1:n2]:
271 new_tokens_result.append((token_class, 'ins', token_text))
272 elif tag == 'replace':
273 # if same type token blocks must be replaced, do a diff on the
274 # characters in the token blocks to show individual changes
275
276 old_char_tokens = []
277 new_char_tokens = []
278 for token_class, token_text in old_tokens[o1:o2]:
279 for char in token_text:
280 old_char_tokens.append((token_class, char))
281
282 for token_class, token_text in new_tokens[n1:n2]:
283 for char in token_text:
284 new_char_tokens.append((token_class, char))
285
286 old_string = ''.join([token_text for
287 token_class, token_text in old_char_tokens])
288 new_string = ''.join([token_text for
289 token_class, token_text in new_char_tokens])
290
291 char_sequence = difflib.SequenceMatcher(
292 None, old_string, new_string)
293 copcodes = char_sequence.get_opcodes()
294 obuffer, nbuffer = [], []
295
296 if use_diff_match_patch:
297 dmp = diff_match_patch()
298 dmp.Diff_EditCost = 11 # TODO: dan: extract this to a setting
299 reps = dmp.diff_main(old_string, new_string)
300 dmp.diff_cleanupEfficiency(reps)
301
302 a, b = 0, 0
303 for op, rep in reps:
304 l = len(rep)
305 if op == 0:
306 for i, c in enumerate(rep):
307 obuffer.append((old_char_tokens[a+i][0], '', c))
308 nbuffer.append((new_char_tokens[b+i][0], '', c))
309 a += l
310 b += l
311 elif op == -1:
312 for i, c in enumerate(rep):
313 obuffer.append((old_char_tokens[a+i][0], 'del', c))
314 a += l
315 elif op == 1:
316 for i, c in enumerate(rep):
317 nbuffer.append((new_char_tokens[b+i][0], 'ins', c))
318 b += l
319 else:
320 for ctag, co1, co2, cn1, cn2 in copcodes:
321 if ctag == 'equal':
322 for token_class, token_text in old_char_tokens[co1:co2]:
323 obuffer.append((token_class, '', token_text))
324 for token_class, token_text in new_char_tokens[cn1:cn2]:
325 nbuffer.append((token_class, '', token_text))
326 elif ctag == 'delete':
327 for token_class, token_text in old_char_tokens[co1:co2]:
328 obuffer.append((token_class, 'del', token_text))
329 elif ctag == 'insert':
330 for token_class, token_text in new_char_tokens[cn1:cn2]:
331 nbuffer.append((token_class, 'ins', token_text))
332 elif ctag == 'replace':
333 for token_class, token_text in old_char_tokens[co1:co2]:
334 obuffer.append((token_class, 'del', token_text))
335 for token_class, token_text in new_char_tokens[cn1:cn2]:
336 nbuffer.append((token_class, 'ins', token_text))
337
338 old_tokens_result.extend(obuffer)
339 new_tokens_result.extend(nbuffer)
340
341 return old_tokens_result, new_tokens_result, similarity
342
343
344 class DiffSet(object):
345 """
346 An object for parsing the diff result from diffs.DiffProcessor and
347 adding highlighting, side by side/unified renderings and line diffs
348 """
349
350 HL_REAL = 'REAL' # highlights using original file, slow
351 HL_FAST = 'FAST' # highlights using just the line, fast but not correct
352 # in the case of multiline code
353 HL_NONE = 'NONE' # no highlighting, fastest
354
355 def __init__(self, highlight_mode=HL_REAL, repo_name=None,
356 source_node_getter=lambda filename: None,
357 target_node_getter=lambda filename: None,
358 source_nodes=None, target_nodes=None,
359 max_file_size_limit=150 * 1024, # files over this size will
360 # use fast highlighting
361 comments=None,
362 ):
363
364 self.highlight_mode = highlight_mode
365 self.highlighted_filenodes = {}
366 self.source_node_getter = source_node_getter
367 self.target_node_getter = target_node_getter
368 self.source_nodes = source_nodes or {}
369 self.target_nodes = target_nodes or {}
370 self.repo_name = repo_name
371 self.comments = comments or {}
372 self.max_file_size_limit = max_file_size_limit
373
374 def render_patchset(self, patchset, source_ref=None, target_ref=None):
375 diffset = AttributeDict(dict(
376 lines_added=0,
377 lines_deleted=0,
378 changed_files=0,
379 files=[],
380 limited_diff=isinstance(patchset, LimitedDiffContainer),
381 repo_name=self.repo_name,
382 source_ref=source_ref,
383 target_ref=target_ref,
384 ))
385 for patch in patchset:
386 filediff = self.render_patch(patch)
387 filediff.diffset = diffset
388 diffset.files.append(filediff)
389 diffset.changed_files += 1
390 if not patch['stats']['binary']:
391 diffset.lines_added += patch['stats']['added']
392 diffset.lines_deleted += patch['stats']['deleted']
393
394 return diffset
395
396 _lexer_cache = {}
397 def _get_lexer_for_filename(self, filename):
398 # cached because we might need to call it twice for source/target
399 if filename not in self._lexer_cache:
400 self._lexer_cache[filename] = get_lexer_safe(filepath=filename)
401 return self._lexer_cache[filename]
402
403 def render_patch(self, patch):
404 log.debug('rendering diff for %r' % patch['filename'])
405
406 source_filename = patch['original_filename']
407 target_filename = patch['filename']
408
409 source_lexer = plain_text_lexer
410 target_lexer = plain_text_lexer
411
412 if not patch['stats']['binary']:
413 if self.highlight_mode == self.HL_REAL:
414 if (source_filename and patch['operation'] in ('D', 'M')
415 and source_filename not in self.source_nodes):
416 self.source_nodes[source_filename] = (
417 self.source_node_getter(source_filename))
418
419 if (target_filename and patch['operation'] in ('A', 'M')
420 and target_filename not in self.target_nodes):
421 self.target_nodes[target_filename] = (
422 self.target_node_getter(target_filename))
423
424 elif self.highlight_mode == self.HL_FAST:
425 source_lexer = self._get_lexer_for_filename(source_filename)
426 target_lexer = self._get_lexer_for_filename(target_filename)
427
428 source_file = self.source_nodes.get(source_filename, source_filename)
429 target_file = self.target_nodes.get(target_filename, target_filename)
430
431 source_filenode, target_filenode = None, None
432
433 # TODO: dan: FileNode.lexer works on the content of the file - which
434 # can be slow - issue #4289 explains a lexer clean up - which once
435 # done can allow caching a lexer for a filenode to avoid the file lookup
436 if isinstance(source_file, FileNode):
437 source_filenode = source_file
438 source_lexer = source_file.lexer
439 if isinstance(target_file, FileNode):
440 target_filenode = target_file
441 target_lexer = target_file.lexer
442
443 source_file_path, target_file_path = None, None
444
445 if source_filename != '/dev/null':
446 source_file_path = source_filename
447 if target_filename != '/dev/null':
448 target_file_path = target_filename
449
450 source_file_type = source_lexer.name
451 target_file_type = target_lexer.name
452
453 op_hunks = patch['chunks'][0]
454 hunks = patch['chunks'][1:]
455
456 filediff = AttributeDict({
457 'source_file_path': source_file_path,
458 'target_file_path': target_file_path,
459 'source_filenode': source_filenode,
460 'target_filenode': target_filenode,
461 'hunks': [],
462 'source_file_type': target_file_type,
463 'target_file_type': source_file_type,
464 'patch': patch,
465 'source_mode': patch['stats']['old_mode'],
466 'target_mode': patch['stats']['new_mode'],
467 'limited_diff': isinstance(patch, LimitedDiffContainer),
468 'diffset': self,
469 })
470
471 for hunk in hunks:
472 hunkbit = self.parse_hunk(hunk, source_file, target_file)
473 hunkbit.filediff = filediff
474 filediff.hunks.append(hunkbit)
475 return filediff
476
477 def parse_hunk(self, hunk, source_file, target_file):
478 result = AttributeDict(dict(
479 source_start=hunk['source_start'],
480 source_length=hunk['source_length'],
481 target_start=hunk['target_start'],
482 target_length=hunk['target_length'],
483 section_header=hunk['section_header'],
484 lines=[],
485 ))
486 before, after = [], []
487
488 for line in hunk['lines']:
489 if line['action'] == 'unmod':
490 result.lines.extend(
491 self.parse_lines(before, after, source_file, target_file))
492 after.append(line)
493 before.append(line)
494 elif line['action'] == 'add':
495 after.append(line)
496 elif line['action'] == 'del':
497 before.append(line)
498 elif line['action'] == 'old-no-nl':
499 before.append(line)
500 elif line['action'] == 'new-no-nl':
501 after.append(line)
502
503 result.lines.extend(
504 self.parse_lines(before, after, source_file, target_file))
505 result.unified = self.as_unified(result.lines)
506 result.sideside = result.lines
507 return result
508
509 def parse_lines(self, before_lines, after_lines, source_file, target_file):
510 # TODO: dan: investigate doing the diff comparison and fast highlighting
511 # on the entire before and after buffered block lines rather than by
512 # line, this means we can get better 'fast' highlighting if the context
513 # allows it - eg.
514 # line 4: """
515 # line 5: this gets highlighted as a string
516 # line 6: """
517
518 lines = []
519 while before_lines or after_lines:
520 before, after = None, None
521 before_tokens, after_tokens = None, None
522
523 if before_lines:
524 before = before_lines.pop(0)
525 if after_lines:
526 after = after_lines.pop(0)
527
528 original = AttributeDict()
529 modified = AttributeDict()
530
531 if before:
532 if before['action'] == 'old-no-nl':
533 before_tokens = [('nonl', before['line'])]
534 else:
535 before_tokens = self.get_line_tokens(
536 line_text=before['line'], line_number=before['old_lineno'],
537 file=source_file)
538 original.lineno = before['old_lineno']
539 original.content = before['line']
540 original.action = self.action_to_op(before['action'])
541 original.comments = self.get_comments_for('old',
542 source_file, before['old_lineno'])
543
544 if after:
545 if after['action'] == 'new-no-nl':
546 after_tokens = [('nonl', after['line'])]
547 else:
548 after_tokens = self.get_line_tokens(
549 line_text=after['line'], line_number=after['new_lineno'],
550 file=target_file)
551 modified.lineno = after['new_lineno']
552 modified.content = after['line']
553 modified.action = self.action_to_op(after['action'])
554 modified.comments = self.get_comments_for('new',
555 target_file, after['new_lineno'])
556
557 # diff the lines
558 if before_tokens and after_tokens:
559 o_tokens, m_tokens, similarity = tokens_diff(
560 before_tokens, after_tokens)
561 original.content = render_tokenstream(o_tokens)
562 modified.content = render_tokenstream(m_tokens)
563 elif before_tokens:
564 original.content = render_tokenstream(
565 [(x[0], '', x[1]) for x in before_tokens])
566 elif after_tokens:
567 modified.content = render_tokenstream(
568 [(x[0], '', x[1]) for x in after_tokens])
569
570 lines.append(AttributeDict({
571 'original': original,
572 'modified': modified,
573 }))
574
575 return lines
576
577 def get_comments_for(self, version, file, line_number):
578 if hasattr(file, 'unicode_path'):
579 file = file.unicode_path
580
581 if not isinstance(file, basestring):
582 return None
583
584 line_key = {
585 'old': 'o',
586 'new': 'n',
587 }[version] + str(line_number)
588
589 return self.comments.get(file, {}).get(line_key)
590
591 def get_line_tokens(self, line_text, line_number, file=None):
592 filenode = None
593 filename = None
594
595 if isinstance(file, basestring):
596 filename = file
597 elif isinstance(file, FileNode):
598 filenode = file
599 filename = file.unicode_path
600
601 if self.highlight_mode == self.HL_REAL and filenode:
602 if line_number and file.size < self.max_file_size_limit:
603 return self.get_tokenized_filenode_line(file, line_number)
604
605 if self.highlight_mode in (self.HL_REAL, self.HL_FAST) and filename:
606 lexer = self._get_lexer_for_filename(filename)
607 return list(tokenize_string(line_text, lexer))
608
609 return list(tokenize_string(line_text, plain_text_lexer))
610
611 def get_tokenized_filenode_line(self, filenode, line_number):
612
613 if filenode not in self.highlighted_filenodes:
614 tokenized_lines = filenode_as_lines_tokens(filenode, filenode.lexer)
615 self.highlighted_filenodes[filenode] = tokenized_lines
616 return self.highlighted_filenodes[filenode][line_number - 1]
617
618 def action_to_op(self, action):
619 return {
620 'add': '+',
621 'del': '-',
622 'unmod': ' ',
623 'old-no-nl': ' ',
624 'new-no-nl': ' ',
625 }.get(action, action)
626
627 def as_unified(self, lines):
628 """ Return a generator that yields the lines of a diff in unified order """
629 def generator():
630 buf = []
631 for line in lines:
632
633 if buf and not line.original or line.original.action == ' ':
634 for b in buf:
635 yield b
636 buf = []
637
638 if line.original:
639 if line.original.action == ' ':
640 yield (line.original.lineno, line.modified.lineno,
641 line.original.action, line.original.content,
642 line.original.comments)
643 continue
644
645 if line.original.action == '-':
646 yield (line.original.lineno, None,
647 line.original.action, line.original.content,
648 line.original.comments)
649
650 if line.modified.action == '+':
651 buf.append((
652 None, line.modified.lineno,
653 line.modified.action, line.modified.content,
654 line.modified.comments))
655 continue
656
657 if line.modified:
658 yield (None, line.modified.lineno,
659 line.modified.action, line.modified.content,
660 line.modified.comments)
661
662 for b in buf:
663 yield b
664
665 return generator()
This diff has been collapsed as it changes many lines, (3640 lines changed) Show them Hide them
@@ -0,0 +1,3640 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2010-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 """
22 Database Models for RhodeCode Enterprise
23 """
24
25 import re
26 import os
27 import sys
28 import time
29 import hashlib
30 import logging
31 import datetime
32 import warnings
33 import ipaddress
34 import functools
35 import traceback
36 import collections
37
38
39 from sqlalchemy import *
40 from sqlalchemy.exc import IntegrityError
41 from sqlalchemy.ext.declarative import declared_attr
42 from sqlalchemy.ext.hybrid import hybrid_property
43 from sqlalchemy.orm import (
44 relationship, joinedload, class_mapper, validates, aliased)
45 from sqlalchemy.sql.expression import true
46 from beaker.cache import cache_region, region_invalidate
47 from webob.exc import HTTPNotFound
48 from zope.cachedescriptors.property import Lazy as LazyProperty
49
50 from pylons import url
51 from pylons.i18n.translation import lazy_ugettext as _
52
53 from rhodecode.lib.vcs import get_backend, get_vcs_instance
54 from rhodecode.lib.vcs.utils.helpers import get_scm
55 from rhodecode.lib.vcs.exceptions import VCSError
56 from rhodecode.lib.vcs.backends.base import (
57 EmptyCommit, Reference, MergeFailureReason)
58 from rhodecode.lib.utils2 import (
59 str2bool, safe_str, get_commit_safe, safe_unicode, remove_prefix, md5_safe,
60 time_to_datetime, aslist, Optional, safe_int, get_clone_url, AttributeDict,
61 glob2re)
62 from rhodecode.lib.jsonalchemy import MutationObj, JsonType, JSONDict
63 from rhodecode.lib.ext_json import json
64 from rhodecode.lib.caching_query import FromCache
65 from rhodecode.lib.encrypt import AESCipher
66
67 from rhodecode.model.meta import Base, Session
68
69 URL_SEP = '/'
70 log = logging.getLogger(__name__)
71
72 # =============================================================================
73 # BASE CLASSES
74 # =============================================================================
75
76 # this is propagated from .ini file rhodecode.encrypted_values.secret or
77 # beaker.session.secret if first is not set.
78 # and initialized at environment.py
79 ENCRYPTION_KEY = None
80
81 # used to sort permissions by types, '#' used here is not allowed to be in
82 # usernames, and it's very early in sorted string.printable table.
83 PERMISSION_TYPE_SORT = {
84 'admin': '####',
85 'write': '###',
86 'read': '##',
87 'none': '#',
88 }
89
90
91 def display_sort(obj):
92 """
93 Sort function used to sort permissions in .permissions() function of
94 Repository, RepoGroup, UserGroup. Also it put the default user in front
95 of all other resources
96 """
97
98 if obj.username == User.DEFAULT_USER:
99 return '#####'
100 prefix = PERMISSION_TYPE_SORT.get(obj.permission.split('.')[-1], '')
101 return prefix + obj.username
102
103
104 def _hash_key(k):
105 return md5_safe(k)
106
107
108 class EncryptedTextValue(TypeDecorator):
109 """
110 Special column for encrypted long text data, use like::
111
112 value = Column("encrypted_value", EncryptedValue(), nullable=False)
113
114 This column is intelligent so if value is in unencrypted form it return
115 unencrypted form, but on save it always encrypts
116 """
117 impl = Text
118
119 def process_bind_param(self, value, dialect):
120 if not value:
121 return value
122 if value.startswith('enc$aes$') or value.startswith('enc$aes_hmac$'):
123 # protect against double encrypting if someone manually starts
124 # doing
125 raise ValueError('value needs to be in unencrypted format, ie. '
126 'not starting with enc$aes')
127 return 'enc$aes_hmac$%s' % AESCipher(
128 ENCRYPTION_KEY, hmac=True).encrypt(value)
129
130 def process_result_value(self, value, dialect):
131 import rhodecode
132
133 if not value:
134 return value
135
136 parts = value.split('$', 3)
137 if not len(parts) == 3:
138 # probably not encrypted values
139 return value
140 else:
141 if parts[0] != 'enc':
142 # parts ok but without our header ?
143 return value
144 enc_strict_mode = str2bool(rhodecode.CONFIG.get(
145 'rhodecode.encrypted_values.strict') or True)
146 # at that stage we know it's our encryption
147 if parts[1] == 'aes':
148 decrypted_data = AESCipher(ENCRYPTION_KEY).decrypt(parts[2])
149 elif parts[1] == 'aes_hmac':
150 decrypted_data = AESCipher(
151 ENCRYPTION_KEY, hmac=True,
152 strict_verification=enc_strict_mode).decrypt(parts[2])
153 else:
154 raise ValueError(
155 'Encryption type part is wrong, must be `aes` '
156 'or `aes_hmac`, got `%s` instead' % (parts[1]))
157 return decrypted_data
158
159
160 class BaseModel(object):
161 """
162 Base Model for all classes
163 """
164
165 @classmethod
166 def _get_keys(cls):
167 """return column names for this model """
168 return class_mapper(cls).c.keys()
169
170 def get_dict(self):
171 """
172 return dict with keys and values corresponding
173 to this model data """
174
175 d = {}
176 for k in self._get_keys():
177 d[k] = getattr(self, k)
178
179 # also use __json__() if present to get additional fields
180 _json_attr = getattr(self, '__json__', None)
181 if _json_attr:
182 # update with attributes from __json__
183 if callable(_json_attr):
184 _json_attr = _json_attr()
185 for k, val in _json_attr.iteritems():
186 d[k] = val
187 return d
188
189 def get_appstruct(self):
190 """return list with keys and values tuples corresponding
191 to this model data """
192
193 l = []
194 for k in self._get_keys():
195 l.append((k, getattr(self, k),))
196 return l
197
198 def populate_obj(self, populate_dict):
199 """populate model with data from given populate_dict"""
200
201 for k in self._get_keys():
202 if k in populate_dict:
203 setattr(self, k, populate_dict[k])
204
205 @classmethod
206 def query(cls):
207 return Session().query(cls)
208
209 @classmethod
210 def get(cls, id_):
211 if id_:
212 return cls.query().get(id_)
213
214 @classmethod
215 def get_or_404(cls, id_):
216 try:
217 id_ = int(id_)
218 except (TypeError, ValueError):
219 raise HTTPNotFound
220
221 res = cls.query().get(id_)
222 if not res:
223 raise HTTPNotFound
224 return res
225
226 @classmethod
227 def getAll(cls):
228 # deprecated and left for backward compatibility
229 return cls.get_all()
230
231 @classmethod
232 def get_all(cls):
233 return cls.query().all()
234
235 @classmethod
236 def delete(cls, id_):
237 obj = cls.query().get(id_)
238 Session().delete(obj)
239
240 @classmethod
241 def identity_cache(cls, session, attr_name, value):
242 exist_in_session = []
243 for (item_cls, pkey), instance in session.identity_map.items():
244 if cls == item_cls and getattr(instance, attr_name) == value:
245 exist_in_session.append(instance)
246 if exist_in_session:
247 if len(exist_in_session) == 1:
248 return exist_in_session[0]
249 log.exception(
250 'multiple objects with attr %s and '
251 'value %s found with same name: %r',
252 attr_name, value, exist_in_session)
253
254 def __repr__(self):
255 if hasattr(self, '__unicode__'):
256 # python repr needs to return str
257 try:
258 return safe_str(self.__unicode__())
259 except UnicodeDecodeError:
260 pass
261 return '<DB:%s>' % (self.__class__.__name__)
262
263
264 class RhodeCodeSetting(Base, BaseModel):
265 __tablename__ = 'rhodecode_settings'
266 __table_args__ = (
267 UniqueConstraint('app_settings_name'),
268 {'extend_existing': True, 'mysql_engine': 'InnoDB',
269 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
270 )
271
272 SETTINGS_TYPES = {
273 'str': safe_str,
274 'int': safe_int,
275 'unicode': safe_unicode,
276 'bool': str2bool,
277 'list': functools.partial(aslist, sep=',')
278 }
279 DEFAULT_UPDATE_URL = 'https://rhodecode.com/api/v1/info/versions'
280 GLOBAL_CONF_KEY = 'app_settings'
281
282 app_settings_id = Column("app_settings_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
283 app_settings_name = Column("app_settings_name", String(255), nullable=True, unique=None, default=None)
284 _app_settings_value = Column("app_settings_value", String(4096), nullable=True, unique=None, default=None)
285 _app_settings_type = Column("app_settings_type", String(255), nullable=True, unique=None, default=None)
286
287 def __init__(self, key='', val='', type='unicode'):
288 self.app_settings_name = key
289 self.app_settings_type = type
290 self.app_settings_value = val
291
292 @validates('_app_settings_value')
293 def validate_settings_value(self, key, val):
294 assert type(val) == unicode
295 return val
296
297 @hybrid_property
298 def app_settings_value(self):
299 v = self._app_settings_value
300 _type = self.app_settings_type
301 if _type:
302 _type = self.app_settings_type.split('.')[0]
303 # decode the encrypted value
304 if 'encrypted' in self.app_settings_type:
305 cipher = EncryptedTextValue()
306 v = safe_unicode(cipher.process_result_value(v, None))
307
308 converter = self.SETTINGS_TYPES.get(_type) or \
309 self.SETTINGS_TYPES['unicode']
310 return converter(v)
311
312 @app_settings_value.setter
313 def app_settings_value(self, val):
314 """
315 Setter that will always make sure we use unicode in app_settings_value
316
317 :param val:
318 """
319 val = safe_unicode(val)
320 # encode the encrypted value
321 if 'encrypted' in self.app_settings_type:
322 cipher = EncryptedTextValue()
323 val = safe_unicode(cipher.process_bind_param(val, None))
324 self._app_settings_value = val
325
326 @hybrid_property
327 def app_settings_type(self):
328 return self._app_settings_type
329
330 @app_settings_type.setter
331 def app_settings_type(self, val):
332 if val.split('.')[0] not in self.SETTINGS_TYPES:
333 raise Exception('type must be one of %s got %s'
334 % (self.SETTINGS_TYPES.keys(), val))
335 self._app_settings_type = val
336
337 def __unicode__(self):
338 return u"<%s('%s:%s[%s]')>" % (
339 self.__class__.__name__,
340 self.app_settings_name, self.app_settings_value,
341 self.app_settings_type
342 )
343
344
345 class RhodeCodeUi(Base, BaseModel):
346 __tablename__ = 'rhodecode_ui'
347 __table_args__ = (
348 UniqueConstraint('ui_key'),
349 {'extend_existing': True, 'mysql_engine': 'InnoDB',
350 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
351 )
352
353 HOOK_REPO_SIZE = 'changegroup.repo_size'
354 # HG
355 HOOK_PRE_PULL = 'preoutgoing.pre_pull'
356 HOOK_PULL = 'outgoing.pull_logger'
357 HOOK_PRE_PUSH = 'prechangegroup.pre_push'
358 HOOK_PUSH = 'changegroup.push_logger'
359
360 # TODO: johbo: Unify way how hooks are configured for git and hg,
361 # git part is currently hardcoded.
362
363 # SVN PATTERNS
364 SVN_BRANCH_ID = 'vcs_svn_branch'
365 SVN_TAG_ID = 'vcs_svn_tag'
366
367 ui_id = Column(
368 "ui_id", Integer(), nullable=False, unique=True, default=None,
369 primary_key=True)
370 ui_section = Column(
371 "ui_section", String(255), nullable=True, unique=None, default=None)
372 ui_key = Column(
373 "ui_key", String(255), nullable=True, unique=None, default=None)
374 ui_value = Column(
375 "ui_value", String(255), nullable=True, unique=None, default=None)
376 ui_active = Column(
377 "ui_active", Boolean(), nullable=True, unique=None, default=True)
378
379 def __repr__(self):
380 return '<%s[%s]%s=>%s]>' % (self.__class__.__name__, self.ui_section,
381 self.ui_key, self.ui_value)
382
383
384 class RepoRhodeCodeSetting(Base, BaseModel):
385 __tablename__ = 'repo_rhodecode_settings'
386 __table_args__ = (
387 UniqueConstraint(
388 'app_settings_name', 'repository_id',
389 name='uq_repo_rhodecode_setting_name_repo_id'),
390 {'extend_existing': True, 'mysql_engine': 'InnoDB',
391 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
392 )
393
394 repository_id = Column(
395 "repository_id", Integer(), ForeignKey('repositories.repo_id'),
396 nullable=False)
397 app_settings_id = Column(
398 "app_settings_id", Integer(), nullable=False, unique=True,
399 default=None, primary_key=True)
400 app_settings_name = Column(
401 "app_settings_name", String(255), nullable=True, unique=None,
402 default=None)
403 _app_settings_value = Column(
404 "app_settings_value", String(4096), nullable=True, unique=None,
405 default=None)
406 _app_settings_type = Column(
407 "app_settings_type", String(255), nullable=True, unique=None,
408 default=None)
409
410 repository = relationship('Repository')
411
412 def __init__(self, repository_id, key='', val='', type='unicode'):
413 self.repository_id = repository_id
414 self.app_settings_name = key
415 self.app_settings_type = type
416 self.app_settings_value = val
417
418 @validates('_app_settings_value')
419 def validate_settings_value(self, key, val):
420 assert type(val) == unicode
421 return val
422
423 @hybrid_property
424 def app_settings_value(self):
425 v = self._app_settings_value
426 type_ = self.app_settings_type
427 SETTINGS_TYPES = RhodeCodeSetting.SETTINGS_TYPES
428 converter = SETTINGS_TYPES.get(type_) or SETTINGS_TYPES['unicode']
429 return converter(v)
430
431 @app_settings_value.setter
432 def app_settings_value(self, val):
433 """
434 Setter that will always make sure we use unicode in app_settings_value
435
436 :param val:
437 """
438 self._app_settings_value = safe_unicode(val)
439
440 @hybrid_property
441 def app_settings_type(self):
442 return self._app_settings_type
443
444 @app_settings_type.setter
445 def app_settings_type(self, val):
446 SETTINGS_TYPES = RhodeCodeSetting.SETTINGS_TYPES
447 if val not in SETTINGS_TYPES:
448 raise Exception('type must be one of %s got %s'
449 % (SETTINGS_TYPES.keys(), val))
450 self._app_settings_type = val
451
452 def __unicode__(self):
453 return u"<%s('%s:%s:%s[%s]')>" % (
454 self.__class__.__name__, self.repository.repo_name,
455 self.app_settings_name, self.app_settings_value,
456 self.app_settings_type
457 )
458
459
460 class RepoRhodeCodeUi(Base, BaseModel):
461 __tablename__ = 'repo_rhodecode_ui'
462 __table_args__ = (
463 UniqueConstraint(
464 'repository_id', 'ui_section', 'ui_key',
465 name='uq_repo_rhodecode_ui_repository_id_section_key'),
466 {'extend_existing': True, 'mysql_engine': 'InnoDB',
467 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
468 )
469
470 repository_id = Column(
471 "repository_id", Integer(), ForeignKey('repositories.repo_id'),
472 nullable=False)
473 ui_id = Column(
474 "ui_id", Integer(), nullable=False, unique=True, default=None,
475 primary_key=True)
476 ui_section = Column(
477 "ui_section", String(255), nullable=True, unique=None, default=None)
478 ui_key = Column(
479 "ui_key", String(255), nullable=True, unique=None, default=None)
480 ui_value = Column(
481 "ui_value", String(255), nullable=True, unique=None, default=None)
482 ui_active = Column(
483 "ui_active", Boolean(), nullable=True, unique=None, default=True)
484
485 repository = relationship('Repository')
486
487 def __repr__(self):
488 return '<%s[%s:%s]%s=>%s]>' % (
489 self.__class__.__name__, self.repository.repo_name,
490 self.ui_section, self.ui_key, self.ui_value)
491
492
493 class User(Base, BaseModel):
494 __tablename__ = 'users'
495 __table_args__ = (
496 UniqueConstraint('username'), UniqueConstraint('email'),
497 Index('u_username_idx', 'username'),
498 Index('u_email_idx', 'email'),
499 {'extend_existing': True, 'mysql_engine': 'InnoDB',
500 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
501 )
502 DEFAULT_USER = 'default'
503 DEFAULT_USER_EMAIL = 'anonymous@rhodecode.org'
504 DEFAULT_GRAVATAR_URL = 'https://secure.gravatar.com/avatar/{md5email}?d=identicon&s={size}'
505
506 user_id = Column("user_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
507 username = Column("username", String(255), nullable=True, unique=None, default=None)
508 password = Column("password", String(255), nullable=True, unique=None, default=None)
509 active = Column("active", Boolean(), nullable=True, unique=None, default=True)
510 admin = Column("admin", Boolean(), nullable=True, unique=None, default=False)
511 name = Column("firstname", String(255), nullable=True, unique=None, default=None)
512 lastname = Column("lastname", String(255), nullable=True, unique=None, default=None)
513 _email = Column("email", String(255), nullable=True, unique=None, default=None)
514 last_login = Column("last_login", DateTime(timezone=False), nullable=True, unique=None, default=None)
515 extern_type = Column("extern_type", String(255), nullable=True, unique=None, default=None)
516 extern_name = Column("extern_name", String(255), nullable=True, unique=None, default=None)
517 api_key = Column("api_key", String(255), nullable=True, unique=None, default=None)
518 inherit_default_permissions = Column("inherit_default_permissions", Boolean(), nullable=False, unique=None, default=True)
519 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
520 _user_data = Column("user_data", LargeBinary(), nullable=True) # JSON data
521
522 user_log = relationship('UserLog')
523 user_perms = relationship('UserToPerm', primaryjoin="User.user_id==UserToPerm.user_id", cascade='all')
524
525 repositories = relationship('Repository')
526 repository_groups = relationship('RepoGroup')
527 user_groups = relationship('UserGroup')
528
529 user_followers = relationship('UserFollowing', primaryjoin='UserFollowing.follows_user_id==User.user_id', cascade='all')
530 followings = relationship('UserFollowing', primaryjoin='UserFollowing.user_id==User.user_id', cascade='all')
531
532 repo_to_perm = relationship('UserRepoToPerm', primaryjoin='UserRepoToPerm.user_id==User.user_id', cascade='all')
533 repo_group_to_perm = relationship('UserRepoGroupToPerm', primaryjoin='UserRepoGroupToPerm.user_id==User.user_id', cascade='all')
534 user_group_to_perm = relationship('UserUserGroupToPerm', primaryjoin='UserUserGroupToPerm.user_id==User.user_id', cascade='all')
535
536 group_member = relationship('UserGroupMember', cascade='all')
537
538 notifications = relationship('UserNotification', cascade='all')
539 # notifications assigned to this user
540 user_created_notifications = relationship('Notification', cascade='all')
541 # comments created by this user
542 user_comments = relationship('ChangesetComment', cascade='all')
543 # user profile extra info
544 user_emails = relationship('UserEmailMap', cascade='all')
545 user_ip_map = relationship('UserIpMap', cascade='all')
546 user_auth_tokens = relationship('UserApiKeys', cascade='all')
547 # gists
548 user_gists = relationship('Gist', cascade='all')
549 # user pull requests
550 user_pull_requests = relationship('PullRequest', cascade='all')
551 # external identities
552 extenal_identities = relationship(
553 'ExternalIdentity',
554 primaryjoin="User.user_id==ExternalIdentity.local_user_id",
555 cascade='all')
556
557 def __unicode__(self):
558 return u"<%s('id:%s:%s')>" % (self.__class__.__name__,
559 self.user_id, self.username)
560
561 @hybrid_property
562 def email(self):
563 return self._email
564
565 @email.setter
566 def email(self, val):
567 self._email = val.lower() if val else None
568
569 @property
570 def firstname(self):
571 # alias for future
572 return self.name
573
574 @property
575 def emails(self):
576 other = UserEmailMap.query().filter(UserEmailMap.user==self).all()
577 return [self.email] + [x.email for x in other]
578
579 @property
580 def auth_tokens(self):
581 return [self.api_key] + [x.api_key for x in self.extra_auth_tokens]
582
583 @property
584 def extra_auth_tokens(self):
585 return UserApiKeys.query().filter(UserApiKeys.user == self).all()
586
587 @property
588 def feed_token(self):
589 feed_tokens = UserApiKeys.query()\
590 .filter(UserApiKeys.user == self)\
591 .filter(UserApiKeys.role == UserApiKeys.ROLE_FEED)\
592 .all()
593 if feed_tokens:
594 return feed_tokens[0].api_key
595 else:
596 # use the main token so we don't end up with nothing...
597 return self.api_key
598
599 @classmethod
600 def extra_valid_auth_tokens(cls, user, role=None):
601 tokens = UserApiKeys.query().filter(UserApiKeys.user == user)\
602 .filter(or_(UserApiKeys.expires == -1,
603 UserApiKeys.expires >= time.time()))
604 if role:
605 tokens = tokens.filter(or_(UserApiKeys.role == role,
606 UserApiKeys.role == UserApiKeys.ROLE_ALL))
607 return tokens.all()
608
609 @property
610 def ip_addresses(self):
611 ret = UserIpMap.query().filter(UserIpMap.user == self).all()
612 return [x.ip_addr for x in ret]
613
614 @property
615 def username_and_name(self):
616 return '%s (%s %s)' % (self.username, self.firstname, self.lastname)
617
618 @property
619 def username_or_name_or_email(self):
620 full_name = self.full_name if self.full_name is not ' ' else None
621 return self.username or full_name or self.email
622
623 @property
624 def full_name(self):
625 return '%s %s' % (self.firstname, self.lastname)
626
627 @property
628 def full_name_or_username(self):
629 return ('%s %s' % (self.firstname, self.lastname)
630 if (self.firstname and self.lastname) else self.username)
631
632 @property
633 def full_contact(self):
634 return '%s %s <%s>' % (self.firstname, self.lastname, self.email)
635
636 @property
637 def short_contact(self):
638 return '%s %s' % (self.firstname, self.lastname)
639
640 @property
641 def is_admin(self):
642 return self.admin
643
644 @property
645 def AuthUser(self):
646 """
647 Returns instance of AuthUser for this user
648 """
649 from rhodecode.lib.auth import AuthUser
650 return AuthUser(user_id=self.user_id, api_key=self.api_key,
651 username=self.username)
652
653 @hybrid_property
654 def user_data(self):
655 if not self._user_data:
656 return {}
657
658 try:
659 return json.loads(self._user_data)
660 except TypeError:
661 return {}
662
663 @user_data.setter
664 def user_data(self, val):
665 if not isinstance(val, dict):
666 raise Exception('user_data must be dict, got %s' % type(val))
667 try:
668 self._user_data = json.dumps(val)
669 except Exception:
670 log.error(traceback.format_exc())
671
672 @classmethod
673 def get_by_username(cls, username, case_insensitive=False,
674 cache=False, identity_cache=False):
675 session = Session()
676
677 if case_insensitive:
678 q = cls.query().filter(
679 func.lower(cls.username) == func.lower(username))
680 else:
681 q = cls.query().filter(cls.username == username)
682
683 if cache:
684 if identity_cache:
685 val = cls.identity_cache(session, 'username', username)
686 if val:
687 return val
688 else:
689 q = q.options(
690 FromCache("sql_cache_short",
691 "get_user_by_name_%s" % _hash_key(username)))
692
693 return q.scalar()
694
695 @classmethod
696 def get_by_auth_token(cls, auth_token, cache=False, fallback=True):
697 q = cls.query().filter(cls.api_key == auth_token)
698
699 if cache:
700 q = q.options(FromCache("sql_cache_short",
701 "get_auth_token_%s" % auth_token))
702 res = q.scalar()
703
704 if fallback and not res:
705 #fallback to additional keys
706 _res = UserApiKeys.query()\
707 .filter(UserApiKeys.api_key == auth_token)\
708 .filter(or_(UserApiKeys.expires == -1,
709 UserApiKeys.expires >= time.time()))\
710 .first()
711 if _res:
712 res = _res.user
713 return res
714
715 @classmethod
716 def get_by_email(cls, email, case_insensitive=False, cache=False):
717
718 if case_insensitive:
719 q = cls.query().filter(func.lower(cls.email) == func.lower(email))
720
721 else:
722 q = cls.query().filter(cls.email == email)
723
724 if cache:
725 q = q.options(FromCache("sql_cache_short",
726 "get_email_key_%s" % _hash_key(email)))
727
728 ret = q.scalar()
729 if ret is None:
730 q = UserEmailMap.query()
731 # try fetching in alternate email map
732 if case_insensitive:
733 q = q.filter(func.lower(UserEmailMap.email) == func.lower(email))
734 else:
735 q = q.filter(UserEmailMap.email == email)
736 q = q.options(joinedload(UserEmailMap.user))
737 if cache:
738 q = q.options(FromCache("sql_cache_short",
739 "get_email_map_key_%s" % email))
740 ret = getattr(q.scalar(), 'user', None)
741
742 return ret
743
744 @classmethod
745 def get_from_cs_author(cls, author):
746 """
747 Tries to get User objects out of commit author string
748
749 :param author:
750 """
751 from rhodecode.lib.helpers import email, author_name
752 # Valid email in the attribute passed, see if they're in the system
753 _email = email(author)
754 if _email:
755 user = cls.get_by_email(_email, case_insensitive=True)
756 if user:
757 return user
758 # Maybe we can match by username?
759 _author = author_name(author)
760 user = cls.get_by_username(_author, case_insensitive=True)
761 if user:
762 return user
763
764 def update_userdata(self, **kwargs):
765 usr = self
766 old = usr.user_data
767 old.update(**kwargs)
768 usr.user_data = old
769 Session().add(usr)
770 log.debug('updated userdata with ', kwargs)
771
772 def update_lastlogin(self):
773 """Update user lastlogin"""
774 self.last_login = datetime.datetime.now()
775 Session().add(self)
776 log.debug('updated user %s lastlogin', self.username)
777
778 def update_lastactivity(self):
779 """Update user lastactivity"""
780 usr = self
781 old = usr.user_data
782 old.update({'last_activity': time.time()})
783 usr.user_data = old
784 Session().add(usr)
785 log.debug('updated user %s lastactivity', usr.username)
786
787 def update_password(self, new_password, change_api_key=False):
788 from rhodecode.lib.auth import get_crypt_password,generate_auth_token
789
790 self.password = get_crypt_password(new_password)
791 if change_api_key:
792 self.api_key = generate_auth_token(self.username)
793 Session().add(self)
794
795 @classmethod
796 def get_first_super_admin(cls):
797 user = User.query().filter(User.admin == true()).first()
798 if user is None:
799 raise Exception('FATAL: Missing administrative account!')
800 return user
801
802 @classmethod
803 def get_all_super_admins(cls):
804 """
805 Returns all admin accounts sorted by username
806 """
807 return User.query().filter(User.admin == true())\
808 .order_by(User.username.asc()).all()
809
810 @classmethod
811 def get_default_user(cls, cache=False):
812 user = User.get_by_username(User.DEFAULT_USER, cache=cache)
813 if user is None:
814 raise Exception('FATAL: Missing default account!')
815 return user
816
817 def _get_default_perms(self, user, suffix=''):
818 from rhodecode.model.permission import PermissionModel
819 return PermissionModel().get_default_perms(user.user_perms, suffix)
820
821 def get_default_perms(self, suffix=''):
822 return self._get_default_perms(self, suffix)
823
824 def get_api_data(self, include_secrets=False, details='full'):
825 """
826 Common function for generating user related data for API
827
828 :param include_secrets: By default secrets in the API data will be replaced
829 by a placeholder value to prevent exposing this data by accident. In case
830 this data shall be exposed, set this flag to ``True``.
831
832 :param details: details can be 'basic|full' basic gives only a subset of
833 the available user information that includes user_id, name and emails.
834 """
835 user = self
836 user_data = self.user_data
837 data = {
838 'user_id': user.user_id,
839 'username': user.username,
840 'firstname': user.name,
841 'lastname': user.lastname,
842 'email': user.email,
843 'emails': user.emails,
844 }
845 if details == 'basic':
846 return data
847
848 api_key_length = 40
849 api_key_replacement = '*' * api_key_length
850
851 extras = {
852 'api_key': api_key_replacement,
853 'api_keys': [api_key_replacement],
854 'active': user.active,
855 'admin': user.admin,
856 'extern_type': user.extern_type,
857 'extern_name': user.extern_name,
858 'last_login': user.last_login,
859 'ip_addresses': user.ip_addresses,
860 'language': user_data.get('language')
861 }
862 data.update(extras)
863
864 if include_secrets:
865 data['api_key'] = user.api_key
866 data['api_keys'] = user.auth_tokens
867 return data
868
869 def __json__(self):
870 data = {
871 'full_name': self.full_name,
872 'full_name_or_username': self.full_name_or_username,
873 'short_contact': self.short_contact,
874 'full_contact': self.full_contact,
875 }
876 data.update(self.get_api_data())
877 return data
878
879
880 class UserApiKeys(Base, BaseModel):
881 __tablename__ = 'user_api_keys'
882 __table_args__ = (
883 Index('uak_api_key_idx', 'api_key'),
884 Index('uak_api_key_expires_idx', 'api_key', 'expires'),
885 UniqueConstraint('api_key'),
886 {'extend_existing': True, 'mysql_engine': 'InnoDB',
887 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
888 )
889 __mapper_args__ = {}
890
891 # ApiKey role
892 ROLE_ALL = 'token_role_all'
893 ROLE_HTTP = 'token_role_http'
894 ROLE_VCS = 'token_role_vcs'
895 ROLE_API = 'token_role_api'
896 ROLE_FEED = 'token_role_feed'
897 ROLES = [ROLE_ALL, ROLE_HTTP, ROLE_VCS, ROLE_API, ROLE_FEED]
898
899 user_api_key_id = Column("user_api_key_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
900 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None)
901 api_key = Column("api_key", String(255), nullable=False, unique=True)
902 description = Column('description', UnicodeText().with_variant(UnicodeText(1024), 'mysql'))
903 expires = Column('expires', Float(53), nullable=False)
904 role = Column('role', String(255), nullable=True)
905 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
906
907 user = relationship('User', lazy='joined')
908
909 @classmethod
910 def _get_role_name(cls, role):
911 return {
912 cls.ROLE_ALL: _('all'),
913 cls.ROLE_HTTP: _('http/web interface'),
914 cls.ROLE_VCS: _('vcs (git/hg/svn protocol)'),
915 cls.ROLE_API: _('api calls'),
916 cls.ROLE_FEED: _('feed access'),
917 }.get(role, role)
918
919 @property
920 def expired(self):
921 if self.expires == -1:
922 return False
923 return time.time() > self.expires
924
925 @property
926 def role_humanized(self):
927 return self._get_role_name(self.role)
928
929
930 class UserEmailMap(Base, BaseModel):
931 __tablename__ = 'user_email_map'
932 __table_args__ = (
933 Index('uem_email_idx', 'email'),
934 UniqueConstraint('email'),
935 {'extend_existing': True, 'mysql_engine': 'InnoDB',
936 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
937 )
938 __mapper_args__ = {}
939
940 email_id = Column("email_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
941 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None)
942 _email = Column("email", String(255), nullable=True, unique=False, default=None)
943 user = relationship('User', lazy='joined')
944
945 @validates('_email')
946 def validate_email(self, key, email):
947 # check if this email is not main one
948 main_email = Session().query(User).filter(User.email == email).scalar()
949 if main_email is not None:
950 raise AttributeError('email %s is present is user table' % email)
951 return email
952
953 @hybrid_property
954 def email(self):
955 return self._email
956
957 @email.setter
958 def email(self, val):
959 self._email = val.lower() if val else None
960
961
962 class UserIpMap(Base, BaseModel):
963 __tablename__ = 'user_ip_map'
964 __table_args__ = (
965 UniqueConstraint('user_id', 'ip_addr'),
966 {'extend_existing': True, 'mysql_engine': 'InnoDB',
967 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
968 )
969 __mapper_args__ = {}
970
971 ip_id = Column("ip_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
972 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None)
973 ip_addr = Column("ip_addr", String(255), nullable=True, unique=False, default=None)
974 active = Column("active", Boolean(), nullable=True, unique=None, default=True)
975 description = Column("description", String(10000), nullable=True, unique=None, default=None)
976 user = relationship('User', lazy='joined')
977
978 @classmethod
979 def _get_ip_range(cls, ip_addr):
980 net = ipaddress.ip_network(ip_addr, strict=False)
981 return [str(net.network_address), str(net.broadcast_address)]
982
983 def __json__(self):
984 return {
985 'ip_addr': self.ip_addr,
986 'ip_range': self._get_ip_range(self.ip_addr),
987 }
988
989 def __unicode__(self):
990 return u"<%s('user_id:%s=>%s')>" % (self.__class__.__name__,
991 self.user_id, self.ip_addr)
992
993 class UserLog(Base, BaseModel):
994 __tablename__ = 'user_logs'
995 __table_args__ = (
996 {'extend_existing': True, 'mysql_engine': 'InnoDB',
997 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
998 )
999 user_log_id = Column("user_log_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
1000 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None)
1001 username = Column("username", String(255), nullable=True, unique=None, default=None)
1002 repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True)
1003 repository_name = Column("repository_name", String(255), nullable=True, unique=None, default=None)
1004 user_ip = Column("user_ip", String(255), nullable=True, unique=None, default=None)
1005 action = Column("action", Text().with_variant(Text(1200000), 'mysql'), nullable=True, unique=None, default=None)
1006 action_date = Column("action_date", DateTime(timezone=False), nullable=True, unique=None, default=None)
1007
1008 def __unicode__(self):
1009 return u"<%s('id:%s:%s')>" % (self.__class__.__name__,
1010 self.repository_name,
1011 self.action)
1012
1013 @property
1014 def action_as_day(self):
1015 return datetime.date(*self.action_date.timetuple()[:3])
1016
1017 user = relationship('User')
1018 repository = relationship('Repository', cascade='')
1019
1020
1021 class UserGroup(Base, BaseModel):
1022 __tablename__ = 'users_groups'
1023 __table_args__ = (
1024 {'extend_existing': True, 'mysql_engine': 'InnoDB',
1025 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
1026 )
1027
1028 users_group_id = Column("users_group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
1029 users_group_name = Column("users_group_name", String(255), nullable=False, unique=True, default=None)
1030 user_group_description = Column("user_group_description", String(10000), nullable=True, unique=None, default=None)
1031 users_group_active = Column("users_group_active", Boolean(), nullable=True, unique=None, default=None)
1032 inherit_default_permissions = Column("users_group_inherit_default_permissions", Boolean(), nullable=False, unique=None, default=True)
1033 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=False, default=None)
1034 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
1035 _group_data = Column("group_data", LargeBinary(), nullable=True) # JSON data
1036
1037 members = relationship('UserGroupMember', cascade="all, delete, delete-orphan", lazy="joined")
1038 users_group_to_perm = relationship('UserGroupToPerm', cascade='all')
1039 users_group_repo_to_perm = relationship('UserGroupRepoToPerm', cascade='all')
1040 users_group_repo_group_to_perm = relationship('UserGroupRepoGroupToPerm', cascade='all')
1041 user_user_group_to_perm = relationship('UserUserGroupToPerm', cascade='all')
1042 user_group_user_group_to_perm = relationship('UserGroupUserGroupToPerm ', primaryjoin="UserGroupUserGroupToPerm.target_user_group_id==UserGroup.users_group_id", cascade='all')
1043
1044 user = relationship('User')
1045
1046 @hybrid_property
1047 def group_data(self):
1048 if not self._group_data:
1049 return {}
1050
1051 try:
1052 return json.loads(self._group_data)
1053 except TypeError:
1054 return {}
1055
1056 @group_data.setter
1057 def group_data(self, val):
1058 try:
1059 self._group_data = json.dumps(val)
1060 except Exception:
1061 log.error(traceback.format_exc())
1062
1063 def __unicode__(self):
1064 return u"<%s('id:%s:%s')>" % (self.__class__.__name__,
1065 self.users_group_id,
1066 self.users_group_name)
1067
1068 @classmethod
1069 def get_by_group_name(cls, group_name, cache=False,
1070 case_insensitive=False):
1071 if case_insensitive:
1072 q = cls.query().filter(func.lower(cls.users_group_name) ==
1073 func.lower(group_name))
1074
1075 else:
1076 q = cls.query().filter(cls.users_group_name == group_name)
1077 if cache:
1078 q = q.options(FromCache(
1079 "sql_cache_short",
1080 "get_group_%s" % _hash_key(group_name)))
1081 return q.scalar()
1082
1083 @classmethod
1084 def get(cls, user_group_id, cache=False):
1085 user_group = cls.query()
1086 if cache:
1087 user_group = user_group.options(FromCache("sql_cache_short",
1088 "get_users_group_%s" % user_group_id))
1089 return user_group.get(user_group_id)
1090
1091 def permissions(self, with_admins=True, with_owner=True):
1092 q = UserUserGroupToPerm.query().filter(UserUserGroupToPerm.user_group == self)
1093 q = q.options(joinedload(UserUserGroupToPerm.user_group),
1094 joinedload(UserUserGroupToPerm.user),
1095 joinedload(UserUserGroupToPerm.permission),)
1096
1097 # get owners and admins and permissions. We do a trick of re-writing
1098 # objects from sqlalchemy to named-tuples due to sqlalchemy session
1099 # has a global reference and changing one object propagates to all
1100 # others. This means if admin is also an owner admin_row that change
1101 # would propagate to both objects
1102 perm_rows = []
1103 for _usr in q.all():
1104 usr = AttributeDict(_usr.user.get_dict())
1105 usr.permission = _usr.permission.permission_name
1106 perm_rows.append(usr)
1107
1108 # filter the perm rows by 'default' first and then sort them by
1109 # admin,write,read,none permissions sorted again alphabetically in
1110 # each group
1111 perm_rows = sorted(perm_rows, key=display_sort)
1112
1113 _admin_perm = 'usergroup.admin'
1114 owner_row = []
1115 if with_owner:
1116 usr = AttributeDict(self.user.get_dict())
1117 usr.owner_row = True
1118 usr.permission = _admin_perm
1119 owner_row.append(usr)
1120
1121 super_admin_rows = []
1122 if with_admins:
1123 for usr in User.get_all_super_admins():
1124 # if this admin is also owner, don't double the record
1125 if usr.user_id == owner_row[0].user_id:
1126 owner_row[0].admin_row = True
1127 else:
1128 usr = AttributeDict(usr.get_dict())
1129 usr.admin_row = True
1130 usr.permission = _admin_perm
1131 super_admin_rows.append(usr)
1132
1133 return super_admin_rows + owner_row + perm_rows
1134
1135 def permission_user_groups(self):
1136 q = UserGroupUserGroupToPerm.query().filter(UserGroupUserGroupToPerm.target_user_group == self)
1137 q = q.options(joinedload(UserGroupUserGroupToPerm.user_group),
1138 joinedload(UserGroupUserGroupToPerm.target_user_group),
1139 joinedload(UserGroupUserGroupToPerm.permission),)
1140
1141 perm_rows = []
1142 for _user_group in q.all():
1143 usr = AttributeDict(_user_group.user_group.get_dict())
1144 usr.permission = _user_group.permission.permission_name
1145 perm_rows.append(usr)
1146
1147 return perm_rows
1148
1149 def _get_default_perms(self, user_group, suffix=''):
1150 from rhodecode.model.permission import PermissionModel
1151 return PermissionModel().get_default_perms(user_group.users_group_to_perm, suffix)
1152
1153 def get_default_perms(self, suffix=''):
1154 return self._get_default_perms(self, suffix)
1155
1156 def get_api_data(self, with_group_members=True, include_secrets=False):
1157 """
1158 :param include_secrets: See :meth:`User.get_api_data`, this parameter is
1159 basically forwarded.
1160
1161 """
1162 user_group = self
1163
1164 data = {
1165 'users_group_id': user_group.users_group_id,
1166 'group_name': user_group.users_group_name,
1167 'group_description': user_group.user_group_description,
1168 'active': user_group.users_group_active,
1169 'owner': user_group.user.username,
1170 }
1171 if with_group_members:
1172 users = []
1173 for user in user_group.members:
1174 user = user.user
1175 users.append(user.get_api_data(include_secrets=include_secrets))
1176 data['users'] = users
1177
1178 return data
1179
1180
1181 class UserGroupMember(Base, BaseModel):
1182 __tablename__ = 'users_groups_members'
1183 __table_args__ = (
1184 {'extend_existing': True, 'mysql_engine': 'InnoDB',
1185 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
1186 )
1187
1188 users_group_member_id = Column("users_group_member_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
1189 users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
1190 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
1191
1192 user = relationship('User', lazy='joined')
1193 users_group = relationship('UserGroup')
1194
1195 def __init__(self, gr_id='', u_id=''):
1196 self.users_group_id = gr_id
1197 self.user_id = u_id
1198
1199
1200 class RepositoryField(Base, BaseModel):
1201 __tablename__ = 'repositories_fields'
1202 __table_args__ = (
1203 UniqueConstraint('repository_id', 'field_key'), # no-multi field
1204 {'extend_existing': True, 'mysql_engine': 'InnoDB',
1205 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
1206 )
1207 PREFIX = 'ex_' # prefix used in form to not conflict with already existing fields
1208
1209 repo_field_id = Column("repo_field_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
1210 repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None)
1211 field_key = Column("field_key", String(250))
1212 field_label = Column("field_label", String(1024), nullable=False)
1213 field_value = Column("field_value", String(10000), nullable=False)
1214 field_desc = Column("field_desc", String(1024), nullable=False)
1215 field_type = Column("field_type", String(255), nullable=False, unique=None)
1216 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
1217
1218 repository = relationship('Repository')
1219
1220 @property
1221 def field_key_prefixed(self):
1222 return 'ex_%s' % self.field_key
1223
1224 @classmethod
1225 def un_prefix_key(cls, key):
1226 if key.startswith(cls.PREFIX):
1227 return key[len(cls.PREFIX):]
1228 return key
1229
1230 @classmethod
1231 def get_by_key_name(cls, key, repo):
1232 row = cls.query()\
1233 .filter(cls.repository == repo)\
1234 .filter(cls.field_key == key).scalar()
1235 return row
1236
1237
1238 class Repository(Base, BaseModel):
1239 __tablename__ = 'repositories'
1240 __table_args__ = (
1241 Index('r_repo_name_idx', 'repo_name', mysql_length=255),
1242 {'extend_existing': True, 'mysql_engine': 'InnoDB',
1243 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
1244 )
1245 DEFAULT_CLONE_URI = '{scheme}://{user}@{netloc}/{repo}'
1246 DEFAULT_CLONE_URI_ID = '{scheme}://{user}@{netloc}/_{repoid}'
1247
1248 STATE_CREATED = 'repo_state_created'
1249 STATE_PENDING = 'repo_state_pending'
1250 STATE_ERROR = 'repo_state_error'
1251
1252 LOCK_AUTOMATIC = 'lock_auto'
1253 LOCK_API = 'lock_api'
1254 LOCK_WEB = 'lock_web'
1255 LOCK_PULL = 'lock_pull'
1256
1257 NAME_SEP = URL_SEP
1258
1259 repo_id = Column(
1260 "repo_id", Integer(), nullable=False, unique=True, default=None,
1261 primary_key=True)
1262 _repo_name = Column(
1263 "repo_name", Text(), nullable=False, default=None)
1264 _repo_name_hash = Column(
1265 "repo_name_hash", String(255), nullable=False, unique=True)
1266 repo_state = Column("repo_state", String(255), nullable=True)
1267
1268 clone_uri = Column(
1269 "clone_uri", EncryptedTextValue(), nullable=True, unique=False,
1270 default=None)
1271 repo_type = Column(
1272 "repo_type", String(255), nullable=False, unique=False, default=None)
1273 user_id = Column(
1274 "user_id", Integer(), ForeignKey('users.user_id'), nullable=False,
1275 unique=False, default=None)
1276 private = Column(
1277 "private", Boolean(), nullable=True, unique=None, default=None)
1278 enable_statistics = Column(
1279 "statistics", Boolean(), nullable=True, unique=None, default=True)
1280 enable_downloads = Column(
1281 "downloads", Boolean(), nullable=True, unique=None, default=True)
1282 description = Column(
1283 "description", String(10000), nullable=True, unique=None, default=None)
1284 created_on = Column(
1285 'created_on', DateTime(timezone=False), nullable=True, unique=None,
1286 default=datetime.datetime.now)
1287 updated_on = Column(
1288 'updated_on', DateTime(timezone=False), nullable=True, unique=None,
1289 default=datetime.datetime.now)
1290 _landing_revision = Column(
1291 "landing_revision", String(255), nullable=False, unique=False,
1292 default=None)
1293 enable_locking = Column(
1294 "enable_locking", Boolean(), nullable=False, unique=None,
1295 default=False)
1296 _locked = Column(
1297 "locked", String(255), nullable=True, unique=False, default=None)
1298 _changeset_cache = Column(
1299 "changeset_cache", LargeBinary(), nullable=True) # JSON data
1300
1301 fork_id = Column(
1302 "fork_id", Integer(), ForeignKey('repositories.repo_id'),
1303 nullable=True, unique=False, default=None)
1304 group_id = Column(
1305 "group_id", Integer(), ForeignKey('groups.group_id'), nullable=True,
1306 unique=False, default=None)
1307
1308 user = relationship('User', lazy='joined')
1309 fork = relationship('Repository', remote_side=repo_id, lazy='joined')
1310 group = relationship('RepoGroup', lazy='joined')
1311 repo_to_perm = relationship(
1312 'UserRepoToPerm', cascade='all',
1313 order_by='UserRepoToPerm.repo_to_perm_id')
1314 users_group_to_perm = relationship('UserGroupRepoToPerm', cascade='all')
1315 stats = relationship('Statistics', cascade='all', uselist=False)
1316
1317 followers = relationship(
1318 'UserFollowing',
1319 primaryjoin='UserFollowing.follows_repo_id==Repository.repo_id',
1320 cascade='all')
1321 extra_fields = relationship(
1322 'RepositoryField', cascade="all, delete, delete-orphan")
1323 logs = relationship('UserLog')
1324 comments = relationship(
1325 'ChangesetComment', cascade="all, delete, delete-orphan")
1326 pull_requests_source = relationship(
1327 'PullRequest',
1328 primaryjoin='PullRequest.source_repo_id==Repository.repo_id',
1329 cascade="all, delete, delete-orphan")
1330 pull_requests_target = relationship(
1331 'PullRequest',
1332 primaryjoin='PullRequest.target_repo_id==Repository.repo_id',
1333 cascade="all, delete, delete-orphan")
1334 ui = relationship('RepoRhodeCodeUi', cascade="all")
1335 settings = relationship('RepoRhodeCodeSetting', cascade="all")
1336 integrations = relationship('Integration',
1337 cascade="all, delete, delete-orphan")
1338
1339 def __unicode__(self):
1340 return u"<%s('%s:%s')>" % (self.__class__.__name__, self.repo_id,
1341 safe_unicode(self.repo_name))
1342
1343 @hybrid_property
1344 def landing_rev(self):
1345 # always should return [rev_type, rev]
1346 if self._landing_revision:
1347 _rev_info = self._landing_revision.split(':')
1348 if len(_rev_info) < 2:
1349 _rev_info.insert(0, 'rev')
1350 return [_rev_info[0], _rev_info[1]]
1351 return [None, None]
1352
1353 @landing_rev.setter
1354 def landing_rev(self, val):
1355 if ':' not in val:
1356 raise ValueError('value must be delimited with `:` and consist '
1357 'of <rev_type>:<rev>, got %s instead' % val)
1358 self._landing_revision = val
1359
1360 @hybrid_property
1361 def locked(self):
1362 if self._locked:
1363 user_id, timelocked, reason = self._locked.split(':')
1364 lock_values = int(user_id), timelocked, reason
1365 else:
1366 lock_values = [None, None, None]
1367 return lock_values
1368
1369 @locked.setter
1370 def locked(self, val):
1371 if val and isinstance(val, (list, tuple)):
1372 self._locked = ':'.join(map(str, val))
1373 else:
1374 self._locked = None
1375
1376 @hybrid_property
1377 def changeset_cache(self):
1378 from rhodecode.lib.vcs.backends.base import EmptyCommit
1379 dummy = EmptyCommit().__json__()
1380 if not self._changeset_cache:
1381 return dummy
1382 try:
1383 return json.loads(self._changeset_cache)
1384 except TypeError:
1385 return dummy
1386 except Exception:
1387 log.error(traceback.format_exc())
1388 return dummy
1389
1390 @changeset_cache.setter
1391 def changeset_cache(self, val):
1392 try:
1393 self._changeset_cache = json.dumps(val)
1394 except Exception:
1395 log.error(traceback.format_exc())
1396
1397 @hybrid_property
1398 def repo_name(self):
1399 return self._repo_name
1400
1401 @repo_name.setter
1402 def repo_name(self, value):
1403 self._repo_name = value
1404 self._repo_name_hash = hashlib.sha1(safe_str(value)).hexdigest()
1405
1406 @classmethod
1407 def normalize_repo_name(cls, repo_name):
1408 """
1409 Normalizes os specific repo_name to the format internally stored inside
1410 database using URL_SEP
1411
1412 :param cls:
1413 :param repo_name:
1414 """
1415 return cls.NAME_SEP.join(repo_name.split(os.sep))
1416
1417 @classmethod
1418 def get_by_repo_name(cls, repo_name, cache=False, identity_cache=False):
1419 session = Session()
1420 q = session.query(cls).filter(cls.repo_name == repo_name)
1421
1422 if cache:
1423 if identity_cache:
1424 val = cls.identity_cache(session, 'repo_name', repo_name)
1425 if val:
1426 return val
1427 else:
1428 q = q.options(
1429 FromCache("sql_cache_short",
1430 "get_repo_by_name_%s" % _hash_key(repo_name)))
1431
1432 return q.scalar()
1433
1434 @classmethod
1435 def get_by_full_path(cls, repo_full_path):
1436 repo_name = repo_full_path.split(cls.base_path(), 1)[-1]
1437 repo_name = cls.normalize_repo_name(repo_name)
1438 return cls.get_by_repo_name(repo_name.strip(URL_SEP))
1439
1440 @classmethod
1441 def get_repo_forks(cls, repo_id):
1442 return cls.query().filter(Repository.fork_id == repo_id)
1443
1444 @classmethod
1445 def base_path(cls):
1446 """
1447 Returns base path when all repos are stored
1448
1449 :param cls:
1450 """
1451 q = Session().query(RhodeCodeUi)\
1452 .filter(RhodeCodeUi.ui_key == cls.NAME_SEP)
1453 q = q.options(FromCache("sql_cache_short", "repository_repo_path"))
1454 return q.one().ui_value
1455
1456 @classmethod
1457 def is_valid(cls, repo_name):
1458 """
1459 returns True if given repo name is a valid filesystem repository
1460
1461 :param cls:
1462 :param repo_name:
1463 """
1464 from rhodecode.lib.utils import is_valid_repo
1465
1466 return is_valid_repo(repo_name, cls.base_path())
1467
1468 @classmethod
1469 def get_all_repos(cls, user_id=Optional(None), group_id=Optional(None),
1470 case_insensitive=True):
1471 q = Repository.query()
1472
1473 if not isinstance(user_id, Optional):
1474 q = q.filter(Repository.user_id == user_id)
1475
1476 if not isinstance(group_id, Optional):
1477 q = q.filter(Repository.group_id == group_id)
1478
1479 if case_insensitive:
1480 q = q.order_by(func.lower(Repository.repo_name))
1481 else:
1482 q = q.order_by(Repository.repo_name)
1483 return q.all()
1484
1485 @property
1486 def forks(self):
1487 """
1488 Return forks of this repo
1489 """
1490 return Repository.get_repo_forks(self.repo_id)
1491
1492 @property
1493 def parent(self):
1494 """
1495 Returns fork parent
1496 """
1497 return self.fork
1498
1499 @property
1500 def just_name(self):
1501 return self.repo_name.split(self.NAME_SEP)[-1]
1502
1503 @property
1504 def groups_with_parents(self):
1505 groups = []
1506 if self.group is None:
1507 return groups
1508
1509 cur_gr = self.group
1510 groups.insert(0, cur_gr)
1511 while 1:
1512 gr = getattr(cur_gr, 'parent_group', None)
1513 cur_gr = cur_gr.parent_group
1514 if gr is None:
1515 break
1516 groups.insert(0, gr)
1517
1518 return groups
1519
1520 @property
1521 def groups_and_repo(self):
1522 return self.groups_with_parents, self
1523
1524 @LazyProperty
1525 def repo_path(self):
1526 """
1527 Returns base full path for that repository means where it actually
1528 exists on a filesystem
1529 """
1530 q = Session().query(RhodeCodeUi).filter(
1531 RhodeCodeUi.ui_key == self.NAME_SEP)
1532 q = q.options(FromCache("sql_cache_short", "repository_repo_path"))
1533 return q.one().ui_value
1534
1535 @property
1536 def repo_full_path(self):
1537 p = [self.repo_path]
1538 # we need to split the name by / since this is how we store the
1539 # names in the database, but that eventually needs to be converted
1540 # into a valid system path
1541 p += self.repo_name.split(self.NAME_SEP)
1542 return os.path.join(*map(safe_unicode, p))
1543
1544 @property
1545 def cache_keys(self):
1546 """
1547 Returns associated cache keys for that repo
1548 """
1549 return CacheKey.query()\
1550 .filter(CacheKey.cache_args == self.repo_name)\
1551 .order_by(CacheKey.cache_key)\
1552 .all()
1553
1554 def get_new_name(self, repo_name):
1555 """
1556 returns new full repository name based on assigned group and new new
1557
1558 :param group_name:
1559 """
1560 path_prefix = self.group.full_path_splitted if self.group else []
1561 return self.NAME_SEP.join(path_prefix + [repo_name])
1562
1563 @property
1564 def _config(self):
1565 """
1566 Returns db based config object.
1567 """
1568 from rhodecode.lib.utils import make_db_config
1569 return make_db_config(clear_session=False, repo=self)
1570
1571 def permissions(self, with_admins=True, with_owner=True):
1572 q = UserRepoToPerm.query().filter(UserRepoToPerm.repository == self)
1573 q = q.options(joinedload(UserRepoToPerm.repository),
1574 joinedload(UserRepoToPerm.user),
1575 joinedload(UserRepoToPerm.permission),)
1576
1577 # get owners and admins and permissions. We do a trick of re-writing
1578 # objects from sqlalchemy to named-tuples due to sqlalchemy session
1579 # has a global reference and changing one object propagates to all
1580 # others. This means if admin is also an owner admin_row that change
1581 # would propagate to both objects
1582 perm_rows = []
1583 for _usr in q.all():
1584 usr = AttributeDict(_usr.user.get_dict())
1585 usr.permission = _usr.permission.permission_name
1586 perm_rows.append(usr)
1587
1588 # filter the perm rows by 'default' first and then sort them by
1589 # admin,write,read,none permissions sorted again alphabetically in
1590 # each group
1591 perm_rows = sorted(perm_rows, key=display_sort)
1592
1593 _admin_perm = 'repository.admin'
1594 owner_row = []
1595 if with_owner:
1596 usr = AttributeDict(self.user.get_dict())
1597 usr.owner_row = True
1598 usr.permission = _admin_perm
1599 owner_row.append(usr)
1600
1601 super_admin_rows = []
1602 if with_admins:
1603 for usr in User.get_all_super_admins():
1604 # if this admin is also owner, don't double the record
1605 if usr.user_id == owner_row[0].user_id:
1606 owner_row[0].admin_row = True
1607 else:
1608 usr = AttributeDict(usr.get_dict())
1609 usr.admin_row = True
1610 usr.permission = _admin_perm
1611 super_admin_rows.append(usr)
1612
1613 return super_admin_rows + owner_row + perm_rows
1614
1615 def permission_user_groups(self):
1616 q = UserGroupRepoToPerm.query().filter(
1617 UserGroupRepoToPerm.repository == self)
1618 q = q.options(joinedload(UserGroupRepoToPerm.repository),
1619 joinedload(UserGroupRepoToPerm.users_group),
1620 joinedload(UserGroupRepoToPerm.permission),)
1621
1622 perm_rows = []
1623 for _user_group in q.all():
1624 usr = AttributeDict(_user_group.users_group.get_dict())
1625 usr.permission = _user_group.permission.permission_name
1626 perm_rows.append(usr)
1627
1628 return perm_rows
1629
1630 def get_api_data(self, include_secrets=False):
1631 """
1632 Common function for generating repo api data
1633
1634 :param include_secrets: See :meth:`User.get_api_data`.
1635
1636 """
1637 # TODO: mikhail: Here there is an anti-pattern, we probably need to
1638 # move this methods on models level.
1639 from rhodecode.model.settings import SettingsModel
1640
1641 repo = self
1642 _user_id, _time, _reason = self.locked
1643
1644 data = {
1645 'repo_id': repo.repo_id,
1646 'repo_name': repo.repo_name,
1647 'repo_type': repo.repo_type,
1648 'clone_uri': repo.clone_uri or '',
1649 'url': url('summary_home', repo_name=self.repo_name, qualified=True),
1650 'private': repo.private,
1651 'created_on': repo.created_on,
1652 'description': repo.description,
1653 'landing_rev': repo.landing_rev,
1654 'owner': repo.user.username,
1655 'fork_of': repo.fork.repo_name if repo.fork else None,
1656 'enable_statistics': repo.enable_statistics,
1657 'enable_locking': repo.enable_locking,
1658 'enable_downloads': repo.enable_downloads,
1659 'last_changeset': repo.changeset_cache,
1660 'locked_by': User.get(_user_id).get_api_data(
1661 include_secrets=include_secrets) if _user_id else None,
1662 'locked_date': time_to_datetime(_time) if _time else None,
1663 'lock_reason': _reason if _reason else None,
1664 }
1665
1666 # TODO: mikhail: should be per-repo settings here
1667 rc_config = SettingsModel().get_all_settings()
1668 repository_fields = str2bool(
1669 rc_config.get('rhodecode_repository_fields'))
1670 if repository_fields:
1671 for f in self.extra_fields:
1672 data[f.field_key_prefixed] = f.field_value
1673
1674 return data
1675
1676 @classmethod
1677 def lock(cls, repo, user_id, lock_time=None, lock_reason=None):
1678 if not lock_time:
1679 lock_time = time.time()
1680 if not lock_reason:
1681 lock_reason = cls.LOCK_AUTOMATIC
1682 repo.locked = [user_id, lock_time, lock_reason]
1683 Session().add(repo)
1684 Session().commit()
1685
1686 @classmethod
1687 def unlock(cls, repo):
1688 repo.locked = None
1689 Session().add(repo)
1690 Session().commit()
1691
1692 @classmethod
1693 def getlock(cls, repo):
1694 return repo.locked
1695
1696 def is_user_lock(self, user_id):
1697 if self.lock[0]:
1698 lock_user_id = safe_int(self.lock[0])
1699 user_id = safe_int(user_id)
1700 # both are ints, and they are equal
1701 return all([lock_user_id, user_id]) and lock_user_id == user_id
1702
1703 return False
1704
1705 def get_locking_state(self, action, user_id, only_when_enabled=True):
1706 """
1707 Checks locking on this repository, if locking is enabled and lock is
1708 present returns a tuple of make_lock, locked, locked_by.
1709 make_lock can have 3 states None (do nothing) True, make lock
1710 False release lock, This value is later propagated to hooks, which
1711 do the locking. Think about this as signals passed to hooks what to do.
1712
1713 """
1714 # TODO: johbo: This is part of the business logic and should be moved
1715 # into the RepositoryModel.
1716
1717 if action not in ('push', 'pull'):
1718 raise ValueError("Invalid action value: %s" % repr(action))
1719
1720 # defines if locked error should be thrown to user
1721 currently_locked = False
1722 # defines if new lock should be made, tri-state
1723 make_lock = None
1724 repo = self
1725 user = User.get(user_id)
1726
1727 lock_info = repo.locked
1728
1729 if repo and (repo.enable_locking or not only_when_enabled):
1730 if action == 'push':
1731 # check if it's already locked !, if it is compare users
1732 locked_by_user_id = lock_info[0]
1733 if user.user_id == locked_by_user_id:
1734 log.debug(
1735 'Got `push` action from user %s, now unlocking', user)
1736 # unlock if we have push from user who locked
1737 make_lock = False
1738 else:
1739 # we're not the same user who locked, ban with
1740 # code defined in settings (default is 423 HTTP Locked) !
1741 log.debug('Repo %s is currently locked by %s', repo, user)
1742 currently_locked = True
1743 elif action == 'pull':
1744 # [0] user [1] date
1745 if lock_info[0] and lock_info[1]:
1746 log.debug('Repo %s is currently locked by %s', repo, user)
1747 currently_locked = True
1748 else:
1749 log.debug('Setting lock on repo %s by %s', repo, user)
1750 make_lock = True
1751
1752 else:
1753 log.debug('Repository %s do not have locking enabled', repo)
1754
1755 log.debug('FINAL locking values make_lock:%s,locked:%s,locked_by:%s',
1756 make_lock, currently_locked, lock_info)
1757
1758 from rhodecode.lib.auth import HasRepoPermissionAny
1759 perm_check = HasRepoPermissionAny('repository.write', 'repository.admin')
1760 if make_lock and not perm_check(repo_name=repo.repo_name, user=user):
1761 # if we don't have at least write permission we cannot make a lock
1762 log.debug('lock state reset back to FALSE due to lack '
1763 'of at least read permission')
1764 make_lock = False
1765
1766 return make_lock, currently_locked, lock_info
1767
1768 @property
1769 def last_db_change(self):
1770 return self.updated_on
1771
1772 @property
1773 def clone_uri_hidden(self):
1774 clone_uri = self.clone_uri
1775 if clone_uri:
1776 import urlobject
1777 url_obj = urlobject.URLObject(clone_uri)
1778 if url_obj.password:
1779 clone_uri = url_obj.with_password('*****')
1780 return clone_uri
1781
1782 def clone_url(self, **override):
1783 qualified_home_url = url('home', qualified=True)
1784
1785 uri_tmpl = None
1786 if 'with_id' in override:
1787 uri_tmpl = self.DEFAULT_CLONE_URI_ID
1788 del override['with_id']
1789
1790 if 'uri_tmpl' in override:
1791 uri_tmpl = override['uri_tmpl']
1792 del override['uri_tmpl']
1793
1794 # we didn't override our tmpl from **overrides
1795 if not uri_tmpl:
1796 uri_tmpl = self.DEFAULT_CLONE_URI
1797 try:
1798 from pylons import tmpl_context as c
1799 uri_tmpl = c.clone_uri_tmpl
1800 except Exception:
1801 # in any case if we call this outside of request context,
1802 # ie, not having tmpl_context set up
1803 pass
1804
1805 return get_clone_url(uri_tmpl=uri_tmpl,
1806 qualifed_home_url=qualified_home_url,
1807 repo_name=self.repo_name,
1808 repo_id=self.repo_id, **override)
1809
1810 def set_state(self, state):
1811 self.repo_state = state
1812 Session().add(self)
1813 #==========================================================================
1814 # SCM PROPERTIES
1815 #==========================================================================
1816
1817 def get_commit(self, commit_id=None, commit_idx=None, pre_load=None):
1818 return get_commit_safe(
1819 self.scm_instance(), commit_id, commit_idx, pre_load=pre_load)
1820
1821 def get_changeset(self, rev=None, pre_load=None):
1822 warnings.warn("Use get_commit", DeprecationWarning)
1823 commit_id = None
1824 commit_idx = None
1825 if isinstance(rev, basestring):
1826 commit_id = rev
1827 else:
1828 commit_idx = rev
1829 return self.get_commit(commit_id=commit_id, commit_idx=commit_idx,
1830 pre_load=pre_load)
1831
1832 def get_landing_commit(self):
1833 """
1834 Returns landing commit, or if that doesn't exist returns the tip
1835 """
1836 _rev_type, _rev = self.landing_rev
1837 commit = self.get_commit(_rev)
1838 if isinstance(commit, EmptyCommit):
1839 return self.get_commit()
1840 return commit
1841
1842 def update_commit_cache(self, cs_cache=None, config=None):
1843 """
1844 Update cache of last changeset for repository, keys should be::
1845
1846 short_id
1847 raw_id
1848 revision
1849 parents
1850 message
1851 date
1852 author
1853
1854 :param cs_cache:
1855 """
1856 from rhodecode.lib.vcs.backends.base import BaseChangeset
1857 if cs_cache is None:
1858 # use no-cache version here
1859 scm_repo = self.scm_instance(cache=False, config=config)
1860 if scm_repo:
1861 cs_cache = scm_repo.get_commit(
1862 pre_load=["author", "date", "message", "parents"])
1863 else:
1864 cs_cache = EmptyCommit()
1865
1866 if isinstance(cs_cache, BaseChangeset):
1867 cs_cache = cs_cache.__json__()
1868
1869 def is_outdated(new_cs_cache):
1870 if (new_cs_cache['raw_id'] != self.changeset_cache['raw_id'] or
1871 new_cs_cache['revision'] != self.changeset_cache['revision']):
1872 return True
1873 return False
1874
1875 # check if we have maybe already latest cached revision
1876 if is_outdated(cs_cache) or not self.changeset_cache:
1877 _default = datetime.datetime.fromtimestamp(0)
1878 last_change = cs_cache.get('date') or _default
1879 log.debug('updated repo %s with new cs cache %s',
1880 self.repo_name, cs_cache)
1881 self.updated_on = last_change
1882 self.changeset_cache = cs_cache
1883 Session().add(self)
1884 Session().commit()
1885 else:
1886 log.debug('Skipping update_commit_cache for repo:`%s` '
1887 'commit already with latest changes', self.repo_name)
1888
1889 @property
1890 def tip(self):
1891 return self.get_commit('tip')
1892
1893 @property
1894 def author(self):
1895 return self.tip.author
1896
1897 @property
1898 def last_change(self):
1899 return self.scm_instance().last_change
1900
1901 def get_comments(self, revisions=None):
1902 """
1903 Returns comments for this repository grouped by revisions
1904
1905 :param revisions: filter query by revisions only
1906 """
1907 cmts = ChangesetComment.query()\
1908 .filter(ChangesetComment.repo == self)
1909 if revisions:
1910 cmts = cmts.filter(ChangesetComment.revision.in_(revisions))
1911 grouped = collections.defaultdict(list)
1912 for cmt in cmts.all():
1913 grouped[cmt.revision].append(cmt)
1914 return grouped
1915
1916 def statuses(self, revisions=None):
1917 """
1918 Returns statuses for this repository
1919
1920 :param revisions: list of revisions to get statuses for
1921 """
1922 statuses = ChangesetStatus.query()\
1923 .filter(ChangesetStatus.repo == self)\
1924 .filter(ChangesetStatus.version == 0)
1925
1926 if revisions:
1927 # Try doing the filtering in chunks to avoid hitting limits
1928 size = 500
1929 status_results = []
1930 for chunk in xrange(0, len(revisions), size):
1931 status_results += statuses.filter(
1932 ChangesetStatus.revision.in_(
1933 revisions[chunk: chunk+size])
1934 ).all()
1935 else:
1936 status_results = statuses.all()
1937
1938 grouped = {}
1939
1940 # maybe we have open new pullrequest without a status?
1941 stat = ChangesetStatus.STATUS_UNDER_REVIEW
1942 status_lbl = ChangesetStatus.get_status_lbl(stat)
1943 for pr in PullRequest.query().filter(PullRequest.source_repo == self).all():
1944 for rev in pr.revisions:
1945 pr_id = pr.pull_request_id
1946 pr_repo = pr.target_repo.repo_name
1947 grouped[rev] = [stat, status_lbl, pr_id, pr_repo]
1948
1949 for stat in status_results:
1950 pr_id = pr_repo = None
1951 if stat.pull_request:
1952 pr_id = stat.pull_request.pull_request_id
1953 pr_repo = stat.pull_request.target_repo.repo_name
1954 grouped[stat.revision] = [str(stat.status), stat.status_lbl,
1955 pr_id, pr_repo]
1956 return grouped
1957
1958 # ==========================================================================
1959 # SCM CACHE INSTANCE
1960 # ==========================================================================
1961
1962 def scm_instance(self, **kwargs):
1963 import rhodecode
1964
1965 # Passing a config will not hit the cache currently only used
1966 # for repo2dbmapper
1967 config = kwargs.pop('config', None)
1968 cache = kwargs.pop('cache', None)
1969 full_cache = str2bool(rhodecode.CONFIG.get('vcs_full_cache'))
1970 # if cache is NOT defined use default global, else we have a full
1971 # control over cache behaviour
1972 if cache is None and full_cache and not config:
1973 return self._get_instance_cached()
1974 return self._get_instance(cache=bool(cache), config=config)
1975
1976 def _get_instance_cached(self):
1977 @cache_region('long_term')
1978 def _get_repo(cache_key):
1979 return self._get_instance()
1980
1981 invalidator_context = CacheKey.repo_context_cache(
1982 _get_repo, self.repo_name, None, thread_scoped=True)
1983
1984 with invalidator_context as context:
1985 context.invalidate()
1986 repo = context.compute()
1987
1988 return repo
1989
1990 def _get_instance(self, cache=True, config=None):
1991 config = config or self._config
1992 custom_wire = {
1993 'cache': cache # controls the vcs.remote cache
1994 }
1995
1996 repo = get_vcs_instance(
1997 repo_path=safe_str(self.repo_full_path),
1998 config=config,
1999 with_wire=custom_wire,
2000 create=False)
2001
2002 return repo
2003
2004 def __json__(self):
2005 return {'landing_rev': self.landing_rev}
2006
2007 def get_dict(self):
2008
2009 # Since we transformed `repo_name` to a hybrid property, we need to
2010 # keep compatibility with the code which uses `repo_name` field.
2011
2012 result = super(Repository, self).get_dict()
2013 result['repo_name'] = result.pop('_repo_name', None)
2014 return result
2015
2016
2017 class RepoGroup(Base, BaseModel):
2018 __tablename__ = 'groups'
2019 __table_args__ = (
2020 UniqueConstraint('group_name', 'group_parent_id'),
2021 CheckConstraint('group_id != group_parent_id'),
2022 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2023 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
2024 )
2025 __mapper_args__ = {'order_by': 'group_name'}
2026
2027 CHOICES_SEPARATOR = '/' # used to generate select2 choices for nested groups
2028
2029 group_id = Column("group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2030 group_name = Column("group_name", String(255), nullable=False, unique=True, default=None)
2031 group_parent_id = Column("group_parent_id", Integer(), ForeignKey('groups.group_id'), nullable=True, unique=None, default=None)
2032 group_description = Column("group_description", String(10000), nullable=True, unique=None, default=None)
2033 enable_locking = Column("enable_locking", Boolean(), nullable=False, unique=None, default=False)
2034 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=False, default=None)
2035 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
2036
2037 repo_group_to_perm = relationship('UserRepoGroupToPerm', cascade='all', order_by='UserRepoGroupToPerm.group_to_perm_id')
2038 users_group_to_perm = relationship('UserGroupRepoGroupToPerm', cascade='all')
2039 parent_group = relationship('RepoGroup', remote_side=group_id)
2040 user = relationship('User')
2041 integrations = relationship('Integration',
2042 cascade="all, delete, delete-orphan")
2043
2044 def __init__(self, group_name='', parent_group=None):
2045 self.group_name = group_name
2046 self.parent_group = parent_group
2047
2048 def __unicode__(self):
2049 return u"<%s('id:%s:%s')>" % (self.__class__.__name__, self.group_id,
2050 self.group_name)
2051
2052 @classmethod
2053 def _generate_choice(cls, repo_group):
2054 from webhelpers.html import literal as _literal
2055 _name = lambda k: _literal(cls.CHOICES_SEPARATOR.join(k))
2056 return repo_group.group_id, _name(repo_group.full_path_splitted)
2057
2058 @classmethod
2059 def groups_choices(cls, groups=None, show_empty_group=True):
2060 if not groups:
2061 groups = cls.query().all()
2062
2063 repo_groups = []
2064 if show_empty_group:
2065 repo_groups = [('-1', u'-- %s --' % _('No parent'))]
2066
2067 repo_groups.extend([cls._generate_choice(x) for x in groups])
2068
2069 repo_groups = sorted(
2070 repo_groups, key=lambda t: t[1].split(cls.CHOICES_SEPARATOR)[0])
2071 return repo_groups
2072
2073 @classmethod
2074 def url_sep(cls):
2075 return URL_SEP
2076
2077 @classmethod
2078 def get_by_group_name(cls, group_name, cache=False, case_insensitive=False):
2079 if case_insensitive:
2080 gr = cls.query().filter(func.lower(cls.group_name)
2081 == func.lower(group_name))
2082 else:
2083 gr = cls.query().filter(cls.group_name == group_name)
2084 if cache:
2085 gr = gr.options(FromCache(
2086 "sql_cache_short",
2087 "get_group_%s" % _hash_key(group_name)))
2088 return gr.scalar()
2089
2090 @classmethod
2091 def get_all_repo_groups(cls, user_id=Optional(None), group_id=Optional(None),
2092 case_insensitive=True):
2093 q = RepoGroup.query()
2094
2095 if not isinstance(user_id, Optional):
2096 q = q.filter(RepoGroup.user_id == user_id)
2097
2098 if not isinstance(group_id, Optional):
2099 q = q.filter(RepoGroup.group_parent_id == group_id)
2100
2101 if case_insensitive:
2102 q = q.order_by(func.lower(RepoGroup.group_name))
2103 else:
2104 q = q.order_by(RepoGroup.group_name)
2105 return q.all()
2106
2107 @property
2108 def parents(self):
2109 parents_recursion_limit = 10
2110 groups = []
2111 if self.parent_group is None:
2112 return groups
2113 cur_gr = self.parent_group
2114 groups.insert(0, cur_gr)
2115 cnt = 0
2116 while 1:
2117 cnt += 1
2118 gr = getattr(cur_gr, 'parent_group', None)
2119 cur_gr = cur_gr.parent_group
2120 if gr is None:
2121 break
2122 if cnt == parents_recursion_limit:
2123 # this will prevent accidental infinit loops
2124 log.error(('more than %s parents found for group %s, stopping '
2125 'recursive parent fetching' % (parents_recursion_limit, self)))
2126 break
2127
2128 groups.insert(0, gr)
2129 return groups
2130
2131 @property
2132 def children(self):
2133 return RepoGroup.query().filter(RepoGroup.parent_group == self)
2134
2135 @property
2136 def name(self):
2137 return self.group_name.split(RepoGroup.url_sep())[-1]
2138
2139 @property
2140 def full_path(self):
2141 return self.group_name
2142
2143 @property
2144 def full_path_splitted(self):
2145 return self.group_name.split(RepoGroup.url_sep())
2146
2147 @property
2148 def repositories(self):
2149 return Repository.query()\
2150 .filter(Repository.group == self)\
2151 .order_by(Repository.repo_name)
2152
2153 @property
2154 def repositories_recursive_count(self):
2155 cnt = self.repositories.count()
2156
2157 def children_count(group):
2158 cnt = 0
2159 for child in group.children:
2160 cnt += child.repositories.count()
2161 cnt += children_count(child)
2162 return cnt
2163
2164 return cnt + children_count(self)
2165
2166 def _recursive_objects(self, include_repos=True):
2167 all_ = []
2168
2169 def _get_members(root_gr):
2170 if include_repos:
2171 for r in root_gr.repositories:
2172 all_.append(r)
2173 childs = root_gr.children.all()
2174 if childs:
2175 for gr in childs:
2176 all_.append(gr)
2177 _get_members(gr)
2178
2179 _get_members(self)
2180 return [self] + all_
2181
2182 def recursive_groups_and_repos(self):
2183 """
2184 Recursive return all groups, with repositories in those groups
2185 """
2186 return self._recursive_objects()
2187
2188 def recursive_groups(self):
2189 """
2190 Returns all children groups for this group including children of children
2191 """
2192 return self._recursive_objects(include_repos=False)
2193
2194 def get_new_name(self, group_name):
2195 """
2196 returns new full group name based on parent and new name
2197
2198 :param group_name:
2199 """
2200 path_prefix = (self.parent_group.full_path_splitted if
2201 self.parent_group else [])
2202 return RepoGroup.url_sep().join(path_prefix + [group_name])
2203
2204 def permissions(self, with_admins=True, with_owner=True):
2205 q = UserRepoGroupToPerm.query().filter(UserRepoGroupToPerm.group == self)
2206 q = q.options(joinedload(UserRepoGroupToPerm.group),
2207 joinedload(UserRepoGroupToPerm.user),
2208 joinedload(UserRepoGroupToPerm.permission),)
2209
2210 # get owners and admins and permissions. We do a trick of re-writing
2211 # objects from sqlalchemy to named-tuples due to sqlalchemy session
2212 # has a global reference and changing one object propagates to all
2213 # others. This means if admin is also an owner admin_row that change
2214 # would propagate to both objects
2215 perm_rows = []
2216 for _usr in q.all():
2217 usr = AttributeDict(_usr.user.get_dict())
2218 usr.permission = _usr.permission.permission_name
2219 perm_rows.append(usr)
2220
2221 # filter the perm rows by 'default' first and then sort them by
2222 # admin,write,read,none permissions sorted again alphabetically in
2223 # each group
2224 perm_rows = sorted(perm_rows, key=display_sort)
2225
2226 _admin_perm = 'group.admin'
2227 owner_row = []
2228 if with_owner:
2229 usr = AttributeDict(self.user.get_dict())
2230 usr.owner_row = True
2231 usr.permission = _admin_perm
2232 owner_row.append(usr)
2233
2234 super_admin_rows = []
2235 if with_admins:
2236 for usr in User.get_all_super_admins():
2237 # if this admin is also owner, don't double the record
2238 if usr.user_id == owner_row[0].user_id:
2239 owner_row[0].admin_row = True
2240 else:
2241 usr = AttributeDict(usr.get_dict())
2242 usr.admin_row = True
2243 usr.permission = _admin_perm
2244 super_admin_rows.append(usr)
2245
2246 return super_admin_rows + owner_row + perm_rows
2247
2248 def permission_user_groups(self):
2249 q = UserGroupRepoGroupToPerm.query().filter(UserGroupRepoGroupToPerm.group == self)
2250 q = q.options(joinedload(UserGroupRepoGroupToPerm.group),
2251 joinedload(UserGroupRepoGroupToPerm.users_group),
2252 joinedload(UserGroupRepoGroupToPerm.permission),)
2253
2254 perm_rows = []
2255 for _user_group in q.all():
2256 usr = AttributeDict(_user_group.users_group.get_dict())
2257 usr.permission = _user_group.permission.permission_name
2258 perm_rows.append(usr)
2259
2260 return perm_rows
2261
2262 def get_api_data(self):
2263 """
2264 Common function for generating api data
2265
2266 """
2267 group = self
2268 data = {
2269 'group_id': group.group_id,
2270 'group_name': group.group_name,
2271 'group_description': group.group_description,
2272 'parent_group': group.parent_group.group_name if group.parent_group else None,
2273 'repositories': [x.repo_name for x in group.repositories],
2274 'owner': group.user.username,
2275 }
2276 return data
2277
2278
2279 class Permission(Base, BaseModel):
2280 __tablename__ = 'permissions'
2281 __table_args__ = (
2282 Index('p_perm_name_idx', 'permission_name'),
2283 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2284 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
2285 )
2286 PERMS = [
2287 ('hg.admin', _('RhodeCode Super Administrator')),
2288
2289 ('repository.none', _('Repository no access')),
2290 ('repository.read', _('Repository read access')),
2291 ('repository.write', _('Repository write access')),
2292 ('repository.admin', _('Repository admin access')),
2293
2294 ('group.none', _('Repository group no access')),
2295 ('group.read', _('Repository group read access')),
2296 ('group.write', _('Repository group write access')),
2297 ('group.admin', _('Repository group admin access')),
2298
2299 ('usergroup.none', _('User group no access')),
2300 ('usergroup.read', _('User group read access')),
2301 ('usergroup.write', _('User group write access')),
2302 ('usergroup.admin', _('User group admin access')),
2303
2304 ('hg.repogroup.create.false', _('Repository Group creation disabled')),
2305 ('hg.repogroup.create.true', _('Repository Group creation enabled')),
2306
2307 ('hg.usergroup.create.false', _('User Group creation disabled')),
2308 ('hg.usergroup.create.true', _('User Group creation enabled')),
2309
2310 ('hg.create.none', _('Repository creation disabled')),
2311 ('hg.create.repository', _('Repository creation enabled')),
2312 ('hg.create.write_on_repogroup.true', _('Repository creation enabled with write permission to a repository group')),
2313 ('hg.create.write_on_repogroup.false', _('Repository creation disabled with write permission to a repository group')),
2314
2315 ('hg.fork.none', _('Repository forking disabled')),
2316 ('hg.fork.repository', _('Repository forking enabled')),
2317
2318 ('hg.register.none', _('Registration disabled')),
2319 ('hg.register.manual_activate', _('User Registration with manual account activation')),
2320 ('hg.register.auto_activate', _('User Registration with automatic account activation')),
2321
2322 ('hg.extern_activate.manual', _('Manual activation of external account')),
2323 ('hg.extern_activate.auto', _('Automatic activation of external account')),
2324
2325 ('hg.inherit_default_perms.false', _('Inherit object permissions from default user disabled')),
2326 ('hg.inherit_default_perms.true', _('Inherit object permissions from default user enabled')),
2327 ]
2328
2329 # definition of system default permissions for DEFAULT user
2330 DEFAULT_USER_PERMISSIONS = [
2331 'repository.read',
2332 'group.read',
2333 'usergroup.read',
2334 'hg.create.repository',
2335 'hg.repogroup.create.false',
2336 'hg.usergroup.create.false',
2337 'hg.create.write_on_repogroup.true',
2338 'hg.fork.repository',
2339 'hg.register.manual_activate',
2340 'hg.extern_activate.auto',
2341 'hg.inherit_default_perms.true',
2342 ]
2343
2344 # defines which permissions are more important higher the more important
2345 # Weight defines which permissions are more important.
2346 # The higher number the more important.
2347 PERM_WEIGHTS = {
2348 'repository.none': 0,
2349 'repository.read': 1,
2350 'repository.write': 3,
2351 'repository.admin': 4,
2352
2353 'group.none': 0,
2354 'group.read': 1,
2355 'group.write': 3,
2356 'group.admin': 4,
2357
2358 'usergroup.none': 0,
2359 'usergroup.read': 1,
2360 'usergroup.write': 3,
2361 'usergroup.admin': 4,
2362
2363 'hg.repogroup.create.false': 0,
2364 'hg.repogroup.create.true': 1,
2365
2366 'hg.usergroup.create.false': 0,
2367 'hg.usergroup.create.true': 1,
2368
2369 'hg.fork.none': 0,
2370 'hg.fork.repository': 1,
2371 'hg.create.none': 0,
2372 'hg.create.repository': 1
2373 }
2374
2375 permission_id = Column("permission_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2376 permission_name = Column("permission_name", String(255), nullable=True, unique=None, default=None)
2377 permission_longname = Column("permission_longname", String(255), nullable=True, unique=None, default=None)
2378
2379 def __unicode__(self):
2380 return u"<%s('%s:%s')>" % (
2381 self.__class__.__name__, self.permission_id, self.permission_name
2382 )
2383
2384 @classmethod
2385 def get_by_key(cls, key):
2386 return cls.query().filter(cls.permission_name == key).scalar()
2387
2388 @classmethod
2389 def get_default_repo_perms(cls, user_id, repo_id=None):
2390 q = Session().query(UserRepoToPerm, Repository, Permission)\
2391 .join((Permission, UserRepoToPerm.permission_id == Permission.permission_id))\
2392 .join((Repository, UserRepoToPerm.repository_id == Repository.repo_id))\
2393 .filter(UserRepoToPerm.user_id == user_id)
2394 if repo_id:
2395 q = q.filter(UserRepoToPerm.repository_id == repo_id)
2396 return q.all()
2397
2398 @classmethod
2399 def get_default_repo_perms_from_user_group(cls, user_id, repo_id=None):
2400 q = Session().query(UserGroupRepoToPerm, Repository, Permission)\
2401 .join(
2402 Permission,
2403 UserGroupRepoToPerm.permission_id == Permission.permission_id)\
2404 .join(
2405 Repository,
2406 UserGroupRepoToPerm.repository_id == Repository.repo_id)\
2407 .join(
2408 UserGroup,
2409 UserGroupRepoToPerm.users_group_id ==
2410 UserGroup.users_group_id)\
2411 .join(
2412 UserGroupMember,
2413 UserGroupRepoToPerm.users_group_id ==
2414 UserGroupMember.users_group_id)\
2415 .filter(
2416 UserGroupMember.user_id == user_id,
2417 UserGroup.users_group_active == true())
2418 if repo_id:
2419 q = q.filter(UserGroupRepoToPerm.repository_id == repo_id)
2420 return q.all()
2421
2422 @classmethod
2423 def get_default_group_perms(cls, user_id, repo_group_id=None):
2424 q = Session().query(UserRepoGroupToPerm, RepoGroup, Permission)\
2425 .join((Permission, UserRepoGroupToPerm.permission_id == Permission.permission_id))\
2426 .join((RepoGroup, UserRepoGroupToPerm.group_id == RepoGroup.group_id))\
2427 .filter(UserRepoGroupToPerm.user_id == user_id)
2428 if repo_group_id:
2429 q = q.filter(UserRepoGroupToPerm.group_id == repo_group_id)
2430 return q.all()
2431
2432 @classmethod
2433 def get_default_group_perms_from_user_group(
2434 cls, user_id, repo_group_id=None):
2435 q = Session().query(UserGroupRepoGroupToPerm, RepoGroup, Permission)\
2436 .join(
2437 Permission,
2438 UserGroupRepoGroupToPerm.permission_id ==
2439 Permission.permission_id)\
2440 .join(
2441 RepoGroup,
2442 UserGroupRepoGroupToPerm.group_id == RepoGroup.group_id)\
2443 .join(
2444 UserGroup,
2445 UserGroupRepoGroupToPerm.users_group_id ==
2446 UserGroup.users_group_id)\
2447 .join(
2448 UserGroupMember,
2449 UserGroupRepoGroupToPerm.users_group_id ==
2450 UserGroupMember.users_group_id)\
2451 .filter(
2452 UserGroupMember.user_id == user_id,
2453 UserGroup.users_group_active == true())
2454 if repo_group_id:
2455 q = q.filter(UserGroupRepoGroupToPerm.group_id == repo_group_id)
2456 return q.all()
2457
2458 @classmethod
2459 def get_default_user_group_perms(cls, user_id, user_group_id=None):
2460 q = Session().query(UserUserGroupToPerm, UserGroup, Permission)\
2461 .join((Permission, UserUserGroupToPerm.permission_id == Permission.permission_id))\
2462 .join((UserGroup, UserUserGroupToPerm.user_group_id == UserGroup.users_group_id))\
2463 .filter(UserUserGroupToPerm.user_id == user_id)
2464 if user_group_id:
2465 q = q.filter(UserUserGroupToPerm.user_group_id == user_group_id)
2466 return q.all()
2467
2468 @classmethod
2469 def get_default_user_group_perms_from_user_group(
2470 cls, user_id, user_group_id=None):
2471 TargetUserGroup = aliased(UserGroup, name='target_user_group')
2472 q = Session().query(UserGroupUserGroupToPerm, UserGroup, Permission)\
2473 .join(
2474 Permission,
2475 UserGroupUserGroupToPerm.permission_id ==
2476 Permission.permission_id)\
2477 .join(
2478 TargetUserGroup,
2479 UserGroupUserGroupToPerm.target_user_group_id ==
2480 TargetUserGroup.users_group_id)\
2481 .join(
2482 UserGroup,
2483 UserGroupUserGroupToPerm.user_group_id ==
2484 UserGroup.users_group_id)\
2485 .join(
2486 UserGroupMember,
2487 UserGroupUserGroupToPerm.user_group_id ==
2488 UserGroupMember.users_group_id)\
2489 .filter(
2490 UserGroupMember.user_id == user_id,
2491 UserGroup.users_group_active == true())
2492 if user_group_id:
2493 q = q.filter(
2494 UserGroupUserGroupToPerm.user_group_id == user_group_id)
2495
2496 return q.all()
2497
2498
2499 class UserRepoToPerm(Base, BaseModel):
2500 __tablename__ = 'repo_to_perm'
2501 __table_args__ = (
2502 UniqueConstraint('user_id', 'repository_id', 'permission_id'),
2503 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2504 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2505 )
2506 repo_to_perm_id = Column("repo_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2507 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
2508 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2509 repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None)
2510
2511 user = relationship('User')
2512 repository = relationship('Repository')
2513 permission = relationship('Permission')
2514
2515 @classmethod
2516 def create(cls, user, repository, permission):
2517 n = cls()
2518 n.user = user
2519 n.repository = repository
2520 n.permission = permission
2521 Session().add(n)
2522 return n
2523
2524 def __unicode__(self):
2525 return u'<%s => %s >' % (self.user, self.repository)
2526
2527
2528 class UserUserGroupToPerm(Base, BaseModel):
2529 __tablename__ = 'user_user_group_to_perm'
2530 __table_args__ = (
2531 UniqueConstraint('user_id', 'user_group_id', 'permission_id'),
2532 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2533 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2534 )
2535 user_user_group_to_perm_id = Column("user_user_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2536 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
2537 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2538 user_group_id = Column("user_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2539
2540 user = relationship('User')
2541 user_group = relationship('UserGroup')
2542 permission = relationship('Permission')
2543
2544 @classmethod
2545 def create(cls, user, user_group, permission):
2546 n = cls()
2547 n.user = user
2548 n.user_group = user_group
2549 n.permission = permission
2550 Session().add(n)
2551 return n
2552
2553 def __unicode__(self):
2554 return u'<%s => %s >' % (self.user, self.user_group)
2555
2556
2557 class UserToPerm(Base, BaseModel):
2558 __tablename__ = 'user_to_perm'
2559 __table_args__ = (
2560 UniqueConstraint('user_id', 'permission_id'),
2561 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2562 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2563 )
2564 user_to_perm_id = Column("user_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2565 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
2566 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2567
2568 user = relationship('User')
2569 permission = relationship('Permission', lazy='joined')
2570
2571 def __unicode__(self):
2572 return u'<%s => %s >' % (self.user, self.permission)
2573
2574
2575 class UserGroupRepoToPerm(Base, BaseModel):
2576 __tablename__ = 'users_group_repo_to_perm'
2577 __table_args__ = (
2578 UniqueConstraint('repository_id', 'users_group_id', 'permission_id'),
2579 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2580 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2581 )
2582 users_group_to_perm_id = Column("users_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2583 users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2584 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2585 repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None)
2586
2587 users_group = relationship('UserGroup')
2588 permission = relationship('Permission')
2589 repository = relationship('Repository')
2590
2591 @classmethod
2592 def create(cls, users_group, repository, permission):
2593 n = cls()
2594 n.users_group = users_group
2595 n.repository = repository
2596 n.permission = permission
2597 Session().add(n)
2598 return n
2599
2600 def __unicode__(self):
2601 return u'<UserGroupRepoToPerm:%s => %s >' % (self.users_group, self.repository)
2602
2603
2604 class UserGroupUserGroupToPerm(Base, BaseModel):
2605 __tablename__ = 'user_group_user_group_to_perm'
2606 __table_args__ = (
2607 UniqueConstraint('target_user_group_id', 'user_group_id', 'permission_id'),
2608 CheckConstraint('target_user_group_id != user_group_id'),
2609 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2610 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2611 )
2612 user_group_user_group_to_perm_id = Column("user_group_user_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2613 target_user_group_id = Column("target_user_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2614 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2615 user_group_id = Column("user_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2616
2617 target_user_group = relationship('UserGroup', primaryjoin='UserGroupUserGroupToPerm.target_user_group_id==UserGroup.users_group_id')
2618 user_group = relationship('UserGroup', primaryjoin='UserGroupUserGroupToPerm.user_group_id==UserGroup.users_group_id')
2619 permission = relationship('Permission')
2620
2621 @classmethod
2622 def create(cls, target_user_group, user_group, permission):
2623 n = cls()
2624 n.target_user_group = target_user_group
2625 n.user_group = user_group
2626 n.permission = permission
2627 Session().add(n)
2628 return n
2629
2630 def __unicode__(self):
2631 return u'<UserGroupUserGroup:%s => %s >' % (self.target_user_group, self.user_group)
2632
2633
2634 class UserGroupToPerm(Base, BaseModel):
2635 __tablename__ = 'users_group_to_perm'
2636 __table_args__ = (
2637 UniqueConstraint('users_group_id', 'permission_id',),
2638 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2639 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2640 )
2641 users_group_to_perm_id = Column("users_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2642 users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2643 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2644
2645 users_group = relationship('UserGroup')
2646 permission = relationship('Permission')
2647
2648
2649 class UserRepoGroupToPerm(Base, BaseModel):
2650 __tablename__ = 'user_repo_group_to_perm'
2651 __table_args__ = (
2652 UniqueConstraint('user_id', 'group_id', 'permission_id'),
2653 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2654 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2655 )
2656
2657 group_to_perm_id = Column("group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2658 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
2659 group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=False, unique=None, default=None)
2660 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2661
2662 user = relationship('User')
2663 group = relationship('RepoGroup')
2664 permission = relationship('Permission')
2665
2666 @classmethod
2667 def create(cls, user, repository_group, permission):
2668 n = cls()
2669 n.user = user
2670 n.group = repository_group
2671 n.permission = permission
2672 Session().add(n)
2673 return n
2674
2675
2676 class UserGroupRepoGroupToPerm(Base, BaseModel):
2677 __tablename__ = 'users_group_repo_group_to_perm'
2678 __table_args__ = (
2679 UniqueConstraint('users_group_id', 'group_id'),
2680 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2681 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2682 )
2683
2684 users_group_repo_group_to_perm_id = Column("users_group_repo_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2685 users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2686 group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=False, unique=None, default=None)
2687 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2688
2689 users_group = relationship('UserGroup')
2690 permission = relationship('Permission')
2691 group = relationship('RepoGroup')
2692
2693 @classmethod
2694 def create(cls, user_group, repository_group, permission):
2695 n = cls()
2696 n.users_group = user_group
2697 n.group = repository_group
2698 n.permission = permission
2699 Session().add(n)
2700 return n
2701
2702 def __unicode__(self):
2703 return u'<UserGroupRepoGroupToPerm:%s => %s >' % (self.users_group, self.group)
2704
2705
2706 class Statistics(Base, BaseModel):
2707 __tablename__ = 'statistics'
2708 __table_args__ = (
2709 UniqueConstraint('repository_id'),
2710 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2711 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2712 )
2713 stat_id = Column("stat_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2714 repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=True, default=None)
2715 stat_on_revision = Column("stat_on_revision", Integer(), nullable=False)
2716 commit_activity = Column("commit_activity", LargeBinary(1000000), nullable=False)#JSON data
2717 commit_activity_combined = Column("commit_activity_combined", LargeBinary(), nullable=False)#JSON data
2718 languages = Column("languages", LargeBinary(1000000), nullable=False)#JSON data
2719
2720 repository = relationship('Repository', single_parent=True)
2721
2722
2723 class UserFollowing(Base, BaseModel):
2724 __tablename__ = 'user_followings'
2725 __table_args__ = (
2726 UniqueConstraint('user_id', 'follows_repository_id'),
2727 UniqueConstraint('user_id', 'follows_user_id'),
2728 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2729 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2730 )
2731
2732 user_following_id = Column("user_following_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2733 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
2734 follows_repo_id = Column("follows_repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True, unique=None, default=None)
2735 follows_user_id = Column("follows_user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None)
2736 follows_from = Column('follows_from', DateTime(timezone=False), nullable=True, unique=None, default=datetime.datetime.now)
2737
2738 user = relationship('User', primaryjoin='User.user_id==UserFollowing.user_id')
2739
2740 follows_user = relationship('User', primaryjoin='User.user_id==UserFollowing.follows_user_id')
2741 follows_repository = relationship('Repository', order_by='Repository.repo_name')
2742
2743 @classmethod
2744 def get_repo_followers(cls, repo_id):
2745 return cls.query().filter(cls.follows_repo_id == repo_id)
2746
2747
2748 class CacheKey(Base, BaseModel):
2749 __tablename__ = 'cache_invalidation'
2750 __table_args__ = (
2751 UniqueConstraint('cache_key'),
2752 Index('key_idx', 'cache_key'),
2753 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2754 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
2755 )
2756 CACHE_TYPE_ATOM = 'ATOM'
2757 CACHE_TYPE_RSS = 'RSS'
2758 CACHE_TYPE_README = 'README'
2759
2760 cache_id = Column("cache_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2761 cache_key = Column("cache_key", String(255), nullable=True, unique=None, default=None)
2762 cache_args = Column("cache_args", String(255), nullable=True, unique=None, default=None)
2763 cache_active = Column("cache_active", Boolean(), nullable=True, unique=None, default=False)
2764
2765 def __init__(self, cache_key, cache_args=''):
2766 self.cache_key = cache_key
2767 self.cache_args = cache_args
2768 self.cache_active = False
2769
2770 def __unicode__(self):
2771 return u"<%s('%s:%s[%s]')>" % (
2772 self.__class__.__name__,
2773 self.cache_id, self.cache_key, self.cache_active)
2774
2775 def _cache_key_partition(self):
2776 prefix, repo_name, suffix = self.cache_key.partition(self.cache_args)
2777 return prefix, repo_name, suffix
2778
2779 def get_prefix(self):
2780 """
2781 Try to extract prefix from existing cache key. The key could consist
2782 of prefix, repo_name, suffix
2783 """
2784 # this returns prefix, repo_name, suffix
2785 return self._cache_key_partition()[0]
2786
2787 def get_suffix(self):
2788 """
2789 get suffix that might have been used in _get_cache_key to
2790 generate self.cache_key. Only used for informational purposes
2791 in repo_edit.html.
2792 """
2793 # prefix, repo_name, suffix
2794 return self._cache_key_partition()[2]
2795
2796 @classmethod
2797 def delete_all_cache(cls):
2798 """
2799 Delete all cache keys from database.
2800 Should only be run when all instances are down and all entries
2801 thus stale.
2802 """
2803 cls.query().delete()
2804 Session().commit()
2805
2806 @classmethod
2807 def get_cache_key(cls, repo_name, cache_type):
2808 """
2809
2810 Generate a cache key for this process of RhodeCode instance.
2811 Prefix most likely will be process id or maybe explicitly set
2812 instance_id from .ini file.
2813 """
2814 import rhodecode
2815 prefix = safe_unicode(rhodecode.CONFIG.get('instance_id') or '')
2816
2817 repo_as_unicode = safe_unicode(repo_name)
2818 key = u'{}_{}'.format(repo_as_unicode, cache_type) \
2819 if cache_type else repo_as_unicode
2820
2821 return u'{}{}'.format(prefix, key)
2822
2823 @classmethod
2824 def set_invalidate(cls, repo_name, delete=False):
2825 """
2826 Mark all caches of a repo as invalid in the database.
2827 """
2828
2829 try:
2830 qry = Session().query(cls).filter(cls.cache_args == repo_name)
2831 if delete:
2832 log.debug('cache objects deleted for repo %s',
2833 safe_str(repo_name))
2834 qry.delete()
2835 else:
2836 log.debug('cache objects marked as invalid for repo %s',
2837 safe_str(repo_name))
2838 qry.update({"cache_active": False})
2839
2840 Session().commit()
2841 except Exception:
2842 log.exception(
2843 'Cache key invalidation failed for repository %s',
2844 safe_str(repo_name))
2845 Session().rollback()
2846
2847 @classmethod
2848 def get_active_cache(cls, cache_key):
2849 inv_obj = cls.query().filter(cls.cache_key == cache_key).scalar()
2850 if inv_obj:
2851 return inv_obj
2852 return None
2853
2854 @classmethod
2855 def repo_context_cache(cls, compute_func, repo_name, cache_type,
2856 thread_scoped=False):
2857 """
2858 @cache_region('long_term')
2859 def _heavy_calculation(cache_key):
2860 return 'result'
2861
2862 cache_context = CacheKey.repo_context_cache(
2863 _heavy_calculation, repo_name, cache_type)
2864
2865 with cache_context as context:
2866 context.invalidate()
2867 computed = context.compute()
2868
2869 assert computed == 'result'
2870 """
2871 from rhodecode.lib import caches
2872 return caches.InvalidationContext(
2873 compute_func, repo_name, cache_type, thread_scoped=thread_scoped)
2874
2875
2876 class ChangesetComment(Base, BaseModel):
2877 __tablename__ = 'changeset_comments'
2878 __table_args__ = (
2879 Index('cc_revision_idx', 'revision'),
2880 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2881 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
2882 )
2883
2884 COMMENT_OUTDATED = u'comment_outdated'
2885
2886 comment_id = Column('comment_id', Integer(), nullable=False, primary_key=True)
2887 repo_id = Column('repo_id', Integer(), ForeignKey('repositories.repo_id'), nullable=False)
2888 revision = Column('revision', String(40), nullable=True)
2889 pull_request_id = Column("pull_request_id", Integer(), ForeignKey('pull_requests.pull_request_id'), nullable=True)
2890 pull_request_version_id = Column("pull_request_version_id", Integer(), ForeignKey('pull_request_versions.pull_request_version_id'), nullable=True)
2891 line_no = Column('line_no', Unicode(10), nullable=True)
2892 hl_lines = Column('hl_lines', Unicode(512), nullable=True)
2893 f_path = Column('f_path', Unicode(1000), nullable=True)
2894 user_id = Column('user_id', Integer(), ForeignKey('users.user_id'), nullable=False)
2895 text = Column('text', UnicodeText().with_variant(UnicodeText(25000), 'mysql'), nullable=False)
2896 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
2897 modified_at = Column('modified_at', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
2898 renderer = Column('renderer', Unicode(64), nullable=True)
2899 display_state = Column('display_state', Unicode(128), nullable=True)
2900
2901 author = relationship('User', lazy='joined')
2902 repo = relationship('Repository')
2903 status_change = relationship('ChangesetStatus', cascade="all, delete, delete-orphan")
2904 pull_request = relationship('PullRequest', lazy='joined')
2905 pull_request_version = relationship('PullRequestVersion')
2906
2907 @classmethod
2908 def get_users(cls, revision=None, pull_request_id=None):
2909 """
2910 Returns user associated with this ChangesetComment. ie those
2911 who actually commented
2912
2913 :param cls:
2914 :param revision:
2915 """
2916 q = Session().query(User)\
2917 .join(ChangesetComment.author)
2918 if revision:
2919 q = q.filter(cls.revision == revision)
2920 elif pull_request_id:
2921 q = q.filter(cls.pull_request_id == pull_request_id)
2922 return q.all()
2923
2924 def render(self, mentions=False):
2925 from rhodecode.lib import helpers as h
2926 return h.render(self.text, renderer=self.renderer, mentions=mentions)
2927
2928 def __repr__(self):
2929 if self.comment_id:
2930 return '<DB:ChangesetComment #%s>' % self.comment_id
2931 else:
2932 return '<DB:ChangesetComment at %#x>' % id(self)
2933
2934
2935 class ChangesetStatus(Base, BaseModel):
2936 __tablename__ = 'changeset_statuses'
2937 __table_args__ = (
2938 Index('cs_revision_idx', 'revision'),
2939 Index('cs_version_idx', 'version'),
2940 UniqueConstraint('repo_id', 'revision', 'version'),
2941 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2942 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2943 )
2944 STATUS_NOT_REVIEWED = DEFAULT = 'not_reviewed'
2945 STATUS_APPROVED = 'approved'
2946 STATUS_REJECTED = 'rejected'
2947 STATUS_UNDER_REVIEW = 'under_review'
2948
2949 STATUSES = [
2950 (STATUS_NOT_REVIEWED, _("Not Reviewed")), # (no icon) and default
2951 (STATUS_APPROVED, _("Approved")),
2952 (STATUS_REJECTED, _("Rejected")),
2953 (STATUS_UNDER_REVIEW, _("Under Review")),
2954 ]
2955
2956 changeset_status_id = Column('changeset_status_id', Integer(), nullable=False, primary_key=True)
2957 repo_id = Column('repo_id', Integer(), ForeignKey('repositories.repo_id'), nullable=False)
2958 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None)
2959 revision = Column('revision', String(40), nullable=False)
2960 status = Column('status', String(128), nullable=False, default=DEFAULT)
2961 changeset_comment_id = Column('changeset_comment_id', Integer(), ForeignKey('changeset_comments.comment_id'))
2962 modified_at = Column('modified_at', DateTime(), nullable=False, default=datetime.datetime.now)
2963 version = Column('version', Integer(), nullable=False, default=0)
2964 pull_request_id = Column("pull_request_id", Integer(), ForeignKey('pull_requests.pull_request_id'), nullable=True)
2965
2966 author = relationship('User', lazy='joined')
2967 repo = relationship('Repository')
2968 comment = relationship('ChangesetComment', lazy='joined')
2969 pull_request = relationship('PullRequest', lazy='joined')
2970
2971 def __unicode__(self):
2972 return u"<%s('%s[%s]:%s')>" % (
2973 self.__class__.__name__,
2974 self.status, self.version, self.author
2975 )
2976
2977 @classmethod
2978 def get_status_lbl(cls, value):
2979 return dict(cls.STATUSES).get(value)
2980
2981 @property
2982 def status_lbl(self):
2983 return ChangesetStatus.get_status_lbl(self.status)
2984
2985
2986 class _PullRequestBase(BaseModel):
2987 """
2988 Common attributes of pull request and version entries.
2989 """
2990
2991 # .status values
2992 STATUS_NEW = u'new'
2993 STATUS_OPEN = u'open'
2994 STATUS_CLOSED = u'closed'
2995
2996 title = Column('title', Unicode(255), nullable=True)
2997 description = Column(
2998 'description', UnicodeText().with_variant(UnicodeText(10240), 'mysql'),
2999 nullable=True)
3000 # new/open/closed status of pull request (not approve/reject/etc)
3001 status = Column('status', Unicode(255), nullable=False, default=STATUS_NEW)
3002 created_on = Column(
3003 'created_on', DateTime(timezone=False), nullable=False,
3004 default=datetime.datetime.now)
3005 updated_on = Column(
3006 'updated_on', DateTime(timezone=False), nullable=False,
3007 default=datetime.datetime.now)
3008
3009 @declared_attr
3010 def user_id(cls):
3011 return Column(
3012 "user_id", Integer(), ForeignKey('users.user_id'), nullable=False,
3013 unique=None)
3014
3015 # 500 revisions max
3016 _revisions = Column(
3017 'revisions', UnicodeText().with_variant(UnicodeText(20500), 'mysql'))
3018
3019 @declared_attr
3020 def source_repo_id(cls):
3021 # TODO: dan: rename column to source_repo_id
3022 return Column(
3023 'org_repo_id', Integer(), ForeignKey('repositories.repo_id'),
3024 nullable=False)
3025
3026 source_ref = Column('org_ref', Unicode(255), nullable=False)
3027
3028 @declared_attr
3029 def target_repo_id(cls):
3030 # TODO: dan: rename column to target_repo_id
3031 return Column(
3032 'other_repo_id', Integer(), ForeignKey('repositories.repo_id'),
3033 nullable=False)
3034
3035 target_ref = Column('other_ref', Unicode(255), nullable=False)
3036
3037 # TODO: dan: rename column to last_merge_source_rev
3038 _last_merge_source_rev = Column(
3039 'last_merge_org_rev', String(40), nullable=True)
3040 # TODO: dan: rename column to last_merge_target_rev
3041 _last_merge_target_rev = Column(
3042 'last_merge_other_rev', String(40), nullable=True)
3043 _last_merge_status = Column('merge_status', Integer(), nullable=True)
3044 merge_rev = Column('merge_rev', String(40), nullable=True)
3045
3046 @hybrid_property
3047 def revisions(self):
3048 return self._revisions.split(':') if self._revisions else []
3049
3050 @revisions.setter
3051 def revisions(self, val):
3052 self._revisions = ':'.join(val)
3053
3054 @declared_attr
3055 def author(cls):
3056 return relationship('User', lazy='joined')
3057
3058 @declared_attr
3059 def source_repo(cls):
3060 return relationship(
3061 'Repository',
3062 primaryjoin='%s.source_repo_id==Repository.repo_id' % cls.__name__)
3063
3064 @property
3065 def source_ref_parts(self):
3066 refs = self.source_ref.split(':')
3067 return Reference(refs[0], refs[1], refs[2])
3068
3069 @declared_attr
3070 def target_repo(cls):
3071 return relationship(
3072 'Repository',
3073 primaryjoin='%s.target_repo_id==Repository.repo_id' % cls.__name__)
3074
3075 @property
3076 def target_ref_parts(self):
3077 refs = self.target_ref.split(':')
3078 return Reference(refs[0], refs[1], refs[2])
3079
3080
3081 class PullRequest(Base, _PullRequestBase):
3082 __tablename__ = 'pull_requests'
3083 __table_args__ = (
3084 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3085 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3086 )
3087
3088 pull_request_id = Column(
3089 'pull_request_id', Integer(), nullable=False, primary_key=True)
3090
3091 def __repr__(self):
3092 if self.pull_request_id:
3093 return '<DB:PullRequest #%s>' % self.pull_request_id
3094 else:
3095 return '<DB:PullRequest at %#x>' % id(self)
3096
3097 reviewers = relationship('PullRequestReviewers',
3098 cascade="all, delete, delete-orphan")
3099 statuses = relationship('ChangesetStatus')
3100 comments = relationship('ChangesetComment',
3101 cascade="all, delete, delete-orphan")
3102 versions = relationship('PullRequestVersion',
3103 cascade="all, delete, delete-orphan")
3104
3105 def is_closed(self):
3106 return self.status == self.STATUS_CLOSED
3107
3108 def get_api_data(self):
3109 from rhodecode.model.pull_request import PullRequestModel
3110 pull_request = self
3111 merge_status = PullRequestModel().merge_status(pull_request)
3112 data = {
3113 'pull_request_id': pull_request.pull_request_id,
3114 'url': url('pullrequest_show', repo_name=self.target_repo.repo_name,
3115 pull_request_id=self.pull_request_id,
3116 qualified=True),
3117 'title': pull_request.title,
3118 'description': pull_request.description,
3119 'status': pull_request.status,
3120 'created_on': pull_request.created_on,
3121 'updated_on': pull_request.updated_on,
3122 'commit_ids': pull_request.revisions,
3123 'review_status': pull_request.calculated_review_status(),
3124 'mergeable': {
3125 'status': merge_status[0],
3126 'message': unicode(merge_status[1]),
3127 },
3128 'source': {
3129 'clone_url': pull_request.source_repo.clone_url(),
3130 'repository': pull_request.source_repo.repo_name,
3131 'reference': {
3132 'name': pull_request.source_ref_parts.name,
3133 'type': pull_request.source_ref_parts.type,
3134 'commit_id': pull_request.source_ref_parts.commit_id,
3135 },
3136 },
3137 'target': {
3138 'clone_url': pull_request.target_repo.clone_url(),
3139 'repository': pull_request.target_repo.repo_name,
3140 'reference': {
3141 'name': pull_request.target_ref_parts.name,
3142 'type': pull_request.target_ref_parts.type,
3143 'commit_id': pull_request.target_ref_parts.commit_id,
3144 },
3145 },
3146 'author': pull_request.author.get_api_data(include_secrets=False,
3147 details='basic'),
3148 'reviewers': [
3149 {
3150 'user': reviewer.get_api_data(include_secrets=False,
3151 details='basic'),
3152 'review_status': st[0][1].status if st else 'not_reviewed',
3153 }
3154 for reviewer, st in pull_request.reviewers_statuses()
3155 ]
3156 }
3157
3158 return data
3159
3160 def __json__(self):
3161 return {
3162 'revisions': self.revisions,
3163 }
3164
3165 def calculated_review_status(self):
3166 # TODO: anderson: 13.05.15 Used only on templates/my_account_pullrequests.html
3167 # because it's tricky on how to use ChangesetStatusModel from there
3168 warnings.warn("Use calculated_review_status from ChangesetStatusModel", DeprecationWarning)
3169 from rhodecode.model.changeset_status import ChangesetStatusModel
3170 return ChangesetStatusModel().calculated_review_status(self)
3171
3172 def reviewers_statuses(self):
3173 warnings.warn("Use reviewers_statuses from ChangesetStatusModel", DeprecationWarning)
3174 from rhodecode.model.changeset_status import ChangesetStatusModel
3175 return ChangesetStatusModel().reviewers_statuses(self)
3176
3177
3178 class PullRequestVersion(Base, _PullRequestBase):
3179 __tablename__ = 'pull_request_versions'
3180 __table_args__ = (
3181 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3182 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3183 )
3184
3185 pull_request_version_id = Column(
3186 'pull_request_version_id', Integer(), nullable=False, primary_key=True)
3187 pull_request_id = Column(
3188 'pull_request_id', Integer(),
3189 ForeignKey('pull_requests.pull_request_id'), nullable=False)
3190 pull_request = relationship('PullRequest')
3191
3192 def __repr__(self):
3193 if self.pull_request_version_id:
3194 return '<DB:PullRequestVersion #%s>' % self.pull_request_version_id
3195 else:
3196 return '<DB:PullRequestVersion at %#x>' % id(self)
3197
3198
3199 class PullRequestReviewers(Base, BaseModel):
3200 __tablename__ = 'pull_request_reviewers'
3201 __table_args__ = (
3202 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3203 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3204 )
3205
3206 def __init__(self, user=None, pull_request=None):
3207 self.user = user
3208 self.pull_request = pull_request
3209
3210 pull_requests_reviewers_id = Column(
3211 'pull_requests_reviewers_id', Integer(), nullable=False,
3212 primary_key=True)
3213 pull_request_id = Column(
3214 "pull_request_id", Integer(),
3215 ForeignKey('pull_requests.pull_request_id'), nullable=False)
3216 user_id = Column(
3217 "user_id", Integer(), ForeignKey('users.user_id'), nullable=True)
3218
3219 user = relationship('User')
3220 pull_request = relationship('PullRequest')
3221
3222
3223 class Notification(Base, BaseModel):
3224 __tablename__ = 'notifications'
3225 __table_args__ = (
3226 Index('notification_type_idx', 'type'),
3227 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3228 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3229 )
3230
3231 TYPE_CHANGESET_COMMENT = u'cs_comment'
3232 TYPE_MESSAGE = u'message'
3233 TYPE_MENTION = u'mention'
3234 TYPE_REGISTRATION = u'registration'
3235 TYPE_PULL_REQUEST = u'pull_request'
3236 TYPE_PULL_REQUEST_COMMENT = u'pull_request_comment'
3237
3238 notification_id = Column('notification_id', Integer(), nullable=False, primary_key=True)
3239 subject = Column('subject', Unicode(512), nullable=True)
3240 body = Column('body', UnicodeText().with_variant(UnicodeText(50000), 'mysql'), nullable=True)
3241 created_by = Column("created_by", Integer(), ForeignKey('users.user_id'), nullable=True)
3242 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
3243 type_ = Column('type', Unicode(255))
3244
3245 created_by_user = relationship('User')
3246 notifications_to_users = relationship('UserNotification', lazy='joined',
3247 cascade="all, delete, delete-orphan")
3248
3249 @property
3250 def recipients(self):
3251 return [x.user for x in UserNotification.query()\
3252 .filter(UserNotification.notification == self)\
3253 .order_by(UserNotification.user_id.asc()).all()]
3254
3255 @classmethod
3256 def create(cls, created_by, subject, body, recipients, type_=None):
3257 if type_ is None:
3258 type_ = Notification.TYPE_MESSAGE
3259
3260 notification = cls()
3261 notification.created_by_user = created_by
3262 notification.subject = subject
3263 notification.body = body
3264 notification.type_ = type_
3265 notification.created_on = datetime.datetime.now()
3266
3267 for u in recipients:
3268 assoc = UserNotification()
3269 assoc.notification = notification
3270
3271 # if created_by is inside recipients mark his notification
3272 # as read
3273 if u.user_id == created_by.user_id:
3274 assoc.read = True
3275
3276 u.notifications.append(assoc)
3277 Session().add(notification)
3278
3279 return notification
3280
3281 @property
3282 def description(self):
3283 from rhodecode.model.notification import NotificationModel
3284 return NotificationModel().make_description(self)
3285
3286
3287 class UserNotification(Base, BaseModel):
3288 __tablename__ = 'user_to_notification'
3289 __table_args__ = (
3290 UniqueConstraint('user_id', 'notification_id'),
3291 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3292 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
3293 )
3294 user_id = Column('user_id', Integer(), ForeignKey('users.user_id'), primary_key=True)
3295 notification_id = Column("notification_id", Integer(), ForeignKey('notifications.notification_id'), primary_key=True)
3296 read = Column('read', Boolean, default=False)
3297 sent_on = Column('sent_on', DateTime(timezone=False), nullable=True, unique=None)
3298
3299 user = relationship('User', lazy="joined")
3300 notification = relationship('Notification', lazy="joined",
3301 order_by=lambda: Notification.created_on.desc(),)
3302
3303 def mark_as_read(self):
3304 self.read = True
3305 Session().add(self)
3306
3307
3308 class Gist(Base, BaseModel):
3309 __tablename__ = 'gists'
3310 __table_args__ = (
3311 Index('g_gist_access_id_idx', 'gist_access_id'),
3312 Index('g_created_on_idx', 'created_on'),
3313 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3314 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
3315 )
3316 GIST_PUBLIC = u'public'
3317 GIST_PRIVATE = u'private'
3318 DEFAULT_FILENAME = u'gistfile1.txt'
3319
3320 ACL_LEVEL_PUBLIC = u'acl_public'
3321 ACL_LEVEL_PRIVATE = u'acl_private'
3322
3323 gist_id = Column('gist_id', Integer(), primary_key=True)
3324 gist_access_id = Column('gist_access_id', Unicode(250))
3325 gist_description = Column('gist_description', UnicodeText().with_variant(UnicodeText(1024), 'mysql'))
3326 gist_owner = Column('user_id', Integer(), ForeignKey('users.user_id'), nullable=True)
3327 gist_expires = Column('gist_expires', Float(53), nullable=False)
3328 gist_type = Column('gist_type', Unicode(128), nullable=False)
3329 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
3330 modified_at = Column('modified_at', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
3331 acl_level = Column('acl_level', Unicode(128), nullable=True)
3332
3333 owner = relationship('User')
3334
3335 def __repr__(self):
3336 return '<Gist:[%s]%s>' % (self.gist_type, self.gist_access_id)
3337
3338 @classmethod
3339 def get_or_404(cls, id_):
3340 res = cls.query().filter(cls.gist_access_id == id_).scalar()
3341 if not res:
3342 raise HTTPNotFound
3343 return res
3344
3345 @classmethod
3346 def get_by_access_id(cls, gist_access_id):
3347 return cls.query().filter(cls.gist_access_id == gist_access_id).scalar()
3348
3349 def gist_url(self):
3350 import rhodecode
3351 alias_url = rhodecode.CONFIG.get('gist_alias_url')
3352 if alias_url:
3353 return alias_url.replace('{gistid}', self.gist_access_id)
3354
3355 return url('gist', gist_id=self.gist_access_id, qualified=True)
3356
3357 @classmethod
3358 def base_path(cls):
3359 """
3360 Returns base path when all gists are stored
3361
3362 :param cls:
3363 """
3364 from rhodecode.model.gist import GIST_STORE_LOC
3365 q = Session().query(RhodeCodeUi)\
3366 .filter(RhodeCodeUi.ui_key == URL_SEP)
3367 q = q.options(FromCache("sql_cache_short", "repository_repo_path"))
3368 return os.path.join(q.one().ui_value, GIST_STORE_LOC)
3369
3370 def get_api_data(self):
3371 """
3372 Common function for generating gist related data for API
3373 """
3374 gist = self
3375 data = {
3376 'gist_id': gist.gist_id,
3377 'type': gist.gist_type,
3378 'access_id': gist.gist_access_id,
3379 'description': gist.gist_description,
3380 'url': gist.gist_url(),
3381 'expires': gist.gist_expires,
3382 'created_on': gist.created_on,
3383 'modified_at': gist.modified_at,
3384 'content': None,
3385 'acl_level': gist.acl_level,
3386 }
3387 return data
3388
3389 def __json__(self):
3390 data = dict(
3391 )
3392 data.update(self.get_api_data())
3393 return data
3394 # SCM functions
3395
3396 def scm_instance(self, **kwargs):
3397 full_repo_path = os.path.join(self.base_path(), self.gist_access_id)
3398 return get_vcs_instance(
3399 repo_path=safe_str(full_repo_path), create=False)
3400
3401
3402 class DbMigrateVersion(Base, BaseModel):
3403 __tablename__ = 'db_migrate_version'
3404 __table_args__ = (
3405 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3406 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3407 )
3408 repository_id = Column('repository_id', String(250), primary_key=True)
3409 repository_path = Column('repository_path', Text)
3410 version = Column('version', Integer)
3411
3412
3413 class ExternalIdentity(Base, BaseModel):
3414 __tablename__ = 'external_identities'
3415 __table_args__ = (
3416 Index('local_user_id_idx', 'local_user_id'),
3417 Index('external_id_idx', 'external_id'),
3418 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3419 'mysql_charset': 'utf8'})
3420
3421 external_id = Column('external_id', Unicode(255), default=u'',
3422 primary_key=True)
3423 external_username = Column('external_username', Unicode(1024), default=u'')
3424 local_user_id = Column('local_user_id', Integer(),
3425 ForeignKey('users.user_id'), primary_key=True)
3426 provider_name = Column('provider_name', Unicode(255), default=u'',
3427 primary_key=True)
3428 access_token = Column('access_token', String(1024), default=u'')
3429 alt_token = Column('alt_token', String(1024), default=u'')
3430 token_secret = Column('token_secret', String(1024), default=u'')
3431
3432 @classmethod
3433 def by_external_id_and_provider(cls, external_id, provider_name,
3434 local_user_id=None):
3435 """
3436 Returns ExternalIdentity instance based on search params
3437
3438 :param external_id:
3439 :param provider_name:
3440 :return: ExternalIdentity
3441 """
3442 query = cls.query()
3443 query = query.filter(cls.external_id == external_id)
3444 query = query.filter(cls.provider_name == provider_name)
3445 if local_user_id:
3446 query = query.filter(cls.local_user_id == local_user_id)
3447 return query.first()
3448
3449 @classmethod
3450 def user_by_external_id_and_provider(cls, external_id, provider_name):
3451 """
3452 Returns User instance based on search params
3453
3454 :param external_id:
3455 :param provider_name:
3456 :return: User
3457 """
3458 query = User.query()
3459 query = query.filter(cls.external_id == external_id)
3460 query = query.filter(cls.provider_name == provider_name)
3461 query = query.filter(User.user_id == cls.local_user_id)
3462 return query.first()
3463
3464 @classmethod
3465 def by_local_user_id(cls, local_user_id):
3466 """
3467 Returns all tokens for user
3468
3469 :param local_user_id:
3470 :return: ExternalIdentity
3471 """
3472 query = cls.query()
3473 query = query.filter(cls.local_user_id == local_user_id)
3474 return query
3475
3476
3477 class Integration(Base, BaseModel):
3478 __tablename__ = 'integrations'
3479 __table_args__ = (
3480 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3481 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
3482 )
3483
3484 integration_id = Column('integration_id', Integer(), primary_key=True)
3485 integration_type = Column('integration_type', String(255))
3486 enabled = Column('enabled', Boolean(), nullable=False)
3487 name = Column('name', String(255), nullable=False)
3488 child_repos_only = Column('child_repos_only', Boolean(), nullable=False,
3489 default=False)
3490
3491 settings = Column(
3492 'settings_json', MutationObj.as_mutable(
3493 JsonType(dialect_map=dict(mysql=UnicodeText(16384)))))
3494 repo_id = Column(
3495 'repo_id', Integer(), ForeignKey('repositories.repo_id'),
3496 nullable=True, unique=None, default=None)
3497 repo = relationship('Repository', lazy='joined')
3498
3499 repo_group_id = Column(
3500 'repo_group_id', Integer(), ForeignKey('groups.group_id'),
3501 nullable=True, unique=None, default=None)
3502 repo_group = relationship('RepoGroup', lazy='joined')
3503
3504 @property
3505 def scope(self):
3506 if self.repo:
3507 return repr(self.repo)
3508 if self.repo_group:
3509 if self.child_repos_only:
3510 return repr(self.repo_group) + ' (child repos only)'
3511 else:
3512 return repr(self.repo_group) + ' (recursive)'
3513 if self.child_repos_only:
3514 return 'root_repos'
3515 return 'global'
3516
3517 def __repr__(self):
3518 return '<Integration(%r, %r)>' % (self.integration_type, self.scope)
3519
3520
3521 class RepoReviewRuleUser(Base, BaseModel):
3522 __tablename__ = 'repo_review_rules_users'
3523 __table_args__ = (
3524 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3525 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,}
3526 )
3527 repo_review_rule_user_id = Column(
3528 'repo_review_rule_user_id', Integer(), primary_key=True)
3529 repo_review_rule_id = Column("repo_review_rule_id",
3530 Integer(), ForeignKey('repo_review_rules.repo_review_rule_id'))
3531 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'),
3532 nullable=False)
3533 user = relationship('User')
3534
3535
3536 class RepoReviewRuleUserGroup(Base, BaseModel):
3537 __tablename__ = 'repo_review_rules_users_groups'
3538 __table_args__ = (
3539 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3540 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,}
3541 )
3542 repo_review_rule_users_group_id = Column(
3543 'repo_review_rule_users_group_id', Integer(), primary_key=True)
3544 repo_review_rule_id = Column("repo_review_rule_id",
3545 Integer(), ForeignKey('repo_review_rules.repo_review_rule_id'))
3546 users_group_id = Column("users_group_id", Integer(),
3547 ForeignKey('users_groups.users_group_id'), nullable=False)
3548 users_group = relationship('UserGroup')
3549
3550
3551 class RepoReviewRule(Base, BaseModel):
3552 __tablename__ = 'repo_review_rules'
3553 __table_args__ = (
3554 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3555 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,}
3556 )
3557
3558 repo_review_rule_id = Column(
3559 'repo_review_rule_id', Integer(), primary_key=True)
3560 repo_id = Column(
3561 "repo_id", Integer(), ForeignKey('repositories.repo_id'))
3562 repo = relationship('Repository', backref='review_rules')
3563
3564 _branch_pattern = Column("branch_pattern", UnicodeText().with_variant(UnicodeText(255), 'mysql'),
3565 default=u'*') # glob
3566 _file_pattern = Column("file_pattern", UnicodeText().with_variant(UnicodeText(255), 'mysql'),
3567 default=u'*') # glob
3568
3569 use_authors_for_review = Column("use_authors_for_review", Boolean(),
3570 nullable=False, default=False)
3571 rule_users = relationship('RepoReviewRuleUser')
3572 rule_user_groups = relationship('RepoReviewRuleUserGroup')
3573
3574 @hybrid_property
3575 def branch_pattern(self):
3576 return self._branch_pattern or '*'
3577
3578 def _validate_pattern(self, value):
3579 re.compile('^' + glob2re(value) + '$')
3580
3581 @branch_pattern.setter
3582 def branch_pattern(self, value):
3583 self._validate_glob(value)
3584 self._branch_pattern = value or '*'
3585
3586 @hybrid_property
3587 def file_pattern(self):
3588 return self._file_pattern or '*'
3589
3590 @file_pattern.setter
3591 def file_pattern(self, value):
3592 self._validate_glob(value)
3593 self._file_pattern = value or '*'
3594
3595 def matches(self, branch, files_changed):
3596 """
3597 Check if this review rule matches a branch/files in a pull request
3598
3599 :param branch: branch name for the commit
3600 :param files_changed: list of file paths changed in the pull request
3601 """
3602
3603 branch = branch or ''
3604 files_changed = files_changed or []
3605
3606 branch_matches = True
3607 if branch:
3608 branch_regex = re.compile('^' + glob2re(self.branch_pattern) + '$')
3609 branch_matches = bool(branch_regex.search(branch))
3610
3611 files_matches = True
3612 if self.file_pattern != '*':
3613 files_matches = False
3614 file_regex = re.compile(glob2re(self.file_pattern))
3615 for filename in files_changed:
3616 if file_regex.search(filename):
3617 files_matches = True
3618 break
3619
3620 return branch_matches and files_matches
3621
3622 @property
3623 def review_users(self):
3624 """ Returns the users which this rule applies to """
3625
3626 users = set()
3627 users |= set([
3628 rule_user.user for rule_user in self.rule_users
3629 if rule_user.user.active])
3630 users |= set(
3631 member.user
3632 for rule_user_group in self.rule_user_groups
3633 for member in rule_user_group.users_group.members
3634 if member.user.active
3635 )
3636 return users
3637
3638 def __repr__(self):
3639 return '<RepoReviewerRule(id=%r, repo=%r)>' % (
3640 self.repo_review_rule_id, self.repo)
This diff has been collapsed as it changes many lines, (3658 lines changed) Show them Hide them
@@ -0,0 +1,3658 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2010-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 """
22 Database Models for RhodeCode Enterprise
23 """
24
25 import re
26 import os
27 import sys
28 import time
29 import hashlib
30 import logging
31 import datetime
32 import warnings
33 import ipaddress
34 import functools
35 import traceback
36 import collections
37
38
39 from sqlalchemy import *
40 from sqlalchemy.exc import IntegrityError
41 from sqlalchemy.ext.declarative import declared_attr
42 from sqlalchemy.ext.hybrid import hybrid_property
43 from sqlalchemy.orm import (
44 relationship, joinedload, class_mapper, validates, aliased)
45 from sqlalchemy.sql.expression import true
46 from beaker.cache import cache_region, region_invalidate
47 from webob.exc import HTTPNotFound
48 from zope.cachedescriptors.property import Lazy as LazyProperty
49
50 from pylons import url
51 from pylons.i18n.translation import lazy_ugettext as _
52
53 from rhodecode.lib.vcs import get_backend, get_vcs_instance
54 from rhodecode.lib.vcs.utils.helpers import get_scm
55 from rhodecode.lib.vcs.exceptions import VCSError
56 from rhodecode.lib.vcs.backends.base import (
57 EmptyCommit, Reference, MergeFailureReason)
58 from rhodecode.lib.utils2 import (
59 str2bool, safe_str, get_commit_safe, safe_unicode, remove_prefix, md5_safe,
60 time_to_datetime, aslist, Optional, safe_int, get_clone_url, AttributeDict,
61 glob2re)
62 from rhodecode.lib.jsonalchemy import MutationObj, MutationList, JsonType, JSONDict
63 from rhodecode.lib.ext_json import json
64 from rhodecode.lib.caching_query import FromCache
65 from rhodecode.lib.encrypt import AESCipher
66
67 from rhodecode.model.meta import Base, Session
68
69 URL_SEP = '/'
70 log = logging.getLogger(__name__)
71
72 # =============================================================================
73 # BASE CLASSES
74 # =============================================================================
75
76 # this is propagated from .ini file rhodecode.encrypted_values.secret or
77 # beaker.session.secret if first is not set.
78 # and initialized at environment.py
79 ENCRYPTION_KEY = None
80
81 # used to sort permissions by types, '#' used here is not allowed to be in
82 # usernames, and it's very early in sorted string.printable table.
83 PERMISSION_TYPE_SORT = {
84 'admin': '####',
85 'write': '###',
86 'read': '##',
87 'none': '#',
88 }
89
90
91 def display_sort(obj):
92 """
93 Sort function used to sort permissions in .permissions() function of
94 Repository, RepoGroup, UserGroup. Also it put the default user in front
95 of all other resources
96 """
97
98 if obj.username == User.DEFAULT_USER:
99 return '#####'
100 prefix = PERMISSION_TYPE_SORT.get(obj.permission.split('.')[-1], '')
101 return prefix + obj.username
102
103
104 def _hash_key(k):
105 return md5_safe(k)
106
107
108 class EncryptedTextValue(TypeDecorator):
109 """
110 Special column for encrypted long text data, use like::
111
112 value = Column("encrypted_value", EncryptedValue(), nullable=False)
113
114 This column is intelligent so if value is in unencrypted form it return
115 unencrypted form, but on save it always encrypts
116 """
117 impl = Text
118
119 def process_bind_param(self, value, dialect):
120 if not value:
121 return value
122 if value.startswith('enc$aes$') or value.startswith('enc$aes_hmac$'):
123 # protect against double encrypting if someone manually starts
124 # doing
125 raise ValueError('value needs to be in unencrypted format, ie. '
126 'not starting with enc$aes')
127 return 'enc$aes_hmac$%s' % AESCipher(
128 ENCRYPTION_KEY, hmac=True).encrypt(value)
129
130 def process_result_value(self, value, dialect):
131 import rhodecode
132
133 if not value:
134 return value
135
136 parts = value.split('$', 3)
137 if not len(parts) == 3:
138 # probably not encrypted values
139 return value
140 else:
141 if parts[0] != 'enc':
142 # parts ok but without our header ?
143 return value
144 enc_strict_mode = str2bool(rhodecode.CONFIG.get(
145 'rhodecode.encrypted_values.strict') or True)
146 # at that stage we know it's our encryption
147 if parts[1] == 'aes':
148 decrypted_data = AESCipher(ENCRYPTION_KEY).decrypt(parts[2])
149 elif parts[1] == 'aes_hmac':
150 decrypted_data = AESCipher(
151 ENCRYPTION_KEY, hmac=True,
152 strict_verification=enc_strict_mode).decrypt(parts[2])
153 else:
154 raise ValueError(
155 'Encryption type part is wrong, must be `aes` '
156 'or `aes_hmac`, got `%s` instead' % (parts[1]))
157 return decrypted_data
158
159
160 class BaseModel(object):
161 """
162 Base Model for all classes
163 """
164
165 @classmethod
166 def _get_keys(cls):
167 """return column names for this model """
168 return class_mapper(cls).c.keys()
169
170 def get_dict(self):
171 """
172 return dict with keys and values corresponding
173 to this model data """
174
175 d = {}
176 for k in self._get_keys():
177 d[k] = getattr(self, k)
178
179 # also use __json__() if present to get additional fields
180 _json_attr = getattr(self, '__json__', None)
181 if _json_attr:
182 # update with attributes from __json__
183 if callable(_json_attr):
184 _json_attr = _json_attr()
185 for k, val in _json_attr.iteritems():
186 d[k] = val
187 return d
188
189 def get_appstruct(self):
190 """return list with keys and values tuples corresponding
191 to this model data """
192
193 l = []
194 for k in self._get_keys():
195 l.append((k, getattr(self, k),))
196 return l
197
198 def populate_obj(self, populate_dict):
199 """populate model with data from given populate_dict"""
200
201 for k in self._get_keys():
202 if k in populate_dict:
203 setattr(self, k, populate_dict[k])
204
205 @classmethod
206 def query(cls):
207 return Session().query(cls)
208
209 @classmethod
210 def get(cls, id_):
211 if id_:
212 return cls.query().get(id_)
213
214 @classmethod
215 def get_or_404(cls, id_):
216 try:
217 id_ = int(id_)
218 except (TypeError, ValueError):
219 raise HTTPNotFound
220
221 res = cls.query().get(id_)
222 if not res:
223 raise HTTPNotFound
224 return res
225
226 @classmethod
227 def getAll(cls):
228 # deprecated and left for backward compatibility
229 return cls.get_all()
230
231 @classmethod
232 def get_all(cls):
233 return cls.query().all()
234
235 @classmethod
236 def delete(cls, id_):
237 obj = cls.query().get(id_)
238 Session().delete(obj)
239
240 @classmethod
241 def identity_cache(cls, session, attr_name, value):
242 exist_in_session = []
243 for (item_cls, pkey), instance in session.identity_map.items():
244 if cls == item_cls and getattr(instance, attr_name) == value:
245 exist_in_session.append(instance)
246 if exist_in_session:
247 if len(exist_in_session) == 1:
248 return exist_in_session[0]
249 log.exception(
250 'multiple objects with attr %s and '
251 'value %s found with same name: %r',
252 attr_name, value, exist_in_session)
253
254 def __repr__(self):
255 if hasattr(self, '__unicode__'):
256 # python repr needs to return str
257 try:
258 return safe_str(self.__unicode__())
259 except UnicodeDecodeError:
260 pass
261 return '<DB:%s>' % (self.__class__.__name__)
262
263
264 class RhodeCodeSetting(Base, BaseModel):
265 __tablename__ = 'rhodecode_settings'
266 __table_args__ = (
267 UniqueConstraint('app_settings_name'),
268 {'extend_existing': True, 'mysql_engine': 'InnoDB',
269 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
270 )
271
272 SETTINGS_TYPES = {
273 'str': safe_str,
274 'int': safe_int,
275 'unicode': safe_unicode,
276 'bool': str2bool,
277 'list': functools.partial(aslist, sep=',')
278 }
279 DEFAULT_UPDATE_URL = 'https://rhodecode.com/api/v1/info/versions'
280 GLOBAL_CONF_KEY = 'app_settings'
281
282 app_settings_id = Column("app_settings_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
283 app_settings_name = Column("app_settings_name", String(255), nullable=True, unique=None, default=None)
284 _app_settings_value = Column("app_settings_value", String(4096), nullable=True, unique=None, default=None)
285 _app_settings_type = Column("app_settings_type", String(255), nullable=True, unique=None, default=None)
286
287 def __init__(self, key='', val='', type='unicode'):
288 self.app_settings_name = key
289 self.app_settings_type = type
290 self.app_settings_value = val
291
292 @validates('_app_settings_value')
293 def validate_settings_value(self, key, val):
294 assert type(val) == unicode
295 return val
296
297 @hybrid_property
298 def app_settings_value(self):
299 v = self._app_settings_value
300 _type = self.app_settings_type
301 if _type:
302 _type = self.app_settings_type.split('.')[0]
303 # decode the encrypted value
304 if 'encrypted' in self.app_settings_type:
305 cipher = EncryptedTextValue()
306 v = safe_unicode(cipher.process_result_value(v, None))
307
308 converter = self.SETTINGS_TYPES.get(_type) or \
309 self.SETTINGS_TYPES['unicode']
310 return converter(v)
311
312 @app_settings_value.setter
313 def app_settings_value(self, val):
314 """
315 Setter that will always make sure we use unicode in app_settings_value
316
317 :param val:
318 """
319 val = safe_unicode(val)
320 # encode the encrypted value
321 if 'encrypted' in self.app_settings_type:
322 cipher = EncryptedTextValue()
323 val = safe_unicode(cipher.process_bind_param(val, None))
324 self._app_settings_value = val
325
326 @hybrid_property
327 def app_settings_type(self):
328 return self._app_settings_type
329
330 @app_settings_type.setter
331 def app_settings_type(self, val):
332 if val.split('.')[0] not in self.SETTINGS_TYPES:
333 raise Exception('type must be one of %s got %s'
334 % (self.SETTINGS_TYPES.keys(), val))
335 self._app_settings_type = val
336
337 def __unicode__(self):
338 return u"<%s('%s:%s[%s]')>" % (
339 self.__class__.__name__,
340 self.app_settings_name, self.app_settings_value,
341 self.app_settings_type
342 )
343
344
345 class RhodeCodeUi(Base, BaseModel):
346 __tablename__ = 'rhodecode_ui'
347 __table_args__ = (
348 UniqueConstraint('ui_key'),
349 {'extend_existing': True, 'mysql_engine': 'InnoDB',
350 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
351 )
352
353 HOOK_REPO_SIZE = 'changegroup.repo_size'
354 # HG
355 HOOK_PRE_PULL = 'preoutgoing.pre_pull'
356 HOOK_PULL = 'outgoing.pull_logger'
357 HOOK_PRE_PUSH = 'prechangegroup.pre_push'
358 HOOK_PUSH = 'changegroup.push_logger'
359
360 # TODO: johbo: Unify way how hooks are configured for git and hg,
361 # git part is currently hardcoded.
362
363 # SVN PATTERNS
364 SVN_BRANCH_ID = 'vcs_svn_branch'
365 SVN_TAG_ID = 'vcs_svn_tag'
366
367 ui_id = Column(
368 "ui_id", Integer(), nullable=False, unique=True, default=None,
369 primary_key=True)
370 ui_section = Column(
371 "ui_section", String(255), nullable=True, unique=None, default=None)
372 ui_key = Column(
373 "ui_key", String(255), nullable=True, unique=None, default=None)
374 ui_value = Column(
375 "ui_value", String(255), nullable=True, unique=None, default=None)
376 ui_active = Column(
377 "ui_active", Boolean(), nullable=True, unique=None, default=True)
378
379 def __repr__(self):
380 return '<%s[%s]%s=>%s]>' % (self.__class__.__name__, self.ui_section,
381 self.ui_key, self.ui_value)
382
383
384 class RepoRhodeCodeSetting(Base, BaseModel):
385 __tablename__ = 'repo_rhodecode_settings'
386 __table_args__ = (
387 UniqueConstraint(
388 'app_settings_name', 'repository_id',
389 name='uq_repo_rhodecode_setting_name_repo_id'),
390 {'extend_existing': True, 'mysql_engine': 'InnoDB',
391 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
392 )
393
394 repository_id = Column(
395 "repository_id", Integer(), ForeignKey('repositories.repo_id'),
396 nullable=False)
397 app_settings_id = Column(
398 "app_settings_id", Integer(), nullable=False, unique=True,
399 default=None, primary_key=True)
400 app_settings_name = Column(
401 "app_settings_name", String(255), nullable=True, unique=None,
402 default=None)
403 _app_settings_value = Column(
404 "app_settings_value", String(4096), nullable=True, unique=None,
405 default=None)
406 _app_settings_type = Column(
407 "app_settings_type", String(255), nullable=True, unique=None,
408 default=None)
409
410 repository = relationship('Repository')
411
412 def __init__(self, repository_id, key='', val='', type='unicode'):
413 self.repository_id = repository_id
414 self.app_settings_name = key
415 self.app_settings_type = type
416 self.app_settings_value = val
417
418 @validates('_app_settings_value')
419 def validate_settings_value(self, key, val):
420 assert type(val) == unicode
421 return val
422
423 @hybrid_property
424 def app_settings_value(self):
425 v = self._app_settings_value
426 type_ = self.app_settings_type
427 SETTINGS_TYPES = RhodeCodeSetting.SETTINGS_TYPES
428 converter = SETTINGS_TYPES.get(type_) or SETTINGS_TYPES['unicode']
429 return converter(v)
430
431 @app_settings_value.setter
432 def app_settings_value(self, val):
433 """
434 Setter that will always make sure we use unicode in app_settings_value
435
436 :param val:
437 """
438 self._app_settings_value = safe_unicode(val)
439
440 @hybrid_property
441 def app_settings_type(self):
442 return self._app_settings_type
443
444 @app_settings_type.setter
445 def app_settings_type(self, val):
446 SETTINGS_TYPES = RhodeCodeSetting.SETTINGS_TYPES
447 if val not in SETTINGS_TYPES:
448 raise Exception('type must be one of %s got %s'
449 % (SETTINGS_TYPES.keys(), val))
450 self._app_settings_type = val
451
452 def __unicode__(self):
453 return u"<%s('%s:%s:%s[%s]')>" % (
454 self.__class__.__name__, self.repository.repo_name,
455 self.app_settings_name, self.app_settings_value,
456 self.app_settings_type
457 )
458
459
460 class RepoRhodeCodeUi(Base, BaseModel):
461 __tablename__ = 'repo_rhodecode_ui'
462 __table_args__ = (
463 UniqueConstraint(
464 'repository_id', 'ui_section', 'ui_key',
465 name='uq_repo_rhodecode_ui_repository_id_section_key'),
466 {'extend_existing': True, 'mysql_engine': 'InnoDB',
467 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
468 )
469
470 repository_id = Column(
471 "repository_id", Integer(), ForeignKey('repositories.repo_id'),
472 nullable=False)
473 ui_id = Column(
474 "ui_id", Integer(), nullable=False, unique=True, default=None,
475 primary_key=True)
476 ui_section = Column(
477 "ui_section", String(255), nullable=True, unique=None, default=None)
478 ui_key = Column(
479 "ui_key", String(255), nullable=True, unique=None, default=None)
480 ui_value = Column(
481 "ui_value", String(255), nullable=True, unique=None, default=None)
482 ui_active = Column(
483 "ui_active", Boolean(), nullable=True, unique=None, default=True)
484
485 repository = relationship('Repository')
486
487 def __repr__(self):
488 return '<%s[%s:%s]%s=>%s]>' % (
489 self.__class__.__name__, self.repository.repo_name,
490 self.ui_section, self.ui_key, self.ui_value)
491
492
493 class User(Base, BaseModel):
494 __tablename__ = 'users'
495 __table_args__ = (
496 UniqueConstraint('username'), UniqueConstraint('email'),
497 Index('u_username_idx', 'username'),
498 Index('u_email_idx', 'email'),
499 {'extend_existing': True, 'mysql_engine': 'InnoDB',
500 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
501 )
502 DEFAULT_USER = 'default'
503 DEFAULT_USER_EMAIL = 'anonymous@rhodecode.org'
504 DEFAULT_GRAVATAR_URL = 'https://secure.gravatar.com/avatar/{md5email}?d=identicon&s={size}'
505
506 user_id = Column("user_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
507 username = Column("username", String(255), nullable=True, unique=None, default=None)
508 password = Column("password", String(255), nullable=True, unique=None, default=None)
509 active = Column("active", Boolean(), nullable=True, unique=None, default=True)
510 admin = Column("admin", Boolean(), nullable=True, unique=None, default=False)
511 name = Column("firstname", String(255), nullable=True, unique=None, default=None)
512 lastname = Column("lastname", String(255), nullable=True, unique=None, default=None)
513 _email = Column("email", String(255), nullable=True, unique=None, default=None)
514 last_login = Column("last_login", DateTime(timezone=False), nullable=True, unique=None, default=None)
515 extern_type = Column("extern_type", String(255), nullable=True, unique=None, default=None)
516 extern_name = Column("extern_name", String(255), nullable=True, unique=None, default=None)
517 api_key = Column("api_key", String(255), nullable=True, unique=None, default=None)
518 inherit_default_permissions = Column("inherit_default_permissions", Boolean(), nullable=False, unique=None, default=True)
519 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
520 _user_data = Column("user_data", LargeBinary(), nullable=True) # JSON data
521
522 user_log = relationship('UserLog')
523 user_perms = relationship('UserToPerm', primaryjoin="User.user_id==UserToPerm.user_id", cascade='all')
524
525 repositories = relationship('Repository')
526 repository_groups = relationship('RepoGroup')
527 user_groups = relationship('UserGroup')
528
529 user_followers = relationship('UserFollowing', primaryjoin='UserFollowing.follows_user_id==User.user_id', cascade='all')
530 followings = relationship('UserFollowing', primaryjoin='UserFollowing.user_id==User.user_id', cascade='all')
531
532 repo_to_perm = relationship('UserRepoToPerm', primaryjoin='UserRepoToPerm.user_id==User.user_id', cascade='all')
533 repo_group_to_perm = relationship('UserRepoGroupToPerm', primaryjoin='UserRepoGroupToPerm.user_id==User.user_id', cascade='all')
534 user_group_to_perm = relationship('UserUserGroupToPerm', primaryjoin='UserUserGroupToPerm.user_id==User.user_id', cascade='all')
535
536 group_member = relationship('UserGroupMember', cascade='all')
537
538 notifications = relationship('UserNotification', cascade='all')
539 # notifications assigned to this user
540 user_created_notifications = relationship('Notification', cascade='all')
541 # comments created by this user
542 user_comments = relationship('ChangesetComment', cascade='all')
543 # user profile extra info
544 user_emails = relationship('UserEmailMap', cascade='all')
545 user_ip_map = relationship('UserIpMap', cascade='all')
546 user_auth_tokens = relationship('UserApiKeys', cascade='all')
547 # gists
548 user_gists = relationship('Gist', cascade='all')
549 # user pull requests
550 user_pull_requests = relationship('PullRequest', cascade='all')
551 # external identities
552 extenal_identities = relationship(
553 'ExternalIdentity',
554 primaryjoin="User.user_id==ExternalIdentity.local_user_id",
555 cascade='all')
556
557 def __unicode__(self):
558 return u"<%s('id:%s:%s')>" % (self.__class__.__name__,
559 self.user_id, self.username)
560
561 @hybrid_property
562 def email(self):
563 return self._email
564
565 @email.setter
566 def email(self, val):
567 self._email = val.lower() if val else None
568
569 @property
570 def firstname(self):
571 # alias for future
572 return self.name
573
574 @property
575 def emails(self):
576 other = UserEmailMap.query().filter(UserEmailMap.user==self).all()
577 return [self.email] + [x.email for x in other]
578
579 @property
580 def auth_tokens(self):
581 return [self.api_key] + [x.api_key for x in self.extra_auth_tokens]
582
583 @property
584 def extra_auth_tokens(self):
585 return UserApiKeys.query().filter(UserApiKeys.user == self).all()
586
587 @property
588 def feed_token(self):
589 feed_tokens = UserApiKeys.query()\
590 .filter(UserApiKeys.user == self)\
591 .filter(UserApiKeys.role == UserApiKeys.ROLE_FEED)\
592 .all()
593 if feed_tokens:
594 return feed_tokens[0].api_key
595 else:
596 # use the main token so we don't end up with nothing...
597 return self.api_key
598
599 @classmethod
600 def extra_valid_auth_tokens(cls, user, role=None):
601 tokens = UserApiKeys.query().filter(UserApiKeys.user == user)\
602 .filter(or_(UserApiKeys.expires == -1,
603 UserApiKeys.expires >= time.time()))
604 if role:
605 tokens = tokens.filter(or_(UserApiKeys.role == role,
606 UserApiKeys.role == UserApiKeys.ROLE_ALL))
607 return tokens.all()
608
609 @property
610 def ip_addresses(self):
611 ret = UserIpMap.query().filter(UserIpMap.user == self).all()
612 return [x.ip_addr for x in ret]
613
614 @property
615 def username_and_name(self):
616 return '%s (%s %s)' % (self.username, self.firstname, self.lastname)
617
618 @property
619 def username_or_name_or_email(self):
620 full_name = self.full_name if self.full_name is not ' ' else None
621 return self.username or full_name or self.email
622
623 @property
624 def full_name(self):
625 return '%s %s' % (self.firstname, self.lastname)
626
627 @property
628 def full_name_or_username(self):
629 return ('%s %s' % (self.firstname, self.lastname)
630 if (self.firstname and self.lastname) else self.username)
631
632 @property
633 def full_contact(self):
634 return '%s %s <%s>' % (self.firstname, self.lastname, self.email)
635
636 @property
637 def short_contact(self):
638 return '%s %s' % (self.firstname, self.lastname)
639
640 @property
641 def is_admin(self):
642 return self.admin
643
644 @property
645 def AuthUser(self):
646 """
647 Returns instance of AuthUser for this user
648 """
649 from rhodecode.lib.auth import AuthUser
650 return AuthUser(user_id=self.user_id, api_key=self.api_key,
651 username=self.username)
652
653 @hybrid_property
654 def user_data(self):
655 if not self._user_data:
656 return {}
657
658 try:
659 return json.loads(self._user_data)
660 except TypeError:
661 return {}
662
663 @user_data.setter
664 def user_data(self, val):
665 if not isinstance(val, dict):
666 raise Exception('user_data must be dict, got %s' % type(val))
667 try:
668 self._user_data = json.dumps(val)
669 except Exception:
670 log.error(traceback.format_exc())
671
672 @classmethod
673 def get_by_username(cls, username, case_insensitive=False,
674 cache=False, identity_cache=False):
675 session = Session()
676
677 if case_insensitive:
678 q = cls.query().filter(
679 func.lower(cls.username) == func.lower(username))
680 else:
681 q = cls.query().filter(cls.username == username)
682
683 if cache:
684 if identity_cache:
685 val = cls.identity_cache(session, 'username', username)
686 if val:
687 return val
688 else:
689 q = q.options(
690 FromCache("sql_cache_short",
691 "get_user_by_name_%s" % _hash_key(username)))
692
693 return q.scalar()
694
695 @classmethod
696 def get_by_auth_token(cls, auth_token, cache=False, fallback=True):
697 q = cls.query().filter(cls.api_key == auth_token)
698
699 if cache:
700 q = q.options(FromCache("sql_cache_short",
701 "get_auth_token_%s" % auth_token))
702 res = q.scalar()
703
704 if fallback and not res:
705 #fallback to additional keys
706 _res = UserApiKeys.query()\
707 .filter(UserApiKeys.api_key == auth_token)\
708 .filter(or_(UserApiKeys.expires == -1,
709 UserApiKeys.expires >= time.time()))\
710 .first()
711 if _res:
712 res = _res.user
713 return res
714
715 @classmethod
716 def get_by_email(cls, email, case_insensitive=False, cache=False):
717
718 if case_insensitive:
719 q = cls.query().filter(func.lower(cls.email) == func.lower(email))
720
721 else:
722 q = cls.query().filter(cls.email == email)
723
724 if cache:
725 q = q.options(FromCache("sql_cache_short",
726 "get_email_key_%s" % _hash_key(email)))
727
728 ret = q.scalar()
729 if ret is None:
730 q = UserEmailMap.query()
731 # try fetching in alternate email map
732 if case_insensitive:
733 q = q.filter(func.lower(UserEmailMap.email) == func.lower(email))
734 else:
735 q = q.filter(UserEmailMap.email == email)
736 q = q.options(joinedload(UserEmailMap.user))
737 if cache:
738 q = q.options(FromCache("sql_cache_short",
739 "get_email_map_key_%s" % email))
740 ret = getattr(q.scalar(), 'user', None)
741
742 return ret
743
744 @classmethod
745 def get_from_cs_author(cls, author):
746 """
747 Tries to get User objects out of commit author string
748
749 :param author:
750 """
751 from rhodecode.lib.helpers import email, author_name
752 # Valid email in the attribute passed, see if they're in the system
753 _email = email(author)
754 if _email:
755 user = cls.get_by_email(_email, case_insensitive=True)
756 if user:
757 return user
758 # Maybe we can match by username?
759 _author = author_name(author)
760 user = cls.get_by_username(_author, case_insensitive=True)
761 if user:
762 return user
763
764 def update_userdata(self, **kwargs):
765 usr = self
766 old = usr.user_data
767 old.update(**kwargs)
768 usr.user_data = old
769 Session().add(usr)
770 log.debug('updated userdata with ', kwargs)
771
772 def update_lastlogin(self):
773 """Update user lastlogin"""
774 self.last_login = datetime.datetime.now()
775 Session().add(self)
776 log.debug('updated user %s lastlogin', self.username)
777
778 def update_lastactivity(self):
779 """Update user lastactivity"""
780 usr = self
781 old = usr.user_data
782 old.update({'last_activity': time.time()})
783 usr.user_data = old
784 Session().add(usr)
785 log.debug('updated user %s lastactivity', usr.username)
786
787 def update_password(self, new_password, change_api_key=False):
788 from rhodecode.lib.auth import get_crypt_password,generate_auth_token
789
790 self.password = get_crypt_password(new_password)
791 if change_api_key:
792 self.api_key = generate_auth_token(self.username)
793 Session().add(self)
794
795 @classmethod
796 def get_first_super_admin(cls):
797 user = User.query().filter(User.admin == true()).first()
798 if user is None:
799 raise Exception('FATAL: Missing administrative account!')
800 return user
801
802 @classmethod
803 def get_all_super_admins(cls):
804 """
805 Returns all admin accounts sorted by username
806 """
807 return User.query().filter(User.admin == true())\
808 .order_by(User.username.asc()).all()
809
810 @classmethod
811 def get_default_user(cls, cache=False):
812 user = User.get_by_username(User.DEFAULT_USER, cache=cache)
813 if user is None:
814 raise Exception('FATAL: Missing default account!')
815 return user
816
817 def _get_default_perms(self, user, suffix=''):
818 from rhodecode.model.permission import PermissionModel
819 return PermissionModel().get_default_perms(user.user_perms, suffix)
820
821 def get_default_perms(self, suffix=''):
822 return self._get_default_perms(self, suffix)
823
824 def get_api_data(self, include_secrets=False, details='full'):
825 """
826 Common function for generating user related data for API
827
828 :param include_secrets: By default secrets in the API data will be replaced
829 by a placeholder value to prevent exposing this data by accident. In case
830 this data shall be exposed, set this flag to ``True``.
831
832 :param details: details can be 'basic|full' basic gives only a subset of
833 the available user information that includes user_id, name and emails.
834 """
835 user = self
836 user_data = self.user_data
837 data = {
838 'user_id': user.user_id,
839 'username': user.username,
840 'firstname': user.name,
841 'lastname': user.lastname,
842 'email': user.email,
843 'emails': user.emails,
844 }
845 if details == 'basic':
846 return data
847
848 api_key_length = 40
849 api_key_replacement = '*' * api_key_length
850
851 extras = {
852 'api_key': api_key_replacement,
853 'api_keys': [api_key_replacement],
854 'active': user.active,
855 'admin': user.admin,
856 'extern_type': user.extern_type,
857 'extern_name': user.extern_name,
858 'last_login': user.last_login,
859 'ip_addresses': user.ip_addresses,
860 'language': user_data.get('language')
861 }
862 data.update(extras)
863
864 if include_secrets:
865 data['api_key'] = user.api_key
866 data['api_keys'] = user.auth_tokens
867 return data
868
869 def __json__(self):
870 data = {
871 'full_name': self.full_name,
872 'full_name_or_username': self.full_name_or_username,
873 'short_contact': self.short_contact,
874 'full_contact': self.full_contact,
875 }
876 data.update(self.get_api_data())
877 return data
878
879
880 class UserApiKeys(Base, BaseModel):
881 __tablename__ = 'user_api_keys'
882 __table_args__ = (
883 Index('uak_api_key_idx', 'api_key'),
884 Index('uak_api_key_expires_idx', 'api_key', 'expires'),
885 UniqueConstraint('api_key'),
886 {'extend_existing': True, 'mysql_engine': 'InnoDB',
887 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
888 )
889 __mapper_args__ = {}
890
891 # ApiKey role
892 ROLE_ALL = 'token_role_all'
893 ROLE_HTTP = 'token_role_http'
894 ROLE_VCS = 'token_role_vcs'
895 ROLE_API = 'token_role_api'
896 ROLE_FEED = 'token_role_feed'
897 ROLES = [ROLE_ALL, ROLE_HTTP, ROLE_VCS, ROLE_API, ROLE_FEED]
898
899 user_api_key_id = Column("user_api_key_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
900 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None)
901 api_key = Column("api_key", String(255), nullable=False, unique=True)
902 description = Column('description', UnicodeText().with_variant(UnicodeText(1024), 'mysql'))
903 expires = Column('expires', Float(53), nullable=False)
904 role = Column('role', String(255), nullable=True)
905 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
906
907 user = relationship('User', lazy='joined')
908
909 @classmethod
910 def _get_role_name(cls, role):
911 return {
912 cls.ROLE_ALL: _('all'),
913 cls.ROLE_HTTP: _('http/web interface'),
914 cls.ROLE_VCS: _('vcs (git/hg/svn protocol)'),
915 cls.ROLE_API: _('api calls'),
916 cls.ROLE_FEED: _('feed access'),
917 }.get(role, role)
918
919 @property
920 def expired(self):
921 if self.expires == -1:
922 return False
923 return time.time() > self.expires
924
925 @property
926 def role_humanized(self):
927 return self._get_role_name(self.role)
928
929
930 class UserEmailMap(Base, BaseModel):
931 __tablename__ = 'user_email_map'
932 __table_args__ = (
933 Index('uem_email_idx', 'email'),
934 UniqueConstraint('email'),
935 {'extend_existing': True, 'mysql_engine': 'InnoDB',
936 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
937 )
938 __mapper_args__ = {}
939
940 email_id = Column("email_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
941 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None)
942 _email = Column("email", String(255), nullable=True, unique=False, default=None)
943 user = relationship('User', lazy='joined')
944
945 @validates('_email')
946 def validate_email(self, key, email):
947 # check if this email is not main one
948 main_email = Session().query(User).filter(User.email == email).scalar()
949 if main_email is not None:
950 raise AttributeError('email %s is present is user table' % email)
951 return email
952
953 @hybrid_property
954 def email(self):
955 return self._email
956
957 @email.setter
958 def email(self, val):
959 self._email = val.lower() if val else None
960
961
962 class UserIpMap(Base, BaseModel):
963 __tablename__ = 'user_ip_map'
964 __table_args__ = (
965 UniqueConstraint('user_id', 'ip_addr'),
966 {'extend_existing': True, 'mysql_engine': 'InnoDB',
967 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
968 )
969 __mapper_args__ = {}
970
971 ip_id = Column("ip_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
972 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None)
973 ip_addr = Column("ip_addr", String(255), nullable=True, unique=False, default=None)
974 active = Column("active", Boolean(), nullable=True, unique=None, default=True)
975 description = Column("description", String(10000), nullable=True, unique=None, default=None)
976 user = relationship('User', lazy='joined')
977
978 @classmethod
979 def _get_ip_range(cls, ip_addr):
980 net = ipaddress.ip_network(ip_addr, strict=False)
981 return [str(net.network_address), str(net.broadcast_address)]
982
983 def __json__(self):
984 return {
985 'ip_addr': self.ip_addr,
986 'ip_range': self._get_ip_range(self.ip_addr),
987 }
988
989 def __unicode__(self):
990 return u"<%s('user_id:%s=>%s')>" % (self.__class__.__name__,
991 self.user_id, self.ip_addr)
992
993 class UserLog(Base, BaseModel):
994 __tablename__ = 'user_logs'
995 __table_args__ = (
996 {'extend_existing': True, 'mysql_engine': 'InnoDB',
997 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
998 )
999 user_log_id = Column("user_log_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
1000 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None)
1001 username = Column("username", String(255), nullable=True, unique=None, default=None)
1002 repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True)
1003 repository_name = Column("repository_name", String(255), nullable=True, unique=None, default=None)
1004 user_ip = Column("user_ip", String(255), nullable=True, unique=None, default=None)
1005 action = Column("action", Text().with_variant(Text(1200000), 'mysql'), nullable=True, unique=None, default=None)
1006 action_date = Column("action_date", DateTime(timezone=False), nullable=True, unique=None, default=None)
1007
1008 def __unicode__(self):
1009 return u"<%s('id:%s:%s')>" % (self.__class__.__name__,
1010 self.repository_name,
1011 self.action)
1012
1013 @property
1014 def action_as_day(self):
1015 return datetime.date(*self.action_date.timetuple()[:3])
1016
1017 user = relationship('User')
1018 repository = relationship('Repository', cascade='')
1019
1020
1021 class UserGroup(Base, BaseModel):
1022 __tablename__ = 'users_groups'
1023 __table_args__ = (
1024 {'extend_existing': True, 'mysql_engine': 'InnoDB',
1025 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
1026 )
1027
1028 users_group_id = Column("users_group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
1029 users_group_name = Column("users_group_name", String(255), nullable=False, unique=True, default=None)
1030 user_group_description = Column("user_group_description", String(10000), nullable=True, unique=None, default=None)
1031 users_group_active = Column("users_group_active", Boolean(), nullable=True, unique=None, default=None)
1032 inherit_default_permissions = Column("users_group_inherit_default_permissions", Boolean(), nullable=False, unique=None, default=True)
1033 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=False, default=None)
1034 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
1035 _group_data = Column("group_data", LargeBinary(), nullable=True) # JSON data
1036
1037 members = relationship('UserGroupMember', cascade="all, delete, delete-orphan", lazy="joined")
1038 users_group_to_perm = relationship('UserGroupToPerm', cascade='all')
1039 users_group_repo_to_perm = relationship('UserGroupRepoToPerm', cascade='all')
1040 users_group_repo_group_to_perm = relationship('UserGroupRepoGroupToPerm', cascade='all')
1041 user_user_group_to_perm = relationship('UserUserGroupToPerm', cascade='all')
1042 user_group_user_group_to_perm = relationship('UserGroupUserGroupToPerm ', primaryjoin="UserGroupUserGroupToPerm.target_user_group_id==UserGroup.users_group_id", cascade='all')
1043
1044 user = relationship('User')
1045
1046 @hybrid_property
1047 def group_data(self):
1048 if not self._group_data:
1049 return {}
1050
1051 try:
1052 return json.loads(self._group_data)
1053 except TypeError:
1054 return {}
1055
1056 @group_data.setter
1057 def group_data(self, val):
1058 try:
1059 self._group_data = json.dumps(val)
1060 except Exception:
1061 log.error(traceback.format_exc())
1062
1063 def __unicode__(self):
1064 return u"<%s('id:%s:%s')>" % (self.__class__.__name__,
1065 self.users_group_id,
1066 self.users_group_name)
1067
1068 @classmethod
1069 def get_by_group_name(cls, group_name, cache=False,
1070 case_insensitive=False):
1071 if case_insensitive:
1072 q = cls.query().filter(func.lower(cls.users_group_name) ==
1073 func.lower(group_name))
1074
1075 else:
1076 q = cls.query().filter(cls.users_group_name == group_name)
1077 if cache:
1078 q = q.options(FromCache(
1079 "sql_cache_short",
1080 "get_group_%s" % _hash_key(group_name)))
1081 return q.scalar()
1082
1083 @classmethod
1084 def get(cls, user_group_id, cache=False):
1085 user_group = cls.query()
1086 if cache:
1087 user_group = user_group.options(FromCache("sql_cache_short",
1088 "get_users_group_%s" % user_group_id))
1089 return user_group.get(user_group_id)
1090
1091 def permissions(self, with_admins=True, with_owner=True):
1092 q = UserUserGroupToPerm.query().filter(UserUserGroupToPerm.user_group == self)
1093 q = q.options(joinedload(UserUserGroupToPerm.user_group),
1094 joinedload(UserUserGroupToPerm.user),
1095 joinedload(UserUserGroupToPerm.permission),)
1096
1097 # get owners and admins and permissions. We do a trick of re-writing
1098 # objects from sqlalchemy to named-tuples due to sqlalchemy session
1099 # has a global reference and changing one object propagates to all
1100 # others. This means if admin is also an owner admin_row that change
1101 # would propagate to both objects
1102 perm_rows = []
1103 for _usr in q.all():
1104 usr = AttributeDict(_usr.user.get_dict())
1105 usr.permission = _usr.permission.permission_name
1106 perm_rows.append(usr)
1107
1108 # filter the perm rows by 'default' first and then sort them by
1109 # admin,write,read,none permissions sorted again alphabetically in
1110 # each group
1111 perm_rows = sorted(perm_rows, key=display_sort)
1112
1113 _admin_perm = 'usergroup.admin'
1114 owner_row = []
1115 if with_owner:
1116 usr = AttributeDict(self.user.get_dict())
1117 usr.owner_row = True
1118 usr.permission = _admin_perm
1119 owner_row.append(usr)
1120
1121 super_admin_rows = []
1122 if with_admins:
1123 for usr in User.get_all_super_admins():
1124 # if this admin is also owner, don't double the record
1125 if usr.user_id == owner_row[0].user_id:
1126 owner_row[0].admin_row = True
1127 else:
1128 usr = AttributeDict(usr.get_dict())
1129 usr.admin_row = True
1130 usr.permission = _admin_perm
1131 super_admin_rows.append(usr)
1132
1133 return super_admin_rows + owner_row + perm_rows
1134
1135 def permission_user_groups(self):
1136 q = UserGroupUserGroupToPerm.query().filter(UserGroupUserGroupToPerm.target_user_group == self)
1137 q = q.options(joinedload(UserGroupUserGroupToPerm.user_group),
1138 joinedload(UserGroupUserGroupToPerm.target_user_group),
1139 joinedload(UserGroupUserGroupToPerm.permission),)
1140
1141 perm_rows = []
1142 for _user_group in q.all():
1143 usr = AttributeDict(_user_group.user_group.get_dict())
1144 usr.permission = _user_group.permission.permission_name
1145 perm_rows.append(usr)
1146
1147 return perm_rows
1148
1149 def _get_default_perms(self, user_group, suffix=''):
1150 from rhodecode.model.permission import PermissionModel
1151 return PermissionModel().get_default_perms(user_group.users_group_to_perm, suffix)
1152
1153 def get_default_perms(self, suffix=''):
1154 return self._get_default_perms(self, suffix)
1155
1156 def get_api_data(self, with_group_members=True, include_secrets=False):
1157 """
1158 :param include_secrets: See :meth:`User.get_api_data`, this parameter is
1159 basically forwarded.
1160
1161 """
1162 user_group = self
1163
1164 data = {
1165 'users_group_id': user_group.users_group_id,
1166 'group_name': user_group.users_group_name,
1167 'group_description': user_group.user_group_description,
1168 'active': user_group.users_group_active,
1169 'owner': user_group.user.username,
1170 }
1171 if with_group_members:
1172 users = []
1173 for user in user_group.members:
1174 user = user.user
1175 users.append(user.get_api_data(include_secrets=include_secrets))
1176 data['users'] = users
1177
1178 return data
1179
1180
1181 class UserGroupMember(Base, BaseModel):
1182 __tablename__ = 'users_groups_members'
1183 __table_args__ = (
1184 {'extend_existing': True, 'mysql_engine': 'InnoDB',
1185 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
1186 )
1187
1188 users_group_member_id = Column("users_group_member_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
1189 users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
1190 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
1191
1192 user = relationship('User', lazy='joined')
1193 users_group = relationship('UserGroup')
1194
1195 def __init__(self, gr_id='', u_id=''):
1196 self.users_group_id = gr_id
1197 self.user_id = u_id
1198
1199
1200 class RepositoryField(Base, BaseModel):
1201 __tablename__ = 'repositories_fields'
1202 __table_args__ = (
1203 UniqueConstraint('repository_id', 'field_key'), # no-multi field
1204 {'extend_existing': True, 'mysql_engine': 'InnoDB',
1205 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
1206 )
1207 PREFIX = 'ex_' # prefix used in form to not conflict with already existing fields
1208
1209 repo_field_id = Column("repo_field_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
1210 repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None)
1211 field_key = Column("field_key", String(250))
1212 field_label = Column("field_label", String(1024), nullable=False)
1213 field_value = Column("field_value", String(10000), nullable=False)
1214 field_desc = Column("field_desc", String(1024), nullable=False)
1215 field_type = Column("field_type", String(255), nullable=False, unique=None)
1216 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
1217
1218 repository = relationship('Repository')
1219
1220 @property
1221 def field_key_prefixed(self):
1222 return 'ex_%s' % self.field_key
1223
1224 @classmethod
1225 def un_prefix_key(cls, key):
1226 if key.startswith(cls.PREFIX):
1227 return key[len(cls.PREFIX):]
1228 return key
1229
1230 @classmethod
1231 def get_by_key_name(cls, key, repo):
1232 row = cls.query()\
1233 .filter(cls.repository == repo)\
1234 .filter(cls.field_key == key).scalar()
1235 return row
1236
1237
1238 class Repository(Base, BaseModel):
1239 __tablename__ = 'repositories'
1240 __table_args__ = (
1241 Index('r_repo_name_idx', 'repo_name', mysql_length=255),
1242 {'extend_existing': True, 'mysql_engine': 'InnoDB',
1243 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
1244 )
1245 DEFAULT_CLONE_URI = '{scheme}://{user}@{netloc}/{repo}'
1246 DEFAULT_CLONE_URI_ID = '{scheme}://{user}@{netloc}/_{repoid}'
1247
1248 STATE_CREATED = 'repo_state_created'
1249 STATE_PENDING = 'repo_state_pending'
1250 STATE_ERROR = 'repo_state_error'
1251
1252 LOCK_AUTOMATIC = 'lock_auto'
1253 LOCK_API = 'lock_api'
1254 LOCK_WEB = 'lock_web'
1255 LOCK_PULL = 'lock_pull'
1256
1257 NAME_SEP = URL_SEP
1258
1259 repo_id = Column(
1260 "repo_id", Integer(), nullable=False, unique=True, default=None,
1261 primary_key=True)
1262 _repo_name = Column(
1263 "repo_name", Text(), nullable=False, default=None)
1264 _repo_name_hash = Column(
1265 "repo_name_hash", String(255), nullable=False, unique=True)
1266 repo_state = Column("repo_state", String(255), nullable=True)
1267
1268 clone_uri = Column(
1269 "clone_uri", EncryptedTextValue(), nullable=True, unique=False,
1270 default=None)
1271 repo_type = Column(
1272 "repo_type", String(255), nullable=False, unique=False, default=None)
1273 user_id = Column(
1274 "user_id", Integer(), ForeignKey('users.user_id'), nullable=False,
1275 unique=False, default=None)
1276 private = Column(
1277 "private", Boolean(), nullable=True, unique=None, default=None)
1278 enable_statistics = Column(
1279 "statistics", Boolean(), nullable=True, unique=None, default=True)
1280 enable_downloads = Column(
1281 "downloads", Boolean(), nullable=True, unique=None, default=True)
1282 description = Column(
1283 "description", String(10000), nullable=True, unique=None, default=None)
1284 created_on = Column(
1285 'created_on', DateTime(timezone=False), nullable=True, unique=None,
1286 default=datetime.datetime.now)
1287 updated_on = Column(
1288 'updated_on', DateTime(timezone=False), nullable=True, unique=None,
1289 default=datetime.datetime.now)
1290 _landing_revision = Column(
1291 "landing_revision", String(255), nullable=False, unique=False,
1292 default=None)
1293 enable_locking = Column(
1294 "enable_locking", Boolean(), nullable=False, unique=None,
1295 default=False)
1296 _locked = Column(
1297 "locked", String(255), nullable=True, unique=False, default=None)
1298 _changeset_cache = Column(
1299 "changeset_cache", LargeBinary(), nullable=True) # JSON data
1300
1301 fork_id = Column(
1302 "fork_id", Integer(), ForeignKey('repositories.repo_id'),
1303 nullable=True, unique=False, default=None)
1304 group_id = Column(
1305 "group_id", Integer(), ForeignKey('groups.group_id'), nullable=True,
1306 unique=False, default=None)
1307
1308 user = relationship('User', lazy='joined')
1309 fork = relationship('Repository', remote_side=repo_id, lazy='joined')
1310 group = relationship('RepoGroup', lazy='joined')
1311 repo_to_perm = relationship(
1312 'UserRepoToPerm', cascade='all',
1313 order_by='UserRepoToPerm.repo_to_perm_id')
1314 users_group_to_perm = relationship('UserGroupRepoToPerm', cascade='all')
1315 stats = relationship('Statistics', cascade='all', uselist=False)
1316
1317 followers = relationship(
1318 'UserFollowing',
1319 primaryjoin='UserFollowing.follows_repo_id==Repository.repo_id',
1320 cascade='all')
1321 extra_fields = relationship(
1322 'RepositoryField', cascade="all, delete, delete-orphan")
1323 logs = relationship('UserLog')
1324 comments = relationship(
1325 'ChangesetComment', cascade="all, delete, delete-orphan")
1326 pull_requests_source = relationship(
1327 'PullRequest',
1328 primaryjoin='PullRequest.source_repo_id==Repository.repo_id',
1329 cascade="all, delete, delete-orphan")
1330 pull_requests_target = relationship(
1331 'PullRequest',
1332 primaryjoin='PullRequest.target_repo_id==Repository.repo_id',
1333 cascade="all, delete, delete-orphan")
1334 ui = relationship('RepoRhodeCodeUi', cascade="all")
1335 settings = relationship('RepoRhodeCodeSetting', cascade="all")
1336 integrations = relationship('Integration',
1337 cascade="all, delete, delete-orphan")
1338
1339 def __unicode__(self):
1340 return u"<%s('%s:%s')>" % (self.__class__.__name__, self.repo_id,
1341 safe_unicode(self.repo_name))
1342
1343 @hybrid_property
1344 def landing_rev(self):
1345 # always should return [rev_type, rev]
1346 if self._landing_revision:
1347 _rev_info = self._landing_revision.split(':')
1348 if len(_rev_info) < 2:
1349 _rev_info.insert(0, 'rev')
1350 return [_rev_info[0], _rev_info[1]]
1351 return [None, None]
1352
1353 @landing_rev.setter
1354 def landing_rev(self, val):
1355 if ':' not in val:
1356 raise ValueError('value must be delimited with `:` and consist '
1357 'of <rev_type>:<rev>, got %s instead' % val)
1358 self._landing_revision = val
1359
1360 @hybrid_property
1361 def locked(self):
1362 if self._locked:
1363 user_id, timelocked, reason = self._locked.split(':')
1364 lock_values = int(user_id), timelocked, reason
1365 else:
1366 lock_values = [None, None, None]
1367 return lock_values
1368
1369 @locked.setter
1370 def locked(self, val):
1371 if val and isinstance(val, (list, tuple)):
1372 self._locked = ':'.join(map(str, val))
1373 else:
1374 self._locked = None
1375
1376 @hybrid_property
1377 def changeset_cache(self):
1378 from rhodecode.lib.vcs.backends.base import EmptyCommit
1379 dummy = EmptyCommit().__json__()
1380 if not self._changeset_cache:
1381 return dummy
1382 try:
1383 return json.loads(self._changeset_cache)
1384 except TypeError:
1385 return dummy
1386 except Exception:
1387 log.error(traceback.format_exc())
1388 return dummy
1389
1390 @changeset_cache.setter
1391 def changeset_cache(self, val):
1392 try:
1393 self._changeset_cache = json.dumps(val)
1394 except Exception:
1395 log.error(traceback.format_exc())
1396
1397 @hybrid_property
1398 def repo_name(self):
1399 return self._repo_name
1400
1401 @repo_name.setter
1402 def repo_name(self, value):
1403 self._repo_name = value
1404 self._repo_name_hash = hashlib.sha1(safe_str(value)).hexdigest()
1405
1406 @classmethod
1407 def normalize_repo_name(cls, repo_name):
1408 """
1409 Normalizes os specific repo_name to the format internally stored inside
1410 database using URL_SEP
1411
1412 :param cls:
1413 :param repo_name:
1414 """
1415 return cls.NAME_SEP.join(repo_name.split(os.sep))
1416
1417 @classmethod
1418 def get_by_repo_name(cls, repo_name, cache=False, identity_cache=False):
1419 session = Session()
1420 q = session.query(cls).filter(cls.repo_name == repo_name)
1421
1422 if cache:
1423 if identity_cache:
1424 val = cls.identity_cache(session, 'repo_name', repo_name)
1425 if val:
1426 return val
1427 else:
1428 q = q.options(
1429 FromCache("sql_cache_short",
1430 "get_repo_by_name_%s" % _hash_key(repo_name)))
1431
1432 return q.scalar()
1433
1434 @classmethod
1435 def get_by_full_path(cls, repo_full_path):
1436 repo_name = repo_full_path.split(cls.base_path(), 1)[-1]
1437 repo_name = cls.normalize_repo_name(repo_name)
1438 return cls.get_by_repo_name(repo_name.strip(URL_SEP))
1439
1440 @classmethod
1441 def get_repo_forks(cls, repo_id):
1442 return cls.query().filter(Repository.fork_id == repo_id)
1443
1444 @classmethod
1445 def base_path(cls):
1446 """
1447 Returns base path when all repos are stored
1448
1449 :param cls:
1450 """
1451 q = Session().query(RhodeCodeUi)\
1452 .filter(RhodeCodeUi.ui_key == cls.NAME_SEP)
1453 q = q.options(FromCache("sql_cache_short", "repository_repo_path"))
1454 return q.one().ui_value
1455
1456 @classmethod
1457 def is_valid(cls, repo_name):
1458 """
1459 returns True if given repo name is a valid filesystem repository
1460
1461 :param cls:
1462 :param repo_name:
1463 """
1464 from rhodecode.lib.utils import is_valid_repo
1465
1466 return is_valid_repo(repo_name, cls.base_path())
1467
1468 @classmethod
1469 def get_all_repos(cls, user_id=Optional(None), group_id=Optional(None),
1470 case_insensitive=True):
1471 q = Repository.query()
1472
1473 if not isinstance(user_id, Optional):
1474 q = q.filter(Repository.user_id == user_id)
1475
1476 if not isinstance(group_id, Optional):
1477 q = q.filter(Repository.group_id == group_id)
1478
1479 if case_insensitive:
1480 q = q.order_by(func.lower(Repository.repo_name))
1481 else:
1482 q = q.order_by(Repository.repo_name)
1483 return q.all()
1484
1485 @property
1486 def forks(self):
1487 """
1488 Return forks of this repo
1489 """
1490 return Repository.get_repo_forks(self.repo_id)
1491
1492 @property
1493 def parent(self):
1494 """
1495 Returns fork parent
1496 """
1497 return self.fork
1498
1499 @property
1500 def just_name(self):
1501 return self.repo_name.split(self.NAME_SEP)[-1]
1502
1503 @property
1504 def groups_with_parents(self):
1505 groups = []
1506 if self.group is None:
1507 return groups
1508
1509 cur_gr = self.group
1510 groups.insert(0, cur_gr)
1511 while 1:
1512 gr = getattr(cur_gr, 'parent_group', None)
1513 cur_gr = cur_gr.parent_group
1514 if gr is None:
1515 break
1516 groups.insert(0, gr)
1517
1518 return groups
1519
1520 @property
1521 def groups_and_repo(self):
1522 return self.groups_with_parents, self
1523
1524 @LazyProperty
1525 def repo_path(self):
1526 """
1527 Returns base full path for that repository means where it actually
1528 exists on a filesystem
1529 """
1530 q = Session().query(RhodeCodeUi).filter(
1531 RhodeCodeUi.ui_key == self.NAME_SEP)
1532 q = q.options(FromCache("sql_cache_short", "repository_repo_path"))
1533 return q.one().ui_value
1534
1535 @property
1536 def repo_full_path(self):
1537 p = [self.repo_path]
1538 # we need to split the name by / since this is how we store the
1539 # names in the database, but that eventually needs to be converted
1540 # into a valid system path
1541 p += self.repo_name.split(self.NAME_SEP)
1542 return os.path.join(*map(safe_unicode, p))
1543
1544 @property
1545 def cache_keys(self):
1546 """
1547 Returns associated cache keys for that repo
1548 """
1549 return CacheKey.query()\
1550 .filter(CacheKey.cache_args == self.repo_name)\
1551 .order_by(CacheKey.cache_key)\
1552 .all()
1553
1554 def get_new_name(self, repo_name):
1555 """
1556 returns new full repository name based on assigned group and new new
1557
1558 :param group_name:
1559 """
1560 path_prefix = self.group.full_path_splitted if self.group else []
1561 return self.NAME_SEP.join(path_prefix + [repo_name])
1562
1563 @property
1564 def _config(self):
1565 """
1566 Returns db based config object.
1567 """
1568 from rhodecode.lib.utils import make_db_config
1569 return make_db_config(clear_session=False, repo=self)
1570
1571 def permissions(self, with_admins=True, with_owner=True):
1572 q = UserRepoToPerm.query().filter(UserRepoToPerm.repository == self)
1573 q = q.options(joinedload(UserRepoToPerm.repository),
1574 joinedload(UserRepoToPerm.user),
1575 joinedload(UserRepoToPerm.permission),)
1576
1577 # get owners and admins and permissions. We do a trick of re-writing
1578 # objects from sqlalchemy to named-tuples due to sqlalchemy session
1579 # has a global reference and changing one object propagates to all
1580 # others. This means if admin is also an owner admin_row that change
1581 # would propagate to both objects
1582 perm_rows = []
1583 for _usr in q.all():
1584 usr = AttributeDict(_usr.user.get_dict())
1585 usr.permission = _usr.permission.permission_name
1586 perm_rows.append(usr)
1587
1588 # filter the perm rows by 'default' first and then sort them by
1589 # admin,write,read,none permissions sorted again alphabetically in
1590 # each group
1591 perm_rows = sorted(perm_rows, key=display_sort)
1592
1593 _admin_perm = 'repository.admin'
1594 owner_row = []
1595 if with_owner:
1596 usr = AttributeDict(self.user.get_dict())
1597 usr.owner_row = True
1598 usr.permission = _admin_perm
1599 owner_row.append(usr)
1600
1601 super_admin_rows = []
1602 if with_admins:
1603 for usr in User.get_all_super_admins():
1604 # if this admin is also owner, don't double the record
1605 if usr.user_id == owner_row[0].user_id:
1606 owner_row[0].admin_row = True
1607 else:
1608 usr = AttributeDict(usr.get_dict())
1609 usr.admin_row = True
1610 usr.permission = _admin_perm
1611 super_admin_rows.append(usr)
1612
1613 return super_admin_rows + owner_row + perm_rows
1614
1615 def permission_user_groups(self):
1616 q = UserGroupRepoToPerm.query().filter(
1617 UserGroupRepoToPerm.repository == self)
1618 q = q.options(joinedload(UserGroupRepoToPerm.repository),
1619 joinedload(UserGroupRepoToPerm.users_group),
1620 joinedload(UserGroupRepoToPerm.permission),)
1621
1622 perm_rows = []
1623 for _user_group in q.all():
1624 usr = AttributeDict(_user_group.users_group.get_dict())
1625 usr.permission = _user_group.permission.permission_name
1626 perm_rows.append(usr)
1627
1628 return perm_rows
1629
1630 def get_api_data(self, include_secrets=False):
1631 """
1632 Common function for generating repo api data
1633
1634 :param include_secrets: See :meth:`User.get_api_data`.
1635
1636 """
1637 # TODO: mikhail: Here there is an anti-pattern, we probably need to
1638 # move this methods on models level.
1639 from rhodecode.model.settings import SettingsModel
1640
1641 repo = self
1642 _user_id, _time, _reason = self.locked
1643
1644 data = {
1645 'repo_id': repo.repo_id,
1646 'repo_name': repo.repo_name,
1647 'repo_type': repo.repo_type,
1648 'clone_uri': repo.clone_uri or '',
1649 'url': url('summary_home', repo_name=self.repo_name, qualified=True),
1650 'private': repo.private,
1651 'created_on': repo.created_on,
1652 'description': repo.description,
1653 'landing_rev': repo.landing_rev,
1654 'owner': repo.user.username,
1655 'fork_of': repo.fork.repo_name if repo.fork else None,
1656 'enable_statistics': repo.enable_statistics,
1657 'enable_locking': repo.enable_locking,
1658 'enable_downloads': repo.enable_downloads,
1659 'last_changeset': repo.changeset_cache,
1660 'locked_by': User.get(_user_id).get_api_data(
1661 include_secrets=include_secrets) if _user_id else None,
1662 'locked_date': time_to_datetime(_time) if _time else None,
1663 'lock_reason': _reason if _reason else None,
1664 }
1665
1666 # TODO: mikhail: should be per-repo settings here
1667 rc_config = SettingsModel().get_all_settings()
1668 repository_fields = str2bool(
1669 rc_config.get('rhodecode_repository_fields'))
1670 if repository_fields:
1671 for f in self.extra_fields:
1672 data[f.field_key_prefixed] = f.field_value
1673
1674 return data
1675
1676 @classmethod
1677 def lock(cls, repo, user_id, lock_time=None, lock_reason=None):
1678 if not lock_time:
1679 lock_time = time.time()
1680 if not lock_reason:
1681 lock_reason = cls.LOCK_AUTOMATIC
1682 repo.locked = [user_id, lock_time, lock_reason]
1683 Session().add(repo)
1684 Session().commit()
1685
1686 @classmethod
1687 def unlock(cls, repo):
1688 repo.locked = None
1689 Session().add(repo)
1690 Session().commit()
1691
1692 @classmethod
1693 def getlock(cls, repo):
1694 return repo.locked
1695
1696 def is_user_lock(self, user_id):
1697 if self.lock[0]:
1698 lock_user_id = safe_int(self.lock[0])
1699 user_id = safe_int(user_id)
1700 # both are ints, and they are equal
1701 return all([lock_user_id, user_id]) and lock_user_id == user_id
1702
1703 return False
1704
1705 def get_locking_state(self, action, user_id, only_when_enabled=True):
1706 """
1707 Checks locking on this repository, if locking is enabled and lock is
1708 present returns a tuple of make_lock, locked, locked_by.
1709 make_lock can have 3 states None (do nothing) True, make lock
1710 False release lock, This value is later propagated to hooks, which
1711 do the locking. Think about this as signals passed to hooks what to do.
1712
1713 """
1714 # TODO: johbo: This is part of the business logic and should be moved
1715 # into the RepositoryModel.
1716
1717 if action not in ('push', 'pull'):
1718 raise ValueError("Invalid action value: %s" % repr(action))
1719
1720 # defines if locked error should be thrown to user
1721 currently_locked = False
1722 # defines if new lock should be made, tri-state
1723 make_lock = None
1724 repo = self
1725 user = User.get(user_id)
1726
1727 lock_info = repo.locked
1728
1729 if repo and (repo.enable_locking or not only_when_enabled):
1730 if action == 'push':
1731 # check if it's already locked !, if it is compare users
1732 locked_by_user_id = lock_info[0]
1733 if user.user_id == locked_by_user_id:
1734 log.debug(
1735 'Got `push` action from user %s, now unlocking', user)
1736 # unlock if we have push from user who locked
1737 make_lock = False
1738 else:
1739 # we're not the same user who locked, ban with
1740 # code defined in settings (default is 423 HTTP Locked) !
1741 log.debug('Repo %s is currently locked by %s', repo, user)
1742 currently_locked = True
1743 elif action == 'pull':
1744 # [0] user [1] date
1745 if lock_info[0] and lock_info[1]:
1746 log.debug('Repo %s is currently locked by %s', repo, user)
1747 currently_locked = True
1748 else:
1749 log.debug('Setting lock on repo %s by %s', repo, user)
1750 make_lock = True
1751
1752 else:
1753 log.debug('Repository %s do not have locking enabled', repo)
1754
1755 log.debug('FINAL locking values make_lock:%s,locked:%s,locked_by:%s',
1756 make_lock, currently_locked, lock_info)
1757
1758 from rhodecode.lib.auth import HasRepoPermissionAny
1759 perm_check = HasRepoPermissionAny('repository.write', 'repository.admin')
1760 if make_lock and not perm_check(repo_name=repo.repo_name, user=user):
1761 # if we don't have at least write permission we cannot make a lock
1762 log.debug('lock state reset back to FALSE due to lack '
1763 'of at least read permission')
1764 make_lock = False
1765
1766 return make_lock, currently_locked, lock_info
1767
1768 @property
1769 def last_db_change(self):
1770 return self.updated_on
1771
1772 @property
1773 def clone_uri_hidden(self):
1774 clone_uri = self.clone_uri
1775 if clone_uri:
1776 import urlobject
1777 url_obj = urlobject.URLObject(clone_uri)
1778 if url_obj.password:
1779 clone_uri = url_obj.with_password('*****')
1780 return clone_uri
1781
1782 def clone_url(self, **override):
1783 qualified_home_url = url('home', qualified=True)
1784
1785 uri_tmpl = None
1786 if 'with_id' in override:
1787 uri_tmpl = self.DEFAULT_CLONE_URI_ID
1788 del override['with_id']
1789
1790 if 'uri_tmpl' in override:
1791 uri_tmpl = override['uri_tmpl']
1792 del override['uri_tmpl']
1793
1794 # we didn't override our tmpl from **overrides
1795 if not uri_tmpl:
1796 uri_tmpl = self.DEFAULT_CLONE_URI
1797 try:
1798 from pylons import tmpl_context as c
1799 uri_tmpl = c.clone_uri_tmpl
1800 except Exception:
1801 # in any case if we call this outside of request context,
1802 # ie, not having tmpl_context set up
1803 pass
1804
1805 return get_clone_url(uri_tmpl=uri_tmpl,
1806 qualifed_home_url=qualified_home_url,
1807 repo_name=self.repo_name,
1808 repo_id=self.repo_id, **override)
1809
1810 def set_state(self, state):
1811 self.repo_state = state
1812 Session().add(self)
1813 #==========================================================================
1814 # SCM PROPERTIES
1815 #==========================================================================
1816
1817 def get_commit(self, commit_id=None, commit_idx=None, pre_load=None):
1818 return get_commit_safe(
1819 self.scm_instance(), commit_id, commit_idx, pre_load=pre_load)
1820
1821 def get_changeset(self, rev=None, pre_load=None):
1822 warnings.warn("Use get_commit", DeprecationWarning)
1823 commit_id = None
1824 commit_idx = None
1825 if isinstance(rev, basestring):
1826 commit_id = rev
1827 else:
1828 commit_idx = rev
1829 return self.get_commit(commit_id=commit_id, commit_idx=commit_idx,
1830 pre_load=pre_load)
1831
1832 def get_landing_commit(self):
1833 """
1834 Returns landing commit, or if that doesn't exist returns the tip
1835 """
1836 _rev_type, _rev = self.landing_rev
1837 commit = self.get_commit(_rev)
1838 if isinstance(commit, EmptyCommit):
1839 return self.get_commit()
1840 return commit
1841
1842 def update_commit_cache(self, cs_cache=None, config=None):
1843 """
1844 Update cache of last changeset for repository, keys should be::
1845
1846 short_id
1847 raw_id
1848 revision
1849 parents
1850 message
1851 date
1852 author
1853
1854 :param cs_cache:
1855 """
1856 from rhodecode.lib.vcs.backends.base import BaseChangeset
1857 if cs_cache is None:
1858 # use no-cache version here
1859 scm_repo = self.scm_instance(cache=False, config=config)
1860 if scm_repo:
1861 cs_cache = scm_repo.get_commit(
1862 pre_load=["author", "date", "message", "parents"])
1863 else:
1864 cs_cache = EmptyCommit()
1865
1866 if isinstance(cs_cache, BaseChangeset):
1867 cs_cache = cs_cache.__json__()
1868
1869 def is_outdated(new_cs_cache):
1870 if (new_cs_cache['raw_id'] != self.changeset_cache['raw_id'] or
1871 new_cs_cache['revision'] != self.changeset_cache['revision']):
1872 return True
1873 return False
1874
1875 # check if we have maybe already latest cached revision
1876 if is_outdated(cs_cache) or not self.changeset_cache:
1877 _default = datetime.datetime.fromtimestamp(0)
1878 last_change = cs_cache.get('date') or _default
1879 log.debug('updated repo %s with new cs cache %s',
1880 self.repo_name, cs_cache)
1881 self.updated_on = last_change
1882 self.changeset_cache = cs_cache
1883 Session().add(self)
1884 Session().commit()
1885 else:
1886 log.debug('Skipping update_commit_cache for repo:`%s` '
1887 'commit already with latest changes', self.repo_name)
1888
1889 @property
1890 def tip(self):
1891 return self.get_commit('tip')
1892
1893 @property
1894 def author(self):
1895 return self.tip.author
1896
1897 @property
1898 def last_change(self):
1899 return self.scm_instance().last_change
1900
1901 def get_comments(self, revisions=None):
1902 """
1903 Returns comments for this repository grouped by revisions
1904
1905 :param revisions: filter query by revisions only
1906 """
1907 cmts = ChangesetComment.query()\
1908 .filter(ChangesetComment.repo == self)
1909 if revisions:
1910 cmts = cmts.filter(ChangesetComment.revision.in_(revisions))
1911 grouped = collections.defaultdict(list)
1912 for cmt in cmts.all():
1913 grouped[cmt.revision].append(cmt)
1914 return grouped
1915
1916 def statuses(self, revisions=None):
1917 """
1918 Returns statuses for this repository
1919
1920 :param revisions: list of revisions to get statuses for
1921 """
1922 statuses = ChangesetStatus.query()\
1923 .filter(ChangesetStatus.repo == self)\
1924 .filter(ChangesetStatus.version == 0)
1925
1926 if revisions:
1927 # Try doing the filtering in chunks to avoid hitting limits
1928 size = 500
1929 status_results = []
1930 for chunk in xrange(0, len(revisions), size):
1931 status_results += statuses.filter(
1932 ChangesetStatus.revision.in_(
1933 revisions[chunk: chunk+size])
1934 ).all()
1935 else:
1936 status_results = statuses.all()
1937
1938 grouped = {}
1939
1940 # maybe we have open new pullrequest without a status?
1941 stat = ChangesetStatus.STATUS_UNDER_REVIEW
1942 status_lbl = ChangesetStatus.get_status_lbl(stat)
1943 for pr in PullRequest.query().filter(PullRequest.source_repo == self).all():
1944 for rev in pr.revisions:
1945 pr_id = pr.pull_request_id
1946 pr_repo = pr.target_repo.repo_name
1947 grouped[rev] = [stat, status_lbl, pr_id, pr_repo]
1948
1949 for stat in status_results:
1950 pr_id = pr_repo = None
1951 if stat.pull_request:
1952 pr_id = stat.pull_request.pull_request_id
1953 pr_repo = stat.pull_request.target_repo.repo_name
1954 grouped[stat.revision] = [str(stat.status), stat.status_lbl,
1955 pr_id, pr_repo]
1956 return grouped
1957
1958 # ==========================================================================
1959 # SCM CACHE INSTANCE
1960 # ==========================================================================
1961
1962 def scm_instance(self, **kwargs):
1963 import rhodecode
1964
1965 # Passing a config will not hit the cache currently only used
1966 # for repo2dbmapper
1967 config = kwargs.pop('config', None)
1968 cache = kwargs.pop('cache', None)
1969 full_cache = str2bool(rhodecode.CONFIG.get('vcs_full_cache'))
1970 # if cache is NOT defined use default global, else we have a full
1971 # control over cache behaviour
1972 if cache is None and full_cache and not config:
1973 return self._get_instance_cached()
1974 return self._get_instance(cache=bool(cache), config=config)
1975
1976 def _get_instance_cached(self):
1977 @cache_region('long_term')
1978 def _get_repo(cache_key):
1979 return self._get_instance()
1980
1981 invalidator_context = CacheKey.repo_context_cache(
1982 _get_repo, self.repo_name, None, thread_scoped=True)
1983
1984 with invalidator_context as context:
1985 context.invalidate()
1986 repo = context.compute()
1987
1988 return repo
1989
1990 def _get_instance(self, cache=True, config=None):
1991 config = config or self._config
1992 custom_wire = {
1993 'cache': cache # controls the vcs.remote cache
1994 }
1995
1996 repo = get_vcs_instance(
1997 repo_path=safe_str(self.repo_full_path),
1998 config=config,
1999 with_wire=custom_wire,
2000 create=False)
2001
2002 return repo
2003
2004 def __json__(self):
2005 return {'landing_rev': self.landing_rev}
2006
2007 def get_dict(self):
2008
2009 # Since we transformed `repo_name` to a hybrid property, we need to
2010 # keep compatibility with the code which uses `repo_name` field.
2011
2012 result = super(Repository, self).get_dict()
2013 result['repo_name'] = result.pop('_repo_name', None)
2014 return result
2015
2016
2017 class RepoGroup(Base, BaseModel):
2018 __tablename__ = 'groups'
2019 __table_args__ = (
2020 UniqueConstraint('group_name', 'group_parent_id'),
2021 CheckConstraint('group_id != group_parent_id'),
2022 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2023 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
2024 )
2025 __mapper_args__ = {'order_by': 'group_name'}
2026
2027 CHOICES_SEPARATOR = '/' # used to generate select2 choices for nested groups
2028
2029 group_id = Column("group_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2030 group_name = Column("group_name", String(255), nullable=False, unique=True, default=None)
2031 group_parent_id = Column("group_parent_id", Integer(), ForeignKey('groups.group_id'), nullable=True, unique=None, default=None)
2032 group_description = Column("group_description", String(10000), nullable=True, unique=None, default=None)
2033 enable_locking = Column("enable_locking", Boolean(), nullable=False, unique=None, default=False)
2034 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=False, default=None)
2035 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
2036
2037 repo_group_to_perm = relationship('UserRepoGroupToPerm', cascade='all', order_by='UserRepoGroupToPerm.group_to_perm_id')
2038 users_group_to_perm = relationship('UserGroupRepoGroupToPerm', cascade='all')
2039 parent_group = relationship('RepoGroup', remote_side=group_id)
2040 user = relationship('User')
2041 integrations = relationship('Integration',
2042 cascade="all, delete, delete-orphan")
2043
2044 def __init__(self, group_name='', parent_group=None):
2045 self.group_name = group_name
2046 self.parent_group = parent_group
2047
2048 def __unicode__(self):
2049 return u"<%s('id:%s:%s')>" % (self.__class__.__name__, self.group_id,
2050 self.group_name)
2051
2052 @classmethod
2053 def _generate_choice(cls, repo_group):
2054 from webhelpers.html import literal as _literal
2055 _name = lambda k: _literal(cls.CHOICES_SEPARATOR.join(k))
2056 return repo_group.group_id, _name(repo_group.full_path_splitted)
2057
2058 @classmethod
2059 def groups_choices(cls, groups=None, show_empty_group=True):
2060 if not groups:
2061 groups = cls.query().all()
2062
2063 repo_groups = []
2064 if show_empty_group:
2065 repo_groups = [('-1', u'-- %s --' % _('No parent'))]
2066
2067 repo_groups.extend([cls._generate_choice(x) for x in groups])
2068
2069 repo_groups = sorted(
2070 repo_groups, key=lambda t: t[1].split(cls.CHOICES_SEPARATOR)[0])
2071 return repo_groups
2072
2073 @classmethod
2074 def url_sep(cls):
2075 return URL_SEP
2076
2077 @classmethod
2078 def get_by_group_name(cls, group_name, cache=False, case_insensitive=False):
2079 if case_insensitive:
2080 gr = cls.query().filter(func.lower(cls.group_name)
2081 == func.lower(group_name))
2082 else:
2083 gr = cls.query().filter(cls.group_name == group_name)
2084 if cache:
2085 gr = gr.options(FromCache(
2086 "sql_cache_short",
2087 "get_group_%s" % _hash_key(group_name)))
2088 return gr.scalar()
2089
2090 @classmethod
2091 def get_all_repo_groups(cls, user_id=Optional(None), group_id=Optional(None),
2092 case_insensitive=True):
2093 q = RepoGroup.query()
2094
2095 if not isinstance(user_id, Optional):
2096 q = q.filter(RepoGroup.user_id == user_id)
2097
2098 if not isinstance(group_id, Optional):
2099 q = q.filter(RepoGroup.group_parent_id == group_id)
2100
2101 if case_insensitive:
2102 q = q.order_by(func.lower(RepoGroup.group_name))
2103 else:
2104 q = q.order_by(RepoGroup.group_name)
2105 return q.all()
2106
2107 @property
2108 def parents(self):
2109 parents_recursion_limit = 10
2110 groups = []
2111 if self.parent_group is None:
2112 return groups
2113 cur_gr = self.parent_group
2114 groups.insert(0, cur_gr)
2115 cnt = 0
2116 while 1:
2117 cnt += 1
2118 gr = getattr(cur_gr, 'parent_group', None)
2119 cur_gr = cur_gr.parent_group
2120 if gr is None:
2121 break
2122 if cnt == parents_recursion_limit:
2123 # this will prevent accidental infinit loops
2124 log.error(('more than %s parents found for group %s, stopping '
2125 'recursive parent fetching' % (parents_recursion_limit, self)))
2126 break
2127
2128 groups.insert(0, gr)
2129 return groups
2130
2131 @property
2132 def children(self):
2133 return RepoGroup.query().filter(RepoGroup.parent_group == self)
2134
2135 @property
2136 def name(self):
2137 return self.group_name.split(RepoGroup.url_sep())[-1]
2138
2139 @property
2140 def full_path(self):
2141 return self.group_name
2142
2143 @property
2144 def full_path_splitted(self):
2145 return self.group_name.split(RepoGroup.url_sep())
2146
2147 @property
2148 def repositories(self):
2149 return Repository.query()\
2150 .filter(Repository.group == self)\
2151 .order_by(Repository.repo_name)
2152
2153 @property
2154 def repositories_recursive_count(self):
2155 cnt = self.repositories.count()
2156
2157 def children_count(group):
2158 cnt = 0
2159 for child in group.children:
2160 cnt += child.repositories.count()
2161 cnt += children_count(child)
2162 return cnt
2163
2164 return cnt + children_count(self)
2165
2166 def _recursive_objects(self, include_repos=True):
2167 all_ = []
2168
2169 def _get_members(root_gr):
2170 if include_repos:
2171 for r in root_gr.repositories:
2172 all_.append(r)
2173 childs = root_gr.children.all()
2174 if childs:
2175 for gr in childs:
2176 all_.append(gr)
2177 _get_members(gr)
2178
2179 _get_members(self)
2180 return [self] + all_
2181
2182 def recursive_groups_and_repos(self):
2183 """
2184 Recursive return all groups, with repositories in those groups
2185 """
2186 return self._recursive_objects()
2187
2188 def recursive_groups(self):
2189 """
2190 Returns all children groups for this group including children of children
2191 """
2192 return self._recursive_objects(include_repos=False)
2193
2194 def get_new_name(self, group_name):
2195 """
2196 returns new full group name based on parent and new name
2197
2198 :param group_name:
2199 """
2200 path_prefix = (self.parent_group.full_path_splitted if
2201 self.parent_group else [])
2202 return RepoGroup.url_sep().join(path_prefix + [group_name])
2203
2204 def permissions(self, with_admins=True, with_owner=True):
2205 q = UserRepoGroupToPerm.query().filter(UserRepoGroupToPerm.group == self)
2206 q = q.options(joinedload(UserRepoGroupToPerm.group),
2207 joinedload(UserRepoGroupToPerm.user),
2208 joinedload(UserRepoGroupToPerm.permission),)
2209
2210 # get owners and admins and permissions. We do a trick of re-writing
2211 # objects from sqlalchemy to named-tuples due to sqlalchemy session
2212 # has a global reference and changing one object propagates to all
2213 # others. This means if admin is also an owner admin_row that change
2214 # would propagate to both objects
2215 perm_rows = []
2216 for _usr in q.all():
2217 usr = AttributeDict(_usr.user.get_dict())
2218 usr.permission = _usr.permission.permission_name
2219 perm_rows.append(usr)
2220
2221 # filter the perm rows by 'default' first and then sort them by
2222 # admin,write,read,none permissions sorted again alphabetically in
2223 # each group
2224 perm_rows = sorted(perm_rows, key=display_sort)
2225
2226 _admin_perm = 'group.admin'
2227 owner_row = []
2228 if with_owner:
2229 usr = AttributeDict(self.user.get_dict())
2230 usr.owner_row = True
2231 usr.permission = _admin_perm
2232 owner_row.append(usr)
2233
2234 super_admin_rows = []
2235 if with_admins:
2236 for usr in User.get_all_super_admins():
2237 # if this admin is also owner, don't double the record
2238 if usr.user_id == owner_row[0].user_id:
2239 owner_row[0].admin_row = True
2240 else:
2241 usr = AttributeDict(usr.get_dict())
2242 usr.admin_row = True
2243 usr.permission = _admin_perm
2244 super_admin_rows.append(usr)
2245
2246 return super_admin_rows + owner_row + perm_rows
2247
2248 def permission_user_groups(self):
2249 q = UserGroupRepoGroupToPerm.query().filter(UserGroupRepoGroupToPerm.group == self)
2250 q = q.options(joinedload(UserGroupRepoGroupToPerm.group),
2251 joinedload(UserGroupRepoGroupToPerm.users_group),
2252 joinedload(UserGroupRepoGroupToPerm.permission),)
2253
2254 perm_rows = []
2255 for _user_group in q.all():
2256 usr = AttributeDict(_user_group.users_group.get_dict())
2257 usr.permission = _user_group.permission.permission_name
2258 perm_rows.append(usr)
2259
2260 return perm_rows
2261
2262 def get_api_data(self):
2263 """
2264 Common function for generating api data
2265
2266 """
2267 group = self
2268 data = {
2269 'group_id': group.group_id,
2270 'group_name': group.group_name,
2271 'group_description': group.group_description,
2272 'parent_group': group.parent_group.group_name if group.parent_group else None,
2273 'repositories': [x.repo_name for x in group.repositories],
2274 'owner': group.user.username,
2275 }
2276 return data
2277
2278
2279 class Permission(Base, BaseModel):
2280 __tablename__ = 'permissions'
2281 __table_args__ = (
2282 Index('p_perm_name_idx', 'permission_name'),
2283 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2284 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
2285 )
2286 PERMS = [
2287 ('hg.admin', _('RhodeCode Super Administrator')),
2288
2289 ('repository.none', _('Repository no access')),
2290 ('repository.read', _('Repository read access')),
2291 ('repository.write', _('Repository write access')),
2292 ('repository.admin', _('Repository admin access')),
2293
2294 ('group.none', _('Repository group no access')),
2295 ('group.read', _('Repository group read access')),
2296 ('group.write', _('Repository group write access')),
2297 ('group.admin', _('Repository group admin access')),
2298
2299 ('usergroup.none', _('User group no access')),
2300 ('usergroup.read', _('User group read access')),
2301 ('usergroup.write', _('User group write access')),
2302 ('usergroup.admin', _('User group admin access')),
2303
2304 ('hg.repogroup.create.false', _('Repository Group creation disabled')),
2305 ('hg.repogroup.create.true', _('Repository Group creation enabled')),
2306
2307 ('hg.usergroup.create.false', _('User Group creation disabled')),
2308 ('hg.usergroup.create.true', _('User Group creation enabled')),
2309
2310 ('hg.create.none', _('Repository creation disabled')),
2311 ('hg.create.repository', _('Repository creation enabled')),
2312 ('hg.create.write_on_repogroup.true', _('Repository creation enabled with write permission to a repository group')),
2313 ('hg.create.write_on_repogroup.false', _('Repository creation disabled with write permission to a repository group')),
2314
2315 ('hg.fork.none', _('Repository forking disabled')),
2316 ('hg.fork.repository', _('Repository forking enabled')),
2317
2318 ('hg.register.none', _('Registration disabled')),
2319 ('hg.register.manual_activate', _('User Registration with manual account activation')),
2320 ('hg.register.auto_activate', _('User Registration with automatic account activation')),
2321
2322 ('hg.extern_activate.manual', _('Manual activation of external account')),
2323 ('hg.extern_activate.auto', _('Automatic activation of external account')),
2324
2325 ('hg.inherit_default_perms.false', _('Inherit object permissions from default user disabled')),
2326 ('hg.inherit_default_perms.true', _('Inherit object permissions from default user enabled')),
2327 ]
2328
2329 # definition of system default permissions for DEFAULT user
2330 DEFAULT_USER_PERMISSIONS = [
2331 'repository.read',
2332 'group.read',
2333 'usergroup.read',
2334 'hg.create.repository',
2335 'hg.repogroup.create.false',
2336 'hg.usergroup.create.false',
2337 'hg.create.write_on_repogroup.true',
2338 'hg.fork.repository',
2339 'hg.register.manual_activate',
2340 'hg.extern_activate.auto',
2341 'hg.inherit_default_perms.true',
2342 ]
2343
2344 # defines which permissions are more important higher the more important
2345 # Weight defines which permissions are more important.
2346 # The higher number the more important.
2347 PERM_WEIGHTS = {
2348 'repository.none': 0,
2349 'repository.read': 1,
2350 'repository.write': 3,
2351 'repository.admin': 4,
2352
2353 'group.none': 0,
2354 'group.read': 1,
2355 'group.write': 3,
2356 'group.admin': 4,
2357
2358 'usergroup.none': 0,
2359 'usergroup.read': 1,
2360 'usergroup.write': 3,
2361 'usergroup.admin': 4,
2362
2363 'hg.repogroup.create.false': 0,
2364 'hg.repogroup.create.true': 1,
2365
2366 'hg.usergroup.create.false': 0,
2367 'hg.usergroup.create.true': 1,
2368
2369 'hg.fork.none': 0,
2370 'hg.fork.repository': 1,
2371 'hg.create.none': 0,
2372 'hg.create.repository': 1
2373 }
2374
2375 permission_id = Column("permission_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2376 permission_name = Column("permission_name", String(255), nullable=True, unique=None, default=None)
2377 permission_longname = Column("permission_longname", String(255), nullable=True, unique=None, default=None)
2378
2379 def __unicode__(self):
2380 return u"<%s('%s:%s')>" % (
2381 self.__class__.__name__, self.permission_id, self.permission_name
2382 )
2383
2384 @classmethod
2385 def get_by_key(cls, key):
2386 return cls.query().filter(cls.permission_name == key).scalar()
2387
2388 @classmethod
2389 def get_default_repo_perms(cls, user_id, repo_id=None):
2390 q = Session().query(UserRepoToPerm, Repository, Permission)\
2391 .join((Permission, UserRepoToPerm.permission_id == Permission.permission_id))\
2392 .join((Repository, UserRepoToPerm.repository_id == Repository.repo_id))\
2393 .filter(UserRepoToPerm.user_id == user_id)
2394 if repo_id:
2395 q = q.filter(UserRepoToPerm.repository_id == repo_id)
2396 return q.all()
2397
2398 @classmethod
2399 def get_default_repo_perms_from_user_group(cls, user_id, repo_id=None):
2400 q = Session().query(UserGroupRepoToPerm, Repository, Permission)\
2401 .join(
2402 Permission,
2403 UserGroupRepoToPerm.permission_id == Permission.permission_id)\
2404 .join(
2405 Repository,
2406 UserGroupRepoToPerm.repository_id == Repository.repo_id)\
2407 .join(
2408 UserGroup,
2409 UserGroupRepoToPerm.users_group_id ==
2410 UserGroup.users_group_id)\
2411 .join(
2412 UserGroupMember,
2413 UserGroupRepoToPerm.users_group_id ==
2414 UserGroupMember.users_group_id)\
2415 .filter(
2416 UserGroupMember.user_id == user_id,
2417 UserGroup.users_group_active == true())
2418 if repo_id:
2419 q = q.filter(UserGroupRepoToPerm.repository_id == repo_id)
2420 return q.all()
2421
2422 @classmethod
2423 def get_default_group_perms(cls, user_id, repo_group_id=None):
2424 q = Session().query(UserRepoGroupToPerm, RepoGroup, Permission)\
2425 .join((Permission, UserRepoGroupToPerm.permission_id == Permission.permission_id))\
2426 .join((RepoGroup, UserRepoGroupToPerm.group_id == RepoGroup.group_id))\
2427 .filter(UserRepoGroupToPerm.user_id == user_id)
2428 if repo_group_id:
2429 q = q.filter(UserRepoGroupToPerm.group_id == repo_group_id)
2430 return q.all()
2431
2432 @classmethod
2433 def get_default_group_perms_from_user_group(
2434 cls, user_id, repo_group_id=None):
2435 q = Session().query(UserGroupRepoGroupToPerm, RepoGroup, Permission)\
2436 .join(
2437 Permission,
2438 UserGroupRepoGroupToPerm.permission_id ==
2439 Permission.permission_id)\
2440 .join(
2441 RepoGroup,
2442 UserGroupRepoGroupToPerm.group_id == RepoGroup.group_id)\
2443 .join(
2444 UserGroup,
2445 UserGroupRepoGroupToPerm.users_group_id ==
2446 UserGroup.users_group_id)\
2447 .join(
2448 UserGroupMember,
2449 UserGroupRepoGroupToPerm.users_group_id ==
2450 UserGroupMember.users_group_id)\
2451 .filter(
2452 UserGroupMember.user_id == user_id,
2453 UserGroup.users_group_active == true())
2454 if repo_group_id:
2455 q = q.filter(UserGroupRepoGroupToPerm.group_id == repo_group_id)
2456 return q.all()
2457
2458 @classmethod
2459 def get_default_user_group_perms(cls, user_id, user_group_id=None):
2460 q = Session().query(UserUserGroupToPerm, UserGroup, Permission)\
2461 .join((Permission, UserUserGroupToPerm.permission_id == Permission.permission_id))\
2462 .join((UserGroup, UserUserGroupToPerm.user_group_id == UserGroup.users_group_id))\
2463 .filter(UserUserGroupToPerm.user_id == user_id)
2464 if user_group_id:
2465 q = q.filter(UserUserGroupToPerm.user_group_id == user_group_id)
2466 return q.all()
2467
2468 @classmethod
2469 def get_default_user_group_perms_from_user_group(
2470 cls, user_id, user_group_id=None):
2471 TargetUserGroup = aliased(UserGroup, name='target_user_group')
2472 q = Session().query(UserGroupUserGroupToPerm, UserGroup, Permission)\
2473 .join(
2474 Permission,
2475 UserGroupUserGroupToPerm.permission_id ==
2476 Permission.permission_id)\
2477 .join(
2478 TargetUserGroup,
2479 UserGroupUserGroupToPerm.target_user_group_id ==
2480 TargetUserGroup.users_group_id)\
2481 .join(
2482 UserGroup,
2483 UserGroupUserGroupToPerm.user_group_id ==
2484 UserGroup.users_group_id)\
2485 .join(
2486 UserGroupMember,
2487 UserGroupUserGroupToPerm.user_group_id ==
2488 UserGroupMember.users_group_id)\
2489 .filter(
2490 UserGroupMember.user_id == user_id,
2491 UserGroup.users_group_active == true())
2492 if user_group_id:
2493 q = q.filter(
2494 UserGroupUserGroupToPerm.user_group_id == user_group_id)
2495
2496 return q.all()
2497
2498
2499 class UserRepoToPerm(Base, BaseModel):
2500 __tablename__ = 'repo_to_perm'
2501 __table_args__ = (
2502 UniqueConstraint('user_id', 'repository_id', 'permission_id'),
2503 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2504 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2505 )
2506 repo_to_perm_id = Column("repo_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2507 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
2508 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2509 repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None)
2510
2511 user = relationship('User')
2512 repository = relationship('Repository')
2513 permission = relationship('Permission')
2514
2515 @classmethod
2516 def create(cls, user, repository, permission):
2517 n = cls()
2518 n.user = user
2519 n.repository = repository
2520 n.permission = permission
2521 Session().add(n)
2522 return n
2523
2524 def __unicode__(self):
2525 return u'<%s => %s >' % (self.user, self.repository)
2526
2527
2528 class UserUserGroupToPerm(Base, BaseModel):
2529 __tablename__ = 'user_user_group_to_perm'
2530 __table_args__ = (
2531 UniqueConstraint('user_id', 'user_group_id', 'permission_id'),
2532 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2533 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2534 )
2535 user_user_group_to_perm_id = Column("user_user_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2536 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
2537 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2538 user_group_id = Column("user_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2539
2540 user = relationship('User')
2541 user_group = relationship('UserGroup')
2542 permission = relationship('Permission')
2543
2544 @classmethod
2545 def create(cls, user, user_group, permission):
2546 n = cls()
2547 n.user = user
2548 n.user_group = user_group
2549 n.permission = permission
2550 Session().add(n)
2551 return n
2552
2553 def __unicode__(self):
2554 return u'<%s => %s >' % (self.user, self.user_group)
2555
2556
2557 class UserToPerm(Base, BaseModel):
2558 __tablename__ = 'user_to_perm'
2559 __table_args__ = (
2560 UniqueConstraint('user_id', 'permission_id'),
2561 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2562 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2563 )
2564 user_to_perm_id = Column("user_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2565 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
2566 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2567
2568 user = relationship('User')
2569 permission = relationship('Permission', lazy='joined')
2570
2571 def __unicode__(self):
2572 return u'<%s => %s >' % (self.user, self.permission)
2573
2574
2575 class UserGroupRepoToPerm(Base, BaseModel):
2576 __tablename__ = 'users_group_repo_to_perm'
2577 __table_args__ = (
2578 UniqueConstraint('repository_id', 'users_group_id', 'permission_id'),
2579 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2580 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2581 )
2582 users_group_to_perm_id = Column("users_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2583 users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2584 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2585 repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=None, default=None)
2586
2587 users_group = relationship('UserGroup')
2588 permission = relationship('Permission')
2589 repository = relationship('Repository')
2590
2591 @classmethod
2592 def create(cls, users_group, repository, permission):
2593 n = cls()
2594 n.users_group = users_group
2595 n.repository = repository
2596 n.permission = permission
2597 Session().add(n)
2598 return n
2599
2600 def __unicode__(self):
2601 return u'<UserGroupRepoToPerm:%s => %s >' % (self.users_group, self.repository)
2602
2603
2604 class UserGroupUserGroupToPerm(Base, BaseModel):
2605 __tablename__ = 'user_group_user_group_to_perm'
2606 __table_args__ = (
2607 UniqueConstraint('target_user_group_id', 'user_group_id', 'permission_id'),
2608 CheckConstraint('target_user_group_id != user_group_id'),
2609 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2610 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2611 )
2612 user_group_user_group_to_perm_id = Column("user_group_user_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2613 target_user_group_id = Column("target_user_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2614 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2615 user_group_id = Column("user_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2616
2617 target_user_group = relationship('UserGroup', primaryjoin='UserGroupUserGroupToPerm.target_user_group_id==UserGroup.users_group_id')
2618 user_group = relationship('UserGroup', primaryjoin='UserGroupUserGroupToPerm.user_group_id==UserGroup.users_group_id')
2619 permission = relationship('Permission')
2620
2621 @classmethod
2622 def create(cls, target_user_group, user_group, permission):
2623 n = cls()
2624 n.target_user_group = target_user_group
2625 n.user_group = user_group
2626 n.permission = permission
2627 Session().add(n)
2628 return n
2629
2630 def __unicode__(self):
2631 return u'<UserGroupUserGroup:%s => %s >' % (self.target_user_group, self.user_group)
2632
2633
2634 class UserGroupToPerm(Base, BaseModel):
2635 __tablename__ = 'users_group_to_perm'
2636 __table_args__ = (
2637 UniqueConstraint('users_group_id', 'permission_id',),
2638 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2639 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2640 )
2641 users_group_to_perm_id = Column("users_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2642 users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2643 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2644
2645 users_group = relationship('UserGroup')
2646 permission = relationship('Permission')
2647
2648
2649 class UserRepoGroupToPerm(Base, BaseModel):
2650 __tablename__ = 'user_repo_group_to_perm'
2651 __table_args__ = (
2652 UniqueConstraint('user_id', 'group_id', 'permission_id'),
2653 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2654 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2655 )
2656
2657 group_to_perm_id = Column("group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2658 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
2659 group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=False, unique=None, default=None)
2660 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2661
2662 user = relationship('User')
2663 group = relationship('RepoGroup')
2664 permission = relationship('Permission')
2665
2666 @classmethod
2667 def create(cls, user, repository_group, permission):
2668 n = cls()
2669 n.user = user
2670 n.group = repository_group
2671 n.permission = permission
2672 Session().add(n)
2673 return n
2674
2675
2676 class UserGroupRepoGroupToPerm(Base, BaseModel):
2677 __tablename__ = 'users_group_repo_group_to_perm'
2678 __table_args__ = (
2679 UniqueConstraint('users_group_id', 'group_id'),
2680 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2681 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2682 )
2683
2684 users_group_repo_group_to_perm_id = Column("users_group_repo_group_to_perm_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2685 users_group_id = Column("users_group_id", Integer(), ForeignKey('users_groups.users_group_id'), nullable=False, unique=None, default=None)
2686 group_id = Column("group_id", Integer(), ForeignKey('groups.group_id'), nullable=False, unique=None, default=None)
2687 permission_id = Column("permission_id", Integer(), ForeignKey('permissions.permission_id'), nullable=False, unique=None, default=None)
2688
2689 users_group = relationship('UserGroup')
2690 permission = relationship('Permission')
2691 group = relationship('RepoGroup')
2692
2693 @classmethod
2694 def create(cls, user_group, repository_group, permission):
2695 n = cls()
2696 n.users_group = user_group
2697 n.group = repository_group
2698 n.permission = permission
2699 Session().add(n)
2700 return n
2701
2702 def __unicode__(self):
2703 return u'<UserGroupRepoGroupToPerm:%s => %s >' % (self.users_group, self.group)
2704
2705
2706 class Statistics(Base, BaseModel):
2707 __tablename__ = 'statistics'
2708 __table_args__ = (
2709 UniqueConstraint('repository_id'),
2710 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2711 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2712 )
2713 stat_id = Column("stat_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2714 repository_id = Column("repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=False, unique=True, default=None)
2715 stat_on_revision = Column("stat_on_revision", Integer(), nullable=False)
2716 commit_activity = Column("commit_activity", LargeBinary(1000000), nullable=False)#JSON data
2717 commit_activity_combined = Column("commit_activity_combined", LargeBinary(), nullable=False)#JSON data
2718 languages = Column("languages", LargeBinary(1000000), nullable=False)#JSON data
2719
2720 repository = relationship('Repository', single_parent=True)
2721
2722
2723 class UserFollowing(Base, BaseModel):
2724 __tablename__ = 'user_followings'
2725 __table_args__ = (
2726 UniqueConstraint('user_id', 'follows_repository_id'),
2727 UniqueConstraint('user_id', 'follows_user_id'),
2728 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2729 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2730 )
2731
2732 user_following_id = Column("user_following_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2733 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None, default=None)
2734 follows_repo_id = Column("follows_repository_id", Integer(), ForeignKey('repositories.repo_id'), nullable=True, unique=None, default=None)
2735 follows_user_id = Column("follows_user_id", Integer(), ForeignKey('users.user_id'), nullable=True, unique=None, default=None)
2736 follows_from = Column('follows_from', DateTime(timezone=False), nullable=True, unique=None, default=datetime.datetime.now)
2737
2738 user = relationship('User', primaryjoin='User.user_id==UserFollowing.user_id')
2739
2740 follows_user = relationship('User', primaryjoin='User.user_id==UserFollowing.follows_user_id')
2741 follows_repository = relationship('Repository', order_by='Repository.repo_name')
2742
2743 @classmethod
2744 def get_repo_followers(cls, repo_id):
2745 return cls.query().filter(cls.follows_repo_id == repo_id)
2746
2747
2748 class CacheKey(Base, BaseModel):
2749 __tablename__ = 'cache_invalidation'
2750 __table_args__ = (
2751 UniqueConstraint('cache_key'),
2752 Index('key_idx', 'cache_key'),
2753 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2754 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
2755 )
2756 CACHE_TYPE_ATOM = 'ATOM'
2757 CACHE_TYPE_RSS = 'RSS'
2758 CACHE_TYPE_README = 'README'
2759
2760 cache_id = Column("cache_id", Integer(), nullable=False, unique=True, default=None, primary_key=True)
2761 cache_key = Column("cache_key", String(255), nullable=True, unique=None, default=None)
2762 cache_args = Column("cache_args", String(255), nullable=True, unique=None, default=None)
2763 cache_active = Column("cache_active", Boolean(), nullable=True, unique=None, default=False)
2764
2765 def __init__(self, cache_key, cache_args=''):
2766 self.cache_key = cache_key
2767 self.cache_args = cache_args
2768 self.cache_active = False
2769
2770 def __unicode__(self):
2771 return u"<%s('%s:%s[%s]')>" % (
2772 self.__class__.__name__,
2773 self.cache_id, self.cache_key, self.cache_active)
2774
2775 def _cache_key_partition(self):
2776 prefix, repo_name, suffix = self.cache_key.partition(self.cache_args)
2777 return prefix, repo_name, suffix
2778
2779 def get_prefix(self):
2780 """
2781 Try to extract prefix from existing cache key. The key could consist
2782 of prefix, repo_name, suffix
2783 """
2784 # this returns prefix, repo_name, suffix
2785 return self._cache_key_partition()[0]
2786
2787 def get_suffix(self):
2788 """
2789 get suffix that might have been used in _get_cache_key to
2790 generate self.cache_key. Only used for informational purposes
2791 in repo_edit.html.
2792 """
2793 # prefix, repo_name, suffix
2794 return self._cache_key_partition()[2]
2795
2796 @classmethod
2797 def delete_all_cache(cls):
2798 """
2799 Delete all cache keys from database.
2800 Should only be run when all instances are down and all entries
2801 thus stale.
2802 """
2803 cls.query().delete()
2804 Session().commit()
2805
2806 @classmethod
2807 def get_cache_key(cls, repo_name, cache_type):
2808 """
2809
2810 Generate a cache key for this process of RhodeCode instance.
2811 Prefix most likely will be process id or maybe explicitly set
2812 instance_id from .ini file.
2813 """
2814 import rhodecode
2815 prefix = safe_unicode(rhodecode.CONFIG.get('instance_id') or '')
2816
2817 repo_as_unicode = safe_unicode(repo_name)
2818 key = u'{}_{}'.format(repo_as_unicode, cache_type) \
2819 if cache_type else repo_as_unicode
2820
2821 return u'{}{}'.format(prefix, key)
2822
2823 @classmethod
2824 def set_invalidate(cls, repo_name, delete=False):
2825 """
2826 Mark all caches of a repo as invalid in the database.
2827 """
2828
2829 try:
2830 qry = Session().query(cls).filter(cls.cache_args == repo_name)
2831 if delete:
2832 log.debug('cache objects deleted for repo %s',
2833 safe_str(repo_name))
2834 qry.delete()
2835 else:
2836 log.debug('cache objects marked as invalid for repo %s',
2837 safe_str(repo_name))
2838 qry.update({"cache_active": False})
2839
2840 Session().commit()
2841 except Exception:
2842 log.exception(
2843 'Cache key invalidation failed for repository %s',
2844 safe_str(repo_name))
2845 Session().rollback()
2846
2847 @classmethod
2848 def get_active_cache(cls, cache_key):
2849 inv_obj = cls.query().filter(cls.cache_key == cache_key).scalar()
2850 if inv_obj:
2851 return inv_obj
2852 return None
2853
2854 @classmethod
2855 def repo_context_cache(cls, compute_func, repo_name, cache_type,
2856 thread_scoped=False):
2857 """
2858 @cache_region('long_term')
2859 def _heavy_calculation(cache_key):
2860 return 'result'
2861
2862 cache_context = CacheKey.repo_context_cache(
2863 _heavy_calculation, repo_name, cache_type)
2864
2865 with cache_context as context:
2866 context.invalidate()
2867 computed = context.compute()
2868
2869 assert computed == 'result'
2870 """
2871 from rhodecode.lib import caches
2872 return caches.InvalidationContext(
2873 compute_func, repo_name, cache_type, thread_scoped=thread_scoped)
2874
2875
2876 class ChangesetComment(Base, BaseModel):
2877 __tablename__ = 'changeset_comments'
2878 __table_args__ = (
2879 Index('cc_revision_idx', 'revision'),
2880 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2881 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
2882 )
2883
2884 COMMENT_OUTDATED = u'comment_outdated'
2885
2886 comment_id = Column('comment_id', Integer(), nullable=False, primary_key=True)
2887 repo_id = Column('repo_id', Integer(), ForeignKey('repositories.repo_id'), nullable=False)
2888 revision = Column('revision', String(40), nullable=True)
2889 pull_request_id = Column("pull_request_id", Integer(), ForeignKey('pull_requests.pull_request_id'), nullable=True)
2890 pull_request_version_id = Column("pull_request_version_id", Integer(), ForeignKey('pull_request_versions.pull_request_version_id'), nullable=True)
2891 line_no = Column('line_no', Unicode(10), nullable=True)
2892 hl_lines = Column('hl_lines', Unicode(512), nullable=True)
2893 f_path = Column('f_path', Unicode(1000), nullable=True)
2894 user_id = Column('user_id', Integer(), ForeignKey('users.user_id'), nullable=False)
2895 text = Column('text', UnicodeText().with_variant(UnicodeText(25000), 'mysql'), nullable=False)
2896 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
2897 modified_at = Column('modified_at', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
2898 renderer = Column('renderer', Unicode(64), nullable=True)
2899 display_state = Column('display_state', Unicode(128), nullable=True)
2900
2901 author = relationship('User', lazy='joined')
2902 repo = relationship('Repository')
2903 status_change = relationship('ChangesetStatus', cascade="all, delete, delete-orphan")
2904 pull_request = relationship('PullRequest', lazy='joined')
2905 pull_request_version = relationship('PullRequestVersion')
2906
2907 @classmethod
2908 def get_users(cls, revision=None, pull_request_id=None):
2909 """
2910 Returns user associated with this ChangesetComment. ie those
2911 who actually commented
2912
2913 :param cls:
2914 :param revision:
2915 """
2916 q = Session().query(User)\
2917 .join(ChangesetComment.author)
2918 if revision:
2919 q = q.filter(cls.revision == revision)
2920 elif pull_request_id:
2921 q = q.filter(cls.pull_request_id == pull_request_id)
2922 return q.all()
2923
2924 def render(self, mentions=False):
2925 from rhodecode.lib import helpers as h
2926 return h.render(self.text, renderer=self.renderer, mentions=mentions)
2927
2928 def __repr__(self):
2929 if self.comment_id:
2930 return '<DB:ChangesetComment #%s>' % self.comment_id
2931 else:
2932 return '<DB:ChangesetComment at %#x>' % id(self)
2933
2934
2935 class ChangesetStatus(Base, BaseModel):
2936 __tablename__ = 'changeset_statuses'
2937 __table_args__ = (
2938 Index('cs_revision_idx', 'revision'),
2939 Index('cs_version_idx', 'version'),
2940 UniqueConstraint('repo_id', 'revision', 'version'),
2941 {'extend_existing': True, 'mysql_engine': 'InnoDB',
2942 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
2943 )
2944 STATUS_NOT_REVIEWED = DEFAULT = 'not_reviewed'
2945 STATUS_APPROVED = 'approved'
2946 STATUS_REJECTED = 'rejected'
2947 STATUS_UNDER_REVIEW = 'under_review'
2948
2949 STATUSES = [
2950 (STATUS_NOT_REVIEWED, _("Not Reviewed")), # (no icon) and default
2951 (STATUS_APPROVED, _("Approved")),
2952 (STATUS_REJECTED, _("Rejected")),
2953 (STATUS_UNDER_REVIEW, _("Under Review")),
2954 ]
2955
2956 changeset_status_id = Column('changeset_status_id', Integer(), nullable=False, primary_key=True)
2957 repo_id = Column('repo_id', Integer(), ForeignKey('repositories.repo_id'), nullable=False)
2958 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=None)
2959 revision = Column('revision', String(40), nullable=False)
2960 status = Column('status', String(128), nullable=False, default=DEFAULT)
2961 changeset_comment_id = Column('changeset_comment_id', Integer(), ForeignKey('changeset_comments.comment_id'))
2962 modified_at = Column('modified_at', DateTime(), nullable=False, default=datetime.datetime.now)
2963 version = Column('version', Integer(), nullable=False, default=0)
2964 pull_request_id = Column("pull_request_id", Integer(), ForeignKey('pull_requests.pull_request_id'), nullable=True)
2965
2966 author = relationship('User', lazy='joined')
2967 repo = relationship('Repository')
2968 comment = relationship('ChangesetComment', lazy='joined')
2969 pull_request = relationship('PullRequest', lazy='joined')
2970
2971 def __unicode__(self):
2972 return u"<%s('%s[%s]:%s')>" % (
2973 self.__class__.__name__,
2974 self.status, self.version, self.author
2975 )
2976
2977 @classmethod
2978 def get_status_lbl(cls, value):
2979 return dict(cls.STATUSES).get(value)
2980
2981 @property
2982 def status_lbl(self):
2983 return ChangesetStatus.get_status_lbl(self.status)
2984
2985
2986 class _PullRequestBase(BaseModel):
2987 """
2988 Common attributes of pull request and version entries.
2989 """
2990
2991 # .status values
2992 STATUS_NEW = u'new'
2993 STATUS_OPEN = u'open'
2994 STATUS_CLOSED = u'closed'
2995
2996 title = Column('title', Unicode(255), nullable=True)
2997 description = Column(
2998 'description', UnicodeText().with_variant(UnicodeText(10240), 'mysql'),
2999 nullable=True)
3000 # new/open/closed status of pull request (not approve/reject/etc)
3001 status = Column('status', Unicode(255), nullable=False, default=STATUS_NEW)
3002 created_on = Column(
3003 'created_on', DateTime(timezone=False), nullable=False,
3004 default=datetime.datetime.now)
3005 updated_on = Column(
3006 'updated_on', DateTime(timezone=False), nullable=False,
3007 default=datetime.datetime.now)
3008
3009 @declared_attr
3010 def user_id(cls):
3011 return Column(
3012 "user_id", Integer(), ForeignKey('users.user_id'), nullable=False,
3013 unique=None)
3014
3015 # 500 revisions max
3016 _revisions = Column(
3017 'revisions', UnicodeText().with_variant(UnicodeText(20500), 'mysql'))
3018
3019 @declared_attr
3020 def source_repo_id(cls):
3021 # TODO: dan: rename column to source_repo_id
3022 return Column(
3023 'org_repo_id', Integer(), ForeignKey('repositories.repo_id'),
3024 nullable=False)
3025
3026 source_ref = Column('org_ref', Unicode(255), nullable=False)
3027
3028 @declared_attr
3029 def target_repo_id(cls):
3030 # TODO: dan: rename column to target_repo_id
3031 return Column(
3032 'other_repo_id', Integer(), ForeignKey('repositories.repo_id'),
3033 nullable=False)
3034
3035 target_ref = Column('other_ref', Unicode(255), nullable=False)
3036
3037 # TODO: dan: rename column to last_merge_source_rev
3038 _last_merge_source_rev = Column(
3039 'last_merge_org_rev', String(40), nullable=True)
3040 # TODO: dan: rename column to last_merge_target_rev
3041 _last_merge_target_rev = Column(
3042 'last_merge_other_rev', String(40), nullable=True)
3043 _last_merge_status = Column('merge_status', Integer(), nullable=True)
3044 merge_rev = Column('merge_rev', String(40), nullable=True)
3045
3046 @hybrid_property
3047 def revisions(self):
3048 return self._revisions.split(':') if self._revisions else []
3049
3050 @revisions.setter
3051 def revisions(self, val):
3052 self._revisions = ':'.join(val)
3053
3054 @declared_attr
3055 def author(cls):
3056 return relationship('User', lazy='joined')
3057
3058 @declared_attr
3059 def source_repo(cls):
3060 return relationship(
3061 'Repository',
3062 primaryjoin='%s.source_repo_id==Repository.repo_id' % cls.__name__)
3063
3064 @property
3065 def source_ref_parts(self):
3066 refs = self.source_ref.split(':')
3067 return Reference(refs[0], refs[1], refs[2])
3068
3069 @declared_attr
3070 def target_repo(cls):
3071 return relationship(
3072 'Repository',
3073 primaryjoin='%s.target_repo_id==Repository.repo_id' % cls.__name__)
3074
3075 @property
3076 def target_ref_parts(self):
3077 refs = self.target_ref.split(':')
3078 return Reference(refs[0], refs[1], refs[2])
3079
3080
3081 class PullRequest(Base, _PullRequestBase):
3082 __tablename__ = 'pull_requests'
3083 __table_args__ = (
3084 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3085 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3086 )
3087
3088 pull_request_id = Column(
3089 'pull_request_id', Integer(), nullable=False, primary_key=True)
3090
3091 def __repr__(self):
3092 if self.pull_request_id:
3093 return '<DB:PullRequest #%s>' % self.pull_request_id
3094 else:
3095 return '<DB:PullRequest at %#x>' % id(self)
3096
3097 reviewers = relationship('PullRequestReviewers',
3098 cascade="all, delete, delete-orphan")
3099 statuses = relationship('ChangesetStatus')
3100 comments = relationship('ChangesetComment',
3101 cascade="all, delete, delete-orphan")
3102 versions = relationship('PullRequestVersion',
3103 cascade="all, delete, delete-orphan")
3104
3105 def is_closed(self):
3106 return self.status == self.STATUS_CLOSED
3107
3108 def get_api_data(self):
3109 from rhodecode.model.pull_request import PullRequestModel
3110 pull_request = self
3111 merge_status = PullRequestModel().merge_status(pull_request)
3112 data = {
3113 'pull_request_id': pull_request.pull_request_id,
3114 'url': url('pullrequest_show', repo_name=self.target_repo.repo_name,
3115 pull_request_id=self.pull_request_id,
3116 qualified=True),
3117 'title': pull_request.title,
3118 'description': pull_request.description,
3119 'status': pull_request.status,
3120 'created_on': pull_request.created_on,
3121 'updated_on': pull_request.updated_on,
3122 'commit_ids': pull_request.revisions,
3123 'review_status': pull_request.calculated_review_status(),
3124 'mergeable': {
3125 'status': merge_status[0],
3126 'message': unicode(merge_status[1]),
3127 },
3128 'source': {
3129 'clone_url': pull_request.source_repo.clone_url(),
3130 'repository': pull_request.source_repo.repo_name,
3131 'reference': {
3132 'name': pull_request.source_ref_parts.name,
3133 'type': pull_request.source_ref_parts.type,
3134 'commit_id': pull_request.source_ref_parts.commit_id,
3135 },
3136 },
3137 'target': {
3138 'clone_url': pull_request.target_repo.clone_url(),
3139 'repository': pull_request.target_repo.repo_name,
3140 'reference': {
3141 'name': pull_request.target_ref_parts.name,
3142 'type': pull_request.target_ref_parts.type,
3143 'commit_id': pull_request.target_ref_parts.commit_id,
3144 },
3145 },
3146 'author': pull_request.author.get_api_data(include_secrets=False,
3147 details='basic'),
3148 'reviewers': [
3149 {
3150 'user': reviewer.get_api_data(include_secrets=False,
3151 details='basic'),
3152 'reasons': reasons,
3153 'review_status': st[0][1].status if st else 'not_reviewed',
3154 }
3155 for reviewer, reasons, st in pull_request.reviewers_statuses()
3156 ]
3157 }
3158
3159 return data
3160
3161 def __json__(self):
3162 return {
3163 'revisions': self.revisions,
3164 }
3165
3166 def calculated_review_status(self):
3167 # TODO: anderson: 13.05.15 Used only on templates/my_account_pullrequests.html
3168 # because it's tricky on how to use ChangesetStatusModel from there
3169 warnings.warn("Use calculated_review_status from ChangesetStatusModel", DeprecationWarning)
3170 from rhodecode.model.changeset_status import ChangesetStatusModel
3171 return ChangesetStatusModel().calculated_review_status(self)
3172
3173 def reviewers_statuses(self):
3174 warnings.warn("Use reviewers_statuses from ChangesetStatusModel", DeprecationWarning)
3175 from rhodecode.model.changeset_status import ChangesetStatusModel
3176 return ChangesetStatusModel().reviewers_statuses(self)
3177
3178
3179 class PullRequestVersion(Base, _PullRequestBase):
3180 __tablename__ = 'pull_request_versions'
3181 __table_args__ = (
3182 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3183 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3184 )
3185
3186 pull_request_version_id = Column(
3187 'pull_request_version_id', Integer(), nullable=False, primary_key=True)
3188 pull_request_id = Column(
3189 'pull_request_id', Integer(),
3190 ForeignKey('pull_requests.pull_request_id'), nullable=False)
3191 pull_request = relationship('PullRequest')
3192
3193 def __repr__(self):
3194 if self.pull_request_version_id:
3195 return '<DB:PullRequestVersion #%s>' % self.pull_request_version_id
3196 else:
3197 return '<DB:PullRequestVersion at %#x>' % id(self)
3198
3199
3200 class PullRequestReviewers(Base, BaseModel):
3201 __tablename__ = 'pull_request_reviewers'
3202 __table_args__ = (
3203 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3204 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3205 )
3206
3207 def __init__(self, user=None, pull_request=None, reasons=None):
3208 self.user = user
3209 self.pull_request = pull_request
3210 self.reasons = reasons or []
3211
3212 @hybrid_property
3213 def reasons(self):
3214 if not self._reasons:
3215 return []
3216 return self._reasons
3217
3218 @reasons.setter
3219 def reasons(self, val):
3220 val = val or []
3221 if any(not isinstance(x, basestring) for x in val):
3222 raise Exception('invalid reasons type, must be list of strings')
3223 self._reasons = val
3224
3225 pull_requests_reviewers_id = Column(
3226 'pull_requests_reviewers_id', Integer(), nullable=False,
3227 primary_key=True)
3228 pull_request_id = Column(
3229 "pull_request_id", Integer(),
3230 ForeignKey('pull_requests.pull_request_id'), nullable=False)
3231 user_id = Column(
3232 "user_id", Integer(), ForeignKey('users.user_id'), nullable=True)
3233 _reasons = Column(
3234 'reason', MutationList.as_mutable(
3235 JsonType('list', dialect_map=dict(mysql=UnicodeText(16384)))))
3236
3237 user = relationship('User')
3238 pull_request = relationship('PullRequest')
3239
3240
3241 class Notification(Base, BaseModel):
3242 __tablename__ = 'notifications'
3243 __table_args__ = (
3244 Index('notification_type_idx', 'type'),
3245 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3246 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3247 )
3248
3249 TYPE_CHANGESET_COMMENT = u'cs_comment'
3250 TYPE_MESSAGE = u'message'
3251 TYPE_MENTION = u'mention'
3252 TYPE_REGISTRATION = u'registration'
3253 TYPE_PULL_REQUEST = u'pull_request'
3254 TYPE_PULL_REQUEST_COMMENT = u'pull_request_comment'
3255
3256 notification_id = Column('notification_id', Integer(), nullable=False, primary_key=True)
3257 subject = Column('subject', Unicode(512), nullable=True)
3258 body = Column('body', UnicodeText().with_variant(UnicodeText(50000), 'mysql'), nullable=True)
3259 created_by = Column("created_by", Integer(), ForeignKey('users.user_id'), nullable=True)
3260 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
3261 type_ = Column('type', Unicode(255))
3262
3263 created_by_user = relationship('User')
3264 notifications_to_users = relationship('UserNotification', lazy='joined',
3265 cascade="all, delete, delete-orphan")
3266
3267 @property
3268 def recipients(self):
3269 return [x.user for x in UserNotification.query()\
3270 .filter(UserNotification.notification == self)\
3271 .order_by(UserNotification.user_id.asc()).all()]
3272
3273 @classmethod
3274 def create(cls, created_by, subject, body, recipients, type_=None):
3275 if type_ is None:
3276 type_ = Notification.TYPE_MESSAGE
3277
3278 notification = cls()
3279 notification.created_by_user = created_by
3280 notification.subject = subject
3281 notification.body = body
3282 notification.type_ = type_
3283 notification.created_on = datetime.datetime.now()
3284
3285 for u in recipients:
3286 assoc = UserNotification()
3287 assoc.notification = notification
3288
3289 # if created_by is inside recipients mark his notification
3290 # as read
3291 if u.user_id == created_by.user_id:
3292 assoc.read = True
3293
3294 u.notifications.append(assoc)
3295 Session().add(notification)
3296
3297 return notification
3298
3299 @property
3300 def description(self):
3301 from rhodecode.model.notification import NotificationModel
3302 return NotificationModel().make_description(self)
3303
3304
3305 class UserNotification(Base, BaseModel):
3306 __tablename__ = 'user_to_notification'
3307 __table_args__ = (
3308 UniqueConstraint('user_id', 'notification_id'),
3309 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3310 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
3311 )
3312 user_id = Column('user_id', Integer(), ForeignKey('users.user_id'), primary_key=True)
3313 notification_id = Column("notification_id", Integer(), ForeignKey('notifications.notification_id'), primary_key=True)
3314 read = Column('read', Boolean, default=False)
3315 sent_on = Column('sent_on', DateTime(timezone=False), nullable=True, unique=None)
3316
3317 user = relationship('User', lazy="joined")
3318 notification = relationship('Notification', lazy="joined",
3319 order_by=lambda: Notification.created_on.desc(),)
3320
3321 def mark_as_read(self):
3322 self.read = True
3323 Session().add(self)
3324
3325
3326 class Gist(Base, BaseModel):
3327 __tablename__ = 'gists'
3328 __table_args__ = (
3329 Index('g_gist_access_id_idx', 'gist_access_id'),
3330 Index('g_created_on_idx', 'created_on'),
3331 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3332 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
3333 )
3334 GIST_PUBLIC = u'public'
3335 GIST_PRIVATE = u'private'
3336 DEFAULT_FILENAME = u'gistfile1.txt'
3337
3338 ACL_LEVEL_PUBLIC = u'acl_public'
3339 ACL_LEVEL_PRIVATE = u'acl_private'
3340
3341 gist_id = Column('gist_id', Integer(), primary_key=True)
3342 gist_access_id = Column('gist_access_id', Unicode(250))
3343 gist_description = Column('gist_description', UnicodeText().with_variant(UnicodeText(1024), 'mysql'))
3344 gist_owner = Column('user_id', Integer(), ForeignKey('users.user_id'), nullable=True)
3345 gist_expires = Column('gist_expires', Float(53), nullable=False)
3346 gist_type = Column('gist_type', Unicode(128), nullable=False)
3347 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
3348 modified_at = Column('modified_at', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
3349 acl_level = Column('acl_level', Unicode(128), nullable=True)
3350
3351 owner = relationship('User')
3352
3353 def __repr__(self):
3354 return '<Gist:[%s]%s>' % (self.gist_type, self.gist_access_id)
3355
3356 @classmethod
3357 def get_or_404(cls, id_):
3358 res = cls.query().filter(cls.gist_access_id == id_).scalar()
3359 if not res:
3360 raise HTTPNotFound
3361 return res
3362
3363 @classmethod
3364 def get_by_access_id(cls, gist_access_id):
3365 return cls.query().filter(cls.gist_access_id == gist_access_id).scalar()
3366
3367 def gist_url(self):
3368 import rhodecode
3369 alias_url = rhodecode.CONFIG.get('gist_alias_url')
3370 if alias_url:
3371 return alias_url.replace('{gistid}', self.gist_access_id)
3372
3373 return url('gist', gist_id=self.gist_access_id, qualified=True)
3374
3375 @classmethod
3376 def base_path(cls):
3377 """
3378 Returns base path when all gists are stored
3379
3380 :param cls:
3381 """
3382 from rhodecode.model.gist import GIST_STORE_LOC
3383 q = Session().query(RhodeCodeUi)\
3384 .filter(RhodeCodeUi.ui_key == URL_SEP)
3385 q = q.options(FromCache("sql_cache_short", "repository_repo_path"))
3386 return os.path.join(q.one().ui_value, GIST_STORE_LOC)
3387
3388 def get_api_data(self):
3389 """
3390 Common function for generating gist related data for API
3391 """
3392 gist = self
3393 data = {
3394 'gist_id': gist.gist_id,
3395 'type': gist.gist_type,
3396 'access_id': gist.gist_access_id,
3397 'description': gist.gist_description,
3398 'url': gist.gist_url(),
3399 'expires': gist.gist_expires,
3400 'created_on': gist.created_on,
3401 'modified_at': gist.modified_at,
3402 'content': None,
3403 'acl_level': gist.acl_level,
3404 }
3405 return data
3406
3407 def __json__(self):
3408 data = dict(
3409 )
3410 data.update(self.get_api_data())
3411 return data
3412 # SCM functions
3413
3414 def scm_instance(self, **kwargs):
3415 full_repo_path = os.path.join(self.base_path(), self.gist_access_id)
3416 return get_vcs_instance(
3417 repo_path=safe_str(full_repo_path), create=False)
3418
3419
3420 class DbMigrateVersion(Base, BaseModel):
3421 __tablename__ = 'db_migrate_version'
3422 __table_args__ = (
3423 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3424 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3425 )
3426 repository_id = Column('repository_id', String(250), primary_key=True)
3427 repository_path = Column('repository_path', Text)
3428 version = Column('version', Integer)
3429
3430
3431 class ExternalIdentity(Base, BaseModel):
3432 __tablename__ = 'external_identities'
3433 __table_args__ = (
3434 Index('local_user_id_idx', 'local_user_id'),
3435 Index('external_id_idx', 'external_id'),
3436 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3437 'mysql_charset': 'utf8'})
3438
3439 external_id = Column('external_id', Unicode(255), default=u'',
3440 primary_key=True)
3441 external_username = Column('external_username', Unicode(1024), default=u'')
3442 local_user_id = Column('local_user_id', Integer(),
3443 ForeignKey('users.user_id'), primary_key=True)
3444 provider_name = Column('provider_name', Unicode(255), default=u'',
3445 primary_key=True)
3446 access_token = Column('access_token', String(1024), default=u'')
3447 alt_token = Column('alt_token', String(1024), default=u'')
3448 token_secret = Column('token_secret', String(1024), default=u'')
3449
3450 @classmethod
3451 def by_external_id_and_provider(cls, external_id, provider_name,
3452 local_user_id=None):
3453 """
3454 Returns ExternalIdentity instance based on search params
3455
3456 :param external_id:
3457 :param provider_name:
3458 :return: ExternalIdentity
3459 """
3460 query = cls.query()
3461 query = query.filter(cls.external_id == external_id)
3462 query = query.filter(cls.provider_name == provider_name)
3463 if local_user_id:
3464 query = query.filter(cls.local_user_id == local_user_id)
3465 return query.first()
3466
3467 @classmethod
3468 def user_by_external_id_and_provider(cls, external_id, provider_name):
3469 """
3470 Returns User instance based on search params
3471
3472 :param external_id:
3473 :param provider_name:
3474 :return: User
3475 """
3476 query = User.query()
3477 query = query.filter(cls.external_id == external_id)
3478 query = query.filter(cls.provider_name == provider_name)
3479 query = query.filter(User.user_id == cls.local_user_id)
3480 return query.first()
3481
3482 @classmethod
3483 def by_local_user_id(cls, local_user_id):
3484 """
3485 Returns all tokens for user
3486
3487 :param local_user_id:
3488 :return: ExternalIdentity
3489 """
3490 query = cls.query()
3491 query = query.filter(cls.local_user_id == local_user_id)
3492 return query
3493
3494
3495 class Integration(Base, BaseModel):
3496 __tablename__ = 'integrations'
3497 __table_args__ = (
3498 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3499 'mysql_charset': 'utf8', 'sqlite_autoincrement': True}
3500 )
3501
3502 integration_id = Column('integration_id', Integer(), primary_key=True)
3503 integration_type = Column('integration_type', String(255))
3504 enabled = Column('enabled', Boolean(), nullable=False)
3505 name = Column('name', String(255), nullable=False)
3506 child_repos_only = Column('child_repos_only', Boolean(), nullable=False,
3507 default=False)
3508
3509 settings = Column(
3510 'settings_json', MutationObj.as_mutable(
3511 JsonType(dialect_map=dict(mysql=UnicodeText(16384)))))
3512 repo_id = Column(
3513 'repo_id', Integer(), ForeignKey('repositories.repo_id'),
3514 nullable=True, unique=None, default=None)
3515 repo = relationship('Repository', lazy='joined')
3516
3517 repo_group_id = Column(
3518 'repo_group_id', Integer(), ForeignKey('groups.group_id'),
3519 nullable=True, unique=None, default=None)
3520 repo_group = relationship('RepoGroup', lazy='joined')
3521
3522 @property
3523 def scope(self):
3524 if self.repo:
3525 return repr(self.repo)
3526 if self.repo_group:
3527 if self.child_repos_only:
3528 return repr(self.repo_group) + ' (child repos only)'
3529 else:
3530 return repr(self.repo_group) + ' (recursive)'
3531 if self.child_repos_only:
3532 return 'root_repos'
3533 return 'global'
3534
3535 def __repr__(self):
3536 return '<Integration(%r, %r)>' % (self.integration_type, self.scope)
3537
3538
3539 class RepoReviewRuleUser(Base, BaseModel):
3540 __tablename__ = 'repo_review_rules_users'
3541 __table_args__ = (
3542 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3543 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,}
3544 )
3545 repo_review_rule_user_id = Column(
3546 'repo_review_rule_user_id', Integer(), primary_key=True)
3547 repo_review_rule_id = Column("repo_review_rule_id",
3548 Integer(), ForeignKey('repo_review_rules.repo_review_rule_id'))
3549 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'),
3550 nullable=False)
3551 user = relationship('User')
3552
3553
3554 class RepoReviewRuleUserGroup(Base, BaseModel):
3555 __tablename__ = 'repo_review_rules_users_groups'
3556 __table_args__ = (
3557 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3558 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,}
3559 )
3560 repo_review_rule_users_group_id = Column(
3561 'repo_review_rule_users_group_id', Integer(), primary_key=True)
3562 repo_review_rule_id = Column("repo_review_rule_id",
3563 Integer(), ForeignKey('repo_review_rules.repo_review_rule_id'))
3564 users_group_id = Column("users_group_id", Integer(),
3565 ForeignKey('users_groups.users_group_id'), nullable=False)
3566 users_group = relationship('UserGroup')
3567
3568
3569 class RepoReviewRule(Base, BaseModel):
3570 __tablename__ = 'repo_review_rules'
3571 __table_args__ = (
3572 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3573 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,}
3574 )
3575
3576 repo_review_rule_id = Column(
3577 'repo_review_rule_id', Integer(), primary_key=True)
3578 repo_id = Column(
3579 "repo_id", Integer(), ForeignKey('repositories.repo_id'))
3580 repo = relationship('Repository', backref='review_rules')
3581
3582 _branch_pattern = Column("branch_pattern", UnicodeText().with_variant(UnicodeText(255), 'mysql'),
3583 default=u'*') # glob
3584 _file_pattern = Column("file_pattern", UnicodeText().with_variant(UnicodeText(255), 'mysql'),
3585 default=u'*') # glob
3586
3587 use_authors_for_review = Column("use_authors_for_review", Boolean(),
3588 nullable=False, default=False)
3589 rule_users = relationship('RepoReviewRuleUser')
3590 rule_user_groups = relationship('RepoReviewRuleUserGroup')
3591
3592 @hybrid_property
3593 def branch_pattern(self):
3594 return self._branch_pattern or '*'
3595
3596 def _validate_glob(self, value):
3597 re.compile('^' + glob2re(value) + '$')
3598
3599 @branch_pattern.setter
3600 def branch_pattern(self, value):
3601 self._validate_glob(value)
3602 self._branch_pattern = value or '*'
3603
3604 @hybrid_property
3605 def file_pattern(self):
3606 return self._file_pattern or '*'
3607
3608 @file_pattern.setter
3609 def file_pattern(self, value):
3610 self._validate_glob(value)
3611 self._file_pattern = value or '*'
3612
3613 def matches(self, branch, files_changed):
3614 """
3615 Check if this review rule matches a branch/files in a pull request
3616
3617 :param branch: branch name for the commit
3618 :param files_changed: list of file paths changed in the pull request
3619 """
3620
3621 branch = branch or ''
3622 files_changed = files_changed or []
3623
3624 branch_matches = True
3625 if branch:
3626 branch_regex = re.compile('^' + glob2re(self.branch_pattern) + '$')
3627 branch_matches = bool(branch_regex.search(branch))
3628
3629 files_matches = True
3630 if self.file_pattern != '*':
3631 files_matches = False
3632 file_regex = re.compile(glob2re(self.file_pattern))
3633 for filename in files_changed:
3634 if file_regex.search(filename):
3635 files_matches = True
3636 break
3637
3638 return branch_matches and files_matches
3639
3640 @property
3641 def review_users(self):
3642 """ Returns the users which this rule applies to """
3643
3644 users = set()
3645 users |= set([
3646 rule_user.user for rule_user in self.rule_users
3647 if rule_user.user.active])
3648 users |= set(
3649 member.user
3650 for rule_user_group in self.rule_user_groups
3651 for member in rule_user_group.users_group.members
3652 if member.user.active
3653 )
3654 return users
3655
3656 def __repr__(self):
3657 return '<RepoReviewerRule(id=%r, repo=%r)>' % (
3658 self.repo_review_rule_id, self.repo)
@@ -0,0 +1,35 b''
1 import logging
2 import datetime
3
4 from sqlalchemy import *
5 from sqlalchemy.exc import DatabaseError
6 from sqlalchemy.orm import relation, backref, class_mapper, joinedload
7 from sqlalchemy.orm.session import Session
8 from sqlalchemy.ext.declarative import declarative_base
9
10 from rhodecode.lib.dbmigrate.migrate import *
11 from rhodecode.lib.dbmigrate.migrate.changeset import *
12 from rhodecode.lib.utils2 import str2bool
13
14 from rhodecode.model.meta import Base
15 from rhodecode.model import meta
16 from rhodecode.lib.dbmigrate.versions import _reset_base, notify
17
18 log = logging.getLogger(__name__)
19
20
21 def upgrade(migrate_engine):
22 """
23 Upgrade operations go here.
24 Don't create your own engine; bind migrate_engine to your metadata
25 """
26 _reset_base(migrate_engine)
27 from rhodecode.lib.dbmigrate.schema import db_4_4_0_2
28
29 db_4_4_0_2.RepoReviewRule.__table__.create()
30 db_4_4_0_2.RepoReviewRuleUser.__table__.create()
31 db_4_4_0_2.RepoReviewRuleUserGroup.__table__.create()
32
33 def downgrade(migrate_engine):
34 meta = MetaData()
35 meta.bind = migrate_engine
@@ -0,0 +1,34 b''
1 import logging
2 import datetime
3
4 from sqlalchemy import *
5 from sqlalchemy.exc import DatabaseError
6 from sqlalchemy.orm import relation, backref, class_mapper, joinedload
7 from sqlalchemy.orm.session import Session
8 from sqlalchemy.ext.declarative import declarative_base
9
10 from rhodecode.lib.dbmigrate.migrate import *
11 from rhodecode.lib.dbmigrate.migrate.changeset import *
12 from rhodecode.lib.utils2 import str2bool
13
14 from rhodecode.model.meta import Base
15 from rhodecode.model import meta
16 from rhodecode.lib.dbmigrate.versions import _reset_base, notify
17
18 log = logging.getLogger(__name__)
19
20
21 def upgrade(migrate_engine):
22 """
23 Upgrade operations go here.
24 Don't create your own engine; bind migrate_engine to your metadata
25 """
26 _reset_base(migrate_engine)
27 from rhodecode.lib.dbmigrate.schema import db_4_5_0_0
28
29 db_4_5_0_0.PullRequestReviewers.reasons.create(
30 table=db_4_5_0_0.PullRequestReviewers.__table__)
31
32 def downgrade(migrate_engine):
33 meta = MetaData()
34 meta.bind = migrate_engine
@@ -0,0 +1,42 b''
1 import logging
2 import datetime
3
4 from sqlalchemy import *
5 from sqlalchemy.exc import DatabaseError
6 from sqlalchemy.orm import relation, backref, class_mapper, joinedload
7 from sqlalchemy.orm.session import Session
8 from sqlalchemy.ext.declarative import declarative_base
9
10 from rhodecode.lib.dbmigrate.migrate import *
11 from rhodecode.lib.dbmigrate.migrate.changeset import *
12 from rhodecode.lib.utils2 import str2bool
13
14 from rhodecode.model.meta import Base
15 from rhodecode.model import meta
16 from rhodecode.lib.dbmigrate.versions import _reset_base, notify
17
18 log = logging.getLogger(__name__)
19
20
21 def upgrade(migrate_engine):
22 """
23 Upgrade operations go here.
24 Don't create your own engine; bind migrate_engine to your metadata
25 """
26 _reset_base(migrate_engine)
27 from rhodecode.lib.dbmigrate.schema import db_4_5_0_0
28
29 fixups(db_4_5_0_0, meta.Session)
30
31 def downgrade(migrate_engine):
32 meta = MetaData()
33 meta.bind = migrate_engine
34
35 def fixups(models, _SESSION):
36 # ** create default permissions ** #
37 from rhodecode.model.permission import PermissionModel
38 PermissionModel(_SESSION()).create_permissions()
39
40 res = PermissionModel(_SESSION()).create_default_user_permissions(
41 models.User.DEFAULT_USER)
42 _SESSION().commit()
@@ -0,0 +1,31 b''
1 import logging
2
3 from sqlalchemy import Column, MetaData, Unicode
4
5 from rhodecode.lib.dbmigrate.versions import _reset_base
6
7 log = logging.getLogger(__name__)
8
9
10 def upgrade(migrate_engine):
11 """
12 Upgrade operations go here.
13 Don't create your own engine; bind migrate_engine to your metadata
14 """
15 _reset_base(migrate_engine)
16 from rhodecode.lib.dbmigrate.schema import db_4_5_0_0 as db
17
18 # Add shadow merge ref column to pull request table.
19 pr_table = db.PullRequest.__table__
20 pr_col = Column('shadow_merge_ref', Unicode(255), nullable=True)
21 pr_col.create(table=pr_table)
22
23 # Add shadow merge ref column to pull request version table.
24 pr_version_table = db.PullRequestVersion.__table__
25 pr_version_col = Column('shadow_merge_ref', Unicode(255), nullable=True)
26 pr_version_col.create(table=pr_version_table)
27
28
29 def downgrade(migrate_engine):
30 meta = MetaData()
31 meta.bind = migrate_engine
@@ -0,0 +1,27 b''
1 import logging
2
3 from sqlalchemy import Column, MetaData, Boolean
4
5 from rhodecode.lib.dbmigrate.versions import _reset_base
6
7 log = logging.getLogger(__name__)
8
9
10 def upgrade(migrate_engine):
11 """
12 Upgrade operations go here.
13 Don't create your own engine; bind migrate_engine to your metadata
14 """
15 _reset_base(migrate_engine)
16 from rhodecode.lib.dbmigrate.schema import db_4_5_0_0 as db
17
18 # Add personal column to RepoGroup table.
19 rg_table = db.RepoGroup.__table__
20 rg_col = Column(
21 'personal', Boolean(), nullable=True, unique=None, default=None)
22 rg_col.create(table=rg_table)
23
24
25 def downgrade(migrate_engine):
26 meta = MetaData()
27 meta.bind = migrate_engine
This diff has been collapsed as it changes many lines, (1919 lines changed) Show them Hide them
@@ -0,0 +1,1919 b''
1 #!/usr/bin/python2.4
2
3 from __future__ import division
4
5 """Diff Match and Patch
6
7 Copyright 2006 Google Inc.
8 http://code.google.com/p/google-diff-match-patch/
9
10 Licensed under the Apache License, Version 2.0 (the "License");
11 you may not use this file except in compliance with the License.
12 You may obtain a copy of the License at
13
14 http://www.apache.org/licenses/LICENSE-2.0
15
16 Unless required by applicable law or agreed to in writing, software
17 distributed under the License is distributed on an "AS IS" BASIS,
18 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
19 See the License for the specific language governing permissions and
20 limitations under the License.
21 """
22
23 """Functions for diff, match and patch.
24
25 Computes the difference between two texts to create a patch.
26 Applies the patch onto another text, allowing for errors.
27 """
28
29 __author__ = 'fraser@google.com (Neil Fraser)'
30
31 import math
32 import re
33 import sys
34 import time
35 import urllib
36
37 class diff_match_patch:
38 """Class containing the diff, match and patch methods.
39
40 Also contains the behaviour settings.
41 """
42
43 def __init__(self):
44 """Inits a diff_match_patch object with default settings.
45 Redefine these in your program to override the defaults.
46 """
47
48 # Number of seconds to map a diff before giving up (0 for infinity).
49 self.Diff_Timeout = 1.0
50 # Cost of an empty edit operation in terms of edit characters.
51 self.Diff_EditCost = 4
52 # At what point is no match declared (0.0 = perfection, 1.0 = very loose).
53 self.Match_Threshold = 0.5
54 # How far to search for a match (0 = exact location, 1000+ = broad match).
55 # A match this many characters away from the expected location will add
56 # 1.0 to the score (0.0 is a perfect match).
57 self.Match_Distance = 1000
58 # When deleting a large block of text (over ~64 characters), how close do
59 # the contents have to be to match the expected contents. (0.0 = perfection,
60 # 1.0 = very loose). Note that Match_Threshold controls how closely the
61 # end points of a delete need to match.
62 self.Patch_DeleteThreshold = 0.5
63 # Chunk size for context length.
64 self.Patch_Margin = 4
65
66 # The number of bits in an int.
67 # Python has no maximum, thus to disable patch splitting set to 0.
68 # However to avoid long patches in certain pathological cases, use 32.
69 # Multiple short patches (using native ints) are much faster than long ones.
70 self.Match_MaxBits = 32
71
72 # DIFF FUNCTIONS
73
74 # The data structure representing a diff is an array of tuples:
75 # [(DIFF_DELETE, "Hello"), (DIFF_INSERT, "Goodbye"), (DIFF_EQUAL, " world.")]
76 # which means: delete "Hello", add "Goodbye" and keep " world."
77 DIFF_DELETE = -1
78 DIFF_INSERT = 1
79 DIFF_EQUAL = 0
80
81 def diff_main(self, text1, text2, checklines=True, deadline=None):
82 """Find the differences between two texts. Simplifies the problem by
83 stripping any common prefix or suffix off the texts before diffing.
84
85 Args:
86 text1: Old string to be diffed.
87 text2: New string to be diffed.
88 checklines: Optional speedup flag. If present and false, then don't run
89 a line-level diff first to identify the changed areas.
90 Defaults to true, which does a faster, slightly less optimal diff.
91 deadline: Optional time when the diff should be complete by. Used
92 internally for recursive calls. Users should set DiffTimeout instead.
93
94 Returns:
95 Array of changes.
96 """
97 # Set a deadline by which time the diff must be complete.
98 if deadline == None:
99 # Unlike in most languages, Python counts time in seconds.
100 if self.Diff_Timeout <= 0:
101 deadline = sys.maxint
102 else:
103 deadline = time.time() + self.Diff_Timeout
104
105 # Check for null inputs.
106 if text1 == None or text2 == None:
107 raise ValueError("Null inputs. (diff_main)")
108
109 # Check for equality (speedup).
110 if text1 == text2:
111 if text1:
112 return [(self.DIFF_EQUAL, text1)]
113 return []
114
115 # Trim off common prefix (speedup).
116 commonlength = self.diff_commonPrefix(text1, text2)
117 commonprefix = text1[:commonlength]
118 text1 = text1[commonlength:]
119 text2 = text2[commonlength:]
120
121 # Trim off common suffix (speedup).
122 commonlength = self.diff_commonSuffix(text1, text2)
123 if commonlength == 0:
124 commonsuffix = ''
125 else:
126 commonsuffix = text1[-commonlength:]
127 text1 = text1[:-commonlength]
128 text2 = text2[:-commonlength]
129
130 # Compute the diff on the middle block.
131 diffs = self.diff_compute(text1, text2, checklines, deadline)
132
133 # Restore the prefix and suffix.
134 if commonprefix:
135 diffs[:0] = [(self.DIFF_EQUAL, commonprefix)]
136 if commonsuffix:
137 diffs.append((self.DIFF_EQUAL, commonsuffix))
138 self.diff_cleanupMerge(diffs)
139 return diffs
140
141 def diff_compute(self, text1, text2, checklines, deadline):
142 """Find the differences between two texts. Assumes that the texts do not
143 have any common prefix or suffix.
144
145 Args:
146 text1: Old string to be diffed.
147 text2: New string to be diffed.
148 checklines: Speedup flag. If false, then don't run a line-level diff
149 first to identify the changed areas.
150 If true, then run a faster, slightly less optimal diff.
151 deadline: Time when the diff should be complete by.
152
153 Returns:
154 Array of changes.
155 """
156 if not text1:
157 # Just add some text (speedup).
158 return [(self.DIFF_INSERT, text2)]
159
160 if not text2:
161 # Just delete some text (speedup).
162 return [(self.DIFF_DELETE, text1)]
163
164 if len(text1) > len(text2):
165 (longtext, shorttext) = (text1, text2)
166 else:
167 (shorttext, longtext) = (text1, text2)
168 i = longtext.find(shorttext)
169 if i != -1:
170 # Shorter text is inside the longer text (speedup).
171 diffs = [(self.DIFF_INSERT, longtext[:i]), (self.DIFF_EQUAL, shorttext),
172 (self.DIFF_INSERT, longtext[i + len(shorttext):])]
173 # Swap insertions for deletions if diff is reversed.
174 if len(text1) > len(text2):
175 diffs[0] = (self.DIFF_DELETE, diffs[0][1])
176 diffs[2] = (self.DIFF_DELETE, diffs[2][1])
177 return diffs
178
179 if len(shorttext) == 1:
180 # Single character string.
181 # After the previous speedup, the character can't be an equality.
182 return [(self.DIFF_DELETE, text1), (self.DIFF_INSERT, text2)]
183
184 # Check to see if the problem can be split in two.
185 hm = self.diff_halfMatch(text1, text2)
186 if hm:
187 # A half-match was found, sort out the return data.
188 (text1_a, text1_b, text2_a, text2_b, mid_common) = hm
189 # Send both pairs off for separate processing.
190 diffs_a = self.diff_main(text1_a, text2_a, checklines, deadline)
191 diffs_b = self.diff_main(text1_b, text2_b, checklines, deadline)
192 # Merge the results.
193 return diffs_a + [(self.DIFF_EQUAL, mid_common)] + diffs_b
194
195 if checklines and len(text1) > 100 and len(text2) > 100:
196 return self.diff_lineMode(text1, text2, deadline)
197
198 return self.diff_bisect(text1, text2, deadline)
199
200 def diff_lineMode(self, text1, text2, deadline):
201 """Do a quick line-level diff on both strings, then rediff the parts for
202 greater accuracy.
203 This speedup can produce non-minimal diffs.
204
205 Args:
206 text1: Old string to be diffed.
207 text2: New string to be diffed.
208 deadline: Time when the diff should be complete by.
209
210 Returns:
211 Array of changes.
212 """
213
214 # Scan the text on a line-by-line basis first.
215 (text1, text2, linearray) = self.diff_linesToChars(text1, text2)
216
217 diffs = self.diff_main(text1, text2, False, deadline)
218
219 # Convert the diff back to original text.
220 self.diff_charsToLines(diffs, linearray)
221 # Eliminate freak matches (e.g. blank lines)
222 self.diff_cleanupSemantic(diffs)
223
224 # Rediff any replacement blocks, this time character-by-character.
225 # Add a dummy entry at the end.
226 diffs.append((self.DIFF_EQUAL, ''))
227 pointer = 0
228 count_delete = 0
229 count_insert = 0
230 text_delete = ''
231 text_insert = ''
232 while pointer < len(diffs):
233 if diffs[pointer][0] == self.DIFF_INSERT:
234 count_insert += 1
235 text_insert += diffs[pointer][1]
236 elif diffs[pointer][0] == self.DIFF_DELETE:
237 count_delete += 1
238 text_delete += diffs[pointer][1]
239 elif diffs[pointer][0] == self.DIFF_EQUAL:
240 # Upon reaching an equality, check for prior redundancies.
241 if count_delete >= 1 and count_insert >= 1:
242 # Delete the offending records and add the merged ones.
243 a = self.diff_main(text_delete, text_insert, False, deadline)
244 diffs[pointer - count_delete - count_insert : pointer] = a
245 pointer = pointer - count_delete - count_insert + len(a)
246 count_insert = 0
247 count_delete = 0
248 text_delete = ''
249 text_insert = ''
250
251 pointer += 1
252
253 diffs.pop() # Remove the dummy entry at the end.
254
255 return diffs
256
257 def diff_bisect(self, text1, text2, deadline):
258 """Find the 'middle snake' of a diff, split the problem in two
259 and return the recursively constructed diff.
260 See Myers 1986 paper: An O(ND) Difference Algorithm and Its Variations.
261
262 Args:
263 text1: Old string to be diffed.
264 text2: New string to be diffed.
265 deadline: Time at which to bail if not yet complete.
266
267 Returns:
268 Array of diff tuples.
269 """
270
271 # Cache the text lengths to prevent multiple calls.
272 text1_length = len(text1)
273 text2_length = len(text2)
274 max_d = (text1_length + text2_length + 1) // 2
275 v_offset = max_d
276 v_length = 2 * max_d
277 v1 = [-1] * v_length
278 v1[v_offset + 1] = 0
279 v2 = v1[:]
280 delta = text1_length - text2_length
281 # If the total number of characters is odd, then the front path will
282 # collide with the reverse path.
283 front = (delta % 2 != 0)
284 # Offsets for start and end of k loop.
285 # Prevents mapping of space beyond the grid.
286 k1start = 0
287 k1end = 0
288 k2start = 0
289 k2end = 0
290 for d in xrange(max_d):
291 # Bail out if deadline is reached.
292 if time.time() > deadline:
293 break
294
295 # Walk the front path one step.
296 for k1 in xrange(-d + k1start, d + 1 - k1end, 2):
297 k1_offset = v_offset + k1
298 if k1 == -d or (k1 != d and
299 v1[k1_offset - 1] < v1[k1_offset + 1]):
300 x1 = v1[k1_offset + 1]
301 else:
302 x1 = v1[k1_offset - 1] + 1
303 y1 = x1 - k1
304 while (x1 < text1_length and y1 < text2_length and
305 text1[x1] == text2[y1]):
306 x1 += 1
307 y1 += 1
308 v1[k1_offset] = x1
309 if x1 > text1_length:
310 # Ran off the right of the graph.
311 k1end += 2
312 elif y1 > text2_length:
313 # Ran off the bottom of the graph.
314 k1start += 2
315 elif front:
316 k2_offset = v_offset + delta - k1
317 if k2_offset >= 0 and k2_offset < v_length and v2[k2_offset] != -1:
318 # Mirror x2 onto top-left coordinate system.
319 x2 = text1_length - v2[k2_offset]
320 if x1 >= x2:
321 # Overlap detected.
322 return self.diff_bisectSplit(text1, text2, x1, y1, deadline)
323
324 # Walk the reverse path one step.
325 for k2 in xrange(-d + k2start, d + 1 - k2end, 2):
326 k2_offset = v_offset + k2
327 if k2 == -d or (k2 != d and
328 v2[k2_offset - 1] < v2[k2_offset + 1]):
329 x2 = v2[k2_offset + 1]
330 else:
331 x2 = v2[k2_offset - 1] + 1
332 y2 = x2 - k2
333 while (x2 < text1_length and y2 < text2_length and
334 text1[-x2 - 1] == text2[-y2 - 1]):
335 x2 += 1
336 y2 += 1
337 v2[k2_offset] = x2
338 if x2 > text1_length:
339 # Ran off the left of the graph.
340 k2end += 2
341 elif y2 > text2_length:
342 # Ran off the top of the graph.
343 k2start += 2
344 elif not front:
345 k1_offset = v_offset + delta - k2
346 if k1_offset >= 0 and k1_offset < v_length and v1[k1_offset] != -1:
347 x1 = v1[k1_offset]
348 y1 = v_offset + x1 - k1_offset
349 # Mirror x2 onto top-left coordinate system.
350 x2 = text1_length - x2
351 if x1 >= x2:
352 # Overlap detected.
353 return self.diff_bisectSplit(text1, text2, x1, y1, deadline)
354
355 # Diff took too long and hit the deadline or
356 # number of diffs equals number of characters, no commonality at all.
357 return [(self.DIFF_DELETE, text1), (self.DIFF_INSERT, text2)]
358
359 def diff_bisectSplit(self, text1, text2, x, y, deadline):
360 """Given the location of the 'middle snake', split the diff in two parts
361 and recurse.
362
363 Args:
364 text1: Old string to be diffed.
365 text2: New string to be diffed.
366 x: Index of split point in text1.
367 y: Index of split point in text2.
368 deadline: Time at which to bail if not yet complete.
369
370 Returns:
371 Array of diff tuples.
372 """
373 text1a = text1[:x]
374 text2a = text2[:y]
375 text1b = text1[x:]
376 text2b = text2[y:]
377
378 # Compute both diffs serially.
379 diffs = self.diff_main(text1a, text2a, False, deadline)
380 diffsb = self.diff_main(text1b, text2b, False, deadline)
381
382 return diffs + diffsb
383
384 def diff_linesToChars(self, text1, text2):
385 """Split two texts into an array of strings. Reduce the texts to a string
386 of hashes where each Unicode character represents one line.
387
388 Args:
389 text1: First string.
390 text2: Second string.
391
392 Returns:
393 Three element tuple, containing the encoded text1, the encoded text2 and
394 the array of unique strings. The zeroth element of the array of unique
395 strings is intentionally blank.
396 """
397 lineArray = [] # e.g. lineArray[4] == "Hello\n"
398 lineHash = {} # e.g. lineHash["Hello\n"] == 4
399
400 # "\x00" is a valid character, but various debuggers don't like it.
401 # So we'll insert a junk entry to avoid generating a null character.
402 lineArray.append('')
403
404 def diff_linesToCharsMunge(text):
405 """Split a text into an array of strings. Reduce the texts to a string
406 of hashes where each Unicode character represents one line.
407 Modifies linearray and linehash through being a closure.
408
409 Args:
410 text: String to encode.
411
412 Returns:
413 Encoded string.
414 """
415 chars = []
416 # Walk the text, pulling out a substring for each line.
417 # text.split('\n') would would temporarily double our memory footprint.
418 # Modifying text would create many large strings to garbage collect.
419 lineStart = 0
420 lineEnd = -1
421 while lineEnd < len(text) - 1:
422 lineEnd = text.find('\n', lineStart)
423 if lineEnd == -1:
424 lineEnd = len(text) - 1
425 line = text[lineStart:lineEnd + 1]
426 lineStart = lineEnd + 1
427
428 if line in lineHash:
429 chars.append(unichr(lineHash[line]))
430 else:
431 lineArray.append(line)
432 lineHash[line] = len(lineArray) - 1
433 chars.append(unichr(len(lineArray) - 1))
434 return "".join(chars)
435
436 chars1 = diff_linesToCharsMunge(text1)
437 chars2 = diff_linesToCharsMunge(text2)
438 return (chars1, chars2, lineArray)
439
440 def diff_charsToLines(self, diffs, lineArray):
441 """Rehydrate the text in a diff from a string of line hashes to real lines
442 of text.
443
444 Args:
445 diffs: Array of diff tuples.
446 lineArray: Array of unique strings.
447 """
448 for x in xrange(len(diffs)):
449 text = []
450 for char in diffs[x][1]:
451 text.append(lineArray[ord(char)])
452 diffs[x] = (diffs[x][0], "".join(text))
453
454 def diff_commonPrefix(self, text1, text2):
455 """Determine the common prefix of two strings.
456
457 Args:
458 text1: First string.
459 text2: Second string.
460
461 Returns:
462 The number of characters common to the start of each string.
463 """
464 # Quick check for common null cases.
465 if not text1 or not text2 or text1[0] != text2[0]:
466 return 0
467 # Binary search.
468 # Performance analysis: http://neil.fraser.name/news/2007/10/09/
469 pointermin = 0
470 pointermax = min(len(text1), len(text2))
471 pointermid = pointermax
472 pointerstart = 0
473 while pointermin < pointermid:
474 if text1[pointerstart:pointermid] == text2[pointerstart:pointermid]:
475 pointermin = pointermid
476 pointerstart = pointermin
477 else:
478 pointermax = pointermid
479 pointermid = (pointermax - pointermin) // 2 + pointermin
480 return pointermid
481
482 def diff_commonSuffix(self, text1, text2):
483 """Determine the common suffix of two strings.
484
485 Args:
486 text1: First string.
487 text2: Second string.
488
489 Returns:
490 The number of characters common to the end of each string.
491 """
492 # Quick check for common null cases.
493 if not text1 or not text2 or text1[-1] != text2[-1]:
494 return 0
495 # Binary search.
496 # Performance analysis: http://neil.fraser.name/news/2007/10/09/
497 pointermin = 0
498 pointermax = min(len(text1), len(text2))
499 pointermid = pointermax
500 pointerend = 0
501 while pointermin < pointermid:
502 if (text1[-pointermid:len(text1) - pointerend] ==
503 text2[-pointermid:len(text2) - pointerend]):
504 pointermin = pointermid
505 pointerend = pointermin
506 else:
507 pointermax = pointermid
508 pointermid = (pointermax - pointermin) // 2 + pointermin
509 return pointermid
510
511 def diff_commonOverlap(self, text1, text2):
512 """Determine if the suffix of one string is the prefix of another.
513
514 Args:
515 text1 First string.
516 text2 Second string.
517
518 Returns:
519 The number of characters common to the end of the first
520 string and the start of the second string.
521 """
522 # Cache the text lengths to prevent multiple calls.
523 text1_length = len(text1)
524 text2_length = len(text2)
525 # Eliminate the null case.
526 if text1_length == 0 or text2_length == 0:
527 return 0
528 # Truncate the longer string.
529 if text1_length > text2_length:
530 text1 = text1[-text2_length:]
531 elif text1_length < text2_length:
532 text2 = text2[:text1_length]
533 text_length = min(text1_length, text2_length)
534 # Quick check for the worst case.
535 if text1 == text2:
536 return text_length
537
538 # Start by looking for a single character match
539 # and increase length until no match is found.
540 # Performance analysis: http://neil.fraser.name/news/2010/11/04/
541 best = 0
542 length = 1
543 while True:
544 pattern = text1[-length:]
545 found = text2.find(pattern)
546 if found == -1:
547 return best
548 length += found
549 if found == 0 or text1[-length:] == text2[:length]:
550 best = length
551 length += 1
552
553 def diff_halfMatch(self, text1, text2):
554 """Do the two texts share a substring which is at least half the length of
555 the longer text?
556 This speedup can produce non-minimal diffs.
557
558 Args:
559 text1: First string.
560 text2: Second string.
561
562 Returns:
563 Five element Array, containing the prefix of text1, the suffix of text1,
564 the prefix of text2, the suffix of text2 and the common middle. Or None
565 if there was no match.
566 """
567 if self.Diff_Timeout <= 0:
568 # Don't risk returning a non-optimal diff if we have unlimited time.
569 return None
570 if len(text1) > len(text2):
571 (longtext, shorttext) = (text1, text2)
572 else:
573 (shorttext, longtext) = (text1, text2)
574 if len(longtext) < 4 or len(shorttext) * 2 < len(longtext):
575 return None # Pointless.
576
577 def diff_halfMatchI(longtext, shorttext, i):
578 """Does a substring of shorttext exist within longtext such that the
579 substring is at least half the length of longtext?
580 Closure, but does not reference any external variables.
581
582 Args:
583 longtext: Longer string.
584 shorttext: Shorter string.
585 i: Start index of quarter length substring within longtext.
586
587 Returns:
588 Five element Array, containing the prefix of longtext, the suffix of
589 longtext, the prefix of shorttext, the suffix of shorttext and the
590 common middle. Or None if there was no match.
591 """
592 seed = longtext[i:i + len(longtext) // 4]
593 best_common = ''
594 j = shorttext.find(seed)
595 while j != -1:
596 prefixLength = self.diff_commonPrefix(longtext[i:], shorttext[j:])
597 suffixLength = self.diff_commonSuffix(longtext[:i], shorttext[:j])
598 if len(best_common) < suffixLength + prefixLength:
599 best_common = (shorttext[j - suffixLength:j] +
600 shorttext[j:j + prefixLength])
601 best_longtext_a = longtext[:i - suffixLength]
602 best_longtext_b = longtext[i + prefixLength:]
603 best_shorttext_a = shorttext[:j - suffixLength]
604 best_shorttext_b = shorttext[j + prefixLength:]
605 j = shorttext.find(seed, j + 1)
606
607 if len(best_common) * 2 >= len(longtext):
608 return (best_longtext_a, best_longtext_b,
609 best_shorttext_a, best_shorttext_b, best_common)
610 else:
611 return None
612
613 # First check if the second quarter is the seed for a half-match.
614 hm1 = diff_halfMatchI(longtext, shorttext, (len(longtext) + 3) // 4)
615 # Check again based on the third quarter.
616 hm2 = diff_halfMatchI(longtext, shorttext, (len(longtext) + 1) // 2)
617 if not hm1 and not hm2:
618 return None
619 elif not hm2:
620 hm = hm1
621 elif not hm1:
622 hm = hm2
623 else:
624 # Both matched. Select the longest.
625 if len(hm1[4]) > len(hm2[4]):
626 hm = hm1
627 else:
628 hm = hm2
629
630 # A half-match was found, sort out the return data.
631 if len(text1) > len(text2):
632 (text1_a, text1_b, text2_a, text2_b, mid_common) = hm
633 else:
634 (text2_a, text2_b, text1_a, text1_b, mid_common) = hm
635 return (text1_a, text1_b, text2_a, text2_b, mid_common)
636
637 def diff_cleanupSemantic(self, diffs):
638 """Reduce the number of edits by eliminating semantically trivial
639 equalities.
640
641 Args:
642 diffs: Array of diff tuples.
643 """
644 changes = False
645 equalities = [] # Stack of indices where equalities are found.
646 lastequality = None # Always equal to diffs[equalities[-1]][1]
647 pointer = 0 # Index of current position.
648 # Number of chars that changed prior to the equality.
649 length_insertions1, length_deletions1 = 0, 0
650 # Number of chars that changed after the equality.
651 length_insertions2, length_deletions2 = 0, 0
652 while pointer < len(diffs):
653 if diffs[pointer][0] == self.DIFF_EQUAL: # Equality found.
654 equalities.append(pointer)
655 length_insertions1, length_insertions2 = length_insertions2, 0
656 length_deletions1, length_deletions2 = length_deletions2, 0
657 lastequality = diffs[pointer][1]
658 else: # An insertion or deletion.
659 if diffs[pointer][0] == self.DIFF_INSERT:
660 length_insertions2 += len(diffs[pointer][1])
661 else:
662 length_deletions2 += len(diffs[pointer][1])
663 # Eliminate an equality that is smaller or equal to the edits on both
664 # sides of it.
665 if (lastequality and (len(lastequality) <=
666 max(length_insertions1, length_deletions1)) and
667 (len(lastequality) <= max(length_insertions2, length_deletions2))):
668 # Duplicate record.
669 diffs.insert(equalities[-1], (self.DIFF_DELETE, lastequality))
670 # Change second copy to insert.
671 diffs[equalities[-1] + 1] = (self.DIFF_INSERT,
672 diffs[equalities[-1] + 1][1])
673 # Throw away the equality we just deleted.
674 equalities.pop()
675 # Throw away the previous equality (it needs to be reevaluated).
676 if len(equalities):
677 equalities.pop()
678 if len(equalities):
679 pointer = equalities[-1]
680 else:
681 pointer = -1
682 # Reset the counters.
683 length_insertions1, length_deletions1 = 0, 0
684 length_insertions2, length_deletions2 = 0, 0
685 lastequality = None
686 changes = True
687 pointer += 1
688
689 # Normalize the diff.
690 if changes:
691 self.diff_cleanupMerge(diffs)
692 self.diff_cleanupSemanticLossless(diffs)
693
694 # Find any overlaps between deletions and insertions.
695 # e.g: <del>abcxxx</del><ins>xxxdef</ins>
696 # -> <del>abc</del>xxx<ins>def</ins>
697 # e.g: <del>xxxabc</del><ins>defxxx</ins>
698 # -> <ins>def</ins>xxx<del>abc</del>
699 # Only extract an overlap if it is as big as the edit ahead or behind it.
700 pointer = 1
701 while pointer < len(diffs):
702 if (diffs[pointer - 1][0] == self.DIFF_DELETE and
703 diffs[pointer][0] == self.DIFF_INSERT):
704 deletion = diffs[pointer - 1][1]
705 insertion = diffs[pointer][1]
706 overlap_length1 = self.diff_commonOverlap(deletion, insertion)
707 overlap_length2 = self.diff_commonOverlap(insertion, deletion)
708 if overlap_length1 >= overlap_length2:
709 if (overlap_length1 >= len(deletion) / 2.0 or
710 overlap_length1 >= len(insertion) / 2.0):
711 # Overlap found. Insert an equality and trim the surrounding edits.
712 diffs.insert(pointer, (self.DIFF_EQUAL,
713 insertion[:overlap_length1]))
714 diffs[pointer - 1] = (self.DIFF_DELETE,
715 deletion[:len(deletion) - overlap_length1])
716 diffs[pointer + 1] = (self.DIFF_INSERT,
717 insertion[overlap_length1:])
718 pointer += 1
719 else:
720 if (overlap_length2 >= len(deletion) / 2.0 or
721 overlap_length2 >= len(insertion) / 2.0):
722 # Reverse overlap found.
723 # Insert an equality and swap and trim the surrounding edits.
724 diffs.insert(pointer, (self.DIFF_EQUAL, deletion[:overlap_length2]))
725 diffs[pointer - 1] = (self.DIFF_INSERT,
726 insertion[:len(insertion) - overlap_length2])
727 diffs[pointer + 1] = (self.DIFF_DELETE, deletion[overlap_length2:])
728 pointer += 1
729 pointer += 1
730 pointer += 1
731
732 def diff_cleanupSemanticLossless(self, diffs):
733 """Look for single edits surrounded on both sides by equalities
734 which can be shifted sideways to align the edit to a word boundary.
735 e.g: The c<ins>at c</ins>ame. -> The <ins>cat </ins>came.
736
737 Args:
738 diffs: Array of diff tuples.
739 """
740
741 def diff_cleanupSemanticScore(one, two):
742 """Given two strings, compute a score representing whether the
743 internal boundary falls on logical boundaries.
744 Scores range from 6 (best) to 0 (worst).
745 Closure, but does not reference any external variables.
746
747 Args:
748 one: First string.
749 two: Second string.
750
751 Returns:
752 The score.
753 """
754 if not one or not two:
755 # Edges are the best.
756 return 6
757
758 # Each port of this function behaves slightly differently due to
759 # subtle differences in each language's definition of things like
760 # 'whitespace'. Since this function's purpose is largely cosmetic,
761 # the choice has been made to use each language's native features
762 # rather than force total conformity.
763 char1 = one[-1]
764 char2 = two[0]
765 nonAlphaNumeric1 = not char1.isalnum()
766 nonAlphaNumeric2 = not char2.isalnum()
767 whitespace1 = nonAlphaNumeric1 and char1.isspace()
768 whitespace2 = nonAlphaNumeric2 and char2.isspace()
769 lineBreak1 = whitespace1 and (char1 == "\r" or char1 == "\n")
770 lineBreak2 = whitespace2 and (char2 == "\r" or char2 == "\n")
771 blankLine1 = lineBreak1 and self.BLANKLINEEND.search(one)
772 blankLine2 = lineBreak2 and self.BLANKLINESTART.match(two)
773
774 if blankLine1 or blankLine2:
775 # Five points for blank lines.
776 return 5
777 elif lineBreak1 or lineBreak2:
778 # Four points for line breaks.
779 return 4
780 elif nonAlphaNumeric1 and not whitespace1 and whitespace2:
781 # Three points for end of sentences.
782 return 3
783 elif whitespace1 or whitespace2:
784 # Two points for whitespace.
785 return 2
786 elif nonAlphaNumeric1 or nonAlphaNumeric2:
787 # One point for non-alphanumeric.
788 return 1
789 return 0
790
791 pointer = 1
792 # Intentionally ignore the first and last element (don't need checking).
793 while pointer < len(diffs) - 1:
794 if (diffs[pointer - 1][0] == self.DIFF_EQUAL and
795 diffs[pointer + 1][0] == self.DIFF_EQUAL):
796 # This is a single edit surrounded by equalities.
797 equality1 = diffs[pointer - 1][1]
798 edit = diffs[pointer][1]
799 equality2 = diffs[pointer + 1][1]
800
801 # First, shift the edit as far left as possible.
802 commonOffset = self.diff_commonSuffix(equality1, edit)
803 if commonOffset:
804 commonString = edit[-commonOffset:]
805 equality1 = equality1[:-commonOffset]
806 edit = commonString + edit[:-commonOffset]
807 equality2 = commonString + equality2
808
809 # Second, step character by character right, looking for the best fit.
810 bestEquality1 = equality1
811 bestEdit = edit
812 bestEquality2 = equality2
813 bestScore = (diff_cleanupSemanticScore(equality1, edit) +
814 diff_cleanupSemanticScore(edit, equality2))
815 while edit and equality2 and edit[0] == equality2[0]:
816 equality1 += edit[0]
817 edit = edit[1:] + equality2[0]
818 equality2 = equality2[1:]
819 score = (diff_cleanupSemanticScore(equality1, edit) +
820 diff_cleanupSemanticScore(edit, equality2))
821 # The >= encourages trailing rather than leading whitespace on edits.
822 if score >= bestScore:
823 bestScore = score
824 bestEquality1 = equality1
825 bestEdit = edit
826 bestEquality2 = equality2
827
828 if diffs[pointer - 1][1] != bestEquality1:
829 # We have an improvement, save it back to the diff.
830 if bestEquality1:
831 diffs[pointer - 1] = (diffs[pointer - 1][0], bestEquality1)
832 else:
833 del diffs[pointer - 1]
834 pointer -= 1
835 diffs[pointer] = (diffs[pointer][0], bestEdit)
836 if bestEquality2:
837 diffs[pointer + 1] = (diffs[pointer + 1][0], bestEquality2)
838 else:
839 del diffs[pointer + 1]
840 pointer -= 1
841 pointer += 1
842
843 # Define some regex patterns for matching boundaries.
844 BLANKLINEEND = re.compile(r"\n\r?\n$");
845 BLANKLINESTART = re.compile(r"^\r?\n\r?\n");
846
847 def diff_cleanupEfficiency(self, diffs):
848 """Reduce the number of edits by eliminating operationally trivial
849 equalities.
850
851 Args:
852 diffs: Array of diff tuples.
853 """
854 changes = False
855 equalities = [] # Stack of indices where equalities are found.
856 lastequality = None # Always equal to diffs[equalities[-1]][1]
857 pointer = 0 # Index of current position.
858 pre_ins = False # Is there an insertion operation before the last equality.
859 pre_del = False # Is there a deletion operation before the last equality.
860 post_ins = False # Is there an insertion operation after the last equality.
861 post_del = False # Is there a deletion operation after the last equality.
862 while pointer < len(diffs):
863 if diffs[pointer][0] == self.DIFF_EQUAL: # Equality found.
864 if (len(diffs[pointer][1]) < self.Diff_EditCost and
865 (post_ins or post_del)):
866 # Candidate found.
867 equalities.append(pointer)
868 pre_ins = post_ins
869 pre_del = post_del
870 lastequality = diffs[pointer][1]
871 else:
872 # Not a candidate, and can never become one.
873 equalities = []
874 lastequality = None
875
876 post_ins = post_del = False
877 else: # An insertion or deletion.
878 if diffs[pointer][0] == self.DIFF_DELETE:
879 post_del = True
880 else:
881 post_ins = True
882
883 # Five types to be split:
884 # <ins>A</ins><del>B</del>XY<ins>C</ins><del>D</del>
885 # <ins>A</ins>X<ins>C</ins><del>D</del>
886 # <ins>A</ins><del>B</del>X<ins>C</ins>
887 # <ins>A</del>X<ins>C</ins><del>D</del>
888 # <ins>A</ins><del>B</del>X<del>C</del>
889
890 if lastequality and ((pre_ins and pre_del and post_ins and post_del) or
891 ((len(lastequality) < self.Diff_EditCost / 2) and
892 (pre_ins + pre_del + post_ins + post_del) == 3)):
893 # Duplicate record.
894 diffs.insert(equalities[-1], (self.DIFF_DELETE, lastequality))
895 # Change second copy to insert.
896 diffs[equalities[-1] + 1] = (self.DIFF_INSERT,
897 diffs[equalities[-1] + 1][1])
898 equalities.pop() # Throw away the equality we just deleted.
899 lastequality = None
900 if pre_ins and pre_del:
901 # No changes made which could affect previous entry, keep going.
902 post_ins = post_del = True
903 equalities = []
904 else:
905 if len(equalities):
906 equalities.pop() # Throw away the previous equality.
907 if len(equalities):
908 pointer = equalities[-1]
909 else:
910 pointer = -1
911 post_ins = post_del = False
912 changes = True
913 pointer += 1
914
915 if changes:
916 self.diff_cleanupMerge(diffs)
917
918 def diff_cleanupMerge(self, diffs):
919 """Reorder and merge like edit sections. Merge equalities.
920 Any edit section can move as long as it doesn't cross an equality.
921
922 Args:
923 diffs: Array of diff tuples.
924 """
925 diffs.append((self.DIFF_EQUAL, '')) # Add a dummy entry at the end.
926 pointer = 0
927 count_delete = 0
928 count_insert = 0
929 text_delete = ''
930 text_insert = ''
931 while pointer < len(diffs):
932 if diffs[pointer][0] == self.DIFF_INSERT:
933 count_insert += 1
934 text_insert += diffs[pointer][1]
935 pointer += 1
936 elif diffs[pointer][0] == self.DIFF_DELETE:
937 count_delete += 1
938 text_delete += diffs[pointer][1]
939 pointer += 1
940 elif diffs[pointer][0] == self.DIFF_EQUAL:
941 # Upon reaching an equality, check for prior redundancies.
942 if count_delete + count_insert > 1:
943 if count_delete != 0 and count_insert != 0:
944 # Factor out any common prefixies.
945 commonlength = self.diff_commonPrefix(text_insert, text_delete)
946 if commonlength != 0:
947 x = pointer - count_delete - count_insert - 1
948 if x >= 0 and diffs[x][0] == self.DIFF_EQUAL:
949 diffs[x] = (diffs[x][0], diffs[x][1] +
950 text_insert[:commonlength])
951 else:
952 diffs.insert(0, (self.DIFF_EQUAL, text_insert[:commonlength]))
953 pointer += 1
954 text_insert = text_insert[commonlength:]
955 text_delete = text_delete[commonlength:]
956 # Factor out any common suffixies.
957 commonlength = self.diff_commonSuffix(text_insert, text_delete)
958 if commonlength != 0:
959 diffs[pointer] = (diffs[pointer][0], text_insert[-commonlength:] +
960 diffs[pointer][1])
961 text_insert = text_insert[:-commonlength]
962 text_delete = text_delete[:-commonlength]
963 # Delete the offending records and add the merged ones.
964 if count_delete == 0:
965 diffs[pointer - count_insert : pointer] = [
966 (self.DIFF_INSERT, text_insert)]
967 elif count_insert == 0:
968 diffs[pointer - count_delete : pointer] = [
969 (self.DIFF_DELETE, text_delete)]
970 else:
971 diffs[pointer - count_delete - count_insert : pointer] = [
972 (self.DIFF_DELETE, text_delete),
973 (self.DIFF_INSERT, text_insert)]
974 pointer = pointer - count_delete - count_insert + 1
975 if count_delete != 0:
976 pointer += 1
977 if count_insert != 0:
978 pointer += 1
979 elif pointer != 0 and diffs[pointer - 1][0] == self.DIFF_EQUAL:
980 # Merge this equality with the previous one.
981 diffs[pointer - 1] = (diffs[pointer - 1][0],
982 diffs[pointer - 1][1] + diffs[pointer][1])
983 del diffs[pointer]
984 else:
985 pointer += 1
986
987 count_insert = 0
988 count_delete = 0
989 text_delete = ''
990 text_insert = ''
991
992 if diffs[-1][1] == '':
993 diffs.pop() # Remove the dummy entry at the end.
994
995 # Second pass: look for single edits surrounded on both sides by equalities
996 # which can be shifted sideways to eliminate an equality.
997 # e.g: A<ins>BA</ins>C -> <ins>AB</ins>AC
998 changes = False
999 pointer = 1
1000 # Intentionally ignore the first and last element (don't need checking).
1001 while pointer < len(diffs) - 1:
1002 if (diffs[pointer - 1][0] == self.DIFF_EQUAL and
1003 diffs[pointer + 1][0] == self.DIFF_EQUAL):
1004 # This is a single edit surrounded by equalities.
1005 if diffs[pointer][1].endswith(diffs[pointer - 1][1]):
1006 # Shift the edit over the previous equality.
1007 diffs[pointer] = (diffs[pointer][0],
1008 diffs[pointer - 1][1] +
1009 diffs[pointer][1][:-len(diffs[pointer - 1][1])])
1010 diffs[pointer + 1] = (diffs[pointer + 1][0],
1011 diffs[pointer - 1][1] + diffs[pointer + 1][1])
1012 del diffs[pointer - 1]
1013 changes = True
1014 elif diffs[pointer][1].startswith(diffs[pointer + 1][1]):
1015 # Shift the edit over the next equality.
1016 diffs[pointer - 1] = (diffs[pointer - 1][0],
1017 diffs[pointer - 1][1] + diffs[pointer + 1][1])
1018 diffs[pointer] = (diffs[pointer][0],
1019 diffs[pointer][1][len(diffs[pointer + 1][1]):] +
1020 diffs[pointer + 1][1])
1021 del diffs[pointer + 1]
1022 changes = True
1023 pointer += 1
1024
1025 # If shifts were made, the diff needs reordering and another shift sweep.
1026 if changes:
1027 self.diff_cleanupMerge(diffs)
1028
1029 def diff_xIndex(self, diffs, loc):
1030 """loc is a location in text1, compute and return the equivalent location
1031 in text2. e.g. "The cat" vs "The big cat", 1->1, 5->8
1032
1033 Args:
1034 diffs: Array of diff tuples.
1035 loc: Location within text1.
1036
1037 Returns:
1038 Location within text2.
1039 """
1040 chars1 = 0
1041 chars2 = 0
1042 last_chars1 = 0
1043 last_chars2 = 0
1044 for x in xrange(len(diffs)):
1045 (op, text) = diffs[x]
1046 if op != self.DIFF_INSERT: # Equality or deletion.
1047 chars1 += len(text)
1048 if op != self.DIFF_DELETE: # Equality or insertion.
1049 chars2 += len(text)
1050 if chars1 > loc: # Overshot the location.
1051 break
1052 last_chars1 = chars1
1053 last_chars2 = chars2
1054
1055 if len(diffs) != x and diffs[x][0] == self.DIFF_DELETE:
1056 # The location was deleted.
1057 return last_chars2
1058 # Add the remaining len(character).
1059 return last_chars2 + (loc - last_chars1)
1060
1061 def diff_prettyHtml(self, diffs):
1062 """Convert a diff array into a pretty HTML report.
1063
1064 Args:
1065 diffs: Array of diff tuples.
1066
1067 Returns:
1068 HTML representation.
1069 """
1070 html = []
1071 for (op, data) in diffs:
1072 text = (data.replace("&", "&amp;").replace("<", "&lt;")
1073 .replace(">", "&gt;").replace("\n", "&para;<br>"))
1074 if op == self.DIFF_INSERT:
1075 html.append("<ins style=\"background:#e6ffe6;\">%s</ins>" % text)
1076 elif op == self.DIFF_DELETE:
1077 html.append("<del style=\"background:#ffe6e6;\">%s</del>" % text)
1078 elif op == self.DIFF_EQUAL:
1079 html.append("<span>%s</span>" % text)
1080 return "".join(html)
1081
1082 def diff_text1(self, diffs):
1083 """Compute and return the source text (all equalities and deletions).
1084
1085 Args:
1086 diffs: Array of diff tuples.
1087
1088 Returns:
1089 Source text.
1090 """
1091 text = []
1092 for (op, data) in diffs:
1093 if op != self.DIFF_INSERT:
1094 text.append(data)
1095 return "".join(text)
1096
1097 def diff_text2(self, diffs):
1098 """Compute and return the destination text (all equalities and insertions).
1099
1100 Args:
1101 diffs: Array of diff tuples.
1102
1103 Returns:
1104 Destination text.
1105 """
1106 text = []
1107 for (op, data) in diffs:
1108 if op != self.DIFF_DELETE:
1109 text.append(data)
1110 return "".join(text)
1111
1112 def diff_levenshtein(self, diffs):
1113 """Compute the Levenshtein distance; the number of inserted, deleted or
1114 substituted characters.
1115
1116 Args:
1117 diffs: Array of diff tuples.
1118
1119 Returns:
1120 Number of changes.
1121 """
1122 levenshtein = 0
1123 insertions = 0
1124 deletions = 0
1125 for (op, data) in diffs:
1126 if op == self.DIFF_INSERT:
1127 insertions += len(data)
1128 elif op == self.DIFF_DELETE:
1129 deletions += len(data)
1130 elif op == self.DIFF_EQUAL:
1131 # A deletion and an insertion is one substitution.
1132 levenshtein += max(insertions, deletions)
1133 insertions = 0
1134 deletions = 0
1135 levenshtein += max(insertions, deletions)
1136 return levenshtein
1137
1138 def diff_toDelta(self, diffs):
1139 """Crush the diff into an encoded string which describes the operations
1140 required to transform text1 into text2.
1141 E.g. =3\t-2\t+ing -> Keep 3 chars, delete 2 chars, insert 'ing'.
1142 Operations are tab-separated. Inserted text is escaped using %xx notation.
1143
1144 Args:
1145 diffs: Array of diff tuples.
1146
1147 Returns:
1148 Delta text.
1149 """
1150 text = []
1151 for (op, data) in diffs:
1152 if op == self.DIFF_INSERT:
1153 # High ascii will raise UnicodeDecodeError. Use Unicode instead.
1154 data = data.encode("utf-8")
1155 text.append("+" + urllib.quote(data, "!~*'();/?:@&=+$,# "))
1156 elif op == self.DIFF_DELETE:
1157 text.append("-%d" % len(data))
1158 elif op == self.DIFF_EQUAL:
1159 text.append("=%d" % len(data))
1160 return "\t".join(text)
1161
1162 def diff_fromDelta(self, text1, delta):
1163 """Given the original text1, and an encoded string which describes the
1164 operations required to transform text1 into text2, compute the full diff.
1165
1166 Args:
1167 text1: Source string for the diff.
1168 delta: Delta text.
1169
1170 Returns:
1171 Array of diff tuples.
1172
1173 Raises:
1174 ValueError: If invalid input.
1175 """
1176 if type(delta) == unicode:
1177 # Deltas should be composed of a subset of ascii chars, Unicode not
1178 # required. If this encode raises UnicodeEncodeError, delta is invalid.
1179 delta = delta.encode("ascii")
1180 diffs = []
1181 pointer = 0 # Cursor in text1
1182 tokens = delta.split("\t")
1183 for token in tokens:
1184 if token == "":
1185 # Blank tokens are ok (from a trailing \t).
1186 continue
1187 # Each token begins with a one character parameter which specifies the
1188 # operation of this token (delete, insert, equality).
1189 param = token[1:]
1190 if token[0] == "+":
1191 param = urllib.unquote(param).decode("utf-8")
1192 diffs.append((self.DIFF_INSERT, param))
1193 elif token[0] == "-" or token[0] == "=":
1194 try:
1195 n = int(param)
1196 except ValueError:
1197 raise ValueError("Invalid number in diff_fromDelta: " + param)
1198 if n < 0:
1199 raise ValueError("Negative number in diff_fromDelta: " + param)
1200 text = text1[pointer : pointer + n]
1201 pointer += n
1202 if token[0] == "=":
1203 diffs.append((self.DIFF_EQUAL, text))
1204 else:
1205 diffs.append((self.DIFF_DELETE, text))
1206 else:
1207 # Anything else is an error.
1208 raise ValueError("Invalid diff operation in diff_fromDelta: " +
1209 token[0])
1210 if pointer != len(text1):
1211 raise ValueError(
1212 "Delta length (%d) does not equal source text length (%d)." %
1213 (pointer, len(text1)))
1214 return diffs
1215
1216 # MATCH FUNCTIONS
1217
1218 def match_main(self, text, pattern, loc):
1219 """Locate the best instance of 'pattern' in 'text' near 'loc'.
1220
1221 Args:
1222 text: The text to search.
1223 pattern: The pattern to search for.
1224 loc: The location to search around.
1225
1226 Returns:
1227 Best match index or -1.
1228 """
1229 # Check for null inputs.
1230 if text == None or pattern == None:
1231 raise ValueError("Null inputs. (match_main)")
1232
1233 loc = max(0, min(loc, len(text)))
1234 if text == pattern:
1235 # Shortcut (potentially not guaranteed by the algorithm)
1236 return 0
1237 elif not text:
1238 # Nothing to match.
1239 return -1
1240 elif text[loc:loc + len(pattern)] == pattern:
1241 # Perfect match at the perfect spot! (Includes case of null pattern)
1242 return loc
1243 else:
1244 # Do a fuzzy compare.
1245 match = self.match_bitap(text, pattern, loc)
1246 return match
1247
1248 def match_bitap(self, text, pattern, loc):
1249 """Locate the best instance of 'pattern' in 'text' near 'loc' using the
1250 Bitap algorithm.
1251
1252 Args:
1253 text: The text to search.
1254 pattern: The pattern to search for.
1255 loc: The location to search around.
1256
1257 Returns:
1258 Best match index or -1.
1259 """
1260 # Python doesn't have a maxint limit, so ignore this check.
1261 #if self.Match_MaxBits != 0 and len(pattern) > self.Match_MaxBits:
1262 # raise ValueError("Pattern too long for this application.")
1263
1264 # Initialise the alphabet.
1265 s = self.match_alphabet(pattern)
1266
1267 def match_bitapScore(e, x):
1268 """Compute and return the score for a match with e errors and x location.
1269 Accesses loc and pattern through being a closure.
1270
1271 Args:
1272 e: Number of errors in match.
1273 x: Location of match.
1274
1275 Returns:
1276 Overall score for match (0.0 = good, 1.0 = bad).
1277 """
1278 accuracy = float(e) / len(pattern)
1279 proximity = abs(loc - x)
1280 if not self.Match_Distance:
1281 # Dodge divide by zero error.
1282 return proximity and 1.0 or accuracy
1283 return accuracy + (proximity / float(self.Match_Distance))
1284
1285 # Highest score beyond which we give up.
1286 score_threshold = self.Match_Threshold
1287 # Is there a nearby exact match? (speedup)
1288 best_loc = text.find(pattern, loc)
1289 if best_loc != -1:
1290 score_threshold = min(match_bitapScore(0, best_loc), score_threshold)
1291 # What about in the other direction? (speedup)
1292 best_loc = text.rfind(pattern, loc + len(pattern))
1293 if best_loc != -1:
1294 score_threshold = min(match_bitapScore(0, best_loc), score_threshold)
1295
1296 # Initialise the bit arrays.
1297 matchmask = 1 << (len(pattern) - 1)
1298 best_loc = -1
1299
1300 bin_max = len(pattern) + len(text)
1301 # Empty initialization added to appease pychecker.
1302 last_rd = None
1303 for d in xrange(len(pattern)):
1304 # Scan for the best match each iteration allows for one more error.
1305 # Run a binary search to determine how far from 'loc' we can stray at
1306 # this error level.
1307 bin_min = 0
1308 bin_mid = bin_max
1309 while bin_min < bin_mid:
1310 if match_bitapScore(d, loc + bin_mid) <= score_threshold:
1311 bin_min = bin_mid
1312 else:
1313 bin_max = bin_mid
1314 bin_mid = (bin_max - bin_min) // 2 + bin_min
1315
1316 # Use the result from this iteration as the maximum for the next.
1317 bin_max = bin_mid
1318 start = max(1, loc - bin_mid + 1)
1319 finish = min(loc + bin_mid, len(text)) + len(pattern)
1320
1321 rd = [0] * (finish + 2)
1322 rd[finish + 1] = (1 << d) - 1
1323 for j in xrange(finish, start - 1, -1):
1324 if len(text) <= j - 1:
1325 # Out of range.
1326 charMatch = 0
1327 else:
1328 charMatch = s.get(text[j - 1], 0)
1329 if d == 0: # First pass: exact match.
1330 rd[j] = ((rd[j + 1] << 1) | 1) & charMatch
1331 else: # Subsequent passes: fuzzy match.
1332 rd[j] = (((rd[j + 1] << 1) | 1) & charMatch) | (
1333 ((last_rd[j + 1] | last_rd[j]) << 1) | 1) | last_rd[j + 1]
1334 if rd[j] & matchmask:
1335 score = match_bitapScore(d, j - 1)
1336 # This match will almost certainly be better than any existing match.
1337 # But check anyway.
1338 if score <= score_threshold:
1339 # Told you so.
1340 score_threshold = score
1341 best_loc = j - 1
1342 if best_loc > loc:
1343 # When passing loc, don't exceed our current distance from loc.
1344 start = max(1, 2 * loc - best_loc)
1345 else:
1346 # Already passed loc, downhill from here on in.
1347 break
1348 # No hope for a (better) match at greater error levels.
1349 if match_bitapScore(d + 1, loc) > score_threshold:
1350 break
1351 last_rd = rd
1352 return best_loc
1353
1354 def match_alphabet(self, pattern):
1355 """Initialise the alphabet for the Bitap algorithm.
1356
1357 Args:
1358 pattern: The text to encode.
1359
1360 Returns:
1361 Hash of character locations.
1362 """
1363 s = {}
1364 for char in pattern:
1365 s[char] = 0
1366 for i in xrange(len(pattern)):
1367 s[pattern[i]] |= 1 << (len(pattern) - i - 1)
1368 return s
1369
1370 # PATCH FUNCTIONS
1371
1372 def patch_addContext(self, patch, text):
1373 """Increase the context until it is unique,
1374 but don't let the pattern expand beyond Match_MaxBits.
1375
1376 Args:
1377 patch: The patch to grow.
1378 text: Source text.
1379 """
1380 if len(text) == 0:
1381 return
1382 pattern = text[patch.start2 : patch.start2 + patch.length1]
1383 padding = 0
1384
1385 # Look for the first and last matches of pattern in text. If two different
1386 # matches are found, increase the pattern length.
1387 while (text.find(pattern) != text.rfind(pattern) and (self.Match_MaxBits ==
1388 0 or len(pattern) < self.Match_MaxBits - self.Patch_Margin -
1389 self.Patch_Margin)):
1390 padding += self.Patch_Margin
1391 pattern = text[max(0, patch.start2 - padding) :
1392 patch.start2 + patch.length1 + padding]
1393 # Add one chunk for good luck.
1394 padding += self.Patch_Margin
1395
1396 # Add the prefix.
1397 prefix = text[max(0, patch.start2 - padding) : patch.start2]
1398 if prefix:
1399 patch.diffs[:0] = [(self.DIFF_EQUAL, prefix)]
1400 # Add the suffix.
1401 suffix = text[patch.start2 + patch.length1 :
1402 patch.start2 + patch.length1 + padding]
1403 if suffix:
1404 patch.diffs.append((self.DIFF_EQUAL, suffix))
1405
1406 # Roll back the start points.
1407 patch.start1 -= len(prefix)
1408 patch.start2 -= len(prefix)
1409 # Extend lengths.
1410 patch.length1 += len(prefix) + len(suffix)
1411 patch.length2 += len(prefix) + len(suffix)
1412
1413 def patch_make(self, a, b=None, c=None):
1414 """Compute a list of patches to turn text1 into text2.
1415 Use diffs if provided, otherwise compute it ourselves.
1416 There are four ways to call this function, depending on what data is
1417 available to the caller:
1418 Method 1:
1419 a = text1, b = text2
1420 Method 2:
1421 a = diffs
1422 Method 3 (optimal):
1423 a = text1, b = diffs
1424 Method 4 (deprecated, use method 3):
1425 a = text1, b = text2, c = diffs
1426
1427 Args:
1428 a: text1 (methods 1,3,4) or Array of diff tuples for text1 to
1429 text2 (method 2).
1430 b: text2 (methods 1,4) or Array of diff tuples for text1 to
1431 text2 (method 3) or undefined (method 2).
1432 c: Array of diff tuples for text1 to text2 (method 4) or
1433 undefined (methods 1,2,3).
1434
1435 Returns:
1436 Array of Patch objects.
1437 """
1438 text1 = None
1439 diffs = None
1440 # Note that texts may arrive as 'str' or 'unicode'.
1441 if isinstance(a, basestring) and isinstance(b, basestring) and c is None:
1442 # Method 1: text1, text2
1443 # Compute diffs from text1 and text2.
1444 text1 = a
1445 diffs = self.diff_main(text1, b, True)
1446 if len(diffs) > 2:
1447 self.diff_cleanupSemantic(diffs)
1448 self.diff_cleanupEfficiency(diffs)
1449 elif isinstance(a, list) and b is None and c is None:
1450 # Method 2: diffs
1451 # Compute text1 from diffs.
1452 diffs = a
1453 text1 = self.diff_text1(diffs)
1454 elif isinstance(a, basestring) and isinstance(b, list) and c is None:
1455 # Method 3: text1, diffs
1456 text1 = a
1457 diffs = b
1458 elif (isinstance(a, basestring) and isinstance(b, basestring) and
1459 isinstance(c, list)):
1460 # Method 4: text1, text2, diffs
1461 # text2 is not used.
1462 text1 = a
1463 diffs = c
1464 else:
1465 raise ValueError("Unknown call format to patch_make.")
1466
1467 if not diffs:
1468 return [] # Get rid of the None case.
1469 patches = []
1470 patch = patch_obj()
1471 char_count1 = 0 # Number of characters into the text1 string.
1472 char_count2 = 0 # Number of characters into the text2 string.
1473 prepatch_text = text1 # Recreate the patches to determine context info.
1474 postpatch_text = text1
1475 for x in xrange(len(diffs)):
1476 (diff_type, diff_text) = diffs[x]
1477 if len(patch.diffs) == 0 and diff_type != self.DIFF_EQUAL:
1478 # A new patch starts here.
1479 patch.start1 = char_count1
1480 patch.start2 = char_count2
1481 if diff_type == self.DIFF_INSERT:
1482 # Insertion
1483 patch.diffs.append(diffs[x])
1484 patch.length2 += len(diff_text)
1485 postpatch_text = (postpatch_text[:char_count2] + diff_text +
1486 postpatch_text[char_count2:])
1487 elif diff_type == self.DIFF_DELETE:
1488 # Deletion.
1489 patch.length1 += len(diff_text)
1490 patch.diffs.append(diffs[x])
1491 postpatch_text = (postpatch_text[:char_count2] +
1492 postpatch_text[char_count2 + len(diff_text):])
1493 elif (diff_type == self.DIFF_EQUAL and
1494 len(diff_text) <= 2 * self.Patch_Margin and
1495 len(patch.diffs) != 0 and len(diffs) != x + 1):
1496 # Small equality inside a patch.
1497 patch.diffs.append(diffs[x])
1498 patch.length1 += len(diff_text)
1499 patch.length2 += len(diff_text)
1500
1501 if (diff_type == self.DIFF_EQUAL and
1502 len(diff_text) >= 2 * self.Patch_Margin):
1503 # Time for a new patch.
1504 if len(patch.diffs) != 0:
1505 self.patch_addContext(patch, prepatch_text)
1506 patches.append(patch)
1507 patch = patch_obj()
1508 # Unlike Unidiff, our patch lists have a rolling context.
1509 # http://code.google.com/p/google-diff-match-patch/wiki/Unidiff
1510 # Update prepatch text & pos to reflect the application of the
1511 # just completed patch.
1512 prepatch_text = postpatch_text
1513 char_count1 = char_count2
1514
1515 # Update the current character count.
1516 if diff_type != self.DIFF_INSERT:
1517 char_count1 += len(diff_text)
1518 if diff_type != self.DIFF_DELETE:
1519 char_count2 += len(diff_text)
1520
1521 # Pick up the leftover patch if not empty.
1522 if len(patch.diffs) != 0:
1523 self.patch_addContext(patch, prepatch_text)
1524 patches.append(patch)
1525 return patches
1526
1527 def patch_deepCopy(self, patches):
1528 """Given an array of patches, return another array that is identical.
1529
1530 Args:
1531 patches: Array of Patch objects.
1532
1533 Returns:
1534 Array of Patch objects.
1535 """
1536 patchesCopy = []
1537 for patch in patches:
1538 patchCopy = patch_obj()
1539 # No need to deep copy the tuples since they are immutable.
1540 patchCopy.diffs = patch.diffs[:]
1541 patchCopy.start1 = patch.start1
1542 patchCopy.start2 = patch.start2
1543 patchCopy.length1 = patch.length1
1544 patchCopy.length2 = patch.length2
1545 patchesCopy.append(patchCopy)
1546 return patchesCopy
1547
1548 def patch_apply(self, patches, text):
1549 """Merge a set of patches onto the text. Return a patched text, as well
1550 as a list of true/false values indicating which patches were applied.
1551
1552 Args:
1553 patches: Array of Patch objects.
1554 text: Old text.
1555
1556 Returns:
1557 Two element Array, containing the new text and an array of boolean values.
1558 """
1559 if not patches:
1560 return (text, [])
1561
1562 # Deep copy the patches so that no changes are made to originals.
1563 patches = self.patch_deepCopy(patches)
1564
1565 nullPadding = self.patch_addPadding(patches)
1566 text = nullPadding + text + nullPadding
1567 self.patch_splitMax(patches)
1568
1569 # delta keeps track of the offset between the expected and actual location
1570 # of the previous patch. If there are patches expected at positions 10 and
1571 # 20, but the first patch was found at 12, delta is 2 and the second patch
1572 # has an effective expected position of 22.
1573 delta = 0
1574 results = []
1575 for patch in patches:
1576 expected_loc = patch.start2 + delta
1577 text1 = self.diff_text1(patch.diffs)
1578 end_loc = -1
1579 if len(text1) > self.Match_MaxBits:
1580 # patch_splitMax will only provide an oversized pattern in the case of
1581 # a monster delete.
1582 start_loc = self.match_main(text, text1[:self.Match_MaxBits],
1583 expected_loc)
1584 if start_loc != -1:
1585 end_loc = self.match_main(text, text1[-self.Match_MaxBits:],
1586 expected_loc + len(text1) - self.Match_MaxBits)
1587 if end_loc == -1 or start_loc >= end_loc:
1588 # Can't find valid trailing context. Drop this patch.
1589 start_loc = -1
1590 else:
1591 start_loc = self.match_main(text, text1, expected_loc)
1592 if start_loc == -1:
1593 # No match found. :(
1594 results.append(False)
1595 # Subtract the delta for this failed patch from subsequent patches.
1596 delta -= patch.length2 - patch.length1
1597 else:
1598 # Found a match. :)
1599 results.append(True)
1600 delta = start_loc - expected_loc
1601 if end_loc == -1:
1602 text2 = text[start_loc : start_loc + len(text1)]
1603 else:
1604 text2 = text[start_loc : end_loc + self.Match_MaxBits]
1605 if text1 == text2:
1606 # Perfect match, just shove the replacement text in.
1607 text = (text[:start_loc] + self.diff_text2(patch.diffs) +
1608 text[start_loc + len(text1):])
1609 else:
1610 # Imperfect match.
1611 # Run a diff to get a framework of equivalent indices.
1612 diffs = self.diff_main(text1, text2, False)
1613 if (len(text1) > self.Match_MaxBits and
1614 self.diff_levenshtein(diffs) / float(len(text1)) >
1615 self.Patch_DeleteThreshold):
1616 # The end points match, but the content is unacceptably bad.
1617 results[-1] = False
1618 else:
1619 self.diff_cleanupSemanticLossless(diffs)
1620 index1 = 0
1621 for (op, data) in patch.diffs:
1622 if op != self.DIFF_EQUAL:
1623 index2 = self.diff_xIndex(diffs, index1)
1624 if op == self.DIFF_INSERT: # Insertion
1625 text = text[:start_loc + index2] + data + text[start_loc +
1626 index2:]
1627 elif op == self.DIFF_DELETE: # Deletion
1628 text = text[:start_loc + index2] + text[start_loc +
1629 self.diff_xIndex(diffs, index1 + len(data)):]
1630 if op != self.DIFF_DELETE:
1631 index1 += len(data)
1632 # Strip the padding off.
1633 text = text[len(nullPadding):-len(nullPadding)]
1634 return (text, results)
1635
1636 def patch_addPadding(self, patches):
1637 """Add some padding on text start and end so that edges can match
1638 something. Intended to be called only from within patch_apply.
1639
1640 Args:
1641 patches: Array of Patch objects.
1642
1643 Returns:
1644 The padding string added to each side.
1645 """
1646 paddingLength = self.Patch_Margin
1647 nullPadding = ""
1648 for x in xrange(1, paddingLength + 1):
1649 nullPadding += chr(x)
1650
1651 # Bump all the patches forward.
1652 for patch in patches:
1653 patch.start1 += paddingLength
1654 patch.start2 += paddingLength
1655
1656 # Add some padding on start of first diff.
1657 patch = patches[0]
1658 diffs = patch.diffs
1659 if not diffs or diffs[0][0] != self.DIFF_EQUAL:
1660 # Add nullPadding equality.
1661 diffs.insert(0, (self.DIFF_EQUAL, nullPadding))
1662 patch.start1 -= paddingLength # Should be 0.
1663 patch.start2 -= paddingLength # Should be 0.
1664 patch.length1 += paddingLength
1665 patch.length2 += paddingLength
1666 elif paddingLength > len(diffs[0][1]):
1667 # Grow first equality.
1668 extraLength = paddingLength - len(diffs[0][1])
1669 newText = nullPadding[len(diffs[0][1]):] + diffs[0][1]
1670 diffs[0] = (diffs[0][0], newText)
1671 patch.start1 -= extraLength
1672 patch.start2 -= extraLength
1673 patch.length1 += extraLength
1674 patch.length2 += extraLength
1675
1676 # Add some padding on end of last diff.
1677 patch = patches[-1]
1678 diffs = patch.diffs
1679 if not diffs or diffs[-1][0] != self.DIFF_EQUAL:
1680 # Add nullPadding equality.
1681 diffs.append((self.DIFF_EQUAL, nullPadding))
1682 patch.length1 += paddingLength
1683 patch.length2 += paddingLength
1684 elif paddingLength > len(diffs[-1][1]):
1685 # Grow last equality.
1686 extraLength = paddingLength - len(diffs[-1][1])
1687 newText = diffs[-1][1] + nullPadding[:extraLength]
1688 diffs[-1] = (diffs[-1][0], newText)
1689 patch.length1 += extraLength
1690 patch.length2 += extraLength
1691
1692 return nullPadding
1693
1694 def patch_splitMax(self, patches):
1695 """Look through the patches and break up any which are longer than the
1696 maximum limit of the match algorithm.
1697 Intended to be called only from within patch_apply.
1698
1699 Args:
1700 patches: Array of Patch objects.
1701 """
1702 patch_size = self.Match_MaxBits
1703 if patch_size == 0:
1704 # Python has the option of not splitting strings due to its ability
1705 # to handle integers of arbitrary precision.
1706 return
1707 for x in xrange(len(patches)):
1708 if patches[x].length1 <= patch_size:
1709 continue
1710 bigpatch = patches[x]
1711 # Remove the big old patch.
1712 del patches[x]
1713 x -= 1
1714 start1 = bigpatch.start1
1715 start2 = bigpatch.start2
1716 precontext = ''
1717 while len(bigpatch.diffs) != 0:
1718 # Create one of several smaller patches.
1719 patch = patch_obj()
1720 empty = True
1721 patch.start1 = start1 - len(precontext)
1722 patch.start2 = start2 - len(precontext)
1723 if precontext:
1724 patch.length1 = patch.length2 = len(precontext)
1725 patch.diffs.append((self.DIFF_EQUAL, precontext))
1726
1727 while (len(bigpatch.diffs) != 0 and
1728 patch.length1 < patch_size - self.Patch_Margin):
1729 (diff_type, diff_text) = bigpatch.diffs[0]
1730 if diff_type == self.DIFF_INSERT:
1731 # Insertions are harmless.
1732 patch.length2 += len(diff_text)
1733 start2 += len(diff_text)
1734 patch.diffs.append(bigpatch.diffs.pop(0))
1735 empty = False
1736 elif (diff_type == self.DIFF_DELETE and len(patch.diffs) == 1 and
1737 patch.diffs[0][0] == self.DIFF_EQUAL and
1738 len(diff_text) > 2 * patch_size):
1739 # This is a large deletion. Let it pass in one chunk.
1740 patch.length1 += len(diff_text)
1741 start1 += len(diff_text)
1742 empty = False
1743 patch.diffs.append((diff_type, diff_text))
1744 del bigpatch.diffs[0]
1745 else:
1746 # Deletion or equality. Only take as much as we can stomach.
1747 diff_text = diff_text[:patch_size - patch.length1 -
1748 self.Patch_Margin]
1749 patch.length1 += len(diff_text)
1750 start1 += len(diff_text)
1751 if diff_type == self.DIFF_EQUAL:
1752 patch.length2 += len(diff_text)
1753 start2 += len(diff_text)
1754 else:
1755 empty = False
1756
1757 patch.diffs.append((diff_type, diff_text))
1758 if diff_text == bigpatch.diffs[0][1]:
1759 del bigpatch.diffs[0]
1760 else:
1761 bigpatch.diffs[0] = (bigpatch.diffs[0][0],
1762 bigpatch.diffs[0][1][len(diff_text):])
1763
1764 # Compute the head context for the next patch.
1765 precontext = self.diff_text2(patch.diffs)
1766 precontext = precontext[-self.Patch_Margin:]
1767 # Append the end context for this patch.
1768 postcontext = self.diff_text1(bigpatch.diffs)[:self.Patch_Margin]
1769 if postcontext:
1770 patch.length1 += len(postcontext)
1771 patch.length2 += len(postcontext)
1772 if len(patch.diffs) != 0 and patch.diffs[-1][0] == self.DIFF_EQUAL:
1773 patch.diffs[-1] = (self.DIFF_EQUAL, patch.diffs[-1][1] +
1774 postcontext)
1775 else:
1776 patch.diffs.append((self.DIFF_EQUAL, postcontext))
1777
1778 if not empty:
1779 x += 1
1780 patches.insert(x, patch)
1781
1782 def patch_toText(self, patches):
1783 """Take a list of patches and return a textual representation.
1784
1785 Args:
1786 patches: Array of Patch objects.
1787
1788 Returns:
1789 Text representation of patches.
1790 """
1791 text = []
1792 for patch in patches:
1793 text.append(str(patch))
1794 return "".join(text)
1795
1796 def patch_fromText(self, textline):
1797 """Parse a textual representation of patches and return a list of patch
1798 objects.
1799
1800 Args:
1801 textline: Text representation of patches.
1802
1803 Returns:
1804 Array of Patch objects.
1805
1806 Raises:
1807 ValueError: If invalid input.
1808 """
1809 if type(textline) == unicode:
1810 # Patches should be composed of a subset of ascii chars, Unicode not
1811 # required. If this encode raises UnicodeEncodeError, patch is invalid.
1812 textline = textline.encode("ascii")
1813 patches = []
1814 if not textline:
1815 return patches
1816 text = textline.split('\n')
1817 while len(text) != 0:
1818 m = re.match("^@@ -(\d+),?(\d*) \+(\d+),?(\d*) @@$", text[0])
1819 if not m:
1820 raise ValueError("Invalid patch string: " + text[0])
1821 patch = patch_obj()
1822 patches.append(patch)
1823 patch.start1 = int(m.group(1))
1824 if m.group(2) == '':
1825 patch.start1 -= 1
1826 patch.length1 = 1
1827 elif m.group(2) == '0':
1828 patch.length1 = 0
1829 else:
1830 patch.start1 -= 1
1831 patch.length1 = int(m.group(2))
1832
1833 patch.start2 = int(m.group(3))
1834 if m.group(4) == '':
1835 patch.start2 -= 1
1836 patch.length2 = 1
1837 elif m.group(4) == '0':
1838 patch.length2 = 0
1839 else:
1840 patch.start2 -= 1
1841 patch.length2 = int(m.group(4))
1842
1843 del text[0]
1844
1845 while len(text) != 0:
1846 if text[0]:
1847 sign = text[0][0]
1848 else:
1849 sign = ''
1850 line = urllib.unquote(text[0][1:])
1851 line = line.decode("utf-8")
1852 if sign == '+':
1853 # Insertion.
1854 patch.diffs.append((self.DIFF_INSERT, line))
1855 elif sign == '-':
1856 # Deletion.
1857 patch.diffs.append((self.DIFF_DELETE, line))
1858 elif sign == ' ':
1859 # Minor equality.
1860 patch.diffs.append((self.DIFF_EQUAL, line))
1861 elif sign == '@':
1862 # Start of next patch.
1863 break
1864 elif sign == '':
1865 # Blank line? Whatever.
1866 pass
1867 else:
1868 # WTF?
1869 raise ValueError("Invalid patch mode: '%s'\n%s" % (sign, line))
1870 del text[0]
1871 return patches
1872
1873
1874 class patch_obj:
1875 """Class representing one patch operation.
1876 """
1877
1878 def __init__(self):
1879 """Initializes with an empty list of diffs.
1880 """
1881 self.diffs = []
1882 self.start1 = None
1883 self.start2 = None
1884 self.length1 = 0
1885 self.length2 = 0
1886
1887 def __str__(self):
1888 """Emmulate GNU diff's format.
1889 Header: @@ -382,8 +481,9 @@
1890 Indicies are printed as 1-based, not 0-based.
1891
1892 Returns:
1893 The GNU diff string.
1894 """
1895 if self.length1 == 0:
1896 coords1 = str(self.start1) + ",0"
1897 elif self.length1 == 1:
1898 coords1 = str(self.start1 + 1)
1899 else:
1900 coords1 = str(self.start1 + 1) + "," + str(self.length1)
1901 if self.length2 == 0:
1902 coords2 = str(self.start2) + ",0"
1903 elif self.length2 == 1:
1904 coords2 = str(self.start2 + 1)
1905 else:
1906 coords2 = str(self.start2 + 1) + "," + str(self.length2)
1907 text = ["@@ -", coords1, " +", coords2, " @@\n"]
1908 # Escape the body of the patch with %xx notation.
1909 for (op, data) in self.diffs:
1910 if op == diff_match_patch.DIFF_INSERT:
1911 text.append("+")
1912 elif op == diff_match_patch.DIFF_DELETE:
1913 text.append("-")
1914 elif op == diff_match_patch.DIFF_EQUAL:
1915 text.append(" ")
1916 # High ascii will raise UnicodeDecodeError. Use Unicode instead.
1917 data = data.encode("utf-8")
1918 text.append(urllib.quote(data, "!~*'();/?:@&=+$,# ") + "\n")
1919 return "".join(text) No newline at end of file
@@ -0,0 +1,99 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2016-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21
22 import logging
23 from pyramid import httpexceptions
24 from pyramid.httpexceptions import HTTPError, HTTPInternalServerError
25 from pyramid.threadlocal import get_current_request
26
27 from rhodecode.lib.exceptions import VCSServerUnavailable
28 from rhodecode.lib.vcs.exceptions import VCSCommunicationError
29
30
31 log = logging.getLogger(__name__)
32
33
34 class PylonsErrorHandlingMiddleware(object):
35 """
36 This middleware is wrapped around the old pylons application to catch
37 errors and invoke our error handling view to return a proper error page.
38 """
39 def __init__(self, app, error_view, reraise=False):
40 self.app = app
41 self.error_view = error_view
42 self._reraise = reraise
43
44 def __call__(self, environ, start_response):
45 # We need to use the pyramid request here instead of creating a custom
46 # instance from the environ because this request maybe passed to the
47 # error handler view which is a pyramid view and expects a pyramid
48 # request which has been processed by the pyramid router.
49 request = get_current_request()
50
51 response = self.handle_request(request)
52 return response(environ, start_response)
53
54 def is_http_error(self, response):
55 # webob type error responses
56 return (400 <= response.status_int <= 599)
57
58 def reraise(self):
59 return self._reraise
60
61 def handle_request(self, request):
62 """
63 Calls the underlying WSGI app (typically the old RhodeCode pylons app)
64 and returns the response if no error happened. In case of an error it
65 invokes the error handling view to return a proper error page as
66 response.
67
68 - old webob type exceptions get converted to pyramid exceptions
69 - pyramid exceptions are passed to the error handler view
70 """
71 try:
72 response = request.get_response(self.app)
73 if self.is_http_error(response):
74 response = webob_to_pyramid_http_response(response)
75 return self.error_view(response, request)
76 except HTTPError as e: # pyramid type exceptions
77 return self.error_view(e, request)
78 except Exception as e:
79 log.exception(e)
80
81 if self.reraise():
82 raise
83
84 if isinstance(e, VCSCommunicationError):
85 return self.error_view(VCSServerUnavailable(), request)
86
87 return self.error_view(HTTPInternalServerError(), request)
88
89 return response
90
91
92 def webob_to_pyramid_http_response(webob_response):
93 ResponseClass = httpexceptions.status_map[webob_response.status_int]
94 pyramid_response = ResponseClass(webob_response.status)
95 pyramid_response.status = webob_response.status
96 pyramid_response.headers.update(webob_response.headers)
97 if pyramid_response.headers['content-type'] == 'text/html':
98 pyramid_response.headers['content-type'] = 'text/html; charset=UTF-8'
99 return pyramid_response
This diff has been collapsed as it changes many lines, (640 lines changed) Show them Hide them
@@ -0,0 +1,640 b''
1 import os
2 import sys
3 import time
4 import platform
5 import pkg_resources
6 import logging
7 import string
8
9
10 log = logging.getLogger(__name__)
11
12
13 psutil = None
14
15 try:
16 # cygwin cannot have yet psutil support.
17 import psutil as psutil
18 except ImportError:
19 pass
20
21
22 _NA = 'NOT AVAILABLE'
23
24 STATE_OK = 'ok'
25 STATE_ERR = 'error'
26 STATE_WARN = 'warning'
27
28 STATE_OK_DEFAULT = {'message': '', 'type': STATE_OK}
29
30
31 # HELPERS
32 def percentage(part, whole):
33 whole = float(whole)
34 if whole > 0:
35 return round(100 * float(part) / whole, 1)
36 return 0.0
37
38
39 def get_storage_size(storage_path):
40 sizes = []
41 for file_ in os.listdir(storage_path):
42 storage_file = os.path.join(storage_path, file_)
43 if os.path.isfile(storage_file):
44 try:
45 sizes.append(os.path.getsize(storage_file))
46 except OSError:
47 log.exception('Failed to get size of storage file %s',
48 storage_file)
49 pass
50
51 return sum(sizes)
52
53
54 class SysInfoRes(object):
55 def __init__(self, value, state=STATE_OK_DEFAULT, human_value=None):
56 self.value = value
57 self.state = state
58 self.human_value = human_value or value
59
60 def __json__(self):
61 return {
62 'value': self.value,
63 'state': self.state,
64 'human_value': self.human_value,
65 }
66
67 def __str__(self):
68 return '<SysInfoRes({})>'.format(self.__json__())
69
70
71 class SysInfo(object):
72
73 def __init__(self, func_name, **kwargs):
74 self.func_name = func_name
75 self.value = _NA
76 self.state = None
77 self.kwargs = kwargs or {}
78
79 def __call__(self):
80 computed = self.compute(**self.kwargs)
81 if not isinstance(computed, SysInfoRes):
82 raise ValueError(
83 'computed value for {} is not instance of '
84 '{}, got {} instead'.format(
85 self.func_name, SysInfoRes, type(computed)))
86 return computed.__json__()
87
88 def __str__(self):
89 return '<SysInfo({})>'.format(self.func_name)
90
91 def compute(self, **kwargs):
92 return self.func_name(**kwargs)
93
94
95 # SysInfo functions
96 def python_info():
97 value = dict(version=' '.join(platform._sys_version()),
98 executable=sys.executable)
99 return SysInfoRes(value=value)
100
101
102 def py_modules():
103 mods = dict([(p.project_name, p.version)
104 for p in pkg_resources.working_set])
105 value = sorted(mods.items(), key=lambda k: k[0].lower())
106 return SysInfoRes(value=value)
107
108
109 def platform_type():
110 from rhodecode.lib.utils import safe_unicode, generate_platform_uuid
111
112 value = dict(
113 name=safe_unicode(platform.platform()),
114 uuid=generate_platform_uuid()
115 )
116 return SysInfoRes(value=value)
117
118
119 def uptime():
120 from rhodecode.lib.helpers import age, time_to_datetime
121
122 value = dict(boot_time=0, uptime=0, text='')
123 state = STATE_OK_DEFAULT
124 if not psutil:
125 return SysInfoRes(value=value, state=state)
126
127 boot_time = psutil.boot_time()
128 value['boot_time'] = boot_time
129 value['uptime'] = time.time() - boot_time
130
131 human_value = value.copy()
132 human_value['boot_time'] = time_to_datetime(boot_time)
133 human_value['uptime'] = age(time_to_datetime(boot_time), show_suffix=False)
134 human_value['text'] = 'Server started {}'.format(
135 age(time_to_datetime(boot_time)))
136
137 return SysInfoRes(value=value, human_value=human_value)
138
139
140 def memory():
141 from rhodecode.lib.helpers import format_byte_size_binary
142 value = dict(available=0, used=0, used_real=0, cached=0, percent=0,
143 percent_used=0, free=0, inactive=0, active=0, shared=0,
144 total=0, buffers=0, text='')
145
146 state = STATE_OK_DEFAULT
147 if not psutil:
148 return SysInfoRes(value=value, state=state)
149
150 value.update(dict(psutil.virtual_memory()._asdict()))
151 value['used_real'] = value['total'] - value['available']
152 value['percent_used'] = psutil._common.usage_percent(
153 value['used_real'], value['total'], 1)
154
155 human_value = value.copy()
156 human_value['text'] = '%s/%s, %s%% used' % (
157 format_byte_size_binary(value['used_real']),
158 format_byte_size_binary(value['total']),
159 value['percent_used'],)
160
161 keys = value.keys()[::]
162 keys.pop(keys.index('percent'))
163 keys.pop(keys.index('percent_used'))
164 keys.pop(keys.index('text'))
165 for k in keys:
166 human_value[k] = format_byte_size_binary(value[k])
167
168 if state['type'] == STATE_OK and value['percent_used'] > 90:
169 msg = 'Critical: your available RAM memory is very low.'
170 state = {'message': msg, 'type': STATE_ERR}
171
172 elif state['type'] == STATE_OK and value['percent_used'] > 70:
173 msg = 'Warning: your available RAM memory is running low.'
174 state = {'message': msg, 'type': STATE_WARN}
175
176 return SysInfoRes(value=value, state=state, human_value=human_value)
177
178
179 def machine_load():
180 value = {'1_min': _NA, '5_min': _NA, '15_min': _NA, 'text': ''}
181 state = STATE_OK_DEFAULT
182 if not psutil:
183 return SysInfoRes(value=value, state=state)
184
185 # load averages
186 if hasattr(psutil.os, 'getloadavg'):
187 value.update(dict(
188 zip(['1_min', '5_min', '15_min'], psutil.os.getloadavg())))
189
190 human_value = value.copy()
191 human_value['text'] = '1min: {}, 5min: {}, 15min: {}'.format(
192 value['1_min'], value['5_min'], value['15_min'])
193
194 if state['type'] == STATE_OK and value['15_min'] > 5:
195 msg = 'Warning: your machine load is very high.'
196 state = {'message': msg, 'type': STATE_WARN}
197
198 return SysInfoRes(value=value, state=state, human_value=human_value)
199
200
201 def cpu():
202 value = 0
203 state = STATE_OK_DEFAULT
204
205 if not psutil:
206 return SysInfoRes(value=value, state=state)
207
208 value = psutil.cpu_percent(0.5)
209 human_value = '{} %'.format(value)
210 return SysInfoRes(value=value, state=state, human_value=human_value)
211
212
213 def storage():
214 from rhodecode.lib.helpers import format_byte_size_binary
215 from rhodecode.model.settings import VcsSettingsModel
216 path = VcsSettingsModel().get_repos_location()
217
218 value = dict(percent=0, used=0, total=0, path=path, text='')
219 state = STATE_OK_DEFAULT
220 if not psutil:
221 return SysInfoRes(value=value, state=state)
222
223 try:
224 value.update(dict(psutil.disk_usage(path)._asdict()))
225 except Exception as e:
226 log.exception('Failed to fetch disk info')
227 state = {'message': str(e), 'type': STATE_ERR}
228
229 human_value = value.copy()
230 human_value['used'] = format_byte_size_binary(value['used'])
231 human_value['total'] = format_byte_size_binary(value['total'])
232 human_value['text'] = "{}/{}, {}% used".format(
233 format_byte_size_binary(value['used']),
234 format_byte_size_binary(value['total']),
235 value['percent'])
236
237 if state['type'] == STATE_OK and value['percent'] > 90:
238 msg = 'Critical: your disk space is very low.'
239 state = {'message': msg, 'type': STATE_ERR}
240
241 elif state['type'] == STATE_OK and value['percent'] > 70:
242 msg = 'Warning: your disk space is running low.'
243 state = {'message': msg, 'type': STATE_WARN}
244
245 return SysInfoRes(value=value, state=state, human_value=human_value)
246
247
248 def storage_inodes():
249 from rhodecode.model.settings import VcsSettingsModel
250 path = VcsSettingsModel().get_repos_location()
251
252 value = dict(percent=0, free=0, used=0, total=0, path=path, text='')
253 state = STATE_OK_DEFAULT
254 if not psutil:
255 return SysInfoRes(value=value, state=state)
256
257 try:
258 i_stat = os.statvfs(path)
259
260 value['used'] = i_stat.f_ffree
261 value['free'] = i_stat.f_favail
262 value['total'] = i_stat.f_files
263 value['percent'] = percentage(value['used'], value['total'])
264 except Exception as e:
265 log.exception('Failed to fetch disk inodes info')
266 state = {'message': str(e), 'type': STATE_ERR}
267
268 human_value = value.copy()
269 human_value['text'] = "{}/{}, {}% used".format(
270 value['used'], value['total'], value['percent'])
271
272 if state['type'] == STATE_OK and value['percent'] > 90:
273 msg = 'Critical: your disk free inodes are very low.'
274 state = {'message': msg, 'type': STATE_ERR}
275
276 elif state['type'] == STATE_OK and value['percent'] > 70:
277 msg = 'Warning: your disk free inodes are running low.'
278 state = {'message': msg, 'type': STATE_WARN}
279
280 return SysInfoRes(value=value, state=state, human_value=human_value)
281
282
283 def storage_archives():
284 import rhodecode
285 from rhodecode.lib.utils import safe_str
286 from rhodecode.lib.helpers import format_byte_size_binary
287
288 msg = 'Enable this by setting ' \
289 'archive_cache_dir=/path/to/cache option in the .ini file'
290 path = safe_str(rhodecode.CONFIG.get('archive_cache_dir', msg))
291
292 value = dict(percent=0, used=0, total=0, items=0, path=path, text='')
293 state = STATE_OK_DEFAULT
294 try:
295 items_count = 0
296 used = 0
297 for root, dirs, files in os.walk(path):
298 if root == path:
299 items_count = len(files)
300
301 for f in files:
302 try:
303 used += os.path.getsize(os.path.join(root, f))
304 except OSError:
305 pass
306 value.update({
307 'percent': 100,
308 'used': used,
309 'total': used,
310 'items': items_count
311 })
312
313 except Exception as e:
314 log.exception('failed to fetch archive cache storage')
315 state = {'message': str(e), 'type': STATE_ERR}
316
317 human_value = value.copy()
318 human_value['used'] = format_byte_size_binary(value['used'])
319 human_value['total'] = format_byte_size_binary(value['total'])
320 human_value['text'] = "{} ({} items)".format(
321 human_value['used'], value['items'])
322
323 return SysInfoRes(value=value, state=state, human_value=human_value)
324
325
326 def storage_gist():
327 from rhodecode.model.gist import GIST_STORE_LOC
328 from rhodecode.model.settings import VcsSettingsModel
329 from rhodecode.lib.utils import safe_str
330 from rhodecode.lib.helpers import format_byte_size_binary
331 path = safe_str(os.path.join(
332 VcsSettingsModel().get_repos_location(), GIST_STORE_LOC))
333
334 # gist storage
335 value = dict(percent=0, used=0, total=0, items=0, path=path, text='')
336 state = STATE_OK_DEFAULT
337
338 try:
339 items_count = 0
340 used = 0
341 for root, dirs, files in os.walk(path):
342 if root == path:
343 items_count = len(dirs)
344
345 for f in files:
346 try:
347 used += os.path.getsize(os.path.join(root, f))
348 except OSError:
349 pass
350 value.update({
351 'percent': 100,
352 'used': used,
353 'total': used,
354 'items': items_count
355 })
356 except Exception as e:
357 log.exception('failed to fetch gist storage items')
358 state = {'message': str(e), 'type': STATE_ERR}
359
360 human_value = value.copy()
361 human_value['used'] = format_byte_size_binary(value['used'])
362 human_value['total'] = format_byte_size_binary(value['total'])
363 human_value['text'] = "{} ({} items)".format(
364 human_value['used'], value['items'])
365
366 return SysInfoRes(value=value, state=state, human_value=human_value)
367
368
369 def storage_temp():
370 import tempfile
371 from rhodecode.lib.helpers import format_byte_size_binary
372
373 path = tempfile.gettempdir()
374 value = dict(percent=0, used=0, total=0, items=0, path=path, text='')
375 state = STATE_OK_DEFAULT
376
377 if not psutil:
378 return SysInfoRes(value=value, state=state)
379
380 try:
381 value.update(dict(psutil.disk_usage(path)._asdict()))
382 except Exception as e:
383 log.exception('Failed to fetch temp dir info')
384 state = {'message': str(e), 'type': STATE_ERR}
385
386 human_value = value.copy()
387 human_value['used'] = format_byte_size_binary(value['used'])
388 human_value['total'] = format_byte_size_binary(value['total'])
389 human_value['text'] = "{}/{}, {}% used".format(
390 format_byte_size_binary(value['used']),
391 format_byte_size_binary(value['total']),
392 value['percent'])
393
394 return SysInfoRes(value=value, state=state, human_value=human_value)
395
396
397 def search_info():
398 import rhodecode
399 from rhodecode.lib.index import searcher_from_config
400
401 backend = rhodecode.CONFIG.get('search.module', '')
402 location = rhodecode.CONFIG.get('search.location', '')
403
404 try:
405 searcher = searcher_from_config(rhodecode.CONFIG)
406 searcher = searcher.__class__.__name__
407 except Exception:
408 searcher = None
409
410 value = dict(
411 backend=backend, searcher=searcher, location=location, text='')
412 state = STATE_OK_DEFAULT
413
414 human_value = value.copy()
415 human_value['text'] = "backend:`{}`".format(human_value['backend'])
416
417 return SysInfoRes(value=value, state=state, human_value=human_value)
418
419
420 def git_info():
421 from rhodecode.lib.vcs.backends import git
422 state = STATE_OK_DEFAULT
423 value = human_value = ''
424 try:
425 value = git.discover_git_version(raise_on_exc=True)
426 human_value = 'version reported from VCSServer: {}'.format(value)
427 except Exception as e:
428 state = {'message': str(e), 'type': STATE_ERR}
429
430 return SysInfoRes(value=value, state=state, human_value=human_value)
431
432
433 def hg_info():
434 from rhodecode.lib.vcs.backends import hg
435 state = STATE_OK_DEFAULT
436 value = human_value = ''
437 try:
438 value = hg.discover_hg_version(raise_on_exc=True)
439 human_value = 'version reported from VCSServer: {}'.format(value)
440 except Exception as e:
441 state = {'message': str(e), 'type': STATE_ERR}
442 return SysInfoRes(value=value, state=state, human_value=human_value)
443
444
445 def svn_info():
446 from rhodecode.lib.vcs.backends import svn
447 state = STATE_OK_DEFAULT
448 value = human_value = ''
449 try:
450 value = svn.discover_svn_version(raise_on_exc=True)
451 human_value = 'version reported from VCSServer: {}'.format(value)
452 except Exception as e:
453 state = {'message': str(e), 'type': STATE_ERR}
454 return SysInfoRes(value=value, state=state, human_value=human_value)
455
456
457 def vcs_backends():
458 import rhodecode
459 value = map(
460 string.strip, rhodecode.CONFIG.get('vcs.backends', '').split(','))
461 human_value = 'Enabled backends in order: {}'.format(','.join(value))
462 return SysInfoRes(value=value, human_value=human_value)
463
464
465 def vcs_server():
466 import rhodecode
467 from rhodecode.lib.vcs.backends import get_vcsserver_version
468
469 server_url = rhodecode.CONFIG.get('vcs.server')
470 enabled = rhodecode.CONFIG.get('vcs.server.enable')
471 protocol = rhodecode.CONFIG.get('vcs.server.protocol')
472 state = STATE_OK_DEFAULT
473 version = None
474
475 try:
476 version = get_vcsserver_version()
477 connection = 'connected'
478 except Exception as e:
479 connection = 'failed'
480 state = {'message': str(e), 'type': STATE_ERR}
481
482 value = dict(
483 url=server_url,
484 enabled=enabled,
485 protocol=protocol,
486 connection=connection,
487 version=version,
488 text='',
489 )
490
491 human_value = value.copy()
492 human_value['text'] = \
493 '{url}@ver:{ver} via {mode} mode, connection:{conn}'.format(
494 url=server_url, ver=version, mode=protocol, conn=connection)
495
496 return SysInfoRes(value=value, state=state, human_value=human_value)
497
498
499 def rhodecode_app_info():
500 import rhodecode
501 edition = rhodecode.CONFIG.get('rhodecode.edition')
502
503 value = dict(
504 rhodecode_version=rhodecode.__version__,
505 rhodecode_lib_path=os.path.abspath(rhodecode.__file__),
506 text=''
507 )
508 human_value = value.copy()
509 human_value['text'] = 'RhodeCode {edition}, version {ver}'.format(
510 edition=edition, ver=value['rhodecode_version']
511 )
512 return SysInfoRes(value=value, human_value=human_value)
513
514
515 def rhodecode_config():
516 import rhodecode
517 path = rhodecode.CONFIG.get('__file__')
518 rhodecode_ini_safe = rhodecode.CONFIG.copy()
519
520 blacklist = [
521 'rhodecode_license_key',
522 'routes.map',
523 'pylons.h',
524 'pylons.app_globals',
525 'pylons.environ_config',
526 'sqlalchemy.db1.url',
527 'channelstream.secret',
528 'beaker.session.secret',
529 'rhodecode.encrypted_values.secret',
530 'rhodecode_auth_github_consumer_key',
531 'rhodecode_auth_github_consumer_secret',
532 'rhodecode_auth_google_consumer_key',
533 'rhodecode_auth_google_consumer_secret',
534 'rhodecode_auth_bitbucket_consumer_secret',
535 'rhodecode_auth_bitbucket_consumer_key',
536 'rhodecode_auth_twitter_consumer_secret',
537 'rhodecode_auth_twitter_consumer_key',
538
539 'rhodecode_auth_twitter_secret',
540 'rhodecode_auth_github_secret',
541 'rhodecode_auth_google_secret',
542 'rhodecode_auth_bitbucket_secret',
543
544 'appenlight.api_key',
545 ('app_conf', 'sqlalchemy.db1.url')
546 ]
547 for k in blacklist:
548 if isinstance(k, tuple):
549 section, key = k
550 if section in rhodecode_ini_safe:
551 rhodecode_ini_safe[section] = '**OBFUSCATED**'
552 else:
553 rhodecode_ini_safe.pop(k, None)
554
555 # TODO: maybe put some CONFIG checks here ?
556 return SysInfoRes(value={'config': rhodecode_ini_safe, 'path': path})
557
558
559 def database_info():
560 import rhodecode
561 from sqlalchemy.engine import url as engine_url
562 from rhodecode.model.meta import Base as sql_base, Session
563 from rhodecode.model.db import DbMigrateVersion
564
565 state = STATE_OK_DEFAULT
566
567 db_migrate = DbMigrateVersion.query().filter(
568 DbMigrateVersion.repository_id == 'rhodecode_db_migrations').one()
569
570 db_url_obj = engine_url.make_url(rhodecode.CONFIG['sqlalchemy.db1.url'])
571
572 try:
573 engine = sql_base.metadata.bind
574 db_server_info = engine.dialect._get_server_version_info(
575 Session.connection(bind=engine))
576 db_version = '.'.join(map(str, db_server_info))
577 except Exception:
578 log.exception('failed to fetch db version')
579 db_version = 'UNKNOWN'
580
581 db_info = dict(
582 migrate_version=db_migrate.version,
583 type=db_url_obj.get_backend_name(),
584 version=db_version,
585 url=repr(db_url_obj)
586 )
587
588 human_value = db_info.copy()
589 human_value['url'] = "{} @ migration version: {}".format(
590 db_info['url'], db_info['migrate_version'])
591 human_value['version'] = "{} {}".format(db_info['type'], db_info['version'])
592 return SysInfoRes(value=db_info, state=state, human_value=human_value)
593
594
595 def server_info(environ):
596 import rhodecode
597 from rhodecode.lib.base import get_server_ip_addr, get_server_port
598
599 value = {
600 'server_ip': '%s:%s' % (
601 get_server_ip_addr(environ, log_errors=False),
602 get_server_port(environ)
603 ),
604 'server_id': rhodecode.CONFIG.get('instance_id'),
605 }
606 return SysInfoRes(value=value)
607
608
609 def get_system_info(environ):
610 environ = environ or {}
611 return {
612 'rhodecode_app': SysInfo(rhodecode_app_info)(),
613 'rhodecode_config': SysInfo(rhodecode_config)(),
614 'python': SysInfo(python_info)(),
615 'py_modules': SysInfo(py_modules)(),
616
617 'platform': SysInfo(platform_type)(),
618 'server': SysInfo(server_info, environ=environ)(),
619 'database': SysInfo(database_info)(),
620
621 'storage': SysInfo(storage)(),
622 'storage_inodes': SysInfo(storage_inodes)(),
623 'storage_archive': SysInfo(storage_archives)(),
624 'storage_gist': SysInfo(storage_gist)(),
625 'storage_temp': SysInfo(storage_temp)(),
626
627 'search': SysInfo(search_info)(),
628
629 'uptime': SysInfo(uptime)(),
630 'load': SysInfo(machine_load)(),
631 'cpu': SysInfo(cpu)(),
632 'memory': SysInfo(memory)(),
633
634 'vcs_backends': SysInfo(vcs_backends)(),
635 'vcs_server': SysInfo(vcs_server)(),
636
637 'git': SysInfo(git_info)(),
638 'hg': SysInfo(hg_info)(),
639 'svn': SysInfo(svn_info)(),
640 }
1 NO CONTENT: new file 100644
@@ -0,0 +1,128 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2016-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21
22 import mock
23 import pytest
24
25 from rhodecode.config.routing import ADMIN_PREFIX
26 from rhodecode.login.views import LoginView, CaptchaData
27 from rhodecode.model.settings import SettingsModel
28 from rhodecode.tests.utils import AssertResponse
29
30
31 class RhodeCodeSetting(object):
32 def __init__(self, name, value):
33 self.name = name
34 self.value = value
35
36 def __enter__(self):
37 from rhodecode.model.settings import SettingsModel
38 model = SettingsModel()
39 self.old_setting = model.get_setting_by_name(self.name)
40 model.create_or_update_setting(name=self.name, val=self.value)
41 return self
42
43 def __exit__(self, type, value, traceback):
44 model = SettingsModel()
45 if self.old_setting:
46 model.create_or_update_setting(
47 name=self.name, val=self.old_setting.app_settings_value)
48 else:
49 model.create_or_update_setting(name=self.name)
50
51
52 class TestRegisterCaptcha(object):
53
54 @pytest.mark.parametrize('private_key, public_key, expected', [
55 ('', '', CaptchaData(False, '', '')),
56 ('', 'pubkey', CaptchaData(False, '', 'pubkey')),
57 ('privkey', '', CaptchaData(True, 'privkey', '')),
58 ('privkey', 'pubkey', CaptchaData(True, 'privkey', 'pubkey')),
59 ])
60 def test_get_captcha_data(self, private_key, public_key, expected, db):
61 login_view = LoginView(mock.Mock(), mock.Mock())
62 with RhodeCodeSetting('captcha_private_key', private_key):
63 with RhodeCodeSetting('captcha_public_key', public_key):
64 captcha = login_view._get_captcha_data()
65 assert captcha == expected
66
67 @pytest.mark.parametrize('active', [False, True])
68 @mock.patch.object(LoginView, '_get_captcha_data')
69 def test_private_key_does_not_leak_to_html(
70 self, m_get_captcha_data, active, app):
71 captcha = CaptchaData(
72 active=active, private_key='PRIVATE_KEY', public_key='PUBLIC_KEY')
73 m_get_captcha_data.return_value = captcha
74
75 response = app.get(ADMIN_PREFIX + '/register')
76 assert 'PRIVATE_KEY' not in response
77
78 @pytest.mark.parametrize('active', [False, True])
79 @mock.patch.object(LoginView, '_get_captcha_data')
80 def test_register_view_renders_captcha(
81 self, m_get_captcha_data, active, app):
82 captcha = CaptchaData(
83 active=active, private_key='PRIVATE_KEY', public_key='PUBLIC_KEY')
84 m_get_captcha_data.return_value = captcha
85
86 response = app.get(ADMIN_PREFIX + '/register')
87
88 assertr = AssertResponse(response)
89 if active:
90 assertr.one_element_exists('#recaptcha_field')
91 else:
92 assertr.no_element_exists('#recaptcha_field')
93
94 @pytest.mark.parametrize('valid', [False, True])
95 @mock.patch('rhodecode.login.views.submit')
96 @mock.patch.object(LoginView, '_get_captcha_data')
97 def test_register_with_active_captcha(
98 self, m_get_captcha_data, m_submit, valid, app, csrf_token):
99 captcha = CaptchaData(
100 active=True, private_key='PRIVATE_KEY', public_key='PUBLIC_KEY')
101 m_get_captcha_data.return_value = captcha
102 m_response = mock.Mock()
103 m_response.is_valid = valid
104 m_submit.return_value = m_response
105
106 params = {
107 'csrf_token': csrf_token,
108 'email': 'pytest@example.com',
109 'firstname': 'pytest-firstname',
110 'lastname': 'pytest-lastname',
111 'password': 'secret',
112 'password_confirmation': 'secret',
113 'username': 'pytest',
114 }
115 response = app.post(ADMIN_PREFIX + '/register', params=params)
116
117 if valid:
118 # If we provided a valid captcha input we expect a successful
119 # registration and redirect to the login page.
120 assert response.status_int == 302
121 assert 'location' in response.headers
122 assert ADMIN_PREFIX + '/login' in response.headers['location']
123 else:
124 # If captche input is invalid we expect to stay on the registration
125 # page with an error message displayed.
126 assertr = AssertResponse(response)
127 assert response.status_int == 200
128 assertr.one_element_exists('#recaptcha_field ~ span.error-message')
1 NO CONTENT: new file 100644, binary diff hidden
@@ -0,0 +1,7 b''
1 <link rel="import" href="../../../../../../bower_components/polymer/polymer.html">
2
3 <dom-module id="rhodecode-favicon">
4 <template>
5 </template>
6 <script src="rhodecode-favicon.js"></script>
7 </dom-module>
@@ -0,0 +1,20 b''
1 Polymer({
2 is: 'rhodecode-favicon',
3 properties: {
4 favicon: Object,
5 counter: {
6 type: Number,
7 observer: '_handleCounter'
8 }
9 },
10
11 ready: function () {
12 this.favicon = new Favico({
13 type: 'rectangle',
14 animation: 'none'
15 });
16 },
17 _handleCounter: function (newVal, oldVal) {
18 this.favicon.badge(this.counter);
19 }
20 });
@@ -0,0 +1,7 b''
1 <link rel="import" href="../../../../../../bower_components/polymer/polymer.html">
2 <dom-module id="rhodecode-legacy-js">
3 <template>
4 <script src="../../../scripts.js"></script>
5 </template>
6 <script src="rhodecode-legacy-js.js"></script>
7 </dom-module>
@@ -0,0 +1,3 b''
1 Polymer({
2 is: 'rhodecode-legacy-js',
3 });
@@ -0,0 +1,8 b''
1 // special file to catch some of the translation strings NOT extracted by our extraction system
2 // we need to fix that, however until now we add here those manual entries to catch them
3
4 _gettext('Invite reviewers to this discussion');
5 _gettext('Close');
6 _gettext('Send');
7 _gettext('Switch to chat');
8 _gettext('Switch to comment');
@@ -0,0 +1,30 b''
1 # Copyright (C) 2016-2016 RhodeCode GmbH
2 #
3 # This program is free software: you can redistribute it and/or modify
4 # it under the terms of the GNU Affero General Public License, version 3
5 # (only), as published by the Free Software Foundation.
6 #
7 # This program is distributed in the hope that it will be useful,
8 # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 # GNU General Public License for more details.
11 #
12 # You should have received a copy of the GNU Affero General Public License
13 # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 #
15 # This program is dual-licensed. If you wish to learn more about the
16 # RhodeCode Enterprise Edition, including its added features, Support services,
17 # and proprietary license terms, please see https://rhodecode.com/licenses/
18
19
20 from rhodecode.events.base import RhodecodeEvent
21 from rhodecode.translation import _
22
23
24 class ModDavSvnConfigChange(RhodecodeEvent):
25 """
26 This event will be triggered on every change of the mod dav svn
27 configuration.
28 """
29 name = 'mod-dav-svn-config-change'
30 display_name = _('Configuration for Apaache mad_dav_svn changed.')
This diff has been collapsed as it changes many lines, (569 lines changed) Show them Hide them
@@ -0,0 +1,569 b''
1 <%def name="diff_line_anchor(filename, line, type)"><%
2 return '%s_%s_%i' % (h.safeid(filename), type, line)
3 %></%def>
4
5 <%def name="action_class(action)"><%
6 return {
7 '-': 'cb-deletion',
8 '+': 'cb-addition',
9 ' ': 'cb-context',
10 }.get(action, 'cb-empty')
11 %></%def>
12
13 <%def name="op_class(op_id)"><%
14 return {
15 DEL_FILENODE: 'deletion', # file deleted
16 BIN_FILENODE: 'warning' # binary diff hidden
17 }.get(op_id, 'addition')
18 %></%def>
19
20 <%def name="link_for(**kw)"><%
21 new_args = request.GET.mixed()
22 new_args.update(kw)
23 return h.url('', **new_args)
24 %></%def>
25
26 <%def name="render_diffset(diffset, commit=None,
27
28 # collapse all file diff entries when there are more than this amount of files in the diff
29 collapse_when_files_over=20,
30
31 # collapse lines in the diff when more than this amount of lines changed in the file diff
32 lines_changed_limit=500,
33
34 # add a ruler at to the output
35 ruler_at_chars=0,
36
37 # show inline comments
38 use_comments=False,
39
40 # disable new comments
41 disable_new_comments=False,
42
43 )">
44
45 %if use_comments:
46 <div id="cb-comments-inline-container-template" class="js-template">
47 ${inline_comments_container([])}
48 </div>
49 <div class="js-template" id="cb-comment-inline-form-template">
50 <div class="comment-inline-form ac">
51 %if c.rhodecode_user.username != h.DEFAULT_USER:
52 ${h.form('#', method='get')}
53 <div id="edit-container_{1}" class="clearfix">
54 <div class="comment-title pull-left">
55 ${_('Create a comment on line {1}.')}
56 </div>
57 <div class="comment-help pull-right">
58 ${(_('Comments parsed using %s syntax with %s support.') % (
59 ('<a href="%s">%s</a>' % (h.url('%s_help' % c.visual.default_renderer), c.visual.default_renderer.upper())),
60 ('<span class="tooltip" title="%s">@mention</span>' % _('Use @username inside this text to send notification to this RhodeCode user'))
61 )
62 )|n
63 }
64 </div>
65 <div style="clear: both"></div>
66 <textarea id="text_{1}" name="text" class="comment-block-ta ac-input"></textarea>
67 </div>
68 <div id="preview-container_{1}" class="clearfix" style="display: none;">
69 <div class="comment-help">
70 ${_('Comment preview')}
71 </div>
72 <div id="preview-box_{1}" class="preview-box"></div>
73 </div>
74 <div class="comment-footer">
75 <div class="action-buttons">
76 <input type="hidden" name="f_path" value="{0}">
77 <input type="hidden" name="line" value="{1}">
78 <button id="preview-btn_{1}" class="btn btn-secondary">${_('Preview')}</button>
79 <button id="edit-btn_{1}" class="btn btn-secondary" style="display: none;">${_('Edit')}</button>
80 ${h.submit('save', _('Comment'), class_='btn btn-success save-inline-form')}
81 </div>
82 <div class="comment-button">
83 <button type="button" class="cb-comment-cancel" onclick="return Rhodecode.comments.cancelComment(this);">
84 ${_('Cancel')}
85 </button>
86 </div>
87 ${h.end_form()}
88 </div>
89 %else:
90 ${h.form('', class_='inline-form comment-form-login', method='get')}
91 <div class="pull-left">
92 <div class="comment-help pull-right">
93 ${_('You need to be logged in to comment.')} <a href="${h.route_path('login', _query={'came_from': h.url.current()})}">${_('Login now')}</a>
94 </div>
95 </div>
96 <div class="comment-button pull-right">
97 <button type="button" class="cb-comment-cancel" onclick="return Rhodecode.comments.cancelComment(this);">
98 ${_('Cancel')}
99 </button>
100 </div>
101 <div class="clearfix"></div>
102 ${h.end_form()}
103 %endif
104 </div>
105 </div>
106
107 %endif
108 <%
109 collapse_all = len(diffset.files) > collapse_when_files_over
110 %>
111
112 %if c.diffmode == 'sideside':
113 <style>
114 .wrapper {
115 max-width: 1600px !important;
116 }
117 </style>
118 %endif
119 %if ruler_at_chars:
120 <style>
121 .diff table.cb .cb-content:after {
122 content: "";
123 border-left: 1px solid blue;
124 position: absolute;
125 top: 0;
126 height: 18px;
127 opacity: .2;
128 z-index: 10;
129 ## +5 to account for diff action (+/-)
130 left: ${ruler_at_chars + 5}ch;
131 </style>
132 %endif
133 <div class="diffset ${disable_new_comments and 'diffset-comments-disabled'}">
134 <div class="diffset-heading ${diffset.limited_diff and 'diffset-heading-warning' or ''}">
135 %if commit:
136 <div class="pull-right">
137 <a class="btn tooltip" title="${_('Browse Files at revision {}').format(commit.raw_id)}" href="${h.url('files_home',repo_name=diffset.repo_name, revision=commit.raw_id, f_path='')}">
138 ${_('Browse Files')}
139 </a>
140 </div>
141 %endif
142 <h2 class="clearinner">
143 %if commit:
144 <a class="tooltip revision" title="${h.tooltip(commit.message)}" href="${h.url('changeset_home',repo_name=c.repo_name,revision=commit.raw_id)}">${'r%s:%s' % (commit.revision,h.short_id(commit.raw_id))}</a> -
145 ${h.age_component(commit.date)} -
146 %endif
147 %if diffset.limited_diff:
148 ${_('The requested commit is too big and content was truncated.')}
149
150 ${ungettext('%(num)s file changed.', '%(num)s files changed.', diffset.changed_files) % {'num': diffset.changed_files}}
151 <a href="${link_for(fulldiff=1)}" onclick="return confirm('${_("Showing a big diff might take some time and resources, continue?")}')">${_('Show full diff')}</a>
152 %else:
153 ${ungettext('%(num)s file changed: %(linesadd)s inserted, ''%(linesdel)s deleted',
154 '%(num)s files changed: %(linesadd)s inserted, %(linesdel)s deleted', diffset.changed_files) % {'num': diffset.changed_files, 'linesadd': diffset.lines_added, 'linesdel': diffset.lines_deleted}}
155 %endif
156 </h2>
157 </div>
158
159 %if not diffset.files:
160 <p class="empty_data">${_('No files')}</p>
161 %endif
162
163 <div class="filediffs">
164 %for i, filediff in enumerate(diffset.files):
165 <%
166 lines_changed = filediff['patch']['stats']['added'] + filediff['patch']['stats']['deleted']
167 over_lines_changed_limit = lines_changed > lines_changed_limit
168 %>
169 <input ${collapse_all and 'checked' or ''} class="filediff-collapse-state" id="filediff-collapse-${id(filediff)}" type="checkbox">
170 <div
171 class="filediff"
172 data-f-path="${filediff['patch']['filename']}"
173 id="a_${h.FID('', filediff['patch']['filename'])}">
174 <label for="filediff-collapse-${id(filediff)}" class="filediff-heading">
175 <div class="filediff-collapse-indicator"></div>
176 ${diff_ops(filediff)}
177 </label>
178 ${diff_menu(filediff, use_comments=use_comments)}
179 <table class="cb cb-diff-${c.diffmode} code-highlight ${over_lines_changed_limit and 'cb-collapsed' or ''}">
180 %if not filediff.hunks:
181 %for op_id, op_text in filediff['patch']['stats']['ops'].items():
182 <tr>
183 <td class="cb-text cb-${op_class(op_id)}" ${c.diffmode == 'unified' and 'colspan=3' or 'colspan=4'}>
184 %if op_id == DEL_FILENODE:
185 ${_('File was deleted')}
186 %elif op_id == BIN_FILENODE:
187 ${_('Binary file hidden')}
188 %else:
189 ${op_text}
190 %endif
191 </td>
192 </tr>
193 %endfor
194 %endif
195 %if over_lines_changed_limit:
196 <tr class="cb-warning cb-collapser">
197 <td class="cb-text" ${c.diffmode == 'unified' and 'colspan=4' or 'colspan=4'}>
198 ${_('This diff has been collapsed as it changes many lines, (%i lines changed)' % lines_changed)}
199 <a href="#" class="cb-expand"
200 onclick="$(this).closest('table').removeClass('cb-collapsed'); return false;">${_('Show them')}
201 </a>
202 <a href="#" class="cb-collapse"
203 onclick="$(this).closest('table').addClass('cb-collapsed'); return false;">${_('Hide them')}
204 </a>
205 </td>
206 </tr>
207 %endif
208 %if filediff.patch['is_limited_diff']:
209 <tr class="cb-warning cb-collapser">
210 <td class="cb-text" ${c.diffmode == 'unified' and 'colspan=4' or 'colspan=4'}>
211 ${_('The requested commit is too big and content was truncated.')} <a href="${link_for(fulldiff=1)}" onclick="return confirm('${_("Showing a big diff might take some time and resources, continue?")}')">${_('Show full diff')}</a>
212 </td>
213 </tr>
214 %endif
215 %for hunk in filediff.hunks:
216 <tr class="cb-hunk">
217 <td ${c.diffmode == 'unified' and 'colspan=3' or ''}>
218 ## TODO: dan: add ajax loading of more context here
219 ## <a href="#">
220 <i class="icon-more"></i>
221 ## </a>
222 </td>
223 <td ${c.diffmode == 'sideside' and 'colspan=5' or ''}>
224 @@
225 -${hunk.source_start},${hunk.source_length}
226 +${hunk.target_start},${hunk.target_length}
227 ${hunk.section_header}
228 </td>
229 </tr>
230 %if c.diffmode == 'unified':
231 ${render_hunk_lines_unified(hunk, use_comments=use_comments)}
232 %elif c.diffmode == 'sideside':
233 ${render_hunk_lines_sideside(hunk, use_comments=use_comments)}
234 %else:
235 <tr class="cb-line">
236 <td>unknown diff mode</td>
237 </tr>
238 %endif
239 %endfor
240 </table>
241 </div>
242 %endfor
243 </div>
244 </div>
245 </%def>
246
247 <%def name="diff_ops(filediff)">
248 <%
249 stats = filediff['patch']['stats']
250 from rhodecode.lib.diffs import NEW_FILENODE, DEL_FILENODE, \
251 MOD_FILENODE, RENAMED_FILENODE, CHMOD_FILENODE, BIN_FILENODE
252 %>
253 <span class="pill">
254 %if filediff.source_file_path and filediff.target_file_path:
255 %if filediff.source_file_path != filediff.target_file_path: # file was renamed
256 <strong>${filediff.target_file_path}</strong><del>${filediff.source_file_path}</del>
257 %else:
258 ## file was modified
259 <strong>${filediff.source_file_path}</strong>
260 %endif
261 %else:
262 %if filediff.source_file_path:
263 ## file was deleted
264 <strong>${filediff.source_file_path}</strong>
265 %else:
266 ## file was added
267 <strong>${filediff.target_file_path}</strong>
268 %endif
269 %endif
270 </span>
271 <span class="pill-group" style="float: left">
272 %if filediff.patch['is_limited_diff']:
273 <span class="pill tooltip" op="limited" title="The stats for this diff are not complete">limited diff</span>
274 %endif
275 %if RENAMED_FILENODE in stats['ops']:
276 <span class="pill" op="renamed">renamed</span>
277 %endif
278
279 %if NEW_FILENODE in stats['ops']:
280 <span class="pill" op="created">created</span>
281 %if filediff['target_mode'].startswith('120'):
282 <span class="pill" op="symlink">symlink</span>
283 %else:
284 <span class="pill" op="mode">${nice_mode(filediff['target_mode'])}</span>
285 %endif
286 %endif
287
288 %if DEL_FILENODE in stats['ops']:
289 <span class="pill" op="removed">removed</span>
290 %endif
291
292 %if CHMOD_FILENODE in stats['ops']:
293 <span class="pill" op="mode">
294 ${nice_mode(filediff['source_mode'])} ➡ ${nice_mode(filediff['target_mode'])}
295 </span>
296 %endif
297 </span>
298
299 <a class="pill filediff-anchor" href="#a_${h.FID('', filediff.patch['filename'])}"></a>
300
301 <span class="pill-group" style="float: right">
302 %if BIN_FILENODE in stats['ops']:
303 <span class="pill" op="binary">binary</span>
304 %if MOD_FILENODE in stats['ops']:
305 <span class="pill" op="modified">modified</span>
306 %endif
307 %endif
308 %if stats['added']:
309 <span class="pill" op="added">+${stats['added']}</span>
310 %endif
311 %if stats['deleted']:
312 <span class="pill" op="deleted">-${stats['deleted']}</span>
313 %endif
314 </span>
315
316 </%def>
317
318 <%def name="nice_mode(filemode)">
319 ${filemode.startswith('100') and filemode[3:] or filemode}
320 </%def>
321
322 <%def name="diff_menu(filediff, use_comments=False)">
323 <div class="filediff-menu">
324 %if filediff.diffset.source_ref:
325 %if filediff.patch['operation'] in ['D', 'M']:
326 <a
327 class="tooltip"
328 href="${h.url('files_home',repo_name=filediff.diffset.repo_name,f_path=filediff.source_file_path,revision=filediff.diffset.source_ref)}"
329 title="${h.tooltip(_('Show file at commit: %(commit_id)s') % {'commit_id': filediff.diffset.source_ref[:12]})}"
330 >
331 ${_('Show file before')}
332 </a>
333 %else:
334 <span
335 class="tooltip"
336 title="${h.tooltip(_('File no longer present at commit: %(commit_id)s') % {'commit_id': filediff.diffset.source_ref[:12]})}"
337 >
338 ${_('Show file before')}
339 </span>
340 %endif
341 %if filediff.patch['operation'] in ['A', 'M']:
342 <a
343 class="tooltip"
344 href="${h.url('files_home',repo_name=filediff.diffset.repo_name,f_path=filediff.target_file_path,revision=filediff.diffset.target_ref)}"
345 title="${h.tooltip(_('Show file at commit: %(commit_id)s') % {'commit_id': filediff.diffset.target_ref[:12]})}"
346 >
347 ${_('Show file after')}
348 </a>
349 %else:
350 <span
351 class="tooltip"
352 title="${h.tooltip(_('File no longer present at commit: %(commit_id)s') % {'commit_id': filediff.diffset.target_ref[:12]})}"
353 >
354 ${_('Show file after')}
355 </span>
356 %endif
357 <a
358 class="tooltip"
359 title="${h.tooltip(_('Raw diff'))}"
360 href="${h.url('files_diff_home',repo_name=filediff.diffset.repo_name,f_path=filediff.target_file_path,diff2=filediff.diffset.target_ref,diff1=filediff.diffset.source_ref,diff='raw')}"
361 >
362 ${_('Raw diff')}
363 </a>
364 <a
365 class="tooltip"
366 title="${h.tooltip(_('Download diff'))}"
367 href="${h.url('files_diff_home',repo_name=filediff.diffset.repo_name,f_path=filediff.target_file_path,diff2=filediff.diffset.target_ref,diff1=filediff.diffset.source_ref,diff='download')}"
368 >
369 ${_('Download diff')}
370 </a>
371
372 ## TODO: dan: refactor ignorews_url and context_url into the diff renderer same as diffmode=unified/sideside. Also use ajax to load more context (by clicking hunks)
373 %if hasattr(c, 'ignorews_url'):
374 ${c.ignorews_url(request.GET, h.FID('', filediff['patch']['filename']))}
375 %endif
376 %if hasattr(c, 'context_url'):
377 ${c.context_url(request.GET, h.FID('', filediff['patch']['filename']))}
378 %endif
379
380
381 %if use_comments:
382 <a href="#" onclick="return Rhodecode.comments.toggleComments(this);">
383 <span class="show-comment-button">${_('Show comments')}</span><span class="hide-comment-button">${_('Hide comments')}</span>
384 </a>
385 %endif
386 %endif
387 </div>
388 </%def>
389
390
391 <%namespace name="commentblock" file="/changeset/changeset_file_comment.html"/>
392 <%def name="inline_comments_container(comments)">
393 <div class="inline-comments">
394 %for comment in comments:
395 ${commentblock.comment_block(comment, inline=True)}
396 %endfor
397 <span onclick="return Rhodecode.comments.createComment(this)"
398 class="btn btn-secondary cb-comment-add-button">
399 ${_('Add another comment')}
400 </span>
401 </div>
402 </%def>
403
404
405 <%def name="render_hunk_lines_sideside(hunk, use_comments=False)">
406 %for i, line in enumerate(hunk.sideside):
407 <%
408 old_line_anchor, new_line_anchor = None, None
409 if line.original.lineno:
410 old_line_anchor = diff_line_anchor(hunk.filediff.source_file_path, line.original.lineno, 'o')
411 if line.modified.lineno:
412 new_line_anchor = diff_line_anchor(hunk.filediff.target_file_path, line.modified.lineno, 'n')
413 %>
414 <tr class="cb-line">
415 <td class="cb-data ${action_class(line.original.action)}"
416 data-line-number="${line.original.lineno}"
417 >
418 <div>
419 %if line.original.comments:
420 <i class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
421 %endif
422 </div>
423 </td>
424 <td class="cb-lineno ${action_class(line.original.action)}"
425 data-line-number="${line.original.lineno}"
426 %if old_line_anchor:
427 id="${old_line_anchor}"
428 %endif
429 >
430 %if line.original.lineno:
431 <a name="${old_line_anchor}" href="#${old_line_anchor}">${line.original.lineno}</a>
432 %endif
433 </td>
434 <td class="cb-content ${action_class(line.original.action)}"
435 data-line-number="o${line.original.lineno}"
436 >
437 %if use_comments and line.original.lineno:
438 ${render_add_comment_button()}
439 %endif
440 <span class="cb-code">${line.original.action} ${line.original.content or '' | n}</span>
441 %if use_comments and line.original.lineno and line.original.comments:
442 ${inline_comments_container(line.original.comments)}
443 %endif
444 </td>
445 <td class="cb-data ${action_class(line.modified.action)}"
446 data-line-number="${line.modified.lineno}"
447 >
448 <div>
449 %if line.modified.comments:
450 <i class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
451 %endif
452 </div>
453 </td>
454 <td class="cb-lineno ${action_class(line.modified.action)}"
455 data-line-number="${line.modified.lineno}"
456 %if new_line_anchor:
457 id="${new_line_anchor}"
458 %endif
459 >
460 %if line.modified.lineno:
461 <a name="${new_line_anchor}" href="#${new_line_anchor}">${line.modified.lineno}</a>
462 %endif
463 </td>
464 <td class="cb-content ${action_class(line.modified.action)}"
465 data-line-number="n${line.modified.lineno}"
466 >
467 %if use_comments and line.modified.lineno:
468 ${render_add_comment_button()}
469 %endif
470 <span class="cb-code">${line.modified.action} ${line.modified.content or '' | n}</span>
471 %if use_comments and line.modified.lineno and line.modified.comments:
472 ${inline_comments_container(line.modified.comments)}
473 %endif
474 </td>
475 </tr>
476 %endfor
477 </%def>
478
479
480 <%def name="render_hunk_lines_unified(hunk, use_comments=False)">
481 %for old_line_no, new_line_no, action, content, comments in hunk.unified:
482 <%
483 old_line_anchor, new_line_anchor = None, None
484 if old_line_no:
485 old_line_anchor = diff_line_anchor(hunk.filediff.source_file_path, old_line_no, 'o')
486 if new_line_no:
487 new_line_anchor = diff_line_anchor(hunk.filediff.target_file_path, new_line_no, 'n')
488 %>
489 <tr class="cb-line">
490 <td class="cb-data ${action_class(action)}">
491 <div>
492 %if comments:
493 <i class="icon-comment" onclick="return Rhodecode.comments.toggleLineComments(this)"></i>
494 %endif
495 </div>
496 </td>
497 <td class="cb-lineno ${action_class(action)}"
498 data-line-number="${old_line_no}"
499 %if old_line_anchor:
500 id="${old_line_anchor}"
501 %endif
502 >
503 %if old_line_anchor:
504 <a name="${old_line_anchor}" href="#${old_line_anchor}">${old_line_no}</a>
505 %endif
506 </td>
507 <td class="cb-lineno ${action_class(action)}"
508 data-line-number="${new_line_no}"
509 %if new_line_anchor:
510 id="${new_line_anchor}"
511 %endif
512 >
513 %if new_line_anchor:
514 <a name="${new_line_anchor}" href="#${new_line_anchor}">${new_line_no}</a>
515 %endif
516 </td>
517 <td class="cb-content ${action_class(action)}"
518 data-line-number="${new_line_no and 'n' or 'o'}${new_line_no or old_line_no}"
519 >
520 %if use_comments:
521 ${render_add_comment_button()}
522 %endif
523 <span class="cb-code">${action} ${content or '' | n}</span>
524 %if use_comments and comments:
525 ${inline_comments_container(comments)}
526 %endif
527 </td>
528 </tr>
529 %endfor
530 </%def>
531
532 <%def name="render_add_comment_button()">
533 <button
534 class="btn btn-small btn-primary cb-comment-box-opener"
535 onclick="return Rhodecode.comments.createComment(this)"
536 ><span>+</span></button>
537 </%def>
538
539 <%def name="render_diffset_menu()">
540 <div class="diffset-menu clearinner">
541 <div class="pull-right">
542 <div class="btn-group">
543 <a
544 class="btn ${c.diffmode == 'sideside' and 'btn-primary'} tooltip"
545 title="${_('View side by side')}"
546 href="${h.url_replace(diffmode='sideside')}">
547 <span>${_('Side by Side')}</span>
548 </a>
549 <a
550 class="btn ${c.diffmode == 'unified' and 'btn-primary'} tooltip"
551 title="${_('View unified')}" href="${h.url_replace(diffmode='unified')}">
552 <span>${_('Unified')}</span>
553 </a>
554 </div>
555 </div>
556 <div class="pull-left">
557 <div class="btn-group">
558 <a
559 class="btn"
560 href="#"
561 onclick="$('input[class=filediff-collapse-state]').prop('checked', false); return false">${_('Expand All')}</a>
562 <a
563 class="btn"
564 href="#"
565 onclick="$('input[class=filediff-collapse-state]').prop('checked', true); return false">${_('Collapse All')}</a>
566 </div>
567 </div>
568 </div>
569 </%def>
@@ -0,0 +1,68 b''
1 <%def name="render_line(line_num, tokens,
2 annotation=None,
3 bgcolor=None)">
4 <%
5 from rhodecode.lib.codeblocks import render_tokenstream
6 # avoid module lookup for performance
7 html_escape = h.html_escape
8 %>
9 <tr class="cb-line cb-line-fresh"
10 %if annotation:
11 data-revision="${annotation.revision}"
12 %endif
13 >
14 <td class="cb-lineno" id="L${line_num}">
15 <a data-line-no="${line_num}" href="#L${line_num}"></a>
16 </td>
17 <td class="cb-content cb-content-fresh"
18 %if bgcolor:
19 style="background: ${bgcolor}"
20 %endif
21 >
22 ## newline at end is necessary for highlight to work when line is empty
23 ## and for copy pasting code to work as expected
24 <span class="cb-code">${render_tokenstream(tokens)|n}${'\n'}</span>
25 </td>
26 </tr>
27 </%def>
28
29 <%def name="render_annotation_lines(annotation, lines, color_hasher)">
30 <%
31 rowspan = len(lines) + 1 # span the line's <tr> and annotation <tr>
32 %>
33 %if not annotation:
34 <tr class="cb-annotate">
35 <td class="cb-annotate-message" rowspan="${rowspan}"></td>
36 <td class="cb-annotate-revision" rowspan="${rowspan}"></td>
37 </tr>
38 %else:
39 <tr class="cb-annotate">
40 <td class="cb-annotate-info tooltip"
41 rowspan="${rowspan}"
42 title="Author: ${annotation.author | entity}<br>Date: ${annotation.date}<br>Message: ${annotation.message | entity}"
43 >
44 ${h.gravatar_with_user(annotation.author, 16) | n}
45 <strong class="cb-annotate-message">${
46 h.truncate(annotation.message, len(lines) * 30)
47 }</strong>
48 </td>
49 <td
50 class="cb-annotate-revision"
51 rowspan="${rowspan}"
52 data-revision="${annotation.revision}"
53 onclick="$('[data-revision=${annotation.revision}]').toggleClass('cb-line-fresh')"
54 style="background: ${color_hasher(annotation.raw_id)}">
55 <a href="${h.url('changeset_home',repo_name=c.repo_name,revision=annotation.raw_id)}">
56 r${annotation.revision}
57 </a>
58 </td>
59 </tr>
60 %endif
61
62 %for line_num, tokens in lines:
63 ${render_line(line_num, tokens,
64 bgcolor=color_hasher(annotation and annotation.raw_id or ''),
65 annotation=annotation,
66 )}
67 %endfor
68 </%def>
@@ -0,0 +1,75 b''
1 ## -*- coding: utf-8 -*-
2 <%inherit file="/debug_style/index.html"/>
3
4 <%def name="breadcrumbs_links()">
5 ${h.link_to(_('Style'), h.url('debug_style_home'))}
6 &raquo;
7 ${c.active}
8 </%def>
9
10
11 <%def name="real_main()">
12 <div class="box">
13 <div class="title">
14 ${self.breadcrumbs()}
15 </div>
16
17 <div class='sidebar-col-wrapper'>
18
19 ${self.sidebar()}
20
21 <div class="main-content">
22
23 <h3>Alert Messages</h3>
24 <p>
25 Alert messages are produced using the custom Polymer element
26 <code>rhodecode-toast</code> which is passed a message and level.
27 </p>
28
29 <div class="bs-example">
30 <p> There are four types of alert levels:</p>
31 <div class="alert alert-success">
32 "success" is used when an action is completed as expected<br/>
33 ex. updated settings, deletion of a repo/user
34 </div>
35 <div class="alert alert-warning">
36 "warning" is for notification of impending issues<br/>
37 ex. a gist which was updated elsewhere during editing, disk out of space
38 </div>
39 <div class="alert alert-error">
40 "error" should be used for unexpected results and actions which
41 are not successful<br/>
42 ex. a form not submitted, repo creation failure
43 </div>
44 <div class="alert alert-info">
45 "info" is used for non-critical information<br/>
46 ex. notification of new messages, invitations to chat
47 </div>
48 </div>
49
50 <p><br/>
51 Whether singular or multiple, alerts are grouped into a dismissable
52 panel with a single "Close" button underneath.
53 </p>
54 <a class="btn btn-default" id="test-notification">Test Notification</a>
55
56 <script type="text/javascript">
57 $('#test-notification').on('click', function(e){
58 var levels = ['info', 'error', 'warning', 'success'];
59 var level = levels[Math.floor(Math.random()*levels.length)];
60 var payload = {
61 message: {
62 message: 'This is a test ' +level+ ' notification.',
63 level: level,
64 force: true
65 }
66 };
67 $.Topic('/notifications').publish(payload);
68 });
69 </script>
70
71 </div>
72 </div> <!-- .main-content -->
73 </div>
74 </div> <!-- .box -->
75 </%def>
@@ -0,0 +1,52 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2010-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21
22 import pytest
23 from rhodecode import events
24
25
26 @pytest.fixture
27 def repo_push_event(backend, user_regular):
28 commits = [
29 {'message': 'ancestor commit fixes #15'},
30 {'message': 'quick fixes'},
31 {'message': 'change that fixes #41, #2'},
32 {'message': 'this is because 5b23c3532 broke stuff'},
33 {'message': 'last commit'},
34 ]
35 commit_ids = backend.create_master_repo(commits).values()
36 repo = backend.create_repo()
37 scm_extras = {
38 'ip': '127.0.0.1',
39 'username': user_regular.username,
40 'action': '',
41 'repository': repo.repo_name,
42 'scm': repo.scm_instance().alias,
43 'config': '',
44 'server_url': 'http://example.com',
45 'make_lock': None,
46 'locked_by': [None],
47 'commit_ids': commit_ids,
48 }
49
50 return events.RepoPushEvent(repo_name=repo.repo_name,
51 pushed_commit_ids=commit_ids,
52 extras=scm_extras)
@@ -0,0 +1,93 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2010-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 import pytest
22
23 from rhodecode import events
24 from rhodecode.lib.utils2 import AttributeDict
25 from rhodecode.integrations.types.webhook import WebhookHandler
26
27
28 @pytest.fixture
29 def base_data():
30 return {
31 'repo': {
32 'repo_name': 'foo',
33 'repo_type': 'hg',
34 'repo_id': '12',
35 'url': 'http://repo.url/foo'
36 }
37 }
38
39
40 def test_webhook_parse_url_invalid_event():
41 template_url = 'http://server.com/${repo_name}/build'
42 handler = WebhookHandler(template_url, 'secret_token')
43 with pytest.raises(ValueError) as err:
44 handler(events.RepoDeleteEvent(''), {})
45 assert str(err.value).startswith('event type not supported')
46
47
48 @pytest.mark.parametrize('template,expected_urls', [
49 ('http://server.com/${repo_name}/build', ['http://server.com/foo/build']),
50 ('http://server.com/${repo_name}/${repo_type}', ['http://server.com/foo/hg']),
51 ('http://${server}.com/${repo_name}/${repo_id}', ['http://${server}.com/foo/12']),
52 ('http://server.com/${branch}/build', ['http://server.com/${branch}/build']),
53 ])
54 def test_webook_parse_url_for_create_event(base_data, template, expected_urls):
55 handler = WebhookHandler(template, 'secret_token')
56 urls = handler(events.RepoCreateEvent(''), base_data)
57 assert urls == [(url, 'secret_token', base_data) for url in expected_urls]
58
59
60 @pytest.mark.parametrize('template,expected_urls', [
61 ('http://server.com/${repo_name}/${pull_request_id}', ['http://server.com/foo/999']),
62 ('http://server.com/${repo_name}/${pull_request_url}', ['http://server.com/foo/http://pr-url.com']),
63 ])
64 def test_webook_parse_url_for_pull_request_event(base_data, template, expected_urls):
65 base_data['pullrequest'] = {
66 'pull_request_id': 999,
67 'url': 'http://pr-url.com',
68 }
69 handler = WebhookHandler(template, 'secret_token')
70 urls = handler(events.PullRequestCreateEvent(
71 AttributeDict({'target_repo': 'foo'})), base_data)
72 assert urls == [(url, 'secret_token', base_data) for url in expected_urls]
73
74
75 @pytest.mark.parametrize('template,expected_urls', [
76 ('http://server.com/${branch}/build', ['http://server.com/stable/build',
77 'http://server.com/dev/build']),
78 ('http://server.com/${branch}/${commit_id}', ['http://server.com/stable/stable-xxx',
79 'http://server.com/stable/stable-yyy',
80 'http://server.com/dev/dev-xxx',
81 'http://server.com/dev/dev-yyy']),
82 ])
83 def test_webook_parse_url_for_push_event(pylonsapp, repo_push_event, base_data, template, expected_urls):
84 base_data['push'] = {
85 'branches': [{'name': 'stable'}, {'name': 'dev'}],
86 'commits': [{'branch': 'stable', 'raw_id': 'stable-xxx'},
87 {'branch': 'stable', 'raw_id': 'stable-yyy'},
88 {'branch': 'dev', 'raw_id': 'dev-xxx'},
89 {'branch': 'dev', 'raw_id': 'dev-yyy'}]
90 }
91 handler = WebhookHandler(template, 'secret_token')
92 urls = handler(repo_push_event, base_data)
93 assert urls == [(url, 'secret_token', base_data) for url in expected_urls]
@@ -0,0 +1,330 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2016-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 import pytest
22
23 from rhodecode.lib.codeblocks import (
24 tokenize_string, split_token_stream, rollup_tokenstream,
25 render_tokenstream)
26 from pygments.lexers import get_lexer_by_name
27
28
29 class TestTokenizeString(object):
30
31 python_code = '''
32 import this
33
34 var = 6
35 print "this"
36
37 '''
38
39 def test_tokenize_as_python(self):
40 lexer = get_lexer_by_name('python')
41 tokens = list(tokenize_string(self.python_code, lexer))
42
43 assert tokens == [
44 ('', u'\n'),
45 ('', u' '),
46 ('kn', u'import'),
47 ('', u' '),
48 ('nn', u'this'),
49 ('', u'\n'),
50 ('', u'\n'),
51 ('', u' '),
52 ('n', u'var'),
53 ('', u' '),
54 ('o', u'='),
55 ('', u' '),
56 ('mi', u'6'),
57 ('', u'\n'),
58 ('', u' '),
59 ('k', u'print'),
60 ('', u' '),
61 ('s2', u'"'),
62 ('s2', u'this'),
63 ('s2', u'"'),
64 ('', u'\n'),
65 ('', u'\n'),
66 ('', u' ')
67 ]
68
69 def test_tokenize_as_text(self):
70 lexer = get_lexer_by_name('text')
71 tokens = list(tokenize_string(self.python_code, lexer))
72
73 assert tokens == [
74 ('',
75 u'\n import this\n\n var = 6\n print "this"\n\n ')
76 ]
77
78
79 class TestSplitTokenStream(object):
80
81 def test_split_token_stream(self):
82 lines = list(split_token_stream(
83 [('type1', 'some\ntext'), ('type2', 'more\n')]))
84
85 assert lines == [
86 [('type1', u'some')],
87 [('type1', u'text'), ('type2', u'more')],
88 [('type2', u'')],
89 ]
90
91 def test_split_token_stream_other_char(self):
92 lines = list(split_token_stream(
93 [('type1', 'some\ntext'), ('type2', 'more\n')],
94 split_string='m'))
95
96 assert lines == [
97 [('type1', 'so')],
98 [('type1', 'e\ntext'), ('type2', '')],
99 [('type2', 'ore\n')],
100 ]
101
102 def test_split_token_stream_without_char(self):
103 lines = list(split_token_stream(
104 [('type1', 'some\ntext'), ('type2', 'more\n')],
105 split_string='z'))
106
107 assert lines == [
108 [('type1', 'some\ntext'), ('type2', 'more\n')]
109 ]
110
111 def test_split_token_stream_single(self):
112 lines = list(split_token_stream(
113 [('type1', '\n')], split_string='\n'))
114
115 assert lines == [
116 [('type1', '')],
117 [('type1', '')],
118 ]
119
120 def test_split_token_stream_single_repeat(self):
121 lines = list(split_token_stream(
122 [('type1', '\n\n\n')], split_string='\n'))
123
124 assert lines == [
125 [('type1', '')],
126 [('type1', '')],
127 [('type1', '')],
128 [('type1', '')],
129 ]
130
131 def test_split_token_stream_multiple_repeat(self):
132 lines = list(split_token_stream(
133 [('type1', '\n\n'), ('type2', '\n\n')], split_string='\n'))
134
135 assert lines == [
136 [('type1', '')],
137 [('type1', '')],
138 [('type1', ''), ('type2', '')],
139 [('type2', '')],
140 [('type2', '')],
141 ]
142
143
144 class TestRollupTokens(object):
145
146 @pytest.mark.parametrize('tokenstream,output', [
147 ([],
148 []),
149 ([('A', 'hell'), ('A', 'o')], [
150 ('A', [
151 ('', 'hello')]),
152 ]),
153 ([('A', 'hell'), ('B', 'o')], [
154 ('A', [
155 ('', 'hell')]),
156 ('B', [
157 ('', 'o')]),
158 ]),
159 ([('A', 'hel'), ('A', 'lo'), ('B', ' '), ('A', 'there')], [
160 ('A', [
161 ('', 'hello')]),
162 ('B', [
163 ('', ' ')]),
164 ('A', [
165 ('', 'there')]),
166 ]),
167 ])
168 def test_rollup_tokenstream_without_ops(self, tokenstream, output):
169 assert list(rollup_tokenstream(tokenstream)) == output
170
171 @pytest.mark.parametrize('tokenstream,output', [
172 ([],
173 []),
174 ([('A', '', 'hell'), ('A', '', 'o')], [
175 ('A', [
176 ('', 'hello')]),
177 ]),
178 ([('A', '', 'hell'), ('B', '', 'o')], [
179 ('A', [
180 ('', 'hell')]),
181 ('B', [
182 ('', 'o')]),
183 ]),
184 ([('A', '', 'h'), ('B', '', 'e'), ('C', '', 'y')], [
185 ('A', [
186 ('', 'h')]),
187 ('B', [
188 ('', 'e')]),
189 ('C', [
190 ('', 'y')]),
191 ]),
192 ([('A', '', 'h'), ('A', '', 'e'), ('C', '', 'y')], [
193 ('A', [
194 ('', 'he')]),
195 ('C', [
196 ('', 'y')]),
197 ]),
198 ([('A', 'ins', 'h'), ('A', 'ins', 'e')], [
199 ('A', [
200 ('ins', 'he')
201 ]),
202 ]),
203 ([('A', 'ins', 'h'), ('A', 'del', 'e')], [
204 ('A', [
205 ('ins', 'h'),
206 ('del', 'e')
207 ]),
208 ]),
209 ([('A', 'ins', 'h'), ('B', 'del', 'e'), ('B', 'del', 'y')], [
210 ('A', [
211 ('ins', 'h'),
212 ]),
213 ('B', [
214 ('del', 'ey'),
215 ]),
216 ]),
217 ([('A', 'ins', 'h'), ('A', 'del', 'e'), ('B', 'del', 'y')], [
218 ('A', [
219 ('ins', 'h'),
220 ('del', 'e'),
221 ]),
222 ('B', [
223 ('del', 'y'),
224 ]),
225 ]),
226 ([('A', '', 'some'), ('A', 'ins', 'new'), ('A', '', 'name')], [
227 ('A', [
228 ('', 'some'),
229 ('ins', 'new'),
230 ('', 'name'),
231 ]),
232 ]),
233 ])
234 def test_rollup_tokenstream_with_ops(self, tokenstream, output):
235 assert list(rollup_tokenstream(tokenstream)) == output
236
237
238 class TestRenderTokenStream(object):
239
240 @pytest.mark.parametrize('tokenstream,output', [
241 (
242 [],
243 '',
244 ),
245 (
246 [('', '', u'')],
247 '<span></span>',
248 ),
249 (
250 [('', '', u'text')],
251 '<span>text</span>',
252 ),
253 (
254 [('A', '', u'')],
255 '<span class="A"></span>',
256 ),
257 (
258 [('A', '', u'hello')],
259 '<span class="A">hello</span>',
260 ),
261 (
262 [('A', '', u'hel'), ('A', '', u'lo')],
263 '<span class="A">hello</span>',
264 ),
265 (
266 [('A', '', u'two\n'), ('A', '', u'lines')],
267 '<span class="A">two\nlines</span>',
268 ),
269 (
270 [('A', '', u'\nthree\n'), ('A', '', u'lines')],
271 '<span class="A">\nthree\nlines</span>',
272 ),
273 (
274 [('', '', u'\n'), ('A', '', u'line')],
275 '<span>\n</span><span class="A">line</span>',
276 ),
277 (
278 [('', 'ins', u'\n'), ('A', '', u'line')],
279 '<span><ins>\n</ins></span><span class="A">line</span>',
280 ),
281 (
282 [('A', '', u'hel'), ('A', 'ins', u'lo')],
283 '<span class="A">hel<ins>lo</ins></span>',
284 ),
285 (
286 [('A', '', u'hel'), ('A', 'ins', u'l'), ('A', 'ins', u'o')],
287 '<span class="A">hel<ins>lo</ins></span>',
288 ),
289 (
290 [('A', '', u'hel'), ('A', 'ins', u'l'), ('A', 'del', u'o')],
291 '<span class="A">hel<ins>l</ins><del>o</del></span>',
292 ),
293 (
294 [('A', '', u'hel'), ('B', '', u'lo')],
295 '<span class="A">hel</span><span class="B">lo</span>',
296 ),
297 (
298 [('A', '', u'hel'), ('B', 'ins', u'lo')],
299 '<span class="A">hel</span><span class="B"><ins>lo</ins></span>',
300 ),
301 ])
302 def test_render_tokenstream_with_ops(self, tokenstream, output):
303 html = render_tokenstream(tokenstream)
304 assert html == output
305
306 @pytest.mark.parametrize('tokenstream,output', [
307 (
308 [('A', u'hel'), ('A', u'lo')],
309 '<span class="A">hello</span>',
310 ),
311 (
312 [('A', u'hel'), ('A', u'l'), ('A', u'o')],
313 '<span class="A">hello</span>',
314 ),
315 (
316 [('A', u'hel'), ('A', u'l'), ('A', u'o')],
317 '<span class="A">hello</span>',
318 ),
319 (
320 [('A', u'hel'), ('B', u'lo')],
321 '<span class="A">hel</span><span class="B">lo</span>',
322 ),
323 (
324 [('A', u'hel'), ('B', u'lo')],
325 '<span class="A">hel</span><span class="B">lo</span>',
326 ),
327 ])
328 def test_render_tokenstream_without_ops(self, tokenstream, output):
329 html = render_tokenstream(tokenstream)
330 assert html == output
@@ -0,0 +1,15 b''
1 import py.test
2
3 from rhodecode.lib.system_info import get_system_info
4
5
6 def test_system_info(app):
7 info = get_system_info({})
8 assert info['load']['value']['15_min'] != 'NOT AVAILABLE'
9
10
11 def test_system_info_without_psutil(monkeypatch, app):
12 import rhodecode.lib.system_info
13 monkeypatch.setattr(rhodecode.lib.system_info, 'psutil', None)
14 info = get_system_info({})
15 assert info['load']['value']['15_min'] == 'NOT AVAILABLE'
@@ -0,0 +1,116 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2016-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 import colander
22 import pytest
23
24 from rhodecode.model.validation_schema import types
25 from rhodecode.model.validation_schema.schemas import repo_group_schema
26
27
28 class TestRepoGroupSchema(object):
29
30 @pytest.mark.parametrize('given, expected', [
31 ('my repo', 'my-repo'),
32 (' hello world mike ', 'hello-world-mike'),
33
34 ('//group1/group2//', 'group1/group2'),
35 ('//group1///group2//', 'group1/group2'),
36 ('///group1/group2///group3', 'group1/group2/group3'),
37 ('word g1/group2///group3', 'word-g1/group2/group3'),
38
39 ('grou p1/gro;,,##up2//.../group3', 'grou-p1/group2/group3'),
40
41 ('group,,,/,,,/1/2/3', 'group/1/2/3'),
42 ('grou[]p1/gro;up2///gro up3', 'group1/group2/gro-up3'),
43 (u'grou[]p1/gro;up2///gro up3/ąć', u'group1/group2/gro-up3/ąć'),
44 ])
45 def test_deserialize_repo_name(self, app, user_admin, given, expected):
46 schema = repo_group_schema.RepoGroupSchema().bind()
47 assert schema.get('repo_group_name').deserialize(given) == expected
48
49 def test_deserialize(self, app, user_admin):
50 schema = repo_group_schema.RepoGroupSchema().bind(
51 user=user_admin
52 )
53
54 schema_data = schema.deserialize(dict(
55 repo_group_name='dupa',
56 repo_group_owner=user_admin.username
57 ))
58
59 assert schema_data['repo_group_name'] == 'dupa'
60 assert schema_data['repo_group'] == {
61 'repo_group_id': None,
62 'repo_group_name': types.RootLocation,
63 'repo_group_name_without_group': 'dupa'}
64
65 @pytest.mark.parametrize('given, err_key, expected_exc', [
66 ('xxx/dupa', 'repo_group', 'Parent repository group `xxx` does not exist'),
67 ('', 'repo_group_name', 'Name must start with a letter or number. Got ``'),
68 ])
69 def test_deserialize_with_bad_group_name(
70 self, app, user_admin, given, err_key, expected_exc):
71 schema = repo_group_schema.RepoGroupSchema().bind(
72 repo_type_options=['hg'],
73 user=user_admin
74 )
75
76 with pytest.raises(colander.Invalid) as excinfo:
77 schema.deserialize(dict(
78 repo_group_name=given,
79 repo_group_owner=user_admin.username
80 ))
81
82 assert excinfo.value.asdict()[err_key] == expected_exc
83
84 def test_deserialize_with_group_name(self, app, user_admin, test_repo_group):
85 schema = repo_group_schema.RepoGroupSchema().bind(
86 user=user_admin
87 )
88
89 full_name = test_repo_group.group_name + '/dupa'
90 schema_data = schema.deserialize(dict(
91 repo_group_name=full_name,
92 repo_group_owner=user_admin.username
93 ))
94
95 assert schema_data['repo_group_name'] == full_name
96 assert schema_data['repo_group'] == {
97 'repo_group_id': test_repo_group.group_id,
98 'repo_group_name': test_repo_group.group_name,
99 'repo_group_name_without_group': 'dupa'}
100
101 def test_deserialize_with_group_name_regular_user_no_perms(
102 self, app, user_regular, test_repo_group):
103 schema = repo_group_schema.RepoGroupSchema().bind(
104 user=user_regular
105 )
106
107 full_name = test_repo_group.group_name + '/dupa'
108 with pytest.raises(colander.Invalid) as excinfo:
109 schema.deserialize(dict(
110 repo_group_name=full_name,
111 repo_group_owner=user_regular.username
112 ))
113
114 expected = 'Parent repository group `{}` does not exist'.format(
115 test_repo_group.group_name)
116 assert excinfo.value.asdict()['repo_group'] == expected
@@ -0,0 +1,128 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2016-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21 import colander
22 import pytest
23
24 from rhodecode.model.validation_schema import types
25 from rhodecode.model.validation_schema.schemas import repo_schema
26
27
28 class TestRepoSchema(object):
29
30 #TODO:
31 # test nested groups
32
33 @pytest.mark.parametrize('given, expected', [
34 ('my repo', 'my-repo'),
35 (' hello world mike ', 'hello-world-mike'),
36
37 ('//group1/group2//', 'group1/group2'),
38 ('//group1///group2//', 'group1/group2'),
39 ('///group1/group2///group3', 'group1/group2/group3'),
40 ('word g1/group2///group3', 'word-g1/group2/group3'),
41
42 ('grou p1/gro;,,##up2//.../group3', 'grou-p1/group2/group3'),
43
44 ('group,,,/,,,/1/2/3', 'group/1/2/3'),
45 ('grou[]p1/gro;up2///gro up3', 'group1/group2/gro-up3'),
46 (u'grou[]p1/gro;up2///gro up3/ąć', u'group1/group2/gro-up3/ąć'),
47 ])
48 def test_deserialize_repo_name(self, app, user_admin, given, expected):
49
50 schema = repo_schema.RepoSchema().bind()
51 assert expected == schema.get('repo_name').deserialize(given)
52
53 def test_deserialize(self, app, user_admin):
54 schema = repo_schema.RepoSchema().bind(
55 repo_type_options=['hg'],
56 user=user_admin
57 )
58
59 schema_data = schema.deserialize(dict(
60 repo_name='dupa',
61 repo_type='hg',
62 repo_owner=user_admin.username
63 ))
64
65 assert schema_data['repo_name'] == 'dupa'
66 assert schema_data['repo_group'] == {
67 'repo_group_id': None,
68 'repo_group_name': types.RootLocation,
69 'repo_name_without_group': 'dupa'}
70
71 @pytest.mark.parametrize('given, err_key, expected_exc', [
72 ('xxx/dupa','repo_group', 'Repository group `xxx` does not exist'),
73 ('', 'repo_name', 'Name must start with a letter or number. Got ``'),
74 ])
75 def test_deserialize_with_bad_group_name(
76 self, app, user_admin, given, err_key, expected_exc):
77
78 schema = repo_schema.RepoSchema().bind(
79 repo_type_options=['hg'],
80 user=user_admin
81 )
82
83 with pytest.raises(colander.Invalid) as excinfo:
84 schema.deserialize(dict(
85 repo_name=given,
86 repo_type='hg',
87 repo_owner=user_admin.username
88 ))
89
90 assert excinfo.value.asdict()[err_key] == expected_exc
91
92 def test_deserialize_with_group_name(self, app, user_admin, test_repo_group):
93 schema = repo_schema.RepoSchema().bind(
94 repo_type_options=['hg'],
95 user=user_admin
96 )
97
98 full_name = test_repo_group.group_name + '/dupa'
99 schema_data = schema.deserialize(dict(
100 repo_name=full_name,
101 repo_type='hg',
102 repo_owner=user_admin.username
103 ))
104
105 assert schema_data['repo_name'] == full_name
106 assert schema_data['repo_group'] == {
107 'repo_group_id': test_repo_group.group_id,
108 'repo_group_name': test_repo_group.group_name,
109 'repo_name_without_group': 'dupa'}
110
111 def test_deserialize_with_group_name_regular_user_no_perms(
112 self, app, user_regular, test_repo_group):
113 schema = repo_schema.RepoSchema().bind(
114 repo_type_options=['hg'],
115 user=user_regular
116 )
117
118 full_name = test_repo_group.group_name + '/dupa'
119 with pytest.raises(colander.Invalid) as excinfo:
120 schema.deserialize(dict(
121 repo_name=full_name,
122 repo_type='hg',
123 repo_owner=user_regular.username
124 ))
125
126 expected = 'Repository group `{}` does not exist'.format(
127 test_repo_group.group_name)
128 assert excinfo.value.asdict()['repo_group'] == expected
This diff has been collapsed as it changes many lines, (716 lines changed) Show them Hide them
@@ -0,0 +1,716 b''
1
2
3 ################################################################################
4 ## RHODECODE ENTERPRISE CONFIGURATION ##
5 # The %(here)s variable will be replaced with the parent directory of this file#
6 ################################################################################
7
8 [DEFAULT]
9 debug = true
10
11 ################################################################################
12 ## EMAIL CONFIGURATION ##
13 ## Uncomment and replace with the email address which should receive ##
14 ## any error reports after an application crash ##
15 ## Additionally these settings will be used by the RhodeCode mailing system ##
16 ################################################################################
17
18 ## prefix all emails subjects with given prefix, helps filtering out emails
19 #email_prefix = [RhodeCode]
20
21 ## email FROM address all mails will be sent
22 #app_email_from = rhodecode-noreply@localhost
23
24 ## Uncomment and replace with the address which should receive any error report
25 ## note: using appenlight for error handling doesn't need this to be uncommented
26 #email_to = admin@localhost
27
28 ## in case of Application errors, sent an error email form
29 #error_email_from = rhodecode_error@localhost
30
31 ## additional error message to be send in case of server crash
32 #error_message =
33
34
35 #smtp_server = mail.server.com
36 #smtp_username =
37 #smtp_password =
38 #smtp_port =
39 #smtp_use_tls = false
40 #smtp_use_ssl = true
41 ## Specify available auth parameters here (e.g. LOGIN PLAIN CRAM-MD5, etc.)
42 #smtp_auth =
43
44 [server:main]
45 ## COMMON ##
46 host = 0.0.0.0
47 port = 5000
48
49 ##################################
50 ## WAITRESS WSGI SERVER ##
51 ## Recommended for Development ##
52 ##################################
53
54 use = egg:waitress#main
55 ## number of worker threads
56 threads = 5
57 ## MAX BODY SIZE 100GB
58 max_request_body_size = 107374182400
59 ## Use poll instead of select, fixes file descriptors limits problems.
60 ## May not work on old windows systems.
61 asyncore_use_poll = true
62
63
64 ##########################
65 ## GUNICORN WSGI SERVER ##
66 ##########################
67 ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini>
68
69 #use = egg:gunicorn#main
70 ## Sets the number of process workers. You must set `instance_id = *`
71 ## when this option is set to more than one worker, recommended
72 ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers
73 ## The `instance_id = *` must be set in the [app:main] section below
74 #workers = 2
75 ## number of threads for each of the worker, must be set to 1 for gevent
76 ## generally recommened to be at 1
77 #threads = 1
78 ## process name
79 #proc_name = rhodecode
80 ## type of worker class, one of sync, gevent
81 ## recommended for bigger setup is using of of other than sync one
82 #worker_class = sync
83 ## The maximum number of simultaneous clients. Valid only for Gevent
84 #worker_connections = 10
85 ## max number of requests that worker will handle before being gracefully
86 ## restarted, could prevent memory leaks
87 #max_requests = 1000
88 #max_requests_jitter = 30
89 ## amount of time a worker can spend with handling a request before it
90 ## gets killed and restarted. Set to 6hrs
91 #timeout = 21600
92
93 ## UWSGI ##
94 ## run with uwsgi --ini-paste-logged <inifile.ini>
95 #[uwsgi]
96 #socket = /tmp/uwsgi.sock
97 #master = true
98 #http = 127.0.0.1:5000
99
100 ## set as deamon and redirect all output to file
101 #daemonize = ./uwsgi_rhodecode.log
102
103 ## master process PID
104 #pidfile = ./uwsgi_rhodecode.pid
105
106 ## stats server with workers statistics, use uwsgitop
107 ## for monitoring, `uwsgitop 127.0.0.1:1717`
108 #stats = 127.0.0.1:1717
109 #memory-report = true
110
111 ## log 5XX errors
112 #log-5xx = true
113
114 ## Set the socket listen queue size.
115 #listen = 256
116
117 ## Gracefully Reload workers after the specified amount of managed requests
118 ## (avoid memory leaks).
119 #max-requests = 1000
120
121 ## enable large buffers
122 #buffer-size=65535
123
124 ## socket and http timeouts ##
125 #http-timeout=3600
126 #socket-timeout=3600
127
128 ## Log requests slower than the specified number of milliseconds.
129 #log-slow = 10
130
131 ## Exit if no app can be loaded.
132 #need-app = true
133
134 ## Set lazy mode (load apps in workers instead of master).
135 #lazy = true
136
137 ## scaling ##
138 ## set cheaper algorithm to use, if not set default will be used
139 #cheaper-algo = spare
140
141 ## minimum number of workers to keep at all times
142 #cheaper = 1
143
144 ## number of workers to spawn at startup
145 #cheaper-initial = 1
146
147 ## maximum number of workers that can be spawned
148 #workers = 4
149
150 ## how many workers should be spawned at a time
151 #cheaper-step = 1
152
153 ## prefix middleware for RhodeCode.
154 ## recommended when using proxy setup.
155 ## allows to set RhodeCode under a prefix in server.
156 ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well.
157 ## optionally set prefix like: `prefix = /<your-prefix>`
158 [filter:proxy-prefix]
159 use = egg:PasteDeploy#prefix
160 prefix = /
161
162 [app:main]
163 is_test = True
164 use = egg:rhodecode-enterprise-ce
165
166 ## enable proxy prefix middleware, defined above
167 #filter-with = proxy-prefix
168
169
170 ## RHODECODE PLUGINS ##
171 rhodecode.includes = rhodecode.api
172
173 # api prefix url
174 rhodecode.api.url = /_admin/api
175
176
177 ## END RHODECODE PLUGINS ##
178
179 ## encryption key used to encrypt social plugin tokens,
180 ## remote_urls with credentials etc, if not set it defaults to
181 ## `beaker.session.secret`
182 #rhodecode.encrypted_values.secret =
183
184 ## decryption strict mode (enabled by default). It controls if decryption raises
185 ## `SignatureVerificationError` in case of wrong key, or damaged encryption data.
186 #rhodecode.encrypted_values.strict = false
187
188 ## return gzipped responses from Rhodecode (static files/application)
189 gzip_responses = false
190
191 ## autogenerate javascript routes file on startup
192 generate_js_files = false
193
194 ## Optional Languages
195 ## en(default), be, de, es, fr, it, ja, pl, pt, ru, zh
196 lang = en
197
198 ## perform a full repository scan on each server start, this should be
199 ## set to false after first startup, to allow faster server restarts.
200 startup.import_repos = true
201
202 ## Uncomment and set this path to use archive download cache.
203 ## Once enabled, generated archives will be cached at this location
204 ## and served from the cache during subsequent requests for the same archive of
205 ## the repository.
206 #archive_cache_dir = /tmp/tarballcache
207
208 ## change this to unique ID for security
209 app_instance_uuid = rc-production
210
211 ## cut off limit for large diffs (size in bytes)
212 cut_off_limit_diff = 1024000
213 cut_off_limit_file = 256000
214
215 ## use cache version of scm repo everywhere
216 vcs_full_cache = false
217
218 ## force https in RhodeCode, fixes https redirects, assumes it's always https
219 ## Normally this is controlled by proper http flags sent from http server
220 force_https = false
221
222 ## use Strict-Transport-Security headers
223 use_htsts = false
224
225 ## number of commits stats will parse on each iteration
226 commit_parse_limit = 25
227
228 ## git rev filter option, --all is the default filter, if you need to
229 ## hide all refs in changelog switch this to --branches --tags
230 git_rev_filter = --all
231
232 # Set to true if your repos are exposed using the dumb protocol
233 git_update_server_info = false
234
235 ## RSS/ATOM feed options
236 rss_cut_off_limit = 256000
237 rss_items_per_page = 10
238 rss_include_diff = false
239
240 ## gist URL alias, used to create nicer urls for gist. This should be an
241 ## url that does rewrites to _admin/gists/<gistid>.
242 ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal
243 ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid>
244 gist_alias_url =
245
246 ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be
247 ## used for access.
248 ## Adding ?auth_token = <token> to the url authenticates this request as if it
249 ## came from the the logged in user who own this authentication token.
250 ##
251 ## Syntax is <ControllerClass>:<function_pattern>.
252 ## To enable access to raw_files put `FilesController:raw`.
253 ## To enable access to patches add `ChangesetController:changeset_patch`.
254 ## The list should be "," separated and on a single line.
255 ##
256 ## Recommended controllers to enable:
257 # ChangesetController:changeset_patch,
258 # ChangesetController:changeset_raw,
259 # FilesController:raw,
260 # FilesController:archivefile,
261 # GistsController:*,
262 api_access_controllers_whitelist =
263
264 ## default encoding used to convert from and to unicode
265 ## can be also a comma separated list of encoding in case of mixed encodings
266 default_encoding = UTF-8
267
268 ## instance-id prefix
269 ## a prefix key for this instance used for cache invalidation when running
270 ## multiple instances of rhodecode, make sure it's globally unique for
271 ## all running rhodecode instances. Leave empty if you don't use it
272 instance_id =
273
274 ## Fallback authentication plugin. Set this to a plugin ID to force the usage
275 ## of an authentication plugin also if it is disabled by it's settings.
276 ## This could be useful if you are unable to log in to the system due to broken
277 ## authentication settings. Then you can enable e.g. the internal rhodecode auth
278 ## module to log in again and fix the settings.
279 ##
280 ## Available builtin plugin IDs (hash is part of the ID):
281 ## egg:rhodecode-enterprise-ce#rhodecode
282 ## egg:rhodecode-enterprise-ce#pam
283 ## egg:rhodecode-enterprise-ce#ldap
284 ## egg:rhodecode-enterprise-ce#jasig_cas
285 ## egg:rhodecode-enterprise-ce#headers
286 ## egg:rhodecode-enterprise-ce#crowd
287 #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode
288
289 ## alternative return HTTP header for failed authentication. Default HTTP
290 ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with
291 ## handling that causing a series of failed authentication calls.
292 ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code
293 ## This will be served instead of default 401 on bad authnetication
294 auth_ret_code =
295
296 ## use special detection method when serving auth_ret_code, instead of serving
297 ## ret_code directly, use 401 initially (Which triggers credentials prompt)
298 ## and then serve auth_ret_code to clients
299 auth_ret_code_detection = false
300
301 ## locking return code. When repository is locked return this HTTP code. 2XX
302 ## codes don't break the transactions while 4XX codes do
303 lock_ret_code = 423
304
305 ## allows to change the repository location in settings page
306 allow_repo_location_change = true
307
308 ## allows to setup custom hooks in settings page
309 allow_custom_hooks_settings = true
310
311 ## generated license token, goto license page in RhodeCode settings to obtain
312 ## new token
313 license_token = abra-cada-bra1-rce3
314
315 ## supervisor connection uri, for managing supervisor and logs.
316 supervisor.uri =
317 ## supervisord group name/id we only want this RC instance to handle
318 supervisor.group_id = dev
319
320 ## Display extended labs settings
321 labs_settings_active = true
322
323 ####################################
324 ### CELERY CONFIG ####
325 ####################################
326 use_celery = false
327 broker.host = localhost
328 broker.vhost = rabbitmqhost
329 broker.port = 5672
330 broker.user = rabbitmq
331 broker.password = qweqwe
332
333 celery.imports = rhodecode.lib.celerylib.tasks
334
335 celery.result.backend = amqp
336 celery.result.dburi = amqp://
337 celery.result.serialier = json
338
339 #celery.send.task.error.emails = true
340 #celery.amqp.task.result.expires = 18000
341
342 celeryd.concurrency = 2
343 #celeryd.log.file = celeryd.log
344 celeryd.log.level = debug
345 celeryd.max.tasks.per.child = 1
346
347 ## tasks will never be sent to the queue, but executed locally instead.
348 celery.always.eager = false
349
350 ####################################
351 ### BEAKER CACHE ####
352 ####################################
353 # default cache dir for templates. Putting this into a ramdisk
354 ## can boost performance, eg. %(here)s/data_ramdisk
355 cache_dir = %(here)s/data
356
357 ## locking and default file storage for Beaker. Putting this into a ramdisk
358 ## can boost performance, eg. %(here)s/data_ramdisk/cache/beaker_data
359 beaker.cache.data_dir = %(here)s/rc/data/cache/beaker_data
360 beaker.cache.lock_dir = %(here)s/rc/data/cache/beaker_lock
361
362 beaker.cache.regions = super_short_term, short_term, long_term, sql_cache_short, auth_plugins, repo_cache_long
363
364 beaker.cache.super_short_term.type = memory
365 beaker.cache.super_short_term.expire = 1
366 beaker.cache.super_short_term.key_length = 256
367
368 beaker.cache.short_term.type = memory
369 beaker.cache.short_term.expire = 60
370 beaker.cache.short_term.key_length = 256
371
372 beaker.cache.long_term.type = memory
373 beaker.cache.long_term.expire = 36000
374 beaker.cache.long_term.key_length = 256
375
376 beaker.cache.sql_cache_short.type = memory
377 beaker.cache.sql_cache_short.expire = 1
378 beaker.cache.sql_cache_short.key_length = 256
379
380 ## default is memory cache, configure only if required
381 ## using multi-node or multi-worker setup
382 #beaker.cache.auth_plugins.type = ext:database
383 #beaker.cache.auth_plugins.lock_dir = %(here)s/data/cache/auth_plugin_lock
384 #beaker.cache.auth_plugins.url = postgresql://postgres:secret@localhost/rhodecode
385 #beaker.cache.auth_plugins.url = mysql://root:secret@127.0.0.1/rhodecode
386 #beaker.cache.auth_plugins.sa.pool_recycle = 3600
387 #beaker.cache.auth_plugins.sa.pool_size = 10
388 #beaker.cache.auth_plugins.sa.max_overflow = 0
389
390 beaker.cache.repo_cache_long.type = memorylru_base
391 beaker.cache.repo_cache_long.max_items = 4096
392 beaker.cache.repo_cache_long.expire = 2592000
393
394 ## default is memorylru_base cache, configure only if required
395 ## using multi-node or multi-worker setup
396 #beaker.cache.repo_cache_long.type = ext:memcached
397 #beaker.cache.repo_cache_long.url = localhost:11211
398 #beaker.cache.repo_cache_long.expire = 1209600
399 #beaker.cache.repo_cache_long.key_length = 256
400
401 ####################################
402 ### BEAKER SESSION ####
403 ####################################
404
405 ## .session.type is type of storage options for the session, current allowed
406 ## types are file, ext:memcached, ext:database, and memory (default).
407 beaker.session.type = file
408 beaker.session.data_dir = %(here)s/rc/data/sessions/data
409
410 ## db based session, fast, and allows easy management over logged in users
411 #beaker.session.type = ext:database
412 #beaker.session.table_name = db_session
413 #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode
414 #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode
415 #beaker.session.sa.pool_recycle = 3600
416 #beaker.session.sa.echo = false
417
418 beaker.session.key = rhodecode
419 beaker.session.secret = test-rc-uytcxaz
420 beaker.session.lock_dir = %(here)s/rc/data/sessions/lock
421
422 ## Secure encrypted cookie. Requires AES and AES python libraries
423 ## you must disable beaker.session.secret to use this
424 #beaker.session.encrypt_key = <key_for_encryption>
425 #beaker.session.validate_key = <validation_key>
426
427 ## sets session as invalid(also logging out user) if it haven not been
428 ## accessed for given amount of time in seconds
429 beaker.session.timeout = 2592000
430 beaker.session.httponly = true
431 ## Path to use for the cookie.
432 #beaker.session.cookie_path = /<your-prefix>
433
434 ## uncomment for https secure cookie
435 beaker.session.secure = false
436
437 ## auto save the session to not to use .save()
438 beaker.session.auto = false
439
440 ## default cookie expiration time in seconds, set to `true` to set expire
441 ## at browser close
442 #beaker.session.cookie_expires = 3600
443
444 ###################################
445 ## SEARCH INDEXING CONFIGURATION ##
446 ###################################
447 ## Full text search indexer is available in rhodecode-tools under
448 ## `rhodecode-tools index` command
449
450 # WHOOSH Backend, doesn't require additional services to run
451 # it works good with few dozen repos
452 search.module = rhodecode.lib.index.whoosh
453 search.location = %(here)s/data/index
454
455 ########################################
456 ### CHANNELSTREAM CONFIG ####
457 ########################################
458 ## channelstream enables persistent connections and live notification
459 ## in the system. It's also used by the chat system
460
461 channelstream.enabled = false
462 # location of channelstream server on the backend
463 channelstream.server = 127.0.0.1:9800
464 ## location of the channelstream server from outside world
465 ## most likely this would be an http server special backend URL, that handles
466 ## websocket connections see nginx example for config
467 channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream
468 channelstream.secret = secret
469 channelstream.history.location = %(here)s/channelstream_history
470
471
472 ###################################
473 ## APPENLIGHT CONFIG ##
474 ###################################
475
476 ## Appenlight is tailored to work with RhodeCode, see
477 ## http://appenlight.com for details how to obtain an account
478
479 ## appenlight integration enabled
480 appenlight = false
481
482 appenlight.server_url = https://api.appenlight.com
483 appenlight.api_key = YOUR_API_KEY
484 #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5
485
486 # used for JS client
487 appenlight.api_public_key = YOUR_API_PUBLIC_KEY
488
489 ## TWEAK AMOUNT OF INFO SENT HERE
490
491 ## enables 404 error logging (default False)
492 appenlight.report_404 = false
493
494 ## time in seconds after request is considered being slow (default 1)
495 appenlight.slow_request_time = 1
496
497 ## record slow requests in application
498 ## (needs to be enabled for slow datastore recording and time tracking)
499 appenlight.slow_requests = true
500
501 ## enable hooking to application loggers
502 appenlight.logging = true
503
504 ## minimum log level for log capture
505 appenlight.logging.level = WARNING
506
507 ## send logs only from erroneous/slow requests
508 ## (saves API quota for intensive logging)
509 appenlight.logging_on_error = false
510
511 ## list of additonal keywords that should be grabbed from environ object
512 ## can be string with comma separated list of words in lowercase
513 ## (by default client will always send following info:
514 ## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that
515 ## start with HTTP* this list be extended with additional keywords here
516 appenlight.environ_keys_whitelist =
517
518 ## list of keywords that should be blanked from request object
519 ## can be string with comma separated list of words in lowercase
520 ## (by default client will always blank keys that contain following words
521 ## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf'
522 ## this list be extended with additional keywords set here
523 appenlight.request_keys_blacklist =
524
525 ## list of namespaces that should be ignores when gathering log entries
526 ## can be string with comma separated list of namespaces
527 ## (by default the client ignores own entries: appenlight_client.client)
528 appenlight.log_namespace_blacklist =
529
530
531 ################################################################################
532 ## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT* ##
533 ## Debug mode will enable the interactive debugging tool, allowing ANYONE to ##
534 ## execute malicious code after an exception is raised. ##
535 ################################################################################
536 set debug = false
537
538
539 ##############
540 ## STYLING ##
541 ##############
542 debug_style = false
543
544 #########################################################
545 ### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG ###
546 #########################################################
547 #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode_test.db
548 #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode_test
549 #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode_test
550 sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode_test.db
551
552 # see sqlalchemy docs for other advanced settings
553
554 ## print the sql statements to output
555 sqlalchemy.db1.echo = false
556 ## recycle the connections after this ammount of seconds
557 sqlalchemy.db1.pool_recycle = 3600
558 sqlalchemy.db1.convert_unicode = true
559
560 ## the number of connections to keep open inside the connection pool.
561 ## 0 indicates no limit
562 #sqlalchemy.db1.pool_size = 5
563
564 ## the number of connections to allow in connection pool "overflow", that is
565 ## connections that can be opened above and beyond the pool_size setting,
566 ## which defaults to five.
567 #sqlalchemy.db1.max_overflow = 10
568
569
570 ##################
571 ### VCS CONFIG ###
572 ##################
573 vcs.server.enable = true
574 vcs.server = localhost:9901
575
576 ## Web server connectivity protocol, responsible for web based VCS operatations
577 ## Available protocols are:
578 ## `pyro4` - using pyro4 server
579 ## `http` - using http-rpc backend
580 vcs.server.protocol = http
581
582 ## Push/Pull operations protocol, available options are:
583 ## `pyro4` - using pyro4 server
584 ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended
585 ## `vcsserver.scm_app` - internal app (EE only)
586 vcs.scm_app_implementation = http
587
588 ## Push/Pull operations hooks protocol, available options are:
589 ## `pyro4` - using pyro4 server
590 ## `http` - using http-rpc backend
591 vcs.hooks.protocol = http
592
593 vcs.server.log_level = debug
594 ## Start VCSServer with this instance as a subprocess, usefull for development
595 vcs.start_server = false
596
597 ## List of enabled VCS backends, available options are:
598 ## `hg` - mercurial
599 ## `git` - git
600 ## `svn` - subversion
601 vcs.backends = hg, git, svn
602
603 vcs.connection_timeout = 3600
604 ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out.
605 ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible
606 #vcs.svn.compatible_version = pre-1.8-compatible
607
608
609 ############################################################
610 ### Subversion proxy support (mod_dav_svn) ###
611 ### Maps RhodeCode repo groups into SVN paths for Apache ###
612 ############################################################
613 ## Enable or disable the config file generation.
614 svn.proxy.generate_config = false
615 ## Generate config file with `SVNListParentPath` set to `On`.
616 svn.proxy.list_parent_path = true
617 ## Set location and file name of generated config file.
618 svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf
619 ## File system path to the directory containing the repositories served by
620 ## RhodeCode.
621 svn.proxy.parent_path_root = /path/to/repo_store
622 ## Used as a prefix to the <Location> block in the generated config file. In
623 ## most cases it should be set to `/`.
624 svn.proxy.location_root = /
625
626
627 ################################
628 ### LOGGING CONFIGURATION ####
629 ################################
630 [loggers]
631 keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates
632
633 [handlers]
634 keys = console, console_sql
635
636 [formatters]
637 keys = generic, color_formatter, color_formatter_sql
638
639 #############
640 ## LOGGERS ##
641 #############
642 [logger_root]
643 level = NOTSET
644 handlers = console
645
646 [logger_routes]
647 level = DEBUG
648 handlers =
649 qualname = routes.middleware
650 ## "level = DEBUG" logs the route matched and routing variables.
651 propagate = 1
652
653 [logger_beaker]
654 level = DEBUG
655 handlers =
656 qualname = beaker.container
657 propagate = 1
658
659 [logger_pyro4]
660 level = DEBUG
661 handlers =
662 qualname = Pyro4
663 propagate = 1
664
665 [logger_templates]
666 level = INFO
667 handlers =
668 qualname = pylons.templating
669 propagate = 1
670
671 [logger_rhodecode]
672 level = DEBUG
673 handlers =
674 qualname = rhodecode
675 propagate = 1
676
677 [logger_sqlalchemy]
678 level = ERROR
679 handlers = console_sql
680 qualname = sqlalchemy.engine
681 propagate = 0
682
683 ##############
684 ## HANDLERS ##
685 ##############
686
687 [handler_console]
688 class = StreamHandler
689 args = (sys.stderr,)
690 level = DEBUG
691 formatter = generic
692
693 [handler_console_sql]
694 class = StreamHandler
695 args = (sys.stderr,)
696 level = WARN
697 formatter = generic
698
699 ################
700 ## FORMATTERS ##
701 ################
702
703 [formatter_generic]
704 class = rhodecode.lib.logging_formatter.Pyro4AwareFormatter
705 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
706 datefmt = %Y-%m-%d %H:%M:%S
707
708 [formatter_color_formatter]
709 class = rhodecode.lib.logging_formatter.ColorFormatter
710 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
711 datefmt = %Y-%m-%d %H:%M:%S
712
713 [formatter_color_formatter_sql]
714 class = rhodecode.lib.logging_formatter.ColorFormatterSql
715 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
716 datefmt = %Y-%m-%d %H:%M:%S
@@ -0,0 +1,83 b''
1 ################################################################################
2 # RhodeCode VCSServer with HTTP Backend - configuration #
3 # #
4 ################################################################################
5
6 [app:main]
7 use = egg:rhodecode-vcsserver
8
9 pyramid.default_locale_name = en
10 pyramid.includes =
11 pyramid.reload_templates = true
12
13 # default locale used by VCS systems
14 locale = en_US.UTF-8
15
16 # cache regions, please don't change
17 beaker.cache.regions = repo_object
18 beaker.cache.repo_object.type = memorylru
19 beaker.cache.repo_object.max_items = 100
20 # cache auto-expires after N seconds
21 beaker.cache.repo_object.expire = 300
22 beaker.cache.repo_object.enabled = true
23
24 [server:main]
25 use = egg:waitress#main
26 host = 127.0.0.1
27 port = 9900
28
29 ################################
30 ### LOGGING CONFIGURATION ####
31 ################################
32 [loggers]
33 keys = root, vcsserver, pyro4, beaker
34
35 [handlers]
36 keys = console
37
38 [formatters]
39 keys = generic
40
41 #############
42 ## LOGGERS ##
43 #############
44 [logger_root]
45 level = NOTSET
46 handlers = console
47
48 [logger_vcsserver]
49 level = DEBUG
50 handlers =
51 qualname = vcsserver
52 propagate = 1
53
54 [logger_beaker]
55 level = DEBUG
56 handlers =
57 qualname = beaker
58 propagate = 1
59
60 [logger_pyro4]
61 level = DEBUG
62 handlers =
63 qualname = Pyro4
64 propagate = 1
65
66
67 ##############
68 ## HANDLERS ##
69 ##############
70
71 [handler_console]
72 class = StreamHandler
73 args = (sys.stderr,)
74 level = INFO
75 formatter = generic
76
77 ################
78 ## FORMATTERS ##
79 ################
80
81 [formatter_generic]
82 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
83 datefmt = %Y-%m-%d %H:%M:%S
@@ -0,0 +1,79 b''
1 ################################################################################
2 # RhodeCode VCSServer - configuration #
3 # #
4 ################################################################################
5
6 [DEFAULT]
7 host = 127.0.0.1
8 port = 9900
9 locale = en_US.UTF-8
10 # number of worker threads, this should be set based on a formula threadpool=N*6
11 # where N is number of RhodeCode Enterprise workers, eg. running 2 instances
12 # 8 gunicorn workers each would be 2 * 8 * 6 = 96, threadpool_size = 96
13 threadpool_size = 96
14 timeout = 0
15
16 # cache regions, please don't change
17 beaker.cache.regions = repo_object
18 beaker.cache.repo_object.type = memorylru
19 beaker.cache.repo_object.max_items = 100
20 # cache auto-expires after N seconds
21 beaker.cache.repo_object.expire = 300
22 beaker.cache.repo_object.enabled = true
23
24
25 ################################
26 ### LOGGING CONFIGURATION ####
27 ################################
28 [loggers]
29 keys = root, vcsserver, pyro4, beaker
30
31 [handlers]
32 keys = console
33
34 [formatters]
35 keys = generic
36
37 #############
38 ## LOGGERS ##
39 #############
40 [logger_root]
41 level = NOTSET
42 handlers = console
43
44 [logger_vcsserver]
45 level = DEBUG
46 handlers =
47 qualname = vcsserver
48 propagate = 1
49
50 [logger_beaker]
51 level = DEBUG
52 handlers =
53 qualname = beaker
54 propagate = 1
55
56 [logger_pyro4]
57 level = DEBUG
58 handlers =
59 qualname = Pyro4
60 propagate = 1
61
62
63 ##############
64 ## HANDLERS ##
65 ##############
66
67 [handler_console]
68 class = StreamHandler
69 args = (sys.stderr,)
70 level = INFO
71 formatter = generic
72
73 ################
74 ## FORMATTERS ##
75 ################
76
77 [formatter_generic]
78 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
79 datefmt = %Y-%m-%d %H:%M:%S
@@ -1,5 +1,5 b''
1 1 [bumpversion]
2 current_version = 4.4.2
2 current_version = 4.5.0
3 3 message = release: Bump version {current_version} to {new_version}
4 4
5 5 [bumpversion:file:rhodecode/VERSION]
@@ -43,6 +43,7 b' syntax: regexp'
43 43 ^rhodecode/public/css/style-polymer.css$
44 44 ^rhodecode/public/js/rhodecode-components.html$
45 45 ^rhodecode/public/js/scripts.js$
46 ^rhodecode/public/js/rhodecode-components.js$
46 47 ^rhodecode/public/js/src/components/root-styles.gen.html$
47 48 ^rhodecode/public/js/vendors/webcomponentsjs/
48 49 ^rhodecode\.db$
@@ -4,26 +4,21 b' done = false'
4 4 [task:bump_version]
5 5 done = true
6 6
7 [task:rc_tools_pinned]
8 done = true
9
10 7 [task:fixes_on_stable]
11 done = true
12 8
13 9 [task:pip2nix_generated]
14 done = true
15 10
16 11 [task:changelog_updated]
17 done = true
18 12
19 13 [task:generate_api_docs]
20 done = true
14
15 [task:updated_translation]
21 16
22 17 [release]
23 state = prepared
24 version = 4.4.2
18 state = in_progress
19 version = 4.5.0
25 20
26 [task:updated_translation]
21 [task:rc_tools_pinned]
27 22
28 23 [task:generate_js_routes]
29 24
@@ -11,5 +11,5 b' module.exports = function(grunt) {'
11 11 grunt.loadNpmTasks('grunt-crisper');
12 12 grunt.loadNpmTasks('grunt-contrib-copy');
13 13
14 grunt.registerTask('default', ['less:production', 'less:components', 'concat:polymercss', 'copy','vulcanize', 'crisper', 'concat:dist']);
14 grunt.registerTask('default', ['less:production', 'less:components', 'concat:polymercss', 'copy', 'concat:dist', 'vulcanize', 'crisper']);
15 15 };
@@ -12,6 +12,10 b' permission notice:'
12 12 file:licenses/msgpack_license.txt
13 13 Copyright (c) 2009 - tornado
14 14 file:licenses/tornado_license.txt
15 Copyright (c) 2015 - pygments-markdown-lexer
16 file:licenses/pygments_markdown_lexer_license.txt
17 Copyright 2006 - diff_match_patch
18 file:licenses/diff_match_patch_license.txt
15 19
16 20 All licensed under the Apache License, Version 2.0 (the "License");
17 21 you may not use this file except in compliance with the License.
@@ -2,7 +2,6 b''
2 2 WEBPACK=./node_modules/webpack/bin/webpack.js
3 3 GRUNT=grunt
4 4 NODE_PATH=./node_modules
5 FLAKE8=flake8 setup.py pytest_pylons/ rhodecode/ --select=E124 --ignore=E711,E712,E510,E121,E122,E126,E127,E128,E501,F401 --max-line-length=100 --exclude=*rhodecode/lib/dbmigrate/*,*rhodecode/tests/*,*rhodecode/lib/vcs/utils/*
6 5 CI_PREFIX=enterprise
7 6
8 7 .PHONY: docs docs-clean ci-docs clean test test-clean test-lint test-only
@@ -25,13 +24,6 b' test: test-clean test-only'
25 24 test-clean:
26 25 rm -rf coverage.xml htmlcov junit.xml pylint.log result
27 26
28 test-lint:
29 if [ "$$IN_NIX_SHELL" = "1" ]; then \
30 $(FLAKE8); \
31 else \
32 $(FLAKE8) --format=pylint --exit-zero > pylint.log; \
33 fi
34
35 27 test-only:
36 28 PYTHONHASHSEED=random py.test -vv -r xw --cov=rhodecode --cov-report=term-missing --cov-report=html rhodecode/tests/
37 29
@@ -10,6 +10,8 b''
10 10 "paper-tooltip": "PolymerElements/paper-tooltip#^1.1.2",
11 11 "paper-toast": "PolymerElements/paper-toast#^1.3.0",
12 12 "paper-toggle-button": "PolymerElements/paper-toggle-button#^1.2.0",
13 "iron-ajax": "PolymerElements/iron-ajax#^1.4.3"
13 "iron-ajax": "PolymerElements/iron-ajax#^1.4.3",
14 "iron-autogrow-textarea": "PolymerElements/iron-autogrow-textarea#^1.0.13",
15 "iron-a11y-keys": "PolymerElements/iron-a11y-keys#^1.0.6"
14 16 }
15 17 }
@@ -1,7 +1,7 b''
1 1
2 2
3 3 ################################################################################
4 ## RHODECODE ENTERPRISE CONFIGURATION ##
4 ## RHODECODE COMMUNITY EDITION CONFIGURATION ##
5 5 # The %(here)s variable will be replaced with the parent directory of this file#
6 6 ################################################################################
7 7
@@ -64,7 +64,7 b' asyncore_use_poll = true'
64 64 ##########################
65 65 ## GUNICORN WSGI SERVER ##
66 66 ##########################
67 ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini>
67 ## run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini
68 68
69 69 #use = egg:gunicorn#main
70 70 ## Sets the number of process workers. You must set `instance_id = *`
@@ -91,11 +91,13 b' asyncore_use_poll = true'
91 91 #timeout = 21600
92 92
93 93
94 ## prefix middleware for RhodeCode, disables force_https flag.
94 ## prefix middleware for RhodeCode.
95 95 ## recommended when using proxy setup.
96 96 ## allows to set RhodeCode under a prefix in server.
97 ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well.
98 ## optionally set prefix like: `prefix = /<your-prefix>`
97 ## eg https://server.com/custom_prefix. Enable `filter-with =` option below as well.
98 ## And set your prefix like: `prefix = /custom_prefix`
99 ## be sure to also set beaker.session.cookie_path = /custom_prefix if you need
100 ## to make your cookies only work on prefix url
99 101 [filter:proxy-prefix]
100 102 use = egg:PasteDeploy#prefix
101 103 prefix = /
@@ -194,17 +196,17 b' rss_items_per_page = 10'
194 196 rss_include_diff = false
195 197
196 198 ## gist URL alias, used to create nicer urls for gist. This should be an
197 ## url that does rewrites to _admin/gists/<gistid>.
199 ## url that does rewrites to _admin/gists/{gistid}.
198 200 ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal
199 ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid>
201 ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid}
200 202 gist_alias_url =
201 203
202 204 ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be
203 205 ## used for access.
204 ## Adding ?auth_token = <token> to the url authenticates this request as if it
206 ## Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it
205 207 ## came from the the logged in user who own this authentication token.
206 208 ##
207 ## Syntax is <ControllerClass>:<function_pattern>.
209 ## Syntax is ControllerClass:function_pattern.
208 210 ## To enable access to raw_files put `FilesController:raw`.
209 211 ## To enable access to patches add `ChangesetController:changeset_patch`.
210 212 ## The list should be "," separated and on a single line.
@@ -377,15 +379,15 b' beaker.session.lock_dir = %(here)s/data/'
377 379
378 380 ## Secure encrypted cookie. Requires AES and AES python libraries
379 381 ## you must disable beaker.session.secret to use this
380 #beaker.session.encrypt_key = <key_for_encryption>
381 #beaker.session.validate_key = <validation_key>
382 #beaker.session.encrypt_key = key_for_encryption
383 #beaker.session.validate_key = validation_key
382 384
383 385 ## sets session as invalid(also logging out user) if it haven not been
384 386 ## accessed for given amount of time in seconds
385 387 beaker.session.timeout = 2592000
386 388 beaker.session.httponly = true
387 ## Path to use for the cookie.
388 #beaker.session.cookie_path = /<your-prefix>
389 ## Path to use for the cookie. Set to prefix if you use prefix middleware
390 #beaker.session.cookie_path = /custom_prefix
389 391
390 392 ## uncomment for https secure cookie
391 393 beaker.session.secure = false
@@ -403,8 +405,8 b' beaker.session.auto = false'
403 405 ## Full text search indexer is available in rhodecode-tools under
404 406 ## `rhodecode-tools index` command
405 407
406 # WHOOSH Backend, doesn't require additional services to run
407 # it works good with few dozen repos
408 ## WHOOSH Backend, doesn't require additional services to run
409 ## it works good with few dozen repos
408 410 search.module = rhodecode.lib.index.whoosh
409 411 search.location = %(here)s/data/index
410 412
@@ -511,7 +513,7 b' sqlalchemy.db1.url = sqlite:///%(here)s/'
511 513
512 514 ## print the sql statements to output
513 515 sqlalchemy.db1.echo = false
514 ## recycle the connections after this ammount of seconds
516 ## recycle the connections after this amount of seconds
515 517 sqlalchemy.db1.pool_recycle = 3600
516 518 sqlalchemy.db1.convert_unicode = true
517 519
@@ -533,19 +535,19 b' vcs.server = localhost:9900'
533 535
534 536 ## Web server connectivity protocol, responsible for web based VCS operatations
535 537 ## Available protocols are:
536 ## `pyro4` - using pyro4 server
537 ## `http` - using http-rpc backend
538 ## `pyro4` - use pyro4 server
539 ## `http` - use http-rpc backend (default)
538 540 vcs.server.protocol = http
539 541
540 542 ## Push/Pull operations protocol, available options are:
541 ## `pyro4` - using pyro4 server
542 ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended
543 ## `vcsserver.scm_app` - internal app (EE only)
544 vcs.scm_app_implementation = rhodecode.lib.middleware.utils.scm_app_http
543 ## `pyro4` - use pyro4 server
544 ## `http` - use http-rpc backend (default)
545 ##
546 vcs.scm_app_implementation = http
545 547
546 548 ## Push/Pull operations hooks protocol, available options are:
547 ## `pyro4` - using pyro4 server
548 ## `http` - using http-rpc backend
549 ## `pyro4` - use pyro4 server
550 ## `http` - use http-rpc backend (default)
549 551 vcs.hooks.protocol = http
550 552
551 553 vcs.server.log_level = debug
@@ -574,12 +576,15 b' svn.proxy.generate_config = false'
574 576 svn.proxy.list_parent_path = true
575 577 ## Set location and file name of generated config file.
576 578 svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf
577 ## File system path to the directory containing the repositories served by
578 ## RhodeCode.
579 svn.proxy.parent_path_root = /path/to/repo_store
580 ## Used as a prefix to the <Location> block in the generated config file. In
581 ## most cases it should be set to `/`.
579 ## Used as a prefix to the `Location` block in the generated config file.
580 ## In most cases it should be set to `/`.
582 581 svn.proxy.location_root = /
582 ## Command to reload the mod dav svn configuration on change.
583 ## Example: `/etc/init.d/apache2 reload`
584 #svn.proxy.reload_cmd = /etc/init.d/apache2 reload
585 ## If the timeout expires before the reload command finishes, the command will
586 ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds.
587 #svn.proxy.reload_timeout = 10
583 588
584 589
585 590 ################################
@@ -1,7 +1,7 b''
1 1
2 2
3 3 ################################################################################
4 ## RHODECODE ENTERPRISE CONFIGURATION ##
4 ## RHODECODE COMMUNITY EDITION CONFIGURATION ##
5 5 # The %(here)s variable will be replaced with the parent directory of this file#
6 6 ################################################################################
7 7
@@ -64,7 +64,7 b' port = 5000'
64 64 ##########################
65 65 ## GUNICORN WSGI SERVER ##
66 66 ##########################
67 ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini>
67 ## run with gunicorn --log-config rhodecode.ini --paste rhodecode.ini
68 68
69 69 use = egg:gunicorn#main
70 70 ## Sets the number of process workers. You must set `instance_id = *`
@@ -91,11 +91,13 b' max_requests_jitter = 30'
91 91 timeout = 21600
92 92
93 93
94 ## prefix middleware for RhodeCode, disables force_https flag.
94 ## prefix middleware for RhodeCode.
95 95 ## recommended when using proxy setup.
96 96 ## allows to set RhodeCode under a prefix in server.
97 ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well.
98 ## optionally set prefix like: `prefix = /<your-prefix>`
97 ## eg https://server.com/custom_prefix. Enable `filter-with =` option below as well.
98 ## And set your prefix like: `prefix = /custom_prefix`
99 ## be sure to also set beaker.session.cookie_path = /custom_prefix if you need
100 ## to make your cookies only work on prefix url
99 101 [filter:proxy-prefix]
100 102 use = egg:PasteDeploy#prefix
101 103 prefix = /
@@ -168,17 +170,17 b' rss_items_per_page = 10'
168 170 rss_include_diff = false
169 171
170 172 ## gist URL alias, used to create nicer urls for gist. This should be an
171 ## url that does rewrites to _admin/gists/<gistid>.
173 ## url that does rewrites to _admin/gists/{gistid}.
172 174 ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal
173 ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid>
175 ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/{gistid}
174 176 gist_alias_url =
175 177
176 178 ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be
177 179 ## used for access.
178 ## Adding ?auth_token = <token> to the url authenticates this request as if it
180 ## Adding ?auth_token=TOKEN_HASH to the url authenticates this request as if it
179 181 ## came from the the logged in user who own this authentication token.
180 182 ##
181 ## Syntax is <ControllerClass>:<function_pattern>.
183 ## Syntax is ControllerClass:function_pattern.
182 184 ## To enable access to raw_files put `FilesController:raw`.
183 185 ## To enable access to patches add `ChangesetController:changeset_patch`.
184 186 ## The list should be "," separated and on a single line.
@@ -351,15 +353,15 b' beaker.session.lock_dir = %(here)s/data/'
351 353
352 354 ## Secure encrypted cookie. Requires AES and AES python libraries
353 355 ## you must disable beaker.session.secret to use this
354 #beaker.session.encrypt_key = <key_for_encryption>
355 #beaker.session.validate_key = <validation_key>
356 #beaker.session.encrypt_key = key_for_encryption
357 #beaker.session.validate_key = validation_key
356 358
357 359 ## sets session as invalid(also logging out user) if it haven not been
358 360 ## accessed for given amount of time in seconds
359 361 beaker.session.timeout = 2592000
360 362 beaker.session.httponly = true
361 ## Path to use for the cookie.
362 #beaker.session.cookie_path = /<your-prefix>
363 ## Path to use for the cookie. Set to prefix if you use prefix middleware
364 #beaker.session.cookie_path = /custom_prefix
363 365
364 366 ## uncomment for https secure cookie
365 367 beaker.session.secure = false
@@ -377,8 +379,8 b' beaker.session.auto = false'
377 379 ## Full text search indexer is available in rhodecode-tools under
378 380 ## `rhodecode-tools index` command
379 381
380 # WHOOSH Backend, doesn't require additional services to run
381 # it works good with few dozen repos
382 ## WHOOSH Backend, doesn't require additional services to run
383 ## it works good with few dozen repos
382 384 search.module = rhodecode.lib.index.whoosh
383 385 search.location = %(here)s/data/index
384 386
@@ -480,7 +482,7 b' sqlalchemy.db1.url = postgresql://postgr'
480 482
481 483 ## print the sql statements to output
482 484 sqlalchemy.db1.echo = false
483 ## recycle the connections after this ammount of seconds
485 ## recycle the connections after this amount of seconds
484 486 sqlalchemy.db1.pool_recycle = 3600
485 487 sqlalchemy.db1.convert_unicode = true
486 488
@@ -502,20 +504,20 b' vcs.server = localhost:9900'
502 504
503 505 ## Web server connectivity protocol, responsible for web based VCS operatations
504 506 ## Available protocols are:
505 ## `pyro4` - using pyro4 server
506 ## `http` - using http-rpc backend
507 #vcs.server.protocol = http
507 ## `pyro4` - use pyro4 server
508 ## `http` - use http-rpc backend (default)
509 vcs.server.protocol = http
508 510
509 511 ## Push/Pull operations protocol, available options are:
510 ## `pyro4` - using pyro4 server
511 ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended
512 ## `vcsserver.scm_app` - internal app (EE only)
513 #vcs.scm_app_implementation = rhodecode.lib.middleware.utils.scm_app_http
512 ## `pyro4` - use pyro4 server
513 ## `http` - use http-rpc backend (default)
514 ##
515 vcs.scm_app_implementation = http
514 516
515 517 ## Push/Pull operations hooks protocol, available options are:
516 ## `pyro4` - using pyro4 server
517 ## `http` - using http-rpc backend
518 #vcs.hooks.protocol = http
518 ## `pyro4` - use pyro4 server
519 ## `http` - use http-rpc backend (default)
520 vcs.hooks.protocol = http
519 521
520 522 vcs.server.log_level = info
521 523 ## Start VCSServer with this instance as a subprocess, usefull for development
@@ -543,12 +545,15 b' svn.proxy.generate_config = false'
543 545 svn.proxy.list_parent_path = true
544 546 ## Set location and file name of generated config file.
545 547 svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf
546 ## File system path to the directory containing the repositories served by
547 ## RhodeCode.
548 svn.proxy.parent_path_root = /path/to/repo_store
549 ## Used as a prefix to the <Location> block in the generated config file. In
550 ## most cases it should be set to `/`.
548 ## Used as a prefix to the `Location` block in the generated config file.
549 ## In most cases it should be set to `/`.
551 550 svn.proxy.location_root = /
551 ## Command to reload the mod dav svn configuration on change.
552 ## Example: `/etc/init.d/apache2 reload`
553 #svn.proxy.reload_cmd = /etc/init.d/apache2 reload
554 ## If the timeout expires before the reload command finishes, the command will
555 ## be killed. Setting it to zero means no timeout. Defaults to 10 seconds.
556 #svn.proxy.reload_timeout = 10
552 557
553 558
554 559 ################################
@@ -4,27 +4,55 b''
4 4 # derivation. For advanced tweaks to pimp up the development environment we use
5 5 # "shell.nix" so that it does not have to clutter this file.
6 6
7 { pkgs ? (import <nixpkgs> {})
8 , pythonPackages ? "python27Packages"
7 args@
8 { pythonPackages ? "python27Packages"
9 9 , pythonExternalOverrides ? self: super: {}
10 10 , doCheck ? true
11 , ...
11 12 }:
12 13
13 let pkgs_ = pkgs; in
14
15 14 let
16 pkgs = pkgs_.overridePackages (self: super: {
17 # Override subversion derivation to
18 # - activate python bindings
19 # - set version to 1.8
20 subversion = super.subversion18.override {
21 httpSupport = true;
22 pythonBindings = true;
23 python = self.python27Packages.python;
24 };
15
16 # Use nixpkgs from args or import them. We use this indirect approach
17 # through args to be able to use the name `pkgs` for our customized packages.
18 # Otherwise we will end up with an infinite recursion.
19 nixpkgs = args.pkgs or (import <nixpkgs> { });
20
21 # johbo: Interim bridge which allows us to build with the upcoming
22 # nixos.16.09 branch (unstable at the moment of writing this note) and the
23 # current stable nixos-16.03.
24 backwardsCompatibleFetchgit = { ... }@args:
25 let
26 origSources = nixpkgs.fetchgit args;
27 in
28 nixpkgs.lib.overrideDerivation origSources (oldAttrs: {
29 NIX_PREFETCH_GIT_CHECKOUT_HOOK = ''
30 find $out -name '.git*' -print0 | xargs -0 rm -rf
31 '';
32 });
33
34 # Create a customized version of nixpkgs which should be used throughout the
35 # rest of this file.
36 pkgs = nixpkgs.overridePackages (self: super: {
37 fetchgit = backwardsCompatibleFetchgit;
25 38 });
26 39
27 inherit (pkgs.lib) fix extends;
40 # Evaluates to the last segment of a file system path.
41 basename = path: with pkgs.lib; last (splitString "/" path);
42
43 # source code filter used as arugment to builtins.filterSource.
44 src-filter = path: type: with pkgs.lib;
45 let
46 ext = last (splitString "." path);
47 in
48 !builtins.elem (basename path) [
49 ".git" ".hg" "__pycache__" ".eggs"
50 "bower_components" "node_modules"
51 "build" "data" "result" "tmp"] &&
52 !builtins.elem ext ["egg-info" "pyc"] &&
53 # TODO: johbo: This check is wrong, since "path" contains an absolute path,
54 # it would still be good to restore it since we want to ignore "result-*".
55 !hasPrefix "result" path;
28 56
29 57 basePythonPackages = with builtins; if isAttrs pythonPackages
30 58 then pythonPackages
@@ -34,25 +62,6 b' let'
34 62 pkgs.buildBowerComponents or
35 63 (import ./pkgs/backport-16.03-build-bower-components.nix { inherit pkgs; });
36 64
37 elem = builtins.elem;
38 basename = path: with pkgs.lib; last (splitString "/" path);
39 startsWith = prefix: full: let
40 actualPrefix = builtins.substring 0 (builtins.stringLength prefix) full;
41 in actualPrefix == prefix;
42
43 src-filter = path: type: with pkgs.lib;
44 let
45 ext = last (splitString "." path);
46 in
47 !elem (basename path) [
48 ".git" ".hg" "__pycache__" ".eggs"
49 "bower_components" "node_modules"
50 "build" "data" "result" "tmp"] &&
51 !elem ext ["egg-info" "pyc"] &&
52 # TODO: johbo: This check is wrong, since "path" contains an absolute path,
53 # it would still be good to restore it since we want to ignore "result-*".
54 !startsWith "result" path;
55
56 65 sources = pkgs.config.rc.sources or {};
57 66 version = builtins.readFile ./rhodecode/VERSION;
58 67 rhodecode-enterprise-ce-src = builtins.filterSource src-filter ./.;
@@ -147,18 +156,6 b' let'
147 156 then "${pkgs.glibcLocales}/lib/locale/locale-archive"
148 157 else "";
149 158
150 # Somewhat snappier setup of the development environment
151 # TODO: move into shell.nix
152 # TODO: think of supporting a stable path again, so that multiple shells
153 # can share it.
154 shellHook = ''
155 tmp_path=$(mktemp -d)
156 export PATH="$tmp_path/bin:$PATH"
157 export PYTHONPATH="$tmp_path/${self.python.sitePackages}:$PYTHONPATH"
158 mkdir -p $tmp_path/${self.python.sitePackages}
159 python setup.py develop --prefix $tmp_path --allow-hosts ""
160 '' + linkNodeAndBowerPackages;
161
162 159 preCheck = ''
163 160 export PATH="$out/bin:$PATH"
164 161 '';
@@ -226,16 +223,16 b' let'
226 223 rhodecode-testdata-src = sources.rhodecode-testdata or (
227 224 pkgs.fetchhg {
228 225 url = "https://code.rhodecode.com/upstream/rc_testdata";
229 rev = "v0.8.0";
230 sha256 = "0hy1ba134rq2f9si85yx7j4qhc9ky0hjzdk553s3q026i7km809m";
226 rev = "v0.9.0";
227 sha256 = "0k0ccb7cncd6mmzwckfbr6l7fsymcympwcm948qc3i0f0m6bbg1y";
231 228 });
232 229
233 230 # Apply all overrides and fix the final package set
234 myPythonPackagesUnfix =
231 myPythonPackagesUnfix = with pkgs.lib;
235 232 (extends pythonExternalOverrides
236 233 (extends pythonLocalOverrides
237 234 (extends pythonOverrides
238 235 pythonGeneratedPackages)));
239 myPythonPackages = (fix myPythonPackagesUnfix);
236 myPythonPackages = (pkgs.lib.fix myPythonPackagesUnfix);
240 237
241 238 in myPythonPackages.rhodecode-enterprise-ce
@@ -62,6 +62,24 b' 3. Select :guilabel:`Save`, and you will'
62 62
63 63 .. _md-rst:
64 64
65
66 Suppress license warnings or errors
67 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
68
69 In case you're running on maximum allowed users, RhodeCode will display a
70 warning message on pages that you're close to the license limits.
71 It's often not desired to show that all the time. Here's how you can suppress
72 the license messages.
73
74 1. From the |RCE| interface, select
75 :menuselection:`Admin --> Settings --> Global`
76 2. Select :guilabel:`Flash message filtering` from the drop-down menu.
77 3. Select :guilabel:`Save`, and you will no longer see the license message
78 once your page refreshes.
79
80 .. _admin-tricks-suppress-license-messages:
81
82
65 83 Markdown or RST Rendering
66 84 ^^^^^^^^^^^^^^^^^^^^^^^^^
67 85
@@ -7,10 +7,11 b' Use the following example to configure A'
7 7
8 8 .. code-block:: apache
9 9
10 <Location /<someprefix> > # Change <someprefix> into your chosen prefix
11 ProxyPass http://127.0.0.1:5000/<someprefix>
12 ProxyPassReverse http://127.0.0.1:5000/<someprefix>
13 SetEnvIf X-Url-Scheme https HTTPS=1
10 <Location /<someprefix>/ # Change <someprefix> into your chosen prefix
11 ProxyPreserveHost On
12 ProxyPass "http://127.0.0.1:5000/"
13 ProxyPassReverse "http://127.0.0.1:5000/"
14 Header set X-Url-Scheme https env=HTTPS
14 15 </Location>
15 16
16 17 In addition to the regular Apache setup you will need to add the following
@@ -9,9 +9,14 b' Backup and Restore'
9 9 To snapshot an instance of |RCE|, and save its settings, you need to backup the
10 10 following parts of the system at the same time.
11 11
12 * The |repos| managed by the instance.
12 * The |repos| managed by the instance together with the stored Gists.
13 13 * The |RCE| database.
14 * Any configuration files or extensions that you've configured.
14 * Any configuration files or extensions that you've configured. In most
15 cases it's only the :file:`rhodecode.ini` file.
16 * Installer files such as those in `/opt/rhodecode` can be backed-up, however
17 it's not required since in case of a recovery installer simply
18 re-creates those.
19
15 20
16 21 .. important::
17 22
@@ -29,11 +34,17 b' Repository Backup'
29 34 ^^^^^^^^^^^^^^^^^
30 35
31 36 To back up your |repos|, use the API to get a list of all |repos| managed,
32 and then clone them to your backup location.
37 and then clone them to your backup location. This is the most safe backup option.
38 Backing up the storage directory could potentially result in a backup of
39 partially committed files or commits. (Backup taking place during a big push)
40 As an alternative you could use a rsync or simple `cp` commands if you can
41 ensure your instance is only in read-only mode or stopped at the moment.
42
33 43
34 44 Use the ``get_repos`` method to list all your managed |repos|,
35 45 and use the ``clone_uri`` information that is returned. See the :ref:`api`
36 for more information.
46 for more information. Be sure to keep the structure or repositories with their
47 repository groups.
37 48
38 49 .. important::
39 50
@@ -54,13 +65,19 b' backup location:'
54 65 .. code-block:: bash
55 66
56 67 # For MySQL DBs
57 $ mysqldump -u <uname> -p <pass> db_name > mysql-db-backup
68 $ mysqldump -u <uname> -p <pass> rhodecode_db_name > mysql-db-backup
69 # MySQL restore command
70 $ mysql -u <uname> -p <pass> rhodecode_db_name < mysql-db-backup
58 71
59 72 # For PostgreSQL DBs
60 $ pg_dump dbname > postgresql-db-backup
73 $ PGPASSWORD=<pass> pg_dump rhodecode_db_name > postgresql-db-backup
74 # PosgreSQL restore
75 $ PGPASSWORD=<pass> psql -U <uname> -h localhost -d rhodecode_db_name -1 -f postgresql-db-backup
61 76
62 # For SQLlite
77 # For SQLite
63 78 $ sqlite3 rhodecode.db ‘.dump’ > sqlite-db-backup
79 # SQLite restore
80 $ copy sqlite-db-backup rhodecode.db
64 81
65 82
66 83 The default |RCE| SQLite database location is
@@ -75,13 +92,18 b' Configuration File Backup'
75 92 Depending on your setup, you could have a number of configuration files that
76 93 should be backed up. You may have some, or all of the configuration files
77 94 listed in the :ref:`config-rce-files` section. Ideally you should back these
78 up at the same time as the database and |repos|.
95 up at the same time as the database and |repos|. It really depends on if you need
96 the configuration file like logs, custom modules. We always recommend backing
97 those up.
79 98
80 99 Gist Backup
81 100 ^^^^^^^^^^^
82 101
83 To backup the gists on your |RCE| instance you can use the ``get_users`` and
84 ``get_gists`` API methods to fetch the gists for each user on the instance.
102 To backup the gists on your |RCE| instance you usually have to backup the
103 gist storage path. If this haven't been changed it's located inside
104 :file:`.rc_gist_store` and the metadata in :file:`.rc_gist_metadata`.
105 You can use the ``get_users`` and ``get_gists`` API methods to fetch the
106 gists for each user on the instance.
85 107
86 108 Extension Backups
87 109 ^^^^^^^^^^^^^^^^^
@@ -100,15 +122,17 b' the :ref:`indexing-ref` section.'
100 122 Restoration Steps
101 123 -----------------
102 124
103 To restore an instance of |RCE| from its backed up components, use the
104 following steps.
125 To restore an instance of |RCE| from its backed up components, to a fresh
126 system use the following steps.
105 127
106 1. Install a new instance of |RCE|.
107 2. Once installed, configure the instance to use the backed up
108 :file:`rhodecode.ini` file. Ensure this file points to the backed up
128 1. Install a new instance of |RCE| using sqlite option as database.
129 2. Restore your database.
130 3. Once installed, replace you backed up the :file:`rhodecode.ini` with your
131 backup version. Ensure this file points to the restored
109 132 database, see the :ref:`config-database` section.
110 3. Restart |RCE| and remap and rescan your |repos|, see the
111 :ref:`remap-rescan` section.
133 4. Restart |RCE| and remap and rescan your |repos| to verify filesystem access,
134 see the :ref:`remap-rescan` section.
135
112 136
113 137 Post Restoration Steps
114 138 ^^^^^^^^^^^^^^^^^^^^^^
@@ -22,8 +22,8 b' account permissions.'
22 22 .. code-block:: bash
23 23
24 24 # Open iShell from the terminal
25 $ .rccontrol/enterprise-5/profile/bin/paster \
26 ishell .rccontrol/enterprise-5/rhodecode.ini
25 $ .rccontrol/enterprise-1/profile/bin/paster \
26 ishell .rccontrol/enterprise-1/rhodecode.ini
27 27
28 28 .. code-block:: mysql
29 29
@@ -52,11 +52,12 b' following example to make changes to thi'
52 52 .. code-block:: mysql
53 53
54 54 # Use this example to enable global .hgrc access
55 In [4]: new_option = RhodeCodeUi()
56 In [5]: new_option.ui_section='web'
57 In [6]: new_option.ui_key='allow_push'
58 In [7]: new_option.ui_value='*'
59 In [8]: Session().add(new_option);Session().commit()
55 In [1]: new_option = RhodeCodeUi()
56 In [2]: new_option.ui_section='web'
57 In [3]: new_option.ui_key='allow_push'
58 In [4]: new_option.ui_value='*'
59 In [5]: Session().add(new_option);Session().commit()
60 In [6]: exit()
60 61
61 62 Manually Reset Password
62 63 ^^^^^^^^^^^^^^^^^^^^^^^
@@ -72,24 +73,47 b' Use the following code example to carry '
72 73 .. code-block:: bash
73 74
74 75 # starts the ishell interactive prompt
75 $ .rccontrol/enterprise-5/profile/bin/paster \
76 ishell .rccontrol/enterprise-5/rhodecode.ini
76 $ .rccontrol/enterprise-1/profile/bin/paster \
77 ishell .rccontrol/enterprise-1/rhodecode.ini
77 78
78 79 .. code-block:: mysql
79 80
80 from rhodecode.lib.auth import generate_auth_token
81 from rhodecode.lib.auth import get_crypt_password
82
81 In [1]: from rhodecode.lib.auth import generate_auth_token
82 In [2]: from rhodecode.lib.auth import get_crypt_password
83 83 # Enter the user name whose password you wish to change
84 my_user = 'USERNAME'
85 u = User.get_by_username(my_user)
86
84 In [3]: my_user = 'USERNAME'
85 In [4]: u = User.get_by_username(my_user)
87 86 # If this fails then the user does not exist
88 u.auth_token = generate_auth_token(my_user)
89
87 In [5]: u.auth_token = generate_auth_token(my_user)
90 88 # Set the new password
91 u.password = get_crypt_password('PASSWORD')
89 In [6]: u.password = get_crypt_password('PASSWORD')
90 In [7]: Session().add(u);Session().commit()
91 In [8]: exit()
92
93
94
95 Change user details
96 ^^^^^^^^^^^^^^^^^^^
97
98 If you need to manually change some of users details, use the following steps.
99
100 1. Navigate to your |RCE| install location.
101 2. Run the interactive ``ishell`` prompt.
102 3. Set a new arguments for users.
92 103
93 Session().add(u)
94 Session().commit()
95 exit
104 Use the following code example to carry out these steps.
105
106 .. code-block:: bash
107
108 # starts the ishell interactive prompt
109 $ .rccontrol/enterprise-1/profile/bin/paster \
110 ishell .rccontrol/enterprise-1/rhodecode.ini
111
112 .. code-block:: mysql
113
114 # Use this example to change email and username of LDAP user
115 In [1]: my_user = User.get_by_username('some_username')
116 In [2]: my_user.email = 'new_email@foobar.com'
117 In [3]: my_user.username = 'SomeUser'
118 In [4]: Session().add(my_user);Session().commit()
119 In [5]: exit()
@@ -36,7 +36,7 b' 1. On your local machine create the publ'
36 36 Your public key has been saved in /home/user/.ssh/id_rsa.pub.
37 37 The key fingerprint is:
38 38 02:82:38:95:e5:30:d2:ad:17:60:15:7f:94:17:9f:30 user@ubuntu
39 The key's randomart image is:
39 The key\'s randomart image is:
40 40 +--[ RSA 2048]----+
41 41
42 42 2. SFTP to your server, and copy the public key to the ``~/.ssh`` folder.
@@ -18,6 +18,7 b' The following are the most common system'
18 18
19 19 config-files-overview
20 20 vcs-server
21 svn-http
21 22 apache-config
22 23 nginx-config
23 24 backup-restore
@@ -18,7 +18,7 b' 1. Open ishell from the terminal and use'
18 18 2. Run the following commands, and ensure that |RCE| has write access to the
19 19 new directory:
20 20
21 .. code-block:: mysql
21 .. code-block:: bash
22 22
23 23 # Once logged into the database, use SQL to redirect
24 24 # the large files location
@@ -298,133 +298,7 b' For a more detailed explanation of the l'
298 298 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
299 299 datefmt = %Y-%m-%d %H:%M:%S
300 300
301 .. _svn-http:
302
303 |svn| With Write Over HTTP
304 ^^^^^^^^^^^^^^^^^^^^^^^^^^
305
306 To use |svn| with read/write support over the |svn| HTTP protocol, you have to
307 configure the HTTP |svn| backend.
308
309 Prerequisites
310 =============
311
312 - Enable HTTP support inside the admin VCS settings on your |RCE| instance
313 - You need to install the following tools on the machine that is running an
314 instance of |RCE|:
315 ``Apache HTTP Server`` and
316 ``mod_dav_svn``.
317
318
319 Using Ubuntu Distribution as an example you can run:
320
321 .. code-block:: bash
322
323 $ sudo apt-get install apache2 libapache2-mod-svn
324
325 Once installed you need to enable ``dav_svn``:
326
327 .. code-block:: bash
328
329 $ sudo a2enmod dav_svn
330
331 Configuring Apache Setup
332 ========================
333
334 .. tip::
335
336 It is recommended to run Apache on a port other than 80, due to possible
337 conflicts with other HTTP servers like nginx. To do this, set the
338 ``Listen`` parameter in the ``/etc/apache2/ports.conf`` file, for example
339 ``Listen 8090``.
340
341
342 .. warning::
343
344 Make sure your Apache instance which runs the mod_dav_svn module is
345 only accessible by RhodeCode. Otherwise everyone is able to browse
346 the repositories or run subversion operations (checkout/commit/etc.).
347
348 It is also recommended to run apache as the same user as |RCE|, otherwise
349 permission issues could occur. To do this edit the ``/etc/apache2/envvars``
350
351 .. code-block:: apache
352
353 export APACHE_RUN_USER=rhodecode
354 export APACHE_RUN_GROUP=rhodecode
355
356 1. To configure Apache, create and edit a virtual hosts file, for example
357 :file:`/etc/apache2/sites-available/default.conf`. Below is an example
358 how to use one with auto-generated config ```mod_dav_svn.conf```
359 from configured |RCE| instance.
360
361 .. code-block:: apache
362
363 <VirtualHost *:8080>
364 ServerAdmin rhodecode-admin@localhost
365 DocumentRoot /var/www/html
366 ErrorLog ${'${APACHE_LOG_DIR}'}/error.log
367 CustomLog ${'${APACHE_LOG_DIR}'}/access.log combined
368 Include /home/user/.rccontrol/enterprise-1/mod_dav_svn.conf
369 </VirtualHost>
370
371
372 2. Go to the :menuselection:`Admin --> Settings --> VCS` page, and
373 enable :guilabel:`Proxy Subversion HTTP requests`, and specify the
374 :guilabel:`Subversion HTTP Server URL`.
375
376 3. Open the |RCE| configuration file,
377 :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini`
378
379 4. Add the following configuration option in the ``[app:main]``
380 section if you don't have it yet.
381
382 This enables mapping of the created |RCE| repo groups into special |svn| paths.
383 Each time a new repository group is created, the system will update
384 the template file and create new mapping. Apache web server needs to be
385 reloaded to pick up the changes on this file.
386 It's recommended to add reload into a crontab so the changes can be picked
387 automatically once someone creates a repository group inside RhodeCode.
388
389
390 .. code-block:: ini
391
392 ##############################################
393 ### Subversion proxy support (mod_dav_svn) ###
394 ##############################################
395 ## Enable or disable the config file generation.
396 svn.proxy.generate_config = true
397 ## Generate config file with `SVNListParentPath` set to `On`.
398 svn.proxy.list_parent_path = true
399 ## Set location and file name of generated config file.
400 svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf
401 ## File system path to the directory containing the repositories served by
402 ## RhodeCode.
403 svn.proxy.parent_path_root = /path/to/repo_store
404 ## Used as a prefix to the <Location> block in the generated config file. In
405 ## most cases it should be set to `/`.
406 svn.proxy.location_root = /
407
408
409 This would create a special template file called ```mod_dav_svn.conf```. We
410 used that file path in the apache config above inside the Include statement.
411
412
413 Using |svn|
414 ===========
415
416 Once |svn| has been enabled on your instance, you can use it with the
417 following examples. For more |svn| information, see the `Subversion Red Book`_
418
419 .. code-block:: bash
420
421 # To clone a repository
422 svn checkout http://my-svn-server.example.com/my-svn-repo
423
424 # svn commit
425 svn commit
426 301
427 302 .. _Subversion Red Book: http://svnbook.red-bean.com/en/1.7/svn-book.html#svn.ref.svn
428 303
429
430 .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue No newline at end of file
304 .. _Ask Ubuntu: http://askubuntu.com/questions/162391/how-do-i-fix-my-locale-issue
@@ -1,7 +1,7 b''
1 1 .. _deprecated-methods-ref:
2 2
3 3 deprecated methods
4 =================
4 ==================
5 5
6 6 changeset_comment
7 7 -----------------
@@ -1,7 +1,7 b''
1 1 .. _gist-methods-ref:
2 2
3 3 gist methods
4 =================
4 ============
5 5
6 6 create_gist
7 7 -----------
@@ -1,10 +1,10 b''
1 1 .. _license-methods-ref:
2 2
3 3 license methods
4 =================
4 ===============
5 5
6 6 get_license_info (EE only)
7 ----------------
7 --------------------------
8 8
9 9 .. py:function:: get_license_info(apiuser)
10 10
@@ -32,7 +32,7 b' get_license_info (EE only)'
32 32
33 33
34 34 set_license_key (EE only)
35 ---------------
35 -------------------------
36 36
37 37 .. py:function:: set_license_key(apiuser, key)
38 38
@@ -1,7 +1,7 b''
1 1 .. _pull-request-methods-ref:
2 2
3 3 pull_request methods
4 =================
4 ====================
5 5
6 6 close_pull_request
7 7 ------------------
@@ -103,6 +103,10 b' create_pull_request'
103 103 :type description: Optional(str)
104 104 :param reviewers: Set the new pull request reviewers list.
105 105 :type reviewers: Optional(list)
106 Accepts username strings or objects of the format:
107 {
108 'username': 'nick', 'reasons': ['original author']
109 }
106 110
107 111
108 112 get_pull_request
@@ -165,6 +169,15 b' get_pull_request'
165 169 "commit_id": "<commit_id>",
166 170 }
167 171 },
172 "merge": {
173 "clone_url": "<clone_url>",
174 "reference":
175 {
176 "name": "<name>",
177 "type": "<type>",
178 "commit_id": "<commit_id>",
179 }
180 },
168 181 "author": <user_obj>,
169 182 "reviewers": [
170 183 ...
@@ -241,6 +254,15 b' get_pull_requests'
241 254 "commit_id": "<commit_id>",
242 255 }
243 256 },
257 "merge": {
258 "clone_url": "<clone_url>",
259 "reference":
260 {
261 "name": "<name>",
262 "type": "<type>",
263 "commit_id": "<commit_id>",
264 }
265 },
244 266 "author": <user_obj>,
245 267 "reviewers": [
246 268 ...
@@ -284,7 +306,12 b' merge_pull_request'
284 306 "executed": "<bool>",
285 307 "failure_reason": "<int>",
286 308 "merge_commit_id": "<merge_commit_id>",
287 "possible": "<bool>"
309 "possible": "<bool>",
310 "merge_ref": {
311 "commit_id": "<commit_id>",
312 "type": "<type>",
313 "name": "<name>"
314 }
288 315 },
289 316 "error": null
290 317
@@ -1,24 +1,25 b''
1 1 .. _repo-group-methods-ref:
2 2
3 3 repo_group methods
4 =================
4 ==================
5 5
6 6 create_repo_group
7 7 -----------------
8 8
9 .. py:function:: create_repo_group(apiuser, group_name, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser>>, copy_permissions=<Optional:False>)
9 .. py:function:: create_repo_group(apiuser, group_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, copy_permissions=<Optional:False>)
10 10
11 11 Creates a repository group.
12 12
13 * If the repository group name contains "/", all the required repository
14 groups will be created.
13 * If the repository group name contains "/", repository group will be
14 created inside a repository group or nested repository groups
15 15
16 For example "foo/bar/baz" will create |repo| groups "foo" and "bar"
17 (with "foo" as parent). It will also create the "baz" repository
18 with "bar" as |repo| group.
16 For example "foo/bar/group1" will create repository group called "group1"
17 inside group "foo/bar". You have to have permissions to access and
18 write to the last repository group ("bar" in this example)
19 19
20 This command can only be run using an |authtoken| with admin
21 permissions.
20 This command can only be run using an |authtoken| with at least
21 permissions to create repository groups, or admin permissions to
22 parent repository groups.
22 23
23 24 :param apiuser: This is filled automatically from the |authtoken|.
24 25 :type apiuser: AuthUser
@@ -73,7 +74,7 b' delete_repo_group'
73 74
74 75 id : <id_given_in_input>
75 76 result : {
76 'msg': 'deleted repo group ID:<repogroupid> <repogroupname>
77 'msg': 'deleted repo group ID:<repogroupid> <repogroupname>'
77 78 'repo_group': null
78 79 }
79 80 error : null
@@ -325,13 +326,22 b' revoke_user_permission_from_repo_group'
325 326 update_repo_group
326 327 -----------------
327 328
328 .. py:function:: update_repo_group(apiuser, repogroupid, group_name=<Optional:''>, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser>>, parent=<Optional:None>, enable_locking=<Optional:False>)
329 .. py:function:: update_repo_group(apiuser, repogroupid, group_name=<Optional:''>, description=<Optional:''>, owner=<Optional:<OptionalAttr:apiuser>>, enable_locking=<Optional:False>)
329 330
330 331 Updates repository group with the details given.
331 332
332 333 This command can only be run using an |authtoken| with admin
333 334 permissions.
334 335
336 * If the group_name name contains "/", repository group will be updated
337 accordingly with a repository group or nested repository groups
338
339 For example repogroupid=group-test group_name="foo/bar/group-test"
340 will update repository group called "group-test" and place it
341 inside group "foo/bar".
342 You have to have permissions to access and write to the last repository
343 group ("bar" in this example)
344
335 345 :param apiuser: This is filled automatically from the |authtoken|.
336 346 :type apiuser: AuthUser
337 347 :param repogroupid: Set the ID of repository group.
@@ -342,8 +352,6 b' update_repo_group'
342 352 :type description: str
343 353 :param owner: Set the |repo| group owner.
344 354 :type owner: str
345 :param parent: Set the |repo| group parent.
346 :type parent: str or int
347 355 :param enable_locking: Enable |repo| locking. The default is false.
348 356 :type enable_locking: bool
349 357
@@ -1,7 +1,7 b''
1 1 .. _repo-methods-ref:
2 2
3 3 repo methods
4 =================
4 ============
5 5
6 6 add_field_to_repo
7 7 -----------------
@@ -68,15 +68,16 b' create_repo'
68 68
69 69 Creates a repository.
70 70
71 * If the repository name contains "/", all the required repository
72 groups will be created.
71 * If the repository name contains "/", repository will be created inside
72 a repository group or nested repository groups
73 73
74 For example "foo/bar/baz" will create |repo| groups "foo" and "bar"
75 (with "foo" as parent). It will also create the "baz" repository
76 with "bar" as |repo| group.
74 For example "foo/bar/repo1" will create |repo| called "repo1" inside
75 group "foo/bar". You have to have permissions to access and write to
76 the last repository group ("bar" in this example)
77 77
78 78 This command can only be run using an |authtoken| with at least
79 write permissions to the |repo|.
79 permissions to create repositories, or write permissions to
80 parent repository groups.
80 81
81 82 :param apiuser: This is filled automatically from the |authtoken|.
82 83 :type apiuser: AuthUser
@@ -88,9 +89,9 b' create_repo'
88 89 :type owner: Optional(str)
89 90 :param description: Set the repository description.
90 91 :type description: Optional(str)
91 :param private:
92 :param private: set repository as private
92 93 :type private: bool
93 :param clone_uri:
94 :param clone_uri: set clone_uri
94 95 :type clone_uri: str
95 96 :param landing_rev: <rev_type>:<rev>
96 97 :type landing_rev: str
@@ -125,7 +126,7 b' create_repo'
125 126 id : <id_given_in_input>
126 127 result : null
127 128 error : {
128 'failed to create repository `<repo_name>`
129 'failed to create repository `<repo_name>`'
129 130 }
130 131
131 132
@@ -164,25 +165,29 b' delete_repo'
164 165 fork_repo
165 166 ---------
166 167
167 .. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, copy_permissions=<Optional:False>, private=<Optional:False>, landing_rev=<Optional:'rev:tip'>)
168 .. py:function:: fork_repo(apiuser, repoid, fork_name, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, copy_permissions=<Optional:False>)
168 169
169 170 Creates a fork of the specified |repo|.
170 171
171 * If using |RCE| with Celery this will immediately return a success
172 message, even though the fork will be created asynchronously.
172 * If the fork_name contains "/", fork will be created inside
173 a repository group or nested repository groups
173 174
174 This command can only be run using an |authtoken| with fork
175 permissions on the |repo|.
175 For example "foo/bar/fork-repo" will create fork called "fork-repo"
176 inside group "foo/bar". You have to have permissions to access and
177 write to the last repository group ("bar" in this example)
178
179 This command can only be run using an |authtoken| with minimum
180 read permissions of the forked repo, create fork permissions for an user.
176 181
177 182 :param apiuser: This is filled automatically from the |authtoken|.
178 183 :type apiuser: AuthUser
179 184 :param repoid: Set repository name or repository ID.
180 185 :type repoid: str or int
181 :param fork_name: Set the fork name.
186 :param fork_name: Set the fork name, including it's repository group membership.
182 187 :type fork_name: str
183 188 :param owner: Set the fork owner.
184 189 :type owner: str
185 :param description: Set the fork descripton.
190 :param description: Set the fork description.
186 191 :type description: str
187 192 :param copy_permissions: Copy permissions from parent |repo|. The
188 193 default is False.
@@ -729,7 +734,7 b' lock'
729 734 id : <id_given_in_input>
730 735 result : null
731 736 error : {
732 'Error occurred locking repository `<reponame>`
737 'Error occurred locking repository `<reponame>`'
733 738 }
734 739
735 740
@@ -923,24 +928,31 b' strip'
923 928 update_repo
924 929 -----------
925 930
926 .. py:function:: update_repo(apiuser, repoid, name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, group=<Optional:None>, fork_of=<Optional:None>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>)
931 .. py:function:: update_repo(apiuser, repoid, repo_name=<Optional:None>, owner=<Optional:<OptionalAttr:apiuser>>, description=<Optional:''>, private=<Optional:False>, clone_uri=<Optional:None>, landing_rev=<Optional:'rev:tip'>, fork_of=<Optional:None>, enable_statistics=<Optional:False>, enable_locking=<Optional:False>, enable_downloads=<Optional:False>, fields=<Optional:''>)
927 932
928 933 Updates a repository with the given information.
929 934
930 935 This command can only be run using an |authtoken| with at least
931 write permissions to the |repo|.
936 admin permissions to the |repo|.
937
938 * If the repository name contains "/", repository will be updated
939 accordingly with a repository group or nested repository groups
940
941 For example repoid=repo-test name="foo/bar/repo-test" will update |repo|
942 called "repo-test" and place it inside group "foo/bar".
943 You have to have permissions to access and write to the last repository
944 group ("bar" in this example)
932 945
933 946 :param apiuser: This is filled automatically from the |authtoken|.
934 947 :type apiuser: AuthUser
935 948 :param repoid: repository name or repository ID.
936 949 :type repoid: str or int
937 :param name: Update the |repo| name.
938 :type name: str
950 :param repo_name: Update the |repo| name, including the
951 repository group it's in.
952 :type repo_name: str
939 953 :param owner: Set the |repo| owner.
940 954 :type owner: str
941 :param group: Set the |repo| group the |repo| belongs to.
942 :type group: str
943 :param fork_of: Set the master |repo| name.
955 :param fork_of: Set the |repo| as fork of another |repo|.
944 956 :type fork_of: str
945 957 :param description: Update the |repo| description.
946 958 :type description: str
@@ -948,16 +960,13 b' update_repo'
948 960 :type private: bool
949 961 :param clone_uri: Update the |repo| clone URI.
950 962 :type clone_uri: str
951 :param landing_rev: Set the |repo| landing revision. Default is
952 ``tip``.
963 :param landing_rev: Set the |repo| landing revision. Default is ``rev:tip``.
953 964 :type landing_rev: str
954 :param enable_statistics: Enable statistics on the |repo|,
955 (True | False).
965 :param enable_statistics: Enable statistics on the |repo|, (True | False).
956 966 :type enable_statistics: bool
957 967 :param enable_locking: Enable |repo| locking.
958 968 :type enable_locking: bool
959 :param enable_downloads: Enable downloads from the |repo|,
960 (True | False).
969 :param enable_downloads: Enable downloads from the |repo|, (True | False).
961 970 :type enable_downloads: bool
962 971 :param fields: Add extra fields to the |repo|. Use the following
963 972 example format: ``field_key=field_val,field_key2=fieldval2``.
@@ -1,7 +1,7 b''
1 1 .. _server-methods-ref:
2 2
3 3 server methods
4 =================
4 ==============
5 5
6 6 get_ip
7 7 ------
@@ -1,7 +1,7 b''
1 1 .. _user-group-methods-ref:
2 2
3 3 user_group methods
4 =================
4 ==================
5 5
6 6 add_user_to_user_group
7 7 ----------------------
@@ -1,12 +1,12 b''
1 1 .. _user-methods-ref:
2 2
3 3 user methods
4 =================
4 ============
5 5
6 6 create_user
7 7 -----------
8 8
9 .. py:function:: create_user(apiuser, username, email, password=<Optional:''>, firstname=<Optional:''>, lastname=<Optional:''>, active=<Optional:True>, admin=<Optional:False>, extern_name=<Optional:'rhodecode'>, extern_type=<Optional:'rhodecode'>, force_password_change=<Optional:False>)
9 .. py:function:: create_user(apiuser, username, email, password=<Optional:''>, firstname=<Optional:''>, lastname=<Optional:''>, active=<Optional:True>, admin=<Optional:False>, extern_name=<Optional:'rhodecode'>, extern_type=<Optional:'rhodecode'>, force_password_change=<Optional:False>, create_personal_repo_group=<Optional:None>)
10 10
11 11 Creates a new user and returns the new user object.
12 12
@@ -39,7 +39,8 b' create_user'
39 39 :param force_password_change: Force the new user to change password
40 40 on next login.
41 41 :type force_password_change: Optional(``True`` | ``False``)
42
42 :param create_personal_repo_group: Create personal repo group for this user
43 :type create_personal_repo_group: Optional(``True`` | ``False``)
43 44 Example output:
44 45
45 46 .. code-block:: bash
@@ -163,6 +164,7 b' get_user'
163 164 "usergroup.read",
164 165 "hg.repogroup.create.false",
165 166 "hg.create.none",
167 "hg.password_reset.enabled",
166 168 "hg.extern_activate.manual",
167 169 "hg.create.write_on_repogroup.false",
168 170 "hg.usergroup.create.false",
@@ -1,7 +1,7 b''
1 1
2 =====
3 API
4 =====
2 ===================
3 CONTRIBUTING TO API
4 ===================
5 5
6 6
7 7
@@ -130,7 +130,7 b' is a very small pencil which has to be c'
130 130 ticket.
131 131
132 132
133 .. figure:: images/redmine-description.png
133 .. figure:: ../images/redmine-description.png
134 134 :alt: Example of pencil to change the ticket description
135 135
136 136 Shows an example of the pencil which lets you change the description.
@@ -9,9 +9,6 b''
9 9 .. Avoid duplicating the quickstart instructions by importing the README
10 10 file.
11 11
12 .. include:: ../../../acceptance_tests/README.rst
13
14
15 12
16 13 Choices of technology and tools
17 14 ===============================
@@ -88,10 +88,10 b' let'
88 88 };
89 89
90 90 Sphinx = buildPythonPackage (rec {
91 name = "Sphinx-1.4.4";
91 name = "Sphinx-1.4.8";
92 92 src = fetchurl {
93 url = "https://pypi.python.org/packages/20/a2/72f44c84f6c4115e3fef58d36d657ec311d80196eab9fd5ec7bcde76143b/${name}.tar.gz";
94 md5 = "64ce2ec08d37ed56313a98232cbe2aee";
93 url = "https://pypi.python.org/packages/1f/f6/e54a7aad73e35232356103771ae76306dadd8546b024c646fbe75135571c/${name}.tar.gz";
94 md5 = "5ec718a4855917e149498bba91b74e67";
95 95 };
96 96 propagatedBuildInputs = [
97 97 docutils
@@ -20,8 +20,10 b' and commit files and |repos| while manag'
20 20 * Migration from existing databases.
21 21 * |RCM| SDK.
22 22 * Built-in analytics
23 * Built in integrations including: Slack, Jenkins, Webhooks, Jira, Redmine, Hipchat
23 24 * Pluggable authentication system.
24 * Support for |LDAP|, Crowd, CAS, PAM.
25 * Support for AD, |LDAP|, Crowd, CAS, PAM.
26 * Support for external authentication via Oauth Google, Github, Bitbucket, Twitter.
25 27 * Debug modes of operation.
26 28 * Private and public gists.
27 29 * Gists with limited lifetimes and within instance only sharing.
@@ -50,3 +50,4 b' See pages specific to each type of integ'
50 50 redmine
51 51 jira
52 52 webhook
53 email
@@ -7,6 +7,16 b' The Webhook integration allows you to PO'
7 7 or pull requests to a custom http endpoint as a json dict with details of the
8 8 event.
9 9
10 Starting from 4.5.0 release, webhook integration allows to use variables
11 inside the URL. For example in URL `https://server-example.com/${repo_name}`
12 ${repo_name} will be replaced with the name of repository which events is
13 triggered from. Some of the variables like
14 `${branch}` will result in webhook be called multiple times when multiple
15 branches are pushed.
16
17 Some of the variables like `${pull_request_id}` will be replaced only in
18 the pull request related events.
19
10 20 To create a webhook integration, select "webhook" in the integration settings
11 and use the url and key from your custom webhook. See
12 :ref:`creating-integrations` for additional instructions. No newline at end of file
21 and use the URL and key from your any previous custom webhook created. See
22 :ref:`creating-integrations` for additional instructions.
@@ -7,38 +7,6 b' Release Date'
7 7 - 2016-08-12
8 8
9 9
10 General
11 ^^^^^^^
12
13 - Subversion: detect requests also based on magic path.
14 This adds subversion 1.9 support for SVN backend.
15 - Summary/changelog: unified how data is displayed for those pages.
16 * use consistent order of columns
17 * fix the link to commit status
18 * fix order of displaying comments
19 - Live-chat: refactor live chat system for code review based on
20 latest channelstream changes.
21 - SVN: Add template to generate the apache mod_dav_svn config for all
22 repository groups. Repository groups can now be automatically mapped to be
23 supported by SVN backend. Set `svn.proxy.generate_config = true` and similar
24 options found inside .ini config.
25 - Readme/markup: improved order of generating readme files. Fixes #4050
26 * we now use order based on default system renderer
27 * having multiple readme files will pick correct one as set renderer
28 - Api: add a max_file_bytes parameter to get_nodes so that large files
29 can be skipped.
30 - Auth-ldap: added flag to set debug mode for LDAP connections.
31 - Labs: moved rebase-merge option from labs settings into VCS settings.
32 - System: send platform type and version to upgrade endpoint when checking
33 for new versions.
34 - Packaging: update rhodecode-tools from 0.8.3 to 0.10.0
35 - Packaging: update codemirror from 5.4.0 to 5.11.0
36 - Packaging: updated pygments to 2.1.3
37 - Packaging: bumped supervisor to 3.3.0
38 - Packaging: bumped psycopg2 to 2.6.1
39 - Packaging: bumped mercurial to 3.8.4
40
41
42 10 New Features
43 11 ^^^^^^^^^^^^
44 12
@@ -64,6 +32,38 b' New Features'
64 32 onto comments you submitted while doing a code-review.
65 33
66 34
35 General
36 ^^^^^^^
37
38 - Subversion: detect requests also based on magic path.
39 This adds subversion 1.9 support for SVN backend.
40 - Summary/changelog: unified how data is displayed for those pages.
41 * use consistent order of columns
42 * fix the link to commit status
43 * fix order of displaying comments
44 - Live chat: refactor live chat system for code review based on
45 latest channelstream changes.
46 - SVN: Add template to generate the apache mod_dav_svn config for all
47 repository groups. Repository groups can now be automatically mapped to be
48 supported by SVN backend. Set `svn.proxy.generate_config = true` and similar
49 options found inside .ini config.
50 - Readme/markup: improved order of generating readme files. Fixes #4050
51 * we now use order based on default system renderer
52 * having multiple readme files will pick correct one as set renderer
53 - Api: add a max_file_bytes parameter to get_nodes so that large files
54 can be skipped.
55 - Auth-ldap: added flag to set debug mode for LDAP connections.
56 - Labs: moved rebase-merge option from labs settings into VCS settings.
57 - System: send platform type and version to upgrade endpoint when checking
58 for new versions.
59 - Packaging: update rhodecode-tools from 0.8.3 to 0.10.0
60 - Packaging: update codemirror from 5.4.0 to 5.11.0
61 - Packaging: updated pygments to 2.1.3
62 - Packaging: bumped supervisor to 3.3.0
63 - Packaging: bumped psycopg2 to 2.6.1
64 - Packaging: bumped mercurial to 3.8.4
65
66
67 67 Security
68 68 ^^^^^^^^
69 69
@@ -105,7 +105,7 b' Fixes'
105 105 support to gevent compatible handling.
106 106 - Diff2way: fixed unicode problem on non-ascii files.
107 107 - Full text search: whoosh schema uses now bigger ints, fixes #4035
108 - File-browser: optimized cached tree calculation, reduced load times by
108 - File browser: optimized cached tree calculation, reduced load times by
109 109 50% on complex file trees.
110 110 - Styling: #4086 fixing bug where long commit messages did not wrap in file view.
111 111 - SVN: Ignore the content length header from response, fixes #4112.
@@ -6,6 +6,27 b' Release Date'
6 6
7 7 - 2016-08-23
8 8
9
10 New Features
11 ^^^^^^^^^^^^
12
13
14
15 General
16 ^^^^^^^
17
18
19
20 Security
21 ^^^^^^^^
22
23
24
25 Performance
26 ^^^^^^^^^^^
27
28
29
9 30 Fixes
10 31 ^^^^^
11 32
@@ -7,18 +7,6 b' Release Date'
7 7 - 2016-09-16
8 8
9 9
10 General
11 ^^^^^^^
12
13 - UI: introduced Polymer webcomponents into core application. RhodeCode will
14 be now shipped together with Polymer framework webcomponents. Most of
15 dynamic UI components that require large amounts of interaction
16 will be done now with Polymer.
17 - live-notifications: use rhodecode-toast for live notifications instead of
18 toastr jquery plugin.
19 - Svn: moved svn http support out of labs settings. It's tested and stable now.
20
21
22 10 New Features
23 11 ^^^^^^^^^^^^
24 12
@@ -29,11 +17,11 b' New Features'
29 17 It will allow to configure exactly which projects use which integrations.
30 18 - Integrations: show branches/commits separately when posting push events
31 19 to hipchat/slack, fixes #4192.
32 - Pull-requests: summary page now shows update dates for pull request to
20 - Pull requests: summary page now shows update dates for pull request to
33 21 easier see which one were receantly updated.
34 22 - UI: hidden inline comments will be shown in side view when browsing the diffs
35 23 - Diffs: added inline comments toggle into pull requests diff view. #2884
36 - Live-chat: added summon reviewers functionality. You can now request
24 - Live chat: added summon reviewers functionality. You can now request
37 25 presence from online users into a chat for collaborative code-review.
38 26 This requires channelstream to be enabled.
39 27 - UX: added a static 502 page for gateway error. Once configured via
@@ -41,6 +29,18 b' New Features'
41 29 backend servers are offline. Fixes #4202.
42 30
43 31
32 General
33 ^^^^^^^
34
35 - UI: introduced Polymer webcomponents into core application. RhodeCode will
36 be now shipped together with Polymer framework webcomponents. Most of
37 dynamic UI components that require large amounts of interaction
38 will be done now with Polymer.
39 - Live notifications: use rhodecode-toast for live notifications instead of
40 toastr jquery plugin.
41 - Svn: moved svn http support out of labs settings. It's tested and stable now.
42
43
44 44 Security
45 45 ^^^^^^^^
46 46
@@ -67,12 +67,12 b' Fixes'
67 67 match rest of ui, fixes: #4200.
68 68 - UX: show multiple tags/branches in changelog/summary instead of
69 69 truncating them.
70 - My-account: fix test notifications for IE10+
70 - My account: fix test notifications for IE10+
71 71 - Vcs: change way refs are retrieved for git so same name branch/tags and
72 72 remotes can be supported, fixes #298.
73 73 - Lexers: added small extensions table to extend syntax highlighting for file
74 74 sources. Fixes #4227.
75 75 - Search: fix bug where file path link was wrong when the repository name was
76 76 in the file path, fixes #4228
77 - Fixed INT overflow bug
77 - Pagination: fixed INT overflow bug.
78 78 - Events: send pushed commits always in the correct in order.
@@ -7,19 +7,18 b' Release Date'
7 7 - 2016-09-27
8 8
9 9
10 New Features
11 ^^^^^^^^^^^^
12
13
10 14 General
11 15 ^^^^^^^
12 16
13 - channelstream: auto-generate the url to channelstream server if it's not
17 - Channelstream: auto-generate the url to channelstream server if it's not
14 18 explicitly defined in the config. It allows to use a relative server
15 19 without knowing its address upfront.
16 20
17 21
18 New Features
19 ^^^^^^^^^^^^
20
21
22
23 22 Security
24 23 ^^^^^^^^
25 24
@@ -34,7 +33,7 b' Fixes'
34 33 ^^^^^
35 34
36 35 - GIT: properly extract branches on events and submit them to integrations.
37 - Pullrequests: fix problems with unicode in auto-generated descriptions
36 - Pull requests: fix problems with unicode in auto-generated descriptions
38 37 - Gist: fixed bug in update functionality of Gists that auto changed them
39 38 to private.
40 39 - SVN: add proper escaping in the autogenerated svn mod_dav config
@@ -7,21 +7,21 b' Release Date'
7 7 - 2016-10-17
8 8
9 9
10 General
11 ^^^^^^^
12
13 - packaging: pinned against rhodecode-tools 0.10.1
14
15
16 10 New Features
17 11 ^^^^^^^^^^^^
18 12
19 13
20 14
15 General
16 ^^^^^^^
17
18 - Packaging: pinned against rhodecode-tools 0.10.1
19
20
21 21 Security
22 22 ^^^^^^^^
23 23
24 - integrations: fix 500 error on integrations page when delegated admin
24 - Integrations: fix 500 error on integrations page when delegated admin
25 25 tried to access integration page after adding some integrations.
26 26 Permission checks were to strict for delegated admins.
27 27
@@ -34,8 +34,8 b' Performance'
34 34 Fixes
35 35 ^^^^^
36 36
37 - vcsserver: make sure we correctly ping against bundled HG/GIT/SVN binaries.
37 - Vcsserver: make sure we correctly ping against bundled HG/GIT/SVN binaries.
38 38 This should fix a problem where system binaries could be used accidentally
39 39 by the RhodeCode.
40 - ldap: fixed email extraction issues. Empty email addresses from LDAP server
40 - LDAP: fixed email extraction issues. Empty email addresses from LDAP server
41 41 will no longer take precedence over those stored inside RhodeCode database.
@@ -9,6 +9,7 b' Release Notes'
9 9 .. toctree::
10 10 :maxdepth: 1
11 11
12 release-notes-4.5.0.rst
12 13 release-notes-4.4.2.rst
13 14 release-notes-4.4.1.rst
14 15 release-notes-4.4.0.rst
@@ -4,7 +4,7 b''
4 4 Scaling Best Practices
5 5 ======================
6 6
7 When deploying |RCE| at scale; 100s of users, multiple instances, CI servers,
7 When deploying |RCE| at scale; 1000s of users, multiple instances, CI servers,
8 8 there are a number of steps you can take to ensure you are getting the
9 9 most out of your system.
10 10
@@ -15,20 +15,23 b' You can configure multiple |RCE| instanc'
15 15 set of |repos|. This lets users work on an instance that has less traffic
16 16 than those being hit by CI servers. To configure this, use |RCC| to install
17 17 multiple instances and configure the database and |repos| connection. If you
18 do need to reset the database connection, see the
18 do need to reset/adjust the database connection, see the
19 19 :ref:`config-database` section.
20 20
21 Once configured, set your CI servers to use a particular instance and for
22 user specific instances you can configure loads balancing. See the
23 :ref:`nginx-ws-ref` section for examples.
21 You can configure then a load-balancer to balance the traffic between the CI
22 dedicated instance and instance that end users would use.
23 See the :ref:`nginx-ws-ref` section for examples on how to do it in NGINX.
24 24
25 25 Switch to Database Sessions
26 26 ---------------------------
27 27
28 To increase database performance switch to database-based user sessions. In a
29 large scale deployment, we recommend switching from file-based
30 sessions to database-based user sessions. For configuration details, see the
31 :ref:`db-session-ref` section.
28 To increase |RCE| performance switch from the default file based sessions to
29 database-based. In such way all your distributed instances would not need to
30 share the file storage to use file-based sessions.
31 Database based session have an additional advantage of the file
32 based ones that they don't need a periodic cleanup as the session library
33 cleans them up for users. For configuration details,
34 see the :ref:`db-session-ref` section.
32 35
33 36 Tuning |RCE|
34 37 ------------
@@ -6,7 +6,9 b''
6 6 },
7 7 "js": {
8 8 "src": "rhodecode/public/js/src",
9 "dest": "rhodecode/public/js"
9 "dest": "rhodecode/public/js",
10 "bower": "bower_components",
11 "node_modules": "node_modules"
10 12 }
11 13 },
12 14 "copy": {
@@ -34,7 +36,8 b''
34 36 "<%= dirs.js.src %>/bootstrap.js",
35 37 "<%= dirs.js.src %>/mousetrap.js",
36 38 "<%= dirs.js.src %>/moment.js",
37 "<%= dirs.js.src %>/appenlight-client-0.4.1.min.js",
39 "<%= dirs.js.node_modules %>/appenlight-client/appenlight-client.min.js",
40 "<%= dirs.js.node_modules %>/favico.js/favico-0.3.10.min.js",
38 41 "<%= dirs.js.src %>/i18n_utils.js",
39 42 "<%= dirs.js.src %>/deform.js",
40 43 "<%= dirs.js.src %>/plugins/jquery.pjax.js",
@@ -64,7 +67,6 b''
64 67 "<%= dirs.js.src %>/rhodecode/utils/ie.js",
65 68 "<%= dirs.js.src %>/rhodecode/utils/os.js",
66 69 "<%= dirs.js.src %>/rhodecode/utils/topics.js",
67 "<%= dirs.js.src %>/rhodecode/widgets/multiselect.js",
68 70 "<%= dirs.js.src %>/rhodecode/init.js",
69 71 "<%= dirs.js.src %>/rhodecode/codemirror.js",
70 72 "<%= dirs.js.src %>/rhodecode/comments.js",
@@ -144,7 +146,9 b''
144 146 "less:development",
145 147 "less:components",
146 148 "concat:polymercss",
147 "vulcanize"
149 "vulcanize",
150 "crisper",
151 "concat:dist"
148 152 ]
149 153 },
150 154 "js": {
@@ -13,6 +13,8 b''
13 13 "grunt-crisper": "^1.0.1",
14 14 "grunt-vulcanize": "^1.0.0",
15 15 "jshint": "^2.9.1-rc3",
16 "bower": "^1.7.9"
16 "bower": "^1.7.9",
17 "favico.js": "^0.3.10",
18 "appenlight-client": "git+https://git@github.com/AppEnlight/appenlight-client-js.git#0.5.0"
17 19 }
18 20 }
@@ -4,10 +4,12 b' buildEnv { name = "bower-env"; ignoreCol'
4 4 (fetchbower "polymer" "Polymer/polymer#1.6.1" "Polymer/polymer#^1.6.1" "09mm0jgk457gvwqlc155swch7gjr6fs3g7spnvhi6vh5b6518540")
5 5 (fetchbower "paper-button" "PolymerElements/paper-button#1.0.13" "PolymerElements/paper-button#^1.0.13" "0i3y153nqk06pn0gk282vyybnl3g1w3w41d5i9z659cgn27g3fvm")
6 6 (fetchbower "paper-spinner" "PolymerElements/paper-spinner#1.2.0" "PolymerElements/paper-spinner#^1.2.0" "1av1m6y81jw3hjhz1yqy3rwcgxarjzl58ldfn4q6sn51pgzngfqb")
7 (fetchbower "paper-tooltip" "PolymerElements/paper-tooltip#1.1.2" "PolymerElements/paper-tooltip#^1.1.2" "1j64nprcyk2d2bbl3qwjyr0lbjngm4wclpyfwgai1c4y6g6bigd2")
7 (fetchbower "paper-tooltip" "PolymerElements/paper-tooltip#1.1.3" "PolymerElements/paper-tooltip#^1.1.2" "0vmrm1n8k9sk9nvqy03q177axy22pia6i3j1gxbk72j3pqiqvg6k")
8 8 (fetchbower "paper-toast" "PolymerElements/paper-toast#1.3.0" "PolymerElements/paper-toast#^1.3.0" "0x9rqxsks5455s8pk4aikpp99ijdn6kxr9gvhwh99nbcqdzcxq1m")
9 9 (fetchbower "paper-toggle-button" "PolymerElements/paper-toggle-button#1.2.0" "PolymerElements/paper-toggle-button#^1.2.0" "0mphcng3ngspbpg4jjn0mb91nvr4xc1phq3qswib15h6sfww1b2w")
10 10 (fetchbower "iron-ajax" "PolymerElements/iron-ajax#1.4.3" "PolymerElements/iron-ajax#^1.4.3" "0m3dx27arwmlcp00b7n516sc5a51f40p9vapr1nvd57l3i3z0pzm")
11 (fetchbower "iron-autogrow-textarea" "PolymerElements/iron-autogrow-textarea#1.0.13" "PolymerElements/iron-autogrow-textarea#^1.0.13" "0zwhpl97vii1s8k0lgain8i9dnw29b0mxc5ixdscx9las13n2lqq")
12 (fetchbower "iron-a11y-keys" "PolymerElements/iron-a11y-keys#1.0.6" "PolymerElements/iron-a11y-keys#^1.0.6" "1xz3mgghfcxixq28sdb654iaxj4nyi1bzcwf77ydkms6fviqs9mv")
11 13 (fetchbower "iron-flex-layout" "PolymerElements/iron-flex-layout#1.3.1" "PolymerElements/iron-flex-layout#^1.0.0" "0nswv3ih3bhflgcd2wjfmddqswzgqxb2xbq65jk9w3rkj26hplbl")
12 14 (fetchbower "paper-behaviors" "PolymerElements/paper-behaviors#1.0.12" "PolymerElements/paper-behaviors#^1.0.0" "012bqk97awgz55cn7rm9g7cckrdhkqhls3zvp8l6nd4rdwcrdzq8")
13 15 (fetchbower "paper-material" "PolymerElements/paper-material#1.0.6" "PolymerElements/paper-material#^1.0.0" "0rljmknfdbm5aabvx9pk77754zckj3l127c3mvnmwkpkkr353xnh")
@@ -19,13 +21,13 b' buildEnv { name = "bower-env"; ignoreCol'
19 21 (fetchbower "iron-checked-element-behavior" "PolymerElements/iron-checked-element-behavior#1.0.5" "PolymerElements/iron-checked-element-behavior#^1.0.0" "0l0yy4ah454s8bzfv076s8by7h67zy9ni6xb932qwyhx8br6c1m7")
20 22 (fetchbower "promise-polyfill" "polymerlabs/promise-polyfill#1.0.1" "polymerlabs/promise-polyfill#^1.0.0" "045bj2caav3famr5hhxgs1dx7n08r4s46mlzwb313vdy17is38xb")
21 23 (fetchbower "iron-behaviors" "PolymerElements/iron-behaviors#1.0.17" "PolymerElements/iron-behaviors#^1.0.0" "021qvkmbk32jrrmmphpmwgby4bzi5jyf47rh1bxmq2ip07ly4bpr")
24 (fetchbower "iron-validatable-behavior" "PolymerElements/iron-validatable-behavior#1.1.1" "PolymerElements/iron-validatable-behavior#^1.0.0" "1yhxlvywhw2klbbgm3f3cmanxfxggagph4ii635zv0c13707wslv")
25 (fetchbower "iron-form-element-behavior" "PolymerElements/iron-form-element-behavior#1.0.6" "PolymerElements/iron-form-element-behavior#^1.0.0" "0rdhxivgkdhhz2yadgdbjfc70l555p3y83vjh8rfj5hr0asyn6q1")
26 (fetchbower "iron-a11y-keys-behavior" "polymerelements/iron-a11y-keys-behavior#1.1.9" "polymerelements/iron-a11y-keys-behavior#^1.0.0" "1imm4gc84qizihhbyhfa8lwjh3myhj837f79i5m04xjgwrjmkaf6")
22 27 (fetchbower "paper-ripple" "PolymerElements/paper-ripple#1.0.8" "PolymerElements/paper-ripple#^1.0.0" "0r9sq8ik7wwrw0qb82c3rw0c030ljwd3s466c9y4qbcrsbvfjnns")
23 28 (fetchbower "font-roboto" "PolymerElements/font-roboto#1.0.1" "PolymerElements/font-roboto#^1.0.1" "02jz43r0wkyr3yp7rq2rc08l5cwnsgca9fr54sr4rhsnl7cjpxrj")
24 29 (fetchbower "iron-meta" "PolymerElements/iron-meta#1.1.2" "PolymerElements/iron-meta#^1.0.0" "1wl4dx8fnsknw9z9xi8bpc4cy9x70c11x4zxwxnj73hf3smifppl")
25 30 (fetchbower "iron-resizable-behavior" "PolymerElements/iron-resizable-behavior#1.0.5" "PolymerElements/iron-resizable-behavior#^1.0.0" "1fd5zmbr2hax42vmcasncvk7lzi38fmb1kyii26nn8pnnjak7zkn")
26 31 (fetchbower "iron-selector" "PolymerElements/iron-selector#1.5.2" "PolymerElements/iron-selector#^1.0.0" "1ajv46llqzvahm5g6g75w7nfyjcslp53ji0wm96l2k94j87spv3r")
27 32 (fetchbower "web-animations-js" "web-animations/web-animations-js#2.2.2" "web-animations/web-animations-js#^2.2.0" "1izfvm3l67vwys0bqbhidi9rqziw2f8wv289386sc6jsxzgkzhga")
28 (fetchbower "iron-a11y-keys-behavior" "PolymerElements/iron-a11y-keys-behavior#1.1.7" "PolymerElements/iron-a11y-keys-behavior#^1.0.0" "070z46dbbz242002gmqrgy28x0y1fcqp9hnvbi05r3zphiqfx3l7")
29 (fetchbower "iron-validatable-behavior" "PolymerElements/iron-validatable-behavior#1.1.1" "PolymerElements/iron-validatable-behavior#^1.0.0" "1yhxlvywhw2klbbgm3f3cmanxfxggagph4ii635zv0c13707wslv")
30 (fetchbower "iron-form-element-behavior" "PolymerElements/iron-form-element-behavior#1.0.6" "PolymerElements/iron-form-element-behavior#^1.0.0" "0rdhxivgkdhhz2yadgdbjfc70l555p3y83vjh8rfj5hr0asyn6q1")
31 33 ]; }
@@ -103,6 +103,34 b' let'
103 103 sha1 = "a2e14ff85c2d6bf8c8080e5aa55129ebc6a2d320";
104 104 };
105 105 };
106 "bower-1.7.9" = {
107 name = "bower";
108 packageName = "bower";
109 version = "1.7.9";
110 src = fetchurl {
111 url = "https://registry.npmjs.org/bower/-/bower-1.7.9.tgz";
112 sha1 = "b7296c2393e0d75edaa6ca39648132dd255812b0";
113 };
114 };
115 "favico.js-0.3.10" = {
116 name = "favico.js";
117 packageName = "favico.js";
118 version = "0.3.10";
119 src = fetchurl {
120 url = "https://registry.npmjs.org/favico.js/-/favico.js-0.3.10.tgz";
121 sha1 = "80586e27a117f24a8d51c18a99bdc714d4339301";
122 };
123 };
124 "appenlight-client-git+https://git@github.com/AppEnlight/appenlight-client-js.git#0.5.0" = {
125 name = "appenlight-client";
126 packageName = "appenlight-client";
127 version = "0.5.0";
128 src = fetchgit {
129 url = "https://git@github.com/AppEnlight/appenlight-client-js.git";
130 rev = "b1d6853345dc3e96468b34537810b3eb77e0764f";
131 sha256 = "2ef00aef7dafdecdc1666d2e83fc190a796849985d04a8f0fad148d64aa4f8db";
132 };
133 };
106 134 "async-0.1.22" = {
107 135 name = "async";
108 136 packageName = "async";
@@ -301,13 +329,13 b' let'
301 329 sha1 = "fadd834b9683073da179b3eae6d9c0d15053f73e";
302 330 };
303 331 };
304 "inherits-2.0.1" = {
332 "inherits-2.0.3" = {
305 333 name = "inherits";
306 334 packageName = "inherits";
307 version = "2.0.1";
335 version = "2.0.3";
308 336 src = fetchurl {
309 url = "https://registry.npmjs.org/inherits/-/inherits-2.0.1.tgz";
310 sha1 = "b17d08d326b4423e568eff719f91b0b1cbdf69f1";
337 url = "https://registry.npmjs.org/inherits/-/inherits-2.0.3.tgz";
338 sha1 = "633c2c83e3da42a502f52466022480f4208261de";
311 339 };
312 340 };
313 341 "minimatch-0.3.0" = {
@@ -580,13 +608,13 b' let'
580 608 sha1 = "6cbfea22b3b830304e9a5fb371d54fa480c9d7cf";
581 609 };
582 610 };
583 "lodash-4.15.0" = {
611 "lodash-4.16.2" = {
584 612 name = "lodash";
585 613 packageName = "lodash";
586 version = "4.15.0";
614 version = "4.16.2";
587 615 src = fetchurl {
588 url = "https://registry.npmjs.org/lodash/-/lodash-4.15.0.tgz";
589 sha1 = "3162391d8f0140aa22cf8f6b3c34d6b7f63d3aa9";
616 url = "https://registry.npmjs.org/lodash/-/lodash-4.16.2.tgz";
617 sha1 = "3e626db827048a699281a8a125226326cfc0e652";
590 618 };
591 619 };
592 620 "errno-0.1.4" = {
@@ -598,13 +626,13 b' let'
598 626 sha1 = "b896e23a9e5e8ba33871fc996abd3635fc9a1c7d";
599 627 };
600 628 };
601 "graceful-fs-4.1.6" = {
629 "graceful-fs-4.1.8" = {
602 630 name = "graceful-fs";
603 631 packageName = "graceful-fs";
604 version = "4.1.6";
632 version = "4.1.8";
605 633 src = fetchurl {
606 url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.1.6.tgz";
607 sha1 = "514c38772b31bee2e08bedc21a0aeb3abf54c19e";
634 url = "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.1.8.tgz";
635 sha1 = "da3e11135eb2168bdd374532c4e2649751672890";
608 636 };
609 637 };
610 638 "image-size-0.5.0" = {
@@ -670,13 +698,13 b' let'
670 698 sha1 = "857fcabfc3397d2625b8228262e86aa7a011b05d";
671 699 };
672 700 };
673 "asap-2.0.4" = {
701 "asap-2.0.5" = {
674 702 name = "asap";
675 703 packageName = "asap";
676 version = "2.0.4";
704 version = "2.0.5";
677 705 src = fetchurl {
678 url = "https://registry.npmjs.org/asap/-/asap-2.0.4.tgz";
679 sha1 = "b391bf7f6bfbc65706022fec8f49c4b07fecf589";
706 url = "https://registry.npmjs.org/asap/-/asap-2.0.5.tgz";
707 sha1 = "522765b50c3510490e52d7dcfe085ef9ba96958f";
680 708 };
681 709 };
682 710 "gaze-0.5.2" = {
@@ -778,13 +806,13 b' let'
778 806 sha1 = "f197d6eaff34c9085577484b2864375b294f5697";
779 807 };
780 808 };
781 "dom5-1.3.3" = {
809 "dom5-1.3.6" = {
782 810 name = "dom5";
783 811 packageName = "dom5";
784 version = "1.3.3";
812 version = "1.3.6";
785 813 src = fetchurl {
786 url = "https://registry.npmjs.org/dom5/-/dom5-1.3.3.tgz";
787 sha1 = "07e514522c245c7aa8512aa3f9118e8bcab9f909";
814 url = "https://registry.npmjs.org/dom5/-/dom5-1.3.6.tgz";
815 sha1 = "a7088a9fc5f3b08dc9f6eda4c7abaeb241945e0d";
788 816 };
789 817 };
790 818 "array-back-1.0.3" = {
@@ -832,13 +860,13 b' let'
832 860 sha1 = "a2d6ce740d15f0d92b1b26763e2ce9c0e361fd98";
833 861 };
834 862 };
835 "typical-2.5.0" = {
863 "typical-2.6.0" = {
836 864 name = "typical";
837 865 packageName = "typical";
838 version = "2.5.0";
866 version = "2.6.0";
839 867 src = fetchurl {
840 url = "https://registry.npmjs.org/typical/-/typical-2.5.0.tgz";
841 sha1 = "81244918aa28180c9e602aa457173404be0604f1";
868 url = "https://registry.npmjs.org/typical/-/typical-2.6.0.tgz";
869 sha1 = "89d51554ab139848a65bcc2c8772f8fb450c40ed";
842 870 };
843 871 };
844 872 "ansi-escape-sequences-2.2.2" = {
@@ -958,22 +986,22 b' let'
958 986 sha1 = "a09136f72ec043d27c893707c2b159bfad7de93f";
959 987 };
960 988 };
961 "test-value-2.0.0" = {
989 "test-value-2.1.0" = {
962 990 name = "test-value";
963 991 packageName = "test-value";
964 version = "2.0.0";
992 version = "2.1.0";
965 993 src = fetchurl {
966 url = "https://registry.npmjs.org/test-value/-/test-value-2.0.0.tgz";
967 sha1 = "0d65c45ee0b48a757c4507a5e98ec2680a9db137";
994 url = "https://registry.npmjs.org/test-value/-/test-value-2.1.0.tgz";
995 sha1 = "11da6ff670f3471a73b625ca4f3fdcf7bb748291";
968 996 };
969 997 };
970 "@types/clone-0.1.29" = {
998 "@types/clone-0.1.30" = {
971 999 name = "@types/clone";
972 1000 packageName = "@types/clone";
973 version = "0.1.29";
1001 version = "0.1.30";
974 1002 src = fetchurl {
975 url = "https://registry.npmjs.org/@types/clone/-/clone-0.1.29.tgz";
976 sha1 = "65a0be88189ffddcd373e450aa6b68c9c83218b7";
1003 url = "https://registry.npmjs.org/@types/clone/-/clone-0.1.30.tgz";
1004 sha1 = "e7365648c1b42136a59c7d5040637b3b5c83b614";
977 1005 };
978 1006 };
979 1007 "@types/node-4.0.30" = {
@@ -985,13 +1013,13 b' let'
985 1013 sha1 = "553f490ed3030311620f88003e7abfc0edcb301e";
986 1014 };
987 1015 };
988 "@types/parse5-0.0.28" = {
1016 "@types/parse5-0.0.31" = {
989 1017 name = "@types/parse5";
990 1018 packageName = "@types/parse5";
991 version = "0.0.28";
1019 version = "0.0.31";
992 1020 src = fetchurl {
993 url = "https://registry.npmjs.org/@types/parse5/-/parse5-0.0.28.tgz";
994 sha1 = "2a38cb7145bb157688d4ad2c46944c6dffae3cc6";
1021 url = "https://registry.npmjs.org/@types/parse5/-/parse5-0.0.31.tgz";
1022 sha1 = "e827a493a443b156e1b582a2e4c3bdc0040f2ee7";
995 1023 };
996 1024 };
997 1025 "clone-1.0.2" = {
@@ -1012,13 +1040,13 b' let'
1012 1040 sha1 = "9b7f3b0de32be78dc2401b17573ccaf0f6f59d94";
1013 1041 };
1014 1042 };
1015 "@types/node-6.0.37" = {
1043 "@types/node-6.0.41" = {
1016 1044 name = "@types/node";
1017 1045 packageName = "@types/node";
1018 version = "6.0.37";
1046 version = "6.0.41";
1019 1047 src = fetchurl {
1020 url = "https://registry.npmjs.org/@types/node/-/node-6.0.37.tgz";
1021 sha1 = "a1e081f2ec60074113d3a1fbf11f35d304f30e39";
1048 url = "https://registry.npmjs.org/@types/node/-/node-6.0.41.tgz";
1049 sha1 = "578cf53aaec65887bcaf16792f8722932e8ff8ea";
1022 1050 };
1023 1051 };
1024 1052 "es6-promise-2.3.0" = {
@@ -1093,13 +1121,13 b' let'
1093 1121 sha1 = "5a5b53af4693110bebb0867aa3430dd3b70a1018";
1094 1122 };
1095 1123 };
1096 "espree-3.1.7" = {
1124 "espree-3.3.1" = {
1097 1125 name = "espree";
1098 1126 packageName = "espree";
1099 version = "3.1.7";
1127 version = "3.3.1";
1100 1128 src = fetchurl {
1101 url = "https://registry.npmjs.org/espree/-/espree-3.1.7.tgz";
1102 sha1 = "fd5deec76a97a5120a9cd3a7cb1177a0923b11d2";
1129 url = "https://registry.npmjs.org/espree/-/espree-3.3.1.tgz";
1130 sha1 = "42107376856738a65ff3b5877f3a58bd52497643";
1103 1131 };
1104 1132 };
1105 1133 "estraverse-3.1.0" = {
@@ -1183,13 +1211,13 b' let'
1183 1211 sha1 = "96e3b70d5779f6ad49cd032673d1c312767ba581";
1184 1212 };
1185 1213 };
1186 "optionator-0.8.1" = {
1214 "optionator-0.8.2" = {
1187 1215 name = "optionator";
1188 1216 packageName = "optionator";
1189 version = "0.8.1";
1217 version = "0.8.2";
1190 1218 src = fetchurl {
1191 url = "https://registry.npmjs.org/optionator/-/optionator-0.8.1.tgz";
1192 sha1 = "e31b4932cdd5fb862a8b0d10bc63d3ee1ec7d78b";
1219 url = "https://registry.npmjs.org/optionator/-/optionator-0.8.2.tgz";
1220 sha1 = "364c5e409d3f4d6301d6c0b4c05bba50180aeb64";
1193 1221 };
1194 1222 };
1195 1223 "source-map-0.2.0" = {
@@ -1246,13 +1274,31 b' let'
1246 1274 sha1 = "3b09924edf9f083c0490fdd4c0bc4421e04764ee";
1247 1275 };
1248 1276 };
1249 "fast-levenshtein-1.1.4" = {
1277 "fast-levenshtein-2.0.4" = {
1250 1278 name = "fast-levenshtein";
1251 1279 packageName = "fast-levenshtein";
1252 version = "1.1.4";
1280 version = "2.0.4";
1281 src = fetchurl {
1282 url = "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-2.0.4.tgz";
1283 sha1 = "e31e729eea62233c60a7bc9dce2bdcc88b4fffe3";
1284 };
1285 };
1286 "acorn-4.0.3" = {
1287 name = "acorn";
1288 packageName = "acorn";
1289 version = "4.0.3";
1253 1290 src = fetchurl {
1254 url = "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-1.1.4.tgz";
1255 sha1 = "e6a754cc8f15e58987aa9cbd27af66fd6f4e5af9";
1291 url = "https://registry.npmjs.org/acorn/-/acorn-4.0.3.tgz";
1292 sha1 = "1a3e850b428e73ba6b09d1cc527f5aaad4d03ef1";
1293 };
1294 };
1295 "acorn-jsx-3.0.1" = {
1296 name = "acorn-jsx";
1297 packageName = "acorn-jsx";
1298 version = "3.0.1";
1299 src = fetchurl {
1300 url = "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-3.0.1.tgz";
1301 sha1 = "afdf9488fb1ecefc8348f6fb22f464e32a58b36b";
1256 1302 };
1257 1303 };
1258 1304 "acorn-3.3.0" = {
@@ -1264,15 +1310,6 b' let'
1264 1310 sha1 = "45e37fb39e8da3f25baee3ff5369e2bb5f22017a";
1265 1311 };
1266 1312 };
1267 "acorn-jsx-3.0.1" = {
1268 name = "acorn-jsx";
1269 packageName = "acorn-jsx";
1270 version = "3.0.1";
1271 src = fetchurl {
1272 url = "https://registry.npmjs.org/acorn-jsx/-/acorn-jsx-3.0.1.tgz";
1273 sha1 = "afdf9488fb1ecefc8348f6fb22f464e32a58b36b";
1274 };
1275 };
1276 1313 "boxen-0.3.1" = {
1277 1314 name = "boxen";
1278 1315 packageName = "boxen";
@@ -1282,13 +1319,13 b' let'
1282 1319 sha1 = "a7d898243ae622f7abb6bb604d740a76c6a5461b";
1283 1320 };
1284 1321 };
1285 "configstore-2.0.0" = {
1322 "configstore-2.1.0" = {
1286 1323 name = "configstore";
1287 1324 packageName = "configstore";
1288 version = "2.0.0";
1325 version = "2.1.0";
1289 1326 src = fetchurl {
1290 url = "https://registry.npmjs.org/configstore/-/configstore-2.0.0.tgz";
1291 sha1 = "8d81e9cdfa73ebd0e06bc985147856b2f1c4e764";
1327 url = "https://registry.npmjs.org/configstore/-/configstore-2.1.0.tgz";
1328 sha1 = "737a3a7036e9886102aa6099e47bb33ab1aba1a1";
1292 1329 };
1293 1330 };
1294 1331 "is-npm-1.0.0" = {
@@ -1399,13 +1436,13 b' let'
1399 1436 sha1 = "ef9e31386f031a7f0d643af82fde50c457ef00cb";
1400 1437 };
1401 1438 };
1402 "dot-prop-2.4.0" = {
1439 "dot-prop-3.0.0" = {
1403 1440 name = "dot-prop";
1404 1441 packageName = "dot-prop";
1405 version = "2.4.0";
1442 version = "3.0.0";
1406 1443 src = fetchurl {
1407 url = "https://registry.npmjs.org/dot-prop/-/dot-prop-2.4.0.tgz";
1408 sha1 = "848e28f7f1d50740c6747ab3cb07670462b6f89c";
1444 url = "https://registry.npmjs.org/dot-prop/-/dot-prop-3.0.0.tgz";
1445 sha1 = "1b708af094a49c9a0e7dbcad790aba539dac1177";
1409 1446 };
1410 1447 };
1411 1448 "os-tmpdir-1.0.1" = {
@@ -1426,13 +1463,13 b' let'
1426 1463 sha1 = "83cf05c6d6458fc4d5ac6362ea325d92f2754217";
1427 1464 };
1428 1465 };
1429 "uuid-2.0.2" = {
1466 "uuid-2.0.3" = {
1430 1467 name = "uuid";
1431 1468 packageName = "uuid";
1432 version = "2.0.2";
1469 version = "2.0.3";
1433 1470 src = fetchurl {
1434 url = "https://registry.npmjs.org/uuid/-/uuid-2.0.2.tgz";
1435 sha1 = "48bd5698f0677e3c7901a1c46ef15b1643794726";
1471 url = "https://registry.npmjs.org/uuid/-/uuid-2.0.3.tgz";
1472 sha1 = "67e2e863797215530dff318e5bf9dcebfd47b21a";
1436 1473 };
1437 1474 };
1438 1475 "write-file-atomic-1.2.0" = {
@@ -1489,13 +1526,13 b' let'
1489 1526 sha1 = "56eb027d65b4d2dce6cb2e2d32c4d4afc9e1d707";
1490 1527 };
1491 1528 };
1492 "package-json-2.3.3" = {
1529 "package-json-2.4.0" = {
1493 1530 name = "package-json";
1494 1531 packageName = "package-json";
1495 version = "2.3.3";
1532 version = "2.4.0";
1496 1533 src = fetchurl {
1497 url = "https://registry.npmjs.org/package-json/-/package-json-2.3.3.tgz";
1498 sha1 = "14895311a963d18edf8801e06b67ea87795d15b9";
1534 url = "https://registry.npmjs.org/package-json/-/package-json-2.4.0.tgz";
1535 sha1 = "0d15bd67d1cbbddbb2ca222ff2edb86bcb31a8bb";
1499 1536 };
1500 1537 };
1501 1538 "got-5.6.0" = {
@@ -1507,13 +1544,13 b' let'
1507 1544 sha1 = "bb1d7ee163b78082bbc8eb836f3f395004ea6fbf";
1508 1545 };
1509 1546 };
1510 "rc-1.1.6" = {
1511 name = "rc";
1512 packageName = "rc";
1513 version = "1.1.6";
1547 "registry-auth-token-3.0.1" = {
1548 name = "registry-auth-token";
1549 packageName = "registry-auth-token";
1550 version = "3.0.1";
1514 1551 src = fetchurl {
1515 url = "https://registry.npmjs.org/rc/-/rc-1.1.6.tgz";
1516 sha1 = "43651b76b6ae53b5c802f1151fa3fc3b059969c9";
1552 url = "https://registry.npmjs.org/registry-auth-token/-/registry-auth-token-3.0.1.tgz";
1553 sha1 = "c3ee5ec585bce29f88bf41629a3944c71ed53e25";
1517 1554 };
1518 1555 };
1519 1556 "registry-url-3.1.0" = {
@@ -1651,13 +1688,13 b' let'
1651 1688 sha1 = "f38b0ae81d3747d628001f41dafc652ace671c0a";
1652 1689 };
1653 1690 };
1654 "unzip-response-1.0.0" = {
1691 "unzip-response-1.0.1" = {
1655 1692 name = "unzip-response";
1656 1693 packageName = "unzip-response";
1657 version = "1.0.0";
1694 version = "1.0.1";
1658 1695 src = fetchurl {
1659 url = "https://registry.npmjs.org/unzip-response/-/unzip-response-1.0.0.tgz";
1660 sha1 = "bfda54eeec658f00c2df4d4494b9dca0ca00f3e4";
1696 url = "https://registry.npmjs.org/unzip-response/-/unzip-response-1.0.1.tgz";
1697 sha1 = "4a73959f2989470fa503791cefb54e1dbbc68412";
1661 1698 };
1662 1699 };
1663 1700 "url-parse-lax-1.0.0" = {
@@ -1768,6 +1805,15 b' let'
1768 1805 sha1 = "d4f4562b0ce3696e41ac52d0e002e57a635dc6dc";
1769 1806 };
1770 1807 };
1808 "rc-1.1.6" = {
1809 name = "rc";
1810 packageName = "rc";
1811 version = "1.1.6";
1812 src = fetchurl {
1813 url = "https://registry.npmjs.org/rc/-/rc-1.1.6.tgz";
1814 sha1 = "43651b76b6ae53b5c802f1151fa3fc3b059969c9";
1815 };
1816 };
1771 1817 "ini-1.3.4" = {
1772 1818 name = "ini";
1773 1819 packageName = "ini";
@@ -1858,13 +1904,13 b' let'
1858 1904 sha1 = "3678bd8ab995057c07ade836ed2ef087da811d45";
1859 1905 };
1860 1906 };
1861 "glob-7.0.6" = {
1907 "glob-7.1.0" = {
1862 1908 name = "glob";
1863 1909 packageName = "glob";
1864 version = "7.0.6";
1910 version = "7.1.0";
1865 1911 src = fetchurl {
1866 url = "https://registry.npmjs.org/glob/-/glob-7.0.6.tgz";
1867 sha1 = "211bafaf49e525b8cd93260d14ab136152b3f57a";
1912 url = "https://registry.npmjs.org/glob/-/glob-7.1.0.tgz";
1913 sha1 = "36add856d746d0d99e4cc2797bba1ae2c67272fd";
1868 1914 };
1869 1915 };
1870 1916 "fs.realpath-1.0.0" = {
@@ -1885,13 +1931,13 b' let'
1885 1931 sha1 = "db3204cd5a9de2e6cd890b85c6e2f66bcf4f620a";
1886 1932 };
1887 1933 };
1888 "once-1.3.3" = {
1934 "once-1.4.0" = {
1889 1935 name = "once";
1890 1936 packageName = "once";
1891 version = "1.3.3";
1937 version = "1.4.0";
1892 1938 src = fetchurl {
1893 url = "https://registry.npmjs.org/once/-/once-1.3.3.tgz";
1894 sha1 = "b2e261557ce4c314ec8304f3fa82663e4297ca20";
1939 url = "https://registry.npmjs.org/once/-/once-1.4.0.tgz";
1940 sha1 = "583b1aa775961d4b113ac17d9c50baef9dd76bd1";
1895 1941 };
1896 1942 };
1897 1943 "wrappy-1.0.2" = {
@@ -2034,7 +2080,7 b' let'
2034 2080 (sources."grunt-contrib-less-1.4.0" // {
2035 2081 dependencies = [
2036 2082 sources."async-2.0.1"
2037 sources."lodash-4.15.0"
2083 sources."lodash-4.16.2"
2038 2084 ];
2039 2085 })
2040 2086 (sources."grunt-contrib-watch-0.6.1" // {
@@ -2062,6 +2108,9 b' let'
2062 2108 sources."lodash-3.7.0"
2063 2109 ];
2064 2110 })
2111 sources."bower-1.7.9"
2112 sources."favico.js-0.3.10"
2113 sources."appenlight-client-git+https://git@github.com/AppEnlight/appenlight-client-js.git#0.5.0"
2065 2114 sources."async-0.1.22"
2066 2115 sources."coffee-script-1.3.3"
2067 2116 sources."colors-0.6.2"
@@ -2097,7 +2146,7 b' let'
2097 2146 sources."underscore.string-2.3.3"
2098 2147 ];
2099 2148 })
2100 sources."inherits-2.0.1"
2149 sources."inherits-2.0.3"
2101 2150 sources."lru-cache-2.7.3"
2102 2151 sources."sigmund-1.0.1"
2103 2152 sources."graceful-fs-1.2.3"
@@ -2127,7 +2176,7 b' let'
2127 2176 sources."amdefine-1.0.0"
2128 2177 (sources."less-2.7.1" // {
2129 2178 dependencies = [
2130 sources."graceful-fs-4.1.6"
2179 sources."graceful-fs-4.1.8"
2131 2180 sources."source-map-0.5.6"
2132 2181 ];
2133 2182 })
@@ -2138,7 +2187,7 b' let'
2138 2187 sources."promise-7.1.1"
2139 2188 sources."prr-0.0.0"
2140 2189 sources."minimist-0.0.8"
2141 sources."asap-2.0.4"
2190 sources."asap-2.0.5"
2142 2191 sources."gaze-0.5.2"
2143 2192 sources."tiny-lr-fork-0.0.5"
2144 2193 (sources."globule-0.1.0" // {
@@ -2155,17 +2204,17 b' let'
2155 2204 })
2156 2205 sources."debug-0.7.4"
2157 2206 sources."command-line-args-2.1.6"
2158 sources."dom5-1.3.3"
2207 sources."dom5-1.3.6"
2159 2208 sources."array-back-1.0.3"
2160 2209 sources."command-line-usage-2.0.5"
2161 2210 sources."core-js-2.4.1"
2162 2211 sources."feature-detect-es6-1.3.1"
2163 2212 (sources."find-replace-1.0.2" // {
2164 2213 dependencies = [
2165 sources."test-value-2.0.0"
2214 sources."test-value-2.1.0"
2166 2215 ];
2167 2216 })
2168 sources."typical-2.5.0"
2217 sources."typical-2.6.0"
2169 2218 sources."ansi-escape-sequences-2.2.2"
2170 2219 sources."column-layout-2.1.4"
2171 2220 sources."wordwrapjs-1.2.1"
@@ -2182,11 +2231,11 b' let'
2182 2231 sources."object-tools-2.0.6"
2183 2232 sources."object-get-2.1.0"
2184 2233 sources."test-value-1.1.0"
2185 sources."@types/clone-0.1.29"
2234 sources."@types/clone-0.1.30"
2186 2235 sources."@types/node-4.0.30"
2187 (sources."@types/parse5-0.0.28" // {
2236 (sources."@types/parse5-0.0.31" // {
2188 2237 dependencies = [
2189 sources."@types/node-6.0.37"
2238 sources."@types/node-6.0.41"
2190 2239 ];
2191 2240 })
2192 2241 sources."clone-1.0.2"
@@ -2205,26 +2254,30 b' let'
2205 2254 sources."source-map-0.2.0"
2206 2255 ];
2207 2256 })
2208 sources."espree-3.1.7"
2257 sources."espree-3.3.1"
2209 2258 sources."estraverse-3.1.0"
2210 2259 sources."path-is-absolute-1.0.0"
2211 2260 sources."babel-runtime-6.11.6"
2212 2261 sources."regenerator-runtime-0.9.5"
2213 2262 sources."esutils-1.1.6"
2214 2263 sources."isarray-0.0.1"
2215 sources."optionator-0.8.1"
2264 sources."optionator-0.8.2"
2216 2265 sources."prelude-ls-1.1.2"
2217 2266 sources."deep-is-0.1.3"
2218 2267 sources."wordwrap-1.0.0"
2219 2268 sources."type-check-0.3.2"
2220 2269 sources."levn-0.3.0"
2221 sources."fast-levenshtein-1.1.4"
2222 sources."acorn-3.3.0"
2223 sources."acorn-jsx-3.0.1"
2270 sources."fast-levenshtein-2.0.4"
2271 sources."acorn-4.0.3"
2272 (sources."acorn-jsx-3.0.1" // {
2273 dependencies = [
2274 sources."acorn-3.3.0"
2275 ];
2276 })
2224 2277 sources."boxen-0.3.1"
2225 (sources."configstore-2.0.0" // {
2278 (sources."configstore-2.1.0" // {
2226 2279 dependencies = [
2227 sources."graceful-fs-4.1.6"
2280 sources."graceful-fs-4.1.8"
2228 2281 ];
2229 2282 })
2230 2283 sources."is-npm-1.0.0"
@@ -2239,13 +2292,13 b' let'
2239 2292 sources."number-is-nan-1.0.0"
2240 2293 sources."code-point-at-1.0.0"
2241 2294 sources."is-fullwidth-code-point-1.0.0"
2242 sources."dot-prop-2.4.0"
2295 sources."dot-prop-3.0.0"
2243 2296 sources."os-tmpdir-1.0.1"
2244 2297 sources."osenv-0.1.3"
2245 sources."uuid-2.0.2"
2298 sources."uuid-2.0.3"
2246 2299 (sources."write-file-atomic-1.2.0" // {
2247 2300 dependencies = [
2248 sources."graceful-fs-4.1.6"
2301 sources."graceful-fs-4.1.8"
2249 2302 ];
2250 2303 })
2251 2304 sources."xdg-basedir-2.0.0"
@@ -2253,13 +2306,9 b' let'
2253 2306 sources."os-homedir-1.0.1"
2254 2307 sources."imurmurhash-0.1.4"
2255 2308 sources."slide-1.1.6"
2256 sources."package-json-2.3.3"
2309 sources."package-json-2.4.0"
2257 2310 sources."got-5.6.0"
2258 (sources."rc-1.1.6" // {
2259 dependencies = [
2260 sources."minimist-1.2.0"
2261 ];
2262 })
2311 sources."registry-auth-token-3.0.1"
2263 2312 sources."registry-url-3.1.0"
2264 2313 sources."semver-5.3.0"
2265 2314 sources."create-error-class-3.0.2"
@@ -2279,7 +2328,7 b' let'
2279 2328 ];
2280 2329 })
2281 2330 sources."timed-out-2.0.0"
2282 sources."unzip-response-1.0.0"
2331 sources."unzip-response-1.0.1"
2283 2332 sources."url-parse-lax-1.0.0"
2284 2333 sources."capture-stack-trace-1.0.0"
2285 2334 sources."error-ex-1.3.0"
@@ -2291,11 +2340,16 b' let'
2291 2340 sources."string_decoder-0.10.31"
2292 2341 sources."util-deprecate-1.0.2"
2293 2342 sources."prepend-http-1.0.4"
2343 (sources."rc-1.1.6" // {
2344 dependencies = [
2345 sources."minimist-1.2.0"
2346 ];
2347 })
2294 2348 sources."ini-1.3.4"
2295 2349 sources."strip-json-comments-1.0.4"
2296 2350 (sources."cli-1.0.0" // {
2297 2351 dependencies = [
2298 sources."glob-7.0.6"
2352 sources."glob-7.1.0"
2299 2353 sources."minimatch-3.0.3"
2300 2354 ];
2301 2355 })
@@ -2308,7 +2362,7 b' let'
2308 2362 sources."shelljs-0.3.0"
2309 2363 sources."fs.realpath-1.0.0"
2310 2364 sources."inflight-1.0.5"
2311 sources."once-1.3.3"
2365 sources."once-1.4.0"
2312 2366 sources."wrappy-1.0.2"
2313 2367 sources."brace-expansion-1.1.6"
2314 2368 sources."balanced-match-0.4.2"
@@ -15,19 +15,6 b' let'
15 15 };
16 16 };
17 17
18 # johbo: Interim bridge which allows us to build with the upcoming
19 # nixos.16.09 branch (unstable at the moment of writing this note) and the
20 # current stable nixos-16.03.
21 backwardsCompatibleFetchgit = { ... }@args:
22 let
23 origSources = pkgs.fetchgit args;
24 in
25 pkgs.lib.overrideDerivation origSources (oldAttrs: {
26 NIX_PREFETCH_GIT_CHECKOUT_HOOK = ''
27 find $out -name '.git*' -print0 | xargs -0 rm -rf
28 '';
29 });
30
31 18 in
32 19
33 20 self: super: {
@@ -77,6 +64,9 b' self: super: {'
77 64 });
78 65
79 66 lxml = super.lxml.override (attrs: {
67 # johbo: On 16.09 we need this to compile on darwin, otherwise compilation
68 # fails on Darwin.
69 hardeningDisable = if pkgs.stdenv.isDarwin then [ "format" ] else null;
80 70 buildInputs = with self; [
81 71 pkgs.libxml2
82 72 pkgs.libxslt
@@ -110,7 +100,7 b' self: super: {'
110 100 });
111 101
112 102 py-gfm = super.py-gfm.override {
113 src = backwardsCompatibleFetchgit {
103 src = pkgs.fetchgit {
114 104 url = "https://code.rhodecode.com/upstream/py-gfm";
115 105 rev = "0d66a19bc16e3d49de273c0f797d4e4781e8c0f2";
116 106 sha256 = "0ryp74jyihd3ckszq31bml5jr3bciimhfp7va7kw6ld92930ksv3";
@@ -134,7 +124,7 b' self: super: {'
134 124
135 125 Pylons = super.Pylons.override (attrs: {
136 126 name = "Pylons-1.0.1-patch1";
137 src = backwardsCompatibleFetchgit {
127 src = pkgs.fetchgit {
138 128 url = "https://code.rhodecode.com/upstream/pylons";
139 129 rev = "707354ee4261b9c10450404fc9852ccea4fd667d";
140 130 sha256 = "b2763274c2780523a335f83a1df65be22ebe4ff413a7bc9e9288d23c1f62032e";
@@ -190,7 +180,8 b' self: super: {'
190 180 pkgs.openldap
191 181 pkgs.openssl
192 182 ];
193 NIX_CFLAGS_COMPILE = "-I${pkgs.cyrus_sasl}/include/sasl";
183 # TODO: johbo: Remove the "or" once we drop 16.03 support.
184 NIX_CFLAGS_COMPILE = "-I${pkgs.cyrus_sasl.dev or pkgs.cyrus_sasl}/include/sasl";
194 185 });
195 186
196 187 python-pam = super.python-pam.override (attrs:
@@ -431,6 +431,19 b''
431 431 license = [ pkgs.lib.licenses.psfl ];
432 432 };
433 433 };
434 backports.shutil-get-terminal-size = super.buildPythonPackage {
435 name = "backports.shutil-get-terminal-size-1.0.0";
436 buildInputs = with self; [];
437 doCheck = false;
438 propagatedBuildInputs = with self; [];
439 src = fetchurl {
440 url = "https://pypi.python.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz";
441 md5 = "03267762480bd86b50580dc19dff3c66";
442 };
443 meta = {
444 license = [ pkgs.lib.licenses.mit ];
445 };
446 };
434 447 bottle = super.buildPythonPackage {
435 448 name = "bottle-0.12.8";
436 449 buildInputs = with self; [];
@@ -678,17 +691,17 b''
678 691 license = [ pkgs.lib.licenses.asl20 ];
679 692 };
680 693 };
681 flake8 = super.buildPythonPackage {
682 name = "flake8-2.4.1";
694 enum34 = super.buildPythonPackage {
695 name = "enum34-1.1.6";
683 696 buildInputs = with self; [];
684 697 doCheck = false;
685 propagatedBuildInputs = with self; [pyflakes pep8 mccabe];
698 propagatedBuildInputs = with self; [];
686 699 src = fetchurl {
687 url = "https://pypi.python.org/packages/8f/b5/9a73c66c7dba273bac8758398f060c008a25f3e84531063b42503b5d0a95/flake8-2.4.1.tar.gz";
688 md5 = "ed45d3db81a3b7c88bd63c6e37ca1d65";
700 url = "https://pypi.python.org/packages/bf/3e/31d502c25302814a7c2f1d3959d2a3b3f78e509002ba91aea64993936876/enum34-1.1.6.tar.gz";
701 md5 = "5f13a0841a61f7fc295c514490d120d0";
689 702 };
690 703 meta = {
691 license = [ pkgs.lib.licenses.mit ];
704 license = [ pkgs.lib.licenses.bsdOriginal ];
692 705 };
693 706 };
694 707 future = super.buildPythonPackage {
@@ -809,26 +822,39 b''
809 822 };
810 823 };
811 824 ipdb = super.buildPythonPackage {
812 name = "ipdb-0.8";
825 name = "ipdb-0.10.1";
813 826 buildInputs = with self; [];
814 827 doCheck = false;
815 propagatedBuildInputs = with self; [ipython];
828 propagatedBuildInputs = with self; [ipython setuptools];
816 829 src = fetchurl {
817 url = "https://pypi.python.org/packages/f0/25/d7dd430ced6cd8dc242a933c8682b5dbf32eb4011d82f87e34209e5ec845/ipdb-0.8.zip";
818 md5 = "96dca0712efa01aa5eaf6b22071dd3ed";
830 url = "https://pypi.python.org/packages/eb/0a/0a37dc19572580336ad3813792c0d18c8d7117c2d66fc63c501f13a7a8f8/ipdb-0.10.1.tar.gz";
831 md5 = "4aeab65f633ddc98ebdb5eebf08dc713";
819 832 };
820 833 meta = {
821 license = [ pkgs.lib.licenses.gpl1 ];
834 license = [ pkgs.lib.licenses.bsdOriginal ];
822 835 };
823 836 };
824 837 ipython = super.buildPythonPackage {
825 name = "ipython-3.1.0";
838 name = "ipython-5.1.0";
839 buildInputs = with self; [];
840 doCheck = false;
841 propagatedBuildInputs = with self; [setuptools decorator pickleshare simplegeneric traitlets prompt-toolkit Pygments pexpect backports.shutil-get-terminal-size pathlib2 pexpect];
842 src = fetchurl {
843 url = "https://pypi.python.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz";
844 md5 = "47c8122420f65b58784cb4b9b4af35e3";
845 };
846 meta = {
847 license = [ pkgs.lib.licenses.bsdOriginal ];
848 };
849 };
850 ipython-genutils = super.buildPythonPackage {
851 name = "ipython-genutils-0.1.0";
826 852 buildInputs = with self; [];
827 853 doCheck = false;
828 854 propagatedBuildInputs = with self; [];
829 855 src = fetchurl {
830 url = "https://pypi.python.org/packages/06/91/120c0835254c120af89f066afaabf81289bc2726c1fc3ca0555df6882f58/ipython-3.1.0.tar.gz";
831 md5 = "a749d90c16068687b0ec45a27e72ef8f";
856 url = "https://pypi.python.org/packages/71/b7/a64c71578521606edbbce15151358598f3dfb72a3431763edc2baf19e71f/ipython_genutils-0.1.0.tar.gz";
857 md5 = "9a8afbe0978adbcbfcb3b35b2d015a56";
832 858 };
833 859 meta = {
834 860 license = [ pkgs.lib.licenses.bsdOriginal ];
@@ -886,19 +912,6 b''
886 912 license = [ pkgs.lib.licenses.bsdOriginal ];
887 913 };
888 914 };
889 mccabe = super.buildPythonPackage {
890 name = "mccabe-0.3";
891 buildInputs = with self; [];
892 doCheck = false;
893 propagatedBuildInputs = with self; [];
894 src = fetchurl {
895 url = "https://pypi.python.org/packages/c9/2e/75231479e11a906b64ac43bad9d0bb534d00080b18bdca8db9da46e1faf7/mccabe-0.3.tar.gz";
896 md5 = "81640948ff226f8c12b3277059489157";
897 };
898 meta = {
899 license = [ { fullName = "Expat license"; } pkgs.lib.licenses.mit ];
900 };
901 };
902 915 meld3 = super.buildPythonPackage {
903 916 name = "meld3-1.0.2";
904 917 buildInputs = with self; [];
@@ -990,17 +1003,17 b''
990 1003 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
991 1004 };
992 1005 };
993 pep8 = super.buildPythonPackage {
994 name = "pep8-1.5.7";
1006 pathlib2 = super.buildPythonPackage {
1007 name = "pathlib2-2.1.0";
995 1008 buildInputs = with self; [];
996 1009 doCheck = false;
997 propagatedBuildInputs = with self; [];
1010 propagatedBuildInputs = with self; [six];
998 1011 src = fetchurl {
999 url = "https://pypi.python.org/packages/8b/de/259f5e735897ada1683489dd514b2a1c91aaa74e5e6b68f80acf128a6368/pep8-1.5.7.tar.gz";
1000 md5 = "f6adbdd69365ecca20513c709f9b7c93";
1012 url = "https://pypi.python.org/packages/c9/27/8448b10d8440c08efeff0794adf7d0ed27adb98372c70c7b38f3947d4749/pathlib2-2.1.0.tar.gz";
1013 md5 = "38e4f58b4d69dfcb9edb49a54a8b28d2";
1001 1014 };
1002 1015 meta = {
1003 license = [ { fullName = "Expat license"; } pkgs.lib.licenses.mit ];
1016 license = [ pkgs.lib.licenses.mit ];
1004 1017 };
1005 1018 };
1006 1019 peppercorn = super.buildPythonPackage {
@@ -1016,14 +1029,53 b''
1016 1029 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1017 1030 };
1018 1031 };
1032 pexpect = super.buildPythonPackage {
1033 name = "pexpect-4.2.1";
1034 buildInputs = with self; [];
1035 doCheck = false;
1036 propagatedBuildInputs = with self; [ptyprocess];
1037 src = fetchurl {
1038 url = "https://pypi.python.org/packages/e8/13/d0b0599099d6cd23663043a2a0bb7c61e58c6ba359b2656e6fb000ef5b98/pexpect-4.2.1.tar.gz";
1039 md5 = "3694410001a99dff83f0b500a1ca1c95";
1040 };
1041 meta = {
1042 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1043 };
1044 };
1045 pickleshare = super.buildPythonPackage {
1046 name = "pickleshare-0.7.4";
1047 buildInputs = with self; [];
1048 doCheck = false;
1049 propagatedBuildInputs = with self; [pathlib2];
1050 src = fetchurl {
1051 url = "https://pypi.python.org/packages/69/fe/dd137d84daa0fd13a709e448138e310d9ea93070620c9db5454e234af525/pickleshare-0.7.4.tar.gz";
1052 md5 = "6a9e5dd8dfc023031f6b7b3f824cab12";
1053 };
1054 meta = {
1055 license = [ pkgs.lib.licenses.mit ];
1056 };
1057 };
1058 prompt-toolkit = super.buildPythonPackage {
1059 name = "prompt-toolkit-1.0.9";
1060 buildInputs = with self; [];
1061 doCheck = false;
1062 propagatedBuildInputs = with self; [six wcwidth];
1063 src = fetchurl {
1064 url = "https://pypi.python.org/packages/83/14/5ac258da6c530eca02852ee25c7a9ff3ca78287bb4c198d0d0055845d856/prompt_toolkit-1.0.9.tar.gz";
1065 md5 = "a39f91a54308fb7446b1a421c11f227c";
1066 };
1067 meta = {
1068 license = [ pkgs.lib.licenses.bsdOriginal ];
1069 };
1070 };
1019 1071 psutil = super.buildPythonPackage {
1020 name = "psutil-2.2.1";
1072 name = "psutil-4.3.1";
1021 1073 buildInputs = with self; [];
1022 1074 doCheck = false;
1023 1075 propagatedBuildInputs = with self; [];
1024 1076 src = fetchurl {
1025 url = "https://pypi.python.org/packages/df/47/ee54ef14dd40f8ce831a7581001a5096494dc99fe71586260ca6b531fe86/psutil-2.2.1.tar.gz";
1026 md5 = "1a2b58cd9e3a53528bb6148f0c4d5244";
1077 url = "https://pypi.python.org/packages/78/cc/f267a1371f229bf16db6a4e604428c3b032b823b83155bd33cef45e49a53/psutil-4.3.1.tar.gz";
1078 md5 = "199a366dba829c88bddaf5b41d19ddc0";
1027 1079 };
1028 1080 meta = {
1029 1081 license = [ pkgs.lib.licenses.bsdOriginal ];
@@ -1042,6 +1094,19 b''
1042 1094 license = [ pkgs.lib.licenses.zpt21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1043 1095 };
1044 1096 };
1097 ptyprocess = super.buildPythonPackage {
1098 name = "ptyprocess-0.5.1";
1099 buildInputs = with self; [];
1100 doCheck = false;
1101 propagatedBuildInputs = with self; [];
1102 src = fetchurl {
1103 url = "https://pypi.python.org/packages/db/d7/b465161910f3d1cef593c5e002bff67e0384898f597f1a7fdc8db4c02bf6/ptyprocess-0.5.1.tar.gz";
1104 md5 = "94e537122914cc9ec9c1eadcd36e73a1";
1105 };
1106 meta = {
1107 license = [ ];
1108 };
1109 };
1045 1110 py = super.buildPythonPackage {
1046 1111 name = "py-1.4.29";
1047 1112 buildInputs = with self; [];
@@ -1120,6 +1185,19 b''
1120 1185 license = [ pkgs.lib.licenses.mit ];
1121 1186 };
1122 1187 };
1188 pygments-markdown-lexer = super.buildPythonPackage {
1189 name = "pygments-markdown-lexer-0.1.0.dev39";
1190 buildInputs = with self; [];
1191 doCheck = false;
1192 propagatedBuildInputs = with self; [Pygments];
1193 src = fetchurl {
1194 url = "https://pypi.python.org/packages/c3/12/674cdee66635d638cedb2c5d9c85ce507b7b2f91bdba29e482f1b1160ff6/pygments-markdown-lexer-0.1.0.dev39.zip";
1195 md5 = "6360fe0f6d1f896e35b7a0142ce6459c";
1196 };
1197 meta = {
1198 license = [ pkgs.lib.licenses.asl20 ];
1199 };
1200 };
1123 1201 pyparsing = super.buildPythonPackage {
1124 1202 name = "pyparsing-1.5.7";
1125 1203 buildInputs = with self; [];
@@ -1420,10 +1498,10 b''
1420 1498 };
1421 1499 };
1422 1500 rhodecode-enterprise-ce = super.buildPythonPackage {
1423 name = "rhodecode-enterprise-ce-4.4.2";
1424 buildInputs = with self; [WebTest configobj cssselect flake8 lxml mock pytest pytest-cov pytest-runner];
1501 name = "rhodecode-enterprise-ce-4.5.0";
1502 buildInputs = with self; [WebTest configobj cssselect lxml mock pytest pytest-cov pytest-runner pytest-sugar];
1425 1503 doCheck = true;
1426 propagatedBuildInputs = with self; [Babel Beaker FormEncode Mako Markdown MarkupSafe MySQL-python Paste PasteDeploy PasteScript Pygments Pylons Pyro4 Routes SQLAlchemy Tempita URLObject WebError WebHelpers WebHelpers2 WebOb WebTest Whoosh alembic amqplib anyjson appenlight-client authomatic backport-ipaddress celery channelstream colander decorator deform docutils gevent gunicorn infrae.cache ipython iso8601 kombu msgpack-python packaging psycopg2 py-gfm pycrypto pycurl pyparsing pyramid pyramid-debugtoolbar pyramid-mako pyramid-beaker pysqlite python-dateutil python-ldap python-memcached python-pam recaptcha-client repoze.lru requests simplejson waitress zope.cachedescriptors dogpile.cache dogpile.core psutil py-bcrypt];
1504 propagatedBuildInputs = with self; [Babel Beaker FormEncode Mako Markdown MarkupSafe MySQL-python Paste PasteDeploy PasteScript Pygments pygments-markdown-lexer Pylons Pyro4 Routes SQLAlchemy Tempita URLObject WebError WebHelpers WebHelpers2 WebOb WebTest Whoosh alembic amqplib anyjson appenlight-client authomatic backport-ipaddress celery channelstream colander decorator deform docutils gevent gunicorn infrae.cache ipython iso8601 kombu msgpack-python packaging psycopg2 py-gfm pycrypto pycurl pyparsing pyramid pyramid-debugtoolbar pyramid-mako pyramid-beaker pysqlite python-dateutil python-ldap python-memcached python-pam recaptcha-client repoze.lru requests simplejson subprocess32 waitress zope.cachedescriptors dogpile.cache dogpile.core psutil py-bcrypt];
1427 1505 src = ./.;
1428 1506 meta = {
1429 1507 license = [ { fullName = "AGPLv3, and Commercial License"; } ];
@@ -1494,6 +1572,19 b''
1494 1572 license = [ pkgs.lib.licenses.mit ];
1495 1573 };
1496 1574 };
1575 simplegeneric = super.buildPythonPackage {
1576 name = "simplegeneric-0.8.1";
1577 buildInputs = with self; [];
1578 doCheck = false;
1579 propagatedBuildInputs = with self; [];
1580 src = fetchurl {
1581 url = "https://pypi.python.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip";
1582 md5 = "f9c1fab00fd981be588fc32759f474e3";
1583 };
1584 meta = {
1585 license = [ pkgs.lib.licenses.zpt21 ];
1586 };
1587 };
1497 1588 simplejson = super.buildPythonPackage {
1498 1589 name = "simplejson-3.7.2";
1499 1590 buildInputs = with self; [];
@@ -1546,6 +1637,19 b''
1546 1637 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1547 1638 };
1548 1639 };
1640 traitlets = super.buildPythonPackage {
1641 name = "traitlets-4.3.1";
1642 buildInputs = with self; [];
1643 doCheck = false;
1644 propagatedBuildInputs = with self; [ipython-genutils six decorator enum34];
1645 src = fetchurl {
1646 url = "https://pypi.python.org/packages/b1/d6/5b5aa6d5c474691909b91493da1e8972e309c9f01ecfe4aeafd272eb3234/traitlets-4.3.1.tar.gz";
1647 md5 = "dd0b1b6e5d31ce446d55a4b5e5083c98";
1648 };
1649 meta = {
1650 license = [ pkgs.lib.licenses.bsdOriginal ];
1651 };
1652 };
1549 1653 transifex-client = super.buildPythonPackage {
1550 1654 name = "transifex-client-0.10";
1551 1655 buildInputs = with self; [];
@@ -1637,6 +1741,19 b''
1637 1741 license = [ pkgs.lib.licenses.zpt21 ];
1638 1742 };
1639 1743 };
1744 wcwidth = super.buildPythonPackage {
1745 name = "wcwidth-0.1.7";
1746 buildInputs = with self; [];
1747 doCheck = false;
1748 propagatedBuildInputs = with self; [];
1749 src = fetchurl {
1750 url = "https://pypi.python.org/packages/55/11/e4a2bb08bb450fdbd42cc709dd40de4ed2c472cf0ccb9e64af22279c5495/wcwidth-0.1.7.tar.gz";
1751 md5 = "b3b6a0a08f0c8a34d1de8cf44150a4ad";
1752 };
1753 meta = {
1754 license = [ pkgs.lib.licenses.mit ];
1755 };
1756 };
1640 1757 ws4py = super.buildPythonPackage {
1641 1758 name = "ws4py-0.3.5";
1642 1759 buildInputs = with self; [];
@@ -1718,5 +1835,30 b''
1718 1835
1719 1836 ### Test requirements
1720 1837
1721
1838 pytest-sugar = super.buildPythonPackage {
1839 name = "pytest-sugar-0.7.1";
1840 buildInputs = with self; [];
1841 doCheck = false;
1842 propagatedBuildInputs = with self; [pytest termcolor];
1843 src = fetchurl {
1844 url = "https://pypi.python.org/packages/03/97/05d988b4fa870e7373e8ee4582408543b9ca2bd35c3c67b569369c6f9c49/pytest-sugar-0.7.1.tar.gz";
1845 md5 = "7400f7c11f3d572b2c2a3b60352d35fe";
1846 };
1847 meta = {
1848 license = [ pkgs.lib.licenses.bsdOriginal ];
1849 };
1850 };
1851 termcolor = super.buildPythonPackage {
1852 name = "termcolor-1.1.0";
1853 buildInputs = with self; [];
1854 doCheck = false;
1855 propagatedBuildInputs = with self; [];
1856 src = fetchurl {
1857 url = "https://pypi.python.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz";
1858 md5 = "043e89644f8909d462fbbfa511c768df";
1859 };
1860 meta = {
1861 license = [ pkgs.lib.licenses.mit ];
1862 };
1863 };
1722 1864 }
@@ -1,9 +1,9 b''
1 1 [pytest]
2 2 testpaths = ./rhodecode
3 pylons_config = test.ini
4 vcsserver_protocol = pyro4
5 vcsserver_config = rhodecode/tests/vcsserver.ini
6 vcsserver_config_http = rhodecode/tests/vcsserver_pyramid.ini
3 pylons_config = rhodecode/tests/rhodecode.ini
4 vcsserver_protocol = http
5 vcsserver_config_pyro4 = rhodecode/tests/vcsserver_pyro4.ini
6 vcsserver_config_http = rhodecode/tests/vcsserver_http.ini
7 7 norecursedirs = tests/scripts
8 8 addopts = -k "not _BaseTest"
9 9 markers =
@@ -1,5 +1,6 b''
1 1 Babel==1.3
2 2 Beaker==1.7.0
3 Chameleon==2.24
3 4 CProfileV==1.0.6
4 5 FormEncode==1.2.4
5 6 Jinja2==2.7.3
@@ -11,6 +12,7 b' Paste==2.0.2'
11 12 PasteDeploy==1.5.2
12 13 PasteScript==1.7.5
13 14 Pygments==2.1.3
15 pygments-markdown-lexer==0.1.0.dev39
14 16
15 17 # TODO: This version is not available on PyPI
16 18 # Pylons==1.0.2.dev20160108
@@ -62,7 +64,6 b' dogpile.cache==0.6.1'
62 64 dogpile.core==0.4.1
63 65 dulwich==0.12.0
64 66 ecdsa==0.11
65 flake8==2.4.1
66 67 future==0.14.3
67 68 futures==3.0.2
68 69 gevent==1.1.1
@@ -77,13 +78,12 b' gunicorn==19.6.0'
77 78 gnureadline==6.3.3
78 79 infrae.cache==1.0.1
79 80 invoke==0.13.0
80 ipdb==0.8
81 ipython==3.1.0
81 ipdb==0.10.1
82 ipython==5.1.0
82 83 iso8601==0.1.11
83 84 itsdangerous==0.24
84 85 kombu==1.5.1
85 86 lxml==3.4.4
86 mccabe==0.3
87 87 meld3==1.0.2
88 88 mock==1.0.1
89 89 msgpack-python==0.4.6
@@ -91,8 +91,7 b' nose==1.3.6'
91 91 objgraph==2.0.0
92 92 packaging==15.2
93 93 paramiko==1.15.1
94 pep8==1.5.7
95 psutil==2.2.1
94 psutil==4.3.1
96 95 psycopg2==2.6.1
97 96 py==1.4.29
98 97 py-bcrypt==0.4
@@ -141,6 +140,7 b' transifex-client==0.10'
141 140 translationstring==1.3
142 141 trollius==1.0.4
143 142 uWSGI==2.0.11.2
143 urllib3==1.16
144 144 venusian==1.0
145 145 waitress==0.8.9
146 146 wsgiref==0.1.2
@@ -1,1 +1,1 b''
1 4.4.2 No newline at end of file
1 4.5.0 No newline at end of file
@@ -51,7 +51,7 b' PYRAMID_SETTINGS = {}'
51 51 EXTENSIONS = {}
52 52
53 53 __version__ = ('.'.join((str(each) for each in VERSION[:3])))
54 __dbversion__ = 58 # defines current db version for migrations
54 __dbversion__ = 63 # defines current db version for migrations
55 55 __platform__ = platform.system()
56 56 __license__ = 'AGPLv3, and Commercial License'
57 57 __author__ = 'RhodeCode GmbH'
@@ -35,6 +35,9 b' def includeme(config):'
35 35 config.add_route(
36 36 name='admin_settings_open_source',
37 37 pattern=ADMIN_PREFIX + '/settings/open_source')
38 config.add_route(
39 name='admin_settings_vcs_svn_generate_cfg',
40 pattern=ADMIN_PREFIX + '/settings/vcs/svn_generate_cfg')
38 41
39 42 # Scan module for configuration decorators.
40 43 config.scan()
@@ -24,8 +24,11 b' import logging'
24 24 from pylons import tmpl_context as c
25 25 from pyramid.view import view_config
26 26
27 from rhodecode.lib.auth import LoginRequired, HasPermissionAllDecorator
27 from rhodecode.lib.auth import (
28 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
28 29 from rhodecode.lib.utils import read_opensource_licenses
30 from rhodecode.svn_support.utils import generate_mod_dav_svn_config
31 from rhodecode.translation import _
29 32
30 33 from .navigation import navigation_list
31 34
@@ -53,3 +56,27 b' class AdminSettingsView(object):'
53 56 sorted(read_opensource_licenses().items(), key=lambda t: t[0]))
54 57
55 58 return {}
59
60 @LoginRequired()
61 @CSRFRequired()
62 @HasPermissionAllDecorator('hg.admin')
63 @view_config(
64 route_name='admin_settings_vcs_svn_generate_cfg',
65 request_method='POST', renderer='json')
66 def vcs_svn_generate_config(self):
67 try:
68 generate_mod_dav_svn_config(self.request.registry)
69 msg = {
70 'message': _('Apache configuration for Subversion generated.'),
71 'level': 'success',
72 }
73 except Exception:
74 log.exception(
75 'Exception while generating the Apache configuration for Subversion.')
76 msg = {
77 'message': _('Failed to generate the Apache configuration for Subversion.'),
78 'level': 'error',
79 }
80
81 data = {'message': msg}
82 return data
@@ -23,8 +23,11 b' import json'
23 23 import mock
24 24 import pytest
25 25
26 from rhodecode.lib.utils2 import safe_unicode
26 27 from rhodecode.lib.vcs import settings
28 from rhodecode.model.meta import Session
27 29 from rhodecode.model.repo import RepoModel
30 from rhodecode.model.user import UserModel
28 31 from rhodecode.tests import TEST_USER_ADMIN_LOGIN
29 32 from rhodecode.api.tests.utils import (
30 33 build_data, api_call, assert_ok, assert_error, crash)
@@ -36,29 +39,37 b' fixture = Fixture()'
36 39
37 40 @pytest.mark.usefixtures("testuser_api", "app")
38 41 class TestCreateRepo(object):
39 def test_api_create_repo(self, backend):
40 repo_name = 'api-repo-1'
42
43 @pytest.mark.parametrize('given, expected_name, expected_exc', [
44 ('api repo-1', 'api-repo-1', False),
45 ('api-repo 1-ąć', 'api-repo-1-ąć', False),
46 (u'unicode-ąć', u'unicode-ąć', False),
47 ('some repo v1.2', 'some-repo-v1.2', False),
48 ('v2.0', 'v2.0', False),
49 ])
50 def test_api_create_repo(self, backend, given, expected_name, expected_exc):
51
41 52 id_, params = build_data(
42 53 self.apikey,
43 54 'create_repo',
44 repo_name=repo_name,
55 repo_name=given,
45 56 owner=TEST_USER_ADMIN_LOGIN,
46 57 repo_type=backend.alias,
47 58 )
48 59 response = api_call(self.app, params)
49 60
50 repo = RepoModel().get_by_repo_name(repo_name)
51
52 assert repo is not None
53 61 ret = {
54 'msg': 'Created new repository `%s`' % (repo_name,),
62 'msg': 'Created new repository `%s`' % (expected_name,),
55 63 'success': True,
56 64 'task': None,
57 65 }
58 66 expected = ret
59 67 assert_ok(id_, expected, given=response.body)
60 68
61 id_, params = build_data(self.apikey, 'get_repo', repoid=repo_name)
69 repo = RepoModel().get_by_repo_name(safe_unicode(expected_name))
70 assert repo is not None
71
72 id_, params = build_data(self.apikey, 'get_repo', repoid=expected_name)
62 73 response = api_call(self.app, params)
63 74 body = json.loads(response.body)
64 75
@@ -66,7 +77,7 b' class TestCreateRepo(object):'
66 77 assert body['result']['enable_locking'] is False
67 78 assert body['result']['enable_statistics'] is False
68 79
69 fixture.destroy_repo(repo_name)
80 fixture.destroy_repo(safe_unicode(expected_name))
70 81
71 82 def test_api_create_restricted_repo_type(self, backend):
72 83 repo_name = 'api-repo-type-{0}'.format(backend.alias)
@@ -158,6 +169,21 b' class TestCreateRepo(object):'
158 169 fixture.destroy_repo(repo_name)
159 170 fixture.destroy_repo_group(repo_group_name)
160 171
172 def test_create_repo_in_group_that_doesnt_exist(self, backend, user_util):
173 repo_group_name = 'fake_group'
174
175 repo_name = '%s/api-repo-gr' % (repo_group_name,)
176 id_, params = build_data(
177 self.apikey, 'create_repo',
178 repo_name=repo_name,
179 owner=TEST_USER_ADMIN_LOGIN,
180 repo_type=backend.alias,)
181 response = api_call(self.app, params)
182
183 expected = {'repo_group': 'Repository group `{}` does not exist'.format(
184 repo_group_name)}
185 assert_error(id_, expected, given=response.body)
186
161 187 def test_api_create_repo_unknown_owner(self, backend):
162 188 repo_name = 'api-repo-2'
163 189 owner = 'i-dont-exist'
@@ -218,10 +244,48 b' class TestCreateRepo(object):'
218 244 owner=owner)
219 245 response = api_call(self.app, params)
220 246
221 expected = 'Only RhodeCode admin can specify `owner` param'
247 expected = 'Only RhodeCode super-admin can specify `owner` param'
222 248 assert_error(id_, expected, given=response.body)
223 249 fixture.destroy_repo(repo_name)
224 250
251 def test_api_create_repo_by_non_admin_no_parent_group_perms(self, backend):
252 repo_group_name = 'no-access'
253 fixture.create_repo_group(repo_group_name)
254 repo_name = 'no-access/api-repo'
255
256 id_, params = build_data(
257 self.apikey_regular, 'create_repo',
258 repo_name=repo_name,
259 repo_type=backend.alias)
260 response = api_call(self.app, params)
261
262 expected = {'repo_group': 'Repository group `{}` does not exist'.format(
263 repo_group_name)}
264 assert_error(id_, expected, given=response.body)
265 fixture.destroy_repo_group(repo_group_name)
266 fixture.destroy_repo(repo_name)
267
268 def test_api_create_repo_non_admin_no_permission_to_create_to_root_level(
269 self, backend, user_util):
270
271 regular_user = user_util.create_user()
272 regular_user_api_key = regular_user.api_key
273
274 usr = UserModel().get_by_username(regular_user.username)
275 usr.inherit_default_permissions = False
276 Session().add(usr)
277
278 repo_name = backend.new_repo_name()
279 id_, params = build_data(
280 regular_user_api_key, 'create_repo',
281 repo_name=repo_name,
282 repo_type=backend.alias)
283 response = api_call(self.app, params)
284 expected = {
285 "repo_name": "You do not have the permission to "
286 "store repositories in the root location."}
287 assert_error(id_, expected, given=response.body)
288
225 289 def test_api_create_repo_exists(self, backend):
226 290 repo_name = backend.repo_name
227 291 id_, params = build_data(
@@ -230,7 +294,9 b' class TestCreateRepo(object):'
230 294 owner=TEST_USER_ADMIN_LOGIN,
231 295 repo_type=backend.alias,)
232 296 response = api_call(self.app, params)
233 expected = "repo `%s` already exist" % (repo_name,)
297 expected = {
298 'unique_repo_name': 'Repository with name `{}` already exists'.format(
299 repo_name)}
234 300 assert_error(id_, expected, given=response.body)
235 301
236 302 @mock.patch.object(RepoModel, 'create', crash)
@@ -245,26 +311,40 b' class TestCreateRepo(object):'
245 311 expected = 'failed to create repository `%s`' % (repo_name,)
246 312 assert_error(id_, expected, given=response.body)
247 313
248 def test_create_repo_with_extra_slashes_in_name(self, backend, user_util):
249 existing_repo_group = user_util.create_repo_group()
250 dirty_repo_name = '//{}/repo_name//'.format(
251 existing_repo_group.group_name)
252 cleaned_repo_name = '{}/repo_name'.format(
253 existing_repo_group.group_name)
314 @pytest.mark.parametrize('parent_group, dirty_name, expected_name', [
315 (None, 'foo bar x', 'foo-bar-x'),
316 ('foo', '/foo//bar x', 'foo/bar-x'),
317 ('foo-bar', 'foo-bar //bar x', 'foo-bar/bar-x'),
318 ])
319 def test_create_repo_with_extra_slashes_in_name(
320 self, backend, parent_group, dirty_name, expected_name):
321
322 if parent_group:
323 gr = fixture.create_repo_group(parent_group)
324 assert gr.group_name == parent_group
254 325
255 326 id_, params = build_data(
256 327 self.apikey, 'create_repo',
257 repo_name=dirty_repo_name,
328 repo_name=dirty_name,
258 329 repo_type=backend.alias,
259 330 owner=TEST_USER_ADMIN_LOGIN,)
260 331 response = api_call(self.app, params)
261 repo = RepoModel().get_by_repo_name(cleaned_repo_name)
332 expected ={
333 "msg": "Created new repository `{}`".format(expected_name),
334 "task": None,
335 "success": True
336 }
337 assert_ok(id_, expected, response.body)
338
339 repo = RepoModel().get_by_repo_name(expected_name)
262 340 assert repo is not None
263 341
264 342 expected = {
265 'msg': 'Created new repository `%s`' % (cleaned_repo_name,),
343 'msg': 'Created new repository `%s`' % (expected_name,),
266 344 'success': True,
267 345 'task': None,
268 346 }
269 347 assert_ok(id_, expected, given=response.body)
270 fixture.destroy_repo(cleaned_repo_name)
348 fixture.destroy_repo(expected_name)
349 if parent_group:
350 fixture.destroy_repo_group(parent_group)
@@ -54,55 +54,10 b' class TestCreateRepoGroup(object):'
54 54 'repo_group': repo_group.get_api_data()
55 55 }
56 56 expected = ret
57 assert_ok(id_, expected, given=response.body)
58 fixture.destroy_repo_group(repo_group_name)
59
60 def test_api_create_repo_group_regular_user(self):
61 repo_group_name = 'api-repo-group'
62
63 usr = UserModel().get_by_username(self.TEST_USER_LOGIN)
64 usr.inherit_default_permissions = False
65 Session().add(usr)
66 UserModel().grant_perm(
67 self.TEST_USER_LOGIN, 'hg.repogroup.create.true')
68 Session().commit()
69
70 repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name)
71 assert repo_group is None
72
73 id_, params = build_data(
74 self.apikey_regular, 'create_repo_group',
75 group_name=repo_group_name,
76 owner=TEST_USER_ADMIN_LOGIN,)
77 response = api_call(self.app, params)
78
79 repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name)
80 assert repo_group is not None
81 ret = {
82 'msg': 'Created new repo group `%s`' % (repo_group_name,),
83 'repo_group': repo_group.get_api_data()
84 }
85 expected = ret
86 assert_ok(id_, expected, given=response.body)
87 fixture.destroy_repo_group(repo_group_name)
88 UserModel().revoke_perm(
89 self.TEST_USER_LOGIN, 'hg.repogroup.create.true')
90 usr = UserModel().get_by_username(self.TEST_USER_LOGIN)
91 usr.inherit_default_permissions = True
92 Session().add(usr)
93 Session().commit()
94
95 def test_api_create_repo_group_regular_user_no_permission(self):
96 repo_group_name = 'api-repo-group'
97
98 id_, params = build_data(
99 self.apikey_regular, 'create_repo_group',
100 group_name=repo_group_name,
101 owner=TEST_USER_ADMIN_LOGIN,)
102 response = api_call(self.app, params)
103
104 expected = "Access was denied to this resource."
105 assert_error(id_, expected, given=response.body)
57 try:
58 assert_ok(id_, expected, given=response.body)
59 finally:
60 fixture.destroy_repo_group(repo_group_name)
106 61
107 62 def test_api_create_repo_group_in_another_group(self):
108 63 repo_group_name = 'api-repo-group'
@@ -127,9 +82,11 b' class TestCreateRepoGroup(object):'
127 82 'repo_group': repo_group.get_api_data()
128 83 }
129 84 expected = ret
130 assert_ok(id_, expected, given=response.body)
131 fixture.destroy_repo_group(full_repo_group_name)
132 fixture.destroy_repo_group(repo_group_name)
85 try:
86 assert_ok(id_, expected, given=response.body)
87 finally:
88 fixture.destroy_repo_group(full_repo_group_name)
89 fixture.destroy_repo_group(repo_group_name)
133 90
134 91 def test_api_create_repo_group_in_another_group_not_existing(self):
135 92 repo_group_name = 'api-repo-group-no'
@@ -144,7 +101,10 b' class TestCreateRepoGroup(object):'
144 101 owner=TEST_USER_ADMIN_LOGIN,
145 102 copy_permissions=True)
146 103 response = api_call(self.app, params)
147 expected = 'repository group `%s` does not exist' % (repo_group_name,)
104 expected = {
105 'repo_group':
106 'Parent repository group `{}` does not exist'.format(
107 repo_group_name)}
148 108 assert_error(id_, expected, given=response.body)
149 109
150 110 def test_api_create_repo_group_that_exists(self):
@@ -159,9 +119,139 b' class TestCreateRepoGroup(object):'
159 119 group_name=repo_group_name,
160 120 owner=TEST_USER_ADMIN_LOGIN,)
161 121 response = api_call(self.app, params)
162 expected = 'repo group `%s` already exist' % (repo_group_name,)
122 expected = {
123 'unique_repo_group_name':
124 'Repository group with name `{}` already exists'.format(
125 repo_group_name)}
126 try:
127 assert_error(id_, expected, given=response.body)
128 finally:
129 fixture.destroy_repo_group(repo_group_name)
130
131 def test_api_create_repo_group_regular_user_wit_root_location_perms(
132 self, user_util):
133 regular_user = user_util.create_user()
134 regular_user_api_key = regular_user.api_key
135
136 repo_group_name = 'api-repo-group-by-regular-user'
137
138 usr = UserModel().get_by_username(regular_user.username)
139 usr.inherit_default_permissions = False
140 Session().add(usr)
141
142 UserModel().grant_perm(
143 regular_user.username, 'hg.repogroup.create.true')
144 Session().commit()
145
146 repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name)
147 assert repo_group is None
148
149 id_, params = build_data(
150 regular_user_api_key, 'create_repo_group',
151 group_name=repo_group_name)
152 response = api_call(self.app, params)
153
154 repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name)
155 assert repo_group is not None
156 expected = {
157 'msg': 'Created new repo group `%s`' % (repo_group_name,),
158 'repo_group': repo_group.get_api_data()
159 }
160 try:
161 assert_ok(id_, expected, given=response.body)
162 finally:
163 fixture.destroy_repo_group(repo_group_name)
164
165 def test_api_create_repo_group_regular_user_with_admin_perms_to_parent(
166 self, user_util):
167
168 repo_group_name = 'api-repo-group-parent'
169
170 repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name)
171 assert repo_group is None
172 # create the parent
173 fixture.create_repo_group(repo_group_name)
174
175 # user perms
176 regular_user = user_util.create_user()
177 regular_user_api_key = regular_user.api_key
178
179 usr = UserModel().get_by_username(regular_user.username)
180 usr.inherit_default_permissions = False
181 Session().add(usr)
182
183 RepoGroupModel().grant_user_permission(
184 repo_group_name, regular_user.username, 'group.admin')
185 Session().commit()
186
187 full_repo_group_name = repo_group_name + '/' + repo_group_name
188 id_, params = build_data(
189 regular_user_api_key, 'create_repo_group',
190 group_name=full_repo_group_name)
191 response = api_call(self.app, params)
192
193 repo_group = RepoGroupModel.cls.get_by_group_name(full_repo_group_name)
194 assert repo_group is not None
195 expected = {
196 'msg': 'Created new repo group `{}`'.format(full_repo_group_name),
197 'repo_group': repo_group.get_api_data()
198 }
199 try:
200 assert_ok(id_, expected, given=response.body)
201 finally:
202 fixture.destroy_repo_group(full_repo_group_name)
203 fixture.destroy_repo_group(repo_group_name)
204
205 def test_api_create_repo_group_regular_user_no_permission_to_create_to_root_level(self):
206 repo_group_name = 'api-repo-group'
207
208 id_, params = build_data(
209 self.apikey_regular, 'create_repo_group',
210 group_name=repo_group_name)
211 response = api_call(self.app, params)
212
213 expected = {
214 'repo_group':
215 u'You do not have the permission to store '
216 u'repository groups in the root location.'}
163 217 assert_error(id_, expected, given=response.body)
164 fixture.destroy_repo_group(repo_group_name)
218
219 def test_api_create_repo_group_regular_user_no_parent_group_perms(self):
220 repo_group_name = 'api-repo-group-regular-user'
221
222 repo_group = RepoGroupModel.cls.get_by_group_name(repo_group_name)
223 assert repo_group is None
224 # create the parent
225 fixture.create_repo_group(repo_group_name)
226
227 full_repo_group_name = repo_group_name+'/'+repo_group_name
228
229 id_, params = build_data(
230 self.apikey_regular, 'create_repo_group',
231 group_name=full_repo_group_name)
232 response = api_call(self.app, params)
233
234 expected = {
235 'repo_group':
236 'Parent repository group `{}` does not exist'.format(
237 repo_group_name)}
238 try:
239 assert_error(id_, expected, given=response.body)
240 finally:
241 fixture.destroy_repo_group(repo_group_name)
242
243 def test_api_create_repo_group_regular_user_no_permission_to_specify_owner(
244 self):
245 repo_group_name = 'api-repo-group'
246
247 id_, params = build_data(
248 self.apikey_regular, 'create_repo_group',
249 group_name=repo_group_name,
250 owner=TEST_USER_ADMIN_LOGIN,)
251 response = api_call(self.app, params)
252
253 expected = "Only RhodeCode super-admin can specify `owner` param"
254 assert_error(id_, expected, given=response.body)
165 255
166 256 @mock.patch.object(RepoGroupModel, 'create', crash)
167 257 def test_api_create_repo_group_exception_occurred(self):
@@ -28,6 +28,7 b' from rhodecode.tests import ('
28 28 from rhodecode.api.tests.utils import (
29 29 build_data, api_call, assert_ok, assert_error, jsonify, crash)
30 30 from rhodecode.tests.fixture import Fixture
31 from rhodecode.model.db import RepoGroup
31 32
32 33
33 34 # TODO: mikhail: remove fixture from here
@@ -145,6 +146,36 b' class TestCreateUser(object):'
145 146 finally:
146 147 fixture.destroy_user(usr.user_id)
147 148
149 def test_api_create_user_with_personal_repo_group(self):
150 username = 'test_new_api_user_personal_group'
151 email = username + "@foo.com"
152
153 id_, params = build_data(
154 self.apikey, 'create_user',
155 username=username,
156 email=email, extern_name='rhodecode',
157 create_personal_repo_group=True)
158 response = api_call(self.app, params)
159
160 usr = UserModel().get_by_username(username)
161 ret = {
162 'msg': 'created new user `%s`' % (username,),
163 'user': jsonify(usr.get_api_data(include_secrets=True)),
164 }
165
166 personal_group = RepoGroup.get_by_group_name(username)
167 assert personal_group
168 assert personal_group.personal == True
169 assert personal_group.user.username == username
170
171 try:
172 expected = ret
173 assert_ok(id_, expected, given=response.body)
174 finally:
175 fixture.destroy_repo_group(username)
176 fixture.destroy_user(usr.user_id)
177
178
148 179 @mock.patch.object(UserModel, 'create_or_update', crash)
149 180 def test_api_create_user_when_exception_happened(self):
150 181
@@ -24,6 +24,7 b' import pytest'
24 24
25 25 from rhodecode.model.meta import Session
26 26 from rhodecode.model.repo import RepoModel
27 from rhodecode.model.repo_group import RepoGroupModel
27 28 from rhodecode.model.user import UserModel
28 29 from rhodecode.tests import TEST_USER_ADMIN_LOGIN
29 30 from rhodecode.api.tests.utils import (
@@ -99,11 +100,35 b' class TestApiForkRepo(object):'
99 100 finally:
100 101 fixture.destroy_repo(fork_name)
101 102
103 def test_api_fork_repo_non_admin_into_group_no_permission(self, backend, user_util):
104 source_name = backend['minimal'].repo_name
105 repo_group = user_util.create_repo_group()
106 repo_group_name = repo_group.group_name
107 fork_name = '%s/api-repo-fork' % repo_group_name
108
109 id_, params = build_data(
110 self.apikey_regular, 'fork_repo',
111 repoid=source_name,
112 fork_name=fork_name)
113 response = api_call(self.app, params)
114
115 expected = {
116 'repo_group': 'Repository group `{}` does not exist'.format(
117 repo_group_name)}
118 try:
119 assert_error(id_, expected, given=response.body)
120 finally:
121 fixture.destroy_repo(fork_name)
122
102 123 def test_api_fork_repo_non_admin_into_group(self, backend, user_util):
103 124 source_name = backend['minimal'].repo_name
104 125 repo_group = user_util.create_repo_group()
105 126 fork_name = '%s/api-repo-fork' % repo_group.group_name
106 127
128 RepoGroupModel().grant_user_permission(
129 repo_group, self.TEST_USER_LOGIN, 'group.admin')
130 Session().commit()
131
107 132 id_, params = build_data(
108 133 self.apikey_regular, 'fork_repo',
109 134 repoid=source_name,
@@ -129,10 +154,11 b' class TestApiForkRepo(object):'
129 154 fork_name=fork_name,
130 155 owner=TEST_USER_ADMIN_LOGIN)
131 156 response = api_call(self.app, params)
132 expected = 'Only RhodeCode admin can specify `owner` param'
157 expected = 'Only RhodeCode super-admin can specify `owner` param'
133 158 assert_error(id_, expected, given=response.body)
134 159
135 def test_api_fork_repo_non_admin_no_permission_to_fork(self, backend):
160 def test_api_fork_repo_non_admin_no_permission_of_source_repo(
161 self, backend):
136 162 source_name = backend['minimal'].repo_name
137 163 RepoModel().grant_user_permission(repo=source_name,
138 164 user=self.TEST_USER_LOGIN,
@@ -147,19 +173,44 b' class TestApiForkRepo(object):'
147 173 assert_error(id_, expected, given=response.body)
148 174
149 175 def test_api_fork_repo_non_admin_no_permission_to_fork_to_root_level(
150 self, backend):
176 self, backend, user_util):
177
178 regular_user = user_util.create_user()
179 regular_user_api_key = regular_user.api_key
180 usr = UserModel().get_by_username(regular_user.username)
181 usr.inherit_default_permissions = False
182 Session().add(usr)
183 UserModel().grant_perm(regular_user.username, 'hg.fork.repository')
184
151 185 source_name = backend['minimal'].repo_name
186 fork_name = backend.new_repo_name()
187 id_, params = build_data(
188 regular_user_api_key, 'fork_repo',
189 repoid=source_name,
190 fork_name=fork_name)
191 response = api_call(self.app, params)
192 expected = {
193 "repo_name": "You do not have the permission to "
194 "store repositories in the root location."}
195 assert_error(id_, expected, given=response.body)
152 196
153 usr = UserModel().get_by_username(self.TEST_USER_LOGIN)
197 def test_api_fork_repo_non_admin_no_permission_to_fork(
198 self, backend, user_util):
199
200 regular_user = user_util.create_user()
201 regular_user_api_key = regular_user.api_key
202 usr = UserModel().get_by_username(regular_user.username)
154 203 usr.inherit_default_permissions = False
155 204 Session().add(usr)
156 205
206 source_name = backend['minimal'].repo_name
157 207 fork_name = backend.new_repo_name()
158 208 id_, params = build_data(
159 self.apikey_regular, 'fork_repo',
209 regular_user_api_key, 'fork_repo',
160 210 repoid=source_name,
161 211 fork_name=fork_name)
162 212 response = api_call(self.app, params)
213
163 214 expected = "Access was denied to this resource."
164 215 assert_error(id_, expected, given=response.body)
165 216
@@ -189,7 +240,9 b' class TestApiForkRepo(object):'
189 240 response = api_call(self.app, params)
190 241
191 242 try:
192 expected = "fork `%s` already exist" % (fork_name,)
243 expected = {
244 'unique_repo_name': 'Repository with name `{}` already exists'.format(
245 fork_name)}
193 246 assert_error(id_, expected, given=response.body)
194 247 finally:
195 248 fixture.destroy_repo(fork_repo.repo_name)
@@ -205,7 +258,9 b' class TestApiForkRepo(object):'
205 258 owner=TEST_USER_ADMIN_LOGIN)
206 259 response = api_call(self.app, params)
207 260
208 expected = "repo `%s` already exist" % (fork_name,)
261 expected = {
262 'unique_repo_name': 'Repository with name `{}` already exists'.format(
263 fork_name)}
209 264 assert_error(id_, expected, given=response.body)
210 265
211 266 @mock.patch.object(RepoModel, 'create_fork', crash)
@@ -34,6 +34,7 b' pytestmark = pytest.mark.backends("git",'
34 34 class TestGetPullRequest(object):
35 35
36 36 def test_api_get_pull_request(self, pr_util):
37 from rhodecode.model.pull_request import PullRequestModel
37 38 pull_request = pr_util.create_pull_request(mergeable=True)
38 39 id_, params = build_data(
39 40 self.apikey, 'get_pull_request',
@@ -57,6 +58,8 b' class TestGetPullRequest(object):'
57 58 target_url = unicode(
58 59 pull_request.target_repo.clone_url()
59 60 .with_netloc('test.example.com:80'))
61 shadow_url = unicode(
62 PullRequestModel().get_shadow_clone_url(pull_request))
60 63 expected = {
61 64 'pull_request_id': pull_request.pull_request_id,
62 65 'url': pr_url,
@@ -89,15 +92,24 b' class TestGetPullRequest(object):'
89 92 'commit_id': pull_request.target_ref_parts.commit_id,
90 93 },
91 94 },
95 'merge': {
96 'clone_url': shadow_url,
97 'reference': {
98 'name': pull_request.shadow_merge_ref.name,
99 'type': pull_request.shadow_merge_ref.type,
100 'commit_id': pull_request.shadow_merge_ref.commit_id,
101 },
102 },
92 103 'author': pull_request.author.get_api_data(include_secrets=False,
93 104 details='basic'),
94 105 'reviewers': [
95 106 {
96 107 'user': reviewer.get_api_data(include_secrets=False,
97 108 details='basic'),
109 'reasons': reasons,
98 110 'review_status': st[0][1].status if st else 'not_reviewed',
99 111 }
100 for reviewer, st in pull_request.reviewers_statuses()
112 for reviewer, reasons, st in pull_request.reviewers_statuses()
101 113 ]
102 114 }
103 115 assert_ok(id_, expected, response.body)
@@ -45,8 +45,13 b' class TestGetServerInfo(object):'
45 45 expected['uptime'] = resp['result']['uptime']
46 46 expected['load'] = resp['result']['load']
47 47 expected['cpu'] = resp['result']['cpu']
48 expected['disk'] = resp['result']['disk']
49 expected['server_ip'] = '127.0.0.1:80'
48 expected['storage'] = resp['result']['storage']
49 expected['storage_temp'] = resp['result']['storage_temp']
50 expected['storage_inodes'] = resp['result']['storage_inodes']
51 expected['server'] = resp['result']['server']
52
53 expected['index_storage'] = resp['result']['index_storage']
54 expected['storage'] = resp['result']['storage']
50 55
51 56 assert_ok(id_, expected, given=response.body)
52 57
@@ -59,7 +64,21 b' class TestGetServerInfo(object):'
59 64 expected['uptime'] = resp['result']['uptime']
60 65 expected['load'] = resp['result']['load']
61 66 expected['cpu'] = resp['result']['cpu']
62 expected['disk'] = resp['result']['disk']
63 expected['server_ip'] = '127.0.0.1:80'
67 expected['storage'] = resp['result']['storage']
68 expected['storage_temp'] = resp['result']['storage_temp']
69 expected['storage_inodes'] = resp['result']['storage_inodes']
70 expected['server'] = resp['result']['server']
71
72 expected['index_storage'] = resp['result']['index_storage']
73 expected['storage'] = resp['result']['storage']
64 74
65 75 assert_ok(id_, expected, given=response.body)
76
77 def test_api_get_server_info_data_for_search_index_build(self):
78 id_, params = build_data(self.apikey, 'get_server_info')
79 response = api_call(self.app, params)
80 resp = response.json
81
82 # required by indexer
83 assert resp['result']['index_storage']
84 assert resp['result']['storage']
@@ -32,42 +32,37 b' from rhodecode.api.tests.utils import ('
32 32 class TestMergePullRequest(object):
33 33 @pytest.mark.backends("git", "hg")
34 34 def test_api_merge_pull_request(self, pr_util, no_notifications):
35 pull_request = pr_util.create_pull_request()
36 pull_request_2 = PullRequestModel().create(
37 created_by=pull_request.author,
38 source_repo=pull_request.source_repo,
39 source_ref=pull_request.source_ref,
40 target_repo=pull_request.target_repo,
41 target_ref=pull_request.target_ref,
42 revisions=pull_request.revisions,
43 reviewers=(),
44 title=pull_request.title,
45 description=pull_request.description,
46 )
35 pull_request = pr_util.create_pull_request(mergeable=True)
47 36 author = pull_request.user_id
48 repo = pull_request_2.target_repo.repo_id
49 pull_request_2_id = pull_request_2.pull_request_id
50 pull_request_2_repo = pull_request_2.target_repo.repo_name
51 Session().commit()
37 repo = pull_request.target_repo.repo_id
38 pull_request_id = pull_request.pull_request_id
39 pull_request_repo = pull_request.target_repo.repo_name
52 40
53 41 id_, params = build_data(
54 42 self.apikey, 'merge_pull_request',
55 repoid=pull_request_2_repo,
56 pullrequestid=pull_request_2_id)
43 repoid=pull_request_repo,
44 pullrequestid=pull_request_id)
45
57 46 response = api_call(self.app, params)
58 47
48 # The above api call detaches the pull request DB object from the
49 # session because of an unconditional transaction rollback in our
50 # middleware. Therefore we need to add it back here if we want to use
51 # it.
52 Session().add(pull_request)
53
59 54 expected = {
60 55 'executed': True,
61 56 'failure_reason': 0,
62 'possible': True
57 'possible': True,
58 'merge_commit_id': pull_request.shadow_merge_ref.commit_id,
59 'merge_ref': pull_request.shadow_merge_ref._asdict()
63 60 }
64 61
65 62 response_json = response.json['result']
66 assert response_json['merge_commit_id']
67 response_json.pop('merge_commit_id')
68 63 assert response_json == expected
69 64
70 action = 'user_merged_pull_request:%d' % (pull_request_2_id, )
65 action = 'user_merged_pull_request:%d' % (pull_request_id, )
71 66 journal = UserLog.query()\
72 67 .filter(UserLog.user_id == author)\
73 68 .filter(UserLog.repository_id == repo)\
@@ -77,11 +72,11 b' class TestMergePullRequest(object):'
77 72
78 73 id_, params = build_data(
79 74 self.apikey, 'merge_pull_request',
80 repoid=pull_request_2_repo, pullrequestid=pull_request_2_id)
75 repoid=pull_request_repo, pullrequestid=pull_request_id)
81 76 response = api_call(self.app, params)
82 77
83 78 expected = 'pull request `%s` merge failed, pull request is closed' % (
84 pull_request_2_id)
79 pull_request_id)
85 80 assert_error(id_, expected, given=response.body)
86 81
87 82 @pytest.mark.backends("git", "hg")
@@ -32,35 +32,60 b' fixture = Fixture()'
32 32
33 33 UPDATE_REPO_NAME = 'api_update_me'
34 34
35 class SAME_AS_UPDATES(object): """ Constant used for tests below """
35
36 class SAME_AS_UPDATES(object):
37 """ Constant used for tests below """
38
36 39
37 40 @pytest.mark.usefixtures("testuser_api", "app")
38 41 class TestApiUpdateRepo(object):
39 42
40 43 @pytest.mark.parametrize("updates, expected", [
41 ({'owner': TEST_USER_REGULAR_LOGIN}, SAME_AS_UPDATES),
42 ({'description': 'new description'}, SAME_AS_UPDATES),
43 ({'clone_uri': 'http://foo.com/repo'}, SAME_AS_UPDATES),
44 ({'clone_uri': None}, {'clone_uri': ''}),
45 ({'clone_uri': ''}, {'clone_uri': ''}),
46 ({'landing_rev': 'branch:master'}, {'landing_rev': ['branch','master']}),
47 ({'enable_statistics': True}, SAME_AS_UPDATES),
48 ({'enable_locking': True}, SAME_AS_UPDATES),
49 ({'enable_downloads': True}, SAME_AS_UPDATES),
50 ({'name': 'new_repo_name'}, {
44 ({'owner': TEST_USER_REGULAR_LOGIN},
45 SAME_AS_UPDATES),
46
47 ({'description': 'new description'},
48 SAME_AS_UPDATES),
49
50 ({'clone_uri': 'http://foo.com/repo'},
51 SAME_AS_UPDATES),
52
53 ({'clone_uri': None},
54 {'clone_uri': ''}),
55
56 ({'clone_uri': ''},
57 {'clone_uri': ''}),
58
59 ({'landing_rev': 'rev:tip'},
60 {'landing_rev': ['rev', 'tip']}),
61
62 ({'enable_statistics': True},
63 SAME_AS_UPDATES),
64
65 ({'enable_locking': True},
66 SAME_AS_UPDATES),
67
68 ({'enable_downloads': True},
69 SAME_AS_UPDATES),
70
71 ({'repo_name': 'new_repo_name'},
72 {
51 73 'repo_name': 'new_repo_name',
52 'url': 'http://test.example.com:80/new_repo_name',
53 }),
54 ({'group': 'test_group_for_update'}, {
55 'repo_name': 'test_group_for_update/%s' % UPDATE_REPO_NAME,
56 'url': 'http://test.example.com:80/test_group_for_update/%s' % UPDATE_REPO_NAME
57 }),
74 'url': 'http://test.example.com:80/new_repo_name'
75 }),
76
77 ({'repo_name': 'test_group_for_update/{}'.format(UPDATE_REPO_NAME),
78 '_group': 'test_group_for_update'},
79 {
80 'repo_name': 'test_group_for_update/{}'.format(UPDATE_REPO_NAME),
81 'url': 'http://test.example.com:80/test_group_for_update/{}'.format(UPDATE_REPO_NAME)
82 }),
58 83 ])
59 84 def test_api_update_repo(self, updates, expected, backend):
60 85 repo_name = UPDATE_REPO_NAME
61 86 repo = fixture.create_repo(repo_name, repo_type=backend.alias)
62 if updates.get('group'):
63 fixture.create_repo_group(updates['group'])
87 if updates.get('_group'):
88 fixture.create_repo_group(updates['_group'])
64 89
65 90 expected_api_data = repo.get_api_data(include_secrets=True)
66 91 if expected is SAME_AS_UPDATES:
@@ -68,15 +93,12 b' class TestApiUpdateRepo(object):'
68 93 else:
69 94 expected_api_data.update(expected)
70 95
71
72 96 id_, params = build_data(
73 97 self.apikey, 'update_repo', repoid=repo_name, **updates)
74 98 response = api_call(self.app, params)
75 99
76 if updates.get('name'):
77 repo_name = updates['name']
78 if updates.get('group'):
79 repo_name = '/'.join([updates['group'], repo_name])
100 if updates.get('repo_name'):
101 repo_name = updates['repo_name']
80 102
81 103 try:
82 104 expected = {
@@ -86,8 +108,8 b' class TestApiUpdateRepo(object):'
86 108 assert_ok(id_, expected, given=response.body)
87 109 finally:
88 110 fixture.destroy_repo(repo_name)
89 if updates.get('group'):
90 fixture.destroy_repo_group(updates['group'])
111 if updates.get('_group'):
112 fixture.destroy_repo_group(updates['_group'])
91 113
92 114 def test_api_update_repo_fork_of_field(self, backend):
93 115 master_repo = backend.create_repo()
@@ -118,19 +140,23 b' class TestApiUpdateRepo(object):'
118 140 id_, params = build_data(
119 141 self.apikey, 'update_repo', repoid=repo.repo_name, **updates)
120 142 response = api_call(self.app, params)
121 expected = 'repository `{}` does not exist'.format(master_repo_name)
143 expected = {
144 'repo_fork_of': 'Fork with id `{}` does not exists'.format(
145 master_repo_name)}
122 146 assert_error(id_, expected, given=response.body)
123 147
124 148 def test_api_update_repo_with_repo_group_not_existing(self):
125 149 repo_name = 'admin_owned'
150 fake_repo_group = 'test_group_for_update'
126 151 fixture.create_repo(repo_name)
127 updates = {'group': 'test_group_for_update'}
152 updates = {'repo_name': '{}/{}'.format(fake_repo_group, repo_name)}
128 153 id_, params = build_data(
129 154 self.apikey, 'update_repo', repoid=repo_name, **updates)
130 155 response = api_call(self.app, params)
131 156 try:
132 expected = 'repository group `%s` does not exist' % (
133 updates['group'],)
157 expected = {
158 'repo_group': 'Repository group `{}` does not exist'.format(fake_repo_group)
159 }
134 160 assert_error(id_, expected, given=response.body)
135 161 finally:
136 162 fixture.destroy_repo(repo_name)
@@ -30,22 +30,30 b' from rhodecode.api.tests.utils import ('
30 30
31 31 @pytest.mark.usefixtures("testuser_api", "app")
32 32 class TestApiUpdateRepoGroup(object):
33
33 34 def test_update_group_name(self, user_util):
34 35 new_group_name = 'new-group'
35 36 initial_name = self._update(user_util, group_name=new_group_name)
36 37 assert RepoGroupModel()._get_repo_group(initial_name) is None
37 assert RepoGroupModel()._get_repo_group(new_group_name) is not None
38 new_group = RepoGroupModel()._get_repo_group(new_group_name)
39 assert new_group is not None
40 assert new_group.full_path == new_group_name
38 41
39 def test_update_parent(self, user_util):
42 def test_update_group_name_change_parent(self, user_util):
43
40 44 parent_group = user_util.create_repo_group()
41 initial_name = self._update(user_util, parent=parent_group.name)
45 parent_group_name = parent_group.name
42 46
43 expected_group_name = '{}/{}'.format(parent_group.name, initial_name)
47 expected_group_name = '{}/{}'.format(parent_group_name, 'new-group')
48 initial_name = self._update(user_util, group_name=expected_group_name)
49
44 50 repo_group = RepoGroupModel()._get_repo_group(expected_group_name)
51
45 52 assert repo_group is not None
46 53 assert repo_group.group_name == expected_group_name
47 assert repo_group.name == initial_name
54 assert repo_group.full_path == expected_group_name
48 55 assert RepoGroupModel()._get_repo_group(initial_name) is None
56
49 57 new_path = os.path.join(
50 58 RepoGroupModel().repos_path, *repo_group.full_path_splitted)
51 59 assert os.path.exists(new_path)
@@ -67,15 +75,47 b' class TestApiUpdateRepoGroup(object):'
67 75 repo_group = RepoGroupModel()._get_repo_group(initial_name)
68 76 assert repo_group.user.username == owner
69 77
70 def test_api_update_repo_group_by_regular_user_no_permission(
71 self, backend):
72 repo = backend.create_repo()
73 repo_name = repo.repo_name
78 def test_update_group_name_conflict_with_existing(self, user_util):
79 group_1 = user_util.create_repo_group()
80 group_2 = user_util.create_repo_group()
81 repo_group_name_1 = group_1.group_name
82 repo_group_name_2 = group_2.group_name
74 83
75 84 id_, params = build_data(
76 self.apikey_regular, 'update_repo_group', repogroupid=repo_name)
85 self.apikey, 'update_repo_group', repogroupid=repo_group_name_1,
86 group_name=repo_group_name_2)
87 response = api_call(self.app, params)
88 expected = {
89 'unique_repo_group_name':
90 'Repository group with name `{}` already exists'.format(
91 repo_group_name_2)}
92 assert_error(id_, expected, given=response.body)
93
94 def test_api_update_repo_group_by_regular_user_no_permission(self, user_util):
95 temp_user = user_util.create_user()
96 temp_user_api_key = temp_user.api_key
97 parent_group = user_util.create_repo_group()
98 repo_group_name = parent_group.group_name
99 id_, params = build_data(
100 temp_user_api_key, 'update_repo_group', repogroupid=repo_group_name)
77 101 response = api_call(self.app, params)
78 expected = 'repository group `%s` does not exist' % (repo_name,)
102 expected = 'repository group `%s` does not exist' % (repo_group_name,)
103 assert_error(id_, expected, given=response.body)
104
105 def test_api_update_repo_group_regular_user_no_root_write_permissions(
106 self, user_util):
107 temp_user = user_util.create_user()
108 temp_user_api_key = temp_user.api_key
109 parent_group = user_util.create_repo_group(owner=temp_user.username)
110 repo_group_name = parent_group.group_name
111
112 id_, params = build_data(
113 temp_user_api_key, 'update_repo_group', repogroupid=repo_group_name,
114 group_name='at-root-level')
115 response = api_call(self.app, params)
116 expected = {
117 'repo_group': 'You do not have the permission to store '
118 'repository groups in the root location.'}
79 119 assert_error(id_, expected, given=response.body)
80 120
81 121 def _update(self, user_util, **kwargs):
@@ -89,7 +129,10 b' class TestApiUpdateRepoGroup(object):'
89 129 self.apikey, 'update_repo_group', repogroupid=initial_name,
90 130 **kwargs)
91 131 response = api_call(self.app, params)
92 ret = {
132
133 repo_group = RepoGroupModel.cls.get(repo_group.group_id)
134
135 expected = {
93 136 'msg': 'updated repository group ID:{} {}'.format(
94 137 repo_group.group_id, repo_group.group_name),
95 138 'repo_group': {
@@ -103,5 +146,5 b' class TestApiUpdateRepoGroup(object):'
103 146 if repo_group.parent_group else None)
104 147 }
105 148 }
106 assert_ok(id_, ret, given=response.body)
149 assert_ok(id_, expected, given=response.body)
107 150 return initial_name
@@ -249,7 +249,7 b' class TestRepoAccess(object):'
249 249 fake_repo = Mock()
250 250 with self.repo_perm_patch as rmock:
251 251 rmock.return_value = repo_mock
252 assert utils.has_repo_permissions(
252 assert utils.validate_repo_permissions(
253 253 'fake_user', 'fake_repo_id', fake_repo,
254 254 ['perm1', 'perm2'])
255 255 rmock.assert_called_once_with(*['perm1', 'perm2'])
@@ -263,6 +263,6 b' class TestRepoAccess(object):'
263 263 with self.repo_perm_patch as rmock:
264 264 rmock.return_value = repo_mock
265 265 with pytest.raises(JSONRPCError) as excinfo:
266 utils.has_repo_permissions(
266 utils.validate_repo_permissions(
267 267 'fake_user', 'fake_repo_id', fake_repo, 'perms')
268 268 assert 'fake_repo_id' in excinfo
@@ -26,7 +26,8 b' import collections'
26 26 import logging
27 27
28 28 from rhodecode.api.exc import JSONRPCError
29 from rhodecode.lib.auth import HasPermissionAnyApi, HasRepoPermissionAnyApi
29 from rhodecode.lib.auth import HasPermissionAnyApi, HasRepoPermissionAnyApi, \
30 HasRepoGroupPermissionAnyApi
30 31 from rhodecode.lib.utils import safe_unicode
31 32 from rhodecode.controllers.utils import get_commit_from_ref_name
32 33 from rhodecode.lib.vcs.exceptions import RepositoryError
@@ -153,7 +154,7 b' def has_superadmin_permission(apiuser):'
153 154 return False
154 155
155 156
156 def has_repo_permissions(apiuser, repoid, repo, perms):
157 def validate_repo_permissions(apiuser, repoid, repo, perms):
157 158 """
158 159 Raise JsonRPCError if apiuser is not authorized or return True
159 160
@@ -170,6 +171,36 b' def has_repo_permissions(apiuser, repoid'
170 171 return True
171 172
172 173
174 def validate_repo_group_permissions(apiuser, repogroupid, repo_group, perms):
175 """
176 Raise JsonRPCError if apiuser is not authorized or return True
177
178 :param apiuser:
179 :param repogroupid: just the id of repository group
180 :param repo_group: instance of repo_group
181 :param perms:
182 """
183 if not HasRepoGroupPermissionAnyApi(*perms)(
184 user=apiuser, group_name=repo_group.group_name):
185 raise JSONRPCError(
186 'repository group `%s` does not exist' % repogroupid)
187
188 return True
189
190
191 def validate_set_owner_permissions(apiuser, owner):
192 if isinstance(owner, Optional):
193 owner = get_user_or_error(apiuser.user_id)
194 else:
195 if has_superadmin_permission(apiuser):
196 owner = get_user_or_error(owner)
197 else:
198 # forbid setting owner for non-admins
199 raise JSONRPCError(
200 'Only RhodeCode super-admin can specify `owner` param')
201 return owner
202
203
173 204 def get_user_or_error(userid):
174 205 """
175 206 Get user by id or name or return JsonRPCError if not found
@@ -25,7 +25,7 b' from rhodecode.api import jsonrpc_method'
25 25 from rhodecode.api.utils import (
26 26 has_superadmin_permission, Optional, OAttr, get_repo_or_error,
27 27 get_pull_request_or_error, get_commit_or_error, get_user_or_error,
28 has_repo_permissions, resolve_ref_or_error)
28 validate_repo_permissions, resolve_ref_or_error)
29 29 from rhodecode.lib.auth import (HasRepoPermissionAnyApi)
30 30 from rhodecode.lib.base import vcs_operation_context
31 31 from rhodecode.lib.utils2 import str2bool
@@ -96,6 +96,15 b' def get_pull_request(request, apiuser, r'
96 96 "commit_id": "<commit_id>",
97 97 }
98 98 },
99 "merge": {
100 "clone_url": "<clone_url>",
101 "reference":
102 {
103 "name": "<name>",
104 "type": "<type>",
105 "commit_id": "<commit_id>",
106 }
107 },
99 108 "author": <user_obj>,
100 109 "reviewers": [
101 110 ...
@@ -178,6 +187,15 b' def get_pull_requests(request, apiuser, '
178 187 "commit_id": "<commit_id>",
179 188 }
180 189 },
190 "merge": {
191 "clone_url": "<clone_url>",
192 "reference":
193 {
194 "name": "<name>",
195 "type": "<type>",
196 "commit_id": "<commit_id>",
197 }
198 },
181 199 "author": <user_obj>,
182 200 "reviewers": [
183 201 ...
@@ -197,7 +215,7 b' def get_pull_requests(request, apiuser, '
197 215 if not has_superadmin_permission(apiuser):
198 216 _perms = (
199 217 'repository.admin', 'repository.write', 'repository.read',)
200 has_repo_permissions(apiuser, repoid, repo, _perms)
218 validate_repo_permissions(apiuser, repoid, repo, _perms)
201 219
202 220 status = Optional.extract(status)
203 221 pull_requests = PullRequestModel().get_all(repo, statuses=[status])
@@ -232,7 +250,12 b' def merge_pull_request(request, apiuser,'
232 250 "executed": "<bool>",
233 251 "failure_reason": "<int>",
234 252 "merge_commit_id": "<merge_commit_id>",
235 "possible": "<bool>"
253 "possible": "<bool>",
254 "merge_ref": {
255 "commit_id": "<commit_id>",
256 "type": "<type>",
257 "name": "<name>"
258 }
236 259 },
237 260 "error": null
238 261
@@ -260,13 +283,21 b' def merge_pull_request(request, apiuser,'
260 283 request.environ, repo_name=target_repo.repo_name,
261 284 username=apiuser.username, action='push',
262 285 scm=target_repo.repo_type)
263 data = PullRequestModel().merge(pull_request, apiuser, extras=extras)
264 if data.executed:
286 merge_response = PullRequestModel().merge(
287 pull_request, apiuser, extras=extras)
288 if merge_response.executed:
265 289 PullRequestModel().close_pull_request(
266 290 pull_request.pull_request_id, apiuser)
267 291
268 292 Session().commit()
269 return data
293
294 # In previous versions the merge response directly contained the merge
295 # commit id. It is now contained in the merge reference object. To be
296 # backwards compatible we have to extract it again.
297 merge_response = merge_response._asdict()
298 merge_response['merge_commit_id'] = merge_response['merge_ref'].commit_id
299
300 return merge_response
270 301
271 302
272 303 @jsonrpc_method()
@@ -463,12 +494,17 b' def create_pull_request('
463 494 :type description: Optional(str)
464 495 :param reviewers: Set the new pull request reviewers list.
465 496 :type reviewers: Optional(list)
497 Accepts username strings or objects of the format:
498 {
499 'username': 'nick', 'reasons': ['original author']
500 }
466 501 """
502
467 503 source = get_repo_or_error(source_repo)
468 504 target = get_repo_or_error(target_repo)
469 505 if not has_superadmin_permission(apiuser):
470 506 _perms = ('repository.admin', 'repository.write', 'repository.read',)
471 has_repo_permissions(apiuser, source_repo, source, _perms)
507 validate_repo_permissions(apiuser, source_repo, source, _perms)
472 508
473 509 full_source_ref = resolve_ref_or_error(source_ref, source)
474 510 full_target_ref = resolve_ref_or_error(target_ref, target)
@@ -490,12 +526,21 b' def create_pull_request('
490 526 if not ancestor:
491 527 raise JSONRPCError('no common ancestor found')
492 528
493 reviewer_names = Optional.extract(reviewers) or []
494 if not isinstance(reviewer_names, list):
529 reviewer_objects = Optional.extract(reviewers) or []
530 if not isinstance(reviewer_objects, list):
495 531 raise JSONRPCError('reviewers should be specified as a list')
496 532
497 reviewer_users = [get_user_or_error(n) for n in reviewer_names]
498 reviewer_ids = [u.user_id for u in reviewer_users]
533 reviewers_reasons = []
534 for reviewer_object in reviewer_objects:
535 reviewer_reasons = []
536 if isinstance(reviewer_object, (basestring, int)):
537 reviewer_username = reviewer_object
538 else:
539 reviewer_username = reviewer_object['username']
540 reviewer_reasons = reviewer_object.get('reasons', [])
541
542 user = get_user_or_error(reviewer_username)
543 reviewers_reasons.append((user.user_id, reviewer_reasons))
499 544
500 545 pull_request_model = PullRequestModel()
501 546 pull_request = pull_request_model.create(
@@ -506,7 +551,7 b' def create_pull_request('
506 551 target_ref=full_target_ref,
507 552 revisions=reversed(
508 553 [commit.raw_id for commit in reversed(commit_ranges)]),
509 reviewers=reviewer_ids,
554 reviewers=reviewers_reasons,
510 555 title=title,
511 556 description=Optional.extract(description)
512 557 )
@@ -585,12 +630,23 b' def update_pull_request('
585 630 'pull request `%s` update failed, pull request is closed' % (
586 631 pullrequestid,))
587 632
588 reviewer_names = Optional.extract(reviewers) or []
589 if not isinstance(reviewer_names, list):
633 reviewer_objects = Optional.extract(reviewers) or []
634 if not isinstance(reviewer_objects, list):
590 635 raise JSONRPCError('reviewers should be specified as a list')
591 636
592 reviewer_users = [get_user_or_error(n) for n in reviewer_names]
593 reviewer_ids = [u.user_id for u in reviewer_users]
637 reviewers_reasons = []
638 reviewer_ids = set()
639 for reviewer_object in reviewer_objects:
640 reviewer_reasons = []
641 if isinstance(reviewer_object, (int, basestring)):
642 reviewer_username = reviewer_object
643 else:
644 reviewer_username = reviewer_object['username']
645 reviewer_reasons = reviewer_object.get('reasons', [])
646
647 user = get_user_or_error(reviewer_username)
648 reviewer_ids.add(user.user_id)
649 reviewers_reasons.append((user.user_id, reviewer_reasons))
594 650
595 651 title = Optional.extract(title)
596 652 description = Optional.extract(description)
@@ -603,15 +659,15 b' def update_pull_request('
603 659 commit_changes = {"added": [], "common": [], "removed": []}
604 660 if str2bool(Optional.extract(update_commits)):
605 661 if PullRequestModel().has_valid_update_type(pull_request):
606 _version, _commit_changes = PullRequestModel().update_commits(
662 update_response = PullRequestModel().update_commits(
607 663 pull_request)
608 commit_changes = _commit_changes or commit_changes
664 commit_changes = update_response.changes or commit_changes
609 665 Session().commit()
610 666
611 667 reviewers_changes = {"added": [], "removed": []}
612 668 if reviewer_ids:
613 669 added_reviewers, removed_reviewers = \
614 PullRequestModel().update_reviewers(pull_request, reviewer_ids)
670 PullRequestModel().update_reviewers(pull_request, reviewers_reasons)
615 671
616 672 reviewers_changes['added'] = sorted(
617 673 [get_user_or_error(n).username for n in added_reviewers])
@@ -631,5 +687,5 b' def update_pull_request('
631 687 'updated_commits': commit_changes,
632 688 'updated_reviewers': reviewers_changes
633 689 }
690
634 691 return data
635
@@ -21,29 +21,26 b''
21 21 import logging
22 22 import time
23 23
24 import colander
25
26 from rhodecode import BACKENDS
27 from rhodecode.api import jsonrpc_method, JSONRPCError, JSONRPCForbidden, json
24 import rhodecode
25 from rhodecode.api import (
26 jsonrpc_method, JSONRPCError, JSONRPCForbidden, JSONRPCValidationError)
28 27 from rhodecode.api.utils import (
29 28 has_superadmin_permission, Optional, OAttr, get_repo_or_error,
30 get_user_group_or_error, get_user_or_error, has_repo_permissions,
31 get_perm_or_error, store_update, get_repo_group_or_error, parse_args,
32 get_origin, build_commit_data)
33 from rhodecode.lib.auth import (
34 HasPermissionAnyApi, HasRepoGroupPermissionAnyApi,
35 HasUserGroupPermissionAnyApi)
29 get_user_group_or_error, get_user_or_error, validate_repo_permissions,
30 get_perm_or_error, parse_args, get_origin, build_commit_data,
31 validate_set_owner_permissions)
32 from rhodecode.lib.auth import HasPermissionAnyApi, HasUserGroupPermissionAnyApi
36 33 from rhodecode.lib.exceptions import StatusChangeOnClosedPullRequestError
37 from rhodecode.lib.utils import map_groups
38 34 from rhodecode.lib.utils2 import str2bool, time_to_datetime
35 from rhodecode.lib.ext_json import json
39 36 from rhodecode.model.changeset_status import ChangesetStatusModel
40 37 from rhodecode.model.comment import ChangesetCommentsModel
41 38 from rhodecode.model.db import (
42 39 Session, ChangesetStatus, RepositoryField, Repository)
43 40 from rhodecode.model.repo import RepoModel
44 from rhodecode.model.repo_group import RepoGroupModel
45 41 from rhodecode.model.scm import ScmModel, RepoList
46 42 from rhodecode.model.settings import SettingsModel, VcsSettingsModel
43 from rhodecode.model import validation_schema
47 44 from rhodecode.model.validation_schema.schemas import repo_schema
48 45
49 46 log = logging.getLogger(__name__)
@@ -177,6 +174,7 b' def get_repo(request, apiuser, repoid, c'
177 174
178 175 repo = get_repo_or_error(repoid)
179 176 cache = Optional.extract(cache)
177
180 178 include_secrets = False
181 179 if has_superadmin_permission(apiuser):
182 180 include_secrets = True
@@ -184,7 +182,7 b' def get_repo(request, apiuser, repoid, c'
184 182 # check if we have at least read permission for this repo !
185 183 _perms = (
186 184 'repository.admin', 'repository.write', 'repository.read',)
187 has_repo_permissions(apiuser, repoid, repo, _perms)
185 validate_repo_permissions(apiuser, repoid, repo, _perms)
188 186
189 187 permissions = []
190 188 for _user in repo.permissions():
@@ -292,7 +290,7 b' def get_repo_changeset(request, apiuser,'
292 290 if not has_superadmin_permission(apiuser):
293 291 _perms = (
294 292 'repository.admin', 'repository.write', 'repository.read',)
295 has_repo_permissions(apiuser, repoid, repo, _perms)
293 validate_repo_permissions(apiuser, repoid, repo, _perms)
296 294
297 295 changes_details = Optional.extract(details)
298 296 _changes_details_types = ['basic', 'extended', 'full']
@@ -355,7 +353,7 b' def get_repo_changesets(request, apiuser'
355 353 if not has_superadmin_permission(apiuser):
356 354 _perms = (
357 355 'repository.admin', 'repository.write', 'repository.read',)
358 has_repo_permissions(apiuser, repoid, repo, _perms)
356 validate_repo_permissions(apiuser, repoid, repo, _perms)
359 357
360 358 changes_details = Optional.extract(details)
361 359 _changes_details_types = ['basic', 'extended', 'full']
@@ -450,7 +448,7 b' def get_repo_nodes(request, apiuser, rep'
450 448 if not has_superadmin_permission(apiuser):
451 449 _perms = (
452 450 'repository.admin', 'repository.write', 'repository.read',)
453 has_repo_permissions(apiuser, repoid, repo, _perms)
451 validate_repo_permissions(apiuser, repoid, repo, _perms)
454 452
455 453 ret_type = Optional.extract(ret_type)
456 454 details = Optional.extract(details)
@@ -523,7 +521,7 b' def get_repo_refs(request, apiuser, repo'
523 521 repo = get_repo_or_error(repoid)
524 522 if not has_superadmin_permission(apiuser):
525 523 _perms = ('repository.admin', 'repository.write', 'repository.read',)
526 has_repo_permissions(apiuser, repoid, repo, _perms)
524 validate_repo_permissions(apiuser, repoid, repo, _perms)
527 525
528 526 try:
529 527 # check if repo is not empty by any chance, skip quicker if it is.
@@ -538,26 +536,30 b' def get_repo_refs(request, apiuser, repo'
538 536
539 537
540 538 @jsonrpc_method()
541 def create_repo(request, apiuser, repo_name, repo_type,
542 owner=Optional(OAttr('apiuser')), description=Optional(''),
543 private=Optional(False), clone_uri=Optional(None),
544 landing_rev=Optional('rev:tip'),
545 enable_statistics=Optional(False),
546 enable_locking=Optional(False),
547 enable_downloads=Optional(False),
548 copy_permissions=Optional(False)):
539 def create_repo(
540 request, apiuser, repo_name, repo_type,
541 owner=Optional(OAttr('apiuser')),
542 description=Optional(''),
543 private=Optional(False),
544 clone_uri=Optional(None),
545 landing_rev=Optional('rev:tip'),
546 enable_statistics=Optional(False),
547 enable_locking=Optional(False),
548 enable_downloads=Optional(False),
549 copy_permissions=Optional(False)):
549 550 """
550 551 Creates a repository.
551 552
552 * If the repository name contains "/", all the required repository
553 groups will be created.
553 * If the repository name contains "/", repository will be created inside
554 a repository group or nested repository groups
554 555
555 For example "foo/bar/baz" will create |repo| groups "foo" and "bar"
556 (with "foo" as parent). It will also create the "baz" repository
557 with "bar" as |repo| group.
556 For example "foo/bar/repo1" will create |repo| called "repo1" inside
557 group "foo/bar". You have to have permissions to access and write to
558 the last repository group ("bar" in this example)
558 559
559 560 This command can only be run using an |authtoken| with at least
560 write permissions to the |repo|.
561 permissions to create repositories, or write permissions to
562 parent repository groups.
561 563
562 564 :param apiuser: This is filled automatically from the |authtoken|.
563 565 :type apiuser: AuthUser
@@ -569,9 +571,9 b' def create_repo(request, apiuser, repo_n'
569 571 :type owner: Optional(str)
570 572 :param description: Set the repository description.
571 573 :type description: Optional(str)
572 :param private:
574 :param private: set repository as private
573 575 :type private: bool
574 :param clone_uri:
576 :param clone_uri: set clone_uri
575 577 :type clone_uri: str
576 578 :param landing_rev: <rev_type>:<rev>
577 579 :type landing_rev: str
@@ -606,53 +608,17 b' def create_repo(request, apiuser, repo_n'
606 608 id : <id_given_in_input>
607 609 result : null
608 610 error : {
609 'failed to create repository `<repo_name>`
611 'failed to create repository `<repo_name>`'
610 612 }
611 613
612 614 """
613 schema = repo_schema.RepoSchema()
614 try:
615 data = schema.deserialize({
616 'repo_name': repo_name
617 })
618 except colander.Invalid as e:
619 raise JSONRPCError("Validation failed: %s" % (e.asdict(),))
620 repo_name = data['repo_name']
621 615
622 (repo_name_cleaned,
623 parent_group_name) = RepoGroupModel()._get_group_name_and_parent(
624 repo_name)
625
626 if not HasPermissionAnyApi(
627 'hg.admin', 'hg.create.repository')(user=apiuser):
628 # check if we have admin permission for this repo group if given !
629
630 if parent_group_name:
631 repogroupid = parent_group_name
632 repo_group = get_repo_group_or_error(parent_group_name)
616 owner = validate_set_owner_permissions(apiuser, owner)
633 617
634 _perms = ('group.admin',)
635 if not HasRepoGroupPermissionAnyApi(*_perms)(
636 user=apiuser, group_name=repo_group.group_name):
637 raise JSONRPCError(
638 'repository group `%s` does not exist' % (
639 repogroupid,))
640 else:
641 raise JSONRPCForbidden()
642
643 if not has_superadmin_permission(apiuser):
644 if not isinstance(owner, Optional):
645 # forbid setting owner for non-admins
646 raise JSONRPCError(
647 'Only RhodeCode admin can specify `owner` param')
648
649 if isinstance(owner, Optional):
650 owner = apiuser.user_id
651
652 owner = get_user_or_error(owner)
653
654 if RepoModel().get_by_repo_name(repo_name):
655 raise JSONRPCError("repo `%s` already exist" % repo_name)
618 description = Optional.extract(description)
619 copy_permissions = Optional.extract(copy_permissions)
620 clone_uri = Optional.extract(clone_uri)
621 landing_commit_ref = Optional.extract(landing_rev)
656 622
657 623 defs = SettingsModel().get_default_repo_settings(strip_prefix=True)
658 624 if isinstance(private, Optional):
@@ -666,32 +632,44 b' def create_repo(request, apiuser, repo_n'
666 632 if isinstance(enable_downloads, Optional):
667 633 enable_downloads = defs.get('repo_enable_downloads')
668 634
669 clone_uri = Optional.extract(clone_uri)
670 description = Optional.extract(description)
671 landing_rev = Optional.extract(landing_rev)
672 copy_permissions = Optional.extract(copy_permissions)
635 schema = repo_schema.RepoSchema().bind(
636 repo_type_options=rhodecode.BACKENDS.keys(),
637 # user caller
638 user=apiuser)
673 639
674 640 try:
675 # create structure of groups and return the last group
676 repo_group = map_groups(repo_name)
641 schema_data = schema.deserialize(dict(
642 repo_name=repo_name,
643 repo_type=repo_type,
644 repo_owner=owner.username,
645 repo_description=description,
646 repo_landing_commit_ref=landing_commit_ref,
647 repo_clone_uri=clone_uri,
648 repo_private=private,
649 repo_copy_permissions=copy_permissions,
650 repo_enable_statistics=enable_statistics,
651 repo_enable_downloads=enable_downloads,
652 repo_enable_locking=enable_locking))
653 except validation_schema.Invalid as err:
654 raise JSONRPCValidationError(colander_exc=err)
655
656 try:
677 657 data = {
678 'repo_name': repo_name_cleaned,
679 'repo_name_full': repo_name,
680 'repo_type': repo_type,
681 'repo_description': description,
682 658 'owner': owner,
683 'repo_private': private,
684 'clone_uri': clone_uri,
685 'repo_group': repo_group.group_id if repo_group else None,
686 'repo_landing_rev': landing_rev,
687 'enable_statistics': enable_statistics,
688 'enable_locking': enable_locking,
689 'enable_downloads': enable_downloads,
690 'repo_copy_permissions': copy_permissions,
659 'repo_name': schema_data['repo_group']['repo_name_without_group'],
660 'repo_name_full': schema_data['repo_name'],
661 'repo_group': schema_data['repo_group']['repo_group_id'],
662 'repo_type': schema_data['repo_type'],
663 'repo_description': schema_data['repo_description'],
664 'repo_private': schema_data['repo_private'],
665 'clone_uri': schema_data['repo_clone_uri'],
666 'repo_landing_rev': schema_data['repo_landing_commit_ref'],
667 'enable_statistics': schema_data['repo_enable_statistics'],
668 'enable_locking': schema_data['repo_enable_locking'],
669 'enable_downloads': schema_data['repo_enable_downloads'],
670 'repo_copy_permissions': schema_data['repo_copy_permissions'],
691 671 }
692 672
693 if repo_type not in BACKENDS.keys():
694 raise Exception("Invalid backend type %s" % repo_type)
695 673 task = RepoModel().create(form_data=data, cur_user=owner)
696 674 from celery.result import BaseAsyncResult
697 675 task_id = None
@@ -699,17 +677,17 b' def create_repo(request, apiuser, repo_n'
699 677 task_id = task.task_id
700 678 # no commit, it's done in RepoModel, or async via celery
701 679 return {
702 'msg': "Created new repository `%s`" % (repo_name,),
680 'msg': "Created new repository `%s`" % (schema_data['repo_name'],),
703 681 'success': True, # cannot return the repo data here since fork
704 # cann be done async
682 # can be done async
705 683 'task': task_id
706 684 }
707 685 except Exception:
708 686 log.exception(
709 687 u"Exception while trying to create the repository %s",
710 repo_name)
688 schema_data['repo_name'])
711 689 raise JSONRPCError(
712 'failed to create repository `%s`' % (repo_name,))
690 'failed to create repository `%s`' % (schema_data['repo_name'],))
713 691
714 692
715 693 @jsonrpc_method()
@@ -735,7 +713,7 b' def add_field_to_repo(request, apiuser, '
735 713 repo = get_repo_or_error(repoid)
736 714 if not has_superadmin_permission(apiuser):
737 715 _perms = ('repository.admin',)
738 has_repo_permissions(apiuser, repoid, repo, _perms)
716 validate_repo_permissions(apiuser, repoid, repo, _perms)
739 717
740 718 label = Optional.extract(label) or key
741 719 description = Optional.extract(description)
@@ -778,7 +756,7 b' def remove_field_from_repo(request, apiu'
778 756 repo = get_repo_or_error(repoid)
779 757 if not has_superadmin_permission(apiuser):
780 758 _perms = ('repository.admin',)
781 has_repo_permissions(apiuser, repoid, repo, _perms)
759 validate_repo_permissions(apiuser, repoid, repo, _perms)
782 760
783 761 field = RepositoryField.get_by_key_name(key, repo)
784 762 if not field:
@@ -800,33 +778,38 b' def remove_field_from_repo(request, apiu'
800 778
801 779
802 780 @jsonrpc_method()
803 def update_repo(request, apiuser, repoid, name=Optional(None),
804 owner=Optional(OAttr('apiuser')),
805 group=Optional(None),
806 fork_of=Optional(None),
807 description=Optional(''), private=Optional(False),
808 clone_uri=Optional(None), landing_rev=Optional('rev:tip'),
809 enable_statistics=Optional(False),
810 enable_locking=Optional(False),
811 enable_downloads=Optional(False),
812 fields=Optional('')):
781 def update_repo(
782 request, apiuser, repoid, repo_name=Optional(None),
783 owner=Optional(OAttr('apiuser')), description=Optional(''),
784 private=Optional(False), clone_uri=Optional(None),
785 landing_rev=Optional('rev:tip'), fork_of=Optional(None),
786 enable_statistics=Optional(False),
787 enable_locking=Optional(False),
788 enable_downloads=Optional(False), fields=Optional('')):
813 789 """
814 790 Updates a repository with the given information.
815 791
816 792 This command can only be run using an |authtoken| with at least
817 write permissions to the |repo|.
793 admin permissions to the |repo|.
794
795 * If the repository name contains "/", repository will be updated
796 accordingly with a repository group or nested repository groups
797
798 For example repoid=repo-test name="foo/bar/repo-test" will update |repo|
799 called "repo-test" and place it inside group "foo/bar".
800 You have to have permissions to access and write to the last repository
801 group ("bar" in this example)
818 802
819 803 :param apiuser: This is filled automatically from the |authtoken|.
820 804 :type apiuser: AuthUser
821 805 :param repoid: repository name or repository ID.
822 806 :type repoid: str or int
823 :param name: Update the |repo| name.
824 :type name: str
807 :param repo_name: Update the |repo| name, including the
808 repository group it's in.
809 :type repo_name: str
825 810 :param owner: Set the |repo| owner.
826 811 :type owner: str
827 :param group: Set the |repo| group the |repo| belongs to.
828 :type group: str
829 :param fork_of: Set the master |repo| name.
812 :param fork_of: Set the |repo| as fork of another |repo|.
830 813 :type fork_of: str
831 814 :param description: Update the |repo| description.
832 815 :type description: str
@@ -834,69 +817,115 b' def update_repo(request, apiuser, repoid'
834 817 :type private: bool
835 818 :param clone_uri: Update the |repo| clone URI.
836 819 :type clone_uri: str
837 :param landing_rev: Set the |repo| landing revision. Default is
838 ``tip``.
820 :param landing_rev: Set the |repo| landing revision. Default is ``rev:tip``.
839 821 :type landing_rev: str
840 :param enable_statistics: Enable statistics on the |repo|,
841 (True | False).
822 :param enable_statistics: Enable statistics on the |repo|, (True | False).
842 823 :type enable_statistics: bool
843 824 :param enable_locking: Enable |repo| locking.
844 825 :type enable_locking: bool
845 :param enable_downloads: Enable downloads from the |repo|,
846 (True | False).
826 :param enable_downloads: Enable downloads from the |repo|, (True | False).
847 827 :type enable_downloads: bool
848 828 :param fields: Add extra fields to the |repo|. Use the following
849 829 example format: ``field_key=field_val,field_key2=fieldval2``.
850 830 Escape ', ' with \,
851 831 :type fields: str
852 832 """
833
853 834 repo = get_repo_or_error(repoid)
835
854 836 include_secrets = False
855 if has_superadmin_permission(apiuser):
837 if not has_superadmin_permission(apiuser):
838 validate_repo_permissions(apiuser, repoid, repo, ('repository.admin',))
839 else:
856 840 include_secrets = True
857 else:
858 _perms = ('repository.admin',)
859 has_repo_permissions(apiuser, repoid, repo, _perms)
841
842 updates = dict(
843 repo_name=repo_name
844 if not isinstance(repo_name, Optional) else repo.repo_name,
845
846 fork_id=fork_of
847 if not isinstance(fork_of, Optional) else repo.fork.repo_name if repo.fork else None,
848
849 user=owner
850 if not isinstance(owner, Optional) else repo.user.username,
851
852 repo_description=description
853 if not isinstance(description, Optional) else repo.description,
854
855 repo_private=private
856 if not isinstance(private, Optional) else repo.private,
857
858 clone_uri=clone_uri
859 if not isinstance(clone_uri, Optional) else repo.clone_uri,
860
861 repo_landing_rev=landing_rev
862 if not isinstance(landing_rev, Optional) else repo._landing_revision,
863
864 repo_enable_statistics=enable_statistics
865 if not isinstance(enable_statistics, Optional) else repo.enable_statistics,
866
867 repo_enable_locking=enable_locking
868 if not isinstance(enable_locking, Optional) else repo.enable_locking,
869
870 repo_enable_downloads=enable_downloads
871 if not isinstance(enable_downloads, Optional) else repo.enable_downloads)
872
873 ref_choices, _labels = ScmModel().get_repo_landing_revs(repo=repo)
860 874
861 updates = {
862 # update function requires this.
863 'repo_name': repo.just_name
864 }
865 repo_group = group
866 if not isinstance(repo_group, Optional):
867 repo_group = get_repo_group_or_error(repo_group)
868 repo_group = repo_group.group_id
875 schema = repo_schema.RepoSchema().bind(
876 repo_type_options=rhodecode.BACKENDS.keys(),
877 repo_ref_options=ref_choices,
878 # user caller
879 user=apiuser,
880 old_values=repo.get_api_data())
881 try:
882 schema_data = schema.deserialize(dict(
883 # we save old value, users cannot change type
884 repo_type=repo.repo_type,
885
886 repo_name=updates['repo_name'],
887 repo_owner=updates['user'],
888 repo_description=updates['repo_description'],
889 repo_clone_uri=updates['clone_uri'],
890 repo_fork_of=updates['fork_id'],
891 repo_private=updates['repo_private'],
892 repo_landing_commit_ref=updates['repo_landing_rev'],
893 repo_enable_statistics=updates['repo_enable_statistics'],
894 repo_enable_downloads=updates['repo_enable_downloads'],
895 repo_enable_locking=updates['repo_enable_locking']))
896 except validation_schema.Invalid as err:
897 raise JSONRPCValidationError(colander_exc=err)
869 898
870 repo_fork_of = fork_of
871 if not isinstance(repo_fork_of, Optional):
872 repo_fork_of = get_repo_or_error(repo_fork_of)
873 repo_fork_of = repo_fork_of.repo_id
899 # save validated data back into the updates dict
900 validated_updates = dict(
901 repo_name=schema_data['repo_group']['repo_name_without_group'],
902 repo_group=schema_data['repo_group']['repo_group_id'],
903
904 user=schema_data['repo_owner'],
905 repo_description=schema_data['repo_description'],
906 repo_private=schema_data['repo_private'],
907 clone_uri=schema_data['repo_clone_uri'],
908 repo_landing_rev=schema_data['repo_landing_commit_ref'],
909 repo_enable_statistics=schema_data['repo_enable_statistics'],
910 repo_enable_locking=schema_data['repo_enable_locking'],
911 repo_enable_downloads=schema_data['repo_enable_downloads'],
912 )
913
914 if schema_data['repo_fork_of']:
915 fork_repo = get_repo_or_error(schema_data['repo_fork_of'])
916 validated_updates['fork_id'] = fork_repo.repo_id
917
918 # extra fields
919 fields = parse_args(Optional.extract(fields), key_prefix='ex_')
920 if fields:
921 validated_updates.update(fields)
874 922
875 923 try:
876 store_update(updates, name, 'repo_name')
877 store_update(updates, repo_group, 'repo_group')
878 store_update(updates, repo_fork_of, 'fork_id')
879 store_update(updates, owner, 'user')
880 store_update(updates, description, 'repo_description')
881 store_update(updates, private, 'repo_private')
882 store_update(updates, clone_uri, 'clone_uri')
883 store_update(updates, landing_rev, 'repo_landing_rev')
884 store_update(updates, enable_statistics, 'repo_enable_statistics')
885 store_update(updates, enable_locking, 'repo_enable_locking')
886 store_update(updates, enable_downloads, 'repo_enable_downloads')
887
888 # extra fields
889 fields = parse_args(Optional.extract(fields), key_prefix='ex_')
890 if fields:
891 updates.update(fields)
892
893 RepoModel().update(repo, **updates)
924 RepoModel().update(repo, **validated_updates)
894 925 Session().commit()
895 926 return {
896 'msg': 'updated repo ID:%s %s' % (
897 repo.repo_id, repo.repo_name),
898 'repository': repo.get_api_data(
899 include_secrets=include_secrets)
927 'msg': 'updated repo ID:%s %s' % (repo.repo_id, repo.repo_name),
928 'repository': repo.get_api_data(include_secrets=include_secrets)
900 929 }
901 930 except Exception:
902 931 log.exception(
@@ -908,26 +937,33 b' def update_repo(request, apiuser, repoid'
908 937 @jsonrpc_method()
909 938 def fork_repo(request, apiuser, repoid, fork_name,
910 939 owner=Optional(OAttr('apiuser')),
911 description=Optional(''), copy_permissions=Optional(False),
912 private=Optional(False), landing_rev=Optional('rev:tip')):
940 description=Optional(''),
941 private=Optional(False),
942 clone_uri=Optional(None),
943 landing_rev=Optional('rev:tip'),
944 copy_permissions=Optional(False)):
913 945 """
914 946 Creates a fork of the specified |repo|.
915 947
916 * If using |RCE| with Celery this will immediately return a success
917 message, even though the fork will be created asynchronously.
948 * If the fork_name contains "/", fork will be created inside
949 a repository group or nested repository groups
918 950
919 This command can only be run using an |authtoken| with fork
920 permissions on the |repo|.
951 For example "foo/bar/fork-repo" will create fork called "fork-repo"
952 inside group "foo/bar". You have to have permissions to access and
953 write to the last repository group ("bar" in this example)
954
955 This command can only be run using an |authtoken| with minimum
956 read permissions of the forked repo, create fork permissions for an user.
921 957
922 958 :param apiuser: This is filled automatically from the |authtoken|.
923 959 :type apiuser: AuthUser
924 960 :param repoid: Set repository name or repository ID.
925 961 :type repoid: str or int
926 :param fork_name: Set the fork name.
962 :param fork_name: Set the fork name, including it's repository group membership.
927 963 :type fork_name: str
928 964 :param owner: Set the fork owner.
929 965 :type owner: str
930 :param description: Set the fork descripton.
966 :param description: Set the fork description.
931 967 :type description: str
932 968 :param copy_permissions: Copy permissions from parent |repo|. The
933 969 default is False.
@@ -965,71 +1001,63 b' def fork_repo(request, apiuser, repoid, '
965 1001 error: null
966 1002
967 1003 """
968 if not has_superadmin_permission(apiuser):
969 if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser):
970 raise JSONRPCForbidden()
971 1004
972 1005 repo = get_repo_or_error(repoid)
973 1006 repo_name = repo.repo_name
974 1007
975 (fork_name_cleaned,
976 parent_group_name) = RepoGroupModel()._get_group_name_and_parent(
977 fork_name)
978
979 1008 if not has_superadmin_permission(apiuser):
980 1009 # check if we have at least read permission for
981 1010 # this repo that we fork !
982 1011 _perms = (
983 1012 'repository.admin', 'repository.write', 'repository.read')
984 has_repo_permissions(apiuser, repoid, repo, _perms)
1013 validate_repo_permissions(apiuser, repoid, repo, _perms)
985 1014
986 if not isinstance(owner, Optional):
987 # forbid setting owner for non super admins
988 raise JSONRPCError(
989 'Only RhodeCode admin can specify `owner` param'
990 )
991 # check if we have a create.repo permission if not maybe the parent
992 # group permission
993 if not HasPermissionAnyApi('hg.create.repository')(user=apiuser):
994 if parent_group_name:
995 repogroupid = parent_group_name
996 repo_group = get_repo_group_or_error(parent_group_name)
1015 # check if the regular user has at least fork permissions as well
1016 if not HasPermissionAnyApi('hg.fork.repository')(user=apiuser):
1017 raise JSONRPCForbidden()
1018
1019 # check if user can set owner parameter
1020 owner = validate_set_owner_permissions(apiuser, owner)
997 1021
998 _perms = ('group.admin',)
999 if not HasRepoGroupPermissionAnyApi(*_perms)(
1000 user=apiuser, group_name=repo_group.group_name):
1001 raise JSONRPCError(
1002 'repository group `%s` does not exist' % (
1003 repogroupid,))
1004 else:
1005 raise JSONRPCForbidden()
1022 description = Optional.extract(description)
1023 copy_permissions = Optional.extract(copy_permissions)
1024 clone_uri = Optional.extract(clone_uri)
1025 landing_commit_ref = Optional.extract(landing_rev)
1026 private = Optional.extract(private)
1006 1027
1007 _repo = RepoModel().get_by_repo_name(fork_name)
1008 if _repo:
1009 type_ = 'fork' if _repo.fork else 'repo'
1010 raise JSONRPCError("%s `%s` already exist" % (type_, fork_name))
1011
1012 if isinstance(owner, Optional):
1013 owner = apiuser.user_id
1014
1015 owner = get_user_or_error(owner)
1028 schema = repo_schema.RepoSchema().bind(
1029 repo_type_options=rhodecode.BACKENDS.keys(),
1030 # user caller
1031 user=apiuser)
1016 1032
1017 1033 try:
1018 # create structure of groups and return the last group
1019 repo_group = map_groups(fork_name)
1020 form_data = {
1021 'repo_name': fork_name_cleaned,
1022 'repo_name_full': fork_name,
1023 'repo_group': repo_group.group_id if repo_group else None,
1024 'repo_type': repo.repo_type,
1025 'description': Optional.extract(description),
1026 'private': Optional.extract(private),
1027 'copy_permissions': Optional.extract(copy_permissions),
1028 'landing_rev': Optional.extract(landing_rev),
1034 schema_data = schema.deserialize(dict(
1035 repo_name=fork_name,
1036 repo_type=repo.repo_type,
1037 repo_owner=owner.username,
1038 repo_description=description,
1039 repo_landing_commit_ref=landing_commit_ref,
1040 repo_clone_uri=clone_uri,
1041 repo_private=private,
1042 repo_copy_permissions=copy_permissions))
1043 except validation_schema.Invalid as err:
1044 raise JSONRPCValidationError(colander_exc=err)
1045
1046 try:
1047 data = {
1029 1048 'fork_parent_id': repo.repo_id,
1049
1050 'repo_name': schema_data['repo_group']['repo_name_without_group'],
1051 'repo_name_full': schema_data['repo_name'],
1052 'repo_group': schema_data['repo_group']['repo_group_id'],
1053 'repo_type': schema_data['repo_type'],
1054 'description': schema_data['repo_description'],
1055 'private': schema_data['repo_private'],
1056 'copy_permissions': schema_data['repo_copy_permissions'],
1057 'landing_rev': schema_data['repo_landing_commit_ref'],
1030 1058 }
1031 1059
1032 task = RepoModel().create_fork(form_data, cur_user=owner)
1060 task = RepoModel().create_fork(data, cur_user=owner)
1033 1061 # no commit, it's done in RepoModel, or async via celery
1034 1062 from celery.result import BaseAsyncResult
1035 1063 task_id = None
@@ -1037,16 +1065,18 b' def fork_repo(request, apiuser, repoid, '
1037 1065 task_id = task.task_id
1038 1066 return {
1039 1067 'msg': 'Created fork of `%s` as `%s`' % (
1040 repo.repo_name, fork_name),
1068 repo.repo_name, schema_data['repo_name']),
1041 1069 'success': True, # cannot return the repo data here since fork
1042 1070 # can be done async
1043 1071 'task': task_id
1044 1072 }
1045 1073 except Exception:
1046 log.exception("Exception occurred while trying to fork a repo")
1074 log.exception(
1075 u"Exception while trying to create fork %s",
1076 schema_data['repo_name'])
1047 1077 raise JSONRPCError(
1048 1078 'failed to fork repository `%s` as `%s`' % (
1049 repo_name, fork_name))
1079 repo_name, schema_data['repo_name']))
1050 1080
1051 1081
1052 1082 @jsonrpc_method()
@@ -1082,7 +1112,7 b' def delete_repo(request, apiuser, repoid'
1082 1112 repo = get_repo_or_error(repoid)
1083 1113 if not has_superadmin_permission(apiuser):
1084 1114 _perms = ('repository.admin',)
1085 has_repo_permissions(apiuser, repoid, repo, _perms)
1115 validate_repo_permissions(apiuser, repoid, repo, _perms)
1086 1116
1087 1117 try:
1088 1118 handle_forks = Optional.extract(forks)
@@ -1157,7 +1187,7 b' def invalidate_cache(request, apiuser, r'
1157 1187 repo = get_repo_or_error(repoid)
1158 1188 if not has_superadmin_permission(apiuser):
1159 1189 _perms = ('repository.admin', 'repository.write',)
1160 has_repo_permissions(apiuser, repoid, repo, _perms)
1190 validate_repo_permissions(apiuser, repoid, repo, _perms)
1161 1191
1162 1192 delete = Optional.extract(delete_keys)
1163 1193 try:
@@ -1228,7 +1258,7 b' def lock(request, apiuser, repoid, locke'
1228 1258 id : <id_given_in_input>
1229 1259 result : null
1230 1260 error : {
1231 'Error occurred locking repository `<reponame>`
1261 'Error occurred locking repository `<reponame>`'
1232 1262 }
1233 1263 """
1234 1264
@@ -1236,7 +1266,7 b' def lock(request, apiuser, repoid, locke'
1236 1266 if not has_superadmin_permission(apiuser):
1237 1267 # check if we have at least write permission for this repo !
1238 1268 _perms = ('repository.admin', 'repository.write',)
1239 has_repo_permissions(apiuser, repoid, repo, _perms)
1269 validate_repo_permissions(apiuser, repoid, repo, _perms)
1240 1270
1241 1271 # make sure normal user does not pass someone else userid,
1242 1272 # he is not allowed to do that
@@ -1347,7 +1377,7 b' def comment_commit('
1347 1377 repo = get_repo_or_error(repoid)
1348 1378 if not has_superadmin_permission(apiuser):
1349 1379 _perms = ('repository.read', 'repository.write', 'repository.admin')
1350 has_repo_permissions(apiuser, repoid, repo, _perms)
1380 validate_repo_permissions(apiuser, repoid, repo, _perms)
1351 1381
1352 1382 if isinstance(userid, Optional):
1353 1383 userid = apiuser.user_id
@@ -1438,7 +1468,7 b' def grant_user_permission(request, apius'
1438 1468 perm = get_perm_or_error(perm)
1439 1469 if not has_superadmin_permission(apiuser):
1440 1470 _perms = ('repository.admin',)
1441 has_repo_permissions(apiuser, repoid, repo, _perms)
1471 validate_repo_permissions(apiuser, repoid, repo, _perms)
1442 1472
1443 1473 try:
1444 1474
@@ -1492,7 +1522,7 b' def revoke_user_permission(request, apiu'
1492 1522 user = get_user_or_error(userid)
1493 1523 if not has_superadmin_permission(apiuser):
1494 1524 _perms = ('repository.admin',)
1495 has_repo_permissions(apiuser, repoid, repo, _perms)
1525 validate_repo_permissions(apiuser, repoid, repo, _perms)
1496 1526
1497 1527 try:
1498 1528 RepoModel().revoke_user_permission(repo=repo, user=user)
@@ -1560,7 +1590,7 b' def grant_user_group_permission(request,'
1560 1590 perm = get_perm_or_error(perm)
1561 1591 if not has_superadmin_permission(apiuser):
1562 1592 _perms = ('repository.admin',)
1563 has_repo_permissions(apiuser, repoid, repo, _perms)
1593 validate_repo_permissions(apiuser, repoid, repo, _perms)
1564 1594
1565 1595 user_group = get_user_group_or_error(usergroupid)
1566 1596 if not has_superadmin_permission(apiuser):
@@ -1625,7 +1655,7 b' def revoke_user_group_permission(request'
1625 1655 repo = get_repo_or_error(repoid)
1626 1656 if not has_superadmin_permission(apiuser):
1627 1657 _perms = ('repository.admin',)
1628 has_repo_permissions(apiuser, repoid, repo, _perms)
1658 validate_repo_permissions(apiuser, repoid, repo, _perms)
1629 1659
1630 1660 user_group = get_user_group_or_error(usergroupid)
1631 1661 if not has_superadmin_permission(apiuser):
@@ -1701,7 +1731,7 b' def pull(request, apiuser, repoid):'
1701 1731 repo = get_repo_or_error(repoid)
1702 1732 if not has_superadmin_permission(apiuser):
1703 1733 _perms = ('repository.admin',)
1704 has_repo_permissions(apiuser, repoid, repo, _perms)
1734 validate_repo_permissions(apiuser, repoid, repo, _perms)
1705 1735
1706 1736 try:
1707 1737 ScmModel().pull_changes(repo.repo_name, apiuser.username)
@@ -1764,7 +1794,7 b' def strip(request, apiuser, repoid, revi'
1764 1794 repo = get_repo_or_error(repoid)
1765 1795 if not has_superadmin_permission(apiuser):
1766 1796 _perms = ('repository.admin',)
1767 has_repo_permissions(apiuser, repoid, repo, _perms)
1797 validate_repo_permissions(apiuser, repoid, repo, _perms)
1768 1798
1769 1799 try:
1770 1800 ScmModel().strip(repo, revision, branch)
@@ -21,19 +21,18 b''
21 21
22 22 import logging
23 23
24 import colander
25
26 from rhodecode.api import jsonrpc_method, JSONRPCError, JSONRPCForbidden
24 from rhodecode.api import JSONRPCValidationError
25 from rhodecode.api import jsonrpc_method, JSONRPCError
27 26 from rhodecode.api.utils import (
28 27 has_superadmin_permission, Optional, OAttr, get_user_or_error,
29 store_update, get_repo_group_or_error,
30 get_perm_or_error, get_user_group_or_error, get_origin)
28 get_repo_group_or_error, get_perm_or_error, get_user_group_or_error,
29 get_origin, validate_repo_group_permissions, validate_set_owner_permissions)
31 30 from rhodecode.lib.auth import (
32 HasPermissionAnyApi, HasRepoGroupPermissionAnyApi,
33 HasUserGroupPermissionAnyApi)
34 from rhodecode.model.db import Session, RepoGroup
31 HasRepoGroupPermissionAnyApi, HasUserGroupPermissionAnyApi)
32 from rhodecode.model.db import Session
35 33 from rhodecode.model.repo_group import RepoGroupModel
36 34 from rhodecode.model.scm import RepoGroupList
35 from rhodecode.model import validation_schema
37 36 from rhodecode.model.validation_schema.schemas import repo_group_schema
38 37
39 38
@@ -142,21 +141,24 b' def get_repo_groups(request, apiuser):'
142 141
143 142
144 143 @jsonrpc_method()
145 def create_repo_group(request, apiuser, group_name, description=Optional(''),
146 owner=Optional(OAttr('apiuser')),
147 copy_permissions=Optional(False)):
144 def create_repo_group(
145 request, apiuser, group_name,
146 owner=Optional(OAttr('apiuser')),
147 description=Optional(''),
148 copy_permissions=Optional(False)):
148 149 """
149 150 Creates a repository group.
150 151
151 * If the repository group name contains "/", all the required repository
152 groups will be created.
152 * If the repository group name contains "/", repository group will be
153 created inside a repository group or nested repository groups
153 154
154 For example "foo/bar/baz" will create |repo| groups "foo" and "bar"
155 (with "foo" as parent). It will also create the "baz" repository
156 with "bar" as |repo| group.
155 For example "foo/bar/group1" will create repository group called "group1"
156 inside group "foo/bar". You have to have permissions to access and
157 write to the last repository group ("bar" in this example)
157 158
158 This command can only be run using an |authtoken| with admin
159 permissions.
159 This command can only be run using an |authtoken| with at least
160 permissions to create repository groups, or admin permissions to
161 parent repository groups.
160 162
161 163 :param apiuser: This is filled automatically from the |authtoken|.
162 164 :type apiuser: AuthUser
@@ -193,72 +195,64 b' def create_repo_group(request, apiuser, '
193 195
194 196 """
195 197
196 schema = repo_group_schema.RepoGroupSchema()
197 try:
198 data = schema.deserialize({
199 'group_name': group_name
200 })
201 except colander.Invalid as e:
202 raise JSONRPCError("Validation failed: %s" % (e.asdict(),))
203 group_name = data['group_name']
198 owner = validate_set_owner_permissions(apiuser, owner)
204 199
205 if isinstance(owner, Optional):
206 owner = apiuser.user_id
207
208 group_description = Optional.extract(description)
200 description = Optional.extract(description)
209 201 copy_permissions = Optional.extract(copy_permissions)
210 202
211 # get by full name with parents, check if it already exist
212 if RepoGroup.get_by_group_name(group_name):
213 raise JSONRPCError("repo group `%s` already exist" % (group_name,))
214
215 (group_name_cleaned,
216 parent_group_name) = RepoGroupModel()._get_group_name_and_parent(
217 group_name)
203 schema = repo_group_schema.RepoGroupSchema().bind(
204 # user caller
205 user=apiuser)
218 206
219 parent_group = None
220 if parent_group_name:
221 parent_group = get_repo_group_or_error(parent_group_name)
207 try:
208 schema_data = schema.deserialize(dict(
209 repo_group_name=group_name,
210 repo_group_owner=owner.username,
211 repo_group_description=description,
212 repo_group_copy_permissions=copy_permissions,
213 ))
214 except validation_schema.Invalid as err:
215 raise JSONRPCValidationError(colander_exc=err)
222 216
223 if not HasPermissionAnyApi(
224 'hg.admin', 'hg.repogroup.create.true')(user=apiuser):
225 # check if we have admin permission for this parent repo group !
226 # users without admin or hg.repogroup.create can only create other
227 # groups in groups they own so this is a required, but can be empty
228 parent_group = getattr(parent_group, 'group_name', '')
229 _perms = ('group.admin',)
230 if not HasRepoGroupPermissionAnyApi(*_perms)(
231 user=apiuser, group_name=parent_group):
232 raise JSONRPCForbidden()
217 validated_group_name = schema_data['repo_group_name']
233 218
234 219 try:
235 220 repo_group = RepoGroupModel().create(
236 group_name=group_name,
237 group_description=group_description,
238 221 owner=owner,
239 copy_permissions=copy_permissions)
222 group_name=validated_group_name,
223 group_description=schema_data['repo_group_name'],
224 copy_permissions=schema_data['repo_group_copy_permissions'])
240 225 Session().commit()
241 226 return {
242 'msg': 'Created new repo group `%s`' % group_name,
227 'msg': 'Created new repo group `%s`' % validated_group_name,
243 228 'repo_group': repo_group.get_api_data()
244 229 }
245 230 except Exception:
246 231 log.exception("Exception occurred while trying create repo group")
247 232 raise JSONRPCError(
248 'failed to create repo group `%s`' % (group_name,))
233 'failed to create repo group `%s`' % (validated_group_name,))
249 234
250 235
251 236 @jsonrpc_method()
252 237 def update_repo_group(
253 238 request, apiuser, repogroupid, group_name=Optional(''),
254 239 description=Optional(''), owner=Optional(OAttr('apiuser')),
255 parent=Optional(None), enable_locking=Optional(False)):
240 enable_locking=Optional(False)):
256 241 """
257 242 Updates repository group with the details given.
258 243
259 244 This command can only be run using an |authtoken| with admin
260 245 permissions.
261 246
247 * If the group_name name contains "/", repository group will be updated
248 accordingly with a repository group or nested repository groups
249
250 For example repogroupid=group-test group_name="foo/bar/group-test"
251 will update repository group called "group-test" and place it
252 inside group "foo/bar".
253 You have to have permissions to access and write to the last repository
254 group ("bar" in this example)
255
262 256 :param apiuser: This is filled automatically from the |authtoken|.
263 257 :type apiuser: AuthUser
264 258 :param repogroupid: Set the ID of repository group.
@@ -269,29 +263,55 b' def update_repo_group('
269 263 :type description: str
270 264 :param owner: Set the |repo| group owner.
271 265 :type owner: str
272 :param parent: Set the |repo| group parent.
273 :type parent: str or int
274 266 :param enable_locking: Enable |repo| locking. The default is false.
275 267 :type enable_locking: bool
276 268 """
277 269
278 270 repo_group = get_repo_group_or_error(repogroupid)
271
279 272 if not has_superadmin_permission(apiuser):
280 # check if we have admin permission for this repo group !
281 _perms = ('group.admin',)
282 if not HasRepoGroupPermissionAnyApi(*_perms)(
283 user=apiuser, group_name=repo_group.group_name):
284 raise JSONRPCError(
285 'repository group `%s` does not exist' % (repogroupid,))
273 validate_repo_group_permissions(
274 apiuser, repogroupid, repo_group, ('group.admin',))
275
276 updates = dict(
277 group_name=group_name
278 if not isinstance(group_name, Optional) else repo_group.group_name,
279
280 group_description=description
281 if not isinstance(description, Optional) else repo_group.group_description,
282
283 user=owner
284 if not isinstance(owner, Optional) else repo_group.user.username,
285
286 enable_locking=enable_locking
287 if not isinstance(enable_locking, Optional) else repo_group.enable_locking
288 )
286 289
287 updates = {}
290 schema = repo_group_schema.RepoGroupSchema().bind(
291 # user caller
292 user=apiuser,
293 old_values=repo_group.get_api_data())
294
288 295 try:
289 store_update(updates, group_name, 'group_name')
290 store_update(updates, description, 'group_description')
291 store_update(updates, owner, 'user')
292 store_update(updates, parent, 'group_parent_id')
293 store_update(updates, enable_locking, 'enable_locking')
294 repo_group = RepoGroupModel().update(repo_group, updates)
296 schema_data = schema.deserialize(dict(
297 repo_group_name=updates['group_name'],
298 repo_group_owner=updates['user'],
299 repo_group_description=updates['group_description'],
300 repo_group_enable_locking=updates['enable_locking'],
301 ))
302 except validation_schema.Invalid as err:
303 raise JSONRPCValidationError(colander_exc=err)
304
305 validated_updates = dict(
306 group_name=schema_data['repo_group']['repo_group_name_without_group'],
307 group_parent_id=schema_data['repo_group']['repo_group_id'],
308 user=schema_data['repo_group_owner'],
309 group_description=schema_data['repo_group_description'],
310 enable_locking=schema_data['repo_group_enable_locking'],
311 )
312
313 try:
314 RepoGroupModel().update(repo_group, validated_updates)
295 315 Session().commit()
296 316 return {
297 317 'msg': 'updated repository group ID:%s %s' % (
@@ -299,7 +319,9 b' def update_repo_group('
299 319 'repo_group': repo_group.get_api_data()
300 320 }
301 321 except Exception:
302 log.exception("Exception occurred while trying update repo group")
322 log.exception(
323 u"Exception occurred while trying update repo group %s",
324 repogroupid)
303 325 raise JSONRPCError('failed to update repository group `%s`'
304 326 % (repogroupid,))
305 327
@@ -321,7 +343,7 b' def delete_repo_group(request, apiuser, '
321 343
322 344 id : <id_given_in_input>
323 345 result : {
324 'msg': 'deleted repo group ID:<repogroupid> <repogroupname>
346 'msg': 'deleted repo group ID:<repogroupid> <repogroupname>'
325 347 'repo_group': null
326 348 }
327 349 error : null
@@ -340,12 +362,9 b' def delete_repo_group(request, apiuser, '
340 362
341 363 repo_group = get_repo_group_or_error(repogroupid)
342 364 if not has_superadmin_permission(apiuser):
343 # check if we have admin permission for this repo group !
344 _perms = ('group.admin',)
345 if not HasRepoGroupPermissionAnyApi(*_perms)(
346 user=apiuser, group_name=repo_group.group_name):
347 raise JSONRPCError(
348 'repository group `%s` does not exist' % (repogroupid,))
365 validate_repo_group_permissions(
366 apiuser, repogroupid, repo_group, ('group.admin',))
367
349 368 try:
350 369 RepoGroupModel().delete(repo_group)
351 370 Session().commit()
@@ -408,12 +427,8 b' def grant_user_permission_to_repo_group('
408 427 repo_group = get_repo_group_or_error(repogroupid)
409 428
410 429 if not has_superadmin_permission(apiuser):
411 # check if we have admin permission for this repo group !
412 _perms = ('group.admin',)
413 if not HasRepoGroupPermissionAnyApi(*_perms)(
414 user=apiuser, group_name=repo_group.group_name):
415 raise JSONRPCError(
416 'repository group `%s` does not exist' % (repogroupid,))
430 validate_repo_group_permissions(
431 apiuser, repogroupid, repo_group, ('group.admin',))
417 432
418 433 user = get_user_or_error(userid)
419 434 perm = get_perm_or_error(perm, prefix='group.')
@@ -487,12 +502,8 b' def revoke_user_permission_from_repo_gro'
487 502 repo_group = get_repo_group_or_error(repogroupid)
488 503
489 504 if not has_superadmin_permission(apiuser):
490 # check if we have admin permission for this repo group !
491 _perms = ('group.admin',)
492 if not HasRepoGroupPermissionAnyApi(*_perms)(
493 user=apiuser, group_name=repo_group.group_name):
494 raise JSONRPCError(
495 'repository group `%s` does not exist' % (repogroupid,))
505 validate_repo_group_permissions(
506 apiuser, repogroupid, repo_group, ('group.admin',))
496 507
497 508 user = get_user_or_error(userid)
498 509 apply_to_children = Optional.extract(apply_to_children)
@@ -569,12 +580,8 b' def grant_user_group_permission_to_repo_'
569 580 perm = get_perm_or_error(perm, prefix='group.')
570 581 user_group = get_user_group_or_error(usergroupid)
571 582 if not has_superadmin_permission(apiuser):
572 # check if we have admin permission for this repo group !
573 _perms = ('group.admin',)
574 if not HasRepoGroupPermissionAnyApi(*_perms)(
575 user=apiuser, group_name=repo_group.group_name):
576 raise JSONRPCError(
577 'repository group `%s` does not exist' % (repogroupid,))
583 validate_repo_group_permissions(
584 apiuser, repogroupid, repo_group, ('group.admin',))
578 585
579 586 # check if we have at least read permission for this user group !
580 587 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
@@ -656,12 +663,8 b' def revoke_user_group_permission_from_re'
656 663 repo_group = get_repo_group_or_error(repogroupid)
657 664 user_group = get_user_group_or_error(usergroupid)
658 665 if not has_superadmin_permission(apiuser):
659 # check if we have admin permission for this repo group !
660 _perms = ('group.admin',)
661 if not HasRepoGroupPermissionAnyApi(*_perms)(
662 user=apiuser, group_name=repo_group.group_name):
663 raise JSONRPCError(
664 'repository group `%s` does not exist' % (repogroupid,))
666 validate_repo_group_permissions(
667 apiuser, repogroupid, repo_group, ('group.admin',))
665 668
666 669 # check if we have at least read permission for this user group !
667 670 _perms = ('usergroup.read', 'usergroup.write', 'usergroup.admin',)
@@ -60,7 +60,13 b' def get_server_info(request, apiuser):'
60 60 if not has_superadmin_permission(apiuser):
61 61 raise JSONRPCForbidden()
62 62
63 return ScmModel().get_server_info(request.environ)
63 server_info = ScmModel().get_server_info(request.environ)
64 # rhodecode-index requires those
65
66 server_info['index_storage'] = server_info['search']['value']['location']
67 server_info['storage'] = server_info['storage']['value']['path']
68
69 return server_info
64 70
65 71
66 72 @jsonrpc_method()
@@ -25,7 +25,7 b' from rhodecode.api.utils import ('
25 25 Optional, OAttr, has_superadmin_permission, get_user_or_error, store_update)
26 26 from rhodecode.lib.auth import AuthUser, PasswordGenerator
27 27 from rhodecode.lib.exceptions import DefaultUserException
28 from rhodecode.lib.utils2 import safe_int
28 from rhodecode.lib.utils2 import safe_int, str2bool
29 29 from rhodecode.model.db import Session, User, Repository
30 30 from rhodecode.model.user import UserModel
31 31
@@ -81,6 +81,7 b' def get_user(request, apiuser, userid=Op'
81 81 "usergroup.read",
82 82 "hg.repogroup.create.false",
83 83 "hg.create.none",
84 "hg.password_reset.enabled",
84 85 "hg.extern_activate.manual",
85 86 "hg.create.write_on_repogroup.false",
86 87 "hg.usergroup.create.false",
@@ -154,7 +155,8 b' def create_user(request, apiuser, userna'
154 155 active=Optional(True), admin=Optional(False),
155 156 extern_name=Optional('rhodecode'),
156 157 extern_type=Optional('rhodecode'),
157 force_password_change=Optional(False)):
158 force_password_change=Optional(False),
159 create_personal_repo_group=Optional(None)):
158 160 """
159 161 Creates a new user and returns the new user object.
160 162
@@ -187,7 +189,8 b' def create_user(request, apiuser, userna'
187 189 :param force_password_change: Force the new user to change password
188 190 on next login.
189 191 :type force_password_change: Optional(``True`` | ``False``)
190
192 :param create_personal_repo_group: Create personal repo group for this user
193 :type create_personal_repo_group: Optional(``True`` | ``False``)
191 194 Example output:
192 195
193 196 .. code-block:: bash
@@ -229,6 +232,9 b' def create_user(request, apiuser, userna'
229 232 Optional.extract(extern_name) != 'rhodecode'):
230 233 # generate temporary password if user is external
231 234 password = PasswordGenerator().gen_password(length=16)
235 create_repo_group = Optional.extract(create_personal_repo_group)
236 if isinstance(create_repo_group, basestring):
237 create_repo_group = str2bool(create_repo_group)
232 238
233 239 try:
234 240 user = UserModel().create_or_update(
@@ -242,6 +248,7 b' def create_user(request, apiuser, userna'
242 248 extern_type=Optional.extract(extern_type),
243 249 extern_name=Optional.extract(extern_name),
244 250 force_password_change=Optional.extract(force_password_change),
251 create_repo_group=create_repo_group
245 252 )
246 253 Session().commit()
247 254 return {
@@ -226,6 +226,15 b' class RhodeCodeAuthPluginBase(object):'
226 226 """
227 227 raise NotImplementedError("Not implemented in base class")
228 228
229 def get_url_slug(self):
230 """
231 Returns a slug which should be used when constructing URLs which refer
232 to this plugin. By default it returns the plugin name. If the name is
233 not suitable for using it in an URL the plugin should override this
234 method.
235 """
236 return self.name
237
229 238 @property
230 239 def is_headers_auth(self):
231 240 """
@@ -45,7 +45,7 b' class AuthnPluginResourceBase(AuthnResou'
45 45
46 46 def __init__(self, plugin):
47 47 self.plugin = plugin
48 self.__name__ = plugin.name
48 self.__name__ = plugin.get_url_slug()
49 49 self.display_name = plugin.get_display_name()
50 50
51 51
@@ -84,7 +84,7 b' class AuthnPluginViewBase(object):'
84 84
85 85 try:
86 86 valid_data = schema.deserialize(data)
87 except colander.Invalid, e:
87 except colander.Invalid as e:
88 88 # Display error message and display form again.
89 89 self.request.session.flash(
90 90 _('Errors exist when saving plugin settings. '
@@ -31,9 +31,9 b' from pyramid.authorization import ACLAut'
31 31 from pyramid.config import Configurator
32 32 from pyramid.settings import asbool, aslist
33 33 from pyramid.wsgi import wsgiapp
34 from pyramid.httpexceptions import HTTPError, HTTPInternalServerError, HTTPFound
34 from pyramid.httpexceptions import (
35 HTTPError, HTTPInternalServerError, HTTPFound)
35 36 from pyramid.events import ApplicationCreated
36 import pyramid.httpexceptions as httpexceptions
37 37 from pyramid.renderers import render_to_response
38 38 from routes.middleware import RoutesMiddleware
39 39 import routes.util
@@ -44,10 +44,10 b' from rhodecode.config import patches'
44 44 from rhodecode.config.routing import STATIC_FILE_PREFIX
45 45 from rhodecode.config.environment import (
46 46 load_environment, load_pyramid_environment)
47 from rhodecode.lib.exceptions import VCSServerUnavailable
48 from rhodecode.lib.vcs.exceptions import VCSCommunicationError
49 47 from rhodecode.lib.middleware import csrf
50 48 from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled
49 from rhodecode.lib.middleware.error_handling import (
50 PylonsErrorHandlingMiddleware)
51 51 from rhodecode.lib.middleware.https_fixup import HttpsFixup
52 52 from rhodecode.lib.middleware.vcs import VCSMiddleware
53 53 from rhodecode.lib.plugins.utils import register_rhodecode_plugin
@@ -186,53 +186,27 b' def make_not_found_view(config):'
186 186 pylons_app, appenlight_client = wrap_in_appenlight_if_enabled(
187 187 pylons_app, settings)
188 188
189 # The VCSMiddleware shall operate like a fallback if pyramid doesn't find
190 # a view to handle the request. Therefore we wrap it around the pylons app.
189 # The pylons app is executed inside of the pyramid 404 exception handler.
190 # Exceptions which are raised inside of it are not handled by pyramid
191 # again. Therefore we add a middleware that invokes the error handler in
192 # case of an exception or error response. This way we return proper error
193 # HTML pages in case of an error.
194 reraise = (settings.get('debugtoolbar.enabled', False) or
195 rhodecode.disable_error_handler)
196 pylons_app = PylonsErrorHandlingMiddleware(
197 pylons_app, error_handler, reraise)
198
199 # The VCSMiddleware shall operate like a fallback if pyramid doesn't find a
200 # view to handle the request. Therefore it is wrapped around the pylons
201 # app. It has to be outside of the error handling otherwise error responses
202 # from the vcsserver are converted to HTML error pages. This confuses the
203 # command line tools and the user won't get a meaningful error message.
191 204 if vcs_server_enabled:
192 205 pylons_app = VCSMiddleware(
193 206 pylons_app, settings, appenlight_client, registry=config.registry)
194 207
195 pylons_app_as_view = wsgiapp(pylons_app)
196
197 def pylons_app_with_error_handler(context, request):
198 """
199 Handle exceptions from rc pylons app:
200
201 - old webob type exceptions get converted to pyramid exceptions
202 - pyramid exceptions are passed to the error handler view
203 """
204 def is_vcs_response(response):
205 return 'X-RhodeCode-Backend' in response.headers
206
207 def is_http_error(response):
208 # webob type error responses
209 return (400 <= response.status_int <= 599)
210
211 def is_error_handling_needed(response):
212 return is_http_error(response) and not is_vcs_response(response)
213
214 try:
215 response = pylons_app_as_view(context, request)
216 if is_error_handling_needed(response):
217 response = webob_to_pyramid_http_response(response)
218 return error_handler(response, request)
219 except HTTPError as e: # pyramid type exceptions
220 return error_handler(e, request)
221 except Exception as e:
222 log.exception(e)
223
224 if (settings.get('debugtoolbar.enabled', False) or
225 rhodecode.disable_error_handler):
226 raise
227
228 if isinstance(e, VCSCommunicationError):
229 return error_handler(VCSServerUnavailable(), request)
230
231 return error_handler(HTTPInternalServerError(), request)
232
233 return response
234
235 return pylons_app_with_error_handler
208 # Convert WSGI app to pyramid view and return it.
209 return wsgiapp(pylons_app)
236 210
237 211
238 212 def add_pylons_compat_data(registry, global_config, settings):
@@ -243,16 +217,6 b' def add_pylons_compat_data(registry, glo'
243 217 registry._pylons_compat_settings = settings
244 218
245 219
246 def webob_to_pyramid_http_response(webob_response):
247 ResponseClass = httpexceptions.status_map[webob_response.status_int]
248 pyramid_response = ResponseClass(webob_response.status)
249 pyramid_response.status = webob_response.status
250 pyramid_response.headers.update(webob_response.headers)
251 if pyramid_response.headers['content-type'] == 'text/html':
252 pyramid_response.headers['content-type'] = 'text/html; charset=UTF-8'
253 return pyramid_response
254
255
256 220 def error_handler(exception, request):
257 221 from rhodecode.model.settings import SettingsModel
258 222 from rhodecode.lib.utils2 import AttributeDict
@@ -466,10 +430,11 b' def _sanitize_vcs_settings(settings):'
466 430 """
467 431 _string_setting(settings, 'vcs.svn.compatible_version', '')
468 432 _string_setting(settings, 'git_rev_filter', '--all')
469 _string_setting(settings, 'vcs.hooks.protocol', 'pyro4')
433 _string_setting(settings, 'vcs.hooks.protocol', 'http')
434 _string_setting(settings, 'vcs.scm_app_implementation', 'http')
470 435 _string_setting(settings, 'vcs.server', '')
471 436 _string_setting(settings, 'vcs.server.log_level', 'debug')
472 _string_setting(settings, 'vcs.server.protocol', 'pyro4')
437 _string_setting(settings, 'vcs.server.protocol', 'http')
473 438 _bool_setting(settings, 'startup.import_repos', 'false')
474 439 _bool_setting(settings, 'vcs.hooks.direct_calls', 'false')
475 440 _bool_setting(settings, 'vcs.server.enable', 'true')
@@ -477,6 +442,13 b' def _sanitize_vcs_settings(settings):'
477 442 _list_setting(settings, 'vcs.backends', 'hg, git, svn')
478 443 _int_setting(settings, 'vcs.connection_timeout', 3600)
479 444
445 # Support legacy values of vcs.scm_app_implementation. Legacy
446 # configurations may use 'rhodecode.lib.middleware.utils.scm_app_http'
447 # which is now mapped to 'http'.
448 scm_app_impl = settings['vcs.scm_app_implementation']
449 if scm_app_impl == 'rhodecode.lib.middleware.utils.scm_app_http':
450 settings['vcs.scm_app_implementation'] = 'http'
451
480 452
481 453 def _int_setting(settings, name, default):
482 454 settings[name] = int(settings.get(name, default))
@@ -501,5 +473,8 b' def _list_setting(settings, name, defaul'
501 473 settings[name] = aslist(raw_value)
502 474
503 475
504 def _string_setting(settings, name, default):
505 settings[name] = settings.get(name, default).lower()
476 def _string_setting(settings, name, default, lower=True):
477 value = settings.get(name, default)
478 if lower:
479 value = value.lower()
480 settings[name] = value
@@ -196,7 +196,7 b' def make_map(config):'
196 196 rmap.connect('user_autocomplete_data', '/_users', controller='home',
197 197 action='user_autocomplete_data', jsroute=True)
198 198 rmap.connect('user_group_autocomplete_data', '/_user_groups', controller='home',
199 action='user_group_autocomplete_data')
199 action='user_group_autocomplete_data', jsroute=True)
200 200
201 201 rmap.connect(
202 202 'user_profile', '/_profiles/{username}', controller='users',
@@ -305,7 +305,7 b' def make_map(config):'
305 305 m.connect('delete_user', '/users/{user_id}',
306 306 action='delete', conditions={'method': ['DELETE']})
307 307 m.connect('edit_user', '/users/{user_id}/edit',
308 action='edit', conditions={'method': ['GET']})
308 action='edit', conditions={'method': ['GET']}, jsroute=True)
309 309 m.connect('user', '/users/{user_id}',
310 310 action='show', conditions={'method': ['GET']})
311 311 m.connect('force_password_reset_user', '/users/{user_id}/password_reset',
@@ -389,7 +389,7 b' def make_map(config):'
389 389
390 390 m.connect('edit_user_group_members',
391 391 '/user_groups/{user_group_id}/edit/members', jsroute=True,
392 action='edit_members', conditions={'method': ['GET']})
392 action='user_group_members', conditions={'method': ['GET']})
393 393
394 394 # ADMIN PERMISSIONS ROUTES
395 395 with rmap.submapper(path_prefix=ADMIN_PREFIX,
@@ -699,6 +699,9 b' def make_map(config):'
699 699 rmap.connect('repo_refs_changelog_data', '/{repo_name}/refs-data-changelog',
700 700 controller='summary', action='repo_refs_changelog_data',
701 701 requirements=URL_NAME_REQUIREMENTS, jsroute=True)
702 rmap.connect('repo_default_reviewers_data', '/{repo_name}/default-reviewers',
703 controller='summary', action='repo_default_reviewers_data',
704 jsroute=True, requirements=URL_NAME_REQUIREMENTS)
702 705
703 706 rmap.connect('changeset_home', '/{repo_name}/changeset/{revision}',
704 707 controller='changeset', revision='tip', jsroute=True,
@@ -824,6 +827,10 b' def make_map(config):'
824 827 controller='admin/repos', action='repo_delete_svn_pattern',
825 828 conditions={'method': ['DELETE'], 'function': check_repo},
826 829 requirements=URL_NAME_REQUIREMENTS)
830 rmap.connect('repo_pullrequest_settings', '/{repo_name}/settings/pullrequest',
831 controller='admin/repos', action='repo_settings_pullrequest',
832 conditions={'method': ['GET', 'POST'], 'function': check_repo},
833 requirements=URL_NAME_REQUIREMENTS)
827 834
828 835 # still working url for backward compat.
829 836 rmap.connect('raw_changeset_home_depraced',
@@ -28,10 +28,9 b' import logging'
28 28
29 29 import formencode
30 30 import peppercorn
31 from formencode import htmlfill
32 31
33 32 from pylons import request, response, tmpl_context as c, url
34 from pylons.controllers.util import abort, redirect
33 from pylons.controllers.util import redirect
35 34 from pylons.i18n.translation import _
36 35 from webob.exc import HTTPNotFound, HTTPForbidden
37 36 from sqlalchemy.sql.expression import or_
@@ -45,7 +44,7 b' from rhodecode.lib import helpers as h'
45 44 from rhodecode.lib.base import BaseController, render
46 45 from rhodecode.lib.auth import LoginRequired, NotAnonymous
47 46 from rhodecode.lib.utils import jsonify
48 from rhodecode.lib.utils2 import safe_str, safe_int, time_to_datetime
47 from rhodecode.lib.utils2 import time_to_datetime
49 48 from rhodecode.lib.ext_json import json
50 49 from rhodecode.lib.vcs.exceptions import VCSError, NodeNotChangedError
51 50 from rhodecode.model import validation_schema
@@ -39,19 +39,20 b' from rhodecode.lib.auth import ('
39 39 LoginRequired, NotAnonymous, AuthUser, generate_auth_token)
40 40 from rhodecode.lib.base import BaseController, render
41 41 from rhodecode.lib.utils import jsonify
42 from rhodecode.lib.utils2 import safe_int, md5
42 from rhodecode.lib.utils2 import safe_int, md5, str2bool
43 43 from rhodecode.lib.ext_json import json
44 44
45 45 from rhodecode.model.validation_schema.schemas import user_schema
46 46 from rhodecode.model.db import (
47 Repository, PullRequest, PullRequestReviewers, UserEmailMap, User,
48 UserFollowing)
47 Repository, PullRequest, UserEmailMap, User, UserFollowing)
49 48 from rhodecode.model.forms import UserForm
50 49 from rhodecode.model.scm import RepoList
51 50 from rhodecode.model.user import UserModel
52 51 from rhodecode.model.repo import RepoModel
53 52 from rhodecode.model.auth_token import AuthTokenModel
54 53 from rhodecode.model.meta import Session
54 from rhodecode.model.pull_request import PullRequestModel
55 from rhodecode.model.comment import ChangesetCommentsModel
55 56
56 57 log = logging.getLogger(__name__)
57 58
@@ -289,25 +290,85 b' class MyAccountController(BaseController'
289 290 category='success')
290 291 return redirect(url('my_account_emails'))
291 292
293 def _extract_ordering(self, request):
294 column_index = safe_int(request.GET.get('order[0][column]'))
295 order_dir = request.GET.get('order[0][dir]', 'desc')
296 order_by = request.GET.get(
297 'columns[%s][data][sort]' % column_index, 'name_raw')
298 return order_by, order_dir
299
300 def _get_pull_requests_list(self, statuses):
301 start = safe_int(request.GET.get('start'), 0)
302 length = safe_int(request.GET.get('length'), c.visual.dashboard_items)
303 order_by, order_dir = self._extract_ordering(request)
304
305 pull_requests = PullRequestModel().get_im_participating_in(
306 user_id=c.rhodecode_user.user_id,
307 statuses=statuses,
308 offset=start, length=length, order_by=order_by,
309 order_dir=order_dir)
310
311 pull_requests_total_count = PullRequestModel().count_im_participating_in(
312 user_id=c.rhodecode_user.user_id, statuses=statuses)
313
314 from rhodecode.lib.utils import PartialRenderer
315 _render = PartialRenderer('data_table/_dt_elements.html')
316 data = []
317 for pr in pull_requests:
318 repo_id = pr.target_repo_id
319 comments = ChangesetCommentsModel().get_all_comments(
320 repo_id, pull_request=pr)
321 owned = pr.user_id == c.rhodecode_user.user_id
322 status = pr.calculated_review_status()
323
324 data.append({
325 'target_repo': _render('pullrequest_target_repo',
326 pr.target_repo.repo_name),
327 'name': _render('pullrequest_name',
328 pr.pull_request_id, pr.target_repo.repo_name,
329 short=True),
330 'name_raw': pr.pull_request_id,
331 'status': _render('pullrequest_status', status),
332 'title': _render(
333 'pullrequest_title', pr.title, pr.description),
334 'description': h.escape(pr.description),
335 'updated_on': _render('pullrequest_updated_on',
336 h.datetime_to_time(pr.updated_on)),
337 'updated_on_raw': h.datetime_to_time(pr.updated_on),
338 'created_on': _render('pullrequest_updated_on',
339 h.datetime_to_time(pr.created_on)),
340 'created_on_raw': h.datetime_to_time(pr.created_on),
341 'author': _render('pullrequest_author',
342 pr.author.full_contact, ),
343 'author_raw': pr.author.full_name,
344 'comments': _render('pullrequest_comments', len(comments)),
345 'comments_raw': len(comments),
346 'closed': pr.is_closed(),
347 'owned': owned
348 })
349 # json used to render the grid
350 data = ({
351 'data': data,
352 'recordsTotal': pull_requests_total_count,
353 'recordsFiltered': pull_requests_total_count,
354 })
355 return data
356
292 357 def my_account_pullrequests(self):
293 358 c.active = 'pullrequests'
294 359 self.__load_data()
295 c.show_closed = request.GET.get('pr_show_closed')
296
297 def _filter(pr):
298 s = sorted(pr, key=lambda o: o.created_on, reverse=True)
299 if not c.show_closed:
300 s = filter(lambda p: p.status != PullRequest.STATUS_CLOSED, s)
301 return s
360 c.show_closed = str2bool(request.GET.get('pr_show_closed'))
302 361
303 c.my_pull_requests = _filter(
304 PullRequest.query().filter(
305 PullRequest.user_id == c.rhodecode_user.user_id).all())
306 my_prs = [
307 x.pull_request for x in PullRequestReviewers.query().filter(
308 PullRequestReviewers.user_id == c.rhodecode_user.user_id).all()]
309 c.participate_in_pull_requests = _filter(my_prs)
310 return render('admin/my_account/my_account.html')
362 statuses = [PullRequest.STATUS_NEW, PullRequest.STATUS_OPEN]
363 if c.show_closed:
364 statuses += [PullRequest.STATUS_CLOSED]
365 data = self._get_pull_requests_list(statuses)
366 if not request.is_xhr:
367 c.data_participate = json.dumps(data['data'])
368 c.records_total_participate = data['recordsTotal']
369 return render('admin/my_account/my_account.html')
370 else:
371 return json.dumps(data)
311 372
312 373 def my_account_auth_tokens(self):
313 374 c.active = 'auth_tokens'
@@ -57,7 +57,7 b' class PermissionsController(BaseControll'
57 57 super(PermissionsController, self).__before__()
58 58
59 59 def __load_data(self):
60 PermissionModel().set_global_permission_choices(c, translator=_)
60 PermissionModel().set_global_permission_choices(c, gettext_translator=_)
61 61
62 62 @HasPermissionAllDecorator('hg.admin')
63 63 def permission_application(self):
@@ -92,6 +92,7 b' class PermissionsController(BaseControll'
92 92 self.__load_data()
93 93 _form = ApplicationPermissionsForm(
94 94 [x[0] for x in c.register_choices],
95 [x[0] for x in c.password_reset_choices],
95 96 [x[0] for x in c.extern_activate_choices])()
96 97
97 98 try:
@@ -160,6 +160,7 b' class ReposController(BaseRepoController'
160 160 self.__load_defaults()
161 161 form_result = {}
162 162 task_id = None
163 c.personal_repo_group = c.rhodecode_user.personal_repo_group
163 164 try:
164 165 # CanWriteToGroup validators checks permissions of this POST
165 166 form_result = RepoForm(repo_groups=c.repo_groups_choices,
@@ -173,8 +174,6 b' class ReposController(BaseRepoController'
173 174 if isinstance(task, BaseAsyncResult):
174 175 task_id = task.task_id
175 176 except formencode.Invalid as errors:
176 c.personal_repo_group = RepoGroup.get_by_group_name(
177 c.rhodecode_user.username)
178 177 return htmlfill.render(
179 178 render('admin/repos/repo_add.html'),
180 179 defaults=errors.value,
@@ -215,7 +214,7 b' class ReposController(BaseRepoController'
215 214 c.repo_groups = RepoGroup.groups_choices(groups=acl_groups)
216 215 c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups)
217 216 choices, c.landing_revs = ScmModel().get_repo_landing_revs()
218 c.personal_repo_group = RepoGroup.get_by_group_name(c.rhodecode_user.username)
217 c.personal_repo_group = c.rhodecode_user.personal_repo_group
219 218 c.new_repo = repo_name_slug(new_repo)
220 219
221 220 ## apply the defaults from defaults page
@@ -299,9 +298,8 b' class ReposController(BaseRepoController'
299 298 repo_model = RepoModel()
300 299 changed_name = repo_name
301 300
301 c.personal_repo_group = c.rhodecode_user.personal_repo_group
302 302 # override the choices with extracted revisions !
303 c.personal_repo_group = RepoGroup.get_by_group_name(
304 c.rhodecode_user.username)
305 303 repo = Repository.get_by_repo_name(repo_name)
306 304 old_data = {
307 305 'repo_name': repo_name,
@@ -399,8 +397,7 b' class ReposController(BaseRepoController'
399 397
400 398 c.repo_fields = RepositoryField.query()\
401 399 .filter(RepositoryField.repository == c.repo_info).all()
402 c.personal_repo_group = RepoGroup.get_by_group_name(
403 c.rhodecode_user.username)
400 c.personal_repo_group = c.rhodecode_user.personal_repo_group
404 401 c.active = 'settings'
405 402 return htmlfill.render(
406 403 render('admin/repos/repo_edit.html'),
@@ -34,6 +34,7 b' import packaging.version'
34 34 from pylons import request, tmpl_context as c, url, config
35 35 from pylons.controllers.util import redirect
36 36 from pylons.i18n.translation import _, lazy_ugettext
37 from pyramid.threadlocal import get_current_registry
37 38 from webob.exc import HTTPBadRequest
38 39
39 40 import rhodecode
@@ -54,6 +55,7 b' from rhodecode.model.db import RhodeCode'
54 55 from rhodecode.model.forms import ApplicationSettingsForm, \
55 56 ApplicationUiSettingsForm, ApplicationVisualisationForm, \
56 57 LabsSettingsForm, IssueTrackerPatternsForm
58 from rhodecode.model.repo_group import RepoGroupModel
57 59
58 60 from rhodecode.model.scm import ScmModel
59 61 from rhodecode.model.notification import EmailNotificationModel
@@ -63,6 +65,7 b' from rhodecode.model.settings import ('
63 65 SettingsModel)
64 66
65 67 from rhodecode.model.supervisor import SupervisorModel, SUPERVISOR_MASTER
68 from rhodecode.svn_support.config_keys import generate_config
66 69
67 70
68 71 log = logging.getLogger(__name__)
@@ -134,6 +137,10 b' class SettingsController(BaseController)'
134 137 c.svn_branch_patterns = model.get_global_svn_branch_patterns()
135 138 c.svn_tag_patterns = model.get_global_svn_tag_patterns()
136 139
140 # TODO: Replace with request.registry after migrating to pyramid.
141 pyramid_settings = get_current_registry().settings
142 c.svn_proxy_generate_config = pyramid_settings[generate_config]
143
137 144 application_form = ApplicationUiSettingsForm()()
138 145
139 146 try:
@@ -186,6 +193,10 b' class SettingsController(BaseController)'
186 193 c.svn_branch_patterns = model.get_global_svn_branch_patterns()
187 194 c.svn_tag_patterns = model.get_global_svn_tag_patterns()
188 195
196 # TODO: Replace with request.registry after migrating to pyramid.
197 pyramid_settings = get_current_registry().settings
198 c.svn_proxy_generate_config = pyramid_settings[generate_config]
199
189 200 return htmlfill.render(
190 201 render('admin/settings/settings.html'),
191 202 defaults=self._form_defaults(),
@@ -235,6 +246,8 b' class SettingsController(BaseController)'
235 246 """POST /admin/settings/global: All items in the collection"""
236 247 # url('admin_settings_global')
237 248 c.active = 'global'
249 c.personal_repo_group_default_pattern = RepoGroupModel()\
250 .get_personal_group_name_pattern()
238 251 application_form = ApplicationSettingsForm()()
239 252 try:
240 253 form_result = application_form.to_python(dict(request.POST))
@@ -249,16 +262,18 b' class SettingsController(BaseController)'
249 262
250 263 try:
251 264 settings = [
252 ('title', 'rhodecode_title'),
253 ('realm', 'rhodecode_realm'),
254 ('pre_code', 'rhodecode_pre_code'),
255 ('post_code', 'rhodecode_post_code'),
256 ('captcha_public_key', 'rhodecode_captcha_public_key'),
257 ('captcha_private_key', 'rhodecode_captcha_private_key'),
265 ('title', 'rhodecode_title', 'unicode'),
266 ('realm', 'rhodecode_realm', 'unicode'),
267 ('pre_code', 'rhodecode_pre_code', 'unicode'),
268 ('post_code', 'rhodecode_post_code', 'unicode'),
269 ('captcha_public_key', 'rhodecode_captcha_public_key', 'unicode'),
270 ('captcha_private_key', 'rhodecode_captcha_private_key', 'unicode'),
271 ('create_personal_repo_group', 'rhodecode_create_personal_repo_group', 'bool'),
272 ('personal_repo_group_pattern', 'rhodecode_personal_repo_group_pattern', 'unicode'),
258 273 ]
259 for setting, form_key in settings:
274 for setting, form_key, type_ in settings:
260 275 sett = SettingsModel().create_or_update_setting(
261 setting, form_result[form_key])
276 setting, form_result[form_key], type_)
262 277 Session().add(sett)
263 278
264 279 Session().commit()
@@ -277,6 +292,8 b' class SettingsController(BaseController)'
277 292 """GET /admin/settings/global: All items in the collection"""
278 293 # url('admin_settings_global')
279 294 c.active = 'global'
295 c.personal_repo_group_default_pattern = RepoGroupModel()\
296 .get_personal_group_name_pattern()
280 297
281 298 return htmlfill.render(
282 299 render('admin/settings/settings.html'),
@@ -397,19 +414,20 b' class SettingsController(BaseController)'
397 414 settings_model = IssueTrackerSettingsModel()
398 415
399 416 form = IssueTrackerPatternsForm()().to_python(request.POST)
400 for uid in form['delete_patterns']:
401 settings_model.delete_entries(uid)
417 if form:
418 for uid in form.get('delete_patterns', []):
419 settings_model.delete_entries(uid)
402 420
403 for pattern in form['patterns']:
404 for setting, value, type_ in pattern:
405 sett = settings_model.create_or_update_setting(
406 setting, value, type_)
407 Session().add(sett)
421 for pattern in form.get('patterns', []):
422 for setting, value, type_ in pattern:
423 sett = settings_model.create_or_update_setting(
424 setting, value, type_)
425 Session().add(sett)
408 426
409 Session().commit()
427 Session().commit()
410 428
411 SettingsModel().invalidate_settings_cache()
412 h.flash(_('Updated issue tracker entries'), category='success')
429 SettingsModel().invalidate_settings_cache()
430 h.flash(_('Updated issue tracker entries'), category='success')
413 431 return redirect(url('admin_settings_issuetracker'))
414 432
415 433 @HasPermissionAllDecorator('hg.admin')
@@ -530,65 +548,93 b' class SettingsController(BaseController)'
530 548 """GET /admin/settings/system: All items in the collection"""
531 549 # url('admin_settings_system')
532 550 snapshot = str2bool(request.GET.get('snapshot'))
533 c.active = 'system'
551 defaults = self._form_defaults()
534 552
535 defaults = self._form_defaults()
536 c.rhodecode_ini = rhodecode.CONFIG
553 c.active = 'system'
537 554 c.rhodecode_update_url = defaults.get('rhodecode_update_url')
538 555 server_info = ScmModel().get_server_info(request.environ)
556
539 557 for key, val in server_info.iteritems():
540 558 setattr(c, key, val)
541 559
542 if c.disk['percent'] > 90:
543 h.flash(h.literal(_(
544 'Critical: your disk space is very low <b>%s%%</b> used' %
545 c.disk['percent'])), 'error')
546 elif c.disk['percent'] > 70:
547 h.flash(h.literal(_(
548 'Warning: your disk space is running low <b>%s%%</b> used' %
549 c.disk['percent'])), 'warning')
560 def val(name, subkey='human_value'):
561 return server_info[name][subkey]
562
563 def state(name):
564 return server_info[name]['state']
565
566 def val2(name):
567 val = server_info[name]['human_value']
568 state = server_info[name]['state']
569 return val, state
550 570
551 try:
552 c.uptime_age = h._age(
553 h.time_to_datetime(c.boot_time), False, show_suffix=False)
554 except TypeError:
555 c.uptime_age = c.boot_time
571 c.data_items = [
572 # update info
573 (_('Update info'), h.literal(
574 '<span class="link" id="check_for_update" >%s.</span>' % (
575 _('Check for updates')) +
576 '<br/> <span >%s.</span>' % (_('Note: please make sure this server can access `%s` for the update link to work') % c.rhodecode_update_url)
577 ), ''),
578
579 # RhodeCode specific
580 (_('RhodeCode Version'), val('rhodecode_app')['text'], state('rhodecode_app')),
581 (_('RhodeCode Server IP'), val('server')['server_ip'], state('server')),
582 (_('RhodeCode Server ID'), val('server')['server_id'], state('server')),
583 (_('RhodeCode Configuration'), val('rhodecode_config')['path'], state('rhodecode_config')),
584 ('', '', ''), # spacer
585
586 # Database
587 (_('Database'), val('database')['url'], state('database')),
588 (_('Database version'), val('database')['version'], state('database')),
589 ('', '', ''), # spacer
556 590
557 try:
558 c.system_memory = '%s/%s, %s%% (%s%%) used%s' % (
559 h.format_byte_size_binary(c.memory['used']),
560 h.format_byte_size_binary(c.memory['total']),
561 c.memory['percent2'],
562 c.memory['percent'],
563 ' %s' % c.memory['error'] if 'error' in c.memory else '')
564 except TypeError:
565 c.system_memory = 'NOT AVAILABLE'
591 # Platform/Python
592 (_('Platform'), val('platform')['name'], state('platform')),
593 (_('Platform UUID'), val('platform')['uuid'], state('platform')),
594 (_('Python version'), val('python')['version'], state('python')),
595 (_('Python path'), val('python')['executable'], state('python')),
596 ('', '', ''), # spacer
597
598 # Systems stats
599 (_('CPU'), val('cpu'), state('cpu')),
600 (_('Load'), val('load')['text'], state('load')),
601 (_('Memory'), val('memory')['text'], state('memory')),
602 (_('Uptime'), val('uptime')['text'], state('uptime')),
603 ('', '', ''), # spacer
604
605 # Repo storage
606 (_('Storage location'), val('storage')['path'], state('storage')),
607 (_('Storage info'), val('storage')['text'], state('storage')),
608 (_('Storage inodes'), val('storage_inodes')['text'], state('storage_inodes')),
566 609
567 rhodecode_ini_safe = rhodecode.CONFIG.copy()
568 blacklist = [
569 'rhodecode_license_key',
570 'routes.map',
571 'pylons.h',
572 'pylons.app_globals',
573 'pylons.environ_config',
574 'sqlalchemy.db1.url',
575 ('app_conf', 'sqlalchemy.db1.url')
610 (_('Gist storage location'), val('storage_gist')['path'], state('storage_gist')),
611 (_('Gist storage info'), val('storage_gist')['text'], state('storage_gist')),
612
613 (_('Archive cache storage location'), val('storage_archive')['path'], state('storage_archive')),
614 (_('Archive cache info'), val('storage_archive')['text'], state('storage_archive')),
615
616 (_('Temp storage location'), val('storage_temp')['path'], state('storage_temp')),
617 (_('Temp storage info'), val('storage_temp')['text'], state('storage_temp')),
618
619 (_('Search info'), val('search')['text'], state('search')),
620 (_('Search location'), val('search')['location'], state('search')),
621 ('', '', ''), # spacer
622
623 # VCS specific
624 (_('VCS Backends'), val('vcs_backends'), state('vcs_backends')),
625 (_('VCS Server'), val('vcs_server')['text'], state('vcs_server')),
626 (_('GIT'), val('git'), state('git')),
627 (_('HG'), val('hg'), state('hg')),
628 (_('SVN'), val('svn'), state('svn')),
629
576 630 ]
577 for k in blacklist:
578 if isinstance(k, tuple):
579 section, key = k
580 if section in rhodecode_ini_safe:
581 rhodecode_ini_safe[section].pop(key, None)
582 else:
583 rhodecode_ini_safe.pop(k, None)
584
585 c.rhodecode_ini_safe = rhodecode_ini_safe
586 631
587 632 # TODO: marcink, figure out how to allow only selected users to do this
588 c.allowed_to_snapshot = False
633 c.allowed_to_snapshot = c.rhodecode_user.admin
589 634
590 635 if snapshot:
591 636 if c.allowed_to_snapshot:
637 c.data_items.pop(0) # remove server info
592 638 return render('admin/settings/settings_system_snapshot.html')
593 639 else:
594 640 h.flash('You are not allowed to do this', category='warning')
@@ -25,6 +25,7 b' User Groups crud controller for pylons'
25 25 import logging
26 26 import formencode
27 27
28 import peppercorn
28 29 from formencode import htmlfill
29 30 from pylons import request, tmpl_context as c, url, config
30 31 from pylons.controllers.util import redirect
@@ -40,7 +41,7 b' from rhodecode.lib.utils import jsonify,'
40 41 from rhodecode.lib.utils2 import safe_unicode, str2bool, safe_int
41 42 from rhodecode.lib.auth import (
42 43 LoginRequired, NotAnonymous, HasUserGroupPermissionAnyDecorator,
43 HasPermissionAnyDecorator)
44 HasPermissionAnyDecorator, XHRRequired)
44 45 from rhodecode.lib.base import BaseController, render
45 46 from rhodecode.model.permission import PermissionModel
46 47 from rhodecode.model.scm import UserGroupList
@@ -64,18 +65,13 b' class UserGroupsController(BaseControlle'
64 65 def __before__(self):
65 66 super(UserGroupsController, self).__before__()
66 67 c.available_permissions = config['available_permissions']
67 PermissionModel().set_global_permission_choices(c, translator=_)
68 PermissionModel().set_global_permission_choices(c, gettext_translator=_)
68 69
69 70 def __load_data(self, user_group_id):
70 71 c.group_members_obj = [x.user for x in c.user_group.members]
71 72 c.group_members_obj.sort(key=lambda u: u.username.lower())
72
73 73 c.group_members = [(x.user_id, x.username) for x in c.group_members_obj]
74 74
75 c.available_members = [(x.user_id, x.username)
76 for x in User.query().all()]
77 c.available_members.sort(key=lambda u: u[1].lower())
78
79 75 def __load_defaults(self, user_group_id):
80 76 """
81 77 Load defaults settings for edit, and update
@@ -207,20 +203,21 b' class UserGroupsController(BaseControlle'
207 203 c.active = 'settings'
208 204 self.__load_data(user_group_id)
209 205
210 available_members = [safe_unicode(x[0]) for x in c.available_members]
211
212 206 users_group_form = UserGroupForm(
213 edit=True, old_data=c.user_group.get_dict(),
214 available_members=available_members, allow_disabled=True)()
207 edit=True, old_data=c.user_group.get_dict(), allow_disabled=True)()
215 208
216 209 try:
217 210 form_result = users_group_form.to_python(request.POST)
211 pstruct = peppercorn.parse(request.POST.items())
212 form_result['users_group_members'] = pstruct['user_group_members']
213
218 214 UserGroupModel().update(c.user_group, form_result)
219 gr = form_result['users_group_name']
215 updated_user_group = form_result['users_group_name']
220 216 action_logger(c.rhodecode_user,
221 'admin_updated_users_group:%s' % gr,
217 'admin_updated_users_group:%s' % updated_user_group,
222 218 None, self.ip_addr, self.sa)
223 h.flash(_('Updated user group %s') % gr, category='success')
219 h.flash(_('Updated user group %s') % updated_user_group,
220 category='success')
224 221 Session().commit()
225 222 except formencode.Invalid as errors:
226 223 defaults = errors.value
@@ -462,19 +459,29 b' class UserGroupsController(BaseControlle'
462 459 return render('admin/user_groups/user_group_edit.html')
463 460
464 461 @HasUserGroupPermissionAnyDecorator('usergroup.admin')
465 def edit_members(self, user_group_id):
462 @XHRRequired()
463 @jsonify
464 def user_group_members(self, user_group_id):
466 465 user_group_id = safe_int(user_group_id)
467 c.user_group = UserGroup.get_or_404(user_group_id)
468 c.active = 'members'
469 c.group_members_obj = sorted((x.user for x in c.user_group.members),
470 key=lambda u: u.username.lower())
466 user_group = UserGroup.get_or_404(user_group_id)
467 group_members_obj = sorted((x.user for x in user_group.members),
468 key=lambda u: u.username.lower())
471 469
472 group_members = [(x.user_id, x.username) for x in c.group_members_obj]
470 group_members = [
471 {
472 'id': user.user_id,
473 'first_name': user.name,
474 'last_name': user.lastname,
475 'username': user.username,
476 'icon_link': h.gravatar_url(user.email, 30),
477 'value_display': h.person(user.email),
478 'value': user.username,
479 'value_type': 'user',
480 'active': user.active,
481 }
482 for user in group_members_obj
483 ]
473 484
474 if request.is_xhr:
475 return jsonify(lambda *a, **k: {
476 'members': group_members
477 })
478
479 c.group_members = group_members
480 return render('admin/user_groups/user_group_edit.html')
485 return {
486 'members': group_members
487 }
@@ -45,12 +45,13 b' from rhodecode.model.db import ('
45 45 PullRequestReviewers, User, UserEmailMap, UserIpMap, RepoGroup)
46 46 from rhodecode.model.forms import (
47 47 UserForm, UserPermissionsForm, UserIndividualPermissionsForm)
48 from rhodecode.model.repo_group import RepoGroupModel
48 49 from rhodecode.model.user import UserModel
49 50 from rhodecode.model.meta import Session
50 51 from rhodecode.model.permission import PermissionModel
51 52 from rhodecode.lib.utils import action_logger
52 53 from rhodecode.lib.ext_json import json
53 from rhodecode.lib.utils2 import datetime_to_time, safe_int
54 from rhodecode.lib.utils2 import datetime_to_time, safe_int, AttributeDict
54 55
55 56 log = logging.getLogger(__name__)
56 57
@@ -73,7 +74,7 b' class UsersController(BaseController):'
73 74 ('ru', 'Russian (ru)'),
74 75 ('zh', 'Chinese (zh)'),
75 76 ]
76 PermissionModel().set_global_permission_choices(c, translator=_)
77 PermissionModel().set_global_permission_choices(c, gettext_translator=_)
77 78
78 79 @HasPermissionAllDecorator('hg.admin')
79 80 def index(self):
@@ -120,6 +121,16 b' class UsersController(BaseController):'
120 121 c.data = json.dumps(users_data)
121 122 return render('admin/users/users.html')
122 123
124 def _get_personal_repo_group_template_vars(self):
125 DummyUser = AttributeDict({
126 'username': '${username}',
127 'user_id': '${user_id}',
128 })
129 c.default_create_repo_group = RepoGroupModel() \
130 .get_default_create_personal_repo_group()
131 c.personal_repo_group_name = RepoGroupModel() \
132 .get_personal_group_name(DummyUser)
133
123 134 @HasPermissionAllDecorator('hg.admin')
124 135 @auth.CSRFRequired()
125 136 def create(self):
@@ -143,6 +154,7 b' class UsersController(BaseController):'
143 154 % {'user_link': user_link}), category='success')
144 155 Session().commit()
145 156 except formencode.Invalid as errors:
157 self._get_personal_repo_group_template_vars()
146 158 return htmlfill.render(
147 159 render('admin/users/user_add.html'),
148 160 defaults=errors.value,
@@ -163,6 +175,7 b' class UsersController(BaseController):'
163 175 """GET /users/new: Form to create a new item"""
164 176 # url('new_user')
165 177 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.name
178 self._get_personal_repo_group_template_vars()
166 179 return render('admin/users/user_add.html')
167 180
168 181 @HasPermissionAllDecorator('hg.admin')
@@ -339,22 +352,41 b' class UsersController(BaseController):'
339 352
340 353 user_id = safe_int(user_id)
341 354 c.user = User.get_or_404(user_id)
355 personal_repo_group = RepoGroup.get_user_personal_repo_group(
356 c.user.user_id)
357 if personal_repo_group:
358 return redirect(url('edit_user_advanced', user_id=user_id))
342 359
360 personal_repo_group_name = RepoGroupModel().get_personal_group_name(
361 c.user)
362 named_personal_group = RepoGroup.get_by_group_name(
363 personal_repo_group_name)
343 364 try:
344 desc = RepoGroupModel.PERSONAL_GROUP_DESC % {
345 'username': c.user.username}
346 if not RepoGroup.get_by_group_name(c.user.username):
347 RepoGroupModel().create(group_name=c.user.username,
348 group_description=desc,
349 owner=c.user.username)
350 365
351 msg = _('Created repository group `%s`' % (c.user.username,))
366 if named_personal_group and named_personal_group.user_id == c.user.user_id:
367 # migrate the same named group, and mark it as personal
368 named_personal_group.personal = True
369 Session().add(named_personal_group)
370 Session().commit()
371 msg = _('Linked repository group `%s` as personal' % (
372 personal_repo_group_name,))
352 373 h.flash(msg, category='success')
374 elif not named_personal_group:
375 RepoGroupModel().create_personal_repo_group(c.user)
376
377 msg = _('Created repository group `%s`' % (
378 personal_repo_group_name,))
379 h.flash(msg, category='success')
380 else:
381 msg = _('Repository group `%s` is already taken' % (
382 personal_repo_group_name,))
383 h.flash(msg, category='warning')
353 384 except Exception:
354 385 log.exception("Exception during repository group creation")
355 386 msg = _(
356 387 'An error occurred during repository group creation for user')
357 388 h.flash(msg, category='error')
389 Session().rollback()
358 390
359 391 return redirect(url('edit_user_advanced', user_id=user_id))
360 392
@@ -397,7 +429,9 b' class UsersController(BaseController):'
397 429
398 430 c.active = 'advanced'
399 431 c.perm_user = AuthUser(user_id=user_id, ip_addr=self.ip_addr)
400 c.personal_repo_group = RepoGroup.get_by_group_name(user.username)
432 c.personal_repo_group = c.perm_user.personal_repo_group
433 c.personal_repo_group_name = RepoGroupModel()\
434 .get_personal_group_name(user)
401 435 c.first_admin = User.get_first_super_admin()
402 436 defaults = user.get_dict()
403 437
@@ -32,7 +32,7 b' from pylons.i18n.translation import _'
32 32 from pylons.controllers.util import redirect
33 33
34 34 from rhodecode.lib import auth
35 from rhodecode.lib import diffs
35 from rhodecode.lib import diffs, codeblocks
36 36 from rhodecode.lib.auth import (
37 37 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous)
38 38 from rhodecode.lib.base import BaseRepoController, render
@@ -43,7 +43,7 b' from rhodecode.lib.utils import action_l'
43 43 from rhodecode.lib.utils2 import safe_unicode
44 44 from rhodecode.lib.vcs.backends.base import EmptyCommit
45 45 from rhodecode.lib.vcs.exceptions import (
46 RepositoryError, CommitDoesNotExistError)
46 RepositoryError, CommitDoesNotExistError, NodeDoesNotExistError)
47 47 from rhodecode.model.db import ChangesetComment, ChangesetStatus
48 48 from rhodecode.model.changeset_status import ChangesetStatusModel
49 49 from rhodecode.model.comment import ChangesetCommentsModel
@@ -156,15 +156,24 b' class ChangesetController(BaseRepoContro'
156 156 c.ignorews_url = _ignorews_url
157 157 c.context_url = _context_url
158 158 c.fulldiff = fulldiff = request.GET.get('fulldiff')
159
160 # fetch global flags of ignore ws or context lines
161 context_lcl = get_line_ctx('', request.GET)
162 ign_whitespace_lcl = get_ignore_ws('', request.GET)
163
164 # diff_limit will cut off the whole diff if the limit is applied
165 # otherwise it will just hide the big files from the front-end
166 diff_limit = self.cut_off_limit_diff
167 file_limit = self.cut_off_limit_file
168
159 169 # get ranges of commit ids if preset
160 170 commit_range = commit_id_range.split('...')[:2]
161 enable_comments = True
171
162 172 try:
163 173 pre_load = ['affected_files', 'author', 'branch', 'date',
164 174 'message', 'parents']
165 175
166 176 if len(commit_range) == 2:
167 enable_comments = False
168 177 commits = c.rhodecode_repo.get_commits(
169 178 start_id=commit_range[0], end_id=commit_range[1],
170 179 pre_load=pre_load)
@@ -190,88 +199,78 b' class ChangesetController(BaseRepoContro'
190 199 c.lines_deleted = 0
191 200
192 201 c.commit_statuses = ChangesetStatus.STATUSES
193 c.comments = []
194 c.statuses = []
195 202 c.inline_comments = []
196 203 c.inline_cnt = 0
197 204 c.files = []
198 205
206 c.statuses = []
207 c.comments = []
208 if len(c.commit_ranges) == 1:
209 commit = c.commit_ranges[0]
210 c.comments = ChangesetCommentsModel().get_comments(
211 c.rhodecode_db_repo.repo_id,
212 revision=commit.raw_id)
213 c.statuses.append(ChangesetStatusModel().get_status(
214 c.rhodecode_db_repo.repo_id, commit.raw_id))
215 # comments from PR
216 statuses = ChangesetStatusModel().get_statuses(
217 c.rhodecode_db_repo.repo_id, commit.raw_id,
218 with_revisions=True)
219 prs = set(st.pull_request for st in statuses
220 if st.pull_request is not None)
221 # from associated statuses, check the pull requests, and
222 # show comments from them
223 for pr in prs:
224 c.comments.extend(pr.comments)
225
199 226 # Iterate over ranges (default commit view is always one commit)
200 227 for commit in c.commit_ranges:
201 if method == 'show':
202 c.statuses.extend([ChangesetStatusModel().get_status(
203 c.rhodecode_db_repo.repo_id, commit.raw_id)])
204
205 c.comments.extend(ChangesetCommentsModel().get_comments(
206 c.rhodecode_db_repo.repo_id,
207 revision=commit.raw_id))
208
209 # comments from PR
210 st = ChangesetStatusModel().get_statuses(
211 c.rhodecode_db_repo.repo_id, commit.raw_id,
212 with_revisions=True)
213
214 # from associated statuses, check the pull requests, and
215 # show comments from them
216
217 prs = set(x.pull_request for x in
218 filter(lambda x: x.pull_request is not None, st))
219 for pr in prs:
220 c.comments.extend(pr.comments)
221
222 inlines = ChangesetCommentsModel().get_inline_comments(
223 c.rhodecode_db_repo.repo_id, revision=commit.raw_id)
224 c.inline_comments.extend(inlines.iteritems())
225
226 228 c.changes[commit.raw_id] = []
227 229
228 230 commit2 = commit
229 231 commit1 = commit.parents[0] if commit.parents else EmptyCommit()
230 232
231 # fetch global flags of ignore ws or context lines
232 context_lcl = get_line_ctx('', request.GET)
233 ign_whitespace_lcl = get_ignore_ws('', request.GET)
234
235 233 _diff = c.rhodecode_repo.get_diff(
236 234 commit1, commit2,
237 235 ignore_whitespace=ign_whitespace_lcl, context=context_lcl)
236 diff_processor = diffs.DiffProcessor(
237 _diff, format='newdiff', diff_limit=diff_limit,
238 file_limit=file_limit, show_full_diff=fulldiff)
238 239
239 # diff_limit will cut off the whole diff if the limit is applied
240 # otherwise it will just hide the big files from the front-end
241 diff_limit = self.cut_off_limit_diff
242 file_limit = self.cut_off_limit_file
243
244 diff_processor = diffs.DiffProcessor(
245 _diff, format='gitdiff', diff_limit=diff_limit,
246 file_limit=file_limit, show_full_diff=fulldiff)
247 240 commit_changes = OrderedDict()
248 241 if method == 'show':
249 242 _parsed = diff_processor.prepare()
250 243 c.limited_diff = isinstance(_parsed, diffs.LimitedDiffContainer)
251 for f in _parsed:
252 c.files.append(f)
253 st = f['stats']
254 c.lines_added += st['added']
255 c.lines_deleted += st['deleted']
256 fid = h.FID(commit.raw_id, f['filename'])
257 diff = diff_processor.as_html(enable_comments=enable_comments,
258 parsed_lines=[f])
259 commit_changes[fid] = [
260 commit1.raw_id, commit2.raw_id,
261 f['operation'], f['filename'], diff, st, f]
244
245 _parsed = diff_processor.prepare()
246
247 def _node_getter(commit):
248 def get_node(fname):
249 try:
250 return commit.get_node(fname)
251 except NodeDoesNotExistError:
252 return None
253 return get_node
254
255 inline_comments = ChangesetCommentsModel().get_inline_comments(
256 c.rhodecode_db_repo.repo_id, revision=commit.raw_id)
257 c.inline_cnt += len(inline_comments)
258
259 diffset = codeblocks.DiffSet(
260 repo_name=c.repo_name,
261 source_node_getter=_node_getter(commit1),
262 target_node_getter=_node_getter(commit2),
263 comments=inline_comments
264 ).render_patchset(_parsed, commit1.raw_id, commit2.raw_id)
265 c.changes[commit.raw_id] = diffset
262 266 else:
263 267 # downloads/raw we only need RAW diff nothing else
264 268 diff = diff_processor.as_raw()
265 commit_changes[''] = [None, None, None, None, diff, None, None]
266 c.changes[commit.raw_id] = commit_changes
269 c.changes[commit.raw_id] = [None, None, None, None, diff, None, None]
267 270
268 271 # sort comments by how they were generated
269 272 c.comments = sorted(c.comments, key=lambda x: x.comment_id)
270 273
271 # count inline comments
272 for __, lines in c.inline_comments:
273 for comments in lines.values():
274 c.inline_cnt += len(comments)
275 274
276 275 if len(c.commit_ranges) == 1:
277 276 c.commit = c.commit_ranges[0]
@@ -31,13 +31,14 b' from pylons.i18n.translation import _'
31 31
32 32 from rhodecode.controllers.utils import parse_path_ref, get_commit_from_ref_name
33 33 from rhodecode.lib import helpers as h
34 from rhodecode.lib import diffs
34 from rhodecode.lib import diffs, codeblocks
35 35 from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator
36 36 from rhodecode.lib.base import BaseRepoController, render
37 37 from rhodecode.lib.utils import safe_str
38 38 from rhodecode.lib.utils2 import safe_unicode, str2bool
39 39 from rhodecode.lib.vcs.exceptions import (
40 EmptyRepositoryError, RepositoryError, RepositoryRequirementError)
40 EmptyRepositoryError, RepositoryError, RepositoryRequirementError,
41 NodeDoesNotExistError)
41 42 from rhodecode.model.db import Repository, ChangesetStatus
42 43
43 44 log = logging.getLogger(__name__)
@@ -78,7 +79,7 b' class CompareController(BaseRepoControll'
78 79 def index(self, repo_name):
79 80 c.compare_home = True
80 81 c.commit_ranges = []
81 c.files = []
82 c.diffset = None
82 83 c.limited_diff = False
83 84 source_repo = c.rhodecode_db_repo.repo_name
84 85 target_repo = request.GET.get('target_repo', source_repo)
@@ -239,28 +240,24 b' class CompareController(BaseRepoControll'
239 240 commit1=source_commit, commit2=target_commit,
240 241 path1=source_path, path=target_path)
241 242 diff_processor = diffs.DiffProcessor(
242 txtdiff, format='gitdiff', diff_limit=diff_limit,
243 txtdiff, format='newdiff', diff_limit=diff_limit,
243 244 file_limit=file_limit, show_full_diff=c.fulldiff)
244 245 _parsed = diff_processor.prepare()
245 246
246 c.limited_diff = False
247 if isinstance(_parsed, diffs.LimitedDiffContainer):
248 c.limited_diff = True
247 def _node_getter(commit):
248 """ Returns a function that returns a node for a commit or None """
249 def get_node(fname):
250 try:
251 return commit.get_node(fname)
252 except NodeDoesNotExistError:
253 return None
254 return get_node
249 255
250 c.files = []
251 c.changes = {}
252 c.lines_added = 0
253 c.lines_deleted = 0
254 for f in _parsed:
255 st = f['stats']
256 if not st['binary']:
257 c.lines_added += st['added']
258 c.lines_deleted += st['deleted']
259 fid = h.FID('', f['filename'])
260 c.files.append([fid, f['operation'], f['filename'], f['stats'], f])
261 htmldiff = diff_processor.as_html(
262 enable_comments=False, parsed_lines=[f])
263 c.changes[fid] = [f['operation'], f['filename'], htmldiff, f]
256 c.diffset = codeblocks.DiffSet(
257 repo_name=source_repo.repo_name,
258 source_node_getter=_node_getter(source_commit),
259 target_node_getter=_node_getter(target_commit),
260 ).render_patchset(_parsed, source_ref, target_ref)
264 261
265 262 c.preview_mode = merge
266 263
@@ -36,6 +36,8 b' from webob.exc import HTTPNotFound, HTTP'
36 36 from rhodecode.controllers.utils import parse_path_ref
37 37 from rhodecode.lib import diffs, helpers as h, caches
38 38 from rhodecode.lib.compat import OrderedDict
39 from rhodecode.lib.codeblocks import (
40 filenode_as_lines_tokens, filenode_as_annotated_lines_tokens)
39 41 from rhodecode.lib.utils import jsonify, action_logger
40 42 from rhodecode.lib.utils2 import (
41 43 convert_line_endings, detect_mode, safe_str, str2bool)
@@ -221,9 +223,19 b' class FilesController(BaseRepoController'
221 223 c.file_author = True
222 224 c.file_tree = ''
223 225 if c.file.is_file():
224 c.renderer = (
225 c.renderer and h.renderer_from_filename(c.file.path))
226 c.file_source_page = 'true'
226 227 c.file_last_commit = c.file.last_commit
228 if c.file.size < self.cut_off_limit_file:
229 if c.annotate: # annotation has precedence over renderer
230 c.annotated_lines = filenode_as_annotated_lines_tokens(
231 c.file
232 )
233 else:
234 c.renderer = (
235 c.renderer and h.renderer_from_filename(c.file.path)
236 )
237 if not c.renderer:
238 c.lines = filenode_as_lines_tokens(c.file)
227 239
228 240 c.on_branch_head = self._is_valid_head(
229 241 commit_id, c.rhodecode_repo)
@@ -233,6 +245,7 b' class FilesController(BaseRepoController'
233 245 c.authors = [(h.email(author),
234 246 h.person(author, 'username_or_name_or_email'))]
235 247 else:
248 c.file_source_page = 'false'
236 249 c.authors = []
237 250 c.file_tree = self._get_tree_at_commit(
238 251 repo_name, c.commit.raw_id, f_path)
@@ -60,8 +60,7 b' class ForksController(BaseRepoController'
60 60 c.repo_groups_choices = map(lambda k: unicode(k[0]), c.repo_groups)
61 61 choices, c.landing_revs = ScmModel().get_repo_landing_revs()
62 62 c.landing_revs_choices = choices
63 c.personal_repo_group = RepoGroup.get_by_group_name(
64 c.rhodecode_user.username)
63 c.personal_repo_group = c.rhodecode_user.personal_repo_group
65 64
66 65 def __load_data(self, repo_name=None):
67 66 """
@@ -286,4 +286,3 b' class HomeController(BaseController):'
286 286 _user_groups = _user_groups
287 287
288 288 return {'suggestions': _user_groups}
289
@@ -22,6 +22,7 b''
22 22 pull requests controller for rhodecode for initializing pull requests
23 23 """
24 24
25 import peppercorn
25 26 import formencode
26 27 import logging
27 28
@@ -29,22 +30,26 b' from webob.exc import HTTPNotFound, HTTP'
29 30 from pylons import request, tmpl_context as c, url
30 31 from pylons.controllers.util import redirect
31 32 from pylons.i18n.translation import _
33 from pyramid.threadlocal import get_current_registry
32 34 from sqlalchemy.sql import func
33 35 from sqlalchemy.sql.expression import or_
34 36
35 37 from rhodecode import events
36 from rhodecode.lib import auth, diffs, helpers as h
38 from rhodecode.lib import auth, diffs, helpers as h, codeblocks
37 39 from rhodecode.lib.ext_json import json
38 40 from rhodecode.lib.base import (
39 41 BaseRepoController, render, vcs_operation_context)
40 42 from rhodecode.lib.auth import (
41 43 LoginRequired, HasRepoPermissionAnyDecorator, NotAnonymous,
42 44 HasAcceptedRepoType, XHRRequired)
45 from rhodecode.lib.channelstream import channelstream_request
46 from rhodecode.lib.compat import OrderedDict
43 47 from rhodecode.lib.utils import jsonify
44 48 from rhodecode.lib.utils2 import safe_int, safe_str, str2bool, safe_unicode
45 from rhodecode.lib.vcs.backends.base import EmptyCommit
49 from rhodecode.lib.vcs.backends.base import EmptyCommit, UpdateFailureReason
46 50 from rhodecode.lib.vcs.exceptions import (
47 EmptyRepositoryError, CommitDoesNotExistError, RepositoryRequirementError)
51 EmptyRepositoryError, CommitDoesNotExistError, RepositoryRequirementError,
52 NodeDoesNotExistError)
48 53 from rhodecode.lib.diffs import LimitedDiffContainer
49 54 from rhodecode.model.changeset_status import ChangesetStatusModel
50 55 from rhodecode.model.comment import ChangesetCommentsModel
@@ -61,7 +66,7 b' class PullrequestsController(BaseRepoCon'
61 66 def __before__(self):
62 67 super(PullrequestsController, self).__before__()
63 68
64 def _load_compare_data(self, pull_request, enable_comments=True):
69 def _load_compare_data(self, pull_request, inline_comments, enable_comments=True):
65 70 """
66 71 Load context data needed for generating compare diff
67 72
@@ -114,6 +119,7 b' class PullrequestsController(BaseRepoCon'
114 119 except RepositoryRequirementError:
115 120 c.missing_requirements = True
116 121
122 c.changes = {}
117 123 c.missing_commits = False
118 124 if (c.missing_requirements or
119 125 isinstance(source_commit, EmptyCommit) or
@@ -123,11 +129,31 b' class PullrequestsController(BaseRepoCon'
123 129 else:
124 130 vcs_diff = PullRequestModel().get_diff(pull_request)
125 131 diff_processor = diffs.DiffProcessor(
126 vcs_diff, format='gitdiff', diff_limit=diff_limit,
132 vcs_diff, format='newdiff', diff_limit=diff_limit,
127 133 file_limit=file_limit, show_full_diff=c.fulldiff)
128 134 _parsed = diff_processor.prepare()
129 135
130 c.limited_diff = isinstance(_parsed, LimitedDiffContainer)
136 commit_changes = OrderedDict()
137 _parsed = diff_processor.prepare()
138 c.limited_diff = isinstance(_parsed, diffs.LimitedDiffContainer)
139
140 _parsed = diff_processor.prepare()
141
142 def _node_getter(commit):
143 def get_node(fname):
144 try:
145 return commit.get_node(fname)
146 except NodeDoesNotExistError:
147 return None
148 return get_node
149
150 c.diffset = codeblocks.DiffSet(
151 repo_name=c.repo_name,
152 source_node_getter=_node_getter(target_commit),
153 target_node_getter=_node_getter(source_commit),
154 comments=inline_comments
155 ).render_patchset(_parsed, target_commit.raw_id, source_commit.raw_id)
156
131 157
132 158 c.files = []
133 159 c.changes = {}
@@ -136,17 +162,17 b' class PullrequestsController(BaseRepoCon'
136 162 c.included_files = []
137 163 c.deleted_files = []
138 164
139 for f in _parsed:
140 st = f['stats']
141 c.lines_added += st['added']
142 c.lines_deleted += st['deleted']
165 # for f in _parsed:
166 # st = f['stats']
167 # c.lines_added += st['added']
168 # c.lines_deleted += st['deleted']
143 169
144 fid = h.FID('', f['filename'])
145 c.files.append([fid, f['operation'], f['filename'], f['stats']])
146 c.included_files.append(f['filename'])
147 html_diff = diff_processor.as_html(enable_comments=enable_comments,
148 parsed_lines=[f])
149 c.changes[fid] = [f['operation'], f['filename'], html_diff, f]
170 # fid = h.FID('', f['filename'])
171 # c.files.append([fid, f['operation'], f['filename'], f['stats']])
172 # c.included_files.append(f['filename'])
173 # html_diff = diff_processor.as_html(enable_comments=enable_comments,
174 # parsed_lines=[f])
175 # c.changes[fid] = [f['operation'], f['filename'], html_diff, f]
150 176
151 177 def _extract_ordering(self, request):
152 178 column_index = safe_int(request.GET.get('order[0][column]'))
@@ -295,10 +321,12 b' class PullrequestsController(BaseRepoCon'
295 321 redirect(url('pullrequest_home', repo_name=source_repo.repo_name))
296 322
297 323 default_target_repo = source_repo
298 if (source_repo.parent and
299 not source_repo.parent.scm_instance().is_empty()):
300 # change default if we have a parent repo
301 default_target_repo = source_repo.parent
324
325 if source_repo.parent:
326 parent_vcs_obj = source_repo.parent.scm_instance()
327 if parent_vcs_obj and not parent_vcs_obj.is_empty():
328 # change default if we have a parent repo
329 default_target_repo = source_repo.parent
302 330
303 331 target_repo_data = PullRequestModel().generate_repo_data(
304 332 default_target_repo)
@@ -360,7 +388,8 b' class PullrequestsController(BaseRepoCon'
360 388 add_parent = False
361 389 if repo.parent:
362 390 if filter_query in repo.parent.repo_name:
363 if not repo.parent.scm_instance().is_empty():
391 parent_vcs_obj = repo.parent.scm_instance()
392 if parent_vcs_obj and not parent_vcs_obj.is_empty():
364 393 add_parent = True
365 394
366 395 limit = 20 - 1 if add_parent else 20
@@ -397,8 +426,10 b' class PullrequestsController(BaseRepoCon'
397 426 if not repo:
398 427 raise HTTPNotFound
399 428
429 controls = peppercorn.parse(request.POST.items())
430
400 431 try:
401 _form = PullRequestForm(repo.repo_id)().to_python(request.POST)
432 _form = PullRequestForm(repo.repo_id)().to_python(controls)
402 433 except formencode.Invalid as errors:
403 434 if errors.error_dict.get('revisions'):
404 435 msg = 'Revisions: %s' % errors.error_dict['revisions']
@@ -417,7 +448,8 b' class PullrequestsController(BaseRepoCon'
417 448 target_repo = _form['target_repo']
418 449 target_ref = _form['target_ref']
419 450 commit_ids = _form['revisions'][::-1]
420 reviewers = _form['review_members']
451 reviewers = [
452 (r['user_id'], r['reasons']) for r in _form['review_members']]
421 453
422 454 # find the ancestor for this pr
423 455 source_db_repo = Repository.get_by_repo_name(_form['source_repo'])
@@ -476,8 +508,11 b' class PullrequestsController(BaseRepoCon'
476 508 allowed_to_update = PullRequestModel().check_user_update(
477 509 pull_request, c.rhodecode_user)
478 510 if allowed_to_update:
479 if 'reviewers_ids' in request.POST:
480 self._update_reviewers(pull_request_id)
511 controls = peppercorn.parse(request.POST.items())
512
513 if 'review_members' in controls:
514 self._update_reviewers(
515 pull_request_id, controls['review_members'])
481 516 elif str2bool(request.POST.get('update_commits', 'false')):
482 517 self._update_commits(pull_request)
483 518 elif str2bool(request.POST.get('close_pull_request', 'false')):
@@ -506,32 +541,51 b' class PullrequestsController(BaseRepoCon'
506 541 return
507 542
508 543 def _update_commits(self, pull_request):
509 try:
510 if PullRequestModel().has_valid_update_type(pull_request):
511 updated_version, changes = PullRequestModel().update_commits(
512 pull_request)
513 if updated_version:
514 msg = _(
515 u'Pull request updated to "{source_commit_id}" with '
516 u'{count_added} added, {count_removed} removed '
517 u'commits.'
518 ).format(
519 source_commit_id=pull_request.source_ref_parts.commit_id,
520 count_added=len(changes.added),
521 count_removed=len(changes.removed))
522 h.flash(msg, category='success')
523 else:
524 h.flash(_("Nothing changed in pull request."),
525 category='warning')
526 else:
527 msg = _(
528 u"Skipping update of pull request due to reference "
529 u"type: {reference_type}"
530 ).format(reference_type=pull_request.source_ref_parts.type)
531 h.flash(msg, category='warning')
532 except CommitDoesNotExistError:
533 h.flash(
534 _(u'Update failed due to missing commits.'), category='error')
544 resp = PullRequestModel().update_commits(pull_request)
545
546 if resp.executed:
547 msg = _(
548 u'Pull request updated to "{source_commit_id}" with '
549 u'{count_added} added, {count_removed} removed commits.')
550 msg = msg.format(
551 source_commit_id=pull_request.source_ref_parts.commit_id,
552 count_added=len(resp.changes.added),
553 count_removed=len(resp.changes.removed))
554 h.flash(msg, category='success')
555
556 registry = get_current_registry()
557 rhodecode_plugins = getattr(registry, 'rhodecode_plugins', {})
558 channelstream_config = rhodecode_plugins.get('channelstream', {})
559 if channelstream_config.get('enabled'):
560 message = msg + (
561 ' - <a onclick="window.location.reload()">'
562 '<strong>{}</strong></a>'.format(_('Reload page')))
563 channel = '/repo${}$/pr/{}'.format(
564 pull_request.target_repo.repo_name,
565 pull_request.pull_request_id
566 )
567 payload = {
568 'type': 'message',
569 'user': 'system',
570 'exclude_users': [request.user.username],
571 'channel': channel,
572 'message': {
573 'message': message,
574 'level': 'success',
575 'topic': '/notifications'
576 }
577 }
578 channelstream_request(
579 channelstream_config, [payload], '/message',
580 raise_exc=False)
581 else:
582 msg = PullRequestModel.UPDATE_STATUS_MESSAGES[resp.reason]
583 warning_reasons = [
584 UpdateFailureReason.NO_CHANGE,
585 UpdateFailureReason.WRONG_REF_TPYE,
586 ]
587 category = 'warning' if resp.reason in warning_reasons else 'error'
588 h.flash(msg, category=category)
535 589
536 590 @auth.CSRFRequired()
537 591 @LoginRequired()
@@ -601,11 +655,10 b' class PullrequestsController(BaseRepoCon'
601 655 merge_resp.failure_reason)
602 656 h.flash(msg, category='error')
603 657
604 def _update_reviewers(self, pull_request_id):
605 reviewers_ids = map(int, filter(
606 lambda v: v not in [None, ''],
607 request.POST.get('reviewers_ids', '').split(',')))
608 PullRequestModel().update_reviewers(pull_request_id, reviewers_ids)
658 def _update_reviewers(self, pull_request_id, review_members):
659 reviewers = [
660 (int(r['user_id']), r['reasons']) for r in review_members]
661 PullRequestModel().update_reviewers(pull_request_id, reviewers)
609 662 Session().commit()
610 663
611 664 def _reject_close(self, pull_request):
@@ -655,6 +708,10 b' class PullrequestsController(BaseRepoCon'
655 708 c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed()
656 709 c.allowed_to_merge = PullRequestModel().check_user_merge(
657 710 c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed()
711 c.shadow_clone_url = PullRequestModel().get_shadow_clone_url(
712 c.pull_request)
713 c.allowed_to_delete = PullRequestModel().check_user_delete(
714 c.pull_request, c.rhodecode_user) and not c.pull_request.is_closed()
658 715
659 716 cc_model = ChangesetCommentsModel()
660 717
@@ -669,23 +726,13 b' class PullrequestsController(BaseRepoCon'
669 726 c.pr_merge_status = False
670 727 # load compare data into template context
671 728 enable_comments = not c.pull_request.is_closed()
672 self._load_compare_data(c.pull_request, enable_comments=enable_comments)
673 729
674 # this is a hack to properly display links, when creating PR, the
675 # compare view and others uses different notation, and
676 # compare_commits.html renders links based on the target_repo.
677 # We need to swap that here to generate it properly on the html side
678 c.target_repo = c.source_repo
679 730
680 731 # inline comments
681 c.inline_cnt = 0
682 732 c.inline_comments = cc_model.get_inline_comments(
683 733 c.rhodecode_db_repo.repo_id,
684 pull_request=pull_request_id).items()
685 # count inline comments
686 for __, lines in c.inline_comments:
687 for comments in lines.values():
688 c.inline_cnt += len(comments)
734 pull_request=pull_request_id)
735 c.inline_cnt = len(c.inline_comments)
689 736
690 737 # outdated comments
691 738 c.outdated_cnt = 0
@@ -702,6 +749,15 b' class PullrequestsController(BaseRepoCon'
702 749 else:
703 750 c.outdated_comments = {}
704 751
752 self._load_compare_data(
753 c.pull_request, c.inline_comments, enable_comments=enable_comments)
754
755 # this is a hack to properly display links, when creating PR, the
756 # compare view and others uses different notation, and
757 # compare_commits.html renders links based on the target_repo.
758 # We need to swap that here to generate it properly on the html side
759 c.target_repo = c.source_repo
760
705 761 # comments
706 762 c.comments = cc_model.get_comments(c.rhodecode_db_repo.repo_id,
707 763 pull_request=pull_request_id)
@@ -251,6 +251,16 b' class SummaryController(BaseRepoControll'
251 251 }
252 252 return data
253 253
254 @LoginRequired()
255 @HasRepoPermissionAnyDecorator('repository.read', 'repository.write',
256 'repository.admin')
257 @jsonify
258 def repo_default_reviewers_data(self, repo_name):
259 return {
260 'reviewers': [utils.reviewer_as_json(
261 user=c.rhodecode_db_repo.user, reasons=None)]
262 }
263
254 264 @jsonify
255 265 def repo_refs_changelog_data(self, repo_name):
256 266 repo = c.rhodecode_repo
@@ -86,3 +86,21 b' def get_commit_from_ref_name(repo, ref_n'
86 86 '%s "%s" does not exist' % (ref_type, ref_name))
87 87
88 88 return repo_scm.get_commit(commit_id)
89
90
91 def reviewer_as_json(user, reasons):
92 """
93 Returns json struct of a reviewer for frontend
94
95 :param user: the reviewer
96 :param reasons: list of strings of why they are reviewers
97 """
98
99 return {
100 'user_id': user.user_id,
101 'reasons': reasons,
102 'username': user.username,
103 'firstname': user.firstname,
104 'lastname': user.lastname,
105 'gravatar_link': h.gravatar_url(user.email, 14),
106 }
@@ -48,6 +48,7 b' from rhodecode.events.base import Rhodec'
48 48
49 49 from rhodecode.events.user import ( # noqa
50 50 UserPreCreate,
51 UserPostCreate,
51 52 UserPreUpdate,
52 53 UserRegistered
53 54 )
@@ -51,6 +51,19 b' class UserPreCreate(RhodecodeEvent):'
51 51 self.user_data = user_data
52 52
53 53
54 @implementer(IUserPreCreate)
55 class UserPostCreate(RhodecodeEvent):
56 """
57 An instance of this class is emitted as an :term:`event` after a new user
58 object is created.
59 """
60 name = 'user-post-create'
61 display_name = lazy_ugettext('user post create')
62
63 def __init__(self, user_data):
64 self.user_data = user_data
65
66
54 67 @implementer(IUserPreUpdate)
55 68 class UserPreUpdate(RhodecodeEvent):
56 69 """
@@ -161,7 +161,7 b' class HipchatIntegrationType(Integration'
161 161 comment_text = data['comment']['text']
162 162 if len(comment_text) > 200:
163 163 comment_text = '{comment_text}<a href="{comment_url}">...<a/>'.format(
164 comment_text=comment_text[:200],
164 comment_text=h.html_escape(comment_text[:200]),
165 165 comment_url=data['comment']['url'],
166 166 )
167 167
@@ -179,8 +179,8 b' class HipchatIntegrationType(Integration'
179 179 number=data['pullrequest']['pull_request_id'],
180 180 pr_url=data['pullrequest']['url'],
181 181 pr_status=data['pullrequest']['status'],
182 pr_title=data['pullrequest']['title'],
183 comment_text=comment_text
182 pr_title=h.html_escape(data['pullrequest']['title']),
183 comment_text=h.html_escape(comment_text)
184 184 )
185 185 )
186 186
@@ -193,7 +193,7 b' class HipchatIntegrationType(Integration'
193 193 number=data['pullrequest']['pull_request_id'],
194 194 pr_url=data['pullrequest']['url'],
195 195 pr_status=data['pullrequest']['status'],
196 pr_title=data['pullrequest']['title'],
196 pr_title=h.html_escape(data['pullrequest']['title']),
197 197 )
198 198 )
199 199
@@ -206,21 +206,20 b' class HipchatIntegrationType(Integration'
206 206 }.get(event.__class__, str(event.__class__))
207 207
208 208 return ('Pull request <a href="{url}">#{number}</a> - {title} '
209 '{action} by {user}').format(
209 '{action} by <b>{user}</b>').format(
210 210 user=data['actor']['username'],
211 211 number=data['pullrequest']['pull_request_id'],
212 212 url=data['pullrequest']['url'],
213 title=data['pullrequest']['title'],
213 title=h.html_escape(data['pullrequest']['title']),
214 214 action=action
215 215 )
216 216
217 217 def format_repo_push_event(self, data):
218 218 branch_data = {branch['name']: branch
219 for branch in data['push']['branches']}
219 for branch in data['push']['branches']}
220 220
221 221 branches_commits = {}
222 222 for commit in data['push']['commits']:
223 log.critical(commit)
224 223 if commit['branch'] not in branches_commits:
225 224 branch_commits = {'branch': branch_data[commit['branch']],
226 225 'commits': []}
@@ -238,7 +237,7 b' class HipchatIntegrationType(Integration'
238 237 def format_repo_create_event(self, data):
239 238 return '<a href="{}">{}</a> ({}) repository created by <b>{}</b>'.format(
240 239 data['repo']['url'],
241 data['repo']['repo_name'],
240 h.html_escape(data['repo']['repo_name']),
242 241 data['repo']['repo_type'],
243 242 data['actor']['username'],
244 243 )
@@ -173,7 +173,7 b' class SlackIntegrationType(IntegrationTy'
173 173
174 174 return (textwrap.dedent(
175 175 '''
176 {user} commented on pull request <{pr_url}|#{number}> - {pr_title}:
176 *{user}* commented on pull request <{pr_url}|#{number}> - {pr_title}:
177 177 >>> {comment_status}{comment_text}
178 178 ''').format(
179 179 comment_status=comment_status,
@@ -208,7 +208,7 b' class SlackIntegrationType(IntegrationTy'
208 208 }.get(event.__class__, str(event.__class__))
209 209
210 210 return ('Pull request <{url}|#{number}> - {title} '
211 '{action} by {user}').format(
211 '`{action}` by *{user}*').format(
212 212 user=data['actor']['username'],
213 213 number=data['pullrequest']['pull_request_id'],
214 214 url=data['pullrequest']['url'],
@@ -218,11 +218,10 b' class SlackIntegrationType(IntegrationTy'
218 218
219 219 def format_repo_push_event(self, data):
220 220 branch_data = {branch['name']: branch
221 for branch in data['push']['branches']}
221 for branch in data['push']['branches']}
222 222
223 223 branches_commits = {}
224 224 for commit in data['push']['commits']:
225 log.critical(commit)
226 225 if commit['branch'] not in branches_commits:
227 226 branch_commits = {'branch': branch_data[commit['branch']],
228 227 'commits': []}
@@ -19,13 +19,15 b''
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 from __future__ import unicode_literals
22 import string
23 from collections import OrderedDict
22 24
23 25 import deform
24 26 import logging
25 27 import requests
26 28 import colander
27 29 from celery.task import task
28 from mako.template import Template
30 from requests.packages.urllib3.util.retry import Retry
29 31
30 32 from rhodecode import events
31 33 from rhodecode.translation import _
@@ -33,12 +35,127 b' from rhodecode.integrations.types.base i'
33 35
34 36 log = logging.getLogger(__name__)
35 37
38 # updating this required to update the `base_vars` passed in url calling func
39 WEBHOOK_URL_VARS = [
40 'repo_name',
41 'repo_type',
42 'repo_id',
43 'repo_url',
44
45 # special attrs below that we handle, using multi-call
46 'branch',
47 'commit_id',
48
49 # pr events vars
50 'pull_request_id',
51 'pull_request_url',
52
53 ]
54 URL_VARS = ', '.join('${' + x + '}' for x in WEBHOOK_URL_VARS)
55
56
57 class WebhookHandler(object):
58 def __init__(self, template_url, secret_token):
59 self.template_url = template_url
60 self.secret_token = secret_token
61
62 def get_base_parsed_template(self, data):
63 """
64 initially parses the passed in template with some common variables
65 available on ALL calls
66 """
67 # note: make sure to update the `WEBHOOK_URL_VARS` if this changes
68 common_vars = {
69 'repo_name': data['repo']['repo_name'],
70 'repo_type': data['repo']['repo_type'],
71 'repo_id': data['repo']['repo_id'],
72 'repo_url': data['repo']['url'],
73 }
74
75 return string.Template(
76 self.template_url).safe_substitute(**common_vars)
77
78 def repo_push_event_handler(self, event, data):
79 url = self.get_base_parsed_template(data)
80 url_cals = []
81 branch_data = OrderedDict()
82 for obj in data['push']['branches']:
83 branch_data[obj['name']] = obj
84
85 branches_commits = OrderedDict()
86 for commit in data['push']['commits']:
87 if commit['branch'] not in branches_commits:
88 branch_commits = {'branch': branch_data[commit['branch']],
89 'commits': []}
90 branches_commits[commit['branch']] = branch_commits
91
92 branch_commits = branches_commits[commit['branch']]
93 branch_commits['commits'].append(commit)
94
95 if '${branch}' in url:
96 # call it multiple times, for each branch if used in variables
97 for branch, commit_ids in branches_commits.items():
98 branch_url = string.Template(url).safe_substitute(branch=branch)
99 # call further down for each commit if used
100 if '${commit_id}' in branch_url:
101 for commit_data in commit_ids['commits']:
102 commit_id = commit_data['raw_id']
103 commit_url = string.Template(branch_url).safe_substitute(
104 commit_id=commit_id)
105 # register per-commit call
106 log.debug(
107 'register webhook call(%s) to url %s', event, commit_url)
108 url_cals.append((commit_url, self.secret_token, data))
109
110 else:
111 # register per-branch call
112 log.debug(
113 'register webhook call(%s) to url %s', event, branch_url)
114 url_cals.append((branch_url, self.secret_token, data))
115
116 else:
117 log.debug(
118 'register webhook call(%s) to url %s', event, url)
119 url_cals.append((url, self.secret_token, data))
120
121 return url_cals
122
123 def repo_create_event_handler(self, event, data):
124 url = self.get_base_parsed_template(data)
125 log.debug(
126 'register webhook call(%s) to url %s', event, url)
127 return [(url, self.secret_token, data)]
128
129 def pull_request_event_handler(self, event, data):
130 url = self.get_base_parsed_template(data)
131 log.debug(
132 'register webhook call(%s) to url %s', event, url)
133 url = string.Template(url).safe_substitute(
134 pull_request_id=data['pullrequest']['pull_request_id'],
135 pull_request_url=data['pullrequest']['url'])
136 return [(url, self.secret_token, data)]
137
138 def __call__(self, event, data):
139 if isinstance(event, events.RepoPushEvent):
140 return self.repo_push_event_handler(event, data)
141 elif isinstance(event, events.RepoCreateEvent):
142 return self.repo_create_event_handler(event, data)
143 elif isinstance(event, events.PullRequestEvent):
144 return self.pull_request_event_handler(event, data)
145 else:
146 raise ValueError('event type not supported: %s' % events)
147
36 148
37 149 class WebhookSettingsSchema(colander.Schema):
38 150 url = colander.SchemaNode(
39 151 colander.String(),
40 152 title=_('Webhook URL'),
41 description=_('URL of the webhook to receive POST event.'),
153 description=
154 _('URL of the webhook to receive POST event. Following variables '
155 'are allowed to be used: {vars}. Some of the variables would '
156 'trigger multiple calls, like ${{branch}} or ${{commit_id}}. '
157 'Webhook will be called as many times as unique objects in '
158 'data in such cases.').format(vars=URL_VARS),
42 159 missing=colander.required,
43 160 required=True,
44 161 validator=colander.url,
@@ -58,8 +175,6 b' class WebhookSettingsSchema(colander.Sch'
58 175 )
59 176
60 177
61
62
63 178 class WebhookIntegrationType(IntegrationTypeBase):
64 179 key = 'webhook'
65 180 display_name = _('Webhook')
@@ -104,14 +219,30 b' class WebhookIntegrationType(Integration'
104 219 return
105 220
106 221 data = event.as_dict()
107 post_to_webhook(data, self.settings)
222 template_url = self.settings['url']
223
224 handler = WebhookHandler(template_url, self.settings['secret_token'])
225 url_calls = handler(event, data)
226 log.debug('webhook: calling following urls: %s',
227 [x[0] for x in url_calls])
228 post_to_webhook(url_calls)
108 229
109 230
110 231 @task(ignore_result=True)
111 def post_to_webhook(data, settings):
112 log.debug('sending event:%s to webhook %s', data['name'], settings['url'])
113 resp = requests.post(settings['url'], json={
114 'token': settings['secret_token'],
115 'event': data
116 })
117 resp.raise_for_status() # raise exception on a failed request
232 def post_to_webhook(url_calls):
233 max_retries = 3
234 for url, token, data in url_calls:
235 # retry max N times
236 retries = Retry(
237 total=max_retries,
238 backoff_factor=0.15,
239 status_forcelist=[500, 502, 503, 504])
240 req_session = requests.Session()
241 req_session.mount(
242 'http://', requests.adapters.HTTPAdapter(max_retries=retries))
243
244 resp = req_session.post(url, json={
245 'token': token,
246 'event': data
247 })
248 resp.raise_for_status() # raise exception on a failed request
@@ -49,7 +49,7 b' from rhodecode.model.meta import Session'
49 49 from rhodecode.model.user import UserModel
50 50 from rhodecode.model.db import (
51 51 User, Repository, Permission, UserToPerm, UserGroupToPerm, UserGroupMember,
52 UserIpMap, UserApiKeys)
52 UserIpMap, UserApiKeys, RepoGroup)
53 53 from rhodecode.lib import caches
54 54 from rhodecode.lib.utils2 import safe_unicode, aslist, safe_str, md5
55 55 from rhodecode.lib.utils import (
@@ -983,6 +983,9 b' class AuthUser(object):'
983 983 inherit = self.inherit_default_permissions
984 984 return AuthUser.check_ip_allowed(self.user_id, self.ip_addr,
985 985 inherit_from_default=inherit)
986 @property
987 def personal_repo_group(self):
988 return RepoGroup.get_user_personal_repo_group(self.user_id)
986 989
987 990 @classmethod
988 991 def check_ip_allowed(cls, user_id, ip_addr, inherit_from_default):
@@ -163,7 +163,8 b' def get_access_path(environ):'
163 163
164 164
165 165 def vcs_operation_context(
166 environ, repo_name, username, action, scm, check_locking=True):
166 environ, repo_name, username, action, scm, check_locking=True,
167 is_shadow_repo=False):
167 168 """
168 169 Generate the context for a vcs operation, e.g. push or pull.
169 170
@@ -200,6 +201,7 b' def vcs_operation_context('
200 201 'locked_by': locked_by,
201 202 'server_url': utils2.get_server_url(environ),
202 203 'hooks': get_enabled_hook_classes(ui_settings),
204 'is_shadow_repo': is_shadow_repo,
203 205 }
204 206 return extras
205 207
@@ -363,6 +365,18 b' def attach_context_attributes(context, r'
363 365 # Fix this and remove it from base controller.
364 366 # context.repo_name = get_repo_slug(request) # can be empty
365 367
368 diffmode = 'sideside'
369 if request.GET.get('diffmode'):
370 if request.GET['diffmode'] == 'unified':
371 diffmode = 'unified'
372 elif request.session.get('diffmode'):
373 diffmode = request.session['diffmode']
374
375 context.diffmode = diffmode
376
377 if request.session.get('diffmode') != diffmode:
378 request.session['diffmode'] = diffmode
379
366 380 context.csrf_token = auth.get_csrf_token()
367 381 context.backends = rhodecode.BACKENDS.keys()
368 382 context.backends.sort()
@@ -24,9 +24,10 b' import logging'
24 24 import threading
25 25
26 26 from beaker.cache import _cache_decorate, cache_regions, region_invalidate
27 from sqlalchemy.exc import IntegrityError
27 28
28 29 from rhodecode.lib.utils import safe_str, md5
29 from rhodecode.model.db import Session, CacheKey, IntegrityError
30 from rhodecode.model.db import Session, CacheKey
30 31
31 32 log = logging.getLogger(__name__)
32 33
@@ -18,16 +18,14 b''
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 import hashlib
22 import itsdangerous
21 23 import logging
22 24 import os
23
24 import itsdangerous
25 25 import requests
26
27 26 from dogpile.core import ReadWriteMutex
28 27
29 28 import rhodecode.lib.helpers as h
30
31 29 from rhodecode.lib.auth import HasRepoPermissionAny
32 30 from rhodecode.lib.ext_json import json
33 31 from rhodecode.model.db import User
@@ -81,7 +79,7 b' def get_user_data(user_id):'
81 79 'username': user.username,
82 80 'first_name': user.name,
83 81 'last_name': user.lastname,
84 'icon_link': h.gravatar_url(user.email, 14),
82 'icon_link': h.gravatar_url(user.email, 60),
85 83 'display_name': h.person(user, 'username_or_name_or_email'),
86 84 'display_link': h.link_to_user(user),
87 85 'notifications': user.user_data.get('notification_status', True)
@@ -169,7 +167,9 b' def parse_channels_info(info_result, inc'
169 167
170 168
171 169 def log_filepath(history_location, channel_name):
172 filename = '{}.log'.format(channel_name.encode('hex'))
170 hasher = hashlib.sha256()
171 hasher.update(channel_name.encode('utf8'))
172 filename = '{}.log'.format(hasher.hexdigest())
173 173 filepath = os.path.join(history_location, filename)
174 174 return filepath
175 175
@@ -180,6 +180,8 b' class Action(object):'
180 180 UNMODIFIED = 'unmod'
181 181
182 182 CONTEXT = 'context'
183 OLD_NO_NL = 'old-no-nl'
184 NEW_NO_NL = 'new-no-nl'
183 185
184 186
185 187 class DiffProcessor(object):
@@ -227,7 +229,7 b' class DiffProcessor(object):'
227 229 self._parser = self._parse_gitdiff
228 230 else:
229 231 self.differ = self._highlight_line_udiff
230 self._parser = self._parse_udiff
232 self._parser = self._new_parse_gitdiff
231 233
232 234 def _copy_iterator(self):
233 235 """
@@ -491,9 +493,181 b' class DiffProcessor(object):'
491 493
492 494 return diff_container(sorted(_files, key=sorter))
493 495
494 def _parse_udiff(self, inline_diff=True):
495 raise NotImplementedError()
496
497 # FIXME: NEWDIFFS: dan: this replaces the old _escaper function
498 def _process_line(self, string):
499 """
500 Process a diff line, checks the diff limit
501
502 :param string:
503 """
504
505 self.cur_diff_size += len(string)
506
507 if not self.show_full_diff and (self.cur_diff_size > self.diff_limit):
508 raise DiffLimitExceeded('Diff Limit Exceeded')
509
510 return safe_unicode(string)
511
512 # FIXME: NEWDIFFS: dan: this replaces _parse_gitdiff
513 def _new_parse_gitdiff(self, inline_diff=True):
514 _files = []
515 diff_container = lambda arg: arg
516 for chunk in self._diff.chunks():
517 head = chunk.header
518 log.debug('parsing diff %r' % head)
519
520 diff = imap(self._process_line, chunk.diff.splitlines(1))
521 raw_diff = chunk.raw
522 limited_diff = False
523 exceeds_limit = False
524 # if 'empty_file_to_modify_and_rename' in head['a_path']:
525 # 1/0
526 op = None
527 stats = {
528 'added': 0,
529 'deleted': 0,
530 'binary': False,
531 'old_mode': None,
532 'new_mode': None,
533 'ops': {},
534 }
535 if head['old_mode']:
536 stats['old_mode'] = head['old_mode']
537 if head['new_mode']:
538 stats['new_mode'] = head['new_mode']
539 if head['b_mode']:
540 stats['new_mode'] = head['b_mode']
541
542 if head['deleted_file_mode']:
543 op = OPS.DEL
544 stats['binary'] = True
545 stats['ops'][DEL_FILENODE] = 'deleted file'
546
547 elif head['new_file_mode']:
548 op = OPS.ADD
549 stats['binary'] = True
550 stats['old_mode'] = None
551 stats['new_mode'] = head['new_file_mode']
552 stats['ops'][NEW_FILENODE] = 'new file %s' % head['new_file_mode']
553 else: # modify operation, can be copy, rename or chmod
554
555 # CHMOD
556 if head['new_mode'] and head['old_mode']:
557 op = OPS.MOD
558 stats['binary'] = True
559 stats['ops'][CHMOD_FILENODE] = (
560 'modified file chmod %s => %s' % (
561 head['old_mode'], head['new_mode']))
562
563 # RENAME
564 if head['rename_from'] != head['rename_to']:
565 op = OPS.MOD
566 stats['binary'] = True
567 stats['renamed'] = (head['rename_from'], head['rename_to'])
568 stats['ops'][RENAMED_FILENODE] = (
569 'file renamed from %s to %s' % (
570 head['rename_from'], head['rename_to']))
571 # COPY
572 if head.get('copy_from') and head.get('copy_to'):
573 op = OPS.MOD
574 stats['binary'] = True
575 stats['copied'] = (head['copy_from'], head['copy_to'])
576 stats['ops'][COPIED_FILENODE] = (
577 'file copied from %s to %s' % (
578 head['copy_from'], head['copy_to']))
496 579
580 # If our new parsed headers didn't match anything fallback to
581 # old style detection
582 if op is None:
583 if not head['a_file'] and head['b_file']:
584 op = OPS.ADD
585 stats['binary'] = True
586 stats['new_file'] = True
587 stats['ops'][NEW_FILENODE] = 'new file'
588
589 elif head['a_file'] and not head['b_file']:
590 op = OPS.DEL
591 stats['binary'] = True
592 stats['ops'][DEL_FILENODE] = 'deleted file'
593
594 # it's not ADD not DELETE
595 if op is None:
596 op = OPS.MOD
597 stats['binary'] = True
598 stats['ops'][MOD_FILENODE] = 'modified file'
599
600 # a real non-binary diff
601 if head['a_file'] or head['b_file']:
602 try:
603 raw_diff, chunks, _stats = self._new_parse_lines(diff)
604 stats['binary'] = False
605 stats['added'] = _stats[0]
606 stats['deleted'] = _stats[1]
607 # explicit mark that it's a modified file
608 if op == OPS.MOD:
609 stats['ops'][MOD_FILENODE] = 'modified file'
610 exceeds_limit = len(raw_diff) > self.file_limit
611
612 # changed from _escaper function so we validate size of
613 # each file instead of the whole diff
614 # diff will hide big files but still show small ones
615 # from my tests, big files are fairly safe to be parsed
616 # but the browser is the bottleneck
617 if not self.show_full_diff and exceeds_limit:
618 raise DiffLimitExceeded('File Limit Exceeded')
619
620 except DiffLimitExceeded:
621 diff_container = lambda _diff: \
622 LimitedDiffContainer(
623 self.diff_limit, self.cur_diff_size, _diff)
624
625 exceeds_limit = len(raw_diff) > self.file_limit
626 limited_diff = True
627 chunks = []
628
629 else: # GIT format binary patch, or possibly empty diff
630 if head['bin_patch']:
631 # we have operation already extracted, but we mark simply
632 # it's a diff we wont show for binary files
633 stats['ops'][BIN_FILENODE] = 'binary diff hidden'
634 chunks = []
635
636 if chunks and not self.show_full_diff and op == OPS.DEL:
637 # if not full diff mode show deleted file contents
638 # TODO: anderson: if the view is not too big, there is no way
639 # to see the content of the file
640 chunks = []
641
642 chunks.insert(0, [{
643 'old_lineno': '',
644 'new_lineno': '',
645 'action': Action.CONTEXT,
646 'line': msg,
647 } for _op, msg in stats['ops'].iteritems()
648 if _op not in [MOD_FILENODE]])
649
650 original_filename = safe_unicode(head['a_path'])
651 _files.append({
652 'original_filename': original_filename,
653 'filename': safe_unicode(head['b_path']),
654 'old_revision': head['a_blob_id'],
655 'new_revision': head['b_blob_id'],
656 'chunks': chunks,
657 'raw_diff': safe_unicode(raw_diff),
658 'operation': op,
659 'stats': stats,
660 'exceeds_limit': exceeds_limit,
661 'is_limited_diff': limited_diff,
662 })
663
664
665 sorter = lambda info: {OPS.ADD: 0, OPS.MOD: 1,
666 OPS.DEL: 2}.get(info['operation'])
667
668 return diff_container(sorted(_files, key=sorter))
669
670 # FIXME: NEWDIFFS: dan: this gets replaced by _new_parse_lines
497 671 def _parse_lines(self, diff):
498 672 """
499 673 Parse the diff an return data for the template.
@@ -588,6 +762,107 b' class DiffProcessor(object):'
588 762 pass
589 763 return ''.join(raw_diff), chunks, stats
590 764
765 # FIXME: NEWDIFFS: dan: this replaces _parse_lines
766 def _new_parse_lines(self, diff):
767 """
768 Parse the diff an return data for the template.
769 """
770
771 lineiter = iter(diff)
772 stats = [0, 0]
773 chunks = []
774 raw_diff = []
775
776 try:
777 line = lineiter.next()
778
779 while line:
780 raw_diff.append(line)
781 match = self._chunk_re.match(line)
782
783 if not match:
784 break
785
786 gr = match.groups()
787 (old_line, old_end,
788 new_line, new_end) = [int(x or 1) for x in gr[:-1]]
789
790 lines = []
791 hunk = {
792 'section_header': gr[-1],
793 'source_start': old_line,
794 'source_length': old_end,
795 'target_start': new_line,
796 'target_length': new_end,
797 'lines': lines,
798 }
799 chunks.append(hunk)
800
801 old_line -= 1
802 new_line -= 1
803
804 context = len(gr) == 5
805 old_end += old_line
806 new_end += new_line
807
808 line = lineiter.next()
809
810 while old_line < old_end or new_line < new_end:
811 command = ' '
812 if line:
813 command = line[0]
814
815 affects_old = affects_new = False
816
817 # ignore those if we don't expect them
818 if command in '#@':
819 continue
820 elif command == '+':
821 affects_new = True
822 action = Action.ADD
823 stats[0] += 1
824 elif command == '-':
825 affects_old = True
826 action = Action.DELETE
827 stats[1] += 1
828 else:
829 affects_old = affects_new = True
830 action = Action.UNMODIFIED
831
832 if not self._newline_marker.match(line):
833 old_line += affects_old
834 new_line += affects_new
835 lines.append({
836 'old_lineno': affects_old and old_line or '',
837 'new_lineno': affects_new and new_line or '',
838 'action': action,
839 'line': self._clean_line(line, command)
840 })
841 raw_diff.append(line)
842
843 line = lineiter.next()
844
845 if self._newline_marker.match(line):
846 # we need to append to lines, since this is not
847 # counted in the line specs of diff
848 if affects_old:
849 action = Action.OLD_NO_NL
850 elif affects_new:
851 action = Action.NEW_NO_NL
852 else:
853 raise Exception('invalid context for no newline')
854
855 lines.append({
856 'old_lineno': None,
857 'new_lineno': None,
858 'action': action,
859 'line': self._clean_line(line, command)
860 })
861
862 except StopIteration:
863 pass
864 return ''.join(raw_diff), chunks, stats
865
591 866 def _safe_id(self, idstring):
592 867 """Make a string safe for including in an id attribute.
593 868
@@ -699,7 +974,7 b' class DiffProcessor(object):'
699 974 if enable_comments and change['action'] != Action.CONTEXT:
700 975 _html.append('''<a href="#"><span class="icon-comment-add"></span></a>''')
701 976
702 _html.append('''</span></td><td class="comment-toggle tooltip" title="Toggle Comments"><i class="icon-comment"></i></td>\n''')
977 _html.append('''</span></td><td class="comment-toggle tooltip" title="Toggle Comment Thread"><i class="icon-comment"></i></td>\n''')
703 978
704 979 ###########################################################
705 980 # OLD LINE NUMBER
@@ -67,7 +67,6 b' from webhelpers.html.tags import _set_in'
67 67 convert_boolean_attrs, NotGiven, _make_safe_id_component
68 68 from webhelpers2.number import format_byte_size
69 69
70 from rhodecode.lib.annotate import annotate_highlight
71 70 from rhodecode.lib.action_parser import action_parser
72 71 from rhodecode.lib.ext_json import json
73 72 from rhodecode.lib.utils import repo_name_slug, get_custom_lexer
@@ -108,6 +107,15 b' def pylons_url_current(*args, **kw):'
108 107 url.current = pylons_url_current
109 108
110 109
110 def url_replace(**qargs):
111 """ Returns the current request url while replacing query string args """
112
113 request = get_current_request()
114 new_args = request.GET.mixed()
115 new_args.update(qargs)
116 return url('', **new_args)
117
118
111 119 def asset(path, ver=None):
112 120 """
113 121 Helper to generate a static asset file path for rhodecode assets
@@ -125,17 +133,18 b' def asset(path, ver=None):'
125 133 'rhodecode:public/{}'.format(path), _query=query)
126 134
127 135
128 def html_escape(text, html_escape_table=None):
136 default_html_escape_table = {
137 ord('&'): u'&amp;',
138 ord('<'): u'&lt;',
139 ord('>'): u'&gt;',
140 ord('"'): u'&quot;',
141 ord("'"): u'&#39;',
142 }
143
144
145 def html_escape(text, html_escape_table=default_html_escape_table):
129 146 """Produce entities within text."""
130 if not html_escape_table:
131 html_escape_table = {
132 "&": "&amp;",
133 '"': "&quot;",
134 "'": "&apos;",
135 ">": "&gt;",
136 "<": "&lt;",
137 }
138 return "".join(html_escape_table.get(c, c) for c in text)
147 return text.translate(html_escape_table)
139 148
140 149
141 150 def chop_at_smart(s, sub, inclusive=False, suffix_if_chopped=None):
@@ -500,6 +509,86 b' def get_matching_line_offsets(lines, ter'
500 509 return matching_lines
501 510
502 511
512 def hsv_to_rgb(h, s, v):
513 """ Convert hsv color values to rgb """
514
515 if s == 0.0:
516 return v, v, v
517 i = int(h * 6.0) # XXX assume int() truncates!
518 f = (h * 6.0) - i
519 p = v * (1.0 - s)
520 q = v * (1.0 - s * f)
521 t = v * (1.0 - s * (1.0 - f))
522 i = i % 6
523 if i == 0:
524 return v, t, p
525 if i == 1:
526 return q, v, p
527 if i == 2:
528 return p, v, t
529 if i == 3:
530 return p, q, v
531 if i == 4:
532 return t, p, v
533 if i == 5:
534 return v, p, q
535
536
537 def unique_color_generator(n=10000, saturation=0.10, lightness=0.95):
538 """
539 Generator for getting n of evenly distributed colors using
540 hsv color and golden ratio. It always return same order of colors
541
542 :param n: number of colors to generate
543 :param saturation: saturation of returned colors
544 :param lightness: lightness of returned colors
545 :returns: RGB tuple
546 """
547
548 golden_ratio = 0.618033988749895
549 h = 0.22717784590367374
550
551 for _ in xrange(n):
552 h += golden_ratio
553 h %= 1
554 HSV_tuple = [h, saturation, lightness]
555 RGB_tuple = hsv_to_rgb(*HSV_tuple)
556 yield map(lambda x: str(int(x * 256)), RGB_tuple)
557
558
559 def color_hasher(n=10000, saturation=0.10, lightness=0.95):
560 """
561 Returns a function which when called with an argument returns a unique
562 color for that argument, eg.
563
564 :param n: number of colors to generate
565 :param saturation: saturation of returned colors
566 :param lightness: lightness of returned colors
567 :returns: css RGB string
568
569 >>> color_hash = color_hasher()
570 >>> color_hash('hello')
571 'rgb(34, 12, 59)'
572 >>> color_hash('hello')
573 'rgb(34, 12, 59)'
574 >>> color_hash('other')
575 'rgb(90, 224, 159)'
576 """
577
578 color_dict = {}
579 cgenerator = unique_color_generator(
580 saturation=saturation, lightness=lightness)
581
582 def get_color_string(thing):
583 if thing in color_dict:
584 col = color_dict[thing]
585 else:
586 col = color_dict[thing] = cgenerator.next()
587 return "rgb(%s)" % (', '.join(col))
588
589 return get_color_string
590
591
503 592 def get_lexer_safe(mimetype=None, filepath=None):
504 593 """
505 594 Tries to return a relevant pygments lexer using mimetype/filepath name,
@@ -536,92 +625,6 b' def pygmentize(filenode, **kwargs):'
536 625 CodeHtmlFormatter(**kwargs)))
537 626
538 627
539 def pygmentize_annotation(repo_name, filenode, **kwargs):
540 """
541 pygmentize function for annotation
542
543 :param filenode:
544 """
545
546 color_dict = {}
547
548 def gen_color(n=10000):
549 """generator for getting n of evenly distributed colors using
550 hsv color and golden ratio. It always return same order of colors
551
552 :returns: RGB tuple
553 """
554
555 def hsv_to_rgb(h, s, v):
556 if s == 0.0:
557 return v, v, v
558 i = int(h * 6.0) # XXX assume int() truncates!
559 f = (h * 6.0) - i
560 p = v * (1.0 - s)
561 q = v * (1.0 - s * f)
562 t = v * (1.0 - s * (1.0 - f))
563 i = i % 6
564 if i == 0:
565 return v, t, p
566 if i == 1:
567 return q, v, p
568 if i == 2:
569 return p, v, t
570 if i == 3:
571 return p, q, v
572 if i == 4:
573 return t, p, v
574 if i == 5:
575 return v, p, q
576
577 golden_ratio = 0.618033988749895
578 h = 0.22717784590367374
579
580 for _ in xrange(n):
581 h += golden_ratio
582 h %= 1
583 HSV_tuple = [h, 0.95, 0.95]
584 RGB_tuple = hsv_to_rgb(*HSV_tuple)
585 yield map(lambda x: str(int(x * 256)), RGB_tuple)
586
587 cgenerator = gen_color()
588
589 def get_color_string(commit_id):
590 if commit_id in color_dict:
591 col = color_dict[commit_id]
592 else:
593 col = color_dict[commit_id] = cgenerator.next()
594 return "color: rgb(%s)! important;" % (', '.join(col))
595
596 def url_func(repo_name):
597
598 def _url_func(commit):
599 author = commit.author
600 date = commit.date
601 message = tooltip(commit.message)
602
603 tooltip_html = ("<div style='font-size:0.8em'><b>Author:</b>"
604 " %s<br/><b>Date:</b> %s</b><br/><b>Message:"
605 "</b> %s<br/></div>")
606
607 tooltip_html = tooltip_html % (author, date, message)
608 lnk_format = '%5s:%s' % ('r%s' % commit.idx, commit.short_id)
609 uri = link_to(
610 lnk_format,
611 url('changeset_home', repo_name=repo_name,
612 revision=commit.raw_id),
613 style=get_color_string(commit.raw_id),
614 class_='tooltip',
615 title=tooltip_html
616 )
617
618 uri += '\n'
619 return uri
620 return _url_func
621
622 return literal(annotate_highlight(filenode, url_func(repo_name), **kwargs))
623
624
625 628 def is_following_repo(repo_name, user_id):
626 629 from rhodecode.model.scm import ScmModel
627 630 return ScmModel().is_following_repo(repo_name, user_id)
@@ -678,6 +681,29 b' class Flash(_Flash):'
678 681 session.save()
679 682 return messages
680 683
684 def json_alerts(self):
685 payloads = []
686 messages = flash.pop_messages()
687 if messages:
688 for message in messages:
689 subdata = {}
690 if hasattr(message.message, 'rsplit'):
691 flash_data = message.message.rsplit('|DELIM|', 1)
692 org_message = flash_data[0]
693 if len(flash_data) > 1:
694 subdata = json.loads(flash_data[1])
695 else:
696 org_message = message.message
697 payloads.append({
698 'message': {
699 'message': u'{}'.format(org_message),
700 'level': message.category,
701 'force': True,
702 'subdata': subdata
703 }
704 })
705 return json.dumps(payloads)
706
681 707 flash = Flash()
682 708
683 709 #==============================================================================
@@ -38,6 +38,13 b' from rhodecode.model.db import Repositor'
38 38 HookResponse = collections.namedtuple('HookResponse', ('status', 'output'))
39 39
40 40
41 def is_shadow_repo(extras):
42 """
43 Returns ``True`` if this is an action executed against a shadow repository.
44 """
45 return extras['is_shadow_repo']
46
47
41 48 def _get_scm_size(alias, root_path):
42 49
43 50 if not alias.startswith('.'):
@@ -85,7 +92,6 b' def pre_push(extras):'
85 92 """
86 93 usr = User.get_by_username(extras.username)
87 94
88
89 95 output = ''
90 96 if extras.locked_by[0] and usr.user_id != int(extras.locked_by[0]):
91 97 locked_by = User.get(extras.locked_by[0]).username
@@ -100,11 +106,12 b' def pre_push(extras):'
100 106 else:
101 107 raise _http_ret
102 108
103 # Calling hooks after checking the lock, for consistent behavior
104 pre_push_extension(repo_store_path=Repository.base_path(), **extras)
105
106 events.trigger(events.RepoPrePushEvent(repo_name=extras.repository,
107 extras=extras))
109 # Propagate to external components. This is done after checking the
110 # lock, for consistent behavior.
111 if not is_shadow_repo(extras):
112 pre_push_extension(repo_store_path=Repository.base_path(), **extras)
113 events.trigger(events.RepoPrePushEvent(
114 repo_name=extras.repository, extras=extras))
108 115
109 116 return HookResponse(0, output)
110 117
@@ -130,10 +137,12 b' def pre_pull(extras):'
130 137 else:
131 138 raise _http_ret
132 139
133 # Calling hooks after checking the lock, for consistent behavior
134 pre_pull_extension(**extras)
135 events.trigger(events.RepoPrePullEvent(repo_name=extras.repository,
136 extras=extras))
140 # Propagate to external components. This is done after checking the
141 # lock, for consistent behavior.
142 if not is_shadow_repo(extras):
143 pre_pull_extension(**extras)
144 events.trigger(events.RepoPrePullEvent(
145 repo_name=extras.repository, extras=extras))
137 146
138 147 return HookResponse(0, output)
139 148
@@ -144,14 +153,15 b' def post_pull(extras):'
144 153 action = 'pull'
145 154 action_logger(user, action, extras.repository, extras.ip, commit=True)
146 155
147 events.trigger(events.RepoPullEvent(repo_name=extras.repository,
148 extras=extras))
149 # extension hook call
150 post_pull_extension(**extras)
156 # Propagate to external components.
157 if not is_shadow_repo(extras):
158 post_pull_extension(**extras)
159 events.trigger(events.RepoPullEvent(
160 repo_name=extras.repository, extras=extras))
151 161
152 162 output = ''
153 163 # make lock is a tri state False, True, None. We only make lock on True
154 if extras.make_lock is True:
164 if extras.make_lock is True and not is_shadow_repo(extras):
155 165 Repository.lock(Repository.get_by_repo_name(extras.repository),
156 166 user.user_id,
157 167 lock_reason=Repository.LOCK_PULL)
@@ -179,19 +189,20 b' def post_push(extras):'
179 189 action_logger(
180 190 extras.username, action, extras.repository, extras.ip, commit=True)
181 191
182 events.trigger(events.RepoPushEvent(repo_name=extras.repository,
183 pushed_commit_ids=commit_ids,
184 extras=extras))
185
186 # extension hook call
187 post_push_extension(
188 repo_store_path=Repository.base_path(),
189 pushed_revs=commit_ids,
190 **extras)
192 # Propagate to external components.
193 if not is_shadow_repo(extras):
194 post_push_extension(
195 repo_store_path=Repository.base_path(),
196 pushed_revs=commit_ids,
197 **extras)
198 events.trigger(events.RepoPushEvent(
199 repo_name=extras.repository,
200 pushed_commit_ids=commit_ids,
201 extras=extras))
191 202
192 203 output = ''
193 204 # make lock is a tri state False, True, None. We only release lock on False
194 if extras.make_lock is False:
205 if extras.make_lock is False and not is_shadow_repo(extras):
195 206 Repository.unlock(Repository.get_by_repo_name(extras.repository))
196 207 msg = 'Released lock on repo `%s`\n' % extras.repository
197 208 output += msg
@@ -195,9 +195,8 b' class HttpHooksCallbackDaemon(ThreadedHo'
195 195 self._callback_thread = None
196 196
197 197
198 def prepare_callback_daemon(extras, protocol=None, use_direct_calls=False):
198 def prepare_callback_daemon(extras, protocol, use_direct_calls):
199 199 callback_daemon = None
200 protocol = protocol.lower() if protocol else None
201 200
202 201 if use_direct_calls:
203 202 callback_daemon = DummyHooksCallbackDaemon()
@@ -76,7 +76,8 b' class SimpleGit(simplevcs.SimpleVCS):'
76 76 return 'pull'
77 77
78 78 def _create_wsgi_app(self, repo_path, repo_name, config):
79 return self.scm_app.create_git_wsgi_app(repo_path, repo_name, config)
79 return self.scm_app.create_git_wsgi_app(
80 repo_path, repo_name, config)
80 81
81 82 def _create_config(self, extras, repo_name):
82 83 extras['git_update_server_info'] = utils2.str2bool(
@@ -71,7 +71,8 b' class SimpleHg(simplevcs.SimpleVCS):'
71 71 return 'pull'
72 72
73 73 def _create_wsgi_app(self, repo_path, repo_name, config):
74 return self.scm_app.create_hg_wsgi_app(repo_path, repo_name, config)
74 return self.scm_app.create_hg_wsgi_app(
75 repo_path, repo_name, config)
75 76
76 77 def _create_config(self, extras, repo_name):
77 78 config = utils.make_db_config(repo=repo_name)
@@ -89,10 +89,6 b' class SimpleSvnApp(object):'
89 89 if h.lower() not in self.IGNORED_HEADERS
90 90 ]
91 91
92 # Add custom response header to indicate that this is a VCS response
93 # and which backend is used.
94 headers.append(('X-RhodeCode-Backend', 'svn'))
95
96 92 return headers
97 93
98 94
@@ -26,6 +26,7 b" It's implemented with basic auth functio"
26 26 import os
27 27 import logging
28 28 import importlib
29 import re
29 30 from functools import wraps
30 31
31 32 from paste.httpheaders import REMOTE_USER, AUTH_TYPE
@@ -41,15 +42,16 b' from rhodecode.lib.exceptions import ('
41 42 NotAllowedToCreateUserError)
42 43 from rhodecode.lib.hooks_daemon import prepare_callback_daemon
43 44 from rhodecode.lib.middleware import appenlight
44 from rhodecode.lib.middleware.utils import scm_app
45 from rhodecode.lib.middleware.utils import scm_app, scm_app_http
45 46 from rhodecode.lib.utils import (
46 is_valid_repo, get_rhodecode_realm, get_rhodecode_base_path)
47 from rhodecode.lib.utils2 import safe_str, fix_PATH, str2bool
47 is_valid_repo, get_rhodecode_realm, get_rhodecode_base_path, SLUG_RE)
48 from rhodecode.lib.utils2 import safe_str, fix_PATH, str2bool, safe_unicode
48 49 from rhodecode.lib.vcs.conf import settings as vcs_settings
49 50 from rhodecode.lib.vcs.backends import base
50 51 from rhodecode.model import meta
51 from rhodecode.model.db import User, Repository
52 from rhodecode.model.db import User, Repository, PullRequest
52 53 from rhodecode.model.scm import ScmModel
54 from rhodecode.model.pull_request import PullRequestModel
53 55
54 56
55 57 log = logging.getLogger(__name__)
@@ -83,12 +85,26 b' class SimpleVCS(object):'
83 85
84 86 SCM = 'unknown'
85 87
88 acl_repo_name = None
89 url_repo_name = None
90 vcs_repo_name = None
91
92 # We have to handle requests to shadow repositories different than requests
93 # to normal repositories. Therefore we have to distinguish them. To do this
94 # we use this regex which will match only on URLs pointing to shadow
95 # repositories.
96 shadow_repo_re = re.compile(
97 '(?P<groups>(?:{slug_pat}/)*)' # repo groups
98 '(?P<target>{slug_pat})/' # target repo
99 'pull-request/(?P<pr_id>\d+)/' # pull request
100 'repository$' # shadow repo
101 .format(slug_pat=SLUG_RE.pattern))
102
86 103 def __init__(self, application, config, registry):
87 104 self.registry = registry
88 105 self.application = application
89 106 self.config = config
90 107 # re-populated by specialized middleware
91 self.repo_name = None
92 108 self.repo_vcs_config = base.Config()
93 109
94 110 # base path of repo locations
@@ -101,16 +117,73 b' class SimpleVCS(object):'
101 117 auth_ret_code_detection)
102 118 self.ip_addr = '0.0.0.0'
103 119
120 def set_repo_names(self, environ):
121 """
122 This will populate the attributes acl_repo_name, url_repo_name,
123 vcs_repo_name and is_shadow_repo. In case of requests to normal (non
124 shadow) repositories all names are equal. In case of requests to a
125 shadow repository the acl-name points to the target repo of the pull
126 request and the vcs-name points to the shadow repo file system path.
127 The url-name is always the URL used by the vcs client program.
128
129 Example in case of a shadow repo:
130 acl_repo_name = RepoGroup/MyRepo
131 url_repo_name = RepoGroup/MyRepo/pull-request/3/repository
132 vcs_repo_name = /repo/base/path/RepoGroup/.__shadow_MyRepo_pr-3'
133 """
134 # First we set the repo name from URL for all attributes. This is the
135 # default if handling normal (non shadow) repo requests.
136 self.url_repo_name = self._get_repository_name(environ)
137 self.acl_repo_name = self.vcs_repo_name = self.url_repo_name
138 self.is_shadow_repo = False
139
140 # Check if this is a request to a shadow repository.
141 match = self.shadow_repo_re.match(self.url_repo_name)
142 if match:
143 match_dict = match.groupdict()
144
145 # Build acl repo name from regex match.
146 acl_repo_name = safe_unicode('{groups}{target}'.format(
147 groups=match_dict['groups'] or '',
148 target=match_dict['target']))
149
150 # Retrieve pull request instance by ID from regex match.
151 pull_request = PullRequest.get(match_dict['pr_id'])
152
153 # Only proceed if we got a pull request and if acl repo name from
154 # URL equals the target repo name of the pull request.
155 if pull_request and (acl_repo_name ==
156 pull_request.target_repo.repo_name):
157 # Get file system path to shadow repository.
158 workspace_id = PullRequestModel()._workspace_id(pull_request)
159 target_vcs = pull_request.target_repo.scm_instance()
160 vcs_repo_name = target_vcs._get_shadow_repository_path(
161 workspace_id)
162
163 # Store names for later usage.
164 self.vcs_repo_name = vcs_repo_name
165 self.acl_repo_name = acl_repo_name
166 self.is_shadow_repo = True
167
168 log.debug('Repository names: %s', {
169 'acl_repo_name': self.acl_repo_name,
170 'url_repo_name': self.url_repo_name,
171 'vcs_repo_name': self.vcs_repo_name,
172 })
173
104 174 @property
105 175 def scm_app(self):
106 custom_implementation = self.config.get('vcs.scm_app_implementation')
107 if custom_implementation and custom_implementation != 'pyro4':
108 log.info(
109 "Using custom implementation of scm_app: %s",
110 custom_implementation)
176 custom_implementation = self.config['vcs.scm_app_implementation']
177 if custom_implementation == 'http':
178 log.info('Using HTTP implementation of scm app.')
179 scm_app_impl = scm_app_http
180 elif custom_implementation == 'pyro4':
181 log.info('Using Pyro implementation of scm app.')
182 scm_app_impl = scm_app
183 else:
184 log.info('Using custom implementation of scm_app: "{}"'.format(
185 custom_implementation))
111 186 scm_app_impl = importlib.import_module(custom_implementation)
112 else:
113 scm_app_impl = scm_app
114 187 return scm_app_impl
115 188
116 189 def _get_by_id(self, repo_name):
@@ -149,7 +222,7 b' class SimpleVCS(object):'
149 222 repo_name, db_repo.repo_type, scm_type)
150 223 return False
151 224
152 return is_valid_repo(repo_name, base_path, expect_scm=scm_type)
225 return is_valid_repo(repo_name, base_path, explicit_scm=scm_type)
153 226
154 227 def valid_and_active_user(self, user):
155 228 """
@@ -233,11 +306,11 b' class SimpleVCS(object):'
233 306 log.debug('User not allowed to proceed, %s', reason)
234 307 return HTTPNotAcceptable(reason)(environ, start_response)
235 308
236 if not self.repo_name:
237 log.warning('Repository name is empty: %s', self.repo_name)
309 if not self.url_repo_name:
310 log.warning('Repository name is empty: %s', self.url_repo_name)
238 311 # failed to get repo name, we fail now
239 312 return HTTPNotFound()(environ, start_response)
240 log.debug('Extracted repo name is %s', self.repo_name)
313 log.debug('Extracted repo name is %s', self.url_repo_name)
241 314
242 315 ip_addr = get_ip_addr(environ)
243 316 username = None
@@ -251,6 +324,15 b' class SimpleVCS(object):'
251 324 action = self._get_action(environ)
252 325
253 326 # ======================================================================
327 # Check if this is a request to a shadow repository of a pull request.
328 # In this case only pull action is allowed.
329 # ======================================================================
330 if self.is_shadow_repo and action != 'pull':
331 reason = 'Only pull action is allowed for shadow repositories.'
332 log.debug('User not allowed to proceed, %s', reason)
333 return HTTPNotAcceptable(reason)(environ, start_response)
334
335 # ======================================================================
254 336 # CHECK ANONYMOUS PERMISSION
255 337 # ======================================================================
256 338 if action in ['pull', 'push']:
@@ -259,7 +341,7 b' class SimpleVCS(object):'
259 341 if anonymous_user.active:
260 342 # ONLY check permissions if the user is activated
261 343 anonymous_perm = self._check_permission(
262 action, anonymous_user, self.repo_name, ip_addr)
344 action, anonymous_user, self.acl_repo_name, ip_addr)
263 345 else:
264 346 anonymous_perm = False
265 347
@@ -324,7 +406,7 b' class SimpleVCS(object):'
324 406
325 407 # check permissions for this repository
326 408 perm = self._check_permission(
327 action, user, self.repo_name, ip_addr)
409 action, user, self.acl_repo_name, ip_addr)
328 410 if not perm:
329 411 return HTTPForbidden()(environ, start_response)
330 412
@@ -332,30 +414,31 b' class SimpleVCS(object):'
332 414 # in hooks executed by rhodecode
333 415 check_locking = _should_check_locking(environ.get('QUERY_STRING'))
334 416 extras = vcs_operation_context(
335 environ, repo_name=self.repo_name, username=username,
336 action=action, scm=self.SCM,
337 check_locking=check_locking)
417 environ, repo_name=self.acl_repo_name, username=username,
418 action=action, scm=self.SCM, check_locking=check_locking,
419 is_shadow_repo=self.is_shadow_repo
420 )
338 421
339 422 # ======================================================================
340 423 # REQUEST HANDLING
341 424 # ======================================================================
342 str_repo_name = safe_str(self.repo_name)
343 repo_path = os.path.join(safe_str(self.basepath), str_repo_name)
425 repo_path = os.path.join(
426 safe_str(self.basepath), safe_str(self.vcs_repo_name))
344 427 log.debug('Repository path is %s', repo_path)
345 428
346 429 fix_PATH()
347 430
348 431 log.info(
349 432 '%s action on %s repo "%s" by "%s" from %s',
350 action, self.SCM, str_repo_name, safe_str(username), ip_addr)
433 action, self.SCM, safe_str(self.url_repo_name),
434 safe_str(username), ip_addr)
351 435
352 436 return self._generate_vcs_response(
353 environ, start_response, repo_path, self.repo_name, extras, action)
437 environ, start_response, repo_path, extras, action)
354 438
355 439 @initialize_generator
356 440 def _generate_vcs_response(
357 self, environ, start_response, repo_path, repo_name, extras,
358 action):
441 self, environ, start_response, repo_path, extras, action):
359 442 """
360 443 Returns a generator for the response content.
361 444
@@ -365,9 +448,9 b' class SimpleVCS(object):'
365 448 the first chunk is produced by the underlying WSGI application.
366 449 """
367 450 callback_daemon, extras = self._prepare_callback_daemon(extras)
368 config = self._create_config(extras, repo_name)
451 config = self._create_config(extras, self.acl_repo_name)
369 452 log.debug('HOOKS extras is %s', extras)
370 app = self._create_wsgi_app(repo_path, repo_name, config)
453 app = self._create_wsgi_app(repo_path, self.url_repo_name, config)
371 454
372 455 try:
373 456 with callback_daemon:
@@ -386,6 +469,8 b' class SimpleVCS(object):'
386 469 for chunk in response:
387 470 yield chunk
388 471 except Exception as exc:
472 # TODO: martinb: Exceptions are only raised in case of the Pyro4
473 # backend. Refactor this except block after dropping Pyro4 support.
389 474 # TODO: johbo: Improve "translating" back the exception.
390 475 if getattr(exc, '_vcs_kind', None) == 'repo_locked':
391 476 exc = HTTPLockedRC(*exc.args)
@@ -404,7 +489,7 b' class SimpleVCS(object):'
404 489 # invalidate cache on push
405 490 try:
406 491 if action == 'push':
407 self._invalidate_cache(repo_name)
492 self._invalidate_cache(self.url_repo_name)
408 493 finally:
409 494 meta.Session.remove()
410 495
@@ -39,12 +39,12 b' log = logging.getLogger(__name__)'
39 39
40 40 def create_git_wsgi_app(repo_path, repo_name, config):
41 41 url = _vcs_streaming_url() + 'git/'
42 return VcsHttpProxy(url, repo_path, repo_name, config, 'git')
42 return VcsHttpProxy(url, repo_path, repo_name, config)
43 43
44 44
45 45 def create_hg_wsgi_app(repo_path, repo_name, config):
46 46 url = _vcs_streaming_url() + 'hg/'
47 return VcsHttpProxy(url, repo_path, repo_name, config, 'hg')
47 return VcsHttpProxy(url, repo_path, repo_name, config)
48 48
49 49
50 50 def _vcs_streaming_url():
@@ -67,7 +67,7 b' class VcsHttpProxy(object):'
67 67 server as well.
68 68 """
69 69
70 def __init__(self, url, repo_path, repo_name, config, backend):
70 def __init__(self, url, repo_path, repo_name, config):
71 71 """
72 72 :param str url: The URL of the VCSServer to call.
73 73 """
@@ -75,14 +75,11 b' class VcsHttpProxy(object):'
75 75 self._repo_name = repo_name
76 76 self._repo_path = repo_path
77 77 self._config = config
78 self._backend = backend
79 78 log.debug(
80 79 "Creating VcsHttpProxy for repo %s, url %s",
81 80 repo_name, url)
82 81
83 82 def __call__(self, environ, start_response):
84 status = '200 OK'
85
86 83 config = msgpack.packb(self._config)
87 84 request = webob.request.Request(environ)
88 85 request_headers = request.headers
@@ -93,6 +90,7 b' class VcsHttpProxy(object):'
93 90 'X-RC-Path-Info': environ['PATH_INFO'],
94 91 # TODO: johbo: Avoid encoding and put this into payload?
95 92 'X-RC-Repo-Config': base64.b64encode(config),
93 'X-RC-Locked-Status-Code': rhodecode.CONFIG.get('lock_ret_code')
96 94 })
97 95
98 96 data = environ['wsgi.input'].read()
@@ -116,12 +114,11 b' class VcsHttpProxy(object):'
116 114 if not wsgiref.util.is_hop_by_hop(h)
117 115 ]
118 116
119 # Add custom response header to indicate that this is a VCS response
120 # and which backend is used.
121 response_headers.append(('X-RhodeCode-Backend', self._backend))
117 # Build status argument for start_reponse callable.
118 status = '{status_code} {reason_phrase}'.format(
119 status_code=response.status_code,
120 reason_phrase=response.reason)
122 121
123 # TODO: johbo: Better way to get the status including text?
124 status = str(response.status_code)
125 122 start_response(status, response_headers)
126 123 return _maybe_stream(response)
127 124
@@ -174,25 +174,31 b' class VCSMiddleware(object):'
174 174 # translate the _REPO_ID into real repo NAME for usage
175 175 # in middleware
176 176 environ['PATH_INFO'] = vcs_handler._get_by_id(environ['PATH_INFO'])
177 repo_name = vcs_handler._get_repository_name(environ)
177
178 # Set acl, url and vcs repo names.
179 vcs_handler.set_repo_names(environ)
178 180
179 181 # check for type, presence in database and on filesystem
180 182 if not vcs_handler.is_valid_and_existing_repo(
181 repo_name, vcs_handler.basepath, vcs_handler.SCM):
183 vcs_handler.acl_repo_name,
184 vcs_handler.basepath,
185 vcs_handler.SCM):
182 186 return HTTPNotFound()(environ, start_response)
183 187
184 188 # TODO: johbo: Needed for the Pyro4 backend and Mercurial only.
185 189 # Remove once we fully switched to the HTTP backend.
186 environ['REPO_NAME'] = repo_name
190 environ['REPO_NAME'] = vcs_handler.url_repo_name
187 191
188 # register repo_name and it's config back to the handler
189 vcs_handler.repo_name = repo_name
190 vcs_handler.repo_vcs_config = self.vcs_config(repo_name)
192 # register repo config back to the handler
193 vcs_handler.repo_vcs_config = self.vcs_config(
194 vcs_handler.acl_repo_name)
191 195
196 # Wrap handler in middlewares if they are enabled.
192 197 vcs_handler = self.wrap_in_gzip_if_enabled(
193 198 vcs_handler, self.config)
194 199 vcs_handler, _ = wrap_in_appenlight_if_enabled(
195 200 vcs_handler, self.config, self.appenlight_client)
201
196 202 return vcs_handler(environ, start_response)
197 203
198 204 return self.application(environ, start_response)
@@ -33,6 +33,7 b' import tempfile'
33 33 import traceback
34 34 import tarfile
35 35 import warnings
36 import hashlib
36 37 from os.path import join as jn
37 38
38 39 import paste
@@ -58,26 +59,21 b' log = logging.getLogger(__name__)'
58 59
59 60 REMOVED_REPO_PAT = re.compile(r'rm__\d{8}_\d{6}_\d{6}__.*')
60 61
61 _license_cache = None
62
62 # String which contains characters that are not allowed in slug names for
63 # repositories or repository groups. It is properly escaped to use it in
64 # regular expressions.
65 SLUG_BAD_CHARS = re.escape('`?=[]\;\'"<>,/~!@#$%^&*()+{}|:')
63 66
64 def recursive_replace(str_, replace=' '):
65 """
66 Recursive replace of given sign to just one instance
67
68 :param str_: given string
69 :param replace: char to find and replace multiple instances
67 # Regex that matches forbidden characters in repo/group slugs.
68 SLUG_BAD_CHAR_RE = re.compile('[{}]'.format(SLUG_BAD_CHARS))
70 69
71 Examples::
72 >>> recursive_replace("Mighty---Mighty-Bo--sstones",'-')
73 'Mighty-Mighty-Bo-sstones'
74 """
70 # Regex that matches allowed characters in repo/group slugs.
71 SLUG_GOOD_CHAR_RE = re.compile('[^{}]'.format(SLUG_BAD_CHARS))
75 72
76 if str_.find(replace * 2) == -1:
77 return str_
78 else:
79 str_ = str_.replace(replace * 2, replace)
80 return recursive_replace(str_, replace)
73 # Regex that matches whole repo/group slugs.
74 SLUG_RE = re.compile('[^{}]+'.format(SLUG_BAD_CHARS))
75
76 _license_cache = None
81 77
82 78
83 79 def repo_name_slug(value):
@@ -86,14 +82,12 b' def repo_name_slug(value):'
86 82 This function is called on each creation/modification
87 83 of repository to prevent bad names in repo
88 84 """
85 replacement_char = '-'
89 86
90 87 slug = remove_formatting(value)
91 slug = strip_tags(slug)
92
93 for c in """`?=[]\;'"<>,/~!@#$%^&*()+{}|: """:
94 slug = slug.replace(c, '-')
95 slug = recursive_replace(slug, '-')
96 slug = collapse(slug, '-')
88 slug = SLUG_BAD_CHAR_RE.sub('', slug)
89 slug = re.sub('[\s]+', '-', slug)
90 slug = collapse(slug, replacement_char)
97 91 return slug
98 92
99 93
@@ -1009,3 +1003,17 b' def get_registry(request):'
1009 1003 return request.registry
1010 1004 except AttributeError:
1011 1005 return get_current_registry()
1006
1007
1008 def generate_platform_uuid():
1009 """
1010 Generates platform UUID based on it's name
1011 """
1012 import platform
1013
1014 try:
1015 uuid_list = [platform.platform()]
1016 return hashlib.sha256(':'.join(uuid_list)).hexdigest()
1017 except Exception as e:
1018 log.error('Failed to generate host uuid: %s' % e)
1019 return 'UNDEFINED'
@@ -96,7 +96,7 b' def __get_lem(extra_mapping=None):'
96 96
97 97 def str2bool(_str):
98 98 """
99 returs True/False value from given string, it tries to translate the
99 returns True/False value from given string, it tries to translate the
100 100 string into boolean
101 101
102 102 :param _str: string value to translate into boolean
@@ -893,3 +893,44 b' def get_routes_generator_for_server_url('
893 893 environ['wsgi.url_scheme'] = 'https'
894 894
895 895 return routes.util.URLGenerator(rhodecode.CONFIG['routes.map'], environ)
896
897
898 def glob2re(pat):
899 """
900 Translate a shell PATTERN to a regular expression.
901
902 There is no way to quote meta-characters.
903 """
904
905 i, n = 0, len(pat)
906 res = ''
907 while i < n:
908 c = pat[i]
909 i = i+1
910 if c == '*':
911 #res = res + '.*'
912 res = res + '[^/]*'
913 elif c == '?':
914 #res = res + '.'
915 res = res + '[^/]'
916 elif c == '[':
917 j = i
918 if j < n and pat[j] == '!':
919 j = j+1
920 if j < n and pat[j] == ']':
921 j = j+1
922 while j < n and pat[j] != ']':
923 j = j+1
924 if j >= n:
925 res = res + '\\['
926 else:
927 stuff = pat[i:j].replace('\\','\\\\')
928 i = j+1
929 if stuff[0] == '!':
930 stuff = '^' + stuff[1:]
931 elif stuff[0] == '^':
932 stuff = '\\' + stuff
933 res = '%s[%s]' % (res, stuff)
934 else:
935 res = res + re.escape(c)
936 return res + '\Z(?ms)'
@@ -35,7 +35,7 b" VERSION = (0, 5, 0, 'dev')"
35 35
36 36 import atexit
37 37 import logging
38 import subprocess
38 import subprocess32
39 39 import time
40 40 import urlparse
41 41 from cStringIO import StringIO
@@ -46,7 +46,7 b' from Pyro4.errors import CommunicationEr'
46 46 from rhodecode.lib.vcs.conf import settings
47 47 from rhodecode.lib.vcs.backends import get_vcs_instance, get_backend
48 48 from rhodecode.lib.vcs.exceptions import (
49 VCSError, RepositoryError, CommitError)
49 VCSError, RepositoryError, CommitError, VCSCommunicationError)
50 50
51 51 log = logging.getLogger(__name__)
52 52
@@ -99,6 +99,7 b' def connect_pyro4(server_and_port):'
99 99 connection.Git = None
100 100 connection.Hg = None
101 101 connection.Svn = None
102 connection.Service = None
102 103
103 104
104 105 def connect_http(server_and_port):
@@ -108,11 +109,13 b' def connect_http(server_and_port):'
108 109 session_factory = client_http.ThreadlocalSessionFactory()
109 110
110 111 connection.Git = client_http.RepoMaker(
111 server_and_port, '/git', session_factory)
112 server_and_port, '/git', 'git', session_factory)
112 113 connection.Hg = client_http.RepoMaker(
113 server_and_port, '/hg', session_factory)
114 server_and_port, '/hg', 'hg', session_factory)
114 115 connection.Svn = client_http.RepoMaker(
115 server_and_port, '/svn', session_factory)
116 server_and_port, '/svn', 'svn', session_factory)
117 connection.Service = client_http.ServiceConnection(
118 server_and_port, '/_service', session_factory)
116 119
117 120 scm_app.HG_REMOTE_WSGI = client_http.VcsHttpProxy(
118 121 server_and_port, '/proxy/hg')
@@ -124,9 +127,10 b' def connect_http(server_and_port):'
124 127 connection.Git = None
125 128 connection.Hg = None
126 129 connection.Svn = None
130 connection.Service = None
127 131
128 132
129 def connect_vcs(server_and_port, protocol='pyro4'):
133 def connect_vcs(server_and_port, protocol):
130 134 """
131 135 Initializes the connection to the vcs server.
132 136
@@ -137,11 +141,13 b' def connect_vcs(server_and_port, protoco'
137 141 connect_pyro4(server_and_port)
138 142 elif protocol == 'http':
139 143 connect_http(server_and_port)
144 else:
145 raise Exception('Invalid vcs server protocol "{}"'.format(protocol))
140 146
141 147
142 148 # TODO: johbo: This function should be moved into our test suite, there is
143 149 # no reason to support starting the vcsserver in Enterprise itself.
144 def start_vcs_server(server_and_port, protocol='pyro4', log_level=None):
150 def start_vcs_server(server_and_port, protocol, log_level=None):
145 151 """
146 152 Starts the vcs server in a subprocess.
147 153 """
@@ -150,10 +156,12 b' def start_vcs_server(server_and_port, pr'
150 156 return _start_http_vcs_server(server_and_port, log_level)
151 157 elif protocol == 'pyro4':
152 158 return _start_pyro4_vcs_server(server_and_port, log_level)
159 else:
160 raise Exception('Invalid vcs server protocol "{}"'.format(protocol))
153 161
154 162
155 163 def _start_pyro4_vcs_server(server_and_port, log_level=None):
156 _try_to_shutdown_running_server(server_and_port)
164 _try_to_shutdown_running_server(server_and_port, protocol='pyro4')
157 165 host, port = server_and_port.rsplit(":", 1)
158 166 host = host.strip('[]')
159 167 args = [
@@ -161,7 +169,7 b' def _start_pyro4_vcs_server(server_and_p'
161 169 '--threadpool', '32']
162 170 if log_level:
163 171 args += ['--log-level', log_level]
164 proc = subprocess.Popen(args)
172 proc = subprocess32.Popen(args)
165 173
166 174 def cleanup_server_process():
167 175 proc.kill()
@@ -176,9 +184,9 b' def _start_http_vcs_server(server_and_po'
176 184
177 185 host, port = server_and_port.rsplit(":", 1)
178 186 args = [
179 'pserve', 'vcsserver/development_pyramid.ini',
187 'pserve', 'rhodecode/tests/vcsserver_http.ini',
180 188 'http_port=%s' % (port, ), 'http_host=%s' % (host, )]
181 proc = subprocess.Popen(args)
189 proc = subprocess32.Popen(args)
182 190
183 191 def cleanup_server_process():
184 192 proc.kill()
@@ -188,18 +196,22 b' def _start_http_vcs_server(server_and_po'
188 196 _wait_until_vcs_server_is_reachable(server)
189 197
190 198
191 def _wait_until_vcs_server_is_reachable(server):
192 while xrange(80): # max 40s of sleep
199 def _wait_until_vcs_server_is_reachable(server, timeout=40):
200 begin = time.time()
201 while (time.time() - begin) < timeout:
193 202 try:
194 203 server.ping()
195 break
196 except (CommunicationError, pycurl.error):
197 pass
204 return
205 except (VCSCommunicationError, CommunicationError, pycurl.error):
206 log.debug('VCSServer not started yet, retry to connect.')
198 207 time.sleep(0.5)
208 raise Exception(
209 'Starting the VCSServer failed or took more than {} '
210 'seconds.'.format(timeout))
199 211
200 212
201 def _try_to_shutdown_running_server(server_and_port):
202 server = create_vcsserver_proxy(server_and_port)
213 def _try_to_shutdown_running_server(server_and_port, protocol):
214 server = create_vcsserver_proxy(server_and_port, protocol)
203 215 try:
204 216 server.shutdown()
205 217 except (CommunicationError, pycurl.error):
@@ -207,15 +219,17 b' def _try_to_shutdown_running_server(serv'
207 219
208 220 # TODO: Not sure why this is important, but without it the following start
209 221 # of the server fails.
210 server = create_vcsserver_proxy(server_and_port)
222 server = create_vcsserver_proxy(server_and_port, protocol)
211 223 server.ping()
212 224
213 225
214 def create_vcsserver_proxy(server_and_port, protocol='pyro4'):
226 def create_vcsserver_proxy(server_and_port, protocol):
215 227 if protocol == 'pyro4':
216 228 return _create_vcsserver_proxy_pyro4(server_and_port)
217 229 elif protocol == 'http':
218 230 return _create_vcsserver_proxy_http(server_and_port)
231 else:
232 raise Exception('Invalid vcs server protocol "{}"'.format(protocol))
219 233
220 234
221 235 def _create_vcsserver_proxy_pyro4(server_and_port):
@@ -22,6 +22,7 b''
22 22 VCS Backends module
23 23 """
24 24
25 import os
25 26 import logging
26 27
27 28 from pprint import pformat
@@ -42,11 +43,21 b' def get_vcs_instance(repo_path, *args, *'
42 43 for the path it returns None. Arguments and keyword arguments are passed
43 44 to the vcs backend repository class.
44 45 """
46 from rhodecode.lib.utils2 import safe_str
47
48 explicit_vcs_alias = kwargs.pop('_vcs_alias', None)
45 49 try:
46 vcs_alias = get_scm(repo_path)[0]
50 vcs_alias = safe_str(explicit_vcs_alias or get_scm(repo_path)[0])
47 51 log.debug(
48 'Creating instance of %s repository from %s', vcs_alias, repo_path)
52 'Creating instance of %s repository from %s', vcs_alias,
53 safe_str(repo_path))
49 54 backend = get_backend(vcs_alias)
55
56 if explicit_vcs_alias:
57 # do final verification of existance of the path, this does the
58 # same as get_scm() call which we skip in explicit_vcs_alias
59 if not os.path.isdir(repo_path):
60 raise VCSError("Given path %s is not a directory" % repo_path)
50 61 except VCSError:
51 62 log.exception(
52 63 'Perhaps this repository is in db and not in '
@@ -76,3 +87,12 b' def get_supported_backends():'
76 87 Returns list of aliases of supported backends.
77 88 """
78 89 return settings.BACKENDS.keys()
90
91
92 def get_vcsserver_version():
93 from rhodecode.lib.vcs import connection
94 data = connection.Service.get_vcsserver_service_data()
95 if data and 'version' in data:
96 return data['version']
97
98 return None
@@ -53,7 +53,7 b' FILEMODE_EXECUTABLE = 0100755'
53 53 Reference = collections.namedtuple('Reference', ('type', 'name', 'commit_id'))
54 54 MergeResponse = collections.namedtuple(
55 55 'MergeResponse',
56 ('possible', 'executed', 'merge_commit_id', 'failure_reason'))
56 ('possible', 'executed', 'merge_ref', 'failure_reason'))
57 57
58 58
59 59 class MergeFailureReason(object):
@@ -94,8 +94,49 b' class MergeFailureReason(object):'
94 94 # The target repository is locked
95 95 TARGET_IS_LOCKED = 7
96 96
97 # Deprecated, use MISSING_TARGET_REF or MISSING_SOURCE_REF instead.
97 98 # A involved commit could not be found.
98 MISSING_COMMIT = 8
99 _DEPRECATED_MISSING_COMMIT = 8
100
101 # The target repo reference is missing.
102 MISSING_TARGET_REF = 9
103
104 # The source repo reference is missing.
105 MISSING_SOURCE_REF = 10
106
107 # The merge was not successful, there are conflicts related to sub
108 # repositories.
109 SUBREPO_MERGE_FAILED = 11
110
111
112 class UpdateFailureReason(object):
113 """
114 Enumeration with all the reasons why the pull request update could fail.
115
116 DO NOT change the number of the reasons, as they may be stored in the
117 database.
118
119 Changing the name of a reason is acceptable and encouraged to deprecate old
120 reasons.
121 """
122
123 # Everything went well.
124 NONE = 0
125
126 # An unexpected exception was raised. Check the logs for more details.
127 UNKNOWN = 1
128
129 # The pull request is up to date.
130 NO_CHANGE = 2
131
132 # The pull request has a reference type that is not supported for update.
133 WRONG_REF_TPYE = 3
134
135 # Update failed because the target reference is missing.
136 MISSING_TARGET_REF = 4
137
138 # Update failed because the source reference is missing.
139 MISSING_SOURCE_REF = 5
99 140
100 141
101 142 class BaseRepository(object):
@@ -393,9 +434,9 b' class BaseRepository(object):'
393 434 on top of the target instead of being merged.
394 435 """
395 436 if dry_run:
396 message = message or 'sample_message'
397 user_email = user_email or 'user@email.com'
398 user_name = user_name or 'user name'
437 message = message or 'dry_run_merge_message'
438 user_email = user_email or 'dry-run-merge@rhodecode.com'
439 user_name = user_name or 'Dry-Run User'
399 440 else:
400 441 if not user_name:
401 442 raise ValueError('user_name cannot be empty')
@@ -421,7 +462,7 b' class BaseRepository(object):'
421 462
422 463 def _merge_repo(self, shadow_repository_path, target_ref,
423 464 source_repo, source_ref, merge_message,
424 merger_name, merger_email, dry_run=False):
465 merger_name, merger_email, dry_run=False, use_rebase=False):
425 466 """Internal implementation of merge."""
426 467 raise NotImplementedError
427 468
@@ -628,7 +669,8 b' class BaseCommit(object):'
628 669 return u'%s:%s' % (self.idx, self.short_id)
629 670
630 671 def __eq__(self, other):
631 return self.raw_id == other.raw_id
672 same_instance = isinstance(other, self.__class__)
673 return same_instance and self.raw_id == other.raw_id
632 674
633 675 def __json__(self):
634 676 parents = []
@@ -860,7 +902,7 b' class BaseCommit(object):'
860 902
861 903 prefix = self._validate_archive_prefix(prefix)
862 904
863 mtime = mtime or time.time()
905 mtime = mtime or time.mktime(self.date.timetuple())
864 906
865 907 file_info = []
866 908 cur_rev = self.repository.get_commit(commit_id=self.raw_id)
@@ -33,7 +33,7 b' from rhodecode.lib.vcs.backends.git.inme'
33 33 log = logging.getLogger(__name__)
34 34
35 35
36 def discover_git_version():
36 def discover_git_version(raise_on_exc=False):
37 37 """
38 38 Returns the string as it was returned by running 'git --version'
39 39
@@ -44,4 +44,6 b' def discover_git_version():'
44 44 return connection.Git.discover_git_version()
45 45 except Exception:
46 46 log.warning("Failed to discover the Git version", exc_info=True)
47 if raise_on_exc:
48 raise
47 49 return ''
@@ -32,11 +32,13 b' class GitDiff(base.Diff):'
32 32 _header_re = re.compile(r"""
33 33 #^diff[ ]--git
34 34 [ ]"?a/(?P<a_path>.+?)"?[ ]"?b/(?P<b_path>.+?)"?\n
35 (?:^similarity[ ]index[ ](?P<similarity_index>\d+)%\n
36 ^rename[ ]from[ ](?P<rename_from>[^\r\n]+)\n
37 ^rename[ ]to[ ](?P<rename_to>[^\r\n]+)(?:\n|$))?
38 35 (?:^old[ ]mode[ ](?P<old_mode>\d+)\n
39 36 ^new[ ]mode[ ](?P<new_mode>\d+)(?:\n|$))?
37 (?:^similarity[ ]index[ ](?P<similarity_index>\d+)%(?:\n|$))?
38 (?:^rename[ ]from[ ](?P<rename_from>[^\r\n]+)\n
39 ^rename[ ]to[ ](?P<rename_to>[^\r\n]+)(?:\n|$))?
40 (?:^copy[ ]from[ ](?P<copy_from>[^\r\n]+)\n
41 ^copy[ ]to[ ](?P<copy_to>[^\r\n]+)(?:\n|$))?
40 42 (?:^new[ ]file[ ]mode[ ](?P<new_file_mode>.+)(?:\n|$))?
41 43 (?:^deleted[ ]file[ ]mode[ ](?P<deleted_file_mode>.+)(?:\n|$))?
42 44 (?:^index[ ](?P<a_blob_id>[0-9A-Fa-f]+)
@@ -36,11 +36,10 b' from rhodecode.lib.utils import safe_uni'
36 36 from rhodecode.lib.vcs import connection, path as vcspath
37 37 from rhodecode.lib.vcs.backends.base import (
38 38 BaseRepository, CollectionGenerator, Config, MergeResponse,
39 MergeFailureReason)
39 MergeFailureReason, Reference)
40 40 from rhodecode.lib.vcs.backends.git.commit import GitCommit
41 41 from rhodecode.lib.vcs.backends.git.diff import GitDiff
42 42 from rhodecode.lib.vcs.backends.git.inmemory import GitInMemoryCommit
43 from rhodecode.lib.vcs.conf import settings
44 43 from rhodecode.lib.vcs.exceptions import (
45 44 CommitDoesNotExistError, EmptyRepositoryError,
46 45 RepositoryError, TagAlreadyExistError, TagDoesNotExistError, VCSError)
@@ -865,18 +864,29 b' class GitRepository(BaseRepository):'
865 864 shadow_repo._checkout(pr_branch, create=True)
866 865 try:
867 866 shadow_repo._local_fetch(source_repo.path, source_ref.name)
868 except RepositoryError as e:
867 except RepositoryError:
869 868 log.exception('Failure when doing local fetch on git shadow repo')
870 869 return MergeResponse(
871 False, False, None, MergeFailureReason.MISSING_COMMIT)
870 False, False, None, MergeFailureReason.MISSING_SOURCE_REF)
872 871
873 merge_commit_id = None
872 merge_ref = None
874 873 merge_failure_reason = MergeFailureReason.NONE
875 874 try:
876 875 shadow_repo._local_merge(merge_message, merger_name, merger_email,
877 876 [source_ref.commit_id])
878 877 merge_possible = True
879 except RepositoryError as e:
878
879 # Need to reload repo to invalidate the cache, or otherwise we
880 # cannot retrieve the merge commit.
881 shadow_repo = GitRepository(shadow_repository_path)
882 merge_commit_id = shadow_repo.branches[pr_branch]
883
884 # Set a reference pointing to the merge commit. This reference may
885 # be used to easily identify the last successful merge commit in
886 # the shadow repository.
887 shadow_repo.set_refs('refs/heads/pr-merge', merge_commit_id)
888 merge_ref = Reference('branch', 'pr-merge', merge_commit_id)
889 except RepositoryError:
880 890 log.exception('Failure when doing local merge on git shadow repo')
881 891 merge_possible = False
882 892 merge_failure_reason = MergeFailureReason.MERGE_FAILED
@@ -887,11 +897,7 b' class GitRepository(BaseRepository):'
887 897 pr_branch, self.path, target_ref.name, enable_hooks=True,
888 898 rc_scm_data=self.config.get('rhodecode', 'RC_SCM_DATA'))
889 899 merge_succeeded = True
890 # Need to reload repo to invalidate the cache, or otherwise we
891 # cannot retrieve the merge commit.
892 shadow_repo = GitRepository(shadow_repository_path)
893 merge_commit_id = shadow_repo.branches[pr_branch]
894 except RepositoryError as e:
900 except RepositoryError:
895 901 log.exception(
896 902 'Failure when doing local push on git shadow repo')
897 903 merge_succeeded = False
@@ -900,7 +906,7 b' class GitRepository(BaseRepository):'
900 906 merge_succeeded = False
901 907
902 908 return MergeResponse(
903 merge_possible, merge_succeeded, merge_commit_id,
909 merge_possible, merge_succeeded, merge_ref,
904 910 merge_failure_reason)
905 911
906 912 def _get_shadow_repository_path(self, workspace_id):
@@ -21,7 +21,28 b''
21 21 """
22 22 HG module
23 23 """
24 import logging
24 25
26 from rhodecode.lib.vcs import connection
25 27 from rhodecode.lib.vcs.backends.hg.commit import MercurialCommit
26 28 from rhodecode.lib.vcs.backends.hg.inmemory import MercurialInMemoryCommit
27 29 from rhodecode.lib.vcs.backends.hg.repository import MercurialRepository
30
31
32 log = logging.getLogger(__name__)
33
34
35 def discover_hg_version(raise_on_exc=False):
36 """
37 Returns the string as it was returned by running 'git --version'
38
39 It will return an empty string in case the connection is not initialized
40 or no vcsserver is available.
41 """
42 try:
43 return connection.Hg.discover_hg_version()
44 except Exception:
45 log.warning("Failed to discover the HG version", exc_info=True)
46 if raise_on_exc:
47 raise
48 return ''
@@ -38,13 +38,13 b' from rhodecode.lib.utils import safe_uni'
38 38 from rhodecode.lib.vcs import connection
39 39 from rhodecode.lib.vcs.backends.base import (
40 40 BaseRepository, CollectionGenerator, Config, MergeResponse,
41 MergeFailureReason)
41 MergeFailureReason, Reference)
42 42 from rhodecode.lib.vcs.backends.hg.commit import MercurialCommit
43 43 from rhodecode.lib.vcs.backends.hg.diff import MercurialDiff
44 44 from rhodecode.lib.vcs.backends.hg.inmemory import MercurialInMemoryCommit
45 45 from rhodecode.lib.vcs.exceptions import (
46 46 EmptyRepositoryError, RepositoryError, TagAlreadyExistError,
47 TagDoesNotExistError, CommitDoesNotExistError)
47 TagDoesNotExistError, CommitDoesNotExistError, SubrepoMergeError)
48 48
49 49 hexlify = binascii.hexlify
50 50 nullid = "\0" * 20
@@ -673,11 +673,16 b' class MercurialRepository(BaseRepository'
673 673 return MergeResponse(
674 674 False, False, None, MergeFailureReason.TARGET_IS_NOT_HEAD)
675 675
676 if (target_ref.type == 'branch' and
677 len(self._heads(target_ref.name)) != 1):
676 try:
677 if (target_ref.type == 'branch' and
678 len(self._heads(target_ref.name)) != 1):
679 return MergeResponse(
680 False, False, None,
681 MergeFailureReason.HG_TARGET_HAS_MULTIPLE_HEADS)
682 except CommitDoesNotExistError as e:
683 log.exception('Failure when looking up branch heads on hg target')
678 684 return MergeResponse(
679 False, False, None,
680 MergeFailureReason.HG_TARGET_HAS_MULTIPLE_HEADS)
685 False, False, None, MergeFailureReason.MISSING_TARGET_REF)
681 686
682 687 shadow_repo = self._get_shadow_instance(shadow_repository_path)
683 688
@@ -688,12 +693,12 b' class MercurialRepository(BaseRepository'
688 693 log.debug('Pulling in source reference %s', source_ref)
689 694 source_repo._validate_pull_reference(source_ref)
690 695 shadow_repo._local_pull(source_repo.path, source_ref)
691 except CommitDoesNotExistError as e:
696 except CommitDoesNotExistError:
692 697 log.exception('Failure when doing local pull on hg shadow repo')
693 698 return MergeResponse(
694 False, False, None, MergeFailureReason.MISSING_COMMIT)
699 False, False, None, MergeFailureReason.MISSING_SOURCE_REF)
695 700
696 merge_commit_id = None
701 merge_ref = None
697 702 merge_failure_reason = MergeFailureReason.NONE
698 703
699 704 try:
@@ -701,7 +706,18 b' class MercurialRepository(BaseRepository'
701 706 target_ref, merge_message, merger_name, merger_email,
702 707 source_ref, use_rebase=use_rebase)
703 708 merge_possible = True
704 except RepositoryError as e:
709
710 # Set a bookmark pointing to the merge commit. This bookmark may be
711 # used to easily identify the last successful merge commit in the
712 # shadow repository.
713 shadow_repo.bookmark('pr-merge', revision=merge_commit_id)
714 merge_ref = Reference('book', 'pr-merge', merge_commit_id)
715 except SubrepoMergeError:
716 log.exception(
717 'Subrepo merge error during local merge on hg shadow repo.')
718 merge_possible = False
719 merge_failure_reason = MergeFailureReason.SUBREPO_MERGE_FAILED
720 except RepositoryError:
705 721 log.exception('Failure when doing local merge on hg shadow repo')
706 722 merge_possible = False
707 723 merge_failure_reason = MergeFailureReason.MERGE_FAILED
@@ -737,12 +753,8 b' class MercurialRepository(BaseRepository'
737 753 else:
738 754 merge_succeeded = False
739 755
740 if dry_run:
741 merge_commit_id = None
742
743 756 return MergeResponse(
744 merge_possible, merge_succeeded, merge_commit_id,
745 merge_failure_reason)
757 merge_possible, merge_succeeded, merge_ref, merge_failure_reason)
746 758
747 759 def _get_shadow_instance(
748 760 self, shadow_repository_path, enable_hooks=False):
@@ -21,6 +21,27 b''
21 21 """
22 22 SVN module
23 23 """
24 import logging
24 25
26 from rhodecode.lib.vcs import connection
25 27 from rhodecode.lib.vcs.backends.svn.commit import SubversionCommit
26 28 from rhodecode.lib.vcs.backends.svn.repository import SubversionRepository
29
30
31 log = logging.getLogger(__name__)
32
33
34 def discover_svn_version(raise_on_exc=False):
35 """
36 Returns the string as it was returned by running 'git --version'
37
38 It will return an empty string in case the connection is not initialized
39 or no vcsserver is available.
40 """
41 try:
42 return connection.Svn.discover_svn_version()
43 except Exception:
44 log.warning("Failed to discover the SVN version", exc_info=True)
45 if raise_on_exc:
46 raise
47 return ''
@@ -25,15 +25,10 b' Provides the implementation of various c'
25 25
26 26 import copy
27 27 import logging
28 import threading
29 import urlparse
30 28 import uuid
31 29 import weakref
32 from urllib2 import URLError
33 30
34 import msgpack
35 31 import Pyro4
36 import requests
37 32 from pyramid.threadlocal import get_current_request
38 33 from Pyro4.errors import CommunicationError, ConnectionClosedError, DaemonError
39 34
@@ -43,137 +38,6 b' from rhodecode.lib.vcs.conf import setti'
43 38 log = logging.getLogger(__name__)
44 39
45 40
46 # TODO: mikhail: Keep it in sync with vcsserver's
47 # HTTPApplication.ALLOWED_EXCEPTIONS
48 EXCEPTIONS_MAP = {
49 'KeyError': KeyError,
50 'URLError': URLError,
51 }
52
53
54 class HTTPRepoMaker(object):
55 def __init__(self, server_and_port, backend_endpoint):
56 self.url = urlparse.urljoin(
57 'http://%s' % server_and_port, backend_endpoint)
58
59 def __call__(self, path, config, with_wire=None):
60 log.debug('HTTPRepoMaker call on %s', path)
61 return HTTPRemoteRepo(path, config, self.url, with_wire=with_wire)
62
63 def __getattr__(self, name):
64 def f(*args, **kwargs):
65 return self._call(name, *args, **kwargs)
66 return f
67
68 @exceptions.map_vcs_exceptions
69 def _call(self, name, *args, **kwargs):
70 payload = {
71 'id': str(uuid.uuid4()),
72 'method': name,
73 'params': {'args': args, 'kwargs': kwargs}
74 }
75 return _remote_call(self.url, payload, EXCEPTIONS_MAP)
76
77
78 class VcsHttpProxy(object):
79
80 CHUNK_SIZE = 16384
81
82 def __init__(self, server_and_port, backend_endpoint):
83 adapter = requests.adapters.HTTPAdapter(max_retries=5)
84 self.base_url = urlparse.urljoin(
85 'http://%s' % server_and_port, backend_endpoint)
86 self.session = requests.Session()
87 self.session.mount('http://', adapter)
88
89 def handle(self, environment, input_data, *args, **kwargs):
90 data = {
91 'environment': environment,
92 'input_data': input_data,
93 'args': args,
94 'kwargs': kwargs
95 }
96 result = self.session.post(
97 self.base_url, msgpack.packb(data), stream=True)
98 return self._get_result(result)
99
100 def _deserialize_and_raise(self, error):
101 exception = Exception(error['message'])
102 try:
103 exception._vcs_kind = error['_vcs_kind']
104 except KeyError:
105 pass
106 raise exception
107
108 def _iterate(self, result):
109 unpacker = msgpack.Unpacker()
110 for line in result.iter_content(chunk_size=self.CHUNK_SIZE):
111 unpacker.feed(line)
112 for chunk in unpacker:
113 yield chunk
114
115 def _get_result(self, result):
116 iterator = self._iterate(result)
117 error = iterator.next()
118 if error:
119 self._deserialize_and_raise(error)
120
121 status = iterator.next()
122 headers = iterator.next()
123
124 return iterator, status, headers
125
126
127 class HTTPRemoteRepo(object):
128 def __init__(self, path, config, url, with_wire=None):
129 self.url = url
130 self._wire = {
131 "path": path,
132 "config": config,
133 "context": str(uuid.uuid4()),
134 }
135 if with_wire:
136 self._wire.update(with_wire)
137
138 def __getattr__(self, name):
139 def f(*args, **kwargs):
140 return self._call(name, *args, **kwargs)
141 return f
142
143 @exceptions.map_vcs_exceptions
144 def _call(self, name, *args, **kwargs):
145 log.debug('Calling %s@%s', self.url, name)
146 # TODO: oliver: This is currently necessary pre-call since the
147 # config object is being changed for hooking scenarios
148 wire = copy.deepcopy(self._wire)
149 wire["config"] = wire["config"].serialize()
150 payload = {
151 'id': str(uuid.uuid4()),
152 'method': name,
153 'params': {'wire': wire, 'args': args, 'kwargs': kwargs}
154 }
155 return _remote_call(self.url, payload, EXCEPTIONS_MAP)
156
157 def __getitem__(self, key):
158 return self.revision(key)
159
160
161 def _remote_call(url, payload, exceptions_map):
162 response = requests.post(url, data=msgpack.packb(payload))
163 response = msgpack.unpackb(response.content)
164 error = response.get('error')
165 if error:
166 type_ = error.get('type', 'Exception')
167 exc = exceptions_map.get(type_, Exception)
168 exc = exc(error.get('message'))
169 try:
170 exc._vcs_kind = error['_vcs_kind']
171 except KeyError:
172 pass
173 raise exc
174 return response.get('result')
175
176
177 41 class RepoMaker(object):
178 42
179 43 def __init__(self, proxy_factory):
@@ -56,10 +56,11 b' EXCEPTIONS_MAP = {'
56 56
57 57 class RepoMaker(object):
58 58
59 def __init__(self, server_and_port, backend_endpoint, session_factory):
59 def __init__(self, server_and_port, backend_endpoint, backend_type, session_factory):
60 60 self.url = urlparse.urljoin(
61 61 'http://%s' % server_and_port, backend_endpoint)
62 62 self._session_factory = session_factory
63 self.backend_type = backend_type
63 64
64 65 def __call__(self, path, config, with_wire=None):
65 66 log.debug('RepoMaker call on %s', path)
@@ -77,6 +78,30 b' class RepoMaker(object):'
77 78 payload = {
78 79 'id': str(uuid.uuid4()),
79 80 'method': name,
81 'backend': self.backend_type,
82 'params': {'args': args, 'kwargs': kwargs}
83 }
84 return _remote_call(
85 self.url, payload, EXCEPTIONS_MAP, self._session_factory())
86
87
88 class ServiceConnection(object):
89 def __init__(self, server_and_port, backend_endpoint, session_factory):
90 self.url = urlparse.urljoin(
91 'http://%s' % server_and_port, backend_endpoint)
92 self._session_factory = session_factory
93
94 def __getattr__(self, name):
95 def f(*args, **kwargs):
96 return self._call(name, *args, **kwargs)
97
98 return f
99
100 @exceptions.map_vcs_exceptions
101 def _call(self, name, *args, **kwargs):
102 payload = {
103 'id': str(uuid.uuid4()),
104 'method': name,
80 105 'params': {'args': args, 'kwargs': kwargs}
81 106 }
82 107 return _remote_call(
@@ -178,7 +203,12 b' def _remote_call(url, payload, exception'
178 203 except pycurl.error as e:
179 204 raise exceptions.HttpVCSCommunicationError(e)
180 205
181 response = msgpack.unpackb(response.content)
206 try:
207 response = msgpack.unpackb(response.content)
208 except Exception:
209 log.exception('Failed to decode repsponse %r', response.content)
210 raise
211
182 212 error = response.get('error')
183 213 if error:
184 214 type_ = error.get('type', 'Exception')
@@ -25,9 +25,13 b' Holds connection for remote server.'
25 25
26 26 def _not_initialized(*args, **kwargs):
27 27 """Placeholder for objects which have to be initialized first."""
28 raise Exception("rhodecode.lib.vcs is not yet initialized")
28 raise Exception(
29 "rhodecode.lib.vcs is not yet initialized. "
30 "Make sure `vcs.server` is enabled in your configuration.")
29 31
30 32 # TODO: figure out a nice default value for these things
33 Service = _not_initialized
34
31 35 Git = _not_initialized
32 36 Hg = _not_initialized
33 37 Svn = _not_initialized
@@ -24,8 +24,7 b' Custom vcs exceptions module.'
24 24
25 25 import functools
26 26 import urllib2
27 import pycurl
28 from Pyro4.errors import CommunicationError
27
29 28
30 29 class VCSCommunicationError(Exception):
31 30 pass
@@ -129,6 +128,14 b' class NodeAlreadyRemovedError(Committing'
129 128 pass
130 129
131 130
131 class SubrepoMergeError(RepositoryError):
132 """
133 This happens if we try to merge a repository which contains subrepos and
134 the subrepos cannot be merged. The subrepos are not merged itself but
135 their references in the root repo are merged.
136 """
137
138
132 139 class ImproperArchiveTypeError(VCSError):
133 140 pass
134 141
@@ -157,6 +164,7 b' class UnhandledException(VCSError):'
157 164 # TODO: johbo: Define our own exception for this and stop abusing
158 165 # urllib's exception class.
159 166 'url_error': urllib2.URLError,
167 'subrepo_merge_error': SubrepoMergeError,
160 168 }
161 169
162 170
@@ -18,6 +18,7 b''
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 import collections
21 22 import datetime
22 23 import formencode
23 24 import logging
@@ -30,6 +31,7 b' from recaptcha.client.captcha import sub'
30 31
31 32 from rhodecode.authentication.base import authenticate, HTTP_TYPE
32 33 from rhodecode.events import UserRegistered
34 from rhodecode.lib import helpers as h
33 35 from rhodecode.lib.auth import (
34 36 AuthUser, HasPermissionAnyDecorator, CSRFRequired)
35 37 from rhodecode.lib.base import get_ip_addr
@@ -46,6 +48,9 b' from rhodecode.translation import _'
46 48
47 49 log = logging.getLogger(__name__)
48 50
51 CaptchaData = collections.namedtuple(
52 'CaptchaData', 'active, private_key, public_key')
53
49 54
50 55 def _store_user_in_session(session, username, remember=False):
51 56 user = User.get_by_username(username, case_insensitive=True)
@@ -112,6 +117,14 b' class LoginView(object):'
112 117 'errors': {},
113 118 }
114 119
120 def _get_captcha_data(self):
121 settings = SettingsModel().get_all_settings()
122 private_key = settings.get('rhodecode_captcha_private_key')
123 public_key = settings.get('rhodecode_captcha_public_key')
124 active = bool(private_key)
125 return CaptchaData(
126 active=active, private_key=private_key, public_key=public_key)
127
115 128 @view_config(
116 129 route_name='login', request_method='GET',
117 130 renderer='rhodecode:templates/login.html')
@@ -159,7 +172,7 b' class LoginView(object):'
159 172 except formencode.Invalid as errors:
160 173 defaults = errors.value
161 174 # remove password from filling in form again
162 del defaults['password']
175 defaults.pop('password', None)
163 176 render_ctx = self._get_template_context()
164 177 render_ctx.update({
165 178 'errors': errors.error_dict,
@@ -191,10 +204,8 b' class LoginView(object):'
191 204 errors = errors or {}
192 205
193 206 settings = SettingsModel().get_all_settings()
194 captcha_public_key = settings.get('rhodecode_captcha_public_key')
195 captcha_private_key = settings.get('rhodecode_captcha_private_key')
196 captcha_active = bool(captcha_private_key)
197 207 register_message = settings.get('rhodecode_register_message') or ''
208 captcha = self._get_captcha_data()
198 209 auto_active = 'hg.register.auto_activate' in User.get_default_user()\
199 210 .AuthUser.permissions['global']
200 211
@@ -203,8 +214,8 b' class LoginView(object):'
203 214 'defaults': defaults,
204 215 'errors': errors,
205 216 'auto_active': auto_active,
206 'captcha_active': captcha_active,
207 'captcha_public_key': captcha_public_key,
217 'captcha_active': captcha.active,
218 'captcha_public_key': captcha.public_key,
208 219 'register_message': register_message,
209 220 })
210 221 return render_ctx
@@ -215,9 +226,7 b' class LoginView(object):'
215 226 route_name='register', request_method='POST',
216 227 renderer='rhodecode:templates/register.html')
217 228 def register_post(self):
218 captcha_private_key = SettingsModel().get_setting_by_name(
219 'rhodecode_captcha_private_key')
220 captcha_active = bool(captcha_private_key)
229 captcha = self._get_captcha_data()
221 230 auto_active = 'hg.register.auto_activate' in User.get_default_user()\
222 231 .AuthUser.permissions['global']
223 232
@@ -226,15 +235,15 b' class LoginView(object):'
226 235 form_result = register_form.to_python(self.request.params)
227 236 form_result['active'] = auto_active
228 237
229 if captcha_active:
238 if captcha.active:
230 239 response = submit(
231 240 self.request.params.get('recaptcha_challenge_field'),
232 241 self.request.params.get('recaptcha_response_field'),
233 private_key=captcha_private_key,
242 private_key=captcha.private_key,
234 243 remoteip=get_ip_addr(self.request.environ))
235 if captcha_active and not response.is_valid:
244 if not response.is_valid:
236 245 _value = form_result
237 _msg = _('bad captcha')
246 _msg = _('Bad captcha')
238 247 error_dict = {'recaptcha_field': _msg}
239 248 raise formencode.Invalid(_msg, _value, None,
240 249 error_dict=error_dict)
@@ -251,8 +260,8 b' class LoginView(object):'
251 260 raise HTTPFound(redirect_ro)
252 261
253 262 except formencode.Invalid as errors:
254 del errors.value['password']
255 del errors.value['password_confirmation']
263 errors.value.pop('password', None)
264 errors.value.pop('password_confirmation', None)
256 265 return self.register(
257 266 defaults=errors.value, errors=errors.error_dict)
258 267
@@ -268,14 +277,11 b' class LoginView(object):'
268 277 route_name='reset_password', request_method=('GET', 'POST'),
269 278 renderer='rhodecode:templates/password_reset.html')
270 279 def password_reset(self):
271 settings = SettingsModel().get_all_settings()
272 captcha_private_key = settings.get('rhodecode_captcha_private_key')
273 captcha_active = bool(captcha_private_key)
274 captcha_public_key = settings.get('rhodecode_captcha_public_key')
280 captcha = self._get_captcha_data()
275 281
276 282 render_ctx = {
277 'captcha_active': captcha_active,
278 'captcha_public_key': captcha_public_key,
283 'captcha_active': captcha.active,
284 'captcha_public_key': captcha.public_key,
279 285 'defaults': {},
280 286 'errors': {},
281 287 }
@@ -285,15 +291,21 b' class LoginView(object):'
285 291 try:
286 292 form_result = password_reset_form.to_python(
287 293 self.request.params)
288 if captcha_active:
294 if h.HasPermissionAny('hg.password_reset.disabled')():
295 log.error('Failed attempt to reset password for %s.', form_result['email'] )
296 self.session.flash(
297 _('Password reset has been disabled.'),
298 queue='error')
299 return HTTPFound(self.request.route_path('reset_password'))
300 if captcha.active:
289 301 response = submit(
290 302 self.request.params.get('recaptcha_challenge_field'),
291 303 self.request.params.get('recaptcha_response_field'),
292 private_key=captcha_private_key,
304 private_key=captcha.private_key,
293 305 remoteip=get_ip_addr(self.request.environ))
294 if captcha_active and not response.is_valid:
306 if not response.is_valid:
295 307 _value = form_result
296 _msg = _('bad captcha')
308 _msg = _('Bad captcha')
297 309 error_dict = {'recaptcha_field': _msg}
298 310 raise formencode.Invalid(_msg, _value, None,
299 311 error_dict=error_dict)
@@ -80,7 +80,7 b' class ChangesetStatusModel(BaseModel):'
80 80 """
81 81 votes = defaultdict(int)
82 82 reviewers_number = len(statuses_by_reviewers)
83 for user, statuses in statuses_by_reviewers:
83 for user, reasons, statuses in statuses_by_reviewers:
84 84 if statuses:
85 85 ver, latest = statuses[0]
86 86 votes[latest.status] += 1
@@ -254,7 +254,7 b' class ChangesetStatusModel(BaseModel):'
254 254 for x, y in (itertools.groupby(sorted(st, key=version),
255 255 version))]
256 256
257 pull_request_reviewers.append([o.user, st])
257 pull_request_reviewers.append((o.user, o.reasons, st))
258 258 return pull_request_reviewers
259 259
260 260 def calculated_review_status(self, pull_request, reviewers_statuses=None):
@@ -22,8 +22,8 b''
22 22 Database Models for RhodeCode Enterprise
23 23 """
24 24
25 import re
25 26 import os
26 import sys
27 27 import time
28 28 import hashlib
29 29 import logging
@@ -36,28 +36,25 b' import collections'
36 36
37 37
38 38 from sqlalchemy import *
39 from sqlalchemy.exc import IntegrityError
40 39 from sqlalchemy.ext.declarative import declared_attr
41 40 from sqlalchemy.ext.hybrid import hybrid_property
42 41 from sqlalchemy.orm import (
43 42 relationship, joinedload, class_mapper, validates, aliased)
44 43 from sqlalchemy.sql.expression import true
45 from beaker.cache import cache_region, region_invalidate
44 from beaker.cache import cache_region
46 45 from webob.exc import HTTPNotFound
47 46 from zope.cachedescriptors.property import Lazy as LazyProperty
48 47
49 48 from pylons import url
50 49 from pylons.i18n.translation import lazy_ugettext as _
51 50
52 from rhodecode.lib.vcs import get_backend, get_vcs_instance
53 from rhodecode.lib.vcs.utils.helpers import get_scm
54 from rhodecode.lib.vcs.exceptions import VCSError
55 from rhodecode.lib.vcs.backends.base import (
56 EmptyCommit, Reference, MergeFailureReason)
51 from rhodecode.lib.vcs import get_vcs_instance
52 from rhodecode.lib.vcs.backends.base import EmptyCommit, Reference
57 53 from rhodecode.lib.utils2 import (
58 str2bool, safe_str, get_commit_safe, safe_unicode, remove_prefix, md5_safe,
59 time_to_datetime, aslist, Optional, safe_int, get_clone_url, AttributeDict)
60 from rhodecode.lib.jsonalchemy import MutationObj, JsonType, JSONDict
54 str2bool, safe_str, get_commit_safe, safe_unicode, md5_safe,
55 time_to_datetime, aslist, Optional, safe_int, get_clone_url, AttributeDict,
56 glob2re)
57 from rhodecode.lib.jsonalchemy import MutationObj, MutationList, JsonType
61 58 from rhodecode.lib.ext_json import json
62 59 from rhodecode.lib.caching_query import FromCache
63 60 from rhodecode.lib.encrypt import AESCipher
@@ -1990,12 +1987,12 b' class Repository(Base, BaseModel):'
1990 1987 custom_wire = {
1991 1988 'cache': cache # controls the vcs.remote cache
1992 1989 }
1993
1994 1990 repo = get_vcs_instance(
1995 1991 repo_path=safe_str(self.repo_full_path),
1996 1992 config=config,
1997 1993 with_wire=custom_wire,
1998 create=False)
1994 create=False,
1995 _vcs_alias=self.repo_type)
1999 1996
2000 1997 return repo
2001 1998
@@ -2031,6 +2028,7 b' class RepoGroup(Base, BaseModel):'
2031 2028 enable_locking = Column("enable_locking", Boolean(), nullable=False, unique=None, default=False)
2032 2029 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'), nullable=False, unique=False, default=None)
2033 2030 created_on = Column('created_on', DateTime(timezone=False), nullable=False, default=datetime.datetime.now)
2031 personal = Column('personal', Boolean(), nullable=True, unique=None, default=None)
2034 2032
2035 2033 repo_group_to_perm = relationship('UserRepoGroupToPerm', cascade='all', order_by='UserRepoGroupToPerm.group_to_perm_id')
2036 2034 users_group_to_perm = relationship('UserGroupRepoGroupToPerm', cascade='all')
@@ -2086,6 +2084,13 b' class RepoGroup(Base, BaseModel):'
2086 2084 return gr.scalar()
2087 2085
2088 2086 @classmethod
2087 def get_user_personal_repo_group(cls, user_id):
2088 user = User.get(user_id)
2089 return cls.query()\
2090 .filter(cls.personal == true())\
2091 .filter(cls.user == user).scalar()
2092
2093 @classmethod
2089 2094 def get_all_repo_groups(cls, user_id=Optional(None), group_id=Optional(None),
2090 2095 case_insensitive=True):
2091 2096 q = RepoGroup.query()
@@ -2317,6 +2322,10 b' class Permission(Base, BaseModel):'
2317 2322 ('hg.register.manual_activate', _('User Registration with manual account activation')),
2318 2323 ('hg.register.auto_activate', _('User Registration with automatic account activation')),
2319 2324
2325 ('hg.password_reset.enabled', _('Password reset enabled')),
2326 ('hg.password_reset.hidden', _('Password reset hidden')),
2327 ('hg.password_reset.disabled', _('Password reset disabled')),
2328
2320 2329 ('hg.extern_activate.manual', _('Manual activation of external account')),
2321 2330 ('hg.extern_activate.auto', _('Automatic activation of external account')),
2322 2331
@@ -2335,6 +2344,7 b' class Permission(Base, BaseModel):'
2335 2344 'hg.create.write_on_repogroup.true',
2336 2345 'hg.fork.repository',
2337 2346 'hg.register.manual_activate',
2347 'hg.password_reset.enabled',
2338 2348 'hg.extern_activate.auto',
2339 2349 'hg.inherit_default_perms.true',
2340 2350 ]
@@ -2919,6 +2929,10 b' class ChangesetComment(Base, BaseModel):'
2919 2929 q = q.filter(cls.pull_request_id == pull_request_id)
2920 2930 return q.all()
2921 2931
2932 @property
2933 def outdated(self):
2934 return self.display_state == self.COMMENT_OUTDATED
2935
2922 2936 def render(self, mentions=False):
2923 2937 from rhodecode.lib import helpers as h
2924 2938 return h.render(self.text, renderer=self.renderer, mentions=mentions)
@@ -3031,6 +3045,7 b' class _PullRequestBase(BaseModel):'
3031 3045 nullable=False)
3032 3046
3033 3047 target_ref = Column('other_ref', Unicode(255), nullable=False)
3048 _shadow_merge_ref = Column('shadow_merge_ref', Unicode(255), nullable=True)
3034 3049
3035 3050 # TODO: dan: rename column to last_merge_source_rev
3036 3051 _last_merge_source_rev = Column(
@@ -3061,8 +3076,7 b' class _PullRequestBase(BaseModel):'
3061 3076
3062 3077 @property
3063 3078 def source_ref_parts(self):
3064 refs = self.source_ref.split(':')
3065 return Reference(refs[0], refs[1], refs[2])
3079 return self.unicode_to_reference(self.source_ref)
3066 3080
3067 3081 @declared_attr
3068 3082 def target_repo(cls):
@@ -3072,8 +3086,36 b' class _PullRequestBase(BaseModel):'
3072 3086
3073 3087 @property
3074 3088 def target_ref_parts(self):
3075 refs = self.target_ref.split(':')
3076 return Reference(refs[0], refs[1], refs[2])
3089 return self.unicode_to_reference(self.target_ref)
3090
3091 @property
3092 def shadow_merge_ref(self):
3093 return self.unicode_to_reference(self._shadow_merge_ref)
3094
3095 @shadow_merge_ref.setter
3096 def shadow_merge_ref(self, ref):
3097 self._shadow_merge_ref = self.reference_to_unicode(ref)
3098
3099 def unicode_to_reference(self, raw):
3100 """
3101 Convert a unicode (or string) to a reference object.
3102 If unicode evaluates to False it returns None.
3103 """
3104 if raw:
3105 refs = raw.split(':')
3106 return Reference(*refs)
3107 else:
3108 return None
3109
3110 def reference_to_unicode(self, ref):
3111 """
3112 Convert a reference object to unicode.
3113 If reference is None it returns None.
3114 """
3115 if ref:
3116 return u':'.join(ref)
3117 else:
3118 return None
3077 3119
3078 3120
3079 3121 class PullRequest(Base, _PullRequestBase):
@@ -3107,11 +3149,21 b' class PullRequest(Base, _PullRequestBase'
3107 3149 from rhodecode.model.pull_request import PullRequestModel
3108 3150 pull_request = self
3109 3151 merge_status = PullRequestModel().merge_status(pull_request)
3152
3153 pull_request_url = url(
3154 'pullrequest_show', repo_name=self.target_repo.repo_name,
3155 pull_request_id=self.pull_request_id, qualified=True)
3156
3157 merge_data = {
3158 'clone_url': PullRequestModel().get_shadow_clone_url(pull_request),
3159 'reference': (
3160 pull_request.shadow_merge_ref._asdict()
3161 if pull_request.shadow_merge_ref else None),
3162 }
3163
3110 3164 data = {
3111 3165 'pull_request_id': pull_request.pull_request_id,
3112 'url': url('pullrequest_show', repo_name=self.target_repo.repo_name,
3113 pull_request_id=self.pull_request_id,
3114 qualified=True),
3166 'url': pull_request_url,
3115 3167 'title': pull_request.title,
3116 3168 'description': pull_request.description,
3117 3169 'status': pull_request.status,
@@ -3141,15 +3193,17 b' class PullRequest(Base, _PullRequestBase'
3141 3193 'commit_id': pull_request.target_ref_parts.commit_id,
3142 3194 },
3143 3195 },
3196 'merge': merge_data,
3144 3197 'author': pull_request.author.get_api_data(include_secrets=False,
3145 3198 details='basic'),
3146 3199 'reviewers': [
3147 3200 {
3148 3201 'user': reviewer.get_api_data(include_secrets=False,
3149 3202 details='basic'),
3203 'reasons': reasons,
3150 3204 'review_status': st[0][1].status if st else 'not_reviewed',
3151 3205 }
3152 for reviewer, st in pull_request.reviewers_statuses()
3206 for reviewer, reasons, st in pull_request.reviewers_statuses()
3153 3207 ]
3154 3208 }
3155 3209
@@ -3161,14 +3215,10 b' class PullRequest(Base, _PullRequestBase'
3161 3215 }
3162 3216
3163 3217 def calculated_review_status(self):
3164 # TODO: anderson: 13.05.15 Used only on templates/my_account_pullrequests.html
3165 # because it's tricky on how to use ChangesetStatusModel from there
3166 warnings.warn("Use calculated_review_status from ChangesetStatusModel", DeprecationWarning)
3167 3218 from rhodecode.model.changeset_status import ChangesetStatusModel
3168 3219 return ChangesetStatusModel().calculated_review_status(self)
3169 3220
3170 3221 def reviewers_statuses(self):
3171 warnings.warn("Use reviewers_statuses from ChangesetStatusModel", DeprecationWarning)
3172 3222 from rhodecode.model.changeset_status import ChangesetStatusModel
3173 3223 return ChangesetStatusModel().reviewers_statuses(self)
3174 3224
@@ -3201,9 +3251,23 b' class PullRequestReviewers(Base, BaseMod'
3201 3251 'mysql_charset': 'utf8', 'sqlite_autoincrement': True},
3202 3252 )
3203 3253
3204 def __init__(self, user=None, pull_request=None):
3254 def __init__(self, user=None, pull_request=None, reasons=None):
3205 3255 self.user = user
3206 3256 self.pull_request = pull_request
3257 self.reasons = reasons or []
3258
3259 @hybrid_property
3260 def reasons(self):
3261 if not self._reasons:
3262 return []
3263 return self._reasons
3264
3265 @reasons.setter
3266 def reasons(self, val):
3267 val = val or []
3268 if any(not isinstance(x, basestring) for x in val):
3269 raise Exception('invalid reasons type, must be list of strings')
3270 self._reasons = val
3207 3271
3208 3272 pull_requests_reviewers_id = Column(
3209 3273 'pull_requests_reviewers_id', Integer(), nullable=False,
@@ -3213,6 +3277,9 b' class PullRequestReviewers(Base, BaseMod'
3213 3277 ForeignKey('pull_requests.pull_request_id'), nullable=False)
3214 3278 user_id = Column(
3215 3279 "user_id", Integer(), ForeignKey('users.user_id'), nullable=True)
3280 _reasons = Column(
3281 'reason', MutationList.as_mutable(
3282 JsonType('list', dialect_map=dict(mysql=UnicodeText(16384)))))
3216 3283
3217 3284 user = relationship('User')
3218 3285 pull_request = relationship('PullRequest')
@@ -3514,3 +3581,125 b' class Integration(Base, BaseModel):'
3514 3581
3515 3582 def __repr__(self):
3516 3583 return '<Integration(%r, %r)>' % (self.integration_type, self.scope)
3584
3585
3586 class RepoReviewRuleUser(Base, BaseModel):
3587 __tablename__ = 'repo_review_rules_users'
3588 __table_args__ = (
3589 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3590 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,}
3591 )
3592 repo_review_rule_user_id = Column(
3593 'repo_review_rule_user_id', Integer(), primary_key=True)
3594 repo_review_rule_id = Column("repo_review_rule_id",
3595 Integer(), ForeignKey('repo_review_rules.repo_review_rule_id'))
3596 user_id = Column("user_id", Integer(), ForeignKey('users.user_id'),
3597 nullable=False)
3598 user = relationship('User')
3599
3600
3601 class RepoReviewRuleUserGroup(Base, BaseModel):
3602 __tablename__ = 'repo_review_rules_users_groups'
3603 __table_args__ = (
3604 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3605 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,}
3606 )
3607 repo_review_rule_users_group_id = Column(
3608 'repo_review_rule_users_group_id', Integer(), primary_key=True)
3609 repo_review_rule_id = Column("repo_review_rule_id",
3610 Integer(), ForeignKey('repo_review_rules.repo_review_rule_id'))
3611 users_group_id = Column("users_group_id", Integer(),
3612 ForeignKey('users_groups.users_group_id'), nullable=False)
3613 users_group = relationship('UserGroup')
3614
3615
3616 class RepoReviewRule(Base, BaseModel):
3617 __tablename__ = 'repo_review_rules'
3618 __table_args__ = (
3619 {'extend_existing': True, 'mysql_engine': 'InnoDB',
3620 'mysql_charset': 'utf8', 'sqlite_autoincrement': True,}
3621 )
3622
3623 repo_review_rule_id = Column(
3624 'repo_review_rule_id', Integer(), primary_key=True)
3625 repo_id = Column(
3626 "repo_id", Integer(), ForeignKey('repositories.repo_id'))
3627 repo = relationship('Repository', backref='review_rules')
3628
3629 _branch_pattern = Column("branch_pattern", UnicodeText().with_variant(UnicodeText(255), 'mysql'),
3630 default=u'*') # glob
3631 _file_pattern = Column("file_pattern", UnicodeText().with_variant(UnicodeText(255), 'mysql'),
3632 default=u'*') # glob
3633
3634 use_authors_for_review = Column("use_authors_for_review", Boolean(),
3635 nullable=False, default=False)
3636 rule_users = relationship('RepoReviewRuleUser')
3637 rule_user_groups = relationship('RepoReviewRuleUserGroup')
3638
3639 @hybrid_property
3640 def branch_pattern(self):
3641 return self._branch_pattern or '*'
3642
3643 def _validate_glob(self, value):
3644 re.compile('^' + glob2re(value) + '$')
3645
3646 @branch_pattern.setter
3647 def branch_pattern(self, value):
3648 self._validate_glob(value)
3649 self._branch_pattern = value or '*'
3650
3651 @hybrid_property
3652 def file_pattern(self):
3653 return self._file_pattern or '*'
3654
3655 @file_pattern.setter
3656 def file_pattern(self, value):
3657 self._validate_glob(value)
3658 self._file_pattern = value or '*'
3659
3660 def matches(self, branch, files_changed):
3661 """
3662 Check if this review rule matches a branch/files in a pull request
3663
3664 :param branch: branch name for the commit
3665 :param files_changed: list of file paths changed in the pull request
3666 """
3667
3668 branch = branch or ''
3669 files_changed = files_changed or []
3670
3671 branch_matches = True
3672 if branch:
3673 branch_regex = re.compile('^' + glob2re(self.branch_pattern) + '$')
3674 branch_matches = bool(branch_regex.search(branch))
3675
3676 files_matches = True
3677 if self.file_pattern != '*':
3678 files_matches = False
3679 file_regex = re.compile(glob2re(self.file_pattern))
3680 for filename in files_changed:
3681 if file_regex.search(filename):
3682 files_matches = True
3683 break
3684
3685 return branch_matches and files_matches
3686
3687 @property
3688 def review_users(self):
3689 """ Returns the users which this rule applies to """
3690
3691 users = set()
3692 users |= set([
3693 rule_user.user for rule_user in self.rule_users
3694 if rule_user.user.active])
3695 users |= set(
3696 member.user
3697 for rule_user_group in self.rule_user_groups
3698 for member in rule_user_group.users_group.members
3699 if member.user.active
3700 )
3701 return users
3702
3703 def __repr__(self):
3704 return '<RepoReviewerRule(id=%r, repo=%r)>' % (
3705 self.repo_review_rule_id, self.repo)
@@ -143,10 +143,8 b' def UserForm(edit=False, available_langu'
143 143 return _UserForm
144 144
145 145
146 def UserGroupForm(edit=False, old_data=None, available_members=None,
147 allow_disabled=False):
146 def UserGroupForm(edit=False, old_data=None, allow_disabled=False):
148 147 old_data = old_data or {}
149 available_members = available_members or []
150 148
151 149 class _UserGroupForm(formencode.Schema):
152 150 allow_extra_fields = True
@@ -162,10 +160,6 b' def UserGroupForm(edit=False, old_data=N'
162 160 users_group_active = v.StringBoolean(if_missing=False)
163 161
164 162 if edit:
165 users_group_members = v.OneOf(
166 available_members, hideList=False, testValueList=True,
167 if_missing=None, not_empty=False
168 )
169 163 # this is user group owner
170 164 user = All(
171 165 v.UnicodeString(not_empty=True),
@@ -347,6 +341,8 b' def ApplicationSettingsForm():'
347 341 rhodecode_post_code = v.UnicodeString(strip=True, min=1, not_empty=False)
348 342 rhodecode_captcha_public_key = v.UnicodeString(strip=True, min=1, not_empty=False)
349 343 rhodecode_captcha_private_key = v.UnicodeString(strip=True, min=1, not_empty=False)
344 rhodecode_create_personal_repo_group = v.StringBoolean(if_missing=False)
345 rhodecode_personal_repo_group_pattern = v.UnicodeString(strip=True, min=1, not_empty=False)
350 346
351 347 return _ApplicationSettingsForm
352 348
@@ -427,7 +423,8 b' def LabsSettingsForm():'
427 423 return _LabSettingsForm
428 424
429 425
430 def ApplicationPermissionsForm(register_choices, extern_activate_choices):
426 def ApplicationPermissionsForm(
427 register_choices, password_reset_choices, extern_activate_choices):
431 428 class _DefaultPermissionsForm(formencode.Schema):
432 429 allow_extra_fields = True
433 430 filter_extra_fields = True
@@ -435,6 +432,7 b' def ApplicationPermissionsForm(register_'
435 432 anonymous = v.StringBoolean(if_missing=False)
436 433 default_register = v.OneOf(register_choices)
437 434 default_register_message = v.UnicodeString()
435 default_password_reset = v.OneOf(password_reset_choices)
438 436 default_extern_activate = v.OneOf(extern_activate_choices)
439 437
440 438 return _DefaultPermissionsForm
@@ -519,7 +517,12 b' def UserExtraIpForm():'
519 517 return _UserExtraIpForm
520 518
521 519
520
522 521 def PullRequestForm(repo_id):
522 class ReviewerForm(formencode.Schema):
523 user_id = v.Int(not_empty=True)
524 reasons = All()
525
523 526 class _PullRequestForm(formencode.Schema):
524 527 allow_extra_fields = True
525 528 filter_extra_fields = True
@@ -531,8 +534,7 b' def PullRequestForm(repo_id):'
531 534 target_ref = v.UnicodeString(strip=True, required=True)
532 535 revisions = All(#v.NotReviewedRevisions(repo_id)(),
533 536 v.UniqueList()(not_empty=True))
534 review_members = v.UniqueList(convert=int)(not_empty=True)
535
537 review_members = formencode.ForEach(ReviewerForm())
536 538 pullrequest_title = v.UnicodeString(strip=True, required=True)
537 539 pullrequest_desc = v.UnicodeString(strip=True, required=False)
538 540
@@ -51,8 +51,8 b' class PermissionModel(BaseModel):'
51 51 'default_user_group_create': None,
52 52 'default_fork_create': None,
53 53 'default_inherit_default_permissions': None,
54
55 54 'default_register': None,
55 'default_password_reset': None,
56 56 'default_extern_activate': None,
57 57
58 58 # object permissions below
@@ -61,57 +61,63 b' class PermissionModel(BaseModel):'
61 61 'default_user_group_perm': None,
62 62 }
63 63
64 def set_global_permission_choices(self, c_obj, translator):
64 def set_global_permission_choices(self, c_obj, gettext_translator):
65
65 66 c_obj.repo_perms_choices = [
66 ('repository.none', translator('None'),),
67 ('repository.read', translator('Read'),),
68 ('repository.write', translator('Write'),),
69 ('repository.admin', translator('Admin'),)]
67 ('repository.none', gettext_translator('None'),),
68 ('repository.read', gettext_translator('Read'),),
69 ('repository.write', gettext_translator('Write'),),
70 ('repository.admin', gettext_translator('Admin'),)]
70 71
71 72 c_obj.group_perms_choices = [
72 ('group.none', translator('None'),),
73 ('group.read', translator('Read'),),
74 ('group.write', translator('Write'),),
75 ('group.admin', translator('Admin'),)]
73 ('group.none', gettext_translator('None'),),
74 ('group.read', gettext_translator('Read'),),
75 ('group.write', gettext_translator('Write'),),
76 ('group.admin', gettext_translator('Admin'),)]
76 77
77 78 c_obj.user_group_perms_choices = [
78 ('usergroup.none', translator('None'),),
79 ('usergroup.read', translator('Read'),),
80 ('usergroup.write', translator('Write'),),
81 ('usergroup.admin', translator('Admin'),)]
79 ('usergroup.none', gettext_translator('None'),),
80 ('usergroup.read', gettext_translator('Read'),),
81 ('usergroup.write', gettext_translator('Write'),),
82 ('usergroup.admin', gettext_translator('Admin'),)]
82 83
83 84 c_obj.register_choices = [
84 ('hg.register.none', translator('Disabled')),
85 ('hg.register.manual_activate', translator('Allowed with manual account activation')),
86 ('hg.register.auto_activate', translator('Allowed with automatic account activation')),]
85 ('hg.register.none', gettext_translator('Disabled')),
86 ('hg.register.manual_activate', gettext_translator('Allowed with manual account activation')),
87 ('hg.register.auto_activate', gettext_translator('Allowed with automatic account activation')),]
88
89 c_obj.password_reset_choices = [
90 ('hg.password_reset.enabled', gettext_translator('Allow password recovery')),
91 ('hg.password_reset.hidden', gettext_translator('Hide password recovery link')),
92 ('hg.password_reset.disabled', gettext_translator('Disable password recovery')),]
87 93
88 94 c_obj.extern_activate_choices = [
89 ('hg.extern_activate.manual', translator('Manual activation of external account')),
90 ('hg.extern_activate.auto', translator('Automatic activation of external account')),]
95 ('hg.extern_activate.manual', gettext_translator('Manual activation of external account')),
96 ('hg.extern_activate.auto', gettext_translator('Automatic activation of external account')),]
91 97
92 98 c_obj.repo_create_choices = [
93 ('hg.create.none', translator('Disabled')),
94 ('hg.create.repository', translator('Enabled'))]
99 ('hg.create.none', gettext_translator('Disabled')),
100 ('hg.create.repository', gettext_translator('Enabled'))]
95 101
96 102 c_obj.repo_create_on_write_choices = [
97 ('hg.create.write_on_repogroup.false', translator('Disabled')),
98 ('hg.create.write_on_repogroup.true', translator('Enabled'))]
103 ('hg.create.write_on_repogroup.false', gettext_translator('Disabled')),
104 ('hg.create.write_on_repogroup.true', gettext_translator('Enabled'))]
99 105
100 106 c_obj.user_group_create_choices = [
101 ('hg.usergroup.create.false', translator('Disabled')),
102 ('hg.usergroup.create.true', translator('Enabled'))]
107 ('hg.usergroup.create.false', gettext_translator('Disabled')),
108 ('hg.usergroup.create.true', gettext_translator('Enabled'))]
103 109
104 110 c_obj.repo_group_create_choices = [
105 ('hg.repogroup.create.false', translator('Disabled')),
106 ('hg.repogroup.create.true', translator('Enabled'))]
111 ('hg.repogroup.create.false', gettext_translator('Disabled')),
112 ('hg.repogroup.create.true', gettext_translator('Enabled'))]
107 113
108 114 c_obj.fork_choices = [
109 ('hg.fork.none', translator('Disabled')),
110 ('hg.fork.repository', translator('Enabled'))]
115 ('hg.fork.none', gettext_translator('Disabled')),
116 ('hg.fork.repository', gettext_translator('Enabled'))]
111 117
112 118 c_obj.inherit_default_permission_choices = [
113 ('hg.inherit_default_perms.false', translator('Disabled')),
114 ('hg.inherit_default_perms.true', translator('Enabled'))]
119 ('hg.inherit_default_perms.false', gettext_translator('Disabled')),
120 ('hg.inherit_default_perms.true', gettext_translator('Enabled'))]
115 121
116 122 def get_default_perms(self, object_perms, suffix):
117 123 defaults = {}
@@ -149,6 +155,9 b' class PermissionModel(BaseModel):'
149 155 if perm.permission.permission_name.startswith('hg.register.'):
150 156 defaults['default_register' + suffix] = perm.permission.permission_name
151 157
158 if perm.permission.permission_name.startswith('hg.password_reset.'):
159 defaults['default_password_reset' + suffix] = perm.permission.permission_name
160
152 161 if perm.permission.permission_name.startswith('hg.extern_activate.'):
153 162 defaults['default_extern_activate' + suffix] = perm.permission.permission_name
154 163
@@ -182,6 +191,7 b' class PermissionModel(BaseModel):'
182 191
183 192 # application perms
184 193 'default_register': 'hg.register.',
194 'default_password_reset': 'hg.password_reset.',
185 195 'default_extern_activate': 'hg.extern_activate.',
186 196
187 197 # object permissions below
@@ -383,6 +393,7 b' class PermissionModel(BaseModel):'
383 393 'default_user_group_perm',
384 394
385 395 'default_register',
396 'default_password_reset',
386 397 'default_extern_activate'])
387 398 self.sa.commit()
388 399 except (DatabaseError,):
@@ -404,6 +415,7 b' class PermissionModel(BaseModel):'
404 415 'default_user_group_perm',
405 416
406 417 'default_register',
418 'default_password_reset',
407 419 'default_extern_activate'])
408 420 self.sa.commit()
409 421 except (DatabaseError,):
@@ -429,6 +441,7 b' class PermissionModel(BaseModel):'
429 441 'default_inherit_default_permissions',
430 442
431 443 'default_register',
444 'default_password_reset',
432 445 'default_extern_activate'])
433 446
434 447 # overwrite default repo permissions
@@ -27,9 +27,11 b' from collections import namedtuple'
27 27 import json
28 28 import logging
29 29 import datetime
30 import urllib
30 31
31 32 from pylons.i18n.translation import _
32 33 from pylons.i18n.translation import lazy_ugettext
34 from sqlalchemy import or_
33 35
34 36 from rhodecode.lib import helpers as h, hooks_utils, diffs
35 37 from rhodecode.lib.compat import OrderedDict
@@ -39,7 +41,7 b' from rhodecode.lib.markup_renderer impor'
39 41 from rhodecode.lib.utils import action_logger
40 42 from rhodecode.lib.utils2 import safe_unicode, safe_str, md5_safe
41 43 from rhodecode.lib.vcs.backends.base import (
42 Reference, MergeResponse, MergeFailureReason)
44 Reference, MergeResponse, MergeFailureReason, UpdateFailureReason)
43 45 from rhodecode.lib.vcs.conf import settings as vcs_settings
44 46 from rhodecode.lib.vcs.exceptions import (
45 47 CommitDoesNotExistError, EmptyRepositoryError)
@@ -59,6 +61,12 b' from rhodecode.model.settings import Vcs'
59 61 log = logging.getLogger(__name__)
60 62
61 63
64 # Data structure to hold the response data when updating commits during a pull
65 # request update.
66 UpdateResponse = namedtuple(
67 'UpdateResponse', 'executed, reason, new, old, changes')
68
69
62 70 class PullRequestModel(BaseModel):
63 71
64 72 cls = PullRequest
@@ -88,9 +96,37 b' class PullRequestModel(BaseModel):'
88 96 MergeFailureReason.TARGET_IS_LOCKED: lazy_ugettext(
89 97 'This pull request cannot be merged because the target repository'
90 98 ' is locked.'),
91 MergeFailureReason.MISSING_COMMIT: lazy_ugettext(
99 MergeFailureReason._DEPRECATED_MISSING_COMMIT: lazy_ugettext(
92 100 'This pull request cannot be merged because the target or the '
93 101 'source reference is missing.'),
102 MergeFailureReason.MISSING_TARGET_REF: lazy_ugettext(
103 'This pull request cannot be merged because the target '
104 'reference is missing.'),
105 MergeFailureReason.MISSING_SOURCE_REF: lazy_ugettext(
106 'This pull request cannot be merged because the source '
107 'reference is missing.'),
108 MergeFailureReason.SUBREPO_MERGE_FAILED: lazy_ugettext(
109 'This pull request cannot be merged because of conflicts related '
110 'to sub repositories.'),
111 }
112
113 UPDATE_STATUS_MESSAGES = {
114 UpdateFailureReason.NONE: lazy_ugettext(
115 'Pull request update successful.'),
116 UpdateFailureReason.UNKNOWN: lazy_ugettext(
117 'Pull request update failed because of an unknown error.'),
118 UpdateFailureReason.NO_CHANGE: lazy_ugettext(
119 'No update needed because the source reference is already '
120 'up to date.'),
121 UpdateFailureReason.WRONG_REF_TPYE: lazy_ugettext(
122 'Pull request cannot be updated because the reference type is '
123 'not supported for an update.'),
124 UpdateFailureReason.MISSING_TARGET_REF: lazy_ugettext(
125 'This pull request cannot be updated because the target '
126 'reference is missing.'),
127 UpdateFailureReason.MISSING_SOURCE_REF: lazy_ugettext(
128 'This pull request cannot be updated because the source '
129 'reference is missing.'),
94 130 }
95 131
96 132 def __get_pull_request(self, pull_request):
@@ -116,6 +152,11 b' class PullRequestModel(BaseModel):'
116 152 owner = user.user_id == pull_request.user_id
117 153 return self.check_user_merge(pull_request, user, api) or owner
118 154
155 def check_user_delete(self, pull_request, user):
156 owner = user.user_id == pull_request.user_id
157 _perms = ('repository.admin')
158 return self._check_perms(_perms, pull_request, user) or owner
159
119 160 def check_user_change_status(self, pull_request, user, api=False):
120 161 reviewer = user.user_id in [x.user_id for x in
121 162 pull_request.reviewers]
@@ -127,12 +168,16 b' class PullRequestModel(BaseModel):'
127 168 def _prepare_get_all_query(self, repo_name, source=False, statuses=None,
128 169 opened_by=None, order_by=None,
129 170 order_dir='desc'):
130 repo = self._get_repo(repo_name)
171 repo = None
172 if repo_name:
173 repo = self._get_repo(repo_name)
174
131 175 q = PullRequest.query()
176
132 177 # source or target
133 if source:
178 if repo and source:
134 179 q = q.filter(PullRequest.source_repo == repo)
135 else:
180 elif repo:
136 181 q = q.filter(PullRequest.target_repo == repo)
137 182
138 183 # closed,opened
@@ -147,7 +192,8 b' class PullRequestModel(BaseModel):'
147 192 order_map = {
148 193 'name_raw': PullRequest.pull_request_id,
149 194 'title': PullRequest.title,
150 'updated_on_raw': PullRequest.updated_on
195 'updated_on_raw': PullRequest.updated_on,
196 'target_repo': PullRequest.target_repo_id
151 197 }
152 198 if order_dir == 'asc':
153 199 q = q.order_by(order_map[order_by].asc())
@@ -305,6 +351,59 b' class PullRequestModel(BaseModel):'
305 351 PullRequestReviewers.user_id == user_id).all()
306 352 ]
307 353
354 def _prepare_participating_query(self, user_id=None, statuses=None,
355 order_by=None, order_dir='desc'):
356 q = PullRequest.query()
357 if user_id:
358 reviewers_subquery = Session().query(
359 PullRequestReviewers.pull_request_id).filter(
360 PullRequestReviewers.user_id == user_id).subquery()
361 user_filter= or_(
362 PullRequest.user_id == user_id,
363 PullRequest.pull_request_id.in_(reviewers_subquery)
364 )
365 q = PullRequest.query().filter(user_filter)
366
367 # closed,opened
368 if statuses:
369 q = q.filter(PullRequest.status.in_(statuses))
370
371 if order_by:
372 order_map = {
373 'name_raw': PullRequest.pull_request_id,
374 'title': PullRequest.title,
375 'updated_on_raw': PullRequest.updated_on,
376 'target_repo': PullRequest.target_repo_id
377 }
378 if order_dir == 'asc':
379 q = q.order_by(order_map[order_by].asc())
380 else:
381 q = q.order_by(order_map[order_by].desc())
382
383 return q
384
385 def count_im_participating_in(self, user_id=None, statuses=None):
386 q = self._prepare_participating_query(user_id, statuses=statuses)
387 return q.count()
388
389 def get_im_participating_in(
390 self, user_id=None, statuses=None, offset=0,
391 length=None, order_by=None, order_dir='desc'):
392 """
393 Get all Pull requests that i'm participating in, or i have opened
394 """
395
396 q = self._prepare_participating_query(
397 user_id, statuses=statuses, order_by=order_by,
398 order_dir=order_dir)
399
400 if length:
401 pull_requests = q.limit(length).offset(offset).all()
402 else:
403 pull_requests = q.all()
404
405 return pull_requests
406
308 407 def get_versions(self, pull_request):
309 408 """
310 409 returns version of pull request sorted by ID descending
@@ -333,10 +432,18 b' class PullRequestModel(BaseModel):'
333 432 Session().add(pull_request)
334 433 Session().flush()
335 434
435 reviewer_ids = set()
336 436 # members / reviewers
337 for user_id in set(reviewers):
437 for reviewer_object in reviewers:
438 if isinstance(reviewer_object, tuple):
439 user_id, reasons = reviewer_object
440 else:
441 user_id, reasons = reviewer_object, []
442
338 443 user = self._get_user(user_id)
339 reviewer = PullRequestReviewers(user, pull_request)
444 reviewer_ids.add(user.user_id)
445
446 reviewer = PullRequestReviewers(user, pull_request, reasons)
340 447 Session().add(reviewer)
341 448
342 449 # Set approval status to "Under Review" for all commits which are
@@ -348,7 +455,7 b' class PullRequestModel(BaseModel):'
348 455 pull_request=pull_request
349 456 )
350 457
351 self.notify_reviewers(pull_request, reviewers)
458 self.notify_reviewers(pull_request, reviewer_ids)
352 459 self._trigger_pull_request_hook(
353 460 pull_request, created_by_user, 'create')
354 461
@@ -441,7 +548,7 b' class PullRequestModel(BaseModel):'
441 548 return merge_state
442 549
443 550 def _comment_and_close_pr(self, pull_request, user, merge_state):
444 pull_request.merge_rev = merge_state.merge_commit_id
551 pull_request.merge_rev = merge_state.merge_ref.commit_id
445 552 pull_request.updated_on = datetime.datetime.now()
446 553
447 554 ChangesetCommentsModel().create(
@@ -471,7 +578,6 b' class PullRequestModel(BaseModel):'
471 578 and return the new pull request version and the list
472 579 of commits processed by this update action
473 580 """
474
475 581 pull_request = self.__get_pull_request(pull_request)
476 582 source_ref_type = pull_request.source_ref_parts.type
477 583 source_ref_name = pull_request.source_ref_parts.name
@@ -481,13 +587,26 b' class PullRequestModel(BaseModel):'
481 587 log.debug(
482 588 "Skipping update of pull request %s due to ref type: %s",
483 589 pull_request, source_ref_type)
484 return (None, None)
590 return UpdateResponse(
591 executed=False,
592 reason=UpdateFailureReason.WRONG_REF_TPYE,
593 old=pull_request, new=None, changes=None)
485 594
486 595 source_repo = pull_request.source_repo.scm_instance()
487 source_commit = source_repo.get_commit(commit_id=source_ref_name)
596 try:
597 source_commit = source_repo.get_commit(commit_id=source_ref_name)
598 except CommitDoesNotExistError:
599 return UpdateResponse(
600 executed=False,
601 reason=UpdateFailureReason.MISSING_SOURCE_REF,
602 old=pull_request, new=None, changes=None)
603
488 604 if source_ref_id == source_commit.raw_id:
489 605 log.debug("Nothing changed in pull request %s", pull_request)
490 return (None, None)
606 return UpdateResponse(
607 executed=False,
608 reason=UpdateFailureReason.NO_CHANGE,
609 old=pull_request, new=None, changes=None)
491 610
492 611 # Finally there is a need for an update
493 612 pull_request_version = self._create_version_from_snapshot(pull_request)
@@ -498,10 +617,16 b' class PullRequestModel(BaseModel):'
498 617 target_ref_id = pull_request.target_ref_parts.commit_id
499 618 target_repo = pull_request.target_repo.scm_instance()
500 619
501 if target_ref_type in ('tag', 'branch', 'book'):
502 target_commit = target_repo.get_commit(target_ref_name)
503 else:
504 target_commit = target_repo.get_commit(target_ref_id)
620 try:
621 if target_ref_type in ('tag', 'branch', 'book'):
622 target_commit = target_repo.get_commit(target_ref_name)
623 else:
624 target_commit = target_repo.get_commit(target_ref_id)
625 except CommitDoesNotExistError:
626 return UpdateResponse(
627 executed=False,
628 reason=UpdateFailureReason.MISSING_TARGET_REF,
629 old=pull_request, new=None, changes=None)
505 630
506 631 # re-compute commit ids
507 632 old_commit_ids = set(pull_request.revisions)
@@ -570,7 +695,10 b' class PullRequestModel(BaseModel):'
570 695 Session().commit()
571 696 self._trigger_pull_request_hook(pull_request, pull_request.author,
572 697 'update')
573 return (pull_request_version, changes)
698
699 return UpdateResponse(
700 executed=True, reason=UpdateFailureReason.NONE,
701 old=pull_request, new=pull_request_version, changes=changes)
574 702
575 703 def _create_version_from_snapshot(self, pull_request):
576 704 version = PullRequestVersion()
@@ -588,6 +716,7 b' class PullRequestModel(BaseModel):'
588 716 version._last_merge_source_rev = pull_request._last_merge_source_rev
589 717 version._last_merge_target_rev = pull_request._last_merge_target_rev
590 718 version._last_merge_status = pull_request._last_merge_status
719 version.shadow_merge_ref = pull_request.shadow_merge_ref
591 720 version.merge_rev = pull_request.merge_rev
592 721
593 722 version.revisions = pull_request.revisions
@@ -711,8 +840,21 b' class PullRequestModel(BaseModel):'
711 840 pull_request.updated_on = datetime.datetime.now()
712 841 Session().add(pull_request)
713 842
714 def update_reviewers(self, pull_request, reviewers_ids):
715 reviewers_ids = set(reviewers_ids)
843 def update_reviewers(self, pull_request, reviewer_data):
844 """
845 Update the reviewers in the pull request
846
847 :param pull_request: the pr to update
848 :param reviewer_data: list of tuples [(user, ['reason1', 'reason2'])]
849 """
850
851 reviewers_reasons = {}
852 for user_id, reasons in reviewer_data:
853 if isinstance(user_id, (int, basestring)):
854 user_id = self._get_user(user_id).user_id
855 reviewers_reasons[user_id] = reasons
856
857 reviewers_ids = set(reviewers_reasons.keys())
716 858 pull_request = self.__get_pull_request(pull_request)
717 859 current_reviewers = PullRequestReviewers.query()\
718 860 .filter(PullRequestReviewers.pull_request ==
@@ -728,7 +870,8 b' class PullRequestModel(BaseModel):'
728 870 for uid in ids_to_add:
729 871 changed = True
730 872 _usr = self._get_user(uid)
731 reviewer = PullRequestReviewers(_usr, pull_request)
873 reasons = reviewers_reasons[uid]
874 reviewer = PullRequestReviewers(_usr, pull_request, reasons)
732 875 Session().add(reviewer)
733 876
734 877 self.notify_reviewers(pull_request, ids_to_add)
@@ -753,6 +896,18 b' class PullRequestModel(BaseModel):'
753 896 pull_request_id=pull_request.pull_request_id,
754 897 qualified=True)
755 898
899 def get_shadow_clone_url(self, pull_request):
900 """
901 Returns qualified url pointing to the shadow repository. If this pull
902 request is closed there is no shadow repository and ``None`` will be
903 returned.
904 """
905 if pull_request.is_closed():
906 return None
907 else:
908 pr_url = urllib.unquote(self.get_url(pull_request))
909 return safe_unicode('{pr_url}/repository'.format(pr_url=pr_url))
910
756 911 def notify_reviewers(self, pull_request, reviewers_ids):
757 912 # notification to reviewers
758 913 if not reviewers_ids:
@@ -881,6 +1036,7 b' class PullRequestModel(BaseModel):'
881 1036
882 1037 try:
883 1038 resp = self._try_merge(pull_request)
1039 log.debug("Merge response: %s", resp)
884 1040 status = resp.possible, self.merge_status_message(
885 1041 resp.failure_reason)
886 1042 except NotImplementedError:
@@ -923,8 +1079,15 b' class PullRequestModel(BaseModel):'
923 1079 "Trying out if the pull request %s can be merged.",
924 1080 pull_request.pull_request_id)
925 1081 target_vcs = pull_request.target_repo.scm_instance()
926 target_ref = self._refresh_reference(
927 pull_request.target_ref_parts, target_vcs)
1082
1083 # Refresh the target reference.
1084 try:
1085 target_ref = self._refresh_reference(
1086 pull_request.target_ref_parts, target_vcs)
1087 except CommitDoesNotExistError:
1088 merge_state = MergeResponse(
1089 False, False, None, MergeFailureReason.MISSING_TARGET_REF)
1090 return merge_state
928 1091
929 1092 target_locked = pull_request.target_repo.locked
930 1093 if target_locked and target_locked[0]:
@@ -940,7 +1103,7 b' class PullRequestModel(BaseModel):'
940 1103 _last_merge_status == MergeFailureReason.NONE
941 1104 merge_state = MergeResponse(
942 1105 possible, False, None, pull_request._last_merge_status)
943 log.debug("Merge response: %s", merge_state)
1106
944 1107 return merge_state
945 1108
946 1109 def _refresh_reference(self, reference, vcs_repository):
@@ -969,13 +1132,13 b' class PullRequestModel(BaseModel):'
969 1132
970 1133 # Do not store the response if there was an unknown error.
971 1134 if merge_state.failure_reason != MergeFailureReason.UNKNOWN:
972 pull_request._last_merge_source_rev = pull_request.\
973 source_ref_parts.commit_id
1135 pull_request._last_merge_source_rev = \
1136 pull_request.source_ref_parts.commit_id
974 1137 pull_request._last_merge_target_rev = target_reference.commit_id
975 pull_request._last_merge_status = (
976 merge_state.failure_reason)
1138 pull_request._last_merge_status = merge_state.failure_reason
1139 pull_request.shadow_merge_ref = merge_state.merge_ref
977 1140 Session().add(pull_request)
978 Session().flush()
1141 Session().commit()
979 1142
980 1143 return merge_state
981 1144
@@ -148,6 +148,7 b' class RepoModel(BaseModel):'
148 148 qualified=True)
149 149
150 150 def get_users(self, name_contains=None, limit=20, only_active=True):
151
151 152 # TODO: mikhail: move this method to the UserModel.
152 153 query = self.sa.query(User)
153 154 if only_active:
@@ -171,8 +172,9 b' class RepoModel(BaseModel):'
171 172 'first_name': user.name,
172 173 'last_name': user.lastname,
173 174 'username': user.username,
174 'icon_link': h.gravatar_url(user.email, 14),
175 'value_display': h.person(user.email),
175 'email': user.email,
176 'icon_link': h.gravatar_url(user.email, 30),
177 'value_display': h.person(user),
176 178 'value': user.username,
177 179 'value_type': 'user',
178 180 'active': user.active,
@@ -252,9 +254,11 b' class RepoModel(BaseModel):'
252 254
253 255 def desc(desc):
254 256 if c.visual.stylify_metatags:
255 return h.urlify_text(h.escaped_stylize(h.truncate(desc, 60)))
257 desc = h.urlify_text(h.escaped_stylize(desc))
256 258 else:
257 return h.urlify_text(h.html_escape(h.truncate(desc, 60)))
259 desc = h.urlify_text(h.html_escape(desc))
260
261 return _render('repo_desc', desc)
258 262
259 263 def state(repo_state):
260 264 return _render("repo_state", repo_state)
@@ -373,11 +377,11 b' class RepoModel(BaseModel):'
373 377 log.debug('Updating repo %s with params:%s', cur_repo, kwargs)
374 378
375 379 update_keys = [
376 (1, 'repo_enable_downloads'),
377 380 (1, 'repo_description'),
378 (1, 'repo_enable_locking'),
379 381 (1, 'repo_landing_rev'),
380 382 (1, 'repo_private'),
383 (1, 'repo_enable_downloads'),
384 (1, 'repo_enable_locking'),
381 385 (1, 'repo_enable_statistics'),
382 386 (0, 'clone_uri'),
383 387 (0, 'fork_id')
@@ -23,13 +23,13 b''
23 23 repo group model for RhodeCode
24 24 """
25 25
26
26 import os
27 27 import datetime
28 28 import itertools
29 29 import logging
30 import os
31 30 import shutil
32 31 import traceback
32 import string
33 33
34 34 from zope.cachedescriptors.property import Lazy as LazyProperty
35 35
@@ -38,7 +38,7 b' from rhodecode.model import BaseModel'
38 38 from rhodecode.model.db import (
39 39 RepoGroup, UserRepoGroupToPerm, User, Permission, UserGroupRepoGroupToPerm,
40 40 UserGroup, Repository)
41 from rhodecode.model.settings import VcsSettingsModel
41 from rhodecode.model.settings import VcsSettingsModel, SettingsModel
42 42 from rhodecode.lib.caching_query import FromCache
43 43 from rhodecode.lib.utils2 import action_logger_generic
44 44
@@ -48,7 +48,8 b' log = logging.getLogger(__name__)'
48 48 class RepoGroupModel(BaseModel):
49 49
50 50 cls = RepoGroup
51 PERSONAL_GROUP_DESC = '[personal] repo group: owner `%(username)s`'
51 PERSONAL_GROUP_DESC = 'personal repo group of user `%(username)s`'
52 PERSONAL_GROUP_PATTERN = '${username}' # default
52 53
53 54 def _get_user_group(self, users_group):
54 55 return self._get_instance(UserGroup, users_group,
@@ -76,6 +77,39 b' class RepoGroupModel(BaseModel):'
76 77 "sql_cache_short", "get_repo_group_%s" % repo_group_name))
77 78 return repo.scalar()
78 79
80 def get_default_create_personal_repo_group(self):
81 value = SettingsModel().get_setting_by_name(
82 'create_personal_repo_group')
83 return value.app_settings_value if value else None or False
84
85 def get_personal_group_name_pattern(self):
86 value = SettingsModel().get_setting_by_name(
87 'personal_repo_group_pattern')
88 val = value.app_settings_value if value else None
89 group_template = val or self.PERSONAL_GROUP_PATTERN
90
91 group_template = group_template.lstrip('/')
92 return group_template
93
94 def get_personal_group_name(self, user):
95 template = self.get_personal_group_name_pattern()
96 return string.Template(template).safe_substitute(
97 username=user.username,
98 user_id=user.user_id,
99 )
100
101 def create_personal_repo_group(self, user, commit_early=True):
102 desc = self.PERSONAL_GROUP_DESC % {'username': user.username}
103 personal_repo_group_name = self.get_personal_group_name(user)
104
105 # create a new one
106 RepoGroupModel().create(
107 group_name=personal_repo_group_name,
108 group_description=desc,
109 owner=user.username,
110 personal=True,
111 commit_early=commit_early)
112
79 113 def _create_default_perms(self, new_group):
80 114 # create default permission
81 115 default_perm = 'group.read'
@@ -92,7 +126,8 b' class RepoGroupModel(BaseModel):'
92 126 repo_group_to_perm.user_id = def_user.user_id
93 127 return repo_group_to_perm
94 128
95 def _get_group_name_and_parent(self, group_name_full, repo_in_path=False):
129 def _get_group_name_and_parent(self, group_name_full, repo_in_path=False,
130 get_object=False):
96 131 """
97 132 Get's the group name and a parent group name from given group name.
98 133 If repo_in_path is set to truth, we asume the full path also includes
@@ -114,9 +149,13 b' class RepoGroupModel(BaseModel):'
114 149 if len(_parts) > 1:
115 150 parent_repo_group_name = _parts[0]
116 151
152 parent_group = None
117 153 if parent_repo_group_name:
118 154 parent_group = RepoGroup.get_by_group_name(parent_repo_group_name)
119 155
156 if get_object:
157 return group_name_cleaned, parent_repo_group_name, parent_group
158
120 159 return group_name_cleaned, parent_repo_group_name
121 160
122 161 def check_exist_filesystem(self, group_name, exc_on_failure=True):
@@ -125,7 +164,8 b' class RepoGroupModel(BaseModel):'
125 164
126 165 if os.path.isdir(create_path):
127 166 if exc_on_failure:
128 raise Exception('That directory already exists !')
167 abs_create_path = os.path.abspath(create_path)
168 raise Exception('Directory `{}` already exists !'.format(abs_create_path))
129 169 return False
130 170 return True
131 171
@@ -191,7 +231,7 b' class RepoGroupModel(BaseModel):'
191 231 shutil.move(rm_path, os.path.join(self.repos_path, _d))
192 232
193 233 def create(self, group_name, group_description, owner, just_db=False,
194 copy_permissions=False, commit_early=True):
234 copy_permissions=False, personal=None, commit_early=True):
195 235
196 236 (group_name_cleaned,
197 237 parent_group_name) = RepoGroupModel()._get_group_name_and_parent(group_name)
@@ -199,11 +239,18 b' class RepoGroupModel(BaseModel):'
199 239 parent_group = None
200 240 if parent_group_name:
201 241 parent_group = self._get_repo_group(parent_group_name)
242 if not parent_group:
243 # we tried to create a nested group, but the parent is not
244 # existing
245 raise ValueError(
246 'Parent group `%s` given in `%s` group name '
247 'is not yet existing.' % (parent_group_name, group_name))
202 248
203 # becase we are doing a cleanup, we need to check if such directory
204 # already exists. If we don't do that we can accidentally delete existing
205 # directory via cleanup that can cause data issues, since delete does a
206 # folder rename to special syntax later cleanup functions can delete this
249 # because we are doing a cleanup, we need to check if such directory
250 # already exists. If we don't do that we can accidentally delete
251 # existing directory via cleanup that can cause data issues, since
252 # delete does a folder rename to special syntax later cleanup
253 # functions can delete this
207 254 cleanup_group = self.check_exist_filesystem(group_name,
208 255 exc_on_failure=False)
209 256 try:
@@ -213,6 +260,7 b' class RepoGroupModel(BaseModel):'
213 260 new_repo_group.group_description = group_description or group_name
214 261 new_repo_group.parent_group = parent_group
215 262 new_repo_group.group_name = group_name
263 new_repo_group.personal = personal
216 264
217 265 self.sa.add(new_repo_group)
218 266
@@ -614,11 +662,15 b' class RepoGroupModel(BaseModel):'
614 662 def repo_group_lnk(repo_group_name):
615 663 return _render('repo_group_name', repo_group_name)
616 664
617 def desc(desc):
665 def desc(desc, personal):
666 prefix = h.escaped_stylize(u'[personal] ') if personal else ''
667
618 668 if c.visual.stylify_metatags:
619 return h.urlify_text(h.escaped_stylize(h.truncate(desc, 60)))
669 desc = h.urlify_text(prefix + h.escaped_stylize(desc))
620 670 else:
621 return h.urlify_text(h.html_escape(h.truncate(desc, 60)))
671 desc = h.urlify_text(prefix + h.html_escape(desc))
672
673 return _render('repo_group_desc', desc)
622 674
623 675 def repo_group_actions(repo_group_id, repo_group_name, gr_count):
624 676 return _render(
@@ -637,7 +689,7 b' class RepoGroupModel(BaseModel):'
637 689 "menu": quick_menu(group.group_name),
638 690 "name": repo_group_lnk(group.group_name),
639 691 "name_raw": group.group_name,
640 "desc": desc(group.group_description),
692 "desc": desc(group.group_description, group.personal),
641 693 "top_level_repos": 0,
642 694 "owner": user_profile(group.user.username)
643 695 }
@@ -25,13 +25,11 b' Scm model for RhodeCode'
25 25 import os.path
26 26 import re
27 27 import sys
28 import time
29 28 import traceback
30 29 import logging
31 30 import cStringIO
32 31 import pkg_resources
33 32
34 import pylons
35 33 from pylons.i18n.translation import _
36 34 from sqlalchemy import func
37 35 from zope.cachedescriptors.property import Lazy as LazyProperty
@@ -50,12 +48,12 b' from rhodecode.lib.exceptions import Non'
50 48 from rhodecode.lib import hooks_utils, caches
51 49 from rhodecode.lib.utils import (
52 50 get_filesystem_repos, action_logger, make_db_config)
53 from rhodecode.lib.utils2 import (
54 safe_str, safe_unicode, get_server_url, md5)
51 from rhodecode.lib.utils2 import (safe_str, safe_unicode)
52 from rhodecode.lib.system_info import get_system_info
55 53 from rhodecode.model import BaseModel
56 54 from rhodecode.model.db import (
57 55 Repository, CacheKey, UserFollowing, UserLog, User, RepoGroup,
58 PullRequest, DbMigrateVersion)
56 PullRequest)
59 57 from rhodecode.model.settings import VcsSettingsModel
60 58
61 59 log = logging.getLogger(__name__)
@@ -764,11 +762,15 b' class ScmModel(BaseModel):'
764 762 :param repo:
765 763 """
766 764
767 hist_l = []
768 choices = []
769 765 repo = self._get_repo(repo)
770 hist_l.append(['rev:tip', _('latest tip')])
771 choices.append('rev:tip')
766
767 hist_l = [
768 ['rev:tip', _('latest tip')]
769 ]
770 choices = [
771 'rev:tip'
772 ]
773
772 774 if not repo:
773 775 return choices, hist_l
774 776
@@ -882,197 +884,8 b' class ScmModel(BaseModel):'
882 884 self.install_svn_hooks(repo)
883 885
884 886 def get_server_info(self, environ=None):
885 import platform
886 import rhodecode
887 import pkg_resources
888 from rhodecode.model.meta import Base as sql_base, Session
889 from sqlalchemy.engine import url
890 from rhodecode.lib.base import get_server_ip_addr, get_server_port
891 from rhodecode.lib.vcs.backends.git import discover_git_version
892 from rhodecode.model.gist import GIST_STORE_LOC
893
894 try:
895 # cygwin cannot have yet psutil support.
896 import psutil
897 except ImportError:
898 psutil = None
899
900 environ = environ or {}
901 _NA = 'NOT AVAILABLE'
902 _memory = _NA
903 _uptime = _NA
904 _boot_time = _NA
905 _cpu = _NA
906 _disk = dict(percent=0, used=0, total=0, error='')
907 _load = {'1_min': _NA, '5_min': _NA, '15_min': _NA}
908
909 model = VcsSettingsModel()
910 storage_path = model.get_repos_location()
911 gist_storage_path = os.path.join(storage_path, GIST_STORE_LOC)
912 archive_storage_path = rhodecode.CONFIG.get('archive_cache_dir', '')
913 search_index_storage_path = rhodecode.CONFIG.get('search.location', '')
914
915 if psutil:
916 # disk storage
917 try:
918 _disk = dict(psutil.disk_usage(storage_path)._asdict())
919 except Exception as e:
920 log.exception('Failed to fetch disk info')
921 _disk = {'percent': 0, 'used': 0, 'total': 0, 'error': str(e)}
922
923 # memory
924 _memory = dict(psutil.virtual_memory()._asdict())
925 _memory['percent2'] = psutil._common.usage_percent(
926 (_memory['total'] - _memory['free']),
927 _memory['total'], 1)
928
929 # load averages
930 if hasattr(psutil.os, 'getloadavg'):
931 _load = dict(zip(
932 ['1_min', '5_min', '15_min'], psutil.os.getloadavg()))
933 _uptime = time.time() - psutil.boot_time()
934 _boot_time = psutil.boot_time()
935 _cpu = psutil.cpu_percent(0.5)
936
937 mods = dict([(p.project_name, p.version)
938 for p in pkg_resources.working_set])
939
940 def get_storage_size(storage_path):
941 sizes = []
942 for file_ in os.listdir(storage_path):
943 storage_file = os.path.join(storage_path, file_)
944 if os.path.isfile(storage_file):
945 try:
946 sizes.append(os.path.getsize(storage_file))
947 except OSError:
948 log.exception('Failed to get size of storage file %s',
949 storage_file)
950 pass
951
952 return sum(sizes)
953
954 # archive cache storage
955 _disk_archive = {'percent': 0, 'used': 0, 'total': 0}
956 try:
957 archive_storage_path_exists = os.path.isdir(
958 archive_storage_path)
959 if archive_storage_path and archive_storage_path_exists:
960 used = get_storage_size(archive_storage_path)
961 _disk_archive.update({
962 'used': used,
963 'total': used,
964 })
965 except Exception as e:
966 log.exception('failed to fetch archive cache storage')
967 _disk_archive['error'] = str(e)
968
969 # search index storage
970 _disk_index = {'percent': 0, 'used': 0, 'total': 0}
971 try:
972 search_index_storage_path_exists = os.path.isdir(
973 search_index_storage_path)
974 if search_index_storage_path_exists:
975 used = get_storage_size(search_index_storage_path)
976 _disk_index.update({
977 'percent': 100,
978 'used': used,
979 'total': used,
980 })
981 except Exception as e:
982 log.exception('failed to fetch search index storage')
983 _disk_index['error'] = str(e)
984
985 # gist storage
986 _disk_gist = {'percent': 0, 'used': 0, 'total': 0, 'items': 0}
987 try:
988 items_count = 0
989 used = 0
990 for root, dirs, files in os.walk(safe_str(gist_storage_path)):
991 if root == gist_storage_path:
992 items_count = len(dirs)
993
994 for f in files:
995 try:
996 used += os.path.getsize(os.path.join(root, f))
997 except OSError:
998 pass
999 _disk_gist.update({
1000 'percent': 100,
1001 'used': used,
1002 'total': used,
1003 'items': items_count
1004 })
1005 except Exception as e:
1006 log.exception('failed to fetch gist storage items')
1007 _disk_gist['error'] = str(e)
1008
1009 # GIT info
1010 git_ver = discover_git_version()
1011
1012 # SVN info
1013 # TODO: johbo: Add discover_svn_version to replace this code.
1014 try:
1015 import svn.core
1016 svn_ver = svn.core.SVN_VERSION
1017 except ImportError:
1018 svn_ver = None
1019
1020 # DB stuff
1021 db_info = url.make_url(rhodecode.CONFIG['sqlalchemy.db1.url'])
1022 db_type = db_info.__to_string__()
1023 try:
1024 engine = sql_base.metadata.bind
1025 db_server_info = engine.dialect._get_server_version_info(
1026 Session.connection(bind=engine))
1027 db_version = '%s %s' % (db_info.drivername,
1028 '.'.join(map(str, db_server_info)))
1029 except Exception:
1030 log.exception('failed to fetch db version')
1031 db_version = '%s %s' % (db_info.drivername, '?')
1032
1033 db_migrate = DbMigrateVersion.query().filter(
1034 DbMigrateVersion.repository_id == 'rhodecode_db_migrations').one()
1035 db_migrate_version = db_migrate.version
1036
1037 info = {
1038 'py_version': ' '.join(platform._sys_version()),
1039 'py_path': sys.executable,
1040 'py_modules': sorted(mods.items(), key=lambda k: k[0].lower()),
1041
1042 'platform': safe_unicode(platform.platform()),
1043 'storage': storage_path,
1044 'archive_storage': archive_storage_path,
1045 'index_storage': search_index_storage_path,
1046 'gist_storage': gist_storage_path,
1047
1048
1049 'db_type': db_type,
1050 'db_version': db_version,
1051 'db_migrate_version': db_migrate_version,
1052
1053 'rhodecode_version': rhodecode.__version__,
1054 'rhodecode_config_ini': rhodecode.CONFIG.get('__file__'),
1055 'server_ip': '%s:%s' % (
1056 get_server_ip_addr(environ, log_errors=False),
1057 get_server_port(environ)
1058 ),
1059 'server_id': rhodecode.CONFIG.get('instance_id'),
1060
1061 'git_version': safe_unicode(git_ver),
1062 'hg_version': mods.get('mercurial'),
1063 'svn_version': svn_ver,
1064
1065 'uptime': _uptime,
1066 'boot_time': _boot_time,
1067 'load': _load,
1068 'cpu': _cpu,
1069 'memory': _memory,
1070 'disk': _disk,
1071 'disk_archive': _disk_archive,
1072 'disk_gist': _disk_gist,
1073 'disk_index': _disk_index,
1074 }
1075 return info
887 server_info = get_system_info(environ)
888 return server_info
1076 889
1077 890
1078 891 def _check_rhodecode_hook(hook_path):
@@ -35,7 +35,7 b' from sqlalchemy.sql.expression import tr'
35 35 from rhodecode import events
36 36 from rhodecode.lib.utils2 import (
37 37 safe_unicode, get_current_rhodecode_user, action_logger_generic,
38 AttributeDict)
38 AttributeDict, str2bool)
39 39 from rhodecode.lib.caching_query import FromCache
40 40 from rhodecode.model import BaseModel
41 41 from rhodecode.model.auth_token import AuthTokenModel
@@ -104,12 +104,13 b' class UserModel(BaseModel):'
104 104 'cur_user': cur_user
105 105 }
106 106
107 if 'create_repo_group' in form_data:
108 user_data['create_repo_group'] = str2bool(
109 form_data.get('create_repo_group'))
110
107 111 try:
108 if form_data.get('create_repo_group'):
109 user_data['create_repo_group'] = True
110 112 if form_data.get('password_change'):
111 113 user_data['force_password_change'] = True
112
113 114 return UserModel().create_or_update(**user_data)
114 115 except Exception:
115 116 log.error(traceback.format_exc())
@@ -177,7 +178,7 b' class UserModel(BaseModel):'
177 178 self, username, password, email, firstname='', lastname='',
178 179 active=True, admin=False, extern_type=None, extern_name=None,
179 180 cur_user=None, plugin=None, force_password_change=False,
180 allow_to_create_user=True, create_repo_group=False,
181 allow_to_create_user=True, create_repo_group=None,
181 182 updating_user_id=None, language=None, strict_creation_check=True):
182 183 """
183 184 Creates a new instance if not found, or updates current one
@@ -222,8 +223,8 b' class UserModel(BaseModel):'
222 223 # in case it's a plugin we don't care
223 224 if not plugin:
224 225
225 # first check if we gave crypted password back, and if it matches
226 # it's not password change
226 # first check if we gave crypted password back, and if it
227 # matches it's not password change
227 228 if new_user.password == password:
228 229 return False
229 230
@@ -233,6 +234,12 b' class UserModel(BaseModel):'
233 234
234 235 return False
235 236
237 # read settings on default personal repo group creation
238 if create_repo_group is None:
239 default_create_repo_group = RepoGroupModel()\
240 .get_default_create_personal_repo_group()
241 create_repo_group = default_create_repo_group
242
236 243 user_data = {
237 244 'username': username,
238 245 'password': password,
@@ -319,17 +326,16 b' class UserModel(BaseModel):'
319 326 self.sa.add(new_user)
320 327
321 328 if not edit and create_repo_group:
322 # create new group same as username, and make this user an owner
323 desc = RepoGroupModel.PERSONAL_GROUP_DESC % {'username': username}
324 RepoGroupModel().create(group_name=username,
325 group_description=desc,
326 owner=username, commit_early=False)
329 RepoGroupModel().create_personal_repo_group(
330 new_user, commit_early=False)
331
327 332 if not edit:
328 333 # add the RSS token
329 334 AuthTokenModel().create(username,
330 335 description='Generated feed token',
331 336 role=AuthTokenModel.cls.ROLE_FEED)
332 337 log_create_user(created_by=cur_user, **new_user.get_dict())
338 events.trigger(events.UserPostCreate(user_data))
333 339 return new_user
334 340 except (DatabaseError,):
335 341 log.error(traceback.format_exc())
@@ -189,18 +189,15 b' class UserGroupModel(BaseModel):'
189 189 self._log_user_changes('removed from', user_group, removed)
190 190
191 191 def _clean_members_data(self, members_data):
192 # TODO: anderson: this should be in the form validation but I couldn't
193 # make it work there as it conflicts with the other validator
194 192 if not members_data:
195 193 members_data = []
196 194
197 if isinstance(members_data, basestring):
198 new_members = [members_data]
199 else:
200 new_members = members_data
201
202 new_members = [int(uid) for uid in new_members]
203 return new_members
195 members = []
196 for user in members_data:
197 uid = int(user['member_user_id'])
198 if uid not in members and user['type'] in ['new', 'existing']:
199 members.append(uid)
200 return members
204 201
205 202 def update(self, user_group, form_data):
206 203 user_group = self._get_user_group(user_group)
@@ -21,7 +21,6 b''
21 21 import unicodedata
22 22
23 23
24
25 24 def strip_preparer(value):
26 25 """
27 26 strips given values using .strip() function
@@ -23,7 +23,7 b' import os'
23 23 import colander
24 24
25 25 from rhodecode.translation import _
26 from rhodecode.model.validation_schema import validators, preparers
26 from rhodecode.model.validation_schema import preparers
27 27
28 28
29 29 def nodes_to_sequence(nodes, colander_node=None):
@@ -181,5 +181,3 b' class GistSchema(colander.MappingSchema)'
181 181 validator=colander.OneOf([Gist.GIST_PRIVATE, Gist.GIST_PUBLIC]))
182 182
183 183 nodes = Nodes()
184
185
@@ -21,9 +21,220 b''
21 21
22 22 import colander
23 23
24
24 from rhodecode.translation import _
25 25 from rhodecode.model.validation_schema import validators, preparers, types
26 26
27 27
28 def get_group_and_repo(repo_name):
29 from rhodecode.model.repo_group import RepoGroupModel
30 return RepoGroupModel()._get_group_name_and_parent(
31 repo_name, get_object=True)
32
33
34 @colander.deferred
35 def deferred_can_write_to_group_validator(node, kw):
36 old_values = kw.get('old_values') or {}
37 request_user = kw.get('user')
38
39 def can_write_group_validator(node, value):
40 from rhodecode.lib.auth import (
41 HasPermissionAny, HasRepoGroupPermissionAny)
42 from rhodecode.model.repo_group import RepoGroupModel
43
44 messages = {
45 'invalid_parent_repo_group':
46 _(u"Parent repository group `{}` does not exist"),
47 # permissions denied we expose as not existing, to prevent
48 # resource discovery
49 'permission_denied_parent_group':
50 _(u"Parent repository group `{}` does not exist"),
51 'permission_denied_root':
52 _(u"You do not have the permission to store "
53 u"repository groups in the root location.")
54 }
55
56 value = value['repo_group_name']
57 parent_group_name = value
58
59 is_root_location = value is types.RootLocation
60
61 # NOT initialized validators, we must call them
62 can_create_repo_groups_at_root = HasPermissionAny(
63 'hg.admin', 'hg.repogroup.create.true')
64
65 if is_root_location:
66 if can_create_repo_groups_at_root(user=request_user):
67 # we can create repo group inside tool-level. No more checks
68 # are required
69 return
70 else:
71 raise colander.Invalid(node, messages['permission_denied_root'])
72
73 # check if the parent repo group actually exists
74 parent_group = None
75 if parent_group_name:
76 parent_group = RepoGroupModel().get_by_group_name(parent_group_name)
77 if value and not parent_group:
78 raise colander.Invalid(
79 node, messages['invalid_parent_repo_group'].format(
80 parent_group_name))
81
82 # check if we have permissions to create new groups under
83 # parent repo group
84 # create repositories with write permission on group is set to true
85 create_on_write = HasPermissionAny(
86 'hg.create.write_on_repogroup.true')(user=request_user)
87
88 group_admin = HasRepoGroupPermissionAny('group.admin')(
89 parent_group_name, 'can write into group validator', user=request_user)
90 group_write = HasRepoGroupPermissionAny('group.write')(
91 parent_group_name, 'can write into group validator', user=request_user)
92
93 # creation by write access is currently disabled. Needs thinking if
94 # we want to allow this...
95 forbidden = not (group_admin or (group_write and create_on_write and 0))
96
97 if parent_group and forbidden:
98 msg = messages['permission_denied_parent_group'].format(
99 parent_group_name)
100 raise colander.Invalid(node, msg)
101
102 return can_write_group_validator
103
104
105 @colander.deferred
106 def deferred_repo_group_owner_validator(node, kw):
107
108 def repo_owner_validator(node, value):
109 from rhodecode.model.db import User
110 existing = User.get_by_username(value)
111 if not existing:
112 msg = _(u'Repo group owner with id `{}` does not exists').format(
113 value)
114 raise colander.Invalid(node, msg)
115
116 return repo_owner_validator
117
118
119 @colander.deferred
120 def deferred_unique_name_validator(node, kw):
121 request_user = kw.get('user')
122 old_values = kw.get('old_values') or {}
123
124 def unique_name_validator(node, value):
125 from rhodecode.model.db import Repository, RepoGroup
126 name_changed = value != old_values.get('group_name')
127
128 existing = Repository.get_by_repo_name(value)
129 if name_changed and existing:
130 msg = _(u'Repository with name `{}` already exists').format(value)
131 raise colander.Invalid(node, msg)
132
133 existing_group = RepoGroup.get_by_group_name(value)
134 if name_changed and existing_group:
135 msg = _(u'Repository group with name `{}` already exists').format(
136 value)
137 raise colander.Invalid(node, msg)
138 return unique_name_validator
139
140
141 @colander.deferred
142 def deferred_repo_group_name_validator(node, kw):
143 return validators.valid_name_validator
144
145
146 class GroupType(colander.Mapping):
147 def _validate(self, node, value):
148 try:
149 return dict(repo_group_name=value)
150 except Exception as e:
151 raise colander.Invalid(
152 node, '"${val}" is not a mapping type: ${err}'.format(
153 val=value, err=e))
154
155 def deserialize(self, node, cstruct):
156 if cstruct is colander.null:
157 return cstruct
158
159 appstruct = super(GroupType, self).deserialize(node, cstruct)
160 validated_name = appstruct['repo_group_name']
161
162 # inject group based on once deserialized data
163 (repo_group_name_without_group,
164 parent_group_name,
165 parent_group) = get_group_and_repo(validated_name)
166
167 appstruct['repo_group_name_without_group'] = repo_group_name_without_group
168 appstruct['repo_group_name'] = parent_group_name or types.RootLocation
169 if parent_group:
170 appstruct['repo_group_id'] = parent_group.group_id
171
172 return appstruct
173
174
175 class GroupSchema(colander.SchemaNode):
176 schema_type = GroupType
177 validator = deferred_can_write_to_group_validator
178 missing = colander.null
179
180
181 class RepoGroup(GroupSchema):
182 repo_group_name = colander.SchemaNode(
183 types.GroupNameType())
184 repo_group_id = colander.SchemaNode(
185 colander.String(), missing=None)
186 repo_group_name_without_group = colander.SchemaNode(
187 colander.String(), missing=None)
188
189
190 class RepoGroupAccessSchema(colander.MappingSchema):
191 repo_group = RepoGroup()
192
193
194 class RepoGroupNameUniqueSchema(colander.MappingSchema):
195 unique_repo_group_name = colander.SchemaNode(
196 colander.String(),
197 validator=deferred_unique_name_validator)
198
199
28 200 class RepoGroupSchema(colander.Schema):
29 group_name = colander.SchemaNode(types.GroupNameType())
201
202 repo_group_name = colander.SchemaNode(
203 types.GroupNameType(),
204 validator=deferred_repo_group_name_validator)
205
206 repo_group_owner = colander.SchemaNode(
207 colander.String(),
208 validator=deferred_repo_group_owner_validator)
209
210 repo_group_description = colander.SchemaNode(
211 colander.String(), missing='')
212
213 repo_group_copy_permissions = colander.SchemaNode(
214 types.StringBooleanType(),
215 missing=False)
216
217 repo_group_enable_locking = colander.SchemaNode(
218 types.StringBooleanType(),
219 missing=False)
220
221 def deserialize(self, cstruct):
222 """
223 Custom deserialize that allows to chain validation, and verify
224 permissions, and as last step uniqueness
225 """
226
227 appstruct = super(RepoGroupSchema, self).deserialize(cstruct)
228 validated_name = appstruct['repo_group_name']
229
230 # second pass to validate permissions to repo_group
231 second = RepoGroupAccessSchema().bind(**self.bindings)
232 appstruct_second = second.deserialize({'repo_group': validated_name})
233 # save result
234 appstruct['repo_group'] = appstruct_second['repo_group']
235
236 # thirds to validate uniqueness
237 third = RepoGroupNameUniqueSchema().bind(**self.bindings)
238 third.deserialize({'unique_repo_group_name': validated_name})
239
240 return appstruct
@@ -20,8 +20,302 b''
20 20
21 21 import colander
22 22
23 from rhodecode.translation import _
23 24 from rhodecode.model.validation_schema import validators, preparers, types
24 25
26 DEFAULT_LANDING_REF = 'rev:tip'
27
28
29 def get_group_and_repo(repo_name):
30 from rhodecode.model.repo_group import RepoGroupModel
31 return RepoGroupModel()._get_group_name_and_parent(
32 repo_name, get_object=True)
33
34
35 @colander.deferred
36 def deferred_repo_type_validator(node, kw):
37 options = kw.get('repo_type_options', [])
38 return colander.OneOf([x for x in options])
39
40
41 @colander.deferred
42 def deferred_repo_owner_validator(node, kw):
43
44 def repo_owner_validator(node, value):
45 from rhodecode.model.db import User
46 existing = User.get_by_username(value)
47 if not existing:
48 msg = _(u'Repo owner with id `{}` does not exists').format(value)
49 raise colander.Invalid(node, msg)
50
51 return repo_owner_validator
52
53
54 @colander.deferred
55 def deferred_landing_ref_validator(node, kw):
56 options = kw.get('repo_ref_options', [DEFAULT_LANDING_REF])
57 return colander.OneOf([x for x in options])
58
59
60 @colander.deferred
61 def deferred_fork_of_validator(node, kw):
62 old_values = kw.get('old_values') or {}
63
64 def fork_of_validator(node, value):
65 from rhodecode.model.db import Repository, RepoGroup
66 existing = Repository.get_by_repo_name(value)
67 if not existing:
68 msg = _(u'Fork with id `{}` does not exists').format(value)
69 raise colander.Invalid(node, msg)
70 elif old_values['repo_name'] == existing.repo_name:
71 msg = _(u'Cannot set fork of '
72 u'parameter of this repository to itself').format(value)
73 raise colander.Invalid(node, msg)
74
75 return fork_of_validator
76
77
78 @colander.deferred
79 def deferred_can_write_to_group_validator(node, kw):
80 request_user = kw.get('user')
81 old_values = kw.get('old_values') or {}
82
83 def can_write_to_group_validator(node, value):
84 """
85 Checks if given repo path is writable by user. This includes checks if
86 user is allowed to create repositories under root path or under
87 repo group paths
88 """
89
90 from rhodecode.lib.auth import (
91 HasPermissionAny, HasRepoGroupPermissionAny)
92 from rhodecode.model.repo_group import RepoGroupModel
93
94 messages = {
95 'invalid_repo_group':
96 _(u"Repository group `{}` does not exist"),
97 # permissions denied we expose as not existing, to prevent
98 # resource discovery
99 'permission_denied':
100 _(u"Repository group `{}` does not exist"),
101 'permission_denied_root':
102 _(u"You do not have the permission to store "
103 u"repositories in the root location.")
104 }
105
106 value = value['repo_group_name']
107
108 is_root_location = value is types.RootLocation
109 # NOT initialized validators, we must call them
110 can_create_repos_at_root = HasPermissionAny(
111 'hg.admin', 'hg.create.repository')
112
113 # if values is root location, we simply need to check if we can write
114 # to root location !
115 if is_root_location:
116 if can_create_repos_at_root(user=request_user):
117 # we can create repo group inside tool-level. No more checks
118 # are required
119 return
120 else:
121 # "fake" node name as repo_name, otherwise we oddly report
122 # the error as if it was coming form repo_group
123 # however repo_group is empty when using root location.
124 node.name = 'repo_name'
125 raise colander.Invalid(node, messages['permission_denied_root'])
126
127 # parent group not exists ? throw an error
128 repo_group = RepoGroupModel().get_by_group_name(value)
129 if value and not repo_group:
130 raise colander.Invalid(
131 node, messages['invalid_repo_group'].format(value))
132
133 gr_name = repo_group.group_name
134
135 # create repositories with write permission on group is set to true
136 create_on_write = HasPermissionAny(
137 'hg.create.write_on_repogroup.true')(user=request_user)
138
139 group_admin = HasRepoGroupPermissionAny('group.admin')(
140 gr_name, 'can write into group validator', user=request_user)
141 group_write = HasRepoGroupPermissionAny('group.write')(
142 gr_name, 'can write into group validator', user=request_user)
143
144 forbidden = not (group_admin or (group_write and create_on_write))
145
146 # TODO: handling of old values, and detecting no-change in path
147 # to skip permission checks in such cases. This only needs to be
148 # implemented if we use this schema in forms as well
149
150 # gid = (old_data['repo_group'].get('group_id')
151 # if (old_data and 'repo_group' in old_data) else None)
152 # value_changed = gid != safe_int(value)
153 # new = not old_data
154
155 # do check if we changed the value, there's a case that someone got
156 # revoked write permissions to a repository, he still created, we
157 # don't need to check permission if he didn't change the value of
158 # groups in form box
159 # if value_changed or new:
160 # # parent group need to be existing
161 # TODO: ENDS HERE
162
163 if repo_group and forbidden:
164 msg = messages['permission_denied'].format(value)
165 raise colander.Invalid(node, msg)
166
167 return can_write_to_group_validator
168
25 169
26 class RepoSchema(colander.Schema):
27 repo_name = colander.SchemaNode(types.GroupNameType())
170 @colander.deferred
171 def deferred_unique_name_validator(node, kw):
172 request_user = kw.get('user')
173 old_values = kw.get('old_values') or {}
174
175 def unique_name_validator(node, value):
176 from rhodecode.model.db import Repository, RepoGroup
177 name_changed = value != old_values.get('repo_name')
178
179 existing = Repository.get_by_repo_name(value)
180 if name_changed and existing:
181 msg = _(u'Repository with name `{}` already exists').format(value)
182 raise colander.Invalid(node, msg)
183
184 existing_group = RepoGroup.get_by_group_name(value)
185 if name_changed and existing_group:
186 msg = _(u'Repository group with name `{}` already exists').format(
187 value)
188 raise colander.Invalid(node, msg)
189 return unique_name_validator
190
191
192 @colander.deferred
193 def deferred_repo_name_validator(node, kw):
194 return validators.valid_name_validator
195
196
197 class GroupType(colander.Mapping):
198 def _validate(self, node, value):
199 try:
200 return dict(repo_group_name=value)
201 except Exception as e:
202 raise colander.Invalid(
203 node, '"${val}" is not a mapping type: ${err}'.format(
204 val=value, err=e))
205
206 def deserialize(self, node, cstruct):
207 if cstruct is colander.null:
208 return cstruct
209
210 appstruct = super(GroupType, self).deserialize(node, cstruct)
211 validated_name = appstruct['repo_group_name']
212
213 # inject group based on once deserialized data
214 (repo_name_without_group,
215 parent_group_name,
216 parent_group) = get_group_and_repo(validated_name)
217
218 appstruct['repo_name_without_group'] = repo_name_without_group
219 appstruct['repo_group_name'] = parent_group_name or types.RootLocation
220 if parent_group:
221 appstruct['repo_group_id'] = parent_group.group_id
222
223 return appstruct
224
225
226 class GroupSchema(colander.SchemaNode):
227 schema_type = GroupType
228 validator = deferred_can_write_to_group_validator
229 missing = colander.null
230
231
232 class RepoGroup(GroupSchema):
233 repo_group_name = colander.SchemaNode(
234 types.GroupNameType())
235 repo_group_id = colander.SchemaNode(
236 colander.String(), missing=None)
237 repo_name_without_group = colander.SchemaNode(
238 colander.String(), missing=None)
239
240
241 class RepoGroupAccessSchema(colander.MappingSchema):
242 repo_group = RepoGroup()
243
244
245 class RepoNameUniqueSchema(colander.MappingSchema):
246 unique_repo_name = colander.SchemaNode(
247 colander.String(),
248 validator=deferred_unique_name_validator)
249
250
251 class RepoSchema(colander.MappingSchema):
252
253 repo_name = colander.SchemaNode(
254 types.RepoNameType(),
255 validator=deferred_repo_name_validator)
256
257 repo_type = colander.SchemaNode(
258 colander.String(),
259 validator=deferred_repo_type_validator)
260
261 repo_owner = colander.SchemaNode(
262 colander.String(),
263 validator=deferred_repo_owner_validator)
264
265 repo_description = colander.SchemaNode(
266 colander.String(), missing='')
267
268 repo_landing_commit_ref = colander.SchemaNode(
269 colander.String(),
270 validator=deferred_landing_ref_validator,
271 preparers=[preparers.strip_preparer],
272 missing=DEFAULT_LANDING_REF)
273
274 repo_clone_uri = colander.SchemaNode(
275 colander.String(),
276 validator=colander.All(colander.Length(min=1)),
277 preparers=[preparers.strip_preparer],
278 missing='')
279
280 repo_fork_of = colander.SchemaNode(
281 colander.String(),
282 validator=deferred_fork_of_validator,
283 missing=None)
284
285 repo_private = colander.SchemaNode(
286 types.StringBooleanType(),
287 missing=False)
288 repo_copy_permissions = colander.SchemaNode(
289 types.StringBooleanType(),
290 missing=False)
291 repo_enable_statistics = colander.SchemaNode(
292 types.StringBooleanType(),
293 missing=False)
294 repo_enable_downloads = colander.SchemaNode(
295 types.StringBooleanType(),
296 missing=False)
297 repo_enable_locking = colander.SchemaNode(
298 types.StringBooleanType(),
299 missing=False)
300
301 def deserialize(self, cstruct):
302 """
303 Custom deserialize that allows to chain validation, and verify
304 permissions, and as last step uniqueness
305 """
306
307 # first pass, to validate given data
308 appstruct = super(RepoSchema, self).deserialize(cstruct)
309 validated_name = appstruct['repo_name']
310
311 # second pass to validate permissions to repo_group
312 second = RepoGroupAccessSchema().bind(**self.bindings)
313 appstruct_second = second.deserialize({'repo_group': validated_name})
314 # save result
315 appstruct['repo_group'] = appstruct_second['repo_group']
316
317 # thirds to validate uniqueness
318 third = RepoNameUniqueSchema().bind(**self.bindings)
319 third.deserialize({'unique_repo_name': validated_name})
320
321 return appstruct
@@ -33,8 +33,7 b' class SearchParamsSchema(colander.Mappin'
33 33 search_sort = colander.SchemaNode(
34 34 colander.String(),
35 35 missing='newfirst',
36 validator=colander.OneOf(
37 ['oldfirst', 'newfirst']))
36 validator=colander.OneOf(['oldfirst', 'newfirst']))
38 37 page_limit = colander.SchemaNode(
39 38 colander.Integer(),
40 39 missing=10,
@@ -18,22 +18,105 b''
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 import re
22
21 23 import colander
24 from rhodecode.model.validation_schema import preparers
25 from rhodecode.model.db import User, UserGroup
26
27
28 class _RootLocation(object):
29 pass
30
31 RootLocation = _RootLocation()
32
33
34 def _normalize(seperator, path):
35
36 if not path:
37 return ''
38 elif path is colander.null:
39 return colander.null
40
41 parts = path.split(seperator)
22 42
23 from rhodecode.model.db import User, UserGroup
43 def bad_parts(value):
44 if not value:
45 return False
46 if re.match(r'^[.]+$', value):
47 return False
48
49 return True
50
51 def slugify(value):
52 value = preparers.slugify_preparer(value)
53 value = re.sub(r'[.]{2,}', '.', value)
54 return value
55
56 clean_parts = [slugify(item) for item in parts if item]
57 path = filter(bad_parts, clean_parts)
58 return seperator.join(path)
59
60
61 class RepoNameType(colander.String):
62 SEPARATOR = '/'
63
64 def deserialize(self, node, cstruct):
65 result = super(RepoNameType, self).deserialize(node, cstruct)
66 if cstruct is colander.null:
67 return colander.null
68 return self._normalize(result)
69
70 def _normalize(self, path):
71 return _normalize(self.SEPARATOR, path)
24 72
25 73
26 74 class GroupNameType(colander.String):
27 75 SEPARATOR = '/'
28 76
29 77 def deserialize(self, node, cstruct):
78 if cstruct is RootLocation:
79 return cstruct
80
30 81 result = super(GroupNameType, self).deserialize(node, cstruct)
31 return self._replace_extra_slashes(result)
82 if cstruct is colander.null:
83 return colander.null
84 return self._normalize(result)
85
86 def _normalize(self, path):
87 return _normalize(self.SEPARATOR, path)
88
89
90 class StringBooleanType(colander.String):
91 true_values = ['true', 't', 'yes', 'y', 'on', '1']
92 false_values = ['false', 'f', 'no', 'n', 'off', '0']
32 93
33 def _replace_extra_slashes(self, path):
34 path = path.split(self.SEPARATOR)
35 path = [item for item in path if item]
36 return self.SEPARATOR.join(path)
94 def serialize(self, node, appstruct):
95 if appstruct is colander.null:
96 return colander.null
97 if not isinstance(appstruct, bool):
98 raise colander.Invalid(node, '%r is not a boolean' % appstruct)
99
100 return appstruct and 'true' or 'false'
101
102 def deserialize(self, node, cstruct):
103 if cstruct is colander.null:
104 return colander.null
105
106 if isinstance(cstruct, bool):
107 return cstruct
108
109 if not isinstance(cstruct, basestring):
110 raise colander.Invalid(node, '%r is not a string' % cstruct)
111
112 value = cstruct.lower()
113 if value in self.true_values:
114 return True
115 elif value in self.false_values:
116 return False
117 else:
118 raise colander.Invalid(
119 node, '{} value cannot be translated to bool'.format(value))
37 120
38 121
39 122 class UserOrUserGroupType(colander.SchemaType):
@@ -1,9 +1,11 b''
1 1 import os
2 import re
2 3
3 4 import ipaddress
4 5 import colander
5 6
6 7 from rhodecode.translation import _
8 from rhodecode.lib.utils2 import glob2re
7 9
8 10
9 11 def ip_addr_validator(node, value):
@@ -13,3 +15,34 b' def ip_addr_validator(node, value):'
13 15 except ValueError:
14 16 msg = _(u'Please enter a valid IPv4 or IpV6 address')
15 17 raise colander.Invalid(node, msg)
18
19
20 class IpAddrValidator(object):
21 def __init__(self, strict=True):
22 self.strict = strict
23
24 def __call__(self, node, value):
25 try:
26 # this raises an ValueError if address is not IpV4 or IpV6
27 ipaddress.ip_network(value, strict=self.strict)
28 except ValueError:
29 msg = _(u'Please enter a valid IPv4 or IpV6 address')
30 raise colander.Invalid(node, msg)
31
32
33 def glob_validator(node, value):
34 try:
35 re.compile('^' + glob2re(value) + '$')
36 except Exception:
37 msg = _(u'Invalid glob pattern')
38 raise colander.Invalid(node, msg)
39
40
41 def valid_name_validator(node, value):
42 from rhodecode.model.validation_schema import types
43 if value is types.RootLocation:
44 return
45
46 msg = _('Name must start with a letter or number. Got `{}`').format(value)
47 if not re.match(r'^[a-zA-z0-9]{1,}', value):
48 raise colander.Invalid(node, msg)
@@ -45,7 +45,8 b''
45 45 @font-family-sans-serif: "Helvetica Neue", Helvetica, Arial, sans-serif;
46 46 @font-family-serif: Georgia, "Times New Roman", Times, serif;
47 47 //** Default monospace fonts for `<code>`, `<kbd>`, and `<pre>`.
48 @font-family-monospace: Menlo, Monaco, Consolas, "Courier New", monospace;
48 @font-family-monospace: Consolas, "Liberation Mono", Menlo, Monaco, Courier, monospace;
49
49 50 @font-family-base: @font-family-sans-serif;
50 51
51 52 @font-size-base: 14px;
@@ -188,6 +188,14 b' input[type="button"] {'
188 188 padding: @padding * 1.2;
189 189 }
190 190
191 .btn-group {
192 display: inline-block;
193 .btn {
194 float: left;
195 margin: 0 0 0 -1px;
196 }
197 }
198
191 199 .btn-link {
192 200 background: transparent;
193 201 border: none;
@@ -326,15 +334,15 b' input[type="submit"],'
326 334 input[type="reset"] {
327 335 &.btn-danger {
328 336 &:extend(.btn-danger);
329
337
330 338 &:focus {
331 339 outline: 0;
332 340 }
333
341
334 342 &:hover {
335 343 &:extend(.btn-danger:hover);
336 344 }
337
345
338 346 &.btn-link {
339 347 &:extend(.btn-link);
340 348 color: @alert2;
This diff has been collapsed as it changes many lines, (550 lines changed) Show them Hide them
@@ -391,6 +391,7 b' td.injected_diff{'
391 391 div.code-body {
392 392 max-width: 1124px;
393 393 overflow-x: auto;
394 overflow-y: hidden;
394 395 padding: 0;
395 396 }
396 397 div.diffblock {
@@ -505,20 +506,19 b' div.codeblock {'
505 506
506 507 th,
507 508 td {
508 padding: .5em !important;
509 border: @border-thickness solid @border-default-color !important;
509 padding: .5em;
510 border: @border-thickness solid @border-default-color;
510 511 }
511 512 }
512 513
513 514 table {
514 width: 0 !important;
515 border: 0px !important;
515 border: 0px;
516 516 margin: 0;
517 517 letter-spacing: normal;
518
519
518
519
520 520 td {
521 border: 0px !important;
521 border: 0px;
522 522 vertical-align: top;
523 523 }
524 524 }
@@ -568,7 +568,7 b' div.annotatediv { margin-left: 2px; marg'
568 568 .CodeMirror ::-moz-selection { background: @rchighlightblue; }
569 569
570 570 .code { display: block; border:0px !important; }
571 .code-highlight,
571 .code-highlight, /* TODO: dan: merge codehilite into code-highlight */
572 572 .codehilite {
573 573 .hll { background-color: #ffffcc }
574 574 .c { color: #408080; font-style: italic } /* Comment */
@@ -640,3 +640,537 b' pre.literal-block, .codehilite pre{'
640 640 .border-radius(@border-radius);
641 641 background-color: @grey7;
642 642 }
643
644
645 /* START NEW CODE BLOCK CSS */
646
647 @cb-line-height: 18px;
648 @cb-line-code-padding: 10px;
649 @cb-text-padding: 5px;
650
651 @pill-padding: 2px 7px;
652
653 input.filediff-collapse-state {
654 display: none;
655
656 &:checked + .filediff { /* file diff is collapsed */
657 .cb {
658 display: none
659 }
660 .filediff-collapse-indicator {
661 border-width: 9px 0 9px 15.6px;
662 border-color: transparent transparent transparent #ccc;
663 }
664 .filediff-menu {
665 display: none;
666 }
667 margin: -1px 0 0 0;
668 }
669
670 &+ .filediff { /* file diff is expanded */
671 .filediff-collapse-indicator {
672 border-width: 15.6px 9px 0 9px;
673 border-color: #ccc transparent transparent transparent;
674 }
675 .filediff-menu {
676 display: block;
677 }
678 margin: 20px 0;
679 &:nth-child(2) {
680 margin: 0;
681 }
682 }
683 }
684 .cs_files {
685 clear: both;
686 }
687
688 .diffset-menu {
689 margin-bottom: 20px;
690 }
691 .diffset {
692 margin: 20px auto;
693 .diffset-heading {
694 border: 1px solid @grey5;
695 margin-bottom: -1px;
696 // margin-top: 20px;
697 h2 {
698 margin: 0;
699 line-height: 38px;
700 padding-left: 10px;
701 }
702 .btn {
703 margin: 0;
704 }
705 background: @grey6;
706 display: block;
707 padding: 5px;
708 }
709 .diffset-heading-warning {
710 background: @alert3-inner;
711 border: 1px solid @alert3;
712 }
713 &.diffset-comments-disabled {
714 .cb-comment-box-opener, .comment-inline-form, .cb-comment-add-button {
715 display: none !important;
716 }
717 }
718 }
719
720 .pill {
721 display: block;
722 float: left;
723 padding: @pill-padding;
724 }
725 .pill-group {
726 .pill {
727 opacity: .8;
728 &:first-child {
729 border-radius: @border-radius 0 0 @border-radius;
730 }
731 &:last-child {
732 border-radius: 0 @border-radius @border-radius 0;
733 }
734 &:only-child {
735 border-radius: @border-radius;
736 }
737 }
738 }
739
740 .filediff {
741 border: 1px solid @grey5;
742
743 /* START OVERRIDES */
744 .code-highlight {
745 border: none; // TODO: remove this border from the global
746 // .code-highlight, it doesn't belong there
747 }
748 label {
749 margin: 0; // TODO: remove this margin definition from global label
750 // it doesn't belong there - if margin on labels
751 // are needed for a form they should be defined
752 // in the form's class
753 }
754 /* END OVERRIDES */
755
756 * {
757 box-sizing: border-box;
758 }
759 .filediff-anchor {
760 visibility: hidden;
761 }
762 &:hover {
763 .filediff-anchor {
764 visibility: visible;
765 }
766 }
767
768 .filediff-collapse-indicator {
769 width: 0;
770 height: 0;
771 border-style: solid;
772 float: left;
773 margin: 2px 2px 0 0;
774 cursor: pointer;
775 }
776
777 .filediff-heading {
778 background: @grey7;
779 cursor: pointer;
780 display: block;
781 padding: 5px 10px;
782 }
783 .filediff-heading:after {
784 content: "";
785 display: table;
786 clear: both;
787 }
788 .filediff-heading:hover {
789 background: #e1e9f4 !important;
790 }
791
792 .filediff-menu {
793 float: right;
794
795 &> a, &> span {
796 padding: 5px;
797 display: block;
798 float: left
799 }
800 }
801
802 .pill {
803 &[op="name"] {
804 background: none;
805 color: @grey2;
806 opacity: 1;
807 color: white;
808 }
809 &[op="limited"] {
810 background: @grey2;
811 color: white;
812 }
813 &[op="binary"] {
814 background: @color7;
815 color: white;
816 }
817 &[op="modified"] {
818 background: @alert1;
819 color: white;
820 }
821 &[op="renamed"] {
822 background: @color4;
823 color: white;
824 }
825 &[op="mode"] {
826 background: @grey3;
827 color: white;
828 }
829 &[op="symlink"] {
830 background: @color8;
831 color: white;
832 }
833
834 &[op="added"] { /* added lines */
835 background: @alert1;
836 color: white;
837 }
838 &[op="deleted"] { /* deleted lines */
839 background: @alert2;
840 color: white;
841 }
842
843 &[op="created"] { /* created file */
844 background: @alert1;
845 color: white;
846 }
847 &[op="removed"] { /* deleted file */
848 background: @color5;
849 color: white;
850 }
851 }
852
853 .filediff-collapse-button, .filediff-expand-button {
854 cursor: pointer;
855 }
856 .filediff-collapse-button {
857 display: inline;
858 }
859 .filediff-expand-button {
860 display: none;
861 }
862 .filediff-collapsed .filediff-collapse-button {
863 display: none;
864 }
865 .filediff-collapsed .filediff-expand-button {
866 display: inline;
867 }
868
869 @comment-padding: 5px;
870
871 /**** COMMENTS ****/
872
873 .filediff-menu {
874 .show-comment-button {
875 display: none;
876 }
877 }
878 &.hide-comments {
879 .inline-comments {
880 display: none;
881 }
882 .filediff-menu {
883 .show-comment-button {
884 display: inline;
885 }
886 .hide-comment-button {
887 display: none;
888 }
889 }
890 }
891
892 .hide-line-comments {
893 .inline-comments {
894 display: none;
895 }
896 }
897 .inline-comments {
898 border-radius: @border-radius;
899 background: @grey6;
900 .comment {
901 margin: 0;
902 border-radius: @border-radius;
903 }
904 .comment-outdated {
905 opacity: 0.5;
906 }
907 .comment-inline {
908 background: white;
909 padding: (@comment-padding + 3px) @comment-padding;
910 border: @comment-padding solid @grey6;
911
912 .text {
913 border: none;
914 }
915 .meta {
916 border-bottom: 1px solid @grey6;
917 padding-bottom: 10px;
918 }
919 }
920 .comment-selected {
921 border-left: 6px solid @comment-highlight-color;
922 }
923 .comment-inline-form {
924 padding: @comment-padding;
925 display: none;
926 }
927 .cb-comment-add-button {
928 margin: @comment-padding;
929 }
930 /* hide add comment button when form is open */
931 .comment-inline-form-open ~ .cb-comment-add-button {
932 display: none;
933 }
934 .comment-inline-form-open {
935 display: block;
936 }
937 /* hide add comment button when form but no comments */
938 .comment-inline-form:first-child + .cb-comment-add-button {
939 display: none;
940 }
941 /* hide add comment button when no comments or form */
942 .cb-comment-add-button:first-child {
943 display: none;
944 }
945 /* hide add comment button when only comment is being deleted */
946 .comment-deleting:first-child + .cb-comment-add-button {
947 display: none;
948 }
949 }
950 /**** END COMMENTS ****/
951
952 }
953
954
955 table.cb {
956 width: 100%;
957 border-collapse: collapse;
958
959 .cb-text {
960 padding: @cb-text-padding;
961 }
962 .cb-hunk {
963 padding: @cb-text-padding;
964 }
965 .cb-expand {
966 display: none;
967 }
968 .cb-collapse {
969 display: inline;
970 }
971 &.cb-collapsed {
972 .cb-line {
973 display: none;
974 }
975 .cb-expand {
976 display: inline;
977 }
978 .cb-collapse {
979 display: none;
980 }
981 }
982
983 /* intentionally general selector since .cb-line-selected must override it
984 and they both use !important since the td itself may have a random color
985 generated by annotation blocks. TLDR: if you change it, make sure
986 annotated block selection and line selection in file view still work */
987 .cb-line-fresh .cb-content {
988 background: white !important;
989 }
990 .cb-warning {
991 background: #fff4dd;
992 }
993
994 &.cb-diff-sideside {
995 td {
996 &.cb-content {
997 width: 50%;
998 }
999 }
1000 }
1001
1002 tr {
1003 &.cb-annotate {
1004 border-top: 1px solid #eee;
1005
1006 &+ .cb-line {
1007 border-top: 1px solid #eee;
1008 }
1009
1010 &:first-child {
1011 border-top: none;
1012 &+ .cb-line {
1013 border-top: none;
1014 }
1015 }
1016 }
1017
1018 &.cb-hunk {
1019 font-family: @font-family-monospace;
1020 color: rgba(0, 0, 0, 0.3);
1021
1022 td {
1023 &:first-child {
1024 background: #edf2f9;
1025 }
1026 &:last-child {
1027 background: #f4f7fb;
1028 }
1029 }
1030 }
1031 }
1032
1033
1034 td {
1035 vertical-align: top;
1036 padding: 0;
1037
1038 &.cb-content {
1039 font-size: 12.35px;
1040
1041 &.cb-line-selected .cb-code {
1042 background: @comment-highlight-color !important;
1043 }
1044
1045 span.cb-code {
1046 line-height: @cb-line-height;
1047 padding-left: @cb-line-code-padding;
1048 padding-right: @cb-line-code-padding;
1049 display: block;
1050 white-space: pre-wrap;
1051 font-family: @font-family-monospace;
1052 word-break: break-word;
1053 .nonl {
1054 color: @color5;
1055 }
1056 }
1057
1058 &> button.cb-comment-box-opener {
1059
1060 padding: 2px 5px 1px 5px;
1061 margin-left: 0px;
1062 margin-top: -1px;
1063
1064 border-radius: @border-radius;
1065 position: absolute;
1066 display: none;
1067 }
1068 .cb-comment {
1069 margin-top: 10px;
1070 white-space: normal;
1071 }
1072 }
1073 &:hover {
1074 button.cb-comment-box-opener {
1075 display: block;
1076 }
1077 &+ td button.cb-comment-box-opener {
1078 display: block
1079 }
1080 }
1081
1082 &.cb-data {
1083 text-align: right;
1084 width: 30px;
1085 font-family: @font-family-monospace;
1086
1087 .icon-comment {
1088 cursor: pointer;
1089 }
1090 &.cb-line-selected > div {
1091 display: block;
1092 background: @comment-highlight-color !important;
1093 line-height: @cb-line-height;
1094 color: rgba(0, 0, 0, 0.3);
1095 }
1096 }
1097
1098 &.cb-lineno {
1099 padding: 0;
1100 width: 50px;
1101 color: rgba(0, 0, 0, 0.3);
1102 text-align: right;
1103 border-right: 1px solid #eee;
1104 font-family: @font-family-monospace;
1105
1106 a::before {
1107 content: attr(data-line-no);
1108 }
1109 &.cb-line-selected a {
1110 background: @comment-highlight-color !important;
1111 }
1112
1113 a {
1114 display: block;
1115 padding-right: @cb-line-code-padding;
1116 padding-left: @cb-line-code-padding;
1117 line-height: @cb-line-height;
1118 color: rgba(0, 0, 0, 0.3);
1119 }
1120 }
1121
1122 &.cb-empty {
1123 background: @grey7;
1124 }
1125
1126 ins {
1127 color: black;
1128 background: #a6f3a6;
1129 text-decoration: none;
1130 }
1131 del {
1132 color: black;
1133 background: #f8cbcb;
1134 text-decoration: none;
1135 }
1136 &.cb-addition {
1137 background: #ecffec;
1138
1139 &.blob-lineno {
1140 background: #ddffdd;
1141 }
1142 }
1143 &.cb-deletion {
1144 background: #ffecec;
1145
1146 &.blob-lineno {
1147 background: #ffdddd;
1148 }
1149 }
1150
1151 &.cb-annotate-info {
1152 width: 320px;
1153 min-width: 320px;
1154 max-width: 320px;
1155 padding: 5px 2px;
1156 font-size: 13px;
1157
1158 strong.cb-annotate-message {
1159 padding: 5px 0;
1160 white-space: pre-line;
1161 display: inline-block;
1162 }
1163 .rc-user {
1164 float: none;
1165 padding: 0 6px 0 17px;
1166 min-width: auto;
1167 min-height: auto;
1168 }
1169 }
1170
1171 &.cb-annotate-revision {
1172 cursor: pointer;
1173 text-align: right;
1174 }
1175 }
1176 }
@@ -379,6 +379,8 b' span.CodeMirror-selectedtext { backgroun'
379 379 }
380 380
381 381 .CodeMirror-hint-entry .gravatar {
382 height: @gravatar-size;
383 width: @gravatar-size;
382 384 margin-right: 4px;
383 385 }
384 386
@@ -90,28 +90,50 b''
90 90 height: 40px;
91 91 }
92 92
93 .deform-two-field-sequence .deform-seq-container .deform-seq-item label {
94 display: none;
95 }
96 .deform-two-field-sequence .deform-seq-container .deform-seq-item:first-child label {
97 display: block;
98 }
99 .deform-two-field-sequence .deform-seq-container .deform-seq-item .panel-heading {
100 display: none;
101 }
102 .deform-two-field-sequence .deform-seq-container .deform-seq-item.form-group {
103 margin: 0;
104 }
105 .deform-two-field-sequence .deform-seq-container .deform-seq-item .deform-seq-item-group .form-group {
106 width: 45%; padding: 0 2px; float: left; clear: none;
107 }
108 .deform-two-field-sequence .deform-seq-container .deform-seq-item .deform-seq-item-group > .panel {
109 padding: 0;
110 margin: 5px 0;
111 border: none;
112 }
113 .deform-two-field-sequence .deform-seq-container .deform-seq-item .deform-seq-item-group > .panel > .panel-body {
114 padding: 0;
93 .deform-full-field-sequence.control-inputs {
94 width: 100%;
115 95 }
116 96
97 .deform-table-sequence {
98 .deform-seq-container {
99 .deform-seq-item {
100 margin: 0;
101 label {
102 display: none;
103 }
104 .panel-heading {
105 display: none;
106 }
107 .deform-seq-item-group > .panel {
108 padding: 0;
109 margin: 5px 0;
110 border: none;
111 &> .panel-body {
112 padding: 0;
113 }
114 }
115 &:first-child label {
116 display: block;
117 }
118 }
119 }
120 }
121 .deform-table-2-sequence {
122 .deform-seq-container {
123 .deform-seq-item {
124 .form-group {
125 width: 45% !important; padding: 0 2px; float: left; clear: none;
126 }
127 }
128 }
129 }
130 .deform-table-3-sequence {
131 .deform-seq-container {
132 .deform-seq-item {
133 .form-group {
134 width: 30% !important; padding: 0 2px; float: left; clear: none;
135 }
136 }
137 }
138 }
117 139 }
@@ -13,6 +13,16 b' a { cursor: pointer; }'
13 13 }
14 14 }
15 15
16 .clearinner:after { /* clears all floating divs inside a block */
17 content: "";
18 display: table;
19 clear: both;
20 }
21
22 .js-template { /* mark a template for javascript use */
23 display: none;
24 }
25
16 26 .linebreak {
17 27 display: block;
18 28 }
@@ -188,6 +188,10 b''
188 188 line-height: 1.5em;
189 189 }
190 190 }
191
192 p.help-block {
193 margin-left: 0;
194 }
191 195 }
192 196
193 197 .user-menu.submenu {
@@ -142,6 +142,10 b' input.inline[type="file"] {'
142 142 color: @grey2;
143 143 }
144 144
145 .alert {
146 margin: @padding 0;
147 }
148
145 149 .error-branding {
146 150 font-family: @text-semibold;
147 151 color: @grey4;
@@ -435,6 +439,23 b' ul.auth_plugins {'
435 439 //--- MODULES ------------------//
436 440
437 441
442 // Server Announcement
443 #server-announcement {
444 width: 95%;
445 margin: @padding auto;
446 padding: @padding;
447 border-width: 2px;
448 border-style: solid;
449 .border-radius(2px);
450 font-family: @text-bold;
451
452 &.info { border-color: @alert4; background-color: @alert4-inner; }
453 &.warning { border-color: @alert3; background-color: @alert3-inner; }
454 &.error { border-color: @alert2; background-color: @alert2-inner; }
455 &.success { border-color: @alert1; background-color: @alert1-inner; }
456 &.neutral { border-color: @grey3; background-color: @grey6; }
457 }
458
438 459 // Fixed Sidebar Column
439 460 .sidebar-col-wrapper {
440 461 padding-left: @sidebar-all-width;
@@ -1173,6 +1194,8 b' table.integrations {'
1173 1194 }
1174 1195
1175 1196 img {
1197 height: @gravatar-size;
1198 width: @gravatar-size;
1176 1199 margin-right: 1em;
1177 1200 }
1178 1201
@@ -1261,6 +1284,9 b' table.integrations {'
1261 1284 width: 100%;
1262 1285 overflow: auto;
1263 1286 }
1287 .reviewer_reason {
1288 padding-left: 20px;
1289 }
1264 1290 .reviewer_status {
1265 1291 display: inline-block;
1266 1292 vertical-align: top;
@@ -1317,6 +1343,11 b' table.integrations {'
1317 1343 .pr-details-title {
1318 1344 padding-bottom: 8px;
1319 1345 border-bottom: @border-thickness solid @grey5;
1346
1347 .action_button.disabled {
1348 color: @grey4;
1349 cursor: inherit;
1350 }
1320 1351 .action_button {
1321 1352 color: @rcblue;
1322 1353 }
@@ -1332,7 +1363,34 b' table.integrations {'
1332 1363 margin-top: 0;
1333 1364 padding: 0;
1334 1365 list-style: outside none none;
1366
1367 img {
1368 height: @gravatar-size;
1369 width: @gravatar-size;
1370 margin-right: .5em;
1371 margin-left: 3px;
1372 }
1373
1374 .to-delete {
1375 .user {
1376 text-decoration: line-through;
1377 }
1378 }
1335 1379 }
1380
1381 // new entry in group_members
1382 .td-author-new-entry {
1383 background-color: rgba(red(@alert1), green(@alert1), blue(@alert1), 0.3);
1384 }
1385
1386 .usergroup_member_remove {
1387 width: 16px;
1388 margin-bottom: 10px;
1389 padding: 0;
1390 color: black !important;
1391 cursor: pointer;
1392 }
1393
1336 1394 .reviewer_ac .ac-input {
1337 1395 width: 92%;
1338 1396 margin-bottom: 1em;
@@ -5,6 +5,7 b''
5 5 @import 'variables';
6 6 @import 'buttons';
7 7 @import 'alerts';
8 @import 'type';
8 9
9 10 :root {
10 11 --primary-color: @rcblue;
@@ -1,10 +1,11 b''
1 1 @font-face {
2 2 font-family: 'rcicons';
3 src: url('../fonts/RCIcons/rcicons.eot?18742416');
4 src: url('../fonts/RCIcons/rcicons.eot?18742416#iefix') format('embedded-opentype'),
5 url('../fonts/RCIcons/rcicons.woff?18742416') format('woff'),
6 url('../fonts/RCIcons/rcicons.ttf?18742416') format('truetype'),
7 url('../fonts/RCIcons/rcicons.svg?18742416#rcicons') format('svg');
3 src: url('../fonts/RCIcons/rcicons.eot?88634534');
4 src: url('../fonts/RCIcons/rcicons.eot?88634534#iefix') format('embedded-opentype'),
5 url('../fonts/RCIcons/rcicons.woff2?88634534') format('woff2'),
6 url('../fonts/RCIcons/rcicons.woff?88634534') format('woff'),
7 url('../fonts/RCIcons/rcicons.ttf?88634534') format('truetype'),
8 url('../fonts/RCIcons/rcicons.svg?88634534#rcicons') format('svg');
8 9 font-weight: normal;
9 10 font-style: normal;
10 11 }
@@ -14,7 +15,7 b''
14 15 @media screen and (-webkit-min-device-pixel-ratio:0) {
15 16 @font-face {
16 17 font-family: 'rcicons';
17 src: url('../font/rcicons.svg?18742415#rcicons') format('svg');
18 src: url('../fonts/RCIcons/rcicons.svg?88634534#rcicons') format('svg');
18 19 }
19 20 }
20 21 */
@@ -35,7 +36,7 b''
35 36 /* For safety - reset parent styles, that can break glyph codes*/
36 37 font-variant: normal;
37 38 text-transform: none;
38
39
39 40 /* fix buttons height, for twitter bootstrap */
40 41 line-height: 1em;
41 42
@@ -46,40 +47,44 b''
46 47 /* you can be more comfortable with increased icons size */
47 48 /* font-size: 120%; */
48 49
50 /* Font smoothing. That was taken from TWBS */
51 -webkit-font-smoothing: antialiased;
52 -moz-osx-font-smoothing: grayscale;
53
49 54 /* Uncomment for 3D effect */
50 55 /* text-shadow: 1px 1px 1px rgba(127, 127, 127, 0.3); */
51 56 }
52 57
53 .icon-svn-transparent:before { content: '\e800'; } /* '' */
54 .icon-hg-transparent:before { content: '\e801'; } /* '' */
55 .icon-git-transparent:before { content: '\e802'; } /* '' */
56 .icon-svn:before { content: '\e803'; } /* '' */
57 .icon-hg:before { content: '\e804'; } /* '' */
58 .icon-git:before { content: '\e805'; } /* '' */
59 .icon-plus:before { content: '\e806'; } /* '' */
60 .icon-minus:before { content: '\e807'; } /* '' */
61 .icon-remove:before { content: '\e808'; } /* '' */
62 .icon-bookmark:before { content: '\e809'; } /* '' */
63 .icon-branch:before { content: '\e80a'; } /* '' */
64 .icon-tag:before { content: '\e80b'; } /* '' */
65 .icon-lock:before { content: '\e80c'; } /* '' */
66 .icon-unlock:before { content: '\e80d'; } /* '' */
58
59 .icon-git:before { content: '\e82a'; } /* '' */
60 .icon-hg:before { content: '\e82d'; } /* '' */
61 .icon-svn:before { content: '\e82e'; } /* '' */
62 .icon-plus:before { content: '\e813'; } /* '' */
63 .icon-minus:before { content: '\e814'; } /* '' */
64 .icon-remove:before { content: '\e815'; } /* '' */
65 .icon-bookmark:before { content: '\e803'; } /* '' */
66 .icon-branch:before { content: '\e804'; } /* '' */
67 .icon-merge:before { content: '\e833'; } /* '' */
68 .icon-tag:before { content: '\e805'; } /* '' */
69 .icon-lock:before { content: '\e806'; } /* '' */
70 .icon-unlock:before { content: '\e807'; } /* '' */
67 71 .icon-delete:before,
68 .icon-false:before { content: '\e80e'; } /* '' */
72 .icon-false:before { content: '\e800'; } /* '' */
69 73 .icon-ok:before,
70 .icon-true:before { content: '\e80f'; } /* '' */
71 .icon-comment:before { content: '\e810'; } /* '' */
72 .icon-feed:before { content: '\e811'; } /* '' */
73 .icon-right:before { content: '\e812'; } /* '' */
74 .icon-left:before { content: '\e813'; } /* '' */
75 .icon-arrow_down:before { content: '\e814'; } /* '' */
76 .icon-group:before { content: '\e815'; } /* '' */
77 .icon-folder:before { content: '\e816'; } /* '' */
78 .icon-fork:before { content: '\e817'; } /* '' */
79 .icon-more:before { content: '\e818'; } /* '' */
80 .icon-comment-add:before { content: '\e819'; } /* '' */
81
82
74 .icon-true:before { content: '\e801'; } /* '' */
75 .icon-feed:before { content: '\e808'; } /* '' */
76 .icon-left:before { content: '\e809'; } /* '' */
77 .icon-right:before { content: '\e80a'; } /* '' */
78 .icon-arrow_down:before { content: '\e80b'; } /* '' */
79 .icon-arrow_up:before { content: '\e832'; } /* '' */
80 .icon-group:before { content: '\e80f'; } /* '' */
81 .icon-folder:before { content: '\e810'; } /* '' */
82 .icon-fork:before { content: '\e811'; } /* '' */
83 .icon-more:before { content: '\e812'; } /* '' */
84 .icon-comment:before { content: '\e802'; } /* '' */
85 .icon-comment-add:before { content: '\e82f'; } /* '' */
86 .icon-comment-toggle:before { content: '\e830'; } /* '' */
87 .icon-rhodecode:before { content: '\e831'; } /* '' */
83 88 .icon-remove-sign:before { &:extend(.icon-remove:before); }
84 89 .icon-repo-private:before,
85 90 .icon-repo-lock:before { &:extend(.icon-lock:before); }
@@ -129,6 +129,13 b' table.dataTable {'
129 129
130 130 &.td-description {
131 131 min-width: 350px;
132
133 &.truncate, .truncate-wrap {
134 white-space: nowrap;
135 overflow: hidden;
136 text-overflow: ellipsis;
137 max-width: 450px;
138 }
132 139 }
133 140
134 141 &.td-componentname {
@@ -399,7 +399,6 b' dd {'
399 399 .dl-horizontal {
400 400
401 401 overflow: hidden;
402 margin-top: -5px;
403 402 margin-bottom: @space;
404 403
405 404 dt, dd {
1 NO CONTENT: modified file, binary diff hidden
@@ -6,32 +6,59 b''
6 6 <font id="rcicons" horiz-adv-x="1000" >
7 7 <font-face font-family="rcicons" font-weight="400" font-stretch="normal" units-per-em="1000" ascent="850" descent="-150" />
8 8 <missing-glyph horiz-adv-x="1000" />
9 <glyph glyph-name="rc-icon-svn-transparent" unicode="&#xe800;" d="m938 855h-871c-36 0-67-31-67-67v-871c0-36 31-67 67-67h871c36 0 67 31 67 67v871c0 36-31 67-67 67z m-5-936h-864v864h864v-864z m-762 312c8-2 15-5 22-7 7-3 14-3 21-3 15 0 29 3 41 8 12 4 21 12 31 21 9 10 14 19 19 31 5 12 7 24 7 36 0 9-2 16-5 24-2 7-7 11-9 16-5 5-10 10-15 12-4 2-12 5-16 7-5 3-12 5-17 7-5 3-9 5-14 8-5 2-7 4-10 9-2 2-5 7-5 12 0 5 0 9 3 12 2 5 2 7 7 9 2 3 7 5 10 8 4 2 9 2 14 2 5 0 12 0 14-2 5-3 7-5 12-5 2-3 7-5 10-5 2-2 4-2 7-2 2 0 4 0 7 2 2 2 5 2 7 7l17 24c-3 5-8 9-12 12-5 5-10 7-17 9-7 3-12 5-19 8-7 2-14 2-21 2-15 0-27-2-39-7s-21-12-28-19c-7-7-14-17-19-29-5-9-7-21-7-33 0-10 2-19 4-26 3-8 8-12 10-17 5-5 10-10 14-12 5-2 12-5 17-7 5-2 12-5 17-5 4-2 9-5 14-7s7-5 9-10c3-2 5-7 5-11 0-12-2-19-9-27-5-4-15-9-24-9-7 0-14 0-19 2-5 3-10 5-14 7-5 3-8 5-10 8-2 2-7 2-10 2-2 0-4 0-7-2-2-3-4-3-7-5l-21-31c5-5 9-10 14-14 7-5 14-8 21-10z m308-7l126 257h-48c-2 0-5 0-7 0-2 0-5-2-5-2-2 0-2-3-4-5-3-3-3-3-3-5l-59-143c-3-5-8-12-10-19-2-7-5-14-7-21 0 7-2 14-2 21-3 7-3 14-5 19l-26 143c0 5-3 7-5 10-3 2-7 2-12 2h-45l64-257h48z m207 150c0 2 0 5 0 9 0 3 0 8 0 10l90-160c3-2 5-4 7-7 3-2 8-2 12-2h31l31 257h-50l-16-145c0-3 0-7-3-10 0-5 0-7 0-12l-90 160c0 2-3 2-3 5s-4 0-4 0c-3 0-3 0-5 2-3 0-5 0-7 0h-31l-31-257h50l19 150z" horiz-adv-x="1000" />
10 <glyph glyph-name="rc-icon-hg-transparent" unicode="&#xe801;" d="m933 850h-871c-36 0-67-31-67-67v-871c0-36 31-67 67-67h871c36 0 67 31 67 67v871c0 36-31 67-67 67z m-2-933h-864v864h864v-864z m-605 421h93l-14-109h57l31 257h-57l-15-110h-92l14 110h-57l-31-257h57l14 109z m238-88c10-9 22-19 34-24 14-5 28-9 43-9 9 0 19 0 26 2 9 0 16 2 24 5 7 2 14 5 21 9 7 3 14 8 21 12l15 107h-86l-5-31c0-2 0-4 3-7 2-2 4-2 7-2h21l-5-43c-4-2-9-5-16-5-5-2-12-2-19-2-10 0-17 2-24 5-7 2-14 7-19 14-5 7-10 14-12 24 2 7 2 16 2 28 0 15 3 29 5 41 5 12 10 21 17 31s14 14 24 19c9 5 19 7 30 7 5 0 10 0 15 0 5 0 7-2 12-2 4-3 7-3 12-5 4-3 7-5 11-7 5-3 10-5 12-3 5 0 8 3 12 7l19 24c-4 5-9 10-14 15-5 4-12 9-19 11-7 3-14 8-24 8-9 2-19 2-31 2-14 0-26-2-38-5-12-2-24-7-33-14-10-7-19-14-29-24-9-9-16-19-21-28-7-12-10-24-14-36-3-12-5-26-5-41 0-16 2-31 7-45 5-17 12-28 21-38z" horiz-adv-x="1000" />
11 <glyph glyph-name="rc-icon-git-transparent" unicode="&#xe802;" d="m933 850h-885c-38 0-69-29-69-67v-866c0-36 31-67 66-67h886c38 0 67 31 67 67v866c2 38-29 67-65 67z m-4-931h-879v862h879v-862z m-269 307h57l26 212h67l7 45h-193l-5-45h67l-26-212z m-103 257h-59l-31-257h57l33 257z m-336-219c10-9 22-19 34-23 14-5 28-10 45-10 10 0 19 0 29 2 9 0 16 3 23 5 8 3 15 5 22 10 7 2 14 7 21 12l15 107h-89l-4-31c0-3 0-5 2-7s5-3 7-3h22l-5-43c-5-2-12-4-17-4-5-3-12-3-19-3-9 0-16 3-24 5-7 2-14 7-19 14-4 7-9 15-12 24-2 10-4 19-4 31 0 14 2 29 4 41 5 11 10 21 17 30 7 10 14 15 24 20 9 4 19 7 31 7 5 0 9 0 14 0 5 0 7-3 12-3 5-2 7-2 12-4 5-3 7-5 12-8 5-2 9-4 14-2 5 0 7 2 12 7l19 24c-5 5-9 9-14 14-5 5-12 7-19 12-7 3-15 7-24 7s-17-2-29-2c-14 0-26-2-38-5-12-2-24-7-33-14-10-7-19-14-29-22-9-9-16-19-21-28-7-12-12-24-14-36-3-12-5-26-5-40 0-17 2-31 7-46 5-16 12-28 21-38z" horiz-adv-x="1000" />
12 <glyph glyph-name="svn" unicode="&#xe803;" d="m934 850h-868c-36 0-66-30-66-66v-868c0-36 30-66 66-66h868c36 0 66 30 66 66v868c0 36-30 66-66 66z m-779-445c4-4 8-7 13-10 5-2 11-6 18-9 7-2 14-5 22-7 7-3 13-6 21-10 7-4 14-9 19-14 5-6 10-12 13-20 4-7 5-17 5-29 0-16-2-31-8-45-5-13-14-26-24-37-10-10-23-19-38-25-15-6-31-9-50-9-8 0-18 1-27 3-9 2-19 5-26 8-10 2-18 7-25 12-8 5-14 11-19 17l26 33c3 2 5 5 8 6 2 1 6 2 8 2 4 0 8-1 13-3 4-3 9-7 12-10 5-4 10-7 17-10 6-3 13-4 22-4 14 0 24 4 31 11 8 8 12 18 12 33 0 6-2 12-5 16-4 4-8 7-13 11-5 3-11 5-19 8-7 2-13 5-21 7-7 3-14 6-21 9-8 4-13 9-19 14-5 6-10 12-12 21-4 9-5 19-5 31 0 14 2 28 7 41 5 13 13 25 23 35 10 10 22 18 36 24 14 6 30 9 47 9 9 0 18-1 27-3 8-2 16-5 23-7 8-4 14-8 20-13 7-5 12-10 15-15l-18-31c-3-4-5-6-8-7-2-2-5-3-9-3-3 0-6 1-10 4s-7 5-11 7c-4 3-9 5-14 8-5 2-11 4-18 4-7 0-13-2-18-3-5-2-9-5-12-7-4-4-7-8-8-12-4-6-5-11-5-17 0-5 3-10 5-14z m318-215h-65l-79 319h57c7 0 12-1 15-5 4-3 7-6 7-11l33-177c3-7 4-15 5-23 2-9 3-18 4-27 3 9 6 18 10 27 4 8 8 16 11 23l75 177c2 2 3 3 4 6 1 2 4 4 6 5 3 1 5 2 8 4 2 1 5 1 9 1h57l-157-319z m430 0h-38c-5 0-10 1-14 3-3 1-6 5-10 8l-111 197c0-4 0-8-1-12 0-3-1-7-1-11l-23-186h-62l40 319h37c3 0 6 0 8 0 2 0 3-2 5-2 1-1 2-2 3-3 2-2 3-4 4-7l111-197c0 5 2 10 2 15 0 5 1 9 1 14l21 180h63l-35-318z" horiz-adv-x="1000" />
13 <glyph glyph-name="hg" unicode="&#xe804;" d="m935 851h-867c-37 0-67-30-67-66v-868c0-36 30-66 67-66h867c36 0 66 30 66 66v868c0 36-30 66-66 66z m-491-660h-71l16 135h-115l-16-135h-72l39 319h71l-16-135h115l16 135h72l-39-319z m180 104c4-11 9-20 15-29 6-7 14-13 22-17 9-4 19-6 29-6 9 0 16 1 24 2 7 1 14 4 20 6l6 53h-27c-4 0-8 1-10 4-3 2-3 5-3 8l5 39h106l-16-134c-9-6-16-11-25-15-9-3-17-7-27-10-10-2-20-5-30-6-10-1-22-2-34-2-20 0-38 3-54 11-16 7-30 17-41 30-11 12-20 27-28 45-6 17-10 36-10 56 0 18 3 35 7 50 3 16 10 31 17 44 8 14 16 26 26 36 10 11 23 20 35 28 13 7 27 13 42 17 15 4 30 6 46 6 14 0 26-1 36-3 11-3 21-7 30-10 9-4 16-9 24-15 6-5 12-12 19-18l-24-29c-4-5-9-8-14-8-5-2-11 0-16 3-5 4-10 7-15 9-5 3-9 5-14 6-5 2-10 3-15 4-5 1-11 1-17 1-14 0-27-2-38-8-11-7-21-14-30-25-9-12-15-24-19-39-5-15-7-31-7-50 0-10 1-23 5-34z" horiz-adv-x="1000" />
14 <glyph glyph-name="git" unicode="&#xe805;" d="m934 851h-868c-37 0-66-30-66-67v-869c0-36 29-65 66-65h869c36 0 66 30 66 66v868c0 37-30 67-67 67z m-738-557c4-11 9-20 15-29 7-7 14-14 23-17 9-4 19-7 29-7 8 0 16 2 23 3 8 1 14 4 20 6l7 53h-28c-4 0-7 1-10 3-2 3-4 5-2 9l5 40h106l-16-134c-9-6-17-11-25-15-9-3-18-7-28-10-10-2-20-5-30-6-11-2-22-2-35-2-20 0-37 3-54 11-16 7-30 17-41 30-11 12-20 27-27 45-7 17-10 36-10 56 0 18 2 35 6 50 4 16 10 31 17 44 8 14 17 26 27 36 10 11 22 20 35 28 12 7 26 13 41 17 15 4 30 6 46 6 14 0 26-1 38-3 11-3 21-7 30-10 8-4 16-9 23-15 8-5 13-12 19-18l-25-26c-4-5-9-9-14-9-5-1-11 0-16 4-5 4-10 6-15 9-5 2-9 5-14 6-5 1-10 2-15 4-5 1-11 1-17 1-14 0-26-3-38-9-11-6-21-14-30-25-8-11-15-24-18-39-5-15-8-31-8-50 1-13 3-26 6-37z m340-104h-71l39 319h71l-39-319z m348 263h-81l-33-263h-70l33 263h-83l8 56h233l-7-56z" horiz-adv-x="1000" />
15 <glyph glyph-name="plus" unicode="&#xe806;" d="m958 408v-120c0-17-12-34-33-34h-304c-17 0-33-12-33-33v-304c0-17-13-34-34-34h-121c-16 0-33 13-33 34v308c0 17-12 33-33 33h-296c-17 0-33 13-33 34v121c0 16 12 33 33 33h304c17 0 33 12 33 33v296c0 17 13 33 34 33h121c16 0 33-12 33-33v-304c0-17 12-33 33-33h304c13 4 25-13 25-30z" horiz-adv-x="1000" />
16 <glyph glyph-name="minus" unicode="&#xe807;" d="m942 258h-884c-8 0-16 9-16 17v154c0 4 8 13 16 13h888c8 0 17-9 17-17v-154c-5-4-13-13-21-13z" horiz-adv-x="1000" />
17 <glyph glyph-name="remove" unicode="&#xe808;" d="m942 688l-109 108c-12 12-33 12-45 0l-267-267c-13-12-33-12-46 0l-267 267c-12 12-33 12-45 0l-113-108c-12-13-12-34 0-46l267-267c12-12 12-33 0-46l-267-266c-12-13-12-34 0-46l108-109c17-16 38-16 50-4l267 267c13 12 33 12 46 0l267-267c12-12 33-12 45 0l109 109c12 12 12 33 0 45l-267 267c-12 13-12 33 0 46l267 267c12 12 12 33 0 50z" horiz-adv-x="1000" />
18 <glyph glyph-name="bookmark" unicode="&#xe809;" d="m767-96l-234 267c-8 12-25 12-33 0l-233-267c-17-17-42-4-42 25v842c0 17 13 33 25 33h533c13 0 25-16 25-33v-838c0-29-25-46-41-29z" horiz-adv-x="1000" />
19 <glyph glyph-name="branch" unicode="&#xe80a;" d="m829 579c0 67-54 121-121 121s-125-54-125-121c0-41 21-79 59-104-17-129-125-167-192-179v287c38 21 63 59 63 105 0 66-55 120-125 120s-121-54-121-120c0-46 25-84 62-105v-458c-37-25-62-62-62-108 0-67 54-121 121-121s120 54 120 121c0 46-25 83-62 104v50c58 8 154 29 225 100 50 50 79 117 87 196 46 25 71 66 71 112z m-441 150c20 0 41-16 41-41s-16-42-41-42c-21 0-42 17-42 42s21 41 42 41z m0-750c-21 0-42 17-42 42 0 21 17 42 42 42 20 0 41-17 41-42 0-25-16-42-41-42z m320 642c21 0 42-17 42-42s-17-41-42-41c-20 0-41 16-41 41s16 42 41 42z" horiz-adv-x="1000" />
20 <glyph glyph-name="tag" unicode="&#xe80b;" d="m433 800l-316 8c-38 0-71-33-71-70l8-317c0-17 9-33 21-46l442-442c25-25 71-25 96 0l308 309c25 25 25 71 0 96l-442 441c-8 13-25 17-46 21z m-83-225c0-37-29-67-67-67-37 0-66 30-66 67-4 38 25 67 62 67s71-29 71-67z" horiz-adv-x="1000" />
21 <glyph glyph-name="lock" unicode="&#xe80c;" d="m817 429h-50v109c0 8 0 12 0 20 0 5 0 9 0 13 0 4 0 4 0 8 0 9-4 17-4 25 0 4 0 9-5 9-4 8-4 16-8 25 0 0 0 0 0 4-4 8-8 21-12 29 0 4-5 4-5 4-4 8-8 13-12 21 0 4-4 4-4 8-4 4-9 13-13 17 0 4-4 4-4 8-8 9-12 13-21 21 0 0-4 0-4 4-8 4-12 13-21 17-4 0-4 4-8 4-8 4-13 8-21 13-4 0-4 4-8 4-9 4-17 8-29 12 0 0 0 0 0 0-9 4-21 4-30 9-4 0-4 0-8 0-4 0-8 0-8 0 0 0-4 0-4 0 0 0-5 0-5 0 0 0 0 0 0 0 0 0 0 0 0 0-4 0-8 0-8 0-4 0-4 0-8 0-4 0-4 0-9 0 0 0 0 0 0 0-4 0-8 0-8 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0-4 0-4 0 0 0-4 0-4 0-4 0-9 0-9 0-4 0-8 0-8 0-8 0-21-5-29-9 0 0 0 0 0 0-38-12-50-16-58-21-5 0-5-4-9-4-4 0-12-4-16-8-5-4-9-4-9-4-8-4-16-13-25-17 0 0-4 0-4-4-4-4-12-13-21-21 0 0-4-4-4-8-4 0-12-9-17-13 0-4-4-4-4-8-4-8-8-13-12-21 0 0-4-4-4-4-5-13-9-21-13-29 0-4 0-4-4-9-4-8-4-12-8-20 0-5 0-9-5-9 0-8-4-16-4-25 0-4 0-4 0-8 0-4 0-8 0-13 0-8 0-12 0-20v-109h-46c-16 0-29-12-29-29v-475c0-17 13-29 29-29h638c17 0 29 12 29 29v475c-4 17-17 29-33 29z m-459 109c0 66 50 125 113 137 0 0 4 0 4 0 8 0 17 0 21 4 0 0 4 0 4 0 0 0 0 0 0 0 8 0 17 0 21-4 0 0 4 0 4 0 63-12 113-71 113-137v-109h-280v109z" horiz-adv-x="1000" />
22 <glyph glyph-name="unlock" unicode="&#xe80d;" d="m817 429h-459v109c0 66 50 125 113 137 0 0 4 0 4 0 8 0 17 0 21 4 0 0 4 0 4 0 0 0 0 0 4 0 9 0 17 0 25-4 0 0 0 0 0 0 9 0 17-4 25-8 0 0 4 0 4-4 9-5 13-9 21-13 0 0 0 0 0 0 9-4 13-8 17-17 0 0 4-4 4-4 4-4 8-8 13-16 0 0 0-5 4-5 4-8 8-12 12-20 0 0 0-5 4-5 5-8 5-12 5-20 0 0 0-5 0-5 0 0 0 0 0-4 4-12 16-25 29-25h66c17 0 30 17 30 34 0 0 0 0 0 0 0 4 0 4 0 4 0 8-5 16-5 25 0 4 0 8-4 8-4 8-4 17-8 21 4 17 4 17 4 21-4 8-8 16-12 29 0 4-5 4-5 4-4 8-8 13-12 21 0 4-4 4-4 8-4 4-9 13-13 17 0 4-4 4-4 8-8 9-12 13-21 21 0 0-4 0-4 4-8 4-12 13-21 17-12-4-16-4-16-4-9 4-13 8-21 12-4 0-4 4-9 4-8 5-16 9-29 13 0 0 0 0 0 0-8 4-21 4-29 8-4 0-4 0-8 0-9 0-17 0-25 4-4 0-9 0-9 0-8 0-16 0-29 0 0 0-4 0-4 0-8 0-21-4-29-4-8 0-13-4-21-4 0 0-4 0-4 0-113-42-188-142-188-262v-109h-50c-16 0-29-12-29-29v-475c0-17 13-29 29-29h634c16 0 29 12 29 29v475c0 17-13 29-29 29z" horiz-adv-x="1000" />
23 <glyph glyph-name="delete" unicode="&#xe80e;" d="m500 804c-254 0-454-204-454-454 0-250 204-454 454-454s454 204 454 454c0 250-204 454-454 454z m267-491c0-17-17-34-34-34h-466c-21 0-34 13-34 34v75c0 16 17 33 34 33h466c21 0 34-13 34-33v-75z" horiz-adv-x="1000" />
24 <glyph glyph-name="ok" unicode="&#xe80f;" d="m500 804c-250 0-454-204-454-454 0-250 204-454 454-454s454 204 454 454c0 250-204 454-454 454z m283-346l-358-358c-12-12-33-12-46 0l-162 163c-13 12-13 33 0 45l50 50c12 13 33 13 46 0l66-66c13-13 34-13 46 0l263 262c12 13 33 13 45 0l50-50c13-16 13-37 0-46z" horiz-adv-x="1000" />
25 <glyph glyph-name="comment" unicode="&#xe810;" d="m42-13v601c0 87 71 154 154 154h600c87 0 154-71 154-154v-342c0-88-71-154-154-154h-583c-9 0-13-4-17-9l-113-112c-12-17-41-4-41 16z m158 684c-46 0-87-38-87-88v-487l54 54c4 4 12 8 16 8h617c46 0 88 38 88 88v342c0 45-38 87-88 87h-600z" horiz-adv-x="1000" />
26 <glyph glyph-name="feed" unicode="&#xe811;" d="m888 804h-780c-37 0-66-29-66-66v-780c0-37 29-66 66-66h780c37 0 66 29 66 66v780c0 37-29 66-66 66z m-638-787c-46 0-83 37-83 83 0 46 37 88 87 88 46 0 88-38 88-88-4-46-46-83-92-83z m292-13c-38 0-71 34-71 71 0 133-108 238-238 238-37 0-75 29-75 66 0 38 25 71 63 71l12 0c209 0 380-171 380-379 0-33-30-67-71-67z m237 0c-37 0-71 34-71 71 0 263-212 475-475 475-37 0-75 29-75 67 0 37 25 71 63 71l12 0c338 0 613-275 613-613 4-37-29-71-67-71z" horiz-adv-x="1000" />
27 <glyph glyph-name="right" unicode="&#xe812;" d="m308-96l-75 75c-12 13-12 29 0 42l309 308c12 13 12 29 0 42l-317 312c-12 13-12 30 0 42l75 75c13 13 29 13 42 0l433-433c13-13 13-29 0-42l-425-421c-8-12-29-12-42 0z" horiz-adv-x="1000" />
28 <glyph glyph-name="left" unicode="&#xe813;" d="m683 800l75-75c13-12 13-29 0-42l-308-308c-12-12-12-29 0-42l317-316c12-13 12-30 0-42l-75-75c-13-13-29-13-42 0l-433 433c-13 13-13 30 0 42l425 425c12 13 29 13 41 0z" horiz-adv-x="1000" />
29 <glyph glyph-name="arrow_down" unicode="&#xe814;" d="m950 542l-412-521c-17-21-50-21-67 0l-417 521c-21 25-4 71 34 71h829c37 0 58-46 33-71z" horiz-adv-x="1000" />
30 <glyph glyph-name="group" unicode="&#xe815;" d="m958 221v-17c0-4-4-8-8-8h-167-4c-16 21-41 37-79 46-37 4-58 16-71 25 9 8 17 12 34 16 45 9 62 21 66 38s9 25 0 37c-8 13-33 46-33 84 0 41 0 104 75 108h8 13c75-4 79-71 75-108 0-42-25-71-34-84-8-16-8-25 0-37s21-29 67-38c54-12 58-58 58-62 0 0 0 0 0 0z m-725-17c17 17 38 34 71 38 42 8 63 16 75 25-8 8-21 16-37 21-46 8-63 20-67 37-8 13-8 25 0 38 8 12 33 45 33 83 0 42 0 104-75 108h-12-8c-75-4-80-71-80-108 0-42 25-71 34-83 8-17 8-25 0-38-9-12-21-29-67-37-50-9-54-50-54-59 0 0 0 0 0 0v-16c0-5 4-9 8-9h167 12z m463 13c-75 12-96 37-108 58-13 21-13 42 0 63 12 25 50 70 54 133 4 62 0 167-121 175h-21-17c-116-4-120-113-120-175 4-63 41-113 54-133 12-25 12-42 0-63-13-21-38-46-109-58-75-13-87-84-87-92 0 0 0 0 0 0v-25c0-8 8-17 17-17h262 267c8 0 16 9 16 17v25c0 0 0 0 0 0 0 8-8 79-87 92z" horiz-adv-x="1000" />
31 <glyph glyph-name="folder" unicode="&#xe816;" d="m742 17h-659c-12 0-25 8-33 16-8 13-8 25 0 38l179 342c4 12 17 20 34 20h658c12 0 25-8 33-16 9-13 9-25 0-38l-179-341c-8-13-21-21-33-21z m-596 75h575l142 266h-580l-137-266z m-29-38h-75v550c0 21 16 38 37 38h246c8 0 21-4 25-13l54-54h325c21 0 38-17 38-37v-105h-75v67h-300c-9 0-17 4-25 8l-59 59h-191v-513z" horiz-adv-x="1000" />
32 <glyph glyph-name="fork" unicode="&#xe817;" d="m838 688c0 66-55 120-121 120-67 0-121-54-121-120 0-42 21-80 54-100-12-100-83-146-150-167-71 21-137 67-150 167 33 20 54 58 54 100 0 66-54 120-121 120s-120-54-120-120c0-46 29-88 66-109 13-129 88-225 213-271v-191c-38-21-63-59-63-100 0-67 54-121 121-121s121 54 121 121c0 46-25 83-58 104v192c120 45 200 137 212 270 33 17 63 59 63 105z m-555 37c21 0 42-17 42-42s-17-41-42-41-41 16-41 41 21 42 41 42z m217-750c-21 0-42 17-42 42 0 21 17 41 42 41s42-16 42-41c0-25-21-42-42-42z m217 750c21 0 41-17 41-42s-16-41-41-41-42 16-42 41 21 42 42 42z" horiz-adv-x="1000" />
33 <glyph glyph-name="more" unicode="&#xe818;" d="m592 408v-120c0-17-13-34-34-34h-120c-17 0-34 13-34 34v120c0 17 13 34 34 34h120c21 0 34-17 34-34z m-34 400h-120c-17 0-34-12-34-33v-121c0-16 13-33 34-33h120c17 0 34 12 34 33v121c0 21-13 33-34 33z m0-733h-120c-17 0-34-12-34-33v-121c0-17 13-34 34-34h120c17 0 34 13 34 34v121c0 16-13 33-34 33z" horiz-adv-x="1000" />
34 <glyph glyph-name="comment-add" unicode="&#xe819;" d="m48-12v593c0 86 69 155 154 155h596c85 0 154-69 154-155v-338c0-86-69-155-154-155h-584c-7 0-12-2-16-7l-110-110c-14-14-40-4-40 17z m154 679c-47 0-85-38-85-86v-486l52 53c5 4 10 7 17 7h612c47 0 85 38 85 86v340c0 48-38 86-85 86h-596z m479-284l0 48c0 5-5 10-10 10l-114 0c-5 0-9 4-9 9v114c0 5-5 10-10 10h-47c-5 0-10-5-10-10v-114c0-5-5-9-10-9h-114c-5 0-9-5-9-10v-48c0-4 4-9 9-9h114c5 0 10-5 10-10l0-114c0-5 5-9 10-9l47 0c5 0 10 4 10 9v114c0 5 4 10 9 10h114c5-3 10 2 10 9z" horiz-adv-x="1000" />
9 <glyph glyph-name="delete" unicode="&#xe800;" d="M515 758c-211 0-384-173-384-385 0-211 173-385 384-385s385 174 385 385c0 212-173 385-385 385z m227-416c0-15-11-27-30-27h-397c-15 0-30 12-30 27v62c0 15 11 27 30 27h397c15 0 30-12 30-27v-62z" horiz-adv-x="1000" />
10
11 <glyph glyph-name="ok" unicode="&#xe801;" d="M515 735c-211 0-384-173-384-385 0-211 173-385 384-385s385 174 385 385c0 212-173 385-385 385z m239-296l-304-304c-11-12-27-12-38 0l-139 138c-11 12-11 27 0 39l42 42c12 11 27 11 39 0l58-58c11-11 27-11 38 0l219 219c12 12 27 12 39 0l42-42c15-8 15-23 4-34z" horiz-adv-x="1000" />
12
13 <glyph glyph-name="comment" unicode="&#xe802;" d="M131 65v504c0 73 58 131 131 131h507c73 0 131-58 131-131v-288c0-73-58-131-131-131h-496c-4 0-11-4-15-8l-93-92c-11-15-34-4-34 15z m131 574c-39 0-73-31-73-74v-411l46 46c4 4 7 8 15 8h519c39 0 73 31 73 73v288c0 39-30 73-73 73h-507z" horiz-adv-x="1000" />
14
15 <glyph glyph-name="bookmark" unicode="&#xe803;" d="M780-140l-260 290c-10 10-25 10-35 0l-260-295c-20-20-45-5-45 30v930c-5 20 10 35 25 35h590c15 0 30-15 30-35v-925c0-35-30-50-45-30z" horiz-adv-x="1000" />
16
17 <glyph glyph-name="branch" unicode="&#xe804;" d="M875 600c0 76-58 134-134 134s-134-58-134-134c0-49 27-89 63-112-18-142-139-183-210-196v313c45 22 71 62 71 111 0 76-58 134-134 134s-134-58-134-134c0-49 27-94 67-116v-500c-40-22-67-67-67-116 0-76 58-134 134-134s134 58 134 134c0 49-26 94-67 116v58c63 9 166 31 246 112 58 58 89 129 98 214 40 22 67 67 67 116z m-478 161c27 0 45-18 45-45s-18-45-45-45-44 18-44 45 18 45 44 45z m0-822c-26 0-44 18-44 45s18 45 44 45 45-18 45-45-22-45-45-45z m344 706c27 0 45-18 45-45s-18-45-45-45-45 18-45 45 23 45 45 45z" horiz-adv-x="1000" />
18
19 <glyph glyph-name="tag" unicode="&#xe805;" d="M460 788l-366 8c-18 0-31-13-31-31l13-366c0-9 4-13 9-22l464-465c14-13 31-13 45 0l352 353c14 14 14 31 0 45l-468 469c-5 4-14 9-18 9z m-103-224c0-35-31-67-67-67-35 0-67 32-67 67 0 36 32 67 67 67s67-26 67-67z" horiz-adv-x="1000" />
20
21 <glyph glyph-name="lock" unicode="&#xe806;" d="M813 426h-54v107c0 9 0 13 0 18 0 4 0 9 0 13 0 5 0 5 0 9 0 9 0 14-4 23 0 4 0 4-5 9 0 8-4 13-9 22 0 0 0 4-4 4-5 9-9 18-14 27 0 0-4 5-4 5 4 8 0 17-5 22 0 4-4 4-4 9-5 4-9 13-14 18 0 0-4 4-4 4-9 9-13 14-22 22 0 0-5 0-5 5-4 4-13 9-22 13-5 0-5 5-9 5-4 4-13 9-18 9-4 0-4 4-9 4-9 5-18 9-27 9 0 0 0 0 0 0-9 5-17 5-31 9-4 0-4 0-9 0-4 0-4 0-9 0 0 0-4 0-4 0 0 0-5 0-5 0 0 0 0 0 0 0 0 0 0 0 0 0-4 0-9 0-9 0 0 0-4 0-4 0-4 0-4 0-9 0 0 0 0 0 0 0-4 0-9 0-9 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0-4 0-4 0 0 0-5 0-5 0-4 0-4 0-9 0-4 0-4 0-9 0-9 0-22-4-31-9 0 0 0 0 0 0-22 0-36-4-45-9-4 0-4-4-8-4-5-5-9-5-18-9 0-5-5-5-5-5-9-4-18-9-22-18 0 0-5 0-5-4-9-4-13-9-22-18 0 0-4-4-4-4-5-5-9-14-14-18 0-5-4-5-4-9-9-5-14-14-14-18-4 0-4-4-4-4-5-9-9-18-13-27 0 0 0-5-5-5 0-13-4-18-4-26 0-5 0-5-5-9 0-9-4-14-4-23 0-4 0-4 0-9 0-4 0-9 0-13 0-5 0-13 0-18v-107h-49c-18 0-27-13-27-27v-464c0-18 13-27 27-27h620c18 0 27 13 27 27v464c4 18-9 27-22 27z m-451 107c0 67 49 121 111 134 0 0 0 0 5 0 9 0 13 0 22 0 0 0 5 0 5 0 0 0 0 0 0 0 8 0 13 0 22 0 0 0 0 0 4 0 63-13 112-67 112-134v-107h-281v107z" horiz-adv-x="1000" />
22
23 <glyph glyph-name="unlock" unicode="&#xe807;" d="M781 415h-385v93c0 57 43 104 96 115 0 0 0 0 4 0 8 0 12 0 19 0 0 0 4 0 4 0 0 0 0 0 4 0 8 0 16 0 23-4 0 0 0 0 0 0 8 0 12-4 19-7 0 0 4 0 4-4 4-4 12-4 16-8 0 0 0 0 0 0 4-4 11-8 15-11 0 0 4-4 4-4 4-4 8-8 11-12 0 0 0 0 4-4 4-4 8-11 12-19 0 0 0-4 0-4 4-4 4-11 4-15 0 0 0-4 0-4 0 0 0 0 0-4 4-11 11-19 23-19h57c16 0 27 11 24 27 0 0 0 0 0 0 0 0 0 4 0 4 0 7-4 11-4 19 0 4 0 4-4 8 0 15 0 19-4 27 0 0 0 3-4 3-4 8-8 16-11 23 0 0-4 4-4 4-4 8-8 12-12 16 0 4-4 4-4 7-3 4-7 12-11 16 0 0-4 4-4 4-8 7-12 11-19 19 0 0-4 0-4 4-4 4-12 7-19 11-4 0-4 4-8 4-4 4-12 8-15 8-4 0-4 4-8 4-8 3-15 7-23 7 0 0 0 0 0 0-8 4-16 4-27 8-4 0-4 0-8 0-7 0-11 0-19 4-4 0-8 0-8 0-7 0-15 0-23 0 0 0-4 0-4 0-7 0-15-4-27-4-3 0-11-4-15-4 0 0-4 0-4 0-92-27-157-115-157-215v-93h-43c-15 0-23-11-23-23v-400c0-15 12-23 23-23h535c15 0 23 12 23 23v400c4 12-8 23-23 23z" horiz-adv-x="1000" />
24
25 <glyph glyph-name="feed" unicode="&#xe808;" d="M842 739h-653c-35 0-58-27-58-58v-658c0-31 27-58 58-58h657c31 0 58 27 58 58v658c-4 31-31 58-62 58z m-534-666c-39 0-73 31-73 73s30 73 73 73c38 0 73-30 73-73s-35-73-73-73z m246-11c-35 0-58 27-58 57 0 112-88 200-200 200-31 0-65 27-65 58 0 35 23 62 54 62l11 0c177 0 319-143 319-320-3-30-30-57-61-57z m196 0c-35 0-58 27-58 57 0 220-180 400-400 400-30 0-65 27-65 62 0 34 23 61 54 61l11 0c285 0 520-230 520-519 0-34-27-61-62-61z" horiz-adv-x="1000" />
26
27 <glyph glyph-name="left" unicode="&#xe809;" d="M692 773l70-69c11-12 11-27 0-39l-289-288c-11-12-11-27 0-38l296-297c12-11 12-27 0-38l-69-69c-11-12-27-12-38 0l-404 404c-12 11-12 26 0 38l396 396c11 12 27 12 38 0z" horiz-adv-x="1000" />
28
29 <glyph glyph-name="right" unicode="&#xe80a;" d="M339-65l-74 69c-11 11-11 27 0 38l289 289c11 11 11 27 0 38l-296 296c-12 12-12 27 0 39l69 69c12 12 27 12 38 0l404-404c12-11 12-27 0-38l-392-396c-12-12-31-12-38 0z" horiz-adv-x="1000" />
30
31 <glyph glyph-name="arrow_down" unicode="&#xe80b;" d="M704 454l-173-219c-8-8-23-8-27 0l-173 219c-8 11 0 31 15 31h346c12 0 20-20 12-31z" horiz-adv-x="1000" />
32
33 <glyph glyph-name="group" unicode="&#xe80f;" d="M962 219v-15c0-4-4-12-12-12h-161-4c-16 20-39 39-77 43-35 7-54 15-69 23 7 7 19 11 30 15 46 8 58 23 66 35 7 11 7 23 0 38-8 16-31 43-31 81 0 38 0 104 73 104h8 7c73-4 77-66 73-104 0-38-23-69-30-81-8-15-8-27 0-38 7-12 23-27 65-35 54-4 62-46 62-54 0 0 0 0 0 0z m-708-15c15 15 38 31 69 35 39 7 62 15 73 26-7 8-19 16-34 20-47 7-58 23-66 34-7 12-7 23 0 39 8 15 31 42 31 81 0 38 0 103-73 103h-8-11c-73-3-77-65-73-103 0-39 23-70 30-81 8-16 8-27 0-39-7-11-23-27-65-34-46-8-54-50-54-54 0 0 0 0 0 0v-16c0-3 4-11 12-11h161 8z m454 11c-73 12-96 35-108 54-11 20-11 39 0 62 12 23 50 69 54 131 4 61 0 165-119 169h-16-15c-119-4-123-104-119-166 4-61 38-107 54-130 11-23 11-43 0-62-12-19-35-42-108-54-73-11-85-80-85-88 0 0 0 0 0 0v-23c0-8 8-16 16-16h257 258c8 0 15 8 15 16v23c0 0 0 0 0 0-3 4-11 73-84 84z" horiz-adv-x="1000" />
34
35 <glyph glyph-name="folder" unicode="&#xe810;" d="M735 73h-604c-12 0-23 8-31 16-8 11-8 23 0 34l165 312c8 11 20 19 31 19h604c12 0 23-8 31-15 8-12 8-24 0-35l-166-312c-7-11-19-19-30-19z m-546 69h523l127 243h-520l-130-243z m-24-34h-69v504c0 19 16 34 35 34h223c8 0 19-4 23-11l50-50h304c19 0 34-16 34-35v-96h-69v61h-277c-7 0-19 4-23 12l-54 50h-177v-469z" horiz-adv-x="1000" />
36
37 <glyph glyph-name="fork" unicode="&#xe811;" d="M792 654c0 58-46 100-100 100-57 0-100-46-100-100 0-35 20-65 47-85-12-84-70-123-127-142-58 15-116 54-127 142 27 20 46 50 46 85 0 58-46 100-100 100s-108-42-108-100c0-39 23-73 54-89 12-107 77-188 181-226v-162c-31-19-50-50-50-88 0-58 46-101 100-101s100 47 100 101c0 38-19 69-50 88v162c104 38 169 119 181 226 30 20 53 50 53 89z m-465 35c19 0 35-16 35-35s-16-31-35-31-35 12-35 31 16 35 35 35z m181-635c-19 0-35 15-35 35s16 34 35 34c19 0 34-15 34-34s-15-35-34-35z m184 635c20 0 35-16 35-35s-15-31-35-31-34 16-34 35 15 31 34 31z" horiz-adv-x="1000" />
38
39 <glyph glyph-name="more" unicode="&#xe812;" d="M546 435h-100c-15 0-27-12-27-27v-100c0-16 12-27 27-27h100c16 0 27 11 27 27v100c0 15-11 27-27 27z m0 307h-100c-15 0-27-11-27-27v-100c0-15 12-26 27-26h100c16 0 27 11 27 26v100c0 16-11 27-27 27z m0-615h-100c-15 0-27-12-27-27v-100c0-15 12-27 27-27h100c16 0 27 12 27 27v100c0 15-11 27-27 27z" horiz-adv-x="1000" />
40
41 <glyph glyph-name="plus" unicode="&#xe813;" d="M873 404h-254c-15 0-27 11-27 27v254c0 15-11 27-27 27h-100c-15 0-26-12-26-27v-254c0-16-12-27-27-27h-254c-16 0-27-12-27-27v-100c0-15 11-27 27-27h254c15 0 27-11 27-27v-254c0-15 11-27 26-27h100c16 0 27 12 27 27v254c0 16 12 27 27 27h254c16 0 27 12 27 27v100c0 15-11 27-27 27z" horiz-adv-x="1000" />
42
43 <glyph glyph-name="minus" unicode="&#xe814;" d="M980 290h-960c-10 0-20 10-20 20v160c0 10 10 20 20 20h965c10 0 20-10 20-20v-165c-5-5-15-15-25-15z" horiz-adv-x="1000" />
44
45 <glyph glyph-name="remove" unicode="&#xe815;" d="M975 710l-115 115c-15 15-35 15-50 0l-285-285c-15-15-35-15-50 0l-285 285c-15 15-35 15-50 0l-115-115c-15-15-15-35 0-50l285-285c15-15 15-35 0-50l-285-285c-15-15-15-35 0-50l115-115c15-15 35-15 50 0l285 285c15 15 35 15 50 0l285-285c15-15 35-15 50 0l115 115c15 15 15 35 0 50l-285 285c-15 15-15 35 0 50l285 285c15 10 15 35 0 50z" horiz-adv-x="1000" />
46
47 <glyph glyph-name="git" unicode="&#xe82a;" d="M929 844h-858c-36 0-65-30-65-65v-857c0-36 30-65 65-65h857c36 0 65 30 65 65v857c1 35-29 65-64 65z m-729-549c4-11 9-20 14-27 6-8 14-14 22-18 9-4 19-6 29-6 9 0 16 1 24 2 7 2 14 4 20 7l6 51h-27c-4 0-8 1-10 4-2 1-3 5-3 7l5 39h105l-16-131c-8-7-16-12-25-15-9-4-18-8-28-10-10-3-18-5-30-7-10-1-21-2-33-2-20 0-38 4-54 11-16 8-30 18-41 30-12 13-20 28-27 45-6 18-10 36-10 56 0 18 3 34 7 50 3 17 10 30 17 44 8 14 16 25 26 36 10 12 22 20 34 28 13 7 26 14 41 17 15 4 30 7 47 7 13 0 25-2 36-4 11-3 21-6 29-10 8-4 16-9 22-14 6-5 13-11 18-16l-20-31c-4-5-9-8-14-9-5-1-10 0-16 4-5 3-10 6-14 8-5 3-9 5-14 7-5 1-10 2-15 3-5 2-11 2-17 2-14 0-27-3-38-9-11-6-21-14-29-25-8-10-15-24-18-38-5-15-7-31-7-48-1-14 2-27 4-38z m336-102h-71l39 315h71l-39-315z m343 258h-80l-33-258h-70l32 258h-80l7 57h231l-7-57z" horiz-adv-x="1000" />
48
49 <glyph glyph-name="hg" unicode="&#xe82d;" d="M927 841h-853c-36 0-65-29-65-65v-853c0-36 29-65 65-65h853c36 0 65 29 65 65v853c0 36-29 65-65 65z m-483-648h-70l16 133h-113l-17-133h-70l39 313h70l-16-132h113l16 132h71l-39-313z m177 101c3-11 8-20 14-27 7-8 14-14 23-18 8-4 18-6 28-6 9 0 16 1 23 3 7 1 14 3 20 6l6 51h-27c-4 0-7 1-9 3-3 3-3 6-3 9l5 39h104l-16-131c-8-6-16-11-25-15-9-5-18-8-27-11-9-2-19-4-30-6-10-1-21-2-33-2-19 0-37 4-53 11-16 7-30 17-41 29-11 13-20 28-26 45-7 17-10 35-10 55 0 17 2 34 6 50 4 15 10 30 17 43 7 14 16 26 26 36 10 11 22 20 34 28 13 7 27 13 41 17 14 4 30 7 46 7 13 0 25-2 36-4 11-3 20-6 29-10 8-4 16-9 23-14 7-5 13-11 18-17l-23-28c-4-5-8-8-13-9-5-1-11 0-16 3-5 4-10 7-14 9-5 3-9 5-14 6-4 2-9 3-14 4-5 1-11 1-17 1-14 0-27-3-38-8-11-6-21-14-29-25-8-10-15-23-19-38-5-15-7-31-7-49 0-13 2-26 5-37z" horiz-adv-x="1000" />
50
51 <glyph glyph-name="svn" unicode="&#xe82e;" d="M933 841h-852c-36 0-65-29-65-65v-853c0-36 29-65 65-65h852c36 0 66 29 66 65v853c0 36-30 65-66 65z m-765-438c3-4 7-7 13-10 5-3 11-6 18-8 7-3 14-5 21-8 7-3 14-6 21-10 7-4 13-9 18-14 5-6 10-13 13-20 3-8 5-18 5-29 0-16-3-30-8-44-6-14-14-26-24-37-10-10-22-18-37-24-15-7-31-10-49-10-9 0-18 1-27 3s-18 5-27 9c-8 4-16 8-23 13-7 5-13 10-18 17l25 32c2 3 4 5 7 6 3 2 6 3 9 3 4 0 8-2 12-4 3-3 8-6 12-10 5-3 10-6 16-9 6-3 14-4 23-4 13 0 23 3 30 10 7 7 10 18 10 32 0 6-1 12-4 16-4 4-8 7-13 10-5 3-11 6-18 8-7 2-14 5-21 7-7 3-14 6-21 9-6 4-12 8-18 14-5 6-9 13-12 21-3 8-5 18-5 30 0 14 3 28 8 40 5 13 12 25 22 35 10 9 22 17 36 23 14 6 29 9 47 9 9 0 17-1 26-3s16-4 23-8c7-3 14-7 20-11 6-5 11-10 15-15l-21-29c-2-3-5-6-7-7-3-2-6-3-9-3-3 0-7 1-10 3-3 3-7 5-11 8-4 2-9 4-14 7-5 2-12 3-19 3-6 0-12-1-17-3-5-2-9-4-12-8-4-3-6-7-8-11-1-5-2-10-2-15 0-5 2-10 5-14z m312-210h-64l-77 313h57c6 0 10-1 14-4 4-3 6-6 7-11l32-173c2-7 4-14 6-23 1-8 3-17 4-26 3 9 6 18 9 26 4 9 7 16 11 23l73 173c1 2 2 4 4 6 2 2 4 3 6 5 2 1 4 2 7 3 3 1 5 1 8 1h57l-154-313z m423 0h-37c-5 0-10 1-13 2-4 2-7 5-10 9l-109 194c-1-4-1-8-1-12-1-4-1-7-1-10l-22-183h-62l39 313h37c3 0 6 0 8 0 2 0 4-1 5-1 2-1 3-2 4-4s3-3 5-5l110-194c0 5 0 10 1 14 0 5 1 9 1 13l21 177h62l-38-313z" horiz-adv-x="1000" />
52
53 <glyph glyph-name="comment-add" unicode="&#xe82f;" d="M952 258v317c0 7 0 13 0 20-1 12-4 24-8 36-7 22-20 43-37 59-16 17-36 30-58 37-12 4-25 7-37 8-7 1-13 1-19 1h-576c-13 0-26 0-38-2-13-2-25-5-36-10-22-9-41-23-57-40-15-18-27-39-33-62-3-12-5-25-5-38-1-13 0-26 0-39v-557c0-9 5-17 13-21 6-3 15-3 21 0 3 1 5 3 7 5 2 2 4 5 7 7 4 5 9 9 14 14l28 28c9 10 19 19 28 29 9 9 19 18 28 27 4 5 8 9 14 10 2 1 5 1 8 1h567c13 0 25 0 38 2 24 4 47 13 66 27 19 13 35 31 46 51 12 22 19 46 19 71 1 6 0 13 0 19z m-69 307v-317c0-7 0-13 0-19-1-6-3-13-5-18-4-10-9-20-16-28-15-17-37-27-59-28-7 0-13 0-19 0h-576c-7 0-13 0-20 0-3 0-6 0-9-1-3-1-5-2-7-4-2-2-4-4-6-6-3-2-5-4-7-7-5-4-10-9-14-14-10-9-19-18-28-28v485c0 12 2 24 7 35 4 10 10 19 18 27 16 16 38 25 60 25h590c6 0 12 0 18-1 22-3 42-15 56-33 7-9 12-20 15-31 1-5 2-12 2-18 1-6 0-13 0-19z m-214-117h-131c-4 0-11 8-11 12v126c0 8-8 16-12 16h-50c-7 0-15-8-15-12v-130c0-4-8-12-12-12h-126c-8 0-16-8-16-11v-50c0-8 8-16 12-16h127c7 0 15-7 15-11v-127c0-8 8-15 11-15h50c8 0 16 7 16 11v127c0 8 7 15 11 15h127c8 0 15 8 15 12v50c0 7-7 15-11 15z" horiz-adv-x="1000" />
54
55 <glyph glyph-name="comment-toggle" unicode="&#xe830;" d="M798 736h-596c-85 0-154-69-154-155v-593c0-19 22-29 36-20 2 1 5 3 7 5l109 109c2 2 5 4 8 5 2 1 4 1 6 1h584c85 0 154 69 154 155v338c0 86-69 155-154 155z m-680-639v484c0 47 38 85 86 85h476c-86-84-504-511-509-515l-53-54z" horiz-adv-x="1000" />
56
57 <glyph glyph-name="rhodecode" unicode="&#xe831;" d="M175 633c-2-4-3-8-4-12-3-10-6-20-9-30-3-13-7-25-11-38-3-11-6-23-10-35-2-7-4-15-6-22-1-1-1-2-1-4 0 0 0 0 0 0 0-1 1-2 1-2 2-7 5-14 7-21 4-11 8-22 12-33 4-12 9-25 13-37 4-11 8-23 12-34 3-7 5-14 7-21 1-1 1-2 2-3 0-1 1-1 1-2 4-6 8-12 11-17 7-10 13-19 19-29 7-11 14-22 22-33 6-10 13-21 20-31 5-7 9-15 14-22 1-2 3-4 4-5 0-1 1-1 1-2 3-3 6-5 8-8 7-6 13-12 19-19 9-8 17-16 25-24 10-10 19-19 29-28 9-9 18-18 27-27 8-8 16-15 23-23 5-5 11-10 16-15 1-1 3-3 5-5 7-5 14-10 21-15 11-8 21-15 31-23 4-2 7-5 11-7 0-1 1-2 2-2 0 0 0-1 1 0 7 3 14 7 21 11 11 6 23 11 34 17 6 4 12 7 19 10 0 0 0 1 1 1 1 2 2 3 3 5 4 5 8 10 13 15 6 8 12 16 18 24 8 9 15 19 23 28 8 11 16 21 24 31 8 11 16 21 24 31 8 10 15 19 23 28 6 8 12 16 18 24 4 5 8 10 12 15 2 2 3 3 4 5 0 0 0 0 0 0-1 1-2 1-3 1-3 0-6 1-9 2-5 1-10 2-15 3-6 2-13 4-20 5-8 3-16 5-24 7-9 3-19 6-28 9-10 3-21 7-31 11-12 4-23 8-34 13-12 5-24 10-36 15-13 6-26 12-38 18-13 7-26 14-39 21-13 7-27 15-39 23-14 9-27 17-40 27-13 9-26 19-39 29-13 11-25 22-37 33-13 11-25 23-36 36-12 13-23 26-34 40-11 14-21 28-31 43-9 15-19 31-27 47 0 1 0 1 0 1z m-3 3c-1-5-3-9-4-14-3-10-7-21-10-32-4-12-8-25-12-37-3-10-6-20-9-30-1-3-2-6-3-8 0-1 0-2 0-2 1-5 3-10 5-15 3-10 6-20 10-30 4-12 8-24 12-37 4-12 8-24 12-36 3-9 6-18 9-26 1-3 1-5 2-8 0 0 1-1 1-2 2-4 5-8 7-12 5-10 10-19 15-28 6-11 12-23 19-34 6-11 12-22 18-33 4-8 8-15 12-23 1-2 2-4 4-6 4-5 8-9 13-14 7-8 15-16 23-24 9-10 18-20 26-29 9-9 17-18 25-26 5-6 10-11 15-17 2-1 3-3 5-5 5-5 11-11 17-17 9-8 17-17 26-26 9-9 18-18 27-27 7-7 14-14 21-21 2-2 3-3 5-5 0 0 1-1 1-1 0 0 1 0 1 0 11 2 22 3 32 5 9 1 17 2 26 3 0 0 1 1 1 1 1 1 2 3 4 4 4 5 9 10 13 15 7 8 14 15 20 22 9 9 17 18 25 27 9 10 18 20 27 30 8 9 17 19 26 29 8 9 16 17 24 26 7 7 13 15 20 22 4 5 8 9 13 14 1 1 2 2 3 3 0 1 1 1 1 1 0 1-3 1-3 1-3 1-7 2-10 3-4 2-9 3-14 5-6 2-13 4-19 6-8 3-16 6-24 9-9 3-18 7-27 11-10 4-20 8-30 13-11 5-22 10-33 16-12 5-23 11-35 18-12 6-24 13-36 20-12 8-25 16-37 24-13 8-25 17-37 26-13 10-25 19-38 29-12 11-24 21-36 33-12 11-24 23-35 35-11 13-22 25-33 39-10 13-21 27-30 42-10 15-19 30-27 45-9 16-17 32-24 48z m-2 10c-1-4-2-8-3-11-1-10-3-19-5-29-2-12-5-25-7-37-3-13-5-26-8-39-1-10-3-20-5-30-1-5-2-10-3-15 0-1 0-1 0-2 1-3 2-5 3-8 3-9 7-19 10-29 4-12 9-25 13-37 4-11 8-22 11-33 2-5 4-11 6-16 0-1 1-2 1-2 1-3 2-5 4-7 4-9 8-18 13-27 6-12 11-23 17-35 6-11 11-22 17-33 3-7 7-14 11-21 0-2 1-3 1-4 1-1 2-1 2-2 5-6 9-11 14-17 8-9 15-18 22-27 9-10 17-20 26-30 7-9 15-18 22-27 5-6 10-11 15-17 0-1 1-2 2-3 0 0 0 0 0 0 1-1 2-1 3-2 7-4 14-9 21-14 10-7 21-14 31-22 11-7 21-14 31-20 6-4 12-9 18-13 3-2 7-5 10-8 10-8 19-16 29-24 7-5 13-11 20-17 1 0 1 0 1 1 1 1 2 2 3 3 4 4 8 8 12 13 6 6 12 13 18 20 8 8 16 17 23 25 9 10 18 19 26 29 9 9 18 19 27 29 9 9 18 19 26 28 8 9 16 17 23 26 6 6 13 13 19 20 4 4 7 8 11 12 1 2 3 3 4 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0-3 1-3 1-3 1-6 2-9 2-5 2-10 4-15 5-6 2-12 5-18 7-8 3-16 6-24 9-8 4-17 8-26 12-10 4-20 9-30 13-11 6-22 11-32 16-12 6-23 13-35 19-12 7-23 14-35 21-13 8-25 16-37 24-12 8-25 17-37 26-12 9-24 19-36 29-13 10-25 21-36 32-12 11-24 22-35 34-12 12-22 25-33 38-11 13-21 26-30 40-10 14-19 29-28 44-9 15-17 30-24 46-4 10-8 20-12 30z m696 21c-2 4-5 7-9 9-5 3-11 5-16 8-11 4-23 9-34 13-23 8-47 14-71 20-24 6-49 12-74 17-25 6-50 10-76 15-25 4-50 8-75 10-13 1-26 1-39 0-6-1-13-2-19-3-6-1-13-2-19-3-26-4-51-9-77-14-25-5-50-10-75-16-25-5-49-12-74-20-12-4-24-9-36-13-6-3-11-5-17-8-3-1-6-2-9-3-3-1-5-2-8-4-4-2-7-5-10-9-1-2-2-4-3-7-1-3-2-6-3-9-4-11-7-24-9-36-5-24-7-49-6-74 0-13 1-25 3-38 1-12 3-25 5-37 5-25 12-50 20-74 4-12 8-24 13-36 4-11 10-22 15-33 10-21 21-42 34-62 12-20 25-39 39-58 14-19 28-37 43-55 16-18 33-36 50-54 17-18 34-35 52-52 18-17 36-33 55-49 10-7 19-15 28-22 10-8 19-15 28-23 2-2 5-4 7-7 0 0 1-1 1-1 0 0 1 0 1 0 0 0 1 0 2 0 4 4 9 8 14 11 9 8 19 15 28 23 18 15 36 31 54 47 18 16 35 32 52 49 17 16 33 34 49 51 16 18 32 37 47 56 14 18 28 38 41 57 13 20 25 40 36 61 11 21 21 43 30 65 9 22 17 45 23 68 6 23 11 47 13 70 3 24 3 49 2 73 0 3-1 6-2 9 0 3-1 6-2 9-1 6-2 12-3 18-3 11-5 22-8 33-4 9-7 19-11 28-2 5 13-29 0 0z m-51-210c-7-25-15-49-26-73-9-22-19-44-30-64-12-21-24-40-37-59-13-19-27-38-42-56-15-18-31-36-47-54-17-18-34-36-52-53-18-17-37-34-56-50-9-7-19-15-28-23-1 0-2-1-2-1-1 0-2 0-2 0-1 0-2 1-2 1 0 0-1 1-1 1-2 1-3 3-4 4-4 3-9 7-14 11-2 1-4 3-7 5-2 2-4 4-6 6-5 4-9 8-14 13-9 8-18 17-27 25-10 9-18 18-27 27-4 4-8 9-13 13-4 4-8 9-12 13-17 17-33 35-49 53-15 18-30 37-44 57-14 20-27 40-38 61-12 21-23 42-32 64-10 22-18 45-25 67-2 6-4 12-5 19-1 6-2 12-3 19-2 13-4 26-4 39-1 13-2 26-1 39 0 12 1 25 3 37 0 4 0 7 1 10 0 1 0 3 0 5 1 1 1 3 1 4 0 3 1 5 2 7 0 2 0 3 0 4 0 1 1 2 2 3 1 0 2 1 3 2 0 0 1 0 1 1 1 0 2 0 3 1 3 1 6 2 9 3 6 2 12 5 18 7 23 8 47 16 72 23 12 3 24 6 37 9 6 1 12 2 18 4 7 1 13 2 19 4 26 5 51 9 77 13 13 1 26 3 39 5 3 0 7 1 10 1 1 1 3 1 4 1 2 0 3 0 4 0 6 0 12 0 17-1 1 0 2-1 4-1 1 0 3-1 4-1 3 0 6-1 9-1 7-1 13-2 19-3 13-2 25-4 38-6 25-4 51-8 76-13 25-6 50-12 75-19 12-4 24-8 37-12 6-2 12-4 18-6 3-1 6-2 9-3 1-1 3-1 4-2 1 0 1 0 2 0 1-1 2-1 2-1 2-2 3-2 4-4 1-1 1-1 1-2 0-1 0-2 0-2 1-1 1-2 1-3 2-6 3-13 4-19 1-7 2-13 3-20 0-3 0-6 0-9 0-1 0-2 0-2 0-1 0-2 0-3 1-1 1-3 1-5 5-23 7-48 5-72-1-13-3-26-6-38-8-29 8 35 0 0z m-197 0c-2 4-3 9-5 13 0 1 0 1 0 2-1 0 0 1-1 1 0 1 0 2 0 2 0 1-1 2-1 3-1 2-3 4-4 5-2 2-5 4-7 4 2 2 4 3 7 5 1 1 2 2 3 2 1 1 2 1 2 2 0 1 0 1 0 2 1 0 1 1 2 1 0 1 1 2 1 4 0 2 0 4 0 6 0 3 0 5 0 7 0 3 0 5-1 7-1 6-4 10-8 14-1 2-3 3-5 4-3 2-5 4-8 5-3 1-6 2-9 3-3 0-5 0-7 0-3 0-6 0-8 0-13 0-25 0-37 0-5 0-10 0-14 0-1 0-2 0-3 0 0 0 0-4 0-5 0-3 0-6 0-9 0-1 0-2 0-3 0-1 0-1 1-1 1 0 12 0 12 0 0-8 0-16 0-24 0-13 0-25 0-38 0-4 0-7 0-11 0 0 0-1 0-1-3 0-5 0-8 0-1 0-2 0-4 0 0 0-1 0-1-1 0-2 0-4 0-6 0-1 0-14 0-14 10 0 19 0 29 0 5 0 11 0 16 0 1 0 3 0 4 0 0 0 0 3 0 3 0 6 0 11 0 17 0 1 0 1-1 1-1 0-2 0-4 0-1 0-3 0-4 0 0 0-1-1-1 0 0 5 0 10 0 15 0 2 0 5 0 8 0 1 0 1 0 2 0 0 0 0 1 0 2 0 5 0 8 0 2 0 4 0 6 0 1 0 3 0 4 0 1 0 2-1 3-2 1-1 2-2 3-3 0-1 0-1 0-2 0-2 1-3 1-4 1-1 1-2 2-3 0-1 0-1 0-1 0-1 0-2 0-2 1-6 2-12 3-17 1-3 1-5 2-8 0 0 0-1 0-1 0 0 0 0 0 0 11 0 21 0 32 0 0 0 1 0 1 0 0 0 0 0 0 0 0 3 0 5 0 8 0 5 0 10 0 15 0 0 0 0 0 0-1-2-1-4-2-5z m-26 53c0-2-3-3-4-4-1 0-1-1-1-1 0 0-1 0-1 0 0 0-1 0-1 0-1-1-1-2-2-2-1 0-2 0-3 0-2 0-4 0-5 0-1 0-2 0-2 0-3 0-7 0-10 0 0 0-4 0-4 0 0 9 0 19 0 28 0 0 13 0 14 0 4 0 8 0 12-1 1 0 3-1 4-2 1 0 2-1 3-2 1-2 2-4 3-6 0-1 0-1 0-2-1-1-1-1-1-2-1-1-1-1-1-2-1-1-1-2-1-4z m131-53c0 9 0 18 0 27 0 9-2 18-8 24-7 7-19 8-28 7-2 0-4 0-6-1-1 0-1-1-2-2-1 0-2 0-2-1-3-1-6-3-8-6 0 9 0 18 0 27 0 6 0 12 0 18 0 2 0 3 0 5 0 1 0 1 0 1-11 0-22 0-32 0-1 0-3 0-4 0 0 0 0-4 0-4 0-5 0-11 0-16 0 0 1 0 1 0 1 0 3 0 4 0 3 0 5 0 8 0 0 0 0-5 0-6 0-11 0-23 0-34 0-11 0-22 0-33 0-1 0-2 0-3 0-1 0-1-1-1-3 0-6 0-9 0-1 0-2 0-3 0 0 0 0-1 0-1 0-6 0-11 0-17 0-1 0-1 0-2 0 0 0 0 0 0 2 0 3 0 4 0 12 0 24 0 36 0 0 0 9 0 9 0 0 5 0 10 0 15 0 1 0 3 0 5 0 0-6 0-7 0 0 0-6 0-6 0 0 1 0 3 0 4 0 6 0 12 0 18 0 3 0 5 0 7 0 1 0 1 0 2 0 0 1 1 2 2 1 1 2 2 3 2 1 1 1 1 2 1 1 0 1 0 1 1 2 1 3 1 4 1 3 1 5 0 7-1 1-1 2-2 3-3 1-1 2-2 2-3 1-2 1-4 1-6 0-3 0-7 0-11 0-11 0-22 0-33 0-1 0-2 1-2 1 0 3 0 5 0 6 0 12 0 19 0 3 0 7 0 11 0 0 0 0 14 0 15 0 2 0 4 0 6-2-1-5-2-7-2z m7 122c-2 0-5 0-7 0-1 0-2 0-3 0 0 0 0 0 0 0-3-5-7-10-10-14-1-2-3-5-4-7 0 0-1-1-1-2 0 0 0-1 0-1 0-1 0-4 0-4 0 0 1 0 2 0 3 0 5 0 8 0 3 0 5 0 8 0 0 0 0-1 0-1 0-1 0-3 0-4 0-2 0-3 0-4 0-1 0-1 1-1 1 0 3 0 4 0 1 0 2 0 2 0 0 3 0 6 0 8 0 1 0 2 1 2 1 0 2 0 4 0 0 0 0 0 0 0 0 1 0 1 0 2 0 0 0 3 0 3-1 0-2 0-3 0-1 0-2 0-2 1 0 7 0 15 0 22z m-7-15c0-1 0-2 0-4 0-1 0-1 0-2 0 0 0-1 0-2 0 0-1 0-1 0-1 0-2 0-3 0-1 0-3 0-4 0 1 2 3 4 4 6 1 1 1 2 2 3 1 2 2 4 2 7 0-3 0-5 0-8z m41-2c-3 2-8 2-11 2-1 0-2 0-2 0 0 0 0 1 0 1 0 2 0 3 0 4 0 0 13 0 14 0 1 0 1 0 1 1 0 1 0 2 0 4 0 1 0 2 0 3 0 0-1 0-1 0-6 0-11 0-17 0-1 0-1 0-2 0 0 0 0-1 0-1-1-2-1-4-1-6-1-5-1-9-2-14 4 0 9 1 12-1 3-1 4-4 4-7-1-4-5-4-7-4-2 0-9-1-9 2-1-2-1-5-2-8 1 0 3-1 5-1 2-1 5-1 7-1 4 0 8 2 11 5 2 3 2 7 2 11 0 2 0 5 0 7 0 1-1 2-2 3 0 0 0 0 0 0-3 2 2 0 0 0z" horiz-adv-x="1000" />
58
59 <glyph glyph-name="up" unicode="&#xe832;" d="M687 254l-173 217c-8 9-22 9-29 0l-173-217c-9-12-1-29 15-29h345c16 0 24 17 15 29z" horiz-adv-x="1000" />
60
61 <glyph glyph-name="merge" unicode="&#xe833;" d="M200 110c0-72 58-131 130-131s130 59 130 131c0 45-24 86-60 109 18 139 133 179 202 190v-301c-38-23-65-64-65-112 0-72 59-130 130-130s130 58 130 130c0 48-26 89-65 112v487c39 23 65 64 65 112 0 72-58 130-130 130s-130-58-130-130c0-48 27-89 65-112v-55c-60-8-162-32-238-108-54-54-86-124-94-208-42-22-70-65-70-114z m468-158c-24 0-44 20-44 43s20 44 44 44c24 0 43-20 43-44s-19-43-43-43z m0 798c24 0 43-19 43-43s-20-43-43-43c-24 0-44 19-44 43s20 43 44 43z m-338-684c-24 0-43 20-43 43s19 44 43 44c24 0 43-20 43-44s-19-43-43-43z" horiz-adv-x="1000" />
35 62 </font>
36 63 </defs>
37 64 </svg> No newline at end of file
1 NO CONTENT: modified file, binary diff hidden
1 NO CONTENT: modified file, binary diff hidden
@@ -1,6 +1,18 b''
1 1 // Global keyboard bindings
2 2
3 3 function setRCMouseBindings(repoName, repoLandingRev) {
4
5 /** custom callback for supressing mousetrap from firing */
6 Mousetrap.stopCallback = function(e, element) {
7 // if the element has the class "mousetrap" then no need to stop
8 if ((' ' + element.className + ' ').indexOf(' mousetrap ') > -1) {
9 return false;
10 }
11
12 // stop for input, select, and textarea
13 return element.tagName == 'INPUT' || element.tagName == 'SELECT' || element.tagName == 'TEXTAREA' || element.isContentEditable;
14 };
15
4 16 // general help "?"
5 17 Mousetrap.bind(['?'], function(e) {
6 18 $('#help_kb').modal({});
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': 'Add another comment',
10 'Close': 'Close',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'Follow',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': 'Loading ...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': 'Selection link',
39 'Send': 'Send',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'Show more',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': 'Stop following this repository',
43 48 'Submitting...': 'Submitting...',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'There are currently no open pull requests requiring your participation.',
44 52 'Unfollow': 'Unfollow',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': 'disabled',
49 58 'enabled': 'enabled',
50 59 'file': 'file',
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': 'Add another comment',
10 'Close': 'Close',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'Follow',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': 'Loading ...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': 'Selection link',
39 'Send': 'Senden',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'Show more',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': 'Stop following this repository',
43 48 'Submitting...': 'Submitting...',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'There are currently no open pull requests requiring your participation.',
44 52 'Unfollow': 'Unfollow',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': 'disabled',
49 58 'enabled': 'enabled',
50 59 'file': 'file',
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': 'Add another comment',
10 'Close': 'Close',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'Follow',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': 'Loading ...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': 'Selection link',
39 'Send': 'Send',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'Show more',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': 'Stop following this repository',
43 48 'Submitting...': 'Submitting...',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'There are currently no open pull requests requiring your participation.',
44 52 'Unfollow': 'Unfollow',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': 'disabled',
49 58 'enabled': 'enabled',
50 59 'file': 'file',
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': 'Add another comment',
10 'Close': 'Close',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'Follow',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': 'Loading ...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': 'Selection link',
39 'Send': 'Send',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'Show more',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': 'Stop following this repository',
43 48 'Submitting...': 'Submitting...',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'There are currently no open pull requests requiring your participation.',
44 52 'Unfollow': 'Unfollow',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': 'disabled',
49 58 'enabled': 'enabled',
50 59 'file': 'file',
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': 'Add another comment',
10 'Close': 'Close',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'Follow',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': 'Loading ...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': 'Lien vers la sélection',
39 'Send': 'Envoyer',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'Show more',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': 'Arrêter de suivre ce dépôt',
43 48 'Submitting...': 'Envoi…',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'There are currently no open pull requests requiring your participation.',
44 52 'Unfollow': 'Unfollow',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': 'Désactivé',
49 58 'enabled': 'enabled',
50 59 'file': 'file',
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': 'Aggiungi un altro commento',
10 'Close': 'Close',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'Segui',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': 'Caricamento ...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': 'Collegamento selezione',
39 'Send': 'Invia',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'Mostra ancora',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': 'Smetti di seguire il repository',
43 48 'Submitting...': 'Inoltro...',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'Al momento non ci sono richieste di PULL che richiedono il tuo intervento',
44 52 'Unfollow': 'Smetti di seguire',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': 'disabilitato',
49 58 'enabled': 'abilitato',
50 59 'file': 'file',
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': '別のコメントを追加',
10 'Close': 'Close',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'フォロー',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': '読み込み中...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': 'セレクション・リンク',
39 'Send': '送信',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'もっと表示',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': 'このリポジトリのフォローをやめる',
43 48 'Submitting...': '送信中...',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'There are currently no open pull requests requiring your participation.',
44 52 'Unfollow': 'アンフォロー',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': '無効',
49 58 'enabled': '有効',
50 59 'file': 'file',
@@ -1,7 +1,11 b''
1 1 // AUTO GENERATED FILE FOR Babel JS-GETTEXT EXTRACTORS, DO NOT CHANGE
2 _gettext('(from usergroup {0})');
2 3 _gettext('Add another comment');
4 _gettext('Close');
3 5 _gettext('Comment text will be set automatically based on currently selected status ({0}) ...');
6 _gettext('Delete this comment?');
4 7 _gettext('Follow');
8 _gettext('Invite reviewers to this discussion');
5 9 _gettext('Loading ...');
6 10 _gettext('Loading failed');
7 11 _gettext('Loading more results...');
@@ -26,6 +30,7 b''
26 30 _gettext('Please enter {0} or more characters');
27 31 _gettext('Searching...');
28 32 _gettext('Selection link');
33 _gettext('Send');
29 34 _gettext('Set status to Approved');
30 35 _gettext('Set status to Rejected');
31 36 _gettext('Show more');
@@ -35,10 +40,14 b''
35 40 _gettext('Status Review');
36 41 _gettext('Stop following this repository');
37 42 _gettext('Submitting...');
43 _gettext('Switch to chat');
44 _gettext('Switch to comment');
45 _gettext('There are currently no open pull requests requiring your participation.');
38 46 _gettext('Unfollow');
39 47 _gettext('Updating...');
40 48 _gettext('You can only select {0} item');
41 49 _gettext('You can only select {0} items');
50 _gettext('added manually by "{0}"');
42 51 _gettext('disabled');
43 52 _gettext('enabled');
44 53 _gettext('file');
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': 'Dodaj kolejny komentarz',
10 'Close': 'Zamknij',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'Obserwuj',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': 'Ładuję...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': 'Wybór linku',
39 'Send': 'Wyślij',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'Pokaż więcej',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': 'Zakończyć obserwację tego repozytorium',
43 48 'Submitting...': 'Przesyłanie...',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'There are currently no open pull requests requiring your participation.',
44 52 'Unfollow': 'Nie obserwuj',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': 'disabled',
49 58 'enabled': 'enabled',
50 59 'file': 'file',
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': 'Adicionar outro comentário',
10 'Close': 'Close',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'Seguir',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': 'Carregando...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': 'Link da seleção',
39 'Send': 'Enviar',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'Mostrar mais',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': 'Parar de seguir este repositório',
43 48 'Submitting...': 'Enviando...',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'There are currently no open pull requests requiring your participation.',
44 52 'Unfollow': 'Parar de seguir',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': 'desabilitado',
49 58 'enabled': 'enabled',
50 59 'file': 'file',
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': 'Добавить другой комментарий',
10 'Close': 'Close',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'Наблюдать',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': 'Загрузка...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': 'Ссылка выбора',
39 'Send': 'Отправить',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'Показать еще',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': 'Отменить наблюдение за репозиторием',
43 48 'Submitting...': 'Применение...',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'There are currently no open pull requests requiring your participation.',
44 52 'Unfollow': 'Не наблюдать',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': 'отключено',
49 58 'enabled': 'enabled',
50 59 'file': 'file',
@@ -5,9 +5,13 b''
5 5 */
6 6 //JS translations map
7 7 var _TM = {
8 '(from usergroup {0})': '(from usergroup {0})',
8 9 'Add another comment': 'Add another comment',
10 'Close': 'Close',
9 11 'Comment text will be set automatically based on currently selected status ({0}) ...': 'Comment text will be set automatically based on currently selected status ({0}) ...',
12 'Delete this comment?': 'Delete this comment?',
10 13 'Follow': 'Follow',
14 'Invite reviewers to this discussion': 'Invite reviewers to this discussion',
11 15 'Loading ...': 'Loading ...',
12 16 'Loading failed': 'Loading failed',
13 17 'Loading more results...': 'Loading more results...',
@@ -32,6 +36,7 b' var _TM = {'
32 36 'Please enter {0} or more characters': 'Please enter {0} or more characters',
33 37 'Searching...': 'Searching...',
34 38 'Selection link': '选择链接',
39 'Send': '发送',
35 40 'Set status to Approved': 'Set status to Approved',
36 41 'Set status to Rejected': 'Set status to Rejected',
37 42 'Show more': 'Show more',
@@ -41,10 +46,14 b' var _TM = {'
41 46 'Status Review': 'Status Review',
42 47 'Stop following this repository': '停止关注该版本库',
43 48 'Submitting...': '提交中……',
49 'Switch to chat': 'Switch to chat',
50 'Switch to comment': 'Switch to comment',
51 'There are currently no open pull requests requiring your participation.': 'There are currently no open pull requests requiring your participation.',
44 52 'Unfollow': 'Unfollow',
45 53 'Updating...': 'Updating...',
46 54 'You can only select {0} item': 'You can only select {0} item',
47 55 'You can only select {0} items': 'You can only select {0} items',
56 'added manually by "{0}"': 'added manually by "{0}"',
48 57 'disabled': '禁用',
49 58 'enabled': 'enabled',
50 59 'file': 'file',
@@ -14,7 +14,9 b' function registerRCRoutes() {'
14 14 // routes registration
15 15 pyroutes.register('home', '/', []);
16 16 pyroutes.register('user_autocomplete_data', '/_users', []);
17 pyroutes.register('user_group_autocomplete_data', '/_user_groups', []);
17 18 pyroutes.register('new_repo', '/_admin/create_repository', []);
19 pyroutes.register('edit_user', '/_admin/users/%(user_id)s/edit', ['user_id']);
18 20 pyroutes.register('edit_user_group_members', '/_admin/user_groups/%(user_group_id)s/edit/members', ['user_group_id']);
19 21 pyroutes.register('gists', '/_admin/gists', []);
20 22 pyroutes.register('new_gist', '/_admin/gists/new', []);
@@ -22,6 +24,7 b' function registerRCRoutes() {'
22 24 pyroutes.register('repo_stats', '/%(repo_name)s/repo_stats/%(commit_id)s', ['repo_name', 'commit_id']);
23 25 pyroutes.register('repo_refs_data', '/%(repo_name)s/refs-data', ['repo_name']);
24 26 pyroutes.register('repo_refs_changelog_data', '/%(repo_name)s/refs-data-changelog', ['repo_name']);
27 pyroutes.register('repo_default_reviewers_data', '/%(repo_name)s/default-reviewers', ['repo_name']);
25 28 pyroutes.register('changeset_home', '/%(repo_name)s/changeset/%(revision)s', ['repo_name', 'revision']);
26 29 pyroutes.register('edit_repo', '/%(repo_name)s/settings', ['repo_name']);
27 30 pyroutes.register('edit_repo_perms', '/%(repo_name)s/settings/permissions', ['repo_name']);
@@ -1,15 +1,18 b''
1 1 <link rel="import" href="../../../../../../bower_components/polymer/polymer.html">
2 2 <link rel="import" href="../channelstream-connection/channelstream-connection.html">
3 <link rel="import" href="../rhodecode-toast/rhodecode-toast.html">
4 <link rel="import" href="../rhodecode-favicon/rhodecode-favicon.html">
3 5
4 6 <dom-module id="rhodecode-app">
5 7 <template>
6 <rhodecode-toast id="notifications"></rhodecode-toast>
7 8 <channelstream-connection
8 9 id="channelstream-connection"
9 10 on-channelstream-listen-message="receivedMessage"
10 11 on-channelstream-connected="handleConnected"
11 12 on-channelstream-subscribed="handleSubscribed">
12 13 </channelstream-connection>
14 <rhodecode-favicon></rhodecode-favicon>
15 <rhodecode-toast id="notifications"></rhodecode-toast>
13 16 </template>
14 17 <script src="rhodecode-app.js"></script>
15 18 </dom-module>
@@ -3,16 +3,20 b' ccLog.setLevel(Logger.OFF);'
3 3
4 4 var rhodeCodeApp = Polymer({
5 5 is: 'rhodecode-app',
6 created: function () {
6 attached: function () {
7 7 ccLog.debug('rhodeCodeApp created');
8 8 $.Topic('/notifications').subscribe(this.handleNotifications.bind(this));
9
10 $.Topic('/plugins/__REGISTER__').subscribe(
11 this.kickoffChannelstreamPlugin.bind(this)
12 );
13
9 $.Topic('/favicon/update').subscribe(this.faviconUpdate.bind(this));
14 10 $.Topic('/connection_controller/subscribe').subscribe(
15 11 this.subscribeToChannelTopic.bind(this));
12 // this event can be used to coordinate plugins to do their
13 // initialization before channelstream is kicked off
14 $.Topic('/__MAIN_APP__').publish({});
15
16 for (var i = 0; i < alertMessagePayloads.length; i++) {
17 $.Topic('/notifications').publish(alertMessagePayloads[i]);
18 }
19 this.kickoffChannelstreamPlugin();
16 20 },
17 21
18 22 /** proxy to channelstream connection */
@@ -24,6 +28,10 b' var rhodeCodeApp = Polymer({'
24 28 this.$['notifications'].handleNotification(data);
25 29 },
26 30
31 faviconUpdate: function (data) {
32 this.$$('rhodecode-favicon').counter = data.count;
33 },
34
27 35 /** opens connection to ws server */
28 36 kickoffChannelstreamPlugin: function (data) {
29 37 ccLog.debug('kickoffChannelstreamPlugin');
@@ -33,7 +41,7 b' var rhodeCodeApp = Polymer({'
33 41 channels.push(addChannels[i]);
34 42 }
35 43 if (window.CHANNELSTREAM_SETTINGS && CHANNELSTREAM_SETTINGS.enabled){
36 var channelstreamConnection = this.$['channelstream-connection'];
44 var channelstreamConnection = this.getChannelStreamConnection();
37 45 channelstreamConnection.connectUrl = CHANNELSTREAM_URLS.connect;
38 46 channelstreamConnection.subscribeUrl = CHANNELSTREAM_URLS.subscribe;
39 47 channelstreamConnection.websocketUrl = CHANNELSTREAM_URLS.ws + '/ws';
@@ -61,7 +69,7 b' var rhodeCodeApp = Polymer({'
61 69
62 70 /** subscribes users from channels in channelstream */
63 71 subscribeToChannelTopic: function (channels) {
64 var channelstreamConnection = this.$['channelstream-connection'];
72 var channelstreamConnection = this.getChannelStreamConnection();
65 73 var toSubscribe = channelstreamConnection.calculateSubscribe(channels);
66 74 ccLog.debug('subscribeToChannelTopic', toSubscribe);
67 75 if (toSubscribe.length > 0) {
@@ -96,7 +104,7 b' var rhodeCodeApp = Polymer({'
96 104 },
97 105
98 106 handleConnected: function (event) {
99 var channelstreamConnection = this.$['channelstream-connection'];
107 var channelstreamConnection = this.getChannelStreamConnection();
100 108 channelstreamConnection.set('channelsState',
101 109 event.detail.channels_info);
102 110 channelstreamConnection.set('userState', event.detail.state);
@@ -104,7 +112,7 b' var rhodeCodeApp = Polymer({'
104 112 this.propagageChannelsState();
105 113 },
106 114 handleSubscribed: function (event) {
107 var channelstreamConnection = this.$['channelstream-connection'];
115 var channelstreamConnection = this.getChannelStreamConnection();
108 116 var channelInfo = event.detail.channels_info;
109 117 var channelKeys = Object.keys(event.detail.channels_info);
110 118 for (var i = 0; i < channelKeys.length; i++) {
@@ -116,7 +124,7 b' var rhodeCodeApp = Polymer({'
116 124 },
117 125 /** propagates channel states on topics */
118 126 propagageChannelsState: function (event) {
119 var channelstreamConnection = this.$['channelstream-connection'];
127 var channelstreamConnection = this.getChannelStreamConnection();
120 128 var channel_data = channelstreamConnection.channelsState;
121 129 var channels = channelstreamConnection.channels;
122 130 for (var i = 0; i < channels.length; i++) {
@@ -21,7 +21,7 b''
21 21 </template>
22 22 </div>
23 23 <div class="toast-close">
24 <button on-tap="dismissNotifications" class="btn btn-default">{{_gettext('Close now')}}</button>
24 <button on-tap="dismissNotifications" class="btn btn-default">{{_gettext('Close')}}</button>
25 25 </div>
26 26 </paper-toast>
27 27 </template>
@@ -13,9 +13,11 b' Polymer({'
13 13 ],
14 14 _changedToasts: function(newValue, oldValue){
15 15 this.$['p-toast'].notifyResize();
16 $.Topic('/favicon/update').publish({count: this.toasts.length});
16 17 },
17 18 dismissNotifications: function(){
18 19 this.$['p-toast'].close();
20 $.Topic('/favicon/update').publish({count: 0});
19 21 },
20 22 handleClosed: function(){
21 23 this.splice('toasts', 0);
@@ -1,8 +1,9 b''
1 1 <!-- required for stamped out templates that might use common elements -->
2 <link rel="import" href="rhodecode-legacy-js/rhodecode-legacy-js.html">
2 3 <link rel="import" href="../../../../../bower_components/iron-ajax/iron-ajax.html">
3 4 <link rel="import" href="shared-styles.html">
4 5 <link rel="import" href="channelstream-connection/channelstream-connection.html">
5 <link rel="import" href="rhodecode-app/rhodecode-app.html">
6 6 <link rel="import" href="rhodecode-toast/rhodecode-toast.html">
7 7 <link rel="import" href="rhodecode-toggle/rhodecode-toggle.html">
8 8 <link rel="import" href="rhodecode-unsafe-html/rhodecode-unsafe-html.html">
9 <link rel="import" href="rhodecode-app/rhodecode-app.html">
@@ -221,14 +221,32 b' var formatSelect2SelectionRefs = functio'
221 221 };
222 222
223 223 // takes a given html element and scrolls it down offset pixels
224 function offsetScroll(element, offset){
225 setTimeout(function(){
224 function offsetScroll(element, offset) {
225 setTimeout(function() {
226 226 var location = element.offset().top;
227 227 // some browsers use body, some use html
228 228 $('html, body').animate({ scrollTop: (location - offset) });
229 229 }, 100);
230 230 }
231 231
232 // scroll an element `percent`% from the top of page in `time` ms
233 function scrollToElement(element, percent, time) {
234 percent = (percent === undefined ? 25 : percent);
235 time = (time === undefined ? 100 : time);
236
237 var $element = $(element);
238 var elOffset = $element.offset().top;
239 var elHeight = $element.height();
240 var windowHeight = $(window).height();
241 var offset = elOffset;
242 if (elHeight < windowHeight) {
243 offset = elOffset - ((windowHeight / (100 / percent)) - (elHeight / 2));
244 }
245 setTimeout(function() {
246 $('html, body').animate({ scrollTop: offset});
247 }, time);
248 }
249
232 250 /**
233 251 * global hooks after DOM is loaded
234 252 */
@@ -266,7 +284,36 b' function offsetScroll(element, offset){'
266 284 }
267 285 });
268 286
269 $('.compare_view_files').on(
287 $('body').on( /* TODO: replace the $('.compare_view_files').on('click') below
288 when new diffs are integrated */
289 'click', '.cb-lineno a', function(event) {
290
291 if ($(this).attr('data-line-no') !== ""){
292 $('.cb-line-selected').removeClass('cb-line-selected');
293 var td = $(this).parent();
294 td.addClass('cb-line-selected'); // line number td
295 td.prev().addClass('cb-line-selected'); // line data td
296 td.next().addClass('cb-line-selected'); // line content td
297
298 // Replace URL without jumping to it if browser supports.
299 // Default otherwise
300 if (history.pushState) {
301 var new_location = location.href.rstrip('#');
302 if (location.hash) {
303 new_location = new_location.replace(location.hash, "");
304 }
305
306 // Make new anchor url
307 new_location = new_location + $(this).attr('href');
308 history.pushState(true, document.title, new_location);
309
310 return false;
311 }
312 }
313 });
314
315 $('.compare_view_files').on( /* TODO: replace this with .cb function above
316 when new diffs are integrated */
270 317 'click', 'tr.line .lineno a',function(event) {
271 318 if ($(this).text() != ""){
272 319 $('tr.line').removeClass('selected');
@@ -365,24 +412,63 b' function offsetScroll(element, offset){'
365 412 // Select the line that comes from the url anchor
366 413 // At the time of development, Chrome didn't seem to support jquery's :target
367 414 // element, so I had to scroll manually
415
368 416 if (location.hash) {
369 417 var result = splitDelimitedHash(location.hash);
370 418 var loc = result.loc;
371 var remainder = result.remainder;
372 if (loc.length > 1){
373 var lineno = $(loc+'.lineno');
374 if (lineno.length > 0){
375 var tr = lineno.parents('tr.line');
376 tr.addClass('selected');
419 if (loc.length > 1) {
420
421 var highlightable_line_tds = [];
422
423 // source code line format
424 var page_highlights = loc.substring(
425 loc.indexOf('#') + 1).split('L');
377 426
378 tr[0].scrollIntoView();
427 if (page_highlights.length > 1) {
428 var highlight_ranges = page_highlights[1].split(",");
429 var h_lines = [];
430 for (var pos in highlight_ranges) {
431 var _range = highlight_ranges[pos].split('-');
432 if (_range.length === 2) {
433 var start = parseInt(_range[0]);
434 var end = parseInt(_range[1]);
435 if (start < end) {
436 for (var i = start; i <= end; i++) {
437 h_lines.push(i);
438 }
439 }
440 }
441 else {
442 h_lines.push(parseInt(highlight_ranges[pos]));
443 }
444 }
445 for (pos in h_lines) {
446 var line_td = $('td.cb-lineno#L' + h_lines[pos]);
447 if (line_td.length) {
448 highlightable_line_tds.push(line_td);
449 }
450 }
451 }
379 452
453 // now check a direct id reference (diff page)
454 if ($(loc).length && $(loc).hasClass('cb-lineno')) {
455 highlightable_line_tds.push($(loc));
456 }
457 $.each(highlightable_line_tds, function (i, $td) {
458 $td.addClass('cb-line-selected'); // line number td
459 $td.prev().addClass('cb-line-selected'); // line data
460 $td.next().addClass('cb-line-selected'); // line content
461 });
462
463 if (highlightable_line_tds.length) {
464 var $first_line_td = highlightable_line_tds[0];
465 scrollToElement($first_line_td);
380 466 $.Topic('/ui/plugins/code/anchor_focus').prepareOrPublish({
381 tr:tr,
382 remainder:remainder});
467 td: $first_line_td,
468 remainder: result.remainder
469 });
383 470 }
384 471 }
385 472 }
386
387 473 collapsableContent();
388 474 });
@@ -26,6 +26,9 b' var initAppEnlight = function (config) {'
26 26 if (config.requestInfo){
27 27 AppEnlight.setRequestInfo(config.requestInfo);
28 28 }
29 if (config.tags){
30 AppEnlight.addGlobalTags(config.tags);
31 }
29 32 }
30 33 };
31 34
@@ -29,6 +29,178 b' cmLog.setLevel(Logger.OFF);'
29 29 //global cache for inline forms
30 30 var userHintsCache = {};
31 31
32 // global timer, used to cancel async loading
33 var CodeMirrorLoadUserHintTimer;
34
35 var escapeRegExChars = function(value) {
36 return value.replace(/[\-\[\]\/\{\}\(\)\*\+\?\.\\\^\$\|]/g, "\\$&");
37 };
38
39 /**
40 * Load hints from external source returns an array of objects in a format
41 * that hinting lib requires
42 * @returns {Array}
43 */
44 var CodeMirrorLoadUserHints = function(query, triggerHints) {
45 cmLog.debug('Loading mentions users via AJAX');
46 var _users = [];
47 $.ajax({
48 type: 'GET',
49 data: {query: query},
50 url: pyroutes.url('user_autocomplete_data'),
51 headers: {'X-PARTIAL-XHR': true},
52 async: true
53 })
54 .done(function(data) {
55 var tmpl = '<img class="gravatar" src="{0}"/>{1}';
56 $.each(data.suggestions, function(i) {
57 var userObj = data.suggestions[i];
58
59 if (userObj.username !== "default") {
60 _users.push({
61 text: userObj.username + " ",
62 org_text: userObj.username,
63 displayText: userObj.value_display, // search that field
64 // internal caches
65 _icon_link: userObj.icon_link,
66 _text: userObj.value_display,
67
68 render: function(elt, data, completion) {
69 var el = document.createElement('div');
70 el.className = "CodeMirror-hint-entry";
71 el.innerHTML = tmpl.format(
72 completion._icon_link, completion._text);
73 elt.appendChild(el);
74 }
75 });
76 }
77 });
78 cmLog.debug('Mention users loaded');
79 // set to global cache
80 userHintsCache[query] = _users;
81 triggerHints(userHintsCache[query]);
82 })
83 .fail(function(data, textStatus, xhr) {
84 alert("error processing request: " + textStatus);
85 });
86 };
87
88 /**
89 * filters the results based on the current context
90 * @param users
91 * @param context
92 * @returns {Array}
93 */
94 var CodeMirrorFilterUsers = function(users, context) {
95 var MAX_LIMIT = 10;
96 var filtered_users = [];
97 var curWord = context.string;
98
99 cmLog.debug('Filtering users based on query:', curWord);
100 $.each(users, function(i) {
101 var match = users[i];
102 var searchText = match.displayText;
103
104 if (!curWord ||
105 searchText.toLowerCase().lastIndexOf(curWord) !== -1) {
106 // reset state
107 match._text = match.displayText;
108 if (curWord) {
109 // do highlighting
110 var pattern = '(' + escapeRegExChars(curWord) + ')';
111 match._text = searchText.replace(
112 new RegExp(pattern, 'gi'), '<strong>$1<\/strong>');
113 }
114
115 filtered_users.push(match);
116 }
117 // to not return to many results, use limit of filtered results
118 if (filtered_users.length > MAX_LIMIT) {
119 return false;
120 }
121 });
122
123 return filtered_users;
124 };
125
126 var CodeMirrorMentionHint = function(editor, callback, options) {
127 var cur = editor.getCursor();
128 var curLine = editor.getLine(cur.line).slice(0, cur.ch);
129
130 // match on @ +1char
131 var tokenMatch = new RegExp(
132 '(^@| @)([a-zA-Z0-9]{1}[a-zA-Z0-9\-\_\.]*)$').exec(curLine);
133
134 var tokenStr = '';
135 if (tokenMatch !== null && tokenMatch.length > 0){
136 tokenStr = tokenMatch[0].strip();
137 } else {
138 // skip if we didn't match our token
139 return;
140 }
141
142 var context = {
143 start: (cur.ch - tokenStr.length) + 1,
144 end: cur.ch,
145 string: tokenStr.slice(1),
146 type: null
147 };
148
149 // case when we put the @sign in fron of a string,
150 // eg <@ we put it here>sometext then we need to prepend to text
151 if (context.end > cur.ch) {
152 context.start = context.start + 1; // we add to the @ sign
153 context.end = cur.ch; // don't eat front part just append
154 context.string = context.string.slice(1, cur.ch - context.start);
155 }
156
157 cmLog.debug('Mention context', context);
158
159 var triggerHints = function(userHints){
160 return callback({
161 list: CodeMirrorFilterUsers(userHints, context),
162 from: CodeMirror.Pos(cur.line, context.start),
163 to: CodeMirror.Pos(cur.line, context.end)
164 });
165 };
166
167 var queryBasedHintsCache = undefined;
168 // if we have something in the cache, try to fetch the query based cache
169 if (userHintsCache !== {}){
170 queryBasedHintsCache = userHintsCache[context.string];
171 }
172
173 if (queryBasedHintsCache !== undefined) {
174 cmLog.debug('Users loaded from cache');
175 triggerHints(queryBasedHintsCache);
176 } else {
177 // this takes care for async loading, and then displaying results
178 // and also propagates the userHintsCache
179 window.clearTimeout(CodeMirrorLoadUserHintTimer);
180 CodeMirrorLoadUserHintTimer = setTimeout(function() {
181 CodeMirrorLoadUserHints(context.string, triggerHints);
182 }, 300);
183 }
184 };
185
186 var CodeMirrorCompleteAfter = function(cm, pred) {
187 var options = {
188 completeSingle: false,
189 async: true,
190 closeOnUnfocus: true
191 };
192 var cur = cm.getCursor();
193 setTimeout(function() {
194 if (!cm.state.completionActive) {
195 cmLog.debug('Trigger mentions hinting');
196 CodeMirror.showHint(cm, CodeMirror.hint.mentions, options);
197 }
198 }, 100);
199
200 // tell CodeMirror we didn't handle the key
201 // trick to trigger on a char but still complete it
202 return CodeMirror.Pass;
203 };
32 204
33 205 var initCodeMirror = function(textAreadId, resetUrl, focus, options) {
34 206 var ta = $('#' + textAreadId).get(0);
@@ -61,9 +233,6 b' var initCodeMirror = function(textAreadI'
61 233 var initCommentBoxCodeMirror = function(textAreaId, triggerActions){
62 234 var initialHeight = 100;
63 235
64 // global timer, used to cancel async loading
65 var loadUserHintTimer;
66
67 236 if (typeof userHintsCache === "undefined") {
68 237 userHintsCache = {};
69 238 cmLog.debug('Init empty cache for mentions');
@@ -72,96 +241,6 b' var initCommentBoxCodeMirror = function('
72 241 cmLog.debug('Element for textarea not found', textAreaId);
73 242 return;
74 243 }
75 var escapeRegExChars = function(value) {
76 return value.replace(/[\-\[\]\/\{\}\(\)\*\+\?\.\\\^\$\|]/g, "\\$&");
77 };
78 /**
79 * Load hints from external source returns an array of objects in a format
80 * that hinting lib requires
81 * @returns {Array}
82 */
83 var loadUserHints = function(query, triggerHints) {
84 cmLog.debug('Loading mentions users via AJAX');
85 var _users = [];
86 $.ajax({
87 type: 'GET',
88 data: {query: query},
89 url: pyroutes.url('user_autocomplete_data'),
90 headers: {'X-PARTIAL-XHR': true},
91 async: true
92 })
93 .done(function(data) {
94 var tmpl = '<img class="gravatar" src="{0}"/>{1}';
95 $.each(data.suggestions, function(i) {
96 var userObj = data.suggestions[i];
97
98 if (userObj.username !== "default") {
99 _users.push({
100 text: userObj.username + " ",
101 org_text: userObj.username,
102 displayText: userObj.value_display, // search that field
103 // internal caches
104 _icon_link: userObj.icon_link,
105 _text: userObj.value_display,
106
107 render: function(elt, data, completion) {
108 var el = document.createElement('div');
109 el.className = "CodeMirror-hint-entry";
110 el.innerHTML = tmpl.format(
111 completion._icon_link, completion._text);
112 elt.appendChild(el);
113 }
114 });
115 }
116 });
117 cmLog.debug('Mention users loaded');
118 // set to global cache
119 userHintsCache[query] = _users;
120 triggerHints(userHintsCache[query]);
121 })
122 .fail(function(data, textStatus, xhr) {
123 alert("error processing request: " + textStatus);
124 });
125 };
126
127 /**
128 * filters the results based on the current context
129 * @param users
130 * @param context
131 * @returns {Array}
132 */
133 var filterUsers = function(users, context) {
134 var MAX_LIMIT = 10;
135 var filtered_users = [];
136 var curWord = context.string;
137
138 cmLog.debug('Filtering users based on query:', curWord);
139 $.each(users, function(i) {
140 var match = users[i];
141 var searchText = match.displayText;
142
143 if (!curWord ||
144 searchText.toLowerCase().lastIndexOf(curWord) !== -1) {
145 // reset state
146 match._text = match.displayText;
147 if (curWord) {
148 // do highlighting
149 var pattern = '(' + escapeRegExChars(curWord) + ')';
150 match._text = searchText.replace(
151 new RegExp(pattern, 'gi'), '<strong>$1<\/strong>');
152 }
153
154 filtered_users.push(match);
155 }
156 // to not return to many results, use limit of filtered results
157 if (filtered_users.length > MAX_LIMIT) {
158 return false;
159 }
160 });
161
162 return filtered_users;
163 };
164
165 244 /**
166 245 * Filter action based on typed in text
167 246 * @param actions
@@ -200,25 +279,6 b' var initCommentBoxCodeMirror = function('
200 279 return filtered_actions;
201 280 };
202 281
203 var completeAfter = function(cm, pred) {
204 var options = {
205 completeSingle: false,
206 async: true,
207 closeOnUnfocus: true
208 };
209 var cur = cm.getCursor();
210 setTimeout(function() {
211 if (!cm.state.completionActive) {
212 cmLog.debug('Trigger mentions hinting');
213 CodeMirror.showHint(cm, CodeMirror.hint.mentions, options);
214 }
215 }, 100);
216
217 // tell CodeMirror we didn't handle the key
218 // trick to trigger on a char but still complete it
219 return CodeMirror.Pass;
220 };
221
222 282 var submitForm = function(cm, pred) {
223 283 $(cm.display.input.textarea.form).submit();
224 284 return CodeMirror.Pass;
@@ -238,7 +298,7 b' var initCommentBoxCodeMirror = function('
238 298 };
239 299
240 300 var extraKeys = {
241 "'@'": completeAfter,
301 "'@'": CodeMirrorCompleteAfter,
242 302 Tab: function(cm) {
243 303 // space indent instead of TABS
244 304 var spaces = new Array(cm.getOption("indentUnit") + 1).join(" ");
@@ -285,66 +345,6 b' var initCommentBoxCodeMirror = function('
285 345 self.setSize(null, height);
286 346 });
287 347
288 var mentionHint = function(editor, callback, options) {
289 var cur = editor.getCursor();
290 var curLine = editor.getLine(cur.line).slice(0, cur.ch);
291
292 // match on @ +1char
293 var tokenMatch = new RegExp(
294 '(^@| @)([a-zA-Z0-9]{1}[a-zA-Z0-9\-\_\.]*)$').exec(curLine);
295
296 var tokenStr = '';
297 if (tokenMatch !== null && tokenMatch.length > 0){
298 tokenStr = tokenMatch[0].strip();
299 } else {
300 // skip if we didn't match our token
301 return;
302 }
303
304 var context = {
305 start: (cur.ch - tokenStr.length) + 1,
306 end: cur.ch,
307 string: tokenStr.slice(1),
308 type: null
309 };
310
311 // case when we put the @sign in fron of a string,
312 // eg <@ we put it here>sometext then we need to prepend to text
313 if (context.end > cur.ch) {
314 context.start = context.start + 1; // we add to the @ sign
315 context.end = cur.ch; // don't eat front part just append
316 context.string = context.string.slice(1, cur.ch - context.start);
317 }
318
319 cmLog.debug('Mention context', context);
320
321 var triggerHints = function(userHints){
322 return callback({
323 list: filterUsers(userHints, context),
324 from: CodeMirror.Pos(cur.line, context.start),
325 to: CodeMirror.Pos(cur.line, context.end)
326 });
327 };
328
329 var queryBasedHintsCache = undefined;
330 // if we have something in the cache, try to fetch the query based cache
331 if (userHintsCache !== {}){
332 queryBasedHintsCache = userHintsCache[context.string];
333 }
334
335 if (queryBasedHintsCache !== undefined) {
336 cmLog.debug('Users loaded from cache');
337 triggerHints(queryBasedHintsCache);
338 } else {
339 // this takes care for async loading, and then displaying results
340 // and also propagates the userHintsCache
341 window.clearTimeout(loadUserHintTimer);
342 loadUserHintTimer = setTimeout(function() {
343 loadUserHints(context.string, triggerHints);
344 }, 300);
345 }
346 };
347
348 348 var actionHint = function(editor, options) {
349 349 var cur = editor.getCursor();
350 350 var curLine = editor.getLine(cur.line).slice(0, cur.ch);
@@ -408,7 +408,7 b' var initCommentBoxCodeMirror = function('
408 408 to: CodeMirror.Pos(cur.line, context.end)
409 409 };
410 410 };
411 CodeMirror.registerHelper("hint", "mentions", mentionHint);
411 CodeMirror.registerHelper("hint", "mentions", CodeMirrorMentionHint);
412 412 CodeMirror.registerHelper("hint", "actions", actionHint);
413 413 return cm;
414 414 };
@@ -323,7 +323,7 b' var bindToggleButtons = function() {'
323 323 };
324 324
325 325 var linkifyComments = function(comments) {
326
326 /* TODO: dan: remove this - it should no longer needed */
327 327 for (var i = 0; i < comments.length; i++) {
328 328 var comment_id = $(comments[i]).data('comment-id');
329 329 var prev_comment_id = $(comments[i - 1]).data('comment-id');
@@ -347,7 +347,7 b' var linkifyComments = function(comments)'
347 347 }
348 348
349 349 };
350
350
351 351 /**
352 352 * Iterates over all the inlines, and places them inside proper blocks of data
353 353 */
@@ -670,3 +670,226 b' var CommentForm = (function() {'
670 670
671 671 return CommentForm;
672 672 })();
673
674 var CommentsController = function() { /* comments controller */
675 var self = this;
676
677 this.cancelComment = function(node) {
678 var $node = $(node);
679 var $td = $node.closest('td');
680 $node.closest('.comment-inline-form').removeClass('comment-inline-form-open');
681 return false;
682 }
683 this.getLineNumber = function(node) {
684 var $node = $(node);
685 return $node.closest('td').attr('data-line-number');
686 }
687 this.scrollToComment = function(node, offset) {
688 if (!node) {
689 node = $('.comment-selected');
690 if (!node.length) {
691 node = $('comment-current')
692 }
693 }
694 $comment = $(node).closest('.comment-current');
695 $comments = $('.comment-current');
696
697 $('.comment-selected').removeClass('comment-selected');
698
699 var nextIdx = $('.comment-current').index($comment) + offset;
700 if (nextIdx >= $comments.length) {
701 nextIdx = 0;
702 }
703 var $next = $('.comment-current').eq(nextIdx);
704 var $cb = $next.closest('.cb');
705 $cb.removeClass('cb-collapsed')
706
707 var $filediffCollapseState = $cb.closest('.filediff').prev();
708 $filediffCollapseState.prop('checked', false);
709 $next.addClass('comment-selected');
710 scrollToElement($next);
711 return false;
712 }
713 this.nextComment = function(node) {
714 return self.scrollToComment(node, 1);
715 }
716 this.prevComment = function(node) {
717 return self.scrollToComment(node, -1);
718 }
719 this.deleteComment = function(node) {
720 if (!confirm(_gettext('Delete this comment?'))) {
721 return false;
722 }
723 var $node = $(node);
724 var $td = $node.closest('td');
725 var $comment = $node.closest('.comment');
726 var comment_id = $comment.attr('data-comment-id');
727 var url = AJAX_COMMENT_DELETE_URL.replace('__COMMENT_ID__', comment_id);
728 var postData = {
729 '_method': 'delete',
730 'csrf_token': CSRF_TOKEN
731 };
732
733 $comment.addClass('comment-deleting');
734 $comment.hide('fast');
735
736 var success = function(response) {
737 $comment.remove();
738 return false;
739 };
740 var failure = function(data, textStatus, xhr) {
741 alert("error processing request: " + textStatus);
742 $comment.show('fast');
743 $comment.removeClass('comment-deleting');
744 return false;
745 };
746 ajaxPOST(url, postData, success, failure);
747 }
748 this.toggleComments = function(node, show) {
749 var $filediff = $(node).closest('.filediff');
750 if (show === true) {
751 $filediff.removeClass('hide-comments');
752 } else if (show === false) {
753 $filediff.find('.hide-line-comments').removeClass('hide-line-comments');
754 $filediff.addClass('hide-comments');
755 } else {
756 $filediff.find('.hide-line-comments').removeClass('hide-line-comments');
757 $filediff.toggleClass('hide-comments');
758 }
759 return false;
760 }
761 this.toggleLineComments = function(node) {
762 self.toggleComments(node, true);
763 var $node = $(node);
764 $node.closest('tr').toggleClass('hide-line-comments');
765 }
766 this.createComment = function(node) {
767 var $node = $(node);
768 var $td = $node.closest('td');
769 var $form = $td.find('.comment-inline-form');
770
771 if (!$form.length) {
772 var tmpl = $('#cb-comment-inline-form-template').html();
773 var $filediff = $node.closest('.filediff');
774 $filediff.removeClass('hide-comments');
775 var f_path = $filediff.attr('data-f-path');
776 var lineno = self.getLineNumber(node);
777 tmpl = tmpl.format(f_path, lineno);
778 $form = $(tmpl);
779
780 var $comments = $td.find('.inline-comments');
781 if (!$comments.length) {
782 $comments = $(
783 $('#cb-comments-inline-container-template').html());
784 $td.append($comments);
785 }
786
787 $td.find('.cb-comment-add-button').before($form);
788
789 var pullRequestId = templateContext.pull_request_data.pull_request_id;
790 var commitId = templateContext.commit_data.commit_id;
791 var _form = $form[0];
792 var commentForm = new CommentForm(_form, commitId, pullRequestId, lineno, false);
793 var cm = commentForm.getCmInstance();
794
795 // set a CUSTOM submit handler for inline comments.
796 commentForm.setHandleFormSubmit(function(o) {
797 var text = commentForm.cm.getValue();
798
799 if (text === "") {
800 return;
801 }
802
803 if (lineno === undefined) {
804 alert('missing line !');
805 return;
806 }
807 if (f_path === undefined) {
808 alert('missing file path !');
809 return;
810 }
811
812 var excludeCancelBtn = false;
813 var submitEvent = true;
814 commentForm.setActionButtonsDisabled(true, excludeCancelBtn, submitEvent);
815 commentForm.cm.setOption("readOnly", true);
816 var postData = {
817 'text': text,
818 'f_path': f_path,
819 'line': lineno,
820 'csrf_token': CSRF_TOKEN
821 };
822 var submitSuccessCallback = function(json_data) {
823 $form.remove();
824 try {
825 var html = json_data.rendered_text;
826 var lineno = json_data.line_no;
827 var target_id = json_data.target_id;
828
829 $comments.find('.cb-comment-add-button').before(html);
830
831 } catch (e) {
832 console.error(e);
833 }
834
835
836 // re trigger the linkification of next/prev navigation
837 linkifyComments($('.inline-comment-injected'));
838 timeagoActivate();
839 bindDeleteCommentButtons();
840 commentForm.setActionButtonsDisabled(false);
841
842 };
843 var submitFailCallback = function(){
844 commentForm.resetCommentFormState(text)
845 };
846 commentForm.submitAjaxPOST(
847 commentForm.submitUrl, postData, submitSuccessCallback, submitFailCallback);
848 });
849
850 setTimeout(function() {
851 // callbacks
852 if (cm !== undefined) {
853 cm.focus();
854 }
855 }, 10);
856
857 $.Topic('/ui/plugins/code/comment_form_built').prepareOrPublish({
858 form: _form,
859 parent: $td[0],
860 lineno: lineno,
861 f_path: f_path}
862 );
863 }
864
865 $form.addClass('comment-inline-form-open');
866 }
867
868 this.renderInlineComments = function(file_comments) {
869 show_add_button = typeof show_add_button !== 'undefined' ? show_add_button : true;
870
871 for (var i = 0; i < file_comments.length; i++) {
872 var box = file_comments[i];
873
874 var target_id = $(box).attr('target_id');
875
876 // actually comments with line numbers
877 var comments = box.children;
878
879 for (var j = 0; j < comments.length; j++) {
880 var data = {
881 'rendered_text': comments[j].outerHTML,
882 'line_no': $(comments[j]).attr('line'),
883 'target_id': target_id
884 };
885 }
886 }
887
888 // since order of injection is random, we're now re-iterating
889 // from correct order and filling in links
890 linkifyComments($('.inline-comment-injected'));
891 bindDeleteCommentButtons();
892 firefoxAnchorFix();
893 };
894
895 } No newline at end of file
@@ -32,7 +32,7 b' var removeReviewMember = function(review'
32 32 var obj = $('#reviewer_{0}_name'.format(reviewer_id));
33 33 obj.addClass('to-delete');
34 34 // now delete the input
35 $('#reviewer_{0}_input'.format(reviewer_id)).remove();
35 $('#reviewer_{0} input'.format(reviewer_id)).remove();
36 36 }
37 37 }
38 38 else{
@@ -40,23 +40,39 b' var removeReviewMember = function(review'
40 40 }
41 41 };
42 42
43 var addReviewMember = function(id,fname,lname,nname,gravatar_link){
43 var addReviewMember = function(id, fname, lname, nname, gravatar_link, reasons) {
44 44 var members = $('#review_members').get(0);
45 var reasons_html = '';
46 var reasons_inputs = '';
47 var reasons = reasons || [];
48 if (reasons) {
49 for (var i = 0; i < reasons.length; i++) {
50 reasons_html += '<div class="reviewer_reason">- {0}</div>'.format(reasons[i]);
51 reasons_inputs += '<input type="hidden" name="reason" value="' + escapeHtml(reasons[i]) + '">';
52 }
53 }
45 54 var tmpl = '<li id="reviewer_{2}">'+
55 '<input type="hidden" name="__start__" value="reviewer:mapping">'+
46 56 '<div class="reviewer_status">'+
47 57 '<div class="flag_status not_reviewed pull-left reviewer_member_status"></div>'+
48 58 '</div>'+
49 59 '<img alt="gravatar" class="gravatar" src="{0}"/>'+
50 60 '<span class="reviewer_name user">{1}</span>'+
51 '<input type="hidden" value="{2}" name="review_members" />'+
61 reasons_html +
62 '<input type="hidden" name="user_id" value="{2}">'+
63 '<input type="hidden" name="__start__" value="reasons:sequence">'+
64 '{3}'+
65 '<input type="hidden" name="__end__" value="reasons:sequence">'+
52 66 '<div class="reviewer_member_remove action_button" onclick="removeReviewMember({2})">' +
53 67 '<i class="icon-remove-sign"></i>'+
54 68 '</div>'+
55 69 '</div>'+
70 '<input type="hidden" name="__end__" value="reviewer:mapping">'+
56 71 '</li>' ;
72
57 73 var displayname = "{0} ({1} {2})".format(
58 74 nname, escapeHtml(fname), escapeHtml(lname));
59 var element = tmpl.format(gravatar_link,displayname,id);
75 var element = tmpl.format(gravatar_link,displayname,id,reasons_inputs);
60 76 // check if we don't have this ID already in
61 77 var ids = [];
62 78 var _els = $('#review_members li').toArray();
@@ -74,7 +90,11 b' var _updatePullRequest = function(repo_n'
74 90 var url = pyroutes.url(
75 91 'pullrequest_update',
76 92 {"repo_name": repo_name, "pull_request_id": pull_request_id});
77 postData.csrf_token = CSRF_TOKEN;
93 if (typeof postData === 'string' ) {
94 postData += '&csrf_token=' + CSRF_TOKEN;
95 } else {
96 postData.csrf_token = CSRF_TOKEN;
97 }
78 98 var success = function(o) {
79 99 window.location.reload();
80 100 };
@@ -83,17 +103,9 b' var _updatePullRequest = function(repo_n'
83 103
84 104 var updateReviewers = function(reviewers_ids, repo_name, pull_request_id){
85 105 if (reviewers_ids === undefined){
86 var reviewers_ids = [];
87 var ids = $('#review_members input').toArray();
88 for(var i=0; i<ids.length;i++){
89 var id = ids[i].value
90 reviewers_ids.push(id);
91 }
106 var postData = '_method=put&' + $('#reviewers input').serialize();
107 _updatePullRequest(repo_name, pull_request_id, postData);
92 108 }
93 var postData = {
94 '_method':'put',
95 'reviewers_ids': reviewers_ids};
96 _updatePullRequest(repo_name, pull_request_id, postData);
97 109 };
98 110
99 111 /**
@@ -197,8 +209,10 b' var ReviewerAutoComplete = function(inpu'
197 209 formatResult: autocompleteFormatResult,
198 210 lookupFilter: autocompleteFilterResult,
199 211 onSelect: function(suggestion, data){
212 var msg = _gettext('added manually by "{0}"');
213 var reasons = [msg.format(templateContext.rhodecode_user.username)];
200 214 addReviewMember(data.id, data.first_name, data.last_name,
201 data.username, data.icon_link);
215 data.username, data.icon_link, reasons);
202 216 $('#'+input_id).val('');
203 217 }
204 218 });
@@ -20,6 +20,9 b''
20 20 * turns objects into GET query string
21 21 */
22 22 var toQueryString = function(o) {
23 if(typeof o === 'string') {
24 return o;
25 }
23 26 if(typeof o !== 'object') {
24 27 return false;
25 28 }
@@ -33,18 +36,22 b' var toQueryString = function(o) {'
33 36 /**
34 37 * ajax call wrappers
35 38 */
36 var ajaxGET = function(url, success) {
39 var ajaxGET = function(url, success, failure) {
37 40 var sUrl = url;
38 41 var request = $.ajax({url: sUrl, headers: {'X-PARTIAL-XHR': true}})
39 42 .done(function(data){
40 43 success(data);
41 44 })
42 .fail(function(data, textStatus, xhr){
43 alert("error processing request: " + textStatus);
45 .fail(function(data, textStatus, xhr) {
46 if (failure) {
47 failure(data, textStatus, xhr);
48 } else {
49 alert("error processing request: " + textStatus);
50 }
44 51 });
45 52 return request;
46 53 };
47 var ajaxPOST = function(url,postData,success) {
54 var ajaxPOST = function(url, postData, success, failure) {
48 55 var sUrl = url;
49 56 var postData = toQueryString(postData);
50 57 var request = $.ajax({type: 'POST', data: postData, url: sUrl,
@@ -53,7 +60,11 b' var ajaxPOST = function(url,postData,suc'
53 60 success(data);
54 61 })
55 62 .fail(function(data, textStatus, xhr){
56 alert("error processing request: " + textStatus);
63 if (failure) {
64 failure(data, textStatus, xhr);
65 } else {
66 alert("error processing request: " + textStatus);
67 }
57 68 });
58 69 return request;
59 70 };
@@ -1,3 +1,4 b''
1 /__MAIN_APP__ - launched when rhodecode-app element is attached to DOM
1 2 /plugins/__REGISTER__ - launched after the onDomReady() code from rhodecode.js is executed
2 3 /ui/plugins/code/anchor_focus - launched when rc starts to scroll on load to anchor on PR/Codeview
3 4 /ui/plugins/code/comment_form_built - launched when injectInlineForm() is executed and the form object is created
@@ -5,3 +6,4 b''
5 6 /connection_controller/subscribe - subscribes user to new channels
6 7 /connection_controller/presence - receives presence change messages
7 8 /connection_controller/channel_update - receives channel states
9 /favicon/update - notify state change for favicon
@@ -15,7 +15,7 b' import logging'
15 15 import optparse
16 16 import os
17 17 import re
18 import subprocess
18 import subprocess32
19 19 import sys
20 20 import textwrap
21 21 import threading
@@ -32,9 +32,9 b' from rhodecode.lib.compat import kill'
32 32
33 33
34 34 def make_web_build_callback(filename):
35 p = subprocess.Popen('make web-build', shell=True,
36 stdout=subprocess.PIPE,
37 stderr=subprocess.PIPE,
35 p = subprocess32.Popen('make web-build', shell=True,
36 stdout=subprocess32.PIPE,
37 stderr=subprocess32.PIPE,
38 38 cwd=os.path.dirname(os.path.dirname(__file__)))
39 39 stdout, stderr = p.communicate()
40 40 stdout = ''.join(stdout)
@@ -658,7 +658,7 b' class RcServerCommand(object):'
658 658 try:
659 659 try:
660 660 _turn_sigterm_into_systemexit()
661 proc = subprocess.Popen(args, env=new_environ)
661 proc = subprocess32.Popen(args, env=new_environ)
662 662 exit_code = proc.wait()
663 663 proc = None
664 664 except KeyboardInterrupt:
@@ -19,14 +19,21 b''
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21
22 import logging
22 23 import pylons
24 import Queue
25 import subprocess32
23 26
24 27 from pyramid.i18n import get_localizer
25 28 from pyramid.threadlocal import get_current_request
29 from threading import Thread
26 30
27 31 from rhodecode.translation import _ as tsf
28 32
29 33
34 log = logging.getLogger(__name__)
35
36
30 37 def add_renderer_globals(event):
31 38 # Put pylons stuff into the context. This will be removed as soon as
32 39 # migration to pyramid is finished.
@@ -68,3 +75,82 b' def scan_repositories_if_enabled(event):'
68 75 if vcs_server_enabled and import_on_startup:
69 76 repositories = ScmModel().repo_scan(get_rhodecode_base_path())
70 77 repo2db_mapper(repositories, remove_obsolete=False)
78
79
80 class Subscriber(object):
81 """
82 Base class for subscribers to the pyramid event system.
83 """
84 def __call__(self, event):
85 self.run(event)
86
87 def run(self, event):
88 raise NotImplementedError('Subclass has to implement this.')
89
90
91 class AsyncSubscriber(Subscriber):
92 """
93 Subscriber that handles the execution of events in a separate task to not
94 block the execution of the code which triggers the event. It puts the
95 received events into a queue from which the worker process takes them in
96 order.
97 """
98 def __init__(self):
99 self._stop = False
100 self._eventq = Queue.Queue()
101 self._worker = self.create_worker()
102 self._worker.start()
103
104 def __call__(self, event):
105 self._eventq.put(event)
106
107 def create_worker(self):
108 worker = Thread(target=self.do_work)
109 worker.daemon = True
110 return worker
111
112 def stop_worker(self):
113 self._stop = False
114 self._eventq.put(None)
115 self._worker.join()
116
117 def do_work(self):
118 while not self._stop:
119 event = self._eventq.get()
120 if event is not None:
121 self.run(event)
122
123
124 class AsyncSubprocessSubscriber(AsyncSubscriber):
125 """
126 Subscriber that uses the subprocess32 module to execute a command if an
127 event is received. Events are handled asynchronously.
128 """
129
130 def __init__(self, cmd, timeout=None):
131 super(AsyncSubprocessSubscriber, self).__init__()
132 self._cmd = cmd
133 self._timeout = timeout
134
135 def run(self, event):
136 cmd = self._cmd
137 timeout = self._timeout
138 log.debug('Executing command %s.', cmd)
139
140 try:
141 output = subprocess32.check_output(
142 cmd, timeout=timeout, stderr=subprocess32.STDOUT)
143 log.debug('Command finished %s', cmd)
144 if output:
145 log.debug('Command output: %s', output)
146 except subprocess32.TimeoutExpired as e:
147 log.exception('Timeout while executing command.')
148 if e.output:
149 log.error('Command output: %s', e.output)
150 except subprocess32.CalledProcessError as e:
151 log.exception('Error while executing command.')
152 if e.output:
153 log.error('Command output: %s', e.output)
154 except:
155 log.exception(
156 'Exception while executing command %s.', cmd)
@@ -20,10 +20,16 b''
20 20
21 21 import logging
22 22 import os
23 import shlex
23 24
24 from rhodecode import events
25 from rhodecode.lib.utils2 import str2bool
25 # Do not use `from rhodecode import events` here, it will be overridden by the
26 # events module in this package due to pythons import mechanism.
27 from rhodecode.events import RepoGroupEvent
28 from rhodecode.subscribers import AsyncSubprocessSubscriber
29 from rhodecode.config.middleware import (
30 _bool_setting, _string_setting, _int_setting)
26 31
32 from .events import ModDavSvnConfigChange
27 33 from .subscribers import generate_config_subscriber
28 34 from . import config_keys
29 35
@@ -36,34 +42,42 b' def includeme(config):'
36 42 _sanitize_settings_and_apply_defaults(settings)
37 43
38 44 if settings[config_keys.generate_config]:
39 config.add_subscriber(
40 generate_config_subscriber, events.RepoGroupEvent)
45 # Add subscriber to generate the Apache mod dav svn configuration on
46 # repository group events.
47 config.add_subscriber(generate_config_subscriber, RepoGroupEvent)
48
49 # If a reload command is set add a subscriber to execute it on
50 # configuration changes.
51 reload_cmd = shlex.split(settings[config_keys.reload_command])
52 if reload_cmd:
53 reload_timeout = settings[config_keys.reload_timeout] or None
54 reload_subscriber = AsyncSubprocessSubscriber(
55 cmd=reload_cmd, timeout=reload_timeout)
56 config.add_subscriber(reload_subscriber, ModDavSvnConfigChange)
41 57
42 58
43 59 def _sanitize_settings_and_apply_defaults(settings):
44 60 """
45 61 Set defaults, convert to python types and validate settings.
46 62 """
47 # Convert bool settings from string to bool.
48 settings[config_keys.generate_config] = str2bool(
49 settings.get(config_keys.generate_config, 'false'))
50 settings[config_keys.list_parent_path] = str2bool(
51 settings.get(config_keys.list_parent_path, 'true'))
63 _bool_setting(settings, config_keys.generate_config, 'false')
64 _bool_setting(settings, config_keys.list_parent_path, 'true')
65 _int_setting(settings, config_keys.reload_timeout, 10)
66 _string_setting(settings, config_keys.config_file_path, '', lower=False)
67 _string_setting(settings, config_keys.location_root, '/', lower=False)
68 _string_setting(settings, config_keys.reload_command, '', lower=False)
52 69
53 # Set defaults if key not present.
54 settings.setdefault(config_keys.config_file_path, None)
55 settings.setdefault(config_keys.location_root, '/')
56 settings.setdefault(config_keys.parent_path_root, None)
70 # Convert negative timeout values to zero.
71 if settings[config_keys.reload_timeout] < 0:
72 settings[config_keys.reload_timeout] = 0
57 73
58 # Append path separator to paths.
74 # Append path separator to location root.
59 75 settings[config_keys.location_root] = _append_path_sep(
60 76 settings[config_keys.location_root])
61 settings[config_keys.parent_path_root] = _append_path_sep(
62 settings[config_keys.parent_path_root])
63 77
64 78 # Validate settings.
65 79 if settings[config_keys.generate_config]:
66 assert settings[config_keys.config_file_path] is not None
80 assert len(settings[config_keys.config_file_path]) > 0
67 81
68 82
69 83 def _append_path_sep(path):
@@ -25,4 +25,5 b" config_file_path = 'svn.proxy.config_fil"
25 25 generate_config = 'svn.proxy.generate_config'
26 26 list_parent_path = 'svn.proxy.list_parent_path'
27 27 location_root = 'svn.proxy.location_root'
28 parent_path_root = 'svn.proxy.parent_path_root'
28 reload_command = 'svn.proxy.reload_cmd'
29 reload_timeout = 'svn.proxy.reload_timeout'
@@ -18,14 +18,23 b''
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 import logging
22
21 23
22 24 from .utils import generate_mod_dav_svn_config
23 25
24 26
27 log = logging.getLogger(__name__)
28
29
25 30 def generate_config_subscriber(event):
26 31 """
27 32 Subscriber to the `rhodcode.events.RepoGroupEvent`. This triggers the
28 33 automatic generation of mod_dav_svn config file on repository group
29 34 changes.
30 35 """
31 generate_mod_dav_svn_config(event.request.registry.settings)
36 try:
37 generate_mod_dav_svn_config(event.request.registry)
38 except Exception:
39 log.exception(
40 'Exception while generating subversion mod_dav_svn configuration.')
@@ -36,6 +36,9 b''
36 36 # After changing this a stop and start of Apache is required (using restart
37 37 # doesn't work).
38 38
39 # fix https -> http downgrade with DAV. It requires an header downgrade for
40 # https -> http reverse proxy to work properly
41 RequestHeader edit Destination ^https: http: early
39 42
40 43 <Location "${location_root|n}">
41 44 # The mod_dav_svn module takes the username from the apache request object.
@@ -19,15 +19,13 b''
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21
22 import codecs
23 22 import mock
23 import pytest
24 24 import re
25 import shutil
26 import tempfile
27 25
28 26 from pyramid import testing
29 27
30 from rhodecode.svn_support import config_keys, utils
28 from rhodecode.svn_support import utils
31 29
32 30
33 31 class TestModDavSvnConfig(object):
@@ -38,30 +36,12 b' class TestModDavSvnConfig(object):'
38 36 config = testing.setUp()
39 37 config.include('pyramid_mako')
40 38
41 # Temporary directory holding the generated config files.
42 cls.tempdir = tempfile.mkdtemp(suffix='pytest-mod-dav-svn')
43
44 39 cls.location_root = u'/location/root/çµäö'
45 40 cls.parent_path_root = u'/parent/path/çµäö'
46 cls._dummy_realm = u'Dummy Realm (äöüçµ)'
41 cls.realm = u'Dummy Realm (äöüçµ)'
47 42
48 43 @classmethod
49 def teardown_class(cls):
50 testing.tearDown()
51 shutil.rmtree(cls.tempdir, ignore_errors=True)
52
53 @classmethod
54 def get_settings(cls):
55 config_file_path = tempfile.mkstemp(
56 suffix='mod-dav-svn.conf', dir=cls.tempdir)[1]
57 return {
58 config_keys.config_file_path: config_file_path,
59 config_keys.location_root: cls.location_root,
60 config_keys.list_parent_path: True,
61 }
62
63 @classmethod
64 def get_repo_groups(cls, count=1):
44 def get_repo_group_mocks(cls, count=1):
65 45 repo_groups = []
66 46 for num in range(0, count):
67 47 full_path = u'/path/to/RepöGröúp-°µ {}'.format(num)
@@ -81,71 +61,37 b' class TestModDavSvnConfig(object):'
81 61 location=self.location_root, group_path=group_path)
82 62 assert len(re.findall(pattern, config)) == 1
83 63
84 @mock.patch('rhodecode.svn_support.utils.get_rhodecode_realm')
85 @mock.patch('rhodecode.svn_support.utils.RepoGroup')
86 def test_generate_mod_dav_svn_config(self, RepoGroupMock, GetRealmMock):
87 # Setup mock objects.
88 GetRealmMock.return_value = self._dummy_realm
89 num_groups = 3
90 RepoGroupMock.get_all_repo_groups.return_value = self.get_repo_groups(
91 count=num_groups)
92
93 # Execute the method under test.
94 settings = self.get_settings()
95 utils.generate_mod_dav_svn_config(
96 settings=settings, parent_path_root=self.parent_path_root)
97
98 # Read generated file.
99 path = settings[config_keys.config_file_path]
100 with codecs.open(path, 'r', encoding='utf-8') as f:
101 content = f.read()
102
64 def test_render_mod_dav_svn_config(self):
65 repo_groups = self.get_repo_group_mocks(count=10)
66 generated_config = utils._render_mod_dav_svn_config(
67 parent_path_root=self.parent_path_root,
68 list_parent_path=True,
69 location_root=self.location_root,
70 repo_groups=repo_groups,
71 realm=self.realm
72 )
103 73 # Assert that one location directive exists for each repository group.
104 for group in self.get_repo_groups(count=num_groups):
105 self.assert_group_location_directive(content, group.full_path)
74 for group in repo_groups:
75 self.assert_group_location_directive(
76 generated_config, group.full_path)
106 77
107 78 # Assert that the root location directive exists.
108 self.assert_root_location_directive(content)
109
110 @mock.patch('rhodecode.svn_support.utils.get_rhodecode_realm')
111 @mock.patch('rhodecode.svn_support.utils.RepoGroup')
112 def test_list_parent_path_on(self, RepoGroupMock, GetRealmMock):
113 # Setup mock objects.
114 GetRealmMock.return_value = self._dummy_realm
115 RepoGroupMock.get_all_repo_groups.return_value = self.get_repo_groups()
116
117 # Execute the method under test.
118 settings = self.get_settings()
119 settings[config_keys.list_parent_path] = True
120 utils.generate_mod_dav_svn_config(
121 settings=settings, parent_path_root=self.parent_path_root)
122
123 # Read generated file.
124 path = settings[config_keys.config_file_path]
125 with codecs.open(path, 'r', encoding='utf-8') as f:
126 content = f.read()
79 self.assert_root_location_directive(generated_config)
127 80
128 # Make assertions.
129 assert not re.search('SVNListParentPath\s+Off', content)
130 assert re.search('SVNListParentPath\s+On', content)
131
132 @mock.patch('rhodecode.svn_support.utils.get_rhodecode_realm')
133 @mock.patch('rhodecode.svn_support.utils.RepoGroup')
134 def test_list_parent_path_off(self, RepoGroupMock, GetRealmMock):
135 # Setup mock objects.
136 GetRealmMock.return_value = self._dummy_realm
137 RepoGroupMock.get_all_repo_groups.return_value = self.get_repo_groups()
81 @pytest.mark.parametrize('list_parent_path', [True, False])
82 def test_list_parent_path(self, list_parent_path):
83 generated_config = utils._render_mod_dav_svn_config(
84 parent_path_root=self.parent_path_root,
85 list_parent_path=list_parent_path,
86 location_root=self.location_root,
87 repo_groups=self.get_repo_group_mocks(count=10),
88 realm=self.realm
89 )
138 90
139 # Execute the method under test.
140 settings = self.get_settings()
141 settings[config_keys.list_parent_path] = False
142 utils.generate_mod_dav_svn_config(
143 settings=settings, parent_path_root=self.parent_path_root)
144
145 # Read generated file.
146 with open(settings[config_keys.config_file_path], 'r') as file_:
147 content = file_.read()
148
149 # Make assertions.
150 assert re.search('SVNListParentPath\s+Off', content)
151 assert not re.search('SVNListParentPath\s+On', content)
91 # Assert that correct configuration directive is present.
92 if list_parent_path:
93 assert not re.search('SVNListParentPath\s+Off', generated_config)
94 assert re.search('SVNListParentPath\s+On', generated_config)
95 else:
96 assert re.search('SVNListParentPath\s+Off', generated_config)
97 assert not re.search('SVNListParentPath\s+On', generated_config)
@@ -18,36 +18,44 b''
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 import codecs
21 22 import logging
22 23 import os
23
24 24 from pyramid.renderers import render
25 25
26 from rhodecode.lib.utils import get_rhodecode_realm
26 from rhodecode.events import trigger
27 from rhodecode.lib.utils import get_rhodecode_realm, get_rhodecode_base_path
27 28 from rhodecode.model.db import RepoGroup
29
28 30 from . import config_keys
31 from .events import ModDavSvnConfigChange
29 32
30 33
31 34 log = logging.getLogger(__name__)
32 35
33 36
34 def generate_mod_dav_svn_config(settings):
37 def generate_mod_dav_svn_config(registry):
35 38 """
36 39 Generate the configuration file for use with subversion's mod_dav_svn
37 40 module. The configuration has to contain a <Location> block for each
38 41 available repository group because the mod_dav_svn module does not support
39 42 repositories organized in sub folders.
40 43 """
44 settings = registry.settings
41 45 config = _render_mod_dav_svn_config(
42 settings[config_keys.parent_path_root],
43 settings[config_keys.list_parent_path],
44 settings[config_keys.location_root],
45 RepoGroup.get_all_repo_groups())
46 parent_path_root=get_rhodecode_base_path(),
47 list_parent_path=settings[config_keys.list_parent_path],
48 location_root=settings[config_keys.location_root],
49 repo_groups=RepoGroup.get_all_repo_groups(),
50 realm=get_rhodecode_realm())
46 51 _write_mod_dav_svn_config(config, settings[config_keys.config_file_path])
47 52
53 # Trigger an event on mod dav svn configuration change.
54 trigger(ModDavSvnConfigChange(), registry)
55
48 56
49 57 def _render_mod_dav_svn_config(
50 parent_path_root, list_parent_path, location_root, repo_groups):
58 parent_path_root, list_parent_path, location_root, repo_groups, realm):
51 59 """
52 60 Render mod_dav_svn configuration to string.
53 61 """
@@ -63,7 +71,7 b' def _render_mod_dav_svn_config('
63 71 'parent_path_root': parent_path_root,
64 72 'repo_group_paths': repo_group_paths,
65 73 'svn_list_parent_path': list_parent_path,
66 'rhodecode_realm': get_rhodecode_realm(),
74 'rhodecode_realm': realm,
67 75 }
68 76
69 77 # Render the configuration template to string.
@@ -73,11 +81,7 b' def _render_mod_dav_svn_config('
73 81
74 82 def _write_mod_dav_svn_config(config, filepath):
75 83 """
76 Write mod_dav_svn config to file. Log on exceptions but do not raise.
84 Write mod_dav_svn config to file.
77 85 """
78 try:
79 with open(filepath, 'w') as file_:
80 file_.write(config)
81 except Exception:
82 log.exception(
83 'Can not write mod_dav_svn configuration to "%s"', filepath)
86 with codecs.open(filepath, 'w', encoding='utf-8') as f:
87 f.write(config)
@@ -24,12 +24,6 b''
24 24 </div>
25 25
26 26 <div class="table">
27 <div id="edit_error" class="flash_msg" style="display:none;">
28 <div class="alert alert-warning">
29 ${h.literal(_('Gist was updated since you started editing. Copy your changes and click %(here)s to reload the new version.')
30 % {'here': h.link_to('here',h.url('edit_gist', gist_id=c.gist.gist_access_id))})}
31 </div>
32 </div>
33 27
34 28 <div id="files_data">
35 29 ${h.secure_form(h.url('edit_gist', gist_id=c.gist.gist_access_id), method='post', id='eform')}
@@ -115,7 +109,6 b''
115 109 <script>
116 110 $('#update').on('click', function(e){
117 111 e.preventDefault();
118
119 112 // check for newer version.
120 113 $.ajax({
121 114 url: "${h.url('edit_gist_check_revision', gist_id=c.gist.gist_access_id)}",
@@ -126,8 +119,11 b''
126 119 type: 'GET',
127 120 success: function(data) {
128 121 if(data.success === false){
129 $('#edit_error').show();
130 window.scrollTo(0,0);
122 message = '${h.literal(_('Gist was updated since you started editing. Copy your changes and click %(here)s to reload the new version.')
123 % {'here': h.link_to('here',h.url('edit_gist', gist_id=c.gist.gist_access_id))})}'
124 alertMessage = [{"message": {
125 "message": message, "force": "true", "level": "warning"}}];
126 $.Topic('/notifications').publish(alertMessage[0]);
131 127 }
132 128 else{
133 129 $('#eform').submit();
@@ -53,23 +53,6 b''
53 53 $('#live-notifications')[0].checked = event.detail.response;
54 54 };
55 55
56 function checkBrowserStatus(){
57 var browserStatus = 'Unknown';
58
59 if (!("Notification" in window)) {
60 browserStatus = 'Not supported'
61 }
62 else if(Notification.permission === 'denied'){
63 browserStatus = 'Denied';
64 $('.flash_msg').append('<div class="alert alert-error">Notifications are blocked on browser level - you need to enable them in your browser settings.</div>')
65 }
66 else if(Notification.permission === 'granted'){
67 browserStatus = 'Allowed';
68 }
69
70 $('#browser-notification-status').text(browserStatus);
71 }
72
73 56 ctrlr.testNotifications = function(event){
74 57 var levels = ['info', 'error', 'warning', 'success'];
75 58 var level = levels[Math.floor(Math.random()*levels.length)];
@@ -11,123 +11,12 b''
11 11 </div>
12 12
13 13 <div class="panel panel-default">
14 <div class="panel-heading">
15 <h3 class="panel-title">${_('Pull Requests You Opened')}: ${len(c.my_pull_requests)}</h3>
16 </div>
17 <div class="panel-body">
18 <div class="pullrequestlist">
19 %if c.my_pull_requests:
20 <table class="rctable">
21 <thead>
22 <th class="td-status"></th>
23 <th>${_('Target Repo')}</th>
24 <th>${_('Author')}</th>
25 <th></th>
26 <th>${_('Title')}</th>
27 <th class="td-time">${_('Last Update')}</th>
28 <th></th>
29 </thead>
30 %for pull_request in c.my_pull_requests:
31 <tr class="${'closed' if pull_request.is_closed() else ''} prwrapper">
32 <td class="td-status">
33 <div class="${'flag_status %s' % pull_request.calculated_review_status()} pull-left"></div>
34 </td>
35 <td class="truncate-wrap td-componentname">
36 <div class="truncate">
37 ${h.link_to(pull_request.target_repo.repo_name,h.url('summary_home',repo_name=pull_request.target_repo.repo_name))}
38 </div>
39 </td>
40 <td class="user">
41 ${base.gravatar_with_user(pull_request.author.email, 16)}
42 </td>
43 <td class="td-message expand_commit" data-pr-id="m${pull_request.pull_request_id}" title="${_('Expand commit message')}">
44 <div class="show_more_col">
45 <i class="show_more"></i>&nbsp;
46 </div>
47 </td>
48 <td class="mid td-description">
49 <div class="log-container truncate-wrap">
50 <div class="message truncate" id="c-m${pull_request.pull_request_id}"><a href="${h.url('pullrequest_show',repo_name=pull_request.target_repo.repo_name,pull_request_id=pull_request.pull_request_id)}">#${pull_request.pull_request_id}: ${pull_request.title}</a>\
51 %if pull_request.is_closed():
52 &nbsp;(${_('Closed')})\
53 %endif
54 <br/>${pull_request.description}</div>
55 </div>
56 </td>
57 <td class="td-time">
58 ${h.age_component(pull_request.updated_on)}
59 </td>
60 <td class="td-action repolist_actions">
61 ${h.secure_form(url('pullrequest_delete', repo_name=pull_request.target_repo.repo_name, pull_request_id=pull_request.pull_request_id),method='delete')}
62 ${h.submit('remove_%s' % pull_request.pull_request_id, _('Delete'),
63 class_="btn btn-link btn-danger",onclick="return confirm('"+_('Confirm to delete this pull request')+"');")}
64 ${h.end_form()}
65 </td>
66 </tr>
67 %endfor
68 </table>
69 %else:
70 <h2><span class="empty_data">${_('You currently have no open pull requests.')}</span></h2>
71 %endif
14 <div class="panel-heading">
15 <h3 class="panel-title">${_('Pull Requests You Participate In')}: ${c.records_total_participate}</h3>
72 16 </div>
73 </div>
74 </div>
75
76 <div class="panel panel-default">
77 <div class="panel-heading">
78 <h3 class="panel-title">${_('Pull Requests You Participate In')}: ${len(c.participate_in_pull_requests)}</h3>
79 </div>
80
81 <div class="panel-body">
82 <div class="pullrequestlist">
83 %if c.participate_in_pull_requests:
84 <table class="rctable">
85 <thead>
86 <th class="td-status"></th>
87 <th>${_('Target Repo')}</th>
88 <th>${_('Author')}</th>
89 <th></th>
90 <th>${_('Title')}</th>
91 <th class="td-time">${_('Last Update')}</th>
92 </thead>
93 %for pull_request in c.participate_in_pull_requests:
94 <tr class="${'closed' if pull_request.is_closed() else ''} prwrapper">
95 <td class="td-status">
96 <div class="${'flag_status %s' % pull_request.calculated_review_status()} pull-left"></div>
97 </td>
98 <td class="truncate-wrap td-componentname">
99 <div class="truncate">
100 ${h.link_to(pull_request.target_repo.repo_name,h.url('summary_home',repo_name=pull_request.target_repo.repo_name))}
101 </div>
102 </td>
103 <td class="user">
104 ${base.gravatar_with_user(pull_request.author.email, 16)}
105 </td>
106 <td class="td-message expand_commit" data-pr-id="p${pull_request.pull_request_id}" title="${_('Expand commit message')}">
107 <div class="show_more_col">
108 <i class="show_more"></i>&nbsp;
109 </div>
110 </td>
111 <td class="mid td-description">
112 <div class="log-container truncate-wrap">
113 <div class="message truncate" id="c-p${pull_request.pull_request_id}"><a href="${h.url('pullrequest_show',repo_name=pull_request.target_repo.repo_name,pull_request_id=pull_request.pull_request_id)}">#${pull_request.pull_request_id}: ${pull_request.title}</a>\
114 %if pull_request.is_closed():
115 &nbsp;(${_('Closed')})\
116 %endif
117 <br/>${pull_request.description}</div>
118 </div>
119 </td>
120 <td class="td-time">
121 ${h.age_component(pull_request.updated_on)}
122 </td>
123 </tr>
124 %endfor
125 </table>
126 %else:
127 <h2 class="empty_data">${_('There are currently no open pull requests requiring your participation.')}</h2>
128 %endif
17 <div class="panel-body">
18 <table id="pull_request_list_table_participate" class="display"></table>
129 19 </div>
130 </div>
131 20 </div>
132 21
133 22 <script>
@@ -139,17 +28,51 b''
139 28 window.location = "${h.url('my_account_pullrequests')}";
140 29 }
141 30 });
142 $('.expand_commit').on('click',function(e){
143 var target_expand = $(this);
144 var cid = target_expand.data('prId');
31 $(document).ready(function() {
32
33 var columnsDefs = [
34 { data: {"_": "status",
35 "sort": "status"}, title: "", className: "td-status", orderable: false},
36 { data: {"_": "target_repo",
37 "sort": "target_repo"}, title: "${_('Target Repo')}", className: "td-targetrepo", orderable: false},
38 { data: {"_": "name",
39 "sort": "name_raw"}, title: "${_('Name')}", className: "td-componentname", "type": "num" },
40 { data: {"_": "author",
41 "sort": "author_raw"}, title: "${_('Author')}", className: "td-user", orderable: false },
42 { data: {"_": "title",
43 "sort": "title"}, title: "${_('Title')}", className: "td-description" },
44 { data: {"_": "comments",
45 "sort": "comments_raw"}, title: "", className: "td-comments", orderable: false},
46 { data: {"_": "updated_on",
47 "sort": "updated_on_raw"}, title: "${_('Last Update')}", className: "td-time" }
48 ];
145 49
146 if (target_expand.hasClass('open')){
147 $('#c-'+cid).css({'height': '2.75em', 'text-overflow': 'ellipsis', 'overflow':'hidden'});
148 target_expand.removeClass('open');
149 }
150 else {
151 $('#c-'+cid).css({'height': 'auto', 'text-overflow': 'initial', 'overflow':'visible'});
152 target_expand.addClass('open');
153 }
50 // participating object list
51 $('#pull_request_list_table_participate').DataTable({
52 data: ${c.data_participate|n},
53 processing: true,
54 serverSide: true,
55 deferLoading: ${c.records_total_participate},
56 ajax: "",
57 dom: 'tp',
58 pageLength: ${c.visual.dashboard_items},
59 order: [[ 2, "desc" ]],
60 columns: columnsDefs,
61 language: {
62 paginate: DEFAULT_GRID_PAGINATION,
63 emptyTable: _gettext("There are currently no open pull requests requiring your participation.")
64 },
65 "drawCallback": function( settings, json ) {
66 timeagoActivate();
67 },
68 "createdRow": function ( row, data, index ) {
69 if (data['closed']) {
70 $(row).addClass('closed');
71 }
72 if (data['owned']) {
73 $(row).addClass('owned');
74 }
75 }
76 });
154 77 });
155 78 </script>
@@ -29,6 +29,15 b''
29 29 </div>
30 30
31 31 <div class="field">
32 <div class="label label-select">
33 <label for="default_password_reset">${_('Password Reset')}:</label>
34 </div>
35 <div class="select">
36 ${h.select('default_password_reset','',c.password_reset_choices)}
37 </div>
38 </div>
39
40 <div class="field">
32 41 <div class="label label-textarea">
33 42 <label for="default_register_message">${_('Registration Page Message')}:</label>
34 43 </div>
@@ -66,6 +75,7 b''
66 75 };
67 76
68 77 $("#default_register").select2(select2Options);
78 $("#default_password_reset").select2(select2Options);
69 79 $("#default_extern_activate").select2(select2Options);
70 80 });
71 81 </script>
@@ -4,6 +4,7 b''
4 4 elems = [
5 5 (_('Owner'), lambda:base.gravatar_with_user(c.repo_group.user.email), '', ''),
6 6 (_('Created on'), h.format_date(c.repo_group.created_on), '', ''),
7 (_('Is Personal Group'), c.repo_group.personal or False, '', ''),
7 8
8 9 (_('Total repositories'), c.repo_group.repositories_recursive_count, '', ''),
9 10 (_('Top level repositories'), c.repo_group.repositories.count(), '', c.repo_group.repositories.all()),
@@ -42,9 +42,11 b''
42 42 </div>
43 43 <div class="select">
44 44 ${h.select('repo_group',request.GET.get('parent_group'),c.repo_groups,class_="medium")}
45 %if c.personal_repo_group:
46 <a style="padding: 4px" href="#" id="select_my_group" data-personal-group-id="${c.personal_repo_group.group_id}">${_('Select my personal group (%(repo_group_name)s)') % {'repo_group_name': c.personal_repo_group.group_name}}</a>
47 %endif
45 % if c.personal_repo_group:
46 <a class="btn" href="#" id="select_my_group" data-personal-group-id="${c.personal_repo_group.group_id}">
47 ${_('Select my personal group (%(repo_group_name)s)') % {'repo_group_name': c.personal_repo_group.group_name}}
48 </a>
49 % endif
48 50 <span class="help-block">${_('Optionally select a group to put this repository into.')}</span>
49 51 </div>
50 52 </div>
@@ -1,10 +1,6 b''
1 1 ## -*- coding: utf-8 -*-
2 2 <%inherit file="/base/base.html"/>
3 3
4 ## don't trigger flash messages on this page
5 <%def name="flash_msg()">
6 </%def>
7
8 4 <%def name="title()">
9 5 ${_('%s Creating repository') % c.repo_name}
10 6 %if c.rhodecode_name:
@@ -71,6 +71,22 b''
71 71 <li class="${'active' if c.active=='integrations' else ''}">
72 72 <a href="${h.route_path('repo_integrations_home', repo_name=c.repo_name)}">${_('Integrations')}</a>
73 73 </li>
74 ## TODO: dan: replace repo navigation with navlist registry like with
75 ## admin menu. First must find way to allow runtime configuration
76 ## it to account for the c.repo_info.repo_type != 'svn' call above
77 <%
78 reviewer_settings = False
79 try:
80 import rc_reviewers
81 reviewer_settings = True
82 except ImportError:
83 pass
84 %>
85 %if reviewer_settings:
86 <li class="${'active' if c.active=='reviewers' else ''}">
87 <a href="${h.route_path('repo_reviewers_home', repo_name=c.repo_name)}">${_('Reviewers')}</a>
88 </li>
89 %endif
74 90 </ul>
75 91 </div>
76 92
@@ -57,9 +57,11 b''
57 57 </div>
58 58 <div class="select">
59 59 ${h.select('repo_group','',c.repo_groups,class_="medium")}
60 %if c.personal_repo_group:
61 <a style="padding: 4px" href="#" id="select_my_group" data-personal-group-id="${c.personal_repo_group.group_id}">${_('Select my personal group (%(repo_group_name)s)') % {'repo_group_name': c.personal_repo_group.group_name}}</a>
62 %endif
60 % if c.personal_repo_group:
61 <a class="btn" href="#" id="select_my_group" data-personal-group-id="${c.personal_repo_group.group_id}">
62 ${_('Select my personal group (%(repo_group_name)s)') % {'repo_group_name': c.personal_repo_group.group_name}}
63 </a>
64 % endif
63 65 <p class="help-block">${_('Optional select a group to put this repository into.')}</p>
64 66 </div>
65 67 </div>
@@ -30,6 +30,35 b''
30 30 </div>
31 31 </div>
32 32
33
34 <div class="panel panel-default">
35 <div class="panel-heading" id="personal-group-options">
36 <h3 class="panel-title">${_('Personal Repository Group')} <a class="permalink" href="#personal-group-options"></a></h3>
37 </div>
38 <div class="panel-body">
39 <div class="checkbox">
40 ${h.checkbox('rhodecode_create_personal_repo_group','True')}
41 <label for="rhodecode_create_personal_repo_group">${_('Create Personal Repository Group')}</label>
42 </div>
43 <span class="help-block">
44 ${_('Always create Personal Repository Groups for new users.')} <br/>
45 ${_('When creating new users from add user form or API you can still turn this off via a checkbox or flag')}
46 </span>
47
48 <div class="label">
49 <label for="rhodecode_personal_repo_group_pattern">${_('Personal Repo Group Pattern')}</label>
50 </div>
51 <div class="field input">
52 ${h.text('rhodecode_personal_repo_group_pattern',size=60, placeholder=c.personal_repo_group_default_pattern)}
53 </div>
54 <span class="help-block">
55 ${_('Pattern used to create Personal Repository Groups. Prefix can be other existing repository group path[s], eg. /u/${username}')} <br/>
56 ${_('Available variables are currently ${username} and ${user_id}')}
57 </span>
58 </div>
59 </div>
60
61
33 62 <div class="panel panel-default">
34 63 <div class="panel-heading" id="captcha-options">
35 64 <h3 class="panel-title">${_('Registration Captcha')} <a class="permalink" href="#captcha-options"></a></h3>
@@ -72,6 +101,7 b''
72 101 <option value="ga">Google Analytics</option>
73 102 <option value="clicky">Clicky</option>
74 103 <option value="server_announce">${_('Server Announcement')}</option>
104 <option value="flash_filtering">${_('Flash message filtering')}</option>
75 105 </select>
76 106 </div>
77 107 <div style="padding: 10px 0px"></div>
@@ -183,20 +213,40 b''
183 213 // important messages to all users of the RhodeCode Enterprise system.
184 214
185 215 $(document).ready(function(e){
186 // put your message below
216
217 // EDIT - put your message below
187 218 var message = "TYPE YOUR MESSAGE HERE";
188 219
189 $('#body').prepend(
190 ('<div class="flash_msg">'+
191 '<div class="alert alert-info">_MSG_'+
192 '</div></div>').replace('_MSG_', message)
220 // EDIT - choose "info"/"warning"/"error"/"success"/"neutral" as appropriate
221 var alert_level = "info";
222
223 $("#body").prepend(
224 ("<div id='server-announcement' class='"+alert_level+"'>_MSG_"+"</div>").replace("_MSG_", message)
193 225 )
194 226 })
195 227 </script>
196 228 </%text>
197 229 </script>
198 230
231 <script id="flash_filtering_tmpl" type='text/x-template'>
232 <%text filter="h">
233 <script>
234 // This filters out some flash messages before they are presented to user
235 // based on their contents. Could be used to filter out warnings/errors
236 // of license messages
199 237
238 var filteredMessages = [];
239 for(var i =0; i< alertMessagePayloads.length; i++){
240 if (typeof alertMessagePayloads[i].message.subdata.subtype !== 'undefined' &&
241 alertMessagePayloads[i].message.subdata.subtype.indexOf('rc_license') !== -1){
242 continue
243 }
244 filteredMessages.push(alertMessagePayloads[i]);
245 }
246 alertMessagePayloads = filteredMessages;
247 </script>
248 </%text>
249 </script>
200 250
201 251 <script>
202 252 var pre_cm = initCodeMirror('rhodecode_pre_code', '', false);
@@ -215,7 +265,8 b' var get_data = function(type, old){'
215 265 '#': old,
216 266 'ga': get_tmpl('ga'),
217 267 'clicky': get_tmpl('clicky'),
218 'server_announce': get_tmpl('server_announce')
268 'server_announce': get_tmpl('server_announce'),
269 'flash_filtering': get_tmpl('flash_filtering')
219 270 }[type]
220 271 };
221 272
@@ -1,41 +1,3 b''
1 <%
2 elems = [
3 ## general
4 (_('RhodeCode Enterprise version'), h.literal('%s <div class="link" id="check_for_update" >%s</div>' % (c.rhodecode_version, _('check for updates'))), ''),
5 (_('Upgrade info endpoint'), h.literal('%s <br/><span >%s.</span>' % (c.rhodecode_update_url, _('Note: please make sure this server can access this url'))), ''),
6 (_('Configuration INI file'), c.rhodecode_config_ini, ''),
7 ## systems stats
8 (_('RhodeCode Enterprise Server IP'), c.server_ip, ''),
9 (_('RhodeCode Enterprise Server ID'), c.server_id, ''),
10 (_('Platform'), c.platform, ''),
11 (_('Uptime'), c.uptime_age, ''),
12 (_('Storage location'), c.storage, ''),
13 (_('Storage disk space'), "%s/%s, %s%% used%s" % (h.format_byte_size_binary(c.disk['used']), h.format_byte_size_binary(c.disk['total']),(c.disk['percent']), ' %s' % c.disk['error'] if 'error' in c.disk else ''), ''),
14
15 (_('Search index storage'), c.index_storage, ''),
16 (_('Search index size'), "%s %s" % (h.format_byte_size_binary(c.disk_index['used']), ' %s' % c.disk_index['error'] if 'error' in c.disk_index else ''), ''),
17
18 (_('Gist storage'), c.gist_storage, ''),
19 (_('Gist storage size'), "%s (%s items)%s" % (h.format_byte_size_binary(c.disk_gist['used']),c.disk_gist['items'], ' %s' % c.disk_gist['error'] if 'error' in c.disk_gist else ''), ''),
20
21 (_('Archive cache'), h.literal('%s <br/><span >%s.</span>' % (c.archive_storage, _('Enable this by setting archive_cache_dir=/path/to/cache option in the .ini file'))), ''),
22 (_('Archive cache size'), "%s%s" % (h.format_byte_size_binary(c.disk_archive['used']), ' %s' % c.disk_archive['error'] if 'error' in c.disk_archive else ''), ''),
23
24 (_('System memory'), c.system_memory, ''),
25 (_('CPU'), '%s %%' %(c.cpu), ''),
26 (_('Load'), '1min: %s, 5min: %s, 15min: %s' %(c.load['1_min'],c.load['5_min'],c.load['15_min']), ''),
27
28 ## rhodecode stuff
29 (_('Python version'), c.py_version, ''),
30 (_('Python path'), c.py_path, ''),
31 (_('GIT version'), c.git_version, ''),
32 (_('HG version'), c.hg_version, ''),
33 (_('SVN version'), c.svn_version, ''),
34 (_('Database'), "%s @ version: %s" % (c.db_type, c.db_migrate_version), ''),
35 (_('Database version'), c.db_version, ''),
36
37 ]
38 %>
39 1
40 2 <div id="update_notice" style="display: none; margin: -40px 0px 20px 0px">
41 3 <div>${_('Checking for updates...')}</div>
@@ -46,15 +8,21 b''
46 8 <div class="panel-heading">
47 9 <h3 class="panel-title">${_('System Info')}</h3>
48 10 % if c.allowed_to_snapshot:
49 <a href="${url('admin_settings_system', snapshot=1)}" class="panel-edit">${_('create snapshot')}</a>
11 <a href="${url('admin_settings_system', snapshot=1)}" class="panel-edit">${_('create summary snapshot')}</a>
50 12 % endif
51 13 </div>
52 14 <div class="panel-body">
53 15 <dl class="dl-horizontal settings">
54 %for dt, dd, tt in elems:
55 <dt>${dt}:</dt>
56 <dd title="${tt}">${dd}</dd>
57 %endfor
16 % for dt, dd, warn in c.data_items:
17 <dt>${dt}${':' if dt else '---'}</dt>
18 <dd>${dd}${'' if dt else '---'}
19 % if warn and warn['message']:
20 <div class="alert-${warn['type']}">
21 <strong>${warn['message']}</strong>
22 </div>
23 % endif
24 </dd>
25 % endfor
58 26 </dl>
59 27 </div>
60 28 </div>
@@ -70,12 +38,12 b''
70 38 <col class='content'>
71 39 </colgroup>
72 40 <tbody>
73 %for key, value in c.py_modules:
41 % for key, value in c.py_modules['human_value']:
74 42 <tr>
75 43 <td>${key}</td>
76 44 <td>${value}</td>
77 45 </tr>
78 %endfor
46 % endfor
79 47 </tbody>
80 48 </table>
81 49 </div>
@@ -1,61 +1,26 b''
1 <%
2 elems = [
3 ## general
4 (_('RhodeCode Enterprise version'), c.rhodecode_version, ''),
5 (_('Upgrade info endpoint'), c.rhodecode_update_url, ''),
6 (_('Configuration INI file'), c.rhodecode_config_ini, ''),
7 ## systems stats
8 (_('RhodeCode Enterprise Server IP'), c.server_ip, ''),
9 (_('RhodeCode Enterprise Server ID'), c.server_id, ''),
10 (_('Platform'), c.platform, ''),
11 (_('Uptime'), c.uptime_age, ''),
12 (_('Storage location'), c.storage, ''),
13 (_('Storage disk space'), "%s/%s, %s%% used%s" % (h.format_byte_size_binary(c.disk['used']), h.format_byte_size_binary(c.disk['total']),(c.disk['percent']), ' %s' % c.disk['error'] if 'error' in c.disk else ''), ''),
14
15 (_('Search index storage'), c.index_storage, ''),
16 (_('Search index size'), "%s %s" % (h.format_byte_size_binary(c.disk_index['used']), ' %s' % c.disk_index['error'] if 'error' in c.disk_index else ''), ''),
17
18 (_('Gist storage'), c.gist_storage, ''),
19 (_('Gist storage size'), "%s (%s items)%s" % (h.format_byte_size_binary(c.disk_gist['used']),c.disk_gist['items'], ' %s' % c.disk_gist['error'] if 'error' in c.disk_gist else ''), ''),
20
21 (_('Archive cache'), c.archive_storage, ''),
22 (_('Archive cache size'), "%s%s" % (h.format_byte_size_binary(c.disk_archive['used']), ' %s' % c.disk_archive['error'] if 'error' in c.disk_archive else ''), ''),
23
24 (_('System memory'), c.system_memory, ''),
25 (_('CPU'), '%s %%' %(c.cpu), ''),
26 (_('Load'), '1min: %s, 5min: %s, 15min: %s' %(c.load['1_min'],c.load['5_min'],c.load['15_min']), ''),
27
28 ## rhodecode stuff
29 (_('Python version'), c.py_version, ''),
30 (_('Python path'), c.py_path, ''),
31 (_('GIT version'), c.git_version, ''),
32 (_('HG version'), c.hg_version, ''),
33 (_('SVN version'), c.svn_version, ''),
34 (_('Database'), "%s @ version: %s" % (c.db_type, c.db_migrate_version), ''),
35 (_('Database version'), c.db_version, ''),
36
37 ]
38 %>
39 1
40 2 <pre>
41 3 SYSTEM INFO
42 4 -----------
43 5
44 % for dt, dd, tt in elems:
45 ${dt}: ${dd}
6 % for dt, dd, warn in c.data_items:
7 ${dt.lower().replace(' ', '_')}${': '+dd if dt else '---'}
8 % if warn and warn['message']:
9 ALERT_${warn['type'].upper()} ${warn['message']}
10 % endif
46 11 % endfor
47 12
48 13 PYTHON PACKAGES
49 14 ---------------
50 15
51 % for key, value in c.py_modules:
16 % for key, value in c.py_modules['human_value']:
52 17 ${key}: ${value}
53 18 % endfor
54 19
55 20 SYSTEM SETTINGS
56 21 ---------------
57 22
58 % for key, value in sorted(c.rhodecode_ini_safe.items()):
23 % for key, value in sorted(c.rhodecode_config['human_value'].items()):
59 24 % if isinstance(value, dict):
60 25
61 26 % for key2, value2 in value.items():
@@ -50,5 +50,17 b''
50 50 if ($('.locked_input').children().hasClass('error-message')) {
51 51 unlockpath();
52 52 }
53 })
53
54 /* On click handler for the `Generate Apache Config` button. It sends a
55 POST request to trigger the (re)generation of the mod_dav_svn config. */
56 $('#vcs_svn_generate_cfg').on('click', function(event) {
57 event.preventDefault();
58 var url = "${h.route_path('admin_settings_vcs_svn_generate_cfg')}";
59 var jqxhr = $.post(url, {'csrf_token': CSRF_TOKEN});
60 jqxhr.done(function(data) {
61 $.Topic('/notifications').publish(data);
62 });
63 });
64
65 });
54 66 </script>
@@ -69,6 +69,8 b''
69 69 <tr><td>[featured] </td><td><span class="metatag" tag="featured">featured</span></td></tr>
70 70 <tr><td>[stale] </td><td><span class="metatag" tag="stale">stale</span></td></tr>
71 71 <tr><td>[dead] </td><td><span class="metatag" tag="dead">dead</span></td></tr>
72 <tr><td>[personal] </td><td><span class="metatag" tag="personal">personal</span></td></tr>
73
72 74 <tr><td>[lang =&gt; lang] </td><td><span class="metatag" tag="lang" >lang</span></td></tr>
73 75
74 76 <tr><td>[license =&gt; License] </td><td><span class="metatag" tag="license"><a href="http://www.opensource.org/licenses/License" >License</a></span></td></tr>
@@ -35,7 +35,6 b''
35 35 <li class="${'active' if c.active=='advanced' else ''}"><a href="${h.url('edit_user_group_advanced', user_group_id=c.user_group.users_group_id)}">${_('Advanced')}</a></li>
36 36 <li class="${'active' if c.active=='global_perms' else ''}"><a href="${h.url('edit_user_group_global_perms', user_group_id=c.user_group.users_group_id)}">${_('Global permissions')}</a></li>
37 37 <li class="${'active' if c.active=='perms_summary' else ''}"><a href="${h.url('edit_user_group_perms_summary', user_group_id=c.user_group.users_group_id)}">${_('Permissions summary')}</a></li>
38 <li class="${'active' if c.active=='members' else ''}"><a href="${h.url('edit_user_group_members', user_group_id=c.user_group.users_group_id)}">${_('Members')}</a></li>
39 38 </ul>
40 39 </div>
41 40
@@ -54,40 +54,53 b''
54 54 ${h.checkbox('users_group_active',value=True)}
55 55 </div>
56 56 </div>
57 <div class="field">
58 <div class="label">
59 <label for="users_group_active">${_('Search')}:</label>
60 ${h.text('from_user_group',
61 placeholder="user/usergroup",
62 class_="medium")}
57
58 <div class="field">
59 <div class="label label-checkbox">
60 <label for="users_group_active">${_('Add members')}:</label>
61 </div>
62 <div class="input">
63 ${h.text('user_group_add_members', placeholder="user/usergroup", class_="medium")}
63 64 </div>
64 <div class="select side-by-side-selector">
65 <div class="left-group">
66 <label class="text"><strong>${_('Chosen group members')}</strong></label>
67 ${h.select('users_group_members',[x[0] for x in c.group_members],c.group_members,multiple=True,size=8,)}
68 <div class="btn" id="remove_all_elements" >
69 ${_('Remove all elements')}
70 <i class="icon-chevron-right"></i>
71 </div>
72 </div>
73 <div class="middle-group">
74 <i id="add_element" class="icon-chevron-left"></i>
75 <br />
76 <i id="remove_element" class="icon-chevron-right"></i>
77 </div>
78 <div class="right-group">
79 <label class="text" >${_('Available users')}
80 </label>
81 ${h.select('available_members',[],c.available_members,multiple=True,size=8,)}
82 <div class="btn" id="add_all_elements" >
83 <i class="icon-chevron-left"></i>${_('Add all elements')}
84 </div>
85 </div>
86 </div>
87 </div>
88 <div class="buttons">
89 ${h.submit('Save',_('Save'),class_="btn")}
90 </div>
65 </div>
66
67 <input type="hidden" name="__start__" value="user_group_members:sequence"/>
68 <table id="group_members_placeholder" class="rctable group_members">
69 <tr>
70 <th>${_('Username')}</th>
71 <th>${_('Action')}</th>
72 </tr>
73
74 % if c.group_members_obj:
75 % for user in c.group_members_obj:
76 <tr>
77 <td id="member_user_${user.user_id}" class="td-author">
78 <div class="group_member">
79 ${base.gravatar(user.email, 16)}
80 <span class="username user">${h.link_to(h.person(user), h.url( 'edit_user',user_id=user.user_id))}</span>
81 <input type="hidden" name="__start__" value="member:mapping">
82 <input type="hidden" name="member_user_id" value="${user.user_id}">
83 <input type="hidden" name="type" value="existing" id="member_${user.user_id}">
84 <input type="hidden" name="__end__" value="member:mapping">
85 </div>
86 </td>
87 <td class="">
88 <div class="usergroup_member_remove action_button" onclick="removeUserGroupMember(${user.user_id}, true)" style="visibility: visible;">
89 <i class="icon-remove-sign"></i>
90 </div>
91 </td>
92 </tr>
93 % endfor
94
95 % else:
96 <tr><td colspan="2">${_('No members yet')}</td></tr>
97 % endif
98 </table>
99 <input type="hidden" name="__end__" value="user_group_members:sequence"/>
100
101 <div class="buttons">
102 ${h.submit('Save',_('Save'),class_="btn")}
103 </div>
91 104 </div>
92 105 </div>
93 106 ${h.end_form()}
@@ -95,15 +108,18 b''
95 108 </div>
96 109 <script>
97 110 $(document).ready(function(){
98 MultiSelectWidget('users_group_members','available_members','edit_users_group');
99
100 111 $("#group_parent_id").select2({
101 112 'containerCssClass': "drop-menu",
102 113 'dropdownCssClass': "drop-menu-dropdown",
103 114 'dropdownAutoWidth': true
104 115 });
105 116
106 $('#from_user_group').autocomplete({
117 removeUserGroupMember = function(userId){
118 $('#member_'+userId).val('remove');
119 $('#member_user_'+userId).addClass('to-delete');
120 };
121
122 $('#user_group_add_members').autocomplete({
107 123 serviceUrl: pyroutes.url('user_autocomplete_data'),
108 124 minChars:2,
109 125 maxHeight:400,
@@ -115,9 +131,37 b''
115 131 lookupFilter: autocompleteFilterResult,
116 132 onSelect: function(element, suggestion){
117 133
118 function preSelectUserIds(uids) {
119 $('#available_members').val(uids);
120 $('#users_group_members').val(uids);
134 function addMember(user, fromUserGroup) {
135 var gravatar = user.icon_link;
136 var username = user.value_display;
137 var userLink = pyroutes.url('edit_user', {"user_id": user.id});
138 var uid = user.id;
139
140 if (fromUserGroup) {
141 username = username +" "+ _gettext('(from usergroup {0})'.format(fromUserGroup))
142 }
143
144 var elem = $(
145 ('<tr>'+
146 '<td id="member_user_{6}" class="td-author td-author-new-entry">'+
147 '<div class="group_member">'+
148 '<img class="gravatar" src="{0}" height="16" width="16">'+
149 '<span class="username user"><a href="{1}">{2}</a></span>'+
150 '<input type="hidden" name="__start__" value="member:mapping">'+
151 '<input type="hidden" name="member_user_id" value="{3}">'+
152 '<input type="hidden" name="type" value="new" id="member_{4}">'+
153 '<input type="hidden" name="__end__" value="member:mapping">'+
154 '</div>'+
155 '</td>'+
156 '<td class="td-author-new-entry">'+
157 '<div class="usergroup_member_remove action_button" onclick="removeUserGroupMember({5}, true)" style="visibility: visible;">'+
158 '<i class="icon-remove-sign"></i>'+
159 '</div>'+
160 '</td>'+
161 '</tr>').format(gravatar, userLink, username,
162 uid, uid, uid, uid)
163 );
164 $('#group_members_placeholder').append(elem)
121 165 }
122 166
123 167 if (suggestion.value_type == 'user_group') {
@@ -125,20 +169,18 b''
125 169 pyroutes.url('edit_user_group_members',
126 170 {'user_group_id': suggestion.id}),
127 171 function(data) {
128 var uids = [];
129 172 $.each(data.members, function(idx, user) {
130 var userid = user[0],
131 username = user[1];
132 uids.push(userid.toString());
173 addMember(user, suggestion.value)
133 174 });
134 preSelectUserIds(uids)
135 175 }
136 176 );
137 177 } else if (suggestion.value_type == 'user') {
138 preSelectUserIds([suggestion.id.toString()]);
178 addMember(suggestion, null);
139 179 }
140 180 }
141 181 });
182
183
142 184 UsersAutoComplete('user', '${c.rhodecode_user.user_id}');
143 185 })
144 186 </script>
@@ -113,11 +113,14 b''
113 113
114 114 <div class="field">
115 115 <div class="label label-checkbox">
116 <label for="create_repo_group">${_('Add repository group')}:</label>
116 <label for="create_repo_group">${_('Add personal repository group')}:</label>
117 117 </div>
118 118 <div class="checkboxes">
119 ${h.checkbox('create_repo_group',value=True)}
120 <span class="help-block">${_('Add repository group with the same name as username. \nUser will be automatically set as this group owner.')}</span>
119 ${h.checkbox('create_repo_group',value=True, checked=c.default_create_repo_group)}
120 <span class="help-block">
121 ${_('New group will be created at: `/%(path)s`') % {'path': c.personal_repo_group_name}}<br/>
122 ${_('User will be automatically set as this group owner.')}
123 </span>
121 124 </div>
122 125 </div>
123 126
@@ -61,7 +61,11 b''
61 61 %if c.personal_repo_group:
62 62 <div class="panel-body-title-text">${_('Users personal repository group')} : ${h.link_to(c.personal_repo_group.group_name, url('repo_group_home', group_name=c.personal_repo_group.group_name))}</div>
63 63 %else:
64 <div class="panel-body-title-text">${_('This user currently does not have a personal repository group')}</div>
64 <div class="panel-body-title-text">
65 ${_('This user currently does not have a personal repository group')}
66 <br/>
67 ${_('New group will be created at: `/%(path)s`') % {'path': c.personal_repo_group_name}}
68 </div>
65 69 %endif
66 70 <button class="btn btn-default" type="submit" ${'disabled="disabled"' if c.personal_repo_group else ''}>
67 71 <i class="icon-folder-close"></i>
@@ -50,16 +50,18 b''
50 50 $.fn.dataTable.ext.search.push(
51 51 function( settings, data, dataIndex ) {
52 52 var query = $('#q_filter').val();
53 var username = data[1];
54 var email = data[2];
55 var first_name = data[3];
56 var last_name = data[4];
53
54 var username = data[0];
55 var email = data[1];
56 var first_name = data[2];
57 var last_name = data[3];
57 58
58 59 var query_str = username + " " +
59 60 email + " " +
60 61 first_name + " " +
61 62 last_name;
62 if((query_str).indexOf(query) !== -1){
63
64 if ((query_str).indexOf(query) !== -1) {
63 65 return true;
64 66 }
65 67 return false;
@@ -23,7 +23,6 b''
23 23
24 24 <!-- CONTENT -->
25 25 <div id="content" class="wrapper">
26 ${self.flash_msg()}
27 26 <div class="main">
28 27 ${next.main()}
29 28 </div>
@@ -62,10 +61,6 b''
62 61 <%def name="menu_bar_subnav()">
63 62 </%def>
64 63
65 <%def name="flash_msg()">
66 <%include file="/base/flash_msg.html"/>
67 </%def>
68
69 64 <%def name="breadcrumbs(class_='breadcrumbs')">
70 65 <div class="${class_}">
71 66 ${self.breadcrumbs_links()}
@@ -100,12 +95,12 b''
100 95 ${dd}
101 96 %endif
102 97 %if show_items:
103 <span class="btn-collapse" data-toggle="item-${h.md5(dt)[:6]}-details">${_('Show More')} </span>
98 <span class="btn-collapse" data-toggle="item-${h.md5_safe(dt)[:6]}-details">${_('Show More')} </span>
104 99 %endif
105 100 </dd>
106 101
107 102 %if show_items:
108 <div class="collapsable-content" data-toggle="item-${h.md5(dt)[:6]}-details" style="display: none">
103 <div class="collapsable-content" data-toggle="item-${h.md5_safe(dt)[:6]}-details" style="display: none">
109 104 %for item in show_items:
110 105 <dt></dt>
111 106 <dd>${item}</dd>
@@ -313,7 +308,9 b''
313 308 <div class="field">
314 309 <div class="label">
315 310 <label for="password">${_('Password')}:</label>
316 <span class="forgot_password">${h.link_to(_('(Forgot password?)'),h.route_path('reset_password'))}</span>
311 %if h.HasPermissionAny('hg.password_reset.enabled')():
312 <span class="forgot_password">${h.link_to(_('(Forgot password?)'),h.route_path('reset_password'), class_='pwd_reset')}</span>
313 %endif
317 314 </div>
318 315 <div class="input">
319 316 ${h.password('password',class_='focus',tabindex=2)}
@@ -341,6 +338,9 b''
341 338 <div class="">
342 339 <ol class="links">
343 340 <li>${h.link_to(_(u'My account'),h.url('my_account'))}</li>
341 % if c.rhodecode_user.personal_repo_group:
342 <li>${h.link_to(_(u'My personal group'), h.url('repo_group_home', group_name=c.rhodecode_user.personal_repo_group.group_name))}</li>
343 % endif
344 344 <li class="logout">
345 345 ${h.secure_form(h.route_path('logout'))}
346 346 ${h.submit('log_out', _(u'Sign Out'),class_="btn btn-primary")}
@@ -12,11 +12,15 b" if getattr(c, 'rhodecode_user', None) an"
12 12 c.template_context['rhodecode_user']['username'] = c.rhodecode_user.username
13 13 c.template_context['rhodecode_user']['email'] = c.rhodecode_user.email
14 14 c.template_context['rhodecode_user']['notification_status'] = c.rhodecode_user.get_instance().user_data.get('notification_status', True)
15 c.template_context['rhodecode_user']['first_name'] = c.rhodecode_user.name
16 c.template_context['rhodecode_user']['last_name'] = c.rhodecode_user.lastname
15 17
16 18 c.template_context['visual']['default_renderer'] = h.get_visual_attr(c, 'default_renderer')
17 19 %>
18 20 <html xmlns="http://www.w3.org/1999/xhtml">
19 21 <head>
22 <script src="${h.asset('js/vendors/webcomponentsjs/webcomponents-lite.min.js', ver=c.rhodecode_version_hash)}"></script>
23 <link rel="import" href="${h.asset('js/rhodecode-components.html', ver=c.rhodecode_version_hash)}">
20 24 <title>${self.title()}</title>
21 25 <meta http-equiv="Content-Type" content="text/html;charset=utf-8" />
22 26 <%def name="robots()">
@@ -64,10 +68,6 b" c.template_context['visual']['default_re"
64 68 && 'content' in document.createElement('template')
65 69 );
66 70 if (!webComponentsSupported) {
67 var e = document.createElement('script');
68 e.async = true;
69 e.src = '${h.asset('js/vendors/webcomponentsjs/webcomponents-lite.min.js', ver=c.rhodecode_version_hash)}';
70 document.head.appendChild(e);
71 71 } else {
72 72 onload();
73 73 }
@@ -108,22 +108,22 b" c.template_context['visual']['default_re"
108 108 ip: '${c.rhodecode_user.ip_addr}',
109 109 username: '${c.rhodecode_user.username}'
110 110 % endif
111 },
112 tags: {
113 rhodecode_version: '${c.rhodecode_version}',
114 rhodecode_edition: '${c.rhodecode_edition}'
111 115 }
112 116 };
117
113 118 </script>
114 119 <%include file="/base/plugins_base.html"/>
115 120 <!--[if lt IE 9]>
116 121 <script language="javascript" type="text/javascript" src="${h.asset('js/excanvas.min.js')}"></script>
117 122 <![endif]-->
118 123 <script language="javascript" type="text/javascript" src="${h.asset('js/rhodecode/routes.js', ver=c.rhodecode_version_hash)}"></script>
119 <script language="javascript" type="text/javascript" src="${h.asset('js/scripts.js', ver=c.rhodecode_version_hash)}"></script>
120 <script>
121 var e = document.createElement('script');
122 e.src = '${h.asset('js/rhodecode-components.js', ver=c.rhodecode_version_hash)}';
123 document.head.appendChild(e);
124 </script>
125 <link rel="import" href="${h.asset('js/rhodecode-components.html', ver=c.rhodecode_version_hash)}">
124 <script> var alertMessagePayloads = ${h.flash.json_alerts()|n}; </script>
126 125 ## avoide escaping the %N
126 <script language="javascript" type="text/javascript" src="${h.asset('js/rhodecode-components.js', ver=c.rhodecode_version_hash)}"></script>
127 127 <script>CodeMirror.modeURL = "${h.asset('') + 'js/mode/%N/%N.js?ver='+c.rhodecode_version_hash}";</script>
128 128
129 129
@@ -131,6 +131,13 b" c.template_context['visual']['default_re"
131 131 ${self.js_extra()}
132 132
133 133 <script type="text/javascript">
134 Rhodecode = (function() {
135 function _Rhodecode() {
136 this.comments = new CommentsController();
137 }
138 return new _Rhodecode();
139 })();
140
134 141 $(document).ready(function(){
135 142 show_more_event();
136 143 timeagoActivate();
@@ -145,11 +152,6 b" c.template_context['visual']['default_re"
145 152
146 153 <%def name="head_extra()"></%def>
147 154 ${self.head_extra()}
148 <script>
149 window.addEventListener("load", function(event) {
150 $.Topic('/plugins/__REGISTER__').prepareOrPublish({});
151 });
152 </script>
153 155 ## extra stuff
154 156 %if c.pre_code:
155 157 ${c.pre_code|n}
@@ -126,7 +126,7 b''
126 126
127 127 <div class="checkbox">
128 128 ${h.checkbox('rhodecode_hg_use_rebase_for_merging' + suffix, 'True', **kwargs)}
129 <label for="rhodecode_hg_use_rebase_for_merging{suffix}">${_('Use rebase as merge strategy')}</label>
129 <label for="rhodecode_hg_use_rebase_for_merging${suffix}">${_('Use rebase as merge strategy')}</label>
130 130 </div>
131 131 <div class="label">
132 132 <span class="help-block">${_('Use rebase instead of creating a merge commit when merging via web interface.')}</span>
@@ -147,10 +147,13 b''
147 147 <div class="field">
148 148 <div class="checkbox">
149 149 ${h.checkbox('vcs_svn_proxy_http_requests_enabled' + suffix, 'True', **kwargs)}
150 <label for="vcs_svn_proxy_http_requests_enabled{suffix}">${_('Proxy subversion HTTP requests')}</label>
150 <label for="vcs_svn_proxy_http_requests_enabled${suffix}">${_('Proxy subversion HTTP requests')}</label>
151 151 </div>
152 152 <div class="label">
153 <span class="help-block">${_('Subversion HTTP Support. Enables communication with SVN over HTTP protocol.')}</span>
153 <span class="help-block">
154 ${_('Subversion HTTP Support. Enables communication with SVN over HTTP protocol.')}
155 <a href="${h.url('enterprise_svn_setup')}" target="_blank">${_('SVN Protocol setup Documentation')}</a>.
156 </span>
154 157 </div>
155 158 </div>
156 159 <div class="field">
@@ -161,6 +164,11 b''
161 164 ${h.text('vcs_svn_proxy_http_server_url',size=59)}
162 165 </div>
163 166 </div>
167 % if c.svn_proxy_generate_config:
168 <div class="buttons">
169 <button class="btn btn-primary" id="vcs_svn_generate_cfg">${_('Generate Apache Config')}</button>
170 </div>
171 % endif
164 172 </div>
165 173 </div>
166 174 % endif
@@ -17,7 +17,7 b''
17 17
18 18 %if cs.merge:
19 19 <span class="mergetag">
20 ${_('merge')}
20 <i class="icon-merge"></i>${_('merge')}
21 21 </span>
22 22 %endif
23 23 <div class="message_history" title="${cs.message}">
@@ -42,12 +42,12 b''
42 42 <span id="parent_link">
43 43 <a href="#" title="${_('Parent Commit')}">${_('Parent')}</a>
44 44 </span>
45 |
45 |
46 46 <span id="child_link">
47 47 <a href="#" title="${_('Child Commit')}">${_('Child')}</a>
48 48 </span>
49 49 </div>
50
50
51 51 <div class="fieldset">
52 52 <div class="left-label">
53 53 ${_('Description')}:
@@ -59,7 +59,7 b''
59 59 </div>
60 60 </div>
61 61 </div>
62
62
63 63 %if c.statuses:
64 64 <div class="fieldset">
65 65 <div class="left-label">
@@ -73,20 +73,20 b''
73 73 </div>
74 74 </div>
75 75 %endif
76
76
77 77 <div class="fieldset">
78 78 <div class="left-label">
79 79 ${_('References')}:
80 80 </div>
81 81 <div class="right-content">
82 82 <div class="tags">
83
83
84 84 %if c.commit.merge:
85 85 <span class="mergetag tag">
86 ${_('merge')}
86 <i class="icon-merge"></i>${_('merge')}
87 87 </span>
88 88 %endif
89
89
90 90 %if h.is_hg(c.rhodecode_repo):
91 91 %for book in c.commit.bookmarks:
92 92 <span class="booktag tag" title="${_('Bookmark %s') % book}">
@@ -94,13 +94,13 b''
94 94 </span>
95 95 %endfor
96 96 %endif
97
97
98 98 %for tag in c.commit.tags:
99 99 <span class="tagtag tag" title="${_('Tag %s') % tag}">
100 100 <a href="${h.url('files_home',repo_name=c.repo_name,revision=c.commit.raw_id)}"><i class="icon-tag"></i>${tag}</a>
101 101 </span>
102 102 %endfor
103
103
104 104 %if c.commit.branch:
105 105 <span class="branchtag tag" title="${_('Branch %s') % c.commit.branch}">
106 106 <a href="${h.url('files_home',repo_name=c.repo_name,revision=c.commit.raw_id)}"><i class="icon-code-fork"></i>${h.shorter(c.commit.branch)}</a>
@@ -119,22 +119,22 b''
119 119 <a href="${h.url('changeset_raw_home',repo_name=c.repo_name,revision=c.commit.raw_id)}" class="tooltip" title="${h.tooltip(_('Raw diff'))}">
120 120 ${_('Raw Diff')}
121 121 </a>
122 |
122 |
123 123 <a href="${h.url('changeset_patch_home',repo_name=c.repo_name,revision=c.commit.raw_id)}" class="tooltip" title="${h.tooltip(_('Patch diff'))}">
124 124 ${_('Patch Diff')}
125 125 </a>
126 |
126 |
127 127 <a href="${h.url('changeset_download_home',repo_name=c.repo_name,revision=c.commit.raw_id,diff='download')}" class="tooltip" title="${h.tooltip(_('Download diff'))}">
128 128 ${_('Download Diff')}
129 129 </a>
130 |
130 |
131 131 ${c.ignorews_url(request.GET)}
132 |
132 |
133 133 ${c.context_url(request.GET)}
134 134 </div>
135 135 </div>
136 136 </div>
137
137
138 138 <div class="fieldset">
139 139 <div class="left-label">
140 140 ${_('Comments')}:
@@ -147,17 +147,16 b''
147 147 ${ungettext("%d Commit comment", "%d Commit comments", len(c.comments)) % len(c.comments)}
148 148 %endif
149 149 %if c.inline_cnt:
150 ## this is replaced with a proper link to first comment via JS linkifyComments() func
151 <a href="#inline-comments" id="inline-comments-counter">${ungettext("%d Inline Comment", "%d Inline Comments", c.inline_cnt) % c.inline_cnt}</a>
150 <a href="#" onclick="return Rhodecode.comments.nextComment();" id="inline-comments-counter">${ungettext("%d Inline Comment", "%d Inline Comments", c.inline_cnt) % c.inline_cnt}</a>
152 151 %else:
153 152 ${ungettext("%d Inline Comment", "%d Inline Comments", c.inline_cnt) % c.inline_cnt}
154 153 %endif
155 154 </div>
156 155 </div>
157 156 </div>
158
157
159 158 </div> <!-- end summary-detail -->
160
159
161 160 <div id="commit-stats" class="sidebar-right">
162 161 <div class="summary-detail-header">
163 162 <h4 class="item">
@@ -170,102 +169,24 b''
170 169 </div>
171 170 </div><!-- end sidebar -->
172 171 </div> <!-- end summary -->
173 <div class="cs_files_title">
174 <span class="cs_files_expand">
175 <span id="files_link"><a href="#" title="${_('Browse files at current commit')}">${_('Browse files')}</a></span> |
176
177 <span id="expand_all_files">${_('Expand All')}</span> | <span id="collapse_all_files">${_('Collapse All')}</span>
178 </span>
179 <h2>
180 ${diff_block.diff_summary_text(len(c.files), c.lines_added, c.lines_deleted, c.limited_diff)}
181 </h2>
172 <div class="cs_files">
173 <%namespace name="cbdiffs" file="/codeblocks/diffs.html"/>
174 ${cbdiffs.render_diffset_menu()}
175 ${cbdiffs.render_diffset(
176 c.changes[c.commit.raw_id], commit=c.commit, use_comments=True)}
182 177 </div>
183 </div>
184
185 <div class="cs_files">
186
187 %if not c.files:
188 <p class="empty_data">${_('No files')}</p>
189 %endif
190
191 <table class="compare_view_files commit_diff">
192 %for FID, (cs1, cs2, change, path, diff, stats, file) in c.changes[c.commit.raw_id].iteritems():
193 <tr class="cs_${change} collapse_file" fid="${FID}">
194 <td class="cs_icon_td">
195 <span class="collapse_file_icon" fid="${FID}"></span>
196 </td>
197 <td class="cs_icon_td">
198 <div class="flag_status not_reviewed hidden"></div>
199 </td>
200 <td class="cs_${change}" id="a_${FID}">
201 <div class="node">
202 <a href="#a_${FID}">
203 <i class="icon-file-${change.lower()}"></i>
204 ${h.safe_unicode(path)}
205 </a>
206 </div>
207 </td>
208 <td>
209 <div class="changes pull-right">${h.fancy_file_stats(stats)}</div>
210 <div class="comment-bubble pull-right" data-path="${path}">
211 <i class="icon-comment"></i>
212 </div>
213 </td>
214 </tr>
215 <tr fid="${FID}" id="diff_${FID}" class="diff_links">
216 <td></td>
217 <td></td>
218 <td class="cs_${change}">
219 ${diff_block.diff_menu(c.repo_name, h.safe_unicode(path), cs1, cs2, change, file)}
220 </td>
221 <td class="td-actions rc-form">
222 ${c.ignorews_url(request.GET, h.FID(cs2,path))} |
223 ${c.context_url(request.GET, h.FID(cs2,path))} |
224 <div data-comment-id="${h.FID(cs2,path)}" class="btn-link show-inline-comments comments-visible">
225 <span class="comments-show">${_('Show comments')}</span>
226 <span class="comments-hide">${_('Hide comments')}</span>
227 </div>
228 </td>
229 </tr>
230 <tr id="tr_${FID}">
231 <td></td>
232 <td></td>
233 <td class="injected_diff" colspan="2">
234 <div class="diff-container" id="${'diff-container-%s' % (id(change))}">
235 <div id="${FID}" class="diffblock margined comm">
236 <div class="code-body">
237 <div class="full_f_path" path="${h.safe_unicode(path)}"></div>
238 ${diff|n}
239 % if file and file["is_limited_diff"]:
240 % if file["exceeds_limit"]:
241 ${diff_block.file_message()}
242 % else:
243 <h5>${_('Diff was truncated. File content available only in full diff.')} <a href="${h.url.current(fulldiff=1, **request.GET.mixed())}" onclick="return confirm('${_("Showing a big diff might take some time and resources, continue?")}')">${_('Show full diff')}</a></h5>
244 % endif
245 % endif
246 </div>
247 </div>
248 </div>
249 </td>
250 </tr>
251 %endfor
252 </table>
253 </div>
254
255 % if c.limited_diff:
256 ${diff_block.changeset_message()}
257 % endif
258 178
259 179 ## template for inline comment form
260 180 <%namespace name="comment" file="/changeset/changeset_file_comment.html"/>
261 181 ${comment.comment_inline_form()}
262 182
263 ## render comments and inlines
183 ## ## render comments and inlines
264 184 ${comment.generate_comments()}
265 185
266 186 ## main comment form and it status
267 187 ${comment.comments(h.url('changeset_comment', repo_name=c.repo_name, revision=c.commit.raw_id),
268 188 h.commit_status(c.rhodecode_db_repo, c.commit.raw_id))}
189 </div>
269 190
270 191 ## FORM FOR MAKING JS ACTION AS CHANGESET COMMENTS
271 192 <script type="text/javascript">
@@ -388,7 +309,6 b''
388 309
389 310 // inject comments into their proper positions
390 311 var file_comments = $('.inline-comment-placeholder');
391 renderInlineComments(file_comments, true);
392 312 })
393 313 </script>
394 314
@@ -6,7 +6,14 b''
6 6 <%namespace name="base" file="/base/base.html"/>
7 7
8 8 <%def name="comment_block(comment, inline=False)">
9 <div class="comment ${'comment-inline' if inline else ''}" id="comment-${comment.comment_id}" line="${comment.line_no}" data-comment-id="${comment.comment_id}">
9 <div
10 class="comment
11 ${'comment-inline' if inline else ''}
12 ${'comment-outdated' if comment.outdated else 'comment-current'}"
13 "
14 id="comment-${comment.comment_id}"
15 line="${comment.line_no}"
16 data-comment-id="${comment.comment_id}">
10 17 <div class="meta">
11 18 <div class="author">
12 19 ${base.gravatar_with_user(comment.author.email, 16)}
@@ -46,24 +53,15 b''
46 53 ## only super-admin, repo admin OR comment owner can delete
47 54 %if not comment.pull_request or (comment.pull_request and not comment.pull_request.is_closed()):
48 55 %if h.HasPermissionAny('hg.admin')() or h.HasRepoPermissionAny('repository.admin')(c.repo_name) or comment.author.user_id == c.rhodecode_user.user_id:
49 <div onClick="deleteComment(${comment.comment_id})" class="delete-comment"> ${_('Delete')}</div>
50 %if inline:
51 <div class="comment-links-divider"> | </div>
56 ## TODO: dan: add edit comment here
57 <a onclick="return Rhodecode.comments.deleteComment(this);" class="delete-comment"> ${_('Delete')}</a> |
58 %if not comment.outdated:
59 <a onclick="return Rhodecode.comments.prevComment(this);" class="prev-comment"> ${_('Prev')}</a> |
60 <a onclick="return Rhodecode.comments.nextComment(this);" class="next-comment"> ${_('Next')}</a>
52 61 %endif
53 62 %endif
54 63 %endif
55 64
56 %if inline:
57
58 <div id="prev_c_${comment.comment_id}" class="comment-previous-link" title="${_('Previous comment')}">
59 <a class="arrow_comment_link disabled"><i class="icon-left"></i></a>
60 </div>
61
62 <div id="next_c_${comment.comment_id}" class="comment-next-link" title="${_('Next comment')}">
63 <a class="arrow_comment_link disabled"><i class="icon-right"></i></a>
64 </div>
65 %endif
66
67 65 </div>
68 66 </div>
69 67 <div class="text">
@@ -165,31 +163,8 b''
165 163 </%def>
166 164
167 165
168 ## generates inlines taken from c.comments var
169 <%def name="inlines(is_pull_request=False)">
170 %if is_pull_request:
171 <h2 id="comments">${ungettext("%d Pull Request Comment", "%d Pull Request Comments", len(c.comments)) % len(c.comments)}</h2>
172 %else:
173 <h2 id="comments">${ungettext("%d Commit Comment", "%d Commit Comments", len(c.comments)) % len(c.comments)}</h2>
174 %endif
175 %for path, lines_comments in c.inline_comments:
176 % for line, comments in lines_comments.iteritems():
177 <div style="display: none;" class="inline-comment-placeholder" path="${path}" target_id="${h.safeid(h.safe_unicode(path))}">
178 ## for each comment in particular line
179 %for comment in comments:
180 ${comment_block(comment, inline=True)}
181 %endfor
182 </div>
183 %endfor
184 %endfor
185
186 </%def>
187
188 ## generate inline comments and the main ones
166 ## generate main comments
189 167 <%def name="generate_comments(include_pull_request=False, is_pull_request=False)">
190 ## generate inlines for this changeset
191 ${inlines(is_pull_request)}
192
193 168 %for comment in c.comments:
194 169 <div id="comment-tr-${comment.comment_id}">
195 170 ## only render comments that are not from pull request, or from
@@ -49,78 +49,23 b''
49 49 ${self.breadcrumbs_links()}
50 50 </h2>
51 51 </div>
52
53 <div id="changeset_compare_view_content">
54 ##CS
55 <%include file="../compare/compare_commits.html"/>
56 ## FILES
57 <div class="cs_files_title">
58 <span class="cs_files_expand">
59 <span id="expand_all_files">${_('Expand All')}</span> | <span id="collapse_all_files">${_('Collapse All')}</span>
60 </span>
61 <h2>
62 ${diff_block.diff_summary_text(len(c.files), c.lines_added, c.lines_deleted, c.limited_diff)}
63 </h2>
64 </div>
65 </div>
66 52 </div>
67
68 <div class="cs_files">
69 <table class="compare_view_files">
53 <div id="changeset_compare_view_content">
54 ##CS
55 <%include file="../compare/compare_commits.html"/>
56 <div class="cs_files">
57 <%namespace name="cbdiffs" file="/codeblocks/diffs.html"/>
70 58 <%namespace name="comment" file="/changeset/changeset_file_comment.html"/>
71 59 <%namespace name="diff_block" file="/changeset/diff_block.html"/>
72 %for cs in c.commit_ranges:
73 <tr class="rctable">
74 <td colspan="4">
75 <a class="tooltip revision" title="${h.tooltip(cs.message)}" href="${h.url('changeset_home',repo_name=c.repo_name,revision=cs.raw_id)}">${'r%s:%s' % (cs.revision,h.short_id(cs.raw_id))}</a> |
76 ${h.age_component(cs.date)}
77 </td>
78 </tr>
79 %for FID, (cs1, cs2, change, path, diff, stats, file) in c.changes[cs.raw_id].iteritems():
80 <tr class="cs_${change} collapse_file" fid="${FID}">
81 <td class="cs_icon_td">
82 <span class="collapse_file_icon" fid="${FID}"></span>
83 </td>
84 <td class="cs_icon_td">
85 <div class="flag_status not_reviewed hidden"></div>
86 </td>
87 <td class="cs_${change}" id="a_${FID}">
88 <div class="node">
89 <a href="#a_${FID}">
90 <i class="icon-file-${change.lower()}"></i>
91 ${h.safe_unicode(path)}
92 </a>
93 </div>
94 </td>
95 <td>
96 <div class="changes">${h.fancy_file_stats(stats)}</div>
97 </td>
98 </tr>
99 <tr fid="${FID}" id="diff_${FID}" class="diff_links">
100 <td></td>
101 <td></td>
102 <td class="cs_${change}">
103 ${diff_block.diff_menu(c.repo_name, h.safe_unicode(path), cs1, cs2, change, file)}
104 </td>
105 <td class="td-actions rc-form"></td>
106 </tr>
107 <tr id="tr_${FID}">
108 <td></td>
109 <td></td>
110 <td class="injected_diff" colspan="2">
111 <div id="diff-container-${FID}" class="diff-container">
112 <div id="${FID}" class="diffblock margined comm">
113 <div class="code-body">
114 ${diff|n}
115 </div>
116 </div>
117 </div>
118 </td>
119 </tr>
120 %endfor
121 %endfor
122 </table>
60 ${cbdiffs.render_diffset_menu()}
61 %for commit in c.commit_ranges:
62 ${cbdiffs.render_diffset(
63 diffset=c.changes[commit.raw_id],
64 collapse_when_files_over=5,
65 commit=commit,
66 )}
67 %endfor
68 </table>
69 </div>
123 70 </div>
124 ## end summary detail
125
126 </%def> No newline at end of file
71 </%def>
@@ -15,6 +15,7 b''
15 15 </p>
16 16 %endif
17 17
18 <input type="hidden" name="__start__" value="revisions:sequence">
18 19 <table class="rctable compare_view_commits">
19 20 <tr>
20 21 <th>${_('Time')}</th>
@@ -66,6 +67,7 b''
66 67 </tr>
67 68 %endfor
68 69 </table>
70 <input type="hidden" name="__end__" value="revisions:sequence">
69 71 %endif
70 72 </div>
71 73
@@ -1,5 +1,6 b''
1 1 ## -*- coding: utf-8 -*-
2 2 <%inherit file="/base/base.html"/>
3 <%namespace name="cbdiffs" file="/codeblocks/diffs.html"/>
3 4
4 5 <%def name="title()">
5 6 %if c.compare_home:
@@ -53,7 +54,7 b''
53 54 <a id="btn-swap" class="btn btn-primary" href="${c.swap_url}"><i class="icon-refresh"></i> ${_('Swap')}</a>
54 55 %endif
55 56 <div id="compare_revs" class="btn btn-primary"><i class ="icon-loop"></i> ${_('Compare Commits')}</div>
56 %if c.files:
57 %if c.diffset and c.diffset.files:
57 58 <div id="compare_changeset_status_toggle" class="btn btn-primary">${_('Comment')}</div>
58 59 %endif
59 60 </div>
@@ -248,72 +249,8 b''
248 249 <div id="changeset_compare_view_content">
249 250 ##CS
250 251 <%include file="compare_commits.html"/>
251
252 ## FILES
253 <div class="cs_files_title">
254 <span class="cs_files_expand">
255 <span id="expand_all_files">${_('Expand All')}</span> | <span id="collapse_all_files">${_('Collapse All')}</span>
256 </span>
257 <h2>
258 ${diff_block.diff_summary_text(len(c.files), c.lines_added, c.lines_deleted, c.limited_diff)}
259 </h2>
260 </div>
261 <div class="cs_files">
262 %if not c.files:
263 <p class="empty_data">${_('No files')}</p>
264 %endif
265 <table class="compare_view_files">
266 <%namespace name="diff_block" file="/changeset/diff_block.html"/>
267 %for FID, change, path, stats, file in c.files:
268 <tr class="cs_${change} collapse_file" fid="${FID}">
269 <td class="cs_icon_td">
270 <span class="collapse_file_icon" fid="${FID}"></span>
271 </td>
272 <td class="cs_icon_td">
273 <div class="flag_status not_reviewed hidden"></div>
274 </td>
275 <td class="cs_${change}" id="a_${FID}">
276 <div class="node">
277 <a href="#a_${FID}">
278 <i class="icon-file-${change.lower()}"></i>
279 ${h.safe_unicode(path)}
280 </a>
281 </div>
282 </td>
283 <td>
284 <div class="changes pull-right">${h.fancy_file_stats(stats)}</div>
285 <div class="comment-bubble pull-right" data-path="${path}">
286 <i class="icon-comment"></i>
287 </div>
288 </td>
289 </tr>
290 <tr fid="${FID}" id="diff_${FID}" class="diff_links">
291 <td></td>
292 <td></td>
293 <td class="cs_${change}">
294 %if c.target_repo.repo_name == c.repo_name:
295 ${diff_block.diff_menu(c.repo_name, h.safe_unicode(path), c.source_ref, c.target_ref, change, file)}
296 %else:
297 ## this is slightly different case later, since the target repo can have this
298 ## file in target state than the source repo
299 ${diff_block.diff_menu(c.target_repo.repo_name, h.safe_unicode(path), c.source_ref, c.target_ref, change, file)}
300 %endif
301 </td>
302 <td class="td-actions rc-form">
303 </td>
304 </tr>
305 <tr id="tr_${FID}">
306 <td></td>
307 <td></td>
308 <td class="injected_diff" colspan="2">
309 ${diff_block.diff_block_simple([c.changes[FID]])}
310 </td>
311 </tr>
312 %endfor
313 </table>
314 % if c.limited_diff:
315 ${diff_block.changeset_message()}
316 % endif
252 ${cbdiffs.render_diffset_menu()}
253 ${cbdiffs.render_diffset(c.diffset)}
317 254 </div>
318 255 %endif
319 256 </div>
@@ -72,6 +72,10 b''
72 72 </div>
73 73 </%def>
74 74
75 <%def name="repo_desc(description)">
76 <div class="truncate-wrap">${description}</div>
77 </%def>
78
75 79 <%def name="last_change(last_change)">
76 80 ${h.age_component(last_change)}
77 81 </%def>
@@ -167,6 +171,10 b''
167 171 </div>
168 172 </%def>
169 173
174 <%def name="repo_group_desc(description)">
175 <div class="truncate-wrap">${description}</div>
176 </%def>
177
170 178 <%def name="repo_group_actions(repo_group_id, repo_group_name, gr_count)">
171 179 <div class="grid_edit">
172 180 <a href="${h.url('edit_repo_group',group_name=repo_group_name)}" title="${_('Edit')}">Edit</a>
@@ -271,6 +279,12 b''
271 279
272 280
273 281 ## PULL REQUESTS GRID RENDERERS
282
283 <%def name="pullrequest_target_repo(repo_name)">
284 <div class="truncate">
285 ${h.link_to(repo_name,h.url('summary_home',repo_name=repo_name))}
286 </div>
287 </%def>
274 288 <%def name="pullrequest_status(status)">
275 289 <div class="${'flag_status %s' % status} pull-left"></div>
276 290 </%def>
@@ -284,9 +298,13 b''
284 298 <i class="icon-comment icon-comment-colored"></i> ${comments_nr}
285 299 </%def>
286 300
287 <%def name="pullrequest_name(pull_request_id, target_repo_name)">
301 <%def name="pullrequest_name(pull_request_id, target_repo_name, short=False)">
288 302 <a href="${h.url('pullrequest_show',repo_name=target_repo_name,pull_request_id=pull_request_id)}">
289 ${_('Pull request #%(pr_number)s') % {'pr_number': pull_request_id,}}
303 % if short:
304 #${pull_request_id}
305 % else:
306 ${_('Pull request #%(pr_number)s') % {'pr_number': pull_request_id,}}
307 % endif
290 308 </a>
291 309 </%def>
292 310
@@ -545,12 +545,6 b''
545 545 </div>
546 546 </div>
547 547
548 <script>
549 $(document).ready(function(){
550 MultiSelectWidget('users_group_members','available_members','edit_users_group');
551 })
552 </script>
553
554 548 </div>
555 549
556 550 <div class='field'>
@@ -595,12 +589,6 b''
595 589 </div>
596 590 </div>
597 591
598 <script>
599 $(document).ready(function(){
600 MultiSelectWidget('users_group_members2','available_members','edit_users_group');
601 })
602 </script>
603
604 592 </div>
605 593
606 594 <div class='field'>
@@ -53,40 +53,46 b''
53 53
54 54 <table id="icons-list">
55 55 <tr class="row">
56 <td title="Code: 0xe800" class="the-icons span3"><i class="icon-plus"></i> <span class="i-name">icon-plus</span><span class="i-code">0xe800</span></td>
57 <td title="Code: 0xe801" class="the-icons span3"><i class="icon-minus"></i> <span class="i-name">icon-minus</span><span class="i-code">0xe801</span></td>
58 <td title="Code: 0xe802" class="the-icons span3"><i class="icon-remove"></i> <span class="i-name">icon-remove</span><span class="i-code">0xe802</span></td>
59 <td title="Code: 0xe803" class="the-icons span3"><i class="icon-bookmark"></i> <span class="i-name">icon-bookmark</span><span class="i-code">0xe803</span></td>
60 </tr>
61 <tr class="row">
62 <td title="Code: 0xe804" class="the-icons span3"><i class="icon-branch"></i> <span class="i-name">icon-branch</span><span class="i-code">0xe804</span></td>
63 <td title="Code: 0xe805" class="the-icons span3"><i class="icon-tag"></i> <span class="i-name">icon-tag</span><span class="i-code">0xe805</span></td>
64 <td title="Code: 0xe806" class="the-icons span3"><i class="icon-lock"></i> <span class="i-name">icon-lock</span><span class="i-code">0xe806</span></td>
65 <td title="Code: 0xe807" class="the-icons span3"><i class="icon-unlock"></i> <span class="i-name">icon-unlock</span><span class="i-code">0xe807</span></td>
56 <td title="Code: 0xe813" class="the-icons span3"><i class="icon-plus"></i> <span class="i-name">icon-plus</span> <span class="i-code">0xe813</span></td>
57 <td title="Code: 0xe814" class="the-icons span3"><i class="icon-minus"></i> <span class="i-name">icon-minus</span> <span class="i-code">0xe814</span></td>
58 <td title="Code: 0xe815" class="the-icons span3"><i class="icon-remove"></i> <span class="i-name">icon-remove</span> <span class="i-code">0xe815</span></td>
59 <td title="Code: 0xe811" class="the-icons span3"><i class="icon-fork"></i> <span class="i-name">icon-fork</span> <span class="i-code">0xe811</span></td>
60 <td title="Code: 0xe803" class="the-icons span3"><i class="icon-bookmark"></i> <span class="i-name">icon-bookmark</span> <span class="i-code">0xe803</span></td>
61 </tr>
62 <tr class="row">
63 <td title="Code: 0xe804" class="the-icons span3"><i class="icon-branch"></i> <span class="i-name">icon-branch</span> <span class="i-code">0xe804</span></td>
64 <td title="Code: 0xe833" class="the-icons span3"><i class="icon-merge"></i> <span class="i-name">icon-merge</span> <span class="i-code">0xe833</span></td>
65 <td title="Code: 0xe805" class="the-icons span3"><i class="icon-tag"></i> <span class="i-name">icon-tag</span> <span class="i-code">0xe805</span></td>
66 <td title="Code: 0xe806" class="the-icons span3"><i class="icon-lock"></i> <span class="i-name">icon-lock</span> <span class="i-code">0xe806</span></td>
67 <td title="Code: 0xe807" class="the-icons span3"><i class="icon-unlock"></i> <span class="i-name">icon-unlock</span> <span class="i-code">0xe807</span></td>
68 </tr>
69 <tr class="row">
70 <td title="Code: 0xe800" class="the-icons span3"><i class="icon-delete"></i> <span class="i-name">icon-delete</span> <span class="i-code">0xe800</span></td>
71 <td title="Code: 0xe800" class="the-icons span3"><i class="icon-false"></i> <span class="i-name">icon-false</span> <span class="i-code">0xe800</span></td>
72 <td title="Code: 0xe801" class="the-icons span3"><i class="icon-ok"></i> <span class="i-name">icon-ok</span> <span class="i-code">0xe801</span></td>
73 <td title="Code: 0xe801" class="the-icons span3"><i class="icon-true"></i> <span class="i-name">icon-true</span> <span class="i-code">0xe801</span></td>
74 <td title="Code: 0xe80f" class="the-icons span3"><i class="icon-group"></i> <span class="i-name">icon-group</span> <span class="i-code">0xe80f</span></td>
75 </tr>
76 <tr class="row">
77 <td title="Code: 0xe82d" class="the-icons span3"><i class="icon-hg"></i> <span class="i-name">icon-hg</span> <span class="i-code">0xe82d</span></td>
78 <td title="Code: 0xe82a" class="the-icons span3"><i class="icon-git"></i> <span class="i-name">icon-git</span> <span class="i-code">0xe82a</span></td>
79 <td title="Code: 0xe82e" class="the-icons span3"><i class="icon-svn"></i> <span class="i-name">icon-svn</span> <span class="i-code">0xe82e</span></td>
80 <td title="Code: 0xe810" class="the-icons span3"><i class="icon-folder"></i> <span class="i-name">icon-folder</span> <span class="i-code">0xe810</span></td>
81 <td title="Code: 0xe831" class="the-icons span3"><i class="icon-rhodecode"></i> <span class="i-name">icon-rhodecode</span> <span class="i-code">0xe831</span></td>
82 </tr>
83 <tr class="row">
84 <td title="Code: 0xe812" class="the-icons span3"><i class="icon-more"></i> <span class="i-name">icon-more</span> <span class="i-code">0xe812</span></td>
85 <td title="Code: 0xe802" class="the-icons span3"><i class="icon-comment"></i> <span class="i-name">icon-comment</span> <span class="i-code">0xe802</span></td>
86 <td title="Code: 0xe82f" class="the-icons span3"><i class="icon-comment-add"></i> <span class="i-name">icon-comment-add</span> <span class="i-code">0xe82f</span></td>
87 <td title="Code: 0xe830" class="the-icons span3"><i class="icon-comment-toggle"></i> <span class="i-name">icon-comment-toggle</span> <span class="i-code">0xe830</span></td>
88 <td title="Code: 0xe808" class="the-icons span3"><i class="icon-feed"></i> <span class="i-name">icon-feed</span> <span class="i-code">0xe808</span></td>
66 89 </tr>
67 90 <tr class="row">
68 <td title="Code: 0xe808" class="the-icons span3"><i class="icon-delete"></i> <span class="i-name">icon-delete</span><span class="i-code">0xe808</span></td>
69 <td title="Code: 0xe808" class="the-icons span3"><i class="icon-false"></i> <span class="i-name">icon-false</span><span class="i-code">0xe808</span></td>
70 <td title="Code: 0xe809" class="the-icons span3"><i class="icon-ok"></i> <span class="i-name">icon-ok</span><span class="i-code">0xe809</span></td>
71 <td title="Code: 0xe809" class="the-icons span3"><i class="icon-true"></i> <span class="i-name">icon-true</span><span class="i-code">0xe809</span></td>
72 </tr>
73 <tr class="row">
74 <td title="Code: 0xe80c" class="the-icons span3"><i class="icon-right"></i> <span class="i-name">icon-right</span><span class="i-code">0xe80c</span></td>
75 <td title="Code: 0xe80d" class="the-icons span3"><i class="icon-left"></i> <span class="i-name">icon-left</span><span class="i-code">0xe80d</span></td>
76 <td title="Code: 0xe80e" class="the-icons span3"><i class="icon-arrow_down"></i> <span class="i-name">icon-arrow_down</span><span class="i-code">0xe80e</span></td>
77 <td title="Code: 0xe80f" class="the-icons span3"><i class="icon-git"></i> <span class="i-name">icon-git</span><span class="i-code">0xe80f</span></td>
78 </tr>
79 <tr class="row">
80 <td title="Code: 0xe810" class="the-icons span3"><i class="icon-hg"></i> <span class="i-name">icon-hg</span><span class="i-code">0xe810</span></td>
81 <td title="Code: 0xe811" class="the-icons span3"><i class="icon-svn"></i> <span class="i-name">icon-svn</span><span class="i-code">0xe811</span></td>
82 <td title="Code: 0xe812" class="the-icons span3"><i class="icon-group"></i> <span class="i-name">icon-group</span><span class="i-code">0xe812</span></td>
83 <td title="Code: 0xe813" class="the-icons span3"><i class="icon-folder"></i> <span class="i-name">icon-folder</span><span class="i-code">0xe813</span></td>
84 </tr>
85 <tr class="row">
86 <td title="Code: 0xe814" class="the-icons span3"><i class="icon-fork"></i> <span class="i-name">icon-fork</span><span class="i-code">0xe814</span></td>
87 <td title="Code: 0xe815" class="the-icons span3"><i class="icon-more"></i> <span class="i-name">icon-more</span><span class="i-code">0xe815</span></td>
88 <td title="Code: 0xe80a" class="the-icons span3"><i class="icon-comment"></i> <span class="i-name">icon-comment</span><span class="i-code">0xe80a</span></td>
89 <td title="Code: 0xe80b" class="the-icons span3"><i class="icon-feed"></i> <span class="i-name">icon-feed</span><span class="i-code">0xe80b</span></td>
91 <td title="Code: 0xe80a" class="the-icons span3"><i class="icon-right"></i> <span class="i-name">icon-right</span> <span class="i-code">0xe80a</span></td>
92 <td title="Code: 0xe809" class="the-icons span3"><i class="icon-left"></i> <span class="i-name">icon-left</span> <span class="i-code">0xe809</span></td>
93 <td title="Code: 0xe80b" class="the-icons span3"><i class="icon-arrow_down"></i> <span class="i-name">icon-arrow_down</span> <span class="i-code">0xe80b</span></td>
94 <td title="Code: 0xe832" class="the-icons span3"><i class="icon-arrow_up"></i> <span class="i-name">icon-arrow_up</span> <span class="i-code">0xe832</span></td>
95 <td></td>
90 96 </tr>
91 97 </div>
92 98 </table>
@@ -56,6 +56,7 b''
56 56 <li class="${'active' if c.active=='forms' else ''}"><a href="${h.url('debug_style_template', t_path='forms.html')}">${_('Forms')}</a></li>
57 57 <li class="${'active' if c.active=='buttons' else ''}"><a href="${h.url('debug_style_template', t_path='buttons.html')}">${_('Buttons')}</a></li>
58 58 <li class="${'active' if c.active=='labels' else ''}"><a href="${h.url('debug_style_template', t_path='labels.html')}">${_('Labels')}</a></li>
59 <li class="${'active' if c.active=='alerts' else ''}"><a href="${h.url('debug_style_template', t_path='alerts.html')}">${_('Alerts')}</a></li>
59 60 <li class="${'active' if c.active=='tables' else ''}"><a href="${h.url('debug_style_template', t_path='tables.html')}">${_('Tables')}</a></li>
60 61 <li class="${'active' if c.active=='tables-wide' else ''}"><a href="${h.url('debug_style_template', t_path='tables-wide.html')}">${_('Tables wide')}</a></li>
61 62 <li class="${'active' if c.active=='collapsable-content' else ''}"><a href="${h.url('debug_style_template', t_path='collapsable-content.html')}">${_('Collapsable Content')}</a></li>
@@ -19,65 +19,45 b''
19 19 ${self.sidebar()}
20 20
21 21 <div class="main-content">
22
23 <h2>Label</h2>
24
22 <h2>Labels</h2>
23
25 24 <h3>Labels used for tags, branches and bookmarks</h3>
25
26 <div class="bs-example">
27 <ul class="metatag-list">
28 <li>
29 <span class="tagtag tag" title="Tag tip">
30 <a href="/fake-link"><i class="icon-tag"></i>tip</a>
31 </span>
32 </li>
33 <li>
34 <span class="branchtag tag" title="Branch default">
35 <a href="/fake-link"><i class="icon-code-fork"></i>default</a>
36 </span>
37 </li>
38 <li>
39 <span class="bookmarktag tag" title="Bookmark example">
40 <a href="/fake-link"><i class="icon-bookmark"></i>example</a>
41 </span>
42 </li>
43 </ul>
26 44
27 <ul>
28 <li>
29 <span class="tagtag tag" title="Tag tip">
30 <a href="/fake-link"><i class="icon-tag"></i>tip</a>
31 </span>
32 </li>
33 <li>
34 <span class="branchtag tag" title="Branch default">
35 <a href="/fake-link"><i class="icon-code-fork"></i>default</a>
36 </span>
37 </li>
38 <li>
39 <span class="bookmarktag tag" title="Bookmark example">
40 <a href="/fake-link"><i class="icon-bookmark"></i>example</a>
41 </span>
42 </li>
43 </ul>
45 </div>
44 46
45 47 <h3>Labels used in tables</h3>
46
47 <ul class="metatag-list">
48 <li>[default] <span class="metatag" tag="default">default</span></li>
49 <li>[featured] <span class="metatag" tag="featured">featured</span></li>
50 <li>[stale] <span class="metatag" tag="stale">stale</span></li>
51 <li>[dead] <span class="metatag" tag="dead">dead</span></li>
52 <li>[lang =&gt; lang] <span class="metatag" tag="lang">lang</span></li>
53 <li>[license =&gt; License] <span class="metatag" tag="license"><a href="http://www.opensource.org/licenses/License">License</a></span></li>
54 <li>[requires =&gt; Repo] <span class="metatag" tag="requires">requires =&gt; <a href="#">Repo</a></span></li>
55 <li>[recommends =&gt; Repo] <span class="metatag" tag="recommends">recommends =&gt; <a href="#">Repo</a></span></li>
56 <li>[see =&gt; URI] <span class="metatag" tag="see">see =&gt; <a href="#">URI</a></span></li>
57 </ul>
58
59 <h3>Flash Messages</h3>
60 <div class="flash_msg">
61
62 <div class="alert alert-dismissable alert-success">
63 <button type="button" class="close" data-dismiss="alert" aria-hidden="true">×</button>
64 Updated VCS settings
65 </div>
66 <div class="alert alert-dismissable alert-warning">
67 <button type="button" class="close" data-dismiss="alert" aria-hidden="true">×</button>
68 Updated VCS settings
69 </div>
70 <div class="alert alert-dismissable alert-error">
71 <button type="button" class="close" data-dismiss="alert" aria-hidden="true">×</button>
72 Updated VCS settings
73 </div>
74 <div class="alert alert-dismissable alert">
75 <button type="button" class="close" data-dismiss="alert" aria-hidden="true">×</button>
76 Updated VCS settings
77 </div>
78
79
80 </div>
48 <div class="bs-example">
49 <ul class="metatag-list">
50 <li>[default] <span class="metatag" tag="default">default</span></li>
51 <li>[featured] <span class="metatag" tag="featured">featured</span></li>
52 <li>[stale] <span class="metatag" tag="stale">stale</span></li>
53 <li>[dead] <span class="metatag" tag="dead">dead</span></li>
54 <li>[lang =&gt; lang] <span class="metatag" tag="lang">lang</span></li>
55 <li>[license =&gt; License] <span class="metatag" tag="license"><a href="http://www.opensource.org/licenses/License">License</a></span></li>
56 <li>[requires =&gt; Repo] <span class="metatag" tag="requires">requires =&gt; <a href="#">Repo</a></span></li>
57 <li>[recommends =&gt; Repo] <span class="metatag" tag="recommends">recommends =&gt; <a href="#">Repo</a></span></li>
58 <li>[see =&gt; URI] <span class="metatag" tag="see">see =&gt; <a href="#">URI</a></span></li>
59 </ul>
60 </div>
81 61 </div> <!-- .main-content -->
82 62 </div>
83 63 </div> <!-- .box -->
@@ -22,7 +22,7 b''
22 22 <script type="text/javascript" src="${h.asset('js/scripts.js', ver=c.rhodecode_version_hash)}"></script>
23 23 </head>
24 24 <body>
25 <%include file="/base/flash_msg.html"/>
25 <% messages = h.flash.pop_messages() %>
26 26
27 27 <div class="wrapper error_page">
28 28 <div class="sidebar">
@@ -35,6 +35,11 b''
35 35 </span><br/>
36 36 ${c.error_message} | <span class="error_message">${c.error_explanation}</span>
37 37 </h1>
38 % if messages:
39 % for message in messages:
40 <div class="alert alert-${message.category}">${message}</div>
41 % endfor
42 % endif
38 43 %if c.redirect_time:
39 44 <p>${_('You will be redirected to %s in %s seconds') % (c.redirect_module,c.redirect_time)}</p>
40 45 %endif
@@ -1,7 +1,7 b''
1 1 <%def name="refs(commit)">
2 2 %if commit.merge:
3 3 <span class="mergetag tag">
4 ${_('merge')}
4 <i class="icon-merge">${_('merge')}</i>
5 5 </span>
6 6 %endif
7 7
@@ -88,7 +88,7 b''
88 88 if (metadataRequest && metadataRequest.readyState != 4) {
89 89 metadataRequest.abort();
90 90 }
91 if (source_page) {
91 if (fileSourcePage) {
92 92 return false;
93 93 }
94 94
@@ -143,47 +143,9 b''
143 143 $(this).hide();
144 144 });
145 145
146
147 if (source_page) {
146 if (fileSourcePage) {
148 147 // variants for with source code, not tree view
149 148
150 if (location.href.indexOf('#') != -1) {
151 page_highlights = location.href.substring(location.href.indexOf('#') + 1).split('L');
152 if (page_highlights.length == 2) {
153 highlight_ranges = page_highlights[1].split(",");
154
155 var h_lines = [];
156 for (pos in highlight_ranges) {
157 var _range = highlight_ranges[pos].split('-');
158 if (_range.length == 2) {
159 var start = parseInt(_range[0]);
160 var end = parseInt(_range[1]);
161 if (start < end) {
162 for (var i = start; i <= end; i++) {
163 h_lines.push(i);
164 }
165 }
166 }
167 else {
168 h_lines.push(parseInt(highlight_ranges[pos]));
169 }
170 }
171
172 for (pos in h_lines) {
173 // @comment-highlight-color
174 $('#L' + h_lines[pos]).css('background-color', '#ffd887');
175 }
176
177 var _first_line = $('#L' + h_lines[0]).get(0);
178 if (_first_line) {
179 var line = $('#L' + h_lines[0]);
180 if (line.length > 0){
181 offsetScroll(line, 70);
182 }
183 }
184 }
185 }
186
187 149 // select code link event
188 150 $("#hlcode").mouseup(getSelectionLink);
189 151
@@ -47,7 +47,3 b''
47 47 </div>
48 48
49 49 </div>
50
51 <script>
52 var source_page = false;
53 </script>
@@ -27,17 +27,21 b''
27 27 %for cnt,node in enumerate(c.file):
28 28 <tr class="parity${cnt%2}">
29 29 <td class="td-componentname">
30 %if node.is_submodule():
30 % if node.is_submodule():
31 31 <span class="submodule-dir">
32 ${h.link_to_if(
33 node.url.startswith('http://') or node.url.startswith('https://'),
34 node.name, node.url)}
32 % if node.url.startswith('http://') or node.url.startswith('https://'):
33 <a href="${node.url}">
34 <i class="icon-folder browser-dir"></i>${node.name}
35 </a>
36 % else:
37 <i class="icon-folder browser-dir"></i>${node.name}
38 % endif
35 39 </span>
36 %else:
40 % else:
37 41 <a href="${h.url('files_home',repo_name=c.repo_name,revision=c.commit.raw_id,f_path=h.safe_unicode(node.path))}" class="pjax-link">
38 42 <i class="${'icon-file browser-file' if node.is_file() else 'icon-folder browser-dir'}"></i>${node.name}
39 43 </a>
40 %endif
44 % endif
41 45 </td>
42 46 %if node.is_file():
43 47 <td class="td-size" data-attr-name="size">
@@ -34,7 +34,10 b''
34 34 % endif
35 35
36 36 </div> <!--end summary-detail-->
37
37 <script>
38 // set the pageSource variable
39 var fileSourcePage = ${c.file_source_page};
40 </script>
38 41 % if c.file.is_dir():
39 42 <div id="commit-stats" class="sidebar-right">
40 43 <%include file='file_tree_author_box.html'/>
@@ -1,3 +1,4 b''
1 <%namespace name="sourceblock" file="/codeblocks/source.html"/>
1 2
2 3 <div id="codeblock" class="codeblock">
3 4 <div class="codeblock-header">
@@ -51,12 +52,22 b''
51 52 </div>
52 53 %else:
53 54 % if c.file.size < c.cut_off_limit:
54 %if c.annotate:
55 ${h.pygmentize_annotation(c.repo_name,c.file,linenos=True,anchorlinenos=True,lineanchors='L',cssclass="code-highlight")}
56 %elif c.renderer:
55 %if c.renderer and not c.annotate:
57 56 ${h.render(c.file.content, renderer=c.renderer)}
58 57 %else:
59 ${h.pygmentize(c.file,linenos=True,anchorlinenos=True,lineanchors='L',cssclass="code-highlight")}
58 <table class="cb codehilite">
59 %if c.annotate:
60 <% color_hasher = h.color_hasher() %>
61 %for annotation, lines in c.annotated_lines:
62 ${sourceblock.render_annotation_lines(annotation, lines, color_hasher)}
63 %endfor
64 %else:
65 %for line_num, tokens in enumerate(c.lines, 1):
66 ${sourceblock.render_line(line_num, tokens)}
67 %endfor
68 %endif
69 </table>
70 </div>
60 71 %endif
61 72 %else:
62 73 ${_('File is too big to display')} ${h.link_to(_('Show as raw'),
@@ -64,8 +75,4 b''
64 75 %endif
65 76 %endif
66 77 </div>
67 </div>
68
69 <script>
70 var source_page = true;
71 </script>
78 </div> No newline at end of file
@@ -59,9 +59,11 b''
59 59 </div>
60 60 <div class="select">
61 61 ${h.select('repo_group','',c.repo_groups,class_="medium")}
62 %if c.personal_repo_group:
63 <a style="padding: 4px" href="#" id="select_my_group" data-personal-group-id="${c.personal_repo_group.group_id}">${_('Select my personal group (%(repo_group_name)s)') % {'repo_group_name': c.personal_repo_group.group_name}}</a>
64 %endif
62 % if c.personal_repo_group:
63 <a class="btn" href="#" id="select_my_group" data-personal-group-id="${c.personal_repo_group.group_id}">
64 ${_('Select my personal group (%(repo_group_name)s)') % {'repo_group_name': c.personal_repo_group.group_name}}
65 </a>
66 % endif
65 67 <span class="help-block">${_('Optionally select a group to put this repository into.')}</span>
66 68 </div>
67 69 </div>
@@ -29,7 +29,6 b''
29 29 </div>
30 30 <%block name="above_login_button" />
31 31 <div id="login" class="right-column">
32 <%include file="/base/flash_msg.html"/>
33 32 <!-- login -->
34 33 <div class="sign-in-title">
35 34 <h1>${_('Sign In')}</h1>
@@ -57,9 +56,21 b''
57 56 ${h.checkbox('remember', value=True, checked=defaults.get('remember'))}
58 57 <label class="checkbox" for="remember">${_('Remember me')}</label>
59 58
60 <p class="links">
61 ${h.link_to(_('Forgot your password?'), h.route_path('reset_password'))}
62 </p>
59 %if h.HasPermissionAny('hg.password_reset.enabled')():
60 <p class="links">
61 ${h.link_to(_('Forgot your password?'), h.route_path('reset_password'), class_='pwd_reset')}
62 </p>
63 %elif h.HasPermissionAny('hg.password_reset.hidden')():
64 <p class="help-block">
65 ${_('Password reset is disabled. Please contact ')}
66 % if c.visual.rhodecode_support_url:
67 <a href="${c.visual.rhodecode_support_url}" target="_blank">${_('Support')}</a>
68 ${_('or')}
69 % endif
70 ${_('an administrator if you need help.')}
71 </p>
72 %endif
73
63 74
64 75 ${h.submit('sign_in', _('Sign In'), class_="btn sign-in")}
65 76
@@ -2,7 +2,7 b''
2 2 <%inherit file="base/root.html"/>
3 3
4 4 <%def name="title()">
5 ${_('Create an Account')}
5 ${_('Reset Password')}
6 6 %if c.rhodecode_name:
7 7 &middot; ${h.branding(c.rhodecode_name)}
8 8 %endif
@@ -28,40 +28,52 b''
28 28 <img class="sign-in-image" src="${h.asset('images/sign-in.png')}" alt="RhodeCode"/>
29 29 </div>
30 30
31 <div id="register" class="right-column">
32 <%include file="/base/flash_msg.html"/>
33 <!-- login -->
34 <div class="sign-in-title">
35 <h1>${_('Reset your Password')}</h1>
36 <h4>${h.link_to(_("Go to the login page to sign in."), request.route_path('login'))}</h4>
31 %if h.HasPermissionAny('hg.password_reset.disabled')():
32 <div class="right-column">
33 <p>
34 ${_('Password reset is disabled. Please contact ')}
35 % if c.visual.rhodecode_support_url:
36 <a href="${c.visual.rhodecode_support_url}" target="_blank">${_('Support')}</a>
37 ${_('or')}
38 % endif
39 ${_('an administrator if you need help.')}
40 </p>
37 41 </div>
38 <div class="inner form">
39 ${h.form(request.route_path('reset_password'), needs_csrf_token=False)}
40 <label for="email">${_('Email Address')}:</label>
41 ${h.text('email', defaults.get('email'))}
42 %if 'email' in errors:
43 <span class="error-message">${errors.get('email')}</span>
44 <br />
45 %endif
46
47 %if captcha_active:
48 <div class="login-captcha"
49 <label for="email">${_('Captcha')}:</label>
50 ${h.hidden('recaptcha_field')}
51 <div id="recaptcha"></div>
52 %if 'recaptcha_field' in errors:
53 <span class="error-message">${errors.get('recaptcha_field')}</span>
42 %else:
43 <div id="register" class="right-column">
44 <!-- login -->
45 <div class="sign-in-title">
46 <h1>${_('Reset your Password')}</h1>
47 <h4>${h.link_to(_("Go to the login page to sign in."), request.route_path('login'))}</h4>
48 </div>
49 <div class="inner form">
50 ${h.form(request.route_path('reset_password'), needs_csrf_token=False)}
51 <label for="email">${_('Email Address')}:</label>
52 ${h.text('email', defaults.get('email'))}
53 %if 'email' in errors:
54 <span class="error-message">${errors.get('email')}</span>
54 55 <br />
55 56 %endif
56 </div>
57 %endif
58
59 ${h.submit('send', _('Send password reset email'), class_="btn sign-in")}
60 <div class="activation_msg">${_('Password reset link will be sent to matching email address')}</div>
61
62 ${h.end_form()}
57
58 %if captcha_active:
59 <div class="login-captcha"
60 <label for="email">${_('Captcha')}:</label>
61 ${h.hidden('recaptcha_field')}
62 <div id="recaptcha"></div>
63 %if 'recaptcha_field' in errors:
64 <span class="error-message">${errors.get('recaptcha_field')}</span>
65 <br />
66 %endif
67 </div>
68 %endif
69
70 ${h.submit('send', _('Send password reset email'), class_="btn sign-in")}
71 <div class="activation_msg">${_('Password reset link will be sent to matching email address')}</div>
72
73 ${h.end_form()}
74 </div>
63 75 </div>
64 </div>
76 %endif
65 77 </div>
66 78 </div>
67 79
@@ -111,7 +111,9 b''
111 111 </div>
112 112 <div id="reviewers" class="block-right pr-details-content reviewers">
113 113 ## members goes here, filled via JS based on initial selection !
114 <input type="hidden" name="__start__" value="review_members:sequence">
114 115 <ul id="review_members" class="group_members"></ul>
116 <input type="hidden" name="__end__" value="review_members:sequence">
115 117 <div id="add_reviewer_input" class='ac'>
116 118 <div class="reviewer_ac">
117 119 ${h.text('user', class_='ac-input', placeholder=_('Add reviewer'))}
@@ -439,13 +441,6 b''
439 441 };
440 442
441 443 var targetRepoChanged = function(repoData) {
442 // reset && add the reviewer based on selected repo
443 $('#review_members').html('');
444 addReviewMember(
445 repoData.user.user_id, repoData.user.firstname,
446 repoData.user.lastname, repoData.user.username,
447 repoData.user.gravatar_link);
448
449 444 // generate new DESC of target repo displayed next to select
450 445 $('#target_repo_desc').html(
451 446 "<strong>${_('Destination repository')}</strong>: {0}".format(repoData['description'])
@@ -488,10 +483,12 b''
488 483
489 484 $sourceRef.on('change', function(e){
490 485 loadRepoRefDiffPreview();
486 loadDefaultReviewers();
491 487 });
492 488
493 489 $targetRef.on('change', function(e){
494 490 loadRepoRefDiffPreview();
491 loadDefaultReviewers();
495 492 });
496 493
497 494 $targetRepo.on('change', function(e){
@@ -518,6 +515,36 b''
518 515
519 516 });
520 517
518 var loadDefaultReviewers = function() {
519 if (loadDefaultReviewers._currentRequest) {
520 loadDefaultReviewers._currentRequest.abort();
521 }
522 var url = pyroutes.url('repo_default_reviewers_data', {'repo_name': targetRepoName});
523
524 var sourceRepo = $sourceRepo.eq(0).val();
525 var sourceRef = $sourceRef.eq(0).val().split(':');
526 var targetRepo = $targetRepo.eq(0).val();
527 var targetRef = $targetRef.eq(0).val().split(':');
528 url += '?source_repo=' + sourceRepo;
529 url += '&source_ref=' + sourceRef[2];
530 url += '&target_repo=' + targetRepo;
531 url += '&target_ref=' + targetRef[2];
532
533 loadDefaultReviewers._currentRequest = $.get(url)
534 .done(function(data) {
535 loadDefaultReviewers._currentRequest = null;
536
537 // reset && add the reviewer based on selected repo
538 $('#review_members').html('');
539 for (var i = 0; i < data.reviewers.length; i++) {
540 var reviewer = data.reviewers[i];
541 addReviewMember(
542 reviewer.user_id, reviewer.firstname,
543 reviewer.lastname, reviewer.username,
544 reviewer.gravatar_link, reviewer.reasons);
545 }
546 });
547 };
521 548 prButtonLock(true, "${_('Please select origin and destination')}");
522 549
523 550 // auto-load on init, the target refs select2
@@ -532,6 +559,7 b''
532 559 // in case we have a pre-selected value, use it now
533 560 $sourceRef.select2('val', '${c.default_source_ref}');
534 561 loadRepoRefDiffPreview();
562 loadDefaultReviewers();
535 563 %endif
536 564
537 565 ReviewerAutoComplete('user');
@@ -47,8 +47,18 b''
47 47 <div class="pr-details-title">
48 48 ${_('Pull request #%s') % c.pull_request.pull_request_id} ${_('From')} ${h.format_date(c.pull_request.created_on)}
49 49 %if c.allowed_to_update:
50 <span id="open_edit_pullrequest" class="block-right action_button">${_('Edit')}</span>
51 <span id="close_edit_pullrequest" class="block-right action_button" style="display: none;">${_('Close')}</span>
50 <div id="delete_pullrequest" class="pull-right action_button ${'' if c.allowed_to_delete else 'disabled' }" style="clear:inherit;padding: 0">
51 % if c.allowed_to_delete:
52 ${h.secure_form(url('pullrequest_delete', repo_name=c.pull_request.target_repo.repo_name, pull_request_id=c.pull_request.pull_request_id),method='delete')}
53 ${h.submit('remove_%s' % c.pull_request.pull_request_id, _('Delete'),
54 class_="btn btn-link btn-danger",onclick="return confirm('"+_('Confirm to delete this pull request')+"');")}
55 ${h.end_form()}
56 % else:
57 ${_('Delete')}
58 % endif
59 </div>
60 <div id="open_edit_pullrequest" class="pull-right action_button">${_('Edit')}</div>
61 <div id="close_edit_pullrequest" class="pull-right action_button" style="display: none;padding: 0">${_('Cancel edit')}</div>
52 62 %endif
53 63 </div>
54 64
@@ -100,6 +110,25 b''
100 110 </div>
101 111 </div>
102 112 </div>
113
114 ## Link to the shadow repository.
115 %if not c.pull_request.is_closed() and c.pull_request.shadow_merge_ref:
116 <div class="field">
117 <div class="label-summary">
118 <label>Merge:</label>
119 </div>
120 <div class="input">
121 <div class="pr-mergeinfo">
122 %if h.is_hg(c.pull_request.target_repo):
123 <input type="text" value="hg clone -u ${c.pull_request.shadow_merge_ref.name} ${c.shadow_clone_url} pull-request-${c.pull_request.pull_request_id}" readonly="readonly">
124 %elif h.is_git(c.pull_request.target_repo):
125 <input type="text" value="git clone --branch ${c.pull_request.shadow_merge_ref.name} ${c.shadow_clone_url} pull-request-${c.pull_request.pull_request_id}" readonly="readonly">
126 %endif
127 </div>
128 </div>
129 </div>
130 %endif
131
103 132 <div class="field">
104 133 <div class="label-summary">
105 134 <label>${_('Review')}:</label>
@@ -188,17 +217,27 b''
188 217 </div>
189 218 <div id="reviewers" class="block-right pr-details-content reviewers">
190 219 ## members goes here !
220 <input type="hidden" name="__start__" value="review_members:sequence">
191 221 <ul id="review_members" class="group_members">
192 %for member,status in c.pull_request_reviewers:
222 %for member,reasons,status in c.pull_request_reviewers:
193 223 <li id="reviewer_${member.user_id}">
194 224 <div class="reviewers_member">
195 225 <div class="reviewer_status tooltip" title="${h.tooltip(h.commit_status_lbl(status[0][1].status if status else 'not_reviewed'))}">
196 226 <div class="${'flag_status %s' % (status[0][1].status if status else 'not_reviewed')} pull-left reviewer_member_status"></div>
197 227 </div>
198 228 <div id="reviewer_${member.user_id}_name" class="reviewer_name">
199 ${self.gravatar_with_user(member.email, 16)} <div class="reviewer">(${_('owner') if c.pull_request.user_id == member.user_id else _('reviewer')})</div>
229 ${self.gravatar_with_user(member.email, 16)}
200 230 </div>
201 <input id="reviewer_${member.user_id}_input" type="hidden" value="${member.user_id}" name="review_members" />
231 <input type="hidden" name="__start__" value="reviewer:mapping">
232 <input type="hidden" name="__start__" value="reasons:sequence">
233 %for reason in reasons:
234 <div class="reviewer_reason">- ${reason}</div>
235 <input type="hidden" name="reason" value="${reason}">
236
237 %endfor
238 <input type="hidden" name="__end__" value="reasons:sequence">
239 <input id="reviewer_${member.user_id}_input" type="hidden" value="${member.user_id}" name="user_id" />
240 <input type="hidden" name="__end__" value="reviewer:mapping">
202 241 %if c.allowed_to_update:
203 242 <div class="reviewer_member_remove action_button" onclick="removeReviewMember(${member.user_id}, true)" style="visibility: hidden;">
204 243 <i class="icon-remove-sign" ></i>
@@ -208,6 +247,7 b''
208 247 </li>
209 248 %endfor
210 249 </ul>
250 <input type="hidden" name="__end__" value="review_members:sequence">
211 251 %if not c.pull_request.is_closed():
212 252 <div id="add_reviewer_input" class='ac' style="display: none;">
213 253 %if c.allowed_to_update:
@@ -259,143 +299,18 b''
259 299 </div>
260 300 % if not c.missing_commits:
261 301 <%include file="/compare/compare_commits.html" />
262 ## FILES
263 <div class="cs_files_title">
264 <span class="cs_files_expand">
265 <span id="expand_all_files">${_('Expand All')}</span> | <span id="collapse_all_files">${_('Collapse All')}</span>
266 </span>
267 <h2>
268 ${diff_block.diff_summary_text(len(c.files), c.lines_added, c.lines_deleted, c.limited_diff)}
269 </h2>
270 </div>
271 % endif
272 302 <div class="cs_files">
273 %if not c.files and not c.missing_commits:
274 <span class="empty_data">${_('No files')}</span>
275 %endif
276 <table class="compare_view_files">
277 <%namespace name="diff_block" file="/changeset/diff_block.html"/>
278 %for FID, change, path, stats in c.files:
279 <tr class="cs_${change} collapse_file" fid="${FID}">
280 <td class="cs_icon_td">
281 <span class="collapse_file_icon" fid="${FID}"></span>
282 </td>
283 <td class="cs_icon_td">
284 <div class="flag_status not_reviewed hidden"></div>
285 </td>
286 <td class="cs_${change}" id="a_${FID}">
287 <div class="node">
288 <a href="#a_${FID}">
289 <i class="icon-file-${change.lower()}"></i>
290 ${h.safe_unicode(path)}
291 </a>
292 </div>
293 </td>
294 <td>
295 <div class="changes pull-right">${h.fancy_file_stats(stats)}</div>
296 <div class="comment-bubble pull-right" data-path="${path}">
297 <i class="icon-comment"></i>
298 </div>
299 </td>
300 </tr>
301 <tr fid="${FID}" id="diff_${FID}" class="diff_links">
302 <td></td>
303 <td></td>
304 <td class="cs_${change}">
305 %if c.target_repo.repo_name == c.repo_name:
306 ${diff_block.diff_menu(c.repo_name, h.safe_unicode(path), c.target_ref, c.source_ref, change)}
307 %else:
308 ## this is slightly different case later, since the other repo can have this
309 ## file in other state than the origin repo
310 ${diff_block.diff_menu(c.target_repo.repo_name, h.safe_unicode(path), c.target_ref, c.source_ref, change)}
311 %endif
312 </td>
313 <td class="td-actions rc-form">
314 <div data-comment-id="${FID}" class="btn-link show-inline-comments comments-visible">
315 <span class="comments-show">${_('Show comments')}</span>
316 <span class="comments-hide">${_('Hide comments')}</span>
317 </div>
318 </td>
319 </tr>
320 <tr id="tr_${FID}">
321 <td></td>
322 <td></td>
323 <td class="injected_diff" colspan="2">
324 ${diff_block.diff_block_simple([c.changes[FID]])}
325 </td>
326 </tr>
303 <%namespace name="cbdiffs" file="/codeblocks/diffs.html"/>
304 ${cbdiffs.render_diffset_menu()}
305 ${cbdiffs.render_diffset(
306 c.diffset, use_comments=True,
307 collapse_when_files_over=30,
308 disable_new_comments=c.pull_request.is_closed())}
327 309
328 ## Loop through inline comments
329 % if c.outdated_comments.get(path,False):
330 <tr class="outdated">
331 <td></td>
332 <td></td>
333 <td colspan="2">
334 <p>${_('Outdated Inline Comments')}:</p>
335 </td>
336 </tr>
337 <tr class="outdated">
338 <td></td>
339 <td></td>
340 <td colspan="2" class="outdated_comment_block">
341 % for line, comments in c.outdated_comments[path].iteritems():
342 <div class="inline-comment-placeholder" path="${path}" target_id="${h.safeid(h.safe_unicode(path))}">
343 % for co in comments:
344 ${comment.comment_block_outdated(co)}
345 % endfor
346 </div>
347 % endfor
348 </td>
349 </tr>
350 % endif
351 %endfor
352 ## Loop through inline comments for deleted files
353 %for path in c.deleted_files:
354 <tr class="outdated deleted">
355 <td></td>
356 <td></td>
357 <td>${path}</td>
358 </tr>
359 <tr class="outdated deleted">
360 <td></td>
361 <td></td>
362 <td>(${_('Removed')})</td>
363 </tr>
364 % if path in c.outdated_comments:
365 <tr class="outdated deleted">
366 <td></td>
367 <td></td>
368 <td colspan="2">
369 <p>${_('Outdated Inline Comments')}:</p>
370 </td>
371 </tr>
372 <tr class="outdated">
373 <td></td>
374 <td></td>
375 <td colspan="2" class="outdated_comment_block">
376 % for line, comments in c.outdated_comments[path].iteritems():
377 <div class="inline-comment-placeholder" path="${path}" target_id="${h.safeid(h.safe_unicode(path))}">
378 % for co in comments:
379 ${comment.comment_block_outdated(co)}
380 % endfor
381 </div>
382 % endfor
383 </td>
384 </tr>
385 % endif
386 %endfor
387 </table>
388 </div>
389 % if c.limited_diff:
390 <h5>${_('Commit was too big and was cut off...')} <a href="${h.url.current(fulldiff=1, **request.GET.mixed())}" onclick="return confirm('${_("Showing a huge diff might take some time and resources")}')">${_('Show full diff')}</a></h5>
310 </div>
391 311 % endif
392 </div>
393 312 </div>
394 313
395 % if c.limited_diff:
396 <p>${_('Commit was too big and was cut off...')} <a href="${h.url.current(fulldiff=1, **request.GET.mixed())}" onclick="return confirm('${_("Showing a huge diff might take some time and resources")}')">${_('Show full diff')}</a></p>
397 % endif
398
399 314 ## template for inline comment form
400 315 <%namespace name="comment" file="/changeset/changeset_file_comment.html"/>
401 316 ${comment.comment_inline_form()}
@@ -427,6 +342,7 b''
427 342 var PRDetails = {
428 343 editButton: $('#open_edit_pullrequest'),
429 344 closeButton: $('#close_edit_pullrequest'),
345 deleteButton: $('#delete_pullrequest'),
430 346 viewFields: $('#pr-desc, #pr-title'),
431 347 editFields: $('#pr-desc-edit, #pr-title-edit, #pr-save'),
432 348
@@ -439,11 +355,15 b''
439 355 edit: function(event) {
440 356 this.viewFields.hide();
441 357 this.editButton.hide();
358 this.deleteButton.hide();
359 this.closeButton.show();
442 360 this.editFields.show();
443 361 codeMirrorInstance.refresh();
444 362 },
445 363
446 364 view: function(event) {
365 this.editButton.show();
366 this.deleteButton.show();
447 367 this.editFields.hide();
448 368 this.closeButton.hide();
449 369 this.viewFields.show();
@@ -474,7 +394,7 b''
474 394 this.closeButton.hide();
475 395 this.addButton.hide();
476 396 this.removeButtons.css('visibility', 'hidden');
477 }
397 },
478 398 };
479 399
480 400 PRDetails.init();
@@ -569,16 +489,16 b''
569 489 $('.show-inline-comments').on('click', function(e){
570 490 var boxid = $(this).attr('data-comment-id');
571 491 var button = $(this);
572
492
573 493 if(button.hasClass("comments-visible")) {
574 494 $('#{0} .inline-comments'.format(boxid)).each(function(index){
575 495 $(this).hide();
576 })
496 });
577 497 button.removeClass("comments-visible");
578 498 } else {
579 499 $('#{0} .inline-comments'.format(boxid)).each(function(index){
580 500 $(this).show();
581 })
501 });
582 502 button.addClass("comments-visible");
583 503 }
584 504 });
@@ -29,7 +29,6 b''
29 29 </div>
30 30 <%block name="above_register_button" />
31 31 <div id="register" class="right-column">
32 <%include file="/base/flash_msg.html"/>
33 32 <!-- login -->
34 33 <div class="sign-in-title">
35 34 <h1>${_('Create an account')}</h1>
@@ -224,9 +224,11 b' def assert_session_flash(response=None, '
224 224 raise ValueError("Parameter msg is required.")
225 225
226 226 messages = flash.pop_messages()
227 msg = _eval_if_lazy(msg)
228
229 assert messages, 'unable to find message `%s` in empty flash list' % msg
227 230 message = messages[0]
228 231
229 msg = _eval_if_lazy(msg)
230 232 message_text = _eval_if_lazy(message.message)
231 233
232 234 if msg not in message_text:
@@ -115,10 +115,11 b' class TestSanitizeVcsSettings(object):'
115 115 _string_settings = [
116 116 ('vcs.svn.compatible_version', ''),
117 117 ('git_rev_filter', '--all'),
118 ('vcs.hooks.protocol', 'pyro4'),
118 ('vcs.hooks.protocol', 'http'),
119 ('vcs.scm_app_implementation', 'http'),
119 120 ('vcs.server', ''),
120 121 ('vcs.server.log_level', 'debug'),
121 ('vcs.server.protocol', 'pyro4'),
122 ('vcs.server.protocol', 'http'),
122 123 ]
123 124
124 125 _list_settings = [
@@ -27,7 +27,7 b' from rhodecode.model.pull_request import'
27 27 from rhodecode.tests import assert_session_flash
28 28
29 29
30 def test_merge_pull_request_renders_failure_reason(user_regular):
30 def test_merge_pull_request_renders_failure_reason(app, user_regular):
31 31 pull_request = mock.Mock()
32 32 controller = pullrequests.PullrequestsController()
33 33 model_patcher = mock.patch.multiple(
@@ -18,7 +18,7 b''
18 18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 from subprocess import Popen, PIPE
21 from subprocess32 import Popen, PIPE
22 22 import os
23 23 import shutil
24 24 import sys
@@ -45,6 +45,7 b' def scm_extras(user_regular, repo_stub):'
45 45 'make_lock': None,
46 46 'locked_by': [None],
47 47 'commit_ids': ['a' * 40] * 3,
48 'is_shadow_repo': False,
48 49 })
49 50 return extras
50 51
@@ -76,8 +76,8 b' class TestINI(object):'
76 76
77 77 for data in self.ini_params:
78 78 section, ini_params = data.items()[0]
79 key, val = ini_params.items()[0]
80 config[section][key] = val
79 for key, val in ini_params.items():
80 config[section][key] = val
81 81 with tempfile.NamedTemporaryFile(
82 82 prefix=self.new_path_prefix, suffix='.ini', dir=self._dir,
83 83 delete=False) as new_ini_file:
@@ -85,14 +85,13 b' class TestMyAccountController(TestContro'
85 85 def test_my_account_my_pullrequests(self, pr_util):
86 86 self.log_user()
87 87 response = self.app.get(url('my_account_pullrequests'))
88 response.mustcontain('You currently have no open pull requests.')
88 response.mustcontain('There are currently no open pull '
89 'requests requiring your participation.')
89 90
90 91 pr = pr_util.create_pull_request(title='TestMyAccountPR')
91 92 response = self.app.get(url('my_account_pullrequests'))
92 response.mustcontain('There are currently no open pull requests '
93 'requiring your participation')
94
95 response.mustcontain('#%s: TestMyAccountPR' % pr.pull_request_id)
93 response.mustcontain('"name_raw": %s' % pr.pull_request_id)
94 response.mustcontain('TestMyAccountPR')
96 95
97 96 def test_my_account_my_emails(self):
98 97 self.log_user()
@@ -40,22 +40,22 b' class TestAdminPermissionsController(Tes'
40 40 self.app.get(url('admin_permissions_application'))
41 41
42 42 @pytest.mark.parametrize(
43 'anonymous, default_register, default_register_message,'
43 'anonymous, default_register, default_register_message, default_password_reset,'
44 44 'default_extern_activate, expect_error, expect_form_error', [
45 (True, 'hg.register.none', '', 'hg.extern_activate.manual',
45 (True, 'hg.register.none', '', 'hg.password_reset.enabled', 'hg.extern_activate.manual',
46 46 False, False),
47 (True, 'hg.register.manual_activate', '', 'hg.extern_activate.auto',
47 (True, 'hg.register.manual_activate', '', 'hg.password_reset.enabled', 'hg.extern_activate.auto',
48 48 False, False),
49 (True, 'hg.register.auto_activate', '', 'hg.extern_activate.manual',
49 (True, 'hg.register.auto_activate', '', 'hg.password_reset.enabled', 'hg.extern_activate.manual',
50 50 False, False),
51 (True, 'hg.register.auto_activate', '', 'hg.extern_activate.manual',
51 (True, 'hg.register.auto_activate', '', 'hg.password_reset.enabled', 'hg.extern_activate.manual',
52 52 False, False),
53 (True, 'hg.register.XXX', '', 'hg.extern_activate.manual',
53 (True, 'hg.register.XXX', '', 'hg.password_reset.enabled', 'hg.extern_activate.manual',
54 54 False, True),
55 (True, '', '', '', True, False),
55 (True, '', '', 'hg.password_reset.enabled', '', True, False),
56 56 ])
57 57 def test_update_application_permissions(
58 self, anonymous, default_register, default_register_message,
58 self, anonymous, default_register, default_register_message, default_password_reset,
59 59 default_extern_activate, expect_error, expect_form_error):
60 60
61 61 self.log_user()
@@ -66,6 +66,7 b' class TestAdminPermissionsController(Tes'
66 66 'anonymous': anonymous,
67 67 'default_register': default_register,
68 68 'default_register_message': default_register_message,
69 'default_password_reset': default_password_reset,
69 70 'default_extern_activate': default_extern_activate,
70 71 }
71 72 response = self.app.post(url('admin_permissions_application'),
@@ -195,6 +195,8 b' class TestAdminSettingsGlobal:'
195 195 'rhodecode_post_code': '',
196 196 'rhodecode_captcha_private_key': '',
197 197 'rhodecode_captcha_public_key': '',
198 'rhodecode_create_personal_repo_group': False,
199 'rhodecode_personal_repo_group_pattern': '${username}',
198 200 }
199 201 params.update(settings)
200 202 response = self.app.post(url('admin_settings_global'), params=params)
@@ -494,6 +496,14 b' class TestAdminSettingsIssueTracker:'
494 496 response = self.app.get(url('admin_settings_issuetracker'))
495 497 assert response.status_code == 200
496 498
499 def test_add_empty_issuetracker_pattern(
500 self, request, autologin_user, csrf_token):
501 post_url = url('admin_settings_issuetracker_save')
502 post_data = {
503 'csrf_token': csrf_token
504 }
505 self.app.post(post_url, post_data, status=302)
506
497 507 def test_add_issuetracker_pattern(
498 508 self, request, autologin_user, csrf_token):
499 509 pattern = 'issuetracker_pat'
@@ -36,7 +36,7 b' class TestAdminUsersGroupsController(Tes'
36 36 def test_index(self):
37 37 self.log_user()
38 38 response = self.app.get(url('users_groups'))
39 response.status_int == 200
39 assert response.status_int == 200
40 40
41 41 def test_create(self):
42 42 self.log_user()
@@ -148,19 +148,21 b' class TestAdminUsersGroupsController(Tes'
148 148
149 149 fixture.destroy_user_group(users_group_name)
150 150
151 def test_edit(self):
151 def test_edit_autocomplete(self):
152 152 self.log_user()
153 153 ug = fixture.create_user_group(TEST_USER_GROUP, skip_if_exists=True)
154 154 response = self.app.get(
155 155 url('edit_users_group', user_group_id=ug.users_group_id))
156 156 fixture.destroy_user_group(TEST_USER_GROUP)
157 157
158 def test_edit_user_group_members(self):
158 def test_edit_user_group_autocomplete_members(self, xhr_header):
159 159 self.log_user()
160 160 ug = fixture.create_user_group(TEST_USER_GROUP, skip_if_exists=True)
161 161 response = self.app.get(
162 url('edit_user_group_members', user_group_id=ug.users_group_id))
163 response.mustcontain('No members yet')
162 url('edit_user_group_members', user_group_id=ug.users_group_id),
163 extra_environ=xhr_header)
164
165 assert response.body == '{"members": []}'
164 166 fixture.destroy_user_group(TEST_USER_GROUP)
165 167
166 168 def test_usergroup_escape(self):
@@ -181,7 +183,7 b' class TestAdminUsersGroupsController(Tes'
181 183 'csrf_token': self.csrf_token
182 184 }
183 185
184 response = self.app.post(url('users_groups'), data)
186 self.app.post(url('users_groups'), data)
185 187 response = self.app.get(url('users_groups'))
186 188
187 189 response.mustcontain(
@@ -190,3 +192,42 b' class TestAdminUsersGroupsController(Tes'
190 192 response.mustcontain(
191 193 '&lt;img src=&#34;/image2&#34; onload=&#34;'
192 194 'alert(&#39;Hello, World!&#39;);&#34;&gt;')
195
196 def test_update_members_from_user_ids(self, user_regular):
197 uid = user_regular.user_id
198 username = user_regular.username
199 self.log_user()
200
201 user_group = fixture.create_user_group('test_gr_ids')
202 assert user_group.members == []
203 assert user_group.user != user_regular
204 expected_active_state = not user_group.users_group_active
205
206 form_data = [
207 ('csrf_token', self.csrf_token),
208 ('_method', 'put'),
209 ('user', username),
210 ('users_group_name', 'changed_name'),
211 ('users_group_active', expected_active_state),
212 ('user_group_description', 'changed_description'),
213
214 ('__start__', 'user_group_members:sequence'),
215 ('__start__', 'member:mapping'),
216 ('member_user_id', uid),
217 ('type', 'existing'),
218 ('__end__', 'member:mapping'),
219 ('__end__', 'user_group_members:sequence'),
220 ]
221 ugid = user_group.users_group_id
222 self.app.post(url('update_users_group', user_group_id=ugid), form_data)
223
224 user_group = UserGroup.get(ugid)
225 assert user_group
226
227 assert user_group.members[0].user_id == uid
228 assert user_group.user_id == uid
229 assert 'changed_name' in user_group.users_group_name
230 assert 'changed_description' in user_group.user_group_description
231 assert user_group.users_group_active == expected_active_state
232
233 fixture.destroy_user_group(user_group)
@@ -82,8 +82,7 b' class TestChangelogController(TestContro'
82 82 assert expected_url in response.location
83 83 response = response.follow()
84 84 expected_warning = 'Branch {} is not found.'.format(branch)
85 assert_response = AssertResponse(response)
86 assert_response.element_contains('.alert-warning', expected_warning)
85 assert expected_warning in response.body
87 86
88 87 def assert_commits_on_page(self, response, indexes):
89 88 found_indexes = [int(idx) for idx in MATCH_HASH.findall(response.body)]
@@ -82,7 +82,7 b' class TestChangesetController(object):'
82 82 response.mustcontain('new file 100644')
83 83 response.mustcontain('Changed theme to ADC theme') # commit msg
84 84
85 self._check_diff_menus(response, right_menu=True)
85 self._check_new_diff_menus(response, right_menu=True)
86 86
87 87 def test_commit_range_page_different_ops(self, backend):
88 88 commit_id_range = {
@@ -104,14 +104,17 b' class TestChangesetController(object):'
104 104
105 105 response.mustcontain(_shorten_commit_id(commit_ids[0]))
106 106 response.mustcontain(_shorten_commit_id(commit_ids[1]))
107
107
108 108 # svn is special
109 109 if backend.alias == 'svn':
110 110 response.mustcontain('new file 10644')
111 response.mustcontain('34 files changed: 1184 inserted, 311 deleted')
111 response.mustcontain('1 file changed: 5 inserted, 1 deleted')
112 response.mustcontain('12 files changed: 236 inserted, 22 deleted')
113 response.mustcontain('21 files changed: 943 inserted, 288 deleted')
112 114 else:
113 115 response.mustcontain('new file 100644')
114 response.mustcontain('33 files changed: 1165 inserted, 308 deleted')
116 response.mustcontain('12 files changed: 222 inserted, 20 deleted')
117 response.mustcontain('21 files changed: 943 inserted, 288 deleted')
115 118
116 119 # files op files
117 120 response.mustcontain('File no longer present at commit: %s' %
@@ -119,7 +122,7 b' class TestChangesetController(object):'
119 122 response.mustcontain('Added docstrings to vcs.cli') # commit msg
120 123 response.mustcontain('Changed theme to ADC theme') # commit msg
121 124
122 self._check_diff_menus(response)
125 self._check_new_diff_menus(response)
123 126
124 127 def test_combined_compare_commit_page_different_ops(self, backend):
125 128 commit_id_range = {
@@ -146,7 +149,7 b' class TestChangesetController(object):'
146 149 # files op files
147 150 response.mustcontain('File no longer present at commit: %s' %
148 151 _shorten_commit_id(commit_ids[1]))
149
152
150 153 # svn is special
151 154 if backend.alias == 'svn':
152 155 response.mustcontain('new file 10644')
@@ -158,7 +161,7 b' class TestChangesetController(object):'
158 161 response.mustcontain('Added docstrings to vcs.cli') # commit msg
159 162 response.mustcontain('Changed theme to ADC theme') # commit msg
160 163
161 self._check_diff_menus(response)
164 self._check_new_diff_menus(response)
162 165
163 166 def test_changeset_range(self, backend):
164 167 self._check_changeset_range(
@@ -273,7 +276,7 b' Added a symlink'
273 276 """ + diffs['svn'],
274 277 }
275 278
276 def _check_diff_menus(self, response, right_menu=False):
279 def _check_diff_menus(self, response, right_menu=False,):
277 280 # diff menus
278 281 for elem in ['Show File', 'Unified Diff', 'Side-by-side Diff',
279 282 'Raw Diff', 'Download Diff']:
@@ -284,3 +287,16 b' Added a symlink'
284 287 for elem in ['Ignore whitespace', 'Increase context',
285 288 'Hide comments']:
286 289 response.mustcontain(elem)
290
291
292 def _check_new_diff_menus(self, response, right_menu=False,):
293 # diff menus
294 for elem in ['Show file before', 'Show file after',
295 'Raw diff', 'Download diff']:
296 response.mustcontain(elem)
297
298 # right pane diff menus
299 if right_menu:
300 for elem in ['Ignore whitespace', 'Increase context',
301 'Hide comments']:
302 response.mustcontain(elem)
@@ -109,11 +109,17 b' class TestCommitCommentsController(TestC'
109 109 # test DB
110 110 assert ChangesetComment.query().count() == 1
111 111 assert_comment_links(response, 0, ChangesetComment.query().count())
112 response.mustcontain(
113 '''class="inline-comment-placeholder" '''
114 '''path="vcs/web/simplevcs/views/repository.py" '''
115 '''target_id="vcswebsimplevcsviewsrepositorypy"'''
116 )
112
113 if backend.alias == 'svn':
114 response.mustcontain(
115 '''data-f-path="vcs/commands/summary.py" '''
116 '''id="a_c--ad05457a43f8"'''
117 )
118 else:
119 response.mustcontain(
120 '''data-f-path="vcs/backends/hg.py" '''
121 '''id="a_c--9c390eb52cd6"'''
122 )
117 123
118 124 assert Notification.query().count() == 1
119 125 assert ChangesetComment.query().count() == 1
@@ -271,7 +277,6 b' def assert_comment_links(response, comme'
271 277 inline_comments) % inline_comments
272 278 if inline_comments:
273 279 response.mustcontain(
274 '<a href="#inline-comments" '
275 'id="inline-comments-counter">%s</a>' % inline_comments_text)
280 'id="inline-comments-counter">%s</' % inline_comments_text)
276 281 else:
277 282 response.mustcontain(inline_comments_text)
@@ -20,6 +20,7 b''
20 20
21 21 import mock
22 22 import pytest
23 import lxml.html
23 24
24 25 from rhodecode.lib.vcs.backends.base import EmptyCommit
25 26 from rhodecode.lib.vcs.exceptions import RepositoryRequirementError
@@ -609,9 +610,12 b' class ComparePage(AssertResponse):'
609 610 """
610 611
611 612 def contains_file_links_and_anchors(self, files):
613 doc = lxml.html.fromstring(self.response.body)
612 614 for filename, file_id in files:
613 self.contains_one_link(filename, '#' + file_id)
614 615 self.contains_one_anchor(file_id)
616 diffblock = doc.cssselect('[data-f-path="%s"]' % filename)
617 assert len(diffblock) == 1
618 assert len(diffblock[0].cssselect('a[href="#%s"]' % file_id)) == 1
615 619
616 620 def contains_change_summary(self, files_changed, inserted, deleted):
617 621 template = (
@@ -81,7 +81,7 b' def _commit_change('
81 81 return commit
82 82
83 83
84
84
85 85 @pytest.mark.usefixtures("app")
86 86 class TestFilesController:
87 87
@@ -270,9 +270,9 b' class TestFilesController:'
270 270 annotate=True))
271 271
272 272 expected_revisions = {
273 'hg': 'r356:25213a5fbb04',
274 'git': 'r345:c994f0de03b2',
275 'svn': 'r208:209',
273 'hg': 'r356',
274 'git': 'r345',
275 'svn': 'r208',
276 276 }
277 277 response.mustcontain(expected_revisions[backend.alias])
278 278
@@ -56,7 +56,7 b' class TestHomeController(TestController)'
56 56
57 57 rhodecode_version_hash = c.rhodecode_version_hash
58 58 response.mustcontain('style.css?ver={0}'.format(rhodecode_version_hash))
59 response.mustcontain('scripts.js?ver={0}'.format(rhodecode_version_hash))
59 response.mustcontain('rhodecode-components.js?ver={0}'.format(rhodecode_version_hash))
60 60
61 61 def test_index_contains_backend_specific_details(self, backend):
62 62 self.log_user()
@@ -25,7 +25,8 b' import pytest'
25 25
26 26 from rhodecode.config.routing import ADMIN_PREFIX
27 27 from rhodecode.tests import (
28 assert_session_flash, url, HG_REPO, TEST_USER_ADMIN_LOGIN)
28 TestController, assert_session_flash, clear_all_caches, url,
29 HG_REPO, TEST_USER_ADMIN_LOGIN, TEST_USER_ADMIN_PASS)
29 30 from rhodecode.tests.fixture import Fixture
30 31 from rhodecode.tests.utils import AssertResponse, get_session_from_response
31 32 from rhodecode.lib.auth import check_password, generate_auth_token
@@ -39,6 +40,7 b' fixture = Fixture()'
39 40
40 41 # Hardcode URLs because we don't have a request object to use
41 42 # pyramids URL generation methods.
43 index_url = '/'
42 44 login_url = ADMIN_PREFIX + '/login'
43 45 logut_url = ADMIN_PREFIX + '/logout'
44 46 register_url = ADMIN_PREFIX + '/register'
@@ -517,3 +519,70 b' class TestLoginController:'
517 519 repo_name=HG_REPO, revision='tip',
518 520 api_key=new_auth_token.api_key),
519 521 status=302)
522
523
524 class TestPasswordReset(TestController):
525
526 @pytest.mark.parametrize(
527 'pwd_reset_setting, show_link, show_reset', [
528 ('hg.password_reset.enabled', True, True),
529 ('hg.password_reset.hidden', False, True),
530 ('hg.password_reset.disabled', False, False),
531 ])
532 def test_password_reset_settings(
533 self, pwd_reset_setting, show_link, show_reset):
534 clear_all_caches()
535 self.log_user(TEST_USER_ADMIN_LOGIN, TEST_USER_ADMIN_PASS)
536 params = {
537 'csrf_token': self.csrf_token,
538 'anonymous': 'True',
539 'default_register': 'hg.register.auto_activate',
540 'default_register_message': '',
541 'default_password_reset': pwd_reset_setting,
542 'default_extern_activate': 'hg.extern_activate.auto',
543 }
544 resp = self.app.post(url('admin_permissions_application'), params=params)
545 self.logout_user()
546
547 login_page = self.app.get(login_url)
548 asr_login = AssertResponse(login_page)
549 index_page = self.app.get(index_url)
550 asr_index = AssertResponse(index_page)
551
552 if show_link:
553 asr_login.one_element_exists('a.pwd_reset')
554 asr_index.one_element_exists('a.pwd_reset')
555 else:
556 asr_login.no_element_exists('a.pwd_reset')
557 asr_index.no_element_exists('a.pwd_reset')
558
559 pwdreset_page = self.app.get(pwd_reset_url)
560
561 asr_reset = AssertResponse(pwdreset_page)
562 if show_reset:
563 assert 'Send password reset email' in pwdreset_page
564 asr_reset.one_element_exists('#email')
565 asr_reset.one_element_exists('#send')
566 else:
567 assert 'Password reset is disabled.' in pwdreset_page
568 asr_reset.no_element_exists('#email')
569 asr_reset.no_element_exists('#send')
570
571 def test_password_form_disabled(self):
572 self.log_user(TEST_USER_ADMIN_LOGIN, TEST_USER_ADMIN_PASS)
573 params = {
574 'csrf_token': self.csrf_token,
575 'anonymous': 'True',
576 'default_register': 'hg.register.auto_activate',
577 'default_register_message': '',
578 'default_password_reset': 'hg.password_reset.disabled',
579 'default_extern_activate': 'hg.extern_activate.auto',
580 }
581 self.app.post(url('admin_permissions_application'), params=params)
582 self.logout_user()
583
584 pwdreset_page = self.app.post(
585 pwd_reset_url,
586 {'email': 'lisa@rhodecode.com',}
587 )
588 assert 'Password reset is disabled.' in pwdreset_page
@@ -30,6 +30,7 b' from rhodecode.model.db import ('
30 30 from rhodecode.model.meta import Session
31 31 from rhodecode.model.pull_request import PullRequestModel
32 32 from rhodecode.model.user import UserModel
33 from rhodecode.model.repo import RepoModel
33 34 from rhodecode.tests import assert_session_flash, url, TEST_USER_ADMIN_LOGIN
34 35 from rhodecode.tests.utils import AssertResponse
35 36
@@ -187,6 +188,8 b' class TestPullrequestsController:'
187 188 category='error')
188 189
189 190 def test_update_invalid_source_reference(self, pr_util, csrf_token):
191 from rhodecode.lib.vcs.backends.base import UpdateFailureReason
192
190 193 pull_request = pr_util.create_pull_request()
191 194 pull_request.source_ref = 'branch:invalid-branch:invalid-commit-id'
192 195 Session().add(pull_request)
@@ -201,9 +204,31 b' class TestPullrequestsController:'
201 204 params={'update_commits': 'true', '_method': 'put',
202 205 'csrf_token': csrf_token})
203 206
204 assert_session_flash(
205 response, u'Update failed due to missing commits.',
206 category='error')
207 expected_msg = PullRequestModel.UPDATE_STATUS_MESSAGES[
208 UpdateFailureReason.MISSING_SOURCE_REF]
209 assert_session_flash(response, expected_msg, category='error')
210
211 def test_missing_target_reference(self, pr_util, csrf_token):
212 from rhodecode.lib.vcs.backends.base import MergeFailureReason
213 pull_request = pr_util.create_pull_request(
214 approved=True, mergeable=True)
215 pull_request.target_ref = 'branch:invalid-branch:invalid-commit-id'
216 Session().add(pull_request)
217 Session().commit()
218
219 pull_request_id = pull_request.pull_request_id
220 pull_request_url = url(
221 controller='pullrequests', action='show',
222 repo_name=pull_request.target_repo.repo_name,
223 pull_request_id=str(pull_request_id))
224
225 response = self.app.get(pull_request_url)
226
227 assertr = AssertResponse(response)
228 expected_msg = PullRequestModel.MERGE_STATUS_MESSAGES[
229 MergeFailureReason.MISSING_TARGET_REF]
230 assertr.element_contains(
231 'span[data-role="merge-message"]', str(expected_msg))
207 232
208 233 def test_comment_and_close_pull_request(self, pr_util, csrf_token):
209 234 pull_request = pr_util.create_pull_request(approved=True)
@@ -256,8 +281,8 b' class TestPullrequestsController:'
256 281 def test_comment_force_close_pull_request(self, pr_util, csrf_token):
257 282 pull_request = pr_util.create_pull_request()
258 283 pull_request_id = pull_request.pull_request_id
259 reviewers_ids = [1, 2]
260 PullRequestModel().update_reviewers(pull_request_id, reviewers_ids)
284 reviewers_data = [(1, ['reason']), (2, ['reason2'])]
285 PullRequestModel().update_reviewers(pull_request_id, reviewers_data)
261 286 author = pull_request.user_id
262 287 repo = pull_request.target_repo.repo_id
263 288 self.app.post(
@@ -288,28 +313,40 b' class TestPullrequestsController:'
288 313 commits = [
289 314 {'message': 'ancestor'},
290 315 {'message': 'change'},
316 {'message': 'change2'},
291 317 ]
292 318 commit_ids = backend.create_master_repo(commits)
293 319 target = backend.create_repo(heads=['ancestor'])
294 source = backend.create_repo(heads=['change'])
320 source = backend.create_repo(heads=['change2'])
295 321
296 322 response = self.app.post(
297 323 url(
298 324 controller='pullrequests',
299 325 action='create',
300 repo_name=source.repo_name),
301 params={
302 'source_repo': source.repo_name,
303 'source_ref': 'branch:default:' + commit_ids['change'],
304 'target_repo': target.repo_name,
305 'target_ref': 'branch:default:' + commit_ids['ancestor'],
306 'pullrequest_desc': 'Description',
307 'pullrequest_title': 'Title',
308 'review_members': '1',
309 'revisions': commit_ids['change'],
310 'user': '',
311 'csrf_token': csrf_token,
312 },
326 repo_name=source.repo_name
327 ),
328 [
329 ('source_repo', source.repo_name),
330 ('source_ref', 'branch:default:' + commit_ids['change2']),
331 ('target_repo', target.repo_name),
332 ('target_ref', 'branch:default:' + commit_ids['ancestor']),
333 ('pullrequest_desc', 'Description'),
334 ('pullrequest_title', 'Title'),
335 ('__start__', 'review_members:sequence'),
336 ('__start__', 'reviewer:mapping'),
337 ('user_id', '1'),
338 ('__start__', 'reasons:sequence'),
339 ('reason', 'Some reason'),
340 ('__end__', 'reasons:sequence'),
341 ('__end__', 'reviewer:mapping'),
342 ('__end__', 'review_members:sequence'),
343 ('__start__', 'revisions:sequence'),
344 ('revisions', commit_ids['change']),
345 ('revisions', commit_ids['change2']),
346 ('__end__', 'revisions:sequence'),
347 ('user', ''),
348 ('csrf_token', csrf_token),
349 ],
313 350 status=302)
314 351
315 352 location = response.headers['Location']
@@ -317,8 +354,8 b' class TestPullrequestsController:'
317 354 pull_request = PullRequest.get(pull_request_id)
318 355
319 356 # check that we have now both revisions
320 assert pull_request.revisions == [commit_ids['change']]
321 assert pull_request.source_ref == 'branch:default:' + commit_ids['change']
357 assert pull_request.revisions == [commit_ids['change2'], commit_ids['change']]
358 assert pull_request.source_ref == 'branch:default:' + commit_ids['change2']
322 359 expected_target_ref = 'branch:default:' + commit_ids['ancestor']
323 360 assert pull_request.target_ref == expected_target_ref
324 361
@@ -344,19 +381,29 b' class TestPullrequestsController:'
344 381 url(
345 382 controller='pullrequests',
346 383 action='create',
347 repo_name=source.repo_name),
348 params={
349 'source_repo': source.repo_name,
350 'source_ref': 'branch:default:' + commit_ids['change'],
351 'target_repo': target.repo_name,
352 'target_ref': 'branch:default:' + commit_ids['ancestor-child'],
353 'pullrequest_desc': 'Description',
354 'pullrequest_title': 'Title',
355 'review_members': '2',
356 'revisions': commit_ids['change'],
357 'user': '',
358 'csrf_token': csrf_token,
359 },
384 repo_name=source.repo_name
385 ),
386 [
387 ('source_repo', source.repo_name),
388 ('source_ref', 'branch:default:' + commit_ids['change']),
389 ('target_repo', target.repo_name),
390 ('target_ref', 'branch:default:' + commit_ids['ancestor-child']),
391 ('pullrequest_desc', 'Description'),
392 ('pullrequest_title', 'Title'),
393 ('__start__', 'review_members:sequence'),
394 ('__start__', 'reviewer:mapping'),
395 ('user_id', '2'),
396 ('__start__', 'reasons:sequence'),
397 ('reason', 'Some reason'),
398 ('__end__', 'reasons:sequence'),
399 ('__end__', 'reviewer:mapping'),
400 ('__end__', 'review_members:sequence'),
401 ('__start__', 'revisions:sequence'),
402 ('revisions', commit_ids['change']),
403 ('__end__', 'revisions:sequence'),
404 ('user', ''),
405 ('csrf_token', csrf_token),
406 ],
360 407 status=302)
361 408
362 409 location = response.headers['Location']
@@ -373,7 +420,8 b' class TestPullrequestsController:'
373 420 assert len(notifications.all()) == 1
374 421
375 422 # Change reviewers and check that a notification was made
376 PullRequestModel().update_reviewers(pull_request.pull_request_id, [1])
423 PullRequestModel().update_reviewers(
424 pull_request.pull_request_id, [(1, [])])
377 425 assert len(notifications.all()) == 2
378 426
379 427 def test_create_pull_request_stores_ancestor_commit_id(self, backend,
@@ -397,19 +445,29 b' class TestPullrequestsController:'
397 445 url(
398 446 controller='pullrequests',
399 447 action='create',
400 repo_name=source.repo_name),
401 params={
402 'source_repo': source.repo_name,
403 'source_ref': 'branch:default:' + commit_ids['change'],
404 'target_repo': target.repo_name,
405 'target_ref': 'branch:default:' + commit_ids['ancestor-child'],
406 'pullrequest_desc': 'Description',
407 'pullrequest_title': 'Title',
408 'review_members': '1',
409 'revisions': commit_ids['change'],
410 'user': '',
411 'csrf_token': csrf_token,
412 },
448 repo_name=source.repo_name
449 ),
450 [
451 ('source_repo', source.repo_name),
452 ('source_ref', 'branch:default:' + commit_ids['change']),
453 ('target_repo', target.repo_name),
454 ('target_ref', 'branch:default:' + commit_ids['ancestor-child']),
455 ('pullrequest_desc', 'Description'),
456 ('pullrequest_title', 'Title'),
457 ('__start__', 'review_members:sequence'),
458 ('__start__', 'reviewer:mapping'),
459 ('user_id', '1'),
460 ('__start__', 'reasons:sequence'),
461 ('reason', 'Some reason'),
462 ('__end__', 'reasons:sequence'),
463 ('__end__', 'reviewer:mapping'),
464 ('__end__', 'review_members:sequence'),
465 ('__start__', 'revisions:sequence'),
466 ('revisions', commit_ids['change']),
467 ('__end__', 'revisions:sequence'),
468 ('user', ''),
469 ('csrf_token', csrf_token),
470 ],
413 471 status=302)
414 472
415 473 location = response.headers['Location']
@@ -879,6 +937,97 b' class TestPullrequestsController:'
879 937 response.mustcontain(
880 938 "&lt;script&gt;alert(&#39;Hi!&#39;)&lt;/script&gt;")
881 939
940 @pytest.mark.parametrize('mergeable', [True, False])
941 def test_shadow_repository_link(
942 self, mergeable, pr_util, http_host_stub):
943 """
944 Check that the pull request summary page displays a link to the shadow
945 repository if the pull request is mergeable. If it is not mergeable
946 the link should not be displayed.
947 """
948 pull_request = pr_util.create_pull_request(
949 mergeable=mergeable, enable_notifications=False)
950 target_repo = pull_request.target_repo.scm_instance()
951 pr_id = pull_request.pull_request_id
952 shadow_url = '{host}/{repo}/pull-request/{pr_id}/repository'.format(
953 host=http_host_stub, repo=target_repo.name, pr_id=pr_id)
954
955 response = self.app.get(url(
956 controller='pullrequests', action='show',
957 repo_name=target_repo.name,
958 pull_request_id=str(pr_id)))
959
960 assertr = AssertResponse(response)
961 if mergeable:
962 assertr.element_value_contains(
963 'div.pr-mergeinfo input', shadow_url)
964 assertr.element_value_contains(
965 'div.pr-mergeinfo input', 'pr-merge')
966 else:
967 assertr.no_element_exists('div.pr-mergeinfo')
968
969
970 @pytest.mark.usefixtures('app')
971 @pytest.mark.backends("git", "hg")
972 class TestPullrequestsControllerDelete(object):
973 def test_pull_request_delete_button_permissions_admin(
974 self, autologin_user, user_admin, pr_util):
975 pull_request = pr_util.create_pull_request(
976 author=user_admin.username, enable_notifications=False)
977
978 response = self.app.get(url(
979 controller='pullrequests', action='show',
980 repo_name=pull_request.target_repo.scm_instance().name,
981 pull_request_id=str(pull_request.pull_request_id)))
982
983 response.mustcontain('id="delete_pullrequest"')
984 response.mustcontain('Confirm to delete this pull request')
985
986 def test_pull_request_delete_button_permissions_owner(
987 self, autologin_regular_user, user_regular, pr_util):
988 pull_request = pr_util.create_pull_request(
989 author=user_regular.username, enable_notifications=False)
990
991 response = self.app.get(url(
992 controller='pullrequests', action='show',
993 repo_name=pull_request.target_repo.scm_instance().name,
994 pull_request_id=str(pull_request.pull_request_id)))
995
996 response.mustcontain('id="delete_pullrequest"')
997 response.mustcontain('Confirm to delete this pull request')
998
999 def test_pull_request_delete_button_permissions_forbidden(
1000 self, autologin_regular_user, user_regular, user_admin, pr_util):
1001 pull_request = pr_util.create_pull_request(
1002 author=user_admin.username, enable_notifications=False)
1003
1004 response = self.app.get(url(
1005 controller='pullrequests', action='show',
1006 repo_name=pull_request.target_repo.scm_instance().name,
1007 pull_request_id=str(pull_request.pull_request_id)))
1008 response.mustcontain(no=['id="delete_pullrequest"'])
1009 response.mustcontain(no=['Confirm to delete this pull request'])
1010
1011 def test_pull_request_delete_button_permissions_can_update_cannot_delete(
1012 self, autologin_regular_user, user_regular, user_admin, pr_util,
1013 user_util):
1014
1015 pull_request = pr_util.create_pull_request(
1016 author=user_admin.username, enable_notifications=False)
1017
1018 user_util.grant_user_permission_to_repo(
1019 pull_request.target_repo, user_regular,
1020 'repository.write')
1021
1022 response = self.app.get(url(
1023 controller='pullrequests', action='show',
1024 repo_name=pull_request.target_repo.scm_instance().name,
1025 pull_request_id=str(pull_request.pull_request_id)))
1026
1027 response.mustcontain('id="open_edit_pullrequest"')
1028 response.mustcontain('id="delete_pullrequest"')
1029 response.mustcontain(no=['Confirm to delete this pull request'])
1030
882 1031
883 1032 def assert_pull_request_status(pull_request, expected_status):
884 1033 status = ChangesetStatusModel().calculated_review_status(
@@ -19,41 +19,12 b''
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import pytest
22 import requests
23 from mock import Mock, patch
22 from mock import patch
24 23
25 24 from rhodecode import events
26 25 from rhodecode.model.db import Session, Integration
27 26 from rhodecode.integrations.types.slack import SlackIntegrationType
28 27
29 @pytest.fixture
30 def repo_push_event(backend, user_regular):
31 commits = [
32 {'message': 'ancestor commit fixes #15'},
33 {'message': 'quick fixes'},
34 {'message': 'change that fixes #41, #2'},
35 {'message': 'this is because 5b23c3532 broke stuff'},
36 {'message': 'last commit'},
37 ]
38 commit_ids = backend.create_master_repo(commits).values()
39 repo = backend.create_repo()
40 scm_extras = {
41 'ip': '127.0.0.1',
42 'username': user_regular.username,
43 'action': '',
44 'repository': repo.repo_name,
45 'scm': repo.scm_instance().alias,
46 'config': '',
47 'server_url': 'http://example.com',
48 'make_lock': None,
49 'locked_by': [None],
50 'commit_ids': commit_ids,
51 }
52
53 return events.RepoPushEvent(repo_name=repo.repo_name,
54 pushed_commit_ids=commit_ids,
55 extras=scm_extras)
56
57 28
58 29 @pytest.fixture
59 30 def slack_settings():
@@ -157,7 +157,6 b' class TestSimpleSvnApp(object):'
157 157 expected_headers = [
158 158 ('MS-Author-Via', 'DAV'),
159 159 ('SVN-Supported-Posts', 'create-txn-with-props'),
160 ('X-RhodeCode-Backend', 'svn'),
161 160 ]
162 161 response_headers = self.app._get_response_headers(headers)
163 162 assert sorted(response_headers) == sorted(expected_headers)
@@ -192,7 +191,6 b' class TestSimpleSvnApp(object):'
192 191 expected_response_headers = [
193 192 ('SVN-Supported-Posts', 'create-txn-with-props'),
194 193 ('MS-Author-Via', 'DAV'),
195 ('X-RhodeCode-Backend', 'svn'),
196 194 ]
197 195 request_mock.assert_called_once_with(
198 196 self.environment['REQUEST_METHOD'], expected_url,
@@ -28,7 +28,7 b' from rhodecode.lib.caching_query import '
28 28 from rhodecode.lib.hooks_daemon import DummyHooksCallbackDaemon
29 29 from rhodecode.lib.middleware import simplevcs
30 30 from rhodecode.lib.middleware.https_fixup import HttpsFixup
31 from rhodecode.lib.middleware.utils import scm_app
31 from rhodecode.lib.middleware.utils import scm_app_http
32 32 from rhodecode.model.db import User, _hash_key
33 33 from rhodecode.model.meta import Session
34 34 from rhodecode.tests import (
@@ -44,13 +44,15 b' class StubVCSController(simplevcs.Simple'
44 44
45 45 def __init__(self, *args, **kwargs):
46 46 super(StubVCSController, self).__init__(*args, **kwargs)
47 self.repo_name = HG_REPO
47 self._action = 'pull'
48 self._name = HG_REPO
49 self.set_repo_names(None)
48 50
49 51 def _get_repository_name(self, environ):
50 return HG_REPO
52 return self._name
51 53
52 54 def _get_action(self, environ):
53 return "pull"
55 return self._action
54 56
55 57 def _create_wsgi_app(self, repo_path, repo_name, config):
56 58 def fake_app(environ, start_response):
@@ -151,7 +153,7 b' def test_provides_traceback_for_appenlig'
151 153
152 154 def test_provides_utils_scm_app_as_scm_app_by_default(pylonsapp):
153 155 controller = StubVCSController(pylonsapp, pylonsapp.config, None)
154 assert controller.scm_app is scm_app
156 assert controller.scm_app is scm_app_http
155 157
156 158
157 159 def test_allows_to_override_scm_app_via_config(pylonsapp):
@@ -171,6 +173,158 b' def test_should_check_locking(query_stri'
171 173 assert result == expected
172 174
173 175
176 class TestShadowRepoRegularExpression(object):
177 pr_segment = 'pull-request'
178 shadow_segment = 'repository'
179
180 @pytest.mark.parametrize('url, expected', [
181 # repo with/without groups
182 ('My-Repo/{pr_segment}/1/{shadow_segment}', True),
183 ('Group/My-Repo/{pr_segment}/2/{shadow_segment}', True),
184 ('Group/Sub-Group/My-Repo/{pr_segment}/3/{shadow_segment}', True),
185 ('Group/Sub-Group1/Sub-Group2/My-Repo/{pr_segment}/3/{shadow_segment}', True),
186
187 # pull request ID
188 ('MyRepo/{pr_segment}/1/{shadow_segment}', True),
189 ('MyRepo/{pr_segment}/1234567890/{shadow_segment}', True),
190 ('MyRepo/{pr_segment}/-1/{shadow_segment}', False),
191 ('MyRepo/{pr_segment}/invalid/{shadow_segment}', False),
192
193 # unicode
194 (u'Sp€çîál-Repö/{pr_segment}/1/{shadow_segment}', True),
195 (u'Sp€çîál-Gröüp/Sp€çîál-Repö/{pr_segment}/1/{shadow_segment}', True),
196
197 # trailing/leading slash
198 ('/My-Repo/{pr_segment}/1/{shadow_segment}', False),
199 ('My-Repo/{pr_segment}/1/{shadow_segment}/', False),
200 ('/My-Repo/{pr_segment}/1/{shadow_segment}/', False),
201
202 # misc
203 ('My-Repo/{pr_segment}/1/{shadow_segment}/extra', False),
204 ('My-Repo/{pr_segment}/1/{shadow_segment}extra', False),
205 ])
206 def test_shadow_repo_regular_expression(self, url, expected):
207 from rhodecode.lib.middleware.simplevcs import SimpleVCS
208 url = url.format(
209 pr_segment=self.pr_segment,
210 shadow_segment=self.shadow_segment)
211 match_obj = SimpleVCS.shadow_repo_re.match(url)
212 assert (match_obj is not None) == expected
213
214
215 @pytest.mark.backends('git', 'hg')
216 class TestShadowRepoExposure(object):
217
218 def test_pull_on_shadow_repo_propagates_to_wsgi_app(self, pylonsapp):
219 """
220 Check that a pull action to a shadow repo is propagated to the
221 underlying wsgi app.
222 """
223 controller = StubVCSController(pylonsapp, pylonsapp.config, None)
224 controller._check_ssl = mock.Mock()
225 controller.is_shadow_repo = True
226 controller._action = 'pull'
227 controller.stub_response_body = 'dummy body value'
228 environ_stub = {
229 'HTTP_HOST': 'test.example.com',
230 'REQUEST_METHOD': 'GET',
231 'wsgi.url_scheme': 'http',
232 }
233
234 response = controller(environ_stub, mock.Mock())
235 response_body = ''.join(response)
236
237 # Assert that we got the response from the wsgi app.
238 assert response_body == controller.stub_response_body
239
240 def test_push_on_shadow_repo_raises(self, pylonsapp):
241 """
242 Check that a push action to a shadow repo is aborted.
243 """
244 controller = StubVCSController(pylonsapp, pylonsapp.config, None)
245 controller._check_ssl = mock.Mock()
246 controller.is_shadow_repo = True
247 controller._action = 'push'
248 controller.stub_response_body = 'dummy body value'
249 environ_stub = {
250 'HTTP_HOST': 'test.example.com',
251 'REQUEST_METHOD': 'GET',
252 'wsgi.url_scheme': 'http',
253 }
254
255 response = controller(environ_stub, mock.Mock())
256 response_body = ''.join(response)
257
258 assert response_body != controller.stub_response_body
259 # Assert that a 406 error is returned.
260 assert '406 Not Acceptable' in response_body
261
262 def test_set_repo_names_no_shadow(self, pylonsapp):
263 """
264 Check that the set_repo_names method sets all names to the one returned
265 by the _get_repository_name method on a request to a non shadow repo.
266 """
267 environ_stub = {}
268 controller = StubVCSController(pylonsapp, pylonsapp.config, None)
269 controller._name = 'RepoGroup/MyRepo'
270 controller.set_repo_names(environ_stub)
271 assert not controller.is_shadow_repo
272 assert (controller.url_repo_name ==
273 controller.acl_repo_name ==
274 controller.vcs_repo_name ==
275 controller._get_repository_name(environ_stub))
276
277 def test_set_repo_names_with_shadow(self, pylonsapp, pr_util):
278 """
279 Check that the set_repo_names method sets correct names on a request
280 to a shadow repo.
281 """
282 from rhodecode.model.pull_request import PullRequestModel
283
284 pull_request = pr_util.create_pull_request()
285 shadow_url = '{target}/{pr_segment}/{pr_id}/{shadow_segment}'.format(
286 target=pull_request.target_repo.repo_name,
287 pr_id=pull_request.pull_request_id,
288 pr_segment=TestShadowRepoRegularExpression.pr_segment,
289 shadow_segment=TestShadowRepoRegularExpression.shadow_segment)
290 controller = StubVCSController(pylonsapp, pylonsapp.config, None)
291 controller._name = shadow_url
292 controller.set_repo_names({})
293
294 # Get file system path to shadow repo for assertions.
295 workspace_id = PullRequestModel()._workspace_id(pull_request)
296 target_vcs = pull_request.target_repo.scm_instance()
297 vcs_repo_name = target_vcs._get_shadow_repository_path(
298 workspace_id)
299
300 assert controller.vcs_repo_name == vcs_repo_name
301 assert controller.url_repo_name == shadow_url
302 assert controller.acl_repo_name == pull_request.target_repo.repo_name
303 assert controller.is_shadow_repo
304
305 def test_set_repo_names_with_shadow_but_missing_pr(
306 self, pylonsapp, pr_util):
307 """
308 Checks that the set_repo_names method enforces matching target repos
309 and pull request IDs.
310 """
311 pull_request = pr_util.create_pull_request()
312 shadow_url = '{target}/{pr_segment}/{pr_id}/{shadow_segment}'.format(
313 target=pull_request.target_repo.repo_name,
314 pr_id=999999999,
315 pr_segment=TestShadowRepoRegularExpression.pr_segment,
316 shadow_segment=TestShadowRepoRegularExpression.shadow_segment)
317 controller = StubVCSController(pylonsapp, pylonsapp.config, None)
318 controller._name = shadow_url
319 controller.set_repo_names({})
320
321 assert not controller.is_shadow_repo
322 assert (controller.url_repo_name ==
323 controller.acl_repo_name ==
324 controller.vcs_repo_name)
325
326
327 @pytest.mark.usefixtures('db')
174 328 @mock.patch.multiple(
175 329 'Pyro4.config', SERVERTYPE='multiplex', POLLTIMEOUT=0.01)
176 330 class TestGenerateVcsResponse:
@@ -245,7 +399,6 b' class TestGenerateVcsResponse:'
245 399 result = controller._generate_vcs_response(
246 400 environ={}, start_response=self.start_response,
247 401 repo_path='fake_repo_path',
248 repo_name='fake_repo_name',
249 402 extras={}, action='push')
250 403 self.controller = controller
251 404 return result
@@ -20,8 +20,7 b''
20 20
21 21 import mock
22 22 import pytest
23 import rhodecode
24 import rhodecode.lib.vcs.client as client
23
25 24
26 25 @pytest.mark.usefixtures('autologin_user', 'app')
27 26 def test_vcs_available_returns_summary_page(app, backend):
@@ -32,16 +31,31 b' def test_vcs_available_returns_summary_p'
32 31
33 32
34 33 @pytest.mark.usefixtures('autologin_user', 'app')
35 def test_vcs_unavailable_returns_vcs_error_page(app, backend):
34 def test_vcs_unavailable_returns_vcs_error_page(app, backend, app_settings):
35 from rhodecode.lib.vcs.exceptions import VCSCommunicationError
36 from rhodecode.lib.middleware.error_handling import (
37 PylonsErrorHandlingMiddleware)
38
39 # Depending on the used VCSServer protocol we have to patch a different
40 # RemoteRepo class to raise an exception. For the test it doesn't matter
41 # if http or pyro4 is used, it just requires the exception to be raised.
42 vcs_protocol = app_settings['vcs.server.protocol']
43 if vcs_protocol == 'http':
44 from rhodecode.lib.vcs.client_http import RemoteRepo
45 elif vcs_protocol == 'pyro4':
46 from rhodecode.lib.vcs.client import RemoteRepo
47 else:
48 pytest.fail('Unknown VCS server protocol: "{}"'.format(vcs_protocol))
49
36 50 url = '/{repo_name}'.format(repo_name=backend.repo.repo_name)
37 51
38 try:
39 rhodecode.disable_error_handler = False
40 with mock.patch.object(client, '_get_proxy_method') as p:
41 p.side_effect = client.exceptions.PyroVCSCommunicationError()
52 # Patch remote repo to raise an exception instead of making a RPC.
53 with mock.patch.object(RemoteRepo, '__getattr__') as remote_mock:
54 remote_mock.side_effect = VCSCommunicationError()
55 # Patch pylons error handling middleware to not re-raise exceptions.
56 with mock.patch.object(PylonsErrorHandlingMiddleware, 'reraise') as r:
57 r.return_value = False
42 58 response = app.get(url, expect_errors=True)
43 finally:
44 rhodecode.disable_error_handler = True
45 59
46 60 assert response.status_code == 502
47 61 assert 'Could not connect to VCS Server' in response.body
@@ -33,7 +33,7 b' def vcs_http_app(vcsserver_http_echo_app'
33 33 """
34 34 git_url = vcsserver_http_echo_app.http_url + 'stream/git/'
35 35 vcs_http_proxy = scm_app_http.VcsHttpProxy(
36 git_url, 'stub_path', 'stub_name', None, 'stub_backend')
36 git_url, 'stub_path', 'stub_name', None)
37 37 app = webtest.TestApp(vcs_http_proxy)
38 38 return app
39 39
@@ -24,7 +24,7 b' Checking the chunked data transfer via H'
24 24
25 25 import os
26 26 import time
27 import subprocess
27 import subprocess32
28 28
29 29 import pytest
30 30 import requests
@@ -53,7 +53,7 b' def echo_app_chunking(request, available'
53 53 'rhodecode.tests.lib.middleware.utils.test_scm_app_http_chunking'
54 54 ':create_echo_app')
55 55 command = command.format(port=port)
56 proc = subprocess.Popen(command.split(' '), bufsize=0)
56 proc = subprocess32.Popen(command.split(' '), bufsize=0)
57 57 echo_app_url = 'http://localhost:' + str(port)
58 58
59 59 @request.addfinalizer
@@ -78,7 +78,7 b' def scm_app(request, available_port_fact'
78 78 command = command.format(port=port)
79 79 env = os.environ.copy()
80 80 env["RC_ECHO_URL"] = echo_app_chunking
81 proc = subprocess.Popen(command.split(' '), bufsize=0, env=env)
81 proc = subprocess32.Popen(command.split(' '), bufsize=0, env=env)
82 82 scm_app_url = 'http://localhost:' + str(port)
83 83 wait_for_url(scm_app_url)
84 84
@@ -133,4 +133,4 b' def create_scm_app():'
133 133 """
134 134 echo_app_url = os.environ["RC_ECHO_URL"]
135 135 return scm_app_http.VcsHttpProxy(
136 echo_app_url, 'stub_path', 'stub_name', None, 'stub_backend')
136 echo_app_url, 'stub_path', 'stub_name', None)
@@ -93,6 +93,8 b' def test_remote_app_caller():'
93 93 response = test_app.get('/path')
94 94
95 95 assert response.status == '200 OK'
96 assert response.headers.items() == [
97 ('Content-Type', 'text/plain'), ('Content-Length', '7')]
96 assert sorted(response.headers.items()) == sorted([
97 ('Content-Type', 'text/plain'),
98 ('Content-Length', '7'),
99 ])
98 100 assert response.body == 'content'
@@ -76,7 +76,7 b' def test_diffprocessor_as_html_with_comm'
76 76 expected_html = textwrap.dedent('''
77 77 <table class="code-difftable">
78 78 <tr class="line context">
79 <td class="add-comment-line"><span class="add-comment-content"></span></td><td class="comment-toggle tooltip" title="Toggle Comments"><i class="icon-comment"></i></td>
79 <td class="add-comment-line"><span class="add-comment-content"></span></td><td class="comment-toggle tooltip" title="Toggle Comment Thread"><i class="icon-comment"></i></td>
80 80 <td class="lineno old">...</td>
81 81 <td class="lineno new">...</td>
82 82 <td class="code no-comment">
@@ -85,7 +85,7 b' def test_diffprocessor_as_html_with_comm'
85 85 </td>
86 86 </tr>
87 87 <tr class="line unmod">
88 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comments"><i class="icon-comment"></i></td>
88 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comment Thread"><i class="icon-comment"></i></td>
89 89 <td id="setuppy_o2" class="lineno old"><a href="#setuppy_o2" class="tooltip"
90 90 title="Click to select line">2</a></td>
91 91 <td id="setuppy_n2" class="lineno new"><a href="#setuppy_n2" class="tooltip"
@@ -96,7 +96,7 b' def test_diffprocessor_as_html_with_comm'
96 96 </td>
97 97 </tr>
98 98 <tr class="line unmod">
99 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comments"><i class="icon-comment"></i></td>
99 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comment Thread"><i class="icon-comment"></i></td>
100 100 <td id="setuppy_o3" class="lineno old"><a href="#setuppy_o3" class="tooltip"
101 101 title="Click to select line">3</a></td>
102 102 <td id="setuppy_n3" class="lineno new"><a href="#setuppy_n3" class="tooltip"
@@ -107,7 +107,7 b' def test_diffprocessor_as_html_with_comm'
107 107 </td>
108 108 </tr>
109 109 <tr class="line unmod">
110 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comments"><i class="icon-comment"></i></td>
110 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comment Thread"><i class="icon-comment"></i></td>
111 111 <td id="setuppy_o4" class="lineno old"><a href="#setuppy_o4" class="tooltip"
112 112 title="Click to select line">4</a></td>
113 113 <td id="setuppy_n4" class="lineno new"><a href="#setuppy_n4" class="tooltip"
@@ -118,7 +118,7 b' def test_diffprocessor_as_html_with_comm'
118 118 </td>
119 119 </tr>
120 120 <tr class="line del">
121 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comments"><i class="icon-comment"></i></td>
121 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comment Thread"><i class="icon-comment"></i></td>
122 122 <td id="setuppy_o5" class="lineno old"><a href="#setuppy_o5" class="tooltip"
123 123 title="Click to select line">5</a></td>
124 124 <td class="lineno new"><a href="#setuppy_n" class="tooltip"
@@ -129,7 +129,7 b' def test_diffprocessor_as_html_with_comm'
129 129 </td>
130 130 </tr>
131 131 <tr class="line add">
132 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comments"><i class="icon-comment"></i></td>
132 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comment Thread"><i class="icon-comment"></i></td>
133 133 <td class="lineno old"><a href="#setuppy_o" class="tooltip"
134 134 title="Click to select line"></a></td>
135 135 <td id="setuppy_n5" class="lineno new"><a href="#setuppy_n5" class="tooltip"
@@ -140,7 +140,7 b' def test_diffprocessor_as_html_with_comm'
140 140 </td>
141 141 </tr>
142 142 <tr class="line unmod">
143 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comments"><i class="icon-comment"></i></td>
143 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comment Thread"><i class="icon-comment"></i></td>
144 144 <td id="setuppy_o6" class="lineno old"><a href="#setuppy_o6" class="tooltip"
145 145 title="Click to select line">6</a></td>
146 146 <td id="setuppy_n6" class="lineno new"><a href="#setuppy_n6" class="tooltip"
@@ -151,7 +151,7 b' def test_diffprocessor_as_html_with_comm'
151 151 </td>
152 152 </tr>
153 153 <tr class="line unmod">
154 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comments"><i class="icon-comment"></i></td>
154 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comment Thread"><i class="icon-comment"></i></td>
155 155 <td id="setuppy_o7" class="lineno old"><a href="#setuppy_o7" class="tooltip"
156 156 title="Click to select line">7</a></td>
157 157 <td id="setuppy_n7" class="lineno new"><a href="#setuppy_n7" class="tooltip"
@@ -162,7 +162,7 b' def test_diffprocessor_as_html_with_comm'
162 162 </td>
163 163 </tr>
164 164 <tr class="line unmod">
165 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comments"><i class="icon-comment"></i></td>
165 <td class="add-comment-line"><span class="add-comment-content"><a href="#"><span class="icon-comment-add"></span></a></span></td><td class="comment-toggle tooltip" title="Toggle Comment Thread"><i class="icon-comment"></i></td>
166 166 <td id="setuppy_o8" class="lineno old"><a href="#setuppy_o8" class="tooltip"
167 167 title="Click to select line">8</a></td>
168 168 <td id="setuppy_n8" class="lineno new"><a href="#setuppy_n8" class="tooltip"
@@ -19,6 +19,7 b''
19 19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20 20
21 21 import mock
22 import pytest
22 23
23 24 from rhodecode.lib import hooks_base, utils2
24 25
@@ -40,6 +41,7 b' def test_post_push_truncates_commits(use'
40 41 'make_lock': None,
41 42 'locked_by': [None],
42 43 'commit_ids': ['abcde12345' * 4] * 30000,
44 'is_shadow_repo': False,
43 45 }
44 46 extras = utils2.AttributeDict(extras)
45 47
@@ -51,3 +53,90 b' def test_post_push_truncates_commits(use'
51 53 hooks_base.action_logger.assert_called_with(
52 54 extras.username, expected_action, extras.repository, extras.ip,
53 55 commit=True)
56
57
58 def assert_called_with_mock(callable_, expected_mock_name):
59 mock_obj = callable_.call_args[0][0]
60 mock_name = mock_obj._mock_new_parent._mock_new_name
61 assert mock_name == expected_mock_name
62
63
64 @pytest.fixture
65 def hook_extras(user_regular, repo_stub):
66 extras = utils2.AttributeDict({
67 'ip': '127.0.0.1',
68 'username': user_regular.username,
69 'action': 'push',
70 'repository': repo_stub.repo_name,
71 'scm': '',
72 'config': '',
73 'server_url': 'http://example.com',
74 'make_lock': None,
75 'locked_by': [None],
76 'commit_ids': [],
77 'is_shadow_repo': False,
78 })
79 return extras
80
81
82 @pytest.mark.parametrize('func, extension, event', [
83 (hooks_base.pre_push, 'pre_push_extension', 'RepoPrePushEvent'),
84 (hooks_base.post_push, 'post_pull_extension', 'RepoPushEvent'),
85 (hooks_base.pre_pull, 'pre_pull_extension', 'RepoPrePullEvent'),
86 (hooks_base.post_pull, 'post_push_extension', 'RepoPullEvent'),
87 ])
88 def test_hooks_propagate(func, extension, event, hook_extras):
89 """
90 Tests that our hook code propagates to rhodecode extensions and triggers
91 the appropriate event.
92 """
93 extension_mock = mock.Mock()
94 events_mock = mock.Mock()
95 patches = {
96 'Repository': mock.Mock(),
97 'events': events_mock,
98 extension: extension_mock,
99 }
100
101 # Clear shadow repo flag.
102 hook_extras.is_shadow_repo = False
103
104 # Execute hook function.
105 with mock.patch.multiple(hooks_base, **patches):
106 func(hook_extras)
107
108 # Assert that extensions are called and event was fired.
109 extension_mock.called_once()
110 assert_called_with_mock(events_mock.trigger, event)
111
112
113 @pytest.mark.parametrize('func, extension, event', [
114 (hooks_base.pre_push, 'pre_push_extension', 'RepoPrePushEvent'),
115 (hooks_base.post_push, 'post_pull_extension', 'RepoPushEvent'),
116 (hooks_base.pre_pull, 'pre_pull_extension', 'RepoPrePullEvent'),
117 (hooks_base.post_pull, 'post_push_extension', 'RepoPullEvent'),
118 ])
119 def test_hooks_propagates_not_on_shadow(func, extension, event, hook_extras):
120 """
121 If hooks are called by a request to a shadow repo we only want to run our
122 internal hooks code but not external ones like rhodecode extensions or
123 trigger an event.
124 """
125 extension_mock = mock.Mock()
126 events_mock = mock.Mock()
127 patches = {
128 'Repository': mock.Mock(),
129 'events': events_mock,
130 extension: extension_mock,
131 }
132
133 # Set shadow repo flag.
134 hook_extras.is_shadow_repo = True
135
136 # Execute hook function.
137 with mock.patch.multiple(hooks_base, **patches):
138 func(hook_extras)
139
140 # Assert that extensions are *not* called and event was *not* fired.
141 assert not extension_mock.called
142 assert not events_mock.trigger.called
@@ -318,18 +318,18 b' class TestHttpHooksCallbackDaemon(object'
318 318
319 319
320 320 class TestPrepareHooksDaemon(object):
321 def test_returns_dummy_hooks_callback_daemon_when_using_direct_calls(self):
321 @pytest.mark.parametrize('protocol', ('http', 'pyro4'))
322 def test_returns_dummy_hooks_callback_daemon_when_using_direct_calls(
323 self, protocol):
322 324 expected_extras = {'extra1': 'value1'}
323 325 callback, extras = hooks_daemon.prepare_callback_daemon(
324 expected_extras.copy(), use_direct_calls=True)
326 expected_extras.copy(), protocol=protocol, use_direct_calls=True)
325 327 assert isinstance(callback, hooks_daemon.DummyHooksCallbackDaemon)
326 328 expected_extras['hooks_module'] = 'rhodecode.lib.hooks_daemon'
327 329 assert extras == expected_extras
328 330
329 331 @pytest.mark.parametrize('protocol, expected_class', (
330 332 ('pyro4', hooks_daemon.Pyro4HooksCallbackDaemon),
331 ('Pyro4', hooks_daemon.Pyro4HooksCallbackDaemon),
332 ('HTTP', hooks_daemon.HttpHooksCallbackDaemon),
333 333 ('http', hooks_daemon.HttpHooksCallbackDaemon)
334 334 ))
335 335 def test_returns_real_hooks_callback_daemon_when_protocol_is_specified(
@@ -339,13 +339,30 b' class TestPrepareHooksDaemon(object):'
339 339 'hooks_protocol': protocol.lower()
340 340 }
341 341 callback, extras = hooks_daemon.prepare_callback_daemon(
342 expected_extras.copy(), protocol=protocol)
342 expected_extras.copy(), protocol=protocol, use_direct_calls=False)
343 343 assert isinstance(callback, expected_class)
344 344 hooks_uri = extras.pop('hooks_uri')
345 345 assert extras == expected_extras
346 346 if protocol.lower() == 'pyro4':
347 347 assert hooks_uri.startswith('PYRO')
348 348
349 @pytest.mark.parametrize('protocol', (
350 'invalid',
351 'Pyro4',
352 'Http',
353 'HTTP',
354 ))
355 def test_raises_on_invalid_protocol(self, protocol):
356 expected_extras = {
357 'extra1': 'value1',
358 'hooks_protocol': protocol.lower()
359 }
360 with pytest.raises(Exception):
361 callback, extras = hooks_daemon.prepare_callback_daemon(
362 expected_extras.copy(),
363 protocol=protocol,
364 use_direct_calls=False)
365
349 366
350 367 class MockRequest(object):
351 368 def __init__(self, request):
@@ -21,7 +21,22 b''
21 21 import pickle
22 22 import pytest
23 23
24 from rhodecode.lib.jsonalchemy import MutationDict, MutationList
24 from sqlalchemy import Column, String, create_engine
25 from sqlalchemy.orm import sessionmaker
26 from sqlalchemy.ext.declarative import declarative_base
27
28 from rhodecode.lib.jsonalchemy import (
29 MutationDict, MutationList, MutationObj, JsonType)
30
31
32 @pytest.fixture
33 def engine():
34 return create_engine('sqlite://')
35
36
37 @pytest.fixture
38 def session(engine):
39 return sessionmaker(bind=engine)()
25 40
26 41
27 42 def test_mutation_dict_is_picklable():
@@ -30,12 +45,14 b' def test_mutation_dict_is_picklable():'
30 45 loaded = pickle.loads(dumped)
31 46 assert loaded == mutation_dict
32 47
48
33 49 def test_mutation_list_is_picklable():
34 50 mutation_list = MutationList(['a', 'b', 'c'])
35 51 dumped = pickle.dumps(mutation_list)
36 52 loaded = pickle.loads(dumped)
37 53 assert loaded == mutation_list
38 54
55
39 56 def test_mutation_dict_with_lists_is_picklable():
40 57 mutation_dict = MutationDict({
41 58 'key': MutationList(['values', MutationDict({'key': 'value'})])
@@ -43,3 +60,48 b' def test_mutation_dict_with_lists_is_pic'
43 60 dumped = pickle.dumps(mutation_dict)
44 61 loaded = pickle.loads(dumped)
45 62 assert loaded == mutation_dict
63
64
65 def test_mutation_types_with_nullable(engine, session):
66 # TODO: dan: ideally want to make this parametrized python => sql tests eg:
67 # (MutationObj, 5) => '5'
68 # (MutationObj, {'a': 5}) => '{"a": 5}'
69 # (MutationObj, None) => 'null' <- think about if None is 'null' or NULL
70
71 Base = declarative_base()
72
73 class DummyModel(Base):
74 __tablename__ = 'some_table'
75 name = Column(String, primary_key=True)
76 json_list = Column(MutationList.as_mutable(JsonType('list')))
77 json_dict = Column(MutationDict.as_mutable(JsonType('dict')))
78 json_obj = Column(MutationObj.as_mutable(JsonType()))
79
80 Base.metadata.create_all(engine)
81
82 obj_nulls = DummyModel(name='nulls')
83 obj_stuff = DummyModel(
84 name='stuff', json_list=[1,2,3], json_dict={'a': 5}, json_obj=9)
85
86 session.add(obj_nulls)
87 session.add(obj_stuff)
88 session.commit()
89 session.expire_all()
90
91 assert engine.execute(
92 "select * from some_table where name = 'nulls';").first() == (
93 (u'nulls', None, None, None)
94 )
95 ret_nulls = session.query(DummyModel).get('nulls')
96 assert ret_nulls.json_list == []
97 assert ret_nulls.json_dict == {}
98 assert ret_nulls.json_obj is None
99
100 assert engine.execute(
101 "select * from some_table where name = 'stuff';").first() == (
102 (u'stuff', u'[1, 2, 3]', u'{"a": 5}', u'9')
103 )
104 ret_stuff = session.query(DummyModel).get('stuff')
105 assert ret_stuff.json_list == [1, 2, 3]
106 assert ret_stuff.json_dict == {'a': 5}
107 assert ret_stuff.json_obj == 9
@@ -32,7 +32,7 b' import itertools'
32 32 import os
33 33 import pprint
34 34 import shutil
35 import subprocess
35 import subprocess32
36 36 import sys
37 37 import time
38 38
@@ -77,12 +77,12 b' def execute(*popenargs, **kwargs):'
77 77 input = kwargs.pop('stdin', None)
78 78 stdin = None
79 79 if input:
80 stdin = subprocess.PIPE
80 stdin = subprocess32.PIPE
81 81 #if 'stderr' not in kwargs:
82 # kwargs['stderr'] = subprocess.PIPE
82 # kwargs['stderr'] = subprocess32.PIPE
83 83 if 'stdout' in kwargs:
84 84 raise ValueError('stdout argument not allowed, it will be overridden.')
85 process = subprocess.Popen(stdin=stdin, stdout=subprocess.PIPE,
85 process = subprocess32.Popen(stdin=stdin, stdout=subprocess32.PIPE,
86 86 *popenargs, **kwargs)
87 87 output, error = process.communicate(input=input)
88 88 retcode = process.poll()
@@ -91,7 +91,7 b' def execute(*popenargs, **kwargs):'
91 91 if cmd is None:
92 92 cmd = popenargs[0]
93 93 print cmd, output, error
94 raise subprocess.CalledProcessError(retcode, cmd, output=output)
94 raise subprocess32.CalledProcessError(retcode, cmd, output=output)
95 95 return output
96 96
97 97
@@ -31,7 +31,7 b' To stop the script by press Ctrl-C'
31 31 import datetime
32 32 import os
33 33 import psutil
34 import subprocess
34 import subprocess32
35 35 import sys
36 36 import time
37 37 import traceback
@@ -66,7 +66,7 b' def dump_system():'
66 66
67 67
68 68 def count_dulwich_fds(proc):
69 p = subprocess.Popen(["lsof", "-p", proc.pid], stdout=subprocess.PIPE)
69 p = subprocess32.Popen(["lsof", "-p", proc.pid], stdout=subprocess32.PIPE)
70 70 out, err = p.communicate()
71 71
72 72 count = 0
@@ -117,7 +117,7 b' print "VCS - Ok"'
117 117
118 118 print "\nStarting RhodeCode..."
119 119 rc = psutil.Popen("RC_VCSSERVER_TEST_DISABLE=1 paster serve test.ini",
120 shell=True, stdin=subprocess.PIPE)
120 shell=True, stdin=subprocess32.PIPE)
121 121 time.sleep(1)
122 122 if not rc.is_running():
123 123 print "RC - Failed to start"
@@ -40,7 +40,7 b' import functools'
40 40 import logging
41 41 import os
42 42 import shutil
43 import subprocess
43 import subprocess32
44 44 import tempfile
45 45 import time
46 46 from itertools import chain
@@ -145,8 +145,8 b' class Repository(object):'
145 145
146 146 def _run(self, *args):
147 147 command = [self.BASE_COMMAND] + list(args)
148 process = subprocess.Popen(
149 command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
148 process = subprocess32.Popen(
149 command, stdout=subprocess32.PIPE, stderr=subprocess32.PIPE)
150 150 return process.communicate()
151 151
152 152 def _create_file(self, name, size):
@@ -21,8 +21,6 b''
21 21 import colander
22 22 import pytest
23 23
24 from rhodecode.model import validation_schema
25
26 24 from rhodecode.integrations import integration_type_registry
27 25 from rhodecode.integrations.types.base import IntegrationTypeBase
28 26 from rhodecode.model.validation_schema.schemas.integration_schema import (
@@ -33,14 +31,12 b' from rhodecode.model.validation_schema.s'
33 31 @pytest.mark.usefixtures('app', 'autologin_user')
34 32 class TestIntegrationSchema(object):
35 33
36 def test_deserialize_integration_schema_perms(self, backend_random,
37 test_repo_group,
38 StubIntegrationType):
34 def test_deserialize_integration_schema_perms(
35 self, backend_random, test_repo_group, StubIntegrationType):
39 36
40 37 repo = backend_random.repo
41 38 repo_group = test_repo_group
42 39
43
44 40 empty_perms_dict = {
45 41 'global': [],
46 42 'repositories': {},
@@ -21,26 +21,82 b''
21 21 import colander
22 22 import pytest
23 23
24 from rhodecode.model.validation_schema.types import GroupNameType
24 from rhodecode.model.validation_schema.types import (
25 GroupNameType, RepoNameType, StringBooleanType)
25 26
26 27
27 28 class TestGroupNameType(object):
28 29 @pytest.mark.parametrize('given, expected', [
29 30 ('//group1/group2//', 'group1/group2'),
30 31 ('//group1///group2//', 'group1/group2'),
31 ('group1/group2///group3', 'group1/group2/group3')
32 ('group1/group2///group3', 'group1/group2/group3'),
32 33 ])
33 def test_replace_extra_slashes_cleans_up_extra_slashes(
34 self, given, expected):
35 type_ = GroupNameType()
36 result = type_._replace_extra_slashes(given)
34 def test_normalize_path(self, given, expected):
35 result = GroupNameType()._normalize(given)
37 36 assert result == expected
38 37
39 def test_deserialize_cleans_up_extra_slashes(self):
38 @pytest.mark.parametrize('given, expected', [
39 ('//group1/group2//', 'group1/group2'),
40 ('//group1///group2//', 'group1/group2'),
41 ('group1/group2///group3', 'group1/group2/group3'),
42 ('v1.2', 'v1.2'),
43 ('/v1.2', 'v1.2'),
44 ('.dirs', '.dirs'),
45 ('..dirs', '.dirs'),
46 ('./..dirs', '.dirs'),
47 ('dir/;name;/;[];/sub', 'dir/name/sub'),
48 (',/,/,d,,,', 'd'),
49 ('/;/#/,d,,,', 'd'),
50 ('long../../..name', 'long./.name'),
51 ('long../..name', 'long./.name'),
52 ('../', ''),
53 ('\'../"../', ''),
54 ('c,/,/..//./,c,,,/.d/../.........c', 'c/c/.d/.c'),
55 ('c,/,/..//./,c,,,', 'c/c'),
56 ('d../..d', 'd./.d'),
57 ('d../../d', 'd./d'),
58
59 ('d\;\./\,\./d', 'd./d'),
60 ('d\.\./\.\./d', 'd./d'),
61 ('d\.\./\..\../d', 'd./d'),
62 ])
63 def test_deserialize_clean_up_name(self, given, expected):
40 64 class TestSchema(colander.Schema):
41 field = colander.SchemaNode(GroupNameType())
65 field_group = colander.SchemaNode(GroupNameType())
66 field_repo = colander.SchemaNode(RepoNameType())
42 67
43 68 schema = TestSchema()
44 cleaned_data = schema.deserialize(
45 {'field': '//group1/group2///group3//'})
46 assert cleaned_data['field'] == 'group1/group2/group3'
69 cleaned_data = schema.deserialize({
70 'field_group': given,
71 'field_repo': given
72 })
73 assert cleaned_data['field_group'] == expected
74 assert cleaned_data['field_repo'] == expected
75
76
77 class TestStringBooleanType(object):
78
79 def _get_schema(self):
80 class Schema(colander.MappingSchema):
81 bools = colander.SchemaNode(StringBooleanType())
82 return Schema()
83
84 @pytest.mark.parametrize('given, expected', [
85 ('1', True),
86 ('yEs', True),
87 ('true', True),
88
89 ('0', False),
90 ('NO', False),
91 ('FALSE', False),
92
93 ])
94 def test_convert_type(self, given, expected):
95 schema = self._get_schema()
96 result = schema.deserialize({'bools':given})
97 assert result['bools'] == expected
98
99 def test_try_convert_bad_type(self):
100 schema = self._get_schema()
101 with pytest.raises(colander.Invalid):
102 result = schema.deserialize({'bools': 'boom'})
@@ -348,6 +348,7 b' class TestPermissions(object):'
348 348 'hg.create.none',
349 349 'hg.fork.none',
350 350 'hg.register.manual_activate',
351 'hg.password_reset.enabled',
351 352 'hg.extern_activate.auto',
352 353 'repository.read',
353 354 'group.read',
@@ -379,6 +380,7 b' class TestPermissions(object):'
379 380 'hg.create.repository',
380 381 'hg.fork.repository',
381 382 'hg.register.manual_activate',
383 'hg.password_reset.enabled',
382 384 'hg.extern_activate.auto',
383 385 'repository.read',
384 386 'group.read',
@@ -406,6 +408,7 b' class TestPermissions(object):'
406 408 'hg.create.none',
407 409 'hg.fork.none',
408 410 'hg.register.manual_activate',
411 'hg.password_reset.enabled',
409 412 'hg.extern_activate.auto',
410 413 'repository.read',
411 414 'group.read',
@@ -25,7 +25,8 b' import textwrap'
25 25 import rhodecode
26 26 from rhodecode.lib.utils2 import safe_unicode
27 27 from rhodecode.lib.vcs.backends import get_backend
28 from rhodecode.lib.vcs.backends.base import MergeResponse, MergeFailureReason
28 from rhodecode.lib.vcs.backends.base import (
29 MergeResponse, MergeFailureReason, Reference)
29 30 from rhodecode.lib.vcs.exceptions import RepositoryError
30 31 from rhodecode.lib.vcs.nodes import FileNode
31 32 from rhodecode.model.comment import ChangesetCommentsModel
@@ -115,7 +116,7 b' class TestPullRequestModel:'
115 116
116 117 def test_get_awaiting_my_review(self, pull_request):
117 118 PullRequestModel().update_reviewers(
118 pull_request, [pull_request.author])
119 pull_request, [(pull_request.author, ['author'])])
119 120 prs = PullRequestModel().get_awaiting_my_review(
120 121 pull_request.target_repo, user_id=pull_request.author.user_id)
121 122 assert isinstance(prs, list)
@@ -123,7 +124,7 b' class TestPullRequestModel:'
123 124
124 125 def test_count_awaiting_my_review(self, pull_request):
125 126 PullRequestModel().update_reviewers(
126 pull_request, [pull_request.author])
127 pull_request, [(pull_request.author, ['author'])])
127 128 pr_count = PullRequestModel().count_awaiting_my_review(
128 129 pull_request.target_repo, user_id=pull_request.author.user_id)
129 130 assert pr_count == 1
@@ -269,10 +270,10 b' class TestPullRequestModel:'
269 270
270 271 def test_merge(self, pull_request, merge_extras):
271 272 user = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
273 merge_ref = Reference(
274 'type', 'name', '6126b7bfcc82ad2d3deaee22af926b082ce54cc6')
272 275 self.merge_mock.return_value = MergeResponse(
273 True, True,
274 '6126b7bfcc82ad2d3deaee22af926b082ce54cc6',
275 MergeFailureReason.NONE)
276 True, True, merge_ref, MergeFailureReason.NONE)
276 277
277 278 merge_extras['repository'] = pull_request.target_repo.repo_name
278 279 PullRequestModel().merge(
@@ -308,10 +309,10 b' class TestPullRequestModel:'
308 309
309 310 def test_merge_failed(self, pull_request, merge_extras):
310 311 user = UserModel().get_by_username(TEST_USER_ADMIN_LOGIN)
312 merge_ref = Reference(
313 'type', 'name', '6126b7bfcc82ad2d3deaee22af926b082ce54cc6')
311 314 self.merge_mock.return_value = MergeResponse(
312 False, False,
313 '6126b7bfcc82ad2d3deaee22af926b082ce54cc6',
314 MergeFailureReason.MERGE_FAILED)
315 False, False, merge_ref, MergeFailureReason.MERGE_FAILED)
315 316
316 317 merge_extras['repository'] = pull_request.target_repo.repo_name
317 318 PullRequestModel().merge(
@@ -359,6 +360,14 b' class TestPullRequestModel:'
359 360 pull_request, context=6)
360 361 assert 'file_1' in diff.raw
361 362
363 def test_generate_title_returns_unicode(self):
364 title = PullRequestModel().generate_pullrequest_title(
365 source='source-dummy',
366 source_ref='source-ref-dummy',
367 target='target-dummy',
368 )
369 assert type(title) == unicode
370
362 371
363 372 class TestIntegrationMerge(object):
364 373 @pytest.mark.parametrize('extra_config', (
@@ -449,6 +458,7 b' def merge_extras(user_regular):'
449 458 'locked_by': [None, None, None],
450 459 'server_url': 'http://test.example.com:5000',
451 460 'hooks': ['push', 'pull'],
461 'is_shadow_repo': False,
452 462 }
453 463 return extras
454 464
@@ -113,40 +113,20 b' def test_add_and_remove_user_from_group('
113 113 assert user_group.members == []
114 114
115 115
116 @pytest.mark.parametrize(
117 'data, expected', [
118 ("1", [1]), (["1", "2"], [1, 2])
119 ]
120 )
116 @pytest.mark.parametrize('data, expected', [
117 ([], []),
118 ([{"member_user_id": 1, "type": "new"}], [1]),
119 ([{"member_user_id": 1, "type": "new"},
120 {"member_user_id": 1, "type": "existing"}], [1]),
121 ([{"member_user_id": 1, "type": "new"},
122 {"member_user_id": 2, "type": "new"},
123 {"member_user_id": 3, "type": "remove"}], [1, 2])
124 ])
121 125 def test_clean_members_data(data, expected):
122 126 cleaned = UserGroupModel()._clean_members_data(data)
123 127 assert cleaned == expected
124 128
125 129
126 def test_update_members_from_user_ids(user_regular, user_util):
127 user_group = user_util.create_user_group()
128 assert user_group.members == []
129 assert user_group.user != user_regular
130 expected_active_state = not user_group.users_group_active
131
132 form_data = {
133 'users_group_members': str(user_regular.user_id),
134 'user': str(user_regular.username),
135 'users_group_name': 'changed_name',
136 'users_group_active': expected_active_state,
137 'user_group_description': 'changed_description'
138 }
139
140 UserGroupModel().update(user_group, form_data)
141 assert user_group.members[0].user_id == user_regular.user_id
142 assert user_group.user_id == user_regular.user_id
143 assert 'changed_name' in user_group.users_group_name
144 assert 'changed_description' in user_group.user_group_description
145 assert user_group.users_group_active == expected_active_state
146 # Ignore changes on the test
147 Session().rollback()
148
149
150 130 def _create_test_members():
151 131 members = []
152 132 for member_number in range(3):
@@ -28,7 +28,7 b' Base for test suite for making push/pull'
28 28 """
29 29
30 30 from os.path import join as jn
31 from subprocess import Popen, PIPE
31 from subprocess32 import Popen, PIPE
32 32 import logging
33 33 import os
34 34 import tempfile
@@ -29,7 +29,7 b' py.test config for test suite for making'
29 29
30 30 import ConfigParser
31 31 import os
32 import subprocess
32 import subprocess32
33 33 import tempfile
34 34 import textwrap
35 35 import pytest
@@ -164,7 +164,7 b' def rc_web_server('
164 164 print('Command: {}'.format(command))
165 165 print('Logfile: {}'.format(RC_LOG))
166 166
167 proc = subprocess.Popen(
167 proc = subprocess32.Popen(
168 168 command, bufsize=0, env=env, stdout=server_out, stderr=server_out)
169 169
170 170 wait_for_url(host_url, timeout=30)
@@ -26,7 +26,7 b' import re'
26 26 import pprint
27 27 import shutil
28 28 import socket
29 import subprocess
29 import subprocess32
30 30 import time
31 31 import uuid
32 32
@@ -38,11 +38,12 b' import requests'
38 38 from webtest.app import TestApp
39 39
40 40 import rhodecode
41 from rhodecode.lib.utils2 import AttributeDict
41 42 from rhodecode.model.changeset_status import ChangesetStatusModel
42 43 from rhodecode.model.comment import ChangesetCommentsModel
43 44 from rhodecode.model.db import (
44 45 PullRequest, Repository, RhodeCodeSetting, ChangesetStatus, RepoGroup,
45 UserGroup, RepoRhodeCodeUi, RepoRhodeCodeSetting, RhodeCodeUi, Integration)
46 UserGroup, RepoRhodeCodeUi, RepoRhodeCodeSetting, RhodeCodeUi)
46 47 from rhodecode.model.meta import Session
47 48 from rhodecode.model.pull_request import PullRequestModel
48 49 from rhodecode.model.repo import RepoModel
@@ -218,7 +219,7 b' def app(request, pylonsapp, http_environ'
218 219 return app
219 220
220 221
221 @pytest.fixture()
222 @pytest.fixture(scope='session')
222 223 def app_settings(pylonsapp, pylons_config):
223 224 """
224 225 Settings dictionary used to create the app.
@@ -234,6 +235,18 b' def app_settings(pylonsapp, pylons_confi'
234 235 return settings
235 236
236 237
238 @pytest.fixture(scope='session')
239 def db(app_settings):
240 """
241 Initializes the database connection.
242
243 It uses the same settings which are used to create the ``pylonsapp`` or
244 ``app`` fixtures.
245 """
246 from rhodecode.config.utils import initialize_database
247 initialize_database(app_settings)
248
249
237 250 LoginData = collections.namedtuple('LoginData', ('csrf_token', 'user'))
238 251
239 252
@@ -872,7 +885,7 b' class RepoServer(object):'
872 885 if vcsrepo.alias != 'svn':
873 886 raise TypeError("Backend %s not supported" % vcsrepo.alias)
874 887
875 proc = subprocess.Popen(
888 proc = subprocess32.Popen(
876 889 ['svnserve', '-d', '--foreground', '--listen-host', 'localhost',
877 890 '--root', vcsrepo.path])
878 891 self._cleanup_servers.append(proc)
@@ -1120,7 +1133,7 b' def user_util(request, pylonsapp):'
1120 1133 class UserUtility(object):
1121 1134
1122 1135 def __init__(self, test_name="test"):
1123 self._test_name = test_name
1136 self._test_name = self._sanitize_name(test_name)
1124 1137 self.fixture = Fixture()
1125 1138 self.repo_group_ids = []
1126 1139 self.user_ids = []
@@ -1133,6 +1146,11 b' class UserUtility(object):'
1133 1146 self.user_group_user_group_permission_ids = []
1134 1147 self.user_permissions = []
1135 1148
1149 def _sanitize_name(self, name):
1150 for char in ['[', ']']:
1151 name = name.replace(char, '_')
1152 return name
1153
1136 1154 def create_repo_group(
1137 1155 self, owner=TEST_USER_ADMIN_LOGIN, auto_cleanup=True):
1138 1156 group_name = "{prefix}_repogroup_{count}".format(
@@ -23,7 +23,7 b' import logging.config'
23 23 import os
24 24 import platform
25 25 import socket
26 import subprocess
26 import subprocess32
27 27 import time
28 28 from urllib2 import urlopen, URLError
29 29
@@ -48,6 +48,9 b' def _parse_json(value):'
48 48
49 49
50 50 def pytest_addoption(parser):
51 parser.addoption(
52 '--test-loglevel', dest='test_loglevel',
53 help="Set default Logging level for tests, warn (default), info, debug")
51 54 group = parser.getgroup('pylons')
52 55 group.addoption(
53 56 '--with-pylons', dest='pylons_config',
@@ -68,7 +71,7 b' def pytest_addoption(parser):'
68 71 '--without-vcsserver', dest='with_vcsserver', action='store_false',
69 72 help="Do not start the VCSServer in a background process.")
70 73 vcsgroup.addoption(
71 '--with-vcsserver', dest='vcsserver_config',
74 '--with-vcsserver', dest='vcsserver_config_pyro4',
72 75 help="Start the VCSServer with the specified config file.")
73 76 vcsgroup.addoption(
74 77 '--with-vcsserver-http', dest='vcsserver_config_http',
@@ -91,7 +94,7 b' def pytest_addoption(parser):'
91 94 "against an already running server and random ports cause "
92 95 "trouble."))
93 96 parser.addini(
94 'vcsserver_config',
97 'vcsserver_config_pyro4',
95 98 "Start the VCSServer with the specified config file.")
96 99 parser.addini(
97 100 'vcsserver_config_http',
@@ -134,7 +137,7 b' def vcsserver_factory(tmpdir_factory):'
134 137 Use this if you need a running vcsserver with a special configuration.
135 138 """
136 139
137 def factory(request, use_http=False, overrides=(), vcsserver_port=None):
140 def factory(request, use_http=True, overrides=(), vcsserver_port=None):
138 141
139 142 if vcsserver_port is None:
140 143 vcsserver_port = get_available_port()
@@ -151,7 +154,7 b' def vcsserver_factory(tmpdir_factory):'
151 154 overrides.append(platform_override)
152 155
153 156 option_name = (
154 'vcsserver_config_http' if use_http else 'vcsserver_config')
157 'vcsserver_config_http' if use_http else 'vcsserver_config_pyro4')
155 158 override_option_name = 'vcsserver_config_override'
156 159 config_file = get_config(
157 160 request.config, option_name=option_name,
@@ -183,10 +186,15 b' def _use_vcs_http_server(config):'
183 186 protocol = (
184 187 config.getoption(protocol_option) or
185 188 config.getini(protocol_option) or
186 'pyro4')
189 'http')
187 190 return protocol == 'http'
188 191
189 192
193 def _use_log_level(config):
194 level = config.getoption('test_loglevel') or 'warn'
195 return level.upper()
196
197
190 198 class VCSServer(object):
191 199 """
192 200 Represents a running VCSServer instance.
@@ -196,7 +204,7 b' class VCSServer(object):'
196 204
197 205 def start(self):
198 206 print("Starting the VCSServer: {}".format(self._args))
199 self.process = subprocess.Popen(self._args)
207 self.process = subprocess32.Popen(self._args)
200 208
201 209 def wait_until_ready(self, timeout=30):
202 210 raise NotImplementedError()
@@ -218,7 +226,8 b' class Pyro4VCSServer(VCSServer):'
218 226 self._args = args
219 227
220 228 def wait_until_ready(self, timeout=30):
221 remote_server = vcs.create_vcsserver_proxy(self.server_and_port)
229 remote_server = vcs.create_vcsserver_proxy(
230 self.server_and_port, 'pyro4')
222 231 start = time.time()
223 232 with remote_server:
224 233 while time.time() - start < timeout:
@@ -245,7 +254,7 b' class HttpVCSServer(VCSServer):'
245 254 config_data = configobj.ConfigObj(config_file)
246 255 self._config = config_data['server:main']
247 256
248 args = ['pserve', config_file, 'http_host=0.0.0.0']
257 args = ['pserve', config_file]
249 258 self._args = args
250 259
251 260 @property
@@ -254,7 +263,7 b' class HttpVCSServer(VCSServer):'
254 263 return template.format(**self._config)
255 264
256 265 def start(self):
257 self.process = subprocess.Popen(self._args)
266 self.process = subprocess32.Popen(self._args)
258 267
259 268 def wait_until_ready(self, timeout=30):
260 269 host = self._config['host']
@@ -280,12 +289,41 b' class HttpVCSServer(VCSServer):'
280 289 @pytest.fixture(scope='session')
281 290 def pylons_config(request, tmpdir_factory, rcserver_port, vcsserver_port):
282 291 option_name = 'pylons_config'
292 log_level = _use_log_level(request.config)
283 293
284 294 overrides = [
285 295 {'server:main': {'port': rcserver_port}},
286 {'app:main': {'vcs.server': 'localhost:%s' % vcsserver_port}}]
296 {'app:main': {
297 'vcs.server': 'localhost:%s' % vcsserver_port,
298 # johbo: We will always start the VCSServer on our own based on the
299 # fixtures of the test cases. For the test run it must always be
300 # off in the INI file.
301 'vcs.start_server': 'false',
302 }},
303
304 {'handler_console': {
305 'class ': 'StreamHandler',
306 'args ': '(sys.stderr,)',
307 'level': log_level,
308 }},
309
310 ]
287 311 if _use_vcs_http_server(request.config):
288 overrides.append({'app:main': {'vcs.server.protocol': 'http'}})
312 overrides.append({
313 'app:main': {
314 'vcs.server.protocol': 'http',
315 'vcs.scm_app_implementation': 'http',
316 'vcs.hooks.protocol': 'http',
317 }
318 })
319 else:
320 overrides.append({
321 'app:main': {
322 'vcs.server.protocol': 'pyro4',
323 'vcs.scm_app_implementation': 'pyro4',
324 'vcs.hooks.protocol': 'pyro4',
325 }
326 })
289 327
290 328 filename = get_config(
291 329 request.config, option_name=option_name,
@@ -345,6 +383,7 b' def available_port(available_port_factor'
345 383
346 384 @pytest.fixture(scope='session')
347 385 def pylonsapp(pylons_config, vcsserver, http_environ_session):
386 print "Using the RhodeCode configuration", pylons_config
348 387 logging.config.fileConfig(
349 388 pylons_config, disable_existing_loggers=False)
350 389 app = _setup_pylons_environment(pylons_config, http_environ_session)
@@ -30,10 +30,8 b' from os.path import join as jn'
30 30 from os.path import dirname as dn
31 31
32 32 from tempfile import _RandomNameSequence
33 from subprocess import Popen, PIPE
34
33 from subprocess32 import Popen, PIPE
35 34 from paste.deploy import appconfig
36 from pylons import config
37 35
38 36 from rhodecode.lib.utils import add_cache
39 37 from rhodecode.lib.utils2 import engine_from_config
@@ -42,7 +40,7 b' from rhodecode.model import init_model'
42 40 from rhodecode.model import meta
43 41 from rhodecode.model.db import User, Repository
44 42
45 from rhodecode.tests import TESTS_TMP_PATH, NEW_HG_REPO, HG_REPO
43 from rhodecode.tests import TESTS_TMP_PATH, HG_REPO
46 44 from rhodecode.config.environment import load_environment
47 45
48 46 rel_path = dn(dn(dn(dn(os.path.abspath(__file__)))))
@@ -22,7 +22,7 b' import threading'
22 22 import time
23 23 import logging
24 24 import os.path
25 import subprocess
25 import subprocess32
26 26 import urllib2
27 27 from urlparse import urlparse, parse_qsl
28 28 from urllib import unquote_plus
@@ -113,10 +113,10 b' def _load_svn_dump_into_repo(dump_name, '
113 113 integrated with the main repository once they stabilize more.
114 114 """
115 115 dump = rc_testdata.load_svn_dump(dump_name)
116 load_dump = subprocess.Popen(
116 load_dump = subprocess32.Popen(
117 117 ['svnadmin', 'load', repo_path],
118 stdin=subprocess.PIPE, stdout=subprocess.PIPE,
119 stderr=subprocess.PIPE)
118 stdin=subprocess32.PIPE, stdout=subprocess32.PIPE,
119 stderr=subprocess32.PIPE)
120 120 out, err = load_dump.communicate(dump)
121 121 if load_dump.returncode != 0:
122 122 log.error("Output of load_dump command: %s", out)
@@ -149,6 +149,10 b' class AssertResponse(object):'
149 149 element = self.get_element(css_selector)
150 150 assert expected_content in element.text_content()
151 151
152 def element_value_contains(self, css_selector, expected_content):
153 element = self.get_element(css_selector)
154 assert expected_content in element.value
155
152 156 def contains_one_link(self, link_text, href):
153 157 doc = fromstring(self.response.body)
154 158 sel = CSSSelector('a[href]')
@@ -70,7 +70,7 b' def stub_session_factory(stub_session):'
70 70
71 71 def test_repo_maker_uses_session_for_classmethods(stub_session_factory):
72 72 repo_maker = client_http.RepoMaker(
73 'server_and_port', 'endpoint', stub_session_factory)
73 'server_and_port', 'endpoint', 'test_dummy_scm', stub_session_factory)
74 74 repo_maker.example_call()
75 75 stub_session_factory().post.assert_called_with(
76 76 'http://server_and_port/endpoint', data=mock.ANY)
@@ -79,7 +79,7 b' def test_repo_maker_uses_session_for_cla'
79 79 def test_repo_maker_uses_session_for_instance_methods(
80 80 stub_session_factory, config):
81 81 repo_maker = client_http.RepoMaker(
82 'server_and_port', 'endpoint', stub_session_factory)
82 'server_and_port', 'endpoint', 'test_dummy_scm', stub_session_factory)
83 83 repo = repo_maker('stub_path', config)
84 84 repo.example_call()
85 85 stub_session_factory().post.assert_called_with(
@@ -465,6 +465,19 b' class TestCommits(BackendTestMixin):'
465 465 with pytest.raises(TypeError):
466 466 self.repo.get_commits(start_id=1, end_id=2)
467 467
468 def test_commit_equality(self):
469 commit1 = self.repo.get_commit(self.repo.commit_ids[0])
470 commit2 = self.repo.get_commit(self.repo.commit_ids[1])
471
472 assert commit1 == commit1
473 assert commit2 == commit2
474 assert commit1 != commit2
475 assert commit2 != commit1
476 assert commit1 != None
477 assert None != commit1
478 assert 1 != commit1
479 assert 'string' != commit1
480
468 481
469 482 @pytest.mark.parametrize("filename, expected", [
470 483 ("README.rst", False),
@@ -599,7 +599,7 b' TODO: To be written...'
599 599 'test user', 'test@rhodecode.com', 'merge message 1',
600 600 dry_run=False)
601 601 expected_merge_response = MergeResponse(
602 True, True, merge_response.merge_commit_id,
602 True, True, merge_response.merge_ref,
603 603 MergeFailureReason.NONE)
604 604 assert merge_response == expected_merge_response
605 605
@@ -611,14 +611,14 b' TODO: To be written...'
611 611 assert target_ref.commit_id in commit_ids
612 612
613 613 merge_commit = target_commits[-1]
614 assert merge_commit.raw_id == merge_response.merge_commit_id
614 assert merge_commit.raw_id == merge_response.merge_ref.commit_id
615 615 assert merge_commit.message.strip() == 'merge message 1'
616 616 assert merge_commit.author == 'test user <test@rhodecode.com>'
617 617
618 618 # Check the bookmark was updated in the target repo
619 619 assert (
620 620 target_repo.bookmarks[bookmark_name] ==
621 merge_response.merge_commit_id)
621 merge_response.merge_ref.commit_id)
622 622
623 623 def test_merge_source_is_bookmark(self, vcsbackend_hg):
624 624 target_repo = vcsbackend_hg.create_repo(number_of_commits=1)
@@ -643,7 +643,7 b' TODO: To be written...'
643 643 'test user', 'test@rhodecode.com', 'merge message 1',
644 644 dry_run=False)
645 645 expected_merge_response = MergeResponse(
646 True, True, merge_response.merge_commit_id,
646 True, True, merge_response.merge_ref,
647 647 MergeFailureReason.NONE)
648 648 assert merge_response == expected_merge_response
649 649
@@ -717,7 +717,7 b' TODO: To be written...'
717 717 dry_run=False, use_rebase=True)
718 718
719 719 expected_merge_response = MergeResponse(
720 True, True, merge_response.merge_commit_id,
720 True, True, merge_response.merge_ref,
721 721 MergeFailureReason.NONE)
722 722 assert merge_response == expected_merge_response
723 723
@@ -308,7 +308,7 b' class TestRepositoryMerge:'
308 308 'test user', 'test@rhodecode.com', 'merge message 1',
309 309 dry_run=False)
310 310 expected_merge_response = MergeResponse(
311 True, True, merge_response.merge_commit_id,
311 True, True, merge_response.merge_ref,
312 312 MergeFailureReason.NONE)
313 313 assert merge_response == expected_merge_response
314 314
@@ -320,45 +320,55 b' class TestRepositoryMerge:'
320 320 assert self.target_ref.commit_id in commit_ids
321 321
322 322 merge_commit = target_commits[-1]
323 assert merge_commit.raw_id == merge_response.merge_commit_id
323 assert merge_commit.raw_id == merge_response.merge_ref.commit_id
324 324 assert merge_commit.message.strip() == 'merge message 1'
325 325 assert merge_commit.author == 'test user <test@rhodecode.com>'
326 326
327 327 # We call it twice so to make sure we can handle updates
328 328 target_ref = Reference(
329 329 self.target_ref.type, self.target_ref.name,
330 merge_response.merge_commit_id)
330 merge_response.merge_ref.commit_id)
331 331
332 332 merge_response = target_repo.merge(
333 333 target_ref, self.source_repo, self.source_ref, self.workspace,
334 334 'test user', 'test@rhodecode.com', 'merge message 2',
335 335 dry_run=False)
336 336 expected_merge_response = MergeResponse(
337 True, True, merge_response.merge_commit_id,
337 True, True, merge_response.merge_ref,
338 338 MergeFailureReason.NONE)
339 339 assert merge_response == expected_merge_response
340 340
341 341 target_repo = backends.get_backend(
342 342 vcsbackend.alias)(self.target_repo.path)
343 merge_commit = target_repo.get_commit(merge_response.merge_commit_id)
343 merge_commit = target_repo.get_commit(
344 merge_response.merge_ref.commit_id)
344 345 assert merge_commit.message.strip() == 'merge message 1'
345 346 assert merge_commit.author == 'test user <test@rhodecode.com>'
346 347
347 348 def test_merge_success_dry_run(self, vcsbackend):
348 349 self.prepare_for_success(vcsbackend)
349 expected_merge_response = MergeResponse(
350 True, False, None, MergeFailureReason.NONE)
351 350
352 351 merge_response = self.target_repo.merge(
353 352 self.target_ref, self.source_repo, self.source_ref, self.workspace,
354 353 dry_run=True)
355 assert merge_response == expected_merge_response
356 354
357 355 # We call it twice so to make sure we can handle updates
358 merge_response = self.target_repo.merge(
356 merge_response_update = self.target_repo.merge(
359 357 self.target_ref, self.source_repo, self.source_ref, self.workspace,
360 358 dry_run=True)
361 assert merge_response == expected_merge_response
359
360 # Multiple merges may differ in their commit id. Therefore we set the
361 # commit id to `None` before comparing the merge responses.
362 merge_response = merge_response._replace(
363 merge_ref=merge_response.merge_ref._replace(commit_id=None))
364 merge_response_update = merge_response_update._replace(
365 merge_ref=merge_response_update.merge_ref._replace(commit_id=None))
366
367 assert merge_response == merge_response_update
368 assert merge_response.possible is True
369 assert merge_response.executed is False
370 assert merge_response.merge_ref
371 assert merge_response.failure_reason is MergeFailureReason.NONE
362 372
363 373 @pytest.mark.parametrize('dry_run', [True, False])
364 374 def test_merge_conflict(self, vcsbackend, dry_run):
@@ -391,10 +401,10 b' class TestRepositoryMerge:'
391 401
392 402 assert merge_response == expected_merge_response
393 403
394 def test_merge_missing_commit(self, vcsbackend):
404 def test_merge_missing_source_reference(self, vcsbackend):
395 405 self.prepare_for_success(vcsbackend)
396 406 expected_merge_response = MergeResponse(
397 False, False, None, MergeFailureReason.MISSING_COMMIT)
407 False, False, None, MergeFailureReason.MISSING_SOURCE_REF)
398 408
399 409 source_ref = Reference(
400 410 self.source_ref.type, 'not_existing', self.source_ref.commit_id)
@@ -20,7 +20,7 b''
20 20
21 21 import datetime
22 22 import os
23 import subprocess
23 import subprocess32
24 24
25 25 import pytest
26 26
@@ -86,8 +86,8 b' class TestGetScm:'
86 86 def test_get_two_scms_for_path(self, tmpdir):
87 87 multialias_repo_path = str(tmpdir)
88 88
89 subprocess.check_call(['hg', 'init', multialias_repo_path])
90 subprocess.check_call(['git', 'init', multialias_repo_path])
89 subprocess32.check_call(['hg', 'init', multialias_repo_path])
90 subprocess32.check_call(['git', 'init', multialias_repo_path])
91 91
92 92 with pytest.raises(VCSError):
93 93 get_scm(multialias_repo_path)
@@ -45,8 +45,8 b' def test_vcs_connections_have_a_timeout_'
45 45 proxy_objects = []
46 46 with pytest.raises(TimeoutError):
47 47 # TODO: johbo: Find a better way to set this number
48 while xrange(100):
49 server = create_vcsserver_proxy(server_and_port)
48 for number in xrange(100):
49 server = create_vcsserver_proxy(server_and_port, protocol='pyro4')
50 50 server.ping()
51 51 proxy_objects.append(server)
52 52
@@ -54,7 +54,7 b' def test_vcs_connections_have_a_timeout_'
54 54 def test_vcs_remote_calls_are_bound_by_timeout(pylonsapp, short_timeout):
55 55 server_and_port = pylonsapp.config['vcs.server']
56 56 with pytest.raises(TimeoutError):
57 server = create_vcsserver_proxy(server_and_port)
57 server = create_vcsserver_proxy(server_and_port, protocol='pyro4')
58 58 server.sleep(short_timeout + 1.0)
59 59
60 60
@@ -27,7 +27,7 b' import os'
27 27 import re
28 28 import sys
29 29
30 from subprocess import Popen
30 from subprocess32 import Popen
31 31
32 32
33 33 class VCSTestError(Exception):
@@ -21,6 +21,7 b' from pyramid.i18n import TranslationStri'
21 21 # Create a translation string factory for the 'rhodecode' domain.
22 22 _ = TranslationStringFactory('rhodecode')
23 23
24
24 25 class LazyString(object):
25 26 def __init__(self, *args, **kw):
26 27 self.args = args
@@ -21,6 +21,7 b' keywords ='
21 21 lazy_ugettext
22 22 _ngettext
23 23 _gettext
24 gettext_translator
24 25
25 26 [init_catalog]
26 27 domain = rhodecode
@@ -51,6 +51,7 b' requirements = ['
51 51 'PasteDeploy',
52 52 'PasteScript',
53 53 'Pygments',
54 'pygments-markdown-lexer',
54 55 'Pylons',
55 56 'Pyro4',
56 57 'Routes',
@@ -101,6 +102,7 b' requirements = ['
101 102 'repoze.lru',
102 103 'requests',
103 104 'simplejson',
105 'subprocess32',
104 106 'waitress',
105 107 'zope.cachedescriptors',
106 108 'dogpile.cache',
@@ -117,12 +119,12 b' test_requirements = ['
117 119 'WebTest',
118 120 'configobj',
119 121 'cssselect',
120 'flake8',
121 122 'lxml',
122 123 'mock',
123 124 'pytest',
124 125 'pytest-cov',
125 126 'pytest-runner',
127 'pytest-sugar',
126 128 ]
127 129
128 130 setup_requirements = [
@@ -1,57 +1,83 b''
1 1 { pkgs ? (import <nixpkgs> {})
2 , vcsserverPath ? "./../rhodecode-vcsserver"
3 , vcsserverNix ? "shell.nix"
2 , pythonPackages ? "python27Packages"
4 3 , doCheck ? true
4 , sourcesOverrides ? {}
5 , doDevelopInstall ? true
5 6 }:
6 7
7 8 let
8
9 # Convert vcsserverPath to absolute path.
10 vcsserverAbsPath =
11 if pkgs.lib.strings.hasPrefix "/" vcsserverPath then
12 builtins.toPath "${vcsserverPath}"
13 else
14 builtins.toPath ("${builtins.getEnv "PWD"}/${vcsserverPath}");
9 # Get sources from config and update them with overrides.
10 sources = (pkgs.config.rc.sources or {}) // sourcesOverrides;
15 11
16 # Import vcsserver if nix file exists, otherwise set it to null.
17 vcsserver =
18 let
19 nixFile = "${vcsserverAbsPath}/${vcsserverNix}";
20 in
21 if pkgs.lib.pathExists "${nixFile}" then
22 builtins.trace
23 "Using local vcsserver from ${nixFile}"
24 import "${nixFile}" {inherit pkgs;}
25 else
26 null;
27
28 hasVcsserver = !isNull vcsserver;
29
30 enterprise = import ./default.nix {
31 inherit pkgs doCheck;
12 enterprise-ce = import ./default.nix {
13 inherit pkgs pythonPackages doCheck;
32 14 };
33 15
34 pythonPackages = enterprise.pythonPackages;
16 ce-pythonPackages = enterprise-ce.pythonPackages;
35 17
36 in enterprise.override (attrs: {
18 # This method looks up a path from `pkgs.config.rc.sources` and returns a
19 # shell script which does a `python setup.py develop` installation of it. If
20 # no path is found it will return an empty string.
21 optionalDevelopInstall = attributeName:
22 let
23 path = pkgs.lib.attrByPath [attributeName] null sources;
24 doIt = doDevelopInstall && path != null;
25 in
26 pkgs.lib.optionalString doIt (
27 builtins.trace "Develop install of ${attributeName} from ${path}" ''
28 echo "Develop install of '${attributeName}' from '${path}' [BEGIN]"
29 pushd ${path}
30 python setup.py develop --prefix $tmp_path --allow-hosts ""
31 popd
32 echo "Develop install of '${attributeName}' from '${path}' [DONE]"
33 '');
34
35 # This method looks up a path from `pkgs.config.rc.sources` and imports the
36 # default.nix file if it exists. It returns the list of build inputs. If no
37 # path is found it will return an empty list.
38 optionalDevelopInstallBuildInputs = attributeName:
39 let
40 path = pkgs.lib.attrByPath [attributeName] null sources;
41 nixFile = "${path}/default.nix";
42 doIt = doDevelopInstall && path != null && pkgs.lib.pathExists "${nixFile}";
43 derivate = import "${nixFile}" {
44 inherit doCheck pkgs pythonPackages;
45 };
46 in
47 pkgs.lib.lists.optionals doIt derivate.propagatedNativeBuildInputs;
48
49 developInstalls = [ "rhodecode-vcsserver" ];
50
51 in enterprise-ce.override (attrs: {
37 52 # Avoid that we dump any sources into the store when entering the shell and
38 53 # make development a little bit more convenient.
39 54 src = null;
40 55
41 56 buildInputs =
42 57 attrs.buildInputs ++
43 pkgs.lib.optionals (hasVcsserver) vcsserver.propagatedNativeBuildInputs ++
44 (with pythonPackages; [
58 pkgs.lib.lists.concatMap optionalDevelopInstallBuildInputs developInstalls ++
59 (with ce-pythonPackages; [
45 60 bumpversion
46 61 invoke
47 62 ipdb
48 63 ]);
49 64
50 shellHook = attrs.shellHook +
51 pkgs.lib.strings.optionalString (hasVcsserver) ''
52 # Setup the vcsserver development egg.
53 pushd ${vcsserverAbsPath}
54 python setup.py develop --prefix $tmp_path --allow-hosts ""
55 popd
56 '';
65 # Somewhat snappier setup of the development environment
66 # TODO: think of supporting a stable path again, so that multiple shells
67 # can share it.
68 preShellHook = enterprise-ce.linkNodeAndBowerPackages + ''
69 # Custom prompt to distinguish from other dev envs.
70 export PS1="\n\[\033[1;32m\][CE-shell:\w]$\[\033[0m\] "
71
72 # Setup a temporary directory.
73 tmp_path=$(mktemp -d)
74 export PATH="$tmp_path/bin:$PATH"
75 export PYTHONPATH="$tmp_path/${ce-pythonPackages.python.sitePackages}:$PYTHONPATH"
76 mkdir -p $tmp_path/${ce-pythonPackages.python.sitePackages}
77
78 # Develop installations
79 python setup.py develop --prefix $tmp_path --allow-hosts ""
80 echo "Additional develop installs"
81 '' + pkgs.lib.strings.concatMapStrings optionalDevelopInstall developInstalls;
82
57 83 })
@@ -1,214 +0,0 b''
1 # -*- coding: utf-8 -*-
2
3 # Copyright (C) 2011-2016 RhodeCode GmbH
4 #
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
8 #
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
13 #
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
21
22 """
23 Anontation library for usage in rhodecode, previously part of vcs
24 """
25
26 import StringIO
27
28 from pygments import highlight
29 from pygments.formatters import HtmlFormatter
30
31 from rhodecode.lib.vcs.exceptions import VCSError
32 from rhodecode.lib.vcs.nodes import FileNode
33
34
35 def annotate_highlight(
36 filenode, annotate_from_commit_func=None,
37 order=None, headers=None, **options):
38 """
39 Returns html portion containing annotated table with 3 columns: line
40 numbers, commit information and pygmentized line of code.
41
42 :param filenode: FileNode object
43 :param annotate_from_commit_func: function taking commit and
44 returning single annotate cell; needs break line at the end
45 :param order: ordered sequence of ``ls`` (line numbers column),
46 ``annotate`` (annotate column), ``code`` (code column); Default is
47 ``['ls', 'annotate', 'code']``
48 :param headers: dictionary with headers (keys are whats in ``order``
49 parameter)
50 """
51 from rhodecode.lib.helpers import get_lexer_for_filenode
52 options['linenos'] = True
53 formatter = AnnotateHtmlFormatter(
54 filenode=filenode, order=order, headers=headers,
55 annotate_from_commit_func=annotate_from_commit_func, **options)
56 lexer = get_lexer_for_filenode(filenode)
57 highlighted = highlight(filenode.content, lexer, formatter)
58 return highlighted
59
60
61 class AnnotateHtmlFormatter(HtmlFormatter):
62
63 def __init__(
64 self, filenode, annotate_from_commit_func=None,
65 order=None, **options):
66 """
67 If ``annotate_from_commit_func`` is passed, it should be a function
68 which returns string from the given commit. For example, we may pass
69 following function as ``annotate_from_commit_func``::
70
71 def commit_to_anchor(commit):
72 return '<a href="/commits/%s/">%s</a>\n' %\
73 (commit.id, commit.id)
74
75 :param annotate_from_commit_func: see above
76 :param order: (default: ``['ls', 'annotate', 'code']``); order of
77 columns;
78 :param options: standard pygment's HtmlFormatter options, there is
79 extra option tough, ``headers``. For instance we can pass::
80
81 formatter = AnnotateHtmlFormatter(filenode, headers={
82 'ls': '#',
83 'annotate': 'Annotate',
84 'code': 'Code',
85 })
86
87 """
88 super(AnnotateHtmlFormatter, self).__init__(**options)
89 self.annotate_from_commit_func = annotate_from_commit_func
90 self.order = order or ('ls', 'annotate', 'code')
91 headers = options.pop('headers', None)
92 if headers and not (
93 'ls' in headers and 'annotate' in headers and 'code' in headers):
94 raise ValueError(
95 "If headers option dict is specified it must "
96 "all 'ls', 'annotate' and 'code' keys")
97 self.headers = headers
98 if isinstance(filenode, FileNode):
99 self.filenode = filenode
100 else:
101 raise VCSError(
102 "This formatter expect FileNode parameter, not %r" %
103 type(filenode))
104
105 def annotate_from_commit(self, commit):
106 """
107 Returns full html line for single commit per annotated line.
108 """
109 if self.annotate_from_commit_func:
110 return self.annotate_from_commit_func(commit)
111 else:
112 return commit.id + '\n'
113
114 def _wrap_tablelinenos(self, inner):
115 dummyoutfile = StringIO.StringIO()
116 lncount = 0
117 for t, line in inner:
118 if t:
119 lncount += 1
120 dummyoutfile.write(line)
121
122 fl = self.linenostart
123 mw = len(str(lncount + fl - 1))
124 sp = self.linenospecial
125 st = self.linenostep
126 la = self.lineanchors
127 aln = self.anchorlinenos
128 if sp:
129 lines = []
130
131 for i in range(fl, fl + lncount):
132 if i % st == 0:
133 if i % sp == 0:
134 if aln:
135 lines.append('<a href="#%s-%d" class="special">'
136 '%*d</a>' %
137 (la, i, mw, i))
138 else:
139 lines.append('<span class="special">'
140 '%*d</span>' % (mw, i))
141 else:
142 if aln:
143 lines.append('<a href="#%s-%d">'
144 '%*d</a>' % (la, i, mw, i))
145 else:
146 lines.append('%*d' % (mw, i))
147 else:
148 lines.append('')
149 ls = '\n'.join(lines)
150 else:
151 lines = []
152 for i in range(fl, fl + lncount):
153 if i % st == 0:
154 if aln:
155 lines.append('<a href="#%s-%d">%*d</a>' \
156 % (la, i, mw, i))
157 else:
158 lines.append('%*d' % (mw, i))
159 else:
160 lines.append('')
161 ls = '\n'.join(lines)
162
163 cached = {}
164 annotate = []
165 for el in self.filenode.annotate:
166 commit_id = el[1]
167 if commit_id in cached:
168 result = cached[commit_id]
169 else:
170 commit = el[2]()
171 result = self.annotate_from_commit(commit)
172 cached[commit_id] = result
173 annotate.append(result)
174
175 annotate = ''.join(annotate)
176
177 # in case you wonder about the seemingly redundant <div> here:
178 # since the content in the other cell also is wrapped in a div,
179 # some browsers in some configurations seem to mess up the formatting.
180 '''
181 yield 0, ('<table class="%stable">' % self.cssclass +
182 '<tr><td class="linenos"><div class="linenodiv"><pre>' +
183 ls + '</pre></div></td>' +
184 '<td class="code">')
185 yield 0, dummyoutfile.getvalue()
186 yield 0, '</td></tr></table>'
187
188 '''
189 headers_row = []
190 if self.headers:
191 headers_row = ['<tr class="annotate-header">']
192 for key in self.order:
193 td = ''.join(('<td>', self.headers[key], '</td>'))
194 headers_row.append(td)
195 headers_row.append('</tr>')
196
197 body_row_start = ['<tr>']
198 for key in self.order:
199 if key == 'ls':
200 body_row_start.append(
201 '<td class="linenos"><div class="linenodiv"><pre>' +
202 ls + '</pre></div></td>')
203 elif key == 'annotate':
204 body_row_start.append(
205 '<td class="annotate"><div class="annotatediv"><pre>' +
206 annotate + '</pre></div></td>')
207 elif key == 'code':
208 body_row_start.append('<td class="code">')
209 yield 0, ('<table class="%stable">' % self.cssclass +
210 ''.join(headers_row) +
211 ''.join(body_row_start)
212 )
213 yield 0, dummyoutfile.getvalue()
214 yield 0, '</td></tr></table>'
This diff has been collapsed as it changes many lines, (3168 lines changed) Show them Hide them
@@ -1,3168 +0,0 b''
1 /* Javascript plotting library for jQuery, version 0.8.3.
2
3 Copyright (c) 2007-2014 IOLA and Ole Laursen.
4 Licensed under the MIT license.
5
6 */
7
8 // first an inline dependency, jquery.colorhelpers.js, we inline it here
9 // for convenience
10
11 /* Plugin for jQuery for working with colors.
12 *
13 * Version 1.1.
14 *
15 * Inspiration from jQuery color animation plugin by John Resig.
16 *
17 * Released under the MIT license by Ole Laursen, October 2009.
18 *
19 * Examples:
20 *
21 * $.color.parse("#fff").scale('rgb', 0.25).add('a', -0.5).toString()
22 * var c = $.color.extract($("#mydiv"), 'background-color');
23 * console.log(c.r, c.g, c.b, c.a);
24 * $.color.make(100, 50, 25, 0.4).toString() // returns "rgba(100,50,25,0.4)"
25 *
26 * Note that .scale() and .add() return the same modified object
27 * instead of making a new one.
28 *
29 * V. 1.1: Fix error handling so e.g. parsing an empty string does
30 * produce a color rather than just crashing.
31 */
32 (function($){$.color={};$.color.make=function(r,g,b,a){var o={};o.r=r||0;o.g=g||0;o.b=b||0;o.a=a!=null?a:1;o.add=function(c,d){for(var i=0;i<c.length;++i)o[c.charAt(i)]+=d;return o.normalize()};o.scale=function(c,f){for(var i=0;i<c.length;++i)o[c.charAt(i)]*=f;return o.normalize()};o.toString=function(){if(o.a>=1){return"rgb("+[o.r,o.g,o.b].join(",")+")"}else{return"rgba("+[o.r,o.g,o.b,o.a].join(",")+")"}};o.normalize=function(){function clamp(min,value,max){return value<min?min:value>max?max:value}o.r=clamp(0,parseInt(o.r),255);o.g=clamp(0,parseInt(o.g),255);o.b=clamp(0,parseInt(o.b),255);o.a=clamp(0,o.a,1);return o};o.clone=function(){return $.color.make(o.r,o.b,o.g,o.a)};return o.normalize()};$.color.extract=function(elem,css){var c;do{c=elem.css(css).toLowerCase();if(c!=""&&c!="transparent")break;elem=elem.parent()}while(elem.length&&!$.nodeName(elem.get(0),"body"));if(c=="rgba(0, 0, 0, 0)")c="transparent";return $.color.parse(c)};$.color.parse=function(str){var res,m=$.color.make;if(res=/rgb\(\s*([0-9]{1,3})\s*,\s*([0-9]{1,3})\s*,\s*([0-9]{1,3})\s*\)/.exec(str))return m(parseInt(res[1],10),parseInt(res[2],10),parseInt(res[3],10));if(res=/rgba\(\s*([0-9]{1,3})\s*,\s*([0-9]{1,3})\s*,\s*([0-9]{1,3})\s*,\s*([0-9]+(?:\.[0-9]+)?)\s*\)/.exec(str))return m(parseInt(res[1],10),parseInt(res[2],10),parseInt(res[3],10),parseFloat(res[4]));if(res=/rgb\(\s*([0-9]+(?:\.[0-9]+)?)\%\s*,\s*([0-9]+(?:\.[0-9]+)?)\%\s*,\s*([0-9]+(?:\.[0-9]+)?)\%\s*\)/.exec(str))return m(parseFloat(res[1])*2.55,parseFloat(res[2])*2.55,parseFloat(res[3])*2.55);if(res=/rgba\(\s*([0-9]+(?:\.[0-9]+)?)\%\s*,\s*([0-9]+(?:\.[0-9]+)?)\%\s*,\s*([0-9]+(?:\.[0-9]+)?)\%\s*,\s*([0-9]+(?:\.[0-9]+)?)\s*\)/.exec(str))return m(parseFloat(res[1])*2.55,parseFloat(res[2])*2.55,parseFloat(res[3])*2.55,parseFloat(res[4]));if(res=/#([a-fA-F0-9]{2})([a-fA-F0-9]{2})([a-fA-F0-9]{2})/.exec(str))return m(parseInt(res[1],16),parseInt(res[2],16),parseInt(res[3],16));if(res=/#([a-fA-F0-9])([a-fA-F0-9])([a-fA-F0-9])/.exec(str))return m(parseInt(res[1]+res[1],16),parseInt(res[2]+res[2],16),parseInt(res[3]+res[3],16));var name=$.trim(str).toLowerCase();if(name=="transparent")return m(255,255,255,0);else{res=lookupColors[name]||[0,0,0];return m(res[0],res[1],res[2])}};var lookupColors={aqua:[0,255,255],azure:[240,255,255],beige:[245,245,220],black:[0,0,0],blue:[0,0,255],brown:[165,42,42],cyan:[0,255,255],darkblue:[0,0,139],darkcyan:[0,139,139],darkgrey:[169,169,169],darkgreen:[0,100,0],darkkhaki:[189,183,107],darkmagenta:[139,0,139],darkolivegreen:[85,107,47],darkorange:[255,140,0],darkorchid:[153,50,204],darkred:[139,0,0],darksalmon:[233,150,122],darkviolet:[148,0,211],fuchsia:[255,0,255],gold:[255,215,0],green:[0,128,0],indigo:[75,0,130],khaki:[240,230,140],lightblue:[173,216,230],lightcyan:[224,255,255],lightgreen:[144,238,144],lightgrey:[211,211,211],lightpink:[255,182,193],lightyellow:[255,255,224],lime:[0,255,0],magenta:[255,0,255],maroon:[128,0,0],navy:[0,0,128],olive:[128,128,0],orange:[255,165,0],pink:[255,192,203],purple:[128,0,128],violet:[128,0,128],red:[255,0,0],silver:[192,192,192],white:[255,255,255],yellow:[255,255,0]}})(jQuery);
33
34 // the actual Flot code
35 (function($) {
36
37 // Cache the prototype hasOwnProperty for faster access
38
39 var hasOwnProperty = Object.prototype.hasOwnProperty;
40
41 // A shim to provide 'detach' to jQuery versions prior to 1.4. Using a DOM
42 // operation produces the same effect as detach, i.e. removing the element
43 // without touching its jQuery data.
44
45 // Do not merge this into Flot 0.9, since it requires jQuery 1.4.4+.
46
47 if (!$.fn.detach) {
48 $.fn.detach = function() {
49 return this.each(function() {
50 if (this.parentNode) {
51 this.parentNode.removeChild( this );
52 }
53 });
54 };
55 }
56
57 ///////////////////////////////////////////////////////////////////////////
58 // The Canvas object is a wrapper around an HTML5 <canvas> tag.
59 //
60 // @constructor
61 // @param {string} cls List of classes to apply to the canvas.
62 // @param {element} container Element onto which to append the canvas.
63 //
64 // Requiring a container is a little iffy, but unfortunately canvas
65 // operations don't work unless the canvas is attached to the DOM.
66
67 function Canvas(cls, container) {
68
69 var element = container.children("." + cls)[0];
70
71 if (element == null) {
72
73 element = document.createElement("canvas");
74 element.className = cls;
75
76 $(element).css({ direction: "ltr", position: "absolute", left: 0, top: 0 })
77 .appendTo(container);
78
79 // If HTML5 Canvas isn't available, fall back to [Ex|Flash]canvas
80
81 if (!element.getContext) {
82 if (window.G_vmlCanvasManager) {
83 element = window.G_vmlCanvasManager.initElement(element);
84 } else {
85 throw new Error("Canvas is not available. If you're using IE with a fall-back such as Excanvas, then there's either a mistake in your conditional include, or the page has no DOCTYPE and is rendering in Quirks Mode.");
86 }
87 }
88 }
89
90 this.element = element;
91
92 var context = this.context = element.getContext("2d");
93
94 // Determine the screen's ratio of physical to device-independent
95 // pixels. This is the ratio between the canvas width that the browser
96 // advertises and the number of pixels actually present in that space.
97
98 // The iPhone 4, for example, has a device-independent width of 320px,
99 // but its screen is actually 640px wide. It therefore has a pixel
100 // ratio of 2, while most normal devices have a ratio of 1.
101
102 var devicePixelRatio = window.devicePixelRatio || 1,
103 backingStoreRatio =
104 context.webkitBackingStorePixelRatio ||
105 context.mozBackingStorePixelRatio ||
106 context.msBackingStorePixelRatio ||
107 context.oBackingStorePixelRatio ||
108 context.backingStorePixelRatio || 1;
109
110 this.pixelRatio = devicePixelRatio / backingStoreRatio;
111
112 // Size the canvas to match the internal dimensions of its container
113
114 this.resize(container.width(), container.height());
115
116 // Collection of HTML div layers for text overlaid onto the canvas
117
118 this.textContainer = null;
119 this.text = {};
120
121 // Cache of text fragments and metrics, so we can avoid expensively
122 // re-calculating them when the plot is re-rendered in a loop.
123
124 this._textCache = {};
125 }
126
127 // Resizes the canvas to the given dimensions.
128 //
129 // @param {number} width New width of the canvas, in pixels.
130 // @param {number} width New height of the canvas, in pixels.
131
132 Canvas.prototype.resize = function(width, height) {
133
134 if (width <= 0 || height <= 0) {
135 throw new Error("Invalid dimensions for plot, width = " + width + ", height = " + height);
136 }
137
138 var element = this.element,
139 context = this.context,
140 pixelRatio = this.pixelRatio;
141
142 // Resize the canvas, increasing its density based on the display's
143 // pixel ratio; basically giving it more pixels without increasing the
144 // size of its element, to take advantage of the fact that retina
145 // displays have that many more pixels in the same advertised space.
146
147 // Resizing should reset the state (excanvas seems to be buggy though)
148
149 if (this.width != width) {
150 element.width = width * pixelRatio;
151 element.style.width = width + "px";
152 this.width = width;
153 }
154
155 if (this.height != height) {
156 element.height = height * pixelRatio;
157 element.style.height = height + "px";
158 this.height = height;
159 }
160
161 // Save the context, so we can reset in case we get replotted. The
162 // restore ensure that we're really back at the initial state, and
163 // should be safe even if we haven't saved the initial state yet.
164
165 context.restore();
166 context.save();
167
168 // Scale the coordinate space to match the display density; so even though we
169 // may have twice as many pixels, we still want lines and other drawing to
170 // appear at the same size; the extra pixels will just make them crisper.
171
172 context.scale(pixelRatio, pixelRatio);
173 };
174
175 // Clears the entire canvas area, not including any overlaid HTML text
176
177 Canvas.prototype.clear = function() {
178 this.context.clearRect(0, 0, this.width, this.height);
179 };
180
181 // Finishes rendering the canvas, including managing the text overlay.
182
183 Canvas.prototype.render = function() {
184
185 var cache = this._textCache;
186
187 // For each text layer, add elements marked as active that haven't
188 // already been rendered, and remove those that are no longer active.
189
190 for (var layerKey in cache) {
191 if (hasOwnProperty.call(cache, layerKey)) {
192
193 var layer = this.getTextLayer(layerKey),
194 layerCache = cache[layerKey];
195
196 layer.hide();
197
198 for (var styleKey in layerCache) {
199 if (hasOwnProperty.call(layerCache, styleKey)) {
200 var styleCache = layerCache[styleKey];
201 for (var key in styleCache) {
202 if (hasOwnProperty.call(styleCache, key)) {
203
204 var positions = styleCache[key].positions;
205
206 for (var i = 0, position; position = positions[i]; i++) {
207 if (position.active) {
208 if (!position.rendered) {
209 layer.append(position.element);
210 position.rendered = true;
211 }
212 } else {
213 positions.splice(i--, 1);
214 if (position.rendered) {
215 position.element.detach();
216 }
217 }
218 }
219
220 if (positions.length == 0) {
221 delete styleCache[key];
222 }
223 }
224 }
225 }
226 }
227
228 layer.show();
229 }
230 }
231 };
232
233 // Creates (if necessary) and returns the text overlay container.
234 //
235 // @param {string} classes String of space-separated CSS classes used to
236 // uniquely identify the text layer.
237 // @return {object} The jQuery-wrapped text-layer div.
238
239 Canvas.prototype.getTextLayer = function(classes) {
240
241 var layer = this.text[classes];
242
243 // Create the text layer if it doesn't exist
244
245 if (layer == null) {
246
247 // Create the text layer container, if it doesn't exist
248
249 if (this.textContainer == null) {
250 this.textContainer = $("<div class='flot-text'></div>")
251 .css({
252 position: "absolute",
253 top: 0,
254 left: 0,
255 bottom: 0,
256 right: 0,
257 'font-size': "smaller",
258 color: "#545454"
259 })
260 .insertAfter(this.element);
261 }
262
263 layer = this.text[classes] = $("<div></div>")
264 .addClass(classes)
265 .css({
266 position: "absolute",
267 top: 0,
268 left: 0,
269 bottom: 0,
270 right: 0
271 })
272 .appendTo(this.textContainer);
273 }
274
275 return layer;
276 };
277
278 // Creates (if necessary) and returns a text info object.
279 //
280 // The object looks like this:
281 //
282 // {
283 // width: Width of the text's wrapper div.
284 // height: Height of the text's wrapper div.
285 // element: The jQuery-wrapped HTML div containing the text.
286 // positions: Array of positions at which this text is drawn.
287 // }
288 //
289 // The positions array contains objects that look like this:
290 //
291 // {
292 // active: Flag indicating whether the text should be visible.
293 // rendered: Flag indicating whether the text is currently visible.
294 // element: The jQuery-wrapped HTML div containing the text.
295 // x: X coordinate at which to draw the text.
296 // y: Y coordinate at which to draw the text.
297 // }
298 //
299 // Each position after the first receives a clone of the original element.
300 //
301 // The idea is that that the width, height, and general 'identity' of the
302 // text is constant no matter where it is placed; the placements are a
303 // secondary property.
304 //
305 // Canvas maintains a cache of recently-used text info objects; getTextInfo
306 // either returns the cached element or creates a new entry.
307 //
308 // @param {string} layer A string of space-separated CSS classes uniquely
309 // identifying the layer containing this text.
310 // @param {string} text Text string to retrieve info for.
311 // @param {(string|object)=} font Either a string of space-separated CSS
312 // classes or a font-spec object, defining the text's font and style.
313 // @param {number=} angle Angle at which to rotate the text, in degrees.
314 // Angle is currently unused, it will be implemented in the future.
315 // @param {number=} width Maximum width of the text before it wraps.
316 // @return {object} a text info object.
317
318 Canvas.prototype.getTextInfo = function(layer, text, font, angle, width) {
319
320 var textStyle, layerCache, styleCache, info;
321
322 // Cast the value to a string, in case we were given a number or such
323
324 text = "" + text;
325
326 // If the font is a font-spec object, generate a CSS font definition
327
328 if (typeof font === "object") {
329 textStyle = font.style + " " + font.variant + " " + font.weight + " " + font.size + "px/" + font.lineHeight + "px " + font.family;
330 } else {
331 textStyle = font;
332 }
333
334 // Retrieve (or create) the cache for the text's layer and styles
335
336 layerCache = this._textCache[layer];
337
338 if (layerCache == null) {
339 layerCache = this._textCache[layer] = {};
340 }
341
342 styleCache = layerCache[textStyle];
343
344 if (styleCache == null) {
345 styleCache = layerCache[textStyle] = {};
346 }
347
348 info = styleCache[text];
349
350 // If we can't find a matching element in our cache, create a new one
351
352 if (info == null) {
353
354 var element = $("<div></div>").html(text)
355 .css({
356 position: "absolute",
357 'max-width': width,
358 top: -9999
359 })
360 .appendTo(this.getTextLayer(layer));
361
362 if (typeof font === "object") {
363 element.css({
364 font: textStyle,
365 color: font.color
366 });
367 } else if (typeof font === "string") {
368 element.addClass(font);
369 }
370
371 info = styleCache[text] = {
372 width: element.outerWidth(true),
373 height: element.outerHeight(true),
374 element: element,
375 positions: []
376 };
377
378 element.detach();
379 }
380
381 return info;
382 };
383
384 // Adds a text string to the canvas text overlay.
385 //
386 // The text isn't drawn immediately; it is marked as rendering, which will
387 // result in its addition to the canvas on the next render pass.
388 //
389 // @param {string} layer A string of space-separated CSS classes uniquely
390 // identifying the layer containing this text.
391 // @param {number} x X coordinate at which to draw the text.
392 // @param {number} y Y coordinate at which to draw the text.
393 // @param {string} text Text string to draw.
394 // @param {(string|object)=} font Either a string of space-separated CSS
395 // classes or a font-spec object, defining the text's font and style.
396 // @param {number=} angle Angle at which to rotate the text, in degrees.
397 // Angle is currently unused, it will be implemented in the future.
398 // @param {number=} width Maximum width of the text before it wraps.
399 // @param {string=} halign Horizontal alignment of the text; either "left",
400 // "center" or "right".
401 // @param {string=} valign Vertical alignment of the text; either "top",
402 // "middle" or "bottom".
403
404 Canvas.prototype.addText = function(layer, x, y, text, font, angle, width, halign, valign) {
405
406 var info = this.getTextInfo(layer, text, font, angle, width),
407 positions = info.positions;
408
409 // Tweak the div's position to match the text's alignment
410
411 if (halign == "center") {
412 x -= info.width / 2;
413 } else if (halign == "right") {
414 x -= info.width;
415 }
416
417 if (valign == "middle") {
418 y -= info.height / 2;
419 } else if (valign == "bottom") {
420 y -= info.height;
421 }
422
423 // Determine whether this text already exists at this position.
424 // If so, mark it for inclusion in the next render pass.
425
426 for (var i = 0, position; position = positions[i]; i++) {
427 if (position.x == x && position.y == y) {
428 position.active = true;
429 return;
430 }
431 }
432
433 // If the text doesn't exist at this position, create a new entry
434
435 // For the very first position we'll re-use the original element,
436 // while for subsequent ones we'll clone it.
437
438 position = {
439 active: true,
440 rendered: false,
441 element: positions.length ? info.element.clone() : info.element,
442 x: x,
443 y: y
444 };
445
446 positions.push(position);
447
448 // Move the element to its final position within the container
449
450 position.element.css({
451 top: Math.round(y),
452 left: Math.round(x),
453 'text-align': halign // In case the text wraps
454 });
455 };
456
457 // Removes one or more text strings from the canvas text overlay.
458 //
459 // If no parameters are given, all text within the layer is removed.
460 //
461 // Note that the text is not immediately removed; it is simply marked as
462 // inactive, which will result in its removal on the next render pass.
463 // This avoids the performance penalty for 'clear and redraw' behavior,
464 // where we potentially get rid of all text on a layer, but will likely
465 // add back most or all of it later, as when redrawing axes, for example.
466 //
467 // @param {string} layer A string of space-separated CSS classes uniquely
468 // identifying the layer containing this text.
469 // @param {number=} x X coordinate of the text.
470 // @param {number=} y Y coordinate of the text.
471 // @param {string=} text Text string to remove.
472 // @param {(string|object)=} font Either a string of space-separated CSS
473 // classes or a font-spec object, defining the text's font and style.
474 // @param {number=} angle Angle at which the text is rotated, in degrees.
475 // Angle is currently unused, it will be implemented in the future.
476
477 Canvas.prototype.removeText = function(layer, x, y, text, font, angle) {
478 if (text == null) {
479 var layerCache = this._textCache[layer];
480 if (layerCache != null) {
481 for (var styleKey in layerCache) {
482 if (hasOwnProperty.call(layerCache, styleKey)) {
483 var styleCache = layerCache[styleKey];
484 for (var key in styleCache) {
485 if (hasOwnProperty.call(styleCache, key)) {
486 var positions = styleCache[key].positions;
487 for (var i = 0, position; position = positions[i]; i++) {
488 position.active = false;
489 }
490 }
491 }
492 }
493 }
494 }
495 } else {
496 var positions = this.getTextInfo(layer, text, font, angle).positions;
497 for (var i = 0, position; position = positions[i]; i++) {
498 if (position.x == x && position.y == y) {
499 position.active = false;
500 }
501 }
502 }
503 };
504
505 ///////////////////////////////////////////////////////////////////////////
506 // The top-level container for the entire plot.
507
508 function Plot(placeholder, data_, options_, plugins) {
509 // data is on the form:
510 // [ series1, series2 ... ]
511 // where series is either just the data as [ [x1, y1], [x2, y2], ... ]
512 // or { data: [ [x1, y1], [x2, y2], ... ], label: "some label", ... }
513
514 var series = [],
515 options = {
516 // the color theme used for graphs
517 colors: ["#edc240", "#afd8f8", "#cb4b4b", "#4da74d", "#9440ed"],
518 legend: {
519 show: true,
520 noColumns: 1, // number of colums in legend table
521 labelFormatter: null, // fn: string -> string
522 labelBoxBorderColor: "#ccc", // border color for the little label boxes
523 container: null, // container (as jQuery object) to put legend in, null means default on top of graph
524 position: "ne", // position of default legend container within plot
525 margin: 5, // distance from grid edge to default legend container within plot
526 backgroundColor: null, // null means auto-detect
527 backgroundOpacity: 0.85, // set to 0 to avoid background
528 sorted: null // default to no legend sorting
529 },
530 xaxis: {
531 show: null, // null = auto-detect, true = always, false = never
532 position: "bottom", // or "top"
533 mode: null, // null or "time"
534 font: null, // null (derived from CSS in placeholder) or object like { size: 11, lineHeight: 13, style: "italic", weight: "bold", family: "sans-serif", variant: "small-caps" }
535 color: null, // base color, labels, ticks
536 tickColor: null, // possibly different color of ticks, e.g. "rgba(0,0,0,0.15)"
537 transform: null, // null or f: number -> number to transform axis
538 inverseTransform: null, // if transform is set, this should be the inverse function
539 min: null, // min. value to show, null means set automatically
540 max: null, // max. value to show, null means set automatically
541 autoscaleMargin: null, // margin in % to add if auto-setting min/max
542 ticks: null, // either [1, 3] or [[1, "a"], 3] or (fn: axis info -> ticks) or app. number of ticks for auto-ticks
543 tickFormatter: null, // fn: number -> string
544 labelWidth: null, // size of tick labels in pixels
545 labelHeight: null,
546 reserveSpace: null, // whether to reserve space even if axis isn't shown
547 tickLength: null, // size in pixels of ticks, or "full" for whole line
548 alignTicksWithAxis: null, // axis number or null for no sync
549 tickDecimals: null, // no. of decimals, null means auto
550 tickSize: null, // number or [number, "unit"]
551 minTickSize: null // number or [number, "unit"]
552 },
553 yaxis: {
554 autoscaleMargin: 0.02,
555 position: "left" // or "right"
556 },
557 xaxes: [],
558 yaxes: [],
559 series: {
560 points: {
561 show: false,
562 radius: 3,
563 lineWidth: 2, // in pixels
564 fill: true,
565 fillColor: "#ffffff",
566 symbol: "circle" // or callback
567 },
568 lines: {
569 // we don't put in show: false so we can see
570 // whether lines were actively disabled
571 lineWidth: 2, // in pixels
572 fill: false,
573 fillColor: null,
574 steps: false
575 // Omit 'zero', so we can later default its value to
576 // match that of the 'fill' option.
577 },
578 bars: {
579 show: false,
580 lineWidth: 2, // in pixels
581 barWidth: 1, // in units of the x axis
582 fill: true,
583 fillColor: null,
584 align: "left", // "left", "right", or "center"
585 horizontal: false,
586 zero: true
587 },
588 shadowSize: 3,
589 highlightColor: null
590 },
591 grid: {
592 show: true,
593 aboveData: false,
594 color: "#545454", // primary color used for outline and labels
595 backgroundColor: null, // null for transparent, else color
596 borderColor: null, // set if different from the grid color
597 tickColor: null, // color for the ticks, e.g. "rgba(0,0,0,0.15)"
598 margin: 0, // distance from the canvas edge to the grid
599 labelMargin: 5, // in pixels
600 axisMargin: 8, // in pixels
601 borderWidth: 2, // in pixels
602 minBorderMargin: null, // in pixels, null means taken from points radius
603 markings: null, // array of ranges or fn: axes -> array of ranges
604 markingsColor: "#f4f4f4",
605 markingsLineWidth: 2,
606 // interactive stuff
607 clickable: false,
608 hoverable: false,
609 autoHighlight: true, // highlight in case mouse is near
610 mouseActiveRadius: 10 // how far the mouse can be away to activate an item
611 },
612 interaction: {
613 redrawOverlayInterval: 1000/60 // time between updates, -1 means in same flow
614 },
615 hooks: {}
616 },
617 surface = null, // the canvas for the plot itself
618 overlay = null, // canvas for interactive stuff on top of plot
619 eventHolder = null, // jQuery object that events should be bound to
620 ctx = null, octx = null,
621 xaxes = [], yaxes = [],
622 plotOffset = { left: 0, right: 0, top: 0, bottom: 0},
623 plotWidth = 0, plotHeight = 0,
624 hooks = {
625 processOptions: [],
626 processRawData: [],
627 processDatapoints: [],
628 processOffset: [],
629 drawBackground: [],
630 drawSeries: [],
631 draw: [],
632 bindEvents: [],
633 drawOverlay: [],
634 shutdown: []
635 },
636 plot = this;
637
638 // public functions
639 plot.setData = setData;
640 plot.setupGrid = setupGrid;
641 plot.draw = draw;
642 plot.getPlaceholder = function() { return placeholder; };
643 plot.getCanvas = function() { return surface.element; };
644 plot.getPlotOffset = function() { return plotOffset; };
645 plot.width = function () { return plotWidth; };
646 plot.height = function () { return plotHeight; };
647 plot.offset = function () {
648 var o = eventHolder.offset();
649 o.left += plotOffset.left;
650 o.top += plotOffset.top;
651 return o;
652 };
653 plot.getData = function () { return series; };
654 plot.getAxes = function () {
655 var res = {}, i;
656 $.each(xaxes.concat(yaxes), function (_, axis) {
657 if (axis)
658 res[axis.direction + (axis.n != 1 ? axis.n : "") + "axis"] = axis;
659 });
660 return res;
661 };
662 plot.getXAxes = function () { return xaxes; };
663 plot.getYAxes = function () { return yaxes; };
664 plot.c2p = canvasToAxisCoords;
665 plot.p2c = axisToCanvasCoords;
666 plot.getOptions = function () { return options; };
667 plot.highlight = highlight;
668 plot.unhighlight = unhighlight;
669 plot.triggerRedrawOverlay = triggerRedrawOverlay;
670 plot.pointOffset = function(point) {
671 return {
672 left: parseInt(xaxes[axisNumber(point, "x") - 1].p2c(+point.x) + plotOffset.left, 10),
673 top: parseInt(yaxes[axisNumber(point, "y") - 1].p2c(+point.y) + plotOffset.top, 10)
674 };
675 };
676 plot.shutdown = shutdown;
677 plot.destroy = function () {
678 shutdown();
679 placeholder.removeData("plot").empty();
680
681 series = [];
682 options = null;
683 surface = null;
684 overlay = null;
685 eventHolder = null;
686 ctx = null;
687 octx = null;
688 xaxes = [];
689 yaxes = [];
690 hooks = null;
691 highlights = [];
692 plot = null;
693 };
694 plot.resize = function () {
695 var width = placeholder.width(),
696 height = placeholder.height();
697 surface.resize(width, height);
698 overlay.resize(width, height);
699 };
700
701 // public attributes
702 plot.hooks = hooks;
703
704 // initialize
705 initPlugins(plot);
706 parseOptions(options_);
707 setupCanvases();
708 setData(data_);
709 setupGrid();
710 draw();
711 bindEvents();
712
713
714 function executeHooks(hook, args) {
715 args = [plot].concat(args);
716 for (var i = 0; i < hook.length; ++i)
717 hook[i].apply(this, args);
718 }
719
720 function initPlugins() {
721
722 // References to key classes, allowing plugins to modify them
723
724 var classes = {
725 Canvas: Canvas
726 };
727
728 for (var i = 0; i < plugins.length; ++i) {
729 var p = plugins[i];
730 p.init(plot, classes);
731 if (p.options)
732 $.extend(true, options, p.options);
733 }
734 }
735
736 function parseOptions(opts) {
737
738 $.extend(true, options, opts);
739
740 // $.extend merges arrays, rather than replacing them. When less
741 // colors are provided than the size of the default palette, we
742 // end up with those colors plus the remaining defaults, which is
743 // not expected behavior; avoid it by replacing them here.
744
745 if (opts && opts.colors) {
746 options.colors = opts.colors;
747 }
748
749 if (options.xaxis.color == null)
750 options.xaxis.color = $.color.parse(options.grid.color).scale('a', 0.22).toString();
751 if (options.yaxis.color == null)
752 options.yaxis.color = $.color.parse(options.grid.color).scale('a', 0.22).toString();
753
754 if (options.xaxis.tickColor == null) // grid.tickColor for back-compatibility
755 options.xaxis.tickColor = options.grid.tickColor || options.xaxis.color;
756 if (options.yaxis.tickColor == null) // grid.tickColor for back-compatibility
757 options.yaxis.tickColor = options.grid.tickColor || options.yaxis.color;
758
759 if (options.grid.borderColor == null)
760 options.grid.borderColor = options.grid.color;
761 if (options.grid.tickColor == null)
762 options.grid.tickColor = $.color.parse(options.grid.color).scale('a', 0.22).toString();
763
764 // Fill in defaults for axis options, including any unspecified
765 // font-spec fields, if a font-spec was provided.
766
767 // If no x/y axis options were provided, create one of each anyway,
768 // since the rest of the code assumes that they exist.
769
770 var i, axisOptions, axisCount,
771 fontSize = placeholder.css("font-size"),
772 fontSizeDefault = fontSize ? +fontSize.replace("px", "") : 13,
773 fontDefaults = {
774 style: placeholder.css("font-style"),
775 size: Math.round(0.8 * fontSizeDefault),
776 variant: placeholder.css("font-variant"),
777 weight: placeholder.css("font-weight"),
778 family: placeholder.css("font-family")
779 };
780
781 axisCount = options.xaxes.length || 1;
782 for (i = 0; i < axisCount; ++i) {
783
784 axisOptions = options.xaxes[i];
785 if (axisOptions && !axisOptions.tickColor) {
786 axisOptions.tickColor = axisOptions.color;
787 }
788
789 axisOptions = $.extend(true, {}, options.xaxis, axisOptions);
790 options.xaxes[i] = axisOptions;
791
792 if (axisOptions.font) {
793 axisOptions.font = $.extend({}, fontDefaults, axisOptions.font);
794 if (!axisOptions.font.color) {
795 axisOptions.font.color = axisOptions.color;
796 }
797 if (!axisOptions.font.lineHeight) {
798 axisOptions.font.lineHeight = Math.round(axisOptions.font.size * 1.15);
799 }
800 }
801 }
802
803 axisCount = options.yaxes.length || 1;
804 for (i = 0; i < axisCount; ++i) {
805
806 axisOptions = options.yaxes[i];
807 if (axisOptions && !axisOptions.tickColor) {
808 axisOptions.tickColor = axisOptions.color;
809 }
810
811 axisOptions = $.extend(true, {}, options.yaxis, axisOptions);
812 options.yaxes[i] = axisOptions;
813
814 if (axisOptions.font) {
815 axisOptions.font = $.extend({}, fontDefaults, axisOptions.font);
816 if (!axisOptions.font.color) {
817 axisOptions.font.color = axisOptions.color;
818 }
819 if (!axisOptions.font.lineHeight) {
820 axisOptions.font.lineHeight = Math.round(axisOptions.font.size * 1.15);
821 }
822 }
823 }
824
825 // backwards compatibility, to be removed in future
826 if (options.xaxis.noTicks && options.xaxis.ticks == null)
827 options.xaxis.ticks = options.xaxis.noTicks;
828 if (options.yaxis.noTicks && options.yaxis.ticks == null)
829 options.yaxis.ticks = options.yaxis.noTicks;
830 if (options.x2axis) {
831 options.xaxes[1] = $.extend(true, {}, options.xaxis, options.x2axis);
832 options.xaxes[1].position = "top";
833 // Override the inherit to allow the axis to auto-scale
834 if (options.x2axis.min == null) {
835 options.xaxes[1].min = null;
836 }
837 if (options.x2axis.max == null) {
838 options.xaxes[1].max = null;
839 }
840 }
841 if (options.y2axis) {
842 options.yaxes[1] = $.extend(true, {}, options.yaxis, options.y2axis);
843 options.yaxes[1].position = "right";
844 // Override the inherit to allow the axis to auto-scale
845 if (options.y2axis.min == null) {
846 options.yaxes[1].min = null;
847 }
848 if (options.y2axis.max == null) {
849 options.yaxes[1].max = null;
850 }
851 }
852 if (options.grid.coloredAreas)
853 options.grid.markings = options.grid.coloredAreas;
854 if (options.grid.coloredAreasColor)
855 options.grid.markingsColor = options.grid.coloredAreasColor;
856 if (options.lines)
857 $.extend(true, options.series.lines, options.lines);
858 if (options.points)
859 $.extend(true, options.series.points, options.points);
860 if (options.bars)
861 $.extend(true, options.series.bars, options.bars);
862 if (options.shadowSize != null)
863 options.series.shadowSize = options.shadowSize;
864 if (options.highlightColor != null)
865 options.series.highlightColor = options.highlightColor;
866
867 // save options on axes for future reference
868 for (i = 0; i < options.xaxes.length; ++i)
869 getOrCreateAxis(xaxes, i + 1).options = options.xaxes[i];
870 for (i = 0; i < options.yaxes.length; ++i)
871 getOrCreateAxis(yaxes, i + 1).options = options.yaxes[i];
872
873 // add hooks from options
874 for (var n in hooks)
875 if (options.hooks[n] && options.hooks[n].length)
876 hooks[n] = hooks[n].concat(options.hooks[n]);
877
878 executeHooks(hooks.processOptions, [options]);
879 }
880
881 function setData(d) {
882 series = parseData(d);
883 fillInSeriesOptions();
884 processData();
885 }
886
887 function parseData(d) {
888 var res = [];
889 for (var i = 0; i < d.length; ++i) {
890 var s = $.extend(true, {}, options.series);
891
892 if (d[i].data != null) {
893 s.data = d[i].data; // move the data instead of deep-copy
894 delete d[i].data;
895
896 $.extend(true, s, d[i]);
897
898 d[i].data = s.data;
899 }
900 else
901 s.data = d[i];
902 res.push(s);
903 }
904
905 return res;
906 }
907
908 function axisNumber(obj, coord) {
909 var a = obj[coord + "axis"];
910 if (typeof a == "object") // if we got a real axis, extract number
911 a = a.n;
912 if (typeof a != "number")
913 a = 1; // default to first axis
914 return a;
915 }
916
917 function allAxes() {
918 // return flat array without annoying null entries
919 return $.grep(xaxes.concat(yaxes), function (a) { return a; });
920 }
921
922 function canvasToAxisCoords(pos) {
923 // return an object with x/y corresponding to all used axes
924 var res = {}, i, axis;
925 for (i = 0; i < xaxes.length; ++i) {
926 axis = xaxes[i];
927 if (axis && axis.used)
928 res["x" + axis.n] = axis.c2p(pos.left);
929 }
930
931 for (i = 0; i < yaxes.length; ++i) {
932 axis = yaxes[i];
933 if (axis && axis.used)
934 res["y" + axis.n] = axis.c2p(pos.top);
935 }
936
937 if (res.x1 !== undefined)
938 res.x = res.x1;
939 if (res.y1 !== undefined)
940 res.y = res.y1;
941
942 return res;
943 }
944
945 function axisToCanvasCoords(pos) {
946 // get canvas coords from the first pair of x/y found in pos
947 var res = {}, i, axis, key;
948
949 for (i = 0; i < xaxes.length; ++i) {
950 axis = xaxes[i];
951 if (axis && axis.used) {
952 key = "x" + axis.n;
953 if (pos[key] == null && axis.n == 1)
954 key = "x";
955
956 if (pos[key] != null) {
957 res.left = axis.p2c(pos[key]);
958 break;
959 }
960 }
961 }
962
963 for (i = 0; i < yaxes.length; ++i) {
964 axis = yaxes[i];
965 if (axis && axis.used) {
966 key = "y" + axis.n;
967 if (pos[key] == null && axis.n == 1)
968 key = "y";
969
970 if (pos[key] != null) {
971 res.top = axis.p2c(pos[key]);
972 break;
973 }
974 }
975 }
976
977 return res;
978 }
979
980 function getOrCreateAxis(axes, number) {
981 if (!axes[number - 1])
982 axes[number - 1] = {
983 n: number, // save the number for future reference
984 direction: axes == xaxes ? "x" : "y",
985 options: $.extend(true, {}, axes == xaxes ? options.xaxis : options.yaxis)
986 };
987
988 return axes[number - 1];
989 }
990
991 function fillInSeriesOptions() {
992
993 var neededColors = series.length, maxIndex = -1, i;
994
995 // Subtract the number of series that already have fixed colors or
996 // color indexes from the number that we still need to generate.
997
998 for (i = 0; i < series.length; ++i) {
999 var sc = series[i].color;
1000 if (sc != null) {
1001 neededColors--;
1002 if (typeof sc == "number" && sc > maxIndex) {
1003 maxIndex = sc;
1004 }
1005 }
1006 }
1007
1008 // If any of the series have fixed color indexes, then we need to
1009 // generate at least as many colors as the highest index.
1010
1011 if (neededColors <= maxIndex) {
1012 neededColors = maxIndex + 1;
1013 }
1014
1015 // Generate all the colors, using first the option colors and then
1016 // variations on those colors once they're exhausted.
1017
1018 var c, colors = [], colorPool = options.colors,
1019 colorPoolSize = colorPool.length, variation = 0;
1020
1021 for (i = 0; i < neededColors; i++) {
1022
1023 c = $.color.parse(colorPool[i % colorPoolSize] || "#666");
1024
1025 // Each time we exhaust the colors in the pool we adjust
1026 // a scaling factor used to produce more variations on
1027 // those colors. The factor alternates negative/positive
1028 // to produce lighter/darker colors.
1029
1030 // Reset the variation after every few cycles, or else
1031 // it will end up producing only white or black colors.
1032
1033 if (i % colorPoolSize == 0 && i) {
1034 if (variation >= 0) {
1035 if (variation < 0.5) {
1036 variation = -variation - 0.2;
1037 } else variation = 0;
1038 } else variation = -variation;
1039 }
1040
1041 colors[i] = c.scale('rgb', 1 + variation);
1042 }
1043
1044 // Finalize the series options, filling in their colors
1045
1046 var colori = 0, s;
1047 for (i = 0; i < series.length; ++i) {
1048 s = series[i];
1049
1050 // assign colors
1051 if (s.color == null) {
1052 s.color = colors[colori].toString();
1053 ++colori;
1054 }
1055 else if (typeof s.color == "number")
1056 s.color = colors[s.color].toString();
1057
1058 // turn on lines automatically in case nothing is set
1059 if (s.lines.show == null) {
1060 var v, show = true;
1061 for (v in s)
1062 if (s[v] && s[v].show) {
1063 show = false;
1064 break;
1065 }
1066 if (show)
1067 s.lines.show = true;
1068 }
1069
1070 // If nothing was provided for lines.zero, default it to match
1071 // lines.fill, since areas by default should extend to zero.
1072
1073 if (s.lines.zero == null) {
1074 s.lines.zero = !!s.lines.fill;
1075 }
1076
1077 // setup axes
1078 s.xaxis = getOrCreateAxis(xaxes, axisNumber(s, "x"));
1079 s.yaxis = getOrCreateAxis(yaxes, axisNumber(s, "y"));
1080 }
1081 }
1082
1083 function processData() {
1084 var topSentry = Number.POSITIVE_INFINITY,
1085 bottomSentry = Number.NEGATIVE_INFINITY,
1086 fakeInfinity = Number.MAX_VALUE,
1087 i, j, k, m, length,
1088 s, points, ps, x, y, axis, val, f, p,
1089 data, format;
1090
1091 function updateAxis(axis, min, max) {
1092 if (min < axis.datamin && min != -fakeInfinity)
1093 axis.datamin = min;
1094 if (max > axis.datamax && max != fakeInfinity)
1095 axis.datamax = max;
1096 }
1097
1098 $.each(allAxes(), function (_, axis) {
1099 // init axis
1100 axis.datamin = topSentry;
1101 axis.datamax = bottomSentry;
1102 axis.used = false;
1103 });
1104
1105 for (i = 0; i < series.length; ++i) {
1106 s = series[i];
1107 s.datapoints = { points: [] };
1108
1109 executeHooks(hooks.processRawData, [ s, s.data, s.datapoints ]);
1110 }
1111
1112 // first pass: clean and copy data
1113 for (i = 0; i < series.length; ++i) {
1114 s = series[i];
1115
1116 data = s.data;
1117 format = s.datapoints.format;
1118
1119 if (!format) {
1120 format = [];
1121 // find out how to copy
1122 format.push({ x: true, number: true, required: true });
1123 format.push({ y: true, number: true, required: true });
1124
1125 if (s.bars.show || (s.lines.show && s.lines.fill)) {
1126 var autoscale = !!((s.bars.show && s.bars.zero) || (s.lines.show && s.lines.zero));
1127 format.push({ y: true, number: true, required: false, defaultValue: 0, autoscale: autoscale });
1128 if (s.bars.horizontal) {
1129 delete format[format.length - 1].y;
1130 format[format.length - 1].x = true;
1131 }
1132 }
1133
1134 s.datapoints.format = format;
1135 }
1136
1137 if (s.datapoints.pointsize != null)
1138 continue; // already filled in
1139
1140 s.datapoints.pointsize = format.length;
1141
1142 ps = s.datapoints.pointsize;
1143 points = s.datapoints.points;
1144
1145 var insertSteps = s.lines.show && s.lines.steps;
1146 s.xaxis.used = s.yaxis.used = true;
1147
1148 for (j = k = 0; j < data.length; ++j, k += ps) {
1149 p = data[j];
1150
1151 var nullify = p == null;
1152 if (!nullify) {
1153 for (m = 0; m < ps; ++m) {
1154 val = p[m];
1155 f = format[m];
1156
1157 if (f) {
1158 if (f.number && val != null) {
1159 val = +val; // convert to number
1160 if (isNaN(val))
1161 val = null;
1162 else if (val == Infinity)
1163 val = fakeInfinity;
1164 else if (val == -Infinity)
1165 val = -fakeInfinity;
1166 }
1167
1168 if (val == null) {
1169 if (f.required)
1170 nullify = true;
1171
1172 if (f.defaultValue != null)
1173 val = f.defaultValue;
1174 }
1175 }
1176
1177 points[k + m] = val;
1178 }
1179 }
1180
1181 if (nullify) {
1182 for (m = 0; m < ps; ++m) {
1183 val = points[k + m];
1184 if (val != null) {
1185 f = format[m];
1186 // extract min/max info
1187 if (f.autoscale !== false) {
1188 if (f.x) {
1189 updateAxis(s.xaxis, val, val);
1190 }
1191 if (f.y) {
1192 updateAxis(s.yaxis, val, val);
1193 }
1194 }
1195 }
1196 points[k + m] = null;
1197 }
1198 }
1199 else {
1200 // a little bit of line specific stuff that
1201 // perhaps shouldn't be here, but lacking
1202 // better means...
1203 if (insertSteps && k > 0
1204 && points[k - ps] != null
1205 && points[k - ps] != points[k]
1206 && points[k - ps + 1] != points[k + 1]) {
1207 // copy the point to make room for a middle point
1208 for (m = 0; m < ps; ++m)
1209 points[k + ps + m] = points[k + m];
1210
1211 // middle point has same y
1212 points[k + 1] = points[k - ps + 1];
1213
1214 // we've added a point, better reflect that
1215 k += ps;
1216 }
1217 }
1218 }
1219 }
1220
1221 // give the hooks a chance to run
1222 for (i = 0; i < series.length; ++i) {
1223 s = series[i];
1224
1225 executeHooks(hooks.processDatapoints, [ s, s.datapoints]);
1226 }
1227
1228 // second pass: find datamax/datamin for auto-scaling
1229 for (i = 0; i < series.length; ++i) {
1230 s = series[i];
1231 points = s.datapoints.points;
1232 ps = s.datapoints.pointsize;
1233 format = s.datapoints.format;
1234
1235 var xmin = topSentry, ymin = topSentry,
1236 xmax = bottomSentry, ymax = bottomSentry;
1237
1238 for (j = 0; j < points.length; j += ps) {
1239 if (points[j] == null)
1240 continue;
1241
1242 for (m = 0; m < ps; ++m) {
1243 val = points[j + m];
1244 f = format[m];
1245 if (!f || f.autoscale === false || val == fakeInfinity || val == -fakeInfinity)
1246 continue;
1247
1248 if (f.x) {
1249 if (val < xmin)
1250 xmin = val;
1251 if (val > xmax)
1252 xmax = val;
1253 }
1254 if (f.y) {
1255 if (val < ymin)
1256 ymin = val;
1257 if (val > ymax)
1258 ymax = val;
1259 }
1260 }
1261 }
1262
1263 if (s.bars.show) {
1264 // make sure we got room for the bar on the dancing floor
1265 var delta;
1266
1267 switch (s.bars.align) {
1268 case "left":
1269 delta = 0;
1270 break;
1271 case "right":
1272 delta = -s.bars.barWidth;
1273 break;
1274 default:
1275 delta = -s.bars.barWidth / 2;
1276 }
1277
1278 if (s.bars.horizontal) {
1279 ymin += delta;
1280 ymax += delta + s.bars.barWidth;
1281 }
1282 else {
1283 xmin += delta;
1284 xmax += delta + s.bars.barWidth;
1285 }
1286 }
1287
1288 updateAxis(s.xaxis, xmin, xmax);
1289 updateAxis(s.yaxis, ymin, ymax);
1290 }
1291
1292 $.each(allAxes(), function (_, axis) {
1293 if (axis.datamin == topSentry)
1294 axis.datamin = null;
1295 if (axis.datamax == bottomSentry)
1296 axis.datamax = null;
1297 });
1298 }
1299
1300 function setupCanvases() {
1301
1302 // Make sure the placeholder is clear of everything except canvases
1303 // from a previous plot in this container that we'll try to re-use.
1304
1305 placeholder.css("padding", 0) // padding messes up the positioning
1306 .children().filter(function(){
1307 return !$(this).hasClass("flot-overlay") && !$(this).hasClass('flot-base');
1308 }).remove();
1309
1310 if (placeholder.css("position") == 'static')
1311 placeholder.css("position", "relative"); // for positioning labels and overlay
1312
1313 surface = new Canvas("flot-base", placeholder);
1314 overlay = new Canvas("flot-overlay", placeholder); // overlay canvas for interactive features
1315
1316 ctx = surface.context;
1317 octx = overlay.context;
1318
1319 // define which element we're listening for events on
1320 eventHolder = $(overlay.element).unbind();
1321
1322 // If we're re-using a plot object, shut down the old one
1323
1324 var existing = placeholder.data("plot");
1325
1326 if (existing) {
1327 existing.shutdown();
1328 overlay.clear();
1329 }
1330
1331 // save in case we get replotted
1332 placeholder.data("plot", plot);
1333 }
1334
1335 function bindEvents() {
1336 // bind events
1337 if (options.grid.hoverable) {
1338 eventHolder.mousemove(onMouseMove);
1339
1340 // Use bind, rather than .mouseleave, because we officially
1341 // still support jQuery 1.2.6, which doesn't define a shortcut
1342 // for mouseenter or mouseleave. This was a bug/oversight that
1343 // was fixed somewhere around 1.3.x. We can return to using
1344 // .mouseleave when we drop support for 1.2.6.
1345
1346 eventHolder.bind("mouseleave", onMouseLeave);
1347 }
1348
1349 if (options.grid.clickable)
1350 eventHolder.click(onClick);
1351
1352 executeHooks(hooks.bindEvents, [eventHolder]);
1353 }
1354
1355 function shutdown() {
1356 if (redrawTimeout)
1357 clearTimeout(redrawTimeout);
1358
1359 eventHolder.unbind("mousemove", onMouseMove);
1360 eventHolder.unbind("mouseleave", onMouseLeave);
1361 eventHolder.unbind("click", onClick);
1362
1363 executeHooks(hooks.shutdown, [eventHolder]);
1364 }
1365
1366 function setTransformationHelpers(axis) {
1367 // set helper functions on the axis, assumes plot area
1368 // has been computed already
1369
1370 function identity(x) { return x; }
1371
1372 var s, m, t = axis.options.transform || identity,
1373 it = axis.options.inverseTransform;
1374
1375 // precompute how much the axis is scaling a point
1376 // in canvas space
1377 if (axis.direction == "x") {
1378 s = axis.scale = plotWidth / Math.abs(t(axis.max) - t(axis.min));
1379 m = Math.min(t(axis.max), t(axis.min));
1380 }
1381 else {
1382 s = axis.scale = plotHeight / Math.abs(t(axis.max) - t(axis.min));
1383 s = -s;
1384 m = Math.max(t(axis.max), t(axis.min));
1385 }
1386
1387 // data point to canvas coordinate
1388 if (t == identity) // slight optimization
1389 axis.p2c = function (p) { return (p - m) * s; };
1390 else
1391 axis.p2c = function (p) { return (t(p) - m) * s; };
1392 // canvas coordinate to data point
1393 if (!it)
1394 axis.c2p = function (c) { return m + c / s; };
1395 else
1396 axis.c2p = function (c) { return it(m + c / s); };
1397 }
1398
1399 function measureTickLabels(axis) {
1400
1401 var opts = axis.options,
1402 ticks = axis.ticks || [],
1403 labelWidth = opts.labelWidth || 0,
1404 labelHeight = opts.labelHeight || 0,
1405 maxWidth = labelWidth || (axis.direction == "x" ? Math.floor(surface.width / (ticks.length || 1)) : null),
1406 legacyStyles = axis.direction + "Axis " + axis.direction + axis.n + "Axis",
1407 layer = "flot-" + axis.direction + "-axis flot-" + axis.direction + axis.n + "-axis " + legacyStyles,
1408 font = opts.font || "flot-tick-label tickLabel";
1409
1410 for (var i = 0; i < ticks.length; ++i) {
1411
1412 var t = ticks[i];
1413
1414 if (!t.label)
1415 continue;
1416
1417 var info = surface.getTextInfo(layer, t.label, font, null, maxWidth);
1418
1419 labelWidth = Math.max(labelWidth, info.width);
1420 labelHeight = Math.max(labelHeight, info.height);
1421 }
1422
1423 axis.labelWidth = opts.labelWidth || labelWidth;
1424 axis.labelHeight = opts.labelHeight || labelHeight;
1425 }
1426
1427 function allocateAxisBoxFirstPhase(axis) {
1428 // find the bounding box of the axis by looking at label
1429 // widths/heights and ticks, make room by diminishing the
1430 // plotOffset; this first phase only looks at one
1431 // dimension per axis, the other dimension depends on the
1432 // other axes so will have to wait
1433
1434 var lw = axis.labelWidth,
1435 lh = axis.labelHeight,
1436 pos = axis.options.position,
1437 isXAxis = axis.direction === "x",
1438 tickLength = axis.options.tickLength,
1439 axisMargin = options.grid.axisMargin,
1440 padding = options.grid.labelMargin,
1441 innermost = true,
1442 outermost = true,
1443 first = true,
1444 found = false;
1445
1446 // Determine the axis's position in its direction and on its side
1447
1448 $.each(isXAxis ? xaxes : yaxes, function(i, a) {
1449 if (a && (a.show || a.reserveSpace)) {
1450 if (a === axis) {
1451 found = true;
1452 } else if (a.options.position === pos) {
1453 if (found) {
1454 outermost = false;
1455 } else {
1456 innermost = false;
1457 }
1458 }
1459 if (!found) {
1460 first = false;
1461 }
1462 }
1463 });
1464
1465 // The outermost axis on each side has no margin
1466
1467 if (outermost) {
1468 axisMargin = 0;
1469 }
1470
1471 // The ticks for the first axis in each direction stretch across
1472
1473 if (tickLength == null) {
1474 tickLength = first ? "full" : 5;
1475 }
1476
1477 if (!isNaN(+tickLength))
1478 padding += +tickLength;
1479
1480 if (isXAxis) {
1481 lh += padding;
1482
1483 if (pos == "bottom") {
1484 plotOffset.bottom += lh + axisMargin;
1485 axis.box = { top: surface.height - plotOffset.bottom, height: lh };
1486 }
1487 else {
1488 axis.box = { top: plotOffset.top + axisMargin, height: lh };
1489 plotOffset.top += lh + axisMargin;
1490 }
1491 }
1492 else {
1493 lw += padding;
1494
1495 if (pos == "left") {
1496 axis.box = { left: plotOffset.left + axisMargin, width: lw };
1497 plotOffset.left += lw + axisMargin;
1498 }
1499 else {
1500 plotOffset.right += lw + axisMargin;
1501 axis.box = { left: surface.width - plotOffset.right, width: lw };
1502 }
1503 }
1504
1505 // save for future reference
1506 axis.position = pos;
1507 axis.tickLength = tickLength;
1508 axis.box.padding = padding;
1509 axis.innermost = innermost;
1510 }
1511
1512 function allocateAxisBoxSecondPhase(axis) {
1513 // now that all axis boxes have been placed in one
1514 // dimension, we can set the remaining dimension coordinates
1515 if (axis.direction == "x") {
1516 axis.box.left = plotOffset.left - axis.labelWidth / 2;
1517 axis.box.width = surface.width - plotOffset.left - plotOffset.right + axis.labelWidth;
1518 }
1519 else {
1520 axis.box.top = plotOffset.top - axis.labelHeight / 2;
1521 axis.box.height = surface.height - plotOffset.bottom - plotOffset.top + axis.labelHeight;
1522 }
1523 }
1524
1525 function adjustLayoutForThingsStickingOut() {
1526 // possibly adjust plot offset to ensure everything stays
1527 // inside the canvas and isn't clipped off
1528
1529 var minMargin = options.grid.minBorderMargin,
1530 axis, i;
1531
1532 // check stuff from the plot (FIXME: this should just read
1533 // a value from the series, otherwise it's impossible to
1534 // customize)
1535 if (minMargin == null) {
1536 minMargin = 0;
1537 for (i = 0; i < series.length; ++i)
1538 minMargin = Math.max(minMargin, 2 * (series[i].points.radius + series[i].points.lineWidth/2));
1539 }
1540
1541 var margins = {
1542 left: minMargin,
1543 right: minMargin,
1544 top: minMargin,
1545 bottom: minMargin
1546 };
1547
1548 // check axis labels, note we don't check the actual
1549 // labels but instead use the overall width/height to not
1550 // jump as much around with replots
1551 $.each(allAxes(), function (_, axis) {
1552 if (axis.reserveSpace && axis.ticks && axis.ticks.length) {
1553 if (axis.direction === "x") {
1554 margins.left = Math.max(margins.left, axis.labelWidth / 2);
1555 margins.right = Math.max(margins.right, axis.labelWidth / 2);
1556 } else {
1557 margins.bottom = Math.max(margins.bottom, axis.labelHeight / 2);
1558 margins.top = Math.max(margins.top, axis.labelHeight / 2);
1559 }
1560 }
1561 });
1562
1563 plotOffset.left = Math.ceil(Math.max(margins.left, plotOffset.left));
1564 plotOffset.right = Math.ceil(Math.max(margins.right, plotOffset.right));
1565 plotOffset.top = Math.ceil(Math.max(margins.top, plotOffset.top));
1566 plotOffset.bottom = Math.ceil(Math.max(margins.bottom, plotOffset.bottom));
1567 }
1568
1569 function setupGrid() {
1570 var i, axes = allAxes(), showGrid = options.grid.show;
1571
1572 // Initialize the plot's offset from the edge of the canvas
1573
1574 for (var a in plotOffset) {
1575 var margin = options.grid.margin || 0;
1576 plotOffset[a] = typeof margin == "number" ? margin : margin[a] || 0;
1577 }
1578
1579 executeHooks(hooks.processOffset, [plotOffset]);
1580
1581 // If the grid is visible, add its border width to the offset
1582
1583 for (var a in plotOffset) {
1584 if(typeof(options.grid.borderWidth) == "object") {
1585 plotOffset[a] += showGrid ? options.grid.borderWidth[a] : 0;
1586 }
1587 else {
1588 plotOffset[a] += showGrid ? options.grid.borderWidth : 0;
1589 }
1590 }
1591
1592 $.each(axes, function (_, axis) {
1593 var axisOpts = axis.options;
1594 axis.show = axisOpts.show == null ? axis.used : axisOpts.show;
1595 axis.reserveSpace = axisOpts.reserveSpace == null ? axis.show : axisOpts.reserveSpace;
1596 setRange(axis);
1597 });
1598
1599 if (showGrid) {
1600
1601 var allocatedAxes = $.grep(axes, function (axis) {
1602 return axis.show || axis.reserveSpace;
1603 });
1604
1605 $.each(allocatedAxes, function (_, axis) {
1606 // make the ticks
1607 setupTickGeneration(axis);
1608 setTicks(axis);
1609 snapRangeToTicks(axis, axis.ticks);
1610 // find labelWidth/Height for axis
1611 measureTickLabels(axis);
1612 });
1613
1614 // with all dimensions calculated, we can compute the
1615 // axis bounding boxes, start from the outside
1616 // (reverse order)
1617 for (i = allocatedAxes.length - 1; i >= 0; --i)
1618 allocateAxisBoxFirstPhase(allocatedAxes[i]);
1619
1620 // make sure we've got enough space for things that
1621 // might stick out
1622 adjustLayoutForThingsStickingOut();
1623
1624 $.each(allocatedAxes, function (_, axis) {
1625 allocateAxisBoxSecondPhase(axis);
1626 });
1627 }
1628
1629 plotWidth = surface.width - plotOffset.left - plotOffset.right;
1630 plotHeight = surface.height - plotOffset.bottom - plotOffset.top;
1631
1632 // now we got the proper plot dimensions, we can compute the scaling
1633 $.each(axes, function (_, axis) {
1634 setTransformationHelpers(axis);
1635 });
1636
1637 if (showGrid) {
1638 drawAxisLabels();
1639 }
1640
1641 insertLegend();
1642 }
1643
1644 function setRange(axis) {
1645 var opts = axis.options,
1646 min = +(opts.min != null ? opts.min : axis.datamin),
1647 max = +(opts.max != null ? opts.max : axis.datamax),
1648 delta = max - min;
1649
1650 if (delta == 0.0) {
1651 // degenerate case
1652 var widen = max == 0 ? 1 : 0.01;
1653
1654 if (opts.min == null)
1655 min -= widen;
1656 // always widen max if we couldn't widen min to ensure we
1657 // don't fall into min == max which doesn't work
1658 if (opts.max == null || opts.min != null)
1659 max += widen;
1660 }
1661 else {
1662 // consider autoscaling
1663 var margin = opts.autoscaleMargin;
1664 if (margin != null) {
1665 if (opts.min == null) {
1666 min -= delta * margin;
1667 // make sure we don't go below zero if all values
1668 // are positive
1669 if (min < 0 && axis.datamin != null && axis.datamin >= 0)
1670 min = 0;
1671 }
1672 if (opts.max == null) {
1673 max += delta * margin;
1674 if (max > 0 && axis.datamax != null && axis.datamax <= 0)
1675 max = 0;
1676 }
1677 }
1678 }
1679 axis.min = min;
1680 axis.max = max;
1681 }
1682
1683 function setupTickGeneration(axis) {
1684 var opts = axis.options;
1685
1686 // estimate number of ticks
1687 var noTicks;
1688 if (typeof opts.ticks == "number" && opts.ticks > 0)
1689 noTicks = opts.ticks;
1690 else
1691 // heuristic based on the model a*sqrt(x) fitted to
1692 // some data points that seemed reasonable
1693 noTicks = 0.3 * Math.sqrt(axis.direction == "x" ? surface.width : surface.height);
1694
1695 var delta = (axis.max - axis.min) / noTicks,
1696 dec = -Math.floor(Math.log(delta) / Math.LN10),
1697 maxDec = opts.tickDecimals;
1698
1699 if (maxDec != null && dec > maxDec) {
1700 dec = maxDec;
1701 }
1702
1703 var magn = Math.pow(10, -dec),
1704 norm = delta / magn, // norm is between 1.0 and 10.0
1705 size;
1706
1707 if (norm < 1.5) {
1708 size = 1;
1709 } else if (norm < 3) {
1710 size = 2;
1711 // special case for 2.5, requires an extra decimal
1712 if (norm > 2.25 && (maxDec == null || dec + 1 <= maxDec)) {
1713 size = 2.5;
1714 ++dec;
1715 }
1716 } else if (norm < 7.5) {
1717 size = 5;
1718 } else {
1719 size = 10;
1720 }
1721
1722 size *= magn;
1723
1724 if (opts.minTickSize != null && size < opts.minTickSize) {
1725 size = opts.minTickSize;
1726 }
1727
1728 axis.delta = delta;
1729 axis.tickDecimals = Math.max(0, maxDec != null ? maxDec : dec);
1730 axis.tickSize = opts.tickSize || size;
1731
1732 // Time mode was moved to a plug-in in 0.8, and since so many people use it
1733 // we'll add an especially friendly reminder to make sure they included it.
1734
1735 if (opts.mode == "time" && !axis.tickGenerator) {
1736 throw new Error("Time mode requires the flot.time plugin.");
1737 }
1738
1739 // Flot supports base-10 axes; any other mode else is handled by a plug-in,
1740 // like flot.time.js.
1741
1742 if (!axis.tickGenerator) {
1743
1744 axis.tickGenerator = function (axis) {
1745
1746 var ticks = [],
1747 start = floorInBase(axis.min, axis.tickSize),
1748 i = 0,
1749 v = Number.NaN,
1750 prev;
1751
1752 do {
1753 prev = v;
1754 v = start + i * axis.tickSize;
1755 ticks.push(v);
1756 ++i;
1757 } while (v < axis.max && v != prev);
1758 return ticks;
1759 };
1760
1761 axis.tickFormatter = function (value, axis) {
1762
1763 var factor = axis.tickDecimals ? Math.pow(10, axis.tickDecimals) : 1;
1764 var formatted = "" + Math.round(value * factor) / factor;
1765
1766 // If tickDecimals was specified, ensure that we have exactly that
1767 // much precision; otherwise default to the value's own precision.
1768
1769 if (axis.tickDecimals != null) {
1770 var decimal = formatted.indexOf(".");
1771 var precision = decimal == -1 ? 0 : formatted.length - decimal - 1;
1772 if (precision < axis.tickDecimals) {
1773 return (precision ? formatted : formatted + ".") + ("" + factor).substr(1, axis.tickDecimals - precision);
1774 }
1775 }
1776
1777 return formatted;
1778 };
1779 }
1780
1781 if ($.isFunction(opts.tickFormatter))
1782 axis.tickFormatter = function (v, axis) { return "" + opts.tickFormatter(v, axis); };
1783
1784 if (opts.alignTicksWithAxis != null) {
1785 var otherAxis = (axis.direction == "x" ? xaxes : yaxes)[opts.alignTicksWithAxis - 1];
1786 if (otherAxis && otherAxis.used && otherAxis != axis) {
1787 // consider snapping min/max to outermost nice ticks
1788 var niceTicks = axis.tickGenerator(axis);
1789 if (niceTicks.length > 0) {
1790 if (opts.min == null)
1791 axis.min = Math.min(axis.min, niceTicks[0]);
1792 if (opts.max == null && niceTicks.length > 1)
1793 axis.max = Math.max(axis.max, niceTicks[niceTicks.length - 1]);
1794 }
1795
1796 axis.tickGenerator = function (axis) {
1797 // copy ticks, scaled to this axis
1798 var ticks = [], v, i;
1799 for (i = 0; i < otherAxis.ticks.length; ++i) {
1800 v = (otherAxis.ticks[i].v - otherAxis.min) / (otherAxis.max - otherAxis.min);
1801 v = axis.min + v * (axis.max - axis.min);
1802 ticks.push(v);
1803 }
1804 return ticks;
1805 };
1806
1807 // we might need an extra decimal since forced
1808 // ticks don't necessarily fit naturally
1809 if (!axis.mode && opts.tickDecimals == null) {
1810 var extraDec = Math.max(0, -Math.floor(Math.log(axis.delta) / Math.LN10) + 1),
1811 ts = axis.tickGenerator(axis);
1812
1813 // only proceed if the tick interval rounded
1814 // with an extra decimal doesn't give us a
1815 // zero at end
1816 if (!(ts.length > 1 && /\..*0$/.test((ts[1] - ts[0]).toFixed(extraDec))))
1817 axis.tickDecimals = extraDec;
1818 }
1819 }
1820 }
1821 }
1822
1823 function setTicks(axis) {
1824 var oticks = axis.options.ticks, ticks = [];
1825 if (oticks == null || (typeof oticks == "number" && oticks > 0))
1826 ticks = axis.tickGenerator(axis);
1827 else if (oticks) {
1828 if ($.isFunction(oticks))
1829 // generate the ticks
1830 ticks = oticks(axis);
1831 else
1832 ticks = oticks;
1833 }
1834
1835 // clean up/labelify the supplied ticks, copy them over
1836 var i, v;
1837 axis.ticks = [];
1838 for (i = 0; i < ticks.length; ++i) {
1839 var label = null;
1840 var t = ticks[i];
1841 if (typeof t == "object") {
1842 v = +t[0];
1843 if (t.length > 1)
1844 label = t[1];
1845 }
1846 else
1847 v = +t;
1848 if (label == null)
1849 label = axis.tickFormatter(v, axis);
1850 if (!isNaN(v))
1851 axis.ticks.push({ v: v, label: label });
1852 }
1853 }
1854
1855 function snapRangeToTicks(axis, ticks) {
1856 if (axis.options.autoscaleMargin && ticks.length > 0) {
1857 // snap to ticks
1858 if (axis.options.min == null)
1859 axis.min = Math.min(axis.min, ticks[0].v);
1860 if (axis.options.max == null && ticks.length > 1)
1861 axis.max = Math.max(axis.max, ticks[ticks.length - 1].v);
1862 }
1863 }
1864
1865 function draw() {
1866
1867 surface.clear();
1868
1869 executeHooks(hooks.drawBackground, [ctx]);
1870
1871 var grid = options.grid;
1872
1873 // draw background, if any
1874 if (grid.show && grid.backgroundColor)
1875 drawBackground();
1876
1877 if (grid.show && !grid.aboveData) {
1878 drawGrid();
1879 }
1880
1881 for (var i = 0; i < series.length; ++i) {
1882 executeHooks(hooks.drawSeries, [ctx, series[i]]);
1883 drawSeries(series[i]);
1884 }
1885
1886 executeHooks(hooks.draw, [ctx]);
1887
1888 if (grid.show && grid.aboveData) {
1889 drawGrid();
1890 }
1891
1892 surface.render();
1893
1894 // A draw implies that either the axes or data have changed, so we
1895 // should probably update the overlay highlights as well.
1896
1897 triggerRedrawOverlay();
1898 }
1899
1900 function extractRange(ranges, coord) {
1901 var axis, from, to, key, axes = allAxes();
1902
1903 for (var i = 0; i < axes.length; ++i) {
1904 axis = axes[i];
1905 if (axis.direction == coord) {
1906 key = coord + axis.n + "axis";
1907 if (!ranges[key] && axis.n == 1)
1908 key = coord + "axis"; // support x1axis as xaxis
1909 if (ranges[key]) {
1910 from = ranges[key].from;
1911 to = ranges[key].to;
1912 break;
1913 }
1914 }
1915 }
1916
1917 // backwards-compat stuff - to be removed in future
1918 if (!ranges[key]) {
1919 axis = coord == "x" ? xaxes[0] : yaxes[0];
1920 from = ranges[coord + "1"];
1921 to = ranges[coord + "2"];
1922 }
1923
1924 // auto-reverse as an added bonus
1925 if (from != null && to != null && from > to) {
1926 var tmp = from;
1927 from = to;
1928 to = tmp;
1929 }
1930
1931 return { from: from, to: to, axis: axis };
1932 }
1933
1934 function drawBackground() {
1935 ctx.save();
1936 ctx.translate(plotOffset.left, plotOffset.top);
1937
1938 ctx.fillStyle = getColorOrGradient(options.grid.backgroundColor, plotHeight, 0, "rgba(255, 255, 255, 0)");
1939 ctx.fillRect(0, 0, plotWidth, plotHeight);
1940 ctx.restore();
1941 }
1942
1943 function drawGrid() {
1944 var i, axes, bw, bc;
1945
1946 ctx.save();
1947 ctx.translate(plotOffset.left, plotOffset.top);
1948
1949 // draw markings
1950 var markings = options.grid.markings;
1951 if (markings) {
1952 if ($.isFunction(markings)) {
1953 axes = plot.getAxes();
1954 // xmin etc. is backwards compatibility, to be
1955 // removed in the future
1956 axes.xmin = axes.xaxis.min;
1957 axes.xmax = axes.xaxis.max;
1958 axes.ymin = axes.yaxis.min;
1959 axes.ymax = axes.yaxis.max;
1960
1961 markings = markings(axes);
1962 }
1963
1964 for (i = 0; i < markings.length; ++i) {
1965 var m = markings[i],
1966 xrange = extractRange(m, "x"),
1967 yrange = extractRange(m, "y");
1968
1969 // fill in missing
1970 if (xrange.from == null)
1971 xrange.from = xrange.axis.min;
1972 if (xrange.to == null)
1973 xrange.to = xrange.axis.max;
1974 if (yrange.from == null)
1975 yrange.from = yrange.axis.min;
1976 if (yrange.to == null)
1977 yrange.to = yrange.axis.max;
1978
1979 // clip
1980 if (xrange.to < xrange.axis.min || xrange.from > xrange.axis.max ||
1981 yrange.to < yrange.axis.min || yrange.from > yrange.axis.max)
1982 continue;
1983
1984 xrange.from = Math.max(xrange.from, xrange.axis.min);
1985 xrange.to = Math.min(xrange.to, xrange.axis.max);
1986 yrange.from = Math.max(yrange.from, yrange.axis.min);
1987 yrange.to = Math.min(yrange.to, yrange.axis.max);
1988
1989 var xequal = xrange.from === xrange.to,
1990 yequal = yrange.from === yrange.to;
1991
1992 if (xequal && yequal) {
1993 continue;
1994 }
1995
1996 // then draw
1997 xrange.from = Math.floor(xrange.axis.p2c(xrange.from));
1998 xrange.to = Math.floor(xrange.axis.p2c(xrange.to));
1999 yrange.from = Math.floor(yrange.axis.p2c(yrange.from));
2000 yrange.to = Math.floor(yrange.axis.p2c(yrange.to));
2001
2002 if (xequal || yequal) {
2003 var lineWidth = m.lineWidth || options.grid.markingsLineWidth,
2004 subPixel = lineWidth % 2 ? 0.5 : 0;
2005 ctx.beginPath();
2006 ctx.strokeStyle = m.color || options.grid.markingsColor;
2007 ctx.lineWidth = lineWidth;
2008 if (xequal) {
2009 ctx.moveTo(xrange.to + subPixel, yrange.from);
2010 ctx.lineTo(xrange.to + subPixel, yrange.to);
2011 } else {
2012 ctx.moveTo(xrange.from, yrange.to + subPixel);
2013 ctx.lineTo(xrange.to, yrange.to + subPixel);
2014 }
2015 ctx.stroke();
2016 } else {
2017 ctx.fillStyle = m.color || options.grid.markingsColor;
2018 ctx.fillRect(xrange.from, yrange.to,
2019 xrange.to - xrange.from,
2020 yrange.from - yrange.to);
2021 }
2022 }
2023 }
2024
2025 // draw the ticks
2026 axes = allAxes();
2027 bw = options.grid.borderWidth;
2028
2029 for (var j = 0; j < axes.length; ++j) {
2030 var axis = axes[j], box = axis.box,
2031 t = axis.tickLength, x, y, xoff, yoff;
2032 if (!axis.show || axis.ticks.length == 0)
2033 continue;
2034
2035 ctx.lineWidth = 1;
2036
2037 // find the edges
2038 if (axis.direction == "x") {
2039 x = 0;
2040 if (t == "full")
2041 y = (axis.position == "top" ? 0 : plotHeight);
2042 else
2043 y = box.top - plotOffset.top + (axis.position == "top" ? box.height : 0);
2044 }
2045 else {
2046 y = 0;
2047 if (t == "full")
2048 x = (axis.position == "left" ? 0 : plotWidth);
2049 else
2050 x = box.left - plotOffset.left + (axis.position == "left" ? box.width : 0);
2051 }
2052
2053 // draw tick bar
2054 if (!axis.innermost) {
2055 ctx.strokeStyle = axis.options.color;
2056 ctx.beginPath();
2057 xoff = yoff = 0;
2058 if (axis.direction == "x")
2059 xoff = plotWidth + 1;
2060 else
2061 yoff = plotHeight + 1;
2062
2063 if (ctx.lineWidth == 1) {
2064 if (axis.direction == "x") {
2065 y = Math.floor(y) + 0.5;
2066 } else {
2067 x = Math.floor(x) + 0.5;
2068 }
2069 }
2070
2071 ctx.moveTo(x, y);
2072 ctx.lineTo(x + xoff, y + yoff);
2073 ctx.stroke();
2074 }
2075
2076 // draw ticks
2077
2078 ctx.strokeStyle = axis.options.tickColor;
2079
2080 ctx.beginPath();
2081 for (i = 0; i < axis.ticks.length; ++i) {
2082 var v = axis.ticks[i].v;
2083
2084 xoff = yoff = 0;
2085
2086 if (isNaN(v) || v < axis.min || v > axis.max
2087 // skip those lying on the axes if we got a border
2088 || (t == "full"
2089 && ((typeof bw == "object" && bw[axis.position] > 0) || bw > 0)
2090 && (v == axis.min || v == axis.max)))
2091 continue;
2092
2093 if (axis.direction == "x") {
2094 x = axis.p2c(v);
2095 yoff = t == "full" ? -plotHeight : t;
2096
2097 if (axis.position == "top")
2098 yoff = -yoff;
2099 }
2100 else {
2101 y = axis.p2c(v);
2102 xoff = t == "full" ? -plotWidth : t;
2103
2104 if (axis.position == "left")
2105 xoff = -xoff;
2106 }
2107
2108 if (ctx.lineWidth == 1) {
2109 if (axis.direction == "x")
2110 x = Math.floor(x) + 0.5;
2111 else
2112 y = Math.floor(y) + 0.5;
2113 }
2114
2115 ctx.moveTo(x, y);
2116 ctx.lineTo(x + xoff, y + yoff);
2117 }
2118
2119 ctx.stroke();
2120 }
2121
2122
2123 // draw border
2124 if (bw) {
2125 // If either borderWidth or borderColor is an object, then draw the border
2126 // line by line instead of as one rectangle
2127 bc = options.grid.borderColor;
2128 if(typeof bw == "object" || typeof bc == "object") {
2129 if (typeof bw !== "object") {
2130 bw = {top: bw, right: bw, bottom: bw, left: bw};
2131 }
2132 if (typeof bc !== "object") {
2133 bc = {top: bc, right: bc, bottom: bc, left: bc};
2134 }
2135
2136 if (bw.top > 0) {
2137 ctx.strokeStyle = bc.top;
2138 ctx.lineWidth = bw.top;
2139 ctx.beginPath();
2140 ctx.moveTo(0 - bw.left, 0 - bw.top/2);
2141 ctx.lineTo(plotWidth, 0 - bw.top/2);
2142 ctx.stroke();
2143 }
2144
2145 if (bw.right > 0) {
2146 ctx.strokeStyle = bc.right;
2147 ctx.lineWidth = bw.right;
2148 ctx.beginPath();
2149 ctx.moveTo(plotWidth + bw.right / 2, 0 - bw.top);
2150 ctx.lineTo(plotWidth + bw.right / 2, plotHeight);
2151 ctx.stroke();
2152 }
2153
2154 if (bw.bottom > 0) {
2155 ctx.strokeStyle = bc.bottom;
2156 ctx.lineWidth = bw.bottom;
2157 ctx.beginPath();
2158 ctx.moveTo(plotWidth + bw.right, plotHeight + bw.bottom / 2);
2159 ctx.lineTo(0, plotHeight + bw.bottom / 2);
2160 ctx.stroke();
2161 }
2162
2163 if (bw.left > 0) {
2164 ctx.strokeStyle = bc.left;
2165 ctx.lineWidth = bw.left;
2166 ctx.beginPath();
2167 ctx.moveTo(0 - bw.left/2, plotHeight + bw.bottom);
2168 ctx.lineTo(0- bw.left/2, 0);
2169 ctx.stroke();
2170 }
2171 }
2172 else {
2173 ctx.lineWidth = bw;
2174 ctx.strokeStyle = options.grid.borderColor;
2175 ctx.strokeRect(-bw/2, -bw/2, plotWidth + bw, plotHeight + bw);
2176 }
2177 }
2178
2179 ctx.restore();
2180 }
2181
2182 function drawAxisLabels() {
2183
2184 $.each(allAxes(), function (_, axis) {
2185 var box = axis.box,
2186 legacyStyles = axis.direction + "Axis " + axis.direction + axis.n + "Axis",
2187 layer = "flot-" + axis.direction + "-axis flot-" + axis.direction + axis.n + "-axis " + legacyStyles,
2188 font = axis.options.font || "flot-tick-label tickLabel",
2189 tick, x, y, halign, valign;
2190
2191 // Remove text before checking for axis.show and ticks.length;
2192 // otherwise plugins, like flot-tickrotor, that draw their own
2193 // tick labels will end up with both theirs and the defaults.
2194
2195 surface.removeText(layer);
2196
2197 if (!axis.show || axis.ticks.length == 0)
2198 return;
2199
2200 for (var i = 0; i < axis.ticks.length; ++i) {
2201
2202 tick = axis.ticks[i];
2203 if (!tick.label || tick.v < axis.min || tick.v > axis.max)
2204 continue;
2205
2206 if (axis.direction == "x") {
2207 halign = "center";
2208 x = plotOffset.left + axis.p2c(tick.v);
2209 if (axis.position == "bottom") {
2210 y = box.top + box.padding;
2211 } else {
2212 y = box.top + box.height - box.padding;
2213 valign = "bottom";
2214 }
2215 } else {
2216 valign = "middle";
2217 y = plotOffset.top + axis.p2c(tick.v);
2218 if (axis.position == "left") {
2219 x = box.left + box.width - box.padding;
2220 halign = "right";
2221 } else {
2222 x = box.left + box.padding;
2223 }
2224 }
2225
2226 surface.addText(layer, x, y, tick.label, font, null, null, halign, valign);
2227 }
2228 });
2229 }
2230
2231 function drawSeries(series) {
2232 if (series.lines.show)
2233 drawSeriesLines(series);
2234 if (series.bars.show)
2235 drawSeriesBars(series);
2236 if (series.points.show)
2237 drawSeriesPoints(series);
2238 }
2239
2240 function drawSeriesLines(series) {
2241 function plotLine(datapoints, xoffset, yoffset, axisx, axisy) {
2242 var points = datapoints.points,
2243 ps = datapoints.pointsize,
2244 prevx = null, prevy = null;
2245
2246 ctx.beginPath();
2247 for (var i = ps; i < points.length; i += ps) {
2248 var x1 = points[i - ps], y1 = points[i - ps + 1],
2249 x2 = points[i], y2 = points[i + 1];
2250
2251 if (x1 == null || x2 == null)
2252 continue;
2253
2254 // clip with ymin
2255 if (y1 <= y2 && y1 < axisy.min) {
2256 if (y2 < axisy.min)
2257 continue; // line segment is outside
2258 // compute new intersection point
2259 x1 = (axisy.min - y1) / (y2 - y1) * (x2 - x1) + x1;
2260 y1 = axisy.min;
2261 }
2262 else if (y2 <= y1 && y2 < axisy.min) {
2263 if (y1 < axisy.min)
2264 continue;
2265 x2 = (axisy.min - y1) / (y2 - y1) * (x2 - x1) + x1;
2266 y2 = axisy.min;
2267 }
2268
2269 // clip with ymax
2270 if (y1 >= y2 && y1 > axisy.max) {
2271 if (y2 > axisy.max)
2272 continue;
2273 x1 = (axisy.max - y1) / (y2 - y1) * (x2 - x1) + x1;
2274 y1 = axisy.max;
2275 }
2276 else if (y2 >= y1 && y2 > axisy.max) {
2277 if (y1 > axisy.max)
2278 continue;
2279 x2 = (axisy.max - y1) / (y2 - y1) * (x2 - x1) + x1;
2280 y2 = axisy.max;
2281 }
2282
2283 // clip with xmin
2284 if (x1 <= x2 && x1 < axisx.min) {
2285 if (x2 < axisx.min)
2286 continue;
2287 y1 = (axisx.min - x1) / (x2 - x1) * (y2 - y1) + y1;
2288 x1 = axisx.min;
2289 }
2290 else if (x2 <= x1 && x2 < axisx.min) {
2291 if (x1 < axisx.min)
2292 continue;
2293 y2 = (axisx.min - x1) / (x2 - x1) * (y2 - y1) + y1;
2294 x2 = axisx.min;
2295 }
2296
2297 // clip with xmax
2298 if (x1 >= x2 && x1 > axisx.max) {
2299 if (x2 > axisx.max)
2300 continue;
2301 y1 = (axisx.max - x1) / (x2 - x1) * (y2 - y1) + y1;
2302 x1 = axisx.max;
2303 }
2304 else if (x2 >= x1 && x2 > axisx.max) {
2305 if (x1 > axisx.max)
2306 continue;
2307 y2 = (axisx.max - x1) / (x2 - x1) * (y2 - y1) + y1;
2308 x2 = axisx.max;
2309 }
2310
2311 if (x1 != prevx || y1 != prevy)
2312 ctx.moveTo(axisx.p2c(x1) + xoffset, axisy.p2c(y1) + yoffset);
2313
2314 prevx = x2;
2315 prevy = y2;
2316 ctx.lineTo(axisx.p2c(x2) + xoffset, axisy.p2c(y2) + yoffset);
2317 }
2318 ctx.stroke();
2319 }
2320
2321 function plotLineArea(datapoints, axisx, axisy) {
2322 var points = datapoints.points,
2323 ps = datapoints.pointsize,
2324 bottom = Math.min(Math.max(0, axisy.min), axisy.max),
2325 i = 0, top, areaOpen = false,
2326 ypos = 1, segmentStart = 0, segmentEnd = 0;
2327
2328 // we process each segment in two turns, first forward
2329 // direction to sketch out top, then once we hit the
2330 // end we go backwards to sketch the bottom
2331 while (true) {
2332 if (ps > 0 && i > points.length + ps)
2333 break;
2334
2335 i += ps; // ps is negative if going backwards
2336
2337 var x1 = points[i - ps],
2338 y1 = points[i - ps + ypos],
2339 x2 = points[i], y2 = points[i + ypos];
2340
2341 if (areaOpen) {
2342 if (ps > 0 && x1 != null && x2 == null) {
2343 // at turning point
2344 segmentEnd = i;
2345 ps = -ps;
2346 ypos = 2;
2347 continue;
2348 }
2349
2350 if (ps < 0 && i == segmentStart + ps) {
2351 // done with the reverse sweep
2352 ctx.fill();
2353 areaOpen = false;
2354 ps = -ps;
2355 ypos = 1;
2356 i = segmentStart = segmentEnd + ps;
2357 continue;
2358 }
2359 }
2360
2361 if (x1 == null || x2 == null)
2362 continue;
2363
2364 // clip x values
2365
2366 // clip with xmin
2367 if (x1 <= x2 && x1 < axisx.min) {
2368 if (x2 < axisx.min)
2369 continue;
2370 y1 = (axisx.min - x1) / (x2 - x1) * (y2 - y1) + y1;
2371 x1 = axisx.min;
2372 }
2373 else if (x2 <= x1 && x2 < axisx.min) {
2374 if (x1 < axisx.min)
2375 continue;
2376 y2 = (axisx.min - x1) / (x2 - x1) * (y2 - y1) + y1;
2377 x2 = axisx.min;
2378 }
2379
2380 // clip with xmax
2381 if (x1 >= x2 && x1 > axisx.max) {
2382 if (x2 > axisx.max)
2383 continue;
2384 y1 = (axisx.max - x1) / (x2 - x1) * (y2 - y1) + y1;
2385 x1 = axisx.max;
2386 }
2387 else if (x2 >= x1 && x2 > axisx.max) {
2388 if (x1 > axisx.max)
2389 continue;
2390 y2 = (axisx.max - x1) / (x2 - x1) * (y2 - y1) + y1;
2391 x2 = axisx.max;
2392 }
2393
2394 if (!areaOpen) {
2395 // open area
2396 ctx.beginPath();
2397 ctx.moveTo(axisx.p2c(x1), axisy.p2c(bottom));
2398 areaOpen = true;
2399 }
2400
2401 // now first check the case where both is outside
2402 if (y1 >= axisy.max && y2 >= axisy.max) {
2403 ctx.lineTo(axisx.p2c(x1), axisy.p2c(axisy.max));
2404 ctx.lineTo(axisx.p2c(x2), axisy.p2c(axisy.max));
2405 continue;
2406 }
2407 else if (y1 <= axisy.min && y2 <= axisy.min) {
2408 ctx.lineTo(axisx.p2c(x1), axisy.p2c(axisy.min));
2409 ctx.lineTo(axisx.p2c(x2), axisy.p2c(axisy.min));
2410 continue;
2411 }
2412
2413 // else it's a bit more complicated, there might
2414 // be a flat maxed out rectangle first, then a
2415 // triangular cutout or reverse; to find these
2416 // keep track of the current x values
2417 var x1old = x1, x2old = x2;
2418
2419 // clip the y values, without shortcutting, we
2420 // go through all cases in turn
2421
2422 // clip with ymin
2423 if (y1 <= y2 && y1 < axisy.min && y2 >= axisy.min) {
2424 x1 = (axisy.min - y1) / (y2 - y1) * (x2 - x1) + x1;
2425 y1 = axisy.min;
2426 }
2427 else if (y2 <= y1 && y2 < axisy.min && y1 >= axisy.min) {
2428 x2 = (axisy.min - y1) / (y2 - y1) * (x2 - x1) + x1;
2429 y2 = axisy.min;
2430 }
2431
2432 // clip with ymax
2433 if (y1 >= y2 && y1 > axisy.max && y2 <= axisy.max) {
2434 x1 = (axisy.max - y1) / (y2 - y1) * (x2 - x1) + x1;
2435 y1 = axisy.max;
2436 }
2437 else if (y2 >= y1 && y2 > axisy.max && y1 <= axisy.max) {
2438 x2 = (axisy.max - y1) / (y2 - y1) * (x2 - x1) + x1;
2439 y2 = axisy.max;
2440 }
2441
2442 // if the x value was changed we got a rectangle
2443 // to fill
2444 if (x1 != x1old) {
2445 ctx.lineTo(axisx.p2c(x1old), axisy.p2c(y1));
2446 // it goes to (x1, y1), but we fill that below
2447 }
2448
2449 // fill triangular section, this sometimes result
2450 // in redundant points if (x1, y1) hasn't changed
2451 // from previous line to, but we just ignore that
2452 ctx.lineTo(axisx.p2c(x1), axisy.p2c(y1));
2453 ctx.lineTo(axisx.p2c(x2), axisy.p2c(y2));
2454
2455 // fill the other rectangle if it's there
2456 if (x2 != x2old) {
2457 ctx.lineTo(axisx.p2c(x2), axisy.p2c(y2));
2458 ctx.lineTo(axisx.p2c(x2old), axisy.p2c(y2));
2459 }
2460 }
2461 }
2462
2463 ctx.save();
2464 ctx.translate(plotOffset.left, plotOffset.top);
2465 ctx.lineJoin = "round";
2466
2467 var lw = series.lines.lineWidth,
2468 sw = series.shadowSize;
2469 // FIXME: consider another form of shadow when filling is turned on
2470 if (lw > 0 && sw > 0) {
2471 // draw shadow as a thick and thin line with transparency
2472 ctx.lineWidth = sw;
2473 ctx.strokeStyle = "rgba(0,0,0,0.1)";
2474 // position shadow at angle from the mid of line
2475 var angle = Math.PI/18;
2476 plotLine(series.datapoints, Math.sin(angle) * (lw/2 + sw/2), Math.cos(angle) * (lw/2 + sw/2), series.xaxis, series.yaxis);
2477 ctx.lineWidth = sw/2;
2478 plotLine(series.datapoints, Math.sin(angle) * (lw/2 + sw/4), Math.cos(angle) * (lw/2 + sw/4), series.xaxis, series.yaxis);
2479 }
2480
2481 ctx.lineWidth = lw;
2482 ctx.strokeStyle = series.color;
2483 var fillStyle = getFillStyle(series.lines, series.color, 0, plotHeight);
2484 if (fillStyle) {
2485 ctx.fillStyle = fillStyle;
2486 plotLineArea(series.datapoints, series.xaxis, series.yaxis);
2487 }
2488
2489 if (lw > 0)
2490 plotLine(series.datapoints, 0, 0, series.xaxis, series.yaxis);
2491 ctx.restore();
2492 }
2493
2494 function drawSeriesPoints(series) {
2495 function plotPoints(datapoints, radius, fillStyle, offset, shadow, axisx, axisy, symbol) {
2496 var points = datapoints.points, ps = datapoints.pointsize;
2497
2498 for (var i = 0; i < points.length; i += ps) {
2499 var x = points[i], y = points[i + 1];
2500 if (x == null || x < axisx.min || x > axisx.max || y < axisy.min || y > axisy.max)
2501 continue;
2502
2503 ctx.beginPath();
2504 x = axisx.p2c(x);
2505 y = axisy.p2c(y) + offset;
2506 if (symbol == "circle")
2507 ctx.arc(x, y, radius, 0, shadow ? Math.PI : Math.PI * 2, false);
2508 else
2509 symbol(ctx, x, y, radius, shadow);
2510 ctx.closePath();
2511
2512 if (fillStyle) {
2513 ctx.fillStyle = fillStyle;
2514 ctx.fill();
2515 }
2516 ctx.stroke();
2517 }
2518 }
2519
2520 ctx.save();
2521 ctx.translate(plotOffset.left, plotOffset.top);
2522
2523 var lw = series.points.lineWidth,
2524 sw = series.shadowSize,
2525 radius = series.points.radius,
2526 symbol = series.points.symbol;
2527
2528 // If the user sets the line width to 0, we change it to a very
2529 // small value. A line width of 0 seems to force the default of 1.
2530 // Doing the conditional here allows the shadow setting to still be
2531 // optional even with a lineWidth of 0.
2532
2533 if( lw == 0 )
2534 lw = 0.0001;
2535
2536 if (lw > 0 && sw > 0) {
2537 // draw shadow in two steps
2538 var w = sw / 2;
2539 ctx.lineWidth = w;
2540 ctx.strokeStyle = "rgba(0,0,0,0.1)";
2541 plotPoints(series.datapoints, radius, null, w + w/2, true,
2542 series.xaxis, series.yaxis, symbol);
2543
2544 ctx.strokeStyle = "rgba(0,0,0,0.2)";
2545 plotPoints(series.datapoints, radius, null, w/2, true,
2546 series.xaxis, series.yaxis, symbol);
2547 }
2548
2549 ctx.lineWidth = lw;
2550 ctx.strokeStyle = series.color;
2551 plotPoints(series.datapoints, radius,
2552 getFillStyle(series.points, series.color), 0, false,
2553 series.xaxis, series.yaxis, symbol);
2554 ctx.restore();
2555 }
2556
2557 function drawBar(x, y, b, barLeft, barRight, fillStyleCallback, axisx, axisy, c, horizontal, lineWidth) {
2558 var left, right, bottom, top,
2559 drawLeft, drawRight, drawTop, drawBottom,
2560 tmp;
2561
2562 // in horizontal mode, we start the bar from the left
2563 // instead of from the bottom so it appears to be
2564 // horizontal rather than vertical
2565 if (horizontal) {
2566 drawBottom = drawRight = drawTop = true;
2567 drawLeft = false;
2568 left = b;
2569 right = x;
2570 top = y + barLeft;
2571 bottom = y + barRight;
2572
2573 // account for negative bars
2574 if (right < left) {
2575 tmp = right;
2576 right = left;
2577 left = tmp;
2578 drawLeft = true;
2579 drawRight = false;
2580 }
2581 }
2582 else {
2583 drawLeft = drawRight = drawTop = true;
2584 drawBottom = false;
2585 left = x + barLeft;
2586 right = x + barRight;
2587 bottom = b;
2588 top = y;
2589
2590 // account for negative bars
2591 if (top < bottom) {
2592 tmp = top;
2593 top = bottom;
2594 bottom = tmp;
2595 drawBottom = true;
2596 drawTop = false;
2597 }
2598 }
2599
2600 // clip
2601 if (right < axisx.min || left > axisx.max ||
2602 top < axisy.min || bottom > axisy.max)
2603 return;
2604
2605 if (left < axisx.min) {
2606 left = axisx.min;
2607 drawLeft = false;
2608 }
2609
2610 if (right > axisx.max) {
2611 right = axisx.max;
2612 drawRight = false;
2613 }
2614
2615 if (bottom < axisy.min) {
2616 bottom = axisy.min;
2617 drawBottom = false;
2618 }
2619
2620 if (top > axisy.max) {
2621 top = axisy.max;
2622 drawTop = false;
2623 }
2624
2625 left = axisx.p2c(left);
2626 bottom = axisy.p2c(bottom);
2627 right = axisx.p2c(right);
2628 top = axisy.p2c(top);
2629
2630 // fill the bar
2631 if (fillStyleCallback) {
2632 c.fillStyle = fillStyleCallback(bottom, top);
2633 c.fillRect(left, top, right - left, bottom - top)
2634 }
2635
2636 // draw outline
2637 if (lineWidth > 0 && (drawLeft || drawRight || drawTop || drawBottom)) {
2638 c.beginPath();
2639
2640 // FIXME: inline moveTo is buggy with excanvas
2641 c.moveTo(left, bottom);
2642 if (drawLeft)
2643 c.lineTo(left, top);
2644 else
2645 c.moveTo(left, top);
2646 if (drawTop)
2647 c.lineTo(right, top);
2648 else
2649 c.moveTo(right, top);
2650 if (drawRight)
2651 c.lineTo(right, bottom);
2652 else
2653 c.moveTo(right, bottom);
2654 if (drawBottom)
2655 c.lineTo(left, bottom);
2656 else
2657 c.moveTo(left, bottom);
2658 c.stroke();
2659 }
2660 }
2661
2662 function drawSeriesBars(series) {
2663 function plotBars(datapoints, barLeft, barRight, fillStyleCallback, axisx, axisy) {
2664 var points = datapoints.points, ps = datapoints.pointsize;
2665
2666 for (var i = 0; i < points.length; i += ps) {
2667 if (points[i] == null)
2668 continue;
2669 drawBar(points[i], points[i + 1], points[i + 2], barLeft, barRight, fillStyleCallback, axisx, axisy, ctx, series.bars.horizontal, series.bars.lineWidth);
2670 }
2671 }
2672
2673 ctx.save();
2674 ctx.translate(plotOffset.left, plotOffset.top);
2675
2676 // FIXME: figure out a way to add shadows (for instance along the right edge)
2677 ctx.lineWidth = series.bars.lineWidth;
2678 ctx.strokeStyle = series.color;
2679
2680 var barLeft;
2681
2682 switch (series.bars.align) {
2683 case "left":
2684 barLeft = 0;
2685 break;
2686 case "right":
2687 barLeft = -series.bars.barWidth;
2688 break;
2689 default:
2690 barLeft = -series.bars.barWidth / 2;
2691 }
2692
2693 var fillStyleCallback = series.bars.fill ? function (bottom, top) { return getFillStyle(series.bars, series.color, bottom, top); } : null;
2694 plotBars(series.datapoints, barLeft, barLeft + series.bars.barWidth, fillStyleCallback, series.xaxis, series.yaxis);
2695 ctx.restore();
2696 }
2697
2698 function getFillStyle(filloptions, seriesColor, bottom, top) {
2699 var fill = filloptions.fill;
2700 if (!fill)
2701 return null;
2702
2703 if (filloptions.fillColor)
2704 return getColorOrGradient(filloptions.fillColor, bottom, top, seriesColor);
2705
2706 var c = $.color.parse(seriesColor);
2707 c.a = typeof fill == "number" ? fill : 0.4;
2708 c.normalize();
2709 return c.toString();
2710 }
2711
2712 function insertLegend() {
2713
2714 if (options.legend.container != null) {
2715 $(options.legend.container).html("");
2716 } else {
2717 placeholder.find(".legend").remove();
2718 }
2719
2720 if (!options.legend.show) {
2721 return;
2722 }
2723
2724 var fragments = [], entries = [], rowStarted = false,
2725 lf = options.legend.labelFormatter, s, label;
2726
2727 // Build a list of legend entries, with each having a label and a color
2728
2729 for (var i = 0; i < series.length; ++i) {
2730 s = series[i];
2731 if (s.label) {
2732 label = lf ? lf(s.label, s) : s.label;
2733 if (label) {
2734 entries.push({
2735 label: label,
2736 color: s.color
2737 });
2738 }
2739 }
2740 }
2741
2742 // Sort the legend using either the default or a custom comparator
2743
2744 if (options.legend.sorted) {
2745 if ($.isFunction(options.legend.sorted)) {
2746 entries.sort(options.legend.sorted);
2747 } else if (options.legend.sorted == "reverse") {
2748 entries.reverse();
2749 } else {
2750 var ascending = options.legend.sorted != "descending";
2751 entries.sort(function(a, b) {
2752 return a.label == b.label ? 0 : (
2753 (a.label < b.label) != ascending ? 1 : -1 // Logical XOR
2754 );
2755 });
2756 }
2757 }
2758
2759 // Generate markup for the list of entries, in their final order
2760
2761 for (var i = 0; i < entries.length; ++i) {
2762
2763 var entry = entries[i];
2764
2765 if (i % options.legend.noColumns == 0) {
2766 if (rowStarted)
2767 fragments.push('</tr>');
2768 fragments.push('<tr>');
2769 rowStarted = true;
2770 }
2771
2772 fragments.push(
2773 '<td class="legendColorBox"><div style="border:1px solid ' + options.legend.labelBoxBorderColor + ';padding:1px"><div style="width:4px;height:0;border:5px solid ' + entry.color + ';overflow:hidden"></div></div></td>' +
2774 '<td class="legendLabel">' + entry.label + '</td>'
2775 );
2776 }
2777
2778 if (rowStarted)
2779 fragments.push('</tr>');
2780
2781 if (fragments.length == 0)
2782 return;
2783
2784 var table = '<table style="font-size:smaller;color:' + options.grid.color + '">' + fragments.join("") + '</table>';
2785 if (options.legend.container != null)
2786 $(options.legend.container).html(table);
2787 else {
2788 var pos = "",
2789 p = options.legend.position,
2790 m = options.legend.margin;
2791 if (m[0] == null)
2792 m = [m, m];
2793 if (p.charAt(0) == "n")
2794 pos += 'top:' + (m[1] + plotOffset.top) + 'px;';
2795 else if (p.charAt(0) == "s")
2796 pos += 'bottom:' + (m[1] + plotOffset.bottom) + 'px;';
2797 if (p.charAt(1) == "e")
2798 pos += 'right:' + (m[0] + plotOffset.right) + 'px;';
2799 else if (p.charAt(1) == "w")
2800 pos += 'left:' + (m[0] + plotOffset.left) + 'px;';
2801 var legend = $('<div class="legend">' + table.replace('style="', 'style="position:absolute;' + pos +';') + '</div>').appendTo(placeholder);
2802 if (options.legend.backgroundOpacity != 0.0) {
2803 // put in the transparent background
2804 // separately to avoid blended labels and
2805 // label boxes
2806 var c = options.legend.backgroundColor;
2807 if (c == null) {
2808 c = options.grid.backgroundColor;
2809 if (c && typeof c == "string")
2810 c = $.color.parse(c);
2811 else
2812 c = $.color.extract(legend, 'background-color');
2813 c.a = 1;
2814 c = c.toString();
2815 }
2816 var div = legend.children();
2817 $('<div style="position:absolute;width:' + div.width() + 'px;height:' + div.height() + 'px;' + pos +'background-color:' + c + ';"> </div>').prependTo(legend).css('opacity', options.legend.backgroundOpacity);
2818 }
2819 }
2820 }
2821
2822
2823 // interactive features
2824
2825 var highlights = [],
2826 redrawTimeout = null;
2827
2828 // returns the data item the mouse is over, or null if none is found
2829 function findNearbyItem(mouseX, mouseY, seriesFilter) {
2830 var maxDistance = options.grid.mouseActiveRadius,
2831 smallestDistance = maxDistance * maxDistance + 1,
2832 item = null, foundPoint = false, i, j, ps;
2833
2834 for (i = series.length - 1; i >= 0; --i) {
2835 if (!seriesFilter(series[i]))
2836 continue;
2837
2838 var s = series[i],
2839 axisx = s.xaxis,
2840 axisy = s.yaxis,
2841 points = s.datapoints.points,
2842 mx = axisx.c2p(mouseX), // precompute some stuff to make the loop faster
2843 my = axisy.c2p(mouseY),
2844 maxx = maxDistance / axisx.scale,
2845 maxy = maxDistance / axisy.scale;
2846
2847 ps = s.datapoints.pointsize;
2848 // with inverse transforms, we can't use the maxx/maxy
2849 // optimization, sadly
2850 if (axisx.options.inverseTransform)
2851 maxx = Number.MAX_VALUE;
2852 if (axisy.options.inverseTransform)
2853 maxy = Number.MAX_VALUE;
2854
2855 if (s.lines.show || s.points.show) {
2856 for (j = 0; j < points.length; j += ps) {
2857 var x = points[j], y = points[j + 1];
2858 if (x == null)
2859 continue;
2860
2861 // For points and lines, the cursor must be within a
2862 // certain distance to the data point
2863 if (x - mx > maxx || x - mx < -maxx ||
2864 y - my > maxy || y - my < -maxy)
2865 continue;
2866
2867 // We have to calculate distances in pixels, not in
2868 // data units, because the scales of the axes may be different
2869 var dx = Math.abs(axisx.p2c(x) - mouseX),
2870 dy = Math.abs(axisy.p2c(y) - mouseY),
2871 dist = dx * dx + dy * dy; // we save the sqrt
2872
2873 // use <= to ensure last point takes precedence
2874 // (last generally means on top of)
2875 if (dist < smallestDistance) {
2876 smallestDistance = dist;
2877 item = [i, j / ps];
2878 }
2879 }
2880 }
2881
2882 if (s.bars.show && !item) { // no other point can be nearby
2883
2884 var barLeft, barRight;
2885
2886 switch (s.bars.align) {
2887 case "left":
2888 barLeft = 0;
2889 break;
2890 case "right":
2891 barLeft = -s.bars.barWidth;
2892 break;
2893 default:
2894 barLeft = -s.bars.barWidth / 2;
2895 }
2896
2897 barRight = barLeft + s.bars.barWidth;
2898
2899 for (j = 0; j < points.length; j += ps) {
2900 var x = points[j], y = points[j + 1], b = points[j + 2];
2901 if (x == null)
2902 continue;
2903
2904 // for a bar graph, the cursor must be inside the bar
2905 if (series[i].bars.horizontal ?
2906 (mx <= Math.max(b, x) && mx >= Math.min(b, x) &&
2907 my >= y + barLeft && my <= y + barRight) :
2908 (mx >= x + barLeft && mx <= x + barRight &&
2909 my >= Math.min(b, y) && my <= Math.max(b, y)))
2910 item = [i, j / ps];
2911 }
2912 }
2913 }
2914
2915 if (item) {
2916 i = item[0];
2917 j = item[1];
2918 ps = series[i].datapoints.pointsize;
2919
2920 return { datapoint: series[i].datapoints.points.slice(j * ps, (j + 1) * ps),
2921 dataIndex: j,
2922 series: series[i],
2923 seriesIndex: i };
2924 }
2925
2926 return null;
2927 }
2928
2929 function onMouseMove(e) {
2930 if (options.grid.hoverable)
2931 triggerClickHoverEvent("plothover", e,
2932 function (s) { return s["hoverable"] != false; });
2933 }
2934
2935 function onMouseLeave(e) {
2936 if (options.grid.hoverable)
2937 triggerClickHoverEvent("plothover", e,
2938 function (s) { return false; });
2939 }
2940
2941 function onClick(e) {
2942 triggerClickHoverEvent("plotclick", e,
2943 function (s) { return s["clickable"] != false; });
2944 }
2945
2946 // trigger click or hover event (they send the same parameters
2947 // so we share their code)
2948 function triggerClickHoverEvent(eventname, event, seriesFilter) {
2949 var offset = eventHolder.offset(),
2950 canvasX = event.pageX - offset.left - plotOffset.left,
2951 canvasY = event.pageY - offset.top - plotOffset.top,
2952 pos = canvasToAxisCoords({ left: canvasX, top: canvasY });
2953
2954 pos.pageX = event.pageX;
2955 pos.pageY = event.pageY;
2956
2957 var item = findNearbyItem(canvasX, canvasY, seriesFilter);
2958
2959 if (item) {
2960 // fill in mouse pos for any listeners out there
2961 item.pageX = parseInt(item.series.xaxis.p2c(item.datapoint[0]) + offset.left + plotOffset.left, 10);
2962 item.pageY = parseInt(item.series.yaxis.p2c(item.datapoint[1]) + offset.top + plotOffset.top, 10);
2963 }
2964
2965 if (options.grid.autoHighlight) {
2966 // clear auto-highlights
2967 for (var i = 0; i < highlights.length; ++i) {
2968 var h = highlights[i];
2969 if (h.auto == eventname &&
2970 !(item && h.series == item.series &&
2971 h.point[0] == item.datapoint[0] &&
2972 h.point[1] == item.datapoint[1]))
2973 unhighlight(h.series, h.point);
2974 }
2975
2976 if (item)
2977 highlight(item.series, item.datapoint, eventname);
2978 }
2979
2980 placeholder.trigger(eventname, [ pos, item ]);
2981 }
2982
2983 function triggerRedrawOverlay() {
2984 var t = options.interaction.redrawOverlayInterval;
2985 if (t == -1) { // skip event queue
2986 drawOverlay();
2987 return;
2988 }
2989
2990 if (!redrawTimeout)
2991 redrawTimeout = setTimeout(drawOverlay, t);
2992 }
2993
2994 function drawOverlay() {
2995 redrawTimeout = null;
2996
2997 // draw highlights
2998 octx.save();
2999 overlay.clear();
3000 octx.translate(plotOffset.left, plotOffset.top);
3001
3002 var i, hi;
3003 for (i = 0; i < highlights.length; ++i) {
3004 hi = highlights[i];
3005
3006 if (hi.series.bars.show)
3007 drawBarHighlight(hi.series, hi.point);
3008 else
3009 drawPointHighlight(hi.series, hi.point);
3010 }
3011 octx.restore();
3012
3013 executeHooks(hooks.drawOverlay, [octx]);
3014 }
3015
3016 function highlight(s, point, auto) {
3017 if (typeof s == "number")
3018 s = series[s];
3019
3020 if (typeof point == "number") {
3021 var ps = s.datapoints.pointsize;
3022 point = s.datapoints.points.slice(ps * point, ps * (point + 1));
3023 }
3024
3025 var i = indexOfHighlight(s, point);
3026 if (i == -1) {
3027 highlights.push({ series: s, point: point, auto: auto });
3028
3029 triggerRedrawOverlay();
3030 }
3031 else if (!auto)
3032 highlights[i].auto = false;
3033 }
3034
3035 function unhighlight(s, point) {
3036 if (s == null && point == null) {
3037 highlights = [];
3038 triggerRedrawOverlay();
3039 return;
3040 }
3041
3042 if (typeof s == "number")
3043 s = series[s];
3044
3045 if (typeof point == "number") {
3046 var ps = s.datapoints.pointsize;
3047 point = s.datapoints.points.slice(ps * point, ps * (point + 1));
3048 }
3049
3050 var i = indexOfHighlight(s, point);
3051 if (i != -1) {
3052 highlights.splice(i, 1);
3053
3054 triggerRedrawOverlay();
3055 }
3056 }
3057
3058 function indexOfHighlight(s, p) {
3059 for (var i = 0; i < highlights.length; ++i) {
3060 var h = highlights[i];
3061 if (h.series == s && h.point[0] == p[0]
3062 && h.point[1] == p[1])
3063 return i;
3064 }
3065 return -1;
3066 }
3067
3068 function drawPointHighlight(series, point) {
3069 var x = point[0], y = point[1],
3070 axisx = series.xaxis, axisy = series.yaxis,
3071 highlightColor = (typeof series.highlightColor === "string") ? series.highlightColor : $.color.parse(series.color).scale('a', 0.5).toString();
3072
3073 if (x < axisx.min || x > axisx.max || y < axisy.min || y > axisy.max)
3074 return;
3075
3076 var pointRadius = series.points.radius + series.points.lineWidth / 2;
3077 octx.lineWidth = pointRadius;
3078 octx.strokeStyle = highlightColor;
3079 var radius = 1.5 * pointRadius;
3080 x = axisx.p2c(x);
3081 y = axisy.p2c(y);
3082
3083 octx.beginPath();
3084 if (series.points.symbol == "circle")
3085 octx.arc(x, y, radius, 0, 2 * Math.PI, false);
3086 else
3087 series.points.symbol(octx, x, y, radius, false);
3088 octx.closePath();
3089 octx.stroke();
3090 }
3091
3092 function drawBarHighlight(series, point) {
3093 var highlightColor = (typeof series.highlightColor === "string") ? series.highlightColor : $.color.parse(series.color).scale('a', 0.5).toString(),
3094 fillStyle = highlightColor,
3095 barLeft;
3096
3097 switch (series.bars.align) {
3098 case "left":
3099 barLeft = 0;
3100 break;
3101 case "right":
3102 barLeft = -series.bars.barWidth;
3103 break;
3104 default:
3105 barLeft = -series.bars.barWidth / 2;
3106 }
3107
3108 octx.lineWidth = series.bars.lineWidth;
3109 octx.strokeStyle = highlightColor;
3110
3111 drawBar(point[0], point[1], point[2] || 0, barLeft, barLeft + series.bars.barWidth,
3112 function () { return fillStyle; }, series.xaxis, series.yaxis, octx, series.bars.horizontal, series.bars.lineWidth);
3113 }
3114
3115 function getColorOrGradient(spec, bottom, top, defaultColor) {
3116 if (typeof spec == "string")
3117 return spec;
3118 else {
3119 // assume this is a gradient spec; IE currently only
3120 // supports a simple vertical gradient properly, so that's
3121 // what we support too
3122 var gradient = ctx.createLinearGradient(0, top, 0, bottom);
3123
3124 for (var i = 0, l = spec.colors.length; i < l; ++i) {
3125 var c = spec.colors[i];
3126 if (typeof c != "string") {
3127 var co = $.color.parse(defaultColor);
3128 if (c.brightness != null)
3129 co = co.scale('rgb', c.brightness);
3130 if (c.opacity != null)
3131 co.a *= c.opacity;
3132 c = co.toString();
3133 }
3134 gradient.addColorStop(i / (l - 1), c);
3135 }
3136
3137 return gradient;
3138 }
3139 }
3140 }
3141
3142 // Add the plot function to the top level of the jQuery object
3143
3144 $.plot = function(placeholder, data, options) {
3145 //var t0 = new Date();
3146 var plot = new Plot($(placeholder), data, options, $.plot.plugins);
3147 //(window.console ? console.log : alert)("time used (msecs): " + ((new Date()).getTime() - t0.getTime()));
3148 return plot;
3149 };
3150
3151 $.plot.version = "0.8.3";
3152
3153 $.plot.plugins = [];
3154
3155 // Also add the plot function as a chainable property
3156
3157 $.fn.plot = function(data, options) {
3158 return this.each(function() {
3159 $.plot(this, data, options);
3160 });
3161 };
3162
3163 // round to nearby lower multiple of base
3164 function floorInBase(n, base) {
3165 return base * Math.floor(n / base);
3166 }
3167
3168 })(jQuery);
@@ -1,360 +0,0 b''
1 /* Flot plugin for selecting regions of a plot.
2
3 Copyright (c) 2007-2014 IOLA and Ole Laursen.
4 Licensed under the MIT license.
5
6 The plugin supports these options:
7
8 selection: {
9 mode: null or "x" or "y" or "xy",
10 color: color,
11 shape: "round" or "miter" or "bevel",
12 minSize: number of pixels
13 }
14
15 Selection support is enabled by setting the mode to one of "x", "y" or "xy".
16 In "x" mode, the user will only be able to specify the x range, similarly for
17 "y" mode. For "xy", the selection becomes a rectangle where both ranges can be
18 specified. "color" is color of the selection (if you need to change the color
19 later on, you can get to it with plot.getOptions().selection.color). "shape"
20 is the shape of the corners of the selection.
21
22 "minSize" is the minimum size a selection can be in pixels. This value can
23 be customized to determine the smallest size a selection can be and still
24 have the selection rectangle be displayed. When customizing this value, the
25 fact that it refers to pixels, not axis units must be taken into account.
26 Thus, for example, if there is a bar graph in time mode with BarWidth set to 1
27 minute, setting "minSize" to 1 will not make the minimum selection size 1
28 minute, but rather 1 pixel. Note also that setting "minSize" to 0 will prevent
29 "plotunselected" events from being fired when the user clicks the mouse without
30 dragging.
31
32 When selection support is enabled, a "plotselected" event will be emitted on
33 the DOM element you passed into the plot function. The event handler gets a
34 parameter with the ranges selected on the axes, like this:
35
36 placeholder.bind( "plotselected", function( event, ranges ) {
37 alert("You selected " + ranges.xaxis.from + " to " + ranges.xaxis.to)
38 // similar for yaxis - with multiple axes, the extra ones are in
39 // x2axis, x3axis, ...
40 });
41
42 The "plotselected" event is only fired when the user has finished making the
43 selection. A "plotselecting" event is fired during the process with the same
44 parameters as the "plotselected" event, in case you want to know what's
45 happening while it's happening,
46
47 A "plotunselected" event with no arguments is emitted when the user clicks the
48 mouse to remove the selection. As stated above, setting "minSize" to 0 will
49 destroy this behavior.
50
51 The plugin allso adds the following methods to the plot object:
52
53 - setSelection( ranges, preventEvent )
54
55 Set the selection rectangle. The passed in ranges is on the same form as
56 returned in the "plotselected" event. If the selection mode is "x", you
57 should put in either an xaxis range, if the mode is "y" you need to put in
58 an yaxis range and both xaxis and yaxis if the selection mode is "xy", like
59 this:
60
61 setSelection({ xaxis: { from: 0, to: 10 }, yaxis: { from: 40, to: 60 } });
62
63 setSelection will trigger the "plotselected" event when called. If you don't
64 want that to happen, e.g. if you're inside a "plotselected" handler, pass
65 true as the second parameter. If you are using multiple axes, you can
66 specify the ranges on any of those, e.g. as x2axis/x3axis/... instead of
67 xaxis, the plugin picks the first one it sees.
68
69 - clearSelection( preventEvent )
70
71 Clear the selection rectangle. Pass in true to avoid getting a
72 "plotunselected" event.
73
74 - getSelection()
75
76 Returns the current selection in the same format as the "plotselected"
77 event. If there's currently no selection, the function returns null.
78
79 */
80
81 (function ($) {
82 function init(plot) {
83 var selection = {
84 first: { x: -1, y: -1}, second: { x: -1, y: -1},
85 show: false,
86 active: false
87 };
88
89 // FIXME: The drag handling implemented here should be
90 // abstracted out, there's some similar code from a library in
91 // the navigation plugin, this should be massaged a bit to fit
92 // the Flot cases here better and reused. Doing this would
93 // make this plugin much slimmer.
94 var savedhandlers = {};
95
96 var mouseUpHandler = null;
97
98 function onMouseMove(e) {
99 if (selection.active) {
100 updateSelection(e);
101
102 plot.getPlaceholder().trigger("plotselecting", [ getSelection() ]);
103 }
104 }
105
106 function onMouseDown(e) {
107 if (e.which != 1) // only accept left-click
108 return;
109
110 // cancel out any text selections
111 document.body.focus();
112
113 // prevent text selection and drag in old-school browsers
114 if (document.onselectstart !== undefined && savedhandlers.onselectstart == null) {
115 savedhandlers.onselectstart = document.onselectstart;
116 document.onselectstart = function () { return false; };
117 }
118 if (document.ondrag !== undefined && savedhandlers.ondrag == null) {
119 savedhandlers.ondrag = document.ondrag;
120 document.ondrag = function () { return false; };
121 }
122
123 setSelectionPos(selection.first, e);
124
125 selection.active = true;
126
127 // this is a bit silly, but we have to use a closure to be
128 // able to whack the same handler again
129 mouseUpHandler = function (e) { onMouseUp(e); };
130
131 $(document).one("mouseup", mouseUpHandler);
132 }
133
134 function onMouseUp(e) {
135 mouseUpHandler = null;
136
137 // revert drag stuff for old-school browsers
138 if (document.onselectstart !== undefined)
139 document.onselectstart = savedhandlers.onselectstart;
140 if (document.ondrag !== undefined)
141 document.ondrag = savedhandlers.ondrag;
142
143 // no more dragging
144 selection.active = false;
145 updateSelection(e);
146
147 if (selectionIsSane())
148 triggerSelectedEvent();
149 else {
150 // this counts as a clear
151 plot.getPlaceholder().trigger("plotunselected", [ ]);
152 plot.getPlaceholder().trigger("plotselecting", [ null ]);
153 }
154
155 return false;
156 }
157
158 function getSelection() {
159 if (!selectionIsSane())
160 return null;
161
162 if (!selection.show) return null;
163
164 var r = {}, c1 = selection.first, c2 = selection.second;
165 $.each(plot.getAxes(), function (name, axis) {
166 if (axis.used) {
167 var p1 = axis.c2p(c1[axis.direction]), p2 = axis.c2p(c2[axis.direction]);
168 r[name] = { from: Math.min(p1, p2), to: Math.max(p1, p2) };
169 }
170 });
171 return r;
172 }
173
174 function triggerSelectedEvent() {
175 var r = getSelection();
176
177 plot.getPlaceholder().trigger("plotselected", [ r ]);
178
179 // backwards-compat stuff, to be removed in future
180 if (r.xaxis && r.yaxis)
181 plot.getPlaceholder().trigger("selected", [ { x1: r.xaxis.from, y1: r.yaxis.from, x2: r.xaxis.to, y2: r.yaxis.to } ]);
182 }
183
184 function clamp(min, value, max) {
185 return value < min ? min: (value > max ? max: value);
186 }
187
188 function setSelectionPos(pos, e) {
189 var o = plot.getOptions();
190 var offset = plot.getPlaceholder().offset();
191 var plotOffset = plot.getPlotOffset();
192 pos.x = clamp(0, e.pageX - offset.left - plotOffset.left, plot.width());
193 pos.y = clamp(0, e.pageY - offset.top - plotOffset.top, plot.height());
194
195 if (o.selection.mode == "y")
196 pos.x = pos == selection.first ? 0 : plot.width();
197
198 if (o.selection.mode == "x")
199 pos.y = pos == selection.first ? 0 : plot.height();
200 }
201
202 function updateSelection(pos) {
203 if (pos.pageX == null)
204 return;
205
206 setSelectionPos(selection.second, pos);
207 if (selectionIsSane()) {
208 selection.show = true;
209 plot.triggerRedrawOverlay();
210 }
211 else
212 clearSelection(true);
213 }
214
215 function clearSelection(preventEvent) {
216 if (selection.show) {
217 selection.show = false;
218 plot.triggerRedrawOverlay();
219 if (!preventEvent)
220 plot.getPlaceholder().trigger("plotunselected", [ ]);
221 }
222 }
223
224 // function taken from markings support in Flot
225 function extractRange(ranges, coord) {
226 var axis, from, to, key, axes = plot.getAxes();
227
228 for (var k in axes) {
229 axis = axes[k];
230 if (axis.direction == coord) {
231 key = coord + axis.n + "axis";
232 if (!ranges[key] && axis.n == 1)
233 key = coord + "axis"; // support x1axis as xaxis
234 if (ranges[key]) {
235 from = ranges[key].from;
236 to = ranges[key].to;
237 break;
238 }
239 }
240 }
241
242 // backwards-compat stuff - to be removed in future
243 if (!ranges[key]) {
244 axis = coord == "x" ? plot.getXAxes()[0] : plot.getYAxes()[0];
245 from = ranges[coord + "1"];
246 to = ranges[coord + "2"];
247 }
248
249 // auto-reverse as an added bonus
250 if (from != null && to != null && from > to) {
251 var tmp = from;
252 from = to;
253 to = tmp;
254 }
255
256 return { from: from, to: to, axis: axis };
257 }
258
259 function setSelection(ranges, preventEvent) {
260 var axis, range, o = plot.getOptions();
261
262 if (o.selection.mode == "y") {
263 selection.first.x = 0;
264 selection.second.x = plot.width();
265 }
266 else {
267 range = extractRange(ranges, "x");
268
269 selection.first.x = range.axis.p2c(range.from);
270 selection.second.x = range.axis.p2c(range.to);
271 }
272
273 if (o.selection.mode == "x") {
274 selection.first.y = 0;
275 selection.second.y = plot.height();
276 }
277 else {
278 range = extractRange(ranges, "y");
279
280 selection.first.y = range.axis.p2c(range.from);
281 selection.second.y = range.axis.p2c(range.to);
282 }
283
284 selection.show = true;
285 plot.triggerRedrawOverlay();
286 if (!preventEvent && selectionIsSane())
287 triggerSelectedEvent();
288 }
289
290 function selectionIsSane() {
291 var minSize = plot.getOptions().selection.minSize;
292 return Math.abs(selection.second.x - selection.first.x) >= minSize &&
293 Math.abs(selection.second.y - selection.first.y) >= minSize;
294 }
295
296 plot.clearSelection = clearSelection;
297 plot.setSelection = setSelection;
298 plot.getSelection = getSelection;
299
300 plot.hooks.bindEvents.push(function(plot, eventHolder) {
301 var o = plot.getOptions();
302 if (o.selection.mode != null) {
303 eventHolder.mousemove(onMouseMove);
304 eventHolder.mousedown(onMouseDown);
305 }
306 });
307
308
309 plot.hooks.drawOverlay.push(function (plot, ctx) {
310 // draw selection
311 if (selection.show && selectionIsSane()) {
312 var plotOffset = plot.getPlotOffset();
313 var o = plot.getOptions();
314
315 ctx.save();
316 ctx.translate(plotOffset.left, plotOffset.top);
317
318 var c = $.color.parse(o.selection.color);
319
320 ctx.strokeStyle = c.scale('a', 0.8).toString();
321 ctx.lineWidth = 1;
322 ctx.lineJoin = o.selection.shape;
323 ctx.fillStyle = c.scale('a', 0.4).toString();
324
325 var x = Math.min(selection.first.x, selection.second.x) + 0.5,
326 y = Math.min(selection.first.y, selection.second.y) + 0.5,
327 w = Math.abs(selection.second.x - selection.first.x) - 1,
328 h = Math.abs(selection.second.y - selection.first.y) - 1;
329
330 ctx.fillRect(x, y, w, h);
331 ctx.strokeRect(x, y, w, h);
332
333 ctx.restore();
334 }
335 });
336
337 plot.hooks.shutdown.push(function (plot, eventHolder) {
338 eventHolder.unbind("mousemove", onMouseMove);
339 eventHolder.unbind("mousedown", onMouseDown);
340
341 if (mouseUpHandler)
342 $(document).unbind("mouseup", mouseUpHandler);
343 });
344
345 }
346
347 $.plot.plugins.push({
348 init: init,
349 options: {
350 selection: {
351 mode: null, // one of null, "x", "y" or "xy"
352 color: "#e8cfac",
353 shape: "round", // one of "round", "miter", or "bevel"
354 minSize: 5 // minimum number of pixels
355 }
356 },
357 name: 'selection',
358 version: '1.1'
359 });
360 })(jQuery);
@@ -1,432 +0,0 b''
1 /* Pretty handling of time axes.
2
3 Copyright (c) 2007-2014 IOLA and Ole Laursen.
4 Licensed under the MIT license.
5
6 Set axis.mode to "time" to enable. See the section "Time series data" in
7 API.txt for details.
8
9 */
10
11 (function($) {
12
13 var options = {
14 xaxis: {
15 timezone: null, // "browser" for local to the client or timezone for timezone-js
16 timeformat: null, // format string to use
17 twelveHourClock: false, // 12 or 24 time in time mode
18 monthNames: null // list of names of months
19 }
20 };
21
22 // round to nearby lower multiple of base
23
24 function floorInBase(n, base) {
25 return base * Math.floor(n / base);
26 }
27
28 // Returns a string with the date d formatted according to fmt.
29 // A subset of the Open Group's strftime format is supported.
30
31 function formatDate(d, fmt, monthNames, dayNames) {
32
33 if (typeof d.strftime == "function") {
34 return d.strftime(fmt);
35 }
36
37 var leftPad = function(n, pad) {
38 n = "" + n;
39 pad = "" + (pad == null ? "0" : pad);
40 return n.length == 1 ? pad + n : n;
41 };
42
43 var r = [];
44 var escape = false;
45 var hours = d.getHours();
46 var isAM = hours < 12;
47
48 if (monthNames == null) {
49 monthNames = ["Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec"];
50 }
51
52 if (dayNames == null) {
53 dayNames = ["Sun", "Mon", "Tue", "Wed", "Thu", "Fri", "Sat"];
54 }
55
56 var hours12;
57
58 if (hours > 12) {
59 hours12 = hours - 12;
60 } else if (hours == 0) {
61 hours12 = 12;
62 } else {
63 hours12 = hours;
64 }
65
66 for (var i = 0; i < fmt.length; ++i) {
67
68 var c = fmt.charAt(i);
69
70 if (escape) {
71 switch (c) {
72 case 'a': c = "" + dayNames[d.getDay()]; break;
73 case 'b': c = "" + monthNames[d.getMonth()]; break;
74 case 'd': c = leftPad(d.getDate()); break;
75 case 'e': c = leftPad(d.getDate(), " "); break;
76 case 'h': // For back-compat with 0.7; remove in 1.0
77 case 'H': c = leftPad(hours); break;
78 case 'I': c = leftPad(hours12); break;
79 case 'l': c = leftPad(hours12, " "); break;
80 case 'm': c = leftPad(d.getMonth() + 1); break;
81 case 'M': c = leftPad(d.getMinutes()); break;
82 // quarters not in Open Group's strftime specification
83 case 'q':
84 c = "" + (Math.floor(d.getMonth() / 3) + 1); break;
85 case 'S': c = leftPad(d.getSeconds()); break;
86 case 'y': c = leftPad(d.getFullYear() % 100); break;
87 case 'Y': c = "" + d.getFullYear(); break;
88 case 'p': c = (isAM) ? ("" + "am") : ("" + "pm"); break;
89 case 'P': c = (isAM) ? ("" + "AM") : ("" + "PM"); break;
90 case 'w': c = "" + d.getDay(); break;
91 }
92 r.push(c);
93 escape = false;
94 } else {
95 if (c == "%") {
96 escape = true;
97 } else {
98 r.push(c);
99 }
100 }
101 }
102
103 return r.join("");
104 }
105
106 // To have a consistent view of time-based data independent of which time
107 // zone the client happens to be in we need a date-like object independent
108 // of time zones. This is done through a wrapper that only calls the UTC
109 // versions of the accessor methods.
110
111 function makeUtcWrapper(d) {
112
113 function addProxyMethod(sourceObj, sourceMethod, targetObj, targetMethod) {
114 sourceObj[sourceMethod] = function() {
115 return targetObj[targetMethod].apply(targetObj, arguments);
116 };
117 };
118
119 var utc = {
120 date: d
121 };
122
123 // support strftime, if found
124
125 if (d.strftime != undefined) {
126 addProxyMethod(utc, "strftime", d, "strftime");
127 }
128
129 addProxyMethod(utc, "getTime", d, "getTime");
130 addProxyMethod(utc, "setTime", d, "setTime");
131
132 var props = ["Date", "Day", "FullYear", "Hours", "Milliseconds", "Minutes", "Month", "Seconds"];
133
134 for (var p = 0; p < props.length; p++) {
135 addProxyMethod(utc, "get" + props[p], d, "getUTC" + props[p]);
136 addProxyMethod(utc, "set" + props[p], d, "setUTC" + props[p]);
137 }
138
139 return utc;
140 };
141
142 // select time zone strategy. This returns a date-like object tied to the
143 // desired timezone
144
145 function dateGenerator(ts, opts) {
146 if (opts.timezone == "browser") {
147 return new Date(ts);
148 } else if (!opts.timezone || opts.timezone == "utc") {
149 return makeUtcWrapper(new Date(ts));
150 } else if (typeof timezoneJS != "undefined" && typeof timezoneJS.Date != "undefined") {
151 var d = new timezoneJS.Date();
152 // timezone-js is fickle, so be sure to set the time zone before
153 // setting the time.
154 d.setTimezone(opts.timezone);
155 d.setTime(ts);
156 return d;
157 } else {
158 return makeUtcWrapper(new Date(ts));
159 }
160 }
161
162 // map of app. size of time units in milliseconds
163
164 var timeUnitSize = {
165 "second": 1000,
166 "minute": 60 * 1000,
167 "hour": 60 * 60 * 1000,
168 "day": 24 * 60 * 60 * 1000,
169 "month": 30 * 24 * 60 * 60 * 1000,
170 "quarter": 3 * 30 * 24 * 60 * 60 * 1000,
171 "year": 365.2425 * 24 * 60 * 60 * 1000
172 };
173
174 // the allowed tick sizes, after 1 year we use
175 // an integer algorithm
176
177 var baseSpec = [
178 [1, "second"], [2, "second"], [5, "second"], [10, "second"],
179 [30, "second"],
180 [1, "minute"], [2, "minute"], [5, "minute"], [10, "minute"],
181 [30, "minute"],
182 [1, "hour"], [2, "hour"], [4, "hour"],
183 [8, "hour"], [12, "hour"],
184 [1, "day"], [2, "day"], [3, "day"],
185 [0.25, "month"], [0.5, "month"], [1, "month"],
186 [2, "month"]
187 ];
188
189 // we don't know which variant(s) we'll need yet, but generating both is
190 // cheap
191
192 var specMonths = baseSpec.concat([[3, "month"], [6, "month"],
193 [1, "year"]]);
194 var specQuarters = baseSpec.concat([[1, "quarter"], [2, "quarter"],
195 [1, "year"]]);
196
197 function init(plot) {
198 plot.hooks.processOptions.push(function (plot, options) {
199 $.each(plot.getAxes(), function(axisName, axis) {
200
201 var opts = axis.options;
202
203 if (opts.mode == "time") {
204 axis.tickGenerator = function(axis) {
205
206 var ticks = [];
207 var d = dateGenerator(axis.min, opts);
208 var minSize = 0;
209
210 // make quarter use a possibility if quarters are
211 // mentioned in either of these options
212
213 var spec = (opts.tickSize && opts.tickSize[1] ===
214 "quarter") ||
215 (opts.minTickSize && opts.minTickSize[1] ===
216 "quarter") ? specQuarters : specMonths;
217
218 if (opts.minTickSize != null) {
219 if (typeof opts.tickSize == "number") {
220 minSize = opts.tickSize;
221 } else {
222 minSize = opts.minTickSize[0] * timeUnitSize[opts.minTickSize[1]];
223 }
224 }
225
226 for (var i = 0; i < spec.length - 1; ++i) {
227 if (axis.delta < (spec[i][0] * timeUnitSize[spec[i][1]]
228 + spec[i + 1][0] * timeUnitSize[spec[i + 1][1]]) / 2
229 && spec[i][0] * timeUnitSize[spec[i][1]] >= minSize) {
230 break;
231 }
232 }
233
234 var size = spec[i][0];
235 var unit = spec[i][1];
236
237 // special-case the possibility of several years
238
239 if (unit == "year") {
240
241 // if given a minTickSize in years, just use it,
242 // ensuring that it's an integer
243
244 if (opts.minTickSize != null && opts.minTickSize[1] == "year") {
245 size = Math.floor(opts.minTickSize[0]);
246 } else {
247
248 var magn = Math.pow(10, Math.floor(Math.log(axis.delta / timeUnitSize.year) / Math.LN10));
249 var norm = (axis.delta / timeUnitSize.year) / magn;
250
251 if (norm < 1.5) {
252 size = 1;
253 } else if (norm < 3) {
254 size = 2;
255 } else if (norm < 7.5) {
256 size = 5;
257 } else {
258 size = 10;
259 }
260
261 size *= magn;
262 }
263
264 // minimum size for years is 1
265
266 if (size < 1) {
267 size = 1;
268 }
269 }
270
271 axis.tickSize = opts.tickSize || [size, unit];
272 var tickSize = axis.tickSize[0];
273 unit = axis.tickSize[1];
274
275 var step = tickSize * timeUnitSize[unit];
276
277 if (unit == "second") {
278 d.setSeconds(floorInBase(d.getSeconds(), tickSize));
279 } else if (unit == "minute") {
280 d.setMinutes(floorInBase(d.getMinutes(), tickSize));
281 } else if (unit == "hour") {
282 d.setHours(floorInBase(d.getHours(), tickSize));
283 } else if (unit == "month") {
284 d.setMonth(floorInBase(d.getMonth(), tickSize));
285 } else if (unit == "quarter") {
286 d.setMonth(3 * floorInBase(d.getMonth() / 3,
287 tickSize));
288 } else if (unit == "year") {
289 d.setFullYear(floorInBase(d.getFullYear(), tickSize));
290 }
291
292 // reset smaller components
293
294 d.setMilliseconds(0);
295
296 if (step >= timeUnitSize.minute) {
297 d.setSeconds(0);
298 }
299 if (step >= timeUnitSize.hour) {
300 d.setMinutes(0);
301 }
302 if (step >= timeUnitSize.day) {
303 d.setHours(0);
304 }
305 if (step >= timeUnitSize.day * 4) {
306 d.setDate(1);
307 }
308 if (step >= timeUnitSize.month * 2) {
309 d.setMonth(floorInBase(d.getMonth(), 3));
310 }
311 if (step >= timeUnitSize.quarter * 2) {
312 d.setMonth(floorInBase(d.getMonth(), 6));
313 }
314 if (step >= timeUnitSize.year) {
315 d.setMonth(0);
316 }
317
318 var carry = 0;
319 var v = Number.NaN;
320 var prev;
321
322 do {
323
324 prev = v;
325 v = d.getTime();
326 ticks.push(v);
327
328 if (unit == "month" || unit == "quarter") {
329 if (tickSize < 1) {
330
331 // a bit complicated - we'll divide the
332 // month/quarter up but we need to take
333 // care of fractions so we don't end up in
334 // the middle of a day
335
336 d.setDate(1);
337 var start = d.getTime();
338 d.setMonth(d.getMonth() +
339 (unit == "quarter" ? 3 : 1));
340 var end = d.getTime();
341 d.setTime(v + carry * timeUnitSize.hour + (end - start) * tickSize);
342 carry = d.getHours();
343 d.setHours(0);
344 } else {
345 d.setMonth(d.getMonth() +
346 tickSize * (unit == "quarter" ? 3 : 1));
347 }
348 } else if (unit == "year") {
349 d.setFullYear(d.getFullYear() + tickSize);
350 } else {
351 d.setTime(v + step);
352 }
353 } while (v < axis.max && v != prev);
354
355 return ticks;
356 };
357
358 axis.tickFormatter = function (v, axis) {
359
360 var d = dateGenerator(v, axis.options);
361
362 // first check global format
363
364 if (opts.timeformat != null) {
365 return formatDate(d, opts.timeformat, opts.monthNames, opts.dayNames);
366 }
367
368 // possibly use quarters if quarters are mentioned in
369 // any of these places
370
371 var useQuarters = (axis.options.tickSize &&
372 axis.options.tickSize[1] == "quarter") ||
373 (axis.options.minTickSize &&
374 axis.options.minTickSize[1] == "quarter");
375
376 var t = axis.tickSize[0] * timeUnitSize[axis.tickSize[1]];
377 var span = axis.max - axis.min;
378 var suffix = (opts.twelveHourClock) ? " %p" : "";
379 var hourCode = (opts.twelveHourClock) ? "%I" : "%H";
380 var fmt;
381
382 if (t < timeUnitSize.minute) {
383 fmt = hourCode + ":%M:%S" + suffix;
384 } else if (t < timeUnitSize.day) {
385 if (span < 2 * timeUnitSize.day) {
386 fmt = hourCode + ":%M" + suffix;
387 } else {
388 fmt = "%b %d " + hourCode + ":%M" + suffix;
389 }
390 } else if (t < timeUnitSize.month) {
391 fmt = "%b %d";
392 } else if ((useQuarters && t < timeUnitSize.quarter) ||
393 (!useQuarters && t < timeUnitSize.year)) {
394 if (span < timeUnitSize.year) {
395 fmt = "%b";
396 } else {
397 fmt = "%b %Y";
398 }
399 } else if (useQuarters && t < timeUnitSize.year) {
400 if (span < timeUnitSize.year) {
401 fmt = "Q%q";
402 } else {
403 fmt = "Q%q %Y";
404 }
405 } else {
406 fmt = "%Y";
407 }
408
409 var rt = formatDate(d, fmt, opts.monthNames, opts.dayNames);
410
411 return rt;
412 };
413 }
414 });
415 });
416 }
417
418 $.plot.plugins.push({
419 init: init,
420 options: options,
421 name: 'time',
422 version: '1.0'
423 });
424
425 // Time-axis support used to be in Flot core, which exposed the
426 // formatDate function on the plot object. Various plugins depend
427 // on the function, so we need to re-expose it here.
428
429 $.plot.formatDate = formatDate;
430 $.plot.dateGenerator = dateGenerator;
431
432 })(jQuery);
@@ -1,488 +0,0 b''
1 /*
2 * jquery.flot.tooltip
3 *
4 * description: easy-to-use tooltips for Flot charts
5 * version: 0.8.4
6 * authors: Krzysztof Urbas @krzysu [myviews.pl],Evan Steinkerchner @Roundaround
7 * website: https://github.com/krzysu/flot.tooltip
8 *
9 * build on 2014-08-06
10 * released under MIT License, 2012
11 */
12 (function ($) {
13 // plugin options, default values
14 var defaultOptions = {
15 tooltip: false,
16 tooltipOpts: {
17 id: "flotTip",
18 content: "%s | X: %x | Y: %y",
19 // allowed templates are:
20 // %s -> series label,
21 // %lx -> x axis label (requires flot-axislabels plugin https://github.com/markrcote/flot-axislabels),
22 // %ly -> y axis label (requires flot-axislabels plugin https://github.com/markrcote/flot-axislabels),
23 // %x -> X value,
24 // %y -> Y value,
25 // %x.2 -> precision of X value,
26 // %p -> percent
27 xDateFormat: null,
28 yDateFormat: null,
29 monthNames: null,
30 dayNames: null,
31 shifts: {
32 x: 10,
33 y: 20
34 },
35 defaultTheme: true,
36 lines: false,
37
38 // callbacks
39 onHover: function (flotItem, $tooltipEl) {},
40
41 $compat: false
42 }
43 };
44
45 // object
46 var FlotTooltip = function (plot) {
47 // variables
48 this.tipPosition = {x: 0, y: 0};
49
50 this.init(plot);
51 };
52
53 // main plugin function
54 FlotTooltip.prototype.init = function (plot) {
55 var that = this;
56
57 // detect other flot plugins
58 var plotPluginsLength = $.plot.plugins.length;
59 this.plotPlugins = [];
60
61 if (plotPluginsLength) {
62 for (var p = 0; p < plotPluginsLength; p++) {
63 this.plotPlugins.push($.plot.plugins[p].name);
64 }
65 }
66
67 plot.hooks.bindEvents.push(function (plot, eventHolder) {
68
69 // get plot options
70 that.plotOptions = plot.getOptions();
71
72 // if not enabled return
73 if (that.plotOptions.tooltip === false || typeof that.plotOptions.tooltip === 'undefined') return;
74
75 // shortcut to access tooltip options
76 that.tooltipOptions = that.plotOptions.tooltipOpts;
77
78 if (that.tooltipOptions.$compat) {
79 that.wfunc = 'width';
80 that.hfunc = 'height';
81 } else {
82 that.wfunc = 'innerWidth';
83 that.hfunc = 'innerHeight';
84 }
85
86 // create tooltip DOM element
87 var $tip = that.getDomElement();
88
89 // bind event
90 $( plot.getPlaceholder() ).bind("plothover", plothover);
91
92 $(eventHolder).bind('mousemove', mouseMove);
93 });
94
95 plot.hooks.shutdown.push(function (plot, eventHolder){
96 $(plot.getPlaceholder()).unbind("plothover", plothover);
97 $(eventHolder).unbind("mousemove", mouseMove);
98 });
99
100 function mouseMove(e){
101 var pos = {};
102 pos.x = e.pageX;
103 pos.y = e.pageY;
104 plot.setTooltipPosition(pos);
105 }
106
107 function plothover(event, pos, item) {
108 // Simple distance formula.
109 var lineDistance = function (p1x, p1y, p2x, p2y) {
110 return Math.sqrt((p2x - p1x) * (p2x - p1x) + (p2y - p1y) * (p2y - p1y));
111 };
112
113 // Here is some voodoo magic for determining the distance to a line form a given point {x, y}.
114 var dotLineLength = function (x, y, x0, y0, x1, y1, o) {
115 if (o && !(o =
116 function (x, y, x0, y0, x1, y1) {
117 if (typeof x0 !== 'undefined') return { x: x0, y: y };
118 else if (typeof y0 !== 'undefined') return { x: x, y: y0 };
119
120 var left,
121 tg = -1 / ((y1 - y0) / (x1 - x0));
122
123 return {
124 x: left = (x1 * (x * tg - y + y0) + x0 * (x * -tg + y - y1)) / (tg * (x1 - x0) + y0 - y1),
125 y: tg * left - tg * x + y
126 };
127 } (x, y, x0, y0, x1, y1),
128 o.x >= Math.min(x0, x1) && o.x <= Math.max(x0, x1) && o.y >= Math.min(y0, y1) && o.y <= Math.max(y0, y1))
129 ) {
130 var l1 = lineDistance(x, y, x0, y0), l2 = lineDistance(x, y, x1, y1);
131 return l1 > l2 ? l2 : l1;
132 } else {
133 var a = y0 - y1, b = x1 - x0, c = x0 * y1 - y0 * x1;
134 return Math.abs(a * x + b * y + c) / Math.sqrt(a * a + b * b);
135 }
136 };
137
138 if (item) {
139 plot.showTooltip(item, pos);
140 } else if (that.plotOptions.series.lines.show && that.tooltipOptions.lines === true) {
141 var maxDistance = that.plotOptions.grid.mouseActiveRadius;
142
143 var closestTrace = {
144 distance: maxDistance + 1
145 };
146
147 $.each(plot.getData(), function (i, series) {
148 var xBeforeIndex = 0,
149 xAfterIndex = -1;
150
151 // Our search here assumes our data is sorted via the x-axis.
152 // TODO: Improve efficiency somehow - search smaller sets of data.
153 for (var j = 1; j < series.data.length; j++) {
154 if (series.data[j - 1][0] <= pos.x && series.data[j][0] >= pos.x) {
155 xBeforeIndex = j - 1;
156 xAfterIndex = j;
157 }
158 }
159
160 if (xAfterIndex === -1) {
161 plot.hideTooltip();
162 return;
163 }
164
165 var pointPrev = { x: series.data[xBeforeIndex][0], y: series.data[xBeforeIndex][1] },
166 pointNext = { x: series.data[xAfterIndex][0], y: series.data[xAfterIndex][1] };
167
168 var distToLine = dotLineLength(series.xaxis.p2c(pos.x), series.yaxis.p2c(pos.y), series.xaxis.p2c(pointPrev.x),
169 series.yaxis.p2c(pointPrev.y), series.xaxis.p2c(pointNext.x), series.yaxis.p2c(pointNext.y), false);
170
171 if (distToLine < closestTrace.distance) {
172
173 var closestIndex = lineDistance(pointPrev.x, pointPrev.y, pos.x, pos.y) <
174 lineDistance(pos.x, pos.y, pointNext.x, pointNext.y) ? xBeforeIndex : xAfterIndex;
175
176 var pointSize = series.datapoints.pointsize;
177
178 // Calculate the point on the line vertically closest to our cursor.
179 var pointOnLine = [
180 pos.x,
181 pointPrev.y + ((pointNext.y - pointPrev.y) * ((pos.x - pointPrev.x) / (pointNext.x - pointPrev.x)))
182 ];
183
184 var item = {
185 datapoint: pointOnLine,
186 dataIndex: closestIndex,
187 series: series,
188 seriesIndex: i
189 };
190
191 closestTrace = {
192 distance: distToLine,
193 item: item
194 };
195 }
196 });
197
198 if (closestTrace.distance < maxDistance + 1)
199 plot.showTooltip(closestTrace.item, pos);
200 else
201 plot.hideTooltip();
202 } else {
203 plot.hideTooltip();
204 }
205 }
206
207 // Quick little function for setting the tooltip position.
208 plot.setTooltipPosition = function (pos) {
209 var $tip = that.getDomElement();
210
211 var totalTipWidth = $tip.outerWidth() + that.tooltipOptions.shifts.x;
212 var totalTipHeight = $tip.outerHeight() + that.tooltipOptions.shifts.y;
213 if ((pos.x - $(window).scrollLeft()) > ($(window)[that.wfunc]() - totalTipWidth)) {
214 pos.x -= totalTipWidth;
215 }
216 if ((pos.y - $(window).scrollTop()) > ($(window)[that.hfunc]() - totalTipHeight)) {
217 pos.y -= totalTipHeight;
218 }
219 that.tipPosition.x = pos.x;
220 that.tipPosition.y = pos.y;
221 };
222
223 // Quick little function for showing the tooltip.
224 plot.showTooltip = function (target, position) {
225 var $tip = that.getDomElement();
226
227 // convert tooltip content template to real tipText
228 var tipText = that.stringFormat(that.tooltipOptions.content, target);
229
230 $tip.html(tipText);
231 plot.setTooltipPosition({ x: position.pageX, y: position.pageY });
232 $tip.css({
233 left: that.tipPosition.x + that.tooltipOptions.shifts.x,
234 top: that.tipPosition.y + that.tooltipOptions.shifts.y
235 }).show();
236
237 // run callback
238 if (typeof that.tooltipOptions.onHover === 'function') {
239 that.tooltipOptions.onHover(target, $tip);
240 }
241 };
242
243 // Quick little function for hiding the tooltip.
244 plot.hideTooltip = function () {
245 that.getDomElement().hide().html('');
246 };
247 };
248
249 /**
250 * get or create tooltip DOM element
251 * @return jQuery object
252 */
253 FlotTooltip.prototype.getDomElement = function () {
254 var $tip = $('#' + this.tooltipOptions.id);
255
256 if( $tip.length === 0 ){
257 $tip = $('<div />').attr('id', this.tooltipOptions.id);
258 $tip.appendTo('body').hide().css({position: 'absolute'});
259
260 if(this.tooltipOptions.defaultTheme) {
261 $tip.css({
262 'background': '#fff',
263 'z-index': '1040',
264 'padding': '0.4em 0.6em',
265 'border-radius': '0.5em',
266 'font-size': '0.8em',
267 'border': '1px solid #111',
268 'display': 'none',
269 'white-space': 'nowrap'
270 });
271 }
272 }
273
274 return $tip;
275 };
276
277 /**
278 * core function, create tooltip content
279 * @param {string} content - template with tooltip content
280 * @param {object} item - Flot item
281 * @return {string} real tooltip content for current item
282 */
283 FlotTooltip.prototype.stringFormat = function (content, item) {
284
285 var percentPattern = /%p\.{0,1}(\d{0,})/;
286 var seriesPattern = /%s/;
287 var xLabelPattern = /%lx/; // requires flot-axislabels plugin https://github.com/markrcote/flot-axislabels, will be ignored if plugin isn't loaded
288 var yLabelPattern = /%ly/; // requires flot-axislabels plugin https://github.com/markrcote/flot-axislabels, will be ignored if plugin isn't loaded
289 var xPattern = /%x\.{0,1}(\d{0,})/;
290 var yPattern = /%y\.{0,1}(\d{0,})/;
291 var xPatternWithoutPrecision = "%x";
292 var yPatternWithoutPrecision = "%y";
293 var customTextPattern = "%ct";
294
295 var x, y, customText, p;
296
297 // for threshold plugin we need to read data from different place
298 if (typeof item.series.threshold !== "undefined") {
299 x = item.datapoint[0];
300 y = item.datapoint[1];
301 customText = item.datapoint[2];
302 } else if (typeof item.series.lines !== "undefined" && item.series.lines.steps) {
303 x = item.series.datapoints.points[item.dataIndex * 2];
304 y = item.series.datapoints.points[item.dataIndex * 2 + 1];
305 // TODO: where to find custom text in this variant?
306 customText = "";
307 } else {
308 x = item.series.data[item.dataIndex][0];
309 y = item.series.data[item.dataIndex][1];
310 customText = item.series.data[item.dataIndex][2];
311 }
312
313 // I think this is only in case of threshold plugin
314 if (item.series.label === null && item.series.originSeries) {
315 item.series.label = item.series.originSeries.label;
316 }
317
318 // if it is a function callback get the content string
319 if (typeof(content) === 'function') {
320 content = content(item.series.label, x, y, item);
321 }
322
323 // percent match for pie charts and stacked percent
324 if (typeof (item.series.percent) !== 'undefined') {
325 p = item.series.percent;
326 } else if (typeof (item.series.percents) !== 'undefined') {
327 p = item.series.percents[item.dataIndex];
328 }
329 if (typeof p === 'number') {
330 content = this.adjustValPrecision(percentPattern, content, p);
331 }
332
333 // series match
334 if (typeof(item.series.label) !== 'undefined') {
335 content = content.replace(seriesPattern, item.series.label);
336 } else {
337 //remove %s if label is undefined
338 content = content.replace(seriesPattern, "");
339 }
340
341 // x axis label match
342 if (this.hasAxisLabel('xaxis', item)) {
343 content = content.replace(xLabelPattern, item.series.xaxis.options.axisLabel);
344 } else {
345 //remove %lx if axis label is undefined or axislabels plugin not present
346 content = content.replace(xLabelPattern, "");
347 }
348
349 // y axis label match
350 if (this.hasAxisLabel('yaxis', item)) {
351 content = content.replace(yLabelPattern, item.series.yaxis.options.axisLabel);
352 } else {
353 //remove %ly if axis label is undefined or axislabels plugin not present
354 content = content.replace(yLabelPattern, "");
355 }
356
357 // time mode axes with custom dateFormat
358 if (this.isTimeMode('xaxis', item) && this.isXDateFormat(item)) {
359 content = content.replace(xPattern, this.timestampToDate(x, this.tooltipOptions.xDateFormat, item.series.xaxis.options));
360 }
361 if (this.isTimeMode('yaxis', item) && this.isYDateFormat(item)) {
362 content = content.replace(yPattern, this.timestampToDate(y, this.tooltipOptions.yDateFormat, item.series.yaxis.options));
363 }
364
365 // set precision if defined
366 if (typeof x === 'number') {
367 content = this.adjustValPrecision(xPattern, content, x);
368 }
369 if (typeof y === 'number') {
370 content = this.adjustValPrecision(yPattern, content, y);
371 }
372
373 // change x from number to given label, if given
374 if (typeof item.series.xaxis.ticks !== 'undefined') {
375
376 var ticks;
377 if (this.hasRotatedXAxisTicks(item)) {
378 // xaxis.ticks will be an empty array if tickRotor is being used, but the values are available in rotatedTicks
379 ticks = 'rotatedTicks';
380 } else {
381 ticks = 'ticks';
382 }
383
384 // see https://github.com/krzysu/flot.tooltip/issues/65
385 var tickIndex = item.dataIndex + item.seriesIndex;
386
387 if (item.series.xaxis[ticks].length > tickIndex && !this.isTimeMode('xaxis', item)) {
388 var valueX = (this.isCategoriesMode('xaxis', item)) ? item.series.xaxis[ticks][tickIndex].label : item.series.xaxis[ticks][tickIndex].v;
389 if (valueX === x) {
390 content = content.replace(xPattern, item.series.xaxis[ticks][tickIndex].label);
391 }
392 }
393 }
394
395 // change y from number to given label, if given
396 if (typeof item.series.yaxis.ticks !== 'undefined') {
397 for (var index in item.series.yaxis.ticks) {
398 if (item.series.yaxis.ticks.hasOwnProperty(index)) {
399 var valueY = (this.isCategoriesMode('yaxis', item)) ? item.series.yaxis.ticks[index].label : item.series.yaxis.ticks[index].v;
400 if (valueY === y) {
401 content = content.replace(yPattern, item.series.yaxis.ticks[index].label);
402 }
403 }
404 }
405 }
406
407 // if no value customization, use tickFormatter by default
408 if (typeof item.series.xaxis.tickFormatter !== 'undefined') {
409 //escape dollar
410 content = content.replace(xPatternWithoutPrecision, item.series.xaxis.tickFormatter(x, item.series.xaxis).replace(/\$/g, '$$'));
411 }
412 if (typeof item.series.yaxis.tickFormatter !== 'undefined') {
413 //escape dollar
414 content = content.replace(yPatternWithoutPrecision, item.series.yaxis.tickFormatter(y, item.series.yaxis).replace(/\$/g, '$$'));
415 }
416
417 if (customText)
418 content = content.replace(customTextPattern, customText);
419
420 return content;
421 };
422
423 // helpers just for readability
424 FlotTooltip.prototype.isTimeMode = function (axisName, item) {
425 return (typeof item.series[axisName].options.mode !== 'undefined' && item.series[axisName].options.mode === 'time');
426 };
427
428 FlotTooltip.prototype.isXDateFormat = function (item) {
429 return (typeof this.tooltipOptions.xDateFormat !== 'undefined' && this.tooltipOptions.xDateFormat !== null);
430 };
431
432 FlotTooltip.prototype.isYDateFormat = function (item) {
433 return (typeof this.tooltipOptions.yDateFormat !== 'undefined' && this.tooltipOptions.yDateFormat !== null);
434 };
435
436 FlotTooltip.prototype.isCategoriesMode = function (axisName, item) {
437 return (typeof item.series[axisName].options.mode !== 'undefined' && item.series[axisName].options.mode === 'categories');
438 };
439
440 //
441 FlotTooltip.prototype.timestampToDate = function (tmst, dateFormat, options) {
442 var theDate = $.plot.dateGenerator(tmst, options);
443 return $.plot.formatDate(theDate, dateFormat, this.tooltipOptions.monthNames, this.tooltipOptions.dayNames);
444 };
445
446 //
447 FlotTooltip.prototype.adjustValPrecision = function (pattern, content, value) {
448
449 var precision;
450 var matchResult = content.match(pattern);
451 if( matchResult !== null ) {
452 if(RegExp.$1 !== '') {
453 precision = RegExp.$1;
454 value = value.toFixed(precision);
455
456 // only replace content if precision exists, in other case use thickformater
457 content = content.replace(pattern, value);
458 }
459 }
460 return content;
461 };
462
463 // other plugins detection below
464
465 // check if flot-axislabels plugin (https://github.com/markrcote/flot-axislabels) is used and that an axis label is given
466 FlotTooltip.prototype.hasAxisLabel = function (axisName, item) {
467 return ($.inArray(this.plotPlugins, 'axisLabels') !== -1 && typeof item.series[axisName].options.axisLabel !== 'undefined' && item.series[axisName].options.axisLabel.length > 0);
468 };
469
470 // check whether flot-tickRotor, a plugin which allows rotation of X-axis ticks, is being used
471 FlotTooltip.prototype.hasRotatedXAxisTicks = function (item) {
472 return ($.inArray(this.plotPlugins, 'tickRotor') !== -1 && typeof item.series.xaxis.rotatedTicks !== 'undefined');
473 };
474
475 //
476 var init = function (plot) {
477 new FlotTooltip(plot);
478 };
479
480 // define Flot plugin
481 $.plot.plugins.push({
482 init: init,
483 options: defaultOptions,
484 name: 'tooltip',
485 version: '0.8.4'
486 });
487
488 })(jQuery);
@@ -1,2 +0,0 b''
1 /*! appenlight-client 29-01-2016 */
2 !function(a){"use strict";var b={version:"0.4.1",options:{apiKey:""},errorReportBuffer:[],slowReportBuffer:[],logBuffer:[],requestInfo:null,init:function(b){var c=this;"undefined"==typeof b.server&&(b.server="https://api.appenlight.com"),"undefined"==typeof b.apiKey&&(b.apiKey="undefined"),"undefined"==typeof b.protocol_version&&(b.protocol_version="0.5"),("undefined"==typeof b.windowOnError||0==b.windowOnError)&&(TraceKit.collectWindowErrors=!1),"undefined"==typeof b.sendInterval&&(b.sendInterval=1e3),"undefined"==typeof b.tracekitRemoteFetching&&(b.tracekitRemoteFetching=!0),"undefined"==typeof b.tracekitContextLines&&(b.tracekitContextLines=11),b.sendInterval>=1e3&&this.createSendInterval(b.sendInterval),this.options=b,this.requestInfo={url:a.location.href},this.reportsEndpoint=b.server+"/api/reports?public_api_key="+this.options.apiKey+"&protocol_version="+this.options.protocol_version,this.logsEndpoint=b.server+"/api/logs?public_api_key="+this.options.apiKey+"&protocol_version="+this.options.protocol_version,TraceKit.remoteFetching=b.tracekitRemoteFetching,TraceKit.linesOfContext=b.tracekitContextLines,TraceKit.report.subscribe(function(a){c.handleError(a)})},createSendInterval:function(a){var b=this;this.send_iv=setInterval(function(){b.sendReports(),b.sendLogs()},a)},setRequestInfo:function(a){for(var b in a)this.requestInfo[b]=a[b]},grabError:function(a){try{TraceKit.report(a)}catch(b){if(a!==b)throw b}},handleError:function(b){if("stack"==b.mode)var c=b.name+": "+b.message;else var c=b.message;var d={client:"javascript",language:"javascript",error:c,occurences:1,priority:5,server:"",http_status:500,request:{},traceback:[]};if(d.user_agent=a.navigator.userAgent,d.start_time=(new Date).toJSON(),null!=this.requestInfo)for(var e in this.requestInfo)d[e]=this.requestInfo[e];"undefined"!=typeof d.request_id&&d.request_id||(d.request_id=this.genUUID4());for(var f=b.stack.reverse().slice(-100),e=0;e<f.length;e++){var g="";try{if(f[e].context)for(var h=0;h<f[e].context.length;h++){var i=f[e].context[h];g+=i.length>300?"<minified-context>":i,g+="\n"}}catch(j){}var k={cline:g,file:f[e].url,fn:f[e].func,line:f[e].line,vars:[]};d.traceback.push(k)}d.traceback.length>0&&(d.traceback[d.traceback.length-1].cline=g+"\n"+c),this.errorReportBuffer.push(d)},log:function(b,c,d,e){if("undefined"==typeof d)var d=a.location.pathname;if("undefined"==typeof e)var e=null;this.logBuffer.push({log_level:b.toUpperCase(),message:c,date:(new Date).toJSON(),namespace:d}),null!=this.requestInfo&&"undefined"!=typeof this.requestInfo.server&&(this.logBuffer[this.logBuffer.length-1].server=this.requestInfo.server)},genUUID4:function(){return"xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx".replace(/[xy]/g,function(a){var b=16*Math.random()|0,c="x"==a?b:3&b|8;return c.toString(16)})},sendReports:function(){if(this.errorReportBuffer.length<1)return!0;var a=this.errorReportBuffer;return this.submitData(this.reportsEndpoint,a),this.errorReportBuffer=[],!0},sendLogs:function(){if(this.logBuffer.length<1)return!0;var a=this.logBuffer;return this.submitData(this.logsEndpoint,a),this.logBuffer=[],!0},submitData:function(b,c){var d=new a.XMLHttpRequest;!d&&a.ActiveXObject&&(d=new a.ActiveXObject("Microsoft.XMLHTTP")),d.open("POST",b,!0),d.setRequestHeader("Content-Type","application/json"),d.send(JSON.stringify(c))}};a.AppEnlight=b,"function"==typeof define&&define.amd&&define("appenlight",[],function(){return b})}(window),function(a,b){function c(a,b){return Object.prototype.hasOwnProperty.call(a,b)}function d(a){return"undefined"==typeof a}if(a){var e={},f=a.TraceKit,g=[].slice,h="?";e.noConflict=function(){return a.TraceKit=f,e},e.wrap=function(a){function b(){try{return a.apply(this,arguments)}catch(b){throw e.report(b),b}}return b},e.report=function(){function b(a){i(),n.push(a)}function d(a){for(var b=n.length-1;b>=0;--b)n[b]===a&&n.splice(b,1)}function f(a,b){var d=null;if(!b||e.collectWindowErrors){for(var f in n)if(c(n,f))try{n[f].apply(null,[a].concat(g.call(arguments,2)))}catch(h){d=h}if(d)throw d}}function h(a,b,c,d,g){var h=null;if(q)e.computeStackTrace.augmentStackTraceWithInitialElement(q,b,c,a),j();else if(g)h=e.computeStackTrace(g),f(h,!0);else{var i={url:b,line:c,column:d};i.func=e.computeStackTrace.guessFunctionName(i.url,i.line),i.context=e.computeStackTrace.gatherContext(i.url,i.line),h={mode:"onerror",message:a,stack:[i]},f(h,!0)}return l?l.apply(this,arguments):!1}function i(){m!==!0&&(l=a.onerror,a.onerror=h,m=!0)}function j(){var a=q,b=o;o=null,q=null,p=null,f.apply(null,[a,!1].concat(b))}function k(b){if(q){if(p===b)return;j()}var c=e.computeStackTrace(b);throw q=c,p=b,o=g.call(arguments,1),a.setTimeout(function(){p===b&&j()},c.incomplete?2e3:0),b}var l,m,n=[],o=null,p=null,q=null;return k.subscribe=b,k.unsubscribe=d,k}(),e.computeStackTrace=function(){function b(b){if(!e.remoteFetching)return"";try{var c=function(){try{return new a.XMLHttpRequest}catch(b){return new a.ActiveXObject("Microsoft.XMLHTTP")}},d=c();return d.open("GET",b,!1),d.send(""),d.responseText}catch(f){return""}}function f(a){if("string"!=typeof a)return[];if(!c(w,a)){var d="",e="";try{e=document.domain}catch(f){}-1!==a.indexOf(e)&&(d=b(a)),w[a]=d?d.split("\n"):[]}return w[a]}function g(a,b){var c,e=/function ([^(]*)\(([^)]*)\)/,g=/['"]?([0-9A-Za-z$_]+)['"]?\s*[:=]\s*(function|eval|new Function)/,i="",j=10,k=f(a);if(!k.length)return h;for(var l=0;j>l;++l)if(i=k[b-l]+i,!d(i)){if(c=g.exec(i))return c[1];if(c=e.exec(i))return c[1]}return h}function i(a,b){var c=f(a);if(!c.length)return null;var g=[],h=Math.floor(e.linesOfContext/2),i=h+e.linesOfContext%2,j=Math.max(0,b-h-1),k=Math.min(c.length,b+i-1);b-=1;for(var l=j;k>l;++l)d(c[l])||g.push(c[l]);return g.length>0?g:null}function j(a){return a.replace(/[\-\[\]{}()*+?.,\\\^$|#]/g,"\\$&")}function k(a){return j(a).replace("<","(?:<|&lt;)").replace(">","(?:>|&gt;)").replace("&","(?:&|&amp;)").replace('"','(?:"|&quot;)').replace(/\s+/g,"\\s+")}function l(a,b){for(var c,d,e=0,g=b.length;g>e;++e)if((c=f(b[e])).length&&(c=c.join("\n"),d=a.exec(c)))return{url:b[e],line:c.substring(0,d.index).split("\n").length,column:d.index-c.lastIndexOf("\n",d.index)-1};return null}function m(a,b,c){var d,e=f(b),g=new RegExp("\\b"+j(a)+"\\b");return c-=1,e&&e.length>c&&(d=g.exec(e[c]))?d.index:null}function n(b){if(!d(document)){for(var c,e,f,g,h=[a.location.href],i=document.getElementsByTagName("script"),m=""+b,n=/^function(?:\s+([\w$]+))?\s*\(([\w\s,]*)\)\s*\{\s*(\S[\s\S]*\S)\s*\}\s*$/,o=/^function on([\w$]+)\s*\(event\)\s*\{\s*(\S[\s\S]*\S)\s*\}\s*$/,p=0;p<i.length;++p){var q=i[p];q.src&&h.push(q.src)}if(f=n.exec(m)){var r=f[1]?"\\s+"+f[1]:"",s=f[2].split(",").join("\\s*,\\s*");c=j(f[3]).replace(/;$/,";?"),e=new RegExp("function"+r+"\\s*\\(\\s*"+s+"\\s*\\)\\s*{\\s*"+c+"\\s*}")}else e=new RegExp(j(m).replace(/\s+/g,"\\s+"));if(g=l(e,h))return g;if(f=o.exec(m)){var t=f[1];if(c=k(f[2]),e=new RegExp("on"+t+"=[\\'\"]\\s*"+c+"\\s*[\\'\"]","i"),g=l(e,h[0]))return g;if(e=new RegExp(c),g=l(e,h))return g}return null}}function o(a){if(!a.stack)return null;for(var b,c,e=/^\s*at (.*?) ?\(((?:file|https?|blob|chrome-extension|native|eval).*?)(?::(\d+))?(?::(\d+))?\)?\s*$/i,f=/^\s*(.*?)(?:\((.*?)\))?@?((?:file|https?|blob|chrome|\[).*?)(?::(\d+))?(?::(\d+))?\s*$/i,j=/^\s*at (?:((?:\[object object\])?.+) )?\(?((?:ms-appx|https?|blob):.*?):(\d+)(?::(\d+))?\)?\s*$/i,k=a.stack.split("\n"),l=[],n=/^(.*) is undefined$/.exec(a.message),o=0,p=k.length;p>o;++o){if(b=e.exec(k[o])){var q=b[2]&&-1!==b[2].indexOf("native");c={url:q?null:b[2],func:b[1]||h,args:q?[b[2]]:[],line:b[3]?+b[3]:null,column:b[4]?+b[4]:null}}else if(b=j.exec(k[o]))c={url:b[2],func:b[1]||h,args:[],line:+b[3],column:b[4]?+b[4]:null};else{if(!(b=f.exec(k[o])))continue;c={url:b[3],func:b[1]||h,args:b[2]?b[2].split(","):[],line:b[4]?+b[4]:null,column:b[5]?+b[5]:null}}!c.func&&c.line&&(c.func=g(c.url,c.line)),c.line&&(c.context=i(c.url,c.line)),l.push(c)}return l.length?(l[0]&&l[0].line&&!l[0].column&&n?l[0].column=m(n[1],l[0].url,l[0].line):l[0].column||d(a.columnNumber)||(l[0].column=a.columnNumber+1),{mode:"stack",name:a.name,message:a.message,stack:l}):null}function p(a){var b=a.stacktrace;if(b){for(var c,d=/ line (\d+).*script (?:in )?(\S+)(?:: in function (\S+))?$/i,e=/ line (\d+), column (\d+)\s*(?:in (?:<anonymous function: ([^>]+)>|([^\)]+))\((.*)\))? in (.*):\s*$/i,f=b.split("\n"),h=[],j=0;j<f.length;j+=2){var k=null;if((c=d.exec(f[j]))?k={url:c[2],line:+c[1],column:null,func:c[3],args:[]}:(c=e.exec(f[j]))&&(k={url:c[6],line:+c[1],column:+c[2],func:c[3]||c[4],args:c[5]?c[5].split(","):[]}),k){if(!k.func&&k.line&&(k.func=g(k.url,k.line)),k.line)try{k.context=i(k.url,k.line)}catch(l){}k.context||(k.context=[f[j+1]]),h.push(k)}}return h.length?{mode:"stacktrace",name:a.name,message:a.message,stack:h}:null}}function q(b){var d=b.message.split("\n");if(d.length<4)return null;var e,h=/^\s*Line (\d+) of linked script ((?:file|https?|blob)\S+)(?:: in function (\S+))?\s*$/i,j=/^\s*Line (\d+) of inline#(\d+) script in ((?:file|https?|blob)\S+)(?:: in function (\S+))?\s*$/i,m=/^\s*Line (\d+) of function script\s*$/i,n=[],o=document.getElementsByTagName("script"),p=[];for(var q in o)c(o,q)&&!o[q].src&&p.push(o[q]);for(var r=2;r<d.length;r+=2){var s=null;if(e=h.exec(d[r]))s={url:e[2],func:e[3],args:[],line:+e[1],column:null};else if(e=j.exec(d[r])){s={url:e[3],func:e[4],args:[],line:+e[1],column:null};var t=+e[1],u=p[e[2]-1];if(u){var v=f(s.url);if(v){v=v.join("\n");var w=v.indexOf(u.innerText);w>=0&&(s.line=t+v.substring(0,w).split("\n").length)}}}else if(e=m.exec(d[r])){var x=a.location.href.replace(/#.*$/,""),y=new RegExp(k(d[r+1])),z=l(y,[x]);s={url:x,func:"",args:[],line:z?z.line:e[1],column:null}}if(s){s.func||(s.func=g(s.url,s.line));var A=i(s.url,s.line),B=A?A[Math.floor(A.length/2)]:null;A&&B.replace(/^\s*/,"")===d[r+1].replace(/^\s*/,"")?s.context=A:s.context=[d[r+1]],n.push(s)}}return n.length?{mode:"multiline",name:b.name,message:d[0],stack:n}:null}function r(a,b,c,d){var e={url:b,line:c};if(e.url&&e.line){a.incomplete=!1,e.func||(e.func=g(e.url,e.line)),e.context||(e.context=i(e.url,e.line));var f=/ '([^']+)' /.exec(d);if(f&&(e.column=m(f[1],e.url,e.line)),a.stack.length>0&&a.stack[0].url===e.url){if(a.stack[0].line===e.line)return!1;if(!a.stack[0].line&&a.stack[0].func===e.func)return a.stack[0].line=e.line,a.stack[0].context=e.context,!1}return a.stack.unshift(e),a.partial=!0,!0}return a.incomplete=!0,!1}function s(a,b){for(var c,d,f,i=/function\s+([_$a-zA-Z\xA0-\uFFFF][_$a-zA-Z0-9\xA0-\uFFFF]*)?\s*\(/i,j=[],k={},l=!1,o=s.caller;o&&!l;o=o.caller)if(o!==t&&o!==e.report){if(d={url:null,func:h,args:[],line:null,column:null},o.name?d.func=o.name:(c=i.exec(o.toString()))&&(d.func=c[1]),"undefined"==typeof d.func)try{d.func=c.input.substring(0,c.input.indexOf("{"))}catch(p){}if(f=n(o)){d.url=f.url,d.line=f.line,d.func===h&&(d.func=g(d.url,d.line));var q=/ '([^']+)' /.exec(a.message||a.description);q&&(d.column=m(q[1],f.url,f.line))}k[""+o]?l=!0:k[""+o]=!0,j.push(d)}b&&j.splice(0,b);var u={mode:"callers",name:a.name,message:a.message,stack:j};return r(u,a.sourceURL||a.fileName,a.line||a.lineNumber,a.message||a.description),u}function t(a,b){var c=null;b=null==b?0:+b;try{if(c=p(a))return c}catch(d){if(v)throw d}try{if(c=o(a))return c}catch(d){if(v)throw d}try{if(c=q(a))return c}catch(d){if(v)throw d}try{if(c=s(a,b+1))return c}catch(d){if(v)throw d}return{mode:"failed"}}function u(a){a=(null==a?0:+a)+1;try{throw new Error}catch(b){return t(b,a+1)}}var v=!1,w={};return t.augmentStackTraceWithInitialElement=r,t.guessFunctionName=g,t.gatherContext=i,t.ofCaller=u,t.getSource=f,t}(),e.extendToAsynchronousCallbacks=function(){var b=function(b){var c=a[b];a[b]=function(){var a=g.call(arguments),b=a[0];return"function"==typeof b&&(a[0]=e.wrap(b)),c.apply?c.apply(this,a):c(a[0],a[1])}};b("setTimeout"),b("setInterval")},e.remoteFetching||(e.remoteFetching=!0),e.collectWindowErrors||(e.collectWindowErrors=!0),(!e.linesOfContext||e.linesOfContext<1)&&(e.linesOfContext=11),a.TraceKit=e}}("undefined"!=typeof window?window:global); No newline at end of file
@@ -1,132 +0,0 b''
1 // # Copyright (C) 2010-2016 RhodeCode GmbH
2 // #
3 // # This program is free software: you can redistribute it and/or modify
4 // # it under the terms of the GNU Affero General Public License, version 3
5 // # (only), as published by the Free Software Foundation.
6 // #
7 // # This program is distributed in the hope that it will be useful,
8 // # but WITHOUT ANY WARRANTY; without even the implied warranty of
9 // # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
10 // # GNU General Public License for more details.
11 // #
12 // # You should have received a copy of the GNU Affero General Public License
13 // # along with this program. If not, see <http://www.gnu.org/licenses/>.
14 // #
15 // # This program is dual-licensed. If you wish to learn more about the
16 // # RhodeCode Enterprise Edition, including its added features, Support services,
17 // # and proprietary license terms, please see https://rhodecode.com/licenses/
18
19 /**
20 * Multi select widget
21 */
22 var MultiSelectWidget = function(selected_id, available_id, form_id){
23 // definition of containers ID's
24 var selected_container = selected_id;
25 var available_container = available_id;
26 // temp container for selected storage.
27 var cache = [];
28 var av_cache = [];
29 var c = $('#'+selected_container).get(0);
30 var ac = $('#'+available_container).get(0);
31 // get only selected options for further fullfilment
32 for (var i = 0; node = c.options[i]; i++){
33 if (node.selected){
34 // push selected to my temp storage left overs :)
35 cache.push(node);
36 }
37 }
38 // get all available options to cache
39 for (i = 0; node = ac.options[i]; i++){
40 // push selected to my temp storage left overs :)
41 av_cache.push(node);
42 }
43 // fill available only with those not in chosen
44 ac.options.length = 0;
45 tmp_cache = [];
46
47 for (i = 0; node = av_cache[i]; i++){
48 var add = true;
49 for (var i2 = 0; node_2 = cache[i2]; i2++){
50 if (node.value === node_2.value){
51 add=false;
52 break;
53 }
54 }
55 if(add){
56 tmp_cache.push(new Option(node.text, node.value, false, false));
57 }
58 }
59 for (i = 0; node = tmp_cache[i]; i++){
60 ac.options[i] = node;
61 }
62 function prompts_action_callback(e){
63 var chosen = $('#'+selected_container).get(0);
64 var available = $('#'+available_container).get(0);
65 // get checked and unchecked options from field
66 function get_checked(from_field){
67 // temp container for storage.
68 var sel_cache = [];
69 var oth_cache = [];
70
71 for (i = 0; node = from_field.options[i]; i++){
72 if(node.selected){
73 // push selected fields :)
74 sel_cache.push(node);
75 }
76 else {
77 oth_cache.push(node);
78 }
79 }
80 return [sel_cache,oth_cache];
81 }
82 // fill the field with given options
83 function fill_with(field,options){
84 // clear firtst
85 field.options.length=0;
86 for (var i = 0; node = options[i]; i++){
87 field.options[i] = new Option(node.text, node.value, false, false);
88 }
89 }
90 // adds to current field
91 function add_to(field,options){
92 for (i = 0; node = options[i]; i++){
93 field.appendChild(new Option(node.text, node.value, false, false));
94 }
95 }
96 // add action
97 if (this.id ==='add_element'){
98 var c = get_checked(available);
99 add_to(chosen,c[0]);
100 fill_with(available,c[1]);
101 }
102 // remove action
103 if (this.id ==='remove_element'){
104 c = get_checked(chosen);
105 add_to(available,c[0]);
106 fill_with(chosen,c[1]);
107 }
108 // add all elements
109 if(this.id === 'add_all_elements'){
110 for (i=0; node = available.options[i]; i++){
111 chosen.appendChild(new Option(node.text, node.value, false, false));
112 }
113 available.options.length = 0;
114 }
115 // remove all elements
116 if (this.id === 'remove_all_elements'){
117 for (i=0; node = chosen.options[i]; i++){
118 available.appendChild(new Option(node.text, node.value, false, false));
119 }
120 chosen.options.length = 0;
121 }
122 }
123 $('#add_element, #remove_element, #add_all_elements, #remove_all_elements').click(prompts_action_callback);
124 if (form_id !== undefined) {
125 $('#'+form_id).submit(function(){
126 var chosen = $('#'+selected_container).get(0);
127 for (i = 0; i < chosen.options.length; i++) {
128 chosen.options[i].selected = 'selected';
129 }
130 });
131 }
132 };
@@ -1,30 +0,0 b''
1 <%namespace name="base" file="/base/base.html"/>
2
3 <div class="panel panel-default">
4 <div class="panel-heading">
5 <h3 class="panel-title">${_('Members of User Group: %s') % c.user_group.users_group_name}</h3>
6 </div>
7 <div class="panel-body">
8 % if c.group_members_obj:
9 <table class="rctable group_members">
10 <tr>
11 <th>Username</th>
12 <th>Name</th>
13 </tr>
14 %for user in c.group_members_obj:
15 <tr>
16 <td class="td-author">
17 <div class="group_member">
18 ${base.gravatar(user.email, 16)}
19 <span class="username user">${h.link_to(user.username, h.url( 'edit_user',user_id=user.user_id))}</span>
20 </div>
21 </td>
22 <td class="fullname">${user.full_name}</td>
23 </tr>
24 %endfor
25 </table>
26 %else:
27 <p class="empty_data">${_('No members yet')}</p>
28 %endif
29 </div>
30 </div>
@@ -1,16 +0,0 b''
1 <div class="flash_msg">
2 <% messages = h.flash.pop_messages() %>
3 % if messages:
4 % for message in messages:
5 <div class="alert alert-dismissable alert-${message.category}">
6 <button type="button" class="close" data-dismiss="alert" aria-hidden="true">&times;</button>
7 ${message}
8 </div>
9 % endfor
10 % endif
11 <script>
12 if (typeof jQuery != 'undefined') {
13 $(".alert").alert();
14 }
15 </script>
16 </div>
@@ -1,93 +0,0 b''
1 ################################################################################
2 # RhodeCode VCSServer - configuration #
3 # #
4 ################################################################################
5
6 [DEFAULT]
7 host = 127.0.0.1
8 port = 9901
9 locale = en_US.UTF-8
10 # number of worker threads, this should be set based on a formula threadpool=N*6
11 # where N is number of RhodeCode Enterprise workers, eg. running 2 instances
12 # 8 gunicorn workers each would be 2 * 8 * 6 = 96, threadpool_size = 96
13 threadpool_size = 96
14 timeout = 0
15
16 # cache regions, please don't change
17 beaker.cache.regions = repo_object
18 beaker.cache.repo_object.type = memorylru
19 beaker.cache.repo_object.max_items = 100
20 # cache auto-expires after N seconds
21 beaker.cache.repo_object.expire = 300
22 beaker.cache.repo_object.enabled = true
23
24
25 ################################
26 ### LOGGING CONFIGURATION ####
27 ################################
28 [loggers]
29 keys = root, vcsserver, pyro4, beaker
30
31 [handlers]
32 keys = console
33
34 [formatters]
35 keys = generic
36
37 #############
38 ## LOGGERS ##
39 #############
40 [logger_root]
41 level = NOTSET
42 handlers = console
43
44 [logger_vcsserver]
45 level = DEBUG
46 handlers =
47 qualname = vcsserver
48 propagate = 1
49
50 [logger_beaker]
51 level = DEBUG
52 handlers =
53 qualname = beaker
54 propagate = 1
55
56 [logger_pyro4]
57 level = DEBUG
58 handlers =
59 qualname = Pyro4
60 propagate = 1
61
62
63 ##############
64 ## HANDLERS ##
65 ##############
66
67 [handler_console]
68 class = StreamHandler
69 args = (sys.stderr,)
70 level = INFO
71 formatter = generic
72
73 [handler_file]
74 class = FileHandler
75 args = ('vcsserver.log', 'a',)
76 level = DEBUG
77 formatter = generic
78
79 [handler_file_rotating]
80 class = logging.handlers.TimedRotatingFileHandler
81 # 'D', 5 - rotate every 5days
82 # you can set 'h', 'midnight'
83 args = ('vcsserver.log', 'D', 5, 10,)
84 level = DEBUG
85 formatter = generic
86
87 ################
88 ## FORMATTERS ##
89 ################
90
91 [formatter_generic]
92 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
93 datefmt = %Y-%m-%d %H:%M:%S
@@ -1,70 +0,0 b''
1 [app:main]
2 use = egg:rhodecode-vcsserver
3 pyramid.reload_templates = true
4 pyramid.default_locale_name = en
5 pyramid.includes =
6 # cache regions, please don't change
7 beaker.cache.regions = repo_object
8 beaker.cache.repo_object.type = memorylru
9 beaker.cache.repo_object.max_items = 100
10 # cache auto-expires after N seconds
11 beaker.cache.repo_object.expire = 300
12 beaker.cache.repo_object.enabled = true
13 locale = en_US.UTF-8
14
15
16 [server:main]
17 use = egg:waitress#main
18 host = 0.0.0.0
19 port = %(http_port)s
20
21
22 ################################
23 ### LOGGING CONFIGURATION ####
24 ################################
25 [loggers]
26 keys = root, vcsserver, beaker
27
28 [handlers]
29 keys = console
30
31 [formatters]
32 keys = generic
33
34 #############
35 ## LOGGERS ##
36 #############
37 [logger_root]
38 level = NOTSET
39 handlers = console
40
41 [logger_vcsserver]
42 level = DEBUG
43 handlers =
44 qualname = vcsserver
45 propagate = 1
46
47 [logger_beaker]
48 level = DEBUG
49 handlers =
50 qualname = beaker
51 propagate = 1
52
53
54 ##############
55 ## HANDLERS ##
56 ##############
57
58 [handler_console]
59 class = StreamHandler
60 args = (sys.stderr,)
61 level = DEBUG
62 formatter = generic
63
64 ################
65 ## FORMATTERS ##
66 ################
67
68 [formatter_generic]
69 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
70 datefmt = %Y-%m-%d %H:%M:%S
This diff has been collapsed as it changes many lines, (716 lines changed) Show them Hide them
@@ -1,716 +0,0 b''
1
2
3 ################################################################################
4 ## RHODECODE ENTERPRISE CONFIGURATION ##
5 # The %(here)s variable will be replaced with the parent directory of this file#
6 ################################################################################
7
8 [DEFAULT]
9 debug = true
10
11 ################################################################################
12 ## EMAIL CONFIGURATION ##
13 ## Uncomment and replace with the email address which should receive ##
14 ## any error reports after an application crash ##
15 ## Additionally these settings will be used by the RhodeCode mailing system ##
16 ################################################################################
17
18 ## prefix all emails subjects with given prefix, helps filtering out emails
19 #email_prefix = [RhodeCode]
20
21 ## email FROM address all mails will be sent
22 #app_email_from = rhodecode-noreply@localhost
23
24 ## Uncomment and replace with the address which should receive any error report
25 ## note: using appenlight for error handling doesn't need this to be uncommented
26 #email_to = admin@localhost
27
28 ## in case of Application errors, sent an error email form
29 #error_email_from = rhodecode_error@localhost
30
31 ## additional error message to be send in case of server crash
32 #error_message =
33
34
35 #smtp_server = mail.server.com
36 #smtp_username =
37 #smtp_password =
38 #smtp_port =
39 #smtp_use_tls = false
40 #smtp_use_ssl = true
41 ## Specify available auth parameters here (e.g. LOGIN PLAIN CRAM-MD5, etc.)
42 #smtp_auth =
43
44 [server:main]
45 ## COMMON ##
46 host = 0.0.0.0
47 port = 5000
48
49 ##################################
50 ## WAITRESS WSGI SERVER ##
51 ## Recommended for Development ##
52 ##################################
53
54 use = egg:waitress#main
55 ## number of worker threads
56 threads = 5
57 ## MAX BODY SIZE 100GB
58 max_request_body_size = 107374182400
59 ## Use poll instead of select, fixes file descriptors limits problems.
60 ## May not work on old windows systems.
61 asyncore_use_poll = true
62
63
64 ##########################
65 ## GUNICORN WSGI SERVER ##
66 ##########################
67 ## run with gunicorn --log-config <inifile.ini> --paste <inifile.ini>
68
69 #use = egg:gunicorn#main
70 ## Sets the number of process workers. You must set `instance_id = *`
71 ## when this option is set to more than one worker, recommended
72 ## value is (2 * NUMBER_OF_CPUS + 1), eg 2CPU = 5 workers
73 ## The `instance_id = *` must be set in the [app:main] section below
74 #workers = 2
75 ## number of threads for each of the worker, must be set to 1 for gevent
76 ## generally recommened to be at 1
77 #threads = 1
78 ## process name
79 #proc_name = rhodecode
80 ## type of worker class, one of sync, gevent
81 ## recommended for bigger setup is using of of other than sync one
82 #worker_class = sync
83 ## The maximum number of simultaneous clients. Valid only for Gevent
84 #worker_connections = 10
85 ## max number of requests that worker will handle before being gracefully
86 ## restarted, could prevent memory leaks
87 #max_requests = 1000
88 #max_requests_jitter = 30
89 ## amount of time a worker can spend with handling a request before it
90 ## gets killed and restarted. Set to 6hrs
91 #timeout = 21600
92
93 ## UWSGI ##
94 ## run with uwsgi --ini-paste-logged <inifile.ini>
95 #[uwsgi]
96 #socket = /tmp/uwsgi.sock
97 #master = true
98 #http = 127.0.0.1:5000
99
100 ## set as deamon and redirect all output to file
101 #daemonize = ./uwsgi_rhodecode.log
102
103 ## master process PID
104 #pidfile = ./uwsgi_rhodecode.pid
105
106 ## stats server with workers statistics, use uwsgitop
107 ## for monitoring, `uwsgitop 127.0.0.1:1717`
108 #stats = 127.0.0.1:1717
109 #memory-report = true
110
111 ## log 5XX errors
112 #log-5xx = true
113
114 ## Set the socket listen queue size.
115 #listen = 256
116
117 ## Gracefully Reload workers after the specified amount of managed requests
118 ## (avoid memory leaks).
119 #max-requests = 1000
120
121 ## enable large buffers
122 #buffer-size=65535
123
124 ## socket and http timeouts ##
125 #http-timeout=3600
126 #socket-timeout=3600
127
128 ## Log requests slower than the specified number of milliseconds.
129 #log-slow = 10
130
131 ## Exit if no app can be loaded.
132 #need-app = true
133
134 ## Set lazy mode (load apps in workers instead of master).
135 #lazy = true
136
137 ## scaling ##
138 ## set cheaper algorithm to use, if not set default will be used
139 #cheaper-algo = spare
140
141 ## minimum number of workers to keep at all times
142 #cheaper = 1
143
144 ## number of workers to spawn at startup
145 #cheaper-initial = 1
146
147 ## maximum number of workers that can be spawned
148 #workers = 4
149
150 ## how many workers should be spawned at a time
151 #cheaper-step = 1
152
153 ## prefix middleware for RhodeCode, disables force_https flag.
154 ## recommended when using proxy setup.
155 ## allows to set RhodeCode under a prefix in server.
156 ## eg https://server.com/<prefix>. Enable `filter-with =` option below as well.
157 ## optionally set prefix like: `prefix = /<your-prefix>`
158 [filter:proxy-prefix]
159 use = egg:PasteDeploy#prefix
160 prefix = /
161
162 [app:main]
163 is_test = True
164 use = egg:rhodecode-enterprise-ce
165
166 ## enable proxy prefix middleware, defined above
167 #filter-with = proxy-prefix
168
169
170 ## RHODECODE PLUGINS ##
171 rhodecode.includes = rhodecode.api
172
173 # api prefix url
174 rhodecode.api.url = /_admin/api
175
176
177 ## END RHODECODE PLUGINS ##
178
179 ## encryption key used to encrypt social plugin tokens,
180 ## remote_urls with credentials etc, if not set it defaults to
181 ## `beaker.session.secret`
182 #rhodecode.encrypted_values.secret =
183
184 ## decryption strict mode (enabled by default). It controls if decryption raises
185 ## `SignatureVerificationError` in case of wrong key, or damaged encryption data.
186 #rhodecode.encrypted_values.strict = false
187
188 ## return gzipped responses from Rhodecode (static files/application)
189 gzip_responses = false
190
191 ## autogenerate javascript routes file on startup
192 generate_js_files = false
193
194 ## Optional Languages
195 ## en(default), be, de, es, fr, it, ja, pl, pt, ru, zh
196 lang = en
197
198 ## perform a full repository scan on each server start, this should be
199 ## set to false after first startup, to allow faster server restarts.
200 startup.import_repos = true
201
202 ## Uncomment and set this path to use archive download cache.
203 ## Once enabled, generated archives will be cached at this location
204 ## and served from the cache during subsequent requests for the same archive of
205 ## the repository.
206 #archive_cache_dir = /tmp/tarballcache
207
208 ## change this to unique ID for security
209 app_instance_uuid = rc-production
210
211 ## cut off limit for large diffs (size in bytes)
212 cut_off_limit_diff = 1024000
213 cut_off_limit_file = 256000
214
215 ## use cache version of scm repo everywhere
216 vcs_full_cache = false
217
218 ## force https in RhodeCode, fixes https redirects, assumes it's always https
219 ## Normally this is controlled by proper http flags sent from http server
220 force_https = false
221
222 ## use Strict-Transport-Security headers
223 use_htsts = false
224
225 ## number of commits stats will parse on each iteration
226 commit_parse_limit = 25
227
228 ## git rev filter option, --all is the default filter, if you need to
229 ## hide all refs in changelog switch this to --branches --tags
230 git_rev_filter = --all
231
232 # Set to true if your repos are exposed using the dumb protocol
233 git_update_server_info = false
234
235 ## RSS/ATOM feed options
236 rss_cut_off_limit = 256000
237 rss_items_per_page = 10
238 rss_include_diff = false
239
240 ## gist URL alias, used to create nicer urls for gist. This should be an
241 ## url that does rewrites to _admin/gists/<gistid>.
242 ## example: http://gist.rhodecode.org/{gistid}. Empty means use the internal
243 ## RhodeCode url, ie. http[s]://rhodecode.server/_admin/gists/<gistid>
244 gist_alias_url =
245
246 ## List of controllers (using glob pattern syntax) that AUTH TOKENS could be
247 ## used for access.
248 ## Adding ?auth_token = <token> to the url authenticates this request as if it
249 ## came from the the logged in user who own this authentication token.
250 ##
251 ## Syntax is <ControllerClass>:<function_pattern>.
252 ## To enable access to raw_files put `FilesController:raw`.
253 ## To enable access to patches add `ChangesetController:changeset_patch`.
254 ## The list should be "," separated and on a single line.
255 ##
256 ## Recommended controllers to enable:
257 # ChangesetController:changeset_patch,
258 # ChangesetController:changeset_raw,
259 # FilesController:raw,
260 # FilesController:archivefile,
261 # GistsController:*,
262 api_access_controllers_whitelist =
263
264 ## default encoding used to convert from and to unicode
265 ## can be also a comma separated list of encoding in case of mixed encodings
266 default_encoding = UTF-8
267
268 ## instance-id prefix
269 ## a prefix key for this instance used for cache invalidation when running
270 ## multiple instances of rhodecode, make sure it's globally unique for
271 ## all running rhodecode instances. Leave empty if you don't use it
272 instance_id =
273
274 ## Fallback authentication plugin. Set this to a plugin ID to force the usage
275 ## of an authentication plugin also if it is disabled by it's settings.
276 ## This could be useful if you are unable to log in to the system due to broken
277 ## authentication settings. Then you can enable e.g. the internal rhodecode auth
278 ## module to log in again and fix the settings.
279 ##
280 ## Available builtin plugin IDs (hash is part of the ID):
281 ## egg:rhodecode-enterprise-ce#rhodecode
282 ## egg:rhodecode-enterprise-ce#pam
283 ## egg:rhodecode-enterprise-ce#ldap
284 ## egg:rhodecode-enterprise-ce#jasig_cas
285 ## egg:rhodecode-enterprise-ce#headers
286 ## egg:rhodecode-enterprise-ce#crowd
287 #rhodecode.auth_plugin_fallback = egg:rhodecode-enterprise-ce#rhodecode
288
289 ## alternative return HTTP header for failed authentication. Default HTTP
290 ## response is 401 HTTPUnauthorized. Currently HG clients have troubles with
291 ## handling that causing a series of failed authentication calls.
292 ## Set this variable to 403 to return HTTPForbidden, or any other HTTP code
293 ## This will be served instead of default 401 on bad authnetication
294 auth_ret_code =
295
296 ## use special detection method when serving auth_ret_code, instead of serving
297 ## ret_code directly, use 401 initially (Which triggers credentials prompt)
298 ## and then serve auth_ret_code to clients
299 auth_ret_code_detection = false
300
301 ## locking return code. When repository is locked return this HTTP code. 2XX
302 ## codes don't break the transactions while 4XX codes do
303 lock_ret_code = 423
304
305 ## allows to change the repository location in settings page
306 allow_repo_location_change = true
307
308 ## allows to setup custom hooks in settings page
309 allow_custom_hooks_settings = true
310
311 ## generated license token, goto license page in RhodeCode settings to obtain
312 ## new token
313 license_token = abra-cada-bra1-rce3
314
315 ## supervisor connection uri, for managing supervisor and logs.
316 supervisor.uri =
317 ## supervisord group name/id we only want this RC instance to handle
318 supervisor.group_id = dev
319
320 ## Display extended labs settings
321 labs_settings_active = true
322
323 ####################################
324 ### CELERY CONFIG ####
325 ####################################
326 use_celery = false
327 broker.host = localhost
328 broker.vhost = rabbitmqhost
329 broker.port = 5672
330 broker.user = rabbitmq
331 broker.password = qweqwe
332
333 celery.imports = rhodecode.lib.celerylib.tasks
334
335 celery.result.backend = amqp
336 celery.result.dburi = amqp://
337 celery.result.serialier = json
338
339 #celery.send.task.error.emails = true
340 #celery.amqp.task.result.expires = 18000
341
342 celeryd.concurrency = 2
343 #celeryd.log.file = celeryd.log
344 celeryd.log.level = debug
345 celeryd.max.tasks.per.child = 1
346
347 ## tasks will never be sent to the queue, but executed locally instead.
348 celery.always.eager = false
349
350 ####################################
351 ### BEAKER CACHE ####
352 ####################################
353 # default cache dir for templates. Putting this into a ramdisk
354 ## can boost performance, eg. %(here)s/data_ramdisk
355 cache_dir = %(here)s/data
356
357 ## locking and default file storage for Beaker. Putting this into a ramdisk
358 ## can boost performance, eg. %(here)s/data_ramdisk/cache/beaker_data
359 beaker.cache.data_dir = %(here)s/rc/data/cache/beaker_data
360 beaker.cache.lock_dir = %(here)s/rc/data/cache/beaker_lock
361
362 beaker.cache.regions = super_short_term, short_term, long_term, sql_cache_short, auth_plugins, repo_cache_long
363
364 beaker.cache.super_short_term.type = memory
365 beaker.cache.super_short_term.expire = 1
366 beaker.cache.super_short_term.key_length = 256
367
368 beaker.cache.short_term.type = memory
369 beaker.cache.short_term.expire = 60
370 beaker.cache.short_term.key_length = 256
371
372 beaker.cache.long_term.type = memory
373 beaker.cache.long_term.expire = 36000
374 beaker.cache.long_term.key_length = 256
375
376 beaker.cache.sql_cache_short.type = memory
377 beaker.cache.sql_cache_short.expire = 1
378 beaker.cache.sql_cache_short.key_length = 256
379
380 ## default is memory cache, configure only if required
381 ## using multi-node or multi-worker setup
382 #beaker.cache.auth_plugins.type = ext:database
383 #beaker.cache.auth_plugins.lock_dir = %(here)s/data/cache/auth_plugin_lock
384 #beaker.cache.auth_plugins.url = postgresql://postgres:secret@localhost/rhodecode
385 #beaker.cache.auth_plugins.url = mysql://root:secret@127.0.0.1/rhodecode
386 #beaker.cache.auth_plugins.sa.pool_recycle = 3600
387 #beaker.cache.auth_plugins.sa.pool_size = 10
388 #beaker.cache.auth_plugins.sa.max_overflow = 0
389
390 beaker.cache.repo_cache_long.type = memorylru_base
391 beaker.cache.repo_cache_long.max_items = 4096
392 beaker.cache.repo_cache_long.expire = 2592000
393
394 ## default is memorylru_base cache, configure only if required
395 ## using multi-node or multi-worker setup
396 #beaker.cache.repo_cache_long.type = ext:memcached
397 #beaker.cache.repo_cache_long.url = localhost:11211
398 #beaker.cache.repo_cache_long.expire = 1209600
399 #beaker.cache.repo_cache_long.key_length = 256
400
401 ####################################
402 ### BEAKER SESSION ####
403 ####################################
404
405 ## .session.type is type of storage options for the session, current allowed
406 ## types are file, ext:memcached, ext:database, and memory (default).
407 beaker.session.type = file
408 beaker.session.data_dir = %(here)s/rc/data/sessions/data
409
410 ## db based session, fast, and allows easy management over logged in users
411 #beaker.session.type = ext:database
412 #beaker.session.table_name = db_session
413 #beaker.session.sa.url = postgresql://postgres:secret@localhost/rhodecode
414 #beaker.session.sa.url = mysql://root:secret@127.0.0.1/rhodecode
415 #beaker.session.sa.pool_recycle = 3600
416 #beaker.session.sa.echo = false
417
418 beaker.session.key = rhodecode
419 beaker.session.secret = test-rc-uytcxaz
420 beaker.session.lock_dir = %(here)s/rc/data/sessions/lock
421
422 ## Secure encrypted cookie. Requires AES and AES python libraries
423 ## you must disable beaker.session.secret to use this
424 #beaker.session.encrypt_key = <key_for_encryption>
425 #beaker.session.validate_key = <validation_key>
426
427 ## sets session as invalid(also logging out user) if it haven not been
428 ## accessed for given amount of time in seconds
429 beaker.session.timeout = 2592000
430 beaker.session.httponly = true
431 ## Path to use for the cookie.
432 #beaker.session.cookie_path = /<your-prefix>
433
434 ## uncomment for https secure cookie
435 beaker.session.secure = false
436
437 ## auto save the session to not to use .save()
438 beaker.session.auto = false
439
440 ## default cookie expiration time in seconds, set to `true` to set expire
441 ## at browser close
442 #beaker.session.cookie_expires = 3600
443
444 ###################################
445 ## SEARCH INDEXING CONFIGURATION ##
446 ###################################
447 ## Full text search indexer is available in rhodecode-tools under
448 ## `rhodecode-tools index` command
449
450 # WHOOSH Backend, doesn't require additional services to run
451 # it works good with few dozen repos
452 search.module = rhodecode.lib.index.whoosh
453 search.location = %(here)s/data/index
454
455 ########################################
456 ### CHANNELSTREAM CONFIG ####
457 ########################################
458 ## channelstream enables persistent connections and live notification
459 ## in the system. It's also used by the chat system
460
461 channelstream.enabled = false
462 # location of channelstream server on the backend
463 channelstream.server = 127.0.0.1:9800
464 ## location of the channelstream server from outside world
465 ## most likely this would be an http server special backend URL, that handles
466 ## websocket connections see nginx example for config
467 channelstream.ws_url = ws://rhodecode.yourserver.com/_channelstream
468 channelstream.secret = secret
469 channelstream.history.location = %(here)s/channelstream_history
470
471
472 ###################################
473 ## APPENLIGHT CONFIG ##
474 ###################################
475
476 ## Appenlight is tailored to work with RhodeCode, see
477 ## http://appenlight.com for details how to obtain an account
478
479 ## appenlight integration enabled
480 appenlight = false
481
482 appenlight.server_url = https://api.appenlight.com
483 appenlight.api_key = YOUR_API_KEY
484 #appenlight.transport_config = https://api.appenlight.com?threaded=1&timeout=5
485
486 # used for JS client
487 appenlight.api_public_key = YOUR_API_PUBLIC_KEY
488
489 ## TWEAK AMOUNT OF INFO SENT HERE
490
491 ## enables 404 error logging (default False)
492 appenlight.report_404 = false
493
494 ## time in seconds after request is considered being slow (default 1)
495 appenlight.slow_request_time = 1
496
497 ## record slow requests in application
498 ## (needs to be enabled for slow datastore recording and time tracking)
499 appenlight.slow_requests = true
500
501 ## enable hooking to application loggers
502 appenlight.logging = true
503
504 ## minimum log level for log capture
505 appenlight.logging.level = WARNING
506
507 ## send logs only from erroneous/slow requests
508 ## (saves API quota for intensive logging)
509 appenlight.logging_on_error = false
510
511 ## list of additonal keywords that should be grabbed from environ object
512 ## can be string with comma separated list of words in lowercase
513 ## (by default client will always send following info:
514 ## 'REMOTE_USER', 'REMOTE_ADDR', 'SERVER_NAME', 'CONTENT_TYPE' + all keys that
515 ## start with HTTP* this list be extended with additional keywords here
516 appenlight.environ_keys_whitelist =
517
518 ## list of keywords that should be blanked from request object
519 ## can be string with comma separated list of words in lowercase
520 ## (by default client will always blank keys that contain following words
521 ## 'password', 'passwd', 'pwd', 'auth_tkt', 'secret', 'csrf'
522 ## this list be extended with additional keywords set here
523 appenlight.request_keys_blacklist =
524
525 ## list of namespaces that should be ignores when gathering log entries
526 ## can be string with comma separated list of namespaces
527 ## (by default the client ignores own entries: appenlight_client.client)
528 appenlight.log_namespace_blacklist =
529
530
531 ################################################################################
532 ## WARNING: *THE LINE BELOW MUST BE UNCOMMENTED ON A PRODUCTION ENVIRONMENT* ##
533 ## Debug mode will enable the interactive debugging tool, allowing ANYONE to ##
534 ## execute malicious code after an exception is raised. ##
535 ################################################################################
536 set debug = false
537
538
539 ##############
540 ## STYLING ##
541 ##############
542 debug_style = false
543
544 #########################################################
545 ### DB CONFIGS - EACH DB WILL HAVE IT'S OWN CONFIG ###
546 #########################################################
547 #sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode_test.db
548 #sqlalchemy.db1.url = postgresql://postgres:qweqwe@localhost/rhodecode_test
549 #sqlalchemy.db1.url = mysql://root:qweqwe@localhost/rhodecode_test
550 sqlalchemy.db1.url = sqlite:///%(here)s/rhodecode_test.db
551
552 # see sqlalchemy docs for other advanced settings
553
554 ## print the sql statements to output
555 sqlalchemy.db1.echo = false
556 ## recycle the connections after this ammount of seconds
557 sqlalchemy.db1.pool_recycle = 3600
558 sqlalchemy.db1.convert_unicode = true
559
560 ## the number of connections to keep open inside the connection pool.
561 ## 0 indicates no limit
562 #sqlalchemy.db1.pool_size = 5
563
564 ## the number of connections to allow in connection pool "overflow", that is
565 ## connections that can be opened above and beyond the pool_size setting,
566 ## which defaults to five.
567 #sqlalchemy.db1.max_overflow = 10
568
569
570 ##################
571 ### VCS CONFIG ###
572 ##################
573 vcs.server.enable = true
574 vcs.server = localhost:9901
575
576 ## Web server connectivity protocol, responsible for web based VCS operatations
577 ## Available protocols are:
578 ## `pyro4` - using pyro4 server
579 ## `http` - using http-rpc backend
580 vcs.server.protocol = pyro4
581
582 ## Push/Pull operations protocol, available options are:
583 ## `pyro4` - using pyro4 server
584 ## `rhodecode.lib.middleware.utils.scm_app_http` - Http based, recommended
585 ## `vcsserver.scm_app` - internal app (EE only)
586 vcs.scm_app_implementation = pyro4
587
588 ## Push/Pull operations hooks protocol, available options are:
589 ## `pyro4` - using pyro4 server
590 ## `http` - using http-rpc backend
591 vcs.hooks.protocol = pyro4
592
593 vcs.server.log_level = debug
594 ## Start VCSServer with this instance as a subprocess, usefull for development
595 vcs.start_server = true
596
597 ## List of enabled VCS backends, available options are:
598 ## `hg` - mercurial
599 ## `git` - git
600 ## `svn` - subversion
601 vcs.backends = hg, git, svn
602
603 vcs.connection_timeout = 3600
604 ## Compatibility version when creating SVN repositories. Defaults to newest version when commented out.
605 ## Available options are: pre-1.4-compatible, pre-1.5-compatible, pre-1.6-compatible, pre-1.8-compatible
606 #vcs.svn.compatible_version = pre-1.8-compatible
607
608
609 ############################################################
610 ### Subversion proxy support (mod_dav_svn) ###
611 ### Maps RhodeCode repo groups into SVN paths for Apache ###
612 ############################################################
613 ## Enable or disable the config file generation.
614 svn.proxy.generate_config = false
615 ## Generate config file with `SVNListParentPath` set to `On`.
616 svn.proxy.list_parent_path = true
617 ## Set location and file name of generated config file.
618 svn.proxy.config_file_path = %(here)s/mod_dav_svn.conf
619 ## File system path to the directory containing the repositories served by
620 ## RhodeCode.
621 svn.proxy.parent_path_root = /path/to/repo_store
622 ## Used as a prefix to the <Location> block in the generated config file. In
623 ## most cases it should be set to `/`.
624 svn.proxy.location_root = /
625
626
627 ################################
628 ### LOGGING CONFIGURATION ####
629 ################################
630 [loggers]
631 keys = root, routes, rhodecode, sqlalchemy, beaker, pyro4, templates
632
633 [handlers]
634 keys = console, console_sql
635
636 [formatters]
637 keys = generic, color_formatter, color_formatter_sql
638
639 #############
640 ## LOGGERS ##
641 #############
642 [logger_root]
643 level = NOTSET
644 handlers = console
645
646 [logger_routes]
647 level = DEBUG
648 handlers =
649 qualname = routes.middleware
650 ## "level = DEBUG" logs the route matched and routing variables.
651 propagate = 1
652
653 [logger_beaker]
654 level = DEBUG
655 handlers =
656 qualname = beaker.container
657 propagate = 1
658
659 [logger_pyro4]
660 level = DEBUG
661 handlers =
662 qualname = Pyro4
663 propagate = 1
664
665 [logger_templates]
666 level = INFO
667 handlers =
668 qualname = pylons.templating
669 propagate = 1
670
671 [logger_rhodecode]
672 level = DEBUG
673 handlers =
674 qualname = rhodecode
675 propagate = 1
676
677 [logger_sqlalchemy]
678 level = ERROR
679 handlers = console_sql
680 qualname = sqlalchemy.engine
681 propagate = 0
682
683 ##############
684 ## HANDLERS ##
685 ##############
686
687 [handler_console]
688 class = StreamHandler
689 args = (sys.stderr,)
690 level = DEBUG
691 formatter = generic
692
693 [handler_console_sql]
694 class = StreamHandler
695 args = (sys.stderr,)
696 level = WARN
697 formatter = generic
698
699 ################
700 ## FORMATTERS ##
701 ################
702
703 [formatter_generic]
704 class = rhodecode.lib.logging_formatter.Pyro4AwareFormatter
705 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
706 datefmt = %Y-%m-%d %H:%M:%S
707
708 [formatter_color_formatter]
709 class = rhodecode.lib.logging_formatter.ColorFormatter
710 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
711 datefmt = %Y-%m-%d %H:%M:%S
712
713 [formatter_color_formatter_sql]
714 class = rhodecode.lib.logging_formatter.ColorFormatterSql
715 format = %(asctime)s.%(msecs)03d %(levelname)-5.5s [%(name)s] %(message)s
716 datefmt = %Y-%m-%d %H:%M:%S
General Comments 0
You need to be logged in to leave comments. Login now