##// END OF EJS Templates
release: Merge default into stable for release preparation
super-admin -
r4770:a5598123 merge stable
parent child Browse files
Show More

The requested changes are too big and content was truncated. Show full diff

@@ -0,0 +1,58 b''
1 |RCE| 4.27.0 |RNS|
2 ------------------
3
4 Release Date
5 ^^^^^^^^^^^^
6
7 - 2022-09-01
8
9
10 New Features
11 ^^^^^^^^^^^^
12
13
14 - Git: allow overriding default master branch with an env variable.
15 - VCS settings: when setting landing ref from UI switch default GIT branch too.
16 This allows having non-default branches set as default.
17 - Users: enable full edit mode for super admins.
18 Super-Admins now are allowed to edit even LDAP/OAuth user attributes.
19 We'll enable the users to change their names via external auth for rename cases.
20
21
22 General
23 ^^^^^^^
24
25 - Commits/summary: unify fetching remote attribute in summary and commits page to properly and in the same way show the data.
26 - Caches: updated logging and some timings
27 - Caches: allow regional per repo caches, and invalidate caches via a remote call.
28 - Caches: invalidate cache on remote side from repo settings alongside local cache.
29 - Git: allow setting symbolic ref for repos without master branch.
30 - Sys-info: added helpers for faster loading of sys-info data
31 - Debugging: debug = true flag now prints associated exception and logs right into the error page
32
33
34 Security
35 ^^^^^^^^
36
37 - Healthcheck: Added authentication because we expose DB information
38 - Security: fixed self-xss on archive download page.
39
40
41 Performance
42 ^^^^^^^^^^^
43
44
45
46 Fixes
47 ^^^^^
48
49 - Drafts: show TODO drafts properly in edge cases.
50 - Files: fixed unicode problems in specially encoded paths handler.
51 - Caches: fixed issue with exception on handling non-ascii cache keys.
52 - Git: handle archive generation crash case when submodules are present.
53
54
55 Upgrade notes
56 ^^^^^^^^^^^^^
57
58 - Scheduled release 4.27.0.
@@ -1,5 +1,5 b''
1 [bumpversion]
1 [bumpversion]
2 current_version = 4.26.0
2 current_version = 4.27.0
3 message = release: Bump version {current_version} to {new_version}
3 message = release: Bump version {current_version} to {new_version}
4
4
5 [bumpversion:file:rhodecode/VERSION]
5 [bumpversion:file:rhodecode/VERSION]
@@ -1,34 +1,28 b''
1 [DEFAULT]
1 [DEFAULT]
2 done = false
2 done = false
3
3
4 [task:bump_version]
4 [task:bump_version]
5 done = true
5 done = true
6
6
7 [task:rc_tools_pinned]
7 [task:rc_tools_pinned]
8 done = true
9
8
10 [task:fixes_on_stable]
9 [task:fixes_on_stable]
11 done = true
12
10
13 [task:pip2nix_generated]
11 [task:pip2nix_generated]
14 done = true
15
12
16 [task:changelog_updated]
13 [task:changelog_updated]
17 done = true
18
14
19 [task:generate_api_docs]
15 [task:generate_api_docs]
20 done = true
21
16
22 [task:updated_translation]
17 [task:updated_translation]
23 done = true
24
18
25 [release]
19 [release]
26 state = prepared
20 state = in_progress
27 version = 4.26.0
21 version = 4.27.0
28
22
29 [task:generate_js_routes]
23 [task:generate_js_routes]
30
24
31 [task:updated_trial_license]
25 [task:updated_trial_license]
32
26
33 [task:generate_oss_licenses]
27 [task:generate_oss_licenses]
34
28
@@ -1,172 +1,172 b''
1 .. _system-overview-ref:
1 .. _system-overview-ref:
2
2
3 System Overview
3 System Overview
4 ===============
4 ===============
5
5
6 Latest Version
6 Latest Version
7 --------------
7 --------------
8
8
9 * |release| on Unix and Windows systems.
9 * |release| on Unix and Windows systems.
10
10
11 System Architecture
11 System Architecture
12 -------------------
12 -------------------
13
13
14 The following diagram shows a typical production architecture.
14 The following diagram shows a typical production architecture.
15
15
16 .. image:: ../images/architecture-diagram.png
16 .. image:: ../images/architecture-diagram.png
17 :align: center
17 :align: center
18
18
19 Supported Operating Systems
19 Supported Operating Systems
20 ---------------------------
20 ---------------------------
21
21
22 Linux
22 Linux
23 ^^^^^
23 ^^^^^
24
24
25 * Ubuntu 14.04+
25 * Ubuntu 14.04+
26 * CentOS 6.2, 7 and 8
26 * CentOS 6.2, 7 and 8
27 * RHEL 6.2, 7 and 8
27 * RHEL 6.2, 7 and 8
28 * Debian 7.8
28 * Debian 7.8
29 * RedHat Fedora
29 * RedHat Fedora
30 * Arch Linux
30 * Arch Linux
31 * SUSE Linux
31 * SUSE Linux
32
32
33 Windows
33 Windows
34 ^^^^^^^
34 ^^^^^^^
35
35
36 * Windows Vista Ultimate 64bit
36 * Windows Vista Ultimate 64bit
37 * Windows 7 Ultimate 64bit
37 * Windows 7 Ultimate 64bit
38 * Windows 8 Professional 64bit
38 * Windows 8 Professional 64bit
39 * Windows 8.1 Enterprise 64bit
39 * Windows 8.1 Enterprise 64bit
40 * Windows Server 2008 64bit
40 * Windows Server 2008 64bit
41 * Windows Server 2008-R2 64bit
41 * Windows Server 2008-R2 64bit
42 * Windows Server 2012 64bit
42 * Windows Server 2012 64bit
43
43
44 Supported Databases
44 Supported Databases
45 -------------------
45 -------------------
46
46
47 * SQLite
47 * SQLite
48 * MySQL
48 * MySQL
49 * MariaDB
49 * MariaDB
50 * PostgreSQL
50 * PostgreSQL
51
51
52 Supported Browsers
52 Supported Browsers
53 ------------------
53 ------------------
54
54
55 * Chrome
55 * Chrome
56 * Safari
56 * Safari
57 * Firefox
57 * Firefox
58 * Internet Explorer 10 & 11
58 * Internet Explorer 10 & 11
59
59
60 System Requirements
60 System Requirements
61 -------------------
61 -------------------
62
62
63 |RCE| performs best on machines with ultra-fast hard disks. Generally disk
63 |RCE| performs best on machines with ultra-fast hard disks. Generally disk
64 performance is more important than CPU performance. In a corporate production
64 performance is more important than CPU performance. In a corporate production
65 environment handling 1000s of users and |repos| you should deploy on a 12+
65 environment handling 1000s of users and |repos| you should deploy on a 12+
66 core 64GB RAM server. In short, the more RAM the better.
66 core 64GB RAM server. In short, the more RAM the better.
67
67
68
68
69 For example:
69 For example:
70
70
71 - for team of 1 - 5 active users you can run on 1GB RAM machine with 1CPU
71 - for team of 1 - 5 active users you can run on 1GB RAM machine with 1CPU
72 - above 250 active users, |RCE| needs at least 8GB of memory.
72 - above 250 active users, |RCE| needs at least 8GB of memory.
73 Number of CPUs is less important, but recommended to have at least 2-3 CPUs
73 Number of CPUs is less important, but recommended to have at least 2-3 CPUs
74
74
75
75
76 .. _config-rce-files:
76 .. _config-rce-files:
77
77
78 Configuration Files
78 Configuration Files
79 -------------------
79 -------------------
80
80
81 * :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini`
81 * :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.ini`
82 * :file:`/home/{user}/.rccontrol/{instance-id}/search_mapping.ini`
82 * :file:`/home/{user}/.rccontrol/{instance-id}/search_mapping.ini`
83 * :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini`
83 * :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.ini`
84 * :file:`/home/{user}/.rccontrol/supervisor/supervisord.ini`
84 * :file:`/home/{user}/.rccontrol/supervisor/supervisord.ini`
85 * :file:`/home/{user}/.rccontrol.ini`
85 * :file:`/home/{user}/.rccontrol.ini`
86 * :file:`/home/{user}/.rhoderc`
86 * :file:`/home/{user}/.rhoderc`
87 * :file:`/home/{user}/.rccontrol/cache/MANIFEST`
87 * :file:`/home/{user}/.rccontrol/cache/MANIFEST`
88
88
89 For more information, see the :ref:`config-files` section.
89 For more information, see the :ref:`config-files` section.
90
90
91 Log Files
91 Log Files
92 ---------
92 ---------
93
93
94 * :file:`/home/{user}/.rccontrol/{instance-id}/enterprise.log`
94 * :file:`/home/{user}/.rccontrol/{instance-id}/enterprise.log`
95 * :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.log`
95 * :file:`/home/{user}/.rccontrol/{vcsserver-id}/vcsserver.log`
96 * :file:`/home/{user}/.rccontrol/supervisor/supervisord.log`
96 * :file:`/home/{user}/.rccontrol/supervisor/supervisord.log`
97 * :file:`/tmp/rccontrol.log`
97 * :file:`/tmp/rccontrol.log`
98 * :file:`/tmp/rhodecode_tools.log`
98 * :file:`/tmp/rhodecode_tools.log`
99
99
100 Storage Files
100 Storage Files
101 -------------
101 -------------
102
102
103 * :file:`/home/{user}/.rccontrol/{instance-id}/data/index/{index-file.toc}`
103 * :file:`/home/{user}/.rccontrol/{instance-id}/data/index/{index-file.toc}`
104 * :file:`/home/{user}/repos/.rc_gist_store`
104 * :file:`/home/{user}/repos/.rc_gist_store`
105 * :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.db`
105 * :file:`/home/{user}/.rccontrol/{instance-id}/rhodecode.db`
106 * :file:`/opt/rhodecode/store/{unique-hash}`
106 * :file:`/opt/rhodecode/store/{unique-hash}`
107
107
108 Default Repositories Location
108 Default Repositories Location
109 -----------------------------
109 -----------------------------
110
110
111 * :file:`/home/{user}/repos`
111 * :file:`/home/{user}/repos`
112
112
113 Connection Methods
113 Connection Methods
114 ------------------
114 ------------------
115
115
116 * HTTPS
116 * HTTPS
117 * SSH
117 * SSH
118 * |RCE| API
118 * |RCE| API
119
119
120 Internationalization Support
120 Internationalization Support
121 ----------------------------
121 ----------------------------
122
122
123 Currently available in the following languages, see `Transifex`_ for the
123 Currently available in the following languages, see `Transifex`_ for the
124 latest details. If you want a new language added, please contact us. To
124 latest details. If you want a new language added, please contact us. To
125 configure your language settings, see the :ref:`set-lang` section.
125 configure your language settings, see the :ref:`set-lang` section.
126
126
127 .. hlist::
127 .. hlist::
128
128
129 * Belorussian
129 * Belorussian
130 * Chinese
130 * Chinese
131 * French
131 * French
132 * German
132 * German
133 * Italian
133 * Italian
134 * Japanese
134 * Japanese
135 * Portuguese
135 * Portuguese
136 * Polish
136 * Polish
137 * Russian
137 * Russian
138 * Spanish
138 * Spanish
139
139
140 Licencing Information
140 Licencing Information
141 ---------------------
141 ---------------------
142
142
143 * See licencing information `here`_
143 * See licencing information `here`_
144
144
145 Peer-to-peer Failover Support
145 Peer-to-peer Failover Support
146 -----------------------------
146 -----------------------------
147
147
148 * Yes
148 * Yes
149
149
150 Additional Binaries
150 Additional Binaries
151 -------------------
151 -------------------
152
152
153 * Yes, see :ref:`rhodecode-nix-ref` for full details.
153 * Yes, see :ref:`rhodecode-nix-ref` for full details.
154
154
155 Remote Connectivity
155 Remote Connectivity
156 -------------------
156 -------------------
157
157
158 * Available
158 * Available
159
159
160 Executable Files
160 Executable Files
161 ----------------
161 ----------------
162
162
163 Windows: :file:`RhodeCode-installer-{version}.exe`
163 Windows: :file:`RhodeCode-installer-{version}.exe`
164
164
165 Deprecated Support
165 Deprecated Support
166 ------------------
166 ------------------
167
167
168 - Internet Explorer 8 support deprecated since version 3.7.0.
168 - Internet Explorer 8 support deprecated since version 3.7.0.
169 - Internet Explorer 9 support deprecated since version 3.8.0.
169 - Internet Explorer 9 support deprecated since version 3.8.0.
170
170
171 .. _here: https://rhodecode.com/licenses/
171 .. _here: https://rhodecode.com/licenses/
172 .. _Transifex: https://www.transifex.com/projects/p/RhodeCode/
172 .. _Transifex: https://explore.transifex.com/rhodecode/RhodeCode/
@@ -1,157 +1,158 b''
1 .. _rhodecode-release-notes-ref:
1 .. _rhodecode-release-notes-ref:
2
2
3 Release Notes
3 Release Notes
4 =============
4 =============
5
5
6 |RCE| 4.x Versions
6 |RCE| 4.x Versions
7 ------------------
7 ------------------
8
8
9 .. toctree::
9 .. toctree::
10 :maxdepth: 1
10 :maxdepth: 1
11
11
12 release-notes-4.27.0.rst
12 release-notes-4.26.0.rst
13 release-notes-4.26.0.rst
13 release-notes-4.25.2.rst
14 release-notes-4.25.2.rst
14 release-notes-4.25.1.rst
15 release-notes-4.25.1.rst
15 release-notes-4.25.0.rst
16 release-notes-4.25.0.rst
16 release-notes-4.24.1.rst
17 release-notes-4.24.1.rst
17 release-notes-4.24.0.rst
18 release-notes-4.24.0.rst
18 release-notes-4.23.2.rst
19 release-notes-4.23.2.rst
19 release-notes-4.23.1.rst
20 release-notes-4.23.1.rst
20 release-notes-4.23.0.rst
21 release-notes-4.23.0.rst
21 release-notes-4.22.0.rst
22 release-notes-4.22.0.rst
22 release-notes-4.21.0.rst
23 release-notes-4.21.0.rst
23 release-notes-4.20.1.rst
24 release-notes-4.20.1.rst
24 release-notes-4.20.0.rst
25 release-notes-4.20.0.rst
25 release-notes-4.19.3.rst
26 release-notes-4.19.3.rst
26 release-notes-4.19.2.rst
27 release-notes-4.19.2.rst
27 release-notes-4.19.1.rst
28 release-notes-4.19.1.rst
28 release-notes-4.19.0.rst
29 release-notes-4.19.0.rst
29 release-notes-4.18.3.rst
30 release-notes-4.18.3.rst
30 release-notes-4.18.2.rst
31 release-notes-4.18.2.rst
31 release-notes-4.18.1.rst
32 release-notes-4.18.1.rst
32 release-notes-4.18.0.rst
33 release-notes-4.18.0.rst
33 release-notes-4.17.4.rst
34 release-notes-4.17.4.rst
34 release-notes-4.17.3.rst
35 release-notes-4.17.3.rst
35 release-notes-4.17.2.rst
36 release-notes-4.17.2.rst
36 release-notes-4.17.1.rst
37 release-notes-4.17.1.rst
37 release-notes-4.17.0.rst
38 release-notes-4.17.0.rst
38 release-notes-4.16.2.rst
39 release-notes-4.16.2.rst
39 release-notes-4.16.1.rst
40 release-notes-4.16.1.rst
40 release-notes-4.16.0.rst
41 release-notes-4.16.0.rst
41 release-notes-4.15.2.rst
42 release-notes-4.15.2.rst
42 release-notes-4.15.1.rst
43 release-notes-4.15.1.rst
43 release-notes-4.15.0.rst
44 release-notes-4.15.0.rst
44 release-notes-4.14.1.rst
45 release-notes-4.14.1.rst
45 release-notes-4.14.0.rst
46 release-notes-4.14.0.rst
46 release-notes-4.13.3.rst
47 release-notes-4.13.3.rst
47 release-notes-4.13.2.rst
48 release-notes-4.13.2.rst
48 release-notes-4.13.1.rst
49 release-notes-4.13.1.rst
49 release-notes-4.13.0.rst
50 release-notes-4.13.0.rst
50 release-notes-4.12.4.rst
51 release-notes-4.12.4.rst
51 release-notes-4.12.3.rst
52 release-notes-4.12.3.rst
52 release-notes-4.12.2.rst
53 release-notes-4.12.2.rst
53 release-notes-4.12.1.rst
54 release-notes-4.12.1.rst
54 release-notes-4.12.0.rst
55 release-notes-4.12.0.rst
55 release-notes-4.11.6.rst
56 release-notes-4.11.6.rst
56 release-notes-4.11.5.rst
57 release-notes-4.11.5.rst
57 release-notes-4.11.4.rst
58 release-notes-4.11.4.rst
58 release-notes-4.11.3.rst
59 release-notes-4.11.3.rst
59 release-notes-4.11.2.rst
60 release-notes-4.11.2.rst
60 release-notes-4.11.1.rst
61 release-notes-4.11.1.rst
61 release-notes-4.11.0.rst
62 release-notes-4.11.0.rst
62 release-notes-4.10.6.rst
63 release-notes-4.10.6.rst
63 release-notes-4.10.5.rst
64 release-notes-4.10.5.rst
64 release-notes-4.10.4.rst
65 release-notes-4.10.4.rst
65 release-notes-4.10.3.rst
66 release-notes-4.10.3.rst
66 release-notes-4.10.2.rst
67 release-notes-4.10.2.rst
67 release-notes-4.10.1.rst
68 release-notes-4.10.1.rst
68 release-notes-4.10.0.rst
69 release-notes-4.10.0.rst
69 release-notes-4.9.1.rst
70 release-notes-4.9.1.rst
70 release-notes-4.9.0.rst
71 release-notes-4.9.0.rst
71 release-notes-4.8.0.rst
72 release-notes-4.8.0.rst
72 release-notes-4.7.2.rst
73 release-notes-4.7.2.rst
73 release-notes-4.7.1.rst
74 release-notes-4.7.1.rst
74 release-notes-4.7.0.rst
75 release-notes-4.7.0.rst
75 release-notes-4.6.1.rst
76 release-notes-4.6.1.rst
76 release-notes-4.6.0.rst
77 release-notes-4.6.0.rst
77 release-notes-4.5.2.rst
78 release-notes-4.5.2.rst
78 release-notes-4.5.1.rst
79 release-notes-4.5.1.rst
79 release-notes-4.5.0.rst
80 release-notes-4.5.0.rst
80 release-notes-4.4.2.rst
81 release-notes-4.4.2.rst
81 release-notes-4.4.1.rst
82 release-notes-4.4.1.rst
82 release-notes-4.4.0.rst
83 release-notes-4.4.0.rst
83 release-notes-4.3.1.rst
84 release-notes-4.3.1.rst
84 release-notes-4.3.0.rst
85 release-notes-4.3.0.rst
85 release-notes-4.2.1.rst
86 release-notes-4.2.1.rst
86 release-notes-4.2.0.rst
87 release-notes-4.2.0.rst
87 release-notes-4.1.2.rst
88 release-notes-4.1.2.rst
88 release-notes-4.1.1.rst
89 release-notes-4.1.1.rst
89 release-notes-4.1.0.rst
90 release-notes-4.1.0.rst
90 release-notes-4.0.1.rst
91 release-notes-4.0.1.rst
91 release-notes-4.0.0.rst
92 release-notes-4.0.0.rst
92
93
93 |RCE| 3.x Versions
94 |RCE| 3.x Versions
94 ------------------
95 ------------------
95
96
96 .. toctree::
97 .. toctree::
97 :maxdepth: 1
98 :maxdepth: 1
98
99
99 release-notes-3.8.4.rst
100 release-notes-3.8.4.rst
100 release-notes-3.8.3.rst
101 release-notes-3.8.3.rst
101 release-notes-3.8.2.rst
102 release-notes-3.8.2.rst
102 release-notes-3.8.1.rst
103 release-notes-3.8.1.rst
103 release-notes-3.8.0.rst
104 release-notes-3.8.0.rst
104 release-notes-3.7.1.rst
105 release-notes-3.7.1.rst
105 release-notes-3.7.0.rst
106 release-notes-3.7.0.rst
106 release-notes-3.6.1.rst
107 release-notes-3.6.1.rst
107 release-notes-3.6.0.rst
108 release-notes-3.6.0.rst
108 release-notes-3.5.2.rst
109 release-notes-3.5.2.rst
109 release-notes-3.5.1.rst
110 release-notes-3.5.1.rst
110 release-notes-3.5.0.rst
111 release-notes-3.5.0.rst
111 release-notes-3.4.1.rst
112 release-notes-3.4.1.rst
112 release-notes-3.4.0.rst
113 release-notes-3.4.0.rst
113 release-notes-3.3.4.rst
114 release-notes-3.3.4.rst
114 release-notes-3.3.3.rst
115 release-notes-3.3.3.rst
115 release-notes-3.3.2.rst
116 release-notes-3.3.2.rst
116 release-notes-3.3.1.rst
117 release-notes-3.3.1.rst
117 release-notes-3.3.0.rst
118 release-notes-3.3.0.rst
118 release-notes-3.2.3.rst
119 release-notes-3.2.3.rst
119 release-notes-3.2.2.rst
120 release-notes-3.2.2.rst
120 release-notes-3.2.1.rst
121 release-notes-3.2.1.rst
121 release-notes-3.2.0.rst
122 release-notes-3.2.0.rst
122 release-notes-3.1.1.rst
123 release-notes-3.1.1.rst
123 release-notes-3.1.0.rst
124 release-notes-3.1.0.rst
124 release-notes-3.0.2.rst
125 release-notes-3.0.2.rst
125 release-notes-3.0.1.rst
126 release-notes-3.0.1.rst
126 release-notes-3.0.0.rst
127 release-notes-3.0.0.rst
127
128
128 |RCE| 2.x Versions
129 |RCE| 2.x Versions
129 ------------------
130 ------------------
130
131
131 .. toctree::
132 .. toctree::
132 :maxdepth: 1
133 :maxdepth: 1
133
134
134 release-notes-2.2.8.rst
135 release-notes-2.2.8.rst
135 release-notes-2.2.7.rst
136 release-notes-2.2.7.rst
136 release-notes-2.2.6.rst
137 release-notes-2.2.6.rst
137 release-notes-2.2.5.rst
138 release-notes-2.2.5.rst
138 release-notes-2.2.4.rst
139 release-notes-2.2.4.rst
139 release-notes-2.2.3.rst
140 release-notes-2.2.3.rst
140 release-notes-2.2.2.rst
141 release-notes-2.2.2.rst
141 release-notes-2.2.1.rst
142 release-notes-2.2.1.rst
142 release-notes-2.2.0.rst
143 release-notes-2.2.0.rst
143 release-notes-2.1.0.rst
144 release-notes-2.1.0.rst
144 release-notes-2.0.2.rst
145 release-notes-2.0.2.rst
145 release-notes-2.0.1.rst
146 release-notes-2.0.1.rst
146 release-notes-2.0.0.rst
147 release-notes-2.0.0.rst
147
148
148 |RCE| 1.x Versions
149 |RCE| 1.x Versions
149 ------------------
150 ------------------
150
151
151 .. toctree::
152 .. toctree::
152 :maxdepth: 1
153 :maxdepth: 1
153
154
154 release-notes-1.7.2.rst
155 release-notes-1.7.2.rst
155 release-notes-1.7.1.rst
156 release-notes-1.7.1.rst
156 release-notes-1.7.0.rst
157 release-notes-1.7.0.rst
157 release-notes-1.6.0.rst
158 release-notes-1.6.0.rst
@@ -1,12 +1,12 b''
1 diff -rup rhodecode-tools-1.4.0-orig/setup.py rhodecode-tools-1.4.0/setup.py
1 diff -rup rhodecode-tools-1.4.1-orig/setup.py rhodecode-tools-1.4.1/setup.py
2 --- rhodecode-tools-1.4.0/setup-orig.py 2021-03-11 12:34:45.000000000 +0100
2 --- rhodecode-tools-1.4.1/setup-orig.py 2021-03-11 12:34:45.000000000 +0100
3 +++ rhodecode-tools-1.4.0/setup.py 2021-03-11 12:34:56.000000000 +0100
3 +++ rhodecode-tools-1.4.1/setup.py 2021-03-11 12:34:56.000000000 +0100
4 @@ -69,7 +69,7 @@ def _get_requirements(req_filename, excl
4 @@ -69,7 +69,7 @@ def _get_requirements(req_filename, excl
5
5
6
6
7 # requirements extract
7 # requirements extract
8 -setup_requirements = ['pytest-runner']
8 -setup_requirements = ['pytest-runner']
9 +setup_requirements = ['pytest-runner==5.1.0']
9 +setup_requirements = ['pytest-runner==5.1.0']
10 install_requirements = _get_requirements(
10 install_requirements = _get_requirements(
11 'requirements.txt', exclude=['setuptools'])
11 'requirements.txt', exclude=['setuptools'])
12 test_requirements = _get_requirements('requirements_test.txt') No newline at end of file
12 test_requirements = _get_requirements('requirements_test.txt')
@@ -1,2520 +1,2520 b''
1 # Generated by pip2nix 0.8.0.dev1
1 # Generated by pip2nix 0.8.0.dev1
2 # See https://github.com/johbo/pip2nix
2 # See https://github.com/johbo/pip2nix
3
3
4 { pkgs, fetchurl, fetchgit, fetchhg }:
4 { pkgs, fetchurl, fetchgit, fetchhg }:
5
5
6 self: super: {
6 self: super: {
7 "alembic" = super.buildPythonPackage {
7 "alembic" = super.buildPythonPackage {
8 name = "alembic-1.4.2";
8 name = "alembic-1.4.2";
9 doCheck = false;
9 doCheck = false;
10 propagatedBuildInputs = [
10 propagatedBuildInputs = [
11 self."sqlalchemy"
11 self."sqlalchemy"
12 self."mako"
12 self."mako"
13 self."python-editor"
13 self."python-editor"
14 self."python-dateutil"
14 self."python-dateutil"
15 ];
15 ];
16 src = fetchurl {
16 src = fetchurl {
17 url = "https://files.pythonhosted.org/packages/60/1e/cabc75a189de0fbb2841d0975243e59bde8b7822bacbb95008ac6fe9ad47/alembic-1.4.2.tar.gz";
17 url = "https://files.pythonhosted.org/packages/60/1e/cabc75a189de0fbb2841d0975243e59bde8b7822bacbb95008ac6fe9ad47/alembic-1.4.2.tar.gz";
18 sha256 = "1gsdrzx9h7wfva200qvvsc9sn4w79mk2vs0bbnzjhxi1jw2b0nh3";
18 sha256 = "1gsdrzx9h7wfva200qvvsc9sn4w79mk2vs0bbnzjhxi1jw2b0nh3";
19 };
19 };
20 meta = {
20 meta = {
21 license = [ pkgs.lib.licenses.mit ];
21 license = [ pkgs.lib.licenses.mit ];
22 };
22 };
23 };
23 };
24 "amqp" = super.buildPythonPackage {
24 "amqp" = super.buildPythonPackage {
25 name = "amqp-2.5.2";
25 name = "amqp-2.5.2";
26 doCheck = false;
26 doCheck = false;
27 propagatedBuildInputs = [
27 propagatedBuildInputs = [
28 self."vine"
28 self."vine"
29 ];
29 ];
30 src = fetchurl {
30 src = fetchurl {
31 url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz";
31 url = "https://files.pythonhosted.org/packages/92/1d/433541994a5a69f4ad2fff39746ddbb0bdedb0ea0d85673eb0db68a7edd9/amqp-2.5.2.tar.gz";
32 sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp";
32 sha256 = "13dhhfxjrqcjybnq4zahg92mydhpg2l76nxcmq7d560687wsxwbp";
33 };
33 };
34 meta = {
34 meta = {
35 license = [ pkgs.lib.licenses.bsdOriginal ];
35 license = [ pkgs.lib.licenses.bsdOriginal ];
36 };
36 };
37 };
37 };
38 "apispec" = super.buildPythonPackage {
38 "apispec" = super.buildPythonPackage {
39 name = "apispec-1.0.0";
39 name = "apispec-1.0.0";
40 doCheck = false;
40 doCheck = false;
41 propagatedBuildInputs = [
41 propagatedBuildInputs = [
42 self."PyYAML"
42 self."PyYAML"
43 ];
43 ];
44 src = fetchurl {
44 src = fetchurl {
45 url = "https://files.pythonhosted.org/packages/67/15/346c04988dd67d36007e28145504c520491930c878b1f484a97b27a8f497/apispec-1.0.0.tar.gz";
45 url = "https://files.pythonhosted.org/packages/67/15/346c04988dd67d36007e28145504c520491930c878b1f484a97b27a8f497/apispec-1.0.0.tar.gz";
46 sha256 = "1712w1anvqrvadjjpvai84vbaygaxabd3zz5lxihdzwzs4gvi9sp";
46 sha256 = "1712w1anvqrvadjjpvai84vbaygaxabd3zz5lxihdzwzs4gvi9sp";
47 };
47 };
48 meta = {
48 meta = {
49 license = [ pkgs.lib.licenses.mit ];
49 license = [ pkgs.lib.licenses.mit ];
50 };
50 };
51 };
51 };
52 "appenlight-client" = super.buildPythonPackage {
52 "appenlight-client" = super.buildPythonPackage {
53 name = "appenlight-client-0.6.26";
53 name = "appenlight-client-0.6.26";
54 doCheck = false;
54 doCheck = false;
55 propagatedBuildInputs = [
55 propagatedBuildInputs = [
56 self."webob"
56 self."webob"
57 self."requests"
57 self."requests"
58 self."six"
58 self."six"
59 ];
59 ];
60 src = fetchurl {
60 src = fetchurl {
61 url = "https://files.pythonhosted.org/packages/2e/56/418fc10379b96e795ee39a15e69a730c222818af04c3821fa354eaa859ec/appenlight_client-0.6.26.tar.gz";
61 url = "https://files.pythonhosted.org/packages/2e/56/418fc10379b96e795ee39a15e69a730c222818af04c3821fa354eaa859ec/appenlight_client-0.6.26.tar.gz";
62 sha256 = "0s9xw3sb8s3pk73k78nnq4jil3q4mk6bczfa1fmgfx61kdxl2712";
62 sha256 = "0s9xw3sb8s3pk73k78nnq4jil3q4mk6bczfa1fmgfx61kdxl2712";
63 };
63 };
64 meta = {
64 meta = {
65 license = [ pkgs.lib.licenses.bsdOriginal ];
65 license = [ pkgs.lib.licenses.bsdOriginal ];
66 };
66 };
67 };
67 };
68 "asn1crypto" = super.buildPythonPackage {
68 "asn1crypto" = super.buildPythonPackage {
69 name = "asn1crypto-0.24.0";
69 name = "asn1crypto-0.24.0";
70 doCheck = false;
70 doCheck = false;
71 src = fetchurl {
71 src = fetchurl {
72 url = "https://files.pythonhosted.org/packages/fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4/asn1crypto-0.24.0.tar.gz";
72 url = "https://files.pythonhosted.org/packages/fc/f1/8db7daa71f414ddabfa056c4ef792e1461ff655c2ae2928a2b675bfed6b4/asn1crypto-0.24.0.tar.gz";
73 sha256 = "0jaf8rf9dx1lf23xfv2cdd5h52f1qr3w8k63985bc35g3d220p4x";
73 sha256 = "0jaf8rf9dx1lf23xfv2cdd5h52f1qr3w8k63985bc35g3d220p4x";
74 };
74 };
75 meta = {
75 meta = {
76 license = [ pkgs.lib.licenses.mit ];
76 license = [ pkgs.lib.licenses.mit ];
77 };
77 };
78 };
78 };
79 "atomicwrites" = super.buildPythonPackage {
79 "atomicwrites" = super.buildPythonPackage {
80 name = "atomicwrites-1.3.0";
80 name = "atomicwrites-1.3.0";
81 doCheck = false;
81 doCheck = false;
82 src = fetchurl {
82 src = fetchurl {
83 url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz";
83 url = "https://files.pythonhosted.org/packages/ec/0f/cd484ac8820fed363b374af30049adc8fd13065720fd4f4c6be8a2309da7/atomicwrites-1.3.0.tar.gz";
84 sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm";
84 sha256 = "19ngcscdf3jsqmpcxn6zl5b6anmsajb6izp1smcd1n02midl9abm";
85 };
85 };
86 meta = {
86 meta = {
87 license = [ pkgs.lib.licenses.mit ];
87 license = [ pkgs.lib.licenses.mit ];
88 };
88 };
89 };
89 };
90 "attrs" = super.buildPythonPackage {
90 "attrs" = super.buildPythonPackage {
91 name = "attrs-19.3.0";
91 name = "attrs-19.3.0";
92 doCheck = false;
92 doCheck = false;
93 src = fetchurl {
93 src = fetchurl {
94 url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz";
94 url = "https://files.pythonhosted.org/packages/98/c3/2c227e66b5e896e15ccdae2e00bbc69aa46e9a8ce8869cc5fa96310bf612/attrs-19.3.0.tar.gz";
95 sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp";
95 sha256 = "0wky4h28n7xnr6xv69p9z6kv8bzn50d10c3drmd9ds8gawbcxdzp";
96 };
96 };
97 meta = {
97 meta = {
98 license = [ pkgs.lib.licenses.mit ];
98 license = [ pkgs.lib.licenses.mit ];
99 };
99 };
100 };
100 };
101 "babel" = super.buildPythonPackage {
101 "babel" = super.buildPythonPackage {
102 name = "babel-1.3";
102 name = "babel-1.3";
103 doCheck = false;
103 doCheck = false;
104 propagatedBuildInputs = [
104 propagatedBuildInputs = [
105 self."pytz"
105 self."pytz"
106 ];
106 ];
107 src = fetchurl {
107 src = fetchurl {
108 url = "https://files.pythonhosted.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz";
108 url = "https://files.pythonhosted.org/packages/33/27/e3978243a03a76398c384c83f7ca879bc6e8f1511233a621fcada135606e/Babel-1.3.tar.gz";
109 sha256 = "0bnin777lc53nxd1hp3apq410jj5wx92n08h7h4izpl4f4sx00lz";
109 sha256 = "0bnin777lc53nxd1hp3apq410jj5wx92n08h7h4izpl4f4sx00lz";
110 };
110 };
111 meta = {
111 meta = {
112 license = [ pkgs.lib.licenses.bsdOriginal ];
112 license = [ pkgs.lib.licenses.bsdOriginal ];
113 };
113 };
114 };
114 };
115 "backports.shutil-get-terminal-size" = super.buildPythonPackage {
115 "backports.shutil-get-terminal-size" = super.buildPythonPackage {
116 name = "backports.shutil-get-terminal-size-1.0.0";
116 name = "backports.shutil-get-terminal-size-1.0.0";
117 doCheck = false;
117 doCheck = false;
118 src = fetchurl {
118 src = fetchurl {
119 url = "https://files.pythonhosted.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz";
119 url = "https://files.pythonhosted.org/packages/ec/9c/368086faa9c016efce5da3e0e13ba392c9db79e3ab740b763fe28620b18b/backports.shutil_get_terminal_size-1.0.0.tar.gz";
120 sha256 = "107cmn7g3jnbkp826zlj8rrj19fam301qvaqf0f3905f5217lgki";
120 sha256 = "107cmn7g3jnbkp826zlj8rrj19fam301qvaqf0f3905f5217lgki";
121 };
121 };
122 meta = {
122 meta = {
123 license = [ pkgs.lib.licenses.mit ];
123 license = [ pkgs.lib.licenses.mit ];
124 };
124 };
125 };
125 };
126 "beaker" = super.buildPythonPackage {
126 "beaker" = super.buildPythonPackage {
127 name = "beaker-1.9.1";
127 name = "beaker-1.9.1";
128 doCheck = false;
128 doCheck = false;
129 propagatedBuildInputs = [
129 propagatedBuildInputs = [
130 self."funcsigs"
130 self."funcsigs"
131 ];
131 ];
132 src = fetchurl {
132 src = fetchurl {
133 url = "https://files.pythonhosted.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz";
133 url = "https://files.pythonhosted.org/packages/ca/14/a626188d0d0c7b55dd7cf1902046c2743bd392a7078bb53073e13280eb1e/Beaker-1.9.1.tar.gz";
134 sha256 = "08arsn61r255lhz6hcpn2lsiqpg30clla805ysx06wmbhvb6w9rj";
134 sha256 = "08arsn61r255lhz6hcpn2lsiqpg30clla805ysx06wmbhvb6w9rj";
135 };
135 };
136 meta = {
136 meta = {
137 license = [ pkgs.lib.licenses.bsdOriginal ];
137 license = [ pkgs.lib.licenses.bsdOriginal ];
138 };
138 };
139 };
139 };
140 "beautifulsoup4" = super.buildPythonPackage {
140 "beautifulsoup4" = super.buildPythonPackage {
141 name = "beautifulsoup4-4.6.3";
141 name = "beautifulsoup4-4.6.3";
142 doCheck = false;
142 doCheck = false;
143 src = fetchurl {
143 src = fetchurl {
144 url = "https://files.pythonhosted.org/packages/88/df/86bffad6309f74f3ff85ea69344a078fc30003270c8df6894fca7a3c72ff/beautifulsoup4-4.6.3.tar.gz";
144 url = "https://files.pythonhosted.org/packages/88/df/86bffad6309f74f3ff85ea69344a078fc30003270c8df6894fca7a3c72ff/beautifulsoup4-4.6.3.tar.gz";
145 sha256 = "041dhalzjciw6qyzzq7a2k4h1yvyk76xigp35hv5ibnn448ydy4h";
145 sha256 = "041dhalzjciw6qyzzq7a2k4h1yvyk76xigp35hv5ibnn448ydy4h";
146 };
146 };
147 meta = {
147 meta = {
148 license = [ pkgs.lib.licenses.mit ];
148 license = [ pkgs.lib.licenses.mit ];
149 };
149 };
150 };
150 };
151 "billiard" = super.buildPythonPackage {
151 "billiard" = super.buildPythonPackage {
152 name = "billiard-3.6.1.0";
152 name = "billiard-3.6.1.0";
153 doCheck = false;
153 doCheck = false;
154 src = fetchurl {
154 src = fetchurl {
155 url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz";
155 url = "https://files.pythonhosted.org/packages/68/1d/2aea8fbb0b1e1260a8a2e77352de2983d36d7ac01207cf14c2b9c6cc860e/billiard-3.6.1.0.tar.gz";
156 sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q";
156 sha256 = "09hzy3aqi7visy4vmf4xiish61n0rq5nd3iwjydydps8yrs9r05q";
157 };
157 };
158 meta = {
158 meta = {
159 license = [ pkgs.lib.licenses.bsdOriginal ];
159 license = [ pkgs.lib.licenses.bsdOriginal ];
160 };
160 };
161 };
161 };
162 "bleach" = super.buildPythonPackage {
162 "bleach" = super.buildPythonPackage {
163 name = "bleach-3.1.3";
163 name = "bleach-3.1.3";
164 doCheck = false;
164 doCheck = false;
165 propagatedBuildInputs = [
165 propagatedBuildInputs = [
166 self."six"
166 self."six"
167 self."webencodings"
167 self."webencodings"
168 ];
168 ];
169 src = fetchurl {
169 src = fetchurl {
170 url = "https://files.pythonhosted.org/packages/de/09/5267f8577a92487ed43bc694476c4629c6eca2e3c93fcf690a26bfe39e1d/bleach-3.1.3.tar.gz";
170 url = "https://files.pythonhosted.org/packages/de/09/5267f8577a92487ed43bc694476c4629c6eca2e3c93fcf690a26bfe39e1d/bleach-3.1.3.tar.gz";
171 sha256 = "0al437aw4p2xp83az5hhlrp913nsf0cg6kg4qj3fjhv4wakxipzq";
171 sha256 = "0al437aw4p2xp83az5hhlrp913nsf0cg6kg4qj3fjhv4wakxipzq";
172 };
172 };
173 meta = {
173 meta = {
174 license = [ pkgs.lib.licenses.asl20 ];
174 license = [ pkgs.lib.licenses.asl20 ];
175 };
175 };
176 };
176 };
177 "bumpversion" = super.buildPythonPackage {
177 "bumpversion" = super.buildPythonPackage {
178 name = "bumpversion-0.5.3";
178 name = "bumpversion-0.5.3";
179 doCheck = false;
179 doCheck = false;
180 src = fetchurl {
180 src = fetchurl {
181 url = "https://files.pythonhosted.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz";
181 url = "https://files.pythonhosted.org/packages/14/41/8c9da3549f8e00c84f0432c3a8cf8ed6898374714676aab91501d48760db/bumpversion-0.5.3.tar.gz";
182 sha256 = "0zn7694yfipxg35ikkfh7kvgl2fissha3dnqad2c5bvsvmrwhi37";
182 sha256 = "0zn7694yfipxg35ikkfh7kvgl2fissha3dnqad2c5bvsvmrwhi37";
183 };
183 };
184 meta = {
184 meta = {
185 license = [ pkgs.lib.licenses.mit ];
185 license = [ pkgs.lib.licenses.mit ];
186 };
186 };
187 };
187 };
188 "cachetools" = super.buildPythonPackage {
188 "cachetools" = super.buildPythonPackage {
189 name = "cachetools-3.1.1";
189 name = "cachetools-3.1.1";
190 doCheck = false;
190 doCheck = false;
191 src = fetchurl {
191 src = fetchurl {
192 url = "https://files.pythonhosted.org/packages/ae/37/7fd45996b19200e0cb2027a0b6bef4636951c4ea111bfad36c71287247f6/cachetools-3.1.1.tar.gz";
192 url = "https://files.pythonhosted.org/packages/ae/37/7fd45996b19200e0cb2027a0b6bef4636951c4ea111bfad36c71287247f6/cachetools-3.1.1.tar.gz";
193 sha256 = "16m69l6n6y1r1y7cklm92rr7v69ldig2n3lbl3j323w5jz7d78lf";
193 sha256 = "16m69l6n6y1r1y7cklm92rr7v69ldig2n3lbl3j323w5jz7d78lf";
194 };
194 };
195 meta = {
195 meta = {
196 license = [ pkgs.lib.licenses.mit ];
196 license = [ pkgs.lib.licenses.mit ];
197 };
197 };
198 };
198 };
199 "celery" = super.buildPythonPackage {
199 "celery" = super.buildPythonPackage {
200 name = "celery-4.3.0";
200 name = "celery-4.3.0";
201 doCheck = false;
201 doCheck = false;
202 propagatedBuildInputs = [
202 propagatedBuildInputs = [
203 self."pytz"
203 self."pytz"
204 self."billiard"
204 self."billiard"
205 self."kombu"
205 self."kombu"
206 self."vine"
206 self."vine"
207 ];
207 ];
208 src = fetchurl {
208 src = fetchurl {
209 url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz";
209 url = "https://files.pythonhosted.org/packages/a2/4b/d020836f751617e907e84753a41c92231cd4b673ff991b8ee9da52361323/celery-4.3.0.tar.gz";
210 sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac";
210 sha256 = "1y8y0gbgkwimpxqnxq2rm5qz2vy01fvjiybnpm00y5rzd2m34iac";
211 };
211 };
212 meta = {
212 meta = {
213 license = [ pkgs.lib.licenses.bsdOriginal ];
213 license = [ pkgs.lib.licenses.bsdOriginal ];
214 };
214 };
215 };
215 };
216 "certifi" = super.buildPythonPackage {
216 "certifi" = super.buildPythonPackage {
217 name = "certifi-2020.4.5.1";
217 name = "certifi-2020.4.5.1";
218 doCheck = false;
218 doCheck = false;
219 src = fetchurl {
219 src = fetchurl {
220 url = "https://files.pythonhosted.org/packages/b8/e2/a3a86a67c3fc8249ed305fc7b7d290ebe5e4d46ad45573884761ef4dea7b/certifi-2020.4.5.1.tar.gz";
220 url = "https://files.pythonhosted.org/packages/b8/e2/a3a86a67c3fc8249ed305fc7b7d290ebe5e4d46ad45573884761ef4dea7b/certifi-2020.4.5.1.tar.gz";
221 sha256 = "06b5gfs7wmmipln8f3z928d2mmx2j4b3x7pnqmj6cvmyfh8v7z2i";
221 sha256 = "06b5gfs7wmmipln8f3z928d2mmx2j4b3x7pnqmj6cvmyfh8v7z2i";
222 };
222 };
223 meta = {
223 meta = {
224 license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ];
224 license = [ pkgs.lib.licenses.mpl20 { fullName = "Mozilla Public License 2.0 (MPL 2.0)"; } ];
225 };
225 };
226 };
226 };
227 "cffi" = super.buildPythonPackage {
227 "cffi" = super.buildPythonPackage {
228 name = "cffi-1.12.3";
228 name = "cffi-1.12.3";
229 doCheck = false;
229 doCheck = false;
230 propagatedBuildInputs = [
230 propagatedBuildInputs = [
231 self."pycparser"
231 self."pycparser"
232 ];
232 ];
233 src = fetchurl {
233 src = fetchurl {
234 url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz";
234 url = "https://files.pythonhosted.org/packages/93/1a/ab8c62b5838722f29f3daffcc8d4bd61844aa9b5f437341cc890ceee483b/cffi-1.12.3.tar.gz";
235 sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704";
235 sha256 = "0x075521fxwv0mfp4cqzk7lvmw4n94bjw601qkcv314z5s182704";
236 };
236 };
237 meta = {
237 meta = {
238 license = [ pkgs.lib.licenses.mit ];
238 license = [ pkgs.lib.licenses.mit ];
239 };
239 };
240 };
240 };
241 "chameleon" = super.buildPythonPackage {
241 "chameleon" = super.buildPythonPackage {
242 name = "chameleon-2.24";
242 name = "chameleon-2.24";
243 doCheck = false;
243 doCheck = false;
244 src = fetchurl {
244 src = fetchurl {
245 url = "https://files.pythonhosted.org/packages/5a/9e/637379ffa13c5172b5c0e704833ffea6bf51cec7567f93fd6e903d53ed74/Chameleon-2.24.tar.gz";
245 url = "https://files.pythonhosted.org/packages/5a/9e/637379ffa13c5172b5c0e704833ffea6bf51cec7567f93fd6e903d53ed74/Chameleon-2.24.tar.gz";
246 sha256 = "0ykqr7syxfa6h9adjfnsv1gdsca2xzm22vmic8859n0f0j09abj5";
246 sha256 = "0ykqr7syxfa6h9adjfnsv1gdsca2xzm22vmic8859n0f0j09abj5";
247 };
247 };
248 meta = {
248 meta = {
249 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
249 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
250 };
250 };
251 };
251 };
252 "channelstream" = super.buildPythonPackage {
252 "channelstream" = super.buildPythonPackage {
253 name = "channelstream-0.6.14";
253 name = "channelstream-0.6.14";
254 doCheck = false;
254 doCheck = false;
255 propagatedBuildInputs = [
255 propagatedBuildInputs = [
256 self."gevent"
256 self."gevent"
257 self."ws4py"
257 self."ws4py"
258 self."marshmallow"
258 self."marshmallow"
259 self."python-dateutil"
259 self."python-dateutil"
260 self."pyramid"
260 self."pyramid"
261 self."pyramid-jinja2"
261 self."pyramid-jinja2"
262 self."pyramid-apispec"
262 self."pyramid-apispec"
263 self."itsdangerous"
263 self."itsdangerous"
264 self."requests"
264 self."requests"
265 self."six"
265 self."six"
266 ];
266 ];
267 src = fetchurl {
267 src = fetchurl {
268 url = "https://files.pythonhosted.org/packages/d4/2d/86d6757ccd06ce673ee224123471da3d45251d061da7c580bfc259bad853/channelstream-0.6.14.tar.gz";
268 url = "https://files.pythonhosted.org/packages/d4/2d/86d6757ccd06ce673ee224123471da3d45251d061da7c580bfc259bad853/channelstream-0.6.14.tar.gz";
269 sha256 = "0qgy5j3rj6c8cslzidh32glhkrhbbdxjc008y69v8a0y3zyaz2d3";
269 sha256 = "0qgy5j3rj6c8cslzidh32glhkrhbbdxjc008y69v8a0y3zyaz2d3";
270 };
270 };
271 meta = {
271 meta = {
272 license = [ pkgs.lib.licenses.bsdOriginal ];
272 license = [ pkgs.lib.licenses.bsdOriginal ];
273 };
273 };
274 };
274 };
275 "chardet" = super.buildPythonPackage {
275 "chardet" = super.buildPythonPackage {
276 name = "chardet-3.0.4";
276 name = "chardet-3.0.4";
277 doCheck = false;
277 doCheck = false;
278 src = fetchurl {
278 src = fetchurl {
279 url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz";
279 url = "https://files.pythonhosted.org/packages/fc/bb/a5768c230f9ddb03acc9ef3f0d4a3cf93462473795d18e9535498c8f929d/chardet-3.0.4.tar.gz";
280 sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4";
280 sha256 = "1bpalpia6r5x1kknbk11p1fzph56fmmnp405ds8icksd3knr5aw4";
281 };
281 };
282 meta = {
282 meta = {
283 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
283 license = [ { fullName = "LGPL"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
284 };
284 };
285 };
285 };
286 "click" = super.buildPythonPackage {
286 "click" = super.buildPythonPackage {
287 name = "click-7.0";
287 name = "click-7.0";
288 doCheck = false;
288 doCheck = false;
289 src = fetchurl {
289 src = fetchurl {
290 url = "https://files.pythonhosted.org/packages/f8/5c/f60e9d8a1e77005f664b76ff8aeaee5bc05d0a91798afd7f53fc998dbc47/Click-7.0.tar.gz";
290 url = "https://files.pythonhosted.org/packages/f8/5c/f60e9d8a1e77005f664b76ff8aeaee5bc05d0a91798afd7f53fc998dbc47/Click-7.0.tar.gz";
291 sha256 = "1mzjixd4vjbjvzb6vylki9w1556a9qmdh35kzmq6cign46av952v";
291 sha256 = "1mzjixd4vjbjvzb6vylki9w1556a9qmdh35kzmq6cign46av952v";
292 };
292 };
293 meta = {
293 meta = {
294 license = [ pkgs.lib.licenses.bsdOriginal ];
294 license = [ pkgs.lib.licenses.bsdOriginal ];
295 };
295 };
296 };
296 };
297 "colander" = super.buildPythonPackage {
297 "colander" = super.buildPythonPackage {
298 name = "colander-1.7.0";
298 name = "colander-1.7.0";
299 doCheck = false;
299 doCheck = false;
300 propagatedBuildInputs = [
300 propagatedBuildInputs = [
301 self."translationstring"
301 self."translationstring"
302 self."iso8601"
302 self."iso8601"
303 self."enum34"
303 self."enum34"
304 ];
304 ];
305 src = fetchurl {
305 src = fetchurl {
306 url = "https://files.pythonhosted.org/packages/db/e4/74ab06f54211917b41865cafc987ce511e35503de48da9bfe9358a1bdc3e/colander-1.7.0.tar.gz";
306 url = "https://files.pythonhosted.org/packages/db/e4/74ab06f54211917b41865cafc987ce511e35503de48da9bfe9358a1bdc3e/colander-1.7.0.tar.gz";
307 sha256 = "1wl1bqab307lbbcjx81i28s3yl6dlm4rf15fxawkjb6j48x1cn6p";
307 sha256 = "1wl1bqab307lbbcjx81i28s3yl6dlm4rf15fxawkjb6j48x1cn6p";
308 };
308 };
309 meta = {
309 meta = {
310 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
310 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
311 };
311 };
312 };
312 };
313 "configobj" = super.buildPythonPackage {
313 "configobj" = super.buildPythonPackage {
314 name = "configobj-5.0.6";
314 name = "configobj-5.0.6";
315 doCheck = false;
315 doCheck = false;
316 propagatedBuildInputs = [
316 propagatedBuildInputs = [
317 self."six"
317 self."six"
318 ];
318 ];
319 src = fetchurl {
319 src = fetchurl {
320 url = "https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626";
320 url = "https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626";
321 sha256 = "0kqfrdfr14mw8yd8qwq14dv2xghpkjmd3yjsy8dfcbvpcc17xnxp";
321 sha256 = "0kqfrdfr14mw8yd8qwq14dv2xghpkjmd3yjsy8dfcbvpcc17xnxp";
322 };
322 };
323 meta = {
323 meta = {
324 license = [ pkgs.lib.licenses.bsdOriginal ];
324 license = [ pkgs.lib.licenses.bsdOriginal ];
325 };
325 };
326 };
326 };
327 "configparser" = super.buildPythonPackage {
327 "configparser" = super.buildPythonPackage {
328 name = "configparser-4.0.2";
328 name = "configparser-4.0.2";
329 doCheck = false;
329 doCheck = false;
330 src = fetchurl {
330 src = fetchurl {
331 url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz";
331 url = "https://files.pythonhosted.org/packages/16/4f/48975536bd488d3a272549eb795ac4a13a5f7fcdc8995def77fbef3532ee/configparser-4.0.2.tar.gz";
332 sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7";
332 sha256 = "1priacxym85yjcf68hh38w55nqswaxp71ryjyfdk222kg9l85ln7";
333 };
333 };
334 meta = {
334 meta = {
335 license = [ pkgs.lib.licenses.mit ];
335 license = [ pkgs.lib.licenses.mit ];
336 };
336 };
337 };
337 };
338 "contextlib2" = super.buildPythonPackage {
338 "contextlib2" = super.buildPythonPackage {
339 name = "contextlib2-0.6.0.post1";
339 name = "contextlib2-0.6.0.post1";
340 doCheck = false;
340 doCheck = false;
341 src = fetchurl {
341 src = fetchurl {
342 url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz";
342 url = "https://files.pythonhosted.org/packages/02/54/669207eb72e3d8ae8b38aa1f0703ee87a0e9f88f30d3c0a47bebdb6de242/contextlib2-0.6.0.post1.tar.gz";
343 sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01";
343 sha256 = "0bhnr2ac7wy5l85ji909gyljyk85n92w8pdvslmrvc8qih4r1x01";
344 };
344 };
345 meta = {
345 meta = {
346 license = [ pkgs.lib.licenses.psfl ];
346 license = [ pkgs.lib.licenses.psfl ];
347 };
347 };
348 };
348 };
349 "cov-core" = super.buildPythonPackage {
349 "cov-core" = super.buildPythonPackage {
350 name = "cov-core-1.15.0";
350 name = "cov-core-1.15.0";
351 doCheck = false;
351 doCheck = false;
352 propagatedBuildInputs = [
352 propagatedBuildInputs = [
353 self."coverage"
353 self."coverage"
354 ];
354 ];
355 src = fetchurl {
355 src = fetchurl {
356 url = "https://files.pythonhosted.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz";
356 url = "https://files.pythonhosted.org/packages/4b/87/13e75a47b4ba1be06f29f6d807ca99638bedc6b57fa491cd3de891ca2923/cov-core-1.15.0.tar.gz";
357 sha256 = "0k3np9ymh06yv1ib96sb6wfsxjkqhmik8qfsn119vnhga9ywc52a";
357 sha256 = "0k3np9ymh06yv1ib96sb6wfsxjkqhmik8qfsn119vnhga9ywc52a";
358 };
358 };
359 meta = {
359 meta = {
360 license = [ pkgs.lib.licenses.mit ];
360 license = [ pkgs.lib.licenses.mit ];
361 };
361 };
362 };
362 };
363 "coverage" = super.buildPythonPackage {
363 "coverage" = super.buildPythonPackage {
364 name = "coverage-4.5.4";
364 name = "coverage-4.5.4";
365 doCheck = false;
365 doCheck = false;
366 src = fetchurl {
366 src = fetchurl {
367 url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz";
367 url = "https://files.pythonhosted.org/packages/85/d5/818d0e603685c4a613d56f065a721013e942088047ff1027a632948bdae6/coverage-4.5.4.tar.gz";
368 sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0";
368 sha256 = "0p0j4di6h8k6ica7jwwj09azdcg4ycxq60i9qsskmsg94cd9yzg0";
369 };
369 };
370 meta = {
370 meta = {
371 license = [ pkgs.lib.licenses.asl20 ];
371 license = [ pkgs.lib.licenses.asl20 ];
372 };
372 };
373 };
373 };
374 "cryptography" = super.buildPythonPackage {
374 "cryptography" = super.buildPythonPackage {
375 name = "cryptography-2.6.1";
375 name = "cryptography-2.6.1";
376 doCheck = false;
376 doCheck = false;
377 propagatedBuildInputs = [
377 propagatedBuildInputs = [
378 self."asn1crypto"
378 self."asn1crypto"
379 self."six"
379 self."six"
380 self."cffi"
380 self."cffi"
381 self."enum34"
381 self."enum34"
382 self."ipaddress"
382 self."ipaddress"
383 ];
383 ];
384 src = fetchurl {
384 src = fetchurl {
385 url = "https://files.pythonhosted.org/packages/07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449/cryptography-2.6.1.tar.gz";
385 url = "https://files.pythonhosted.org/packages/07/ca/bc827c5e55918ad223d59d299fff92f3563476c3b00d0a9157d9c0217449/cryptography-2.6.1.tar.gz";
386 sha256 = "19iwz5avym5zl6jrrrkym1rdaa9h61j20ph4cswsqgv8xg5j3j16";
386 sha256 = "19iwz5avym5zl6jrrrkym1rdaa9h61j20ph4cswsqgv8xg5j3j16";
387 };
387 };
388 meta = {
388 meta = {
389 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
389 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
390 };
390 };
391 };
391 };
392 "cssselect" = super.buildPythonPackage {
392 "cssselect" = super.buildPythonPackage {
393 name = "cssselect-1.0.3";
393 name = "cssselect-1.0.3";
394 doCheck = false;
394 doCheck = false;
395 src = fetchurl {
395 src = fetchurl {
396 url = "https://files.pythonhosted.org/packages/52/ea/f31e1d2e9eb130fda2a631e22eac369dc644e8807345fbed5113f2d6f92b/cssselect-1.0.3.tar.gz";
396 url = "https://files.pythonhosted.org/packages/52/ea/f31e1d2e9eb130fda2a631e22eac369dc644e8807345fbed5113f2d6f92b/cssselect-1.0.3.tar.gz";
397 sha256 = "011jqa2jhmydhi0iz4v1w3cr540z5zas8g2bw8brdw4s4b2qnv86";
397 sha256 = "011jqa2jhmydhi0iz4v1w3cr540z5zas8g2bw8brdw4s4b2qnv86";
398 };
398 };
399 meta = {
399 meta = {
400 license = [ pkgs.lib.licenses.bsdOriginal ];
400 license = [ pkgs.lib.licenses.bsdOriginal ];
401 };
401 };
402 };
402 };
403 "cssutils" = super.buildPythonPackage {
403 "cssutils" = super.buildPythonPackage {
404 name = "cssutils-1.0.2";
404 name = "cssutils-1.0.2";
405 doCheck = false;
405 doCheck = false;
406 src = fetchurl {
406 src = fetchurl {
407 url = "https://files.pythonhosted.org/packages/5c/0b/c5f29d29c037e97043770b5e7c740b6252993e4b57f029b3cd03c78ddfec/cssutils-1.0.2.tar.gz";
407 url = "https://files.pythonhosted.org/packages/5c/0b/c5f29d29c037e97043770b5e7c740b6252993e4b57f029b3cd03c78ddfec/cssutils-1.0.2.tar.gz";
408 sha256 = "1bxchrbqzapwijap0yhlxdil1w9bmwvgx77aizlkhc2mcxjg1z52";
408 sha256 = "1bxchrbqzapwijap0yhlxdil1w9bmwvgx77aizlkhc2mcxjg1z52";
409 };
409 };
410 meta = {
410 meta = {
411 license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL 2.1 or later, see also http://cthedot.de/cssutils/"; } ];
411 license = [ { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL 2.1 or later, see also http://cthedot.de/cssutils/"; } ];
412 };
412 };
413 };
413 };
414 "decorator" = super.buildPythonPackage {
414 "decorator" = super.buildPythonPackage {
415 name = "decorator-4.1.2";
415 name = "decorator-4.1.2";
416 doCheck = false;
416 doCheck = false;
417 src = fetchurl {
417 src = fetchurl {
418 url = "https://files.pythonhosted.org/packages/bb/e0/f6e41e9091e130bf16d4437dabbac3993908e4d6485ecbc985ef1352db94/decorator-4.1.2.tar.gz";
418 url = "https://files.pythonhosted.org/packages/bb/e0/f6e41e9091e130bf16d4437dabbac3993908e4d6485ecbc985ef1352db94/decorator-4.1.2.tar.gz";
419 sha256 = "1d8npb11kxyi36mrvjdpcjij76l5zfyrz2f820brf0l0rcw4vdkw";
419 sha256 = "1d8npb11kxyi36mrvjdpcjij76l5zfyrz2f820brf0l0rcw4vdkw";
420 };
420 };
421 meta = {
421 meta = {
422 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "new BSD License"; } ];
422 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "new BSD License"; } ];
423 };
423 };
424 };
424 };
425 "deform" = super.buildPythonPackage {
425 "deform" = super.buildPythonPackage {
426 name = "deform-2.0.8";
426 name = "deform-2.0.8";
427 doCheck = false;
427 doCheck = false;
428 propagatedBuildInputs = [
428 propagatedBuildInputs = [
429 self."chameleon"
429 self."chameleon"
430 self."colander"
430 self."colander"
431 self."iso8601"
431 self."iso8601"
432 self."peppercorn"
432 self."peppercorn"
433 self."translationstring"
433 self."translationstring"
434 self."zope.deprecation"
434 self."zope.deprecation"
435 ];
435 ];
436 src = fetchurl {
436 src = fetchurl {
437 url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz";
437 url = "https://files.pythonhosted.org/packages/21/d0/45fdf891a82722c02fc2da319cf2d1ae6b5abf9e470ad3762135a895a868/deform-2.0.8.tar.gz";
438 sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9";
438 sha256 = "0wbjv98sib96649aqaygzxnrkclyy50qij2rha6fn1i4c86bfdl9";
439 };
439 };
440 meta = {
440 meta = {
441 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
441 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
442 };
442 };
443 };
443 };
444 "defusedxml" = super.buildPythonPackage {
444 "defusedxml" = super.buildPythonPackage {
445 name = "defusedxml-0.6.0";
445 name = "defusedxml-0.6.0";
446 doCheck = false;
446 doCheck = false;
447 src = fetchurl {
447 src = fetchurl {
448 url = "https://files.pythonhosted.org/packages/a4/5f/f8aa58ca0cf01cbcee728abc9d88bfeb74e95e6cb4334cfd5bed5673ea77/defusedxml-0.6.0.tar.gz";
448 url = "https://files.pythonhosted.org/packages/a4/5f/f8aa58ca0cf01cbcee728abc9d88bfeb74e95e6cb4334cfd5bed5673ea77/defusedxml-0.6.0.tar.gz";
449 sha256 = "1xbp8fivl3wlbyg2jrvs4lalaqv1xp9a9f29p75wdx2s2d6h717n";
449 sha256 = "1xbp8fivl3wlbyg2jrvs4lalaqv1xp9a9f29p75wdx2s2d6h717n";
450 };
450 };
451 meta = {
451 meta = {
452 license = [ pkgs.lib.licenses.psfl ];
452 license = [ pkgs.lib.licenses.psfl ];
453 };
453 };
454 };
454 };
455 "dm.xmlsec.binding" = super.buildPythonPackage {
455 "dm.xmlsec.binding" = super.buildPythonPackage {
456 name = "dm.xmlsec.binding-1.3.7";
456 name = "dm.xmlsec.binding-1.3.7";
457 doCheck = false;
457 doCheck = false;
458 propagatedBuildInputs = [
458 propagatedBuildInputs = [
459 self."setuptools"
459 self."setuptools"
460 self."lxml"
460 self."lxml"
461 ];
461 ];
462 src = fetchurl {
462 src = fetchurl {
463 url = "https://files.pythonhosted.org/packages/2c/9e/7651982d50252692991acdae614af821fd6c79bc8dcd598ad71d55be8fc7/dm.xmlsec.binding-1.3.7.tar.gz";
463 url = "https://files.pythonhosted.org/packages/2c/9e/7651982d50252692991acdae614af821fd6c79bc8dcd598ad71d55be8fc7/dm.xmlsec.binding-1.3.7.tar.gz";
464 sha256 = "03jjjscx1pz2nc0dwiw9nia02qbz1c6f0f9zkyr8fmvys2n5jkb3";
464 sha256 = "03jjjscx1pz2nc0dwiw9nia02qbz1c6f0f9zkyr8fmvys2n5jkb3";
465 };
465 };
466 meta = {
466 meta = {
467 license = [ pkgs.lib.licenses.bsdOriginal ];
467 license = [ pkgs.lib.licenses.bsdOriginal ];
468 };
468 };
469 };
469 };
470 "docutils" = super.buildPythonPackage {
470 "docutils" = super.buildPythonPackage {
471 name = "docutils-0.16";
471 name = "docutils-0.16";
472 doCheck = false;
472 doCheck = false;
473 src = fetchurl {
473 src = fetchurl {
474 url = "https://files.pythonhosted.org/packages/2f/e0/3d435b34abd2d62e8206171892f174b180cd37b09d57b924ca5c2ef2219d/docutils-0.16.tar.gz";
474 url = "https://files.pythonhosted.org/packages/2f/e0/3d435b34abd2d62e8206171892f174b180cd37b09d57b924ca5c2ef2219d/docutils-0.16.tar.gz";
475 sha256 = "1z3qliszqca9m719q3qhdkh0ghh90g500avzdgi7pl77x5h3mpn2";
475 sha256 = "1z3qliszqca9m719q3qhdkh0ghh90g500avzdgi7pl77x5h3mpn2";
476 };
476 };
477 meta = {
477 meta = {
478 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ];
478 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.publicDomain pkgs.lib.licenses.gpl1 { fullName = "public domain, Python, 2-Clause BSD, GPL 3 (see COPYING.txt)"; } pkgs.lib.licenses.psfl ];
479 };
479 };
480 };
480 };
481 "dogpile.cache" = super.buildPythonPackage {
481 "dogpile.cache" = super.buildPythonPackage {
482 name = "dogpile.cache-0.9.0";
482 name = "dogpile.cache-0.9.0";
483 doCheck = false;
483 doCheck = false;
484 propagatedBuildInputs = [
484 propagatedBuildInputs = [
485 self."decorator"
485 self."decorator"
486 ];
486 ];
487 src = fetchurl {
487 src = fetchurl {
488 url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz";
488 url = "https://files.pythonhosted.org/packages/ac/6a/9ac405686a94b7f009a20a50070a5786b0e1aedc707b88d40d0c4b51a82e/dogpile.cache-0.9.0.tar.gz";
489 sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k";
489 sha256 = "0sr1fn6b4k5bh0cscd9yi8csqxvj4ngzildav58x5p694mc86j5k";
490 };
490 };
491 meta = {
491 meta = {
492 license = [ pkgs.lib.licenses.bsdOriginal ];
492 license = [ pkgs.lib.licenses.bsdOriginal ];
493 };
493 };
494 };
494 };
495 "dogpile.core" = super.buildPythonPackage {
495 "dogpile.core" = super.buildPythonPackage {
496 name = "dogpile.core-0.4.1";
496 name = "dogpile.core-0.4.1";
497 doCheck = false;
497 doCheck = false;
498 src = fetchurl {
498 src = fetchurl {
499 url = "https://files.pythonhosted.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz";
499 url = "https://files.pythonhosted.org/packages/0e/77/e72abc04c22aedf874301861e5c1e761231c288b5de369c18be8f4b5c9bb/dogpile.core-0.4.1.tar.gz";
500 sha256 = "0xpdvg4kr1isfkrh1rfsh7za4q5a5s6l2kf9wpvndbwf3aqjyrdy";
500 sha256 = "0xpdvg4kr1isfkrh1rfsh7za4q5a5s6l2kf9wpvndbwf3aqjyrdy";
501 };
501 };
502 meta = {
502 meta = {
503 license = [ pkgs.lib.licenses.bsdOriginal ];
503 license = [ pkgs.lib.licenses.bsdOriginal ];
504 };
504 };
505 };
505 };
506 "ecdsa" = super.buildPythonPackage {
506 "ecdsa" = super.buildPythonPackage {
507 name = "ecdsa-0.13.2";
507 name = "ecdsa-0.13.2";
508 doCheck = false;
508 doCheck = false;
509 src = fetchurl {
509 src = fetchurl {
510 url = "https://files.pythonhosted.org/packages/51/76/139bf6e9b7b6684d5891212cdbd9e0739f2bfc03f380a1a6ffa700f392ac/ecdsa-0.13.2.tar.gz";
510 url = "https://files.pythonhosted.org/packages/51/76/139bf6e9b7b6684d5891212cdbd9e0739f2bfc03f380a1a6ffa700f392ac/ecdsa-0.13.2.tar.gz";
511 sha256 = "116qaq7bh4lcynzi613960jhsnn19v0kmsqwahiwjfj14gx4y0sw";
511 sha256 = "116qaq7bh4lcynzi613960jhsnn19v0kmsqwahiwjfj14gx4y0sw";
512 };
512 };
513 meta = {
513 meta = {
514 license = [ pkgs.lib.licenses.mit ];
514 license = [ pkgs.lib.licenses.mit ];
515 };
515 };
516 };
516 };
517 "elasticsearch" = super.buildPythonPackage {
517 "elasticsearch" = super.buildPythonPackage {
518 name = "elasticsearch-6.3.1";
518 name = "elasticsearch-6.3.1";
519 doCheck = false;
519 doCheck = false;
520 propagatedBuildInputs = [
520 propagatedBuildInputs = [
521 self."urllib3"
521 self."urllib3"
522 ];
522 ];
523 src = fetchurl {
523 src = fetchurl {
524 url = "https://files.pythonhosted.org/packages/9d/ce/c4664e8380e379a9402ecfbaf158e56396da90d520daba21cfa840e0eb71/elasticsearch-6.3.1.tar.gz";
524 url = "https://files.pythonhosted.org/packages/9d/ce/c4664e8380e379a9402ecfbaf158e56396da90d520daba21cfa840e0eb71/elasticsearch-6.3.1.tar.gz";
525 sha256 = "12y93v0yn7a4xmf969239g8gb3l4cdkclfpbk1qc8hx5qkymrnma";
525 sha256 = "12y93v0yn7a4xmf969239g8gb3l4cdkclfpbk1qc8hx5qkymrnma";
526 };
526 };
527 meta = {
527 meta = {
528 license = [ pkgs.lib.licenses.asl20 ];
528 license = [ pkgs.lib.licenses.asl20 ];
529 };
529 };
530 };
530 };
531 "elasticsearch-dsl" = super.buildPythonPackage {
531 "elasticsearch-dsl" = super.buildPythonPackage {
532 name = "elasticsearch-dsl-6.3.1";
532 name = "elasticsearch-dsl-6.3.1";
533 doCheck = false;
533 doCheck = false;
534 propagatedBuildInputs = [
534 propagatedBuildInputs = [
535 self."six"
535 self."six"
536 self."python-dateutil"
536 self."python-dateutil"
537 self."elasticsearch"
537 self."elasticsearch"
538 self."ipaddress"
538 self."ipaddress"
539 ];
539 ];
540 src = fetchurl {
540 src = fetchurl {
541 url = "https://files.pythonhosted.org/packages/4c/0d/1549f50c591db6bb4e66cbcc8d34a6e537c3d89aa426b167c244fd46420a/elasticsearch-dsl-6.3.1.tar.gz";
541 url = "https://files.pythonhosted.org/packages/4c/0d/1549f50c591db6bb4e66cbcc8d34a6e537c3d89aa426b167c244fd46420a/elasticsearch-dsl-6.3.1.tar.gz";
542 sha256 = "1gh8a0shqi105k325hgwb9avrpdjh0mc6mxwfg9ba7g6lssb702z";
542 sha256 = "1gh8a0shqi105k325hgwb9avrpdjh0mc6mxwfg9ba7g6lssb702z";
543 };
543 };
544 meta = {
544 meta = {
545 license = [ pkgs.lib.licenses.asl20 ];
545 license = [ pkgs.lib.licenses.asl20 ];
546 };
546 };
547 };
547 };
548 "elasticsearch1" = super.buildPythonPackage {
548 "elasticsearch1" = super.buildPythonPackage {
549 name = "elasticsearch1-1.10.0";
549 name = "elasticsearch1-1.10.0";
550 doCheck = false;
550 doCheck = false;
551 propagatedBuildInputs = [
551 propagatedBuildInputs = [
552 self."urllib3"
552 self."urllib3"
553 ];
553 ];
554 src = fetchurl {
554 src = fetchurl {
555 url = "https://files.pythonhosted.org/packages/a6/eb/73e75f9681fa71e3157b8ee878534235d57f24ee64f0e77f8d995fb57076/elasticsearch1-1.10.0.tar.gz";
555 url = "https://files.pythonhosted.org/packages/a6/eb/73e75f9681fa71e3157b8ee878534235d57f24ee64f0e77f8d995fb57076/elasticsearch1-1.10.0.tar.gz";
556 sha256 = "0g89444kd5zwql4vbvyrmi2m6l6dcj6ga98j4hqxyyyz6z20aki2";
556 sha256 = "0g89444kd5zwql4vbvyrmi2m6l6dcj6ga98j4hqxyyyz6z20aki2";
557 };
557 };
558 meta = {
558 meta = {
559 license = [ pkgs.lib.licenses.asl20 ];
559 license = [ pkgs.lib.licenses.asl20 ];
560 };
560 };
561 };
561 };
562 "elasticsearch1-dsl" = super.buildPythonPackage {
562 "elasticsearch1-dsl" = super.buildPythonPackage {
563 name = "elasticsearch1-dsl-0.0.12";
563 name = "elasticsearch1-dsl-0.0.12";
564 doCheck = false;
564 doCheck = false;
565 propagatedBuildInputs = [
565 propagatedBuildInputs = [
566 self."six"
566 self."six"
567 self."python-dateutil"
567 self."python-dateutil"
568 self."elasticsearch1"
568 self."elasticsearch1"
569 ];
569 ];
570 src = fetchurl {
570 src = fetchurl {
571 url = "https://files.pythonhosted.org/packages/eb/9d/785342775cb10eddc9b8d7457d618a423b4f0b89d8b2b2d1bc27190d71db/elasticsearch1-dsl-0.0.12.tar.gz";
571 url = "https://files.pythonhosted.org/packages/eb/9d/785342775cb10eddc9b8d7457d618a423b4f0b89d8b2b2d1bc27190d71db/elasticsearch1-dsl-0.0.12.tar.gz";
572 sha256 = "0ig1ly39v93hba0z975wnhbmzwj28w6w1sqlr2g7cn5spp732bhk";
572 sha256 = "0ig1ly39v93hba0z975wnhbmzwj28w6w1sqlr2g7cn5spp732bhk";
573 };
573 };
574 meta = {
574 meta = {
575 license = [ pkgs.lib.licenses.asl20 ];
575 license = [ pkgs.lib.licenses.asl20 ];
576 };
576 };
577 };
577 };
578 "elasticsearch2" = super.buildPythonPackage {
578 "elasticsearch2" = super.buildPythonPackage {
579 name = "elasticsearch2-2.5.1";
579 name = "elasticsearch2-2.5.1";
580 doCheck = false;
580 doCheck = false;
581 propagatedBuildInputs = [
581 propagatedBuildInputs = [
582 self."urllib3"
582 self."urllib3"
583 ];
583 ];
584 src = fetchurl {
584 src = fetchurl {
585 url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz";
585 url = "https://files.pythonhosted.org/packages/f6/09/f9b24aa6b1120bea371cd57ef6f57c7694cf16660469456a8be6c2bdbe22/elasticsearch2-2.5.1.tar.gz";
586 sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k";
586 sha256 = "19k2znpjfyp0hrq73cz7pjyj289040xpsxsm0xhh4jfh6y551g7k";
587 };
587 };
588 meta = {
588 meta = {
589 license = [ pkgs.lib.licenses.asl20 ];
589 license = [ pkgs.lib.licenses.asl20 ];
590 };
590 };
591 };
591 };
592 "entrypoints" = super.buildPythonPackage {
592 "entrypoints" = super.buildPythonPackage {
593 name = "entrypoints-0.2.2";
593 name = "entrypoints-0.2.2";
594 doCheck = false;
594 doCheck = false;
595 propagatedBuildInputs = [
595 propagatedBuildInputs = [
596 self."configparser"
596 self."configparser"
597 ];
597 ];
598 src = fetchurl {
598 src = fetchurl {
599 url = "https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d";
599 url = "https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d";
600 sha256 = "0qih72n2myclanplqipqxpgpj9d2yhff1pz5d02zq1cfqyd173w5";
600 sha256 = "0qih72n2myclanplqipqxpgpj9d2yhff1pz5d02zq1cfqyd173w5";
601 };
601 };
602 meta = {
602 meta = {
603 license = [ pkgs.lib.licenses.mit ];
603 license = [ pkgs.lib.licenses.mit ];
604 };
604 };
605 };
605 };
606 "enum34" = super.buildPythonPackage {
606 "enum34" = super.buildPythonPackage {
607 name = "enum34-1.1.10";
607 name = "enum34-1.1.10";
608 doCheck = false;
608 doCheck = false;
609 src = fetchurl {
609 src = fetchurl {
610 url = "https://files.pythonhosted.org/packages/11/c4/2da1f4952ba476677a42f25cd32ab8aaf0e1c0d0e00b89822b835c7e654c/enum34-1.1.10.tar.gz";
610 url = "https://files.pythonhosted.org/packages/11/c4/2da1f4952ba476677a42f25cd32ab8aaf0e1c0d0e00b89822b835c7e654c/enum34-1.1.10.tar.gz";
611 sha256 = "0j7ji699fwswm4vg6w1v07fkbf8dkzdm6gfh88jvs5nqgr3sgrnc";
611 sha256 = "0j7ji699fwswm4vg6w1v07fkbf8dkzdm6gfh88jvs5nqgr3sgrnc";
612 };
612 };
613 meta = {
613 meta = {
614 license = [ pkgs.lib.licenses.bsdOriginal ];
614 license = [ pkgs.lib.licenses.bsdOriginal ];
615 };
615 };
616 };
616 };
617 "formencode" = super.buildPythonPackage {
617 "formencode" = super.buildPythonPackage {
618 name = "formencode-1.2.4";
618 name = "formencode-1.2.4";
619 doCheck = false;
619 doCheck = false;
620 src = fetchurl {
620 src = fetchurl {
621 url = "https://files.pythonhosted.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz";
621 url = "https://files.pythonhosted.org/packages/8e/59/0174271a6f004512e0201188593e6d319db139d14cb7490e488bbb078015/FormEncode-1.2.4.tar.gz";
622 sha256 = "1fgy04sdy4yry5xcjls3x3xy30dqwj58ycnkndim819jx0788w42";
622 sha256 = "1fgy04sdy4yry5xcjls3x3xy30dqwj58ycnkndim819jx0788w42";
623 };
623 };
624 meta = {
624 meta = {
625 license = [ pkgs.lib.licenses.psfl ];
625 license = [ pkgs.lib.licenses.psfl ];
626 };
626 };
627 };
627 };
628 "funcsigs" = super.buildPythonPackage {
628 "funcsigs" = super.buildPythonPackage {
629 name = "funcsigs-1.0.2";
629 name = "funcsigs-1.0.2";
630 doCheck = false;
630 doCheck = false;
631 src = fetchurl {
631 src = fetchurl {
632 url = "https://files.pythonhosted.org/packages/94/4a/db842e7a0545de1cdb0439bb80e6e42dfe82aaeaadd4072f2263a4fbed23/funcsigs-1.0.2.tar.gz";
632 url = "https://files.pythonhosted.org/packages/94/4a/db842e7a0545de1cdb0439bb80e6e42dfe82aaeaadd4072f2263a4fbed23/funcsigs-1.0.2.tar.gz";
633 sha256 = "0l4g5818ffyfmfs1a924811azhjj8ax9xd1cffr1mzd3ycn0zfx7";
633 sha256 = "0l4g5818ffyfmfs1a924811azhjj8ax9xd1cffr1mzd3ycn0zfx7";
634 };
634 };
635 meta = {
635 meta = {
636 license = [ { fullName = "ASL"; } pkgs.lib.licenses.asl20 ];
636 license = [ { fullName = "ASL"; } pkgs.lib.licenses.asl20 ];
637 };
637 };
638 };
638 };
639 "functools32" = super.buildPythonPackage {
639 "functools32" = super.buildPythonPackage {
640 name = "functools32-3.2.3.post2";
640 name = "functools32-3.2.3.post2";
641 doCheck = false;
641 doCheck = false;
642 src = fetchurl {
642 src = fetchurl {
643 url = "https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz";
643 url = "https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz";
644 sha256 = "0v8ya0b58x47wp216n1zamimv4iw57cxz3xxhzix52jkw3xks9gn";
644 sha256 = "0v8ya0b58x47wp216n1zamimv4iw57cxz3xxhzix52jkw3xks9gn";
645 };
645 };
646 meta = {
646 meta = {
647 license = [ pkgs.lib.licenses.psfl ];
647 license = [ pkgs.lib.licenses.psfl ];
648 };
648 };
649 };
649 };
650 "future" = super.buildPythonPackage {
650 "future" = super.buildPythonPackage {
651 name = "future-0.14.3";
651 name = "future-0.14.3";
652 doCheck = false;
652 doCheck = false;
653 src = fetchurl {
653 src = fetchurl {
654 url = "https://files.pythonhosted.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz";
654 url = "https://files.pythonhosted.org/packages/83/80/8ef3a11a15f8eaafafa0937b20c1b3f73527e69ab6b3fa1cf94a5a96aabb/future-0.14.3.tar.gz";
655 sha256 = "1savk7jx7hal032f522c5ajhh8fra6gmnadrj9adv5qxi18pv1b2";
655 sha256 = "1savk7jx7hal032f522c5ajhh8fra6gmnadrj9adv5qxi18pv1b2";
656 };
656 };
657 meta = {
657 meta = {
658 license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ];
658 license = [ { fullName = "OSI Approved"; } pkgs.lib.licenses.mit ];
659 };
659 };
660 };
660 };
661 "futures" = super.buildPythonPackage {
661 "futures" = super.buildPythonPackage {
662 name = "futures-3.0.2";
662 name = "futures-3.0.2";
663 doCheck = false;
663 doCheck = false;
664 src = fetchurl {
664 src = fetchurl {
665 url = "https://files.pythonhosted.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz";
665 url = "https://files.pythonhosted.org/packages/f8/e7/fc0fcbeb9193ba2d4de00b065e7fd5aecd0679e93ce95a07322b2b1434f4/futures-3.0.2.tar.gz";
666 sha256 = "0mz2pbgxbc2nbib1szifi07whjbfs4r02pv2z390z7p410awjgyw";
666 sha256 = "0mz2pbgxbc2nbib1szifi07whjbfs4r02pv2z390z7p410awjgyw";
667 };
667 };
668 meta = {
668 meta = {
669 license = [ pkgs.lib.licenses.bsdOriginal ];
669 license = [ pkgs.lib.licenses.bsdOriginal ];
670 };
670 };
671 };
671 };
672 "gevent" = super.buildPythonPackage {
672 "gevent" = super.buildPythonPackage {
673 name = "gevent-1.5.0";
673 name = "gevent-1.5.0";
674 doCheck = false;
674 doCheck = false;
675 propagatedBuildInputs = [
675 propagatedBuildInputs = [
676 self."greenlet"
676 self."greenlet"
677 ];
677 ];
678 src = fetchurl {
678 src = fetchurl {
679 url = "https://files.pythonhosted.org/packages/5a/79/2c63d385d017b5dd7d70983a463dfd25befae70c824fedb857df6e72eff2/gevent-1.5.0.tar.gz";
679 url = "https://files.pythonhosted.org/packages/5a/79/2c63d385d017b5dd7d70983a463dfd25befae70c824fedb857df6e72eff2/gevent-1.5.0.tar.gz";
680 sha256 = "0aac3d4vhv5n4rsb6cqzq0d1xx9immqz4fmpddw35yxkwdc450dj";
680 sha256 = "0aac3d4vhv5n4rsb6cqzq0d1xx9immqz4fmpddw35yxkwdc450dj";
681 };
681 };
682 meta = {
682 meta = {
683 license = [ pkgs.lib.licenses.mit ];
683 license = [ pkgs.lib.licenses.mit ];
684 };
684 };
685 };
685 };
686 "gnureadline" = super.buildPythonPackage {
686 "gnureadline" = super.buildPythonPackage {
687 name = "gnureadline-6.3.8";
687 name = "gnureadline-6.3.8";
688 doCheck = false;
688 doCheck = false;
689 src = fetchurl {
689 src = fetchurl {
690 url = "https://files.pythonhosted.org/packages/50/64/86085c823cd78f9df9d8e33dce0baa71618016f8860460b82cf6610e1eb3/gnureadline-6.3.8.tar.gz";
690 url = "https://files.pythonhosted.org/packages/50/64/86085c823cd78f9df9d8e33dce0baa71618016f8860460b82cf6610e1eb3/gnureadline-6.3.8.tar.gz";
691 sha256 = "0ddhj98x2nv45iz4aadk4b9m0b1kpsn1xhcbypn5cd556knhiqjq";
691 sha256 = "0ddhj98x2nv45iz4aadk4b9m0b1kpsn1xhcbypn5cd556knhiqjq";
692 };
692 };
693 meta = {
693 meta = {
694 license = [ { fullName = "GNU General Public License v3 (GPLv3)"; } pkgs.lib.licenses.gpl1 ];
694 license = [ { fullName = "GNU General Public License v3 (GPLv3)"; } pkgs.lib.licenses.gpl1 ];
695 };
695 };
696 };
696 };
697 "gprof2dot" = super.buildPythonPackage {
697 "gprof2dot" = super.buildPythonPackage {
698 name = "gprof2dot-2017.9.19";
698 name = "gprof2dot-2017.9.19";
699 doCheck = false;
699 doCheck = false;
700 src = fetchurl {
700 src = fetchurl {
701 url = "https://files.pythonhosted.org/packages/9d/36/f977122502979f3dfb50704979c9ed70e6b620787942b089bf1af15f5aba/gprof2dot-2017.9.19.tar.gz";
701 url = "https://files.pythonhosted.org/packages/9d/36/f977122502979f3dfb50704979c9ed70e6b620787942b089bf1af15f5aba/gprof2dot-2017.9.19.tar.gz";
702 sha256 = "17ih23ld2nzgc3xwgbay911l6lh96jp1zshmskm17n1gg2i7mg6f";
702 sha256 = "17ih23ld2nzgc3xwgbay911l6lh96jp1zshmskm17n1gg2i7mg6f";
703 };
703 };
704 meta = {
704 meta = {
705 license = [ { fullName = "GNU Lesser General Public License v3 or later (LGPLv3+)"; } { fullName = "LGPL"; } ];
705 license = [ { fullName = "GNU Lesser General Public License v3 or later (LGPLv3+)"; } { fullName = "LGPL"; } ];
706 };
706 };
707 };
707 };
708 "greenlet" = super.buildPythonPackage {
708 "greenlet" = super.buildPythonPackage {
709 name = "greenlet-0.4.15";
709 name = "greenlet-0.4.15";
710 doCheck = false;
710 doCheck = false;
711 src = fetchurl {
711 src = fetchurl {
712 url = "https://files.pythonhosted.org/packages/f8/e8/b30ae23b45f69aa3f024b46064c0ac8e5fcb4f22ace0dca8d6f9c8bbe5e7/greenlet-0.4.15.tar.gz";
712 url = "https://files.pythonhosted.org/packages/f8/e8/b30ae23b45f69aa3f024b46064c0ac8e5fcb4f22ace0dca8d6f9c8bbe5e7/greenlet-0.4.15.tar.gz";
713 sha256 = "1g4g1wwc472ds89zmqlpyan3fbnzpa8qm48z3z1y6mlk44z485ll";
713 sha256 = "1g4g1wwc472ds89zmqlpyan3fbnzpa8qm48z3z1y6mlk44z485ll";
714 };
714 };
715 meta = {
715 meta = {
716 license = [ pkgs.lib.licenses.mit ];
716 license = [ pkgs.lib.licenses.mit ];
717 };
717 };
718 };
718 };
719 "gunicorn" = super.buildPythonPackage {
719 "gunicorn" = super.buildPythonPackage {
720 name = "gunicorn-19.9.0";
720 name = "gunicorn-19.9.0";
721 doCheck = false;
721 doCheck = false;
722 src = fetchurl {
722 src = fetchurl {
723 url = "https://files.pythonhosted.org/packages/47/52/68ba8e5e8ba251e54006a49441f7ccabca83b6bef5aedacb4890596c7911/gunicorn-19.9.0.tar.gz";
723 url = "https://files.pythonhosted.org/packages/47/52/68ba8e5e8ba251e54006a49441f7ccabca83b6bef5aedacb4890596c7911/gunicorn-19.9.0.tar.gz";
724 sha256 = "1wzlf4xmn6qjirh5w81l6i6kqjnab1n1qqkh7zsj1yb6gh4n49ps";
724 sha256 = "1wzlf4xmn6qjirh5w81l6i6kqjnab1n1qqkh7zsj1yb6gh4n49ps";
725 };
725 };
726 meta = {
726 meta = {
727 license = [ pkgs.lib.licenses.mit ];
727 license = [ pkgs.lib.licenses.mit ];
728 };
728 };
729 };
729 };
730 "hupper" = super.buildPythonPackage {
730 "hupper" = super.buildPythonPackage {
731 name = "hupper-1.10.2";
731 name = "hupper-1.10.2";
732 doCheck = false;
732 doCheck = false;
733 src = fetchurl {
733 src = fetchurl {
734 url = "https://files.pythonhosted.org/packages/41/24/ea90fef04706e54bd1635c05c50dc9cf87cda543c59303a03e7aa7dda0ce/hupper-1.10.2.tar.gz";
734 url = "https://files.pythonhosted.org/packages/41/24/ea90fef04706e54bd1635c05c50dc9cf87cda543c59303a03e7aa7dda0ce/hupper-1.10.2.tar.gz";
735 sha256 = "0am0p6g5cz6xmcaf04xq8q6dzdd9qz0phj6gcmpsckf2mcyza61q";
735 sha256 = "0am0p6g5cz6xmcaf04xq8q6dzdd9qz0phj6gcmpsckf2mcyza61q";
736 };
736 };
737 meta = {
737 meta = {
738 license = [ pkgs.lib.licenses.mit ];
738 license = [ pkgs.lib.licenses.mit ];
739 };
739 };
740 };
740 };
741 "idna" = super.buildPythonPackage {
741 "idna" = super.buildPythonPackage {
742 name = "idna-2.8";
742 name = "idna-2.8";
743 doCheck = false;
743 doCheck = false;
744 src = fetchurl {
744 src = fetchurl {
745 url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz";
745 url = "https://files.pythonhosted.org/packages/ad/13/eb56951b6f7950cadb579ca166e448ba77f9d24efc03edd7e55fa57d04b7/idna-2.8.tar.gz";
746 sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3";
746 sha256 = "01rlkigdxg17sf9yar1jl8n18ls59367wqh59hnawlyg53vb6my3";
747 };
747 };
748 meta = {
748 meta = {
749 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ];
749 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD-like"; } ];
750 };
750 };
751 };
751 };
752 "importlib-metadata" = super.buildPythonPackage {
752 "importlib-metadata" = super.buildPythonPackage {
753 name = "importlib-metadata-1.6.0";
753 name = "importlib-metadata-1.6.0";
754 doCheck = false;
754 doCheck = false;
755 propagatedBuildInputs = [
755 propagatedBuildInputs = [
756 self."zipp"
756 self."zipp"
757 self."pathlib2"
757 self."pathlib2"
758 self."contextlib2"
758 self."contextlib2"
759 self."configparser"
759 self."configparser"
760 ];
760 ];
761 src = fetchurl {
761 src = fetchurl {
762 url = "https://files.pythonhosted.org/packages/b4/1b/baab42e3cd64c9d5caac25a9d6c054f8324cdc38975a44d600569f1f7158/importlib_metadata-1.6.0.tar.gz";
762 url = "https://files.pythonhosted.org/packages/b4/1b/baab42e3cd64c9d5caac25a9d6c054f8324cdc38975a44d600569f1f7158/importlib_metadata-1.6.0.tar.gz";
763 sha256 = "07icyggasn38yv2swdrd8z6i0plazmc9adavsdkbqqj91j53ll9l";
763 sha256 = "07icyggasn38yv2swdrd8z6i0plazmc9adavsdkbqqj91j53ll9l";
764 };
764 };
765 meta = {
765 meta = {
766 license = [ pkgs.lib.licenses.asl20 ];
766 license = [ pkgs.lib.licenses.asl20 ];
767 };
767 };
768 };
768 };
769 "infrae.cache" = super.buildPythonPackage {
769 "infrae.cache" = super.buildPythonPackage {
770 name = "infrae.cache-1.0.1";
770 name = "infrae.cache-1.0.1";
771 doCheck = false;
771 doCheck = false;
772 propagatedBuildInputs = [
772 propagatedBuildInputs = [
773 self."beaker"
773 self."beaker"
774 self."repoze.lru"
774 self."repoze.lru"
775 ];
775 ];
776 src = fetchurl {
776 src = fetchurl {
777 url = "https://files.pythonhosted.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz";
777 url = "https://files.pythonhosted.org/packages/bb/f0/e7d5e984cf6592fd2807dc7bc44a93f9d18e04e6a61f87fdfb2622422d74/infrae.cache-1.0.1.tar.gz";
778 sha256 = "1dvqsjn8vw253wz9d1pz17j79mf4bs53dvp2qxck2qdp1am1njw4";
778 sha256 = "1dvqsjn8vw253wz9d1pz17j79mf4bs53dvp2qxck2qdp1am1njw4";
779 };
779 };
780 meta = {
780 meta = {
781 license = [ pkgs.lib.licenses.zpl21 ];
781 license = [ pkgs.lib.licenses.zpl21 ];
782 };
782 };
783 };
783 };
784 "invoke" = super.buildPythonPackage {
784 "invoke" = super.buildPythonPackage {
785 name = "invoke-0.13.0";
785 name = "invoke-0.13.0";
786 doCheck = false;
786 doCheck = false;
787 src = fetchurl {
787 src = fetchurl {
788 url = "https://files.pythonhosted.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz";
788 url = "https://files.pythonhosted.org/packages/47/bf/d07ef52fa1ac645468858bbac7cb95b246a972a045e821493d17d89c81be/invoke-0.13.0.tar.gz";
789 sha256 = "0794vhgxfmkh0vzkkg5cfv1w82g3jc3xr18wim29far9qpx9468s";
789 sha256 = "0794vhgxfmkh0vzkkg5cfv1w82g3jc3xr18wim29far9qpx9468s";
790 };
790 };
791 meta = {
791 meta = {
792 license = [ pkgs.lib.licenses.bsdOriginal ];
792 license = [ pkgs.lib.licenses.bsdOriginal ];
793 };
793 };
794 };
794 };
795 "ipaddress" = super.buildPythonPackage {
795 "ipaddress" = super.buildPythonPackage {
796 name = "ipaddress-1.0.23";
796 name = "ipaddress-1.0.23";
797 doCheck = false;
797 doCheck = false;
798 src = fetchurl {
798 src = fetchurl {
799 url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz";
799 url = "https://files.pythonhosted.org/packages/b9/9a/3e9da40ea28b8210dd6504d3fe9fe7e013b62bf45902b458d1cdc3c34ed9/ipaddress-1.0.23.tar.gz";
800 sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p";
800 sha256 = "1qp743h30s04m3cg3yk3fycad930jv17q7dsslj4mfw0jlvf1y5p";
801 };
801 };
802 meta = {
802 meta = {
803 license = [ pkgs.lib.licenses.psfl ];
803 license = [ pkgs.lib.licenses.psfl ];
804 };
804 };
805 };
805 };
806 "ipdb" = super.buildPythonPackage {
806 "ipdb" = super.buildPythonPackage {
807 name = "ipdb-0.13.2";
807 name = "ipdb-0.13.2";
808 doCheck = false;
808 doCheck = false;
809 propagatedBuildInputs = [
809 propagatedBuildInputs = [
810 self."setuptools"
810 self."setuptools"
811 self."ipython"
811 self."ipython"
812 ];
812 ];
813 src = fetchurl {
813 src = fetchurl {
814 url = "https://files.pythonhosted.org/packages/2c/bb/a3e1a441719ebd75c6dac8170d3ddba884b7ee8a5c0f9aefa7297386627a/ipdb-0.13.2.tar.gz";
814 url = "https://files.pythonhosted.org/packages/2c/bb/a3e1a441719ebd75c6dac8170d3ddba884b7ee8a5c0f9aefa7297386627a/ipdb-0.13.2.tar.gz";
815 sha256 = "0jcd849rx30y3wcgzsqbn06v0yjlzvb9x3076q0yxpycdwm1ryvp";
815 sha256 = "0jcd849rx30y3wcgzsqbn06v0yjlzvb9x3076q0yxpycdwm1ryvp";
816 };
816 };
817 meta = {
817 meta = {
818 license = [ pkgs.lib.licenses.bsdOriginal ];
818 license = [ pkgs.lib.licenses.bsdOriginal ];
819 };
819 };
820 };
820 };
821 "ipython" = super.buildPythonPackage {
821 "ipython" = super.buildPythonPackage {
822 name = "ipython-5.1.0";
822 name = "ipython-5.1.0";
823 doCheck = false;
823 doCheck = false;
824 propagatedBuildInputs = [
824 propagatedBuildInputs = [
825 self."setuptools"
825 self."setuptools"
826 self."decorator"
826 self."decorator"
827 self."pickleshare"
827 self."pickleshare"
828 self."simplegeneric"
828 self."simplegeneric"
829 self."traitlets"
829 self."traitlets"
830 self."prompt-toolkit"
830 self."prompt-toolkit"
831 self."pygments"
831 self."pygments"
832 self."pexpect"
832 self."pexpect"
833 self."backports.shutil-get-terminal-size"
833 self."backports.shutil-get-terminal-size"
834 self."pathlib2"
834 self."pathlib2"
835 self."pexpect"
835 self."pexpect"
836 ];
836 ];
837 src = fetchurl {
837 src = fetchurl {
838 url = "https://files.pythonhosted.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz";
838 url = "https://files.pythonhosted.org/packages/89/63/a9292f7cd9d0090a0f995e1167f3f17d5889dcbc9a175261719c513b9848/ipython-5.1.0.tar.gz";
839 sha256 = "0qdrf6aj9kvjczd5chj1my8y2iq09am9l8bb2a1334a52d76kx3y";
839 sha256 = "0qdrf6aj9kvjczd5chj1my8y2iq09am9l8bb2a1334a52d76kx3y";
840 };
840 };
841 meta = {
841 meta = {
842 license = [ pkgs.lib.licenses.bsdOriginal ];
842 license = [ pkgs.lib.licenses.bsdOriginal ];
843 };
843 };
844 };
844 };
845 "ipython-genutils" = super.buildPythonPackage {
845 "ipython-genutils" = super.buildPythonPackage {
846 name = "ipython-genutils-0.2.0";
846 name = "ipython-genutils-0.2.0";
847 doCheck = false;
847 doCheck = false;
848 src = fetchurl {
848 src = fetchurl {
849 url = "https://files.pythonhosted.org/packages/e8/69/fbeffffc05236398ebfcfb512b6d2511c622871dca1746361006da310399/ipython_genutils-0.2.0.tar.gz";
849 url = "https://files.pythonhosted.org/packages/e8/69/fbeffffc05236398ebfcfb512b6d2511c622871dca1746361006da310399/ipython_genutils-0.2.0.tar.gz";
850 sha256 = "1a4bc9y8hnvq6cp08qs4mckgm6i6ajpndp4g496rvvzcfmp12bpb";
850 sha256 = "1a4bc9y8hnvq6cp08qs4mckgm6i6ajpndp4g496rvvzcfmp12bpb";
851 };
851 };
852 meta = {
852 meta = {
853 license = [ pkgs.lib.licenses.bsdOriginal ];
853 license = [ pkgs.lib.licenses.bsdOriginal ];
854 };
854 };
855 };
855 };
856 "iso8601" = super.buildPythonPackage {
856 "iso8601" = super.buildPythonPackage {
857 name = "iso8601-0.1.12";
857 name = "iso8601-0.1.12";
858 doCheck = false;
858 doCheck = false;
859 src = fetchurl {
859 src = fetchurl {
860 url = "https://files.pythonhosted.org/packages/45/13/3db24895497345fb44c4248c08b16da34a9eb02643cea2754b21b5ed08b0/iso8601-0.1.12.tar.gz";
860 url = "https://files.pythonhosted.org/packages/45/13/3db24895497345fb44c4248c08b16da34a9eb02643cea2754b21b5ed08b0/iso8601-0.1.12.tar.gz";
861 sha256 = "10nyvvnrhw2w3p09v1ica4lgj6f4g9j3kkfx17qmraiq3w7b5i29";
861 sha256 = "10nyvvnrhw2w3p09v1ica4lgj6f4g9j3kkfx17qmraiq3w7b5i29";
862 };
862 };
863 meta = {
863 meta = {
864 license = [ pkgs.lib.licenses.mit ];
864 license = [ pkgs.lib.licenses.mit ];
865 };
865 };
866 };
866 };
867 "isodate" = super.buildPythonPackage {
867 "isodate" = super.buildPythonPackage {
868 name = "isodate-0.6.0";
868 name = "isodate-0.6.0";
869 doCheck = false;
869 doCheck = false;
870 propagatedBuildInputs = [
870 propagatedBuildInputs = [
871 self."six"
871 self."six"
872 ];
872 ];
873 src = fetchurl {
873 src = fetchurl {
874 url = "https://files.pythonhosted.org/packages/b1/80/fb8c13a4cd38eb5021dc3741a9e588e4d1de88d895c1910c6fc8a08b7a70/isodate-0.6.0.tar.gz";
874 url = "https://files.pythonhosted.org/packages/b1/80/fb8c13a4cd38eb5021dc3741a9e588e4d1de88d895c1910c6fc8a08b7a70/isodate-0.6.0.tar.gz";
875 sha256 = "1n7jkz68kk5pwni540pr5zdh99bf6ywydk1p5pdrqisrawylldif";
875 sha256 = "1n7jkz68kk5pwni540pr5zdh99bf6ywydk1p5pdrqisrawylldif";
876 };
876 };
877 meta = {
877 meta = {
878 license = [ pkgs.lib.licenses.bsdOriginal ];
878 license = [ pkgs.lib.licenses.bsdOriginal ];
879 };
879 };
880 };
880 };
881 "itsdangerous" = super.buildPythonPackage {
881 "itsdangerous" = super.buildPythonPackage {
882 name = "itsdangerous-1.1.0";
882 name = "itsdangerous-1.1.0";
883 doCheck = false;
883 doCheck = false;
884 src = fetchurl {
884 src = fetchurl {
885 url = "https://files.pythonhosted.org/packages/68/1a/f27de07a8a304ad5fa817bbe383d1238ac4396da447fa11ed937039fa04b/itsdangerous-1.1.0.tar.gz";
885 url = "https://files.pythonhosted.org/packages/68/1a/f27de07a8a304ad5fa817bbe383d1238ac4396da447fa11ed937039fa04b/itsdangerous-1.1.0.tar.gz";
886 sha256 = "068zpbksq5q2z4dckh2k1zbcq43ay74ylqn77rni797j0wyh66rj";
886 sha256 = "068zpbksq5q2z4dckh2k1zbcq43ay74ylqn77rni797j0wyh66rj";
887 };
887 };
888 meta = {
888 meta = {
889 license = [ pkgs.lib.licenses.bsdOriginal ];
889 license = [ pkgs.lib.licenses.bsdOriginal ];
890 };
890 };
891 };
891 };
892 "jinja2" = super.buildPythonPackage {
892 "jinja2" = super.buildPythonPackage {
893 name = "jinja2-2.9.6";
893 name = "jinja2-2.9.6";
894 doCheck = false;
894 doCheck = false;
895 propagatedBuildInputs = [
895 propagatedBuildInputs = [
896 self."markupsafe"
896 self."markupsafe"
897 ];
897 ];
898 src = fetchurl {
898 src = fetchurl {
899 url = "https://files.pythonhosted.org/packages/90/61/f820ff0076a2599dd39406dcb858ecb239438c02ce706c8e91131ab9c7f1/Jinja2-2.9.6.tar.gz";
899 url = "https://files.pythonhosted.org/packages/90/61/f820ff0076a2599dd39406dcb858ecb239438c02ce706c8e91131ab9c7f1/Jinja2-2.9.6.tar.gz";
900 sha256 = "1zzrkywhziqffrzks14kzixz7nd4yh2vc0fb04a68vfd2ai03anx";
900 sha256 = "1zzrkywhziqffrzks14kzixz7nd4yh2vc0fb04a68vfd2ai03anx";
901 };
901 };
902 meta = {
902 meta = {
903 license = [ pkgs.lib.licenses.bsdOriginal ];
903 license = [ pkgs.lib.licenses.bsdOriginal ];
904 };
904 };
905 };
905 };
906 "jsonschema" = super.buildPythonPackage {
906 "jsonschema" = super.buildPythonPackage {
907 name = "jsonschema-2.6.0";
907 name = "jsonschema-2.6.0";
908 doCheck = false;
908 doCheck = false;
909 propagatedBuildInputs = [
909 propagatedBuildInputs = [
910 self."functools32"
910 self."functools32"
911 ];
911 ];
912 src = fetchurl {
912 src = fetchurl {
913 url = "https://files.pythonhosted.org/packages/58/b9/171dbb07e18c6346090a37f03c7e74410a1a56123f847efed59af260a298/jsonschema-2.6.0.tar.gz";
913 url = "https://files.pythonhosted.org/packages/58/b9/171dbb07e18c6346090a37f03c7e74410a1a56123f847efed59af260a298/jsonschema-2.6.0.tar.gz";
914 sha256 = "00kf3zmpp9ya4sydffpifn0j0mzm342a2vzh82p6r0vh10cg7xbg";
914 sha256 = "00kf3zmpp9ya4sydffpifn0j0mzm342a2vzh82p6r0vh10cg7xbg";
915 };
915 };
916 meta = {
916 meta = {
917 license = [ pkgs.lib.licenses.mit ];
917 license = [ pkgs.lib.licenses.mit ];
918 };
918 };
919 };
919 };
920 "jupyter-client" = super.buildPythonPackage {
920 "jupyter-client" = super.buildPythonPackage {
921 name = "jupyter-client-5.0.0";
921 name = "jupyter-client-5.0.0";
922 doCheck = false;
922 doCheck = false;
923 propagatedBuildInputs = [
923 propagatedBuildInputs = [
924 self."traitlets"
924 self."traitlets"
925 self."jupyter-core"
925 self."jupyter-core"
926 self."pyzmq"
926 self."pyzmq"
927 self."python-dateutil"
927 self."python-dateutil"
928 ];
928 ];
929 src = fetchurl {
929 src = fetchurl {
930 url = "https://files.pythonhosted.org/packages/e5/6f/65412ed462202b90134b7e761b0b7e7f949e07a549c1755475333727b3d0/jupyter_client-5.0.0.tar.gz";
930 url = "https://files.pythonhosted.org/packages/e5/6f/65412ed462202b90134b7e761b0b7e7f949e07a549c1755475333727b3d0/jupyter_client-5.0.0.tar.gz";
931 sha256 = "0nxw4rqk4wsjhc87gjqd7pv89cb9dnimcfnmcmp85bmrvv1gjri7";
931 sha256 = "0nxw4rqk4wsjhc87gjqd7pv89cb9dnimcfnmcmp85bmrvv1gjri7";
932 };
932 };
933 meta = {
933 meta = {
934 license = [ pkgs.lib.licenses.bsdOriginal ];
934 license = [ pkgs.lib.licenses.bsdOriginal ];
935 };
935 };
936 };
936 };
937 "jupyter-core" = super.buildPythonPackage {
937 "jupyter-core" = super.buildPythonPackage {
938 name = "jupyter-core-4.5.0";
938 name = "jupyter-core-4.5.0";
939 doCheck = false;
939 doCheck = false;
940 propagatedBuildInputs = [
940 propagatedBuildInputs = [
941 self."traitlets"
941 self."traitlets"
942 ];
942 ];
943 src = fetchurl {
943 src = fetchurl {
944 url = "https://files.pythonhosted.org/packages/4a/de/ff4ca734656d17ebe0450807b59d728f45277e2e7f4b82bc9aae6cb82961/jupyter_core-4.5.0.tar.gz";
944 url = "https://files.pythonhosted.org/packages/4a/de/ff4ca734656d17ebe0450807b59d728f45277e2e7f4b82bc9aae6cb82961/jupyter_core-4.5.0.tar.gz";
945 sha256 = "1xr4pbghwk5hayn5wwnhb7z95380r45p79gf5if5pi1akwg7qvic";
945 sha256 = "1xr4pbghwk5hayn5wwnhb7z95380r45p79gf5if5pi1akwg7qvic";
946 };
946 };
947 meta = {
947 meta = {
948 license = [ pkgs.lib.licenses.bsdOriginal ];
948 license = [ pkgs.lib.licenses.bsdOriginal ];
949 };
949 };
950 };
950 };
951 "kombu" = super.buildPythonPackage {
951 "kombu" = super.buildPythonPackage {
952 name = "kombu-4.6.6";
952 name = "kombu-4.6.6";
953 doCheck = false;
953 doCheck = false;
954 propagatedBuildInputs = [
954 propagatedBuildInputs = [
955 self."amqp"
955 self."amqp"
956 self."importlib-metadata"
956 self."importlib-metadata"
957 ];
957 ];
958 src = fetchurl {
958 src = fetchurl {
959 url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz";
959 url = "https://files.pythonhosted.org/packages/20/e6/bc2d9affba6138a1dc143f77fef253e9e08e238fa7c0688d917c09005e96/kombu-4.6.6.tar.gz";
960 sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p";
960 sha256 = "11mxpcy8mg1l35bgbhba70v29bydr2hrhdbdlb4lg98m3m5vaq0p";
961 };
961 };
962 meta = {
962 meta = {
963 license = [ pkgs.lib.licenses.bsdOriginal ];
963 license = [ pkgs.lib.licenses.bsdOriginal ];
964 };
964 };
965 };
965 };
966 "lxml" = super.buildPythonPackage {
966 "lxml" = super.buildPythonPackage {
967 name = "lxml-4.2.5";
967 name = "lxml-4.2.5";
968 doCheck = false;
968 doCheck = false;
969 src = fetchurl {
969 src = fetchurl {
970 url = "https://files.pythonhosted.org/packages/4b/20/ddf5eb3bd5c57582d2b4652b4bbcf8da301bdfe5d805cb94e805f4d7464d/lxml-4.2.5.tar.gz";
970 url = "https://files.pythonhosted.org/packages/4b/20/ddf5eb3bd5c57582d2b4652b4bbcf8da301bdfe5d805cb94e805f4d7464d/lxml-4.2.5.tar.gz";
971 sha256 = "0zw0y9hs0nflxhl9cs6ipwwh53szi3w2x06wl0k9cylyqac0cwin";
971 sha256 = "0zw0y9hs0nflxhl9cs6ipwwh53szi3w2x06wl0k9cylyqac0cwin";
972 };
972 };
973 meta = {
973 meta = {
974 license = [ pkgs.lib.licenses.bsdOriginal ];
974 license = [ pkgs.lib.licenses.bsdOriginal ];
975 };
975 };
976 };
976 };
977 "mako" = super.buildPythonPackage {
977 "mako" = super.buildPythonPackage {
978 name = "mako-1.1.0";
978 name = "mako-1.1.0";
979 doCheck = false;
979 doCheck = false;
980 propagatedBuildInputs = [
980 propagatedBuildInputs = [
981 self."markupsafe"
981 self."markupsafe"
982 ];
982 ];
983 src = fetchurl {
983 src = fetchurl {
984 url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz";
984 url = "https://files.pythonhosted.org/packages/b0/3c/8dcd6883d009f7cae0f3157fb53e9afb05a0d3d33b3db1268ec2e6f4a56b/Mako-1.1.0.tar.gz";
985 sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3";
985 sha256 = "0jqa3qfpykyn4fmkn0kh6043sfls7br8i2bsdbccazcvk9cijsd3";
986 };
986 };
987 meta = {
987 meta = {
988 license = [ pkgs.lib.licenses.mit ];
988 license = [ pkgs.lib.licenses.mit ];
989 };
989 };
990 };
990 };
991 "markdown" = super.buildPythonPackage {
991 "markdown" = super.buildPythonPackage {
992 name = "markdown-2.6.11";
992 name = "markdown-2.6.11";
993 doCheck = false;
993 doCheck = false;
994 src = fetchurl {
994 src = fetchurl {
995 url = "https://files.pythonhosted.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz";
995 url = "https://files.pythonhosted.org/packages/b3/73/fc5c850f44af5889192dff783b7b0d8f3fe8d30b65c8e3f78f8f0265fecf/Markdown-2.6.11.tar.gz";
996 sha256 = "108g80ryzykh8bj0i7jfp71510wrcixdi771lf2asyghgyf8cmm8";
996 sha256 = "108g80ryzykh8bj0i7jfp71510wrcixdi771lf2asyghgyf8cmm8";
997 };
997 };
998 meta = {
998 meta = {
999 license = [ pkgs.lib.licenses.bsdOriginal ];
999 license = [ pkgs.lib.licenses.bsdOriginal ];
1000 };
1000 };
1001 };
1001 };
1002 "markupsafe" = super.buildPythonPackage {
1002 "markupsafe" = super.buildPythonPackage {
1003 name = "markupsafe-1.1.1";
1003 name = "markupsafe-1.1.1";
1004 doCheck = false;
1004 doCheck = false;
1005 src = fetchurl {
1005 src = fetchurl {
1006 url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz";
1006 url = "https://files.pythonhosted.org/packages/b9/2e/64db92e53b86efccfaea71321f597fa2e1b2bd3853d8ce658568f7a13094/MarkupSafe-1.1.1.tar.gz";
1007 sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9";
1007 sha256 = "0sqipg4fk7xbixqd8kq6rlkxj664d157bdwbh93farcphf92x1r9";
1008 };
1008 };
1009 meta = {
1009 meta = {
1010 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ];
1010 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd3 ];
1011 };
1011 };
1012 };
1012 };
1013 "marshmallow" = super.buildPythonPackage {
1013 "marshmallow" = super.buildPythonPackage {
1014 name = "marshmallow-2.18.0";
1014 name = "marshmallow-2.18.0";
1015 doCheck = false;
1015 doCheck = false;
1016 src = fetchurl {
1016 src = fetchurl {
1017 url = "https://files.pythonhosted.org/packages/ad/0b/5799965d1c6d5f608d684e2c0dce8a828e0309a3bfe8327d9418a89f591c/marshmallow-2.18.0.tar.gz";
1017 url = "https://files.pythonhosted.org/packages/ad/0b/5799965d1c6d5f608d684e2c0dce8a828e0309a3bfe8327d9418a89f591c/marshmallow-2.18.0.tar.gz";
1018 sha256 = "1g0aafpjn7yaxq06yndy8c7rs9n42adxkqq1ayhlr869pr06d3lm";
1018 sha256 = "1g0aafpjn7yaxq06yndy8c7rs9n42adxkqq1ayhlr869pr06d3lm";
1019 };
1019 };
1020 meta = {
1020 meta = {
1021 license = [ pkgs.lib.licenses.mit ];
1021 license = [ pkgs.lib.licenses.mit ];
1022 };
1022 };
1023 };
1023 };
1024 "mistune" = super.buildPythonPackage {
1024 "mistune" = super.buildPythonPackage {
1025 name = "mistune-0.8.4";
1025 name = "mistune-0.8.4";
1026 doCheck = false;
1026 doCheck = false;
1027 src = fetchurl {
1027 src = fetchurl {
1028 url = "https://files.pythonhosted.org/packages/2d/a4/509f6e7783ddd35482feda27bc7f72e65b5e7dc910eca4ab2164daf9c577/mistune-0.8.4.tar.gz";
1028 url = "https://files.pythonhosted.org/packages/2d/a4/509f6e7783ddd35482feda27bc7f72e65b5e7dc910eca4ab2164daf9c577/mistune-0.8.4.tar.gz";
1029 sha256 = "0vkmsh0x480rni51lhyvigfdf06b9247z868pk3bal1wnnfl58sr";
1029 sha256 = "0vkmsh0x480rni51lhyvigfdf06b9247z868pk3bal1wnnfl58sr";
1030 };
1030 };
1031 meta = {
1031 meta = {
1032 license = [ pkgs.lib.licenses.bsdOriginal ];
1032 license = [ pkgs.lib.licenses.bsdOriginal ];
1033 };
1033 };
1034 };
1034 };
1035 "mock" = super.buildPythonPackage {
1035 "mock" = super.buildPythonPackage {
1036 name = "mock-3.0.5";
1036 name = "mock-3.0.5";
1037 doCheck = false;
1037 doCheck = false;
1038 propagatedBuildInputs = [
1038 propagatedBuildInputs = [
1039 self."six"
1039 self."six"
1040 self."funcsigs"
1040 self."funcsigs"
1041 ];
1041 ];
1042 src = fetchurl {
1042 src = fetchurl {
1043 url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz";
1043 url = "https://files.pythonhosted.org/packages/2e/ab/4fe657d78b270aa6a32f027849513b829b41b0f28d9d8d7f8c3d29ea559a/mock-3.0.5.tar.gz";
1044 sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3";
1044 sha256 = "1hrp6j0yrx2xzylfv02qa8kph661m6yq4p0mc8fnimch9j4psrc3";
1045 };
1045 };
1046 meta = {
1046 meta = {
1047 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ];
1047 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "OSI Approved :: BSD License"; } ];
1048 };
1048 };
1049 };
1049 };
1050 "more-itertools" = super.buildPythonPackage {
1050 "more-itertools" = super.buildPythonPackage {
1051 name = "more-itertools-5.0.0";
1051 name = "more-itertools-5.0.0";
1052 doCheck = false;
1052 doCheck = false;
1053 propagatedBuildInputs = [
1053 propagatedBuildInputs = [
1054 self."six"
1054 self."six"
1055 ];
1055 ];
1056 src = fetchurl {
1056 src = fetchurl {
1057 url = "https://files.pythonhosted.org/packages/dd/26/30fc0d541d9fdf55faf5ba4b0fd68f81d5bd2447579224820ad525934178/more-itertools-5.0.0.tar.gz";
1057 url = "https://files.pythonhosted.org/packages/dd/26/30fc0d541d9fdf55faf5ba4b0fd68f81d5bd2447579224820ad525934178/more-itertools-5.0.0.tar.gz";
1058 sha256 = "1r12cm6mcdwdzz7d47a6g4l437xsvapdlgyhqay3i2nrlv03da9q";
1058 sha256 = "1r12cm6mcdwdzz7d47a6g4l437xsvapdlgyhqay3i2nrlv03da9q";
1059 };
1059 };
1060 meta = {
1060 meta = {
1061 license = [ pkgs.lib.licenses.mit ];
1061 license = [ pkgs.lib.licenses.mit ];
1062 };
1062 };
1063 };
1063 };
1064 "msgpack-python" = super.buildPythonPackage {
1064 "msgpack-python" = super.buildPythonPackage {
1065 name = "msgpack-python-0.5.6";
1065 name = "msgpack-python-0.5.6";
1066 doCheck = false;
1066 doCheck = false;
1067 src = fetchurl {
1067 src = fetchurl {
1068 url = "https://files.pythonhosted.org/packages/8a/20/6eca772d1a5830336f84aca1d8198e5a3f4715cd1c7fc36d3cc7f7185091/msgpack-python-0.5.6.tar.gz";
1068 url = "https://files.pythonhosted.org/packages/8a/20/6eca772d1a5830336f84aca1d8198e5a3f4715cd1c7fc36d3cc7f7185091/msgpack-python-0.5.6.tar.gz";
1069 sha256 = "16wh8qgybmfh4pjp8vfv78mdlkxfmcasg78lzlnm6nslsfkci31p";
1069 sha256 = "16wh8qgybmfh4pjp8vfv78mdlkxfmcasg78lzlnm6nslsfkci31p";
1070 };
1070 };
1071 meta = {
1071 meta = {
1072 license = [ pkgs.lib.licenses.asl20 ];
1072 license = [ pkgs.lib.licenses.asl20 ];
1073 };
1073 };
1074 };
1074 };
1075 "mysql-python" = super.buildPythonPackage {
1075 "mysql-python" = super.buildPythonPackage {
1076 name = "mysql-python-1.2.5";
1076 name = "mysql-python-1.2.5";
1077 doCheck = false;
1077 doCheck = false;
1078 src = fetchurl {
1078 src = fetchurl {
1079 url = "https://files.pythonhosted.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip";
1079 url = "https://files.pythonhosted.org/packages/a5/e9/51b544da85a36a68debe7a7091f068d802fc515a3a202652828c73453cad/MySQL-python-1.2.5.zip";
1080 sha256 = "0x0c2jg0bb3pp84njaqiic050qkyd7ymwhfvhipnimg58yv40441";
1080 sha256 = "0x0c2jg0bb3pp84njaqiic050qkyd7ymwhfvhipnimg58yv40441";
1081 };
1081 };
1082 meta = {
1082 meta = {
1083 license = [ pkgs.lib.licenses.gpl1 ];
1083 license = [ pkgs.lib.licenses.gpl1 ];
1084 };
1084 };
1085 };
1085 };
1086 "nbconvert" = super.buildPythonPackage {
1086 "nbconvert" = super.buildPythonPackage {
1087 name = "nbconvert-5.3.1";
1087 name = "nbconvert-5.3.1";
1088 doCheck = false;
1088 doCheck = false;
1089 propagatedBuildInputs = [
1089 propagatedBuildInputs = [
1090 self."mistune"
1090 self."mistune"
1091 self."jinja2"
1091 self."jinja2"
1092 self."pygments"
1092 self."pygments"
1093 self."traitlets"
1093 self."traitlets"
1094 self."jupyter-core"
1094 self."jupyter-core"
1095 self."nbformat"
1095 self."nbformat"
1096 self."entrypoints"
1096 self."entrypoints"
1097 self."bleach"
1097 self."bleach"
1098 self."pandocfilters"
1098 self."pandocfilters"
1099 self."testpath"
1099 self."testpath"
1100 ];
1100 ];
1101 src = fetchurl {
1101 src = fetchurl {
1102 url = "https://files.pythonhosted.org/packages/b9/a4/d0a0938ad6f5eeb4dea4e73d255c617ef94b0b2849d51194c9bbdb838412/nbconvert-5.3.1.tar.gz";
1102 url = "https://files.pythonhosted.org/packages/b9/a4/d0a0938ad6f5eeb4dea4e73d255c617ef94b0b2849d51194c9bbdb838412/nbconvert-5.3.1.tar.gz";
1103 sha256 = "1f9dkvpx186xjm4xab0qbph588mncp4vqk3fmxrsnqs43mks9c8j";
1103 sha256 = "1f9dkvpx186xjm4xab0qbph588mncp4vqk3fmxrsnqs43mks9c8j";
1104 };
1104 };
1105 meta = {
1105 meta = {
1106 license = [ pkgs.lib.licenses.bsdOriginal ];
1106 license = [ pkgs.lib.licenses.bsdOriginal ];
1107 };
1107 };
1108 };
1108 };
1109 "nbformat" = super.buildPythonPackage {
1109 "nbformat" = super.buildPythonPackage {
1110 name = "nbformat-4.4.0";
1110 name = "nbformat-4.4.0";
1111 doCheck = false;
1111 doCheck = false;
1112 propagatedBuildInputs = [
1112 propagatedBuildInputs = [
1113 self."ipython-genutils"
1113 self."ipython-genutils"
1114 self."traitlets"
1114 self."traitlets"
1115 self."jsonschema"
1115 self."jsonschema"
1116 self."jupyter-core"
1116 self."jupyter-core"
1117 ];
1117 ];
1118 src = fetchurl {
1118 src = fetchurl {
1119 url = "https://files.pythonhosted.org/packages/6e/0e/160754f7ae3e984863f585a3743b0ed1702043a81245907c8fae2d537155/nbformat-4.4.0.tar.gz";
1119 url = "https://files.pythonhosted.org/packages/6e/0e/160754f7ae3e984863f585a3743b0ed1702043a81245907c8fae2d537155/nbformat-4.4.0.tar.gz";
1120 sha256 = "00nlf08h8yc4q73nphfvfhxrcnilaqanb8z0mdy6nxk0vzq4wjgp";
1120 sha256 = "00nlf08h8yc4q73nphfvfhxrcnilaqanb8z0mdy6nxk0vzq4wjgp";
1121 };
1121 };
1122 meta = {
1122 meta = {
1123 license = [ pkgs.lib.licenses.bsdOriginal ];
1123 license = [ pkgs.lib.licenses.bsdOriginal ];
1124 };
1124 };
1125 };
1125 };
1126 "packaging" = super.buildPythonPackage {
1126 "packaging" = super.buildPythonPackage {
1127 name = "packaging-20.3";
1127 name = "packaging-20.3";
1128 doCheck = false;
1128 doCheck = false;
1129 propagatedBuildInputs = [
1129 propagatedBuildInputs = [
1130 self."pyparsing"
1130 self."pyparsing"
1131 self."six"
1131 self."six"
1132 ];
1132 ];
1133 src = fetchurl {
1133 src = fetchurl {
1134 url = "https://files.pythonhosted.org/packages/65/37/83e3f492eb52d771e2820e88105f605335553fe10422cba9d256faeb1702/packaging-20.3.tar.gz";
1134 url = "https://files.pythonhosted.org/packages/65/37/83e3f492eb52d771e2820e88105f605335553fe10422cba9d256faeb1702/packaging-20.3.tar.gz";
1135 sha256 = "18xpablq278janh03bai9xd4kz9b0yfp6vflazn725ns9x3jna9w";
1135 sha256 = "18xpablq278janh03bai9xd4kz9b0yfp6vflazn725ns9x3jna9w";
1136 };
1136 };
1137 meta = {
1137 meta = {
1138 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
1138 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "BSD or Apache License, Version 2.0"; } pkgs.lib.licenses.asl20 ];
1139 };
1139 };
1140 };
1140 };
1141 "pandocfilters" = super.buildPythonPackage {
1141 "pandocfilters" = super.buildPythonPackage {
1142 name = "pandocfilters-1.4.2";
1142 name = "pandocfilters-1.4.2";
1143 doCheck = false;
1143 doCheck = false;
1144 src = fetchurl {
1144 src = fetchurl {
1145 url = "https://files.pythonhosted.org/packages/4c/ea/236e2584af67bb6df960832731a6e5325fd4441de001767da328c33368ce/pandocfilters-1.4.2.tar.gz";
1145 url = "https://files.pythonhosted.org/packages/4c/ea/236e2584af67bb6df960832731a6e5325fd4441de001767da328c33368ce/pandocfilters-1.4.2.tar.gz";
1146 sha256 = "1a8d9b7s48gmq9zj0pmbyv2sivn5i7m6mybgpkk4jm5vd7hp1pdk";
1146 sha256 = "1a8d9b7s48gmq9zj0pmbyv2sivn5i7m6mybgpkk4jm5vd7hp1pdk";
1147 };
1147 };
1148 meta = {
1148 meta = {
1149 license = [ pkgs.lib.licenses.bsdOriginal ];
1149 license = [ pkgs.lib.licenses.bsdOriginal ];
1150 };
1150 };
1151 };
1151 };
1152 "paste" = super.buildPythonPackage {
1152 "paste" = super.buildPythonPackage {
1153 name = "paste-3.4.0";
1153 name = "paste-3.4.0";
1154 doCheck = false;
1154 doCheck = false;
1155 propagatedBuildInputs = [
1155 propagatedBuildInputs = [
1156 self."six"
1156 self."six"
1157 ];
1157 ];
1158 src = fetchurl {
1158 src = fetchurl {
1159 url = "https://files.pythonhosted.org/packages/79/4a/45821b71dd40000507549afd1491546afad8279c0a87527c88776a794158/Paste-3.4.0.tar.gz";
1159 url = "https://files.pythonhosted.org/packages/79/4a/45821b71dd40000507549afd1491546afad8279c0a87527c88776a794158/Paste-3.4.0.tar.gz";
1160 sha256 = "16sichvhyci1gaarkjs35mai8vphh7b244qm14hj1isw38nx4c03";
1160 sha256 = "16sichvhyci1gaarkjs35mai8vphh7b244qm14hj1isw38nx4c03";
1161 };
1161 };
1162 meta = {
1162 meta = {
1163 license = [ pkgs.lib.licenses.mit ];
1163 license = [ pkgs.lib.licenses.mit ];
1164 };
1164 };
1165 };
1165 };
1166 "pastedeploy" = super.buildPythonPackage {
1166 "pastedeploy" = super.buildPythonPackage {
1167 name = "pastedeploy-2.1.0";
1167 name = "pastedeploy-2.1.0";
1168 doCheck = false;
1168 doCheck = false;
1169 src = fetchurl {
1169 src = fetchurl {
1170 url = "https://files.pythonhosted.org/packages/c4/e9/972a1c20318b3ae9edcab11a6cef64308fbae5d0d45ab52c6f8b2b8f35b8/PasteDeploy-2.1.0.tar.gz";
1170 url = "https://files.pythonhosted.org/packages/c4/e9/972a1c20318b3ae9edcab11a6cef64308fbae5d0d45ab52c6f8b2b8f35b8/PasteDeploy-2.1.0.tar.gz";
1171 sha256 = "16qsq5y6mryslmbp5pn35x4z8z3ndp5rpgl42h226879nrw9hmg7";
1171 sha256 = "16qsq5y6mryslmbp5pn35x4z8z3ndp5rpgl42h226879nrw9hmg7";
1172 };
1172 };
1173 meta = {
1173 meta = {
1174 license = [ pkgs.lib.licenses.mit ];
1174 license = [ pkgs.lib.licenses.mit ];
1175 };
1175 };
1176 };
1176 };
1177 "pastescript" = super.buildPythonPackage {
1177 "pastescript" = super.buildPythonPackage {
1178 name = "pastescript-3.2.0";
1178 name = "pastescript-3.2.0";
1179 doCheck = false;
1179 doCheck = false;
1180 propagatedBuildInputs = [
1180 propagatedBuildInputs = [
1181 self."paste"
1181 self."paste"
1182 self."pastedeploy"
1182 self."pastedeploy"
1183 self."six"
1183 self."six"
1184 ];
1184 ];
1185 src = fetchurl {
1185 src = fetchurl {
1186 url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz";
1186 url = "https://files.pythonhosted.org/packages/ff/47/45c6f5a3cb8f5abf786fea98dbb8d02400a55768a9b623afb7df12346c61/PasteScript-3.2.0.tar.gz";
1187 sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv";
1187 sha256 = "1b3jq7xh383nvrrlblk05m37345bv97xrhx77wshllba3h7mq3wv";
1188 };
1188 };
1189 meta = {
1189 meta = {
1190 license = [ pkgs.lib.licenses.mit ];
1190 license = [ pkgs.lib.licenses.mit ];
1191 };
1191 };
1192 };
1192 };
1193 "pathlib2" = super.buildPythonPackage {
1193 "pathlib2" = super.buildPythonPackage {
1194 name = "pathlib2-2.3.5";
1194 name = "pathlib2-2.3.5";
1195 doCheck = false;
1195 doCheck = false;
1196 propagatedBuildInputs = [
1196 propagatedBuildInputs = [
1197 self."six"
1197 self."six"
1198 self."scandir"
1198 self."scandir"
1199 ];
1199 ];
1200 src = fetchurl {
1200 src = fetchurl {
1201 url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz";
1201 url = "https://files.pythonhosted.org/packages/94/d8/65c86584e7e97ef824a1845c72bbe95d79f5b306364fa778a3c3e401b309/pathlib2-2.3.5.tar.gz";
1202 sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc";
1202 sha256 = "0s4qa8c082fdkb17izh4mfgwrjd1n5pya18wvrbwqdvvb5xs9nbc";
1203 };
1203 };
1204 meta = {
1204 meta = {
1205 license = [ pkgs.lib.licenses.mit ];
1205 license = [ pkgs.lib.licenses.mit ];
1206 };
1206 };
1207 };
1207 };
1208 "peppercorn" = super.buildPythonPackage {
1208 "peppercorn" = super.buildPythonPackage {
1209 name = "peppercorn-0.6";
1209 name = "peppercorn-0.6";
1210 doCheck = false;
1210 doCheck = false;
1211 src = fetchurl {
1211 src = fetchurl {
1212 url = "https://files.pythonhosted.org/packages/e4/77/93085de7108cdf1a0b092ff443872a8f9442c736d7ddebdf2f27627935f4/peppercorn-0.6.tar.gz";
1212 url = "https://files.pythonhosted.org/packages/e4/77/93085de7108cdf1a0b092ff443872a8f9442c736d7ddebdf2f27627935f4/peppercorn-0.6.tar.gz";
1213 sha256 = "1ip4bfwcpwkq9hz2dai14k2cyabvwrnvcvrcmzxmqm04g8fnimwn";
1213 sha256 = "1ip4bfwcpwkq9hz2dai14k2cyabvwrnvcvrcmzxmqm04g8fnimwn";
1214 };
1214 };
1215 meta = {
1215 meta = {
1216 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1216 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1217 };
1217 };
1218 };
1218 };
1219 "pexpect" = super.buildPythonPackage {
1219 "pexpect" = super.buildPythonPackage {
1220 name = "pexpect-4.8.0";
1220 name = "pexpect-4.8.0";
1221 doCheck = false;
1221 doCheck = false;
1222 propagatedBuildInputs = [
1222 propagatedBuildInputs = [
1223 self."ptyprocess"
1223 self."ptyprocess"
1224 ];
1224 ];
1225 src = fetchurl {
1225 src = fetchurl {
1226 url = "https://files.pythonhosted.org/packages/e5/9b/ff402e0e930e70467a7178abb7c128709a30dfb22d8777c043e501bc1b10/pexpect-4.8.0.tar.gz";
1226 url = "https://files.pythonhosted.org/packages/e5/9b/ff402e0e930e70467a7178abb7c128709a30dfb22d8777c043e501bc1b10/pexpect-4.8.0.tar.gz";
1227 sha256 = "032cg337h8awydgypz6f4wx848lw8dyrj4zy988x0lyib4ws8rgw";
1227 sha256 = "032cg337h8awydgypz6f4wx848lw8dyrj4zy988x0lyib4ws8rgw";
1228 };
1228 };
1229 meta = {
1229 meta = {
1230 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1230 license = [ pkgs.lib.licenses.isc { fullName = "ISC License (ISCL)"; } ];
1231 };
1231 };
1232 };
1232 };
1233 "pickleshare" = super.buildPythonPackage {
1233 "pickleshare" = super.buildPythonPackage {
1234 name = "pickleshare-0.7.5";
1234 name = "pickleshare-0.7.5";
1235 doCheck = false;
1235 doCheck = false;
1236 propagatedBuildInputs = [
1236 propagatedBuildInputs = [
1237 self."pathlib2"
1237 self."pathlib2"
1238 ];
1238 ];
1239 src = fetchurl {
1239 src = fetchurl {
1240 url = "https://files.pythonhosted.org/packages/d8/b6/df3c1c9b616e9c0edbc4fbab6ddd09df9535849c64ba51fcb6531c32d4d8/pickleshare-0.7.5.tar.gz";
1240 url = "https://files.pythonhosted.org/packages/d8/b6/df3c1c9b616e9c0edbc4fbab6ddd09df9535849c64ba51fcb6531c32d4d8/pickleshare-0.7.5.tar.gz";
1241 sha256 = "1jmghg3c53yp1i8cm6pcrm280ayi8621rwyav9fac7awjr3kss47";
1241 sha256 = "1jmghg3c53yp1i8cm6pcrm280ayi8621rwyav9fac7awjr3kss47";
1242 };
1242 };
1243 meta = {
1243 meta = {
1244 license = [ pkgs.lib.licenses.mit ];
1244 license = [ pkgs.lib.licenses.mit ];
1245 };
1245 };
1246 };
1246 };
1247 "plaster" = super.buildPythonPackage {
1247 "plaster" = super.buildPythonPackage {
1248 name = "plaster-1.0";
1248 name = "plaster-1.0";
1249 doCheck = false;
1249 doCheck = false;
1250 propagatedBuildInputs = [
1250 propagatedBuildInputs = [
1251 self."setuptools"
1251 self."setuptools"
1252 ];
1252 ];
1253 src = fetchurl {
1253 src = fetchurl {
1254 url = "https://files.pythonhosted.org/packages/37/e1/56d04382d718d32751017d32f351214384e529b794084eee20bb52405563/plaster-1.0.tar.gz";
1254 url = "https://files.pythonhosted.org/packages/37/e1/56d04382d718d32751017d32f351214384e529b794084eee20bb52405563/plaster-1.0.tar.gz";
1255 sha256 = "1hy8k0nv2mxq94y5aysk6hjk9ryb4bsd13g83m60hcyzxz3wflc3";
1255 sha256 = "1hy8k0nv2mxq94y5aysk6hjk9ryb4bsd13g83m60hcyzxz3wflc3";
1256 };
1256 };
1257 meta = {
1257 meta = {
1258 license = [ pkgs.lib.licenses.mit ];
1258 license = [ pkgs.lib.licenses.mit ];
1259 };
1259 };
1260 };
1260 };
1261 "plaster-pastedeploy" = super.buildPythonPackage {
1261 "plaster-pastedeploy" = super.buildPythonPackage {
1262 name = "plaster-pastedeploy-0.7";
1262 name = "plaster-pastedeploy-0.7";
1263 doCheck = false;
1263 doCheck = false;
1264 propagatedBuildInputs = [
1264 propagatedBuildInputs = [
1265 self."pastedeploy"
1265 self."pastedeploy"
1266 self."plaster"
1266 self."plaster"
1267 ];
1267 ];
1268 src = fetchurl {
1268 src = fetchurl {
1269 url = "https://files.pythonhosted.org/packages/99/69/2d3bc33091249266a1bd3cf24499e40ab31d54dffb4a7d76fe647950b98c/plaster_pastedeploy-0.7.tar.gz";
1269 url = "https://files.pythonhosted.org/packages/99/69/2d3bc33091249266a1bd3cf24499e40ab31d54dffb4a7d76fe647950b98c/plaster_pastedeploy-0.7.tar.gz";
1270 sha256 = "1zg7gcsvc1kzay1ry5p699rg2qavfsxqwl17mqxzr0gzw6j9679r";
1270 sha256 = "1zg7gcsvc1kzay1ry5p699rg2qavfsxqwl17mqxzr0gzw6j9679r";
1271 };
1271 };
1272 meta = {
1272 meta = {
1273 license = [ pkgs.lib.licenses.mit ];
1273 license = [ pkgs.lib.licenses.mit ];
1274 };
1274 };
1275 };
1275 };
1276 "pluggy" = super.buildPythonPackage {
1276 "pluggy" = super.buildPythonPackage {
1277 name = "pluggy-0.13.1";
1277 name = "pluggy-0.13.1";
1278 doCheck = false;
1278 doCheck = false;
1279 propagatedBuildInputs = [
1279 propagatedBuildInputs = [
1280 self."importlib-metadata"
1280 self."importlib-metadata"
1281 ];
1281 ];
1282 src = fetchurl {
1282 src = fetchurl {
1283 url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz";
1283 url = "https://files.pythonhosted.org/packages/f8/04/7a8542bed4b16a65c2714bf76cf5a0b026157da7f75e87cc88774aa10b14/pluggy-0.13.1.tar.gz";
1284 sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm";
1284 sha256 = "1c35qyhvy27q9ih9n899f3h4sdnpgq027dbiilly2qb5cvgarchm";
1285 };
1285 };
1286 meta = {
1286 meta = {
1287 license = [ pkgs.lib.licenses.mit ];
1287 license = [ pkgs.lib.licenses.mit ];
1288 };
1288 };
1289 };
1289 };
1290 "premailer" = super.buildPythonPackage {
1290 "premailer" = super.buildPythonPackage {
1291 name = "premailer-3.6.1";
1291 name = "premailer-3.6.1";
1292 doCheck = false;
1292 doCheck = false;
1293 propagatedBuildInputs = [
1293 propagatedBuildInputs = [
1294 self."lxml"
1294 self."lxml"
1295 self."cssselect"
1295 self."cssselect"
1296 self."cssutils"
1296 self."cssutils"
1297 self."requests"
1297 self."requests"
1298 self."cachetools"
1298 self."cachetools"
1299 ];
1299 ];
1300 src = fetchurl {
1300 src = fetchurl {
1301 url = "https://files.pythonhosted.org/packages/62/da/2f43cdf9d3d79c80c4856a12389a1f257d65fe9ccc44bc6b4383c8a18e33/premailer-3.6.1.tar.gz";
1301 url = "https://files.pythonhosted.org/packages/62/da/2f43cdf9d3d79c80c4856a12389a1f257d65fe9ccc44bc6b4383c8a18e33/premailer-3.6.1.tar.gz";
1302 sha256 = "08pshx7a110k4ll20x0xhpvyn3kkipkrbgxjjn7ncdxs54ihdhgw";
1302 sha256 = "08pshx7a110k4ll20x0xhpvyn3kkipkrbgxjjn7ncdxs54ihdhgw";
1303 };
1303 };
1304 meta = {
1304 meta = {
1305 license = [ pkgs.lib.licenses.psfl { fullName = "Python"; } ];
1305 license = [ pkgs.lib.licenses.psfl { fullName = "Python"; } ];
1306 };
1306 };
1307 };
1307 };
1308 "prompt-toolkit" = super.buildPythonPackage {
1308 "prompt-toolkit" = super.buildPythonPackage {
1309 name = "prompt-toolkit-1.0.18";
1309 name = "prompt-toolkit-1.0.18";
1310 doCheck = false;
1310 doCheck = false;
1311 propagatedBuildInputs = [
1311 propagatedBuildInputs = [
1312 self."six"
1312 self."six"
1313 self."wcwidth"
1313 self."wcwidth"
1314 ];
1314 ];
1315 src = fetchurl {
1315 src = fetchurl {
1316 url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz";
1316 url = "https://files.pythonhosted.org/packages/c5/64/c170e5b1913b540bf0c8ab7676b21fdd1d25b65ddeb10025c6ca43cccd4c/prompt_toolkit-1.0.18.tar.gz";
1317 sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx";
1317 sha256 = "09h1153wgr5x2ny7ds0w2m81n3bb9j8hjb8sjfnrg506r01clkyx";
1318 };
1318 };
1319 meta = {
1319 meta = {
1320 license = [ pkgs.lib.licenses.bsdOriginal ];
1320 license = [ pkgs.lib.licenses.bsdOriginal ];
1321 };
1321 };
1322 };
1322 };
1323 "psutil" = super.buildPythonPackage {
1323 "psutil" = super.buildPythonPackage {
1324 name = "psutil-5.7.0";
1324 name = "psutil-5.7.0";
1325 doCheck = false;
1325 doCheck = false;
1326 src = fetchurl {
1326 src = fetchurl {
1327 url = "https://files.pythonhosted.org/packages/c4/b8/3512f0e93e0db23a71d82485ba256071ebef99b227351f0f5540f744af41/psutil-5.7.0.tar.gz";
1327 url = "https://files.pythonhosted.org/packages/c4/b8/3512f0e93e0db23a71d82485ba256071ebef99b227351f0f5540f744af41/psutil-5.7.0.tar.gz";
1328 sha256 = "03jykdi3dgf1cdal9bv4fq9zjvzj9l9bs99gi5ar81sdl5nc2pk8";
1328 sha256 = "03jykdi3dgf1cdal9bv4fq9zjvzj9l9bs99gi5ar81sdl5nc2pk8";
1329 };
1329 };
1330 meta = {
1330 meta = {
1331 license = [ pkgs.lib.licenses.bsdOriginal ];
1331 license = [ pkgs.lib.licenses.bsdOriginal ];
1332 };
1332 };
1333 };
1333 };
1334 "psycopg2" = super.buildPythonPackage {
1334 "psycopg2" = super.buildPythonPackage {
1335 name = "psycopg2-2.8.4";
1335 name = "psycopg2-2.8.4";
1336 doCheck = false;
1336 doCheck = false;
1337 src = fetchurl {
1337 src = fetchurl {
1338 url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz";
1338 url = "https://files.pythonhosted.org/packages/84/d7/6a93c99b5ba4d4d22daa3928b983cec66df4536ca50b22ce5dcac65e4e71/psycopg2-2.8.4.tar.gz";
1339 sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q";
1339 sha256 = "1djvh98pi4hjd8rxbq8qzc63bg8v78k33yg6pl99wak61b6fb67q";
1340 };
1340 };
1341 meta = {
1341 meta = {
1342 license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1342 license = [ pkgs.lib.licenses.zpl21 { fullName = "GNU Library or Lesser General Public License (LGPL)"; } { fullName = "LGPL with exceptions or ZPL"; } ];
1343 };
1343 };
1344 };
1344 };
1345 "ptyprocess" = super.buildPythonPackage {
1345 "ptyprocess" = super.buildPythonPackage {
1346 name = "ptyprocess-0.6.0";
1346 name = "ptyprocess-0.6.0";
1347 doCheck = false;
1347 doCheck = false;
1348 src = fetchurl {
1348 src = fetchurl {
1349 url = "https://files.pythonhosted.org/packages/7d/2d/e4b8733cf79b7309d84c9081a4ab558c89d8c89da5961bf4ddb050ca1ce0/ptyprocess-0.6.0.tar.gz";
1349 url = "https://files.pythonhosted.org/packages/7d/2d/e4b8733cf79b7309d84c9081a4ab558c89d8c89da5961bf4ddb050ca1ce0/ptyprocess-0.6.0.tar.gz";
1350 sha256 = "1h4lcd3w5nrxnsk436ar7fwkiy5rfn5wj2xwy9l0r4mdqnf2jgwj";
1350 sha256 = "1h4lcd3w5nrxnsk436ar7fwkiy5rfn5wj2xwy9l0r4mdqnf2jgwj";
1351 };
1351 };
1352 meta = {
1352 meta = {
1353 license = [ ];
1353 license = [ ];
1354 };
1354 };
1355 };
1355 };
1356 "py" = super.buildPythonPackage {
1356 "py" = super.buildPythonPackage {
1357 name = "py-1.8.0";
1357 name = "py-1.8.0";
1358 doCheck = false;
1358 doCheck = false;
1359 src = fetchurl {
1359 src = fetchurl {
1360 url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz";
1360 url = "https://files.pythonhosted.org/packages/f1/5a/87ca5909f400a2de1561f1648883af74345fe96349f34f737cdfc94eba8c/py-1.8.0.tar.gz";
1361 sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw";
1361 sha256 = "0lsy1gajva083pzc7csj1cvbmminb7b4l6a0prdzyb3fd829nqyw";
1362 };
1362 };
1363 meta = {
1363 meta = {
1364 license = [ pkgs.lib.licenses.mit ];
1364 license = [ pkgs.lib.licenses.mit ];
1365 };
1365 };
1366 };
1366 };
1367 "py-bcrypt" = super.buildPythonPackage {
1367 "py-bcrypt" = super.buildPythonPackage {
1368 name = "py-bcrypt-0.4";
1368 name = "py-bcrypt-0.4";
1369 doCheck = false;
1369 doCheck = false;
1370 src = fetchurl {
1370 src = fetchurl {
1371 url = "https://files.pythonhosted.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz";
1371 url = "https://files.pythonhosted.org/packages/68/b1/1c3068c5c4d2e35c48b38dcc865301ebfdf45f54507086ac65ced1fd3b3d/py-bcrypt-0.4.tar.gz";
1372 sha256 = "0y6smdggwi5s72v6p1nn53dg6w05hna3d264cq6kas0lap73p8az";
1372 sha256 = "0y6smdggwi5s72v6p1nn53dg6w05hna3d264cq6kas0lap73p8az";
1373 };
1373 };
1374 meta = {
1374 meta = {
1375 license = [ pkgs.lib.licenses.bsdOriginal ];
1375 license = [ pkgs.lib.licenses.bsdOriginal ];
1376 };
1376 };
1377 };
1377 };
1378 "py-gfm" = super.buildPythonPackage {
1378 "py-gfm" = super.buildPythonPackage {
1379 name = "py-gfm-0.1.4";
1379 name = "py-gfm-0.1.4";
1380 doCheck = false;
1380 doCheck = false;
1381 propagatedBuildInputs = [
1381 propagatedBuildInputs = [
1382 self."setuptools"
1382 self."setuptools"
1383 self."markdown"
1383 self."markdown"
1384 ];
1384 ];
1385 src = fetchurl {
1385 src = fetchurl {
1386 url = "https://files.pythonhosted.org/packages/06/ee/004a03a1d92bb386dae44f6dd087db541bc5093374f1637d4d4ae5596cc2/py-gfm-0.1.4.tar.gz";
1386 url = "https://files.pythonhosted.org/packages/06/ee/004a03a1d92bb386dae44f6dd087db541bc5093374f1637d4d4ae5596cc2/py-gfm-0.1.4.tar.gz";
1387 sha256 = "0zip06g2isivx8fzgqd4n9qzsa22c25jas1rsb7m2rnjg72m0rzg";
1387 sha256 = "0zip06g2isivx8fzgqd4n9qzsa22c25jas1rsb7m2rnjg72m0rzg";
1388 };
1388 };
1389 meta = {
1389 meta = {
1390 license = [ pkgs.lib.licenses.bsdOriginal ];
1390 license = [ pkgs.lib.licenses.bsdOriginal ];
1391 };
1391 };
1392 };
1392 };
1393 "pyasn1" = super.buildPythonPackage {
1393 "pyasn1" = super.buildPythonPackage {
1394 name = "pyasn1-0.4.8";
1394 name = "pyasn1-0.4.8";
1395 doCheck = false;
1395 doCheck = false;
1396 src = fetchurl {
1396 src = fetchurl {
1397 url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz";
1397 url = "https://files.pythonhosted.org/packages/a4/db/fffec68299e6d7bad3d504147f9094830b704527a7fc098b721d38cc7fa7/pyasn1-0.4.8.tar.gz";
1398 sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf";
1398 sha256 = "1fnhbi3rmk47l9851gbik0flfr64vs5j0hbqx24cafjap6gprxxf";
1399 };
1399 };
1400 meta = {
1400 meta = {
1401 license = [ pkgs.lib.licenses.bsdOriginal ];
1401 license = [ pkgs.lib.licenses.bsdOriginal ];
1402 };
1402 };
1403 };
1403 };
1404 "pyasn1-modules" = super.buildPythonPackage {
1404 "pyasn1-modules" = super.buildPythonPackage {
1405 name = "pyasn1-modules-0.2.6";
1405 name = "pyasn1-modules-0.2.6";
1406 doCheck = false;
1406 doCheck = false;
1407 propagatedBuildInputs = [
1407 propagatedBuildInputs = [
1408 self."pyasn1"
1408 self."pyasn1"
1409 ];
1409 ];
1410 src = fetchurl {
1410 src = fetchurl {
1411 url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz";
1411 url = "https://files.pythonhosted.org/packages/f1/a9/a1ef72a0e43feff643cf0130a08123dea76205e7a0dda37e3efb5f054a31/pyasn1-modules-0.2.6.tar.gz";
1412 sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3";
1412 sha256 = "08hph9j1r018drnrny29l7dl2q0cin78csswrhwrh8jmq61pmha3";
1413 };
1413 };
1414 meta = {
1414 meta = {
1415 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
1415 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
1416 };
1416 };
1417 };
1417 };
1418 "pycparser" = super.buildPythonPackage {
1418 "pycparser" = super.buildPythonPackage {
1419 name = "pycparser-2.20";
1419 name = "pycparser-2.20";
1420 doCheck = false;
1420 doCheck = false;
1421 src = fetchurl {
1421 src = fetchurl {
1422 url = "https://files.pythonhosted.org/packages/0f/86/e19659527668d70be91d0369aeaa055b4eb396b0f387a4f92293a20035bd/pycparser-2.20.tar.gz";
1422 url = "https://files.pythonhosted.org/packages/0f/86/e19659527668d70be91d0369aeaa055b4eb396b0f387a4f92293a20035bd/pycparser-2.20.tar.gz";
1423 sha256 = "1w0m3xvlrzq4lkbvd1ngfm8mdw64r1yxy6n7djlw6qj5d0km6ird";
1423 sha256 = "1w0m3xvlrzq4lkbvd1ngfm8mdw64r1yxy6n7djlw6qj5d0km6ird";
1424 };
1424 };
1425 meta = {
1425 meta = {
1426 license = [ pkgs.lib.licenses.bsdOriginal ];
1426 license = [ pkgs.lib.licenses.bsdOriginal ];
1427 };
1427 };
1428 };
1428 };
1429 "pycrypto" = super.buildPythonPackage {
1429 "pycrypto" = super.buildPythonPackage {
1430 name = "pycrypto-2.6.1";
1430 name = "pycrypto-2.6.1";
1431 doCheck = false;
1431 doCheck = false;
1432 src = fetchurl {
1432 src = fetchurl {
1433 url = "https://files.pythonhosted.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz";
1433 url = "https://files.pythonhosted.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz";
1434 sha256 = "0g0ayql5b9mkjam8hym6zyg6bv77lbh66rv1fyvgqb17kfc1xkpj";
1434 sha256 = "0g0ayql5b9mkjam8hym6zyg6bv77lbh66rv1fyvgqb17kfc1xkpj";
1435 };
1435 };
1436 meta = {
1436 meta = {
1437 license = [ pkgs.lib.licenses.publicDomain ];
1437 license = [ pkgs.lib.licenses.publicDomain ];
1438 };
1438 };
1439 };
1439 };
1440 "pycurl" = super.buildPythonPackage {
1440 "pycurl" = super.buildPythonPackage {
1441 name = "pycurl-7.43.0.3";
1441 name = "pycurl-7.43.0.3";
1442 doCheck = false;
1442 doCheck = false;
1443 src = fetchurl {
1443 src = fetchurl {
1444 url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz";
1444 url = "https://files.pythonhosted.org/packages/ac/b3/0f3979633b7890bab6098d84c84467030b807a1e2b31f5d30103af5a71ca/pycurl-7.43.0.3.tar.gz";
1445 sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g";
1445 sha256 = "13nsvqhvnmnvfk75s8iynqsgszyv06cjp4drd3psi7zpbh63623g";
1446 };
1446 };
1447 meta = {
1447 meta = {
1448 license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1448 license = [ pkgs.lib.licenses.mit { fullName = "LGPL/MIT"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1449 };
1449 };
1450 };
1450 };
1451 "pygments" = super.buildPythonPackage {
1451 "pygments" = super.buildPythonPackage {
1452 name = "pygments-2.4.2";
1452 name = "pygments-2.4.2";
1453 doCheck = false;
1453 doCheck = false;
1454 src = fetchurl {
1454 src = fetchurl {
1455 url = "https://files.pythonhosted.org/packages/7e/ae/26808275fc76bf2832deb10d3a3ed3107bc4de01b85dcccbe525f2cd6d1e/Pygments-2.4.2.tar.gz";
1455 url = "https://files.pythonhosted.org/packages/7e/ae/26808275fc76bf2832deb10d3a3ed3107bc4de01b85dcccbe525f2cd6d1e/Pygments-2.4.2.tar.gz";
1456 sha256 = "15v2sqm5g12bqa0c7wikfh9ck2nl97ayizy1hpqhmws5gqalq748";
1456 sha256 = "15v2sqm5g12bqa0c7wikfh9ck2nl97ayizy1hpqhmws5gqalq748";
1457 };
1457 };
1458 meta = {
1458 meta = {
1459 license = [ pkgs.lib.licenses.bsdOriginal ];
1459 license = [ pkgs.lib.licenses.bsdOriginal ];
1460 };
1460 };
1461 };
1461 };
1462 "pymysql" = super.buildPythonPackage {
1462 "pymysql" = super.buildPythonPackage {
1463 name = "pymysql-0.8.1";
1463 name = "pymysql-0.8.1";
1464 doCheck = false;
1464 doCheck = false;
1465 src = fetchurl {
1465 src = fetchurl {
1466 url = "https://files.pythonhosted.org/packages/44/39/6bcb83cae0095a31b6be4511707fdf2009d3e29903a55a0494d3a9a2fac0/PyMySQL-0.8.1.tar.gz";
1466 url = "https://files.pythonhosted.org/packages/44/39/6bcb83cae0095a31b6be4511707fdf2009d3e29903a55a0494d3a9a2fac0/PyMySQL-0.8.1.tar.gz";
1467 sha256 = "0a96crz55bw4h6myh833skrli7b0ck89m3x673y2z2ryy7zrpq9l";
1467 sha256 = "0a96crz55bw4h6myh833skrli7b0ck89m3x673y2z2ryy7zrpq9l";
1468 };
1468 };
1469 meta = {
1469 meta = {
1470 license = [ pkgs.lib.licenses.mit ];
1470 license = [ pkgs.lib.licenses.mit ];
1471 };
1471 };
1472 };
1472 };
1473 "pyotp" = super.buildPythonPackage {
1473 "pyotp" = super.buildPythonPackage {
1474 name = "pyotp-2.3.0";
1474 name = "pyotp-2.3.0";
1475 doCheck = false;
1475 doCheck = false;
1476 src = fetchurl {
1476 src = fetchurl {
1477 url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz";
1477 url = "https://files.pythonhosted.org/packages/f7/15/395c4945ea6bc37e8811280bb675615cb4c2b2c1cd70bdc43329da91a386/pyotp-2.3.0.tar.gz";
1478 sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw";
1478 sha256 = "18d13ikra1iq0xyfqfm72zhgwxi2qi9ps6z1a6zmqp4qrn57wlzw";
1479 };
1479 };
1480 meta = {
1480 meta = {
1481 license = [ pkgs.lib.licenses.mit ];
1481 license = [ pkgs.lib.licenses.mit ];
1482 };
1482 };
1483 };
1483 };
1484 "pyparsing" = super.buildPythonPackage {
1484 "pyparsing" = super.buildPythonPackage {
1485 name = "pyparsing-2.4.7";
1485 name = "pyparsing-2.4.7";
1486 doCheck = false;
1486 doCheck = false;
1487 src = fetchurl {
1487 src = fetchurl {
1488 url = "https://files.pythonhosted.org/packages/c1/47/dfc9c342c9842bbe0036c7f763d2d6686bcf5eb1808ba3e170afdb282210/pyparsing-2.4.7.tar.gz";
1488 url = "https://files.pythonhosted.org/packages/c1/47/dfc9c342c9842bbe0036c7f763d2d6686bcf5eb1808ba3e170afdb282210/pyparsing-2.4.7.tar.gz";
1489 sha256 = "1hgc8qrbq1ymxbwfbjghv01fm3fbpjwpjwi0bcailxxzhf3yq0y2";
1489 sha256 = "1hgc8qrbq1ymxbwfbjghv01fm3fbpjwpjwi0bcailxxzhf3yq0y2";
1490 };
1490 };
1491 meta = {
1491 meta = {
1492 license = [ pkgs.lib.licenses.mit ];
1492 license = [ pkgs.lib.licenses.mit ];
1493 };
1493 };
1494 };
1494 };
1495 "pyramid" = super.buildPythonPackage {
1495 "pyramid" = super.buildPythonPackage {
1496 name = "pyramid-1.10.4";
1496 name = "pyramid-1.10.4";
1497 doCheck = false;
1497 doCheck = false;
1498 propagatedBuildInputs = [
1498 propagatedBuildInputs = [
1499 self."hupper"
1499 self."hupper"
1500 self."plaster"
1500 self."plaster"
1501 self."plaster-pastedeploy"
1501 self."plaster-pastedeploy"
1502 self."setuptools"
1502 self."setuptools"
1503 self."translationstring"
1503 self."translationstring"
1504 self."venusian"
1504 self."venusian"
1505 self."webob"
1505 self."webob"
1506 self."zope.deprecation"
1506 self."zope.deprecation"
1507 self."zope.interface"
1507 self."zope.interface"
1508 self."repoze.lru"
1508 self."repoze.lru"
1509 ];
1509 ];
1510 src = fetchurl {
1510 src = fetchurl {
1511 url = "https://files.pythonhosted.org/packages/c2/43/1ae701c9c6bb3a434358e678a5e72c96e8aa55cf4cb1d2fa2041b5dd38b7/pyramid-1.10.4.tar.gz";
1511 url = "https://files.pythonhosted.org/packages/c2/43/1ae701c9c6bb3a434358e678a5e72c96e8aa55cf4cb1d2fa2041b5dd38b7/pyramid-1.10.4.tar.gz";
1512 sha256 = "0rkxs1ajycg2zh1c94xlmls56mx5m161sn8112skj0amza6cn36q";
1512 sha256 = "0rkxs1ajycg2zh1c94xlmls56mx5m161sn8112skj0amza6cn36q";
1513 };
1513 };
1514 meta = {
1514 meta = {
1515 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1515 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1516 };
1516 };
1517 };
1517 };
1518 "pyramid-debugtoolbar" = super.buildPythonPackage {
1518 "pyramid-debugtoolbar" = super.buildPythonPackage {
1519 name = "pyramid-debugtoolbar-4.6.1";
1519 name = "pyramid-debugtoolbar-4.6.1";
1520 doCheck = false;
1520 doCheck = false;
1521 propagatedBuildInputs = [
1521 propagatedBuildInputs = [
1522 self."pyramid"
1522 self."pyramid"
1523 self."pyramid-mako"
1523 self."pyramid-mako"
1524 self."repoze.lru"
1524 self."repoze.lru"
1525 self."pygments"
1525 self."pygments"
1526 self."ipaddress"
1526 self."ipaddress"
1527 ];
1527 ];
1528 src = fetchurl {
1528 src = fetchurl {
1529 url = "https://files.pythonhosted.org/packages/99/f6/b8603f82c18275be293921bc3a2184205056ca505747bf64ab8a0c08e124/pyramid_debugtoolbar-4.6.1.tar.gz";
1529 url = "https://files.pythonhosted.org/packages/99/f6/b8603f82c18275be293921bc3a2184205056ca505747bf64ab8a0c08e124/pyramid_debugtoolbar-4.6.1.tar.gz";
1530 sha256 = "185z7q8n959ga5331iczwra2iljwkidfx4qn6bbd7vm3rm4w6llv";
1530 sha256 = "185z7q8n959ga5331iczwra2iljwkidfx4qn6bbd7vm3rm4w6llv";
1531 };
1531 };
1532 meta = {
1532 meta = {
1533 license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ];
1533 license = [ { fullName = "Repoze Public License"; } pkgs.lib.licenses.bsdOriginal ];
1534 };
1534 };
1535 };
1535 };
1536 "pyramid-jinja2" = super.buildPythonPackage {
1536 "pyramid-jinja2" = super.buildPythonPackage {
1537 name = "pyramid-jinja2-2.7";
1537 name = "pyramid-jinja2-2.7";
1538 doCheck = false;
1538 doCheck = false;
1539 propagatedBuildInputs = [
1539 propagatedBuildInputs = [
1540 self."pyramid"
1540 self."pyramid"
1541 self."zope.deprecation"
1541 self."zope.deprecation"
1542 self."jinja2"
1542 self."jinja2"
1543 self."markupsafe"
1543 self."markupsafe"
1544 ];
1544 ];
1545 src = fetchurl {
1545 src = fetchurl {
1546 url = "https://files.pythonhosted.org/packages/d8/80/d60a7233823de22ce77bd864a8a83736a1fe8b49884b08303a2e68b2c853/pyramid_jinja2-2.7.tar.gz";
1546 url = "https://files.pythonhosted.org/packages/d8/80/d60a7233823de22ce77bd864a8a83736a1fe8b49884b08303a2e68b2c853/pyramid_jinja2-2.7.tar.gz";
1547 sha256 = "1sz5s0pp5jqhf4w22w9527yz8hgdi4mhr6apd6vw1gm5clghh8aw";
1547 sha256 = "1sz5s0pp5jqhf4w22w9527yz8hgdi4mhr6apd6vw1gm5clghh8aw";
1548 };
1548 };
1549 meta = {
1549 meta = {
1550 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1550 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1551 };
1551 };
1552 };
1552 };
1553 "pyramid-apispec" = super.buildPythonPackage {
1553 "pyramid-apispec" = super.buildPythonPackage {
1554 name = "pyramid-apispec-0.3.2";
1554 name = "pyramid-apispec-0.3.2";
1555 doCheck = false;
1555 doCheck = false;
1556 propagatedBuildInputs = [
1556 propagatedBuildInputs = [
1557 self."apispec"
1557 self."apispec"
1558 ];
1558 ];
1559 src = fetchurl {
1559 src = fetchurl {
1560 url = "https://files.pythonhosted.org/packages/2a/30/1dea5d81ea635449572ba60ec3148310d75ae4530c3c695f54b0991bb8c7/pyramid_apispec-0.3.2.tar.gz";
1560 url = "https://files.pythonhosted.org/packages/2a/30/1dea5d81ea635449572ba60ec3148310d75ae4530c3c695f54b0991bb8c7/pyramid_apispec-0.3.2.tar.gz";
1561 sha256 = "0ffrcqp9dkykivhfcq0v9lgy6w0qhwl6x78925vfjmayly9r8da0";
1561 sha256 = "0ffrcqp9dkykivhfcq0v9lgy6w0qhwl6x78925vfjmayly9r8da0";
1562 };
1562 };
1563 meta = {
1563 meta = {
1564 license = [ pkgs.lib.licenses.bsdOriginal ];
1564 license = [ pkgs.lib.licenses.bsdOriginal ];
1565 };
1565 };
1566 };
1566 };
1567 "pyramid-mailer" = super.buildPythonPackage {
1567 "pyramid-mailer" = super.buildPythonPackage {
1568 name = "pyramid-mailer-0.15.1";
1568 name = "pyramid-mailer-0.15.1";
1569 doCheck = false;
1569 doCheck = false;
1570 propagatedBuildInputs = [
1570 propagatedBuildInputs = [
1571 self."pyramid"
1571 self."pyramid"
1572 self."repoze.sendmail"
1572 self."repoze.sendmail"
1573 self."transaction"
1573 self."transaction"
1574 ];
1574 ];
1575 src = fetchurl {
1575 src = fetchurl {
1576 url = "https://files.pythonhosted.org/packages/a0/f2/6febf5459dff4d7e653314d575469ad2e11b9d2af2c3606360e1c67202f2/pyramid_mailer-0.15.1.tar.gz";
1576 url = "https://files.pythonhosted.org/packages/a0/f2/6febf5459dff4d7e653314d575469ad2e11b9d2af2c3606360e1c67202f2/pyramid_mailer-0.15.1.tar.gz";
1577 sha256 = "16vg8jb203jgb7b0hd6wllfqvp542qh2ry1gjai2m6qpv5agy2pc";
1577 sha256 = "16vg8jb203jgb7b0hd6wllfqvp542qh2ry1gjai2m6qpv5agy2pc";
1578 };
1578 };
1579 meta = {
1579 meta = {
1580 license = [ pkgs.lib.licenses.bsdOriginal ];
1580 license = [ pkgs.lib.licenses.bsdOriginal ];
1581 };
1581 };
1582 };
1582 };
1583 "pyramid-mako" = super.buildPythonPackage {
1583 "pyramid-mako" = super.buildPythonPackage {
1584 name = "pyramid-mako-1.1.0";
1584 name = "pyramid-mako-1.1.0";
1585 doCheck = false;
1585 doCheck = false;
1586 propagatedBuildInputs = [
1586 propagatedBuildInputs = [
1587 self."pyramid"
1587 self."pyramid"
1588 self."mako"
1588 self."mako"
1589 ];
1589 ];
1590 src = fetchurl {
1590 src = fetchurl {
1591 url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz";
1591 url = "https://files.pythonhosted.org/packages/63/7b/5e2af68f675071a6bad148c1c393928f0ef5fcd94e95cbf53b89d6471a83/pyramid_mako-1.1.0.tar.gz";
1592 sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0";
1592 sha256 = "1qj0m091mnii86j2q1d82yir22nha361rvhclvg3s70z8iiwhrh0";
1593 };
1593 };
1594 meta = {
1594 meta = {
1595 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1595 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1596 };
1596 };
1597 };
1597 };
1598 "pysqlite" = super.buildPythonPackage {
1598 "pysqlite" = super.buildPythonPackage {
1599 name = "pysqlite-2.8.3";
1599 name = "pysqlite-2.8.3";
1600 doCheck = false;
1600 doCheck = false;
1601 src = fetchurl {
1601 src = fetchurl {
1602 url = "https://files.pythonhosted.org/packages/42/02/981b6703e3c83c5b25a829c6e77aad059f9481b0bbacb47e6e8ca12bd731/pysqlite-2.8.3.tar.gz";
1602 url = "https://files.pythonhosted.org/packages/42/02/981b6703e3c83c5b25a829c6e77aad059f9481b0bbacb47e6e8ca12bd731/pysqlite-2.8.3.tar.gz";
1603 sha256 = "1424gwq9sil2ffmnizk60q36vydkv8rxs6m7xs987kz8cdc37lqp";
1603 sha256 = "1424gwq9sil2ffmnizk60q36vydkv8rxs6m7xs987kz8cdc37lqp";
1604 };
1604 };
1605 meta = {
1605 meta = {
1606 license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ];
1606 license = [ { fullName = "zlib/libpng License"; } { fullName = "zlib/libpng license"; } ];
1607 };
1607 };
1608 };
1608 };
1609 "pytest" = super.buildPythonPackage {
1609 "pytest" = super.buildPythonPackage {
1610 name = "pytest-4.6.5";
1610 name = "pytest-4.6.5";
1611 doCheck = false;
1611 doCheck = false;
1612 propagatedBuildInputs = [
1612 propagatedBuildInputs = [
1613 self."py"
1613 self."py"
1614 self."six"
1614 self."six"
1615 self."packaging"
1615 self."packaging"
1616 self."attrs"
1616 self."attrs"
1617 self."atomicwrites"
1617 self."atomicwrites"
1618 self."pluggy"
1618 self."pluggy"
1619 self."importlib-metadata"
1619 self."importlib-metadata"
1620 self."wcwidth"
1620 self."wcwidth"
1621 self."funcsigs"
1621 self."funcsigs"
1622 self."pathlib2"
1622 self."pathlib2"
1623 self."more-itertools"
1623 self."more-itertools"
1624 ];
1624 ];
1625 src = fetchurl {
1625 src = fetchurl {
1626 url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz";
1626 url = "https://files.pythonhosted.org/packages/2a/c6/1d1f32f6a5009900521b12e6560fb6b7245b0d4bc3fb771acd63d10e30e1/pytest-4.6.5.tar.gz";
1627 sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg";
1627 sha256 = "0iykwwfp4h181nd7rsihh2120b0rkawlw7rvbl19sgfspncr3hwg";
1628 };
1628 };
1629 meta = {
1629 meta = {
1630 license = [ pkgs.lib.licenses.mit ];
1630 license = [ pkgs.lib.licenses.mit ];
1631 };
1631 };
1632 };
1632 };
1633 "pytest-cov" = super.buildPythonPackage {
1633 "pytest-cov" = super.buildPythonPackage {
1634 name = "pytest-cov-2.7.1";
1634 name = "pytest-cov-2.7.1";
1635 doCheck = false;
1635 doCheck = false;
1636 propagatedBuildInputs = [
1636 propagatedBuildInputs = [
1637 self."pytest"
1637 self."pytest"
1638 self."coverage"
1638 self."coverage"
1639 ];
1639 ];
1640 src = fetchurl {
1640 src = fetchurl {
1641 url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz";
1641 url = "https://files.pythonhosted.org/packages/bb/0f/3db7ff86801883b21d5353b258c994b1b8e2abbc804e2273b8d0fd19004b/pytest-cov-2.7.1.tar.gz";
1642 sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0";
1642 sha256 = "0filvmmyqm715azsl09ql8hy2x7h286n6d8z5x42a1wpvvys83p0";
1643 };
1643 };
1644 meta = {
1644 meta = {
1645 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ];
1645 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.mit ];
1646 };
1646 };
1647 };
1647 };
1648 "pytest-profiling" = super.buildPythonPackage {
1648 "pytest-profiling" = super.buildPythonPackage {
1649 name = "pytest-profiling-1.7.0";
1649 name = "pytest-profiling-1.7.0";
1650 doCheck = false;
1650 doCheck = false;
1651 propagatedBuildInputs = [
1651 propagatedBuildInputs = [
1652 self."six"
1652 self."six"
1653 self."pytest"
1653 self."pytest"
1654 self."gprof2dot"
1654 self."gprof2dot"
1655 ];
1655 ];
1656 src = fetchurl {
1656 src = fetchurl {
1657 url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz";
1657 url = "https://files.pythonhosted.org/packages/39/70/22a4b33739f07f1732a63e33bbfbf68e0fa58cfba9d200e76d01921eddbf/pytest-profiling-1.7.0.tar.gz";
1658 sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk";
1658 sha256 = "0abz9gi26jpcfdzgsvwad91555lpgdc8kbymicmms8k2fqa8z4wk";
1659 };
1659 };
1660 meta = {
1660 meta = {
1661 license = [ pkgs.lib.licenses.mit ];
1661 license = [ pkgs.lib.licenses.mit ];
1662 };
1662 };
1663 };
1663 };
1664 "pytest-runner" = super.buildPythonPackage {
1664 "pytest-runner" = super.buildPythonPackage {
1665 name = "pytest-runner-5.1";
1665 name = "pytest-runner-5.1";
1666 doCheck = false;
1666 doCheck = false;
1667 src = fetchurl {
1667 src = fetchurl {
1668 url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz";
1668 url = "https://files.pythonhosted.org/packages/d9/6d/4b41a74b31720e25abd4799be72d54811da4b4d0233e38b75864dcc1f7ad/pytest-runner-5.1.tar.gz";
1669 sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815";
1669 sha256 = "0ykfcnpp8c22winj63qzc07l5axwlc9ikl8vn05sc32gv3417815";
1670 };
1670 };
1671 meta = {
1671 meta = {
1672 license = [ pkgs.lib.licenses.mit ];
1672 license = [ pkgs.lib.licenses.mit ];
1673 };
1673 };
1674 };
1674 };
1675 "pytest-sugar" = super.buildPythonPackage {
1675 "pytest-sugar" = super.buildPythonPackage {
1676 name = "pytest-sugar-0.9.2";
1676 name = "pytest-sugar-0.9.2";
1677 doCheck = false;
1677 doCheck = false;
1678 propagatedBuildInputs = [
1678 propagatedBuildInputs = [
1679 self."pytest"
1679 self."pytest"
1680 self."termcolor"
1680 self."termcolor"
1681 self."packaging"
1681 self."packaging"
1682 ];
1682 ];
1683 src = fetchurl {
1683 src = fetchurl {
1684 url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz";
1684 url = "https://files.pythonhosted.org/packages/55/59/f02f78d1c80f7e03e23177f60624c8106d4f23d124c921df103f65692464/pytest-sugar-0.9.2.tar.gz";
1685 sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w";
1685 sha256 = "1asq7yc4g8bx2sn7yy974mhc9ywvaihasjab4inkirdwn9s7mn7w";
1686 };
1686 };
1687 meta = {
1687 meta = {
1688 license = [ pkgs.lib.licenses.bsdOriginal ];
1688 license = [ pkgs.lib.licenses.bsdOriginal ];
1689 };
1689 };
1690 };
1690 };
1691 "pytest-timeout" = super.buildPythonPackage {
1691 "pytest-timeout" = super.buildPythonPackage {
1692 name = "pytest-timeout-1.3.3";
1692 name = "pytest-timeout-1.3.3";
1693 doCheck = false;
1693 doCheck = false;
1694 propagatedBuildInputs = [
1694 propagatedBuildInputs = [
1695 self."pytest"
1695 self."pytest"
1696 ];
1696 ];
1697 src = fetchurl {
1697 src = fetchurl {
1698 url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz";
1698 url = "https://files.pythonhosted.org/packages/13/48/7a166eaa29c1dca6cc253e3ba5773ff2e4aa4f567c1ea3905808e95ac5c1/pytest-timeout-1.3.3.tar.gz";
1699 sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a";
1699 sha256 = "1cczcjhw4xx5sjkhxlhc5c1bkr7x6fcyx12wrnvwfckshdvblc2a";
1700 };
1700 };
1701 meta = {
1701 meta = {
1702 license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ];
1702 license = [ pkgs.lib.licenses.mit { fullName = "DFSG approved"; } ];
1703 };
1703 };
1704 };
1704 };
1705 "python-dateutil" = super.buildPythonPackage {
1705 "python-dateutil" = super.buildPythonPackage {
1706 name = "python-dateutil-2.8.1";
1706 name = "python-dateutil-2.8.1";
1707 doCheck = false;
1707 doCheck = false;
1708 propagatedBuildInputs = [
1708 propagatedBuildInputs = [
1709 self."six"
1709 self."six"
1710 ];
1710 ];
1711 src = fetchurl {
1711 src = fetchurl {
1712 url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz";
1712 url = "https://files.pythonhosted.org/packages/be/ed/5bbc91f03fa4c839c4c7360375da77f9659af5f7086b7a7bdda65771c8e0/python-dateutil-2.8.1.tar.gz";
1713 sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk";
1713 sha256 = "0g42w7k5007iv9dam6gnja2ry8ydwirh99mgdll35s12pyfzxsvk";
1714 };
1714 };
1715 meta = {
1715 meta = {
1716 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ];
1716 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.asl20 { fullName = "Dual License"; } ];
1717 };
1717 };
1718 };
1718 };
1719 "python-editor" = super.buildPythonPackage {
1719 "python-editor" = super.buildPythonPackage {
1720 name = "python-editor-1.0.4";
1720 name = "python-editor-1.0.4";
1721 doCheck = false;
1721 doCheck = false;
1722 src = fetchurl {
1722 src = fetchurl {
1723 url = "https://files.pythonhosted.org/packages/0a/85/78f4a216d28343a67b7397c99825cff336330893f00601443f7c7b2f2234/python-editor-1.0.4.tar.gz";
1723 url = "https://files.pythonhosted.org/packages/0a/85/78f4a216d28343a67b7397c99825cff336330893f00601443f7c7b2f2234/python-editor-1.0.4.tar.gz";
1724 sha256 = "0yrjh8w72ivqxi4i7xsg5b1vz15x8fg51xra7c3bgfyxqnyadzai";
1724 sha256 = "0yrjh8w72ivqxi4i7xsg5b1vz15x8fg51xra7c3bgfyxqnyadzai";
1725 };
1725 };
1726 meta = {
1726 meta = {
1727 license = [ pkgs.lib.licenses.asl20 { fullName = "Apache"; } ];
1727 license = [ pkgs.lib.licenses.asl20 { fullName = "Apache"; } ];
1728 };
1728 };
1729 };
1729 };
1730 "python-ldap" = super.buildPythonPackage {
1730 "python-ldap" = super.buildPythonPackage {
1731 name = "python-ldap-3.2.0";
1731 name = "python-ldap-3.2.0";
1732 doCheck = false;
1732 doCheck = false;
1733 propagatedBuildInputs = [
1733 propagatedBuildInputs = [
1734 self."pyasn1"
1734 self."pyasn1"
1735 self."pyasn1-modules"
1735 self."pyasn1-modules"
1736 ];
1736 ];
1737 src = fetchurl {
1737 src = fetchurl {
1738 url = "https://files.pythonhosted.org/packages/ea/93/596f875e003c770447f4b99267820a0c769dd2dc3ae3ed19afe460fcbad0/python-ldap-3.2.0.tar.gz";
1738 url = "https://files.pythonhosted.org/packages/ea/93/596f875e003c770447f4b99267820a0c769dd2dc3ae3ed19afe460fcbad0/python-ldap-3.2.0.tar.gz";
1739 sha256 = "13nvrhp85yr0jyxixcjj012iw8l9wynxxlykm9j3alss6waln73x";
1739 sha256 = "13nvrhp85yr0jyxixcjj012iw8l9wynxxlykm9j3alss6waln73x";
1740 };
1740 };
1741 meta = {
1741 meta = {
1742 license = [ pkgs.lib.licenses.psfl ];
1742 license = [ pkgs.lib.licenses.psfl ];
1743 };
1743 };
1744 };
1744 };
1745 "python-memcached" = super.buildPythonPackage {
1745 "python-memcached" = super.buildPythonPackage {
1746 name = "python-memcached-1.59";
1746 name = "python-memcached-1.59";
1747 doCheck = false;
1747 doCheck = false;
1748 propagatedBuildInputs = [
1748 propagatedBuildInputs = [
1749 self."six"
1749 self."six"
1750 ];
1750 ];
1751 src = fetchurl {
1751 src = fetchurl {
1752 url = "https://files.pythonhosted.org/packages/90/59/5faf6e3cd8a568dd4f737ddae4f2e54204fd8c51f90bf8df99aca6c22318/python-memcached-1.59.tar.gz";
1752 url = "https://files.pythonhosted.org/packages/90/59/5faf6e3cd8a568dd4f737ddae4f2e54204fd8c51f90bf8df99aca6c22318/python-memcached-1.59.tar.gz";
1753 sha256 = "0kvyapavbirk2x3n1jx4yb9nyigrj1s3x15nm3qhpvhkpqvqdqm2";
1753 sha256 = "0kvyapavbirk2x3n1jx4yb9nyigrj1s3x15nm3qhpvhkpqvqdqm2";
1754 };
1754 };
1755 meta = {
1755 meta = {
1756 license = [ pkgs.lib.licenses.psfl ];
1756 license = [ pkgs.lib.licenses.psfl ];
1757 };
1757 };
1758 };
1758 };
1759 "python-pam" = super.buildPythonPackage {
1759 "python-pam" = super.buildPythonPackage {
1760 name = "python-pam-1.8.4";
1760 name = "python-pam-1.8.4";
1761 doCheck = false;
1761 doCheck = false;
1762 src = fetchurl {
1762 src = fetchurl {
1763 url = "https://files.pythonhosted.org/packages/01/16/544d01cae9f28e0292dbd092b6b8b0bf222b528f362ee768a5bed2140111/python-pam-1.8.4.tar.gz";
1763 url = "https://files.pythonhosted.org/packages/01/16/544d01cae9f28e0292dbd092b6b8b0bf222b528f362ee768a5bed2140111/python-pam-1.8.4.tar.gz";
1764 sha256 = "16whhc0vr7gxsbzvsnq65nq8fs3wwmx755cavm8kkczdkz4djmn8";
1764 sha256 = "16whhc0vr7gxsbzvsnq65nq8fs3wwmx755cavm8kkczdkz4djmn8";
1765 };
1765 };
1766 meta = {
1766 meta = {
1767 license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ];
1767 license = [ { fullName = "License :: OSI Approved :: MIT License"; } pkgs.lib.licenses.mit ];
1768 };
1768 };
1769 };
1769 };
1770 "python-saml" = super.buildPythonPackage {
1770 "python-saml" = super.buildPythonPackage {
1771 name = "python-saml-2.4.2";
1771 name = "python-saml-2.4.2";
1772 doCheck = false;
1772 doCheck = false;
1773 propagatedBuildInputs = [
1773 propagatedBuildInputs = [
1774 self."dm.xmlsec.binding"
1774 self."dm.xmlsec.binding"
1775 self."isodate"
1775 self."isodate"
1776 self."defusedxml"
1776 self."defusedxml"
1777 ];
1777 ];
1778 src = fetchurl {
1778 src = fetchurl {
1779 url = "https://files.pythonhosted.org/packages/79/a8/a6611017e0883102fd5e2b73c9d90691b8134e38247c04ee1531d3dc647c/python-saml-2.4.2.tar.gz";
1779 url = "https://files.pythonhosted.org/packages/79/a8/a6611017e0883102fd5e2b73c9d90691b8134e38247c04ee1531d3dc647c/python-saml-2.4.2.tar.gz";
1780 sha256 = "0dls4hwvf13yg7x5yfjrghbywg8g38vn5vr0rsf70hli3ydbfm43";
1780 sha256 = "0dls4hwvf13yg7x5yfjrghbywg8g38vn5vr0rsf70hli3ydbfm43";
1781 };
1781 };
1782 meta = {
1782 meta = {
1783 license = [ pkgs.lib.licenses.mit ];
1783 license = [ pkgs.lib.licenses.mit ];
1784 };
1784 };
1785 };
1785 };
1786 "pytz" = super.buildPythonPackage {
1786 "pytz" = super.buildPythonPackage {
1787 name = "pytz-2019.3";
1787 name = "pytz-2019.3";
1788 doCheck = false;
1788 doCheck = false;
1789 src = fetchurl {
1789 src = fetchurl {
1790 url = "https://files.pythonhosted.org/packages/82/c3/534ddba230bd4fbbd3b7a3d35f3341d014cca213f369a9940925e7e5f691/pytz-2019.3.tar.gz";
1790 url = "https://files.pythonhosted.org/packages/82/c3/534ddba230bd4fbbd3b7a3d35f3341d014cca213f369a9940925e7e5f691/pytz-2019.3.tar.gz";
1791 sha256 = "1ghrk1wg45d3nymj7bf4zj03n3bh64xmczhk4pfi577hdkdhcb5h";
1791 sha256 = "1ghrk1wg45d3nymj7bf4zj03n3bh64xmczhk4pfi577hdkdhcb5h";
1792 };
1792 };
1793 meta = {
1793 meta = {
1794 license = [ pkgs.lib.licenses.mit ];
1794 license = [ pkgs.lib.licenses.mit ];
1795 };
1795 };
1796 };
1796 };
1797 "pyzmq" = super.buildPythonPackage {
1797 "pyzmq" = super.buildPythonPackage {
1798 name = "pyzmq-14.6.0";
1798 name = "pyzmq-14.6.0";
1799 doCheck = false;
1799 doCheck = false;
1800 src = fetchurl {
1800 src = fetchurl {
1801 url = "https://files.pythonhosted.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz";
1801 url = "https://files.pythonhosted.org/packages/8a/3b/5463d5a9d712cd8bbdac335daece0d69f6a6792da4e3dd89956c0db4e4e6/pyzmq-14.6.0.tar.gz";
1802 sha256 = "1frmbjykvhmdg64g7sn20c9fpamrsfxwci1nhhg8q7jgz5pq0ikp";
1802 sha256 = "1frmbjykvhmdg64g7sn20c9fpamrsfxwci1nhhg8q7jgz5pq0ikp";
1803 };
1803 };
1804 meta = {
1804 meta = {
1805 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1805 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "LGPL+BSD"; } { fullName = "GNU Library or Lesser General Public License (LGPL)"; } ];
1806 };
1806 };
1807 };
1807 };
1808 "PyYAML" = super.buildPythonPackage {
1808 "PyYAML" = super.buildPythonPackage {
1809 name = "PyYAML-5.3.1";
1809 name = "PyYAML-5.3.1";
1810 doCheck = false;
1810 doCheck = false;
1811 src = fetchurl {
1811 src = fetchurl {
1812 url = "https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz";
1812 url = "https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz";
1813 sha256 = "0pb4zvkfxfijkpgd1b86xjsqql97ssf1knbd1v53wkg1qm9cgsmq";
1813 sha256 = "0pb4zvkfxfijkpgd1b86xjsqql97ssf1knbd1v53wkg1qm9cgsmq";
1814 };
1814 };
1815 meta = {
1815 meta = {
1816 license = [ pkgs.lib.licenses.mit ];
1816 license = [ pkgs.lib.licenses.mit ];
1817 };
1817 };
1818 };
1818 };
1819 "regex" = super.buildPythonPackage {
1819 "regex" = super.buildPythonPackage {
1820 name = "regex-2020.9.27";
1820 name = "regex-2020.9.27";
1821 doCheck = false;
1821 doCheck = false;
1822 src = fetchurl {
1822 src = fetchurl {
1823 url = "https://files.pythonhosted.org/packages/93/8c/17f45cdfb39b13d4b5f909e4b4c2917abcbdef9c0036919a0399769148cf/regex-2020.9.27.tar.gz";
1823 url = "https://files.pythonhosted.org/packages/93/8c/17f45cdfb39b13d4b5f909e4b4c2917abcbdef9c0036919a0399769148cf/regex-2020.9.27.tar.gz";
1824 sha256 = "179ngfzwbsjvn5vhyzdahvmg0f7acahkwwy9bpjy1pv08bm2mwx6";
1824 sha256 = "179ngfzwbsjvn5vhyzdahvmg0f7acahkwwy9bpjy1pv08bm2mwx6";
1825 };
1825 };
1826 meta = {
1826 meta = {
1827 license = [ pkgs.lib.licenses.psfl ];
1827 license = [ pkgs.lib.licenses.psfl ];
1828 };
1828 };
1829 };
1829 };
1830 "redis" = super.buildPythonPackage {
1830 "redis" = super.buildPythonPackage {
1831 name = "redis-3.5.3";
1831 name = "redis-3.5.3";
1832 doCheck = false;
1832 doCheck = false;
1833 src = fetchurl {
1833 src = fetchurl {
1834 url = "https://files.pythonhosted.org/packages/b3/17/1e567ff78c83854e16b98694411fe6e08c3426af866ad11397cddceb80d3/redis-3.5.3.tar.gz";
1834 url = "https://files.pythonhosted.org/packages/b3/17/1e567ff78c83854e16b98694411fe6e08c3426af866ad11397cddceb80d3/redis-3.5.3.tar.gz";
1835 sha256 = "0e7e0cfca8660dea8b7d5cd8c4f6c5e29e11f31158c0b0ae91a397f00e5a05a2";
1835 sha256 = "0e7e0cfca8660dea8b7d5cd8c4f6c5e29e11f31158c0b0ae91a397f00e5a05a2";
1836 };
1836 };
1837 meta = {
1837 meta = {
1838 license = [ pkgs.lib.licenses.mit ];
1838 license = [ pkgs.lib.licenses.mit ];
1839 };
1839 };
1840 };
1840 };
1841 "repoze.lru" = super.buildPythonPackage {
1841 "repoze.lru" = super.buildPythonPackage {
1842 name = "repoze.lru-0.7";
1842 name = "repoze.lru-0.7";
1843 doCheck = false;
1843 doCheck = false;
1844 src = fetchurl {
1844 src = fetchurl {
1845 url = "https://files.pythonhosted.org/packages/12/bc/595a77c4b5e204847fdf19268314ef59c85193a9dc9f83630fc459c0fee5/repoze.lru-0.7.tar.gz";
1845 url = "https://files.pythonhosted.org/packages/12/bc/595a77c4b5e204847fdf19268314ef59c85193a9dc9f83630fc459c0fee5/repoze.lru-0.7.tar.gz";
1846 sha256 = "0xzz1aw2smy8hdszrq8yhnklx6w1r1mf55061kalw3iq35gafa84";
1846 sha256 = "0xzz1aw2smy8hdszrq8yhnklx6w1r1mf55061kalw3iq35gafa84";
1847 };
1847 };
1848 meta = {
1848 meta = {
1849 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1849 license = [ { fullName = "Repoze Public License"; } { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
1850 };
1850 };
1851 };
1851 };
1852 "repoze.sendmail" = super.buildPythonPackage {
1852 "repoze.sendmail" = super.buildPythonPackage {
1853 name = "repoze.sendmail-4.4.1";
1853 name = "repoze.sendmail-4.4.1";
1854 doCheck = false;
1854 doCheck = false;
1855 propagatedBuildInputs = [
1855 propagatedBuildInputs = [
1856 self."setuptools"
1856 self."setuptools"
1857 self."zope.interface"
1857 self."zope.interface"
1858 self."transaction"
1858 self."transaction"
1859 ];
1859 ];
1860 src = fetchurl {
1860 src = fetchurl {
1861 url = "https://files.pythonhosted.org/packages/12/4e/8ef1fd5c42765d712427b9c391419a77bd48877886d2cbc5e9f23c8cad9b/repoze.sendmail-4.4.1.tar.gz";
1861 url = "https://files.pythonhosted.org/packages/12/4e/8ef1fd5c42765d712427b9c391419a77bd48877886d2cbc5e9f23c8cad9b/repoze.sendmail-4.4.1.tar.gz";
1862 sha256 = "096ln02jr2afk7ab9j2czxqv2ryqq7m86ah572nqplx52iws73ks";
1862 sha256 = "096ln02jr2afk7ab9j2czxqv2ryqq7m86ah572nqplx52iws73ks";
1863 };
1863 };
1864 meta = {
1864 meta = {
1865 license = [ pkgs.lib.licenses.zpl21 ];
1865 license = [ pkgs.lib.licenses.zpl21 ];
1866 };
1866 };
1867 };
1867 };
1868 "requests" = super.buildPythonPackage {
1868 "requests" = super.buildPythonPackage {
1869 name = "requests-2.22.0";
1869 name = "requests-2.22.0";
1870 doCheck = false;
1870 doCheck = false;
1871 propagatedBuildInputs = [
1871 propagatedBuildInputs = [
1872 self."chardet"
1872 self."chardet"
1873 self."idna"
1873 self."idna"
1874 self."urllib3"
1874 self."urllib3"
1875 self."certifi"
1875 self."certifi"
1876 ];
1876 ];
1877 src = fetchurl {
1877 src = fetchurl {
1878 url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz";
1878 url = "https://files.pythonhosted.org/packages/01/62/ddcf76d1d19885e8579acb1b1df26a852b03472c0e46d2b959a714c90608/requests-2.22.0.tar.gz";
1879 sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i";
1879 sha256 = "1d5ybh11jr5sm7xp6mz8fyc7vrp4syifds91m7sj60xalal0gq0i";
1880 };
1880 };
1881 meta = {
1881 meta = {
1882 license = [ pkgs.lib.licenses.asl20 ];
1882 license = [ pkgs.lib.licenses.asl20 ];
1883 };
1883 };
1884 };
1884 };
1885 "rhodecode-enterprise-ce" = super.buildPythonPackage {
1885 "rhodecode-enterprise-ce" = super.buildPythonPackage {
1886 name = "rhodecode-enterprise-ce-4.26.0";
1886 name = "rhodecode-enterprise-ce-4.27.0";
1887 buildInputs = [
1887 buildInputs = [
1888 self."pytest"
1888 self."pytest"
1889 self."py"
1889 self."py"
1890 self."pytest-cov"
1890 self."pytest-cov"
1891 self."pytest-sugar"
1891 self."pytest-sugar"
1892 self."pytest-runner"
1892 self."pytest-runner"
1893 self."pytest-profiling"
1893 self."pytest-profiling"
1894 self."pytest-timeout"
1894 self."pytest-timeout"
1895 self."gprof2dot"
1895 self."gprof2dot"
1896 self."mock"
1896 self."mock"
1897 self."cov-core"
1897 self."cov-core"
1898 self."coverage"
1898 self."coverage"
1899 self."webtest"
1899 self."webtest"
1900 self."beautifulsoup4"
1900 self."beautifulsoup4"
1901 self."configobj"
1901 self."configobj"
1902 ];
1902 ];
1903 doCheck = true;
1903 doCheck = true;
1904 propagatedBuildInputs = [
1904 propagatedBuildInputs = [
1905 self."amqp"
1905 self."amqp"
1906 self."babel"
1906 self."babel"
1907 self."beaker"
1907 self."beaker"
1908 self."bleach"
1908 self."bleach"
1909 self."celery"
1909 self."celery"
1910 self."channelstream"
1910 self."channelstream"
1911 self."click"
1911 self."click"
1912 self."colander"
1912 self."colander"
1913 self."configobj"
1913 self."configobj"
1914 self."cssselect"
1914 self."cssselect"
1915 self."cryptography"
1915 self."cryptography"
1916 self."decorator"
1916 self."decorator"
1917 self."deform"
1917 self."deform"
1918 self."docutils"
1918 self."docutils"
1919 self."dogpile.cache"
1919 self."dogpile.cache"
1920 self."dogpile.core"
1920 self."dogpile.core"
1921 self."formencode"
1921 self."formencode"
1922 self."future"
1922 self."future"
1923 self."futures"
1923 self."futures"
1924 self."infrae.cache"
1924 self."infrae.cache"
1925 self."iso8601"
1925 self."iso8601"
1926 self."itsdangerous"
1926 self."itsdangerous"
1927 self."kombu"
1927 self."kombu"
1928 self."lxml"
1928 self."lxml"
1929 self."mako"
1929 self."mako"
1930 self."markdown"
1930 self."markdown"
1931 self."markupsafe"
1931 self."markupsafe"
1932 self."msgpack-python"
1932 self."msgpack-python"
1933 self."pyotp"
1933 self."pyotp"
1934 self."packaging"
1934 self."packaging"
1935 self."pathlib2"
1935 self."pathlib2"
1936 self."paste"
1936 self."paste"
1937 self."pastedeploy"
1937 self."pastedeploy"
1938 self."pastescript"
1938 self."pastescript"
1939 self."peppercorn"
1939 self."peppercorn"
1940 self."premailer"
1940 self."premailer"
1941 self."psutil"
1941 self."psutil"
1942 self."py-bcrypt"
1942 self."py-bcrypt"
1943 self."pycurl"
1943 self."pycurl"
1944 self."pycrypto"
1944 self."pycrypto"
1945 self."pygments"
1945 self."pygments"
1946 self."pyparsing"
1946 self."pyparsing"
1947 self."pyramid-debugtoolbar"
1947 self."pyramid-debugtoolbar"
1948 self."pyramid-mako"
1948 self."pyramid-mako"
1949 self."pyramid"
1949 self."pyramid"
1950 self."pyramid-mailer"
1950 self."pyramid-mailer"
1951 self."python-dateutil"
1951 self."python-dateutil"
1952 self."python-ldap"
1952 self."python-ldap"
1953 self."python-memcached"
1953 self."python-memcached"
1954 self."python-pam"
1954 self."python-pam"
1955 self."python-saml"
1955 self."python-saml"
1956 self."pytz"
1956 self."pytz"
1957 self."tzlocal"
1957 self."tzlocal"
1958 self."pyzmq"
1958 self."pyzmq"
1959 self."py-gfm"
1959 self."py-gfm"
1960 self."regex"
1960 self."regex"
1961 self."redis"
1961 self."redis"
1962 self."repoze.lru"
1962 self."repoze.lru"
1963 self."requests"
1963 self."requests"
1964 self."routes"
1964 self."routes"
1965 self."simplejson"
1965 self."simplejson"
1966 self."six"
1966 self."six"
1967 self."sqlalchemy"
1967 self."sqlalchemy"
1968 self."sshpubkeys"
1968 self."sshpubkeys"
1969 self."subprocess32"
1969 self."subprocess32"
1970 self."supervisor"
1970 self."supervisor"
1971 self."translationstring"
1971 self."translationstring"
1972 self."urllib3"
1972 self."urllib3"
1973 self."urlobject"
1973 self."urlobject"
1974 self."venusian"
1974 self."venusian"
1975 self."weberror"
1975 self."weberror"
1976 self."webhelpers2"
1976 self."webhelpers2"
1977 self."webob"
1977 self."webob"
1978 self."whoosh"
1978 self."whoosh"
1979 self."wsgiref"
1979 self."wsgiref"
1980 self."zope.cachedescriptors"
1980 self."zope.cachedescriptors"
1981 self."zope.deprecation"
1981 self."zope.deprecation"
1982 self."zope.event"
1982 self."zope.event"
1983 self."zope.interface"
1983 self."zope.interface"
1984 self."mysql-python"
1984 self."mysql-python"
1985 self."pymysql"
1985 self."pymysql"
1986 self."pysqlite"
1986 self."pysqlite"
1987 self."psycopg2"
1987 self."psycopg2"
1988 self."nbconvert"
1988 self."nbconvert"
1989 self."nbformat"
1989 self."nbformat"
1990 self."jupyter-client"
1990 self."jupyter-client"
1991 self."jupyter-core"
1991 self."jupyter-core"
1992 self."alembic"
1992 self."alembic"
1993 self."invoke"
1993 self."invoke"
1994 self."bumpversion"
1994 self."bumpversion"
1995 self."gevent"
1995 self."gevent"
1996 self."greenlet"
1996 self."greenlet"
1997 self."gunicorn"
1997 self."gunicorn"
1998 self."waitress"
1998 self."waitress"
1999 self."ipdb"
1999 self."ipdb"
2000 self."ipython"
2000 self."ipython"
2001 self."rhodecode-tools"
2001 self."rhodecode-tools"
2002 self."appenlight-client"
2002 self."appenlight-client"
2003 self."pytest"
2003 self."pytest"
2004 self."py"
2004 self."py"
2005 self."pytest-cov"
2005 self."pytest-cov"
2006 self."pytest-sugar"
2006 self."pytest-sugar"
2007 self."pytest-runner"
2007 self."pytest-runner"
2008 self."pytest-profiling"
2008 self."pytest-profiling"
2009 self."pytest-timeout"
2009 self."pytest-timeout"
2010 self."gprof2dot"
2010 self."gprof2dot"
2011 self."mock"
2011 self."mock"
2012 self."cov-core"
2012 self."cov-core"
2013 self."coverage"
2013 self."coverage"
2014 self."webtest"
2014 self."webtest"
2015 self."beautifulsoup4"
2015 self."beautifulsoup4"
2016 ];
2016 ];
2017 src = ./.;
2017 src = ./.;
2018 meta = {
2018 meta = {
2019 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
2019 license = [ { fullName = "Affero GNU General Public License v3 or later (AGPLv3+)"; } { fullName = "AGPLv3, and Commercial License"; } ];
2020 };
2020 };
2021 };
2021 };
2022 "rhodecode-tools" = super.buildPythonPackage {
2022 "rhodecode-tools" = super.buildPythonPackage {
2023 name = "rhodecode-tools-1.4.0";
2023 name = "rhodecode-tools-1.4.1";
2024 doCheck = false;
2024 doCheck = false;
2025 propagatedBuildInputs = [
2025 propagatedBuildInputs = [
2026 self."click"
2026 self."click"
2027 self."future"
2027 self."future"
2028 self."six"
2028 self."six"
2029 self."mako"
2029 self."mako"
2030 self."markupsafe"
2030 self."markupsafe"
2031 self."requests"
2031 self."requests"
2032 self."urllib3"
2032 self."urllib3"
2033 self."whoosh"
2033 self."whoosh"
2034 self."elasticsearch"
2034 self."elasticsearch"
2035 self."elasticsearch-dsl"
2035 self."elasticsearch-dsl"
2036 self."elasticsearch2"
2036 self."elasticsearch2"
2037 self."elasticsearch1-dsl"
2037 self."elasticsearch1-dsl"
2038 ];
2038 ];
2039 src = fetchurl {
2039 src = fetchurl {
2040 url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a";
2040 url = "https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-d9ea7914-e475-44af-a80a-7e32edc17e2f.tar.gz?sha256=6e5aaac455b4a0b2dee013a1241b367e2991e345fda6ed0f4a8c66c941a19184";
2041 sha256 = "0fjszppj3zhh47g1i6b9xqps28gzfxdkzwb47pdmzrd1sfx29w3n";
2041 sha256 = "6e5aaac455b4a0b2dee013a1241b367e2991e345fda6ed0f4a8c66c941a19184";
2042 };
2042 };
2043 meta = {
2043 meta = {
2044 license = [ { fullName = "Apache 2.0 and Proprietary"; } ];
2044 license = [ { fullName = "Apache 2.0 and Proprietary"; } ];
2045 };
2045 };
2046 };
2046 };
2047 "routes" = super.buildPythonPackage {
2047 "routes" = super.buildPythonPackage {
2048 name = "routes-2.4.1";
2048 name = "routes-2.4.1";
2049 doCheck = false;
2049 doCheck = false;
2050 propagatedBuildInputs = [
2050 propagatedBuildInputs = [
2051 self."six"
2051 self."six"
2052 self."repoze.lru"
2052 self."repoze.lru"
2053 ];
2053 ];
2054 src = fetchurl {
2054 src = fetchurl {
2055 url = "https://files.pythonhosted.org/packages/33/38/ea827837e68d9c7dde4cff7ec122a93c319f0effc08ce92a17095576603f/Routes-2.4.1.tar.gz";
2055 url = "https://files.pythonhosted.org/packages/33/38/ea827837e68d9c7dde4cff7ec122a93c319f0effc08ce92a17095576603f/Routes-2.4.1.tar.gz";
2056 sha256 = "1zamff3m0kc4vyfniyhxpkkcqv1rrgnmh37ykxv34nna1ws47vi6";
2056 sha256 = "1zamff3m0kc4vyfniyhxpkkcqv1rrgnmh37ykxv34nna1ws47vi6";
2057 };
2057 };
2058 meta = {
2058 meta = {
2059 license = [ pkgs.lib.licenses.mit ];
2059 license = [ pkgs.lib.licenses.mit ];
2060 };
2060 };
2061 };
2061 };
2062 "scandir" = super.buildPythonPackage {
2062 "scandir" = super.buildPythonPackage {
2063 name = "scandir-1.10.0";
2063 name = "scandir-1.10.0";
2064 doCheck = false;
2064 doCheck = false;
2065 src = fetchurl {
2065 src = fetchurl {
2066 url = "https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz";
2066 url = "https://files.pythonhosted.org/packages/df/f5/9c052db7bd54d0cbf1bc0bb6554362bba1012d03e5888950a4f5c5dadc4e/scandir-1.10.0.tar.gz";
2067 sha256 = "1bkqwmf056pkchf05ywbnf659wqlp6lljcdb0y88wr9f0vv32ijd";
2067 sha256 = "1bkqwmf056pkchf05ywbnf659wqlp6lljcdb0y88wr9f0vv32ijd";
2068 };
2068 };
2069 meta = {
2069 meta = {
2070 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
2070 license = [ pkgs.lib.licenses.bsdOriginal { fullName = "New BSD License"; } ];
2071 };
2071 };
2072 };
2072 };
2073 "setproctitle" = super.buildPythonPackage {
2073 "setproctitle" = super.buildPythonPackage {
2074 name = "setproctitle-1.1.10";
2074 name = "setproctitle-1.1.10";
2075 doCheck = false;
2075 doCheck = false;
2076 src = fetchurl {
2076 src = fetchurl {
2077 url = "https://files.pythonhosted.org/packages/5a/0d/dc0d2234aacba6cf1a729964383e3452c52096dc695581248b548786f2b3/setproctitle-1.1.10.tar.gz";
2077 url = "https://files.pythonhosted.org/packages/5a/0d/dc0d2234aacba6cf1a729964383e3452c52096dc695581248b548786f2b3/setproctitle-1.1.10.tar.gz";
2078 sha256 = "163kplw9dcrw0lffq1bvli5yws3rngpnvrxrzdw89pbphjjvg0v2";
2078 sha256 = "163kplw9dcrw0lffq1bvli5yws3rngpnvrxrzdw89pbphjjvg0v2";
2079 };
2079 };
2080 meta = {
2080 meta = {
2081 license = [ pkgs.lib.licenses.bsdOriginal ];
2081 license = [ pkgs.lib.licenses.bsdOriginal ];
2082 };
2082 };
2083 };
2083 };
2084 "setuptools" = super.buildPythonPackage {
2084 "setuptools" = super.buildPythonPackage {
2085 name = "setuptools-44.1.0";
2085 name = "setuptools-44.1.0";
2086 doCheck = false;
2086 doCheck = false;
2087 src = fetchurl {
2087 src = fetchurl {
2088 url = "https://files.pythonhosted.org/packages/ed/7b/bbf89ca71e722b7f9464ebffe4b5ee20a9e5c9a555a56e2d3914bb9119a6/setuptools-44.1.0.zip";
2088 url = "https://files.pythonhosted.org/packages/ed/7b/bbf89ca71e722b7f9464ebffe4b5ee20a9e5c9a555a56e2d3914bb9119a6/setuptools-44.1.0.zip";
2089 sha256 = "1jja896zvd1ppccnjbhkgagxbwchgq6vfamp6qn1hvywq6q9cjkr";
2089 sha256 = "1jja896zvd1ppccnjbhkgagxbwchgq6vfamp6qn1hvywq6q9cjkr";
2090 };
2090 };
2091 meta = {
2091 meta = {
2092 license = [ pkgs.lib.licenses.mit ];
2092 license = [ pkgs.lib.licenses.mit ];
2093 };
2093 };
2094 };
2094 };
2095 "setuptools-scm" = super.buildPythonPackage {
2095 "setuptools-scm" = super.buildPythonPackage {
2096 name = "setuptools-scm-3.5.0";
2096 name = "setuptools-scm-3.5.0";
2097 doCheck = false;
2097 doCheck = false;
2098 src = fetchurl {
2098 src = fetchurl {
2099 url = "https://files.pythonhosted.org/packages/b2/f7/60a645aae001a2e06cf4b8db2fba9d9f36b8fd378f10647e3e218b61b74b/setuptools_scm-3.5.0.tar.gz";
2099 url = "https://files.pythonhosted.org/packages/b2/f7/60a645aae001a2e06cf4b8db2fba9d9f36b8fd378f10647e3e218b61b74b/setuptools_scm-3.5.0.tar.gz";
2100 sha256 = "5bdf21a05792903cafe7ae0c9501182ab52497614fa6b1750d9dbae7b60c1a87";
2100 sha256 = "5bdf21a05792903cafe7ae0c9501182ab52497614fa6b1750d9dbae7b60c1a87";
2101 };
2101 };
2102 meta = {
2102 meta = {
2103 license = [ pkgs.lib.licenses.psfl ];
2103 license = [ pkgs.lib.licenses.psfl ];
2104 };
2104 };
2105 };
2105 };
2106 "simplegeneric" = super.buildPythonPackage {
2106 "simplegeneric" = super.buildPythonPackage {
2107 name = "simplegeneric-0.8.1";
2107 name = "simplegeneric-0.8.1";
2108 doCheck = false;
2108 doCheck = false;
2109 src = fetchurl {
2109 src = fetchurl {
2110 url = "https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip";
2110 url = "https://files.pythonhosted.org/packages/3d/57/4d9c9e3ae9a255cd4e1106bb57e24056d3d0709fc01b2e3e345898e49d5b/simplegeneric-0.8.1.zip";
2111 sha256 = "0wwi1c6md4vkbcsfsf8dklf3vr4mcdj4mpxkanwgb6jb1432x5yw";
2111 sha256 = "0wwi1c6md4vkbcsfsf8dklf3vr4mcdj4mpxkanwgb6jb1432x5yw";
2112 };
2112 };
2113 meta = {
2113 meta = {
2114 license = [ pkgs.lib.licenses.zpl21 ];
2114 license = [ pkgs.lib.licenses.zpl21 ];
2115 };
2115 };
2116 };
2116 };
2117 "simplejson" = super.buildPythonPackage {
2117 "simplejson" = super.buildPythonPackage {
2118 name = "simplejson-3.16.0";
2118 name = "simplejson-3.16.0";
2119 doCheck = false;
2119 doCheck = false;
2120 src = fetchurl {
2120 src = fetchurl {
2121 url = "https://files.pythonhosted.org/packages/e3/24/c35fb1c1c315fc0fffe61ea00d3f88e85469004713dab488dee4f35b0aff/simplejson-3.16.0.tar.gz";
2121 url = "https://files.pythonhosted.org/packages/e3/24/c35fb1c1c315fc0fffe61ea00d3f88e85469004713dab488dee4f35b0aff/simplejson-3.16.0.tar.gz";
2122 sha256 = "19cws1syk8jzq2pw43878dv6fjkb0ifvjpx0i9aajix6kc9jkwxi";
2122 sha256 = "19cws1syk8jzq2pw43878dv6fjkb0ifvjpx0i9aajix6kc9jkwxi";
2123 };
2123 };
2124 meta = {
2124 meta = {
2125 license = [ { fullName = "Academic Free License (AFL)"; } pkgs.lib.licenses.mit ];
2125 license = [ { fullName = "Academic Free License (AFL)"; } pkgs.lib.licenses.mit ];
2126 };
2126 };
2127 };
2127 };
2128 "six" = super.buildPythonPackage {
2128 "six" = super.buildPythonPackage {
2129 name = "six-1.11.0";
2129 name = "six-1.11.0";
2130 doCheck = false;
2130 doCheck = false;
2131 src = fetchurl {
2131 src = fetchurl {
2132 url = "https://files.pythonhosted.org/packages/16/d8/bc6316cf98419719bd59c91742194c111b6f2e85abac88e496adefaf7afe/six-1.11.0.tar.gz";
2132 url = "https://files.pythonhosted.org/packages/16/d8/bc6316cf98419719bd59c91742194c111b6f2e85abac88e496adefaf7afe/six-1.11.0.tar.gz";
2133 sha256 = "1scqzwc51c875z23phj48gircqjgnn3af8zy2izjwmnlxrxsgs3h";
2133 sha256 = "1scqzwc51c875z23phj48gircqjgnn3af8zy2izjwmnlxrxsgs3h";
2134 };
2134 };
2135 meta = {
2135 meta = {
2136 license = [ pkgs.lib.licenses.mit ];
2136 license = [ pkgs.lib.licenses.mit ];
2137 };
2137 };
2138 };
2138 };
2139 "sqlalchemy" = super.buildPythonPackage {
2139 "sqlalchemy" = super.buildPythonPackage {
2140 name = "sqlalchemy-1.3.15";
2140 name = "sqlalchemy-1.3.15";
2141 doCheck = false;
2141 doCheck = false;
2142 src = fetchurl {
2142 src = fetchurl {
2143 url = "https://files.pythonhosted.org/packages/8c/30/4134e726dd5ed13728ff814fa91fc01c447ad8700504653fe99d91fdd34b/SQLAlchemy-1.3.15.tar.gz";
2143 url = "https://files.pythonhosted.org/packages/8c/30/4134e726dd5ed13728ff814fa91fc01c447ad8700504653fe99d91fdd34b/SQLAlchemy-1.3.15.tar.gz";
2144 sha256 = "0iglkvymfp35zm5pxy5kzqvcv96kkas0chqdx7xpla86sspa9k64";
2144 sha256 = "0iglkvymfp35zm5pxy5kzqvcv96kkas0chqdx7xpla86sspa9k64";
2145 };
2145 };
2146 meta = {
2146 meta = {
2147 license = [ pkgs.lib.licenses.mit ];
2147 license = [ pkgs.lib.licenses.mit ];
2148 };
2148 };
2149 };
2149 };
2150 "sshpubkeys" = super.buildPythonPackage {
2150 "sshpubkeys" = super.buildPythonPackage {
2151 name = "sshpubkeys-3.1.0";
2151 name = "sshpubkeys-3.1.0";
2152 doCheck = false;
2152 doCheck = false;
2153 propagatedBuildInputs = [
2153 propagatedBuildInputs = [
2154 self."cryptography"
2154 self."cryptography"
2155 self."ecdsa"
2155 self."ecdsa"
2156 ];
2156 ];
2157 src = fetchurl {
2157 src = fetchurl {
2158 url = "https://files.pythonhosted.org/packages/00/23/f7508a12007c96861c3da811992f14283d79c819d71a217b3e12d5196649/sshpubkeys-3.1.0.tar.gz";
2158 url = "https://files.pythonhosted.org/packages/00/23/f7508a12007c96861c3da811992f14283d79c819d71a217b3e12d5196649/sshpubkeys-3.1.0.tar.gz";
2159 sha256 = "105g2li04nm1hb15a2y6hm9m9k7fbrkd5l3gy12w3kgcmsf3k25k";
2159 sha256 = "105g2li04nm1hb15a2y6hm9m9k7fbrkd5l3gy12w3kgcmsf3k25k";
2160 };
2160 };
2161 meta = {
2161 meta = {
2162 license = [ pkgs.lib.licenses.bsdOriginal ];
2162 license = [ pkgs.lib.licenses.bsdOriginal ];
2163 };
2163 };
2164 };
2164 };
2165 "subprocess32" = super.buildPythonPackage {
2165 "subprocess32" = super.buildPythonPackage {
2166 name = "subprocess32-3.5.4";
2166 name = "subprocess32-3.5.4";
2167 doCheck = false;
2167 doCheck = false;
2168 src = fetchurl {
2168 src = fetchurl {
2169 url = "https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz";
2169 url = "https://files.pythonhosted.org/packages/32/c8/564be4d12629b912ea431f1a50eb8b3b9d00f1a0b1ceff17f266be190007/subprocess32-3.5.4.tar.gz";
2170 sha256 = "17f7mvwx2271s1wrl0qac3wjqqnrqag866zs3qc8v5wp0k43fagb";
2170 sha256 = "17f7mvwx2271s1wrl0qac3wjqqnrqag866zs3qc8v5wp0k43fagb";
2171 };
2171 };
2172 meta = {
2172 meta = {
2173 license = [ pkgs.lib.licenses.psfl ];
2173 license = [ pkgs.lib.licenses.psfl ];
2174 };
2174 };
2175 };
2175 };
2176 "supervisor" = super.buildPythonPackage {
2176 "supervisor" = super.buildPythonPackage {
2177 name = "supervisor-4.1.0";
2177 name = "supervisor-4.1.0";
2178 doCheck = false;
2178 doCheck = false;
2179 src = fetchurl {
2179 src = fetchurl {
2180 url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz";
2180 url = "https://files.pythonhosted.org/packages/de/87/ee1ad8fa533a4b5f2c7623f4a2b585d3c1947af7bed8e65bc7772274320e/supervisor-4.1.0.tar.gz";
2181 sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d";
2181 sha256 = "10q36sa1jqljyyyl7cif52akpygl5kmlqq9x91hmx53f8zh6zj1d";
2182 };
2182 };
2183 meta = {
2183 meta = {
2184 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2184 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2185 };
2185 };
2186 };
2186 };
2187 "tempita" = super.buildPythonPackage {
2187 "tempita" = super.buildPythonPackage {
2188 name = "tempita-0.5.2";
2188 name = "tempita-0.5.2";
2189 doCheck = false;
2189 doCheck = false;
2190 src = fetchurl {
2190 src = fetchurl {
2191 url = "https://files.pythonhosted.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz";
2191 url = "https://files.pythonhosted.org/packages/56/c8/8ed6eee83dbddf7b0fc64dd5d4454bc05e6ccaafff47991f73f2894d9ff4/Tempita-0.5.2.tar.gz";
2192 sha256 = "177wwq45slfyajd8csy477bmdmzipyw0dm7i85k3akb7m85wzkna";
2192 sha256 = "177wwq45slfyajd8csy477bmdmzipyw0dm7i85k3akb7m85wzkna";
2193 };
2193 };
2194 meta = {
2194 meta = {
2195 license = [ pkgs.lib.licenses.mit ];
2195 license = [ pkgs.lib.licenses.mit ];
2196 };
2196 };
2197 };
2197 };
2198 "termcolor" = super.buildPythonPackage {
2198 "termcolor" = super.buildPythonPackage {
2199 name = "termcolor-1.1.0";
2199 name = "termcolor-1.1.0";
2200 doCheck = false;
2200 doCheck = false;
2201 src = fetchurl {
2201 src = fetchurl {
2202 url = "https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz";
2202 url = "https://files.pythonhosted.org/packages/8a/48/a76be51647d0eb9f10e2a4511bf3ffb8cc1e6b14e9e4fab46173aa79f981/termcolor-1.1.0.tar.gz";
2203 sha256 = "0fv1vq14rpqwgazxg4981904lfyp84mnammw7y046491cv76jv8x";
2203 sha256 = "0fv1vq14rpqwgazxg4981904lfyp84mnammw7y046491cv76jv8x";
2204 };
2204 };
2205 meta = {
2205 meta = {
2206 license = [ pkgs.lib.licenses.mit ];
2206 license = [ pkgs.lib.licenses.mit ];
2207 };
2207 };
2208 };
2208 };
2209 "testpath" = super.buildPythonPackage {
2209 "testpath" = super.buildPythonPackage {
2210 name = "testpath-0.4.4";
2210 name = "testpath-0.4.4";
2211 doCheck = false;
2211 doCheck = false;
2212 src = fetchurl {
2212 src = fetchurl {
2213 url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz";
2213 url = "https://files.pythonhosted.org/packages/2c/b3/5d57205e896d8998d77ad12aa42ebce75cd97d8b9a97d00ba078c4c9ffeb/testpath-0.4.4.tar.gz";
2214 sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30";
2214 sha256 = "0zpcmq22dz79ipvvsfnw1ykpjcaj6xyzy7ws77s5b5ql3hka7q30";
2215 };
2215 };
2216 meta = {
2216 meta = {
2217 license = [ ];
2217 license = [ ];
2218 };
2218 };
2219 };
2219 };
2220 "traitlets" = super.buildPythonPackage {
2220 "traitlets" = super.buildPythonPackage {
2221 name = "traitlets-4.3.3";
2221 name = "traitlets-4.3.3";
2222 doCheck = false;
2222 doCheck = false;
2223 propagatedBuildInputs = [
2223 propagatedBuildInputs = [
2224 self."ipython-genutils"
2224 self."ipython-genutils"
2225 self."six"
2225 self."six"
2226 self."decorator"
2226 self."decorator"
2227 self."enum34"
2227 self."enum34"
2228 ];
2228 ];
2229 src = fetchurl {
2229 src = fetchurl {
2230 url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz";
2230 url = "https://files.pythonhosted.org/packages/75/b0/43deb021bc943f18f07cbe3dac1d681626a48997b7ffa1e7fb14ef922b21/traitlets-4.3.3.tar.gz";
2231 sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh";
2231 sha256 = "1xsrwgivpkxlbr4dfndfsi098s29yqgswgjc1qqn69yxklvfw8yh";
2232 };
2232 };
2233 meta = {
2233 meta = {
2234 license = [ pkgs.lib.licenses.bsdOriginal ];
2234 license = [ pkgs.lib.licenses.bsdOriginal ];
2235 };
2235 };
2236 };
2236 };
2237 "transaction" = super.buildPythonPackage {
2237 "transaction" = super.buildPythonPackage {
2238 name = "transaction-2.4.0";
2238 name = "transaction-2.4.0";
2239 doCheck = false;
2239 doCheck = false;
2240 propagatedBuildInputs = [
2240 propagatedBuildInputs = [
2241 self."zope.interface"
2241 self."zope.interface"
2242 ];
2242 ];
2243 src = fetchurl {
2243 src = fetchurl {
2244 url = "https://files.pythonhosted.org/packages/9d/7d/0e8af0d059e052b9dcf2bb5a08aad20ae3e238746bdd3f8701a60969b363/transaction-2.4.0.tar.gz";
2244 url = "https://files.pythonhosted.org/packages/9d/7d/0e8af0d059e052b9dcf2bb5a08aad20ae3e238746bdd3f8701a60969b363/transaction-2.4.0.tar.gz";
2245 sha256 = "17wz1y524ca07vr03yddy8dv0gbscs06dbdywmllxv5rc725jq3j";
2245 sha256 = "17wz1y524ca07vr03yddy8dv0gbscs06dbdywmllxv5rc725jq3j";
2246 };
2246 };
2247 meta = {
2247 meta = {
2248 license = [ pkgs.lib.licenses.zpl21 ];
2248 license = [ pkgs.lib.licenses.zpl21 ];
2249 };
2249 };
2250 };
2250 };
2251 "translationstring" = super.buildPythonPackage {
2251 "translationstring" = super.buildPythonPackage {
2252 name = "translationstring-1.3";
2252 name = "translationstring-1.3";
2253 doCheck = false;
2253 doCheck = false;
2254 src = fetchurl {
2254 src = fetchurl {
2255 url = "https://files.pythonhosted.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz";
2255 url = "https://files.pythonhosted.org/packages/5e/eb/bee578cc150b44c653b63f5ebe258b5d0d812ddac12497e5f80fcad5d0b4/translationstring-1.3.tar.gz";
2256 sha256 = "0bdpcnd9pv0131dl08h4zbcwmgc45lyvq3pa224xwan5b3x4rr2f";
2256 sha256 = "0bdpcnd9pv0131dl08h4zbcwmgc45lyvq3pa224xwan5b3x4rr2f";
2257 };
2257 };
2258 meta = {
2258 meta = {
2259 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
2259 license = [ { fullName = "BSD-like (http://repoze.org/license.html)"; } ];
2260 };
2260 };
2261 };
2261 };
2262 "tzlocal" = super.buildPythonPackage {
2262 "tzlocal" = super.buildPythonPackage {
2263 name = "tzlocal-1.5.1";
2263 name = "tzlocal-1.5.1";
2264 doCheck = false;
2264 doCheck = false;
2265 propagatedBuildInputs = [
2265 propagatedBuildInputs = [
2266 self."pytz"
2266 self."pytz"
2267 ];
2267 ];
2268 src = fetchurl {
2268 src = fetchurl {
2269 url = "https://files.pythonhosted.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz";
2269 url = "https://files.pythonhosted.org/packages/cb/89/e3687d3ed99bc882793f82634e9824e62499fdfdc4b1ae39e211c5b05017/tzlocal-1.5.1.tar.gz";
2270 sha256 = "0kiciwiqx0bv0fbc913idxibc4ygg4cb7f8rcpd9ij2shi4bigjf";
2270 sha256 = "0kiciwiqx0bv0fbc913idxibc4ygg4cb7f8rcpd9ij2shi4bigjf";
2271 };
2271 };
2272 meta = {
2272 meta = {
2273 license = [ pkgs.lib.licenses.mit ];
2273 license = [ pkgs.lib.licenses.mit ];
2274 };
2274 };
2275 };
2275 };
2276 "urllib3" = super.buildPythonPackage {
2276 "urllib3" = super.buildPythonPackage {
2277 name = "urllib3-1.25.2";
2277 name = "urllib3-1.25.2";
2278 doCheck = false;
2278 doCheck = false;
2279 src = fetchurl {
2279 src = fetchurl {
2280 url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz";
2280 url = "https://files.pythonhosted.org/packages/9a/8b/ea6d2beb2da6e331e9857d0a60b79ed4f72dcbc4e2c7f2d2521b0480fda2/urllib3-1.25.2.tar.gz";
2281 sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55";
2281 sha256 = "1nq2k4pss1ihsjh02r41sqpjpm5rfqkjfysyq7g7n2i1p7c66c55";
2282 };
2282 };
2283 meta = {
2283 meta = {
2284 license = [ pkgs.lib.licenses.mit ];
2284 license = [ pkgs.lib.licenses.mit ];
2285 };
2285 };
2286 };
2286 };
2287 "urlobject" = super.buildPythonPackage {
2287 "urlobject" = super.buildPythonPackage {
2288 name = "urlobject-2.4.3";
2288 name = "urlobject-2.4.3";
2289 doCheck = false;
2289 doCheck = false;
2290 src = fetchurl {
2290 src = fetchurl {
2291 url = "https://files.pythonhosted.org/packages/e2/b8/1d0a916f4b34c4618846e6da0e4eeaa8fcb4a2f39e006434fe38acb74b34/URLObject-2.4.3.tar.gz";
2291 url = "https://files.pythonhosted.org/packages/e2/b8/1d0a916f4b34c4618846e6da0e4eeaa8fcb4a2f39e006434fe38acb74b34/URLObject-2.4.3.tar.gz";
2292 sha256 = "1ahc8ficzfvr2avln71immfh4ls0zyv6cdaa5xmkdj5rd87f5cj7";
2292 sha256 = "1ahc8ficzfvr2avln71immfh4ls0zyv6cdaa5xmkdj5rd87f5cj7";
2293 };
2293 };
2294 meta = {
2294 meta = {
2295 license = [ pkgs.lib.licenses.publicDomain ];
2295 license = [ pkgs.lib.licenses.publicDomain ];
2296 };
2296 };
2297 };
2297 };
2298 "venusian" = super.buildPythonPackage {
2298 "venusian" = super.buildPythonPackage {
2299 name = "venusian-1.2.0";
2299 name = "venusian-1.2.0";
2300 doCheck = false;
2300 doCheck = false;
2301 src = fetchurl {
2301 src = fetchurl {
2302 url = "https://files.pythonhosted.org/packages/7e/6f/40a9d43ac77cb51cb62be5b5662d170f43f8037bdc4eab56336c4ca92bb7/venusian-1.2.0.tar.gz";
2302 url = "https://files.pythonhosted.org/packages/7e/6f/40a9d43ac77cb51cb62be5b5662d170f43f8037bdc4eab56336c4ca92bb7/venusian-1.2.0.tar.gz";
2303 sha256 = "0ghyx66g8ikx9nx1mnwqvdcqm11i1vlq0hnvwl50s48bp22q5v34";
2303 sha256 = "0ghyx66g8ikx9nx1mnwqvdcqm11i1vlq0hnvwl50s48bp22q5v34";
2304 };
2304 };
2305 meta = {
2305 meta = {
2306 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2306 license = [ { fullName = "BSD-derived (http://www.repoze.org/LICENSE.txt)"; } ];
2307 };
2307 };
2308 };
2308 };
2309 "vine" = super.buildPythonPackage {
2309 "vine" = super.buildPythonPackage {
2310 name = "vine-1.3.0";
2310 name = "vine-1.3.0";
2311 doCheck = false;
2311 doCheck = false;
2312 src = fetchurl {
2312 src = fetchurl {
2313 url = "https://files.pythonhosted.org/packages/1c/e1/79fb8046e607dd6c2ad05c9b8ebac9d0bd31d086a08f02699e96fc5b3046/vine-1.3.0.tar.gz";
2313 url = "https://files.pythonhosted.org/packages/1c/e1/79fb8046e607dd6c2ad05c9b8ebac9d0bd31d086a08f02699e96fc5b3046/vine-1.3.0.tar.gz";
2314 sha256 = "11ydsbhl1vabndc2r979dv61s6j2b0giq6dgvryifvq1m7bycghk";
2314 sha256 = "11ydsbhl1vabndc2r979dv61s6j2b0giq6dgvryifvq1m7bycghk";
2315 };
2315 };
2316 meta = {
2316 meta = {
2317 license = [ pkgs.lib.licenses.bsdOriginal ];
2317 license = [ pkgs.lib.licenses.bsdOriginal ];
2318 };
2318 };
2319 };
2319 };
2320 "waitress" = super.buildPythonPackage {
2320 "waitress" = super.buildPythonPackage {
2321 name = "waitress-1.3.1";
2321 name = "waitress-1.3.1";
2322 doCheck = false;
2322 doCheck = false;
2323 src = fetchurl {
2323 src = fetchurl {
2324 url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz";
2324 url = "https://files.pythonhosted.org/packages/a6/e6/708da7bba65898e5d759ade8391b1077e49d07be0b0223c39f5be04def56/waitress-1.3.1.tar.gz";
2325 sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7";
2325 sha256 = "1iysl8ka3l4cdrr0r19fh1cv28q41mwpvgsb81ji7k4shkb0k3i7";
2326 };
2326 };
2327 meta = {
2327 meta = {
2328 license = [ pkgs.lib.licenses.zpl21 ];
2328 license = [ pkgs.lib.licenses.zpl21 ];
2329 };
2329 };
2330 };
2330 };
2331 "wcwidth" = super.buildPythonPackage {
2331 "wcwidth" = super.buildPythonPackage {
2332 name = "wcwidth-0.1.9";
2332 name = "wcwidth-0.1.9";
2333 doCheck = false;
2333 doCheck = false;
2334 src = fetchurl {
2334 src = fetchurl {
2335 url = "https://files.pythonhosted.org/packages/25/9d/0acbed6e4a4be4fc99148f275488580968f44ddb5e69b8ceb53fc9df55a0/wcwidth-0.1.9.tar.gz";
2335 url = "https://files.pythonhosted.org/packages/25/9d/0acbed6e4a4be4fc99148f275488580968f44ddb5e69b8ceb53fc9df55a0/wcwidth-0.1.9.tar.gz";
2336 sha256 = "1wf5ycjx8s066rdvr0fgz4xds9a8zhs91c4jzxvvymm1c8l8cwzf";
2336 sha256 = "1wf5ycjx8s066rdvr0fgz4xds9a8zhs91c4jzxvvymm1c8l8cwzf";
2337 };
2337 };
2338 meta = {
2338 meta = {
2339 license = [ pkgs.lib.licenses.mit ];
2339 license = [ pkgs.lib.licenses.mit ];
2340 };
2340 };
2341 };
2341 };
2342 "webencodings" = super.buildPythonPackage {
2342 "webencodings" = super.buildPythonPackage {
2343 name = "webencodings-0.5.1";
2343 name = "webencodings-0.5.1";
2344 doCheck = false;
2344 doCheck = false;
2345 src = fetchurl {
2345 src = fetchurl {
2346 url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz";
2346 url = "https://files.pythonhosted.org/packages/0b/02/ae6ceac1baeda530866a85075641cec12989bd8d31af6d5ab4a3e8c92f47/webencodings-0.5.1.tar.gz";
2347 sha256 = "08qrgrc4hrximb2gqnl69g01s93rhf2842jfxdjljc1dbwj1qsmk";
2347 sha256 = "08qrgrc4hrximb2gqnl69g01s93rhf2842jfxdjljc1dbwj1qsmk";
2348 };
2348 };
2349 meta = {
2349 meta = {
2350 license = [ pkgs.lib.licenses.bsdOriginal ];
2350 license = [ pkgs.lib.licenses.bsdOriginal ];
2351 };
2351 };
2352 };
2352 };
2353 "weberror" = super.buildPythonPackage {
2353 "weberror" = super.buildPythonPackage {
2354 name = "weberror-0.13.1";
2354 name = "weberror-0.13.1";
2355 doCheck = false;
2355 doCheck = false;
2356 propagatedBuildInputs = [
2356 propagatedBuildInputs = [
2357 self."webob"
2357 self."webob"
2358 self."tempita"
2358 self."tempita"
2359 self."pygments"
2359 self."pygments"
2360 self."paste"
2360 self."paste"
2361 ];
2361 ];
2362 src = fetchurl {
2362 src = fetchurl {
2363 url = "https://files.pythonhosted.org/packages/07/0a/09ca5eb0fab5c0d17b380026babe81c96ecebb13f2b06c3203432dd7be72/WebError-0.13.1.tar.gz";
2363 url = "https://files.pythonhosted.org/packages/07/0a/09ca5eb0fab5c0d17b380026babe81c96ecebb13f2b06c3203432dd7be72/WebError-0.13.1.tar.gz";
2364 sha256 = "0r4qvnf2r92gfnpa1kwygh4j2x6j3axg2i4an6hyxwg2gpaqp7y1";
2364 sha256 = "0r4qvnf2r92gfnpa1kwygh4j2x6j3axg2i4an6hyxwg2gpaqp7y1";
2365 };
2365 };
2366 meta = {
2366 meta = {
2367 license = [ pkgs.lib.licenses.mit ];
2367 license = [ pkgs.lib.licenses.mit ];
2368 };
2368 };
2369 };
2369 };
2370 "webhelpers2" = super.buildPythonPackage {
2370 "webhelpers2" = super.buildPythonPackage {
2371 name = "webhelpers2-2.0";
2371 name = "webhelpers2-2.0";
2372 doCheck = false;
2372 doCheck = false;
2373 propagatedBuildInputs = [
2373 propagatedBuildInputs = [
2374 self."markupsafe"
2374 self."markupsafe"
2375 self."six"
2375 self."six"
2376 ];
2376 ];
2377 src = fetchurl {
2377 src = fetchurl {
2378 url = "https://files.pythonhosted.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz";
2378 url = "https://files.pythonhosted.org/packages/ff/30/56342c6ea522439e3662427c8d7b5e5b390dff4ff2dc92d8afcb8ab68b75/WebHelpers2-2.0.tar.gz";
2379 sha256 = "0aphva1qmxh83n01p53f5fd43m4srzbnfbz5ajvbx9aj2aipwmcs";
2379 sha256 = "0aphva1qmxh83n01p53f5fd43m4srzbnfbz5ajvbx9aj2aipwmcs";
2380 };
2380 };
2381 meta = {
2381 meta = {
2382 license = [ pkgs.lib.licenses.mit ];
2382 license = [ pkgs.lib.licenses.mit ];
2383 };
2383 };
2384 };
2384 };
2385 "webob" = super.buildPythonPackage {
2385 "webob" = super.buildPythonPackage {
2386 name = "webob-1.8.5";
2386 name = "webob-1.8.5";
2387 doCheck = false;
2387 doCheck = false;
2388 src = fetchurl {
2388 src = fetchurl {
2389 url = "https://files.pythonhosted.org/packages/9d/1a/0c89c070ee2829c934cb6c7082287c822e28236a4fcf90063e6be7c35532/WebOb-1.8.5.tar.gz";
2389 url = "https://files.pythonhosted.org/packages/9d/1a/0c89c070ee2829c934cb6c7082287c822e28236a4fcf90063e6be7c35532/WebOb-1.8.5.tar.gz";
2390 sha256 = "11khpzaxc88q31v25ic330gsf56fwmbdc9b30br8mvp0fmwspah5";
2390 sha256 = "11khpzaxc88q31v25ic330gsf56fwmbdc9b30br8mvp0fmwspah5";
2391 };
2391 };
2392 meta = {
2392 meta = {
2393 license = [ pkgs.lib.licenses.mit ];
2393 license = [ pkgs.lib.licenses.mit ];
2394 };
2394 };
2395 };
2395 };
2396 "webtest" = super.buildPythonPackage {
2396 "webtest" = super.buildPythonPackage {
2397 name = "webtest-2.0.34";
2397 name = "webtest-2.0.34";
2398 doCheck = false;
2398 doCheck = false;
2399 propagatedBuildInputs = [
2399 propagatedBuildInputs = [
2400 self."six"
2400 self."six"
2401 self."webob"
2401 self."webob"
2402 self."waitress"
2402 self."waitress"
2403 self."beautifulsoup4"
2403 self."beautifulsoup4"
2404 ];
2404 ];
2405 src = fetchurl {
2405 src = fetchurl {
2406 url = "https://files.pythonhosted.org/packages/2c/74/a0e63feee438735d628631e2b70d82280276a930637ac535479e5fad9427/WebTest-2.0.34.tar.gz";
2406 url = "https://files.pythonhosted.org/packages/2c/74/a0e63feee438735d628631e2b70d82280276a930637ac535479e5fad9427/WebTest-2.0.34.tar.gz";
2407 sha256 = "0x1y2c8z4fmpsny4hbp6ka37si2g10r5r2jwxhvv5mx7g3blq4bi";
2407 sha256 = "0x1y2c8z4fmpsny4hbp6ka37si2g10r5r2jwxhvv5mx7g3blq4bi";
2408 };
2408 };
2409 meta = {
2409 meta = {
2410 license = [ pkgs.lib.licenses.mit ];
2410 license = [ pkgs.lib.licenses.mit ];
2411 };
2411 };
2412 };
2412 };
2413 "whoosh" = super.buildPythonPackage {
2413 "whoosh" = super.buildPythonPackage {
2414 name = "whoosh-2.7.4";
2414 name = "whoosh-2.7.4";
2415 doCheck = false;
2415 doCheck = false;
2416 src = fetchurl {
2416 src = fetchurl {
2417 url = "https://files.pythonhosted.org/packages/25/2b/6beed2107b148edc1321da0d489afc4617b9ed317ef7b72d4993cad9b684/Whoosh-2.7.4.tar.gz";
2417 url = "https://files.pythonhosted.org/packages/25/2b/6beed2107b148edc1321da0d489afc4617b9ed317ef7b72d4993cad9b684/Whoosh-2.7.4.tar.gz";
2418 sha256 = "10qsqdjpbc85fykc1vgcs8xwbgn4l2l52c8d83xf1q59pwyn79bw";
2418 sha256 = "10qsqdjpbc85fykc1vgcs8xwbgn4l2l52c8d83xf1q59pwyn79bw";
2419 };
2419 };
2420 meta = {
2420 meta = {
2421 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
2421 license = [ pkgs.lib.licenses.bsdOriginal pkgs.lib.licenses.bsd2 ];
2422 };
2422 };
2423 };
2423 };
2424 "ws4py" = super.buildPythonPackage {
2424 "ws4py" = super.buildPythonPackage {
2425 name = "ws4py-0.5.1";
2425 name = "ws4py-0.5.1";
2426 doCheck = false;
2426 doCheck = false;
2427 src = fetchurl {
2427 src = fetchurl {
2428 url = "https://files.pythonhosted.org/packages/53/20/4019a739b2eefe9282d3822ef6a225250af964b117356971bd55e274193c/ws4py-0.5.1.tar.gz";
2428 url = "https://files.pythonhosted.org/packages/53/20/4019a739b2eefe9282d3822ef6a225250af964b117356971bd55e274193c/ws4py-0.5.1.tar.gz";
2429 sha256 = "10slbbf2jm4hpr92jx7kh7mhf48sjl01v2w4d8z3f1p0ybbp7l19";
2429 sha256 = "10slbbf2jm4hpr92jx7kh7mhf48sjl01v2w4d8z3f1p0ybbp7l19";
2430 };
2430 };
2431 meta = {
2431 meta = {
2432 license = [ pkgs.lib.licenses.bsdOriginal ];
2432 license = [ pkgs.lib.licenses.bsdOriginal ];
2433 };
2433 };
2434 };
2434 };
2435 "wsgiref" = super.buildPythonPackage {
2435 "wsgiref" = super.buildPythonPackage {
2436 name = "wsgiref-0.1.2";
2436 name = "wsgiref-0.1.2";
2437 doCheck = false;
2437 doCheck = false;
2438 src = fetchurl {
2438 src = fetchurl {
2439 url = "https://files.pythonhosted.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip";
2439 url = "https://files.pythonhosted.org/packages/41/9e/309259ce8dff8c596e8c26df86dbc4e848b9249fd36797fd60be456f03fc/wsgiref-0.1.2.zip";
2440 sha256 = "0y8fyjmpq7vwwm4x732w97qbkw78rjwal5409k04cw4m03411rn7";
2440 sha256 = "0y8fyjmpq7vwwm4x732w97qbkw78rjwal5409k04cw4m03411rn7";
2441 };
2441 };
2442 meta = {
2442 meta = {
2443 license = [ { fullName = "PSF or ZPL"; } ];
2443 license = [ { fullName = "PSF or ZPL"; } ];
2444 };
2444 };
2445 };
2445 };
2446 "zipp" = super.buildPythonPackage {
2446 "zipp" = super.buildPythonPackage {
2447 name = "zipp-1.2.0";
2447 name = "zipp-1.2.0";
2448 doCheck = false;
2448 doCheck = false;
2449 propagatedBuildInputs = [
2449 propagatedBuildInputs = [
2450 self."contextlib2"
2450 self."contextlib2"
2451 ];
2451 ];
2452 src = fetchurl {
2452 src = fetchurl {
2453 url = "https://files.pythonhosted.org/packages/78/08/d52f0ea643bc1068d6dc98b412f4966a9b63255d20911a23ac3220c033c4/zipp-1.2.0.tar.gz";
2453 url = "https://files.pythonhosted.org/packages/78/08/d52f0ea643bc1068d6dc98b412f4966a9b63255d20911a23ac3220c033c4/zipp-1.2.0.tar.gz";
2454 sha256 = "1c91lnv1bxjimh8as27hz7bghsjkkbxn1d37xq7in9c82iai0167";
2454 sha256 = "1c91lnv1bxjimh8as27hz7bghsjkkbxn1d37xq7in9c82iai0167";
2455 };
2455 };
2456 meta = {
2456 meta = {
2457 license = [ pkgs.lib.licenses.mit ];
2457 license = [ pkgs.lib.licenses.mit ];
2458 };
2458 };
2459 };
2459 };
2460 "zope.cachedescriptors" = super.buildPythonPackage {
2460 "zope.cachedescriptors" = super.buildPythonPackage {
2461 name = "zope.cachedescriptors-4.3.1";
2461 name = "zope.cachedescriptors-4.3.1";
2462 doCheck = false;
2462 doCheck = false;
2463 propagatedBuildInputs = [
2463 propagatedBuildInputs = [
2464 self."setuptools"
2464 self."setuptools"
2465 ];
2465 ];
2466 src = fetchurl {
2466 src = fetchurl {
2467 url = "https://files.pythonhosted.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz";
2467 url = "https://files.pythonhosted.org/packages/2f/89/ebe1890cc6d3291ebc935558fa764d5fffe571018dbbee200e9db78762cb/zope.cachedescriptors-4.3.1.tar.gz";
2468 sha256 = "0jhr3m5p74c6r7k8iv0005b8bfsialih9d7zl5vx38rf5xq1lk8z";
2468 sha256 = "0jhr3m5p74c6r7k8iv0005b8bfsialih9d7zl5vx38rf5xq1lk8z";
2469 };
2469 };
2470 meta = {
2470 meta = {
2471 license = [ pkgs.lib.licenses.zpl21 ];
2471 license = [ pkgs.lib.licenses.zpl21 ];
2472 };
2472 };
2473 };
2473 };
2474 "zope.deprecation" = super.buildPythonPackage {
2474 "zope.deprecation" = super.buildPythonPackage {
2475 name = "zope.deprecation-4.4.0";
2475 name = "zope.deprecation-4.4.0";
2476 doCheck = false;
2476 doCheck = false;
2477 propagatedBuildInputs = [
2477 propagatedBuildInputs = [
2478 self."setuptools"
2478 self."setuptools"
2479 ];
2479 ];
2480 src = fetchurl {
2480 src = fetchurl {
2481 url = "https://files.pythonhosted.org/packages/34/da/46e92d32d545dd067b9436279d84c339e8b16de2ca393d7b892bc1e1e9fd/zope.deprecation-4.4.0.tar.gz";
2481 url = "https://files.pythonhosted.org/packages/34/da/46e92d32d545dd067b9436279d84c339e8b16de2ca393d7b892bc1e1e9fd/zope.deprecation-4.4.0.tar.gz";
2482 sha256 = "1pz2cv7gv9y1r3m0bdv7ks1alagmrn5msm5spwdzkb2by0w36i8d";
2482 sha256 = "1pz2cv7gv9y1r3m0bdv7ks1alagmrn5msm5spwdzkb2by0w36i8d";
2483 };
2483 };
2484 meta = {
2484 meta = {
2485 license = [ pkgs.lib.licenses.zpl21 ];
2485 license = [ pkgs.lib.licenses.zpl21 ];
2486 };
2486 };
2487 };
2487 };
2488 "zope.event" = super.buildPythonPackage {
2488 "zope.event" = super.buildPythonPackage {
2489 name = "zope.event-4.4";
2489 name = "zope.event-4.4";
2490 doCheck = false;
2490 doCheck = false;
2491 propagatedBuildInputs = [
2491 propagatedBuildInputs = [
2492 self."setuptools"
2492 self."setuptools"
2493 ];
2493 ];
2494 src = fetchurl {
2494 src = fetchurl {
2495 url = "https://files.pythonhosted.org/packages/4c/b2/51c0369adcf5be2334280eed230192ab3b03f81f8efda9ddea6f65cc7b32/zope.event-4.4.tar.gz";
2495 url = "https://files.pythonhosted.org/packages/4c/b2/51c0369adcf5be2334280eed230192ab3b03f81f8efda9ddea6f65cc7b32/zope.event-4.4.tar.gz";
2496 sha256 = "1ksbc726av9xacml6jhcfyn828hlhb9xlddpx6fcvnlvmpmpvhk9";
2496 sha256 = "1ksbc726av9xacml6jhcfyn828hlhb9xlddpx6fcvnlvmpmpvhk9";
2497 };
2497 };
2498 meta = {
2498 meta = {
2499 license = [ pkgs.lib.licenses.zpl21 ];
2499 license = [ pkgs.lib.licenses.zpl21 ];
2500 };
2500 };
2501 };
2501 };
2502 "zope.interface" = super.buildPythonPackage {
2502 "zope.interface" = super.buildPythonPackage {
2503 name = "zope.interface-4.6.0";
2503 name = "zope.interface-4.6.0";
2504 doCheck = false;
2504 doCheck = false;
2505 propagatedBuildInputs = [
2505 propagatedBuildInputs = [
2506 self."setuptools"
2506 self."setuptools"
2507 ];
2507 ];
2508 src = fetchurl {
2508 src = fetchurl {
2509 url = "https://files.pythonhosted.org/packages/4e/d0/c9d16bd5b38de44a20c6dc5d5ed80a49626fafcb3db9f9efdc2a19026db6/zope.interface-4.6.0.tar.gz";
2509 url = "https://files.pythonhosted.org/packages/4e/d0/c9d16bd5b38de44a20c6dc5d5ed80a49626fafcb3db9f9efdc2a19026db6/zope.interface-4.6.0.tar.gz";
2510 sha256 = "1rgh2x3rcl9r0v0499kf78xy86rnmanajf4ywmqb943wpk50sg8v";
2510 sha256 = "1rgh2x3rcl9r0v0499kf78xy86rnmanajf4ywmqb943wpk50sg8v";
2511 };
2511 };
2512 meta = {
2512 meta = {
2513 license = [ pkgs.lib.licenses.zpl21 ];
2513 license = [ pkgs.lib.licenses.zpl21 ];
2514 };
2514 };
2515 };
2515 };
2516
2516
2517 ### Test requirements
2517 ### Test requirements
2518
2518
2519
2519
2520 }
2520 }
@@ -1,124 +1,124 b''
1 ## dependencies
1 ## dependencies
2
2
3 amqp==2.5.2
3 amqp==2.5.2
4 babel==1.3
4 babel==1.3
5 beaker==1.9.1
5 beaker==1.9.1
6 bleach==3.1.3
6 bleach==3.1.3
7 celery==4.3.0
7 celery==4.3.0
8 channelstream==0.6.14
8 channelstream==0.6.14
9 click==7.0
9 click==7.0
10 colander==1.7.0
10 colander==1.7.0
11 # our custom configobj
11 # our custom configobj
12 https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626#egg=configobj==5.0.6
12 https://code.rhodecode.com/upstream/configobj/artifacts/download/0-012de99a-b1e1-4f64-a5c0-07a98a41b324.tar.gz?md5=6a513f51fe04b2c18cf84c1395a7c626#egg=configobj==5.0.6
13 cssselect==1.0.3
13 cssselect==1.0.3
14 cryptography==2.6.1
14 cryptography==2.6.1
15 decorator==4.1.2
15 decorator==4.1.2
16 deform==2.0.8
16 deform==2.0.8
17 docutils==0.16.0
17 docutils==0.16.0
18 dogpile.cache==0.9.0
18 dogpile.cache==0.9.0
19 dogpile.core==0.4.1
19 dogpile.core==0.4.1
20 formencode==1.2.4
20 formencode==1.2.4
21 future==0.14.3
21 future==0.14.3
22 futures==3.0.2
22 futures==3.0.2
23 infrae.cache==1.0.1
23 infrae.cache==1.0.1
24 iso8601==0.1.12
24 iso8601==0.1.12
25 itsdangerous==1.1.0
25 itsdangerous==1.1.0
26 kombu==4.6.6
26 kombu==4.6.6
27 lxml==4.2.5
27 lxml==4.2.5
28 mako==1.1.0
28 mako==1.1.0
29 markdown==2.6.11
29 markdown==2.6.11
30 markupsafe==1.1.1
30 markupsafe==1.1.1
31 msgpack-python==0.5.6
31 msgpack-python==0.5.6
32 pyotp==2.3.0
32 pyotp==2.3.0
33 packaging==20.3
33 packaging==20.3
34 pathlib2==2.3.5
34 pathlib2==2.3.5
35 paste==3.4.0
35 paste==3.4.0
36 pastedeploy==2.1.0
36 pastedeploy==2.1.0
37 pastescript==3.2.0
37 pastescript==3.2.0
38 peppercorn==0.6
38 peppercorn==0.6
39 premailer==3.6.1
39 premailer==3.6.1
40 psutil==5.7.0
40 psutil==5.7.0
41 py-bcrypt==0.4
41 py-bcrypt==0.4
42 pycurl==7.43.0.3
42 pycurl==7.43.0.3
43 pycrypto==2.6.1
43 pycrypto==2.6.1
44 pygments==2.4.2
44 pygments==2.4.2
45 pyparsing==2.4.7
45 pyparsing==2.4.7
46 pyramid-debugtoolbar==4.6.1
46 pyramid-debugtoolbar==4.6.1
47 pyramid-mako==1.1.0
47 pyramid-mako==1.1.0
48 pyramid==1.10.4
48 pyramid==1.10.4
49 pyramid_mailer==0.15.1
49 pyramid_mailer==0.15.1
50 python-dateutil==2.8.1
50 python-dateutil==2.8.1
51 python-ldap==3.2.0
51 python-ldap==3.2.0
52 python-memcached==1.59
52 python-memcached==1.59
53 python-pam==1.8.4
53 python-pam==1.8.4
54 python-saml==2.4.2
54 python-saml==2.4.2
55 pytz==2019.3
55 pytz==2019.3
56 tzlocal==1.5.1
56 tzlocal==1.5.1
57 pyzmq==14.6.0
57 pyzmq==14.6.0
58 py-gfm==0.1.4
58 py-gfm==0.1.4
59 regex==2020.9.27
59 regex==2020.9.27
60 redis==3.5.3
60 redis==3.5.3
61 repoze.lru==0.7
61 repoze.lru==0.7
62 requests==2.22.0
62 requests==2.22.0
63 routes==2.4.1
63 routes==2.4.1
64 simplejson==3.16.0
64 simplejson==3.16.0
65 six==1.11.0
65 six==1.11.0
66 sqlalchemy==1.3.15
66 sqlalchemy==1.3.15
67 sshpubkeys==3.1.0
67 sshpubkeys==3.1.0
68 subprocess32==3.5.4
68 subprocess32==3.5.4
69 supervisor==4.1.0
69 supervisor==4.1.0
70 translationstring==1.3
70 translationstring==1.3
71 urllib3==1.25.2
71 urllib3==1.25.2
72 urlobject==2.4.3
72 urlobject==2.4.3
73 venusian==1.2.0
73 venusian==1.2.0
74 weberror==0.13.1
74 weberror==0.13.1
75 webhelpers2==2.0
75 webhelpers2==2.0
76 webob==1.8.5
76 webob==1.8.5
77 whoosh==2.7.4
77 whoosh==2.7.4
78 wsgiref==0.1.2
78 wsgiref==0.1.2
79 zope.cachedescriptors==4.3.1
79 zope.cachedescriptors==4.3.1
80 zope.deprecation==4.4.0
80 zope.deprecation==4.4.0
81 zope.event==4.4.0
81 zope.event==4.4.0
82 zope.interface==4.6.0
82 zope.interface==4.6.0
83
83
84 # DB drivers
84 # DB drivers
85 mysql-python==1.2.5
85 mysql-python==1.2.5
86 pymysql==0.8.1
86 pymysql==0.8.1
87 pysqlite==2.8.3
87 pysqlite==2.8.3
88 psycopg2==2.8.4
88 psycopg2==2.8.4
89
89
90 # IPYTHON RENDERING
90 # IPYTHON RENDERING
91 # entrypoints backport, pypi version doesn't support egg installs
91 # entrypoints backport, pypi version doesn't support egg installs
92 https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d#egg=entrypoints==0.2.2.rhodecode-upstream1
92 https://code.rhodecode.com/upstream/entrypoints/artifacts/download/0-8e9ee9e4-c4db-409c-b07e-81568fd1832d.tar.gz?md5=3a027b8ff1d257b91fe257de6c43357d#egg=entrypoints==0.2.2.rhodecode-upstream1
93 nbconvert==5.3.1
93 nbconvert==5.3.1
94 nbformat==4.4.0
94 nbformat==4.4.0
95 jupyter-client==5.0.0
95 jupyter-client==5.0.0
96 jupyter-core==4.5.0
96 jupyter-core==4.5.0
97
97
98 ## cli tools
98 ## cli tools
99 alembic==1.4.2
99 alembic==1.4.2
100 invoke==0.13.0
100 invoke==0.13.0
101 bumpversion==0.5.3
101 bumpversion==0.5.3
102
102
103 ## http servers
103 ## http servers
104 gevent==1.5.0
104 gevent==1.5.0
105 greenlet==0.4.15
105 greenlet==0.4.15
106 gunicorn==19.9.0
106 gunicorn==19.9.0
107 waitress==1.3.1
107 waitress==1.3.1
108
108
109 ## debug
109 ## debug
110 ipdb==0.13.2
110 ipdb==0.13.2
111 ipython==5.1.0
111 ipython==5.1.0
112
112
113 ## rhodecode-tools, special case, use file://PATH.tar.gz#egg=rhodecode-tools==X.Y.Z, to test local version
113 ## rhodecode-tools, special case, use file://PATH.tar.gz#egg=rhodecode-tools==X.Y.Z, to test local version
114 https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-ed54e749-2ef5-4bc7-ae7f-7900e3c2aa15.tar.gz?sha256=76f024bad3a1e55fdb3d64f13f5b77ff21a12fee699918de2110fe21effd5a3a#egg=rhodecode-tools==1.4.0
114 https://code.rhodecode.com/rhodecode-tools-ce/artifacts/download/0-d9ea7914-e475-44af-a80a-7e32edc17e2f.tar.gz?sha256=6e5aaac455b4a0b2dee013a1241b367e2991e345fda6ed0f4a8c66c941a19184#egg=rhodecode-tools==1.4.1
115
115
116
116
117 ## appenlight
117 ## appenlight
118 appenlight-client==0.6.26
118 appenlight-client==0.6.26
119
119
120 ## test related requirements
120 ## test related requirements
121 -r requirements_test.txt
121 -r requirements_test.txt
122
122
123 ## uncomment to add the debug libraries
123 ## uncomment to add the debug libraries
124 #-r requirements_debug.txt
124 #-r requirements_debug.txt
@@ -1,1 +1,1 b''
1 4.26.0 No newline at end of file
1 4.27.0 No newline at end of file
@@ -1,816 +1,821 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2016-2020 RhodeCode GmbH
3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import time
21 import time
22 import logging
22 import logging
23 import operator
23 import operator
24
24
25 from pyramid import compat
25 from pyramid import compat
26 from pyramid.httpexceptions import HTTPFound, HTTPForbidden, HTTPBadRequest
26 from pyramid.httpexceptions import HTTPFound, HTTPForbidden, HTTPBadRequest
27
27
28 from rhodecode.lib import helpers as h, diffs, rc_cache
28 from rhodecode.lib import helpers as h, diffs, rc_cache
29 from rhodecode.lib.utils2 import (
29 from rhodecode.lib.utils2 import (
30 StrictAttributeDict, str2bool, safe_int, datetime_to_time, safe_unicode)
30 StrictAttributeDict, str2bool, safe_int, datetime_to_time, safe_unicode)
31 from rhodecode.lib.markup_renderer import MarkupRenderer, relative_links
31 from rhodecode.lib.markup_renderer import MarkupRenderer, relative_links
32 from rhodecode.lib.vcs.backends.base import EmptyCommit
32 from rhodecode.lib.vcs.backends.base import EmptyCommit
33 from rhodecode.lib.vcs.exceptions import RepositoryRequirementError
33 from rhodecode.lib.vcs.exceptions import RepositoryRequirementError
34 from rhodecode.model import repo
34 from rhodecode.model import repo
35 from rhodecode.model import repo_group
35 from rhodecode.model import repo_group
36 from rhodecode.model import user_group
36 from rhodecode.model import user_group
37 from rhodecode.model import user
37 from rhodecode.model import user
38 from rhodecode.model.db import User
38 from rhodecode.model.db import User
39 from rhodecode.model.scm import ScmModel
39 from rhodecode.model.scm import ScmModel
40 from rhodecode.model.settings import VcsSettingsModel, IssueTrackerSettingsModel
40 from rhodecode.model.settings import VcsSettingsModel, IssueTrackerSettingsModel
41 from rhodecode.model.repo import ReadmeFinder
41 from rhodecode.model.repo import ReadmeFinder
42
42
43 log = logging.getLogger(__name__)
43 log = logging.getLogger(__name__)
44
44
45
45
46 ADMIN_PREFIX = '/_admin'
46 ADMIN_PREFIX = '/_admin'
47 STATIC_FILE_PREFIX = '/_static'
47 STATIC_FILE_PREFIX = '/_static'
48
48
49 URL_NAME_REQUIREMENTS = {
49 URL_NAME_REQUIREMENTS = {
50 # group name can have a slash in them, but they must not end with a slash
50 # group name can have a slash in them, but they must not end with a slash
51 'group_name': r'.*?[^/]',
51 'group_name': r'.*?[^/]',
52 'repo_group_name': r'.*?[^/]',
52 'repo_group_name': r'.*?[^/]',
53 # repo names can have a slash in them, but they must not end with a slash
53 # repo names can have a slash in them, but they must not end with a slash
54 'repo_name': r'.*?[^/]',
54 'repo_name': r'.*?[^/]',
55 # file path eats up everything at the end
55 # file path eats up everything at the end
56 'f_path': r'.*',
56 'f_path': r'.*',
57 # reference types
57 # reference types
58 'source_ref_type': '(branch|book|tag|rev|\%\(source_ref_type\)s)',
58 'source_ref_type': '(branch|book|tag|rev|\%\(source_ref_type\)s)',
59 'target_ref_type': '(branch|book|tag|rev|\%\(target_ref_type\)s)',
59 'target_ref_type': '(branch|book|tag|rev|\%\(target_ref_type\)s)',
60 }
60 }
61
61
62
62
63 def add_route_with_slash(config,name, pattern, **kw):
63 def add_route_with_slash(config,name, pattern, **kw):
64 config.add_route(name, pattern, **kw)
64 config.add_route(name, pattern, **kw)
65 if not pattern.endswith('/'):
65 if not pattern.endswith('/'):
66 config.add_route(name + '_slash', pattern + '/', **kw)
66 config.add_route(name + '_slash', pattern + '/', **kw)
67
67
68
68
69 def add_route_requirements(route_path, requirements=None):
69 def add_route_requirements(route_path, requirements=None):
70 """
70 """
71 Adds regex requirements to pyramid routes using a mapping dict
71 Adds regex requirements to pyramid routes using a mapping dict
72 e.g::
72 e.g::
73 add_route_requirements('{repo_name}/settings')
73 add_route_requirements('{repo_name}/settings')
74 """
74 """
75 requirements = requirements or URL_NAME_REQUIREMENTS
75 requirements = requirements or URL_NAME_REQUIREMENTS
76 for key, regex in requirements.items():
76 for key, regex in requirements.items():
77 route_path = route_path.replace('{%s}' % key, '{%s:%s}' % (key, regex))
77 route_path = route_path.replace('{%s}' % key, '{%s:%s}' % (key, regex))
78 return route_path
78 return route_path
79
79
80
80
81 def get_format_ref_id(repo):
81 def get_format_ref_id(repo):
82 """Returns a `repo` specific reference formatter function"""
82 """Returns a `repo` specific reference formatter function"""
83 if h.is_svn(repo):
83 if h.is_svn(repo):
84 return _format_ref_id_svn
84 return _format_ref_id_svn
85 else:
85 else:
86 return _format_ref_id
86 return _format_ref_id
87
87
88
88
89 def _format_ref_id(name, raw_id):
89 def _format_ref_id(name, raw_id):
90 """Default formatting of a given reference `name`"""
90 """Default formatting of a given reference `name`"""
91 return name
91 return name
92
92
93
93
94 def _format_ref_id_svn(name, raw_id):
94 def _format_ref_id_svn(name, raw_id):
95 """Special way of formatting a reference for Subversion including path"""
95 """Special way of formatting a reference for Subversion including path"""
96 return '%s@%s' % (name, raw_id)
96 return '%s@%s' % (name, raw_id)
97
97
98
98
99 class TemplateArgs(StrictAttributeDict):
99 class TemplateArgs(StrictAttributeDict):
100 pass
100 pass
101
101
102
102
103 class BaseAppView(object):
103 class BaseAppView(object):
104
104
105 def __init__(self, context, request):
105 def __init__(self, context, request):
106 self.request = request
106 self.request = request
107 self.context = context
107 self.context = context
108 self.session = request.session
108 self.session = request.session
109 if not hasattr(request, 'user'):
109 if not hasattr(request, 'user'):
110 # NOTE(marcink): edge case, we ended up in matched route
110 # NOTE(marcink): edge case, we ended up in matched route
111 # but probably of web-app context, e.g API CALL/VCS CALL
111 # but probably of web-app context, e.g API CALL/VCS CALL
112 if hasattr(request, 'vcs_call') or hasattr(request, 'rpc_method'):
112 if hasattr(request, 'vcs_call') or hasattr(request, 'rpc_method'):
113 log.warning('Unable to process request `%s` in this scope', request)
113 log.warning('Unable to process request `%s` in this scope', request)
114 raise HTTPBadRequest()
114 raise HTTPBadRequest()
115
115
116 self._rhodecode_user = request.user # auth user
116 self._rhodecode_user = request.user # auth user
117 self._rhodecode_db_user = self._rhodecode_user.get_instance()
117 self._rhodecode_db_user = self._rhodecode_user.get_instance()
118 self._maybe_needs_password_change(
118 self._maybe_needs_password_change(
119 request.matched_route.name, self._rhodecode_db_user)
119 request.matched_route.name, self._rhodecode_db_user)
120
120
121 def _maybe_needs_password_change(self, view_name, user_obj):
121 def _maybe_needs_password_change(self, view_name, user_obj):
122
122
123 dont_check_views = [
123 dont_check_views = [
124 'channelstream_connect'
124 'channelstream_connect'
125 ]
125 ]
126 if view_name in dont_check_views:
126 if view_name in dont_check_views:
127 return
127 return
128
128
129 log.debug('Checking if user %s needs password change on view %s',
129 log.debug('Checking if user %s needs password change on view %s',
130 user_obj, view_name)
130 user_obj, view_name)
131
131
132 skip_user_views = [
132 skip_user_views = [
133 'logout', 'login',
133 'logout', 'login',
134 'my_account_password', 'my_account_password_update'
134 'my_account_password', 'my_account_password_update'
135 ]
135 ]
136
136
137 if not user_obj:
137 if not user_obj:
138 return
138 return
139
139
140 if user_obj.username == User.DEFAULT_USER:
140 if user_obj.username == User.DEFAULT_USER:
141 return
141 return
142
142
143 now = time.time()
143 now = time.time()
144 should_change = user_obj.user_data.get('force_password_change')
144 should_change = user_obj.user_data.get('force_password_change')
145 change_after = safe_int(should_change) or 0
145 change_after = safe_int(should_change) or 0
146 if should_change and now > change_after:
146 if should_change and now > change_after:
147 log.debug('User %s requires password change', user_obj)
147 log.debug('User %s requires password change', user_obj)
148 h.flash('You are required to change your password', 'warning',
148 h.flash('You are required to change your password', 'warning',
149 ignore_duplicate=True)
149 ignore_duplicate=True)
150
150
151 if view_name not in skip_user_views:
151 if view_name not in skip_user_views:
152 raise HTTPFound(
152 raise HTTPFound(
153 self.request.route_path('my_account_password'))
153 self.request.route_path('my_account_password'))
154
154
155 def _log_creation_exception(self, e, repo_name):
155 def _log_creation_exception(self, e, repo_name):
156 _ = self.request.translate
156 _ = self.request.translate
157 reason = None
157 reason = None
158 if len(e.args) == 2:
158 if len(e.args) == 2:
159 reason = e.args[1]
159 reason = e.args[1]
160
160
161 if reason == 'INVALID_CERTIFICATE':
161 if reason == 'INVALID_CERTIFICATE':
162 log.exception(
162 log.exception(
163 'Exception creating a repository: invalid certificate')
163 'Exception creating a repository: invalid certificate')
164 msg = (_('Error creating repository %s: invalid certificate')
164 msg = (_('Error creating repository %s: invalid certificate')
165 % repo_name)
165 % repo_name)
166 else:
166 else:
167 log.exception("Exception creating a repository")
167 log.exception("Exception creating a repository")
168 msg = (_('Error creating repository %s')
168 msg = (_('Error creating repository %s')
169 % repo_name)
169 % repo_name)
170 return msg
170 return msg
171
171
172 def _get_local_tmpl_context(self, include_app_defaults=True):
172 def _get_local_tmpl_context(self, include_app_defaults=True):
173 c = TemplateArgs()
173 c = TemplateArgs()
174 c.auth_user = self.request.user
174 c.auth_user = self.request.user
175 # TODO(marcink): migrate the usage of c.rhodecode_user to c.auth_user
175 # TODO(marcink): migrate the usage of c.rhodecode_user to c.auth_user
176 c.rhodecode_user = self.request.user
176 c.rhodecode_user = self.request.user
177
177
178 if include_app_defaults:
178 if include_app_defaults:
179 from rhodecode.lib.base import attach_context_attributes
179 from rhodecode.lib.base import attach_context_attributes
180 attach_context_attributes(c, self.request, self.request.user.user_id)
180 attach_context_attributes(c, self.request, self.request.user.user_id)
181
181
182 c.is_super_admin = c.auth_user.is_admin
182 c.is_super_admin = c.auth_user.is_admin
183
183
184 c.can_create_repo = c.is_super_admin
184 c.can_create_repo = c.is_super_admin
185 c.can_create_repo_group = c.is_super_admin
185 c.can_create_repo_group = c.is_super_admin
186 c.can_create_user_group = c.is_super_admin
186 c.can_create_user_group = c.is_super_admin
187
187
188 c.is_delegated_admin = False
188 c.is_delegated_admin = False
189
189
190 if not c.auth_user.is_default and not c.is_super_admin:
190 if not c.auth_user.is_default and not c.is_super_admin:
191 c.can_create_repo = h.HasPermissionAny('hg.create.repository')(
191 c.can_create_repo = h.HasPermissionAny('hg.create.repository')(
192 user=self.request.user)
192 user=self.request.user)
193 repositories = c.auth_user.repositories_admin or c.can_create_repo
193 repositories = c.auth_user.repositories_admin or c.can_create_repo
194
194
195 c.can_create_repo_group = h.HasPermissionAny('hg.repogroup.create.true')(
195 c.can_create_repo_group = h.HasPermissionAny('hg.repogroup.create.true')(
196 user=self.request.user)
196 user=self.request.user)
197 repository_groups = c.auth_user.repository_groups_admin or c.can_create_repo_group
197 repository_groups = c.auth_user.repository_groups_admin or c.can_create_repo_group
198
198
199 c.can_create_user_group = h.HasPermissionAny('hg.usergroup.create.true')(
199 c.can_create_user_group = h.HasPermissionAny('hg.usergroup.create.true')(
200 user=self.request.user)
200 user=self.request.user)
201 user_groups = c.auth_user.user_groups_admin or c.can_create_user_group
201 user_groups = c.auth_user.user_groups_admin or c.can_create_user_group
202 # delegated admin can create, or manage some objects
202 # delegated admin can create, or manage some objects
203 c.is_delegated_admin = repositories or repository_groups or user_groups
203 c.is_delegated_admin = repositories or repository_groups or user_groups
204 return c
204 return c
205
205
206 def _get_template_context(self, tmpl_args, **kwargs):
206 def _get_template_context(self, tmpl_args, **kwargs):
207
207
208 local_tmpl_args = {
208 local_tmpl_args = {
209 'defaults': {},
209 'defaults': {},
210 'errors': {},
210 'errors': {},
211 'c': tmpl_args
211 'c': tmpl_args
212 }
212 }
213 local_tmpl_args.update(kwargs)
213 local_tmpl_args.update(kwargs)
214 return local_tmpl_args
214 return local_tmpl_args
215
215
216 def load_default_context(self):
216 def load_default_context(self):
217 """
217 """
218 example:
218 example:
219
219
220 def load_default_context(self):
220 def load_default_context(self):
221 c = self._get_local_tmpl_context()
221 c = self._get_local_tmpl_context()
222 c.custom_var = 'foobar'
222 c.custom_var = 'foobar'
223
223
224 return c
224 return c
225 """
225 """
226 raise NotImplementedError('Needs implementation in view class')
226 raise NotImplementedError('Needs implementation in view class')
227
227
228
228
229 class RepoAppView(BaseAppView):
229 class RepoAppView(BaseAppView):
230
230
231 def __init__(self, context, request):
231 def __init__(self, context, request):
232 super(RepoAppView, self).__init__(context, request)
232 super(RepoAppView, self).__init__(context, request)
233 self.db_repo = request.db_repo
233 self.db_repo = request.db_repo
234 self.db_repo_name = self.db_repo.repo_name
234 self.db_repo_name = self.db_repo.repo_name
235 self.db_repo_pull_requests = ScmModel().get_pull_requests(self.db_repo)
235 self.db_repo_pull_requests = ScmModel().get_pull_requests(self.db_repo)
236 self.db_repo_artifacts = ScmModel().get_artifacts(self.db_repo)
236 self.db_repo_artifacts = ScmModel().get_artifacts(self.db_repo)
237 self.db_repo_patterns = IssueTrackerSettingsModel(repo=self.db_repo)
237 self.db_repo_patterns = IssueTrackerSettingsModel(repo=self.db_repo)
238
238
239 def _handle_missing_requirements(self, error):
239 def _handle_missing_requirements(self, error):
240 log.error(
240 log.error(
241 'Requirements are missing for repository %s: %s',
241 'Requirements are missing for repository %s: %s',
242 self.db_repo_name, safe_unicode(error))
242 self.db_repo_name, safe_unicode(error))
243
243
244 def _get_local_tmpl_context(self, include_app_defaults=True):
244 def _get_local_tmpl_context(self, include_app_defaults=True):
245 _ = self.request.translate
245 _ = self.request.translate
246 c = super(RepoAppView, self)._get_local_tmpl_context(
246 c = super(RepoAppView, self)._get_local_tmpl_context(
247 include_app_defaults=include_app_defaults)
247 include_app_defaults=include_app_defaults)
248
248
249 # register common vars for this type of view
249 # register common vars for this type of view
250 c.rhodecode_db_repo = self.db_repo
250 c.rhodecode_db_repo = self.db_repo
251 c.repo_name = self.db_repo_name
251 c.repo_name = self.db_repo_name
252 c.repository_pull_requests = self.db_repo_pull_requests
252 c.repository_pull_requests = self.db_repo_pull_requests
253 c.repository_artifacts = self.db_repo_artifacts
253 c.repository_artifacts = self.db_repo_artifacts
254 c.repository_is_user_following = ScmModel().is_following_repo(
254 c.repository_is_user_following = ScmModel().is_following_repo(
255 self.db_repo_name, self._rhodecode_user.user_id)
255 self.db_repo_name, self._rhodecode_user.user_id)
256 self.path_filter = PathFilter(None)
256 self.path_filter = PathFilter(None)
257
257
258 c.repository_requirements_missing = {}
258 c.repository_requirements_missing = {}
259 try:
259 try:
260 self.rhodecode_vcs_repo = self.db_repo.scm_instance()
260 self.rhodecode_vcs_repo = self.db_repo.scm_instance()
261 # NOTE(marcink):
261 # NOTE(marcink):
262 # comparison to None since if it's an object __bool__ is expensive to
262 # comparison to None since if it's an object __bool__ is expensive to
263 # calculate
263 # calculate
264 if self.rhodecode_vcs_repo is not None:
264 if self.rhodecode_vcs_repo is not None:
265 path_perms = self.rhodecode_vcs_repo.get_path_permissions(
265 path_perms = self.rhodecode_vcs_repo.get_path_permissions(
266 c.auth_user.username)
266 c.auth_user.username)
267 self.path_filter = PathFilter(path_perms)
267 self.path_filter = PathFilter(path_perms)
268 except RepositoryRequirementError as e:
268 except RepositoryRequirementError as e:
269 c.repository_requirements_missing = {'error': str(e)}
269 c.repository_requirements_missing = {'error': str(e)}
270 self._handle_missing_requirements(e)
270 self._handle_missing_requirements(e)
271 self.rhodecode_vcs_repo = None
271 self.rhodecode_vcs_repo = None
272
272
273 c.path_filter = self.path_filter # used by atom_feed_entry.mako
273 c.path_filter = self.path_filter # used by atom_feed_entry.mako
274
274
275 if self.rhodecode_vcs_repo is None:
275 if self.rhodecode_vcs_repo is None:
276 # unable to fetch this repo as vcs instance, report back to user
276 # unable to fetch this repo as vcs instance, report back to user
277 h.flash(_(
277 h.flash(_(
278 "The repository `%(repo_name)s` cannot be loaded in filesystem. "
278 "The repository `%(repo_name)s` cannot be loaded in filesystem. "
279 "Please check if it exist, or is not damaged.") %
279 "Please check if it exist, or is not damaged.") %
280 {'repo_name': c.repo_name},
280 {'repo_name': c.repo_name},
281 category='error', ignore_duplicate=True)
281 category='error', ignore_duplicate=True)
282 if c.repository_requirements_missing:
282 if c.repository_requirements_missing:
283 route = self.request.matched_route.name
283 route = self.request.matched_route.name
284 if route.startswith(('edit_repo', 'repo_summary')):
284 if route.startswith(('edit_repo', 'repo_summary')):
285 # allow summary and edit repo on missing requirements
285 # allow summary and edit repo on missing requirements
286 return c
286 return c
287
287
288 raise HTTPFound(
288 raise HTTPFound(
289 h.route_path('repo_summary', repo_name=self.db_repo_name))
289 h.route_path('repo_summary', repo_name=self.db_repo_name))
290
290
291 else: # redirect if we don't show missing requirements
291 else: # redirect if we don't show missing requirements
292 raise HTTPFound(h.route_path('home'))
292 raise HTTPFound(h.route_path('home'))
293
293
294 c.has_origin_repo_read_perm = False
294 c.has_origin_repo_read_perm = False
295 if self.db_repo.fork:
295 if self.db_repo.fork:
296 c.has_origin_repo_read_perm = h.HasRepoPermissionAny(
296 c.has_origin_repo_read_perm = h.HasRepoPermissionAny(
297 'repository.write', 'repository.read', 'repository.admin')(
297 'repository.write', 'repository.read', 'repository.admin')(
298 self.db_repo.fork.repo_name, 'summary fork link')
298 self.db_repo.fork.repo_name, 'summary fork link')
299
299
300 return c
300 return c
301
301
302 def _get_f_path_unchecked(self, matchdict, default=None):
302 def _get_f_path_unchecked(self, matchdict, default=None):
303 """
303 """
304 Should only be used by redirects, everything else should call _get_f_path
304 Should only be used by redirects, everything else should call _get_f_path
305 """
305 """
306 f_path = matchdict.get('f_path')
306 f_path = matchdict.get('f_path')
307 if f_path:
307 if f_path:
308 # fix for multiple initial slashes that causes errors for GIT
308 # fix for multiple initial slashes that causes errors for GIT
309 return f_path.lstrip('/')
309 return f_path.lstrip('/')
310
310
311 return default
311 return default
312
312
313 def _get_f_path(self, matchdict, default=None):
313 def _get_f_path(self, matchdict, default=None):
314 f_path_match = self._get_f_path_unchecked(matchdict, default)
314 f_path_match = self._get_f_path_unchecked(matchdict, default)
315 return self.path_filter.assert_path_permissions(f_path_match)
315 return self.path_filter.assert_path_permissions(f_path_match)
316
316
317 def _get_general_setting(self, target_repo, settings_key, default=False):
317 def _get_general_setting(self, target_repo, settings_key, default=False):
318 settings_model = VcsSettingsModel(repo=target_repo)
318 settings_model = VcsSettingsModel(repo=target_repo)
319 settings = settings_model.get_general_settings()
319 settings = settings_model.get_general_settings()
320 return settings.get(settings_key, default)
320 return settings.get(settings_key, default)
321
321
322 def _get_repo_setting(self, target_repo, settings_key, default=False):
322 def _get_repo_setting(self, target_repo, settings_key, default=False):
323 settings_model = VcsSettingsModel(repo=target_repo)
323 settings_model = VcsSettingsModel(repo=target_repo)
324 settings = settings_model.get_repo_settings_inherited()
324 settings = settings_model.get_repo_settings_inherited()
325 return settings.get(settings_key, default)
325 return settings.get(settings_key, default)
326
326
327 def _get_readme_data(self, db_repo, renderer_type, commit_id=None, path='/'):
327 def _get_readme_data(self, db_repo, renderer_type, commit_id=None, path='/'):
328 log.debug('Looking for README file at path %s', path)
328 log.debug('Looking for README file at path %s', path)
329 if commit_id:
329 if commit_id:
330 landing_commit_id = commit_id
330 landing_commit_id = commit_id
331 else:
331 else:
332 landing_commit = db_repo.get_landing_commit()
332 landing_commit = db_repo.get_landing_commit()
333 if isinstance(landing_commit, EmptyCommit):
333 if isinstance(landing_commit, EmptyCommit):
334 return None, None
334 return None, None
335 landing_commit_id = landing_commit.raw_id
335 landing_commit_id = landing_commit.raw_id
336
336
337 cache_namespace_uid = 'cache_repo.{}'.format(db_repo.repo_id)
337 cache_namespace_uid = 'cache_repo.{}'.format(db_repo.repo_id)
338 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
338 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
339 start = time.time()
339 start = time.time()
340
340
341 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid)
341 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid)
342 def generate_repo_readme(repo_id, _commit_id, _repo_name, _readme_search_path, _renderer_type):
342 def generate_repo_readme(repo_id, _commit_id, _repo_name, _readme_search_path, _renderer_type):
343 readme_data = None
343 readme_data = None
344 readme_filename = None
344 readme_filename = None
345
345
346 commit = db_repo.get_commit(_commit_id)
346 commit = db_repo.get_commit(_commit_id)
347 log.debug("Searching for a README file at commit %s.", _commit_id)
347 log.debug("Searching for a README file at commit %s.", _commit_id)
348 readme_node = ReadmeFinder(_renderer_type).search(commit, path=_readme_search_path)
348 readme_node = ReadmeFinder(_renderer_type).search(commit, path=_readme_search_path)
349
349
350 if readme_node:
350 if readme_node:
351 log.debug('Found README node: %s', readme_node)
351 log.debug('Found README node: %s', readme_node)
352 relative_urls = {
352 relative_urls = {
353 'raw': h.route_path(
353 'raw': h.route_path(
354 'repo_file_raw', repo_name=_repo_name,
354 'repo_file_raw', repo_name=_repo_name,
355 commit_id=commit.raw_id, f_path=readme_node.path),
355 commit_id=commit.raw_id, f_path=readme_node.path),
356 'standard': h.route_path(
356 'standard': h.route_path(
357 'repo_files', repo_name=_repo_name,
357 'repo_files', repo_name=_repo_name,
358 commit_id=commit.raw_id, f_path=readme_node.path),
358 commit_id=commit.raw_id, f_path=readme_node.path),
359 }
359 }
360 readme_data = self._render_readme_or_none(commit, readme_node, relative_urls)
360 readme_data = self._render_readme_or_none(commit, readme_node, relative_urls)
361 readme_filename = readme_node.unicode_path
361 readme_filename = readme_node.unicode_path
362
362
363 return readme_data, readme_filename
363 return readme_data, readme_filename
364
364
365 readme_data, readme_filename = generate_repo_readme(
365 readme_data, readme_filename = generate_repo_readme(
366 db_repo.repo_id, landing_commit_id, db_repo.repo_name, path, renderer_type,)
366 db_repo.repo_id, landing_commit_id, db_repo.repo_name, path, renderer_type,)
367 compute_time = time.time() - start
367 compute_time = time.time() - start
368 log.debug('Repo README for path %s generated and computed in %.4fs',
368 log.debug('Repo README for path %s generated and computed in %.4fs',
369 path, compute_time)
369 path, compute_time)
370 return readme_data, readme_filename
370 return readme_data, readme_filename
371
371
372 def _render_readme_or_none(self, commit, readme_node, relative_urls):
372 def _render_readme_or_none(self, commit, readme_node, relative_urls):
373 log.debug('Found README file `%s` rendering...', readme_node.path)
373 log.debug('Found README file `%s` rendering...', readme_node.path)
374 renderer = MarkupRenderer()
374 renderer = MarkupRenderer()
375 try:
375 try:
376 html_source = renderer.render(
376 html_source = renderer.render(
377 readme_node.content, filename=readme_node.path)
377 readme_node.content, filename=readme_node.path)
378 if relative_urls:
378 if relative_urls:
379 return relative_links(html_source, relative_urls)
379 return relative_links(html_source, relative_urls)
380 return html_source
380 return html_source
381 except Exception:
381 except Exception:
382 log.exception(
382 log.exception(
383 "Exception while trying to render the README")
383 "Exception while trying to render the README")
384
384
385 def get_recache_flag(self):
385 def get_recache_flag(self):
386 for flag_name in ['force_recache', 'force-recache', 'no-cache']:
386 for flag_name in ['force_recache', 'force-recache', 'no-cache']:
387 flag_val = self.request.GET.get(flag_name)
387 flag_val = self.request.GET.get(flag_name)
388 if str2bool(flag_val):
388 if str2bool(flag_val):
389 return True
389 return True
390 return False
390 return False
391
391
392 def get_commit_preload_attrs(cls):
393 pre_load = ['author', 'branch', 'date', 'message', 'parents',
394 'obsolete', 'phase', 'hidden']
395 return pre_load
396
392
397
393 class PathFilter(object):
398 class PathFilter(object):
394
399
395 # Expects and instance of BasePathPermissionChecker or None
400 # Expects and instance of BasePathPermissionChecker or None
396 def __init__(self, permission_checker):
401 def __init__(self, permission_checker):
397 self.permission_checker = permission_checker
402 self.permission_checker = permission_checker
398
403
399 def assert_path_permissions(self, path):
404 def assert_path_permissions(self, path):
400 if self.path_access_allowed(path):
405 if self.path_access_allowed(path):
401 return path
406 return path
402 raise HTTPForbidden()
407 raise HTTPForbidden()
403
408
404 def path_access_allowed(self, path):
409 def path_access_allowed(self, path):
405 log.debug('Checking ACL permissions for PathFilter for `%s`', path)
410 log.debug('Checking ACL permissions for PathFilter for `%s`', path)
406 if self.permission_checker:
411 if self.permission_checker:
407 has_access = path and self.permission_checker.has_access(path)
412 has_access = path and self.permission_checker.has_access(path)
408 log.debug('ACL Permissions checker enabled, ACL Check has_access: %s', has_access)
413 log.debug('ACL Permissions checker enabled, ACL Check has_access: %s', has_access)
409 return has_access
414 return has_access
410
415
411 log.debug('ACL permissions checker not enabled, skipping...')
416 log.debug('ACL permissions checker not enabled, skipping...')
412 return True
417 return True
413
418
414 def filter_patchset(self, patchset):
419 def filter_patchset(self, patchset):
415 if not self.permission_checker or not patchset:
420 if not self.permission_checker or not patchset:
416 return patchset, False
421 return patchset, False
417 had_filtered = False
422 had_filtered = False
418 filtered_patchset = []
423 filtered_patchset = []
419 for patch in patchset:
424 for patch in patchset:
420 filename = patch.get('filename', None)
425 filename = patch.get('filename', None)
421 if not filename or self.permission_checker.has_access(filename):
426 if not filename or self.permission_checker.has_access(filename):
422 filtered_patchset.append(patch)
427 filtered_patchset.append(patch)
423 else:
428 else:
424 had_filtered = True
429 had_filtered = True
425 if had_filtered:
430 if had_filtered:
426 if isinstance(patchset, diffs.LimitedDiffContainer):
431 if isinstance(patchset, diffs.LimitedDiffContainer):
427 filtered_patchset = diffs.LimitedDiffContainer(patchset.diff_limit, patchset.cur_diff_size, filtered_patchset)
432 filtered_patchset = diffs.LimitedDiffContainer(patchset.diff_limit, patchset.cur_diff_size, filtered_patchset)
428 return filtered_patchset, True
433 return filtered_patchset, True
429 else:
434 else:
430 return patchset, False
435 return patchset, False
431
436
432 def render_patchset_filtered(self, diffset, patchset, source_ref=None, target_ref=None):
437 def render_patchset_filtered(self, diffset, patchset, source_ref=None, target_ref=None):
433 filtered_patchset, has_hidden_changes = self.filter_patchset(patchset)
438 filtered_patchset, has_hidden_changes = self.filter_patchset(patchset)
434 result = diffset.render_patchset(
439 result = diffset.render_patchset(
435 filtered_patchset, source_ref=source_ref, target_ref=target_ref)
440 filtered_patchset, source_ref=source_ref, target_ref=target_ref)
436 result.has_hidden_changes = has_hidden_changes
441 result.has_hidden_changes = has_hidden_changes
437 return result
442 return result
438
443
439 def get_raw_patch(self, diff_processor):
444 def get_raw_patch(self, diff_processor):
440 if self.permission_checker is None:
445 if self.permission_checker is None:
441 return diff_processor.as_raw()
446 return diff_processor.as_raw()
442 elif self.permission_checker.has_full_access:
447 elif self.permission_checker.has_full_access:
443 return diff_processor.as_raw()
448 return diff_processor.as_raw()
444 else:
449 else:
445 return '# Repository has user-specific filters, raw patch generation is disabled.'
450 return '# Repository has user-specific filters, raw patch generation is disabled.'
446
451
447 @property
452 @property
448 def is_enabled(self):
453 def is_enabled(self):
449 return self.permission_checker is not None
454 return self.permission_checker is not None
450
455
451
456
452 class RepoGroupAppView(BaseAppView):
457 class RepoGroupAppView(BaseAppView):
453 def __init__(self, context, request):
458 def __init__(self, context, request):
454 super(RepoGroupAppView, self).__init__(context, request)
459 super(RepoGroupAppView, self).__init__(context, request)
455 self.db_repo_group = request.db_repo_group
460 self.db_repo_group = request.db_repo_group
456 self.db_repo_group_name = self.db_repo_group.group_name
461 self.db_repo_group_name = self.db_repo_group.group_name
457
462
458 def _get_local_tmpl_context(self, include_app_defaults=True):
463 def _get_local_tmpl_context(self, include_app_defaults=True):
459 _ = self.request.translate
464 _ = self.request.translate
460 c = super(RepoGroupAppView, self)._get_local_tmpl_context(
465 c = super(RepoGroupAppView, self)._get_local_tmpl_context(
461 include_app_defaults=include_app_defaults)
466 include_app_defaults=include_app_defaults)
462 c.repo_group = self.db_repo_group
467 c.repo_group = self.db_repo_group
463 return c
468 return c
464
469
465 def _revoke_perms_on_yourself(self, form_result):
470 def _revoke_perms_on_yourself(self, form_result):
466 _updates = filter(lambda u: self._rhodecode_user.user_id == int(u[0]),
471 _updates = filter(lambda u: self._rhodecode_user.user_id == int(u[0]),
467 form_result['perm_updates'])
472 form_result['perm_updates'])
468 _additions = filter(lambda u: self._rhodecode_user.user_id == int(u[0]),
473 _additions = filter(lambda u: self._rhodecode_user.user_id == int(u[0]),
469 form_result['perm_additions'])
474 form_result['perm_additions'])
470 _deletions = filter(lambda u: self._rhodecode_user.user_id == int(u[0]),
475 _deletions = filter(lambda u: self._rhodecode_user.user_id == int(u[0]),
471 form_result['perm_deletions'])
476 form_result['perm_deletions'])
472 admin_perm = 'group.admin'
477 admin_perm = 'group.admin'
473 if _updates and _updates[0][1] != admin_perm or \
478 if _updates and _updates[0][1] != admin_perm or \
474 _additions and _additions[0][1] != admin_perm or \
479 _additions and _additions[0][1] != admin_perm or \
475 _deletions and _deletions[0][1] != admin_perm:
480 _deletions and _deletions[0][1] != admin_perm:
476 return True
481 return True
477 return False
482 return False
478
483
479
484
480 class UserGroupAppView(BaseAppView):
485 class UserGroupAppView(BaseAppView):
481 def __init__(self, context, request):
486 def __init__(self, context, request):
482 super(UserGroupAppView, self).__init__(context, request)
487 super(UserGroupAppView, self).__init__(context, request)
483 self.db_user_group = request.db_user_group
488 self.db_user_group = request.db_user_group
484 self.db_user_group_name = self.db_user_group.users_group_name
489 self.db_user_group_name = self.db_user_group.users_group_name
485
490
486
491
487 class UserAppView(BaseAppView):
492 class UserAppView(BaseAppView):
488 def __init__(self, context, request):
493 def __init__(self, context, request):
489 super(UserAppView, self).__init__(context, request)
494 super(UserAppView, self).__init__(context, request)
490 self.db_user = request.db_user
495 self.db_user = request.db_user
491 self.db_user_id = self.db_user.user_id
496 self.db_user_id = self.db_user.user_id
492
497
493 _ = self.request.translate
498 _ = self.request.translate
494 if not request.db_user_supports_default:
499 if not request.db_user_supports_default:
495 if self.db_user.username == User.DEFAULT_USER:
500 if self.db_user.username == User.DEFAULT_USER:
496 h.flash(_("Editing user `{}` is disabled.".format(
501 h.flash(_("Editing user `{}` is disabled.".format(
497 User.DEFAULT_USER)), category='warning')
502 User.DEFAULT_USER)), category='warning')
498 raise HTTPFound(h.route_path('users'))
503 raise HTTPFound(h.route_path('users'))
499
504
500
505
501 class DataGridAppView(object):
506 class DataGridAppView(object):
502 """
507 """
503 Common class to have re-usable grid rendering components
508 Common class to have re-usable grid rendering components
504 """
509 """
505
510
506 def _extract_ordering(self, request, column_map=None):
511 def _extract_ordering(self, request, column_map=None):
507 column_map = column_map or {}
512 column_map = column_map or {}
508 column_index = safe_int(request.GET.get('order[0][column]'))
513 column_index = safe_int(request.GET.get('order[0][column]'))
509 order_dir = request.GET.get(
514 order_dir = request.GET.get(
510 'order[0][dir]', 'desc')
515 'order[0][dir]', 'desc')
511 order_by = request.GET.get(
516 order_by = request.GET.get(
512 'columns[%s][data][sort]' % column_index, 'name_raw')
517 'columns[%s][data][sort]' % column_index, 'name_raw')
513
518
514 # translate datatable to DB columns
519 # translate datatable to DB columns
515 order_by = column_map.get(order_by) or order_by
520 order_by = column_map.get(order_by) or order_by
516
521
517 search_q = request.GET.get('search[value]')
522 search_q = request.GET.get('search[value]')
518 return search_q, order_by, order_dir
523 return search_q, order_by, order_dir
519
524
520 def _extract_chunk(self, request):
525 def _extract_chunk(self, request):
521 start = safe_int(request.GET.get('start'), 0)
526 start = safe_int(request.GET.get('start'), 0)
522 length = safe_int(request.GET.get('length'), 25)
527 length = safe_int(request.GET.get('length'), 25)
523 draw = safe_int(request.GET.get('draw'))
528 draw = safe_int(request.GET.get('draw'))
524 return draw, start, length
529 return draw, start, length
525
530
526 def _get_order_col(self, order_by, model):
531 def _get_order_col(self, order_by, model):
527 if isinstance(order_by, compat.string_types):
532 if isinstance(order_by, compat.string_types):
528 try:
533 try:
529 return operator.attrgetter(order_by)(model)
534 return operator.attrgetter(order_by)(model)
530 except AttributeError:
535 except AttributeError:
531 return None
536 return None
532 else:
537 else:
533 return order_by
538 return order_by
534
539
535
540
536 class BaseReferencesView(RepoAppView):
541 class BaseReferencesView(RepoAppView):
537 """
542 """
538 Base for reference view for branches, tags and bookmarks.
543 Base for reference view for branches, tags and bookmarks.
539 """
544 """
540 def load_default_context(self):
545 def load_default_context(self):
541 c = self._get_local_tmpl_context()
546 c = self._get_local_tmpl_context()
542 return c
547 return c
543
548
544 def load_refs_context(self, ref_items, partials_template):
549 def load_refs_context(self, ref_items, partials_template):
545 _render = self.request.get_partial_renderer(partials_template)
550 _render = self.request.get_partial_renderer(partials_template)
546 pre_load = ["author", "date", "message", "parents"]
551 pre_load = ["author", "date", "message", "parents"]
547
552
548 is_svn = h.is_svn(self.rhodecode_vcs_repo)
553 is_svn = h.is_svn(self.rhodecode_vcs_repo)
549 is_hg = h.is_hg(self.rhodecode_vcs_repo)
554 is_hg = h.is_hg(self.rhodecode_vcs_repo)
550
555
551 format_ref_id = get_format_ref_id(self.rhodecode_vcs_repo)
556 format_ref_id = get_format_ref_id(self.rhodecode_vcs_repo)
552
557
553 closed_refs = {}
558 closed_refs = {}
554 if is_hg:
559 if is_hg:
555 closed_refs = self.rhodecode_vcs_repo.branches_closed
560 closed_refs = self.rhodecode_vcs_repo.branches_closed
556
561
557 data = []
562 data = []
558 for ref_name, commit_id in ref_items:
563 for ref_name, commit_id in ref_items:
559 commit = self.rhodecode_vcs_repo.get_commit(
564 commit = self.rhodecode_vcs_repo.get_commit(
560 commit_id=commit_id, pre_load=pre_load)
565 commit_id=commit_id, pre_load=pre_load)
561 closed = ref_name in closed_refs
566 closed = ref_name in closed_refs
562
567
563 # TODO: johbo: Unify generation of reference links
568 # TODO: johbo: Unify generation of reference links
564 use_commit_id = '/' in ref_name or is_svn
569 use_commit_id = '/' in ref_name or is_svn
565
570
566 if use_commit_id:
571 if use_commit_id:
567 files_url = h.route_path(
572 files_url = h.route_path(
568 'repo_files',
573 'repo_files',
569 repo_name=self.db_repo_name,
574 repo_name=self.db_repo_name,
570 f_path=ref_name if is_svn else '',
575 f_path=ref_name if is_svn else '',
571 commit_id=commit_id,
576 commit_id=commit_id,
572 _query=dict(at=ref_name)
577 _query=dict(at=ref_name)
573 )
578 )
574
579
575 else:
580 else:
576 files_url = h.route_path(
581 files_url = h.route_path(
577 'repo_files',
582 'repo_files',
578 repo_name=self.db_repo_name,
583 repo_name=self.db_repo_name,
579 f_path=ref_name if is_svn else '',
584 f_path=ref_name if is_svn else '',
580 commit_id=ref_name,
585 commit_id=ref_name,
581 _query=dict(at=ref_name)
586 _query=dict(at=ref_name)
582 )
587 )
583
588
584 data.append({
589 data.append({
585 "name": _render('name', ref_name, files_url, closed),
590 "name": _render('name', ref_name, files_url, closed),
586 "name_raw": ref_name,
591 "name_raw": ref_name,
587 "date": _render('date', commit.date),
592 "date": _render('date', commit.date),
588 "date_raw": datetime_to_time(commit.date),
593 "date_raw": datetime_to_time(commit.date),
589 "author": _render('author', commit.author),
594 "author": _render('author', commit.author),
590 "commit": _render(
595 "commit": _render(
591 'commit', commit.message, commit.raw_id, commit.idx),
596 'commit', commit.message, commit.raw_id, commit.idx),
592 "commit_raw": commit.idx,
597 "commit_raw": commit.idx,
593 "compare": _render(
598 "compare": _render(
594 'compare', format_ref_id(ref_name, commit.raw_id)),
599 'compare', format_ref_id(ref_name, commit.raw_id)),
595 })
600 })
596
601
597 return data
602 return data
598
603
599
604
600 class RepoRoutePredicate(object):
605 class RepoRoutePredicate(object):
601 def __init__(self, val, config):
606 def __init__(self, val, config):
602 self.val = val
607 self.val = val
603
608
604 def text(self):
609 def text(self):
605 return 'repo_route = %s' % self.val
610 return 'repo_route = %s' % self.val
606
611
607 phash = text
612 phash = text
608
613
609 def __call__(self, info, request):
614 def __call__(self, info, request):
610 if hasattr(request, 'vcs_call'):
615 if hasattr(request, 'vcs_call'):
611 # skip vcs calls
616 # skip vcs calls
612 return
617 return
613
618
614 repo_name = info['match']['repo_name']
619 repo_name = info['match']['repo_name']
615 repo_model = repo.RepoModel()
620 repo_model = repo.RepoModel()
616
621
617 by_name_match = repo_model.get_by_repo_name(repo_name, cache=False)
622 by_name_match = repo_model.get_by_repo_name(repo_name, cache=False)
618
623
619 def redirect_if_creating(route_info, db_repo):
624 def redirect_if_creating(route_info, db_repo):
620 skip_views = ['edit_repo_advanced_delete']
625 skip_views = ['edit_repo_advanced_delete']
621 route = route_info['route']
626 route = route_info['route']
622 # we should skip delete view so we can actually "remove" repositories
627 # we should skip delete view so we can actually "remove" repositories
623 # if they get stuck in creating state.
628 # if they get stuck in creating state.
624 if route.name in skip_views:
629 if route.name in skip_views:
625 return
630 return
626
631
627 if db_repo.repo_state in [repo.Repository.STATE_PENDING]:
632 if db_repo.repo_state in [repo.Repository.STATE_PENDING]:
628 repo_creating_url = request.route_path(
633 repo_creating_url = request.route_path(
629 'repo_creating', repo_name=db_repo.repo_name)
634 'repo_creating', repo_name=db_repo.repo_name)
630 raise HTTPFound(repo_creating_url)
635 raise HTTPFound(repo_creating_url)
631
636
632 if by_name_match:
637 if by_name_match:
633 # register this as request object we can re-use later
638 # register this as request object we can re-use later
634 request.db_repo = by_name_match
639 request.db_repo = by_name_match
635 redirect_if_creating(info, by_name_match)
640 redirect_if_creating(info, by_name_match)
636 return True
641 return True
637
642
638 by_id_match = repo_model.get_repo_by_id(repo_name)
643 by_id_match = repo_model.get_repo_by_id(repo_name)
639 if by_id_match:
644 if by_id_match:
640 request.db_repo = by_id_match
645 request.db_repo = by_id_match
641 redirect_if_creating(info, by_id_match)
646 redirect_if_creating(info, by_id_match)
642 return True
647 return True
643
648
644 return False
649 return False
645
650
646
651
647 class RepoForbidArchivedRoutePredicate(object):
652 class RepoForbidArchivedRoutePredicate(object):
648 def __init__(self, val, config):
653 def __init__(self, val, config):
649 self.val = val
654 self.val = val
650
655
651 def text(self):
656 def text(self):
652 return 'repo_forbid_archived = %s' % self.val
657 return 'repo_forbid_archived = %s' % self.val
653
658
654 phash = text
659 phash = text
655
660
656 def __call__(self, info, request):
661 def __call__(self, info, request):
657 _ = request.translate
662 _ = request.translate
658 rhodecode_db_repo = request.db_repo
663 rhodecode_db_repo = request.db_repo
659
664
660 log.debug(
665 log.debug(
661 '%s checking if archived flag for repo for %s',
666 '%s checking if archived flag for repo for %s',
662 self.__class__.__name__, rhodecode_db_repo.repo_name)
667 self.__class__.__name__, rhodecode_db_repo.repo_name)
663
668
664 if rhodecode_db_repo.archived:
669 if rhodecode_db_repo.archived:
665 log.warning('Current view is not supported for archived repo:%s',
670 log.warning('Current view is not supported for archived repo:%s',
666 rhodecode_db_repo.repo_name)
671 rhodecode_db_repo.repo_name)
667
672
668 h.flash(
673 h.flash(
669 h.literal(_('Action not supported for archived repository.')),
674 h.literal(_('Action not supported for archived repository.')),
670 category='warning')
675 category='warning')
671 summary_url = request.route_path(
676 summary_url = request.route_path(
672 'repo_summary', repo_name=rhodecode_db_repo.repo_name)
677 'repo_summary', repo_name=rhodecode_db_repo.repo_name)
673 raise HTTPFound(summary_url)
678 raise HTTPFound(summary_url)
674 return True
679 return True
675
680
676
681
677 class RepoTypeRoutePredicate(object):
682 class RepoTypeRoutePredicate(object):
678 def __init__(self, val, config):
683 def __init__(self, val, config):
679 self.val = val or ['hg', 'git', 'svn']
684 self.val = val or ['hg', 'git', 'svn']
680
685
681 def text(self):
686 def text(self):
682 return 'repo_accepted_type = %s' % self.val
687 return 'repo_accepted_type = %s' % self.val
683
688
684 phash = text
689 phash = text
685
690
686 def __call__(self, info, request):
691 def __call__(self, info, request):
687 if hasattr(request, 'vcs_call'):
692 if hasattr(request, 'vcs_call'):
688 # skip vcs calls
693 # skip vcs calls
689 return
694 return
690
695
691 rhodecode_db_repo = request.db_repo
696 rhodecode_db_repo = request.db_repo
692
697
693 log.debug(
698 log.debug(
694 '%s checking repo type for %s in %s',
699 '%s checking repo type for %s in %s',
695 self.__class__.__name__, rhodecode_db_repo.repo_type, self.val)
700 self.__class__.__name__, rhodecode_db_repo.repo_type, self.val)
696
701
697 if rhodecode_db_repo.repo_type in self.val:
702 if rhodecode_db_repo.repo_type in self.val:
698 return True
703 return True
699 else:
704 else:
700 log.warning('Current view is not supported for repo type:%s',
705 log.warning('Current view is not supported for repo type:%s',
701 rhodecode_db_repo.repo_type)
706 rhodecode_db_repo.repo_type)
702 return False
707 return False
703
708
704
709
705 class RepoGroupRoutePredicate(object):
710 class RepoGroupRoutePredicate(object):
706 def __init__(self, val, config):
711 def __init__(self, val, config):
707 self.val = val
712 self.val = val
708
713
709 def text(self):
714 def text(self):
710 return 'repo_group_route = %s' % self.val
715 return 'repo_group_route = %s' % self.val
711
716
712 phash = text
717 phash = text
713
718
714 def __call__(self, info, request):
719 def __call__(self, info, request):
715 if hasattr(request, 'vcs_call'):
720 if hasattr(request, 'vcs_call'):
716 # skip vcs calls
721 # skip vcs calls
717 return
722 return
718
723
719 repo_group_name = info['match']['repo_group_name']
724 repo_group_name = info['match']['repo_group_name']
720 repo_group_model = repo_group.RepoGroupModel()
725 repo_group_model = repo_group.RepoGroupModel()
721 by_name_match = repo_group_model.get_by_group_name(repo_group_name, cache=False)
726 by_name_match = repo_group_model.get_by_group_name(repo_group_name, cache=False)
722
727
723 if by_name_match:
728 if by_name_match:
724 # register this as request object we can re-use later
729 # register this as request object we can re-use later
725 request.db_repo_group = by_name_match
730 request.db_repo_group = by_name_match
726 return True
731 return True
727
732
728 return False
733 return False
729
734
730
735
731 class UserGroupRoutePredicate(object):
736 class UserGroupRoutePredicate(object):
732 def __init__(self, val, config):
737 def __init__(self, val, config):
733 self.val = val
738 self.val = val
734
739
735 def text(self):
740 def text(self):
736 return 'user_group_route = %s' % self.val
741 return 'user_group_route = %s' % self.val
737
742
738 phash = text
743 phash = text
739
744
740 def __call__(self, info, request):
745 def __call__(self, info, request):
741 if hasattr(request, 'vcs_call'):
746 if hasattr(request, 'vcs_call'):
742 # skip vcs calls
747 # skip vcs calls
743 return
748 return
744
749
745 user_group_id = info['match']['user_group_id']
750 user_group_id = info['match']['user_group_id']
746 user_group_model = user_group.UserGroup()
751 user_group_model = user_group.UserGroup()
747 by_id_match = user_group_model.get(user_group_id, cache=False)
752 by_id_match = user_group_model.get(user_group_id, cache=False)
748
753
749 if by_id_match:
754 if by_id_match:
750 # register this as request object we can re-use later
755 # register this as request object we can re-use later
751 request.db_user_group = by_id_match
756 request.db_user_group = by_id_match
752 return True
757 return True
753
758
754 return False
759 return False
755
760
756
761
757 class UserRoutePredicateBase(object):
762 class UserRoutePredicateBase(object):
758 supports_default = None
763 supports_default = None
759
764
760 def __init__(self, val, config):
765 def __init__(self, val, config):
761 self.val = val
766 self.val = val
762
767
763 def text(self):
768 def text(self):
764 raise NotImplementedError()
769 raise NotImplementedError()
765
770
766 def __call__(self, info, request):
771 def __call__(self, info, request):
767 if hasattr(request, 'vcs_call'):
772 if hasattr(request, 'vcs_call'):
768 # skip vcs calls
773 # skip vcs calls
769 return
774 return
770
775
771 user_id = info['match']['user_id']
776 user_id = info['match']['user_id']
772 user_model = user.User()
777 user_model = user.User()
773 by_id_match = user_model.get(user_id, cache=False)
778 by_id_match = user_model.get(user_id, cache=False)
774
779
775 if by_id_match:
780 if by_id_match:
776 # register this as request object we can re-use later
781 # register this as request object we can re-use later
777 request.db_user = by_id_match
782 request.db_user = by_id_match
778 request.db_user_supports_default = self.supports_default
783 request.db_user_supports_default = self.supports_default
779 return True
784 return True
780
785
781 return False
786 return False
782
787
783
788
784 class UserRoutePredicate(UserRoutePredicateBase):
789 class UserRoutePredicate(UserRoutePredicateBase):
785 supports_default = False
790 supports_default = False
786
791
787 def text(self):
792 def text(self):
788 return 'user_route = %s' % self.val
793 return 'user_route = %s' % self.val
789
794
790 phash = text
795 phash = text
791
796
792
797
793 class UserRouteWithDefaultPredicate(UserRoutePredicateBase):
798 class UserRouteWithDefaultPredicate(UserRoutePredicateBase):
794 supports_default = True
799 supports_default = True
795
800
796 def text(self):
801 def text(self):
797 return 'user_with_default_route = %s' % self.val
802 return 'user_with_default_route = %s' % self.val
798
803
799 phash = text
804 phash = text
800
805
801
806
802 def includeme(config):
807 def includeme(config):
803 config.add_route_predicate(
808 config.add_route_predicate(
804 'repo_route', RepoRoutePredicate)
809 'repo_route', RepoRoutePredicate)
805 config.add_route_predicate(
810 config.add_route_predicate(
806 'repo_accepted_types', RepoTypeRoutePredicate)
811 'repo_accepted_types', RepoTypeRoutePredicate)
807 config.add_route_predicate(
812 config.add_route_predicate(
808 'repo_forbid_when_archived', RepoForbidArchivedRoutePredicate)
813 'repo_forbid_when_archived', RepoForbidArchivedRoutePredicate)
809 config.add_route_predicate(
814 config.add_route_predicate(
810 'repo_group_route', RepoGroupRoutePredicate)
815 'repo_group_route', RepoGroupRoutePredicate)
811 config.add_route_predicate(
816 config.add_route_predicate(
812 'user_group_route', UserGroupRoutePredicate)
817 'user_group_route', UserGroupRoutePredicate)
813 config.add_route_predicate(
818 config.add_route_predicate(
814 'user_route_with_default', UserRouteWithDefaultPredicate)
819 'user_route_with_default', UserRouteWithDefaultPredicate)
815 config.add_route_predicate(
820 config.add_route_predicate(
816 'user_route', UserRoutePredicate)
821 'user_route', UserRoutePredicate)
@@ -1,200 +1,234 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2016-2020 RhodeCode GmbH
3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import urllib2
22 import urllib2
23
23 import os
24
25
24
26 import rhodecode
25 import rhodecode
27 from rhodecode.apps._base import BaseAppView
26 from rhodecode.apps._base import BaseAppView
28 from rhodecode.apps._base.navigation import navigation_list
27 from rhodecode.apps._base.navigation import navigation_list
29 from rhodecode.lib import helpers as h
28 from rhodecode.lib import helpers as h
30 from rhodecode.lib.auth import (LoginRequired, HasPermissionAllDecorator)
29 from rhodecode.lib.auth import (LoginRequired, HasPermissionAllDecorator)
31 from rhodecode.lib.utils2 import str2bool
30 from rhodecode.lib.utils2 import str2bool
32 from rhodecode.lib import system_info
31 from rhodecode.lib import system_info
33 from rhodecode.model.update import UpdateModel
32 from rhodecode.model.update import UpdateModel
34
33
35 log = logging.getLogger(__name__)
34 log = logging.getLogger(__name__)
36
35
37
36
38 class AdminSystemInfoSettingsView(BaseAppView):
37 class AdminSystemInfoSettingsView(BaseAppView):
39 def load_default_context(self):
38 def load_default_context(self):
40 c = self._get_local_tmpl_context()
39 c = self._get_local_tmpl_context()
41 return c
40 return c
42
41
42 def get_env_data(self):
43 black_list = [
44 'NIX_LDFLAGS',
45 'NIX_CFLAGS_COMPILE',
46 'propagatedBuildInputs',
47 'propagatedNativeBuildInputs',
48 'postInstall',
49 'buildInputs',
50 'buildPhase',
51 'preShellHook',
52 'preShellHook',
53 'preCheck',
54 'preBuild',
55 'postShellHook',
56 'postFixup',
57 'postCheck',
58 'nativeBuildInputs',
59 'installPhase',
60 'installCheckPhase',
61 'checkPhase',
62 'configurePhase',
63 'shellHook'
64 ]
65 secret_list = [
66 'RHODECODE_USER_PASS'
67 ]
68
69 for k, v in sorted(os.environ.items()):
70 if k in black_list:
71 continue
72 if k in secret_list:
73 v = '*****'
74 yield k, v
75
43 @LoginRequired()
76 @LoginRequired()
44 @HasPermissionAllDecorator('hg.admin')
77 @HasPermissionAllDecorator('hg.admin')
45 def settings_system_info(self):
78 def settings_system_info(self):
46 _ = self.request.translate
79 _ = self.request.translate
47 c = self.load_default_context()
80 c = self.load_default_context()
48
81
49 c.active = 'system'
82 c.active = 'system'
50 c.navlist = navigation_list(self.request)
83 c.navlist = navigation_list(self.request)
51
84
52 # TODO(marcink), figure out how to allow only selected users to do this
85 # TODO(marcink), figure out how to allow only selected users to do this
53 c.allowed_to_snapshot = self._rhodecode_user.admin
86 c.allowed_to_snapshot = self._rhodecode_user.admin
54
87
55 snapshot = str2bool(self.request.params.get('snapshot'))
88 snapshot = str2bool(self.request.params.get('snapshot'))
56
89
57 c.rhodecode_update_url = UpdateModel().get_update_url()
90 c.rhodecode_update_url = UpdateModel().get_update_url()
91 c.env_data = self.get_env_data()
58 server_info = system_info.get_system_info(self.request.environ)
92 server_info = system_info.get_system_info(self.request.environ)
59
93
60 for key, val in server_info.items():
94 for key, val in server_info.items():
61 setattr(c, key, val)
95 setattr(c, key, val)
62
96
63 def val(name, subkey='human_value'):
97 def val(name, subkey='human_value'):
64 return server_info[name][subkey]
98 return server_info[name][subkey]
65
99
66 def state(name):
100 def state(name):
67 return server_info[name]['state']
101 return server_info[name]['state']
68
102
69 def val2(name):
103 def val2(name):
70 val = server_info[name]['human_value']
104 val = server_info[name]['human_value']
71 state = server_info[name]['state']
105 state = server_info[name]['state']
72 return val, state
106 return val, state
73
107
74 update_info_msg = _('Note: please make sure this server can '
108 update_info_msg = _('Note: please make sure this server can '
75 'access `${url}` for the update link to work',
109 'access `${url}` for the update link to work',
76 mapping=dict(url=c.rhodecode_update_url))
110 mapping=dict(url=c.rhodecode_update_url))
77 version = UpdateModel().get_stored_version()
111 version = UpdateModel().get_stored_version()
78 is_outdated = UpdateModel().is_outdated(
112 is_outdated = UpdateModel().is_outdated(
79 rhodecode.__version__, version)
113 rhodecode.__version__, version)
80 update_state = {
114 update_state = {
81 'type': 'warning',
115 'type': 'warning',
82 'message': 'New version available: {}'.format(version)
116 'message': 'New version available: {}'.format(version)
83 } \
117 } \
84 if is_outdated else {}
118 if is_outdated else {}
85 c.data_items = [
119 c.data_items = [
86 # update info
120 # update info
87 (_('Update info'), h.literal(
121 (_('Update info'), h.literal(
88 '<span class="link" id="check_for_update" >%s.</span>' % (
122 '<span class="link" id="check_for_update" >%s.</span>' % (
89 _('Check for updates')) +
123 _('Check for updates')) +
90 '<br/> <span >%s.</span>' % (update_info_msg)
124 '<br/> <span >%s.</span>' % (update_info_msg)
91 ), ''),
125 ), ''),
92
126
93 # RhodeCode specific
127 # RhodeCode specific
94 (_('RhodeCode Version'), val('rhodecode_app')['text'], state('rhodecode_app')),
128 (_('RhodeCode Version'), val('rhodecode_app')['text'], state('rhodecode_app')),
95 (_('Latest version'), version, update_state),
129 (_('Latest version'), version, update_state),
96 (_('RhodeCode Base URL'), val('rhodecode_config')['config'].get('app.base_url'), state('rhodecode_config')),
130 (_('RhodeCode Base URL'), val('rhodecode_config')['config'].get('app.base_url'), state('rhodecode_config')),
97 (_('RhodeCode Server IP'), val('server')['server_ip'], state('server')),
131 (_('RhodeCode Server IP'), val('server')['server_ip'], state('server')),
98 (_('RhodeCode Server ID'), val('server')['server_id'], state('server')),
132 (_('RhodeCode Server ID'), val('server')['server_id'], state('server')),
99 (_('RhodeCode Configuration'), val('rhodecode_config')['path'], state('rhodecode_config')),
133 (_('RhodeCode Configuration'), val('rhodecode_config')['path'], state('rhodecode_config')),
100 (_('RhodeCode Certificate'), val('rhodecode_config')['cert_path'], state('rhodecode_config')),
134 (_('RhodeCode Certificate'), val('rhodecode_config')['cert_path'], state('rhodecode_config')),
101 (_('Workers'), val('rhodecode_config')['config']['server:main'].get('workers', '?'), state('rhodecode_config')),
135 (_('Workers'), val('rhodecode_config')['config']['server:main'].get('workers', '?'), state('rhodecode_config')),
102 (_('Worker Type'), val('rhodecode_config')['config']['server:main'].get('worker_class', 'sync'), state('rhodecode_config')),
136 (_('Worker Type'), val('rhodecode_config')['config']['server:main'].get('worker_class', 'sync'), state('rhodecode_config')),
103 ('', '', ''), # spacer
137 ('', '', ''), # spacer
104
138
105 # Database
139 # Database
106 (_('Database'), val('database')['url'], state('database')),
140 (_('Database'), val('database')['url'], state('database')),
107 (_('Database version'), val('database')['version'], state('database')),
141 (_('Database version'), val('database')['version'], state('database')),
108 ('', '', ''), # spacer
142 ('', '', ''), # spacer
109
143
110 # Platform/Python
144 # Platform/Python
111 (_('Platform'), val('platform')['name'], state('platform')),
145 (_('Platform'), val('platform')['name'], state('platform')),
112 (_('Platform UUID'), val('platform')['uuid'], state('platform')),
146 (_('Platform UUID'), val('platform')['uuid'], state('platform')),
113 (_('Lang'), val('locale'), state('locale')),
147 (_('Lang'), val('locale'), state('locale')),
114 (_('Python version'), val('python')['version'], state('python')),
148 (_('Python version'), val('python')['version'], state('python')),
115 (_('Python path'), val('python')['executable'], state('python')),
149 (_('Python path'), val('python')['executable'], state('python')),
116 ('', '', ''), # spacer
150 ('', '', ''), # spacer
117
151
118 # Systems stats
152 # Systems stats
119 (_('CPU'), val('cpu')['text'], state('cpu')),
153 (_('CPU'), val('cpu')['text'], state('cpu')),
120 (_('Load'), val('load')['text'], state('load')),
154 (_('Load'), val('load')['text'], state('load')),
121 (_('Memory'), val('memory')['text'], state('memory')),
155 (_('Memory'), val('memory')['text'], state('memory')),
122 (_('Uptime'), val('uptime')['text'], state('uptime')),
156 (_('Uptime'), val('uptime')['text'], state('uptime')),
123 ('', '', ''), # spacer
157 ('', '', ''), # spacer
124
158
125 # ulimit
159 # ulimit
126 (_('Ulimit'), val('ulimit')['text'], state('ulimit')),
160 (_('Ulimit'), val('ulimit')['text'], state('ulimit')),
127
161
128 # Repo storage
162 # Repo storage
129 (_('Storage location'), val('storage')['path'], state('storage')),
163 (_('Storage location'), val('storage')['path'], state('storage')),
130 (_('Storage info'), val('storage')['text'], state('storage')),
164 (_('Storage info'), val('storage')['text'], state('storage')),
131 (_('Storage inodes'), val('storage_inodes')['text'], state('storage_inodes')),
165 (_('Storage inodes'), val('storage_inodes')['text'], state('storage_inodes')),
132
166
133 (_('Gist storage location'), val('storage_gist')['path'], state('storage_gist')),
167 (_('Gist storage location'), val('storage_gist')['path'], state('storage_gist')),
134 (_('Gist storage info'), val('storage_gist')['text'], state('storage_gist')),
168 (_('Gist storage info'), val('storage_gist')['text'], state('storage_gist')),
135
169
136 (_('Archive cache storage location'), val('storage_archive')['path'], state('storage_archive')),
170 (_('Archive cache storage location'), val('storage_archive')['path'], state('storage_archive')),
137 (_('Archive cache info'), val('storage_archive')['text'], state('storage_archive')),
171 (_('Archive cache info'), val('storage_archive')['text'], state('storage_archive')),
138
172
139 (_('Temp storage location'), val('storage_temp')['path'], state('storage_temp')),
173 (_('Temp storage location'), val('storage_temp')['path'], state('storage_temp')),
140 (_('Temp storage info'), val('storage_temp')['text'], state('storage_temp')),
174 (_('Temp storage info'), val('storage_temp')['text'], state('storage_temp')),
141
175
142 (_('Search info'), val('search')['text'], state('search')),
176 (_('Search info'), val('search')['text'], state('search')),
143 (_('Search location'), val('search')['location'], state('search')),
177 (_('Search location'), val('search')['location'], state('search')),
144 ('', '', ''), # spacer
178 ('', '', ''), # spacer
145
179
146 # VCS specific
180 # VCS specific
147 (_('VCS Backends'), val('vcs_backends'), state('vcs_backends')),
181 (_('VCS Backends'), val('vcs_backends'), state('vcs_backends')),
148 (_('VCS Server'), val('vcs_server')['text'], state('vcs_server')),
182 (_('VCS Server'), val('vcs_server')['text'], state('vcs_server')),
149 (_('GIT'), val('git'), state('git')),
183 (_('GIT'), val('git'), state('git')),
150 (_('HG'), val('hg'), state('hg')),
184 (_('HG'), val('hg'), state('hg')),
151 (_('SVN'), val('svn'), state('svn')),
185 (_('SVN'), val('svn'), state('svn')),
152
186
153 ]
187 ]
154
188
155 c.vcsserver_data_items = [
189 c.vcsserver_data_items = [
156 (k, v) for k,v in (val('vcs_server_config') or {}).items()
190 (k, v) for k,v in (val('vcs_server_config') or {}).items()
157 ]
191 ]
158
192
159 if snapshot:
193 if snapshot:
160 if c.allowed_to_snapshot:
194 if c.allowed_to_snapshot:
161 c.data_items.pop(0) # remove server info
195 c.data_items.pop(0) # remove server info
162 self.request.override_renderer = 'admin/settings/settings_system_snapshot.mako'
196 self.request.override_renderer = 'admin/settings/settings_system_snapshot.mako'
163 else:
197 else:
164 h.flash('You are not allowed to do this', category='warning')
198 h.flash('You are not allowed to do this', category='warning')
165 return self._get_template_context(c)
199 return self._get_template_context(c)
166
200
167 @LoginRequired()
201 @LoginRequired()
168 @HasPermissionAllDecorator('hg.admin')
202 @HasPermissionAllDecorator('hg.admin')
169 def settings_system_info_check_update(self):
203 def settings_system_info_check_update(self):
170 _ = self.request.translate
204 _ = self.request.translate
171 c = self.load_default_context()
205 c = self.load_default_context()
172
206
173 update_url = UpdateModel().get_update_url()
207 update_url = UpdateModel().get_update_url()
174
208
175 _err = lambda s: '<div style="color:#ff8888; padding:4px 0px">{}</div>'.format(s)
209 _err = lambda s: '<div style="color:#ff8888; padding:4px 0px">{}</div>'.format(s)
176 try:
210 try:
177 data = UpdateModel().get_update_data(update_url)
211 data = UpdateModel().get_update_data(update_url)
178 except urllib2.URLError as e:
212 except urllib2.URLError as e:
179 log.exception("Exception contacting upgrade server")
213 log.exception("Exception contacting upgrade server")
180 self.request.override_renderer = 'string'
214 self.request.override_renderer = 'string'
181 return _err('Failed to contact upgrade server: %r' % e)
215 return _err('Failed to contact upgrade server: %r' % e)
182 except ValueError as e:
216 except ValueError as e:
183 log.exception("Bad data sent from update server")
217 log.exception("Bad data sent from update server")
184 self.request.override_renderer = 'string'
218 self.request.override_renderer = 'string'
185 return _err('Bad data sent from update server')
219 return _err('Bad data sent from update server')
186
220
187 latest = data['versions'][0]
221 latest = data['versions'][0]
188
222
189 c.update_url = update_url
223 c.update_url = update_url
190 c.latest_data = latest
224 c.latest_data = latest
191 c.latest_ver = latest['version']
225 c.latest_ver = latest['version']
192 c.cur_ver = rhodecode.__version__
226 c.cur_ver = rhodecode.__version__
193 c.should_upgrade = False
227 c.should_upgrade = False
194
228
195 is_oudated = UpdateModel().is_outdated(c.cur_ver, c.latest_ver)
229 is_oudated = UpdateModel().is_outdated(c.cur_ver, c.latest_ver)
196 if is_oudated:
230 if is_oudated:
197 c.should_upgrade = True
231 c.should_upgrade = True
198 c.important_notices = latest['general']
232 c.important_notices = latest['general']
199 UpdateModel().store_version(latest['version'])
233 UpdateModel().store_version(latest['version'])
200 return self._get_template_context(c)
234 return self._get_template_context(c)
@@ -1,1318 +1,1322 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2016-2020 RhodeCode GmbH
3 # Copyright (C) 2016-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import datetime
22 import datetime
23 import formencode
23 import formencode
24 import formencode.htmlfill
24 import formencode.htmlfill
25
25
26 from pyramid.httpexceptions import HTTPFound
26 from pyramid.httpexceptions import HTTPFound
27 from pyramid.renderers import render
27 from pyramid.renderers import render
28 from pyramid.response import Response
28 from pyramid.response import Response
29
29
30 from rhodecode import events
30 from rhodecode import events
31 from rhodecode.apps._base import BaseAppView, DataGridAppView, UserAppView
31 from rhodecode.apps._base import BaseAppView, DataGridAppView, UserAppView
32 from rhodecode.apps.ssh_support import SshKeyFileChangeEvent
32 from rhodecode.apps.ssh_support import SshKeyFileChangeEvent
33 from rhodecode.authentication.base import get_authn_registry, RhodeCodeExternalAuthPlugin
33 from rhodecode.authentication.base import get_authn_registry, RhodeCodeExternalAuthPlugin
34 from rhodecode.authentication.plugins import auth_rhodecode
34 from rhodecode.authentication.plugins import auth_rhodecode
35 from rhodecode.events import trigger
35 from rhodecode.events import trigger
36 from rhodecode.model.db import true, UserNotice
36 from rhodecode.model.db import true, UserNotice
37
37
38 from rhodecode.lib import audit_logger, rc_cache, auth
38 from rhodecode.lib import audit_logger, rc_cache, auth
39 from rhodecode.lib.exceptions import (
39 from rhodecode.lib.exceptions import (
40 UserCreationError, UserOwnsReposException, UserOwnsRepoGroupsException,
40 UserCreationError, UserOwnsReposException, UserOwnsRepoGroupsException,
41 UserOwnsUserGroupsException, UserOwnsPullRequestsException,
41 UserOwnsUserGroupsException, UserOwnsPullRequestsException,
42 UserOwnsArtifactsException, DefaultUserException)
42 UserOwnsArtifactsException, DefaultUserException)
43 from rhodecode.lib.ext_json import json
43 from rhodecode.lib.ext_json import json
44 from rhodecode.lib.auth import (
44 from rhodecode.lib.auth import (
45 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
45 LoginRequired, HasPermissionAllDecorator, CSRFRequired)
46 from rhodecode.lib import helpers as h
46 from rhodecode.lib import helpers as h
47 from rhodecode.lib.helpers import SqlPage
47 from rhodecode.lib.helpers import SqlPage
48 from rhodecode.lib.utils2 import safe_int, safe_unicode, AttributeDict
48 from rhodecode.lib.utils2 import safe_int, safe_unicode, AttributeDict
49 from rhodecode.model.auth_token import AuthTokenModel
49 from rhodecode.model.auth_token import AuthTokenModel
50 from rhodecode.model.forms import (
50 from rhodecode.model.forms import (
51 UserForm, UserIndividualPermissionsForm, UserPermissionsForm,
51 UserForm, UserIndividualPermissionsForm, UserPermissionsForm,
52 UserExtraEmailForm, UserExtraIpForm)
52 UserExtraEmailForm, UserExtraIpForm)
53 from rhodecode.model.permission import PermissionModel
53 from rhodecode.model.permission import PermissionModel
54 from rhodecode.model.repo_group import RepoGroupModel
54 from rhodecode.model.repo_group import RepoGroupModel
55 from rhodecode.model.ssh_key import SshKeyModel
55 from rhodecode.model.ssh_key import SshKeyModel
56 from rhodecode.model.user import UserModel
56 from rhodecode.model.user import UserModel
57 from rhodecode.model.user_group import UserGroupModel
57 from rhodecode.model.user_group import UserGroupModel
58 from rhodecode.model.db import (
58 from rhodecode.model.db import (
59 or_, coalesce,IntegrityError, User, UserGroup, UserIpMap, UserEmailMap,
59 or_, coalesce,IntegrityError, User, UserGroup, UserIpMap, UserEmailMap,
60 UserApiKeys, UserSshKeys, RepoGroup)
60 UserApiKeys, UserSshKeys, RepoGroup)
61 from rhodecode.model.meta import Session
61 from rhodecode.model.meta import Session
62
62
63 log = logging.getLogger(__name__)
63 log = logging.getLogger(__name__)
64
64
65
65
66 class AdminUsersView(BaseAppView, DataGridAppView):
66 class AdminUsersView(BaseAppView, DataGridAppView):
67
67
68 def load_default_context(self):
68 def load_default_context(self):
69 c = self._get_local_tmpl_context()
69 c = self._get_local_tmpl_context()
70 return c
70 return c
71
71
72 @LoginRequired()
72 @LoginRequired()
73 @HasPermissionAllDecorator('hg.admin')
73 @HasPermissionAllDecorator('hg.admin')
74 def users_list(self):
74 def users_list(self):
75 c = self.load_default_context()
75 c = self.load_default_context()
76 return self._get_template_context(c)
76 return self._get_template_context(c)
77
77
78 @LoginRequired()
78 @LoginRequired()
79 @HasPermissionAllDecorator('hg.admin')
79 @HasPermissionAllDecorator('hg.admin')
80 def users_list_data(self):
80 def users_list_data(self):
81 self.load_default_context()
81 self.load_default_context()
82 column_map = {
82 column_map = {
83 'first_name': 'name',
83 'first_name': 'name',
84 'last_name': 'lastname',
84 'last_name': 'lastname',
85 }
85 }
86 draw, start, limit = self._extract_chunk(self.request)
86 draw, start, limit = self._extract_chunk(self.request)
87 search_q, order_by, order_dir = self._extract_ordering(
87 search_q, order_by, order_dir = self._extract_ordering(
88 self.request, column_map=column_map)
88 self.request, column_map=column_map)
89 _render = self.request.get_partial_renderer(
89 _render = self.request.get_partial_renderer(
90 'rhodecode:templates/data_table/_dt_elements.mako')
90 'rhodecode:templates/data_table/_dt_elements.mako')
91
91
92 def user_actions(user_id, username):
92 def user_actions(user_id, username):
93 return _render("user_actions", user_id, username)
93 return _render("user_actions", user_id, username)
94
94
95 users_data_total_count = User.query()\
95 users_data_total_count = User.query()\
96 .filter(User.username != User.DEFAULT_USER) \
96 .filter(User.username != User.DEFAULT_USER) \
97 .count()
97 .count()
98
98
99 users_data_total_inactive_count = User.query()\
99 users_data_total_inactive_count = User.query()\
100 .filter(User.username != User.DEFAULT_USER) \
100 .filter(User.username != User.DEFAULT_USER) \
101 .filter(User.active != true())\
101 .filter(User.active != true())\
102 .count()
102 .count()
103
103
104 # json generate
104 # json generate
105 base_q = User.query().filter(User.username != User.DEFAULT_USER)
105 base_q = User.query().filter(User.username != User.DEFAULT_USER)
106 base_inactive_q = base_q.filter(User.active != true())
106 base_inactive_q = base_q.filter(User.active != true())
107
107
108 if search_q:
108 if search_q:
109 like_expression = u'%{}%'.format(safe_unicode(search_q))
109 like_expression = u'%{}%'.format(safe_unicode(search_q))
110 base_q = base_q.filter(or_(
110 base_q = base_q.filter(or_(
111 User.username.ilike(like_expression),
111 User.username.ilike(like_expression),
112 User._email.ilike(like_expression),
112 User._email.ilike(like_expression),
113 User.name.ilike(like_expression),
113 User.name.ilike(like_expression),
114 User.lastname.ilike(like_expression),
114 User.lastname.ilike(like_expression),
115 ))
115 ))
116 base_inactive_q = base_q.filter(User.active != true())
116 base_inactive_q = base_q.filter(User.active != true())
117
117
118 users_data_total_filtered_count = base_q.count()
118 users_data_total_filtered_count = base_q.count()
119 users_data_total_filtered_inactive_count = base_inactive_q.count()
119 users_data_total_filtered_inactive_count = base_inactive_q.count()
120
120
121 sort_col = getattr(User, order_by, None)
121 sort_col = getattr(User, order_by, None)
122 if sort_col:
122 if sort_col:
123 if order_dir == 'asc':
123 if order_dir == 'asc':
124 # handle null values properly to order by NULL last
124 # handle null values properly to order by NULL last
125 if order_by in ['last_activity']:
125 if order_by in ['last_activity']:
126 sort_col = coalesce(sort_col, datetime.date.max)
126 sort_col = coalesce(sort_col, datetime.date.max)
127 sort_col = sort_col.asc()
127 sort_col = sort_col.asc()
128 else:
128 else:
129 # handle null values properly to order by NULL last
129 # handle null values properly to order by NULL last
130 if order_by in ['last_activity']:
130 if order_by in ['last_activity']:
131 sort_col = coalesce(sort_col, datetime.date.min)
131 sort_col = coalesce(sort_col, datetime.date.min)
132 sort_col = sort_col.desc()
132 sort_col = sort_col.desc()
133
133
134 base_q = base_q.order_by(sort_col)
134 base_q = base_q.order_by(sort_col)
135 base_q = base_q.offset(start).limit(limit)
135 base_q = base_q.offset(start).limit(limit)
136
136
137 users_list = base_q.all()
137 users_list = base_q.all()
138
138
139 users_data = []
139 users_data = []
140 for user in users_list:
140 for user in users_list:
141 users_data.append({
141 users_data.append({
142 "username": h.gravatar_with_user(self.request, user.username),
142 "username": h.gravatar_with_user(self.request, user.username),
143 "email": user.email,
143 "email": user.email,
144 "first_name": user.first_name,
144 "first_name": user.first_name,
145 "last_name": user.last_name,
145 "last_name": user.last_name,
146 "last_login": h.format_date(user.last_login),
146 "last_login": h.format_date(user.last_login),
147 "last_activity": h.format_date(user.last_activity),
147 "last_activity": h.format_date(user.last_activity),
148 "active": h.bool2icon(user.active),
148 "active": h.bool2icon(user.active),
149 "active_raw": user.active,
149 "active_raw": user.active,
150 "admin": h.bool2icon(user.admin),
150 "admin": h.bool2icon(user.admin),
151 "extern_type": user.extern_type,
151 "extern_type": user.extern_type,
152 "extern_name": user.extern_name,
152 "extern_name": user.extern_name,
153 "action": user_actions(user.user_id, user.username),
153 "action": user_actions(user.user_id, user.username),
154 })
154 })
155 data = ({
155 data = ({
156 'draw': draw,
156 'draw': draw,
157 'data': users_data,
157 'data': users_data,
158 'recordsTotal': users_data_total_count,
158 'recordsTotal': users_data_total_count,
159 'recordsFiltered': users_data_total_filtered_count,
159 'recordsFiltered': users_data_total_filtered_count,
160 'recordsTotalInactive': users_data_total_inactive_count,
160 'recordsTotalInactive': users_data_total_inactive_count,
161 'recordsFilteredInactive': users_data_total_filtered_inactive_count
161 'recordsFilteredInactive': users_data_total_filtered_inactive_count
162 })
162 })
163
163
164 return data
164 return data
165
165
166 def _set_personal_repo_group_template_vars(self, c_obj):
166 def _set_personal_repo_group_template_vars(self, c_obj):
167 DummyUser = AttributeDict({
167 DummyUser = AttributeDict({
168 'username': '${username}',
168 'username': '${username}',
169 'user_id': '${user_id}',
169 'user_id': '${user_id}',
170 })
170 })
171 c_obj.default_create_repo_group = RepoGroupModel() \
171 c_obj.default_create_repo_group = RepoGroupModel() \
172 .get_default_create_personal_repo_group()
172 .get_default_create_personal_repo_group()
173 c_obj.personal_repo_group_name = RepoGroupModel() \
173 c_obj.personal_repo_group_name = RepoGroupModel() \
174 .get_personal_group_name(DummyUser)
174 .get_personal_group_name(DummyUser)
175
175
176 @LoginRequired()
176 @LoginRequired()
177 @HasPermissionAllDecorator('hg.admin')
177 @HasPermissionAllDecorator('hg.admin')
178 def users_new(self):
178 def users_new(self):
179 _ = self.request.translate
179 _ = self.request.translate
180 c = self.load_default_context()
180 c = self.load_default_context()
181 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.uid
181 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.uid
182 self._set_personal_repo_group_template_vars(c)
182 self._set_personal_repo_group_template_vars(c)
183 return self._get_template_context(c)
183 return self._get_template_context(c)
184
184
185 @LoginRequired()
185 @LoginRequired()
186 @HasPermissionAllDecorator('hg.admin')
186 @HasPermissionAllDecorator('hg.admin')
187 @CSRFRequired()
187 @CSRFRequired()
188 def users_create(self):
188 def users_create(self):
189 _ = self.request.translate
189 _ = self.request.translate
190 c = self.load_default_context()
190 c = self.load_default_context()
191 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.uid
191 c.default_extern_type = auth_rhodecode.RhodeCodeAuthPlugin.uid
192 user_model = UserModel()
192 user_model = UserModel()
193 user_form = UserForm(self.request.translate)()
193 user_form = UserForm(self.request.translate)()
194 try:
194 try:
195 form_result = user_form.to_python(dict(self.request.POST))
195 form_result = user_form.to_python(dict(self.request.POST))
196 user = user_model.create(form_result)
196 user = user_model.create(form_result)
197 Session().flush()
197 Session().flush()
198 creation_data = user.get_api_data()
198 creation_data = user.get_api_data()
199 username = form_result['username']
199 username = form_result['username']
200
200
201 audit_logger.store_web(
201 audit_logger.store_web(
202 'user.create', action_data={'data': creation_data},
202 'user.create', action_data={'data': creation_data},
203 user=c.rhodecode_user)
203 user=c.rhodecode_user)
204
204
205 user_link = h.link_to(
205 user_link = h.link_to(
206 h.escape(username),
206 h.escape(username),
207 h.route_path('user_edit', user_id=user.user_id))
207 h.route_path('user_edit', user_id=user.user_id))
208 h.flash(h.literal(_('Created user %(user_link)s')
208 h.flash(h.literal(_('Created user %(user_link)s')
209 % {'user_link': user_link}), category='success')
209 % {'user_link': user_link}), category='success')
210 Session().commit()
210 Session().commit()
211 except formencode.Invalid as errors:
211 except formencode.Invalid as errors:
212 self._set_personal_repo_group_template_vars(c)
212 self._set_personal_repo_group_template_vars(c)
213 data = render(
213 data = render(
214 'rhodecode:templates/admin/users/user_add.mako',
214 'rhodecode:templates/admin/users/user_add.mako',
215 self._get_template_context(c), self.request)
215 self._get_template_context(c), self.request)
216 html = formencode.htmlfill.render(
216 html = formencode.htmlfill.render(
217 data,
217 data,
218 defaults=errors.value,
218 defaults=errors.value,
219 errors=errors.error_dict or {},
219 errors=errors.error_dict or {},
220 prefix_error=False,
220 prefix_error=False,
221 encoding="UTF-8",
221 encoding="UTF-8",
222 force_defaults=False
222 force_defaults=False
223 )
223 )
224 return Response(html)
224 return Response(html)
225 except UserCreationError as e:
225 except UserCreationError as e:
226 h.flash(e, 'error')
226 h.flash(e, 'error')
227 except Exception:
227 except Exception:
228 log.exception("Exception creation of user")
228 log.exception("Exception creation of user")
229 h.flash(_('Error occurred during creation of user %s')
229 h.flash(_('Error occurred during creation of user %s')
230 % self.request.POST.get('username'), category='error')
230 % self.request.POST.get('username'), category='error')
231 raise HTTPFound(h.route_path('users'))
231 raise HTTPFound(h.route_path('users'))
232
232
233
233
234 class UsersView(UserAppView):
234 class UsersView(UserAppView):
235 ALLOW_SCOPED_TOKENS = False
235 ALLOW_SCOPED_TOKENS = False
236 """
236 """
237 This view has alternative version inside EE, if modified please take a look
237 This view has alternative version inside EE, if modified please take a look
238 in there as well.
238 in there as well.
239 """
239 """
240
240
241 def get_auth_plugins(self):
241 def get_auth_plugins(self):
242 valid_plugins = []
242 valid_plugins = []
243 authn_registry = get_authn_registry(self.request.registry)
243 authn_registry = get_authn_registry(self.request.registry)
244 for plugin in authn_registry.get_plugins_for_authentication():
244 for plugin in authn_registry.get_plugins_for_authentication():
245 if isinstance(plugin, RhodeCodeExternalAuthPlugin):
245 if isinstance(plugin, RhodeCodeExternalAuthPlugin):
246 valid_plugins.append(plugin)
246 valid_plugins.append(plugin)
247 elif plugin.name == 'rhodecode':
247 elif plugin.name == 'rhodecode':
248 valid_plugins.append(plugin)
248 valid_plugins.append(plugin)
249
249
250 # extend our choices if user has set a bound plugin which isn't enabled at the
250 # extend our choices if user has set a bound plugin which isn't enabled at the
251 # moment
251 # moment
252 extern_type = self.db_user.extern_type
252 extern_type = self.db_user.extern_type
253 if extern_type not in [x.uid for x in valid_plugins]:
253 if extern_type not in [x.uid for x in valid_plugins]:
254 try:
254 try:
255 plugin = authn_registry.get_plugin_by_uid(extern_type)
255 plugin = authn_registry.get_plugin_by_uid(extern_type)
256 if plugin:
256 if plugin:
257 valid_plugins.append(plugin)
257 valid_plugins.append(plugin)
258
258
259 except Exception:
259 except Exception:
260 log.exception(
260 log.exception(
261 'Could not extend user plugins with `{}`'.format(extern_type))
261 'Could not extend user plugins with `{}`'.format(extern_type))
262 return valid_plugins
262 return valid_plugins
263
263
264 def load_default_context(self):
264 def load_default_context(self):
265 req = self.request
265 req = self.request
266
266
267 c = self._get_local_tmpl_context()
267 c = self._get_local_tmpl_context()
268 c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS
268 c.allow_scoped_tokens = self.ALLOW_SCOPED_TOKENS
269 c.allowed_languages = [
269 c.allowed_languages = [
270 ('en', 'English (en)'),
270 ('en', 'English (en)'),
271 ('de', 'German (de)'),
271 ('de', 'German (de)'),
272 ('fr', 'French (fr)'),
272 ('fr', 'French (fr)'),
273 ('it', 'Italian (it)'),
273 ('it', 'Italian (it)'),
274 ('ja', 'Japanese (ja)'),
274 ('ja', 'Japanese (ja)'),
275 ('pl', 'Polish (pl)'),
275 ('pl', 'Polish (pl)'),
276 ('pt', 'Portuguese (pt)'),
276 ('pt', 'Portuguese (pt)'),
277 ('ru', 'Russian (ru)'),
277 ('ru', 'Russian (ru)'),
278 ('zh', 'Chinese (zh)'),
278 ('zh', 'Chinese (zh)'),
279 ]
279 ]
280
280
281 c.allowed_extern_types = [
281 c.allowed_extern_types = [
282 (x.uid, x.get_display_name()) for x in self.get_auth_plugins()
282 (x.uid, x.get_display_name()) for x in self.get_auth_plugins()
283 ]
283 ]
284 perms = req.registry.settings.get('available_permissions')
284 perms = req.registry.settings.get('available_permissions')
285 if not perms:
285 if not perms:
286 # inject info about available permissions
286 # inject info about available permissions
287 auth.set_available_permissions(req.registry.settings)
287 auth.set_available_permissions(req.registry.settings)
288
288
289 c.available_permissions = req.registry.settings['available_permissions']
289 c.available_permissions = req.registry.settings['available_permissions']
290 PermissionModel().set_global_permission_choices(
290 PermissionModel().set_global_permission_choices(
291 c, gettext_translator=req.translate)
291 c, gettext_translator=req.translate)
292
292
293 return c
293 return c
294
294
295 @LoginRequired()
295 @LoginRequired()
296 @HasPermissionAllDecorator('hg.admin')
296 @HasPermissionAllDecorator('hg.admin')
297 @CSRFRequired()
297 @CSRFRequired()
298 def user_update(self):
298 def user_update(self):
299 _ = self.request.translate
299 _ = self.request.translate
300 c = self.load_default_context()
300 c = self.load_default_context()
301
301
302 user_id = self.db_user_id
302 user_id = self.db_user_id
303 c.user = self.db_user
303 c.user = self.db_user
304
304
305 c.active = 'profile'
305 c.active = 'profile'
306 c.extern_type = c.user.extern_type
306 c.extern_type = c.user.extern_type
307 c.extern_name = c.user.extern_name
307 c.extern_name = c.user.extern_name
308 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
308 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
309 available_languages = [x[0] for x in c.allowed_languages]
309 available_languages = [x[0] for x in c.allowed_languages]
310 _form = UserForm(self.request.translate, edit=True,
310 _form = UserForm(self.request.translate, edit=True,
311 available_languages=available_languages,
311 available_languages=available_languages,
312 old_data={'user_id': user_id,
312 old_data={'user_id': user_id,
313 'email': c.user.email})()
313 'email': c.user.email})()
314
315 c.edit_mode = self.request.POST.get('edit') == '1'
314 form_result = {}
316 form_result = {}
315 old_values = c.user.get_api_data()
317 old_values = c.user.get_api_data()
316 try:
318 try:
317 form_result = _form.to_python(dict(self.request.POST))
319 form_result = _form.to_python(dict(self.request.POST))
318 skip_attrs = ['extern_name']
320 skip_attrs = ['extern_name']
319 # TODO: plugin should define if username can be updated
321 # TODO: plugin should define if username can be updated
320 if c.extern_type != "rhodecode":
322
323 if c.extern_type != "rhodecode" and not c.edit_mode:
321 # forbid updating username for external accounts
324 # forbid updating username for external accounts
322 skip_attrs.append('username')
325 skip_attrs.append('username')
323
326
324 UserModel().update_user(
327 UserModel().update_user(
325 user_id, skip_attrs=skip_attrs, **form_result)
328 user_id, skip_attrs=skip_attrs, **form_result)
326
329
327 audit_logger.store_web(
330 audit_logger.store_web(
328 'user.edit', action_data={'old_data': old_values},
331 'user.edit', action_data={'old_data': old_values},
329 user=c.rhodecode_user)
332 user=c.rhodecode_user)
330
333
331 Session().commit()
334 Session().commit()
332 h.flash(_('User updated successfully'), category='success')
335 h.flash(_('User updated successfully'), category='success')
333 except formencode.Invalid as errors:
336 except formencode.Invalid as errors:
334 data = render(
337 data = render(
335 'rhodecode:templates/admin/users/user_edit.mako',
338 'rhodecode:templates/admin/users/user_edit.mako',
336 self._get_template_context(c), self.request)
339 self._get_template_context(c), self.request)
337 html = formencode.htmlfill.render(
340 html = formencode.htmlfill.render(
338 data,
341 data,
339 defaults=errors.value,
342 defaults=errors.value,
340 errors=errors.error_dict or {},
343 errors=errors.error_dict or {},
341 prefix_error=False,
344 prefix_error=False,
342 encoding="UTF-8",
345 encoding="UTF-8",
343 force_defaults=False
346 force_defaults=False
344 )
347 )
345 return Response(html)
348 return Response(html)
346 except UserCreationError as e:
349 except UserCreationError as e:
347 h.flash(e, 'error')
350 h.flash(e, 'error')
348 except Exception:
351 except Exception:
349 log.exception("Exception updating user")
352 log.exception("Exception updating user")
350 h.flash(_('Error occurred during update of user %s')
353 h.flash(_('Error occurred during update of user %s')
351 % form_result.get('username'), category='error')
354 % form_result.get('username'), category='error')
352 raise HTTPFound(h.route_path('user_edit', user_id=user_id))
355 raise HTTPFound(h.route_path('user_edit', user_id=user_id))
353
356
354 @LoginRequired()
357 @LoginRequired()
355 @HasPermissionAllDecorator('hg.admin')
358 @HasPermissionAllDecorator('hg.admin')
356 @CSRFRequired()
359 @CSRFRequired()
357 def user_delete(self):
360 def user_delete(self):
358 _ = self.request.translate
361 _ = self.request.translate
359 c = self.load_default_context()
362 c = self.load_default_context()
360 c.user = self.db_user
363 c.user = self.db_user
361
364
362 _repos = c.user.repositories
365 _repos = c.user.repositories
363 _repo_groups = c.user.repository_groups
366 _repo_groups = c.user.repository_groups
364 _user_groups = c.user.user_groups
367 _user_groups = c.user.user_groups
365 _pull_requests = c.user.user_pull_requests
368 _pull_requests = c.user.user_pull_requests
366 _artifacts = c.user.artifacts
369 _artifacts = c.user.artifacts
367
370
368 handle_repos = None
371 handle_repos = None
369 handle_repo_groups = None
372 handle_repo_groups = None
370 handle_user_groups = None
373 handle_user_groups = None
371 handle_pull_requests = None
374 handle_pull_requests = None
372 handle_artifacts = None
375 handle_artifacts = None
373
376
374 # calls for flash of handle based on handle case detach or delete
377 # calls for flash of handle based on handle case detach or delete
375 def set_handle_flash_repos():
378 def set_handle_flash_repos():
376 handle = handle_repos
379 handle = handle_repos
377 if handle == 'detach':
380 if handle == 'detach':
378 h.flash(_('Detached %s repositories') % len(_repos),
381 h.flash(_('Detached %s repositories') % len(_repos),
379 category='success')
382 category='success')
380 elif handle == 'delete':
383 elif handle == 'delete':
381 h.flash(_('Deleted %s repositories') % len(_repos),
384 h.flash(_('Deleted %s repositories') % len(_repos),
382 category='success')
385 category='success')
383
386
384 def set_handle_flash_repo_groups():
387 def set_handle_flash_repo_groups():
385 handle = handle_repo_groups
388 handle = handle_repo_groups
386 if handle == 'detach':
389 if handle == 'detach':
387 h.flash(_('Detached %s repository groups') % len(_repo_groups),
390 h.flash(_('Detached %s repository groups') % len(_repo_groups),
388 category='success')
391 category='success')
389 elif handle == 'delete':
392 elif handle == 'delete':
390 h.flash(_('Deleted %s repository groups') % len(_repo_groups),
393 h.flash(_('Deleted %s repository groups') % len(_repo_groups),
391 category='success')
394 category='success')
392
395
393 def set_handle_flash_user_groups():
396 def set_handle_flash_user_groups():
394 handle = handle_user_groups
397 handle = handle_user_groups
395 if handle == 'detach':
398 if handle == 'detach':
396 h.flash(_('Detached %s user groups') % len(_user_groups),
399 h.flash(_('Detached %s user groups') % len(_user_groups),
397 category='success')
400 category='success')
398 elif handle == 'delete':
401 elif handle == 'delete':
399 h.flash(_('Deleted %s user groups') % len(_user_groups),
402 h.flash(_('Deleted %s user groups') % len(_user_groups),
400 category='success')
403 category='success')
401
404
402 def set_handle_flash_pull_requests():
405 def set_handle_flash_pull_requests():
403 handle = handle_pull_requests
406 handle = handle_pull_requests
404 if handle == 'detach':
407 if handle == 'detach':
405 h.flash(_('Detached %s pull requests') % len(_pull_requests),
408 h.flash(_('Detached %s pull requests') % len(_pull_requests),
406 category='success')
409 category='success')
407 elif handle == 'delete':
410 elif handle == 'delete':
408 h.flash(_('Deleted %s pull requests') % len(_pull_requests),
411 h.flash(_('Deleted %s pull requests') % len(_pull_requests),
409 category='success')
412 category='success')
410
413
411 def set_handle_flash_artifacts():
414 def set_handle_flash_artifacts():
412 handle = handle_artifacts
415 handle = handle_artifacts
413 if handle == 'detach':
416 if handle == 'detach':
414 h.flash(_('Detached %s artifacts') % len(_artifacts),
417 h.flash(_('Detached %s artifacts') % len(_artifacts),
415 category='success')
418 category='success')
416 elif handle == 'delete':
419 elif handle == 'delete':
417 h.flash(_('Deleted %s artifacts') % len(_artifacts),
420 h.flash(_('Deleted %s artifacts') % len(_artifacts),
418 category='success')
421 category='success')
419
422
420 handle_user = User.get_first_super_admin()
423 handle_user = User.get_first_super_admin()
421 handle_user_id = safe_int(self.request.POST.get('detach_user_id'))
424 handle_user_id = safe_int(self.request.POST.get('detach_user_id'))
422 if handle_user_id:
425 if handle_user_id:
423 # NOTE(marcink): we get new owner for objects...
426 # NOTE(marcink): we get new owner for objects...
424 handle_user = User.get_or_404(handle_user_id)
427 handle_user = User.get_or_404(handle_user_id)
425
428
426 if _repos and self.request.POST.get('user_repos'):
429 if _repos and self.request.POST.get('user_repos'):
427 handle_repos = self.request.POST['user_repos']
430 handle_repos = self.request.POST['user_repos']
428
431
429 if _repo_groups and self.request.POST.get('user_repo_groups'):
432 if _repo_groups and self.request.POST.get('user_repo_groups'):
430 handle_repo_groups = self.request.POST['user_repo_groups']
433 handle_repo_groups = self.request.POST['user_repo_groups']
431
434
432 if _user_groups and self.request.POST.get('user_user_groups'):
435 if _user_groups and self.request.POST.get('user_user_groups'):
433 handle_user_groups = self.request.POST['user_user_groups']
436 handle_user_groups = self.request.POST['user_user_groups']
434
437
435 if _pull_requests and self.request.POST.get('user_pull_requests'):
438 if _pull_requests and self.request.POST.get('user_pull_requests'):
436 handle_pull_requests = self.request.POST['user_pull_requests']
439 handle_pull_requests = self.request.POST['user_pull_requests']
437
440
438 if _artifacts and self.request.POST.get('user_artifacts'):
441 if _artifacts and self.request.POST.get('user_artifacts'):
439 handle_artifacts = self.request.POST['user_artifacts']
442 handle_artifacts = self.request.POST['user_artifacts']
440
443
441 old_values = c.user.get_api_data()
444 old_values = c.user.get_api_data()
442
445
443 try:
446 try:
444
447
445 UserModel().delete(
448 UserModel().delete(
446 c.user,
449 c.user,
447 handle_repos=handle_repos,
450 handle_repos=handle_repos,
448 handle_repo_groups=handle_repo_groups,
451 handle_repo_groups=handle_repo_groups,
449 handle_user_groups=handle_user_groups,
452 handle_user_groups=handle_user_groups,
450 handle_pull_requests=handle_pull_requests,
453 handle_pull_requests=handle_pull_requests,
451 handle_artifacts=handle_artifacts,
454 handle_artifacts=handle_artifacts,
452 handle_new_owner=handle_user
455 handle_new_owner=handle_user
453 )
456 )
454
457
455 audit_logger.store_web(
458 audit_logger.store_web(
456 'user.delete', action_data={'old_data': old_values},
459 'user.delete', action_data={'old_data': old_values},
457 user=c.rhodecode_user)
460 user=c.rhodecode_user)
458
461
459 Session().commit()
462 Session().commit()
460 set_handle_flash_repos()
463 set_handle_flash_repos()
461 set_handle_flash_repo_groups()
464 set_handle_flash_repo_groups()
462 set_handle_flash_user_groups()
465 set_handle_flash_user_groups()
463 set_handle_flash_pull_requests()
466 set_handle_flash_pull_requests()
464 set_handle_flash_artifacts()
467 set_handle_flash_artifacts()
465 username = h.escape(old_values['username'])
468 username = h.escape(old_values['username'])
466 h.flash(_('Successfully deleted user `{}`').format(username), category='success')
469 h.flash(_('Successfully deleted user `{}`').format(username), category='success')
467 except (UserOwnsReposException, UserOwnsRepoGroupsException,
470 except (UserOwnsReposException, UserOwnsRepoGroupsException,
468 UserOwnsUserGroupsException, UserOwnsPullRequestsException,
471 UserOwnsUserGroupsException, UserOwnsPullRequestsException,
469 UserOwnsArtifactsException, DefaultUserException) as e:
472 UserOwnsArtifactsException, DefaultUserException) as e:
470 h.flash(e, category='warning')
473 h.flash(e, category='warning')
471 except Exception:
474 except Exception:
472 log.exception("Exception during deletion of user")
475 log.exception("Exception during deletion of user")
473 h.flash(_('An error occurred during deletion of user'),
476 h.flash(_('An error occurred during deletion of user'),
474 category='error')
477 category='error')
475 raise HTTPFound(h.route_path('users'))
478 raise HTTPFound(h.route_path('users'))
476
479
477 @LoginRequired()
480 @LoginRequired()
478 @HasPermissionAllDecorator('hg.admin')
481 @HasPermissionAllDecorator('hg.admin')
479 def user_edit(self):
482 def user_edit(self):
480 _ = self.request.translate
483 _ = self.request.translate
481 c = self.load_default_context()
484 c = self.load_default_context()
482 c.user = self.db_user
485 c.user = self.db_user
483
486
484 c.active = 'profile'
487 c.active = 'profile'
485 c.extern_type = c.user.extern_type
488 c.extern_type = c.user.extern_type
486 c.extern_name = c.user.extern_name
489 c.extern_name = c.user.extern_name
487 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
490 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
491 c.edit_mode = self.request.GET.get('edit') == '1'
488
492
489 defaults = c.user.get_dict()
493 defaults = c.user.get_dict()
490 defaults.update({'language': c.user.user_data.get('language')})
494 defaults.update({'language': c.user.user_data.get('language')})
491
495
492 data = render(
496 data = render(
493 'rhodecode:templates/admin/users/user_edit.mako',
497 'rhodecode:templates/admin/users/user_edit.mako',
494 self._get_template_context(c), self.request)
498 self._get_template_context(c), self.request)
495 html = formencode.htmlfill.render(
499 html = formencode.htmlfill.render(
496 data,
500 data,
497 defaults=defaults,
501 defaults=defaults,
498 encoding="UTF-8",
502 encoding="UTF-8",
499 force_defaults=False
503 force_defaults=False
500 )
504 )
501 return Response(html)
505 return Response(html)
502
506
503 @LoginRequired()
507 @LoginRequired()
504 @HasPermissionAllDecorator('hg.admin')
508 @HasPermissionAllDecorator('hg.admin')
505 def user_edit_advanced(self):
509 def user_edit_advanced(self):
506 _ = self.request.translate
510 _ = self.request.translate
507 c = self.load_default_context()
511 c = self.load_default_context()
508
512
509 user_id = self.db_user_id
513 user_id = self.db_user_id
510 c.user = self.db_user
514 c.user = self.db_user
511
515
512 c.detach_user = User.get_first_super_admin()
516 c.detach_user = User.get_first_super_admin()
513 detach_user_id = safe_int(self.request.GET.get('detach_user_id'))
517 detach_user_id = safe_int(self.request.GET.get('detach_user_id'))
514 if detach_user_id:
518 if detach_user_id:
515 c.detach_user = User.get_or_404(detach_user_id)
519 c.detach_user = User.get_or_404(detach_user_id)
516
520
517 c.active = 'advanced'
521 c.active = 'advanced'
518 c.personal_repo_group = RepoGroup.get_user_personal_repo_group(user_id)
522 c.personal_repo_group = RepoGroup.get_user_personal_repo_group(user_id)
519 c.personal_repo_group_name = RepoGroupModel()\
523 c.personal_repo_group_name = RepoGroupModel()\
520 .get_personal_group_name(c.user)
524 .get_personal_group_name(c.user)
521
525
522 c.user_to_review_rules = sorted(
526 c.user_to_review_rules = sorted(
523 (x.user for x in c.user.user_review_rules),
527 (x.user for x in c.user.user_review_rules),
524 key=lambda u: u.username.lower())
528 key=lambda u: u.username.lower())
525
529
526 defaults = c.user.get_dict()
530 defaults = c.user.get_dict()
527
531
528 # Interim workaround if the user participated on any pull requests as a
532 # Interim workaround if the user participated on any pull requests as a
529 # reviewer.
533 # reviewer.
530 has_review = len(c.user.reviewer_pull_requests)
534 has_review = len(c.user.reviewer_pull_requests)
531 c.can_delete_user = not has_review
535 c.can_delete_user = not has_review
532 c.can_delete_user_message = ''
536 c.can_delete_user_message = ''
533 inactive_link = h.link_to(
537 inactive_link = h.link_to(
534 'inactive', h.route_path('user_edit', user_id=user_id, _anchor='active'))
538 'inactive', h.route_path('user_edit', user_id=user_id, _anchor='active'))
535 if has_review == 1:
539 if has_review == 1:
536 c.can_delete_user_message = h.literal(_(
540 c.can_delete_user_message = h.literal(_(
537 'The user participates as reviewer in {} pull request and '
541 'The user participates as reviewer in {} pull request and '
538 'cannot be deleted. \nYou can set the user to '
542 'cannot be deleted. \nYou can set the user to '
539 '"{}" instead of deleting it.').format(
543 '"{}" instead of deleting it.').format(
540 has_review, inactive_link))
544 has_review, inactive_link))
541 elif has_review:
545 elif has_review:
542 c.can_delete_user_message = h.literal(_(
546 c.can_delete_user_message = h.literal(_(
543 'The user participates as reviewer in {} pull requests and '
547 'The user participates as reviewer in {} pull requests and '
544 'cannot be deleted. \nYou can set the user to '
548 'cannot be deleted. \nYou can set the user to '
545 '"{}" instead of deleting it.').format(
549 '"{}" instead of deleting it.').format(
546 has_review, inactive_link))
550 has_review, inactive_link))
547
551
548 data = render(
552 data = render(
549 'rhodecode:templates/admin/users/user_edit.mako',
553 'rhodecode:templates/admin/users/user_edit.mako',
550 self._get_template_context(c), self.request)
554 self._get_template_context(c), self.request)
551 html = formencode.htmlfill.render(
555 html = formencode.htmlfill.render(
552 data,
556 data,
553 defaults=defaults,
557 defaults=defaults,
554 encoding="UTF-8",
558 encoding="UTF-8",
555 force_defaults=False
559 force_defaults=False
556 )
560 )
557 return Response(html)
561 return Response(html)
558
562
559 @LoginRequired()
563 @LoginRequired()
560 @HasPermissionAllDecorator('hg.admin')
564 @HasPermissionAllDecorator('hg.admin')
561 def user_edit_global_perms(self):
565 def user_edit_global_perms(self):
562 _ = self.request.translate
566 _ = self.request.translate
563 c = self.load_default_context()
567 c = self.load_default_context()
564 c.user = self.db_user
568 c.user = self.db_user
565
569
566 c.active = 'global_perms'
570 c.active = 'global_perms'
567
571
568 c.default_user = User.get_default_user()
572 c.default_user = User.get_default_user()
569 defaults = c.user.get_dict()
573 defaults = c.user.get_dict()
570 defaults.update(c.default_user.get_default_perms(suffix='_inherited'))
574 defaults.update(c.default_user.get_default_perms(suffix='_inherited'))
571 defaults.update(c.default_user.get_default_perms())
575 defaults.update(c.default_user.get_default_perms())
572 defaults.update(c.user.get_default_perms())
576 defaults.update(c.user.get_default_perms())
573
577
574 data = render(
578 data = render(
575 'rhodecode:templates/admin/users/user_edit.mako',
579 'rhodecode:templates/admin/users/user_edit.mako',
576 self._get_template_context(c), self.request)
580 self._get_template_context(c), self.request)
577 html = formencode.htmlfill.render(
581 html = formencode.htmlfill.render(
578 data,
582 data,
579 defaults=defaults,
583 defaults=defaults,
580 encoding="UTF-8",
584 encoding="UTF-8",
581 force_defaults=False
585 force_defaults=False
582 )
586 )
583 return Response(html)
587 return Response(html)
584
588
585 @LoginRequired()
589 @LoginRequired()
586 @HasPermissionAllDecorator('hg.admin')
590 @HasPermissionAllDecorator('hg.admin')
587 @CSRFRequired()
591 @CSRFRequired()
588 def user_edit_global_perms_update(self):
592 def user_edit_global_perms_update(self):
589 _ = self.request.translate
593 _ = self.request.translate
590 c = self.load_default_context()
594 c = self.load_default_context()
591
595
592 user_id = self.db_user_id
596 user_id = self.db_user_id
593 c.user = self.db_user
597 c.user = self.db_user
594
598
595 c.active = 'global_perms'
599 c.active = 'global_perms'
596 try:
600 try:
597 # first stage that verifies the checkbox
601 # first stage that verifies the checkbox
598 _form = UserIndividualPermissionsForm(self.request.translate)
602 _form = UserIndividualPermissionsForm(self.request.translate)
599 form_result = _form.to_python(dict(self.request.POST))
603 form_result = _form.to_python(dict(self.request.POST))
600 inherit_perms = form_result['inherit_default_permissions']
604 inherit_perms = form_result['inherit_default_permissions']
601 c.user.inherit_default_permissions = inherit_perms
605 c.user.inherit_default_permissions = inherit_perms
602 Session().add(c.user)
606 Session().add(c.user)
603
607
604 if not inherit_perms:
608 if not inherit_perms:
605 # only update the individual ones if we un check the flag
609 # only update the individual ones if we un check the flag
606 _form = UserPermissionsForm(
610 _form = UserPermissionsForm(
607 self.request.translate,
611 self.request.translate,
608 [x[0] for x in c.repo_create_choices],
612 [x[0] for x in c.repo_create_choices],
609 [x[0] for x in c.repo_create_on_write_choices],
613 [x[0] for x in c.repo_create_on_write_choices],
610 [x[0] for x in c.repo_group_create_choices],
614 [x[0] for x in c.repo_group_create_choices],
611 [x[0] for x in c.user_group_create_choices],
615 [x[0] for x in c.user_group_create_choices],
612 [x[0] for x in c.fork_choices],
616 [x[0] for x in c.fork_choices],
613 [x[0] for x in c.inherit_default_permission_choices])()
617 [x[0] for x in c.inherit_default_permission_choices])()
614
618
615 form_result = _form.to_python(dict(self.request.POST))
619 form_result = _form.to_python(dict(self.request.POST))
616 form_result.update({'perm_user_id': c.user.user_id})
620 form_result.update({'perm_user_id': c.user.user_id})
617
621
618 PermissionModel().update_user_permissions(form_result)
622 PermissionModel().update_user_permissions(form_result)
619
623
620 # TODO(marcink): implement global permissions
624 # TODO(marcink): implement global permissions
621 # audit_log.store_web('user.edit.permissions')
625 # audit_log.store_web('user.edit.permissions')
622
626
623 Session().commit()
627 Session().commit()
624
628
625 h.flash(_('User global permissions updated successfully'),
629 h.flash(_('User global permissions updated successfully'),
626 category='success')
630 category='success')
627
631
628 except formencode.Invalid as errors:
632 except formencode.Invalid as errors:
629 data = render(
633 data = render(
630 'rhodecode:templates/admin/users/user_edit.mako',
634 'rhodecode:templates/admin/users/user_edit.mako',
631 self._get_template_context(c), self.request)
635 self._get_template_context(c), self.request)
632 html = formencode.htmlfill.render(
636 html = formencode.htmlfill.render(
633 data,
637 data,
634 defaults=errors.value,
638 defaults=errors.value,
635 errors=errors.error_dict or {},
639 errors=errors.error_dict or {},
636 prefix_error=False,
640 prefix_error=False,
637 encoding="UTF-8",
641 encoding="UTF-8",
638 force_defaults=False
642 force_defaults=False
639 )
643 )
640 return Response(html)
644 return Response(html)
641 except Exception:
645 except Exception:
642 log.exception("Exception during permissions saving")
646 log.exception("Exception during permissions saving")
643 h.flash(_('An error occurred during permissions saving'),
647 h.flash(_('An error occurred during permissions saving'),
644 category='error')
648 category='error')
645
649
646 affected_user_ids = [user_id]
650 affected_user_ids = [user_id]
647 PermissionModel().trigger_permission_flush(affected_user_ids)
651 PermissionModel().trigger_permission_flush(affected_user_ids)
648 raise HTTPFound(h.route_path('user_edit_global_perms', user_id=user_id))
652 raise HTTPFound(h.route_path('user_edit_global_perms', user_id=user_id))
649
653
650 @LoginRequired()
654 @LoginRequired()
651 @HasPermissionAllDecorator('hg.admin')
655 @HasPermissionAllDecorator('hg.admin')
652 @CSRFRequired()
656 @CSRFRequired()
653 def user_enable_force_password_reset(self):
657 def user_enable_force_password_reset(self):
654 _ = self.request.translate
658 _ = self.request.translate
655 c = self.load_default_context()
659 c = self.load_default_context()
656
660
657 user_id = self.db_user_id
661 user_id = self.db_user_id
658 c.user = self.db_user
662 c.user = self.db_user
659
663
660 try:
664 try:
661 c.user.update_userdata(force_password_change=True)
665 c.user.update_userdata(force_password_change=True)
662
666
663 msg = _('Force password change enabled for user')
667 msg = _('Force password change enabled for user')
664 audit_logger.store_web('user.edit.password_reset.enabled',
668 audit_logger.store_web('user.edit.password_reset.enabled',
665 user=c.rhodecode_user)
669 user=c.rhodecode_user)
666
670
667 Session().commit()
671 Session().commit()
668 h.flash(msg, category='success')
672 h.flash(msg, category='success')
669 except Exception:
673 except Exception:
670 log.exception("Exception during password reset for user")
674 log.exception("Exception during password reset for user")
671 h.flash(_('An error occurred during password reset for user'),
675 h.flash(_('An error occurred during password reset for user'),
672 category='error')
676 category='error')
673
677
674 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
678 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
675
679
676 @LoginRequired()
680 @LoginRequired()
677 @HasPermissionAllDecorator('hg.admin')
681 @HasPermissionAllDecorator('hg.admin')
678 @CSRFRequired()
682 @CSRFRequired()
679 def user_disable_force_password_reset(self):
683 def user_disable_force_password_reset(self):
680 _ = self.request.translate
684 _ = self.request.translate
681 c = self.load_default_context()
685 c = self.load_default_context()
682
686
683 user_id = self.db_user_id
687 user_id = self.db_user_id
684 c.user = self.db_user
688 c.user = self.db_user
685
689
686 try:
690 try:
687 c.user.update_userdata(force_password_change=False)
691 c.user.update_userdata(force_password_change=False)
688
692
689 msg = _('Force password change disabled for user')
693 msg = _('Force password change disabled for user')
690 audit_logger.store_web(
694 audit_logger.store_web(
691 'user.edit.password_reset.disabled',
695 'user.edit.password_reset.disabled',
692 user=c.rhodecode_user)
696 user=c.rhodecode_user)
693
697
694 Session().commit()
698 Session().commit()
695 h.flash(msg, category='success')
699 h.flash(msg, category='success')
696 except Exception:
700 except Exception:
697 log.exception("Exception during password reset for user")
701 log.exception("Exception during password reset for user")
698 h.flash(_('An error occurred during password reset for user'),
702 h.flash(_('An error occurred during password reset for user'),
699 category='error')
703 category='error')
700
704
701 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
705 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
702
706
703 @LoginRequired()
707 @LoginRequired()
704 @HasPermissionAllDecorator('hg.admin')
708 @HasPermissionAllDecorator('hg.admin')
705 @CSRFRequired()
709 @CSRFRequired()
706 def user_notice_dismiss(self):
710 def user_notice_dismiss(self):
707 _ = self.request.translate
711 _ = self.request.translate
708 c = self.load_default_context()
712 c = self.load_default_context()
709
713
710 user_id = self.db_user_id
714 user_id = self.db_user_id
711 c.user = self.db_user
715 c.user = self.db_user
712 user_notice_id = safe_int(self.request.POST.get('notice_id'))
716 user_notice_id = safe_int(self.request.POST.get('notice_id'))
713 notice = UserNotice().query()\
717 notice = UserNotice().query()\
714 .filter(UserNotice.user_id == user_id)\
718 .filter(UserNotice.user_id == user_id)\
715 .filter(UserNotice.user_notice_id == user_notice_id)\
719 .filter(UserNotice.user_notice_id == user_notice_id)\
716 .scalar()
720 .scalar()
717 read = False
721 read = False
718 if notice:
722 if notice:
719 notice.notice_read = True
723 notice.notice_read = True
720 Session().add(notice)
724 Session().add(notice)
721 Session().commit()
725 Session().commit()
722 read = True
726 read = True
723
727
724 return {'notice': user_notice_id, 'read': read}
728 return {'notice': user_notice_id, 'read': read}
725
729
726 @LoginRequired()
730 @LoginRequired()
727 @HasPermissionAllDecorator('hg.admin')
731 @HasPermissionAllDecorator('hg.admin')
728 @CSRFRequired()
732 @CSRFRequired()
729 def user_create_personal_repo_group(self):
733 def user_create_personal_repo_group(self):
730 """
734 """
731 Create personal repository group for this user
735 Create personal repository group for this user
732 """
736 """
733 from rhodecode.model.repo_group import RepoGroupModel
737 from rhodecode.model.repo_group import RepoGroupModel
734
738
735 _ = self.request.translate
739 _ = self.request.translate
736 c = self.load_default_context()
740 c = self.load_default_context()
737
741
738 user_id = self.db_user_id
742 user_id = self.db_user_id
739 c.user = self.db_user
743 c.user = self.db_user
740
744
741 personal_repo_group = RepoGroup.get_user_personal_repo_group(
745 personal_repo_group = RepoGroup.get_user_personal_repo_group(
742 c.user.user_id)
746 c.user.user_id)
743 if personal_repo_group:
747 if personal_repo_group:
744 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
748 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
745
749
746 personal_repo_group_name = RepoGroupModel().get_personal_group_name(c.user)
750 personal_repo_group_name = RepoGroupModel().get_personal_group_name(c.user)
747 named_personal_group = RepoGroup.get_by_group_name(
751 named_personal_group = RepoGroup.get_by_group_name(
748 personal_repo_group_name)
752 personal_repo_group_name)
749 try:
753 try:
750
754
751 if named_personal_group and named_personal_group.user_id == c.user.user_id:
755 if named_personal_group and named_personal_group.user_id == c.user.user_id:
752 # migrate the same named group, and mark it as personal
756 # migrate the same named group, and mark it as personal
753 named_personal_group.personal = True
757 named_personal_group.personal = True
754 Session().add(named_personal_group)
758 Session().add(named_personal_group)
755 Session().commit()
759 Session().commit()
756 msg = _('Linked repository group `%s` as personal' % (
760 msg = _('Linked repository group `%s` as personal' % (
757 personal_repo_group_name,))
761 personal_repo_group_name,))
758 h.flash(msg, category='success')
762 h.flash(msg, category='success')
759 elif not named_personal_group:
763 elif not named_personal_group:
760 RepoGroupModel().create_personal_repo_group(c.user)
764 RepoGroupModel().create_personal_repo_group(c.user)
761
765
762 msg = _('Created repository group `%s`' % (
766 msg = _('Created repository group `%s`' % (
763 personal_repo_group_name,))
767 personal_repo_group_name,))
764 h.flash(msg, category='success')
768 h.flash(msg, category='success')
765 else:
769 else:
766 msg = _('Repository group `%s` is already taken' % (
770 msg = _('Repository group `%s` is already taken' % (
767 personal_repo_group_name,))
771 personal_repo_group_name,))
768 h.flash(msg, category='warning')
772 h.flash(msg, category='warning')
769 except Exception:
773 except Exception:
770 log.exception("Exception during repository group creation")
774 log.exception("Exception during repository group creation")
771 msg = _(
775 msg = _(
772 'An error occurred during repository group creation for user')
776 'An error occurred during repository group creation for user')
773 h.flash(msg, category='error')
777 h.flash(msg, category='error')
774 Session().rollback()
778 Session().rollback()
775
779
776 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
780 raise HTTPFound(h.route_path('user_edit_advanced', user_id=user_id))
777
781
778 @LoginRequired()
782 @LoginRequired()
779 @HasPermissionAllDecorator('hg.admin')
783 @HasPermissionAllDecorator('hg.admin')
780 def auth_tokens(self):
784 def auth_tokens(self):
781 _ = self.request.translate
785 _ = self.request.translate
782 c = self.load_default_context()
786 c = self.load_default_context()
783 c.user = self.db_user
787 c.user = self.db_user
784
788
785 c.active = 'auth_tokens'
789 c.active = 'auth_tokens'
786
790
787 c.lifetime_values = AuthTokenModel.get_lifetime_values(translator=_)
791 c.lifetime_values = AuthTokenModel.get_lifetime_values(translator=_)
788 c.role_values = [
792 c.role_values = [
789 (x, AuthTokenModel.cls._get_role_name(x))
793 (x, AuthTokenModel.cls._get_role_name(x))
790 for x in AuthTokenModel.cls.ROLES]
794 for x in AuthTokenModel.cls.ROLES]
791 c.role_options = [(c.role_values, _("Role"))]
795 c.role_options = [(c.role_values, _("Role"))]
792 c.user_auth_tokens = AuthTokenModel().get_auth_tokens(
796 c.user_auth_tokens = AuthTokenModel().get_auth_tokens(
793 c.user.user_id, show_expired=True)
797 c.user.user_id, show_expired=True)
794 c.role_vcs = AuthTokenModel.cls.ROLE_VCS
798 c.role_vcs = AuthTokenModel.cls.ROLE_VCS
795 return self._get_template_context(c)
799 return self._get_template_context(c)
796
800
797 @LoginRequired()
801 @LoginRequired()
798 @HasPermissionAllDecorator('hg.admin')
802 @HasPermissionAllDecorator('hg.admin')
799 def auth_tokens_view(self):
803 def auth_tokens_view(self):
800 _ = self.request.translate
804 _ = self.request.translate
801 c = self.load_default_context()
805 c = self.load_default_context()
802 c.user = self.db_user
806 c.user = self.db_user
803
807
804 auth_token_id = self.request.POST.get('auth_token_id')
808 auth_token_id = self.request.POST.get('auth_token_id')
805
809
806 if auth_token_id:
810 if auth_token_id:
807 token = UserApiKeys.get_or_404(auth_token_id)
811 token = UserApiKeys.get_or_404(auth_token_id)
808
812
809 return {
813 return {
810 'auth_token': token.api_key
814 'auth_token': token.api_key
811 }
815 }
812
816
813 def maybe_attach_token_scope(self, token):
817 def maybe_attach_token_scope(self, token):
814 # implemented in EE edition
818 # implemented in EE edition
815 pass
819 pass
816
820
817 @LoginRequired()
821 @LoginRequired()
818 @HasPermissionAllDecorator('hg.admin')
822 @HasPermissionAllDecorator('hg.admin')
819 @CSRFRequired()
823 @CSRFRequired()
820 def auth_tokens_add(self):
824 def auth_tokens_add(self):
821 _ = self.request.translate
825 _ = self.request.translate
822 c = self.load_default_context()
826 c = self.load_default_context()
823
827
824 user_id = self.db_user_id
828 user_id = self.db_user_id
825 c.user = self.db_user
829 c.user = self.db_user
826
830
827 user_data = c.user.get_api_data()
831 user_data = c.user.get_api_data()
828 lifetime = safe_int(self.request.POST.get('lifetime'), -1)
832 lifetime = safe_int(self.request.POST.get('lifetime'), -1)
829 description = self.request.POST.get('description')
833 description = self.request.POST.get('description')
830 role = self.request.POST.get('role')
834 role = self.request.POST.get('role')
831
835
832 token = UserModel().add_auth_token(
836 token = UserModel().add_auth_token(
833 user=c.user.user_id,
837 user=c.user.user_id,
834 lifetime_minutes=lifetime, role=role, description=description,
838 lifetime_minutes=lifetime, role=role, description=description,
835 scope_callback=self.maybe_attach_token_scope)
839 scope_callback=self.maybe_attach_token_scope)
836 token_data = token.get_api_data()
840 token_data = token.get_api_data()
837
841
838 audit_logger.store_web(
842 audit_logger.store_web(
839 'user.edit.token.add', action_data={
843 'user.edit.token.add', action_data={
840 'data': {'token': token_data, 'user': user_data}},
844 'data': {'token': token_data, 'user': user_data}},
841 user=self._rhodecode_user, )
845 user=self._rhodecode_user, )
842 Session().commit()
846 Session().commit()
843
847
844 h.flash(_("Auth token successfully created"), category='success')
848 h.flash(_("Auth token successfully created"), category='success')
845 return HTTPFound(h.route_path('edit_user_auth_tokens', user_id=user_id))
849 return HTTPFound(h.route_path('edit_user_auth_tokens', user_id=user_id))
846
850
847 @LoginRequired()
851 @LoginRequired()
848 @HasPermissionAllDecorator('hg.admin')
852 @HasPermissionAllDecorator('hg.admin')
849 @CSRFRequired()
853 @CSRFRequired()
850 def auth_tokens_delete(self):
854 def auth_tokens_delete(self):
851 _ = self.request.translate
855 _ = self.request.translate
852 c = self.load_default_context()
856 c = self.load_default_context()
853
857
854 user_id = self.db_user_id
858 user_id = self.db_user_id
855 c.user = self.db_user
859 c.user = self.db_user
856
860
857 user_data = c.user.get_api_data()
861 user_data = c.user.get_api_data()
858
862
859 del_auth_token = self.request.POST.get('del_auth_token')
863 del_auth_token = self.request.POST.get('del_auth_token')
860
864
861 if del_auth_token:
865 if del_auth_token:
862 token = UserApiKeys.get_or_404(del_auth_token)
866 token = UserApiKeys.get_or_404(del_auth_token)
863 token_data = token.get_api_data()
867 token_data = token.get_api_data()
864
868
865 AuthTokenModel().delete(del_auth_token, c.user.user_id)
869 AuthTokenModel().delete(del_auth_token, c.user.user_id)
866 audit_logger.store_web(
870 audit_logger.store_web(
867 'user.edit.token.delete', action_data={
871 'user.edit.token.delete', action_data={
868 'data': {'token': token_data, 'user': user_data}},
872 'data': {'token': token_data, 'user': user_data}},
869 user=self._rhodecode_user,)
873 user=self._rhodecode_user,)
870 Session().commit()
874 Session().commit()
871 h.flash(_("Auth token successfully deleted"), category='success')
875 h.flash(_("Auth token successfully deleted"), category='success')
872
876
873 return HTTPFound(h.route_path('edit_user_auth_tokens', user_id=user_id))
877 return HTTPFound(h.route_path('edit_user_auth_tokens', user_id=user_id))
874
878
875 @LoginRequired()
879 @LoginRequired()
876 @HasPermissionAllDecorator('hg.admin')
880 @HasPermissionAllDecorator('hg.admin')
877 def ssh_keys(self):
881 def ssh_keys(self):
878 _ = self.request.translate
882 _ = self.request.translate
879 c = self.load_default_context()
883 c = self.load_default_context()
880 c.user = self.db_user
884 c.user = self.db_user
881
885
882 c.active = 'ssh_keys'
886 c.active = 'ssh_keys'
883 c.default_key = self.request.GET.get('default_key')
887 c.default_key = self.request.GET.get('default_key')
884 c.user_ssh_keys = SshKeyModel().get_ssh_keys(c.user.user_id)
888 c.user_ssh_keys = SshKeyModel().get_ssh_keys(c.user.user_id)
885 return self._get_template_context(c)
889 return self._get_template_context(c)
886
890
887 @LoginRequired()
891 @LoginRequired()
888 @HasPermissionAllDecorator('hg.admin')
892 @HasPermissionAllDecorator('hg.admin')
889 def ssh_keys_generate_keypair(self):
893 def ssh_keys_generate_keypair(self):
890 _ = self.request.translate
894 _ = self.request.translate
891 c = self.load_default_context()
895 c = self.load_default_context()
892
896
893 c.user = self.db_user
897 c.user = self.db_user
894
898
895 c.active = 'ssh_keys_generate'
899 c.active = 'ssh_keys_generate'
896 comment = 'RhodeCode-SSH {}'.format(c.user.email or '')
900 comment = 'RhodeCode-SSH {}'.format(c.user.email or '')
897 private_format = self.request.GET.get('private_format') \
901 private_format = self.request.GET.get('private_format') \
898 or SshKeyModel.DEFAULT_PRIVATE_KEY_FORMAT
902 or SshKeyModel.DEFAULT_PRIVATE_KEY_FORMAT
899 c.private, c.public = SshKeyModel().generate_keypair(
903 c.private, c.public = SshKeyModel().generate_keypair(
900 comment=comment, private_format=private_format)
904 comment=comment, private_format=private_format)
901
905
902 return self._get_template_context(c)
906 return self._get_template_context(c)
903
907
904 @LoginRequired()
908 @LoginRequired()
905 @HasPermissionAllDecorator('hg.admin')
909 @HasPermissionAllDecorator('hg.admin')
906 @CSRFRequired()
910 @CSRFRequired()
907 def ssh_keys_add(self):
911 def ssh_keys_add(self):
908 _ = self.request.translate
912 _ = self.request.translate
909 c = self.load_default_context()
913 c = self.load_default_context()
910
914
911 user_id = self.db_user_id
915 user_id = self.db_user_id
912 c.user = self.db_user
916 c.user = self.db_user
913
917
914 user_data = c.user.get_api_data()
918 user_data = c.user.get_api_data()
915 key_data = self.request.POST.get('key_data')
919 key_data = self.request.POST.get('key_data')
916 description = self.request.POST.get('description')
920 description = self.request.POST.get('description')
917
921
918 fingerprint = 'unknown'
922 fingerprint = 'unknown'
919 try:
923 try:
920 if not key_data:
924 if not key_data:
921 raise ValueError('Please add a valid public key')
925 raise ValueError('Please add a valid public key')
922
926
923 key = SshKeyModel().parse_key(key_data.strip())
927 key = SshKeyModel().parse_key(key_data.strip())
924 fingerprint = key.hash_md5()
928 fingerprint = key.hash_md5()
925
929
926 ssh_key = SshKeyModel().create(
930 ssh_key = SshKeyModel().create(
927 c.user.user_id, fingerprint, key.keydata, description)
931 c.user.user_id, fingerprint, key.keydata, description)
928 ssh_key_data = ssh_key.get_api_data()
932 ssh_key_data = ssh_key.get_api_data()
929
933
930 audit_logger.store_web(
934 audit_logger.store_web(
931 'user.edit.ssh_key.add', action_data={
935 'user.edit.ssh_key.add', action_data={
932 'data': {'ssh_key': ssh_key_data, 'user': user_data}},
936 'data': {'ssh_key': ssh_key_data, 'user': user_data}},
933 user=self._rhodecode_user, )
937 user=self._rhodecode_user, )
934 Session().commit()
938 Session().commit()
935
939
936 # Trigger an event on change of keys.
940 # Trigger an event on change of keys.
937 trigger(SshKeyFileChangeEvent(), self.request.registry)
941 trigger(SshKeyFileChangeEvent(), self.request.registry)
938
942
939 h.flash(_("Ssh Key successfully created"), category='success')
943 h.flash(_("Ssh Key successfully created"), category='success')
940
944
941 except IntegrityError:
945 except IntegrityError:
942 log.exception("Exception during ssh key saving")
946 log.exception("Exception during ssh key saving")
943 err = 'Such key with fingerprint `{}` already exists, ' \
947 err = 'Such key with fingerprint `{}` already exists, ' \
944 'please use a different one'.format(fingerprint)
948 'please use a different one'.format(fingerprint)
945 h.flash(_('An error occurred during ssh key saving: {}').format(err),
949 h.flash(_('An error occurred during ssh key saving: {}').format(err),
946 category='error')
950 category='error')
947 except Exception as e:
951 except Exception as e:
948 log.exception("Exception during ssh key saving")
952 log.exception("Exception during ssh key saving")
949 h.flash(_('An error occurred during ssh key saving: {}').format(e),
953 h.flash(_('An error occurred during ssh key saving: {}').format(e),
950 category='error')
954 category='error')
951
955
952 return HTTPFound(
956 return HTTPFound(
953 h.route_path('edit_user_ssh_keys', user_id=user_id))
957 h.route_path('edit_user_ssh_keys', user_id=user_id))
954
958
955 @LoginRequired()
959 @LoginRequired()
956 @HasPermissionAllDecorator('hg.admin')
960 @HasPermissionAllDecorator('hg.admin')
957 @CSRFRequired()
961 @CSRFRequired()
958 def ssh_keys_delete(self):
962 def ssh_keys_delete(self):
959 _ = self.request.translate
963 _ = self.request.translate
960 c = self.load_default_context()
964 c = self.load_default_context()
961
965
962 user_id = self.db_user_id
966 user_id = self.db_user_id
963 c.user = self.db_user
967 c.user = self.db_user
964
968
965 user_data = c.user.get_api_data()
969 user_data = c.user.get_api_data()
966
970
967 del_ssh_key = self.request.POST.get('del_ssh_key')
971 del_ssh_key = self.request.POST.get('del_ssh_key')
968
972
969 if del_ssh_key:
973 if del_ssh_key:
970 ssh_key = UserSshKeys.get_or_404(del_ssh_key)
974 ssh_key = UserSshKeys.get_or_404(del_ssh_key)
971 ssh_key_data = ssh_key.get_api_data()
975 ssh_key_data = ssh_key.get_api_data()
972
976
973 SshKeyModel().delete(del_ssh_key, c.user.user_id)
977 SshKeyModel().delete(del_ssh_key, c.user.user_id)
974 audit_logger.store_web(
978 audit_logger.store_web(
975 'user.edit.ssh_key.delete', action_data={
979 'user.edit.ssh_key.delete', action_data={
976 'data': {'ssh_key': ssh_key_data, 'user': user_data}},
980 'data': {'ssh_key': ssh_key_data, 'user': user_data}},
977 user=self._rhodecode_user,)
981 user=self._rhodecode_user,)
978 Session().commit()
982 Session().commit()
979 # Trigger an event on change of keys.
983 # Trigger an event on change of keys.
980 trigger(SshKeyFileChangeEvent(), self.request.registry)
984 trigger(SshKeyFileChangeEvent(), self.request.registry)
981 h.flash(_("Ssh key successfully deleted"), category='success')
985 h.flash(_("Ssh key successfully deleted"), category='success')
982
986
983 return HTTPFound(h.route_path('edit_user_ssh_keys', user_id=user_id))
987 return HTTPFound(h.route_path('edit_user_ssh_keys', user_id=user_id))
984
988
985 @LoginRequired()
989 @LoginRequired()
986 @HasPermissionAllDecorator('hg.admin')
990 @HasPermissionAllDecorator('hg.admin')
987 def emails(self):
991 def emails(self):
988 _ = self.request.translate
992 _ = self.request.translate
989 c = self.load_default_context()
993 c = self.load_default_context()
990 c.user = self.db_user
994 c.user = self.db_user
991
995
992 c.active = 'emails'
996 c.active = 'emails'
993 c.user_email_map = UserEmailMap.query() \
997 c.user_email_map = UserEmailMap.query() \
994 .filter(UserEmailMap.user == c.user).all()
998 .filter(UserEmailMap.user == c.user).all()
995
999
996 return self._get_template_context(c)
1000 return self._get_template_context(c)
997
1001
998 @LoginRequired()
1002 @LoginRequired()
999 @HasPermissionAllDecorator('hg.admin')
1003 @HasPermissionAllDecorator('hg.admin')
1000 @CSRFRequired()
1004 @CSRFRequired()
1001 def emails_add(self):
1005 def emails_add(self):
1002 _ = self.request.translate
1006 _ = self.request.translate
1003 c = self.load_default_context()
1007 c = self.load_default_context()
1004
1008
1005 user_id = self.db_user_id
1009 user_id = self.db_user_id
1006 c.user = self.db_user
1010 c.user = self.db_user
1007
1011
1008 email = self.request.POST.get('new_email')
1012 email = self.request.POST.get('new_email')
1009 user_data = c.user.get_api_data()
1013 user_data = c.user.get_api_data()
1010 try:
1014 try:
1011
1015
1012 form = UserExtraEmailForm(self.request.translate)()
1016 form = UserExtraEmailForm(self.request.translate)()
1013 data = form.to_python({'email': email})
1017 data = form.to_python({'email': email})
1014 email = data['email']
1018 email = data['email']
1015
1019
1016 UserModel().add_extra_email(c.user.user_id, email)
1020 UserModel().add_extra_email(c.user.user_id, email)
1017 audit_logger.store_web(
1021 audit_logger.store_web(
1018 'user.edit.email.add',
1022 'user.edit.email.add',
1019 action_data={'email': email, 'user': user_data},
1023 action_data={'email': email, 'user': user_data},
1020 user=self._rhodecode_user)
1024 user=self._rhodecode_user)
1021 Session().commit()
1025 Session().commit()
1022 h.flash(_("Added new email address `%s` for user account") % email,
1026 h.flash(_("Added new email address `%s` for user account") % email,
1023 category='success')
1027 category='success')
1024 except formencode.Invalid as error:
1028 except formencode.Invalid as error:
1025 h.flash(h.escape(error.error_dict['email']), category='error')
1029 h.flash(h.escape(error.error_dict['email']), category='error')
1026 except IntegrityError:
1030 except IntegrityError:
1027 log.warning("Email %s already exists", email)
1031 log.warning("Email %s already exists", email)
1028 h.flash(_('Email `{}` is already registered for another user.').format(email),
1032 h.flash(_('Email `{}` is already registered for another user.').format(email),
1029 category='error')
1033 category='error')
1030 except Exception:
1034 except Exception:
1031 log.exception("Exception during email saving")
1035 log.exception("Exception during email saving")
1032 h.flash(_('An error occurred during email saving'),
1036 h.flash(_('An error occurred during email saving'),
1033 category='error')
1037 category='error')
1034 raise HTTPFound(h.route_path('edit_user_emails', user_id=user_id))
1038 raise HTTPFound(h.route_path('edit_user_emails', user_id=user_id))
1035
1039
1036 @LoginRequired()
1040 @LoginRequired()
1037 @HasPermissionAllDecorator('hg.admin')
1041 @HasPermissionAllDecorator('hg.admin')
1038 @CSRFRequired()
1042 @CSRFRequired()
1039 def emails_delete(self):
1043 def emails_delete(self):
1040 _ = self.request.translate
1044 _ = self.request.translate
1041 c = self.load_default_context()
1045 c = self.load_default_context()
1042
1046
1043 user_id = self.db_user_id
1047 user_id = self.db_user_id
1044 c.user = self.db_user
1048 c.user = self.db_user
1045
1049
1046 email_id = self.request.POST.get('del_email_id')
1050 email_id = self.request.POST.get('del_email_id')
1047 user_model = UserModel()
1051 user_model = UserModel()
1048
1052
1049 email = UserEmailMap.query().get(email_id).email
1053 email = UserEmailMap.query().get(email_id).email
1050 user_data = c.user.get_api_data()
1054 user_data = c.user.get_api_data()
1051 user_model.delete_extra_email(c.user.user_id, email_id)
1055 user_model.delete_extra_email(c.user.user_id, email_id)
1052 audit_logger.store_web(
1056 audit_logger.store_web(
1053 'user.edit.email.delete',
1057 'user.edit.email.delete',
1054 action_data={'email': email, 'user': user_data},
1058 action_data={'email': email, 'user': user_data},
1055 user=self._rhodecode_user)
1059 user=self._rhodecode_user)
1056 Session().commit()
1060 Session().commit()
1057 h.flash(_("Removed email address from user account"),
1061 h.flash(_("Removed email address from user account"),
1058 category='success')
1062 category='success')
1059 raise HTTPFound(h.route_path('edit_user_emails', user_id=user_id))
1063 raise HTTPFound(h.route_path('edit_user_emails', user_id=user_id))
1060
1064
1061 @LoginRequired()
1065 @LoginRequired()
1062 @HasPermissionAllDecorator('hg.admin')
1066 @HasPermissionAllDecorator('hg.admin')
1063 def ips(self):
1067 def ips(self):
1064 _ = self.request.translate
1068 _ = self.request.translate
1065 c = self.load_default_context()
1069 c = self.load_default_context()
1066 c.user = self.db_user
1070 c.user = self.db_user
1067
1071
1068 c.active = 'ips'
1072 c.active = 'ips'
1069 c.user_ip_map = UserIpMap.query() \
1073 c.user_ip_map = UserIpMap.query() \
1070 .filter(UserIpMap.user == c.user).all()
1074 .filter(UserIpMap.user == c.user).all()
1071
1075
1072 c.inherit_default_ips = c.user.inherit_default_permissions
1076 c.inherit_default_ips = c.user.inherit_default_permissions
1073 c.default_user_ip_map = UserIpMap.query() \
1077 c.default_user_ip_map = UserIpMap.query() \
1074 .filter(UserIpMap.user == User.get_default_user()).all()
1078 .filter(UserIpMap.user == User.get_default_user()).all()
1075
1079
1076 return self._get_template_context(c)
1080 return self._get_template_context(c)
1077
1081
1078 @LoginRequired()
1082 @LoginRequired()
1079 @HasPermissionAllDecorator('hg.admin')
1083 @HasPermissionAllDecorator('hg.admin')
1080 @CSRFRequired()
1084 @CSRFRequired()
1081 # NOTE(marcink): this view is allowed for default users, as we can
1085 # NOTE(marcink): this view is allowed for default users, as we can
1082 # edit their IP white list
1086 # edit their IP white list
1083 def ips_add(self):
1087 def ips_add(self):
1084 _ = self.request.translate
1088 _ = self.request.translate
1085 c = self.load_default_context()
1089 c = self.load_default_context()
1086
1090
1087 user_id = self.db_user_id
1091 user_id = self.db_user_id
1088 c.user = self.db_user
1092 c.user = self.db_user
1089
1093
1090 user_model = UserModel()
1094 user_model = UserModel()
1091 desc = self.request.POST.get('description')
1095 desc = self.request.POST.get('description')
1092 try:
1096 try:
1093 ip_list = user_model.parse_ip_range(
1097 ip_list = user_model.parse_ip_range(
1094 self.request.POST.get('new_ip'))
1098 self.request.POST.get('new_ip'))
1095 except Exception as e:
1099 except Exception as e:
1096 ip_list = []
1100 ip_list = []
1097 log.exception("Exception during ip saving")
1101 log.exception("Exception during ip saving")
1098 h.flash(_('An error occurred during ip saving:%s' % (e,)),
1102 h.flash(_('An error occurred during ip saving:%s' % (e,)),
1099 category='error')
1103 category='error')
1100 added = []
1104 added = []
1101 user_data = c.user.get_api_data()
1105 user_data = c.user.get_api_data()
1102 for ip in ip_list:
1106 for ip in ip_list:
1103 try:
1107 try:
1104 form = UserExtraIpForm(self.request.translate)()
1108 form = UserExtraIpForm(self.request.translate)()
1105 data = form.to_python({'ip': ip})
1109 data = form.to_python({'ip': ip})
1106 ip = data['ip']
1110 ip = data['ip']
1107
1111
1108 user_model.add_extra_ip(c.user.user_id, ip, desc)
1112 user_model.add_extra_ip(c.user.user_id, ip, desc)
1109 audit_logger.store_web(
1113 audit_logger.store_web(
1110 'user.edit.ip.add',
1114 'user.edit.ip.add',
1111 action_data={'ip': ip, 'user': user_data},
1115 action_data={'ip': ip, 'user': user_data},
1112 user=self._rhodecode_user)
1116 user=self._rhodecode_user)
1113 Session().commit()
1117 Session().commit()
1114 added.append(ip)
1118 added.append(ip)
1115 except formencode.Invalid as error:
1119 except formencode.Invalid as error:
1116 msg = error.error_dict['ip']
1120 msg = error.error_dict['ip']
1117 h.flash(msg, category='error')
1121 h.flash(msg, category='error')
1118 except Exception:
1122 except Exception:
1119 log.exception("Exception during ip saving")
1123 log.exception("Exception during ip saving")
1120 h.flash(_('An error occurred during ip saving'),
1124 h.flash(_('An error occurred during ip saving'),
1121 category='error')
1125 category='error')
1122 if added:
1126 if added:
1123 h.flash(
1127 h.flash(
1124 _("Added ips %s to user whitelist") % (', '.join(ip_list), ),
1128 _("Added ips %s to user whitelist") % (', '.join(ip_list), ),
1125 category='success')
1129 category='success')
1126 if 'default_user' in self.request.POST:
1130 if 'default_user' in self.request.POST:
1127 # case for editing global IP list we do it for 'DEFAULT' user
1131 # case for editing global IP list we do it for 'DEFAULT' user
1128 raise HTTPFound(h.route_path('admin_permissions_ips'))
1132 raise HTTPFound(h.route_path('admin_permissions_ips'))
1129 raise HTTPFound(h.route_path('edit_user_ips', user_id=user_id))
1133 raise HTTPFound(h.route_path('edit_user_ips', user_id=user_id))
1130
1134
1131 @LoginRequired()
1135 @LoginRequired()
1132 @HasPermissionAllDecorator('hg.admin')
1136 @HasPermissionAllDecorator('hg.admin')
1133 @CSRFRequired()
1137 @CSRFRequired()
1134 # NOTE(marcink): this view is allowed for default users, as we can
1138 # NOTE(marcink): this view is allowed for default users, as we can
1135 # edit their IP white list
1139 # edit their IP white list
1136 def ips_delete(self):
1140 def ips_delete(self):
1137 _ = self.request.translate
1141 _ = self.request.translate
1138 c = self.load_default_context()
1142 c = self.load_default_context()
1139
1143
1140 user_id = self.db_user_id
1144 user_id = self.db_user_id
1141 c.user = self.db_user
1145 c.user = self.db_user
1142
1146
1143 ip_id = self.request.POST.get('del_ip_id')
1147 ip_id = self.request.POST.get('del_ip_id')
1144 user_model = UserModel()
1148 user_model = UserModel()
1145 user_data = c.user.get_api_data()
1149 user_data = c.user.get_api_data()
1146 ip = UserIpMap.query().get(ip_id).ip_addr
1150 ip = UserIpMap.query().get(ip_id).ip_addr
1147 user_model.delete_extra_ip(c.user.user_id, ip_id)
1151 user_model.delete_extra_ip(c.user.user_id, ip_id)
1148 audit_logger.store_web(
1152 audit_logger.store_web(
1149 'user.edit.ip.delete', action_data={'ip': ip, 'user': user_data},
1153 'user.edit.ip.delete', action_data={'ip': ip, 'user': user_data},
1150 user=self._rhodecode_user)
1154 user=self._rhodecode_user)
1151 Session().commit()
1155 Session().commit()
1152 h.flash(_("Removed ip address from user whitelist"), category='success')
1156 h.flash(_("Removed ip address from user whitelist"), category='success')
1153
1157
1154 if 'default_user' in self.request.POST:
1158 if 'default_user' in self.request.POST:
1155 # case for editing global IP list we do it for 'DEFAULT' user
1159 # case for editing global IP list we do it for 'DEFAULT' user
1156 raise HTTPFound(h.route_path('admin_permissions_ips'))
1160 raise HTTPFound(h.route_path('admin_permissions_ips'))
1157 raise HTTPFound(h.route_path('edit_user_ips', user_id=user_id))
1161 raise HTTPFound(h.route_path('edit_user_ips', user_id=user_id))
1158
1162
1159 @LoginRequired()
1163 @LoginRequired()
1160 @HasPermissionAllDecorator('hg.admin')
1164 @HasPermissionAllDecorator('hg.admin')
1161 def groups_management(self):
1165 def groups_management(self):
1162 c = self.load_default_context()
1166 c = self.load_default_context()
1163 c.user = self.db_user
1167 c.user = self.db_user
1164 c.data = c.user.group_member
1168 c.data = c.user.group_member
1165
1169
1166 groups = [UserGroupModel.get_user_groups_as_dict(group.users_group)
1170 groups = [UserGroupModel.get_user_groups_as_dict(group.users_group)
1167 for group in c.user.group_member]
1171 for group in c.user.group_member]
1168 c.groups = json.dumps(groups)
1172 c.groups = json.dumps(groups)
1169 c.active = 'groups'
1173 c.active = 'groups'
1170
1174
1171 return self._get_template_context(c)
1175 return self._get_template_context(c)
1172
1176
1173 @LoginRequired()
1177 @LoginRequired()
1174 @HasPermissionAllDecorator('hg.admin')
1178 @HasPermissionAllDecorator('hg.admin')
1175 @CSRFRequired()
1179 @CSRFRequired()
1176 def groups_management_updates(self):
1180 def groups_management_updates(self):
1177 _ = self.request.translate
1181 _ = self.request.translate
1178 c = self.load_default_context()
1182 c = self.load_default_context()
1179
1183
1180 user_id = self.db_user_id
1184 user_id = self.db_user_id
1181 c.user = self.db_user
1185 c.user = self.db_user
1182
1186
1183 user_groups = set(self.request.POST.getall('users_group_id'))
1187 user_groups = set(self.request.POST.getall('users_group_id'))
1184 user_groups_objects = []
1188 user_groups_objects = []
1185
1189
1186 for ugid in user_groups:
1190 for ugid in user_groups:
1187 user_groups_objects.append(
1191 user_groups_objects.append(
1188 UserGroupModel().get_group(safe_int(ugid)))
1192 UserGroupModel().get_group(safe_int(ugid)))
1189 user_group_model = UserGroupModel()
1193 user_group_model = UserGroupModel()
1190 added_to_groups, removed_from_groups = \
1194 added_to_groups, removed_from_groups = \
1191 user_group_model.change_groups(c.user, user_groups_objects)
1195 user_group_model.change_groups(c.user, user_groups_objects)
1192
1196
1193 user_data = c.user.get_api_data()
1197 user_data = c.user.get_api_data()
1194 for user_group_id in added_to_groups:
1198 for user_group_id in added_to_groups:
1195 user_group = UserGroup.get(user_group_id)
1199 user_group = UserGroup.get(user_group_id)
1196 old_values = user_group.get_api_data()
1200 old_values = user_group.get_api_data()
1197 audit_logger.store_web(
1201 audit_logger.store_web(
1198 'user_group.edit.member.add',
1202 'user_group.edit.member.add',
1199 action_data={'user': user_data, 'old_data': old_values},
1203 action_data={'user': user_data, 'old_data': old_values},
1200 user=self._rhodecode_user)
1204 user=self._rhodecode_user)
1201
1205
1202 for user_group_id in removed_from_groups:
1206 for user_group_id in removed_from_groups:
1203 user_group = UserGroup.get(user_group_id)
1207 user_group = UserGroup.get(user_group_id)
1204 old_values = user_group.get_api_data()
1208 old_values = user_group.get_api_data()
1205 audit_logger.store_web(
1209 audit_logger.store_web(
1206 'user_group.edit.member.delete',
1210 'user_group.edit.member.delete',
1207 action_data={'user': user_data, 'old_data': old_values},
1211 action_data={'user': user_data, 'old_data': old_values},
1208 user=self._rhodecode_user)
1212 user=self._rhodecode_user)
1209
1213
1210 Session().commit()
1214 Session().commit()
1211 c.active = 'user_groups_management'
1215 c.active = 'user_groups_management'
1212 h.flash(_("Groups successfully changed"), category='success')
1216 h.flash(_("Groups successfully changed"), category='success')
1213
1217
1214 return HTTPFound(h.route_path(
1218 return HTTPFound(h.route_path(
1215 'edit_user_groups_management', user_id=user_id))
1219 'edit_user_groups_management', user_id=user_id))
1216
1220
1217 @LoginRequired()
1221 @LoginRequired()
1218 @HasPermissionAllDecorator('hg.admin')
1222 @HasPermissionAllDecorator('hg.admin')
1219 def user_audit_logs(self):
1223 def user_audit_logs(self):
1220 _ = self.request.translate
1224 _ = self.request.translate
1221 c = self.load_default_context()
1225 c = self.load_default_context()
1222 c.user = self.db_user
1226 c.user = self.db_user
1223
1227
1224 c.active = 'audit'
1228 c.active = 'audit'
1225
1229
1226 p = safe_int(self.request.GET.get('page', 1), 1)
1230 p = safe_int(self.request.GET.get('page', 1), 1)
1227
1231
1228 filter_term = self.request.GET.get('filter')
1232 filter_term = self.request.GET.get('filter')
1229 user_log = UserModel().get_user_log(c.user, filter_term)
1233 user_log = UserModel().get_user_log(c.user, filter_term)
1230
1234
1231 def url_generator(page_num):
1235 def url_generator(page_num):
1232 query_params = {
1236 query_params = {
1233 'page': page_num
1237 'page': page_num
1234 }
1238 }
1235 if filter_term:
1239 if filter_term:
1236 query_params['filter'] = filter_term
1240 query_params['filter'] = filter_term
1237 return self.request.current_route_path(_query=query_params)
1241 return self.request.current_route_path(_query=query_params)
1238
1242
1239 c.audit_logs = SqlPage(
1243 c.audit_logs = SqlPage(
1240 user_log, page=p, items_per_page=10, url_maker=url_generator)
1244 user_log, page=p, items_per_page=10, url_maker=url_generator)
1241 c.filter_term = filter_term
1245 c.filter_term = filter_term
1242 return self._get_template_context(c)
1246 return self._get_template_context(c)
1243
1247
1244 @LoginRequired()
1248 @LoginRequired()
1245 @HasPermissionAllDecorator('hg.admin')
1249 @HasPermissionAllDecorator('hg.admin')
1246 def user_audit_logs_download(self):
1250 def user_audit_logs_download(self):
1247 _ = self.request.translate
1251 _ = self.request.translate
1248 c = self.load_default_context()
1252 c = self.load_default_context()
1249 c.user = self.db_user
1253 c.user = self.db_user
1250
1254
1251 user_log = UserModel().get_user_log(c.user, filter_term=None)
1255 user_log = UserModel().get_user_log(c.user, filter_term=None)
1252
1256
1253 audit_log_data = {}
1257 audit_log_data = {}
1254 for entry in user_log:
1258 for entry in user_log:
1255 audit_log_data[entry.user_log_id] = entry.get_dict()
1259 audit_log_data[entry.user_log_id] = entry.get_dict()
1256
1260
1257 response = Response(json.dumps(audit_log_data, indent=4))
1261 response = Response(json.dumps(audit_log_data, indent=4))
1258 response.content_disposition = str(
1262 response.content_disposition = str(
1259 'attachment; filename=%s' % 'user_{}_audit_logs.json'.format(c.user.user_id))
1263 'attachment; filename=%s' % 'user_{}_audit_logs.json'.format(c.user.user_id))
1260 response.content_type = 'application/json'
1264 response.content_type = 'application/json'
1261
1265
1262 return response
1266 return response
1263
1267
1264 @LoginRequired()
1268 @LoginRequired()
1265 @HasPermissionAllDecorator('hg.admin')
1269 @HasPermissionAllDecorator('hg.admin')
1266 def user_perms_summary(self):
1270 def user_perms_summary(self):
1267 _ = self.request.translate
1271 _ = self.request.translate
1268 c = self.load_default_context()
1272 c = self.load_default_context()
1269 c.user = self.db_user
1273 c.user = self.db_user
1270
1274
1271 c.active = 'perms_summary'
1275 c.active = 'perms_summary'
1272 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1276 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1273
1277
1274 return self._get_template_context(c)
1278 return self._get_template_context(c)
1275
1279
1276 @LoginRequired()
1280 @LoginRequired()
1277 @HasPermissionAllDecorator('hg.admin')
1281 @HasPermissionAllDecorator('hg.admin')
1278 def user_perms_summary_json(self):
1282 def user_perms_summary_json(self):
1279 self.load_default_context()
1283 self.load_default_context()
1280 perm_user = self.db_user.AuthUser(ip_addr=self.request.remote_addr)
1284 perm_user = self.db_user.AuthUser(ip_addr=self.request.remote_addr)
1281
1285
1282 return perm_user.permissions
1286 return perm_user.permissions
1283
1287
1284 @LoginRequired()
1288 @LoginRequired()
1285 @HasPermissionAllDecorator('hg.admin')
1289 @HasPermissionAllDecorator('hg.admin')
1286 def user_caches(self):
1290 def user_caches(self):
1287 _ = self.request.translate
1291 _ = self.request.translate
1288 c = self.load_default_context()
1292 c = self.load_default_context()
1289 c.user = self.db_user
1293 c.user = self.db_user
1290
1294
1291 c.active = 'caches'
1295 c.active = 'caches'
1292 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1296 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1293
1297
1294 cache_namespace_uid = 'cache_user_auth.{}'.format(self.db_user.user_id)
1298 cache_namespace_uid = 'cache_user_auth.{}'.format(self.db_user.user_id)
1295 c.region = rc_cache.get_or_create_region('cache_perms', cache_namespace_uid)
1299 c.region = rc_cache.get_or_create_region('cache_perms', cache_namespace_uid)
1296 c.backend = c.region.backend
1300 c.backend = c.region.backend
1297 c.user_keys = sorted(c.region.backend.list_keys(prefix=cache_namespace_uid))
1301 c.user_keys = sorted(c.region.backend.list_keys(prefix=cache_namespace_uid))
1298
1302
1299 return self._get_template_context(c)
1303 return self._get_template_context(c)
1300
1304
1301 @LoginRequired()
1305 @LoginRequired()
1302 @HasPermissionAllDecorator('hg.admin')
1306 @HasPermissionAllDecorator('hg.admin')
1303 @CSRFRequired()
1307 @CSRFRequired()
1304 def user_caches_update(self):
1308 def user_caches_update(self):
1305 _ = self.request.translate
1309 _ = self.request.translate
1306 c = self.load_default_context()
1310 c = self.load_default_context()
1307 c.user = self.db_user
1311 c.user = self.db_user
1308
1312
1309 c.active = 'caches'
1313 c.active = 'caches'
1310 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1314 c.perm_user = c.user.AuthUser(ip_addr=self.request.remote_addr)
1311
1315
1312 cache_namespace_uid = 'cache_user_auth.{}'.format(self.db_user.user_id)
1316 cache_namespace_uid = 'cache_user_auth.{}'.format(self.db_user.user_id)
1313 del_keys = rc_cache.clear_cache_namespace('cache_perms', cache_namespace_uid)
1317 del_keys = rc_cache.clear_cache_namespace('cache_perms', cache_namespace_uid)
1314
1318
1315 h.flash(_("Deleted {} cache keys").format(del_keys), category='success')
1319 h.flash(_("Deleted {} cache keys").format(del_keys), category='success')
1316
1320
1317 return HTTPFound(h.route_path(
1321 return HTTPFound(h.route_path(
1318 'edit_user_caches', user_id=c.user.user_id))
1322 'edit_user_caches', user_id=c.user.user_id))
@@ -1,83 +1,93 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2011-2020 RhodeCode GmbH
3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import os
21 import os
22 import logging
22 import logging
23
23
24 from pyramid.httpexceptions import HTTPFound
24 from pyramid.httpexceptions import HTTPFound
25
25
26
26
27 from rhodecode.apps._base import RepoAppView
27 from rhodecode.apps._base import RepoAppView
28 from rhodecode.lib.auth import (
28 from rhodecode.lib.auth import (
29 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
29 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
30 from rhodecode.lib import helpers as h, rc_cache
30 from rhodecode.lib import helpers as h, rc_cache
31 from rhodecode.lib import system_info
31 from rhodecode.lib import system_info
32 from rhodecode.model.meta import Session
32 from rhodecode.model.meta import Session
33 from rhodecode.model.scm import ScmModel
33 from rhodecode.model.scm import ScmModel
34
34
35 log = logging.getLogger(__name__)
35 log = logging.getLogger(__name__)
36
36
37
37
38 class RepoCachesView(RepoAppView):
38 class RepoCachesView(RepoAppView):
39 def load_default_context(self):
39 def load_default_context(self):
40 c = self._get_local_tmpl_context()
40 c = self._get_local_tmpl_context()
41 return c
41 return c
42
42
43 @LoginRequired()
43 @LoginRequired()
44 @HasRepoPermissionAnyDecorator('repository.admin')
44 @HasRepoPermissionAnyDecorator('repository.admin')
45 def repo_caches(self):
45 def repo_caches(self):
46 c = self.load_default_context()
46 c = self.load_default_context()
47 c.active = 'caches'
47 c.active = 'caches'
48 cached_diffs_dir = c.rhodecode_db_repo.cached_diffs_dir
48 cached_diffs_dir = c.rhodecode_db_repo.cached_diffs_dir
49 c.cached_diff_count = len(c.rhodecode_db_repo.cached_diffs())
49 c.cached_diff_count = len(c.rhodecode_db_repo.cached_diffs())
50 c.cached_diff_size = 0
50 c.cached_diff_size = 0
51 if os.path.isdir(cached_diffs_dir):
51 if os.path.isdir(cached_diffs_dir):
52 c.cached_diff_size = system_info.get_storage_size(cached_diffs_dir)
52 c.cached_diff_size = system_info.get_storage_size(cached_diffs_dir)
53 c.shadow_repos = c.rhodecode_db_repo.shadow_repos()
53 c.shadow_repos = c.rhodecode_db_repo.shadow_repos()
54
54
55 cache_namespace_uid = 'cache_repo.{}'.format(self.db_repo.repo_id)
55 cache_namespace_uid = 'cache_repo.{}'.format(self.db_repo.repo_id)
56 c.region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
56 c.region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
57 c.backend = c.region.backend
57 c.backend = c.region.backend
58 c.repo_keys = sorted(c.region.backend.list_keys(prefix=cache_namespace_uid))
58 c.repo_keys = sorted(c.region.backend.list_keys(prefix=cache_namespace_uid))
59
59
60 return self._get_template_context(c)
60 return self._get_template_context(c)
61
61
62 @LoginRequired()
62 @LoginRequired()
63 @HasRepoPermissionAnyDecorator('repository.admin')
63 @HasRepoPermissionAnyDecorator('repository.admin')
64 @CSRFRequired()
64 @CSRFRequired()
65 def repo_caches_purge(self):
65 def repo_caches_purge(self):
66 _ = self.request.translate
66 _ = self.request.translate
67 c = self.load_default_context()
67 c = self.load_default_context()
68 c.active = 'caches'
68 c.active = 'caches'
69 invalidated = 0
69
70
70 try:
71 try:
71 ScmModel().mark_for_invalidation(self.db_repo_name, delete=True)
72 ScmModel().mark_for_invalidation(self.db_repo_name, delete=True)
72
73 Session().commit()
73 Session().commit()
74
74 invalidated +=1
75 h.flash(_('Cache invalidation successful'),
76 category='success')
77 except Exception:
75 except Exception:
78 log.exception("Exception during cache invalidation")
76 log.exception("Exception during cache invalidation")
79 h.flash(_('An error occurred during cache invalidation'),
77 h.flash(_('An error occurred during cache invalidation'),
80 category='error')
78 category='error')
81
79
80 try:
81 invalidated += 1
82 self.rhodecode_vcs_repo.vcsserver_invalidate_cache(delete=True)
83 except Exception:
84 log.exception("Exception during vcsserver cache invalidation")
85 h.flash(_('An error occurred during vcsserver cache invalidation'),
86 category='error')
87
88 if invalidated:
89 h.flash(_('Cache invalidation successful. Stages {}/2').format(invalidated),
90 category='success')
91
82 raise HTTPFound(h.route_path(
92 raise HTTPFound(h.route_path(
83 'edit_repo_caches', repo_name=self.db_repo_name)) No newline at end of file
93 'edit_repo_caches', repo_name=self.db_repo_name))
@@ -1,358 +1,355 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21
21
22 import logging
22 import logging
23
23
24 from pyramid.httpexceptions import HTTPNotFound, HTTPFound
24 from pyramid.httpexceptions import HTTPNotFound, HTTPFound
25
25
26 from pyramid.renderers import render
26 from pyramid.renderers import render
27 from pyramid.response import Response
27 from pyramid.response import Response
28
28
29 from rhodecode.apps._base import RepoAppView
29 from rhodecode.apps._base import RepoAppView
30 import rhodecode.lib.helpers as h
30 import rhodecode.lib.helpers as h
31 from rhodecode.lib.auth import (
31 from rhodecode.lib.auth import (
32 LoginRequired, HasRepoPermissionAnyDecorator)
32 LoginRequired, HasRepoPermissionAnyDecorator)
33
33
34 from rhodecode.lib.ext_json import json
34 from rhodecode.lib.ext_json import json
35 from rhodecode.lib.graphmod import _colored, _dagwalker
35 from rhodecode.lib.graphmod import _colored, _dagwalker
36 from rhodecode.lib.helpers import RepoPage
36 from rhodecode.lib.helpers import RepoPage
37 from rhodecode.lib.utils2 import safe_int, safe_str, str2bool, safe_unicode
37 from rhodecode.lib.utils2 import safe_int, safe_str, str2bool, safe_unicode
38 from rhodecode.lib.vcs.exceptions import (
38 from rhodecode.lib.vcs.exceptions import (
39 RepositoryError, CommitDoesNotExistError,
39 RepositoryError, CommitDoesNotExistError,
40 CommitError, NodeDoesNotExistError, EmptyRepositoryError)
40 CommitError, NodeDoesNotExistError, EmptyRepositoryError)
41
41
42 log = logging.getLogger(__name__)
42 log = logging.getLogger(__name__)
43
43
44 DEFAULT_CHANGELOG_SIZE = 20
44 DEFAULT_CHANGELOG_SIZE = 20
45
45
46
46
47 class RepoChangelogView(RepoAppView):
47 class RepoChangelogView(RepoAppView):
48
48
49 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
49 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
50 """
50 """
51 This is a safe way to get commit. If an error occurs it redirects to
51 This is a safe way to get commit. If an error occurs it redirects to
52 tip with proper message
52 tip with proper message
53
53
54 :param commit_id: id of commit to fetch
54 :param commit_id: id of commit to fetch
55 :param redirect_after: toggle redirection
55 :param redirect_after: toggle redirection
56 """
56 """
57 _ = self.request.translate
57 _ = self.request.translate
58
58
59 try:
59 try:
60 return self.rhodecode_vcs_repo.get_commit(commit_id)
60 return self.rhodecode_vcs_repo.get_commit(commit_id)
61 except EmptyRepositoryError:
61 except EmptyRepositoryError:
62 if not redirect_after:
62 if not redirect_after:
63 return None
63 return None
64
64
65 h.flash(h.literal(
65 h.flash(h.literal(
66 _('There are no commits yet')), category='warning')
66 _('There are no commits yet')), category='warning')
67 raise HTTPFound(
67 raise HTTPFound(
68 h.route_path('repo_summary', repo_name=self.db_repo_name))
68 h.route_path('repo_summary', repo_name=self.db_repo_name))
69
69
70 except (CommitDoesNotExistError, LookupError):
70 except (CommitDoesNotExistError, LookupError):
71 msg = _('No such commit exists for this repository')
71 msg = _('No such commit exists for this repository')
72 h.flash(msg, category='error')
72 h.flash(msg, category='error')
73 raise HTTPNotFound()
73 raise HTTPNotFound()
74 except RepositoryError as e:
74 except RepositoryError as e:
75 h.flash(h.escape(safe_str(e)), category='error')
75 h.flash(h.escape(safe_str(e)), category='error')
76 raise HTTPNotFound()
76 raise HTTPNotFound()
77
77
78 def _graph(self, repo, commits, prev_data=None, next_data=None):
78 def _graph(self, repo, commits, prev_data=None, next_data=None):
79 """
79 """
80 Generates a DAG graph for repo
80 Generates a DAG graph for repo
81
81
82 :param repo: repo instance
82 :param repo: repo instance
83 :param commits: list of commits
83 :param commits: list of commits
84 """
84 """
85 if not commits:
85 if not commits:
86 return json.dumps([]), json.dumps([])
86 return json.dumps([]), json.dumps([])
87
87
88 def serialize(commit, parents=True):
88 def serialize(commit, parents=True):
89 data = dict(
89 data = dict(
90 raw_id=commit.raw_id,
90 raw_id=commit.raw_id,
91 idx=commit.idx,
91 idx=commit.idx,
92 branch=None,
92 branch=None,
93 )
93 )
94 if parents:
94 if parents:
95 data['parents'] = [
95 data['parents'] = [
96 serialize(x, parents=False) for x in commit.parents]
96 serialize(x, parents=False) for x in commit.parents]
97 return data
97 return data
98
98
99 prev_data = prev_data or []
99 prev_data = prev_data or []
100 next_data = next_data or []
100 next_data = next_data or []
101
101
102 current = [serialize(x) for x in commits]
102 current = [serialize(x) for x in commits]
103 commits = prev_data + current + next_data
103 commits = prev_data + current + next_data
104
104
105 dag = _dagwalker(repo, commits)
105 dag = _dagwalker(repo, commits)
106
106
107 data = [[commit_id, vtx, edges, branch]
107 data = [[commit_id, vtx, edges, branch]
108 for commit_id, vtx, edges, branch in _colored(dag)]
108 for commit_id, vtx, edges, branch in _colored(dag)]
109 return json.dumps(data), json.dumps(current)
109 return json.dumps(data), json.dumps(current)
110
110
111 def _check_if_valid_branch(self, branch_name, repo_name, f_path):
111 def _check_if_valid_branch(self, branch_name, repo_name, f_path):
112 if branch_name not in self.rhodecode_vcs_repo.branches_all:
112 if branch_name not in self.rhodecode_vcs_repo.branches_all:
113 h.flash(u'Branch {} is not found.'.format(h.escape(safe_unicode(branch_name))),
113 h.flash(u'Branch {} is not found.'.format(h.escape(safe_unicode(branch_name))),
114 category='warning')
114 category='warning')
115 redirect_url = h.route_path(
115 redirect_url = h.route_path(
116 'repo_commits_file', repo_name=repo_name,
116 'repo_commits_file', repo_name=repo_name,
117 commit_id=branch_name, f_path=f_path or '')
117 commit_id=branch_name, f_path=f_path or '')
118 raise HTTPFound(redirect_url)
118 raise HTTPFound(redirect_url)
119
119
120 def _load_changelog_data(
120 def _load_changelog_data(
121 self, c, collection, page, chunk_size, branch_name=None,
121 self, c, collection, page, chunk_size, branch_name=None,
122 dynamic=False, f_path=None, commit_id=None):
122 dynamic=False, f_path=None, commit_id=None):
123
123
124 def url_generator(page_num):
124 def url_generator(page_num):
125 query_params = {
125 query_params = {
126 'page': page_num
126 'page': page_num
127 }
127 }
128
128
129 if branch_name:
129 if branch_name:
130 query_params.update({
130 query_params.update({
131 'branch': branch_name
131 'branch': branch_name
132 })
132 })
133
133
134 if f_path:
134 if f_path:
135 # changelog for file
135 # changelog for file
136 return h.route_path(
136 return h.route_path(
137 'repo_commits_file',
137 'repo_commits_file',
138 repo_name=c.rhodecode_db_repo.repo_name,
138 repo_name=c.rhodecode_db_repo.repo_name,
139 commit_id=commit_id, f_path=f_path,
139 commit_id=commit_id, f_path=f_path,
140 _query=query_params)
140 _query=query_params)
141 else:
141 else:
142 return h.route_path(
142 return h.route_path(
143 'repo_commits',
143 'repo_commits',
144 repo_name=c.rhodecode_db_repo.repo_name, _query=query_params)
144 repo_name=c.rhodecode_db_repo.repo_name, _query=query_params)
145
145
146 c.total_cs = len(collection)
146 c.total_cs = len(collection)
147 c.showing_commits = min(chunk_size, c.total_cs)
147 c.showing_commits = min(chunk_size, c.total_cs)
148 c.pagination = RepoPage(collection, page=page, item_count=c.total_cs,
148 c.pagination = RepoPage(collection, page=page, item_count=c.total_cs,
149 items_per_page=chunk_size, url_maker=url_generator)
149 items_per_page=chunk_size, url_maker=url_generator)
150
150
151 c.next_page = c.pagination.next_page
151 c.next_page = c.pagination.next_page
152 c.prev_page = c.pagination.previous_page
152 c.prev_page = c.pagination.previous_page
153
153
154 if dynamic:
154 if dynamic:
155 if self.request.GET.get('chunk') != 'next':
155 if self.request.GET.get('chunk') != 'next':
156 c.next_page = None
156 c.next_page = None
157 if self.request.GET.get('chunk') != 'prev':
157 if self.request.GET.get('chunk') != 'prev':
158 c.prev_page = None
158 c.prev_page = None
159
159
160 page_commit_ids = [x.raw_id for x in c.pagination]
160 page_commit_ids = [x.raw_id for x in c.pagination]
161 c.comments = c.rhodecode_db_repo.get_comments(page_commit_ids)
161 c.comments = c.rhodecode_db_repo.get_comments(page_commit_ids)
162 c.statuses = c.rhodecode_db_repo.statuses(page_commit_ids)
162 c.statuses = c.rhodecode_db_repo.statuses(page_commit_ids)
163
163
164 def load_default_context(self):
164 def load_default_context(self):
165 c = self._get_local_tmpl_context(include_app_defaults=True)
165 c = self._get_local_tmpl_context(include_app_defaults=True)
166
166
167 c.rhodecode_repo = self.rhodecode_vcs_repo
167 c.rhodecode_repo = self.rhodecode_vcs_repo
168
168
169 return c
169 return c
170
170
171 def _get_preload_attrs(self):
172 pre_load = ['author', 'branch', 'date', 'message', 'parents',
173 'obsolete', 'phase', 'hidden']
174 return pre_load
175
176 @LoginRequired()
171 @LoginRequired()
177 @HasRepoPermissionAnyDecorator(
172 @HasRepoPermissionAnyDecorator(
178 'repository.read', 'repository.write', 'repository.admin')
173 'repository.read', 'repository.write', 'repository.admin')
179 def repo_changelog(self):
174 def repo_changelog(self):
180 c = self.load_default_context()
175 c = self.load_default_context()
181
176
182 commit_id = self.request.matchdict.get('commit_id')
177 commit_id = self.request.matchdict.get('commit_id')
183 f_path = self._get_f_path(self.request.matchdict)
178 f_path = self._get_f_path(self.request.matchdict)
184 show_hidden = str2bool(self.request.GET.get('evolve'))
179 show_hidden = str2bool(self.request.GET.get('evolve'))
185
180
186 chunk_size = 20
181 chunk_size = 20
187
182
188 c.branch_name = branch_name = self.request.GET.get('branch') or ''
183 c.branch_name = branch_name = self.request.GET.get('branch') or ''
189 c.book_name = book_name = self.request.GET.get('bookmark') or ''
184 c.book_name = book_name = self.request.GET.get('bookmark') or ''
190 c.f_path = f_path
185 c.f_path = f_path
191 c.commit_id = commit_id
186 c.commit_id = commit_id
192 c.show_hidden = show_hidden
187 c.show_hidden = show_hidden
193
188
194 hist_limit = safe_int(self.request.GET.get('limit')) or None
189 hist_limit = safe_int(self.request.GET.get('limit')) or None
195
190
196 p = safe_int(self.request.GET.get('page', 1), 1)
191 p = safe_int(self.request.GET.get('page', 1), 1)
197
192
198 c.selected_name = branch_name or book_name
193 c.selected_name = branch_name or book_name
199 if not commit_id and branch_name:
194 if not commit_id and branch_name:
200 self._check_if_valid_branch(branch_name, self.db_repo_name, f_path)
195 self._check_if_valid_branch(branch_name, self.db_repo_name, f_path)
201
196
202 c.changelog_for_path = f_path
197 c.changelog_for_path = f_path
203 pre_load = self._get_preload_attrs()
198 pre_load = self.get_commit_preload_attrs()
204
199
205 partial_xhr = self.request.environ.get('HTTP_X_PARTIAL_XHR')
200 partial_xhr = self.request.environ.get('HTTP_X_PARTIAL_XHR')
206
201
207 try:
202 try:
208 if f_path:
203 if f_path:
209 log.debug('generating changelog for path %s', f_path)
204 log.debug('generating changelog for path %s', f_path)
210 # get the history for the file !
205 # get the history for the file !
211 base_commit = self.rhodecode_vcs_repo.get_commit(commit_id)
206 base_commit = self.rhodecode_vcs_repo.get_commit(commit_id)
212
207
213 try:
208 try:
214 collection = base_commit.get_path_history(
209 collection = base_commit.get_path_history(
215 f_path, limit=hist_limit, pre_load=pre_load)
210 f_path, limit=hist_limit, pre_load=pre_load)
216 if collection and partial_xhr:
211 if collection and partial_xhr:
217 # for ajax call we remove first one since we're looking
212 # for ajax call we remove first one since we're looking
218 # at it right now in the context of a file commit
213 # at it right now in the context of a file commit
219 collection.pop(0)
214 collection.pop(0)
220 except (NodeDoesNotExistError, CommitError):
215 except (NodeDoesNotExistError, CommitError):
221 # this node is not present at tip!
216 # this node is not present at tip!
222 try:
217 try:
223 commit = self._get_commit_or_redirect(commit_id)
218 commit = self._get_commit_or_redirect(commit_id)
224 collection = commit.get_path_history(f_path)
219 collection = commit.get_path_history(f_path)
225 except RepositoryError as e:
220 except RepositoryError as e:
226 h.flash(safe_str(e), category='warning')
221 h.flash(safe_str(e), category='warning')
227 redirect_url = h.route_path(
222 redirect_url = h.route_path(
228 'repo_commits', repo_name=self.db_repo_name)
223 'repo_commits', repo_name=self.db_repo_name)
229 raise HTTPFound(redirect_url)
224 raise HTTPFound(redirect_url)
230 collection = list(reversed(collection))
225 collection = list(reversed(collection))
231 else:
226 else:
232 collection = self.rhodecode_vcs_repo.get_commits(
227 collection = self.rhodecode_vcs_repo.get_commits(
233 branch_name=branch_name, show_hidden=show_hidden,
228 branch_name=branch_name, show_hidden=show_hidden,
234 pre_load=pre_load, translate_tags=False)
229 pre_load=pre_load, translate_tags=False)
235
230
236 self._load_changelog_data(
231 self._load_changelog_data(
237 c, collection, p, chunk_size, c.branch_name,
232 c, collection, p, chunk_size, c.branch_name,
238 f_path=f_path, commit_id=commit_id)
233 f_path=f_path, commit_id=commit_id)
239
234
240 except EmptyRepositoryError as e:
235 except EmptyRepositoryError as e:
241 h.flash(h.escape(safe_str(e)), category='warning')
236 h.flash(h.escape(safe_str(e)), category='warning')
242 raise HTTPFound(
237 raise HTTPFound(
243 h.route_path('repo_summary', repo_name=self.db_repo_name))
238 h.route_path('repo_summary', repo_name=self.db_repo_name))
244 except HTTPFound:
239 except HTTPFound:
245 raise
240 raise
246 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
241 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
247 log.exception(safe_str(e))
242 log.exception(safe_str(e))
248 h.flash(h.escape(safe_str(e)), category='error')
243 h.flash(h.escape(safe_str(e)), category='error')
249
244
250 if commit_id:
245 if commit_id:
251 # from single commit page, we redirect to main commits
246 # from single commit page, we redirect to main commits
252 raise HTTPFound(
247 raise HTTPFound(
253 h.route_path('repo_commits', repo_name=self.db_repo_name))
248 h.route_path('repo_commits', repo_name=self.db_repo_name))
254 else:
249 else:
255 # otherwise we redirect to summary
250 # otherwise we redirect to summary
256 raise HTTPFound(
251 raise HTTPFound(
257 h.route_path('repo_summary', repo_name=self.db_repo_name))
252 h.route_path('repo_summary', repo_name=self.db_repo_name))
258
253
254
255
259 if partial_xhr or self.request.environ.get('HTTP_X_PJAX'):
256 if partial_xhr or self.request.environ.get('HTTP_X_PJAX'):
260 # case when loading dynamic file history in file view
257 # case when loading dynamic file history in file view
261 # loading from ajax, we don't want the first result, it's popped
258 # loading from ajax, we don't want the first result, it's popped
262 # in the code above
259 # in the code above
263 html = render(
260 html = render(
264 'rhodecode:templates/commits/changelog_file_history.mako',
261 'rhodecode:templates/commits/changelog_file_history.mako',
265 self._get_template_context(c), self.request)
262 self._get_template_context(c), self.request)
266 return Response(html)
263 return Response(html)
267
264
268 commit_ids = []
265 commit_ids = []
269 if not f_path:
266 if not f_path:
270 # only load graph data when not in file history mode
267 # only load graph data when not in file history mode
271 commit_ids = c.pagination
268 commit_ids = c.pagination
272
269
273 c.graph_data, c.graph_commits = self._graph(
270 c.graph_data, c.graph_commits = self._graph(
274 self.rhodecode_vcs_repo, commit_ids)
271 self.rhodecode_vcs_repo, commit_ids)
275
272
276 return self._get_template_context(c)
273 return self._get_template_context(c)
277
274
278 @LoginRequired()
275 @LoginRequired()
279 @HasRepoPermissionAnyDecorator(
276 @HasRepoPermissionAnyDecorator(
280 'repository.read', 'repository.write', 'repository.admin')
277 'repository.read', 'repository.write', 'repository.admin')
281 def repo_commits_elements(self):
278 def repo_commits_elements(self):
282 c = self.load_default_context()
279 c = self.load_default_context()
283 commit_id = self.request.matchdict.get('commit_id')
280 commit_id = self.request.matchdict.get('commit_id')
284 f_path = self._get_f_path(self.request.matchdict)
281 f_path = self._get_f_path(self.request.matchdict)
285 show_hidden = str2bool(self.request.GET.get('evolve'))
282 show_hidden = str2bool(self.request.GET.get('evolve'))
286
283
287 chunk_size = 20
284 chunk_size = 20
288 hist_limit = safe_int(self.request.GET.get('limit')) or None
285 hist_limit = safe_int(self.request.GET.get('limit')) or None
289
286
290 def wrap_for_error(err):
287 def wrap_for_error(err):
291 html = '<tr>' \
288 html = '<tr>' \
292 '<td colspan="9" class="alert alert-error">ERROR: {}</td>' \
289 '<td colspan="9" class="alert alert-error">ERROR: {}</td>' \
293 '</tr>'.format(err)
290 '</tr>'.format(err)
294 return Response(html)
291 return Response(html)
295
292
296 c.branch_name = branch_name = self.request.GET.get('branch') or ''
293 c.branch_name = branch_name = self.request.GET.get('branch') or ''
297 c.book_name = book_name = self.request.GET.get('bookmark') or ''
294 c.book_name = book_name = self.request.GET.get('bookmark') or ''
298 c.f_path = f_path
295 c.f_path = f_path
299 c.commit_id = commit_id
296 c.commit_id = commit_id
300 c.show_hidden = show_hidden
297 c.show_hidden = show_hidden
301
298
302 c.selected_name = branch_name or book_name
299 c.selected_name = branch_name or book_name
303 if branch_name and branch_name not in self.rhodecode_vcs_repo.branches_all:
300 if branch_name and branch_name not in self.rhodecode_vcs_repo.branches_all:
304 return wrap_for_error(
301 return wrap_for_error(
305 safe_str('Branch: {} is not valid'.format(branch_name)))
302 safe_str('Branch: {} is not valid'.format(branch_name)))
306
303
307 pre_load = self._get_preload_attrs()
304 pre_load = self.get_commit_preload_attrs()
308
305
309 if f_path:
306 if f_path:
310 try:
307 try:
311 base_commit = self.rhodecode_vcs_repo.get_commit(commit_id)
308 base_commit = self.rhodecode_vcs_repo.get_commit(commit_id)
312 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
309 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
313 log.exception(safe_str(e))
310 log.exception(safe_str(e))
314 raise HTTPFound(
311 raise HTTPFound(
315 h.route_path('repo_commits', repo_name=self.db_repo_name))
312 h.route_path('repo_commits', repo_name=self.db_repo_name))
316
313
317 collection = base_commit.get_path_history(
314 collection = base_commit.get_path_history(
318 f_path, limit=hist_limit, pre_load=pre_load)
315 f_path, limit=hist_limit, pre_load=pre_load)
319 collection = list(reversed(collection))
316 collection = list(reversed(collection))
320 else:
317 else:
321 collection = self.rhodecode_vcs_repo.get_commits(
318 collection = self.rhodecode_vcs_repo.get_commits(
322 branch_name=branch_name, show_hidden=show_hidden, pre_load=pre_load,
319 branch_name=branch_name, show_hidden=show_hidden, pre_load=pre_load,
323 translate_tags=False)
320 translate_tags=False)
324
321
325 p = safe_int(self.request.GET.get('page', 1), 1)
322 p = safe_int(self.request.GET.get('page', 1), 1)
326 try:
323 try:
327 self._load_changelog_data(
324 self._load_changelog_data(
328 c, collection, p, chunk_size, dynamic=True,
325 c, collection, p, chunk_size, dynamic=True,
329 f_path=f_path, commit_id=commit_id)
326 f_path=f_path, commit_id=commit_id)
330 except EmptyRepositoryError as e:
327 except EmptyRepositoryError as e:
331 return wrap_for_error(safe_str(e))
328 return wrap_for_error(safe_str(e))
332 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
329 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
333 log.exception('Failed to fetch commits')
330 log.exception('Failed to fetch commits')
334 return wrap_for_error(safe_str(e))
331 return wrap_for_error(safe_str(e))
335
332
336 prev_data = None
333 prev_data = None
337 next_data = None
334 next_data = None
338
335
339 try:
336 try:
340 prev_graph = json.loads(self.request.POST.get('graph') or '{}')
337 prev_graph = json.loads(self.request.POST.get('graph') or '{}')
341 except json.JSONDecodeError:
338 except json.JSONDecodeError:
342 prev_graph = {}
339 prev_graph = {}
343
340
344 if self.request.GET.get('chunk') == 'prev':
341 if self.request.GET.get('chunk') == 'prev':
345 next_data = prev_graph
342 next_data = prev_graph
346 elif self.request.GET.get('chunk') == 'next':
343 elif self.request.GET.get('chunk') == 'next':
347 prev_data = prev_graph
344 prev_data = prev_graph
348
345
349 commit_ids = []
346 commit_ids = []
350 if not f_path:
347 if not f_path:
351 # only load graph data when not in file history mode
348 # only load graph data when not in file history mode
352 commit_ids = c.pagination
349 commit_ids = c.pagination
353
350
354 c.graph_data, c.graph_commits = self._graph(
351 c.graph_data, c.graph_commits = self._graph(
355 self.rhodecode_vcs_repo, commit_ids,
352 self.rhodecode_vcs_repo, commit_ids,
356 prev_data=prev_data, next_data=next_data)
353 prev_data=prev_data, next_data=next_data)
357
354
358 return self._get_template_context(c)
355 return self._get_template_context(c)
@@ -1,1581 +1,1581 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2011-2020 RhodeCode GmbH
3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import itertools
21 import itertools
22 import logging
22 import logging
23 import os
23 import os
24 import shutil
24 import shutil
25 import tempfile
25 import tempfile
26 import collections
26 import collections
27 import urllib
27 import urllib
28 import pathlib2
28 import pathlib2
29
29
30 from pyramid.httpexceptions import HTTPNotFound, HTTPBadRequest, HTTPFound
30 from pyramid.httpexceptions import HTTPNotFound, HTTPBadRequest, HTTPFound
31
31
32 from pyramid.renderers import render
32 from pyramid.renderers import render
33 from pyramid.response import Response
33 from pyramid.response import Response
34
34
35 import rhodecode
35 import rhodecode
36 from rhodecode.apps._base import RepoAppView
36 from rhodecode.apps._base import RepoAppView
37
37
38
38
39 from rhodecode.lib import diffs, helpers as h, rc_cache
39 from rhodecode.lib import diffs, helpers as h, rc_cache
40 from rhodecode.lib import audit_logger
40 from rhodecode.lib import audit_logger
41 from rhodecode.lib.view_utils import parse_path_ref
41 from rhodecode.lib.view_utils import parse_path_ref
42 from rhodecode.lib.exceptions import NonRelativePathError
42 from rhodecode.lib.exceptions import NonRelativePathError
43 from rhodecode.lib.codeblocks import (
43 from rhodecode.lib.codeblocks import (
44 filenode_as_lines_tokens, filenode_as_annotated_lines_tokens)
44 filenode_as_lines_tokens, filenode_as_annotated_lines_tokens)
45 from rhodecode.lib.utils2 import (
45 from rhodecode.lib.utils2 import (
46 convert_line_endings, detect_mode, safe_str, str2bool, safe_int, sha1, safe_unicode)
46 convert_line_endings, detect_mode, safe_str, str2bool, safe_int, sha1, safe_unicode)
47 from rhodecode.lib.auth import (
47 from rhodecode.lib.auth import (
48 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
48 LoginRequired, HasRepoPermissionAnyDecorator, CSRFRequired)
49 from rhodecode.lib.vcs import path as vcspath
49 from rhodecode.lib.vcs import path as vcspath
50 from rhodecode.lib.vcs.backends.base import EmptyCommit
50 from rhodecode.lib.vcs.backends.base import EmptyCommit
51 from rhodecode.lib.vcs.conf import settings
51 from rhodecode.lib.vcs.conf import settings
52 from rhodecode.lib.vcs.nodes import FileNode
52 from rhodecode.lib.vcs.nodes import FileNode
53 from rhodecode.lib.vcs.exceptions import (
53 from rhodecode.lib.vcs.exceptions import (
54 RepositoryError, CommitDoesNotExistError, EmptyRepositoryError,
54 RepositoryError, CommitDoesNotExistError, EmptyRepositoryError,
55 ImproperArchiveTypeError, VCSError, NodeAlreadyExistsError,
55 ImproperArchiveTypeError, VCSError, NodeAlreadyExistsError,
56 NodeDoesNotExistError, CommitError, NodeError)
56 NodeDoesNotExistError, CommitError, NodeError)
57
57
58 from rhodecode.model.scm import ScmModel
58 from rhodecode.model.scm import ScmModel
59 from rhodecode.model.db import Repository
59 from rhodecode.model.db import Repository
60
60
61 log = logging.getLogger(__name__)
61 log = logging.getLogger(__name__)
62
62
63
63
64 class RepoFilesView(RepoAppView):
64 class RepoFilesView(RepoAppView):
65
65
66 @staticmethod
66 @staticmethod
67 def adjust_file_path_for_svn(f_path, repo):
67 def adjust_file_path_for_svn(f_path, repo):
68 """
68 """
69 Computes the relative path of `f_path`.
69 Computes the relative path of `f_path`.
70
70
71 This is mainly based on prefix matching of the recognized tags and
71 This is mainly based on prefix matching of the recognized tags and
72 branches in the underlying repository.
72 branches in the underlying repository.
73 """
73 """
74 tags_and_branches = itertools.chain(
74 tags_and_branches = itertools.chain(
75 repo.branches.iterkeys(),
75 repo.branches.iterkeys(),
76 repo.tags.iterkeys())
76 repo.tags.iterkeys())
77 tags_and_branches = sorted(tags_and_branches, key=len, reverse=True)
77 tags_and_branches = sorted(tags_and_branches, key=len, reverse=True)
78
78
79 for name in tags_and_branches:
79 for name in tags_and_branches:
80 if f_path.startswith('{}/'.format(name)):
80 if f_path.startswith('{}/'.format(name)):
81 f_path = vcspath.relpath(f_path, name)
81 f_path = vcspath.relpath(f_path, name)
82 break
82 break
83 return f_path
83 return f_path
84
84
85 def load_default_context(self):
85 def load_default_context(self):
86 c = self._get_local_tmpl_context(include_app_defaults=True)
86 c = self._get_local_tmpl_context(include_app_defaults=True)
87 c.rhodecode_repo = self.rhodecode_vcs_repo
87 c.rhodecode_repo = self.rhodecode_vcs_repo
88 c.enable_downloads = self.db_repo.enable_downloads
88 c.enable_downloads = self.db_repo.enable_downloads
89 return c
89 return c
90
90
91 def _ensure_not_locked(self, commit_id='tip'):
91 def _ensure_not_locked(self, commit_id='tip'):
92 _ = self.request.translate
92 _ = self.request.translate
93
93
94 repo = self.db_repo
94 repo = self.db_repo
95 if repo.enable_locking and repo.locked[0]:
95 if repo.enable_locking and repo.locked[0]:
96 h.flash(_('This repository has been locked by %s on %s')
96 h.flash(_('This repository has been locked by %s on %s')
97 % (h.person_by_id(repo.locked[0]),
97 % (h.person_by_id(repo.locked[0]),
98 h.format_date(h.time_to_datetime(repo.locked[1]))),
98 h.format_date(h.time_to_datetime(repo.locked[1]))),
99 'warning')
99 'warning')
100 files_url = h.route_path(
100 files_url = h.route_path(
101 'repo_files:default_path',
101 'repo_files:default_path',
102 repo_name=self.db_repo_name, commit_id=commit_id)
102 repo_name=self.db_repo_name, commit_id=commit_id)
103 raise HTTPFound(files_url)
103 raise HTTPFound(files_url)
104
104
105 def forbid_non_head(self, is_head, f_path, commit_id='tip', json_mode=False):
105 def forbid_non_head(self, is_head, f_path, commit_id='tip', json_mode=False):
106 _ = self.request.translate
106 _ = self.request.translate
107
107
108 if not is_head:
108 if not is_head:
109 message = _('Cannot modify file. '
109 message = _('Cannot modify file. '
110 'Given commit `{}` is not head of a branch.').format(commit_id)
110 'Given commit `{}` is not head of a branch.').format(commit_id)
111 h.flash(message, category='warning')
111 h.flash(message, category='warning')
112
112
113 if json_mode:
113 if json_mode:
114 return message
114 return message
115
115
116 files_url = h.route_path(
116 files_url = h.route_path(
117 'repo_files', repo_name=self.db_repo_name, commit_id=commit_id,
117 'repo_files', repo_name=self.db_repo_name, commit_id=commit_id,
118 f_path=f_path)
118 f_path=f_path)
119 raise HTTPFound(files_url)
119 raise HTTPFound(files_url)
120
120
121 def check_branch_permission(self, branch_name, commit_id='tip', json_mode=False):
121 def check_branch_permission(self, branch_name, commit_id='tip', json_mode=False):
122 _ = self.request.translate
122 _ = self.request.translate
123
123
124 rule, branch_perm = self._rhodecode_user.get_rule_and_branch_permission(
124 rule, branch_perm = self._rhodecode_user.get_rule_and_branch_permission(
125 self.db_repo_name, branch_name)
125 self.db_repo_name, branch_name)
126 if branch_perm and branch_perm not in ['branch.push', 'branch.push_force']:
126 if branch_perm and branch_perm not in ['branch.push', 'branch.push_force']:
127 message = _('Branch `{}` changes forbidden by rule {}.').format(
127 message = _('Branch `{}` changes forbidden by rule {}.').format(
128 h.escape(branch_name), h.escape(rule))
128 h.escape(branch_name), h.escape(rule))
129 h.flash(message, 'warning')
129 h.flash(message, 'warning')
130
130
131 if json_mode:
131 if json_mode:
132 return message
132 return message
133
133
134 files_url = h.route_path(
134 files_url = h.route_path(
135 'repo_files:default_path', repo_name=self.db_repo_name, commit_id=commit_id)
135 'repo_files:default_path', repo_name=self.db_repo_name, commit_id=commit_id)
136
136
137 raise HTTPFound(files_url)
137 raise HTTPFound(files_url)
138
138
139 def _get_commit_and_path(self):
139 def _get_commit_and_path(self):
140 default_commit_id = self.db_repo.landing_ref_name
140 default_commit_id = self.db_repo.landing_ref_name
141 default_f_path = '/'
141 default_f_path = '/'
142
142
143 commit_id = self.request.matchdict.get(
143 commit_id = self.request.matchdict.get(
144 'commit_id', default_commit_id)
144 'commit_id', default_commit_id)
145 f_path = self._get_f_path(self.request.matchdict, default_f_path)
145 f_path = self._get_f_path(self.request.matchdict, default_f_path)
146 return commit_id, f_path
146 return commit_id, f_path
147
147
148 def _get_default_encoding(self, c):
148 def _get_default_encoding(self, c):
149 enc_list = getattr(c, 'default_encodings', [])
149 enc_list = getattr(c, 'default_encodings', [])
150 return enc_list[0] if enc_list else 'UTF-8'
150 return enc_list[0] if enc_list else 'UTF-8'
151
151
152 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
152 def _get_commit_or_redirect(self, commit_id, redirect_after=True):
153 """
153 """
154 This is a safe way to get commit. If an error occurs it redirects to
154 This is a safe way to get commit. If an error occurs it redirects to
155 tip with proper message
155 tip with proper message
156
156
157 :param commit_id: id of commit to fetch
157 :param commit_id: id of commit to fetch
158 :param redirect_after: toggle redirection
158 :param redirect_after: toggle redirection
159 """
159 """
160 _ = self.request.translate
160 _ = self.request.translate
161
161
162 try:
162 try:
163 return self.rhodecode_vcs_repo.get_commit(commit_id)
163 return self.rhodecode_vcs_repo.get_commit(commit_id)
164 except EmptyRepositoryError:
164 except EmptyRepositoryError:
165 if not redirect_after:
165 if not redirect_after:
166 return None
166 return None
167
167
168 _url = h.route_path(
168 _url = h.route_path(
169 'repo_files_add_file',
169 'repo_files_add_file',
170 repo_name=self.db_repo_name, commit_id=0, f_path='')
170 repo_name=self.db_repo_name, commit_id=0, f_path='')
171
171
172 if h.HasRepoPermissionAny(
172 if h.HasRepoPermissionAny(
173 'repository.write', 'repository.admin')(self.db_repo_name):
173 'repository.write', 'repository.admin')(self.db_repo_name):
174 add_new = h.link_to(
174 add_new = h.link_to(
175 _('Click here to add a new file.'), _url, class_="alert-link")
175 _('Click here to add a new file.'), _url, class_="alert-link")
176 else:
176 else:
177 add_new = ""
177 add_new = ""
178
178
179 h.flash(h.literal(
179 h.flash(h.literal(
180 _('There are no files yet. %s') % add_new), category='warning')
180 _('There are no files yet. %s') % add_new), category='warning')
181 raise HTTPFound(
181 raise HTTPFound(
182 h.route_path('repo_summary', repo_name=self.db_repo_name))
182 h.route_path('repo_summary', repo_name=self.db_repo_name))
183
183
184 except (CommitDoesNotExistError, LookupError) as e:
184 except (CommitDoesNotExistError, LookupError) as e:
185 msg = _('No such commit exists for this repository. Commit: {}').format(commit_id)
185 msg = _('No such commit exists for this repository. Commit: {}').format(commit_id)
186 h.flash(msg, category='error')
186 h.flash(msg, category='error')
187 raise HTTPNotFound()
187 raise HTTPNotFound()
188 except RepositoryError as e:
188 except RepositoryError as e:
189 h.flash(h.escape(safe_str(e)), category='error')
189 h.flash(h.escape(safe_str(e)), category='error')
190 raise HTTPNotFound()
190 raise HTTPNotFound()
191
191
192 def _get_filenode_or_redirect(self, commit_obj, path):
192 def _get_filenode_or_redirect(self, commit_obj, path):
193 """
193 """
194 Returns file_node, if error occurs or given path is directory,
194 Returns file_node, if error occurs or given path is directory,
195 it'll redirect to top level path
195 it'll redirect to top level path
196 """
196 """
197 _ = self.request.translate
197 _ = self.request.translate
198
198
199 try:
199 try:
200 file_node = commit_obj.get_node(path)
200 file_node = commit_obj.get_node(path)
201 if file_node.is_dir():
201 if file_node.is_dir():
202 raise RepositoryError('The given path is a directory')
202 raise RepositoryError('The given path is a directory')
203 except CommitDoesNotExistError:
203 except CommitDoesNotExistError:
204 log.exception('No such commit exists for this repository')
204 log.exception('No such commit exists for this repository')
205 h.flash(_('No such commit exists for this repository'), category='error')
205 h.flash(_('No such commit exists for this repository'), category='error')
206 raise HTTPNotFound()
206 raise HTTPNotFound()
207 except RepositoryError as e:
207 except RepositoryError as e:
208 log.warning('Repository error while fetching filenode `%s`. Err:%s', path, e)
208 log.warning('Repository error while fetching filenode `%s`. Err:%s', path, e)
209 h.flash(h.escape(safe_str(e)), category='error')
209 h.flash(h.escape(safe_str(e)), category='error')
210 raise HTTPNotFound()
210 raise HTTPNotFound()
211
211
212 return file_node
212 return file_node
213
213
214 def _is_valid_head(self, commit_id, repo, landing_ref):
214 def _is_valid_head(self, commit_id, repo, landing_ref):
215 branch_name = sha_commit_id = ''
215 branch_name = sha_commit_id = ''
216 is_head = False
216 is_head = False
217 log.debug('Checking if commit_id `%s` is a head for %s.', commit_id, repo)
217 log.debug('Checking if commit_id `%s` is a head for %s.', commit_id, repo)
218
218
219 for _branch_name, branch_commit_id in repo.branches.items():
219 for _branch_name, branch_commit_id in repo.branches.items():
220 # simple case we pass in branch name, it's a HEAD
220 # simple case we pass in branch name, it's a HEAD
221 if commit_id == _branch_name:
221 if commit_id == _branch_name:
222 is_head = True
222 is_head = True
223 branch_name = _branch_name
223 branch_name = _branch_name
224 sha_commit_id = branch_commit_id
224 sha_commit_id = branch_commit_id
225 break
225 break
226 # case when we pass in full sha commit_id, which is a head
226 # case when we pass in full sha commit_id, which is a head
227 elif commit_id == branch_commit_id:
227 elif commit_id == branch_commit_id:
228 is_head = True
228 is_head = True
229 branch_name = _branch_name
229 branch_name = _branch_name
230 sha_commit_id = branch_commit_id
230 sha_commit_id = branch_commit_id
231 break
231 break
232
232
233 if h.is_svn(repo) and not repo.is_empty():
233 if h.is_svn(repo) and not repo.is_empty():
234 # Note: Subversion only has one head.
234 # Note: Subversion only has one head.
235 if commit_id == repo.get_commit(commit_idx=-1).raw_id:
235 if commit_id == repo.get_commit(commit_idx=-1).raw_id:
236 is_head = True
236 is_head = True
237 return branch_name, sha_commit_id, is_head
237 return branch_name, sha_commit_id, is_head
238
238
239 # checked branches, means we only need to try to get the branch/commit_sha
239 # checked branches, means we only need to try to get the branch/commit_sha
240 if repo.is_empty():
240 if repo.is_empty():
241 is_head = True
241 is_head = True
242 branch_name = landing_ref
242 branch_name = landing_ref
243 sha_commit_id = EmptyCommit().raw_id
243 sha_commit_id = EmptyCommit().raw_id
244 else:
244 else:
245 commit = repo.get_commit(commit_id=commit_id)
245 commit = repo.get_commit(commit_id=commit_id)
246 if commit:
246 if commit:
247 branch_name = commit.branch
247 branch_name = commit.branch
248 sha_commit_id = commit.raw_id
248 sha_commit_id = commit.raw_id
249
249
250 return branch_name, sha_commit_id, is_head
250 return branch_name, sha_commit_id, is_head
251
251
252 def _get_tree_at_commit(self, c, commit_id, f_path, full_load=False, at_rev=None):
252 def _get_tree_at_commit(self, c, commit_id, f_path, full_load=False, at_rev=None):
253
253
254 repo_id = self.db_repo.repo_id
254 repo_id = self.db_repo.repo_id
255 force_recache = self.get_recache_flag()
255 force_recache = self.get_recache_flag()
256
256
257 cache_seconds = safe_int(
257 cache_seconds = safe_int(
258 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
258 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
259 cache_on = not force_recache and cache_seconds > 0
259 cache_on = not force_recache and cache_seconds > 0
260 log.debug(
260 log.debug(
261 'Computing FILE TREE for repo_id %s commit_id `%s` and path `%s`'
261 'Computing FILE TREE for repo_id %s commit_id `%s` and path `%s`'
262 'with caching: %s[TTL: %ss]' % (
262 'with caching: %s[TTL: %ss]' % (
263 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
263 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
264
264
265 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
265 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
266 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
266 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
267
267
268 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
268 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
269 def compute_file_tree(ver, _name_hash, _repo_id, _commit_id, _f_path, _full_load, _at_rev):
269 def compute_file_tree(ver, _name_hash, _repo_id, _commit_id, _f_path, _full_load, _at_rev):
270 log.debug('Generating cached file tree at ver:%s for repo_id: %s, %s, %s',
270 log.debug('Generating cached file tree at ver:%s for repo_id: %s, %s, %s',
271 ver, _repo_id, _commit_id, _f_path)
271 ver, _repo_id, _commit_id, _f_path)
272
272
273 c.full_load = _full_load
273 c.full_load = _full_load
274 return render(
274 return render(
275 'rhodecode:templates/files/files_browser_tree.mako',
275 'rhodecode:templates/files/files_browser_tree.mako',
276 self._get_template_context(c), self.request, _at_rev)
276 self._get_template_context(c), self.request, _at_rev)
277
277
278 return compute_file_tree(
278 return compute_file_tree(
279 rc_cache.FILE_TREE_CACHE_VER, self.db_repo.repo_name_hash,
279 rc_cache.FILE_TREE_CACHE_VER, self.db_repo.repo_name_hash,
280 self.db_repo.repo_id, commit_id, f_path, full_load, at_rev)
280 self.db_repo.repo_id, commit_id, f_path, full_load, at_rev)
281
281
282 def _get_archive_spec(self, fname):
282 def _get_archive_spec(self, fname):
283 log.debug('Detecting archive spec for: `%s`', fname)
283 log.debug('Detecting archive spec for: `%s`', fname)
284
284
285 fileformat = None
285 fileformat = None
286 ext = None
286 ext = None
287 content_type = None
287 content_type = None
288 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
288 for a_type, content_type, extension in settings.ARCHIVE_SPECS:
289
289
290 if fname.endswith(extension):
290 if fname.endswith(extension):
291 fileformat = a_type
291 fileformat = a_type
292 log.debug('archive is of type: %s', fileformat)
292 log.debug('archive is of type: %s', fileformat)
293 ext = extension
293 ext = extension
294 break
294 break
295
295
296 if not fileformat:
296 if not fileformat:
297 raise ValueError()
297 raise ValueError()
298
298
299 # left over part of whole fname is the commit
299 # left over part of whole fname is the commit
300 commit_id = fname[:-len(ext)]
300 commit_id = fname[:-len(ext)]
301
301
302 return commit_id, ext, fileformat, content_type
302 return commit_id, ext, fileformat, content_type
303
303
304 def create_pure_path(self, *parts):
304 def create_pure_path(self, *parts):
305 # Split paths and sanitize them, removing any ../ etc
305 # Split paths and sanitize them, removing any ../ etc
306 sanitized_path = [
306 sanitized_path = [
307 x for x in pathlib2.PurePath(*parts).parts
307 x for x in pathlib2.PurePath(*parts).parts
308 if x not in ['.', '..']]
308 if x not in ['.', '..']]
309
309
310 pure_path = pathlib2.PurePath(*sanitized_path)
310 pure_path = pathlib2.PurePath(*sanitized_path)
311 return pure_path
311 return pure_path
312
312
313 def _is_lf_enabled(self, target_repo):
313 def _is_lf_enabled(self, target_repo):
314 lf_enabled = False
314 lf_enabled = False
315
315
316 lf_key_for_vcs_map = {
316 lf_key_for_vcs_map = {
317 'hg': 'extensions_largefiles',
317 'hg': 'extensions_largefiles',
318 'git': 'vcs_git_lfs_enabled'
318 'git': 'vcs_git_lfs_enabled'
319 }
319 }
320
320
321 lf_key_for_vcs = lf_key_for_vcs_map.get(target_repo.repo_type)
321 lf_key_for_vcs = lf_key_for_vcs_map.get(target_repo.repo_type)
322
322
323 if lf_key_for_vcs:
323 if lf_key_for_vcs:
324 lf_enabled = self._get_repo_setting(target_repo, lf_key_for_vcs)
324 lf_enabled = self._get_repo_setting(target_repo, lf_key_for_vcs)
325
325
326 return lf_enabled
326 return lf_enabled
327
327
328 def _get_archive_name(self, db_repo_name, commit_sha, ext, subrepos=False, path_sha='', with_hash=True):
328 def _get_archive_name(self, db_repo_name, commit_sha, ext, subrepos=False, path_sha='', with_hash=True):
329 # original backward compat name of archive
329 # original backward compat name of archive
330 clean_name = safe_str(db_repo_name.replace('/', '_'))
330 clean_name = safe_str(db_repo_name.replace('/', '_'))
331
331
332 # e.g vcsserver.zip
332 # e.g vcsserver.zip
333 # e.g vcsserver-abcdefgh.zip
333 # e.g vcsserver-abcdefgh.zip
334 # e.g vcsserver-abcdefgh-defghijk.zip
334 # e.g vcsserver-abcdefgh-defghijk.zip
335 archive_name = '{}{}{}{}{}{}'.format(
335 archive_name = '{}{}{}{}{}{}'.format(
336 clean_name,
336 clean_name,
337 '-sub' if subrepos else '',
337 '-sub' if subrepos else '',
338 commit_sha,
338 commit_sha,
339 '-{}'.format('plain') if not with_hash else '',
339 '-{}'.format('plain') if not with_hash else '',
340 '-{}'.format(path_sha) if path_sha else '',
340 '-{}'.format(path_sha) if path_sha else '',
341 ext)
341 ext)
342 return archive_name
342 return archive_name
343
343
344 @LoginRequired()
344 @LoginRequired()
345 @HasRepoPermissionAnyDecorator(
345 @HasRepoPermissionAnyDecorator(
346 'repository.read', 'repository.write', 'repository.admin')
346 'repository.read', 'repository.write', 'repository.admin')
347 def repo_archivefile(self):
347 def repo_archivefile(self):
348 # archive cache config
348 # archive cache config
349 from rhodecode import CONFIG
349 from rhodecode import CONFIG
350 _ = self.request.translate
350 _ = self.request.translate
351 self.load_default_context()
351 self.load_default_context()
352 default_at_path = '/'
352 default_at_path = '/'
353 fname = self.request.matchdict['fname']
353 fname = self.request.matchdict['fname']
354 subrepos = self.request.GET.get('subrepos') == 'true'
354 subrepos = self.request.GET.get('subrepos') == 'true'
355 with_hash = str2bool(self.request.GET.get('with_hash', '1'))
355 with_hash = str2bool(self.request.GET.get('with_hash', '1'))
356 at_path = self.request.GET.get('at_path') or default_at_path
356 at_path = self.request.GET.get('at_path') or default_at_path
357
357
358 if not self.db_repo.enable_downloads:
358 if not self.db_repo.enable_downloads:
359 return Response(_('Downloads disabled'))
359 return Response(_('Downloads disabled'))
360
360
361 try:
361 try:
362 commit_id, ext, fileformat, content_type = \
362 commit_id, ext, fileformat, content_type = \
363 self._get_archive_spec(fname)
363 self._get_archive_spec(fname)
364 except ValueError:
364 except ValueError:
365 return Response(_('Unknown archive type for: `{}`').format(
365 return Response(_('Unknown archive type for: `{}`').format(
366 h.escape(fname)))
366 h.escape(fname)))
367
367
368 try:
368 try:
369 commit = self.rhodecode_vcs_repo.get_commit(commit_id)
369 commit = self.rhodecode_vcs_repo.get_commit(commit_id)
370 except CommitDoesNotExistError:
370 except CommitDoesNotExistError:
371 return Response(_('Unknown commit_id {}').format(
371 return Response(_('Unknown commit_id {}').format(
372 h.escape(commit_id)))
372 h.escape(commit_id)))
373 except EmptyRepositoryError:
373 except EmptyRepositoryError:
374 return Response(_('Empty repository'))
374 return Response(_('Empty repository'))
375
375
376 # we used a ref, or a shorter version, lets redirect client ot use explicit hash
376 # we used a ref, or a shorter version, lets redirect client ot use explicit hash
377 if commit_id != commit.raw_id:
377 if commit_id != commit.raw_id:
378 fname='{}{}'.format(commit.raw_id, ext)
378 fname='{}{}'.format(commit.raw_id, ext)
379 raise HTTPFound(self.request.current_route_path(fname=fname))
379 raise HTTPFound(self.request.current_route_path(fname=fname))
380
380
381 try:
381 try:
382 at_path = commit.get_node(at_path).path or default_at_path
382 at_path = commit.get_node(at_path).path or default_at_path
383 except Exception:
383 except Exception:
384 return Response(_('No node at path {} for this repository').format(at_path))
384 return Response(_('No node at path {} for this repository').format(h.escape(at_path)))
385
385
386 # path sha is part of subdir
386 # path sha is part of subdir
387 path_sha = ''
387 path_sha = ''
388 if at_path != default_at_path:
388 if at_path != default_at_path:
389 path_sha = sha1(at_path)[:8]
389 path_sha = sha1(at_path)[:8]
390 short_sha = '-{}'.format(safe_str(commit.short_id))
390 short_sha = '-{}'.format(safe_str(commit.short_id))
391 # used for cache etc
391 # used for cache etc
392 archive_name = self._get_archive_name(
392 archive_name = self._get_archive_name(
393 self.db_repo_name, commit_sha=short_sha, ext=ext, subrepos=subrepos,
393 self.db_repo_name, commit_sha=short_sha, ext=ext, subrepos=subrepos,
394 path_sha=path_sha, with_hash=with_hash)
394 path_sha=path_sha, with_hash=with_hash)
395
395
396 if not with_hash:
396 if not with_hash:
397 short_sha = ''
397 short_sha = ''
398 path_sha = ''
398 path_sha = ''
399
399
400 # what end client gets served
400 # what end client gets served
401 response_archive_name = self._get_archive_name(
401 response_archive_name = self._get_archive_name(
402 self.db_repo_name, commit_sha=short_sha, ext=ext, subrepos=subrepos,
402 self.db_repo_name, commit_sha=short_sha, ext=ext, subrepos=subrepos,
403 path_sha=path_sha, with_hash=with_hash)
403 path_sha=path_sha, with_hash=with_hash)
404 # remove extension from our archive directory name
404 # remove extension from our archive directory name
405 archive_dir_name = response_archive_name[:-len(ext)]
405 archive_dir_name = response_archive_name[:-len(ext)]
406
406
407 use_cached_archive = False
407 use_cached_archive = False
408 archive_cache_dir = CONFIG.get('archive_cache_dir')
408 archive_cache_dir = CONFIG.get('archive_cache_dir')
409 archive_cache_enabled = archive_cache_dir and not self.request.GET.get('no_cache')
409 archive_cache_enabled = archive_cache_dir and not self.request.GET.get('no_cache')
410 cached_archive_path = None
410 cached_archive_path = None
411
411
412 if archive_cache_enabled:
412 if archive_cache_enabled:
413 # check if we it's ok to write, and re-create the archive cache
413 # check if we it's ok to write, and re-create the archive cache
414 if not os.path.isdir(CONFIG['archive_cache_dir']):
414 if not os.path.isdir(CONFIG['archive_cache_dir']):
415 os.makedirs(CONFIG['archive_cache_dir'])
415 os.makedirs(CONFIG['archive_cache_dir'])
416
416
417 cached_archive_path = os.path.join(
417 cached_archive_path = os.path.join(
418 CONFIG['archive_cache_dir'], archive_name)
418 CONFIG['archive_cache_dir'], archive_name)
419 if os.path.isfile(cached_archive_path):
419 if os.path.isfile(cached_archive_path):
420 log.debug('Found cached archive in %s', cached_archive_path)
420 log.debug('Found cached archive in %s', cached_archive_path)
421 fd, archive = None, cached_archive_path
421 fd, archive = None, cached_archive_path
422 use_cached_archive = True
422 use_cached_archive = True
423 else:
423 else:
424 log.debug('Archive %s is not yet cached', archive_name)
424 log.debug('Archive %s is not yet cached', archive_name)
425
425
426 # generate new archive, as previous was not found in the cache
426 # generate new archive, as previous was not found in the cache
427 if not use_cached_archive:
427 if not use_cached_archive:
428 _dir = os.path.abspath(archive_cache_dir) if archive_cache_dir else None
428 _dir = os.path.abspath(archive_cache_dir) if archive_cache_dir else None
429 fd, archive = tempfile.mkstemp(dir=_dir)
429 fd, archive = tempfile.mkstemp(dir=_dir)
430 log.debug('Creating new temp archive in %s', archive)
430 log.debug('Creating new temp archive in %s', archive)
431 try:
431 try:
432 commit.archive_repo(archive, archive_dir_name=archive_dir_name,
432 commit.archive_repo(archive, archive_dir_name=archive_dir_name,
433 kind=fileformat, subrepos=subrepos,
433 kind=fileformat, subrepos=subrepos,
434 archive_at_path=at_path)
434 archive_at_path=at_path)
435 except ImproperArchiveTypeError:
435 except ImproperArchiveTypeError:
436 return _('Unknown archive type')
436 return _('Unknown archive type')
437 if archive_cache_enabled:
437 if archive_cache_enabled:
438 # if we generated the archive and we have cache enabled
438 # if we generated the archive and we have cache enabled
439 # let's use this for future
439 # let's use this for future
440 log.debug('Storing new archive in %s', cached_archive_path)
440 log.debug('Storing new archive in %s', cached_archive_path)
441 shutil.move(archive, cached_archive_path)
441 shutil.move(archive, cached_archive_path)
442 archive = cached_archive_path
442 archive = cached_archive_path
443
443
444 # store download action
444 # store download action
445 audit_logger.store_web(
445 audit_logger.store_web(
446 'repo.archive.download', action_data={
446 'repo.archive.download', action_data={
447 'user_agent': self.request.user_agent,
447 'user_agent': self.request.user_agent,
448 'archive_name': archive_name,
448 'archive_name': archive_name,
449 'archive_spec': fname,
449 'archive_spec': fname,
450 'archive_cached': use_cached_archive},
450 'archive_cached': use_cached_archive},
451 user=self._rhodecode_user,
451 user=self._rhodecode_user,
452 repo=self.db_repo,
452 repo=self.db_repo,
453 commit=True
453 commit=True
454 )
454 )
455
455
456 def get_chunked_archive(archive_path):
456 def get_chunked_archive(archive_path):
457 with open(archive_path, 'rb') as stream:
457 with open(archive_path, 'rb') as stream:
458 while True:
458 while True:
459 data = stream.read(16 * 1024)
459 data = stream.read(16 * 1024)
460 if not data:
460 if not data:
461 if fd: # fd means we used temporary file
461 if fd: # fd means we used temporary file
462 os.close(fd)
462 os.close(fd)
463 if not archive_cache_enabled:
463 if not archive_cache_enabled:
464 log.debug('Destroying temp archive %s', archive_path)
464 log.debug('Destroying temp archive %s', archive_path)
465 os.remove(archive_path)
465 os.remove(archive_path)
466 break
466 break
467 yield data
467 yield data
468
468
469 response = Response(app_iter=get_chunked_archive(archive))
469 response = Response(app_iter=get_chunked_archive(archive))
470 response.content_disposition = str('attachment; filename=%s' % response_archive_name)
470 response.content_disposition = str('attachment; filename=%s' % response_archive_name)
471 response.content_type = str(content_type)
471 response.content_type = str(content_type)
472
472
473 return response
473 return response
474
474
475 def _get_file_node(self, commit_id, f_path):
475 def _get_file_node(self, commit_id, f_path):
476 if commit_id not in ['', None, 'None', '0' * 12, '0' * 40]:
476 if commit_id not in ['', None, 'None', '0' * 12, '0' * 40]:
477 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
477 commit = self.rhodecode_vcs_repo.get_commit(commit_id=commit_id)
478 try:
478 try:
479 node = commit.get_node(f_path)
479 node = commit.get_node(f_path)
480 if node.is_dir():
480 if node.is_dir():
481 raise NodeError('%s path is a %s not a file'
481 raise NodeError('%s path is a %s not a file'
482 % (node, type(node)))
482 % (node, type(node)))
483 except NodeDoesNotExistError:
483 except NodeDoesNotExistError:
484 commit = EmptyCommit(
484 commit = EmptyCommit(
485 commit_id=commit_id,
485 commit_id=commit_id,
486 idx=commit.idx,
486 idx=commit.idx,
487 repo=commit.repository,
487 repo=commit.repository,
488 alias=commit.repository.alias,
488 alias=commit.repository.alias,
489 message=commit.message,
489 message=commit.message,
490 author=commit.author,
490 author=commit.author,
491 date=commit.date)
491 date=commit.date)
492 node = FileNode(f_path, '', commit=commit)
492 node = FileNode(f_path, '', commit=commit)
493 else:
493 else:
494 commit = EmptyCommit(
494 commit = EmptyCommit(
495 repo=self.rhodecode_vcs_repo,
495 repo=self.rhodecode_vcs_repo,
496 alias=self.rhodecode_vcs_repo.alias)
496 alias=self.rhodecode_vcs_repo.alias)
497 node = FileNode(f_path, '', commit=commit)
497 node = FileNode(f_path, '', commit=commit)
498 return node
498 return node
499
499
500 @LoginRequired()
500 @LoginRequired()
501 @HasRepoPermissionAnyDecorator(
501 @HasRepoPermissionAnyDecorator(
502 'repository.read', 'repository.write', 'repository.admin')
502 'repository.read', 'repository.write', 'repository.admin')
503 def repo_files_diff(self):
503 def repo_files_diff(self):
504 c = self.load_default_context()
504 c = self.load_default_context()
505 f_path = self._get_f_path(self.request.matchdict)
505 f_path = self._get_f_path(self.request.matchdict)
506 diff1 = self.request.GET.get('diff1', '')
506 diff1 = self.request.GET.get('diff1', '')
507 diff2 = self.request.GET.get('diff2', '')
507 diff2 = self.request.GET.get('diff2', '')
508
508
509 path1, diff1 = parse_path_ref(diff1, default_path=f_path)
509 path1, diff1 = parse_path_ref(diff1, default_path=f_path)
510
510
511 ignore_whitespace = str2bool(self.request.GET.get('ignorews'))
511 ignore_whitespace = str2bool(self.request.GET.get('ignorews'))
512 line_context = self.request.GET.get('context', 3)
512 line_context = self.request.GET.get('context', 3)
513
513
514 if not any((diff1, diff2)):
514 if not any((diff1, diff2)):
515 h.flash(
515 h.flash(
516 'Need query parameter "diff1" or "diff2" to generate a diff.',
516 'Need query parameter "diff1" or "diff2" to generate a diff.',
517 category='error')
517 category='error')
518 raise HTTPBadRequest()
518 raise HTTPBadRequest()
519
519
520 c.action = self.request.GET.get('diff')
520 c.action = self.request.GET.get('diff')
521 if c.action not in ['download', 'raw']:
521 if c.action not in ['download', 'raw']:
522 compare_url = h.route_path(
522 compare_url = h.route_path(
523 'repo_compare',
523 'repo_compare',
524 repo_name=self.db_repo_name,
524 repo_name=self.db_repo_name,
525 source_ref_type='rev',
525 source_ref_type='rev',
526 source_ref=diff1,
526 source_ref=diff1,
527 target_repo=self.db_repo_name,
527 target_repo=self.db_repo_name,
528 target_ref_type='rev',
528 target_ref_type='rev',
529 target_ref=diff2,
529 target_ref=diff2,
530 _query=dict(f_path=f_path))
530 _query=dict(f_path=f_path))
531 # redirect to new view if we render diff
531 # redirect to new view if we render diff
532 raise HTTPFound(compare_url)
532 raise HTTPFound(compare_url)
533
533
534 try:
534 try:
535 node1 = self._get_file_node(diff1, path1)
535 node1 = self._get_file_node(diff1, path1)
536 node2 = self._get_file_node(diff2, f_path)
536 node2 = self._get_file_node(diff2, f_path)
537 except (RepositoryError, NodeError):
537 except (RepositoryError, NodeError):
538 log.exception("Exception while trying to get node from repository")
538 log.exception("Exception while trying to get node from repository")
539 raise HTTPFound(
539 raise HTTPFound(
540 h.route_path('repo_files', repo_name=self.db_repo_name,
540 h.route_path('repo_files', repo_name=self.db_repo_name,
541 commit_id='tip', f_path=f_path))
541 commit_id='tip', f_path=f_path))
542
542
543 if all(isinstance(node.commit, EmptyCommit)
543 if all(isinstance(node.commit, EmptyCommit)
544 for node in (node1, node2)):
544 for node in (node1, node2)):
545 raise HTTPNotFound()
545 raise HTTPNotFound()
546
546
547 c.commit_1 = node1.commit
547 c.commit_1 = node1.commit
548 c.commit_2 = node2.commit
548 c.commit_2 = node2.commit
549
549
550 if c.action == 'download':
550 if c.action == 'download':
551 _diff = diffs.get_gitdiff(node1, node2,
551 _diff = diffs.get_gitdiff(node1, node2,
552 ignore_whitespace=ignore_whitespace,
552 ignore_whitespace=ignore_whitespace,
553 context=line_context)
553 context=line_context)
554 diff = diffs.DiffProcessor(_diff, format='gitdiff')
554 diff = diffs.DiffProcessor(_diff, format='gitdiff')
555
555
556 response = Response(self.path_filter.get_raw_patch(diff))
556 response = Response(self.path_filter.get_raw_patch(diff))
557 response.content_type = 'text/plain'
557 response.content_type = 'text/plain'
558 response.content_disposition = (
558 response.content_disposition = (
559 'attachment; filename=%s_%s_vs_%s.diff' % (f_path, diff1, diff2)
559 'attachment; filename=%s_%s_vs_%s.diff' % (f_path, diff1, diff2)
560 )
560 )
561 charset = self._get_default_encoding(c)
561 charset = self._get_default_encoding(c)
562 if charset:
562 if charset:
563 response.charset = charset
563 response.charset = charset
564 return response
564 return response
565
565
566 elif c.action == 'raw':
566 elif c.action == 'raw':
567 _diff = diffs.get_gitdiff(node1, node2,
567 _diff = diffs.get_gitdiff(node1, node2,
568 ignore_whitespace=ignore_whitespace,
568 ignore_whitespace=ignore_whitespace,
569 context=line_context)
569 context=line_context)
570 diff = diffs.DiffProcessor(_diff, format='gitdiff')
570 diff = diffs.DiffProcessor(_diff, format='gitdiff')
571
571
572 response = Response(self.path_filter.get_raw_patch(diff))
572 response = Response(self.path_filter.get_raw_patch(diff))
573 response.content_type = 'text/plain'
573 response.content_type = 'text/plain'
574 charset = self._get_default_encoding(c)
574 charset = self._get_default_encoding(c)
575 if charset:
575 if charset:
576 response.charset = charset
576 response.charset = charset
577 return response
577 return response
578
578
579 # in case we ever end up here
579 # in case we ever end up here
580 raise HTTPNotFound()
580 raise HTTPNotFound()
581
581
582 @LoginRequired()
582 @LoginRequired()
583 @HasRepoPermissionAnyDecorator(
583 @HasRepoPermissionAnyDecorator(
584 'repository.read', 'repository.write', 'repository.admin')
584 'repository.read', 'repository.write', 'repository.admin')
585 def repo_files_diff_2way_redirect(self):
585 def repo_files_diff_2way_redirect(self):
586 """
586 """
587 Kept only to make OLD links work
587 Kept only to make OLD links work
588 """
588 """
589 f_path = self._get_f_path_unchecked(self.request.matchdict)
589 f_path = self._get_f_path_unchecked(self.request.matchdict)
590 diff1 = self.request.GET.get('diff1', '')
590 diff1 = self.request.GET.get('diff1', '')
591 diff2 = self.request.GET.get('diff2', '')
591 diff2 = self.request.GET.get('diff2', '')
592
592
593 if not any((diff1, diff2)):
593 if not any((diff1, diff2)):
594 h.flash(
594 h.flash(
595 'Need query parameter "diff1" or "diff2" to generate a diff.',
595 'Need query parameter "diff1" or "diff2" to generate a diff.',
596 category='error')
596 category='error')
597 raise HTTPBadRequest()
597 raise HTTPBadRequest()
598
598
599 compare_url = h.route_path(
599 compare_url = h.route_path(
600 'repo_compare',
600 'repo_compare',
601 repo_name=self.db_repo_name,
601 repo_name=self.db_repo_name,
602 source_ref_type='rev',
602 source_ref_type='rev',
603 source_ref=diff1,
603 source_ref=diff1,
604 target_ref_type='rev',
604 target_ref_type='rev',
605 target_ref=diff2,
605 target_ref=diff2,
606 _query=dict(f_path=f_path, diffmode='sideside',
606 _query=dict(f_path=f_path, diffmode='sideside',
607 target_repo=self.db_repo_name,))
607 target_repo=self.db_repo_name,))
608 raise HTTPFound(compare_url)
608 raise HTTPFound(compare_url)
609
609
610 @LoginRequired()
610 @LoginRequired()
611 def repo_files_default_commit_redirect(self):
611 def repo_files_default_commit_redirect(self):
612 """
612 """
613 Special page that redirects to the landing page of files based on the default
613 Special page that redirects to the landing page of files based on the default
614 commit for repository
614 commit for repository
615 """
615 """
616 c = self.load_default_context()
616 c = self.load_default_context()
617 ref_name = c.rhodecode_db_repo.landing_ref_name
617 ref_name = c.rhodecode_db_repo.landing_ref_name
618 landing_url = h.repo_files_by_ref_url(
618 landing_url = h.repo_files_by_ref_url(
619 c.rhodecode_db_repo.repo_name,
619 c.rhodecode_db_repo.repo_name,
620 c.rhodecode_db_repo.repo_type,
620 c.rhodecode_db_repo.repo_type,
621 f_path='',
621 f_path='',
622 ref_name=ref_name,
622 ref_name=ref_name,
623 commit_id='tip',
623 commit_id='tip',
624 query=dict(at=ref_name)
624 query=dict(at=ref_name)
625 )
625 )
626
626
627 raise HTTPFound(landing_url)
627 raise HTTPFound(landing_url)
628
628
629 @LoginRequired()
629 @LoginRequired()
630 @HasRepoPermissionAnyDecorator(
630 @HasRepoPermissionAnyDecorator(
631 'repository.read', 'repository.write', 'repository.admin')
631 'repository.read', 'repository.write', 'repository.admin')
632 def repo_files(self):
632 def repo_files(self):
633 c = self.load_default_context()
633 c = self.load_default_context()
634
634
635 view_name = getattr(self.request.matched_route, 'name', None)
635 view_name = getattr(self.request.matched_route, 'name', None)
636
636
637 c.annotate = view_name == 'repo_files:annotated'
637 c.annotate = view_name == 'repo_files:annotated'
638 # default is false, but .rst/.md files later are auto rendered, we can
638 # default is false, but .rst/.md files later are auto rendered, we can
639 # overwrite auto rendering by setting this GET flag
639 # overwrite auto rendering by setting this GET flag
640 c.renderer = view_name == 'repo_files:rendered' or \
640 c.renderer = view_name == 'repo_files:rendered' or \
641 not self.request.GET.get('no-render', False)
641 not self.request.GET.get('no-render', False)
642
642
643 commit_id, f_path = self._get_commit_and_path()
643 commit_id, f_path = self._get_commit_and_path()
644
644
645 c.commit = self._get_commit_or_redirect(commit_id)
645 c.commit = self._get_commit_or_redirect(commit_id)
646 c.branch = self.request.GET.get('branch', None)
646 c.branch = self.request.GET.get('branch', None)
647 c.f_path = f_path
647 c.f_path = f_path
648 at_rev = self.request.GET.get('at')
648 at_rev = self.request.GET.get('at')
649
649
650 # prev link
650 # prev link
651 try:
651 try:
652 prev_commit = c.commit.prev(c.branch)
652 prev_commit = c.commit.prev(c.branch)
653 c.prev_commit = prev_commit
653 c.prev_commit = prev_commit
654 c.url_prev = h.route_path(
654 c.url_prev = h.route_path(
655 'repo_files', repo_name=self.db_repo_name,
655 'repo_files', repo_name=self.db_repo_name,
656 commit_id=prev_commit.raw_id, f_path=f_path)
656 commit_id=prev_commit.raw_id, f_path=f_path)
657 if c.branch:
657 if c.branch:
658 c.url_prev += '?branch=%s' % c.branch
658 c.url_prev += '?branch=%s' % c.branch
659 except (CommitDoesNotExistError, VCSError):
659 except (CommitDoesNotExistError, VCSError):
660 c.url_prev = '#'
660 c.url_prev = '#'
661 c.prev_commit = EmptyCommit()
661 c.prev_commit = EmptyCommit()
662
662
663 # next link
663 # next link
664 try:
664 try:
665 next_commit = c.commit.next(c.branch)
665 next_commit = c.commit.next(c.branch)
666 c.next_commit = next_commit
666 c.next_commit = next_commit
667 c.url_next = h.route_path(
667 c.url_next = h.route_path(
668 'repo_files', repo_name=self.db_repo_name,
668 'repo_files', repo_name=self.db_repo_name,
669 commit_id=next_commit.raw_id, f_path=f_path)
669 commit_id=next_commit.raw_id, f_path=f_path)
670 if c.branch:
670 if c.branch:
671 c.url_next += '?branch=%s' % c.branch
671 c.url_next += '?branch=%s' % c.branch
672 except (CommitDoesNotExistError, VCSError):
672 except (CommitDoesNotExistError, VCSError):
673 c.url_next = '#'
673 c.url_next = '#'
674 c.next_commit = EmptyCommit()
674 c.next_commit = EmptyCommit()
675
675
676 # files or dirs
676 # files or dirs
677 try:
677 try:
678 c.file = c.commit.get_node(f_path)
678 c.file = c.commit.get_node(f_path)
679 c.file_author = True
679 c.file_author = True
680 c.file_tree = ''
680 c.file_tree = ''
681
681
682 # load file content
682 # load file content
683 if c.file.is_file():
683 if c.file.is_file():
684 c.lf_node = {}
684 c.lf_node = {}
685
685
686 has_lf_enabled = self._is_lf_enabled(self.db_repo)
686 has_lf_enabled = self._is_lf_enabled(self.db_repo)
687 if has_lf_enabled:
687 if has_lf_enabled:
688 c.lf_node = c.file.get_largefile_node()
688 c.lf_node = c.file.get_largefile_node()
689
689
690 c.file_source_page = 'true'
690 c.file_source_page = 'true'
691 c.file_last_commit = c.file.last_commit
691 c.file_last_commit = c.file.last_commit
692
692
693 c.file_size_too_big = c.file.size > c.visual.cut_off_limit_file
693 c.file_size_too_big = c.file.size > c.visual.cut_off_limit_file
694
694
695 if not (c.file_size_too_big or c.file.is_binary):
695 if not (c.file_size_too_big or c.file.is_binary):
696 if c.annotate: # annotation has precedence over renderer
696 if c.annotate: # annotation has precedence over renderer
697 c.annotated_lines = filenode_as_annotated_lines_tokens(
697 c.annotated_lines = filenode_as_annotated_lines_tokens(
698 c.file
698 c.file
699 )
699 )
700 else:
700 else:
701 c.renderer = (
701 c.renderer = (
702 c.renderer and h.renderer_from_filename(c.file.path)
702 c.renderer and h.renderer_from_filename(c.file.path)
703 )
703 )
704 if not c.renderer:
704 if not c.renderer:
705 c.lines = filenode_as_lines_tokens(c.file)
705 c.lines = filenode_as_lines_tokens(c.file)
706
706
707 _branch_name, _sha_commit_id, is_head = \
707 _branch_name, _sha_commit_id, is_head = \
708 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
708 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
709 landing_ref=self.db_repo.landing_ref_name)
709 landing_ref=self.db_repo.landing_ref_name)
710 c.on_branch_head = is_head
710 c.on_branch_head = is_head
711
711
712 branch = c.commit.branch if (
712 branch = c.commit.branch if (
713 c.commit.branch and '/' not in c.commit.branch) else None
713 c.commit.branch and '/' not in c.commit.branch) else None
714 c.branch_or_raw_id = branch or c.commit.raw_id
714 c.branch_or_raw_id = branch or c.commit.raw_id
715 c.branch_name = c.commit.branch or h.short_id(c.commit.raw_id)
715 c.branch_name = c.commit.branch or h.short_id(c.commit.raw_id)
716
716
717 author = c.file_last_commit.author
717 author = c.file_last_commit.author
718 c.authors = [[
718 c.authors = [[
719 h.email(author),
719 h.email(author),
720 h.person(author, 'username_or_name_or_email'),
720 h.person(author, 'username_or_name_or_email'),
721 1
721 1
722 ]]
722 ]]
723
723
724 else: # load tree content at path
724 else: # load tree content at path
725 c.file_source_page = 'false'
725 c.file_source_page = 'false'
726 c.authors = []
726 c.authors = []
727 # this loads a simple tree without metadata to speed things up
727 # this loads a simple tree without metadata to speed things up
728 # later via ajax we call repo_nodetree_full and fetch whole
728 # later via ajax we call repo_nodetree_full and fetch whole
729 c.file_tree = self._get_tree_at_commit(c, c.commit.raw_id, f_path, at_rev=at_rev)
729 c.file_tree = self._get_tree_at_commit(c, c.commit.raw_id, f_path, at_rev=at_rev)
730
730
731 c.readme_data, c.readme_file = \
731 c.readme_data, c.readme_file = \
732 self._get_readme_data(self.db_repo, c.visual.default_renderer,
732 self._get_readme_data(self.db_repo, c.visual.default_renderer,
733 c.commit.raw_id, f_path)
733 c.commit.raw_id, f_path)
734
734
735 except RepositoryError as e:
735 except RepositoryError as e:
736 h.flash(h.escape(safe_str(e)), category='error')
736 h.flash(h.escape(safe_str(e)), category='error')
737 raise HTTPNotFound()
737 raise HTTPNotFound()
738
738
739 if self.request.environ.get('HTTP_X_PJAX'):
739 if self.request.environ.get('HTTP_X_PJAX'):
740 html = render('rhodecode:templates/files/files_pjax.mako',
740 html = render('rhodecode:templates/files/files_pjax.mako',
741 self._get_template_context(c), self.request)
741 self._get_template_context(c), self.request)
742 else:
742 else:
743 html = render('rhodecode:templates/files/files.mako',
743 html = render('rhodecode:templates/files/files.mako',
744 self._get_template_context(c), self.request)
744 self._get_template_context(c), self.request)
745 return Response(html)
745 return Response(html)
746
746
747 @HasRepoPermissionAnyDecorator(
747 @HasRepoPermissionAnyDecorator(
748 'repository.read', 'repository.write', 'repository.admin')
748 'repository.read', 'repository.write', 'repository.admin')
749 def repo_files_annotated_previous(self):
749 def repo_files_annotated_previous(self):
750 self.load_default_context()
750 self.load_default_context()
751
751
752 commit_id, f_path = self._get_commit_and_path()
752 commit_id, f_path = self._get_commit_and_path()
753 commit = self._get_commit_or_redirect(commit_id)
753 commit = self._get_commit_or_redirect(commit_id)
754 prev_commit_id = commit.raw_id
754 prev_commit_id = commit.raw_id
755 line_anchor = self.request.GET.get('line_anchor')
755 line_anchor = self.request.GET.get('line_anchor')
756 is_file = False
756 is_file = False
757 try:
757 try:
758 _file = commit.get_node(f_path)
758 _file = commit.get_node(f_path)
759 is_file = _file.is_file()
759 is_file = _file.is_file()
760 except (NodeDoesNotExistError, CommitDoesNotExistError, VCSError):
760 except (NodeDoesNotExistError, CommitDoesNotExistError, VCSError):
761 pass
761 pass
762
762
763 if is_file:
763 if is_file:
764 history = commit.get_path_history(f_path)
764 history = commit.get_path_history(f_path)
765 prev_commit_id = history[1].raw_id \
765 prev_commit_id = history[1].raw_id \
766 if len(history) > 1 else prev_commit_id
766 if len(history) > 1 else prev_commit_id
767 prev_url = h.route_path(
767 prev_url = h.route_path(
768 'repo_files:annotated', repo_name=self.db_repo_name,
768 'repo_files:annotated', repo_name=self.db_repo_name,
769 commit_id=prev_commit_id, f_path=f_path,
769 commit_id=prev_commit_id, f_path=f_path,
770 _anchor='L{}'.format(line_anchor))
770 _anchor='L{}'.format(line_anchor))
771
771
772 raise HTTPFound(prev_url)
772 raise HTTPFound(prev_url)
773
773
774 @LoginRequired()
774 @LoginRequired()
775 @HasRepoPermissionAnyDecorator(
775 @HasRepoPermissionAnyDecorator(
776 'repository.read', 'repository.write', 'repository.admin')
776 'repository.read', 'repository.write', 'repository.admin')
777 def repo_nodetree_full(self):
777 def repo_nodetree_full(self):
778 """
778 """
779 Returns rendered html of file tree that contains commit date,
779 Returns rendered html of file tree that contains commit date,
780 author, commit_id for the specified combination of
780 author, commit_id for the specified combination of
781 repo, commit_id and file path
781 repo, commit_id and file path
782 """
782 """
783 c = self.load_default_context()
783 c = self.load_default_context()
784
784
785 commit_id, f_path = self._get_commit_and_path()
785 commit_id, f_path = self._get_commit_and_path()
786 commit = self._get_commit_or_redirect(commit_id)
786 commit = self._get_commit_or_redirect(commit_id)
787 try:
787 try:
788 dir_node = commit.get_node(f_path)
788 dir_node = commit.get_node(f_path)
789 except RepositoryError as e:
789 except RepositoryError as e:
790 return Response('error: {}'.format(h.escape(safe_str(e))))
790 return Response('error: {}'.format(h.escape(safe_str(e))))
791
791
792 if dir_node.is_file():
792 if dir_node.is_file():
793 return Response('')
793 return Response('')
794
794
795 c.file = dir_node
795 c.file = dir_node
796 c.commit = commit
796 c.commit = commit
797 at_rev = self.request.GET.get('at')
797 at_rev = self.request.GET.get('at')
798
798
799 html = self._get_tree_at_commit(
799 html = self._get_tree_at_commit(
800 c, commit.raw_id, dir_node.path, full_load=True, at_rev=at_rev)
800 c, commit.raw_id, dir_node.path, full_load=True, at_rev=at_rev)
801
801
802 return Response(html)
802 return Response(html)
803
803
804 def _get_attachement_headers(self, f_path):
804 def _get_attachement_headers(self, f_path):
805 f_name = safe_str(f_path.split(Repository.NAME_SEP)[-1])
805 f_name = safe_str(f_path.split(Repository.NAME_SEP)[-1])
806 safe_path = f_name.replace('"', '\\"')
806 safe_path = f_name.replace('"', '\\"')
807 encoded_path = urllib.quote(f_name)
807 encoded_path = urllib.quote(f_name)
808
808
809 return "attachment; " \
809 return "attachment; " \
810 "filename=\"{}\"; " \
810 "filename=\"{}\"; " \
811 "filename*=UTF-8\'\'{}".format(safe_path, encoded_path)
811 "filename*=UTF-8\'\'{}".format(safe_path, encoded_path)
812
812
813 @LoginRequired()
813 @LoginRequired()
814 @HasRepoPermissionAnyDecorator(
814 @HasRepoPermissionAnyDecorator(
815 'repository.read', 'repository.write', 'repository.admin')
815 'repository.read', 'repository.write', 'repository.admin')
816 def repo_file_raw(self):
816 def repo_file_raw(self):
817 """
817 """
818 Action for show as raw, some mimetypes are "rendered",
818 Action for show as raw, some mimetypes are "rendered",
819 those include images, icons.
819 those include images, icons.
820 """
820 """
821 c = self.load_default_context()
821 c = self.load_default_context()
822
822
823 commit_id, f_path = self._get_commit_and_path()
823 commit_id, f_path = self._get_commit_and_path()
824 commit = self._get_commit_or_redirect(commit_id)
824 commit = self._get_commit_or_redirect(commit_id)
825 file_node = self._get_filenode_or_redirect(commit, f_path)
825 file_node = self._get_filenode_or_redirect(commit, f_path)
826
826
827 raw_mimetype_mapping = {
827 raw_mimetype_mapping = {
828 # map original mimetype to a mimetype used for "show as raw"
828 # map original mimetype to a mimetype used for "show as raw"
829 # you can also provide a content-disposition to override the
829 # you can also provide a content-disposition to override the
830 # default "attachment" disposition.
830 # default "attachment" disposition.
831 # orig_type: (new_type, new_dispo)
831 # orig_type: (new_type, new_dispo)
832
832
833 # show images inline:
833 # show images inline:
834 # Do not re-add SVG: it is unsafe and permits XSS attacks. One can
834 # Do not re-add SVG: it is unsafe and permits XSS attacks. One can
835 # for example render an SVG with javascript inside or even render
835 # for example render an SVG with javascript inside or even render
836 # HTML.
836 # HTML.
837 'image/x-icon': ('image/x-icon', 'inline'),
837 'image/x-icon': ('image/x-icon', 'inline'),
838 'image/png': ('image/png', 'inline'),
838 'image/png': ('image/png', 'inline'),
839 'image/gif': ('image/gif', 'inline'),
839 'image/gif': ('image/gif', 'inline'),
840 'image/jpeg': ('image/jpeg', 'inline'),
840 'image/jpeg': ('image/jpeg', 'inline'),
841 'application/pdf': ('application/pdf', 'inline'),
841 'application/pdf': ('application/pdf', 'inline'),
842 }
842 }
843
843
844 mimetype = file_node.mimetype
844 mimetype = file_node.mimetype
845 try:
845 try:
846 mimetype, disposition = raw_mimetype_mapping[mimetype]
846 mimetype, disposition = raw_mimetype_mapping[mimetype]
847 except KeyError:
847 except KeyError:
848 # we don't know anything special about this, handle it safely
848 # we don't know anything special about this, handle it safely
849 if file_node.is_binary:
849 if file_node.is_binary:
850 # do same as download raw for binary files
850 # do same as download raw for binary files
851 mimetype, disposition = 'application/octet-stream', 'attachment'
851 mimetype, disposition = 'application/octet-stream', 'attachment'
852 else:
852 else:
853 # do not just use the original mimetype, but force text/plain,
853 # do not just use the original mimetype, but force text/plain,
854 # otherwise it would serve text/html and that might be unsafe.
854 # otherwise it would serve text/html and that might be unsafe.
855 # Note: underlying vcs library fakes text/plain mimetype if the
855 # Note: underlying vcs library fakes text/plain mimetype if the
856 # mimetype can not be determined and it thinks it is not
856 # mimetype can not be determined and it thinks it is not
857 # binary.This might lead to erroneous text display in some
857 # binary.This might lead to erroneous text display in some
858 # cases, but helps in other cases, like with text files
858 # cases, but helps in other cases, like with text files
859 # without extension.
859 # without extension.
860 mimetype, disposition = 'text/plain', 'inline'
860 mimetype, disposition = 'text/plain', 'inline'
861
861
862 if disposition == 'attachment':
862 if disposition == 'attachment':
863 disposition = self._get_attachement_headers(f_path)
863 disposition = self._get_attachement_headers(f_path)
864
864
865 stream_content = file_node.stream_bytes()
865 stream_content = file_node.stream_bytes()
866
866
867 response = Response(app_iter=stream_content)
867 response = Response(app_iter=stream_content)
868 response.content_disposition = disposition
868 response.content_disposition = disposition
869 response.content_type = mimetype
869 response.content_type = mimetype
870
870
871 charset = self._get_default_encoding(c)
871 charset = self._get_default_encoding(c)
872 if charset:
872 if charset:
873 response.charset = charset
873 response.charset = charset
874
874
875 return response
875 return response
876
876
877 @LoginRequired()
877 @LoginRequired()
878 @HasRepoPermissionAnyDecorator(
878 @HasRepoPermissionAnyDecorator(
879 'repository.read', 'repository.write', 'repository.admin')
879 'repository.read', 'repository.write', 'repository.admin')
880 def repo_file_download(self):
880 def repo_file_download(self):
881 c = self.load_default_context()
881 c = self.load_default_context()
882
882
883 commit_id, f_path = self._get_commit_and_path()
883 commit_id, f_path = self._get_commit_and_path()
884 commit = self._get_commit_or_redirect(commit_id)
884 commit = self._get_commit_or_redirect(commit_id)
885 file_node = self._get_filenode_or_redirect(commit, f_path)
885 file_node = self._get_filenode_or_redirect(commit, f_path)
886
886
887 if self.request.GET.get('lf'):
887 if self.request.GET.get('lf'):
888 # only if lf get flag is passed, we download this file
888 # only if lf get flag is passed, we download this file
889 # as LFS/Largefile
889 # as LFS/Largefile
890 lf_node = file_node.get_largefile_node()
890 lf_node = file_node.get_largefile_node()
891 if lf_node:
891 if lf_node:
892 # overwrite our pointer with the REAL large-file
892 # overwrite our pointer with the REAL large-file
893 file_node = lf_node
893 file_node = lf_node
894
894
895 disposition = self._get_attachement_headers(f_path)
895 disposition = self._get_attachement_headers(f_path)
896
896
897 stream_content = file_node.stream_bytes()
897 stream_content = file_node.stream_bytes()
898
898
899 response = Response(app_iter=stream_content)
899 response = Response(app_iter=stream_content)
900 response.content_disposition = disposition
900 response.content_disposition = disposition
901 response.content_type = file_node.mimetype
901 response.content_type = file_node.mimetype
902
902
903 charset = self._get_default_encoding(c)
903 charset = self._get_default_encoding(c)
904 if charset:
904 if charset:
905 response.charset = charset
905 response.charset = charset
906
906
907 return response
907 return response
908
908
909 def _get_nodelist_at_commit(self, repo_name, repo_id, commit_id, f_path):
909 def _get_nodelist_at_commit(self, repo_name, repo_id, commit_id, f_path):
910
910
911 cache_seconds = safe_int(
911 cache_seconds = safe_int(
912 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
912 rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
913 cache_on = cache_seconds > 0
913 cache_on = cache_seconds > 0
914 log.debug(
914 log.debug(
915 'Computing FILE SEARCH for repo_id %s commit_id `%s` and path `%s`'
915 'Computing FILE SEARCH for repo_id %s commit_id `%s` and path `%s`'
916 'with caching: %s[TTL: %ss]' % (
916 'with caching: %s[TTL: %ss]' % (
917 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
917 repo_id, commit_id, f_path, cache_on, cache_seconds or 0))
918
918
919 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
919 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
920 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
920 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
921
921
922 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
922 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid, condition=cache_on)
923 def compute_file_search(_name_hash, _repo_id, _commit_id, _f_path):
923 def compute_file_search(_name_hash, _repo_id, _commit_id, _f_path):
924 log.debug('Generating cached nodelist for repo_id:%s, %s, %s',
924 log.debug('Generating cached nodelist for repo_id:%s, %s, %s',
925 _repo_id, commit_id, f_path)
925 _repo_id, commit_id, f_path)
926 try:
926 try:
927 _d, _f = ScmModel().get_quick_filter_nodes(repo_name, _commit_id, _f_path)
927 _d, _f = ScmModel().get_quick_filter_nodes(repo_name, _commit_id, _f_path)
928 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
928 except (RepositoryError, CommitDoesNotExistError, Exception) as e:
929 log.exception(safe_str(e))
929 log.exception(safe_str(e))
930 h.flash(h.escape(safe_str(e)), category='error')
930 h.flash(h.escape(safe_str(e)), category='error')
931 raise HTTPFound(h.route_path(
931 raise HTTPFound(h.route_path(
932 'repo_files', repo_name=self.db_repo_name,
932 'repo_files', repo_name=self.db_repo_name,
933 commit_id='tip', f_path='/'))
933 commit_id='tip', f_path='/'))
934
934
935 return _d + _f
935 return _d + _f
936
936
937 result = compute_file_search(self.db_repo.repo_name_hash, self.db_repo.repo_id,
937 result = compute_file_search(self.db_repo.repo_name_hash, self.db_repo.repo_id,
938 commit_id, f_path)
938 commit_id, f_path)
939 return filter(lambda n: self.path_filter.path_access_allowed(n['name']), result)
939 return filter(lambda n: self.path_filter.path_access_allowed(n['name']), result)
940
940
941 @LoginRequired()
941 @LoginRequired()
942 @HasRepoPermissionAnyDecorator(
942 @HasRepoPermissionAnyDecorator(
943 'repository.read', 'repository.write', 'repository.admin')
943 'repository.read', 'repository.write', 'repository.admin')
944 def repo_nodelist(self):
944 def repo_nodelist(self):
945 self.load_default_context()
945 self.load_default_context()
946
946
947 commit_id, f_path = self._get_commit_and_path()
947 commit_id, f_path = self._get_commit_and_path()
948 commit = self._get_commit_or_redirect(commit_id)
948 commit = self._get_commit_or_redirect(commit_id)
949
949
950 metadata = self._get_nodelist_at_commit(
950 metadata = self._get_nodelist_at_commit(
951 self.db_repo_name, self.db_repo.repo_id, commit.raw_id, f_path)
951 self.db_repo_name, self.db_repo.repo_id, commit.raw_id, f_path)
952 return {'nodes': metadata}
952 return {'nodes': metadata}
953
953
954 def _create_references(self, branches_or_tags, symbolic_reference, f_path, ref_type):
954 def _create_references(self, branches_or_tags, symbolic_reference, f_path, ref_type):
955 items = []
955 items = []
956 for name, commit_id in branches_or_tags.items():
956 for name, commit_id in branches_or_tags.items():
957 sym_ref = symbolic_reference(commit_id, name, f_path, ref_type)
957 sym_ref = symbolic_reference(commit_id, name, f_path, ref_type)
958 items.append((sym_ref, name, ref_type))
958 items.append((sym_ref, name, ref_type))
959 return items
959 return items
960
960
961 def _symbolic_reference(self, commit_id, name, f_path, ref_type):
961 def _symbolic_reference(self, commit_id, name, f_path, ref_type):
962 return commit_id
962 return commit_id
963
963
964 def _symbolic_reference_svn(self, commit_id, name, f_path, ref_type):
964 def _symbolic_reference_svn(self, commit_id, name, f_path, ref_type):
965 return commit_id
965 return commit_id
966
966
967 # NOTE(dan): old code we used in "diff" mode compare
967 # NOTE(dan): old code we used in "diff" mode compare
968 new_f_path = vcspath.join(name, f_path)
968 new_f_path = vcspath.join(name, f_path)
969 return u'%s@%s' % (new_f_path, commit_id)
969 return u'%s@%s' % (new_f_path, commit_id)
970
970
971 def _get_node_history(self, commit_obj, f_path, commits=None):
971 def _get_node_history(self, commit_obj, f_path, commits=None):
972 """
972 """
973 get commit history for given node
973 get commit history for given node
974
974
975 :param commit_obj: commit to calculate history
975 :param commit_obj: commit to calculate history
976 :param f_path: path for node to calculate history for
976 :param f_path: path for node to calculate history for
977 :param commits: if passed don't calculate history and take
977 :param commits: if passed don't calculate history and take
978 commits defined in this list
978 commits defined in this list
979 """
979 """
980 _ = self.request.translate
980 _ = self.request.translate
981
981
982 # calculate history based on tip
982 # calculate history based on tip
983 tip = self.rhodecode_vcs_repo.get_commit()
983 tip = self.rhodecode_vcs_repo.get_commit()
984 if commits is None:
984 if commits is None:
985 pre_load = ["author", "branch"]
985 pre_load = ["author", "branch"]
986 try:
986 try:
987 commits = tip.get_path_history(f_path, pre_load=pre_load)
987 commits = tip.get_path_history(f_path, pre_load=pre_load)
988 except (NodeDoesNotExistError, CommitError):
988 except (NodeDoesNotExistError, CommitError):
989 # this node is not present at tip!
989 # this node is not present at tip!
990 commits = commit_obj.get_path_history(f_path, pre_load=pre_load)
990 commits = commit_obj.get_path_history(f_path, pre_load=pre_load)
991
991
992 history = []
992 history = []
993 commits_group = ([], _("Changesets"))
993 commits_group = ([], _("Changesets"))
994 for commit in commits:
994 for commit in commits:
995 branch = ' (%s)' % commit.branch if commit.branch else ''
995 branch = ' (%s)' % commit.branch if commit.branch else ''
996 n_desc = 'r%s:%s%s' % (commit.idx, commit.short_id, branch)
996 n_desc = 'r%s:%s%s' % (commit.idx, commit.short_id, branch)
997 commits_group[0].append((commit.raw_id, n_desc, 'sha'))
997 commits_group[0].append((commit.raw_id, n_desc, 'sha'))
998 history.append(commits_group)
998 history.append(commits_group)
999
999
1000 symbolic_reference = self._symbolic_reference
1000 symbolic_reference = self._symbolic_reference
1001
1001
1002 if self.rhodecode_vcs_repo.alias == 'svn':
1002 if self.rhodecode_vcs_repo.alias == 'svn':
1003 adjusted_f_path = RepoFilesView.adjust_file_path_for_svn(
1003 adjusted_f_path = RepoFilesView.adjust_file_path_for_svn(
1004 f_path, self.rhodecode_vcs_repo)
1004 f_path, self.rhodecode_vcs_repo)
1005 if adjusted_f_path != f_path:
1005 if adjusted_f_path != f_path:
1006 log.debug(
1006 log.debug(
1007 'Recognized svn tag or branch in file "%s", using svn '
1007 'Recognized svn tag or branch in file "%s", using svn '
1008 'specific symbolic references', f_path)
1008 'specific symbolic references', f_path)
1009 f_path = adjusted_f_path
1009 f_path = adjusted_f_path
1010 symbolic_reference = self._symbolic_reference_svn
1010 symbolic_reference = self._symbolic_reference_svn
1011
1011
1012 branches = self._create_references(
1012 branches = self._create_references(
1013 self.rhodecode_vcs_repo.branches, symbolic_reference, f_path, 'branch')
1013 self.rhodecode_vcs_repo.branches, symbolic_reference, f_path, 'branch')
1014 branches_group = (branches, _("Branches"))
1014 branches_group = (branches, _("Branches"))
1015
1015
1016 tags = self._create_references(
1016 tags = self._create_references(
1017 self.rhodecode_vcs_repo.tags, symbolic_reference, f_path, 'tag')
1017 self.rhodecode_vcs_repo.tags, symbolic_reference, f_path, 'tag')
1018 tags_group = (tags, _("Tags"))
1018 tags_group = (tags, _("Tags"))
1019
1019
1020 history.append(branches_group)
1020 history.append(branches_group)
1021 history.append(tags_group)
1021 history.append(tags_group)
1022
1022
1023 return history, commits
1023 return history, commits
1024
1024
1025 @LoginRequired()
1025 @LoginRequired()
1026 @HasRepoPermissionAnyDecorator(
1026 @HasRepoPermissionAnyDecorator(
1027 'repository.read', 'repository.write', 'repository.admin')
1027 'repository.read', 'repository.write', 'repository.admin')
1028 def repo_file_history(self):
1028 def repo_file_history(self):
1029 self.load_default_context()
1029 self.load_default_context()
1030
1030
1031 commit_id, f_path = self._get_commit_and_path()
1031 commit_id, f_path = self._get_commit_and_path()
1032 commit = self._get_commit_or_redirect(commit_id)
1032 commit = self._get_commit_or_redirect(commit_id)
1033 file_node = self._get_filenode_or_redirect(commit, f_path)
1033 file_node = self._get_filenode_or_redirect(commit, f_path)
1034
1034
1035 if file_node.is_file():
1035 if file_node.is_file():
1036 file_history, _hist = self._get_node_history(commit, f_path)
1036 file_history, _hist = self._get_node_history(commit, f_path)
1037
1037
1038 res = []
1038 res = []
1039 for section_items, section in file_history:
1039 for section_items, section in file_history:
1040 items = []
1040 items = []
1041 for obj_id, obj_text, obj_type in section_items:
1041 for obj_id, obj_text, obj_type in section_items:
1042 at_rev = ''
1042 at_rev = ''
1043 if obj_type in ['branch', 'bookmark', 'tag']:
1043 if obj_type in ['branch', 'bookmark', 'tag']:
1044 at_rev = obj_text
1044 at_rev = obj_text
1045 entry = {
1045 entry = {
1046 'id': obj_id,
1046 'id': obj_id,
1047 'text': obj_text,
1047 'text': obj_text,
1048 'type': obj_type,
1048 'type': obj_type,
1049 'at_rev': at_rev
1049 'at_rev': at_rev
1050 }
1050 }
1051
1051
1052 items.append(entry)
1052 items.append(entry)
1053
1053
1054 res.append({
1054 res.append({
1055 'text': section,
1055 'text': section,
1056 'children': items
1056 'children': items
1057 })
1057 })
1058
1058
1059 data = {
1059 data = {
1060 'more': False,
1060 'more': False,
1061 'results': res
1061 'results': res
1062 }
1062 }
1063 return data
1063 return data
1064
1064
1065 log.warning('Cannot fetch history for directory')
1065 log.warning('Cannot fetch history for directory')
1066 raise HTTPBadRequest()
1066 raise HTTPBadRequest()
1067
1067
1068 @LoginRequired()
1068 @LoginRequired()
1069 @HasRepoPermissionAnyDecorator(
1069 @HasRepoPermissionAnyDecorator(
1070 'repository.read', 'repository.write', 'repository.admin')
1070 'repository.read', 'repository.write', 'repository.admin')
1071 def repo_file_authors(self):
1071 def repo_file_authors(self):
1072 c = self.load_default_context()
1072 c = self.load_default_context()
1073
1073
1074 commit_id, f_path = self._get_commit_and_path()
1074 commit_id, f_path = self._get_commit_and_path()
1075 commit = self._get_commit_or_redirect(commit_id)
1075 commit = self._get_commit_or_redirect(commit_id)
1076 file_node = self._get_filenode_or_redirect(commit, f_path)
1076 file_node = self._get_filenode_or_redirect(commit, f_path)
1077
1077
1078 if not file_node.is_file():
1078 if not file_node.is_file():
1079 raise HTTPBadRequest()
1079 raise HTTPBadRequest()
1080
1080
1081 c.file_last_commit = file_node.last_commit
1081 c.file_last_commit = file_node.last_commit
1082 if self.request.GET.get('annotate') == '1':
1082 if self.request.GET.get('annotate') == '1':
1083 # use _hist from annotation if annotation mode is on
1083 # use _hist from annotation if annotation mode is on
1084 commit_ids = set(x[1] for x in file_node.annotate)
1084 commit_ids = set(x[1] for x in file_node.annotate)
1085 _hist = (
1085 _hist = (
1086 self.rhodecode_vcs_repo.get_commit(commit_id)
1086 self.rhodecode_vcs_repo.get_commit(commit_id)
1087 for commit_id in commit_ids)
1087 for commit_id in commit_ids)
1088 else:
1088 else:
1089 _f_history, _hist = self._get_node_history(commit, f_path)
1089 _f_history, _hist = self._get_node_history(commit, f_path)
1090 c.file_author = False
1090 c.file_author = False
1091
1091
1092 unique = collections.OrderedDict()
1092 unique = collections.OrderedDict()
1093 for commit in _hist:
1093 for commit in _hist:
1094 author = commit.author
1094 author = commit.author
1095 if author not in unique:
1095 if author not in unique:
1096 unique[commit.author] = [
1096 unique[commit.author] = [
1097 h.email(author),
1097 h.email(author),
1098 h.person(author, 'username_or_name_or_email'),
1098 h.person(author, 'username_or_name_or_email'),
1099 1 # counter
1099 1 # counter
1100 ]
1100 ]
1101
1101
1102 else:
1102 else:
1103 # increase counter
1103 # increase counter
1104 unique[commit.author][2] += 1
1104 unique[commit.author][2] += 1
1105
1105
1106 c.authors = [val for val in unique.values()]
1106 c.authors = [val for val in unique.values()]
1107
1107
1108 return self._get_template_context(c)
1108 return self._get_template_context(c)
1109
1109
1110 @LoginRequired()
1110 @LoginRequired()
1111 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1111 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1112 def repo_files_check_head(self):
1112 def repo_files_check_head(self):
1113 self.load_default_context()
1113 self.load_default_context()
1114
1114
1115 commit_id, f_path = self._get_commit_and_path()
1115 commit_id, f_path = self._get_commit_and_path()
1116 _branch_name, _sha_commit_id, is_head = \
1116 _branch_name, _sha_commit_id, is_head = \
1117 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1117 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1118 landing_ref=self.db_repo.landing_ref_name)
1118 landing_ref=self.db_repo.landing_ref_name)
1119
1119
1120 new_path = self.request.POST.get('path')
1120 new_path = self.request.POST.get('path')
1121 operation = self.request.POST.get('operation')
1121 operation = self.request.POST.get('operation')
1122 path_exist = ''
1122 path_exist = ''
1123
1123
1124 if new_path and operation in ['create', 'upload']:
1124 if new_path and operation in ['create', 'upload']:
1125 new_f_path = os.path.join(f_path.lstrip('/'), new_path)
1125 new_f_path = os.path.join(f_path.lstrip('/'), new_path)
1126 try:
1126 try:
1127 commit_obj = self.rhodecode_vcs_repo.get_commit(commit_id)
1127 commit_obj = self.rhodecode_vcs_repo.get_commit(commit_id)
1128 # NOTE(dan): construct whole path without leading /
1128 # NOTE(dan): construct whole path without leading /
1129 file_node = commit_obj.get_node(new_f_path)
1129 file_node = commit_obj.get_node(new_f_path)
1130 if file_node is not None:
1130 if file_node is not None:
1131 path_exist = new_f_path
1131 path_exist = new_f_path
1132 except EmptyRepositoryError:
1132 except EmptyRepositoryError:
1133 pass
1133 pass
1134 except Exception:
1134 except Exception:
1135 pass
1135 pass
1136
1136
1137 return {
1137 return {
1138 'branch': _branch_name,
1138 'branch': _branch_name,
1139 'sha': _sha_commit_id,
1139 'sha': _sha_commit_id,
1140 'is_head': is_head,
1140 'is_head': is_head,
1141 'path_exists': path_exist
1141 'path_exists': path_exist
1142 }
1142 }
1143
1143
1144 @LoginRequired()
1144 @LoginRequired()
1145 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1145 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1146 def repo_files_remove_file(self):
1146 def repo_files_remove_file(self):
1147 _ = self.request.translate
1147 _ = self.request.translate
1148 c = self.load_default_context()
1148 c = self.load_default_context()
1149 commit_id, f_path = self._get_commit_and_path()
1149 commit_id, f_path = self._get_commit_and_path()
1150
1150
1151 self._ensure_not_locked()
1151 self._ensure_not_locked()
1152 _branch_name, _sha_commit_id, is_head = \
1152 _branch_name, _sha_commit_id, is_head = \
1153 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1153 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1154 landing_ref=self.db_repo.landing_ref_name)
1154 landing_ref=self.db_repo.landing_ref_name)
1155
1155
1156 self.forbid_non_head(is_head, f_path)
1156 self.forbid_non_head(is_head, f_path)
1157 self.check_branch_permission(_branch_name)
1157 self.check_branch_permission(_branch_name)
1158
1158
1159 c.commit = self._get_commit_or_redirect(commit_id)
1159 c.commit = self._get_commit_or_redirect(commit_id)
1160 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1160 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1161
1161
1162 c.default_message = _(
1162 c.default_message = _(
1163 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1163 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1164 c.f_path = f_path
1164 c.f_path = f_path
1165
1165
1166 return self._get_template_context(c)
1166 return self._get_template_context(c)
1167
1167
1168 @LoginRequired()
1168 @LoginRequired()
1169 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1169 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1170 @CSRFRequired()
1170 @CSRFRequired()
1171 def repo_files_delete_file(self):
1171 def repo_files_delete_file(self):
1172 _ = self.request.translate
1172 _ = self.request.translate
1173
1173
1174 c = self.load_default_context()
1174 c = self.load_default_context()
1175 commit_id, f_path = self._get_commit_and_path()
1175 commit_id, f_path = self._get_commit_and_path()
1176
1176
1177 self._ensure_not_locked()
1177 self._ensure_not_locked()
1178 _branch_name, _sha_commit_id, is_head = \
1178 _branch_name, _sha_commit_id, is_head = \
1179 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1179 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1180 landing_ref=self.db_repo.landing_ref_name)
1180 landing_ref=self.db_repo.landing_ref_name)
1181
1181
1182 self.forbid_non_head(is_head, f_path)
1182 self.forbid_non_head(is_head, f_path)
1183 self.check_branch_permission(_branch_name)
1183 self.check_branch_permission(_branch_name)
1184
1184
1185 c.commit = self._get_commit_or_redirect(commit_id)
1185 c.commit = self._get_commit_or_redirect(commit_id)
1186 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1186 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1187
1187
1188 c.default_message = _(
1188 c.default_message = _(
1189 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1189 'Deleted file {} via RhodeCode Enterprise').format(f_path)
1190 c.f_path = f_path
1190 c.f_path = f_path
1191 node_path = f_path
1191 node_path = f_path
1192 author = self._rhodecode_db_user.full_contact
1192 author = self._rhodecode_db_user.full_contact
1193 message = self.request.POST.get('message') or c.default_message
1193 message = self.request.POST.get('message') or c.default_message
1194 try:
1194 try:
1195 nodes = {
1195 nodes = {
1196 node_path: {
1196 node_path: {
1197 'content': ''
1197 'content': ''
1198 }
1198 }
1199 }
1199 }
1200 ScmModel().delete_nodes(
1200 ScmModel().delete_nodes(
1201 user=self._rhodecode_db_user.user_id, repo=self.db_repo,
1201 user=self._rhodecode_db_user.user_id, repo=self.db_repo,
1202 message=message,
1202 message=message,
1203 nodes=nodes,
1203 nodes=nodes,
1204 parent_commit=c.commit,
1204 parent_commit=c.commit,
1205 author=author,
1205 author=author,
1206 )
1206 )
1207
1207
1208 h.flash(
1208 h.flash(
1209 _('Successfully deleted file `{}`').format(
1209 _('Successfully deleted file `{}`').format(
1210 h.escape(f_path)), category='success')
1210 h.escape(f_path)), category='success')
1211 except Exception:
1211 except Exception:
1212 log.exception('Error during commit operation')
1212 log.exception('Error during commit operation')
1213 h.flash(_('Error occurred during commit'), category='error')
1213 h.flash(_('Error occurred during commit'), category='error')
1214 raise HTTPFound(
1214 raise HTTPFound(
1215 h.route_path('repo_commit', repo_name=self.db_repo_name,
1215 h.route_path('repo_commit', repo_name=self.db_repo_name,
1216 commit_id='tip'))
1216 commit_id='tip'))
1217
1217
1218 @LoginRequired()
1218 @LoginRequired()
1219 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1219 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1220 def repo_files_edit_file(self):
1220 def repo_files_edit_file(self):
1221 _ = self.request.translate
1221 _ = self.request.translate
1222 c = self.load_default_context()
1222 c = self.load_default_context()
1223 commit_id, f_path = self._get_commit_and_path()
1223 commit_id, f_path = self._get_commit_and_path()
1224
1224
1225 self._ensure_not_locked()
1225 self._ensure_not_locked()
1226 _branch_name, _sha_commit_id, is_head = \
1226 _branch_name, _sha_commit_id, is_head = \
1227 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1227 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1228 landing_ref=self.db_repo.landing_ref_name)
1228 landing_ref=self.db_repo.landing_ref_name)
1229
1229
1230 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1230 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1231 self.check_branch_permission(_branch_name, commit_id=commit_id)
1231 self.check_branch_permission(_branch_name, commit_id=commit_id)
1232
1232
1233 c.commit = self._get_commit_or_redirect(commit_id)
1233 c.commit = self._get_commit_or_redirect(commit_id)
1234 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1234 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1235
1235
1236 if c.file.is_binary:
1236 if c.file.is_binary:
1237 files_url = h.route_path(
1237 files_url = h.route_path(
1238 'repo_files',
1238 'repo_files',
1239 repo_name=self.db_repo_name,
1239 repo_name=self.db_repo_name,
1240 commit_id=c.commit.raw_id, f_path=f_path)
1240 commit_id=c.commit.raw_id, f_path=f_path)
1241 raise HTTPFound(files_url)
1241 raise HTTPFound(files_url)
1242
1242
1243 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1243 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1244 c.f_path = f_path
1244 c.f_path = f_path
1245
1245
1246 return self._get_template_context(c)
1246 return self._get_template_context(c)
1247
1247
1248 @LoginRequired()
1248 @LoginRequired()
1249 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1249 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1250 @CSRFRequired()
1250 @CSRFRequired()
1251 def repo_files_update_file(self):
1251 def repo_files_update_file(self):
1252 _ = self.request.translate
1252 _ = self.request.translate
1253 c = self.load_default_context()
1253 c = self.load_default_context()
1254 commit_id, f_path = self._get_commit_and_path()
1254 commit_id, f_path = self._get_commit_and_path()
1255
1255
1256 self._ensure_not_locked()
1256 self._ensure_not_locked()
1257
1257
1258 c.commit = self._get_commit_or_redirect(commit_id)
1258 c.commit = self._get_commit_or_redirect(commit_id)
1259 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1259 c.file = self._get_filenode_or_redirect(c.commit, f_path)
1260
1260
1261 if c.file.is_binary:
1261 if c.file.is_binary:
1262 raise HTTPFound(h.route_path('repo_files', repo_name=self.db_repo_name,
1262 raise HTTPFound(h.route_path('repo_files', repo_name=self.db_repo_name,
1263 commit_id=c.commit.raw_id, f_path=f_path))
1263 commit_id=c.commit.raw_id, f_path=f_path))
1264
1264
1265 _branch_name, _sha_commit_id, is_head = \
1265 _branch_name, _sha_commit_id, is_head = \
1266 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1266 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1267 landing_ref=self.db_repo.landing_ref_name)
1267 landing_ref=self.db_repo.landing_ref_name)
1268
1268
1269 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1269 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1270 self.check_branch_permission(_branch_name, commit_id=commit_id)
1270 self.check_branch_permission(_branch_name, commit_id=commit_id)
1271
1271
1272 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1272 c.default_message = _('Edited file {} via RhodeCode Enterprise').format(f_path)
1273 c.f_path = f_path
1273 c.f_path = f_path
1274
1274
1275 old_content = c.file.content
1275 old_content = c.file.content
1276 sl = old_content.splitlines(1)
1276 sl = old_content.splitlines(1)
1277 first_line = sl[0] if sl else ''
1277 first_line = sl[0] if sl else ''
1278
1278
1279 r_post = self.request.POST
1279 r_post = self.request.POST
1280 # line endings: 0 - Unix, 1 - Mac, 2 - DOS
1280 # line endings: 0 - Unix, 1 - Mac, 2 - DOS
1281 line_ending_mode = detect_mode(first_line, 0)
1281 line_ending_mode = detect_mode(first_line, 0)
1282 content = convert_line_endings(r_post.get('content', ''), line_ending_mode)
1282 content = convert_line_endings(r_post.get('content', ''), line_ending_mode)
1283
1283
1284 message = r_post.get('message') or c.default_message
1284 message = r_post.get('message') or c.default_message
1285 org_node_path = c.file.unicode_path
1285 org_node_path = c.file.unicode_path
1286 filename = r_post['filename']
1286 filename = r_post['filename']
1287
1287
1288 root_path = c.file.dir_path
1288 root_path = c.file.dir_path
1289 pure_path = self.create_pure_path(root_path, filename)
1289 pure_path = self.create_pure_path(root_path, filename)
1290 node_path = safe_unicode(bytes(pure_path))
1290 node_path = safe_unicode(bytes(pure_path))
1291
1291
1292 default_redirect_url = h.route_path('repo_commit', repo_name=self.db_repo_name,
1292 default_redirect_url = h.route_path('repo_commit', repo_name=self.db_repo_name,
1293 commit_id=commit_id)
1293 commit_id=commit_id)
1294 if content == old_content and node_path == org_node_path:
1294 if content == old_content and node_path == org_node_path:
1295 h.flash(_('No changes detected on {}').format(h.escape(org_node_path)),
1295 h.flash(_('No changes detected on {}').format(h.escape(org_node_path)),
1296 category='warning')
1296 category='warning')
1297 raise HTTPFound(default_redirect_url)
1297 raise HTTPFound(default_redirect_url)
1298
1298
1299 try:
1299 try:
1300 mapping = {
1300 mapping = {
1301 org_node_path: {
1301 org_node_path: {
1302 'org_filename': org_node_path,
1302 'org_filename': org_node_path,
1303 'filename': node_path,
1303 'filename': node_path,
1304 'content': content,
1304 'content': content,
1305 'lexer': '',
1305 'lexer': '',
1306 'op': 'mod',
1306 'op': 'mod',
1307 'mode': c.file.mode
1307 'mode': c.file.mode
1308 }
1308 }
1309 }
1309 }
1310
1310
1311 commit = ScmModel().update_nodes(
1311 commit = ScmModel().update_nodes(
1312 user=self._rhodecode_db_user.user_id,
1312 user=self._rhodecode_db_user.user_id,
1313 repo=self.db_repo,
1313 repo=self.db_repo,
1314 message=message,
1314 message=message,
1315 nodes=mapping,
1315 nodes=mapping,
1316 parent_commit=c.commit,
1316 parent_commit=c.commit,
1317 )
1317 )
1318
1318
1319 h.flash(_('Successfully committed changes to file `{}`').format(
1319 h.flash(_('Successfully committed changes to file `{}`').format(
1320 h.escape(f_path)), category='success')
1320 h.escape(f_path)), category='success')
1321 default_redirect_url = h.route_path(
1321 default_redirect_url = h.route_path(
1322 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1322 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1323
1323
1324 except Exception:
1324 except Exception:
1325 log.exception('Error occurred during commit')
1325 log.exception('Error occurred during commit')
1326 h.flash(_('Error occurred during commit'), category='error')
1326 h.flash(_('Error occurred during commit'), category='error')
1327
1327
1328 raise HTTPFound(default_redirect_url)
1328 raise HTTPFound(default_redirect_url)
1329
1329
1330 @LoginRequired()
1330 @LoginRequired()
1331 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1331 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1332 def repo_files_add_file(self):
1332 def repo_files_add_file(self):
1333 _ = self.request.translate
1333 _ = self.request.translate
1334 c = self.load_default_context()
1334 c = self.load_default_context()
1335 commit_id, f_path = self._get_commit_and_path()
1335 commit_id, f_path = self._get_commit_and_path()
1336
1336
1337 self._ensure_not_locked()
1337 self._ensure_not_locked()
1338
1338
1339 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1339 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1340 if c.commit is None:
1340 if c.commit is None:
1341 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1341 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1342
1342
1343 if self.rhodecode_vcs_repo.is_empty():
1343 if self.rhodecode_vcs_repo.is_empty():
1344 # for empty repository we cannot check for current branch, we rely on
1344 # for empty repository we cannot check for current branch, we rely on
1345 # c.commit.branch instead
1345 # c.commit.branch instead
1346 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1346 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1347 else:
1347 else:
1348 _branch_name, _sha_commit_id, is_head = \
1348 _branch_name, _sha_commit_id, is_head = \
1349 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1349 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1350 landing_ref=self.db_repo.landing_ref_name)
1350 landing_ref=self.db_repo.landing_ref_name)
1351
1351
1352 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1352 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1353 self.check_branch_permission(_branch_name, commit_id=commit_id)
1353 self.check_branch_permission(_branch_name, commit_id=commit_id)
1354
1354
1355 c.default_message = (_('Added file via RhodeCode Enterprise'))
1355 c.default_message = (_('Added file via RhodeCode Enterprise'))
1356 c.f_path = f_path.lstrip('/') # ensure not relative path
1356 c.f_path = f_path.lstrip('/') # ensure not relative path
1357
1357
1358 return self._get_template_context(c)
1358 return self._get_template_context(c)
1359
1359
1360 @LoginRequired()
1360 @LoginRequired()
1361 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1361 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1362 @CSRFRequired()
1362 @CSRFRequired()
1363 def repo_files_create_file(self):
1363 def repo_files_create_file(self):
1364 _ = self.request.translate
1364 _ = self.request.translate
1365 c = self.load_default_context()
1365 c = self.load_default_context()
1366 commit_id, f_path = self._get_commit_and_path()
1366 commit_id, f_path = self._get_commit_and_path()
1367
1367
1368 self._ensure_not_locked()
1368 self._ensure_not_locked()
1369
1369
1370 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1370 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1371 if c.commit is None:
1371 if c.commit is None:
1372 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1372 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1373
1373
1374 # calculate redirect URL
1374 # calculate redirect URL
1375 if self.rhodecode_vcs_repo.is_empty():
1375 if self.rhodecode_vcs_repo.is_empty():
1376 default_redirect_url = h.route_path(
1376 default_redirect_url = h.route_path(
1377 'repo_summary', repo_name=self.db_repo_name)
1377 'repo_summary', repo_name=self.db_repo_name)
1378 else:
1378 else:
1379 default_redirect_url = h.route_path(
1379 default_redirect_url = h.route_path(
1380 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1380 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1381
1381
1382 if self.rhodecode_vcs_repo.is_empty():
1382 if self.rhodecode_vcs_repo.is_empty():
1383 # for empty repository we cannot check for current branch, we rely on
1383 # for empty repository we cannot check for current branch, we rely on
1384 # c.commit.branch instead
1384 # c.commit.branch instead
1385 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1385 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1386 else:
1386 else:
1387 _branch_name, _sha_commit_id, is_head = \
1387 _branch_name, _sha_commit_id, is_head = \
1388 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1388 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1389 landing_ref=self.db_repo.landing_ref_name)
1389 landing_ref=self.db_repo.landing_ref_name)
1390
1390
1391 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1391 self.forbid_non_head(is_head, f_path, commit_id=commit_id)
1392 self.check_branch_permission(_branch_name, commit_id=commit_id)
1392 self.check_branch_permission(_branch_name, commit_id=commit_id)
1393
1393
1394 c.default_message = (_('Added file via RhodeCode Enterprise'))
1394 c.default_message = (_('Added file via RhodeCode Enterprise'))
1395 c.f_path = f_path
1395 c.f_path = f_path
1396
1396
1397 r_post = self.request.POST
1397 r_post = self.request.POST
1398 message = r_post.get('message') or c.default_message
1398 message = r_post.get('message') or c.default_message
1399 filename = r_post.get('filename')
1399 filename = r_post.get('filename')
1400 unix_mode = 0
1400 unix_mode = 0
1401 content = convert_line_endings(r_post.get('content', ''), unix_mode)
1401 content = convert_line_endings(r_post.get('content', ''), unix_mode)
1402
1402
1403 if not filename:
1403 if not filename:
1404 # If there's no commit, redirect to repo summary
1404 # If there's no commit, redirect to repo summary
1405 if type(c.commit) is EmptyCommit:
1405 if type(c.commit) is EmptyCommit:
1406 redirect_url = h.route_path(
1406 redirect_url = h.route_path(
1407 'repo_summary', repo_name=self.db_repo_name)
1407 'repo_summary', repo_name=self.db_repo_name)
1408 else:
1408 else:
1409 redirect_url = default_redirect_url
1409 redirect_url = default_redirect_url
1410 h.flash(_('No filename specified'), category='warning')
1410 h.flash(_('No filename specified'), category='warning')
1411 raise HTTPFound(redirect_url)
1411 raise HTTPFound(redirect_url)
1412
1412
1413 root_path = f_path
1413 root_path = f_path
1414 pure_path = self.create_pure_path(root_path, filename)
1414 pure_path = self.create_pure_path(root_path, filename)
1415 node_path = safe_unicode(bytes(pure_path).lstrip('/'))
1415 node_path = safe_unicode(bytes(pure_path).lstrip('/'))
1416
1416
1417 author = self._rhodecode_db_user.full_contact
1417 author = self._rhodecode_db_user.full_contact
1418 nodes = {
1418 nodes = {
1419 node_path: {
1419 node_path: {
1420 'content': content
1420 'content': content
1421 }
1421 }
1422 }
1422 }
1423
1423
1424 try:
1424 try:
1425
1425
1426 commit = ScmModel().create_nodes(
1426 commit = ScmModel().create_nodes(
1427 user=self._rhodecode_db_user.user_id,
1427 user=self._rhodecode_db_user.user_id,
1428 repo=self.db_repo,
1428 repo=self.db_repo,
1429 message=message,
1429 message=message,
1430 nodes=nodes,
1430 nodes=nodes,
1431 parent_commit=c.commit,
1431 parent_commit=c.commit,
1432 author=author,
1432 author=author,
1433 )
1433 )
1434
1434
1435 h.flash(_('Successfully committed new file `{}`').format(
1435 h.flash(_('Successfully committed new file `{}`').format(
1436 h.escape(node_path)), category='success')
1436 h.escape(node_path)), category='success')
1437
1437
1438 default_redirect_url = h.route_path(
1438 default_redirect_url = h.route_path(
1439 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1439 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1440
1440
1441 except NonRelativePathError:
1441 except NonRelativePathError:
1442 log.exception('Non Relative path found')
1442 log.exception('Non Relative path found')
1443 h.flash(_('The location specified must be a relative path and must not '
1443 h.flash(_('The location specified must be a relative path and must not '
1444 'contain .. in the path'), category='warning')
1444 'contain .. in the path'), category='warning')
1445 raise HTTPFound(default_redirect_url)
1445 raise HTTPFound(default_redirect_url)
1446 except (NodeError, NodeAlreadyExistsError) as e:
1446 except (NodeError, NodeAlreadyExistsError) as e:
1447 h.flash(h.escape(safe_str(e)), category='error')
1447 h.flash(h.escape(safe_str(e)), category='error')
1448 except Exception:
1448 except Exception:
1449 log.exception('Error occurred during commit')
1449 log.exception('Error occurred during commit')
1450 h.flash(_('Error occurred during commit'), category='error')
1450 h.flash(_('Error occurred during commit'), category='error')
1451
1451
1452 raise HTTPFound(default_redirect_url)
1452 raise HTTPFound(default_redirect_url)
1453
1453
1454 @LoginRequired()
1454 @LoginRequired()
1455 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1455 @HasRepoPermissionAnyDecorator('repository.write', 'repository.admin')
1456 @CSRFRequired()
1456 @CSRFRequired()
1457 def repo_files_upload_file(self):
1457 def repo_files_upload_file(self):
1458 _ = self.request.translate
1458 _ = self.request.translate
1459 c = self.load_default_context()
1459 c = self.load_default_context()
1460 commit_id, f_path = self._get_commit_and_path()
1460 commit_id, f_path = self._get_commit_and_path()
1461
1461
1462 self._ensure_not_locked()
1462 self._ensure_not_locked()
1463
1463
1464 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1464 c.commit = self._get_commit_or_redirect(commit_id, redirect_after=False)
1465 if c.commit is None:
1465 if c.commit is None:
1466 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1466 c.commit = EmptyCommit(alias=self.rhodecode_vcs_repo.alias)
1467
1467
1468 # calculate redirect URL
1468 # calculate redirect URL
1469 if self.rhodecode_vcs_repo.is_empty():
1469 if self.rhodecode_vcs_repo.is_empty():
1470 default_redirect_url = h.route_path(
1470 default_redirect_url = h.route_path(
1471 'repo_summary', repo_name=self.db_repo_name)
1471 'repo_summary', repo_name=self.db_repo_name)
1472 else:
1472 else:
1473 default_redirect_url = h.route_path(
1473 default_redirect_url = h.route_path(
1474 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1474 'repo_commit', repo_name=self.db_repo_name, commit_id='tip')
1475
1475
1476 if self.rhodecode_vcs_repo.is_empty():
1476 if self.rhodecode_vcs_repo.is_empty():
1477 # for empty repository we cannot check for current branch, we rely on
1477 # for empty repository we cannot check for current branch, we rely on
1478 # c.commit.branch instead
1478 # c.commit.branch instead
1479 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1479 _branch_name, _sha_commit_id, is_head = c.commit.branch, '', True
1480 else:
1480 else:
1481 _branch_name, _sha_commit_id, is_head = \
1481 _branch_name, _sha_commit_id, is_head = \
1482 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1482 self._is_valid_head(commit_id, self.rhodecode_vcs_repo,
1483 landing_ref=self.db_repo.landing_ref_name)
1483 landing_ref=self.db_repo.landing_ref_name)
1484
1484
1485 error = self.forbid_non_head(is_head, f_path, json_mode=True)
1485 error = self.forbid_non_head(is_head, f_path, json_mode=True)
1486 if error:
1486 if error:
1487 return {
1487 return {
1488 'error': error,
1488 'error': error,
1489 'redirect_url': default_redirect_url
1489 'redirect_url': default_redirect_url
1490 }
1490 }
1491 error = self.check_branch_permission(_branch_name, json_mode=True)
1491 error = self.check_branch_permission(_branch_name, json_mode=True)
1492 if error:
1492 if error:
1493 return {
1493 return {
1494 'error': error,
1494 'error': error,
1495 'redirect_url': default_redirect_url
1495 'redirect_url': default_redirect_url
1496 }
1496 }
1497
1497
1498 c.default_message = (_('Uploaded file via RhodeCode Enterprise'))
1498 c.default_message = (_('Uploaded file via RhodeCode Enterprise'))
1499 c.f_path = f_path
1499 c.f_path = f_path
1500
1500
1501 r_post = self.request.POST
1501 r_post = self.request.POST
1502
1502
1503 message = c.default_message
1503 message = c.default_message
1504 user_message = r_post.getall('message')
1504 user_message = r_post.getall('message')
1505 if isinstance(user_message, list) and user_message:
1505 if isinstance(user_message, list) and user_message:
1506 # we take the first from duplicated results if it's not empty
1506 # we take the first from duplicated results if it's not empty
1507 message = user_message[0] if user_message[0] else message
1507 message = user_message[0] if user_message[0] else message
1508
1508
1509 nodes = {}
1509 nodes = {}
1510
1510
1511 for file_obj in r_post.getall('files_upload') or []:
1511 for file_obj in r_post.getall('files_upload') or []:
1512 content = file_obj.file
1512 content = file_obj.file
1513 filename = file_obj.filename
1513 filename = file_obj.filename
1514
1514
1515 root_path = f_path
1515 root_path = f_path
1516 pure_path = self.create_pure_path(root_path, filename)
1516 pure_path = self.create_pure_path(root_path, filename)
1517 node_path = safe_unicode(bytes(pure_path).lstrip('/'))
1517 node_path = safe_unicode(bytes(pure_path).lstrip('/'))
1518
1518
1519 nodes[node_path] = {
1519 nodes[node_path] = {
1520 'content': content
1520 'content': content
1521 }
1521 }
1522
1522
1523 if not nodes:
1523 if not nodes:
1524 error = 'missing files'
1524 error = 'missing files'
1525 return {
1525 return {
1526 'error': error,
1526 'error': error,
1527 'redirect_url': default_redirect_url
1527 'redirect_url': default_redirect_url
1528 }
1528 }
1529
1529
1530 author = self._rhodecode_db_user.full_contact
1530 author = self._rhodecode_db_user.full_contact
1531
1531
1532 try:
1532 try:
1533 commit = ScmModel().create_nodes(
1533 commit = ScmModel().create_nodes(
1534 user=self._rhodecode_db_user.user_id,
1534 user=self._rhodecode_db_user.user_id,
1535 repo=self.db_repo,
1535 repo=self.db_repo,
1536 message=message,
1536 message=message,
1537 nodes=nodes,
1537 nodes=nodes,
1538 parent_commit=c.commit,
1538 parent_commit=c.commit,
1539 author=author,
1539 author=author,
1540 )
1540 )
1541 if len(nodes) == 1:
1541 if len(nodes) == 1:
1542 flash_message = _('Successfully committed {} new files').format(len(nodes))
1542 flash_message = _('Successfully committed {} new files').format(len(nodes))
1543 else:
1543 else:
1544 flash_message = _('Successfully committed 1 new file')
1544 flash_message = _('Successfully committed 1 new file')
1545
1545
1546 h.flash(flash_message, category='success')
1546 h.flash(flash_message, category='success')
1547
1547
1548 default_redirect_url = h.route_path(
1548 default_redirect_url = h.route_path(
1549 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1549 'repo_commit', repo_name=self.db_repo_name, commit_id=commit.raw_id)
1550
1550
1551 except NonRelativePathError:
1551 except NonRelativePathError:
1552 log.exception('Non Relative path found')
1552 log.exception('Non Relative path found')
1553 error = _('The location specified must be a relative path and must not '
1553 error = _('The location specified must be a relative path and must not '
1554 'contain .. in the path')
1554 'contain .. in the path')
1555 h.flash(error, category='warning')
1555 h.flash(error, category='warning')
1556
1556
1557 return {
1557 return {
1558 'error': error,
1558 'error': error,
1559 'redirect_url': default_redirect_url
1559 'redirect_url': default_redirect_url
1560 }
1560 }
1561 except (NodeError, NodeAlreadyExistsError) as e:
1561 except (NodeError, NodeAlreadyExistsError) as e:
1562 error = h.escape(e)
1562 error = h.escape(e)
1563 h.flash(error, category='error')
1563 h.flash(error, category='error')
1564
1564
1565 return {
1565 return {
1566 'error': error,
1566 'error': error,
1567 'redirect_url': default_redirect_url
1567 'redirect_url': default_redirect_url
1568 }
1568 }
1569 except Exception:
1569 except Exception:
1570 log.exception('Error occurred during commit')
1570 log.exception('Error occurred during commit')
1571 error = _('Error occurred during commit')
1571 error = _('Error occurred during commit')
1572 h.flash(error, category='error')
1572 h.flash(error, category='error')
1573 return {
1573 return {
1574 'error': error,
1574 'error': error,
1575 'redirect_url': default_redirect_url
1575 'redirect_url': default_redirect_url
1576 }
1576 }
1577
1577
1578 return {
1578 return {
1579 'error': None,
1579 'error': None,
1580 'redirect_url': default_redirect_url
1580 'redirect_url': default_redirect_url
1581 }
1581 }
@@ -1,289 +1,290 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2011-2020 RhodeCode GmbH
3 # Copyright (C) 2011-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import logging
21 import logging
22 import string
22 import string
23 import time
23 import time
24
24
25 import rhodecode
25 import rhodecode
26
26
27
27
28
28
29 from rhodecode.lib.view_utils import get_format_ref_id
29 from rhodecode.lib.view_utils import get_format_ref_id
30 from rhodecode.apps._base import RepoAppView
30 from rhodecode.apps._base import RepoAppView
31 from rhodecode.config.conf import (LANGUAGES_EXTENSIONS_MAP)
31 from rhodecode.config.conf import (LANGUAGES_EXTENSIONS_MAP)
32 from rhodecode.lib import helpers as h, rc_cache
32 from rhodecode.lib import helpers as h, rc_cache
33 from rhodecode.lib.utils2 import safe_str, safe_int
33 from rhodecode.lib.utils2 import safe_str, safe_int
34 from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator
34 from rhodecode.lib.auth import LoginRequired, HasRepoPermissionAnyDecorator
35 from rhodecode.lib.ext_json import json
35 from rhodecode.lib.ext_json import json
36 from rhodecode.lib.vcs.backends.base import EmptyCommit
36 from rhodecode.lib.vcs.backends.base import EmptyCommit
37 from rhodecode.lib.vcs.exceptions import (
37 from rhodecode.lib.vcs.exceptions import (
38 CommitError, EmptyRepositoryError, CommitDoesNotExistError)
38 CommitError, EmptyRepositoryError, CommitDoesNotExistError)
39 from rhodecode.model.db import Statistics, CacheKey, User
39 from rhodecode.model.db import Statistics, CacheKey, User
40 from rhodecode.model.meta import Session
40 from rhodecode.model.meta import Session
41 from rhodecode.model.scm import ScmModel
41 from rhodecode.model.scm import ScmModel
42
42
43 log = logging.getLogger(__name__)
43 log = logging.getLogger(__name__)
44
44
45
45
46 class RepoSummaryView(RepoAppView):
46 class RepoSummaryView(RepoAppView):
47
47
48 def load_default_context(self):
48 def load_default_context(self):
49 c = self._get_local_tmpl_context(include_app_defaults=True)
49 c = self._get_local_tmpl_context(include_app_defaults=True)
50 c.rhodecode_repo = None
50 c.rhodecode_repo = None
51 if not c.repository_requirements_missing:
51 if not c.repository_requirements_missing:
52 c.rhodecode_repo = self.rhodecode_vcs_repo
52 c.rhodecode_repo = self.rhodecode_vcs_repo
53 return c
53 return c
54
54
55 def _load_commits_context(self, c):
55 def _load_commits_context(self, c):
56 p = safe_int(self.request.GET.get('page'), 1)
56 p = safe_int(self.request.GET.get('page'), 1)
57 size = safe_int(self.request.GET.get('size'), 10)
57 size = safe_int(self.request.GET.get('size'), 10)
58
58
59 def url_generator(page_num):
59 def url_generator(page_num):
60 query_params = {
60 query_params = {
61 'page': page_num,
61 'page': page_num,
62 'size': size
62 'size': size
63 }
63 }
64 return h.route_path(
64 return h.route_path(
65 'repo_summary_commits',
65 'repo_summary_commits',
66 repo_name=c.rhodecode_db_repo.repo_name, _query=query_params)
66 repo_name=c.rhodecode_db_repo.repo_name, _query=query_params)
67
67
68 pre_load = ['author', 'branch', 'date', 'message']
68 pre_load = self.get_commit_preload_attrs()
69
69 try:
70 try:
70 collection = self.rhodecode_vcs_repo.get_commits(
71 collection = self.rhodecode_vcs_repo.get_commits(
71 pre_load=pre_load, translate_tags=False)
72 pre_load=pre_load, translate_tags=False)
72 except EmptyRepositoryError:
73 except EmptyRepositoryError:
73 collection = self.rhodecode_vcs_repo
74 collection = self.rhodecode_vcs_repo
74
75
75 c.repo_commits = h.RepoPage(
76 c.repo_commits = h.RepoPage(
76 collection, page=p, items_per_page=size, url_maker=url_generator)
77 collection, page=p, items_per_page=size, url_maker=url_generator)
77 page_ids = [x.raw_id for x in c.repo_commits]
78 page_ids = [x.raw_id for x in c.repo_commits]
78 c.comments = self.db_repo.get_comments(page_ids)
79 c.comments = self.db_repo.get_comments(page_ids)
79 c.statuses = self.db_repo.statuses(page_ids)
80 c.statuses = self.db_repo.statuses(page_ids)
80
81
81 def _prepare_and_set_clone_url(self, c):
82 def _prepare_and_set_clone_url(self, c):
82 username = ''
83 username = ''
83 if self._rhodecode_user.username != User.DEFAULT_USER:
84 if self._rhodecode_user.username != User.DEFAULT_USER:
84 username = safe_str(self._rhodecode_user.username)
85 username = safe_str(self._rhodecode_user.username)
85
86
86 _def_clone_uri = c.clone_uri_tmpl
87 _def_clone_uri = c.clone_uri_tmpl
87 _def_clone_uri_id = c.clone_uri_id_tmpl
88 _def_clone_uri_id = c.clone_uri_id_tmpl
88 _def_clone_uri_ssh = c.clone_uri_ssh_tmpl
89 _def_clone_uri_ssh = c.clone_uri_ssh_tmpl
89
90
90 c.clone_repo_url = self.db_repo.clone_url(
91 c.clone_repo_url = self.db_repo.clone_url(
91 user=username, uri_tmpl=_def_clone_uri)
92 user=username, uri_tmpl=_def_clone_uri)
92 c.clone_repo_url_id = self.db_repo.clone_url(
93 c.clone_repo_url_id = self.db_repo.clone_url(
93 user=username, uri_tmpl=_def_clone_uri_id)
94 user=username, uri_tmpl=_def_clone_uri_id)
94 c.clone_repo_url_ssh = self.db_repo.clone_url(
95 c.clone_repo_url_ssh = self.db_repo.clone_url(
95 uri_tmpl=_def_clone_uri_ssh, ssh=True)
96 uri_tmpl=_def_clone_uri_ssh, ssh=True)
96
97
97 @LoginRequired()
98 @LoginRequired()
98 @HasRepoPermissionAnyDecorator(
99 @HasRepoPermissionAnyDecorator(
99 'repository.read', 'repository.write', 'repository.admin')
100 'repository.read', 'repository.write', 'repository.admin')
100 def summary_commits(self):
101 def summary_commits(self):
101 c = self.load_default_context()
102 c = self.load_default_context()
102 self._prepare_and_set_clone_url(c)
103 self._prepare_and_set_clone_url(c)
103 self._load_commits_context(c)
104 self._load_commits_context(c)
104 return self._get_template_context(c)
105 return self._get_template_context(c)
105
106
106 @LoginRequired()
107 @LoginRequired()
107 @HasRepoPermissionAnyDecorator(
108 @HasRepoPermissionAnyDecorator(
108 'repository.read', 'repository.write', 'repository.admin')
109 'repository.read', 'repository.write', 'repository.admin')
109 def summary(self):
110 def summary(self):
110 c = self.load_default_context()
111 c = self.load_default_context()
111
112
112 # Prepare the clone URL
113 # Prepare the clone URL
113 self._prepare_and_set_clone_url(c)
114 self._prepare_and_set_clone_url(c)
114
115
115 # If enabled, get statistics data
116 # If enabled, get statistics data
116 c.show_stats = bool(self.db_repo.enable_statistics)
117 c.show_stats = bool(self.db_repo.enable_statistics)
117
118
118 stats = Session().query(Statistics) \
119 stats = Session().query(Statistics) \
119 .filter(Statistics.repository == self.db_repo) \
120 .filter(Statistics.repository == self.db_repo) \
120 .scalar()
121 .scalar()
121
122
122 c.stats_percentage = 0
123 c.stats_percentage = 0
123
124
124 if stats and stats.languages:
125 if stats and stats.languages:
125 c.no_data = False is self.db_repo.enable_statistics
126 c.no_data = False is self.db_repo.enable_statistics
126 lang_stats_d = json.loads(stats.languages)
127 lang_stats_d = json.loads(stats.languages)
127
128
128 # Sort first by decreasing count and second by the file extension,
129 # Sort first by decreasing count and second by the file extension,
129 # so we have a consistent output.
130 # so we have a consistent output.
130 lang_stats_items = sorted(lang_stats_d.iteritems(),
131 lang_stats_items = sorted(lang_stats_d.iteritems(),
131 key=lambda k: (-k[1], k[0]))[:10]
132 key=lambda k: (-k[1], k[0]))[:10]
132 lang_stats = [(x, {"count": y,
133 lang_stats = [(x, {"count": y,
133 "desc": LANGUAGES_EXTENSIONS_MAP.get(x)})
134 "desc": LANGUAGES_EXTENSIONS_MAP.get(x)})
134 for x, y in lang_stats_items]
135 for x, y in lang_stats_items]
135
136
136 c.trending_languages = json.dumps(lang_stats)
137 c.trending_languages = json.dumps(lang_stats)
137 else:
138 else:
138 c.no_data = True
139 c.no_data = True
139 c.trending_languages = json.dumps({})
140 c.trending_languages = json.dumps({})
140
141
141 scm_model = ScmModel()
142 scm_model = ScmModel()
142 c.enable_downloads = self.db_repo.enable_downloads
143 c.enable_downloads = self.db_repo.enable_downloads
143 c.repository_followers = scm_model.get_followers(self.db_repo)
144 c.repository_followers = scm_model.get_followers(self.db_repo)
144 c.repository_forks = scm_model.get_forks(self.db_repo)
145 c.repository_forks = scm_model.get_forks(self.db_repo)
145
146
146 # first interaction with the VCS instance after here...
147 # first interaction with the VCS instance after here...
147 if c.repository_requirements_missing:
148 if c.repository_requirements_missing:
148 self.request.override_renderer = \
149 self.request.override_renderer = \
149 'rhodecode:templates/summary/missing_requirements.mako'
150 'rhodecode:templates/summary/missing_requirements.mako'
150 return self._get_template_context(c)
151 return self._get_template_context(c)
151
152
152 c.readme_data, c.readme_file = \
153 c.readme_data, c.readme_file = \
153 self._get_readme_data(self.db_repo, c.visual.default_renderer)
154 self._get_readme_data(self.db_repo, c.visual.default_renderer)
154
155
155 # loads the summary commits template context
156 # loads the summary commits template context
156 self._load_commits_context(c)
157 self._load_commits_context(c)
157
158
158 return self._get_template_context(c)
159 return self._get_template_context(c)
159
160
160 @LoginRequired()
161 @LoginRequired()
161 @HasRepoPermissionAnyDecorator(
162 @HasRepoPermissionAnyDecorator(
162 'repository.read', 'repository.write', 'repository.admin')
163 'repository.read', 'repository.write', 'repository.admin')
163 def repo_stats(self):
164 def repo_stats(self):
164 show_stats = bool(self.db_repo.enable_statistics)
165 show_stats = bool(self.db_repo.enable_statistics)
165 repo_id = self.db_repo.repo_id
166 repo_id = self.db_repo.repo_id
166
167
167 landing_commit = self.db_repo.get_landing_commit()
168 landing_commit = self.db_repo.get_landing_commit()
168 if isinstance(landing_commit, EmptyCommit):
169 if isinstance(landing_commit, EmptyCommit):
169 return {'size': 0, 'code_stats': {}}
170 return {'size': 0, 'code_stats': {}}
170
171
171 cache_seconds = safe_int(rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
172 cache_seconds = safe_int(rhodecode.CONFIG.get('rc_cache.cache_repo.expiration_time'))
172 cache_on = cache_seconds > 0
173 cache_on = cache_seconds > 0
173
174
174 log.debug(
175 log.debug(
175 'Computing REPO STATS for repo_id %s commit_id `%s` '
176 'Computing REPO STATS for repo_id %s commit_id `%s` '
176 'with caching: %s[TTL: %ss]' % (
177 'with caching: %s[TTL: %ss]' % (
177 repo_id, landing_commit, cache_on, cache_seconds or 0))
178 repo_id, landing_commit, cache_on, cache_seconds or 0))
178
179
179 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
180 cache_namespace_uid = 'cache_repo.{}'.format(repo_id)
180 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
181 region = rc_cache.get_or_create_region('cache_repo', cache_namespace_uid)
181
182
182 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid,
183 @region.conditional_cache_on_arguments(namespace=cache_namespace_uid,
183 condition=cache_on)
184 condition=cache_on)
184 def compute_stats(repo_id, commit_id, _show_stats):
185 def compute_stats(repo_id, commit_id, _show_stats):
185 code_stats = {}
186 code_stats = {}
186 size = 0
187 size = 0
187 try:
188 try:
188 commit = self.db_repo.get_commit(commit_id)
189 commit = self.db_repo.get_commit(commit_id)
189
190
190 for node in commit.get_filenodes_generator():
191 for node in commit.get_filenodes_generator():
191 size += node.size
192 size += node.size
192 if not _show_stats:
193 if not _show_stats:
193 continue
194 continue
194 ext = string.lower(node.extension)
195 ext = string.lower(node.extension)
195 ext_info = LANGUAGES_EXTENSIONS_MAP.get(ext)
196 ext_info = LANGUAGES_EXTENSIONS_MAP.get(ext)
196 if ext_info:
197 if ext_info:
197 if ext in code_stats:
198 if ext in code_stats:
198 code_stats[ext]['count'] += 1
199 code_stats[ext]['count'] += 1
199 else:
200 else:
200 code_stats[ext] = {"count": 1, "desc": ext_info}
201 code_stats[ext] = {"count": 1, "desc": ext_info}
201 except (EmptyRepositoryError, CommitDoesNotExistError):
202 except (EmptyRepositoryError, CommitDoesNotExistError):
202 pass
203 pass
203 return {'size': h.format_byte_size_binary(size),
204 return {'size': h.format_byte_size_binary(size),
204 'code_stats': code_stats}
205 'code_stats': code_stats}
205
206
206 stats = compute_stats(self.db_repo.repo_id, landing_commit.raw_id, show_stats)
207 stats = compute_stats(self.db_repo.repo_id, landing_commit.raw_id, show_stats)
207 return stats
208 return stats
208
209
209 @LoginRequired()
210 @LoginRequired()
210 @HasRepoPermissionAnyDecorator(
211 @HasRepoPermissionAnyDecorator(
211 'repository.read', 'repository.write', 'repository.admin')
212 'repository.read', 'repository.write', 'repository.admin')
212 def repo_refs_data(self):
213 def repo_refs_data(self):
213 _ = self.request.translate
214 _ = self.request.translate
214 self.load_default_context()
215 self.load_default_context()
215
216
216 repo = self.rhodecode_vcs_repo
217 repo = self.rhodecode_vcs_repo
217 refs_to_create = [
218 refs_to_create = [
218 (_("Branch"), repo.branches, 'branch'),
219 (_("Branch"), repo.branches, 'branch'),
219 (_("Tag"), repo.tags, 'tag'),
220 (_("Tag"), repo.tags, 'tag'),
220 (_("Bookmark"), repo.bookmarks, 'book'),
221 (_("Bookmark"), repo.bookmarks, 'book'),
221 ]
222 ]
222 res = self._create_reference_data(repo, self.db_repo_name, refs_to_create)
223 res = self._create_reference_data(repo, self.db_repo_name, refs_to_create)
223 data = {
224 data = {
224 'more': False,
225 'more': False,
225 'results': res
226 'results': res
226 }
227 }
227 return data
228 return data
228
229
229 @LoginRequired()
230 @LoginRequired()
230 @HasRepoPermissionAnyDecorator(
231 @HasRepoPermissionAnyDecorator(
231 'repository.read', 'repository.write', 'repository.admin')
232 'repository.read', 'repository.write', 'repository.admin')
232 def repo_refs_changelog_data(self):
233 def repo_refs_changelog_data(self):
233 _ = self.request.translate
234 _ = self.request.translate
234 self.load_default_context()
235 self.load_default_context()
235
236
236 repo = self.rhodecode_vcs_repo
237 repo = self.rhodecode_vcs_repo
237
238
238 refs_to_create = [
239 refs_to_create = [
239 (_("Branches"), repo.branches, 'branch'),
240 (_("Branches"), repo.branches, 'branch'),
240 (_("Closed branches"), repo.branches_closed, 'branch_closed'),
241 (_("Closed branches"), repo.branches_closed, 'branch_closed'),
241 # TODO: enable when vcs can handle bookmarks filters
242 # TODO: enable when vcs can handle bookmarks filters
242 # (_("Bookmarks"), repo.bookmarks, "book"),
243 # (_("Bookmarks"), repo.bookmarks, "book"),
243 ]
244 ]
244 res = self._create_reference_data(
245 res = self._create_reference_data(
245 repo, self.db_repo_name, refs_to_create)
246 repo, self.db_repo_name, refs_to_create)
246 data = {
247 data = {
247 'more': False,
248 'more': False,
248 'results': res
249 'results': res
249 }
250 }
250 return data
251 return data
251
252
252 def _create_reference_data(self, repo, full_repo_name, refs_to_create):
253 def _create_reference_data(self, repo, full_repo_name, refs_to_create):
253 format_ref_id = get_format_ref_id(repo)
254 format_ref_id = get_format_ref_id(repo)
254
255
255 result = []
256 result = []
256 for title, refs, ref_type in refs_to_create:
257 for title, refs, ref_type in refs_to_create:
257 if refs:
258 if refs:
258 result.append({
259 result.append({
259 'text': title,
260 'text': title,
260 'children': self._create_reference_items(
261 'children': self._create_reference_items(
261 repo, full_repo_name, refs, ref_type,
262 repo, full_repo_name, refs, ref_type,
262 format_ref_id),
263 format_ref_id),
263 })
264 })
264 return result
265 return result
265
266
266 def _create_reference_items(self, repo, full_repo_name, refs, ref_type, format_ref_id):
267 def _create_reference_items(self, repo, full_repo_name, refs, ref_type, format_ref_id):
267 result = []
268 result = []
268 is_svn = h.is_svn(repo)
269 is_svn = h.is_svn(repo)
269 for ref_name, raw_id in refs.iteritems():
270 for ref_name, raw_id in refs.iteritems():
270 files_url = self._create_files_url(
271 files_url = self._create_files_url(
271 repo, full_repo_name, ref_name, raw_id, is_svn)
272 repo, full_repo_name, ref_name, raw_id, is_svn)
272 result.append({
273 result.append({
273 'text': ref_name,
274 'text': ref_name,
274 'id': format_ref_id(ref_name, raw_id),
275 'id': format_ref_id(ref_name, raw_id),
275 'raw_id': raw_id,
276 'raw_id': raw_id,
276 'type': ref_type,
277 'type': ref_type,
277 'files_url': files_url,
278 'files_url': files_url,
278 'idx': 0,
279 'idx': 0,
279 })
280 })
280 return result
281 return result
281
282
282 def _create_files_url(self, repo, full_repo_name, ref_name, raw_id, is_svn):
283 def _create_files_url(self, repo, full_repo_name, ref_name, raw_id, is_svn):
283 use_commit_id = '/' in ref_name or is_svn
284 use_commit_id = '/' in ref_name or is_svn
284 return h.route_path(
285 return h.route_path(
285 'repo_files',
286 'repo_files',
286 repo_name=full_repo_name,
287 repo_name=full_repo_name,
287 f_path=ref_name if is_svn else '',
288 f_path=ref_name if is_svn else '',
288 commit_id=raw_id if use_commit_id else ref_name,
289 commit_id=raw_id if use_commit_id else ref_name,
289 _query=dict(at=ref_name))
290 _query=dict(at=ref_name))
@@ -1,782 +1,785 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import os
21 import os
22 import sys
22 import sys
23 import logging
23 import logging
24 import collections
24 import collections
25 import tempfile
25 import tempfile
26 import time
26 import time
27
27
28 from paste.gzipper import make_gzip_middleware
28 from paste.gzipper import make_gzip_middleware
29 import pyramid.events
29 import pyramid.events
30 from pyramid.wsgi import wsgiapp
30 from pyramid.wsgi import wsgiapp
31 from pyramid.authorization import ACLAuthorizationPolicy
31 from pyramid.authorization import ACLAuthorizationPolicy
32 from pyramid.config import Configurator
32 from pyramid.config import Configurator
33 from pyramid.settings import asbool, aslist
33 from pyramid.settings import asbool, aslist
34 from pyramid.httpexceptions import (
34 from pyramid.httpexceptions import (
35 HTTPException, HTTPError, HTTPInternalServerError, HTTPFound, HTTPNotFound)
35 HTTPException, HTTPError, HTTPInternalServerError, HTTPFound, HTTPNotFound)
36 from pyramid.renderers import render_to_response
36 from pyramid.renderers import render_to_response
37
37
38 from rhodecode.model import meta
38 from rhodecode.model import meta
39 from rhodecode.config import patches
39 from rhodecode.config import patches
40 from rhodecode.config import utils as config_utils
40 from rhodecode.config import utils as config_utils
41 from rhodecode.config.environment import load_pyramid_environment
41 from rhodecode.config.environment import load_pyramid_environment
42
42
43 import rhodecode.events
43 import rhodecode.events
44 from rhodecode.lib.middleware.vcs import VCSMiddleware
44 from rhodecode.lib.middleware.vcs import VCSMiddleware
45 from rhodecode.lib.request import Request
45 from rhodecode.lib.request import Request
46 from rhodecode.lib.vcs import VCSCommunicationError
46 from rhodecode.lib.vcs import VCSCommunicationError
47 from rhodecode.lib.exceptions import VCSServerUnavailable
47 from rhodecode.lib.exceptions import VCSServerUnavailable
48 from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled
48 from rhodecode.lib.middleware.appenlight import wrap_in_appenlight_if_enabled
49 from rhodecode.lib.middleware.https_fixup import HttpsFixup
49 from rhodecode.lib.middleware.https_fixup import HttpsFixup
50 from rhodecode.lib.plugins.utils import register_rhodecode_plugin
50 from rhodecode.lib.plugins.utils import register_rhodecode_plugin
51 from rhodecode.lib.utils2 import aslist as rhodecode_aslist, AttributeDict
51 from rhodecode.lib.utils2 import aslist as rhodecode_aslist, AttributeDict
52 from rhodecode.lib.exc_tracking import store_exception
52 from rhodecode.lib.exc_tracking import store_exception
53 from rhodecode.subscribers import (
53 from rhodecode.subscribers import (
54 scan_repositories_if_enabled, write_js_routes_if_enabled,
54 scan_repositories_if_enabled, write_js_routes_if_enabled,
55 write_metadata_if_needed, write_usage_data)
55 write_metadata_if_needed, write_usage_data)
56
56
57
57
58 log = logging.getLogger(__name__)
58 log = logging.getLogger(__name__)
59
59
60
60
61 def is_http_error(response):
61 def is_http_error(response):
62 # error which should have traceback
62 # error which should have traceback
63 return response.status_code > 499
63 return response.status_code > 499
64
64
65
65
66 def should_load_all():
66 def should_load_all():
67 """
67 """
68 Returns if all application components should be loaded. In some cases it's
68 Returns if all application components should be loaded. In some cases it's
69 desired to skip apps loading for faster shell script execution
69 desired to skip apps loading for faster shell script execution
70 """
70 """
71 ssh_cmd = os.environ.get('RC_CMD_SSH_WRAPPER')
71 ssh_cmd = os.environ.get('RC_CMD_SSH_WRAPPER')
72 if ssh_cmd:
72 if ssh_cmd:
73 return False
73 return False
74
74
75 return True
75 return True
76
76
77
77
78 def make_pyramid_app(global_config, **settings):
78 def make_pyramid_app(global_config, **settings):
79 """
79 """
80 Constructs the WSGI application based on Pyramid.
80 Constructs the WSGI application based on Pyramid.
81
81
82 Specials:
82 Specials:
83
83
84 * The application can also be integrated like a plugin via the call to
84 * The application can also be integrated like a plugin via the call to
85 `includeme`. This is accompanied with the other utility functions which
85 `includeme`. This is accompanied with the other utility functions which
86 are called. Changing this should be done with great care to not break
86 are called. Changing this should be done with great care to not break
87 cases when these fragments are assembled from another place.
87 cases when these fragments are assembled from another place.
88
88
89 """
89 """
90
90
91 # Allows to use format style "{ENV_NAME}" placeholders in the configuration. It
91 # Allows to use format style "{ENV_NAME}" placeholders in the configuration. It
92 # will be replaced by the value of the environment variable "NAME" in this case.
92 # will be replaced by the value of the environment variable "NAME" in this case.
93 start_time = time.time()
93 start_time = time.time()
94 log.info('Pyramid app config starting')
94 log.info('Pyramid app config starting')
95
95
96 debug = asbool(global_config.get('debug'))
96 debug = asbool(global_config.get('debug'))
97 if debug:
97 if debug:
98 enable_debug()
98 enable_debug()
99
99
100 environ = {'ENV_{}'.format(key): value for key, value in os.environ.items()}
100 environ = {'ENV_{}'.format(key): value for key, value in os.environ.items()}
101
101
102 global_config = _substitute_values(global_config, environ)
102 global_config = _substitute_values(global_config, environ)
103 settings = _substitute_values(settings, environ)
103 settings = _substitute_values(settings, environ)
104
104
105 sanitize_settings_and_apply_defaults(global_config, settings)
105 sanitize_settings_and_apply_defaults(global_config, settings)
106
106
107 config = Configurator(settings=settings)
107 config = Configurator(settings=settings)
108
108
109 # Apply compatibility patches
109 # Apply compatibility patches
110 patches.inspect_getargspec()
110 patches.inspect_getargspec()
111
111
112 load_pyramid_environment(global_config, settings)
112 load_pyramid_environment(global_config, settings)
113
113
114 # Static file view comes first
114 # Static file view comes first
115 includeme_first(config)
115 includeme_first(config)
116
116
117 includeme(config)
117 includeme(config)
118
118
119 pyramid_app = config.make_wsgi_app()
119 pyramid_app = config.make_wsgi_app()
120 pyramid_app = wrap_app_in_wsgi_middlewares(pyramid_app, config)
120 pyramid_app = wrap_app_in_wsgi_middlewares(pyramid_app, config)
121 pyramid_app.config = config
121 pyramid_app.config = config
122
122
123 config.configure_celery(global_config['__file__'])
123 config.configure_celery(global_config['__file__'])
124
124
125 # creating the app uses a connection - return it after we are done
125 # creating the app uses a connection - return it after we are done
126 meta.Session.remove()
126 meta.Session.remove()
127 total_time = time.time() - start_time
127 total_time = time.time() - start_time
128 log.info('Pyramid app `%s` created and configured in %.2fs',
128 log.info('Pyramid app `%s` created and configured in %.2fs',
129 pyramid_app.func_name, total_time)
129 pyramid_app.func_name, total_time)
130
130
131 return pyramid_app
131 return pyramid_app
132
132
133
133
134 def not_found_view(request):
134 def not_found_view(request):
135 """
135 """
136 This creates the view which should be registered as not-found-view to
136 This creates the view which should be registered as not-found-view to
137 pyramid.
137 pyramid.
138 """
138 """
139
139
140 if not getattr(request, 'vcs_call', None):
140 if not getattr(request, 'vcs_call', None):
141 # handle like regular case with our error_handler
141 # handle like regular case with our error_handler
142 return error_handler(HTTPNotFound(), request)
142 return error_handler(HTTPNotFound(), request)
143
143
144 # handle not found view as a vcs call
144 # handle not found view as a vcs call
145 settings = request.registry.settings
145 settings = request.registry.settings
146 ae_client = getattr(request, 'ae_client', None)
146 ae_client = getattr(request, 'ae_client', None)
147 vcs_app = VCSMiddleware(
147 vcs_app = VCSMiddleware(
148 HTTPNotFound(), request.registry, settings,
148 HTTPNotFound(), request.registry, settings,
149 appenlight_client=ae_client)
149 appenlight_client=ae_client)
150
150
151 return wsgiapp(vcs_app)(None, request)
151 return wsgiapp(vcs_app)(None, request)
152
152
153
153
154 def error_handler(exception, request):
154 def error_handler(exception, request):
155 import rhodecode
155 import rhodecode
156 from rhodecode.lib import helpers
156 from rhodecode.lib import helpers
157 from rhodecode.lib.utils2 import str2bool
157
158
158 rhodecode_title = rhodecode.CONFIG.get('rhodecode_title') or 'RhodeCode'
159 rhodecode_title = rhodecode.CONFIG.get('rhodecode_title') or 'RhodeCode'
159
160
160 base_response = HTTPInternalServerError()
161 base_response = HTTPInternalServerError()
161 # prefer original exception for the response since it may have headers set
162 # prefer original exception for the response since it may have headers set
162 if isinstance(exception, HTTPException):
163 if isinstance(exception, HTTPException):
163 base_response = exception
164 base_response = exception
164 elif isinstance(exception, VCSCommunicationError):
165 elif isinstance(exception, VCSCommunicationError):
165 base_response = VCSServerUnavailable()
166 base_response = VCSServerUnavailable()
166
167
167 if is_http_error(base_response):
168 if is_http_error(base_response):
168 log.exception(
169 log.exception(
169 'error occurred handling this request for path: %s', request.path)
170 'error occurred handling this request for path: %s', request.path)
170
171
171 error_explanation = base_response.explanation or str(base_response)
172 error_explanation = base_response.explanation or str(base_response)
172 if base_response.status_code == 404:
173 if base_response.status_code == 404:
173 error_explanation += " Optionally you don't have permission to access this page."
174 error_explanation += " Optionally you don't have permission to access this page."
174 c = AttributeDict()
175 c = AttributeDict()
175 c.error_message = base_response.status
176 c.error_message = base_response.status
176 c.error_explanation = error_explanation
177 c.error_explanation = error_explanation
177 c.visual = AttributeDict()
178 c.visual = AttributeDict()
178
179
179 c.visual.rhodecode_support_url = (
180 c.visual.rhodecode_support_url = (
180 request.registry.settings.get('rhodecode_support_url') or
181 request.registry.settings.get('rhodecode_support_url') or
181 request.route_url('rhodecode_support')
182 request.route_url('rhodecode_support')
182 )
183 )
183 c.redirect_time = 0
184 c.redirect_time = 0
184 c.rhodecode_name = rhodecode_title
185 c.rhodecode_name = rhodecode_title
185 if not c.rhodecode_name:
186 if not c.rhodecode_name:
186 c.rhodecode_name = 'Rhodecode'
187 c.rhodecode_name = 'Rhodecode'
187
188
188 c.causes = []
189 c.causes = []
189 if is_http_error(base_response):
190 if is_http_error(base_response):
190 c.causes.append('Server is overloaded.')
191 c.causes.append('Server is overloaded.')
191 c.causes.append('Server database connection is lost.')
192 c.causes.append('Server database connection is lost.')
192 c.causes.append('Server expected unhandled error.')
193 c.causes.append('Server expected unhandled error.')
193
194
194 if hasattr(base_response, 'causes'):
195 if hasattr(base_response, 'causes'):
195 c.causes = base_response.causes
196 c.causes = base_response.causes
196
197
197 c.messages = helpers.flash.pop_messages(request=request)
198 c.messages = helpers.flash.pop_messages(request=request)
198
199
199 exc_info = sys.exc_info()
200 exc_info = sys.exc_info()
200 c.exception_id = id(exc_info)
201 c.exception_id = id(exc_info)
201 c.show_exception_id = isinstance(base_response, VCSServerUnavailable) \
202 c.show_exception_id = isinstance(base_response, VCSServerUnavailable) \
202 or base_response.status_code > 499
203 or base_response.status_code > 499
203 c.exception_id_url = request.route_url(
204 c.exception_id_url = request.route_url(
204 'admin_settings_exception_tracker_show', exception_id=c.exception_id)
205 'admin_settings_exception_tracker_show', exception_id=c.exception_id)
205
206
206 if c.show_exception_id:
207 if c.show_exception_id:
207 store_exception(c.exception_id, exc_info)
208 store_exception(c.exception_id, exc_info)
209 c.exception_debug = str2bool(rhodecode.CONFIG.get('debug'))
210 c.exception_config_ini = rhodecode.CONFIG.get('__file__')
208
211
209 response = render_to_response(
212 response = render_to_response(
210 '/errors/error_document.mako', {'c': c, 'h': helpers}, request=request,
213 '/errors/error_document.mako', {'c': c, 'h': helpers}, request=request,
211 response=base_response)
214 response=base_response)
212
215
213 return response
216 return response
214
217
215
218
216 def includeme_first(config):
219 def includeme_first(config):
217 # redirect automatic browser favicon.ico requests to correct place
220 # redirect automatic browser favicon.ico requests to correct place
218 def favicon_redirect(context, request):
221 def favicon_redirect(context, request):
219 return HTTPFound(
222 return HTTPFound(
220 request.static_path('rhodecode:public/images/favicon.ico'))
223 request.static_path('rhodecode:public/images/favicon.ico'))
221
224
222 config.add_view(favicon_redirect, route_name='favicon')
225 config.add_view(favicon_redirect, route_name='favicon')
223 config.add_route('favicon', '/favicon.ico')
226 config.add_route('favicon', '/favicon.ico')
224
227
225 def robots_redirect(context, request):
228 def robots_redirect(context, request):
226 return HTTPFound(
229 return HTTPFound(
227 request.static_path('rhodecode:public/robots.txt'))
230 request.static_path('rhodecode:public/robots.txt'))
228
231
229 config.add_view(robots_redirect, route_name='robots')
232 config.add_view(robots_redirect, route_name='robots')
230 config.add_route('robots', '/robots.txt')
233 config.add_route('robots', '/robots.txt')
231
234
232 config.add_static_view(
235 config.add_static_view(
233 '_static/deform', 'deform:static')
236 '_static/deform', 'deform:static')
234 config.add_static_view(
237 config.add_static_view(
235 '_static/rhodecode', path='rhodecode:public', cache_max_age=3600 * 24)
238 '_static/rhodecode', path='rhodecode:public', cache_max_age=3600 * 24)
236
239
237
240
238 def includeme(config, auth_resources=None):
241 def includeme(config, auth_resources=None):
239 from rhodecode.lib.celerylib.loader import configure_celery
242 from rhodecode.lib.celerylib.loader import configure_celery
240 log.debug('Initializing main includeme from %s', os.path.basename(__file__))
243 log.debug('Initializing main includeme from %s', os.path.basename(__file__))
241 settings = config.registry.settings
244 settings = config.registry.settings
242 config.set_request_factory(Request)
245 config.set_request_factory(Request)
243
246
244 # plugin information
247 # plugin information
245 config.registry.rhodecode_plugins = collections.OrderedDict()
248 config.registry.rhodecode_plugins = collections.OrderedDict()
246
249
247 config.add_directive(
250 config.add_directive(
248 'register_rhodecode_plugin', register_rhodecode_plugin)
251 'register_rhodecode_plugin', register_rhodecode_plugin)
249
252
250 config.add_directive('configure_celery', configure_celery)
253 config.add_directive('configure_celery', configure_celery)
251
254
252 if asbool(settings.get('appenlight', 'false')):
255 if asbool(settings.get('appenlight', 'false')):
253 config.include('appenlight_client.ext.pyramid_tween')
256 config.include('appenlight_client.ext.pyramid_tween')
254
257
255 load_all = should_load_all()
258 load_all = should_load_all()
256
259
257 # Includes which are required. The application would fail without them.
260 # Includes which are required. The application would fail without them.
258 config.include('pyramid_mako')
261 config.include('pyramid_mako')
259 config.include('rhodecode.lib.rc_beaker')
262 config.include('rhodecode.lib.rc_beaker')
260 config.include('rhodecode.lib.rc_cache')
263 config.include('rhodecode.lib.rc_cache')
261 config.include('rhodecode.apps._base.navigation')
264 config.include('rhodecode.apps._base.navigation')
262 config.include('rhodecode.apps._base.subscribers')
265 config.include('rhodecode.apps._base.subscribers')
263 config.include('rhodecode.tweens')
266 config.include('rhodecode.tweens')
264 config.include('rhodecode.authentication')
267 config.include('rhodecode.authentication')
265
268
266 if load_all:
269 if load_all:
267 ce_auth_resources = [
270 ce_auth_resources = [
268 'rhodecode.authentication.plugins.auth_crowd',
271 'rhodecode.authentication.plugins.auth_crowd',
269 'rhodecode.authentication.plugins.auth_headers',
272 'rhodecode.authentication.plugins.auth_headers',
270 'rhodecode.authentication.plugins.auth_jasig_cas',
273 'rhodecode.authentication.plugins.auth_jasig_cas',
271 'rhodecode.authentication.plugins.auth_ldap',
274 'rhodecode.authentication.plugins.auth_ldap',
272 'rhodecode.authentication.plugins.auth_pam',
275 'rhodecode.authentication.plugins.auth_pam',
273 'rhodecode.authentication.plugins.auth_rhodecode',
276 'rhodecode.authentication.plugins.auth_rhodecode',
274 'rhodecode.authentication.plugins.auth_token',
277 'rhodecode.authentication.plugins.auth_token',
275 ]
278 ]
276
279
277 # load CE authentication plugins
280 # load CE authentication plugins
278
281
279 if auth_resources:
282 if auth_resources:
280 ce_auth_resources.extend(auth_resources)
283 ce_auth_resources.extend(auth_resources)
281
284
282 for resource in ce_auth_resources:
285 for resource in ce_auth_resources:
283 config.include(resource)
286 config.include(resource)
284
287
285 # Auto discover authentication plugins and include their configuration.
288 # Auto discover authentication plugins and include their configuration.
286 if asbool(settings.get('auth_plugin.import_legacy_plugins', 'true')):
289 if asbool(settings.get('auth_plugin.import_legacy_plugins', 'true')):
287 from rhodecode.authentication import discover_legacy_plugins
290 from rhodecode.authentication import discover_legacy_plugins
288 discover_legacy_plugins(config)
291 discover_legacy_plugins(config)
289
292
290 # apps
293 # apps
291 if load_all:
294 if load_all:
292 config.include('rhodecode.api')
295 config.include('rhodecode.api')
293 config.include('rhodecode.apps._base')
296 config.include('rhodecode.apps._base')
294 config.include('rhodecode.apps.hovercards')
297 config.include('rhodecode.apps.hovercards')
295 config.include('rhodecode.apps.ops')
298 config.include('rhodecode.apps.ops')
296 config.include('rhodecode.apps.channelstream')
299 config.include('rhodecode.apps.channelstream')
297 config.include('rhodecode.apps.file_store')
300 config.include('rhodecode.apps.file_store')
298 config.include('rhodecode.apps.admin')
301 config.include('rhodecode.apps.admin')
299 config.include('rhodecode.apps.login')
302 config.include('rhodecode.apps.login')
300 config.include('rhodecode.apps.home')
303 config.include('rhodecode.apps.home')
301 config.include('rhodecode.apps.journal')
304 config.include('rhodecode.apps.journal')
302
305
303 config.include('rhodecode.apps.repository')
306 config.include('rhodecode.apps.repository')
304 config.include('rhodecode.apps.repo_group')
307 config.include('rhodecode.apps.repo_group')
305 config.include('rhodecode.apps.user_group')
308 config.include('rhodecode.apps.user_group')
306 config.include('rhodecode.apps.search')
309 config.include('rhodecode.apps.search')
307 config.include('rhodecode.apps.user_profile')
310 config.include('rhodecode.apps.user_profile')
308 config.include('rhodecode.apps.user_group_profile')
311 config.include('rhodecode.apps.user_group_profile')
309 config.include('rhodecode.apps.my_account')
312 config.include('rhodecode.apps.my_account')
310 config.include('rhodecode.apps.gist')
313 config.include('rhodecode.apps.gist')
311
314
312 config.include('rhodecode.apps.svn_support')
315 config.include('rhodecode.apps.svn_support')
313 config.include('rhodecode.apps.ssh_support')
316 config.include('rhodecode.apps.ssh_support')
314 config.include('rhodecode.apps.debug_style')
317 config.include('rhodecode.apps.debug_style')
315
318
316 if load_all:
319 if load_all:
317 config.include('rhodecode.integrations')
320 config.include('rhodecode.integrations')
318
321
319 config.add_route('rhodecode_support', 'https://rhodecode.com/help/', static=True)
322 config.add_route('rhodecode_support', 'https://rhodecode.com/help/', static=True)
320 config.add_translation_dirs('rhodecode:i18n/')
323 config.add_translation_dirs('rhodecode:i18n/')
321 settings['default_locale_name'] = settings.get('lang', 'en')
324 settings['default_locale_name'] = settings.get('lang', 'en')
322
325
323 # Add subscribers.
326 # Add subscribers.
324 if load_all:
327 if load_all:
325 config.add_subscriber(scan_repositories_if_enabled,
328 config.add_subscriber(scan_repositories_if_enabled,
326 pyramid.events.ApplicationCreated)
329 pyramid.events.ApplicationCreated)
327 config.add_subscriber(write_metadata_if_needed,
330 config.add_subscriber(write_metadata_if_needed,
328 pyramid.events.ApplicationCreated)
331 pyramid.events.ApplicationCreated)
329 config.add_subscriber(write_usage_data,
332 config.add_subscriber(write_usage_data,
330 pyramid.events.ApplicationCreated)
333 pyramid.events.ApplicationCreated)
331 config.add_subscriber(write_js_routes_if_enabled,
334 config.add_subscriber(write_js_routes_if_enabled,
332 pyramid.events.ApplicationCreated)
335 pyramid.events.ApplicationCreated)
333
336
334 # request custom methods
337 # request custom methods
335 config.add_request_method(
338 config.add_request_method(
336 'rhodecode.lib.partial_renderer.get_partial_renderer',
339 'rhodecode.lib.partial_renderer.get_partial_renderer',
337 'get_partial_renderer')
340 'get_partial_renderer')
338
341
339 config.add_request_method(
342 config.add_request_method(
340 'rhodecode.lib.request_counter.get_request_counter',
343 'rhodecode.lib.request_counter.get_request_counter',
341 'request_count')
344 'request_count')
342
345
343 config.add_request_method(
346 config.add_request_method(
344 'rhodecode.lib._vendor.statsd.get_statsd_client',
347 'rhodecode.lib._vendor.statsd.get_statsd_client',
345 'statsd', reify=True)
348 'statsd', reify=True)
346
349
347 # Set the authorization policy.
350 # Set the authorization policy.
348 authz_policy = ACLAuthorizationPolicy()
351 authz_policy = ACLAuthorizationPolicy()
349 config.set_authorization_policy(authz_policy)
352 config.set_authorization_policy(authz_policy)
350
353
351 # Set the default renderer for HTML templates to mako.
354 # Set the default renderer for HTML templates to mako.
352 config.add_mako_renderer('.html')
355 config.add_mako_renderer('.html')
353
356
354 config.add_renderer(
357 config.add_renderer(
355 name='json_ext',
358 name='json_ext',
356 factory='rhodecode.lib.ext_json_renderer.pyramid_ext_json')
359 factory='rhodecode.lib.ext_json_renderer.pyramid_ext_json')
357
360
358 config.add_renderer(
361 config.add_renderer(
359 name='string_html',
362 name='string_html',
360 factory='rhodecode.lib.string_renderer.html')
363 factory='rhodecode.lib.string_renderer.html')
361
364
362 # include RhodeCode plugins
365 # include RhodeCode plugins
363 includes = aslist(settings.get('rhodecode.includes', []))
366 includes = aslist(settings.get('rhodecode.includes', []))
364 for inc in includes:
367 for inc in includes:
365 config.include(inc)
368 config.include(inc)
366
369
367 # custom not found view, if our pyramid app doesn't know how to handle
370 # custom not found view, if our pyramid app doesn't know how to handle
368 # the request pass it to potential VCS handling ap
371 # the request pass it to potential VCS handling ap
369 config.add_notfound_view(not_found_view)
372 config.add_notfound_view(not_found_view)
370 if not settings.get('debugtoolbar.enabled', False):
373 if not settings.get('debugtoolbar.enabled', False):
371 # disabled debugtoolbar handle all exceptions via the error_handlers
374 # disabled debugtoolbar handle all exceptions via the error_handlers
372 config.add_view(error_handler, context=Exception)
375 config.add_view(error_handler, context=Exception)
373
376
374 # all errors including 403/404/50X
377 # all errors including 403/404/50X
375 config.add_view(error_handler, context=HTTPError)
378 config.add_view(error_handler, context=HTTPError)
376
379
377
380
378 def wrap_app_in_wsgi_middlewares(pyramid_app, config):
381 def wrap_app_in_wsgi_middlewares(pyramid_app, config):
379 """
382 """
380 Apply outer WSGI middlewares around the application.
383 Apply outer WSGI middlewares around the application.
381 """
384 """
382 registry = config.registry
385 registry = config.registry
383 settings = registry.settings
386 settings = registry.settings
384
387
385 # enable https redirects based on HTTP_X_URL_SCHEME set by proxy
388 # enable https redirects based on HTTP_X_URL_SCHEME set by proxy
386 pyramid_app = HttpsFixup(pyramid_app, settings)
389 pyramid_app = HttpsFixup(pyramid_app, settings)
387
390
388 pyramid_app, _ae_client = wrap_in_appenlight_if_enabled(
391 pyramid_app, _ae_client = wrap_in_appenlight_if_enabled(
389 pyramid_app, settings)
392 pyramid_app, settings)
390 registry.ae_client = _ae_client
393 registry.ae_client = _ae_client
391
394
392 if settings['gzip_responses']:
395 if settings['gzip_responses']:
393 pyramid_app = make_gzip_middleware(
396 pyramid_app = make_gzip_middleware(
394 pyramid_app, settings, compress_level=1)
397 pyramid_app, settings, compress_level=1)
395
398
396 # this should be the outer most middleware in the wsgi stack since
399 # this should be the outer most middleware in the wsgi stack since
397 # middleware like Routes make database calls
400 # middleware like Routes make database calls
398 def pyramid_app_with_cleanup(environ, start_response):
401 def pyramid_app_with_cleanup(environ, start_response):
399 try:
402 try:
400 return pyramid_app(environ, start_response)
403 return pyramid_app(environ, start_response)
401 finally:
404 finally:
402 # Dispose current database session and rollback uncommitted
405 # Dispose current database session and rollback uncommitted
403 # transactions.
406 # transactions.
404 meta.Session.remove()
407 meta.Session.remove()
405
408
406 # In a single threaded mode server, on non sqlite db we should have
409 # In a single threaded mode server, on non sqlite db we should have
407 # '0 Current Checked out connections' at the end of a request,
410 # '0 Current Checked out connections' at the end of a request,
408 # if not, then something, somewhere is leaving a connection open
411 # if not, then something, somewhere is leaving a connection open
409 pool = meta.Base.metadata.bind.engine.pool
412 pool = meta.Base.metadata.bind.engine.pool
410 log.debug('sa pool status: %s', pool.status())
413 log.debug('sa pool status: %s', pool.status())
411 log.debug('Request processing finalized')
414 log.debug('Request processing finalized')
412
415
413 return pyramid_app_with_cleanup
416 return pyramid_app_with_cleanup
414
417
415
418
416 def sanitize_settings_and_apply_defaults(global_config, settings):
419 def sanitize_settings_and_apply_defaults(global_config, settings):
417 """
420 """
418 Applies settings defaults and does all type conversion.
421 Applies settings defaults and does all type conversion.
419
422
420 We would move all settings parsing and preparation into this place, so that
423 We would move all settings parsing and preparation into this place, so that
421 we have only one place left which deals with this part. The remaining parts
424 we have only one place left which deals with this part. The remaining parts
422 of the application would start to rely fully on well prepared settings.
425 of the application would start to rely fully on well prepared settings.
423
426
424 This piece would later be split up per topic to avoid a big fat monster
427 This piece would later be split up per topic to avoid a big fat monster
425 function.
428 function.
426 """
429 """
427
430
428 settings.setdefault('rhodecode.edition', 'Community Edition')
431 settings.setdefault('rhodecode.edition', 'Community Edition')
429 settings.setdefault('rhodecode.edition_id', 'CE')
432 settings.setdefault('rhodecode.edition_id', 'CE')
430
433
431 if 'mako.default_filters' not in settings:
434 if 'mako.default_filters' not in settings:
432 # set custom default filters if we don't have it defined
435 # set custom default filters if we don't have it defined
433 settings['mako.imports'] = 'from rhodecode.lib.base import h_filter'
436 settings['mako.imports'] = 'from rhodecode.lib.base import h_filter'
434 settings['mako.default_filters'] = 'h_filter'
437 settings['mako.default_filters'] = 'h_filter'
435
438
436 if 'mako.directories' not in settings:
439 if 'mako.directories' not in settings:
437 mako_directories = settings.setdefault('mako.directories', [
440 mako_directories = settings.setdefault('mako.directories', [
438 # Base templates of the original application
441 # Base templates of the original application
439 'rhodecode:templates',
442 'rhodecode:templates',
440 ])
443 ])
441 log.debug(
444 log.debug(
442 "Using the following Mako template directories: %s",
445 "Using the following Mako template directories: %s",
443 mako_directories)
446 mako_directories)
444
447
445 # NOTE(marcink): fix redis requirement for schema of connection since 3.X
448 # NOTE(marcink): fix redis requirement for schema of connection since 3.X
446 if 'beaker.session.type' in settings and settings['beaker.session.type'] == 'ext:redis':
449 if 'beaker.session.type' in settings and settings['beaker.session.type'] == 'ext:redis':
447 raw_url = settings['beaker.session.url']
450 raw_url = settings['beaker.session.url']
448 if not raw_url.startswith(('redis://', 'rediss://', 'unix://')):
451 if not raw_url.startswith(('redis://', 'rediss://', 'unix://')):
449 settings['beaker.session.url'] = 'redis://' + raw_url
452 settings['beaker.session.url'] = 'redis://' + raw_url
450
453
451 # Default includes, possible to change as a user
454 # Default includes, possible to change as a user
452 pyramid_includes = settings.setdefault('pyramid.includes', [])
455 pyramid_includes = settings.setdefault('pyramid.includes', [])
453 log.debug(
456 log.debug(
454 "Using the following pyramid.includes: %s",
457 "Using the following pyramid.includes: %s",
455 pyramid_includes)
458 pyramid_includes)
456
459
457 # TODO: johbo: Re-think this, usually the call to config.include
460 # TODO: johbo: Re-think this, usually the call to config.include
458 # should allow to pass in a prefix.
461 # should allow to pass in a prefix.
459 settings.setdefault('rhodecode.api.url', '/_admin/api')
462 settings.setdefault('rhodecode.api.url', '/_admin/api')
460 settings.setdefault('__file__', global_config.get('__file__'))
463 settings.setdefault('__file__', global_config.get('__file__'))
461
464
462 # Sanitize generic settings.
465 # Sanitize generic settings.
463 _list_setting(settings, 'default_encoding', 'UTF-8')
466 _list_setting(settings, 'default_encoding', 'UTF-8')
464 _bool_setting(settings, 'is_test', 'false')
467 _bool_setting(settings, 'is_test', 'false')
465 _bool_setting(settings, 'gzip_responses', 'false')
468 _bool_setting(settings, 'gzip_responses', 'false')
466
469
467 # Call split out functions that sanitize settings for each topic.
470 # Call split out functions that sanitize settings for each topic.
468 _sanitize_appenlight_settings(settings)
471 _sanitize_appenlight_settings(settings)
469 _sanitize_vcs_settings(settings)
472 _sanitize_vcs_settings(settings)
470 _sanitize_cache_settings(settings)
473 _sanitize_cache_settings(settings)
471
474
472 # configure instance id
475 # configure instance id
473 config_utils.set_instance_id(settings)
476 config_utils.set_instance_id(settings)
474
477
475 return settings
478 return settings
476
479
477
480
478 def enable_debug():
481 def enable_debug():
479 """
482 """
480 Helper to enable debug on running instance
483 Helper to enable debug on running instance
481 :return:
484 :return:
482 """
485 """
483 import tempfile
486 import tempfile
484 import textwrap
487 import textwrap
485 import logging.config
488 import logging.config
486
489
487 ini_template = textwrap.dedent("""
490 ini_template = textwrap.dedent("""
488 #####################################
491 #####################################
489 ### DEBUG LOGGING CONFIGURATION ####
492 ### DEBUG LOGGING CONFIGURATION ####
490 #####################################
493 #####################################
491 [loggers]
494 [loggers]
492 keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper
495 keys = root, sqlalchemy, beaker, celery, rhodecode, ssh_wrapper
493
496
494 [handlers]
497 [handlers]
495 keys = console, console_sql
498 keys = console, console_sql
496
499
497 [formatters]
500 [formatters]
498 keys = generic, color_formatter, color_formatter_sql
501 keys = generic, color_formatter, color_formatter_sql
499
502
500 #############
503 #############
501 ## LOGGERS ##
504 ## LOGGERS ##
502 #############
505 #############
503 [logger_root]
506 [logger_root]
504 level = NOTSET
507 level = NOTSET
505 handlers = console
508 handlers = console
506
509
507 [logger_sqlalchemy]
510 [logger_sqlalchemy]
508 level = INFO
511 level = INFO
509 handlers = console_sql
512 handlers = console_sql
510 qualname = sqlalchemy.engine
513 qualname = sqlalchemy.engine
511 propagate = 0
514 propagate = 0
512
515
513 [logger_beaker]
516 [logger_beaker]
514 level = DEBUG
517 level = DEBUG
515 handlers =
518 handlers =
516 qualname = beaker.container
519 qualname = beaker.container
517 propagate = 1
520 propagate = 1
518
521
519 [logger_rhodecode]
522 [logger_rhodecode]
520 level = DEBUG
523 level = DEBUG
521 handlers =
524 handlers =
522 qualname = rhodecode
525 qualname = rhodecode
523 propagate = 1
526 propagate = 1
524
527
525 [logger_ssh_wrapper]
528 [logger_ssh_wrapper]
526 level = DEBUG
529 level = DEBUG
527 handlers =
530 handlers =
528 qualname = ssh_wrapper
531 qualname = ssh_wrapper
529 propagate = 1
532 propagate = 1
530
533
531 [logger_celery]
534 [logger_celery]
532 level = DEBUG
535 level = DEBUG
533 handlers =
536 handlers =
534 qualname = celery
537 qualname = celery
535
538
536
539
537 ##############
540 ##############
538 ## HANDLERS ##
541 ## HANDLERS ##
539 ##############
542 ##############
540
543
541 [handler_console]
544 [handler_console]
542 class = StreamHandler
545 class = StreamHandler
543 args = (sys.stderr, )
546 args = (sys.stderr, )
544 level = DEBUG
547 level = DEBUG
545 formatter = color_formatter
548 formatter = color_formatter
546
549
547 [handler_console_sql]
550 [handler_console_sql]
548 # "level = DEBUG" logs SQL queries and results.
551 # "level = DEBUG" logs SQL queries and results.
549 # "level = INFO" logs SQL queries.
552 # "level = INFO" logs SQL queries.
550 # "level = WARN" logs neither. (Recommended for production systems.)
553 # "level = WARN" logs neither. (Recommended for production systems.)
551 class = StreamHandler
554 class = StreamHandler
552 args = (sys.stderr, )
555 args = (sys.stderr, )
553 level = WARN
556 level = WARN
554 formatter = color_formatter_sql
557 formatter = color_formatter_sql
555
558
556 ################
559 ################
557 ## FORMATTERS ##
560 ## FORMATTERS ##
558 ################
561 ################
559
562
560 [formatter_generic]
563 [formatter_generic]
561 class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter
564 class = rhodecode.lib.logging_formatter.ExceptionAwareFormatter
562 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s | %(req_id)s
565 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s | %(req_id)s
563 datefmt = %Y-%m-%d %H:%M:%S
566 datefmt = %Y-%m-%d %H:%M:%S
564
567
565 [formatter_color_formatter]
568 [formatter_color_formatter]
566 class = rhodecode.lib.logging_formatter.ColorRequestTrackingFormatter
569 class = rhodecode.lib.logging_formatter.ColorRequestTrackingFormatter
567 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s | %(req_id)s
570 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s | %(req_id)s
568 datefmt = %Y-%m-%d %H:%M:%S
571 datefmt = %Y-%m-%d %H:%M:%S
569
572
570 [formatter_color_formatter_sql]
573 [formatter_color_formatter_sql]
571 class = rhodecode.lib.logging_formatter.ColorFormatterSql
574 class = rhodecode.lib.logging_formatter.ColorFormatterSql
572 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
575 format = %(asctime)s.%(msecs)03d [%(process)d] %(levelname)-5.5s [%(name)s] %(message)s
573 datefmt = %Y-%m-%d %H:%M:%S
576 datefmt = %Y-%m-%d %H:%M:%S
574 """)
577 """)
575
578
576 with tempfile.NamedTemporaryFile(prefix='rc_debug_logging_', suffix='.ini',
579 with tempfile.NamedTemporaryFile(prefix='rc_debug_logging_', suffix='.ini',
577 delete=False) as f:
580 delete=False) as f:
578 log.info('Saved Temporary DEBUG config at %s', f.name)
581 log.info('Saved Temporary DEBUG config at %s', f.name)
579 f.write(ini_template)
582 f.write(ini_template)
580
583
581 logging.config.fileConfig(f.name)
584 logging.config.fileConfig(f.name)
582 log.debug('DEBUG MODE ON')
585 log.debug('DEBUG MODE ON')
583 os.remove(f.name)
586 os.remove(f.name)
584
587
585
588
586 def _sanitize_appenlight_settings(settings):
589 def _sanitize_appenlight_settings(settings):
587 _bool_setting(settings, 'appenlight', 'false')
590 _bool_setting(settings, 'appenlight', 'false')
588
591
589
592
590 def _sanitize_vcs_settings(settings):
593 def _sanitize_vcs_settings(settings):
591 """
594 """
592 Applies settings defaults and does type conversion for all VCS related
595 Applies settings defaults and does type conversion for all VCS related
593 settings.
596 settings.
594 """
597 """
595 _string_setting(settings, 'vcs.svn.compatible_version', '')
598 _string_setting(settings, 'vcs.svn.compatible_version', '')
596 _string_setting(settings, 'vcs.hooks.protocol', 'http')
599 _string_setting(settings, 'vcs.hooks.protocol', 'http')
597 _string_setting(settings, 'vcs.hooks.host', '127.0.0.1')
600 _string_setting(settings, 'vcs.hooks.host', '127.0.0.1')
598 _string_setting(settings, 'vcs.scm_app_implementation', 'http')
601 _string_setting(settings, 'vcs.scm_app_implementation', 'http')
599 _string_setting(settings, 'vcs.server', '')
602 _string_setting(settings, 'vcs.server', '')
600 _string_setting(settings, 'vcs.server.protocol', 'http')
603 _string_setting(settings, 'vcs.server.protocol', 'http')
601 _bool_setting(settings, 'startup.import_repos', 'false')
604 _bool_setting(settings, 'startup.import_repos', 'false')
602 _bool_setting(settings, 'vcs.hooks.direct_calls', 'false')
605 _bool_setting(settings, 'vcs.hooks.direct_calls', 'false')
603 _bool_setting(settings, 'vcs.server.enable', 'true')
606 _bool_setting(settings, 'vcs.server.enable', 'true')
604 _bool_setting(settings, 'vcs.start_server', 'false')
607 _bool_setting(settings, 'vcs.start_server', 'false')
605 _list_setting(settings, 'vcs.backends', 'hg, git, svn')
608 _list_setting(settings, 'vcs.backends', 'hg, git, svn')
606 _int_setting(settings, 'vcs.connection_timeout', 3600)
609 _int_setting(settings, 'vcs.connection_timeout', 3600)
607
610
608 # Support legacy values of vcs.scm_app_implementation. Legacy
611 # Support legacy values of vcs.scm_app_implementation. Legacy
609 # configurations may use 'rhodecode.lib.middleware.utils.scm_app_http', or
612 # configurations may use 'rhodecode.lib.middleware.utils.scm_app_http', or
610 # disabled since 4.13 'vcsserver.scm_app' which is now mapped to 'http'.
613 # disabled since 4.13 'vcsserver.scm_app' which is now mapped to 'http'.
611 scm_app_impl = settings['vcs.scm_app_implementation']
614 scm_app_impl = settings['vcs.scm_app_implementation']
612 if scm_app_impl in ['rhodecode.lib.middleware.utils.scm_app_http', 'vcsserver.scm_app']:
615 if scm_app_impl in ['rhodecode.lib.middleware.utils.scm_app_http', 'vcsserver.scm_app']:
613 settings['vcs.scm_app_implementation'] = 'http'
616 settings['vcs.scm_app_implementation'] = 'http'
614
617
615
618
616 def _sanitize_cache_settings(settings):
619 def _sanitize_cache_settings(settings):
617 temp_store = tempfile.gettempdir()
620 temp_store = tempfile.gettempdir()
618 default_cache_dir = os.path.join(temp_store, 'rc_cache')
621 default_cache_dir = os.path.join(temp_store, 'rc_cache')
619
622
620 # save default, cache dir, and use it for all backends later.
623 # save default, cache dir, and use it for all backends later.
621 default_cache_dir = _string_setting(
624 default_cache_dir = _string_setting(
622 settings,
625 settings,
623 'cache_dir',
626 'cache_dir',
624 default_cache_dir, lower=False, default_when_empty=True)
627 default_cache_dir, lower=False, default_when_empty=True)
625
628
626 # ensure we have our dir created
629 # ensure we have our dir created
627 if not os.path.isdir(default_cache_dir):
630 if not os.path.isdir(default_cache_dir):
628 os.makedirs(default_cache_dir, mode=0o755)
631 os.makedirs(default_cache_dir, mode=0o755)
629
632
630 # exception store cache
633 # exception store cache
631 _string_setting(
634 _string_setting(
632 settings,
635 settings,
633 'exception_tracker.store_path',
636 'exception_tracker.store_path',
634 temp_store, lower=False, default_when_empty=True)
637 temp_store, lower=False, default_when_empty=True)
635 _bool_setting(
638 _bool_setting(
636 settings,
639 settings,
637 'exception_tracker.send_email',
640 'exception_tracker.send_email',
638 'false')
641 'false')
639 _string_setting(
642 _string_setting(
640 settings,
643 settings,
641 'exception_tracker.email_prefix',
644 'exception_tracker.email_prefix',
642 '[RHODECODE ERROR]', lower=False, default_when_empty=True)
645 '[RHODECODE ERROR]', lower=False, default_when_empty=True)
643
646
644 # cache_perms
647 # cache_perms
645 _string_setting(
648 _string_setting(
646 settings,
649 settings,
647 'rc_cache.cache_perms.backend',
650 'rc_cache.cache_perms.backend',
648 'dogpile.cache.rc.file_namespace', lower=False)
651 'dogpile.cache.rc.file_namespace', lower=False)
649 _int_setting(
652 _int_setting(
650 settings,
653 settings,
651 'rc_cache.cache_perms.expiration_time',
654 'rc_cache.cache_perms.expiration_time',
652 60)
655 60)
653 _string_setting(
656 _string_setting(
654 settings,
657 settings,
655 'rc_cache.cache_perms.arguments.filename',
658 'rc_cache.cache_perms.arguments.filename',
656 os.path.join(default_cache_dir, 'rc_cache_1'), lower=False)
659 os.path.join(default_cache_dir, 'rc_cache_1'), lower=False)
657
660
658 # cache_repo
661 # cache_repo
659 _string_setting(
662 _string_setting(
660 settings,
663 settings,
661 'rc_cache.cache_repo.backend',
664 'rc_cache.cache_repo.backend',
662 'dogpile.cache.rc.file_namespace', lower=False)
665 'dogpile.cache.rc.file_namespace', lower=False)
663 _int_setting(
666 _int_setting(
664 settings,
667 settings,
665 'rc_cache.cache_repo.expiration_time',
668 'rc_cache.cache_repo.expiration_time',
666 60)
669 60)
667 _string_setting(
670 _string_setting(
668 settings,
671 settings,
669 'rc_cache.cache_repo.arguments.filename',
672 'rc_cache.cache_repo.arguments.filename',
670 os.path.join(default_cache_dir, 'rc_cache_2'), lower=False)
673 os.path.join(default_cache_dir, 'rc_cache_2'), lower=False)
671
674
672 # cache_license
675 # cache_license
673 _string_setting(
676 _string_setting(
674 settings,
677 settings,
675 'rc_cache.cache_license.backend',
678 'rc_cache.cache_license.backend',
676 'dogpile.cache.rc.file_namespace', lower=False)
679 'dogpile.cache.rc.file_namespace', lower=False)
677 _int_setting(
680 _int_setting(
678 settings,
681 settings,
679 'rc_cache.cache_license.expiration_time',
682 'rc_cache.cache_license.expiration_time',
680 5*60)
683 5*60)
681 _string_setting(
684 _string_setting(
682 settings,
685 settings,
683 'rc_cache.cache_license.arguments.filename',
686 'rc_cache.cache_license.arguments.filename',
684 os.path.join(default_cache_dir, 'rc_cache_3'), lower=False)
687 os.path.join(default_cache_dir, 'rc_cache_3'), lower=False)
685
688
686 # cache_repo_longterm memory, 96H
689 # cache_repo_longterm memory, 96H
687 _string_setting(
690 _string_setting(
688 settings,
691 settings,
689 'rc_cache.cache_repo_longterm.backend',
692 'rc_cache.cache_repo_longterm.backend',
690 'dogpile.cache.rc.memory_lru', lower=False)
693 'dogpile.cache.rc.memory_lru', lower=False)
691 _int_setting(
694 _int_setting(
692 settings,
695 settings,
693 'rc_cache.cache_repo_longterm.expiration_time',
696 'rc_cache.cache_repo_longterm.expiration_time',
694 345600)
697 345600)
695 _int_setting(
698 _int_setting(
696 settings,
699 settings,
697 'rc_cache.cache_repo_longterm.max_size',
700 'rc_cache.cache_repo_longterm.max_size',
698 10000)
701 10000)
699
702
700 # sql_cache_short
703 # sql_cache_short
701 _string_setting(
704 _string_setting(
702 settings,
705 settings,
703 'rc_cache.sql_cache_short.backend',
706 'rc_cache.sql_cache_short.backend',
704 'dogpile.cache.rc.memory_lru', lower=False)
707 'dogpile.cache.rc.memory_lru', lower=False)
705 _int_setting(
708 _int_setting(
706 settings,
709 settings,
707 'rc_cache.sql_cache_short.expiration_time',
710 'rc_cache.sql_cache_short.expiration_time',
708 30)
711 30)
709 _int_setting(
712 _int_setting(
710 settings,
713 settings,
711 'rc_cache.sql_cache_short.max_size',
714 'rc_cache.sql_cache_short.max_size',
712 10000)
715 10000)
713
716
714
717
715 def _int_setting(settings, name, default):
718 def _int_setting(settings, name, default):
716 settings[name] = int(settings.get(name, default))
719 settings[name] = int(settings.get(name, default))
717 return settings[name]
720 return settings[name]
718
721
719
722
720 def _bool_setting(settings, name, default):
723 def _bool_setting(settings, name, default):
721 input_val = settings.get(name, default)
724 input_val = settings.get(name, default)
722 if isinstance(input_val, unicode):
725 if isinstance(input_val, unicode):
723 input_val = input_val.encode('utf8')
726 input_val = input_val.encode('utf8')
724 settings[name] = asbool(input_val)
727 settings[name] = asbool(input_val)
725 return settings[name]
728 return settings[name]
726
729
727
730
728 def _list_setting(settings, name, default):
731 def _list_setting(settings, name, default):
729 raw_value = settings.get(name, default)
732 raw_value = settings.get(name, default)
730
733
731 old_separator = ','
734 old_separator = ','
732 if old_separator in raw_value:
735 if old_separator in raw_value:
733 # If we get a comma separated list, pass it to our own function.
736 # If we get a comma separated list, pass it to our own function.
734 settings[name] = rhodecode_aslist(raw_value, sep=old_separator)
737 settings[name] = rhodecode_aslist(raw_value, sep=old_separator)
735 else:
738 else:
736 # Otherwise we assume it uses pyramids space/newline separation.
739 # Otherwise we assume it uses pyramids space/newline separation.
737 settings[name] = aslist(raw_value)
740 settings[name] = aslist(raw_value)
738 return settings[name]
741 return settings[name]
739
742
740
743
741 def _string_setting(settings, name, default, lower=True, default_when_empty=False):
744 def _string_setting(settings, name, default, lower=True, default_when_empty=False):
742 value = settings.get(name, default)
745 value = settings.get(name, default)
743
746
744 if default_when_empty and not value:
747 if default_when_empty and not value:
745 # use default value when value is empty
748 # use default value when value is empty
746 value = default
749 value = default
747
750
748 if lower:
751 if lower:
749 value = value.lower()
752 value = value.lower()
750 settings[name] = value
753 settings[name] = value
751 return settings[name]
754 return settings[name]
752
755
753
756
754 def _substitute_values(mapping, substitutions):
757 def _substitute_values(mapping, substitutions):
755 result = {}
758 result = {}
756
759
757 try:
760 try:
758 for key, value in mapping.items():
761 for key, value in mapping.items():
759 # initialize without substitution first
762 # initialize without substitution first
760 result[key] = value
763 result[key] = value
761
764
762 # Note: Cannot use regular replacements, since they would clash
765 # Note: Cannot use regular replacements, since they would clash
763 # with the implementation of ConfigParser. Using "format" instead.
766 # with the implementation of ConfigParser. Using "format" instead.
764 try:
767 try:
765 result[key] = value.format(**substitutions)
768 result[key] = value.format(**substitutions)
766 except KeyError as e:
769 except KeyError as e:
767 env_var = '{}'.format(e.args[0])
770 env_var = '{}'.format(e.args[0])
768
771
769 msg = 'Failed to substitute: `{key}={{{var}}}` with environment entry. ' \
772 msg = 'Failed to substitute: `{key}={{{var}}}` with environment entry. ' \
770 'Make sure your environment has {var} set, or remove this ' \
773 'Make sure your environment has {var} set, or remove this ' \
771 'variable from config file'.format(key=key, var=env_var)
774 'variable from config file'.format(key=key, var=env_var)
772
775
773 if env_var.startswith('ENV_'):
776 if env_var.startswith('ENV_'):
774 raise ValueError(msg)
777 raise ValueError(msg)
775 else:
778 else:
776 log.warning(msg)
779 log.warning(msg)
777
780
778 except ValueError as e:
781 except ValueError as e:
779 log.warning('Failed to substitute ENV variable: %s', e)
782 log.warning('Failed to substitute ENV variable: %s', e)
780 result = mapping
783 result = mapping
781
784
782 return result
785 return result
@@ -1,106 +1,106 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 """
21 """
22 Single source for redirection links.
22 Single source for redirection links.
23
23
24 Goal of this module is to provide a single source of truth regarding external
24 Goal of this module is to provide a single source of truth regarding external
25 links. The data inside this module is used to configure the routing
25 links. The data inside this module is used to configure the routing
26 system of Enterprise and it is used also as a base to check if this data
26 system of Enterprise and it is used also as a base to check if this data
27 and our server configuration are in sync.
27 and our server configuration are in sync.
28
28
29 .. py:data:: link_config
29 .. py:data:: link_config
30
30
31 Contains the configuration for external links. Each item is supposed to be
31 Contains the configuration for external links. Each item is supposed to be
32 a `dict` like this example::
32 a `dict` like this example::
33
33
34 {"name": "url_name",
34 {"name": "url_name",
35 "target": "https://rhodecode.com/r1/enterprise/keyword/",
35 "target": "https://rhodecode.com/r1/enterprise/keyword/",
36 "external_target": "https://example.com/some-page.html",
36 "external_target": "https://example.com/some-page.html",
37 }
37 }
38
38
39 then you can retrieve the url by simply calling the URL function:
39 then you can retrieve the url by simply calling the URL function:
40
40
41 `h.route_path('url_name')`
41 `h.route_path('url_name')`
42
42
43 The redirection must be first implemented in our servers before
43 The redirection must be first implemented in our servers before
44 you can see it working.
44 you can see it working.
45 """
45 """
46 # pragma: no cover
46 # pragma: no cover
47 from __future__ import unicode_literals
47 from __future__ import unicode_literals
48
48
49 link_config = [
49 link_config = [
50 {
50 {
51 "name": "enterprise_docs",
51 "name": "enterprise_docs",
52 "target": "https://rhodecode.com/r1/enterprise/docs/",
52 "target": "https://rhodecode.com/r1/enterprise/docs/",
53 "external_target": "https://docs.rhodecode.com/RhodeCode-Enterprise/",
53 "external_target": "https://docs.rhodecode.com/RhodeCode-Enterprise/",
54 },
54 },
55 {
55 {
56 "name": "enterprise_log_file_locations",
56 "name": "enterprise_log_file_locations",
57 "target": "https://rhodecode.com/r1/enterprise/docs/admin-system-overview/",
57 "target": "https://rhodecode.com/r1/enterprise/docs/admin-system-overview/",
58 "external_target": "https://docs.rhodecode.com/RhodeCode-Enterprise/admin/system-overview.html#log-files",
58 "external_target": "https://docs.rhodecode.com/RhodeCode-Enterprise/admin/system-overview.html#log-files",
59 },
59 },
60 {
60 {
61 "name": "enterprise_issue_tracker_settings",
61 "name": "enterprise_issue_tracker_settings",
62 "target": "https://rhodecode.com/r1/enterprise/docs/issue-trackers-overview/",
62 "target": "https://rhodecode.com/r1/enterprise/docs/issue-trackers-overview/",
63 "external_target": "https://docs.rhodecode.com/RhodeCode-Enterprise/issue-trackers/issue-trackers.html",
63 "external_target": "https://docs.rhodecode.com/RhodeCode-Enterprise/issue-trackers/issue-trackers.html",
64 },
64 },
65 {
65 {
66 "name": "enterprise_svn_setup",
66 "name": "enterprise_svn_setup",
67 "target": "https://rhodecode.com/r1/enterprise/docs/svn-setup/",
67 "target": "https://rhodecode.com/r1/enterprise/docs/svn-setup/",
68 "external_target": "https://docs.rhodecode.com/RhodeCode-Enterprise/admin/svn-http.html",
68 "external_target": "https://docs.rhodecode.com/RhodeCode-Enterprise/admin/svn-http.html",
69 },
69 },
70 {
70 {
71 "name": "enterprise_license_convert_from_old",
71 "name": "enterprise_license_convert_from_old",
72 "target": "https://rhodecode.com/r1/enterprise/convert-license/",
72 "target": "https://rhodecode.com/r1/enterprise/convert-license/",
73 "external_target": "https://rhodecode.com/u/license-upgrade",
73 "external_target": "https://rhodecode.com/u/license-upgrade",
74 },
74 },
75 {
75 {
76 "name": "rst_help",
76 "name": "rst_help",
77 "target": "http://docutils.sourceforge.io/docs/user/rst/quickref.html",
77 "target": "http://docutils.sourceforge.io/docs/user/rst/quickref.html",
78 "external_target": "https://docutils.sourceforge.io/docs/user/rst/quickref.html",
78 "external_target": "https://docutils.sourceforge.io/docs/user/rst/quickref.html",
79 },
79 },
80 {
80 {
81 "name": "markdown_help",
81 "name": "markdown_help",
82 "target": "https://daringfireball.net/projects/markdown/syntax",
82 "target": "https://daringfireball.net/projects/markdown/syntax",
83 "external_target": "https://daringfireball.net/projects/markdown/syntax",
83 "external_target": "https://daringfireball.net/projects/markdown/syntax",
84 },
84 },
85 {
85 {
86 "name": "rhodecode_official",
86 "name": "rhodecode_official",
87 "target": "https://rhodecode.com",
87 "target": "https://rhodecode.com",
88 "external_target": "https://rhodecode.com/",
88 "external_target": "https://rhodecode.com/",
89 },
89 },
90 {
90 {
91 "name": "rhodecode_support",
91 "name": "rhodecode_support",
92 "target": "https://rhodecode.com/help/",
92 "target": "https://rhodecode.com/help/",
93 "external_target": "https://rhodecode.com/support",
93 "external_target": "https://rhodecode.com/support",
94 },
94 },
95 {
95 {
96 "name": "rhodecode_translations",
96 "name": "rhodecode_translations",
97 "target": "https://rhodecode.com/translate/enterprise",
97 "target": "https://rhodecode.com/translate/enterprise",
98 "external_target": "https://www.transifex.com/rhodecode/RhodeCode/",
98 "external_target": "https://explore.transifex.com/rhodecode/RhodeCode/",
99 },
99 },
100
100
101 ]
101 ]
102
102
103
103
104 def connect_redirection_links(config):
104 def connect_redirection_links(config):
105 for link in link_config:
105 for link in link_config:
106 config.add_route(link['name'], link['target'], static=True)
106 config.add_route(link['name'], link['target'], static=True)
@@ -1,136 +1,136 b''
1 ##########################
1 ##########################
2 To create a new language
2 To create a new language
3 ##########################
3 ##########################
4
4
5 Translations are available on transifex under::
5 Translations are available on transifex under::
6
6
7 https://www.transifex.com/projects/p/RhodeCode/
7 https://explore.transifex.com/rhodecode/RhodeCode/
8
8
9 Log into transifex and request new language translation.
9 Log into transifex and request new language translation.
10
10
11 manual creation of new language
11 manual creation of new language
12 +++++++++++++++++++++++++++++++
12 +++++++++++++++++++++++++++++++
13
13
14 **Step 1:** If you don't have a dev environment set up, download sources of
14 **Step 1:** If you don't have a dev environment set up, download sources of
15 RhodeCode. Run::
15 RhodeCode. Run::
16
16
17 python setup.py develop
17 python setup.py develop
18
18
19 Otherwise, update to the latest stable commit.
19 Otherwise, update to the latest stable commit.
20
20
21
21
22 **Note:** The following commands are intended to be run in the main repo directory
22 **Note:** The following commands are intended to be run in the main repo directory
23 using Linux; running them in nix-shell is fine.
23 using Linux; running them in nix-shell is fine.
24
24
25
25
26 **Step 2:** Make sure all translation strings are extracted by running::
26 **Step 2:** Make sure all translation strings are extracted by running::
27
27
28 python setup.py extract_messages
28 python setup.py extract_messages
29
29
30
30
31 **Step 3:** Create a new language by executing following command::
31 **Step 3:** Create a new language by executing following command::
32
32
33 python setup.py init_catalog -l <new_language_code>
33 python setup.py init_catalog -l <new_language_code>
34
34
35 This creates a new language under directory rhodecode/i18n/<new_language_code>
35 This creates a new language under directory rhodecode/i18n/<new_language_code>
36
36
37
37
38 **Step 4:** Be sure to update transifex mapping, located at rhodecode/.tx/config
38 **Step 4:** Be sure to update transifex mapping, located at rhodecode/.tx/config
39 The path to the new language should be identical to the others, instead using
39 The path to the new language should be identical to the others, instead using
40 the new language code.
40 the new language code.
41
41
42
42
43 **Step 5:** Verify the translation file and fix any errors.
43 **Step 5:** Verify the translation file and fix any errors.
44 This can be done by executing::
44 This can be done by executing::
45
45
46 msgfmt -f -c rhodecode/i18n/<new_language_code>/LC_MESSAGES/<updated_file.po>
46 msgfmt -f -c rhodecode/i18n/<new_language_code>/LC_MESSAGES/<updated_file.po>
47
47
48 Edit rhodecode/i18n/<new_language_code>/LC_MESSAGES/rhodecode.po for errors
48 Edit rhodecode/i18n/<new_language_code>/LC_MESSAGES/rhodecode.po for errors
49 with your favorite file editor (the errors will tell you what's missing, check
49 with your favorite file editor (the errors will tell you what's missing, check
50 other rhodecode.po files in existing languages for clues).
50 other rhodecode.po files in existing languages for clues).
51
51
52
52
53 **Step 6:** Finally, compile the translations::
53 **Step 6:** Finally, compile the translations::
54
54
55 python setup.py compile_catalog -l <new_language_code>
55 python setup.py compile_catalog -l <new_language_code>
56
56
57 **Note:** Make sure there is not a .mo file in the top-level folder!
57 **Note:** Make sure there is not a .mo file in the top-level folder!
58
58
59
59
60 ##########################
60 ##########################
61 To update translations
61 To update translations
62 ##########################
62 ##########################
63
63
64 **Note:** This is a different process, not needed when you are adding a translation.
64 **Note:** This is a different process, not needed when you are adding a translation.
65
65
66 **Step 1:** Fetch the latest version of strings for translation by running::
66 **Step 1:** Fetch the latest version of strings for translation by running::
67
67
68 python setup.py extract_messages
68 python setup.py extract_messages
69
69
70
70
71 **Step 2:** Update the rhodecode.po file using::
71 **Step 2:** Update the rhodecode.po file using::
72
72
73 python setup.py update_catalog -l <new_language_code>
73 python setup.py update_catalog -l <new_language_code>
74
74
75
75
76 **Step 3:** Update the po file as outlined in step 5 for new translations (see above).
76 **Step 3:** Update the po file as outlined in step 5 for new translations (see above).
77
77
78
78
79 **Step 4:** Compile the translations as outlined in step 6 for new translations (see above).
79 **Step 4:** Compile the translations as outlined in step 6 for new translations (see above).
80
80
81 **Note:** Make sure there is not a .mo file in the top-level folder!
81 **Note:** Make sure there is not a .mo file in the top-level folder!
82
82
83
83
84 ###########################
84 ###########################
85 Javascript translations
85 Javascript translations
86 ###########################
86 ###########################
87
87
88 First find all translation used in JS by running the command:
88 First find all translation used in JS by running the command:
89
89
90 grep "_TM\[.*\]" -R . -oh | sort -u
90 grep "_TM\[.*\]" -R . -oh | sort -u
91
91
92 Then compare it against the file scripts/tasks/file_generation/js_i18n_data.py.
92 Then compare it against the file scripts/tasks/file_generation/js_i18n_data.py.
93 Add or remove strings to that file remembering to add them surrounded by _(...),
93 Add or remove strings to that file remembering to add them surrounded by _(...),
94 or otherwise they won't get added to the catalog.
94 or otherwise they won't get added to the catalog.
95
95
96 In case the file changed, regenereate the catalog by following the
96 In case the file changed, regenereate the catalog by following the
97 instructions above ('to update translations').
97 instructions above ('to update translations').
98
98
99 Once the new strings were transalated and the catalogs comiled with msgfmt, you
99 Once the new strings were transalated and the catalogs comiled with msgfmt, you
100 can generate the Javascript translation files. To do so, just run the command:
100 can generate the Javascript translation files. To do so, just run the command:
101
101
102 invoke -r scripts/ generate.js-i18n
102 invoke -r scripts/ generate.js-i18n
103
103
104 Which will generate one JS file for detected language in the folder
104 Which will generate one JS file for detected language in the folder
105 rhodecode/public/js/rhodecode/i18n/{lang}.js
105 rhodecode/public/js/rhodecode/i18n/{lang}.js
106
106
107 Finally, commit the changes.
107 Finally, commit the changes.
108
108
109
109
110 ########################
110 ########################
111 Testing translations
111 Testing translations
112 ########################
112 ########################
113
113
114 Edit the test.ini file, setting the lang attribute to::
114 Edit the test.ini file, setting the lang attribute to::
115
115
116 lang=<new_language_code>
116 lang=<new_language_code>
117
117
118 Run RhodeCode tests by executing::
118 Run RhodeCode tests by executing::
119
119
120 nosetests
120 nosetests
121
121
122 ###########################
122 ###########################
123 Workflow for Transifex
123 Workflow for Transifex
124 ###########################
124 ###########################
125
125
126 #0 new language:
126 #0 new language:
127 edit .tx/config and add language
127 edit .tx/config and add language
128 #1 extract messages to generate updated pot file
128 #1 extract messages to generate updated pot file
129 python setup.py extract_messages
129 python setup.py extract_messages
130 #2 push source .pot file to Transifex
130 #2 push source .pot file to Transifex
131 tx push -s
131 tx push -s
132 #3 when translations are ok pull changes
132 #3 when translations are ok pull changes
133 tx pull
133 tx pull
134 #4 compile languages
134 #4 compile languages
135 python setup.py compile_catalog
135 python setup.py compile_catalog
136
136
@@ -1,390 +1,390 b''
1 import sys
1 import sys
2 import threading
2 import threading
3 import weakref
3 import weakref
4 from base64 import b64encode
4 from base64 import b64encode
5 from logging import getLogger
5 from logging import getLogger
6 from os import urandom
6 from os import urandom
7
7
8 from redis import StrictRedis
8 from redis import StrictRedis
9
9
10 __version__ = '3.7.0'
10 __version__ = '3.7.0'
11
11
12 loggers = {
12 loggers = {
13 k: getLogger("rhodecode." + ".".join((__name__, k)))
13 k: getLogger("rhodecode." + ".".join((__name__, k)))
14 for k in [
14 for k in [
15 "acquire",
15 "acquire",
16 "refresh.thread.start",
16 "refresh.thread.start",
17 "refresh.thread.stop",
17 "refresh.thread.stop",
18 "refresh.thread.exit",
18 "refresh.thread.exit",
19 "refresh.start",
19 "refresh.start",
20 "refresh.shutdown",
20 "refresh.shutdown",
21 "refresh.exit",
21 "refresh.exit",
22 "release",
22 "release",
23 ]
23 ]
24 }
24 }
25
25
26 PY3 = sys.version_info[0] == 3
26 PY3 = sys.version_info[0] == 3
27
27
28 if PY3:
28 if PY3:
29 text_type = str
29 text_type = str
30 binary_type = bytes
30 binary_type = bytes
31 else:
31 else:
32 text_type = unicode # noqa
32 text_type = unicode # noqa
33 binary_type = str
33 binary_type = str
34
34
35
35
36 # Check if the id match. If not, return an error code.
36 # Check if the id match. If not, return an error code.
37 UNLOCK_SCRIPT = b"""
37 UNLOCK_SCRIPT = b"""
38 if redis.call("get", KEYS[1]) ~= ARGV[1] then
38 if redis.call("get", KEYS[1]) ~= ARGV[1] then
39 return 1
39 return 1
40 else
40 else
41 redis.call("del", KEYS[2])
41 redis.call("del", KEYS[2])
42 redis.call("lpush", KEYS[2], 1)
42 redis.call("lpush", KEYS[2], 1)
43 redis.call("pexpire", KEYS[2], ARGV[2])
43 redis.call("pexpire", KEYS[2], ARGV[2])
44 redis.call("del", KEYS[1])
44 redis.call("del", KEYS[1])
45 return 0
45 return 0
46 end
46 end
47 """
47 """
48
48
49 # Covers both cases when key doesn't exist and doesn't equal to lock's id
49 # Covers both cases when key doesn't exist and doesn't equal to lock's id
50 EXTEND_SCRIPT = b"""
50 EXTEND_SCRIPT = b"""
51 if redis.call("get", KEYS[1]) ~= ARGV[1] then
51 if redis.call("get", KEYS[1]) ~= ARGV[1] then
52 return 1
52 return 1
53 elseif redis.call("ttl", KEYS[1]) < 0 then
53 elseif redis.call("ttl", KEYS[1]) < 0 then
54 return 2
54 return 2
55 else
55 else
56 redis.call("expire", KEYS[1], ARGV[2])
56 redis.call("expire", KEYS[1], ARGV[2])
57 return 0
57 return 0
58 end
58 end
59 """
59 """
60
60
61 RESET_SCRIPT = b"""
61 RESET_SCRIPT = b"""
62 redis.call('del', KEYS[2])
62 redis.call('del', KEYS[2])
63 redis.call('lpush', KEYS[2], 1)
63 redis.call('lpush', KEYS[2], 1)
64 redis.call('pexpire', KEYS[2], ARGV[2])
64 redis.call('pexpire', KEYS[2], ARGV[2])
65 return redis.call('del', KEYS[1])
65 return redis.call('del', KEYS[1])
66 """
66 """
67
67
68 RESET_ALL_SCRIPT = b"""
68 RESET_ALL_SCRIPT = b"""
69 local locks = redis.call('keys', 'lock:*')
69 local locks = redis.call('keys', 'lock:*')
70 local signal
70 local signal
71 for _, lock in pairs(locks) do
71 for _, lock in pairs(locks) do
72 signal = 'lock-signal:' .. string.sub(lock, 6)
72 signal = 'lock-signal:' .. string.sub(lock, 6)
73 redis.call('del', signal)
73 redis.call('del', signal)
74 redis.call('lpush', signal, 1)
74 redis.call('lpush', signal, 1)
75 redis.call('expire', signal, 1)
75 redis.call('expire', signal, 1)
76 redis.call('del', lock)
76 redis.call('del', lock)
77 end
77 end
78 return #locks
78 return #locks
79 """
79 """
80
80
81
81
82 class AlreadyAcquired(RuntimeError):
82 class AlreadyAcquired(RuntimeError):
83 pass
83 pass
84
84
85
85
86 class NotAcquired(RuntimeError):
86 class NotAcquired(RuntimeError):
87 pass
87 pass
88
88
89
89
90 class AlreadyStarted(RuntimeError):
90 class AlreadyStarted(RuntimeError):
91 pass
91 pass
92
92
93
93
94 class TimeoutNotUsable(RuntimeError):
94 class TimeoutNotUsable(RuntimeError):
95 pass
95 pass
96
96
97
97
98 class InvalidTimeout(RuntimeError):
98 class InvalidTimeout(RuntimeError):
99 pass
99 pass
100
100
101
101
102 class TimeoutTooLarge(RuntimeError):
102 class TimeoutTooLarge(RuntimeError):
103 pass
103 pass
104
104
105
105
106 class NotExpirable(RuntimeError):
106 class NotExpirable(RuntimeError):
107 pass
107 pass
108
108
109
109
110 class Lock(object):
110 class Lock(object):
111 """
111 """
112 A Lock context manager implemented via redis SETNX/BLPOP.
112 A Lock context manager implemented via redis SETNX/BLPOP.
113 """
113 """
114 unlock_script = None
114 unlock_script = None
115 extend_script = None
115 extend_script = None
116 reset_script = None
116 reset_script = None
117 reset_all_script = None
117 reset_all_script = None
118
118
119 def __init__(self, redis_client, name, expire=None, id=None, auto_renewal=False, strict=True, signal_expire=1000):
119 def __init__(self, redis_client, name, expire=None, id=None, auto_renewal=False, strict=True, signal_expire=1000):
120 """
120 """
121 :param redis_client:
121 :param redis_client:
122 An instance of :class:`~StrictRedis`.
122 An instance of :class:`~StrictRedis`.
123 :param name:
123 :param name:
124 The name (redis key) the lock should have.
124 The name (redis key) the lock should have.
125 :param expire:
125 :param expire:
126 The lock expiry time in seconds. If left at the default (None)
126 The lock expiry time in seconds. If left at the default (None)
127 the lock will not expire.
127 the lock will not expire.
128 :param id:
128 :param id:
129 The ID (redis value) the lock should have. A random value is
129 The ID (redis value) the lock should have. A random value is
130 generated when left at the default.
130 generated when left at the default.
131
131
132 Note that if you specify this then the lock is marked as "held". Acquires
132 Note that if you specify this then the lock is marked as "held". Acquires
133 won't be possible.
133 won't be possible.
134 :param auto_renewal:
134 :param auto_renewal:
135 If set to ``True``, Lock will automatically renew the lock so that it
135 If set to ``True``, Lock will automatically renew the lock so that it
136 doesn't expire for as long as the lock is held (acquire() called
136 doesn't expire for as long as the lock is held (acquire() called
137 or running in a context manager).
137 or running in a context manager).
138
138
139 Implementation note: Renewal will happen using a daemon thread with
139 Implementation note: Renewal will happen using a daemon thread with
140 an interval of ``expire*2/3``. If wishing to use a different renewal
140 an interval of ``expire*2/3``. If wishing to use a different renewal
141 time, subclass Lock, call ``super().__init__()`` then set
141 time, subclass Lock, call ``super().__init__()`` then set
142 ``self._lock_renewal_interval`` to your desired interval.
142 ``self._lock_renewal_interval`` to your desired interval.
143 :param strict:
143 :param strict:
144 If set ``True`` then the ``redis_client`` needs to be an instance of ``redis.StrictRedis``.
144 If set ``True`` then the ``redis_client`` needs to be an instance of ``redis.StrictRedis``.
145 :param signal_expire:
145 :param signal_expire:
146 Advanced option to override signal list expiration in milliseconds. Increase it for very slow clients. Default: ``1000``.
146 Advanced option to override signal list expiration in milliseconds. Increase it for very slow clients. Default: ``1000``.
147 """
147 """
148 if strict and not isinstance(redis_client, StrictRedis):
148 if strict and not isinstance(redis_client, StrictRedis):
149 raise ValueError("redis_client must be instance of StrictRedis. "
149 raise ValueError("redis_client must be instance of StrictRedis. "
150 "Use strict=False if you know what you're doing.")
150 "Use strict=False if you know what you're doing.")
151 if auto_renewal and expire is None:
151 if auto_renewal and expire is None:
152 raise ValueError("Expire may not be None when auto_renewal is set")
152 raise ValueError("Expire may not be None when auto_renewal is set")
153
153
154 self._client = redis_client
154 self._client = redis_client
155
155
156 if expire:
156 if expire:
157 expire = int(expire)
157 expire = int(expire)
158 if expire < 0:
158 if expire < 0:
159 raise ValueError("A negative expire is not acceptable.")
159 raise ValueError("A negative expire is not acceptable.")
160 else:
160 else:
161 expire = None
161 expire = None
162 self._expire = expire
162 self._expire = expire
163
163
164 self._signal_expire = signal_expire
164 self._signal_expire = signal_expire
165 if id is None:
165 if id is None:
166 self._id = b64encode(urandom(18)).decode('ascii')
166 self._id = b64encode(urandom(18)).decode('ascii')
167 elif isinstance(id, binary_type):
167 elif isinstance(id, binary_type):
168 try:
168 try:
169 self._id = id.decode('ascii')
169 self._id = id.decode('ascii')
170 except UnicodeDecodeError:
170 except UnicodeDecodeError:
171 self._id = b64encode(id).decode('ascii')
171 self._id = b64encode(id).decode('ascii')
172 elif isinstance(id, text_type):
172 elif isinstance(id, text_type):
173 self._id = id
173 self._id = id
174 else:
174 else:
175 raise TypeError("Incorrect type for `id`. Must be bytes/str not %s." % type(id))
175 raise TypeError("Incorrect type for `id`. Must be bytes/str not %s." % type(id))
176 self._name = 'lock:' + name
176 self._name = 'lock:' + name
177 self._signal = 'lock-signal:' + name
177 self._signal = 'lock-signal:' + name
178 self._lock_renewal_interval = (float(expire) * 2 / 3
178 self._lock_renewal_interval = (float(expire) * 2 / 3
179 if auto_renewal
179 if auto_renewal
180 else None)
180 else None)
181 self._lock_renewal_thread = None
181 self._lock_renewal_thread = None
182
182
183 self.register_scripts(redis_client)
183 self.register_scripts(redis_client)
184
184
185 @classmethod
185 @classmethod
186 def register_scripts(cls, redis_client):
186 def register_scripts(cls, redis_client):
187 global reset_all_script
187 global reset_all_script
188 if reset_all_script is None:
188 if reset_all_script is None:
189 reset_all_script = redis_client.register_script(RESET_ALL_SCRIPT)
189 reset_all_script = redis_client.register_script(RESET_ALL_SCRIPT)
190 cls.unlock_script = redis_client.register_script(UNLOCK_SCRIPT)
190 cls.unlock_script = redis_client.register_script(UNLOCK_SCRIPT)
191 cls.extend_script = redis_client.register_script(EXTEND_SCRIPT)
191 cls.extend_script = redis_client.register_script(EXTEND_SCRIPT)
192 cls.reset_script = redis_client.register_script(RESET_SCRIPT)
192 cls.reset_script = redis_client.register_script(RESET_SCRIPT)
193 cls.reset_all_script = redis_client.register_script(RESET_ALL_SCRIPT)
193 cls.reset_all_script = redis_client.register_script(RESET_ALL_SCRIPT)
194
194
195 @property
195 @property
196 def _held(self):
196 def _held(self):
197 return self.id == self.get_owner_id()
197 return self.id == self.get_owner_id()
198
198
199 def reset(self):
199 def reset(self):
200 """
200 """
201 Forcibly deletes the lock. Use this with care.
201 Forcibly deletes the lock. Use this with care.
202 """
202 """
203 self.reset_script(client=self._client, keys=(self._name, self._signal), args=(self.id, self._signal_expire))
203 self.reset_script(client=self._client, keys=(self._name, self._signal), args=(self.id, self._signal_expire))
204
204
205 @property
205 @property
206 def id(self):
206 def id(self):
207 return self._id
207 return self._id
208
208
209 def get_owner_id(self):
209 def get_owner_id(self):
210 owner_id = self._client.get(self._name)
210 owner_id = self._client.get(self._name)
211 if isinstance(owner_id, binary_type):
211 if isinstance(owner_id, binary_type):
212 owner_id = owner_id.decode('ascii', 'replace')
212 owner_id = owner_id.decode('ascii', 'replace')
213 return owner_id
213 return owner_id
214
214
215 def acquire(self, blocking=True, timeout=None):
215 def acquire(self, blocking=True, timeout=None):
216 """
216 """
217 :param blocking:
217 :param blocking:
218 Boolean value specifying whether lock should be blocking or not.
218 Boolean value specifying whether lock should be blocking or not.
219 :param timeout:
219 :param timeout:
220 An integer value specifying the maximum number of seconds to block.
220 An integer value specifying the maximum number of seconds to block.
221 """
221 """
222 logger = loggers["acquire"]
222 logger = loggers["acquire"]
223
223
224 logger.debug("Getting acquire on %r ...", self._name)
224 logger.debug("Getting blocking: %s acquire on %r ...", blocking, self._name)
225
225
226 if self._held:
226 if self._held:
227 owner_id = self.get_owner_id()
227 owner_id = self.get_owner_id()
228 raise AlreadyAcquired("Already acquired from this Lock instance. Lock id: {}".format(owner_id))
228 raise AlreadyAcquired("Already acquired from this Lock instance. Lock id: {}".format(owner_id))
229
229
230 if not blocking and timeout is not None:
230 if not blocking and timeout is not None:
231 raise TimeoutNotUsable("Timeout cannot be used if blocking=False")
231 raise TimeoutNotUsable("Timeout cannot be used if blocking=False")
232
232
233 if timeout:
233 if timeout:
234 timeout = int(timeout)
234 timeout = int(timeout)
235 if timeout < 0:
235 if timeout < 0:
236 raise InvalidTimeout("Timeout (%d) cannot be less than or equal to 0" % timeout)
236 raise InvalidTimeout("Timeout (%d) cannot be less than or equal to 0" % timeout)
237
237
238 if self._expire and not self._lock_renewal_interval and timeout > self._expire:
238 if self._expire and not self._lock_renewal_interval and timeout > self._expire:
239 raise TimeoutTooLarge("Timeout (%d) cannot be greater than expire (%d)" % (timeout, self._expire))
239 raise TimeoutTooLarge("Timeout (%d) cannot be greater than expire (%d)" % (timeout, self._expire))
240
240
241 busy = True
241 busy = True
242 blpop_timeout = timeout or self._expire or 0
242 blpop_timeout = timeout or self._expire or 0
243 timed_out = False
243 timed_out = False
244 while busy:
244 while busy:
245 busy = not self._client.set(self._name, self._id, nx=True, ex=self._expire)
245 busy = not self._client.set(self._name, self._id, nx=True, ex=self._expire)
246 if busy:
246 if busy:
247 if timed_out:
247 if timed_out:
248 return False
248 return False
249 elif blocking:
249 elif blocking:
250 timed_out = not self._client.blpop(self._signal, blpop_timeout) and timeout
250 timed_out = not self._client.blpop(self._signal, blpop_timeout) and timeout
251 else:
251 else:
252 logger.warning("Failed to get %r.", self._name)
252 logger.warning("Failed to get %r.", self._name)
253 return False
253 return False
254
254
255 logger.info("Got lock for %r.", self._name)
255 logger.info("Got lock for %r.", self._name)
256 if self._lock_renewal_interval is not None:
256 if self._lock_renewal_interval is not None:
257 self._start_lock_renewer()
257 self._start_lock_renewer()
258 return True
258 return True
259
259
260 def extend(self, expire=None):
260 def extend(self, expire=None):
261 """Extends expiration time of the lock.
261 """Extends expiration time of the lock.
262
262
263 :param expire:
263 :param expire:
264 New expiration time. If ``None`` - `expire` provided during
264 New expiration time. If ``None`` - `expire` provided during
265 lock initialization will be taken.
265 lock initialization will be taken.
266 """
266 """
267 if expire:
267 if expire:
268 expire = int(expire)
268 expire = int(expire)
269 if expire < 0:
269 if expire < 0:
270 raise ValueError("A negative expire is not acceptable.")
270 raise ValueError("A negative expire is not acceptable.")
271 elif self._expire is not None:
271 elif self._expire is not None:
272 expire = self._expire
272 expire = self._expire
273 else:
273 else:
274 raise TypeError(
274 raise TypeError(
275 "To extend a lock 'expire' must be provided as an "
275 "To extend a lock 'expire' must be provided as an "
276 "argument to extend() method or at initialization time."
276 "argument to extend() method or at initialization time."
277 )
277 )
278
278
279 error = self.extend_script(client=self._client, keys=(self._name, self._signal), args=(self._id, expire))
279 error = self.extend_script(client=self._client, keys=(self._name, self._signal), args=(self._id, expire))
280 if error == 1:
280 if error == 1:
281 raise NotAcquired("Lock %s is not acquired or it already expired." % self._name)
281 raise NotAcquired("Lock %s is not acquired or it already expired." % self._name)
282 elif error == 2:
282 elif error == 2:
283 raise NotExpirable("Lock %s has no assigned expiration time" % self._name)
283 raise NotExpirable("Lock %s has no assigned expiration time" % self._name)
284 elif error:
284 elif error:
285 raise RuntimeError("Unsupported error code %s from EXTEND script" % error)
285 raise RuntimeError("Unsupported error code %s from EXTEND script" % error)
286
286
287 @staticmethod
287 @staticmethod
288 def _lock_renewer(lockref, interval, stop):
288 def _lock_renewer(lockref, interval, stop):
289 """
289 """
290 Renew the lock key in redis every `interval` seconds for as long
290 Renew the lock key in redis every `interval` seconds for as long
291 as `self._lock_renewal_thread.should_exit` is False.
291 as `self._lock_renewal_thread.should_exit` is False.
292 """
292 """
293 while not stop.wait(timeout=interval):
293 while not stop.wait(timeout=interval):
294 loggers["refresh.thread.start"].debug("Refreshing lock")
294 loggers["refresh.thread.start"].debug("Refreshing lock")
295 lock = lockref()
295 lock = lockref()
296 if lock is None:
296 if lock is None:
297 loggers["refresh.thread.stop"].debug(
297 loggers["refresh.thread.stop"].debug(
298 "The lock no longer exists, stopping lock refreshing"
298 "The lock no longer exists, stopping lock refreshing"
299 )
299 )
300 break
300 break
301 lock.extend(expire=lock._expire)
301 lock.extend(expire=lock._expire)
302 del lock
302 del lock
303 loggers["refresh.thread.exit"].debug("Exit requested, stopping lock refreshing")
303 loggers["refresh.thread.exit"].debug("Exit requested, stopping lock refreshing")
304
304
305 def _start_lock_renewer(self):
305 def _start_lock_renewer(self):
306 """
306 """
307 Starts the lock refresher thread.
307 Starts the lock refresher thread.
308 """
308 """
309 if self._lock_renewal_thread is not None:
309 if self._lock_renewal_thread is not None:
310 raise AlreadyStarted("Lock refresh thread already started")
310 raise AlreadyStarted("Lock refresh thread already started")
311
311
312 loggers["refresh.start"].debug(
312 loggers["refresh.start"].debug(
313 "Starting thread to refresh lock every %s seconds",
313 "Starting thread to refresh lock every %s seconds",
314 self._lock_renewal_interval
314 self._lock_renewal_interval
315 )
315 )
316 self._lock_renewal_stop = threading.Event()
316 self._lock_renewal_stop = threading.Event()
317 self._lock_renewal_thread = threading.Thread(
317 self._lock_renewal_thread = threading.Thread(
318 group=None,
318 group=None,
319 target=self._lock_renewer,
319 target=self._lock_renewer,
320 kwargs={'lockref': weakref.ref(self),
320 kwargs={'lockref': weakref.ref(self),
321 'interval': self._lock_renewal_interval,
321 'interval': self._lock_renewal_interval,
322 'stop': self._lock_renewal_stop}
322 'stop': self._lock_renewal_stop}
323 )
323 )
324 self._lock_renewal_thread.setDaemon(True)
324 self._lock_renewal_thread.setDaemon(True)
325 self._lock_renewal_thread.start()
325 self._lock_renewal_thread.start()
326
326
327 def _stop_lock_renewer(self):
327 def _stop_lock_renewer(self):
328 """
328 """
329 Stop the lock renewer.
329 Stop the lock renewer.
330
330
331 This signals the renewal thread and waits for its exit.
331 This signals the renewal thread and waits for its exit.
332 """
332 """
333 if self._lock_renewal_thread is None or not self._lock_renewal_thread.is_alive():
333 if self._lock_renewal_thread is None or not self._lock_renewal_thread.is_alive():
334 return
334 return
335 loggers["refresh.shutdown"].debug("Signalling the lock refresher to stop")
335 loggers["refresh.shutdown"].debug("Signalling the lock refresher to stop")
336 self._lock_renewal_stop.set()
336 self._lock_renewal_stop.set()
337 self._lock_renewal_thread.join()
337 self._lock_renewal_thread.join()
338 self._lock_renewal_thread = None
338 self._lock_renewal_thread = None
339 loggers["refresh.exit"].debug("Lock refresher has stopped")
339 loggers["refresh.exit"].debug("Lock refresher has stopped")
340
340
341 def __enter__(self):
341 def __enter__(self):
342 acquired = self.acquire(blocking=True)
342 acquired = self.acquire(blocking=True)
343 assert acquired, "Lock wasn't acquired, but blocking=True"
343 assert acquired, "Lock wasn't acquired, but blocking=True"
344 return self
344 return self
345
345
346 def __exit__(self, exc_type=None, exc_value=None, traceback=None):
346 def __exit__(self, exc_type=None, exc_value=None, traceback=None):
347 self.release()
347 self.release()
348
348
349 def release(self):
349 def release(self):
350 """Releases the lock, that was acquired with the same object.
350 """Releases the lock, that was acquired with the same object.
351
351
352 .. note::
352 .. note::
353
353
354 If you want to release a lock that you acquired in a different place you have two choices:
354 If you want to release a lock that you acquired in a different place you have two choices:
355
355
356 * Use ``Lock("name", id=id_from_other_place).release()``
356 * Use ``Lock("name", id=id_from_other_place).release()``
357 * Use ``Lock("name").reset()``
357 * Use ``Lock("name").reset()``
358 """
358 """
359 if self._lock_renewal_thread is not None:
359 if self._lock_renewal_thread is not None:
360 self._stop_lock_renewer()
360 self._stop_lock_renewer()
361 loggers["release"].debug("Releasing %r.", self._name)
361 loggers["release"].debug("Releasing %r.", self._name)
362 error = self.unlock_script(client=self._client, keys=(self._name, self._signal), args=(self._id, self._signal_expire))
362 error = self.unlock_script(client=self._client, keys=(self._name, self._signal), args=(self._id, self._signal_expire))
363 if error == 1:
363 if error == 1:
364 raise NotAcquired("Lock %s is not acquired or it already expired." % self._name)
364 raise NotAcquired("Lock %s is not acquired or it already expired." % self._name)
365 elif error:
365 elif error:
366 raise RuntimeError("Unsupported error code %s from EXTEND script." % error)
366 raise RuntimeError("Unsupported error code %s from EXTEND script." % error)
367
367
368 def locked(self):
368 def locked(self):
369 """
369 """
370 Return true if the lock is acquired.
370 Return true if the lock is acquired.
371
371
372 Checks that lock with same name already exists. This method returns true, even if
372 Checks that lock with same name already exists. This method returns true, even if
373 lock have another id.
373 lock have another id.
374 """
374 """
375 return self._client.exists(self._name) == 1
375 return self._client.exists(self._name) == 1
376
376
377
377
378 reset_all_script = None
378 reset_all_script = None
379
379
380
380
381 def reset_all(redis_client):
381 def reset_all(redis_client):
382 """
382 """
383 Forcibly deletes all locks if its remains (like a crash reason). Use this with care.
383 Forcibly deletes all locks if its remains (like a crash reason). Use this with care.
384
384
385 :param redis_client:
385 :param redis_client:
386 An instance of :class:`~StrictRedis`.
386 An instance of :class:`~StrictRedis`.
387 """
387 """
388 Lock.register_scripts(redis_client)
388 Lock.register_scripts(redis_client)
389
389
390 reset_all_script(client=redis_client) # noqa
390 reset_all_script(client=redis_client) # noqa
@@ -1,2148 +1,2149 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 """
21 """
22 Helper functions
22 Helper functions
23
23
24 Consists of functions to typically be used within templates, but also
24 Consists of functions to typically be used within templates, but also
25 available to Controllers. This module is available to both as 'h'.
25 available to Controllers. This module is available to both as 'h'.
26 """
26 """
27 import base64
27 import base64
28 import collections
28 import collections
29
29
30 import os
30 import os
31 import random
31 import random
32 import hashlib
32 import hashlib
33 import StringIO
33 import StringIO
34 import textwrap
34 import textwrap
35 import urllib
35 import urllib
36 import math
36 import math
37 import logging
37 import logging
38 import re
38 import re
39 import time
39 import time
40 import string
40 import string
41 import hashlib
41 import hashlib
42 import regex
42 import regex
43 from collections import OrderedDict
43 from collections import OrderedDict
44
44
45 import pygments
45 import pygments
46 import itertools
46 import itertools
47 import fnmatch
47 import fnmatch
48 import bleach
48 import bleach
49
49
50 from pyramid import compat
50 from pyramid import compat
51 from datetime import datetime
51 from datetime import datetime
52 from functools import partial
52 from functools import partial
53 from pygments.formatters.html import HtmlFormatter
53 from pygments.formatters.html import HtmlFormatter
54 from pygments.lexers import (
54 from pygments.lexers import (
55 get_lexer_by_name, get_lexer_for_filename, get_lexer_for_mimetype)
55 get_lexer_by_name, get_lexer_for_filename, get_lexer_for_mimetype)
56
56
57 from pyramid.threadlocal import get_current_request
57 from pyramid.threadlocal import get_current_request
58 from tempita import looper
58 from tempita import looper
59 from webhelpers2.html import literal, HTML, escape
59 from webhelpers2.html import literal, HTML, escape
60 from webhelpers2.html._autolink import _auto_link_urls
60 from webhelpers2.html._autolink import _auto_link_urls
61 from webhelpers2.html.tools import (
61 from webhelpers2.html.tools import (
62 button_to, highlight, js_obfuscate, strip_links, strip_tags)
62 button_to, highlight, js_obfuscate, strip_links, strip_tags)
63
63
64 from webhelpers2.text import (
64 from webhelpers2.text import (
65 chop_at, collapse, convert_accented_entities,
65 chop_at, collapse, convert_accented_entities,
66 convert_misc_entities, lchop, plural, rchop, remove_formatting,
66 convert_misc_entities, lchop, plural, rchop, remove_formatting,
67 replace_whitespace, urlify, truncate, wrap_paragraphs)
67 replace_whitespace, urlify, truncate, wrap_paragraphs)
68 from webhelpers2.date import time_ago_in_words
68 from webhelpers2.date import time_ago_in_words
69
69
70 from webhelpers2.html.tags import (
70 from webhelpers2.html.tags import (
71 _input, NotGiven, _make_safe_id_component as safeid,
71 _input, NotGiven, _make_safe_id_component as safeid,
72 form as insecure_form,
72 form as insecure_form,
73 auto_discovery_link, checkbox, end_form, file,
73 auto_discovery_link, checkbox, end_form, file,
74 hidden, image, javascript_link, link_to, link_to_if, link_to_unless, ol,
74 hidden, image, javascript_link, link_to, link_to_if, link_to_unless, ol,
75 select as raw_select, stylesheet_link, submit, text, password, textarea,
75 select as raw_select, stylesheet_link, submit, text, password, textarea,
76 ul, radio, Options)
76 ul, radio, Options)
77
77
78 from webhelpers2.number import format_byte_size
78 from webhelpers2.number import format_byte_size
79
79
80 from rhodecode.lib.action_parser import action_parser
80 from rhodecode.lib.action_parser import action_parser
81 from rhodecode.lib.pagination import Page, RepoPage, SqlPage
81 from rhodecode.lib.pagination import Page, RepoPage, SqlPage
82 from rhodecode.lib.ext_json import json
82 from rhodecode.lib.ext_json import json
83 from rhodecode.lib.utils import repo_name_slug, get_custom_lexer
83 from rhodecode.lib.utils import repo_name_slug, get_custom_lexer
84 from rhodecode.lib.utils2 import (
84 from rhodecode.lib.utils2 import (
85 str2bool, safe_unicode, safe_str,
85 str2bool, safe_unicode, safe_str,
86 get_commit_safe, datetime_to_time, time_to_datetime, time_to_utcdatetime,
86 get_commit_safe, datetime_to_time, time_to_datetime, time_to_utcdatetime,
87 AttributeDict, safe_int, md5, md5_safe, get_host_info)
87 AttributeDict, safe_int, md5, md5_safe, get_host_info)
88 from rhodecode.lib.markup_renderer import MarkupRenderer, relative_links
88 from rhodecode.lib.markup_renderer import MarkupRenderer, relative_links
89 from rhodecode.lib.vcs.exceptions import CommitDoesNotExistError
89 from rhodecode.lib.vcs.exceptions import CommitDoesNotExistError
90 from rhodecode.lib.vcs.backends.base import BaseChangeset, EmptyCommit
90 from rhodecode.lib.vcs.backends.base import BaseChangeset, EmptyCommit
91 from rhodecode.lib.vcs.conf.settings import ARCHIVE_SPECS
91 from rhodecode.lib.vcs.conf.settings import ARCHIVE_SPECS
92 from rhodecode.lib.index.search_utils import get_matching_line_offsets
92 from rhodecode.lib.index.search_utils import get_matching_line_offsets
93 from rhodecode.config.conf import DATE_FORMAT, DATETIME_FORMAT
93 from rhodecode.config.conf import DATE_FORMAT, DATETIME_FORMAT
94 from rhodecode.model.changeset_status import ChangesetStatusModel
94 from rhodecode.model.changeset_status import ChangesetStatusModel
95 from rhodecode.model.db import Permission, User, Repository, UserApiKeys, FileStore
95 from rhodecode.model.db import Permission, User, Repository, UserApiKeys, FileStore
96 from rhodecode.model.repo_group import RepoGroupModel
96 from rhodecode.model.repo_group import RepoGroupModel
97 from rhodecode.model.settings import IssueTrackerSettingsModel
97 from rhodecode.model.settings import IssueTrackerSettingsModel
98
98
99
99
100 log = logging.getLogger(__name__)
100 log = logging.getLogger(__name__)
101
101
102
102
103 DEFAULT_USER = User.DEFAULT_USER
103 DEFAULT_USER = User.DEFAULT_USER
104 DEFAULT_USER_EMAIL = User.DEFAULT_USER_EMAIL
104 DEFAULT_USER_EMAIL = User.DEFAULT_USER_EMAIL
105
105
106
106
107 def asset(path, ver=None, **kwargs):
107 def asset(path, ver=None, **kwargs):
108 """
108 """
109 Helper to generate a static asset file path for rhodecode assets
109 Helper to generate a static asset file path for rhodecode assets
110
110
111 eg. h.asset('images/image.png', ver='3923')
111 eg. h.asset('images/image.png', ver='3923')
112
112
113 :param path: path of asset
113 :param path: path of asset
114 :param ver: optional version query param to append as ?ver=
114 :param ver: optional version query param to append as ?ver=
115 """
115 """
116 request = get_current_request()
116 request = get_current_request()
117 query = {}
117 query = {}
118 query.update(kwargs)
118 query.update(kwargs)
119 if ver:
119 if ver:
120 query = {'ver': ver}
120 query = {'ver': ver}
121 return request.static_path(
121 return request.static_path(
122 'rhodecode:public/{}'.format(path), _query=query)
122 'rhodecode:public/{}'.format(path), _query=query)
123
123
124
124
125 default_html_escape_table = {
125 default_html_escape_table = {
126 ord('&'): u'&amp;',
126 ord('&'): u'&amp;',
127 ord('<'): u'&lt;',
127 ord('<'): u'&lt;',
128 ord('>'): u'&gt;',
128 ord('>'): u'&gt;',
129 ord('"'): u'&quot;',
129 ord('"'): u'&quot;',
130 ord("'"): u'&#39;',
130 ord("'"): u'&#39;',
131 }
131 }
132
132
133
133
134 def html_escape(text, html_escape_table=default_html_escape_table):
134 def html_escape(text, html_escape_table=default_html_escape_table):
135 """Produce entities within text."""
135 """Produce entities within text."""
136 return text.translate(html_escape_table)
136 return text.translate(html_escape_table)
137
137
138
138
139 def chop_at_smart(s, sub, inclusive=False, suffix_if_chopped=None):
139 def chop_at_smart(s, sub, inclusive=False, suffix_if_chopped=None):
140 """
140 """
141 Truncate string ``s`` at the first occurrence of ``sub``.
141 Truncate string ``s`` at the first occurrence of ``sub``.
142
142
143 If ``inclusive`` is true, truncate just after ``sub`` rather than at it.
143 If ``inclusive`` is true, truncate just after ``sub`` rather than at it.
144 """
144 """
145 suffix_if_chopped = suffix_if_chopped or ''
145 suffix_if_chopped = suffix_if_chopped or ''
146 pos = s.find(sub)
146 pos = s.find(sub)
147 if pos == -1:
147 if pos == -1:
148 return s
148 return s
149
149
150 if inclusive:
150 if inclusive:
151 pos += len(sub)
151 pos += len(sub)
152
152
153 chopped = s[:pos]
153 chopped = s[:pos]
154 left = s[pos:].strip()
154 left = s[pos:].strip()
155
155
156 if left and suffix_if_chopped:
156 if left and suffix_if_chopped:
157 chopped += suffix_if_chopped
157 chopped += suffix_if_chopped
158
158
159 return chopped
159 return chopped
160
160
161
161
162 def shorter(text, size=20, prefix=False):
162 def shorter(text, size=20, prefix=False):
163 postfix = '...'
163 postfix = '...'
164 if len(text) > size:
164 if len(text) > size:
165 if prefix:
165 if prefix:
166 # shorten in front
166 # shorten in front
167 return postfix + text[-(size - len(postfix)):]
167 return postfix + text[-(size - len(postfix)):]
168 else:
168 else:
169 return text[:size - len(postfix)] + postfix
169 return text[:size - len(postfix)] + postfix
170 return text
170 return text
171
171
172
172
173 def reset(name, value=None, id=NotGiven, type="reset", **attrs):
173 def reset(name, value=None, id=NotGiven, type="reset", **attrs):
174 """
174 """
175 Reset button
175 Reset button
176 """
176 """
177 return _input(type, name, value, id, attrs)
177 return _input(type, name, value, id, attrs)
178
178
179
179
180 def select(name, selected_values, options, id=NotGiven, **attrs):
180 def select(name, selected_values, options, id=NotGiven, **attrs):
181
181
182 if isinstance(options, (list, tuple)):
182 if isinstance(options, (list, tuple)):
183 options_iter = options
183 options_iter = options
184 # Handle old value,label lists ... where value also can be value,label lists
184 # Handle old value,label lists ... where value also can be value,label lists
185 options = Options()
185 options = Options()
186 for opt in options_iter:
186 for opt in options_iter:
187 if isinstance(opt, tuple) and len(opt) == 2:
187 if isinstance(opt, tuple) and len(opt) == 2:
188 value, label = opt
188 value, label = opt
189 elif isinstance(opt, basestring):
189 elif isinstance(opt, basestring):
190 value = label = opt
190 value = label = opt
191 else:
191 else:
192 raise ValueError('invalid select option type %r' % type(opt))
192 raise ValueError('invalid select option type %r' % type(opt))
193
193
194 if isinstance(value, (list, tuple)):
194 if isinstance(value, (list, tuple)):
195 option_group = options.add_optgroup(label)
195 option_group = options.add_optgroup(label)
196 for opt2 in value:
196 for opt2 in value:
197 if isinstance(opt2, tuple) and len(opt2) == 2:
197 if isinstance(opt2, tuple) and len(opt2) == 2:
198 group_value, group_label = opt2
198 group_value, group_label = opt2
199 elif isinstance(opt2, basestring):
199 elif isinstance(opt2, basestring):
200 group_value = group_label = opt2
200 group_value = group_label = opt2
201 else:
201 else:
202 raise ValueError('invalid select option type %r' % type(opt2))
202 raise ValueError('invalid select option type %r' % type(opt2))
203
203
204 option_group.add_option(group_label, group_value)
204 option_group.add_option(group_label, group_value)
205 else:
205 else:
206 options.add_option(label, value)
206 options.add_option(label, value)
207
207
208 return raw_select(name, selected_values, options, id=id, **attrs)
208 return raw_select(name, selected_values, options, id=id, **attrs)
209
209
210
210
211 def branding(name, length=40):
211 def branding(name, length=40):
212 return truncate(name, length, indicator="")
212 return truncate(name, length, indicator="")
213
213
214
214
215 def FID(raw_id, path):
215 def FID(raw_id, path):
216 """
216 """
217 Creates a unique ID for filenode based on it's hash of path and commit
217 Creates a unique ID for filenode based on it's hash of path and commit
218 it's safe to use in urls
218 it's safe to use in urls
219
219
220 :param raw_id:
220 :param raw_id:
221 :param path:
221 :param path:
222 """
222 """
223
223
224 return 'c-%s-%s' % (short_id(raw_id), md5_safe(path)[:12])
224 return 'c-%s-%s' % (short_id(raw_id), md5_safe(path)[:12])
225
225
226
226
227 class _GetError(object):
227 class _GetError(object):
228 """Get error from form_errors, and represent it as span wrapped error
228 """Get error from form_errors, and represent it as span wrapped error
229 message
229 message
230
230
231 :param field_name: field to fetch errors for
231 :param field_name: field to fetch errors for
232 :param form_errors: form errors dict
232 :param form_errors: form errors dict
233 """
233 """
234
234
235 def __call__(self, field_name, form_errors):
235 def __call__(self, field_name, form_errors):
236 tmpl = """<span class="error_msg">%s</span>"""
236 tmpl = """<span class="error_msg">%s</span>"""
237 if form_errors and field_name in form_errors:
237 if form_errors and field_name in form_errors:
238 return literal(tmpl % form_errors.get(field_name))
238 return literal(tmpl % form_errors.get(field_name))
239
239
240
240
241 get_error = _GetError()
241 get_error = _GetError()
242
242
243
243
244 class _ToolTip(object):
244 class _ToolTip(object):
245
245
246 def __call__(self, tooltip_title, trim_at=50):
246 def __call__(self, tooltip_title, trim_at=50):
247 """
247 """
248 Special function just to wrap our text into nice formatted
248 Special function just to wrap our text into nice formatted
249 autowrapped text
249 autowrapped text
250
250
251 :param tooltip_title:
251 :param tooltip_title:
252 """
252 """
253 tooltip_title = escape(tooltip_title)
253 tooltip_title = escape(tooltip_title)
254 tooltip_title = tooltip_title.replace('<', '&lt;').replace('>', '&gt;')
254 tooltip_title = tooltip_title.replace('<', '&lt;').replace('>', '&gt;')
255 return tooltip_title
255 return tooltip_title
256
256
257
257
258 tooltip = _ToolTip()
258 tooltip = _ToolTip()
259
259
260 files_icon = u'<i class="file-breadcrumb-copy tooltip icon-clipboard clipboard-action" data-clipboard-text="{}" title="Copy file path"></i>'
260 files_icon = u'<i class="file-breadcrumb-copy tooltip icon-clipboard clipboard-action" data-clipboard-text="{}" title="Copy file path"></i>'
261
261
262
262
263 def files_breadcrumbs(repo_name, repo_type, commit_id, file_path, landing_ref_name=None, at_ref=None,
263 def files_breadcrumbs(repo_name, repo_type, commit_id, file_path, landing_ref_name=None, at_ref=None,
264 limit_items=False, linkify_last_item=False, hide_last_item=False,
264 limit_items=False, linkify_last_item=False, hide_last_item=False,
265 copy_path_icon=True):
265 copy_path_icon=True):
266 if isinstance(file_path, str):
266 if isinstance(file_path, str):
267 file_path = safe_unicode(file_path)
267 file_path = safe_unicode(file_path)
268
268
269 if at_ref:
269 if at_ref:
270 route_qry = {'at': at_ref}
270 route_qry = {'at': at_ref}
271 default_landing_ref = at_ref or landing_ref_name or commit_id
271 default_landing_ref = at_ref or landing_ref_name or commit_id
272 else:
272 else:
273 route_qry = None
273 route_qry = None
274 default_landing_ref = commit_id
274 default_landing_ref = commit_id
275
275
276 # first segment is a `HOME` link to repo files root location
276 # first segment is a `HOME` link to repo files root location
277 root_name = literal(u'<i class="icon-home"></i>')
277 root_name = literal(u'<i class="icon-home"></i>')
278
278
279 url_segments = [
279 url_segments = [
280 link_to(
280 link_to(
281 root_name,
281 root_name,
282 repo_files_by_ref_url(
282 repo_files_by_ref_url(
283 repo_name,
283 repo_name,
284 repo_type,
284 repo_type,
285 f_path=None, # None here is a special case for SVN repos,
285 f_path=None, # None here is a special case for SVN repos,
286 # that won't prefix with a ref
286 # that won't prefix with a ref
287 ref_name=default_landing_ref,
287 ref_name=default_landing_ref,
288 commit_id=commit_id,
288 commit_id=commit_id,
289 query=route_qry
289 query=route_qry
290 )
290 )
291 )]
291 )]
292
292
293 path_segments = file_path.split('/')
293 path_segments = file_path.split('/')
294 last_cnt = len(path_segments) - 1
294 last_cnt = len(path_segments) - 1
295 for cnt, segment in enumerate(path_segments):
295 for cnt, segment in enumerate(path_segments):
296 if not segment:
296 if not segment:
297 continue
297 continue
298 segment_html = escape(segment)
298 segment_html = escape(segment)
299
299
300 last_item = cnt == last_cnt
300 last_item = cnt == last_cnt
301
301
302 if last_item and hide_last_item:
302 if last_item and hide_last_item:
303 # iterate over and hide last element
303 # iterate over and hide last element
304 continue
304 continue
305
305
306 if last_item and linkify_last_item is False:
306 if last_item and linkify_last_item is False:
307 # plain version
307 # plain version
308 url_segments.append(segment_html)
308 url_segments.append(segment_html)
309 else:
309 else:
310 url_segments.append(
310 url_segments.append(
311 link_to(
311 link_to(
312 segment_html,
312 segment_html,
313 repo_files_by_ref_url(
313 repo_files_by_ref_url(
314 repo_name,
314 repo_name,
315 repo_type,
315 repo_type,
316 f_path='/'.join(path_segments[:cnt + 1]),
316 f_path='/'.join(path_segments[:cnt + 1]),
317 ref_name=default_landing_ref,
317 ref_name=default_landing_ref,
318 commit_id=commit_id,
318 commit_id=commit_id,
319 query=route_qry
319 query=route_qry
320 ),
320 ),
321 ))
321 ))
322
322
323 limited_url_segments = url_segments[:1] + ['...'] + url_segments[-5:]
323 limited_url_segments = url_segments[:1] + ['...'] + url_segments[-5:]
324 if limit_items and len(limited_url_segments) < len(url_segments):
324 if limit_items and len(limited_url_segments) < len(url_segments):
325 url_segments = limited_url_segments
325 url_segments = limited_url_segments
326
326
327 full_path = file_path
327 full_path = file_path
328 if copy_path_icon:
328 if copy_path_icon:
329 icon = files_icon.format(escape(full_path))
329 icon = files_icon.format(escape(full_path))
330 else:
330 else:
331 icon = ''
331 icon = ''
332
332
333 if file_path == '':
333 if file_path == '':
334 return root_name
334 return root_name
335 else:
335 else:
336 return literal(' / '.join(url_segments) + icon)
336 return literal(' / '.join(url_segments) + icon)
337
337
338
338
339 def files_url_data(request):
339 def files_url_data(request):
340 import urllib
340 matchdict = request.matchdict
341 matchdict = request.matchdict
341
342
342 if 'f_path' not in matchdict:
343 if 'f_path' not in matchdict:
343 matchdict['f_path'] = ''
344 matchdict['f_path'] = ''
344
345 else:
346 matchdict['f_path'] = urllib.quote(safe_str(matchdict['f_path']))
345 if 'commit_id' not in matchdict:
347 if 'commit_id' not in matchdict:
346 matchdict['commit_id'] = 'tip'
348 matchdict['commit_id'] = 'tip'
347
349
348 return json.dumps(matchdict)
350 return json.dumps(matchdict)
349
351
350
352
351 def repo_files_by_ref_url(db_repo_name, db_repo_type, f_path, ref_name, commit_id, query=None, ):
353 def repo_files_by_ref_url(db_repo_name, db_repo_type, f_path, ref_name, commit_id, query=None, ):
352 _is_svn = is_svn(db_repo_type)
354 _is_svn = is_svn(db_repo_type)
353 final_f_path = f_path
355 final_f_path = f_path
354
356
355 if _is_svn:
357 if _is_svn:
356 """
358 """
357 For SVN the ref_name cannot be used as a commit_id, it needs to be prefixed with
359 For SVN the ref_name cannot be used as a commit_id, it needs to be prefixed with
358 actually commit_id followed by the ref_name. This should be done only in case
360 actually commit_id followed by the ref_name. This should be done only in case
359 This is a initial landing url, without additional paths.
361 This is a initial landing url, without additional paths.
360
362
361 like: /1000/tags/1.0.0/?at=tags/1.0.0
363 like: /1000/tags/1.0.0/?at=tags/1.0.0
362 """
364 """
363
365
364 if ref_name and ref_name != 'tip':
366 if ref_name and ref_name != 'tip':
365 # NOTE(marcink): for svn the ref_name is actually the stored path, so we prefix it
367 # NOTE(marcink): for svn the ref_name is actually the stored path, so we prefix it
366 # for SVN we only do this magic prefix if it's root, .eg landing revision
368 # for SVN we only do this magic prefix if it's root, .eg landing revision
367 # of files link. If we are in the tree we don't need this since we traverse the url
369 # of files link. If we are in the tree we don't need this since we traverse the url
368 # that has everything stored
370 # that has everything stored
369 if f_path in ['', '/']:
371 if f_path in ['', '/']:
370 final_f_path = '/'.join([ref_name, f_path])
372 final_f_path = '/'.join([ref_name, f_path])
371
373
372 # SVN always needs a commit_id explicitly, without a named REF
374 # SVN always needs a commit_id explicitly, without a named REF
373 default_commit_id = commit_id
375 default_commit_id = commit_id
374 else:
376 else:
375 """
377 """
376 For git and mercurial we construct a new URL using the names instead of commit_id
378 For git and mercurial we construct a new URL using the names instead of commit_id
377 like: /master/some_path?at=master
379 like: /master/some_path?at=master
378 """
380 """
379 # We currently do not support branches with slashes
381 # We currently do not support branches with slashes
380 if '/' in ref_name:
382 if '/' in ref_name:
381 default_commit_id = commit_id
383 default_commit_id = commit_id
382 else:
384 else:
383 default_commit_id = ref_name
385 default_commit_id = ref_name
384
386
385 # sometimes we pass f_path as None, to indicate explicit no prefix,
387 # sometimes we pass f_path as None, to indicate explicit no prefix,
386 # we translate it to string to not have None
388 # we translate it to string to not have None
387 final_f_path = final_f_path or ''
389 final_f_path = final_f_path or ''
388
390
389 files_url = route_path(
391 files_url = route_path(
390 'repo_files',
392 'repo_files',
391 repo_name=db_repo_name,
393 repo_name=db_repo_name,
392 commit_id=default_commit_id,
394 commit_id=default_commit_id,
393 f_path=final_f_path,
395 f_path=final_f_path,
394 _query=query
396 _query=query
395 )
397 )
396 return files_url
398 return files_url
397
399
398
400
399 def code_highlight(code, lexer, formatter, use_hl_filter=False):
401 def code_highlight(code, lexer, formatter, use_hl_filter=False):
400 """
402 """
401 Lex ``code`` with ``lexer`` and format it with the formatter ``formatter``.
403 Lex ``code`` with ``lexer`` and format it with the formatter ``formatter``.
402
404
403 If ``outfile`` is given and a valid file object (an object
405 If ``outfile`` is given and a valid file object (an object
404 with a ``write`` method), the result will be written to it, otherwise
406 with a ``write`` method), the result will be written to it, otherwise
405 it is returned as a string.
407 it is returned as a string.
406 """
408 """
407 if use_hl_filter:
409 if use_hl_filter:
408 # add HL filter
410 # add HL filter
409 from rhodecode.lib.index import search_utils
411 from rhodecode.lib.index import search_utils
410 lexer.add_filter(search_utils.ElasticSearchHLFilter())
412 lexer.add_filter(search_utils.ElasticSearchHLFilter())
411 return pygments.format(pygments.lex(code, lexer), formatter)
413 return pygments.format(pygments.lex(code, lexer), formatter)
412
414
413
415
414 class CodeHtmlFormatter(HtmlFormatter):
416 class CodeHtmlFormatter(HtmlFormatter):
415 """
417 """
416 My code Html Formatter for source codes
418 My code Html Formatter for source codes
417 """
419 """
418
420
419 def wrap(self, source, outfile):
421 def wrap(self, source, outfile):
420 return self._wrap_div(self._wrap_pre(self._wrap_code(source)))
422 return self._wrap_div(self._wrap_pre(self._wrap_code(source)))
421
423
422 def _wrap_code(self, source):
424 def _wrap_code(self, source):
423 for cnt, it in enumerate(source):
425 for cnt, it in enumerate(source):
424 i, t = it
426 i, t = it
425 t = '<div id="L%s">%s</div>' % (cnt + 1, t)
427 t = '<div id="L%s">%s</div>' % (cnt + 1, t)
426 yield i, t
428 yield i, t
427
429
428 def _wrap_tablelinenos(self, inner):
430 def _wrap_tablelinenos(self, inner):
429 dummyoutfile = StringIO.StringIO()
431 dummyoutfile = StringIO.StringIO()
430 lncount = 0
432 lncount = 0
431 for t, line in inner:
433 for t, line in inner:
432 if t:
434 if t:
433 lncount += 1
435 lncount += 1
434 dummyoutfile.write(line)
436 dummyoutfile.write(line)
435
437
436 fl = self.linenostart
438 fl = self.linenostart
437 mw = len(str(lncount + fl - 1))
439 mw = len(str(lncount + fl - 1))
438 sp = self.linenospecial
440 sp = self.linenospecial
439 st = self.linenostep
441 st = self.linenostep
440 la = self.lineanchors
442 la = self.lineanchors
441 aln = self.anchorlinenos
443 aln = self.anchorlinenos
442 nocls = self.noclasses
444 nocls = self.noclasses
443 if sp:
445 if sp:
444 lines = []
446 lines = []
445
447
446 for i in range(fl, fl + lncount):
448 for i in range(fl, fl + lncount):
447 if i % st == 0:
449 if i % st == 0:
448 if i % sp == 0:
450 if i % sp == 0:
449 if aln:
451 if aln:
450 lines.append('<a href="#%s%d" class="special">%*d</a>' %
452 lines.append('<a href="#%s%d" class="special">%*d</a>' %
451 (la, i, mw, i))
453 (la, i, mw, i))
452 else:
454 else:
453 lines.append('<span class="special">%*d</span>' % (mw, i))
455 lines.append('<span class="special">%*d</span>' % (mw, i))
454 else:
456 else:
455 if aln:
457 if aln:
456 lines.append('<a href="#%s%d">%*d</a>' % (la, i, mw, i))
458 lines.append('<a href="#%s%d">%*d</a>' % (la, i, mw, i))
457 else:
459 else:
458 lines.append('%*d' % (mw, i))
460 lines.append('%*d' % (mw, i))
459 else:
461 else:
460 lines.append('')
462 lines.append('')
461 ls = '\n'.join(lines)
463 ls = '\n'.join(lines)
462 else:
464 else:
463 lines = []
465 lines = []
464 for i in range(fl, fl + lncount):
466 for i in range(fl, fl + lncount):
465 if i % st == 0:
467 if i % st == 0:
466 if aln:
468 if aln:
467 lines.append('<a href="#%s%d">%*d</a>' % (la, i, mw, i))
469 lines.append('<a href="#%s%d">%*d</a>' % (la, i, mw, i))
468 else:
470 else:
469 lines.append('%*d' % (mw, i))
471 lines.append('%*d' % (mw, i))
470 else:
472 else:
471 lines.append('')
473 lines.append('')
472 ls = '\n'.join(lines)
474 ls = '\n'.join(lines)
473
475
474 # in case you wonder about the seemingly redundant <div> here: since the
476 # in case you wonder about the seemingly redundant <div> here: since the
475 # content in the other cell also is wrapped in a div, some browsers in
477 # content in the other cell also is wrapped in a div, some browsers in
476 # some configurations seem to mess up the formatting...
478 # some configurations seem to mess up the formatting...
477 if nocls:
479 if nocls:
478 yield 0, ('<table class="%stable">' % self.cssclass +
480 yield 0, ('<table class="%stable">' % self.cssclass +
479 '<tr><td><div class="linenodiv" '
481 '<tr><td><div class="linenodiv" '
480 'style="background-color: #f0f0f0; padding-right: 10px">'
482 'style="background-color: #f0f0f0; padding-right: 10px">'
481 '<pre style="line-height: 125%">' +
483 '<pre style="line-height: 125%">' +
482 ls + '</pre></div></td><td id="hlcode" class="code">')
484 ls + '</pre></div></td><td id="hlcode" class="code">')
483 else:
485 else:
484 yield 0, ('<table class="%stable">' % self.cssclass +
486 yield 0, ('<table class="%stable">' % self.cssclass +
485 '<tr><td class="linenos"><div class="linenodiv"><pre>' +
487 '<tr><td class="linenos"><div class="linenodiv"><pre>' +
486 ls + '</pre></div></td><td id="hlcode" class="code">')
488 ls + '</pre></div></td><td id="hlcode" class="code">')
487 yield 0, dummyoutfile.getvalue()
489 yield 0, dummyoutfile.getvalue()
488 yield 0, '</td></tr></table>'
490 yield 0, '</td></tr></table>'
489
491
490
492
491 class SearchContentCodeHtmlFormatter(CodeHtmlFormatter):
493 class SearchContentCodeHtmlFormatter(CodeHtmlFormatter):
492 def __init__(self, **kw):
494 def __init__(self, **kw):
493 # only show these line numbers if set
495 # only show these line numbers if set
494 self.only_lines = kw.pop('only_line_numbers', [])
496 self.only_lines = kw.pop('only_line_numbers', [])
495 self.query_terms = kw.pop('query_terms', [])
497 self.query_terms = kw.pop('query_terms', [])
496 self.max_lines = kw.pop('max_lines', 5)
498 self.max_lines = kw.pop('max_lines', 5)
497 self.line_context = kw.pop('line_context', 3)
499 self.line_context = kw.pop('line_context', 3)
498 self.url = kw.pop('url', None)
500 self.url = kw.pop('url', None)
499
501
500 super(CodeHtmlFormatter, self).__init__(**kw)
502 super(CodeHtmlFormatter, self).__init__(**kw)
501
503
502 def _wrap_code(self, source):
504 def _wrap_code(self, source):
503 for cnt, it in enumerate(source):
505 for cnt, it in enumerate(source):
504 i, t = it
506 i, t = it
505 t = '<pre>%s</pre>' % t
507 t = '<pre>%s</pre>' % t
506 yield i, t
508 yield i, t
507
509
508 def _wrap_tablelinenos(self, inner):
510 def _wrap_tablelinenos(self, inner):
509 yield 0, '<table class="code-highlight %stable">' % self.cssclass
511 yield 0, '<table class="code-highlight %stable">' % self.cssclass
510
512
511 last_shown_line_number = 0
513 last_shown_line_number = 0
512 current_line_number = 1
514 current_line_number = 1
513
515
514 for t, line in inner:
516 for t, line in inner:
515 if not t:
517 if not t:
516 yield t, line
518 yield t, line
517 continue
519 continue
518
520
519 if current_line_number in self.only_lines:
521 if current_line_number in self.only_lines:
520 if last_shown_line_number + 1 != current_line_number:
522 if last_shown_line_number + 1 != current_line_number:
521 yield 0, '<tr>'
523 yield 0, '<tr>'
522 yield 0, '<td class="line">...</td>'
524 yield 0, '<td class="line">...</td>'
523 yield 0, '<td id="hlcode" class="code"></td>'
525 yield 0, '<td id="hlcode" class="code"></td>'
524 yield 0, '</tr>'
526 yield 0, '</tr>'
525
527
526 yield 0, '<tr>'
528 yield 0, '<tr>'
527 if self.url:
529 if self.url:
528 yield 0, '<td class="line"><a href="%s#L%i">%i</a></td>' % (
530 yield 0, '<td class="line"><a href="%s#L%i">%i</a></td>' % (
529 self.url, current_line_number, current_line_number)
531 self.url, current_line_number, current_line_number)
530 else:
532 else:
531 yield 0, '<td class="line"><a href="">%i</a></td>' % (
533 yield 0, '<td class="line"><a href="">%i</a></td>' % (
532 current_line_number)
534 current_line_number)
533 yield 0, '<td id="hlcode" class="code">' + line + '</td>'
535 yield 0, '<td id="hlcode" class="code">' + line + '</td>'
534 yield 0, '</tr>'
536 yield 0, '</tr>'
535
537
536 last_shown_line_number = current_line_number
538 last_shown_line_number = current_line_number
537
539
538 current_line_number += 1
540 current_line_number += 1
539
541
540 yield 0, '</table>'
542 yield 0, '</table>'
541
543
542
544
543 def hsv_to_rgb(h, s, v):
545 def hsv_to_rgb(h, s, v):
544 """ Convert hsv color values to rgb """
546 """ Convert hsv color values to rgb """
545
547
546 if s == 0.0:
548 if s == 0.0:
547 return v, v, v
549 return v, v, v
548 i = int(h * 6.0) # XXX assume int() truncates!
550 i = int(h * 6.0) # XXX assume int() truncates!
549 f = (h * 6.0) - i
551 f = (h * 6.0) - i
550 p = v * (1.0 - s)
552 p = v * (1.0 - s)
551 q = v * (1.0 - s * f)
553 q = v * (1.0 - s * f)
552 t = v * (1.0 - s * (1.0 - f))
554 t = v * (1.0 - s * (1.0 - f))
553 i = i % 6
555 i = i % 6
554 if i == 0:
556 if i == 0:
555 return v, t, p
557 return v, t, p
556 if i == 1:
558 if i == 1:
557 return q, v, p
559 return q, v, p
558 if i == 2:
560 if i == 2:
559 return p, v, t
561 return p, v, t
560 if i == 3:
562 if i == 3:
561 return p, q, v
563 return p, q, v
562 if i == 4:
564 if i == 4:
563 return t, p, v
565 return t, p, v
564 if i == 5:
566 if i == 5:
565 return v, p, q
567 return v, p, q
566
568
567
569
568 def unique_color_generator(n=10000, saturation=0.10, lightness=0.95):
570 def unique_color_generator(n=10000, saturation=0.10, lightness=0.95):
569 """
571 """
570 Generator for getting n of evenly distributed colors using
572 Generator for getting n of evenly distributed colors using
571 hsv color and golden ratio. It always return same order of colors
573 hsv color and golden ratio. It always return same order of colors
572
574
573 :param n: number of colors to generate
575 :param n: number of colors to generate
574 :param saturation: saturation of returned colors
576 :param saturation: saturation of returned colors
575 :param lightness: lightness of returned colors
577 :param lightness: lightness of returned colors
576 :returns: RGB tuple
578 :returns: RGB tuple
577 """
579 """
578
580
579 golden_ratio = 0.618033988749895
581 golden_ratio = 0.618033988749895
580 h = 0.22717784590367374
582 h = 0.22717784590367374
581
583
582 for _ in xrange(n):
584 for _ in xrange(n):
583 h += golden_ratio
585 h += golden_ratio
584 h %= 1
586 h %= 1
585 HSV_tuple = [h, saturation, lightness]
587 HSV_tuple = [h, saturation, lightness]
586 RGB_tuple = hsv_to_rgb(*HSV_tuple)
588 RGB_tuple = hsv_to_rgb(*HSV_tuple)
587 yield map(lambda x: str(int(x * 256)), RGB_tuple)
589 yield map(lambda x: str(int(x * 256)), RGB_tuple)
588
590
589
591
590 def color_hasher(n=10000, saturation=0.10, lightness=0.95):
592 def color_hasher(n=10000, saturation=0.10, lightness=0.95):
591 """
593 """
592 Returns a function which when called with an argument returns a unique
594 Returns a function which when called with an argument returns a unique
593 color for that argument, eg.
595 color for that argument, eg.
594
596
595 :param n: number of colors to generate
597 :param n: number of colors to generate
596 :param saturation: saturation of returned colors
598 :param saturation: saturation of returned colors
597 :param lightness: lightness of returned colors
599 :param lightness: lightness of returned colors
598 :returns: css RGB string
600 :returns: css RGB string
599
601
600 >>> color_hash = color_hasher()
602 >>> color_hash = color_hasher()
601 >>> color_hash('hello')
603 >>> color_hash('hello')
602 'rgb(34, 12, 59)'
604 'rgb(34, 12, 59)'
603 >>> color_hash('hello')
605 >>> color_hash('hello')
604 'rgb(34, 12, 59)'
606 'rgb(34, 12, 59)'
605 >>> color_hash('other')
607 >>> color_hash('other')
606 'rgb(90, 224, 159)'
608 'rgb(90, 224, 159)'
607 """
609 """
608
610
609 color_dict = {}
611 color_dict = {}
610 cgenerator = unique_color_generator(
612 cgenerator = unique_color_generator(
611 saturation=saturation, lightness=lightness)
613 saturation=saturation, lightness=lightness)
612
614
613 def get_color_string(thing):
615 def get_color_string(thing):
614 if thing in color_dict:
616 if thing in color_dict:
615 col = color_dict[thing]
617 col = color_dict[thing]
616 else:
618 else:
617 col = color_dict[thing] = cgenerator.next()
619 col = color_dict[thing] = cgenerator.next()
618 return "rgb(%s)" % (', '.join(col))
620 return "rgb(%s)" % (', '.join(col))
619
621
620 return get_color_string
622 return get_color_string
621
623
622
624
623 def get_lexer_safe(mimetype=None, filepath=None):
625 def get_lexer_safe(mimetype=None, filepath=None):
624 """
626 """
625 Tries to return a relevant pygments lexer using mimetype/filepath name,
627 Tries to return a relevant pygments lexer using mimetype/filepath name,
626 defaulting to plain text if none could be found
628 defaulting to plain text if none could be found
627 """
629 """
628 lexer = None
630 lexer = None
629 try:
631 try:
630 if mimetype:
632 if mimetype:
631 lexer = get_lexer_for_mimetype(mimetype)
633 lexer = get_lexer_for_mimetype(mimetype)
632 if not lexer:
634 if not lexer:
633 lexer = get_lexer_for_filename(filepath)
635 lexer = get_lexer_for_filename(filepath)
634 except pygments.util.ClassNotFound:
636 except pygments.util.ClassNotFound:
635 pass
637 pass
636
638
637 if not lexer:
639 if not lexer:
638 lexer = get_lexer_by_name('text')
640 lexer = get_lexer_by_name('text')
639
641
640 return lexer
642 return lexer
641
643
642
644
643 def get_lexer_for_filenode(filenode):
645 def get_lexer_for_filenode(filenode):
644 lexer = get_custom_lexer(filenode.extension) or filenode.lexer
646 lexer = get_custom_lexer(filenode.extension) or filenode.lexer
645 return lexer
647 return lexer
646
648
647
649
648 def pygmentize(filenode, **kwargs):
650 def pygmentize(filenode, **kwargs):
649 """
651 """
650 pygmentize function using pygments
652 pygmentize function using pygments
651
653
652 :param filenode:
654 :param filenode:
653 """
655 """
654 lexer = get_lexer_for_filenode(filenode)
656 lexer = get_lexer_for_filenode(filenode)
655 return literal(code_highlight(filenode.content, lexer,
657 return literal(code_highlight(filenode.content, lexer,
656 CodeHtmlFormatter(**kwargs)))
658 CodeHtmlFormatter(**kwargs)))
657
659
658
660
659 def is_following_repo(repo_name, user_id):
661 def is_following_repo(repo_name, user_id):
660 from rhodecode.model.scm import ScmModel
662 from rhodecode.model.scm import ScmModel
661 return ScmModel().is_following_repo(repo_name, user_id)
663 return ScmModel().is_following_repo(repo_name, user_id)
662
664
663
665
664 class _Message(object):
666 class _Message(object):
665 """A message returned by ``Flash.pop_messages()``.
667 """A message returned by ``Flash.pop_messages()``.
666
668
667 Converting the message to a string returns the message text. Instances
669 Converting the message to a string returns the message text. Instances
668 also have the following attributes:
670 also have the following attributes:
669
671
670 * ``message``: the message text.
672 * ``message``: the message text.
671 * ``category``: the category specified when the message was created.
673 * ``category``: the category specified when the message was created.
672 """
674 """
673
675
674 def __init__(self, category, message, sub_data=None):
676 def __init__(self, category, message, sub_data=None):
675 self.category = category
677 self.category = category
676 self.message = message
678 self.message = message
677 self.sub_data = sub_data or {}
679 self.sub_data = sub_data or {}
678
680
679 def __str__(self):
681 def __str__(self):
680 return self.message
682 return self.message
681
683
682 __unicode__ = __str__
684 __unicode__ = __str__
683
685
684 def __html__(self):
686 def __html__(self):
685 return escape(safe_unicode(self.message))
687 return escape(safe_unicode(self.message))
686
688
687
689
688 class Flash(object):
690 class Flash(object):
689 # List of allowed categories. If None, allow any category.
691 # List of allowed categories. If None, allow any category.
690 categories = ["warning", "notice", "error", "success"]
692 categories = ["warning", "notice", "error", "success"]
691
693
692 # Default category if none is specified.
694 # Default category if none is specified.
693 default_category = "notice"
695 default_category = "notice"
694
696
695 def __init__(self, session_key="flash", categories=None,
697 def __init__(self, session_key="flash", categories=None,
696 default_category=None):
698 default_category=None):
697 """
699 """
698 Instantiate a ``Flash`` object.
700 Instantiate a ``Flash`` object.
699
701
700 ``session_key`` is the key to save the messages under in the user's
702 ``session_key`` is the key to save the messages under in the user's
701 session.
703 session.
702
704
703 ``categories`` is an optional list which overrides the default list
705 ``categories`` is an optional list which overrides the default list
704 of categories.
706 of categories.
705
707
706 ``default_category`` overrides the default category used for messages
708 ``default_category`` overrides the default category used for messages
707 when none is specified.
709 when none is specified.
708 """
710 """
709 self.session_key = session_key
711 self.session_key = session_key
710 if categories is not None:
712 if categories is not None:
711 self.categories = categories
713 self.categories = categories
712 if default_category is not None:
714 if default_category is not None:
713 self.default_category = default_category
715 self.default_category = default_category
714 if self.categories and self.default_category not in self.categories:
716 if self.categories and self.default_category not in self.categories:
715 raise ValueError(
717 raise ValueError(
716 "unrecognized default category %r" % (self.default_category,))
718 "unrecognized default category %r" % (self.default_category,))
717
719
718 def pop_messages(self, session=None, request=None):
720 def pop_messages(self, session=None, request=None):
719 """
721 """
720 Return all accumulated messages and delete them from the session.
722 Return all accumulated messages and delete them from the session.
721
723
722 The return value is a list of ``Message`` objects.
724 The return value is a list of ``Message`` objects.
723 """
725 """
724 messages = []
726 messages = []
725
727
726 if not session:
728 if not session:
727 if not request:
729 if not request:
728 request = get_current_request()
730 request = get_current_request()
729 session = request.session
731 session = request.session
730
732
731 # Pop the 'old' pylons flash messages. They are tuples of the form
733 # Pop the 'old' pylons flash messages. They are tuples of the form
732 # (category, message)
734 # (category, message)
733 for cat, msg in session.pop(self.session_key, []):
735 for cat, msg in session.pop(self.session_key, []):
734 messages.append(_Message(cat, msg))
736 messages.append(_Message(cat, msg))
735
737
736 # Pop the 'new' pyramid flash messages for each category as list
738 # Pop the 'new' pyramid flash messages for each category as list
737 # of strings.
739 # of strings.
738 for cat in self.categories:
740 for cat in self.categories:
739 for msg in session.pop_flash(queue=cat):
741 for msg in session.pop_flash(queue=cat):
740 sub_data = {}
742 sub_data = {}
741 if hasattr(msg, 'rsplit'):
743 if hasattr(msg, 'rsplit'):
742 flash_data = msg.rsplit('|DELIM|', 1)
744 flash_data = msg.rsplit('|DELIM|', 1)
743 org_message = flash_data[0]
745 org_message = flash_data[0]
744 if len(flash_data) > 1:
746 if len(flash_data) > 1:
745 sub_data = json.loads(flash_data[1])
747 sub_data = json.loads(flash_data[1])
746 else:
748 else:
747 org_message = msg
749 org_message = msg
748
750
749 messages.append(_Message(cat, org_message, sub_data=sub_data))
751 messages.append(_Message(cat, org_message, sub_data=sub_data))
750
752
751 # Map messages from the default queue to the 'notice' category.
753 # Map messages from the default queue to the 'notice' category.
752 for msg in session.pop_flash():
754 for msg in session.pop_flash():
753 messages.append(_Message('notice', msg))
755 messages.append(_Message('notice', msg))
754
756
755 session.save()
757 session.save()
756 return messages
758 return messages
757
759
758 def json_alerts(self, session=None, request=None):
760 def json_alerts(self, session=None, request=None):
759 payloads = []
761 payloads = []
760 messages = flash.pop_messages(session=session, request=request) or []
762 messages = flash.pop_messages(session=session, request=request) or []
761 for message in messages:
763 for message in messages:
762 payloads.append({
764 payloads.append({
763 'message': {
765 'message': {
764 'message': u'{}'.format(message.message),
766 'message': u'{}'.format(message.message),
765 'level': message.category,
767 'level': message.category,
766 'force': True,
768 'force': True,
767 'subdata': message.sub_data
769 'subdata': message.sub_data
768 }
770 }
769 })
771 })
770 return json.dumps(payloads)
772 return json.dumps(payloads)
771
773
772 def __call__(self, message, category=None, ignore_duplicate=True,
774 def __call__(self, message, category=None, ignore_duplicate=True,
773 session=None, request=None):
775 session=None, request=None):
774
776
775 if not session:
777 if not session:
776 if not request:
778 if not request:
777 request = get_current_request()
779 request = get_current_request()
778 session = request.session
780 session = request.session
779
781
780 session.flash(
782 session.flash(
781 message, queue=category, allow_duplicate=not ignore_duplicate)
783 message, queue=category, allow_duplicate=not ignore_duplicate)
782
784
783
785
784 flash = Flash()
786 flash = Flash()
785
787
786 #==============================================================================
788 #==============================================================================
787 # SCM FILTERS available via h.
789 # SCM FILTERS available via h.
788 #==============================================================================
790 #==============================================================================
789 from rhodecode.lib.vcs.utils import author_name, author_email
791 from rhodecode.lib.vcs.utils import author_name, author_email
790 from rhodecode.lib.utils2 import age, age_from_seconds
792 from rhodecode.lib.utils2 import age, age_from_seconds
791 from rhodecode.model.db import User, ChangesetStatus
793 from rhodecode.model.db import User, ChangesetStatus
792
794
793
795
794 email = author_email
796 email = author_email
795
797
796
798
797 def capitalize(raw_text):
799 def capitalize(raw_text):
798 return raw_text.capitalize()
800 return raw_text.capitalize()
799
801
800
802
801 def short_id(long_id):
803 def short_id(long_id):
802 return long_id[:12]
804 return long_id[:12]
803
805
804
806
805 def hide_credentials(url):
807 def hide_credentials(url):
806 from rhodecode.lib.utils2 import credentials_filter
808 from rhodecode.lib.utils2 import credentials_filter
807 return credentials_filter(url)
809 return credentials_filter(url)
808
810
809
811
810 import pytz
812 import pytz
811 import tzlocal
813 import tzlocal
812 local_timezone = tzlocal.get_localzone()
814 local_timezone = tzlocal.get_localzone()
813
815
814
816
815 def get_timezone(datetime_iso, time_is_local=False):
817 def get_timezone(datetime_iso, time_is_local=False):
816 tzinfo = '+00:00'
818 tzinfo = '+00:00'
817
819
818 # detect if we have a timezone info, otherwise, add it
820 # detect if we have a timezone info, otherwise, add it
819 if time_is_local and isinstance(datetime_iso, datetime) and not datetime_iso.tzinfo:
821 if time_is_local and isinstance(datetime_iso, datetime) and not datetime_iso.tzinfo:
820 force_timezone = os.environ.get('RC_TIMEZONE', '')
822 force_timezone = os.environ.get('RC_TIMEZONE', '')
821 if force_timezone:
823 if force_timezone:
822 force_timezone = pytz.timezone(force_timezone)
824 force_timezone = pytz.timezone(force_timezone)
823 timezone = force_timezone or local_timezone
825 timezone = force_timezone or local_timezone
824 offset = timezone.localize(datetime_iso).strftime('%z')
826 offset = timezone.localize(datetime_iso).strftime('%z')
825 tzinfo = '{}:{}'.format(offset[:-2], offset[-2:])
827 tzinfo = '{}:{}'.format(offset[:-2], offset[-2:])
826 return tzinfo
828 return tzinfo
827
829
828
830
829 def age_component(datetime_iso, value=None, time_is_local=False, tooltip=True):
831 def age_component(datetime_iso, value=None, time_is_local=False, tooltip=True):
830 title = value or format_date(datetime_iso)
832 title = value or format_date(datetime_iso)
831 tzinfo = get_timezone(datetime_iso, time_is_local=time_is_local)
833 tzinfo = get_timezone(datetime_iso, time_is_local=time_is_local)
832
834
833 return literal(
835 return literal(
834 '<time class="timeago {cls}" title="{tt_title}" datetime="{dt}{tzinfo}">{title}</time>'.format(
836 '<time class="timeago {cls}" title="{tt_title}" datetime="{dt}{tzinfo}">{title}</time>'.format(
835 cls='tooltip' if tooltip else '',
837 cls='tooltip' if tooltip else '',
836 tt_title=('{title}{tzinfo}'.format(title=title, tzinfo=tzinfo)) if tooltip else '',
838 tt_title=('{title}{tzinfo}'.format(title=title, tzinfo=tzinfo)) if tooltip else '',
837 title=title, dt=datetime_iso, tzinfo=tzinfo
839 title=title, dt=datetime_iso, tzinfo=tzinfo
838 ))
840 ))
839
841
840
842
841 def _shorten_commit_id(commit_id, commit_len=None):
843 def _shorten_commit_id(commit_id, commit_len=None):
842 if commit_len is None:
844 if commit_len is None:
843 request = get_current_request()
845 request = get_current_request()
844 commit_len = request.call_context.visual.show_sha_length
846 commit_len = request.call_context.visual.show_sha_length
845 return commit_id[:commit_len]
847 return commit_id[:commit_len]
846
848
847
849
848 def show_id(commit, show_idx=None, commit_len=None):
850 def show_id(commit, show_idx=None, commit_len=None):
849 """
851 """
850 Configurable function that shows ID
852 Configurable function that shows ID
851 by default it's r123:fffeeefffeee
853 by default it's r123:fffeeefffeee
852
854
853 :param commit: commit instance
855 :param commit: commit instance
854 """
856 """
855 if show_idx is None:
857 if show_idx is None:
856 request = get_current_request()
858 request = get_current_request()
857 show_idx = request.call_context.visual.show_revision_number
859 show_idx = request.call_context.visual.show_revision_number
858
860
859 raw_id = _shorten_commit_id(commit.raw_id, commit_len=commit_len)
861 raw_id = _shorten_commit_id(commit.raw_id, commit_len=commit_len)
860 if show_idx:
862 if show_idx:
861 return 'r%s:%s' % (commit.idx, raw_id)
863 return 'r%s:%s' % (commit.idx, raw_id)
862 else:
864 else:
863 return '%s' % (raw_id, )
865 return '%s' % (raw_id, )
864
866
865
867
866 def format_date(date):
868 def format_date(date):
867 """
869 """
868 use a standardized formatting for dates used in RhodeCode
870 use a standardized formatting for dates used in RhodeCode
869
871
870 :param date: date/datetime object
872 :param date: date/datetime object
871 :return: formatted date
873 :return: formatted date
872 """
874 """
873
875
874 if date:
876 if date:
875 _fmt = "%a, %d %b %Y %H:%M:%S"
877 _fmt = "%a, %d %b %Y %H:%M:%S"
876 return safe_unicode(date.strftime(_fmt))
878 return safe_unicode(date.strftime(_fmt))
877
879
878 return u""
880 return u""
879
881
880
882
881 class _RepoChecker(object):
883 class _RepoChecker(object):
882
884
883 def __init__(self, backend_alias):
885 def __init__(self, backend_alias):
884 self._backend_alias = backend_alias
886 self._backend_alias = backend_alias
885
887
886 def __call__(self, repository):
888 def __call__(self, repository):
887 if hasattr(repository, 'alias'):
889 if hasattr(repository, 'alias'):
888 _type = repository.alias
890 _type = repository.alias
889 elif hasattr(repository, 'repo_type'):
891 elif hasattr(repository, 'repo_type'):
890 _type = repository.repo_type
892 _type = repository.repo_type
891 else:
893 else:
892 _type = repository
894 _type = repository
893 return _type == self._backend_alias
895 return _type == self._backend_alias
894
896
895
897
896 is_git = _RepoChecker('git')
898 is_git = _RepoChecker('git')
897 is_hg = _RepoChecker('hg')
899 is_hg = _RepoChecker('hg')
898 is_svn = _RepoChecker('svn')
900 is_svn = _RepoChecker('svn')
899
901
900
902
901 def get_repo_type_by_name(repo_name):
903 def get_repo_type_by_name(repo_name):
902 repo = Repository.get_by_repo_name(repo_name)
904 repo = Repository.get_by_repo_name(repo_name)
903 if repo:
905 if repo:
904 return repo.repo_type
906 return repo.repo_type
905
907
906
908
907 def is_svn_without_proxy(repository):
909 def is_svn_without_proxy(repository):
908 if is_svn(repository):
910 if is_svn(repository):
909 from rhodecode.model.settings import VcsSettingsModel
911 from rhodecode.model.settings import VcsSettingsModel
910 conf = VcsSettingsModel().get_ui_settings_as_config_obj()
912 conf = VcsSettingsModel().get_ui_settings_as_config_obj()
911 return not str2bool(conf.get('vcs_svn_proxy', 'http_requests_enabled'))
913 return not str2bool(conf.get('vcs_svn_proxy', 'http_requests_enabled'))
912 return False
914 return False
913
915
914
916
915 def discover_user(author):
917 def discover_user(author):
916 """
918 """
917 Tries to discover RhodeCode User based on the author string. Author string
919 Tries to discover RhodeCode User based on the author string. Author string
918 is typically `FirstName LastName <email@address.com>`
920 is typically `FirstName LastName <email@address.com>`
919 """
921 """
920
922
921 # if author is already an instance use it for extraction
923 # if author is already an instance use it for extraction
922 if isinstance(author, User):
924 if isinstance(author, User):
923 return author
925 return author
924
926
925 # Valid email in the attribute passed, see if they're in the system
927 # Valid email in the attribute passed, see if they're in the system
926 _email = author_email(author)
928 _email = author_email(author)
927 if _email != '':
929 if _email != '':
928 user = User.get_by_email(_email, case_insensitive=True, cache=True)
930 user = User.get_by_email(_email, case_insensitive=True, cache=True)
929 if user is not None:
931 if user is not None:
930 return user
932 return user
931
933
932 # Maybe it's a username, we try to extract it and fetch by username ?
934 # Maybe it's a username, we try to extract it and fetch by username ?
933 _author = author_name(author)
935 _author = author_name(author)
934 user = User.get_by_username(_author, case_insensitive=True, cache=True)
936 user = User.get_by_username(_author, case_insensitive=True, cache=True)
935 if user is not None:
937 if user is not None:
936 return user
938 return user
937
939
938 return None
940 return None
939
941
940
942
941 def email_or_none(author):
943 def email_or_none(author):
942 # extract email from the commit string
944 # extract email from the commit string
943 _email = author_email(author)
945 _email = author_email(author)
944
946
945 # If we have an email, use it, otherwise
947 # If we have an email, use it, otherwise
946 # see if it contains a username we can get an email from
948 # see if it contains a username we can get an email from
947 if _email != '':
949 if _email != '':
948 return _email
950 return _email
949 else:
951 else:
950 user = User.get_by_username(
952 user = User.get_by_username(
951 author_name(author), case_insensitive=True, cache=True)
953 author_name(author), case_insensitive=True, cache=True)
952
954
953 if user is not None:
955 if user is not None:
954 return user.email
956 return user.email
955
957
956 # No valid email, not a valid user in the system, none!
958 # No valid email, not a valid user in the system, none!
957 return None
959 return None
958
960
959
961
960 def link_to_user(author, length=0, **kwargs):
962 def link_to_user(author, length=0, **kwargs):
961 user = discover_user(author)
963 user = discover_user(author)
962 # user can be None, but if we have it already it means we can re-use it
964 # user can be None, but if we have it already it means we can re-use it
963 # in the person() function, so we save 1 intensive-query
965 # in the person() function, so we save 1 intensive-query
964 if user:
966 if user:
965 author = user
967 author = user
966
968
967 display_person = person(author, 'username_or_name_or_email')
969 display_person = person(author, 'username_or_name_or_email')
968 if length:
970 if length:
969 display_person = shorter(display_person, length)
971 display_person = shorter(display_person, length)
970
972
971 if user and user.username != user.DEFAULT_USER:
973 if user and user.username != user.DEFAULT_USER:
972 return link_to(
974 return link_to(
973 escape(display_person),
975 escape(display_person),
974 route_path('user_profile', username=user.username),
976 route_path('user_profile', username=user.username),
975 **kwargs)
977 **kwargs)
976 else:
978 else:
977 return escape(display_person)
979 return escape(display_person)
978
980
979
981
980 def link_to_group(users_group_name, **kwargs):
982 def link_to_group(users_group_name, **kwargs):
981 return link_to(
983 return link_to(
982 escape(users_group_name),
984 escape(users_group_name),
983 route_path('user_group_profile', user_group_name=users_group_name),
985 route_path('user_group_profile', user_group_name=users_group_name),
984 **kwargs)
986 **kwargs)
985
987
986
988
987 def person(author, show_attr="username_and_name"):
989 def person(author, show_attr="username_and_name"):
988 user = discover_user(author)
990 user = discover_user(author)
989 if user:
991 if user:
990 return getattr(user, show_attr)
992 return getattr(user, show_attr)
991 else:
993 else:
992 _author = author_name(author)
994 _author = author_name(author)
993 _email = email(author)
995 _email = email(author)
994 return _author or _email
996 return _author or _email
995
997
996
998
997 def author_string(email):
999 def author_string(email):
998 if email:
1000 if email:
999 user = User.get_by_email(email, case_insensitive=True, cache=True)
1001 user = User.get_by_email(email, case_insensitive=True, cache=True)
1000 if user:
1002 if user:
1001 if user.first_name or user.last_name:
1003 if user.first_name or user.last_name:
1002 return '%s %s &lt;%s&gt;' % (
1004 return '%s %s &lt;%s&gt;' % (
1003 user.first_name, user.last_name, email)
1005 user.first_name, user.last_name, email)
1004 else:
1006 else:
1005 return email
1007 return email
1006 else:
1008 else:
1007 return email
1009 return email
1008 else:
1010 else:
1009 return None
1011 return None
1010
1012
1011
1013
1012 def person_by_id(id_, show_attr="username_and_name"):
1014 def person_by_id(id_, show_attr="username_and_name"):
1013 # attr to return from fetched user
1015 # attr to return from fetched user
1014 person_getter = lambda usr: getattr(usr, show_attr)
1016 person_getter = lambda usr: getattr(usr, show_attr)
1015
1017
1016 #maybe it's an ID ?
1018 #maybe it's an ID ?
1017 if str(id_).isdigit() or isinstance(id_, int):
1019 if str(id_).isdigit() or isinstance(id_, int):
1018 id_ = int(id_)
1020 id_ = int(id_)
1019 user = User.get(id_)
1021 user = User.get(id_)
1020 if user is not None:
1022 if user is not None:
1021 return person_getter(user)
1023 return person_getter(user)
1022 return id_
1024 return id_
1023
1025
1024
1026
1025 def gravatar_with_user(request, author, show_disabled=False, tooltip=False):
1027 def gravatar_with_user(request, author, show_disabled=False, tooltip=False):
1026 _render = request.get_partial_renderer('rhodecode:templates/base/base.mako')
1028 _render = request.get_partial_renderer('rhodecode:templates/base/base.mako')
1027 return _render('gravatar_with_user', author, show_disabled=show_disabled, tooltip=tooltip)
1029 return _render('gravatar_with_user', author, show_disabled=show_disabled, tooltip=tooltip)
1028
1030
1029
1031
1030 tags_paterns = OrderedDict((
1032 tags_paterns = OrderedDict((
1031 ('lang', (re.compile(r'\[(lang|language)\ \=\&gt;\ *([a-zA-Z\-\/\#\+\.]*)\]'),
1033 ('lang', (re.compile(r'\[(lang|language)\ \=\&gt;\ *([a-zA-Z\-\/\#\+\.]*)\]'),
1032 '<div class="metatag" tag="lang">\\2</div>')),
1034 '<div class="metatag" tag="lang">\\2</div>')),
1033
1035
1034 ('see', (re.compile(r'\[see\ \=\&gt;\ *([a-zA-Z0-9\/\=\?\&amp;\ \:\/\.\-]*)\]'),
1036 ('see', (re.compile(r'\[see\ \=\&gt;\ *([a-zA-Z0-9\/\=\?\&amp;\ \:\/\.\-]*)\]'),
1035 '<div class="metatag" tag="see">see: \\1 </div>')),
1037 '<div class="metatag" tag="see">see: \\1 </div>')),
1036
1038
1037 ('url', (re.compile(r'\[url\ \=\&gt;\ \[([a-zA-Z0-9\ \.\-\_]+)\]\((http://|https://|/)(.*?)\)\]'),
1039 ('url', (re.compile(r'\[url\ \=\&gt;\ \[([a-zA-Z0-9\ \.\-\_]+)\]\((http://|https://|/)(.*?)\)\]'),
1038 '<div class="metatag" tag="url"> <a href="\\2\\3">\\1</a> </div>')),
1040 '<div class="metatag" tag="url"> <a href="\\2\\3">\\1</a> </div>')),
1039
1041
1040 ('license', (re.compile(r'\[license\ \=\&gt;\ *([a-zA-Z0-9\/\=\?\&amp;\ \:\/\.\-]*)\]'),
1042 ('license', (re.compile(r'\[license\ \=\&gt;\ *([a-zA-Z0-9\/\=\?\&amp;\ \:\/\.\-]*)\]'),
1041 '<div class="metatag" tag="license"><a href="http:\/\/www.opensource.org/licenses/\\1">\\1</a></div>')),
1043 '<div class="metatag" tag="license"><a href="http:\/\/www.opensource.org/licenses/\\1">\\1</a></div>')),
1042
1044
1043 ('ref', (re.compile(r'\[(requires|recommends|conflicts|base)\ \=\&gt;\ *([a-zA-Z0-9\-\/]*)\]'),
1045 ('ref', (re.compile(r'\[(requires|recommends|conflicts|base)\ \=\&gt;\ *([a-zA-Z0-9\-\/]*)\]'),
1044 '<div class="metatag" tag="ref \\1">\\1: <a href="/\\2">\\2</a></div>')),
1046 '<div class="metatag" tag="ref \\1">\\1: <a href="/\\2">\\2</a></div>')),
1045
1047
1046 ('state', (re.compile(r'\[(stable|featured|stale|dead|dev|deprecated)\]'),
1048 ('state', (re.compile(r'\[(stable|featured|stale|dead|dev|deprecated)\]'),
1047 '<div class="metatag" tag="state \\1">\\1</div>')),
1049 '<div class="metatag" tag="state \\1">\\1</div>')),
1048
1050
1049 # label in grey
1051 # label in grey
1050 ('label', (re.compile(r'\[([a-z]+)\]'),
1052 ('label', (re.compile(r'\[([a-z]+)\]'),
1051 '<div class="metatag" tag="label">\\1</div>')),
1053 '<div class="metatag" tag="label">\\1</div>')),
1052
1054
1053 # generic catch all in grey
1055 # generic catch all in grey
1054 ('generic', (re.compile(r'\[([a-zA-Z0-9\.\-\_]+)\]'),
1056 ('generic', (re.compile(r'\[([a-zA-Z0-9\.\-\_]+)\]'),
1055 '<div class="metatag" tag="generic">\\1</div>')),
1057 '<div class="metatag" tag="generic">\\1</div>')),
1056 ))
1058 ))
1057
1059
1058
1060
1059 def extract_metatags(value):
1061 def extract_metatags(value):
1060 """
1062 """
1061 Extract supported meta-tags from given text value
1063 Extract supported meta-tags from given text value
1062 """
1064 """
1063 tags = []
1065 tags = []
1064 if not value:
1066 if not value:
1065 return tags, ''
1067 return tags, ''
1066
1068
1067 for key, val in tags_paterns.items():
1069 for key, val in tags_paterns.items():
1068 pat, replace_html = val
1070 pat, replace_html = val
1069 tags.extend([(key, x.group()) for x in pat.finditer(value)])
1071 tags.extend([(key, x.group()) for x in pat.finditer(value)])
1070 value = pat.sub('', value)
1072 value = pat.sub('', value)
1071
1073
1072 return tags, value
1074 return tags, value
1073
1075
1074
1076
1075 def style_metatag(tag_type, value):
1077 def style_metatag(tag_type, value):
1076 """
1078 """
1077 converts tags from value into html equivalent
1079 converts tags from value into html equivalent
1078 """
1080 """
1079 if not value:
1081 if not value:
1080 return ''
1082 return ''
1081
1083
1082 html_value = value
1084 html_value = value
1083 tag_data = tags_paterns.get(tag_type)
1085 tag_data = tags_paterns.get(tag_type)
1084 if tag_data:
1086 if tag_data:
1085 pat, replace_html = tag_data
1087 pat, replace_html = tag_data
1086 # convert to plain `unicode` instead of a markup tag to be used in
1088 # convert to plain `unicode` instead of a markup tag to be used in
1087 # regex expressions. safe_unicode doesn't work here
1089 # regex expressions. safe_unicode doesn't work here
1088 html_value = pat.sub(replace_html, unicode(value))
1090 html_value = pat.sub(replace_html, unicode(value))
1089
1091
1090 return html_value
1092 return html_value
1091
1093
1092
1094
1093 def bool2icon(value, show_at_false=True):
1095 def bool2icon(value, show_at_false=True):
1094 """
1096 """
1095 Returns boolean value of a given value, represented as html element with
1097 Returns boolean value of a given value, represented as html element with
1096 classes that will represent icons
1098 classes that will represent icons
1097
1099
1098 :param value: given value to convert to html node
1100 :param value: given value to convert to html node
1099 """
1101 """
1100
1102
1101 if value: # does bool conversion
1103 if value: # does bool conversion
1102 return HTML.tag('i', class_="icon-true", title='True')
1104 return HTML.tag('i', class_="icon-true", title='True')
1103 else: # not true as bool
1105 else: # not true as bool
1104 if show_at_false:
1106 if show_at_false:
1105 return HTML.tag('i', class_="icon-false", title='False')
1107 return HTML.tag('i', class_="icon-false", title='False')
1106 return HTML.tag('i')
1108 return HTML.tag('i')
1107
1109
1108
1110
1109 def b64(inp):
1111 def b64(inp):
1110 return base64.b64encode(inp)
1112 return base64.b64encode(inp)
1111
1113
1112 #==============================================================================
1114 #==============================================================================
1113 # PERMS
1115 # PERMS
1114 #==============================================================================
1116 #==============================================================================
1115 from rhodecode.lib.auth import (
1117 from rhodecode.lib.auth import (
1116 HasPermissionAny, HasPermissionAll,
1118 HasPermissionAny, HasPermissionAll,
1117 HasRepoPermissionAny, HasRepoPermissionAll, HasRepoGroupPermissionAll,
1119 HasRepoPermissionAny, HasRepoPermissionAll, HasRepoGroupPermissionAll,
1118 HasRepoGroupPermissionAny, HasRepoPermissionAnyApi, get_csrf_token,
1120 HasRepoGroupPermissionAny, HasRepoPermissionAnyApi, get_csrf_token,
1119 csrf_token_key, AuthUser)
1121 csrf_token_key, AuthUser)
1120
1122
1121
1123
1122 #==============================================================================
1124 #==============================================================================
1123 # GRAVATAR URL
1125 # GRAVATAR URL
1124 #==============================================================================
1126 #==============================================================================
1125 class InitialsGravatar(object):
1127 class InitialsGravatar(object):
1126 def __init__(self, email_address, first_name, last_name, size=30,
1128 def __init__(self, email_address, first_name, last_name, size=30,
1127 background=None, text_color='#fff'):
1129 background=None, text_color='#fff'):
1128 self.size = size
1130 self.size = size
1129 self.first_name = first_name
1131 self.first_name = first_name
1130 self.last_name = last_name
1132 self.last_name = last_name
1131 self.email_address = email_address
1133 self.email_address = email_address
1132 self.background = background or self.str2color(email_address)
1134 self.background = background or self.str2color(email_address)
1133 self.text_color = text_color
1135 self.text_color = text_color
1134
1136
1135 def get_color_bank(self):
1137 def get_color_bank(self):
1136 """
1138 """
1137 returns a predefined list of colors that gravatars can use.
1139 returns a predefined list of colors that gravatars can use.
1138 Those are randomized distinct colors that guarantee readability and
1140 Those are randomized distinct colors that guarantee readability and
1139 uniqueness.
1141 uniqueness.
1140
1142
1141 generated with: http://phrogz.net/css/distinct-colors.html
1143 generated with: http://phrogz.net/css/distinct-colors.html
1142 """
1144 """
1143 return [
1145 return [
1144 '#bf3030', '#a67f53', '#00ff00', '#5989b3', '#392040', '#d90000',
1146 '#bf3030', '#a67f53', '#00ff00', '#5989b3', '#392040', '#d90000',
1145 '#402910', '#204020', '#79baf2', '#a700b3', '#bf6060', '#7f5320',
1147 '#402910', '#204020', '#79baf2', '#a700b3', '#bf6060', '#7f5320',
1146 '#008000', '#003059', '#ee00ff', '#ff0000', '#8c4b00', '#007300',
1148 '#008000', '#003059', '#ee00ff', '#ff0000', '#8c4b00', '#007300',
1147 '#005fb3', '#de73e6', '#ff4040', '#ffaa00', '#3df255', '#203140',
1149 '#005fb3', '#de73e6', '#ff4040', '#ffaa00', '#3df255', '#203140',
1148 '#47004d', '#591616', '#664400', '#59b365', '#0d2133', '#83008c',
1150 '#47004d', '#591616', '#664400', '#59b365', '#0d2133', '#83008c',
1149 '#592d2d', '#bf9f60', '#73e682', '#1d3f73', '#73006b', '#402020',
1151 '#592d2d', '#bf9f60', '#73e682', '#1d3f73', '#73006b', '#402020',
1150 '#b2862d', '#397341', '#597db3', '#e600d6', '#a60000', '#736039',
1152 '#b2862d', '#397341', '#597db3', '#e600d6', '#a60000', '#736039',
1151 '#00b318', '#79aaf2', '#330d30', '#ff8080', '#403010', '#16591f',
1153 '#00b318', '#79aaf2', '#330d30', '#ff8080', '#403010', '#16591f',
1152 '#002459', '#8c4688', '#e50000', '#ffbf40', '#00732e', '#102340',
1154 '#002459', '#8c4688', '#e50000', '#ffbf40', '#00732e', '#102340',
1153 '#bf60ac', '#8c4646', '#cc8800', '#00a642', '#1d3473', '#b32d98',
1155 '#bf60ac', '#8c4646', '#cc8800', '#00a642', '#1d3473', '#b32d98',
1154 '#660e00', '#ffd580', '#80ffb2', '#7391e6', '#733967', '#d97b6c',
1156 '#660e00', '#ffd580', '#80ffb2', '#7391e6', '#733967', '#d97b6c',
1155 '#8c5e00', '#59b389', '#3967e6', '#590047', '#73281d', '#665200',
1157 '#8c5e00', '#59b389', '#3967e6', '#590047', '#73281d', '#665200',
1156 '#00e67a', '#2d50b3', '#8c2377', '#734139', '#b2982d', '#16593a',
1158 '#00e67a', '#2d50b3', '#8c2377', '#734139', '#b2982d', '#16593a',
1157 '#001859', '#ff00aa', '#a65e53', '#ffcc00', '#0d3321', '#2d3959',
1159 '#001859', '#ff00aa', '#a65e53', '#ffcc00', '#0d3321', '#2d3959',
1158 '#731d56', '#401610', '#4c3d00', '#468c6c', '#002ca6', '#d936a3',
1160 '#731d56', '#401610', '#4c3d00', '#468c6c', '#002ca6', '#d936a3',
1159 '#d94c36', '#403920', '#36d9a3', '#0d1733', '#592d4a', '#993626',
1161 '#d94c36', '#403920', '#36d9a3', '#0d1733', '#592d4a', '#993626',
1160 '#cca300', '#00734d', '#46598c', '#8c005e', '#7f1100', '#8c7000',
1162 '#cca300', '#00734d', '#46598c', '#8c005e', '#7f1100', '#8c7000',
1161 '#00a66f', '#7382e6', '#b32d74', '#d9896c', '#ffe680', '#1d7362',
1163 '#00a66f', '#7382e6', '#b32d74', '#d9896c', '#ffe680', '#1d7362',
1162 '#364cd9', '#73003d', '#d93a00', '#998a4d', '#59b3a1', '#5965b3',
1164 '#364cd9', '#73003d', '#d93a00', '#998a4d', '#59b3a1', '#5965b3',
1163 '#e5007a', '#73341d', '#665f00', '#00b38f', '#0018b3', '#59163a',
1165 '#e5007a', '#73341d', '#665f00', '#00b38f', '#0018b3', '#59163a',
1164 '#b2502d', '#bfb960', '#00ffcc', '#23318c', '#a6537f', '#734939',
1166 '#b2502d', '#bfb960', '#00ffcc', '#23318c', '#a6537f', '#734939',
1165 '#b2a700', '#104036', '#3d3df2', '#402031', '#e56739', '#736f39',
1167 '#b2a700', '#104036', '#3d3df2', '#402031', '#e56739', '#736f39',
1166 '#79f2ea', '#000059', '#401029', '#4c1400', '#ffee00', '#005953',
1168 '#79f2ea', '#000059', '#401029', '#4c1400', '#ffee00', '#005953',
1167 '#101040', '#990052', '#402820', '#403d10', '#00ffee', '#0000d9',
1169 '#101040', '#990052', '#402820', '#403d10', '#00ffee', '#0000d9',
1168 '#ff80c4', '#a66953', '#eeff00', '#00ccbe', '#8080ff', '#e673a1',
1170 '#ff80c4', '#a66953', '#eeff00', '#00ccbe', '#8080ff', '#e673a1',
1169 '#a62c00', '#474d00', '#1a3331', '#46468c', '#733950', '#662900',
1171 '#a62c00', '#474d00', '#1a3331', '#46468c', '#733950', '#662900',
1170 '#858c23', '#238c85', '#0f0073', '#b20047', '#d9986c', '#becc00',
1172 '#858c23', '#238c85', '#0f0073', '#b20047', '#d9986c', '#becc00',
1171 '#396f73', '#281d73', '#ff0066', '#ff6600', '#dee673', '#59adb3',
1173 '#396f73', '#281d73', '#ff0066', '#ff6600', '#dee673', '#59adb3',
1172 '#6559b3', '#590024', '#b2622d', '#98b32d', '#36ced9', '#332d59',
1174 '#6559b3', '#590024', '#b2622d', '#98b32d', '#36ced9', '#332d59',
1173 '#40001a', '#733f1d', '#526600', '#005359', '#242040', '#bf6079',
1175 '#40001a', '#733f1d', '#526600', '#005359', '#242040', '#bf6079',
1174 '#735039', '#cef23d', '#007780', '#5630bf', '#66001b', '#b24700',
1176 '#735039', '#cef23d', '#007780', '#5630bf', '#66001b', '#b24700',
1175 '#acbf60', '#1d6273', '#25008c', '#731d34', '#a67453', '#50592d',
1177 '#acbf60', '#1d6273', '#25008c', '#731d34', '#a67453', '#50592d',
1176 '#00ccff', '#6600ff', '#ff0044', '#4c1f00', '#8a994d', '#79daf2',
1178 '#00ccff', '#6600ff', '#ff0044', '#4c1f00', '#8a994d', '#79daf2',
1177 '#a173e6', '#d93662', '#402310', '#aaff00', '#2d98b3', '#8c40ff',
1179 '#a173e6', '#d93662', '#402310', '#aaff00', '#2d98b3', '#8c40ff',
1178 '#592d39', '#ff8c40', '#354020', '#103640', '#1a0040', '#331a20',
1180 '#592d39', '#ff8c40', '#354020', '#103640', '#1a0040', '#331a20',
1179 '#331400', '#334d00', '#1d5673', '#583973', '#7f0022', '#4c3626',
1181 '#331400', '#334d00', '#1d5673', '#583973', '#7f0022', '#4c3626',
1180 '#88cc00', '#36a3d9', '#3d0073', '#d9364c', '#33241a', '#698c23',
1182 '#88cc00', '#36a3d9', '#3d0073', '#d9364c', '#33241a', '#698c23',
1181 '#5995b3', '#300059', '#e57382', '#7f3300', '#366600', '#00aaff',
1183 '#5995b3', '#300059', '#e57382', '#7f3300', '#366600', '#00aaff',
1182 '#3a1659', '#733941', '#663600', '#74b32d', '#003c59', '#7f53a6',
1184 '#3a1659', '#733941', '#663600', '#74b32d', '#003c59', '#7f53a6',
1183 '#73000f', '#ff8800', '#baf279', '#79caf2', '#291040', '#a6293a',
1185 '#73000f', '#ff8800', '#baf279', '#79caf2', '#291040', '#a6293a',
1184 '#b2742d', '#587339', '#0077b3', '#632699', '#400009', '#d9a66c',
1186 '#b2742d', '#587339', '#0077b3', '#632699', '#400009', '#d9a66c',
1185 '#294010', '#2d4a59', '#aa00ff', '#4c131b', '#b25f00', '#5ce600',
1187 '#294010', '#2d4a59', '#aa00ff', '#4c131b', '#b25f00', '#5ce600',
1186 '#267399', '#a336d9', '#990014', '#664e33', '#86bf60', '#0088ff',
1188 '#267399', '#a336d9', '#990014', '#664e33', '#86bf60', '#0088ff',
1187 '#7700b3', '#593a16', '#073300', '#1d4b73', '#ac60bf', '#e59539',
1189 '#7700b3', '#593a16', '#073300', '#1d4b73', '#ac60bf', '#e59539',
1188 '#4f8c46', '#368dd9', '#5c0073'
1190 '#4f8c46', '#368dd9', '#5c0073'
1189 ]
1191 ]
1190
1192
1191 def rgb_to_hex_color(self, rgb_tuple):
1193 def rgb_to_hex_color(self, rgb_tuple):
1192 """
1194 """
1193 Converts an rgb_tuple passed to an hex color.
1195 Converts an rgb_tuple passed to an hex color.
1194
1196
1195 :param rgb_tuple: tuple with 3 ints represents rgb color space
1197 :param rgb_tuple: tuple with 3 ints represents rgb color space
1196 """
1198 """
1197 return '#' + ("".join(map(chr, rgb_tuple)).encode('hex'))
1199 return '#' + ("".join(map(chr, rgb_tuple)).encode('hex'))
1198
1200
1199 def email_to_int_list(self, email_str):
1201 def email_to_int_list(self, email_str):
1200 """
1202 """
1201 Get every byte of the hex digest value of email and turn it to integer.
1203 Get every byte of the hex digest value of email and turn it to integer.
1202 It's going to be always between 0-255
1204 It's going to be always between 0-255
1203 """
1205 """
1204 digest = md5_safe(email_str.lower())
1206 digest = md5_safe(email_str.lower())
1205 return [int(digest[i * 2:i * 2 + 2], 16) for i in range(16)]
1207 return [int(digest[i * 2:i * 2 + 2], 16) for i in range(16)]
1206
1208
1207 def pick_color_bank_index(self, email_str, color_bank):
1209 def pick_color_bank_index(self, email_str, color_bank):
1208 return self.email_to_int_list(email_str)[0] % len(color_bank)
1210 return self.email_to_int_list(email_str)[0] % len(color_bank)
1209
1211
1210 def str2color(self, email_str):
1212 def str2color(self, email_str):
1211 """
1213 """
1212 Tries to map in a stable algorithm an email to color
1214 Tries to map in a stable algorithm an email to color
1213
1215
1214 :param email_str:
1216 :param email_str:
1215 """
1217 """
1216 color_bank = self.get_color_bank()
1218 color_bank = self.get_color_bank()
1217 # pick position (module it's length so we always find it in the
1219 # pick position (module it's length so we always find it in the
1218 # bank even if it's smaller than 256 values
1220 # bank even if it's smaller than 256 values
1219 pos = self.pick_color_bank_index(email_str, color_bank)
1221 pos = self.pick_color_bank_index(email_str, color_bank)
1220 return color_bank[pos]
1222 return color_bank[pos]
1221
1223
1222 def normalize_email(self, email_address):
1224 def normalize_email(self, email_address):
1223 import unicodedata
1225 import unicodedata
1224 # default host used to fill in the fake/missing email
1226 # default host used to fill in the fake/missing email
1225 default_host = u'localhost'
1227 default_host = u'localhost'
1226
1228
1227 if not email_address:
1229 if not email_address:
1228 email_address = u'%s@%s' % (User.DEFAULT_USER, default_host)
1230 email_address = u'%s@%s' % (User.DEFAULT_USER, default_host)
1229
1231
1230 email_address = safe_unicode(email_address)
1232 email_address = safe_unicode(email_address)
1231
1233
1232 if u'@' not in email_address:
1234 if u'@' not in email_address:
1233 email_address = u'%s@%s' % (email_address, default_host)
1235 email_address = u'%s@%s' % (email_address, default_host)
1234
1236
1235 if email_address.endswith(u'@'):
1237 if email_address.endswith(u'@'):
1236 email_address = u'%s%s' % (email_address, default_host)
1238 email_address = u'%s%s' % (email_address, default_host)
1237
1239
1238 email_address = unicodedata.normalize('NFKD', email_address)\
1240 email_address = unicodedata.normalize('NFKD', email_address)\
1239 .encode('ascii', 'ignore')
1241 .encode('ascii', 'ignore')
1240 return email_address
1242 return email_address
1241
1243
1242 def get_initials(self):
1244 def get_initials(self):
1243 """
1245 """
1244 Returns 2 letter initials calculated based on the input.
1246 Returns 2 letter initials calculated based on the input.
1245 The algorithm picks first given email address, and takes first letter
1247 The algorithm picks first given email address, and takes first letter
1246 of part before @, and then the first letter of server name. In case
1248 of part before @, and then the first letter of server name. In case
1247 the part before @ is in a format of `somestring.somestring2` it replaces
1249 the part before @ is in a format of `somestring.somestring2` it replaces
1248 the server letter with first letter of somestring2
1250 the server letter with first letter of somestring2
1249
1251
1250 In case function was initialized with both first and lastname, this
1252 In case function was initialized with both first and lastname, this
1251 overrides the extraction from email by first letter of the first and
1253 overrides the extraction from email by first letter of the first and
1252 last name. We add special logic to that functionality, In case Full name
1254 last name. We add special logic to that functionality, In case Full name
1253 is compound, like Guido Von Rossum, we use last part of the last name
1255 is compound, like Guido Von Rossum, we use last part of the last name
1254 (Von Rossum) picking `R`.
1256 (Von Rossum) picking `R`.
1255
1257
1256 Function also normalizes the non-ascii characters to they ascii
1258 Function also normalizes the non-ascii characters to they ascii
1257 representation, eg Ą => A
1259 representation, eg Ą => A
1258 """
1260 """
1259 import unicodedata
1261 import unicodedata
1260 # replace non-ascii to ascii
1262 # replace non-ascii to ascii
1261 first_name = unicodedata.normalize(
1263 first_name = unicodedata.normalize(
1262 'NFKD', safe_unicode(self.first_name)).encode('ascii', 'ignore')
1264 'NFKD', safe_unicode(self.first_name)).encode('ascii', 'ignore')
1263 last_name = unicodedata.normalize(
1265 last_name = unicodedata.normalize(
1264 'NFKD', safe_unicode(self.last_name)).encode('ascii', 'ignore')
1266 'NFKD', safe_unicode(self.last_name)).encode('ascii', 'ignore')
1265
1267
1266 # do NFKD encoding, and also make sure email has proper format
1268 # do NFKD encoding, and also make sure email has proper format
1267 email_address = self.normalize_email(self.email_address)
1269 email_address = self.normalize_email(self.email_address)
1268
1270
1269 # first push the email initials
1271 # first push the email initials
1270 prefix, server = email_address.split('@', 1)
1272 prefix, server = email_address.split('@', 1)
1271
1273
1272 # check if prefix is maybe a 'first_name.last_name' syntax
1274 # check if prefix is maybe a 'first_name.last_name' syntax
1273 _dot_split = prefix.rsplit('.', 1)
1275 _dot_split = prefix.rsplit('.', 1)
1274 if len(_dot_split) == 2 and _dot_split[1]:
1276 if len(_dot_split) == 2 and _dot_split[1]:
1275 initials = [_dot_split[0][0], _dot_split[1][0]]
1277 initials = [_dot_split[0][0], _dot_split[1][0]]
1276 else:
1278 else:
1277 initials = [prefix[0], server[0]]
1279 initials = [prefix[0], server[0]]
1278
1280
1279 # then try to replace either first_name or last_name
1281 # then try to replace either first_name or last_name
1280 fn_letter = (first_name or " ")[0].strip()
1282 fn_letter = (first_name or " ")[0].strip()
1281 ln_letter = (last_name.split(' ', 1)[-1] or " ")[0].strip()
1283 ln_letter = (last_name.split(' ', 1)[-1] or " ")[0].strip()
1282
1284
1283 if fn_letter:
1285 if fn_letter:
1284 initials[0] = fn_letter
1286 initials[0] = fn_letter
1285
1287
1286 if ln_letter:
1288 if ln_letter:
1287 initials[1] = ln_letter
1289 initials[1] = ln_letter
1288
1290
1289 return ''.join(initials).upper()
1291 return ''.join(initials).upper()
1290
1292
1291 def get_img_data_by_type(self, font_family, img_type):
1293 def get_img_data_by_type(self, font_family, img_type):
1292 default_user = """
1294 default_user = """
1293 <svg xmlns="http://www.w3.org/2000/svg"
1295 <svg xmlns="http://www.w3.org/2000/svg"
1294 version="1.1" x="0px" y="0px" width="{size}" height="{size}"
1296 version="1.1" x="0px" y="0px" width="{size}" height="{size}"
1295 viewBox="-15 -10 439.165 429.164"
1297 viewBox="-15 -10 439.165 429.164"
1296
1298
1297 xml:space="preserve"
1299 xml:space="preserve"
1298 style="background:{background};" >
1300 style="background:{background};" >
1299
1301
1300 <path d="M204.583,216.671c50.664,0,91.74-48.075,
1302 <path d="M204.583,216.671c50.664,0,91.74-48.075,
1301 91.74-107.378c0-82.237-41.074-107.377-91.74-107.377
1303 91.74-107.378c0-82.237-41.074-107.377-91.74-107.377
1302 c-50.668,0-91.74,25.14-91.74,107.377C112.844,
1304 c-50.668,0-91.74,25.14-91.74,107.377C112.844,
1303 168.596,153.916,216.671,
1305 168.596,153.916,216.671,
1304 204.583,216.671z" fill="{text_color}"/>
1306 204.583,216.671z" fill="{text_color}"/>
1305 <path d="M407.164,374.717L360.88,
1307 <path d="M407.164,374.717L360.88,
1306 270.454c-2.117-4.771-5.836-8.728-10.465-11.138l-71.83-37.392
1308 270.454c-2.117-4.771-5.836-8.728-10.465-11.138l-71.83-37.392
1307 c-1.584-0.823-3.502-0.663-4.926,0.415c-20.316,
1309 c-1.584-0.823-3.502-0.663-4.926,0.415c-20.316,
1308 15.366-44.203,23.488-69.076,23.488c-24.877,
1310 15.366-44.203,23.488-69.076,23.488c-24.877,
1309 0-48.762-8.122-69.078-23.488
1311 0-48.762-8.122-69.078-23.488
1310 c-1.428-1.078-3.346-1.238-4.93-0.415L58.75,
1312 c-1.428-1.078-3.346-1.238-4.93-0.415L58.75,
1311 259.316c-4.631,2.41-8.346,6.365-10.465,11.138L2.001,374.717
1313 259.316c-4.631,2.41-8.346,6.365-10.465,11.138L2.001,374.717
1312 c-3.191,7.188-2.537,15.412,1.75,22.005c4.285,
1314 c-3.191,7.188-2.537,15.412,1.75,22.005c4.285,
1313 6.592,11.537,10.526,19.4,10.526h362.861c7.863,0,15.117-3.936,
1315 6.592,11.537,10.526,19.4,10.526h362.861c7.863,0,15.117-3.936,
1314 19.402-10.527 C409.699,390.129,
1316 19.402-10.527 C409.699,390.129,
1315 410.355,381.902,407.164,374.717z" fill="{text_color}"/>
1317 410.355,381.902,407.164,374.717z" fill="{text_color}"/>
1316 </svg>""".format(
1318 </svg>""".format(
1317 size=self.size,
1319 size=self.size,
1318 background='#979797', # @grey4
1320 background='#979797', # @grey4
1319 text_color=self.text_color,
1321 text_color=self.text_color,
1320 font_family=font_family)
1322 font_family=font_family)
1321
1323
1322 return {
1324 return {
1323 "default_user": default_user
1325 "default_user": default_user
1324 }[img_type]
1326 }[img_type]
1325
1327
1326 def get_img_data(self, svg_type=None):
1328 def get_img_data(self, svg_type=None):
1327 """
1329 """
1328 generates the svg metadata for image
1330 generates the svg metadata for image
1329 """
1331 """
1330 fonts = [
1332 fonts = [
1331 '-apple-system',
1333 '-apple-system',
1332 'BlinkMacSystemFont',
1334 'BlinkMacSystemFont',
1333 'Segoe UI',
1335 'Segoe UI',
1334 'Roboto',
1336 'Roboto',
1335 'Oxygen-Sans',
1337 'Oxygen-Sans',
1336 'Ubuntu',
1338 'Ubuntu',
1337 'Cantarell',
1339 'Cantarell',
1338 'Helvetica Neue',
1340 'Helvetica Neue',
1339 'sans-serif'
1341 'sans-serif'
1340 ]
1342 ]
1341 font_family = ','.join(fonts)
1343 font_family = ','.join(fonts)
1342 if svg_type:
1344 if svg_type:
1343 return self.get_img_data_by_type(font_family, svg_type)
1345 return self.get_img_data_by_type(font_family, svg_type)
1344
1346
1345 initials = self.get_initials()
1347 initials = self.get_initials()
1346 img_data = """
1348 img_data = """
1347 <svg xmlns="http://www.w3.org/2000/svg" pointer-events="none"
1349 <svg xmlns="http://www.w3.org/2000/svg" pointer-events="none"
1348 width="{size}" height="{size}"
1350 width="{size}" height="{size}"
1349 style="width: 100%; height: 100%; background-color: {background}"
1351 style="width: 100%; height: 100%; background-color: {background}"
1350 viewBox="0 0 {size} {size}">
1352 viewBox="0 0 {size} {size}">
1351 <text text-anchor="middle" y="50%" x="50%" dy="0.35em"
1353 <text text-anchor="middle" y="50%" x="50%" dy="0.35em"
1352 pointer-events="auto" fill="{text_color}"
1354 pointer-events="auto" fill="{text_color}"
1353 font-family="{font_family}"
1355 font-family="{font_family}"
1354 style="font-weight: 400; font-size: {f_size}px;">{text}
1356 style="font-weight: 400; font-size: {f_size}px;">{text}
1355 </text>
1357 </text>
1356 </svg>""".format(
1358 </svg>""".format(
1357 size=self.size,
1359 size=self.size,
1358 f_size=self.size/2.05, # scale the text inside the box nicely
1360 f_size=self.size/2.05, # scale the text inside the box nicely
1359 background=self.background,
1361 background=self.background,
1360 text_color=self.text_color,
1362 text_color=self.text_color,
1361 text=initials.upper(),
1363 text=initials.upper(),
1362 font_family=font_family)
1364 font_family=font_family)
1363
1365
1364 return img_data
1366 return img_data
1365
1367
1366 def generate_svg(self, svg_type=None):
1368 def generate_svg(self, svg_type=None):
1367 img_data = self.get_img_data(svg_type)
1369 img_data = self.get_img_data(svg_type)
1368 return "data:image/svg+xml;base64,%s" % base64.b64encode(img_data)
1370 return "data:image/svg+xml;base64,%s" % base64.b64encode(img_data)
1369
1371
1370
1372
1371 def initials_gravatar(request, email_address, first_name, last_name, size=30, store_on_disk=False):
1373 def initials_gravatar(request, email_address, first_name, last_name, size=30, store_on_disk=False):
1372
1374
1373 svg_type = None
1375 svg_type = None
1374 if email_address == User.DEFAULT_USER_EMAIL:
1376 if email_address == User.DEFAULT_USER_EMAIL:
1375 svg_type = 'default_user'
1377 svg_type = 'default_user'
1376
1378
1377 klass = InitialsGravatar(email_address, first_name, last_name, size)
1379 klass = InitialsGravatar(email_address, first_name, last_name, size)
1378
1380
1379 if store_on_disk:
1381 if store_on_disk:
1380 from rhodecode.apps.file_store import utils as store_utils
1382 from rhodecode.apps.file_store import utils as store_utils
1381 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, \
1383 from rhodecode.apps.file_store.exceptions import FileNotAllowedException, \
1382 FileOverSizeException
1384 FileOverSizeException
1383 from rhodecode.model.db import Session
1385 from rhodecode.model.db import Session
1384
1386
1385 image_key = md5_safe(email_address.lower()
1387 image_key = md5_safe(email_address.lower()
1386 + first_name.lower() + last_name.lower())
1388 + first_name.lower() + last_name.lower())
1387
1389
1388 storage = store_utils.get_file_storage(request.registry.settings)
1390 storage = store_utils.get_file_storage(request.registry.settings)
1389 filename = '{}.svg'.format(image_key)
1391 filename = '{}.svg'.format(image_key)
1390 subdir = 'gravatars'
1392 subdir = 'gravatars'
1391 # since final name has a counter, we apply the 0
1393 # since final name has a counter, we apply the 0
1392 uid = storage.apply_counter(0, store_utils.uid_filename(filename, randomized=False))
1394 uid = storage.apply_counter(0, store_utils.uid_filename(filename, randomized=False))
1393 store_uid = os.path.join(subdir, uid)
1395 store_uid = os.path.join(subdir, uid)
1394
1396
1395 db_entry = FileStore.get_by_store_uid(store_uid)
1397 db_entry = FileStore.get_by_store_uid(store_uid)
1396 if db_entry:
1398 if db_entry:
1397 return request.route_path('download_file', fid=store_uid)
1399 return request.route_path('download_file', fid=store_uid)
1398
1400
1399 img_data = klass.get_img_data(svg_type=svg_type)
1401 img_data = klass.get_img_data(svg_type=svg_type)
1400 img_file = store_utils.bytes_to_file_obj(img_data)
1402 img_file = store_utils.bytes_to_file_obj(img_data)
1401
1403
1402 try:
1404 try:
1403 store_uid, metadata = storage.save_file(
1405 store_uid, metadata = storage.save_file(
1404 img_file, filename, directory=subdir,
1406 img_file, filename, directory=subdir,
1405 extensions=['.svg'], randomized_name=False)
1407 extensions=['.svg'], randomized_name=False)
1406 except (FileNotAllowedException, FileOverSizeException):
1408 except (FileNotAllowedException, FileOverSizeException):
1407 raise
1409 raise
1408
1410
1409 try:
1411 try:
1410 entry = FileStore.create(
1412 entry = FileStore.create(
1411 file_uid=store_uid, filename=metadata["filename"],
1413 file_uid=store_uid, filename=metadata["filename"],
1412 file_hash=metadata["sha256"], file_size=metadata["size"],
1414 file_hash=metadata["sha256"], file_size=metadata["size"],
1413 file_display_name=filename,
1415 file_display_name=filename,
1414 file_description=u'user gravatar `{}`'.format(safe_unicode(filename)),
1416 file_description=u'user gravatar `{}`'.format(safe_unicode(filename)),
1415 hidden=True, check_acl=False, user_id=1
1417 hidden=True, check_acl=False, user_id=1
1416 )
1418 )
1417 Session().add(entry)
1419 Session().add(entry)
1418 Session().commit()
1420 Session().commit()
1419 log.debug('Stored upload in DB as %s', entry)
1421 log.debug('Stored upload in DB as %s', entry)
1420 except Exception:
1422 except Exception:
1421 raise
1423 raise
1422
1424
1423 return request.route_path('download_file', fid=store_uid)
1425 return request.route_path('download_file', fid=store_uid)
1424
1426
1425 else:
1427 else:
1426 return klass.generate_svg(svg_type=svg_type)
1428 return klass.generate_svg(svg_type=svg_type)
1427
1429
1428
1430
1429 def gravatar_external(request, gravatar_url_tmpl, email_address, size=30):
1431 def gravatar_external(request, gravatar_url_tmpl, email_address, size=30):
1430 return safe_str(gravatar_url_tmpl)\
1432 return safe_str(gravatar_url_tmpl)\
1431 .replace('{email}', email_address) \
1433 .replace('{email}', email_address) \
1432 .replace('{md5email}', md5_safe(email_address.lower())) \
1434 .replace('{md5email}', md5_safe(email_address.lower())) \
1433 .replace('{netloc}', request.host) \
1435 .replace('{netloc}', request.host) \
1434 .replace('{scheme}', request.scheme) \
1436 .replace('{scheme}', request.scheme) \
1435 .replace('{size}', safe_str(size))
1437 .replace('{size}', safe_str(size))
1436
1438
1437
1439
1438 def gravatar_url(email_address, size=30, request=None):
1440 def gravatar_url(email_address, size=30, request=None):
1439 request = request or get_current_request()
1441 request = request or get_current_request()
1440 _use_gravatar = request.call_context.visual.use_gravatar
1442 _use_gravatar = request.call_context.visual.use_gravatar
1441
1443
1442 email_address = email_address or User.DEFAULT_USER_EMAIL
1444 email_address = email_address or User.DEFAULT_USER_EMAIL
1443 if isinstance(email_address, unicode):
1445 if isinstance(email_address, unicode):
1444 # hashlib crashes on unicode items
1446 # hashlib crashes on unicode items
1445 email_address = safe_str(email_address)
1447 email_address = safe_str(email_address)
1446
1448
1447 # empty email or default user
1449 # empty email or default user
1448 if not email_address or email_address == User.DEFAULT_USER_EMAIL:
1450 if not email_address or email_address == User.DEFAULT_USER_EMAIL:
1449 return initials_gravatar(request, User.DEFAULT_USER_EMAIL, '', '', size=size)
1451 return initials_gravatar(request, User.DEFAULT_USER_EMAIL, '', '', size=size)
1450
1452
1451 if _use_gravatar:
1453 if _use_gravatar:
1452 gravatar_url_tmpl = request.call_context.visual.gravatar_url \
1454 gravatar_url_tmpl = request.call_context.visual.gravatar_url \
1453 or User.DEFAULT_GRAVATAR_URL
1455 or User.DEFAULT_GRAVATAR_URL
1454 return gravatar_external(request, gravatar_url_tmpl, email_address, size=size)
1456 return gravatar_external(request, gravatar_url_tmpl, email_address, size=size)
1455
1457
1456 else:
1458 else:
1457 return initials_gravatar(request, email_address, '', '', size=size)
1459 return initials_gravatar(request, email_address, '', '', size=size)
1458
1460
1459
1461
1460 def breadcrumb_repo_link(repo):
1462 def breadcrumb_repo_link(repo):
1461 """
1463 """
1462 Makes a breadcrumbs path link to repo
1464 Makes a breadcrumbs path link to repo
1463
1465
1464 ex::
1466 ex::
1465 group >> subgroup >> repo
1467 group >> subgroup >> repo
1466
1468
1467 :param repo: a Repository instance
1469 :param repo: a Repository instance
1468 """
1470 """
1469
1471
1470 path = [
1472 path = [
1471 link_to(group.name, route_path('repo_group_home', repo_group_name=group.group_name),
1473 link_to(group.name, route_path('repo_group_home', repo_group_name=group.group_name),
1472 title='last change:{}'.format(format_date(group.last_commit_change)))
1474 title='last change:{}'.format(format_date(group.last_commit_change)))
1473 for group in repo.groups_with_parents
1475 for group in repo.groups_with_parents
1474 ] + [
1476 ] + [
1475 link_to(repo.just_name, route_path('repo_summary', repo_name=repo.repo_name),
1477 link_to(repo.just_name, route_path('repo_summary', repo_name=repo.repo_name),
1476 title='last change:{}'.format(format_date(repo.last_commit_change)))
1478 title='last change:{}'.format(format_date(repo.last_commit_change)))
1477 ]
1479 ]
1478
1480
1479 return literal(' &raquo; '.join(path))
1481 return literal(' &raquo; '.join(path))
1480
1482
1481
1483
1482 def breadcrumb_repo_group_link(repo_group):
1484 def breadcrumb_repo_group_link(repo_group):
1483 """
1485 """
1484 Makes a breadcrumbs path link to repo
1486 Makes a breadcrumbs path link to repo
1485
1487
1486 ex::
1488 ex::
1487 group >> subgroup
1489 group >> subgroup
1488
1490
1489 :param repo_group: a Repository Group instance
1491 :param repo_group: a Repository Group instance
1490 """
1492 """
1491
1493
1492 path = [
1494 path = [
1493 link_to(group.name,
1495 link_to(group.name,
1494 route_path('repo_group_home', repo_group_name=group.group_name),
1496 route_path('repo_group_home', repo_group_name=group.group_name),
1495 title='last change:{}'.format(format_date(group.last_commit_change)))
1497 title='last change:{}'.format(format_date(group.last_commit_change)))
1496 for group in repo_group.parents
1498 for group in repo_group.parents
1497 ] + [
1499 ] + [
1498 link_to(repo_group.name,
1500 link_to(repo_group.name,
1499 route_path('repo_group_home', repo_group_name=repo_group.group_name),
1501 route_path('repo_group_home', repo_group_name=repo_group.group_name),
1500 title='last change:{}'.format(format_date(repo_group.last_commit_change)))
1502 title='last change:{}'.format(format_date(repo_group.last_commit_change)))
1501 ]
1503 ]
1502
1504
1503 return literal(' &raquo; '.join(path))
1505 return literal(' &raquo; '.join(path))
1504
1506
1505
1507
1506 def format_byte_size_binary(file_size):
1508 def format_byte_size_binary(file_size):
1507 """
1509 """
1508 Formats file/folder sizes to standard.
1510 Formats file/folder sizes to standard.
1509 """
1511 """
1510 if file_size is None:
1512 if file_size is None:
1511 file_size = 0
1513 file_size = 0
1512
1514
1513 formatted_size = format_byte_size(file_size, binary=True)
1515 formatted_size = format_byte_size(file_size, binary=True)
1514 return formatted_size
1516 return formatted_size
1515
1517
1516
1518
1517 def urlify_text(text_, safe=True, **href_attrs):
1519 def urlify_text(text_, safe=True, **href_attrs):
1518 """
1520 """
1519 Extract urls from text and make html links out of them
1521 Extract urls from text and make html links out of them
1520 """
1522 """
1521
1523
1522 url_pat = re.compile(r'''(http[s]?://(?:[a-zA-Z]|[0-9]|[$-_@#.&+]'''
1524 url_pat = re.compile(r'''(http[s]?://(?:[a-zA-Z]|[0-9]|[$-_@#.&+]'''
1523 '''|[!*\(\),]|(?:%[0-9a-fA-F][0-9a-fA-F]))+)''')
1525 '''|[!*\(\),]|(?:%[0-9a-fA-F][0-9a-fA-F]))+)''')
1524
1526
1525 def url_func(match_obj):
1527 def url_func(match_obj):
1526 url_full = match_obj.groups()[0]
1528 url_full = match_obj.groups()[0]
1527 a_options = dict(href_attrs)
1529 a_options = dict(href_attrs)
1528 a_options['href'] = url_full
1530 a_options['href'] = url_full
1529 a_text = url_full
1531 a_text = url_full
1530 return HTML.tag("a", a_text, **a_options)
1532 return HTML.tag("a", a_text, **a_options)
1531
1533
1532 _new_text = url_pat.sub(url_func, text_)
1534 _new_text = url_pat.sub(url_func, text_)
1533
1535
1534 if safe:
1536 if safe:
1535 return literal(_new_text)
1537 return literal(_new_text)
1536 return _new_text
1538 return _new_text
1537
1539
1538
1540
1539 def urlify_commits(text_, repo_name):
1541 def urlify_commits(text_, repo_name):
1540 """
1542 """
1541 Extract commit ids from text and make link from them
1543 Extract commit ids from text and make link from them
1542
1544
1543 :param text_:
1545 :param text_:
1544 :param repo_name: repo name to build the URL with
1546 :param repo_name: repo name to build the URL with
1545 """
1547 """
1546
1548
1547 url_pat = re.compile(r'(^|\s)([0-9a-fA-F]{12,40})($|\s)')
1549 url_pat = re.compile(r'(^|\s)([0-9a-fA-F]{12,40})($|\s)')
1548
1550
1549 def url_func(match_obj):
1551 def url_func(match_obj):
1550 commit_id = match_obj.groups()[1]
1552 commit_id = match_obj.groups()[1]
1551 pref = match_obj.groups()[0]
1553 pref = match_obj.groups()[0]
1552 suf = match_obj.groups()[2]
1554 suf = match_obj.groups()[2]
1553
1555
1554 tmpl = (
1556 tmpl = (
1555 '%(pref)s<a class="tooltip-hovercard %(cls)s" href="%(url)s" data-hovercard-alt="%(hovercard_alt)s" data-hovercard-url="%(hovercard_url)s">'
1557 '%(pref)s<a class="tooltip-hovercard %(cls)s" href="%(url)s" data-hovercard-alt="%(hovercard_alt)s" data-hovercard-url="%(hovercard_url)s">'
1556 '%(commit_id)s</a>%(suf)s'
1558 '%(commit_id)s</a>%(suf)s'
1557 )
1559 )
1558 return tmpl % {
1560 return tmpl % {
1559 'pref': pref,
1561 'pref': pref,
1560 'cls': 'revision-link',
1562 'cls': 'revision-link',
1561 'url': route_url(
1563 'url': route_url(
1562 'repo_commit', repo_name=repo_name, commit_id=commit_id),
1564 'repo_commit', repo_name=repo_name, commit_id=commit_id),
1563 'commit_id': commit_id,
1565 'commit_id': commit_id,
1564 'suf': suf,
1566 'suf': suf,
1565 'hovercard_alt': 'Commit: {}'.format(commit_id),
1567 'hovercard_alt': 'Commit: {}'.format(commit_id),
1566 'hovercard_url': route_url(
1568 'hovercard_url': route_url(
1567 'hovercard_repo_commit', repo_name=repo_name, commit_id=commit_id)
1569 'hovercard_repo_commit', repo_name=repo_name, commit_id=commit_id)
1568 }
1570 }
1569
1571
1570 new_text = url_pat.sub(url_func, text_)
1572 new_text = url_pat.sub(url_func, text_)
1571
1573
1572 return new_text
1574 return new_text
1573
1575
1574
1576
1575 def _process_url_func(match_obj, repo_name, uid, entry,
1577 def _process_url_func(match_obj, repo_name, uid, entry,
1576 return_raw_data=False, link_format='html'):
1578 return_raw_data=False, link_format='html'):
1577 pref = ''
1579 pref = ''
1578 if match_obj.group().startswith(' '):
1580 if match_obj.group().startswith(' '):
1579 pref = ' '
1581 pref = ' '
1580
1582
1581 issue_id = ''.join(match_obj.groups())
1583 issue_id = ''.join(match_obj.groups())
1582
1584
1583 if link_format == 'html':
1585 if link_format == 'html':
1584 tmpl = (
1586 tmpl = (
1585 '%(pref)s<a class="tooltip %(cls)s" href="%(url)s" title="%(title)s">'
1587 '%(pref)s<a class="tooltip %(cls)s" href="%(url)s" title="%(title)s">'
1586 '%(issue-prefix)s%(id-repr)s'
1588 '%(issue-prefix)s%(id-repr)s'
1587 '</a>')
1589 '</a>')
1588 elif link_format == 'html+hovercard':
1590 elif link_format == 'html+hovercard':
1589 tmpl = (
1591 tmpl = (
1590 '%(pref)s<a class="tooltip-hovercard %(cls)s" href="%(url)s" data-hovercard-url="%(hovercard_url)s">'
1592 '%(pref)s<a class="tooltip-hovercard %(cls)s" href="%(url)s" data-hovercard-url="%(hovercard_url)s">'
1591 '%(issue-prefix)s%(id-repr)s'
1593 '%(issue-prefix)s%(id-repr)s'
1592 '</a>')
1594 '</a>')
1593 elif link_format in ['rst', 'rst+hovercard']:
1595 elif link_format in ['rst', 'rst+hovercard']:
1594 tmpl = '`%(issue-prefix)s%(id-repr)s <%(url)s>`_'
1596 tmpl = '`%(issue-prefix)s%(id-repr)s <%(url)s>`_'
1595 elif link_format in ['markdown', 'markdown+hovercard']:
1597 elif link_format in ['markdown', 'markdown+hovercard']:
1596 tmpl = '[%(pref)s%(issue-prefix)s%(id-repr)s](%(url)s)'
1598 tmpl = '[%(pref)s%(issue-prefix)s%(id-repr)s](%(url)s)'
1597 else:
1599 else:
1598 raise ValueError('Bad link_format:{}'.format(link_format))
1600 raise ValueError('Bad link_format:{}'.format(link_format))
1599
1601
1600 (repo_name_cleaned,
1602 (repo_name_cleaned,
1601 parent_group_name) = RepoGroupModel()._get_group_name_and_parent(repo_name)
1603 parent_group_name) = RepoGroupModel()._get_group_name_and_parent(repo_name)
1602
1604
1603 # variables replacement
1605 # variables replacement
1604 named_vars = {
1606 named_vars = {
1605 'id': issue_id,
1607 'id': issue_id,
1606 'repo': repo_name,
1608 'repo': repo_name,
1607 'repo_name': repo_name_cleaned,
1609 'repo_name': repo_name_cleaned,
1608 'group_name': parent_group_name,
1610 'group_name': parent_group_name,
1609 # set dummy keys so we always have them
1611 # set dummy keys so we always have them
1610 'hostname': '',
1612 'hostname': '',
1611 'netloc': '',
1613 'netloc': '',
1612 'scheme': ''
1614 'scheme': ''
1613 }
1615 }
1614
1616
1615 request = get_current_request()
1617 request = get_current_request()
1616 if request:
1618 if request:
1617 # exposes, hostname, netloc, scheme
1619 # exposes, hostname, netloc, scheme
1618 host_data = get_host_info(request)
1620 host_data = get_host_info(request)
1619 named_vars.update(host_data)
1621 named_vars.update(host_data)
1620
1622
1621 # named regex variables
1623 # named regex variables
1622 named_vars.update(match_obj.groupdict())
1624 named_vars.update(match_obj.groupdict())
1623 _url = string.Template(entry['url']).safe_substitute(**named_vars)
1625 _url = string.Template(entry['url']).safe_substitute(**named_vars)
1624 desc = string.Template(escape(entry['desc'])).safe_substitute(**named_vars)
1626 desc = string.Template(escape(entry['desc'])).safe_substitute(**named_vars)
1625 hovercard_url = string.Template(entry.get('hovercard_url', '')).safe_substitute(**named_vars)
1627 hovercard_url = string.Template(entry.get('hovercard_url', '')).safe_substitute(**named_vars)
1626
1628
1627 def quote_cleaner(input_str):
1629 def quote_cleaner(input_str):
1628 """Remove quotes as it's HTML"""
1630 """Remove quotes as it's HTML"""
1629 return input_str.replace('"', '')
1631 return input_str.replace('"', '')
1630
1632
1631 data = {
1633 data = {
1632 'pref': pref,
1634 'pref': pref,
1633 'cls': quote_cleaner('issue-tracker-link'),
1635 'cls': quote_cleaner('issue-tracker-link'),
1634 'url': quote_cleaner(_url),
1636 'url': quote_cleaner(_url),
1635 'id-repr': issue_id,
1637 'id-repr': issue_id,
1636 'issue-prefix': entry['pref'],
1638 'issue-prefix': entry['pref'],
1637 'serv': entry['url'],
1639 'serv': entry['url'],
1638 'title': bleach.clean(desc, strip=True),
1640 'title': bleach.clean(desc, strip=True),
1639 'hovercard_url': hovercard_url
1641 'hovercard_url': hovercard_url
1640 }
1642 }
1641
1643
1642 if return_raw_data:
1644 if return_raw_data:
1643 return {
1645 return {
1644 'id': issue_id,
1646 'id': issue_id,
1645 'url': _url
1647 'url': _url
1646 }
1648 }
1647 return tmpl % data
1649 return tmpl % data
1648
1650
1649
1651
1650 def get_active_pattern_entries(repo_name):
1652 def get_active_pattern_entries(repo_name):
1651 repo = None
1653 repo = None
1652 if repo_name:
1654 if repo_name:
1653 # Retrieving repo_name to avoid invalid repo_name to explode on
1655 # Retrieving repo_name to avoid invalid repo_name to explode on
1654 # IssueTrackerSettingsModel but still passing invalid name further down
1656 # IssueTrackerSettingsModel but still passing invalid name further down
1655 repo = Repository.get_by_repo_name(repo_name, cache=True)
1657 repo = Repository.get_by_repo_name(repo_name, cache=True)
1656
1658
1657 settings_model = IssueTrackerSettingsModel(repo=repo)
1659 settings_model = IssueTrackerSettingsModel(repo=repo)
1658 active_entries = settings_model.get_settings(cache=True)
1660 active_entries = settings_model.get_settings(cache=True)
1659 return active_entries
1661 return active_entries
1660
1662
1661
1663
1662 pr_pattern_re = regex.compile(r'(?:(?:^!)|(?: !))(\d+)')
1664 pr_pattern_re = regex.compile(r'(?:(?:^!)|(?: !))(\d+)')
1663
1665
1664 allowed_link_formats = [
1666 allowed_link_formats = [
1665 'html', 'rst', 'markdown', 'html+hovercard', 'rst+hovercard', 'markdown+hovercard']
1667 'html', 'rst', 'markdown', 'html+hovercard', 'rst+hovercard', 'markdown+hovercard']
1666
1668
1667
1669
1668 def process_patterns(text_string, repo_name, link_format='html', active_entries=None):
1670 def process_patterns(text_string, repo_name, link_format='html', active_entries=None):
1669
1671
1670 if link_format not in allowed_link_formats:
1672 if link_format not in allowed_link_formats:
1671 raise ValueError('Link format can be only one of:{} got {}'.format(
1673 raise ValueError('Link format can be only one of:{} got {}'.format(
1672 allowed_link_formats, link_format))
1674 allowed_link_formats, link_format))
1673
1675
1674 if active_entries is None:
1676 if active_entries is None:
1675 log.debug('Fetch active issue tracker patterns for repo: %s', repo_name)
1677 log.debug('Fetch active issue tracker patterns for repo: %s', repo_name)
1676 active_entries = get_active_pattern_entries(repo_name)
1678 active_entries = get_active_pattern_entries(repo_name)
1677
1679
1678 issues_data = []
1680 issues_data = []
1679 errors = []
1681 errors = []
1680 new_text = text_string
1682 new_text = text_string
1681
1683
1682 log.debug('Got %s entries to process', len(active_entries))
1684 log.debug('Got %s pattern entries to process', len(active_entries))
1683 for uid, entry in active_entries.items():
1685 for uid, entry in active_entries.items():
1684 log.debug('found issue tracker entry with uid %s', uid)
1685
1686
1686 if not (entry['pat'] and entry['url']):
1687 if not (entry['pat'] and entry['url']):
1687 log.debug('skipping due to missing data')
1688 log.debug('skipping due to missing data')
1688 continue
1689 continue
1689
1690
1690 log.debug('issue tracker entry: uid: `%s` PAT:%s URL:%s PREFIX:%s',
1691 log.debug('issue tracker entry: uid: `%s` PAT:%s URL:%s PREFIX:%s',
1691 uid, entry['pat'], entry['url'], entry['pref'])
1692 uid, entry['pat'], entry['url'], entry['pref'])
1692
1693
1693 if entry.get('pat_compiled'):
1694 if entry.get('pat_compiled'):
1694 pattern = entry['pat_compiled']
1695 pattern = entry['pat_compiled']
1695 else:
1696 else:
1696 try:
1697 try:
1697 pattern = regex.compile(r'%s' % entry['pat'])
1698 pattern = regex.compile(r'%s' % entry['pat'])
1698 except regex.error as e:
1699 except regex.error as e:
1699 regex_err = ValueError('{}:{}'.format(entry['pat'], e))
1700 regex_err = ValueError('{}:{}'.format(entry['pat'], e))
1700 log.exception('issue tracker pattern: `%s` failed to compile', regex_err)
1701 log.exception('issue tracker pattern: `%s` failed to compile', regex_err)
1701 errors.append(regex_err)
1702 errors.append(regex_err)
1702 continue
1703 continue
1703
1704
1704 data_func = partial(
1705 data_func = partial(
1705 _process_url_func, repo_name=repo_name, entry=entry, uid=uid,
1706 _process_url_func, repo_name=repo_name, entry=entry, uid=uid,
1706 return_raw_data=True)
1707 return_raw_data=True)
1707
1708
1708 for match_obj in pattern.finditer(text_string):
1709 for match_obj in pattern.finditer(text_string):
1709 issues_data.append(data_func(match_obj))
1710 issues_data.append(data_func(match_obj))
1710
1711
1711 url_func = partial(
1712 url_func = partial(
1712 _process_url_func, repo_name=repo_name, entry=entry, uid=uid,
1713 _process_url_func, repo_name=repo_name, entry=entry, uid=uid,
1713 link_format=link_format)
1714 link_format=link_format)
1714
1715
1715 new_text = pattern.sub(url_func, new_text)
1716 new_text = pattern.sub(url_func, new_text)
1716 log.debug('processed prefix:uid `%s`', uid)
1717 log.debug('processed prefix:uid `%s`', uid)
1717
1718
1718 # finally use global replace, eg !123 -> pr-link, those will not catch
1719 # finally use global replace, eg !123 -> pr-link, those will not catch
1719 # if already similar pattern exists
1720 # if already similar pattern exists
1720 server_url = '${scheme}://${netloc}'
1721 server_url = '${scheme}://${netloc}'
1721 pr_entry = {
1722 pr_entry = {
1722 'pref': '!',
1723 'pref': '!',
1723 'url': server_url + '/_admin/pull-requests/${id}',
1724 'url': server_url + '/_admin/pull-requests/${id}',
1724 'desc': 'Pull Request !${id}',
1725 'desc': 'Pull Request !${id}',
1725 'hovercard_url': server_url + '/_hovercard/pull_request/${id}'
1726 'hovercard_url': server_url + '/_hovercard/pull_request/${id}'
1726 }
1727 }
1727 pr_url_func = partial(
1728 pr_url_func = partial(
1728 _process_url_func, repo_name=repo_name, entry=pr_entry, uid=None,
1729 _process_url_func, repo_name=repo_name, entry=pr_entry, uid=None,
1729 link_format=link_format+'+hovercard')
1730 link_format=link_format+'+hovercard')
1730 new_text = pr_pattern_re.sub(pr_url_func, new_text)
1731 new_text = pr_pattern_re.sub(pr_url_func, new_text)
1731 log.debug('processed !pr pattern')
1732 log.debug('processed !pr pattern')
1732
1733
1733 return new_text, issues_data, errors
1734 return new_text, issues_data, errors
1734
1735
1735
1736
1736 def urlify_commit_message(commit_text, repository=None, active_pattern_entries=None,
1737 def urlify_commit_message(commit_text, repository=None, active_pattern_entries=None,
1737 issues_container_callback=None, error_container=None):
1738 issues_container_callback=None, error_container=None):
1738 """
1739 """
1739 Parses given text message and makes proper links.
1740 Parses given text message and makes proper links.
1740 issues are linked to given issue-server, and rest is a commit link
1741 issues are linked to given issue-server, and rest is a commit link
1741 """
1742 """
1742
1743
1743 def escaper(_text):
1744 def escaper(_text):
1744 return _text.replace('<', '&lt;').replace('>', '&gt;')
1745 return _text.replace('<', '&lt;').replace('>', '&gt;')
1745
1746
1746 new_text = escaper(commit_text)
1747 new_text = escaper(commit_text)
1747
1748
1748 # extract http/https links and make them real urls
1749 # extract http/https links and make them real urls
1749 new_text = urlify_text(new_text, safe=False)
1750 new_text = urlify_text(new_text, safe=False)
1750
1751
1751 # urlify commits - extract commit ids and make link out of them, if we have
1752 # urlify commits - extract commit ids and make link out of them, if we have
1752 # the scope of repository present.
1753 # the scope of repository present.
1753 if repository:
1754 if repository:
1754 new_text = urlify_commits(new_text, repository)
1755 new_text = urlify_commits(new_text, repository)
1755
1756
1756 # process issue tracker patterns
1757 # process issue tracker patterns
1757 new_text, issues, errors = process_patterns(
1758 new_text, issues, errors = process_patterns(
1758 new_text, repository or '', active_entries=active_pattern_entries)
1759 new_text, repository or '', active_entries=active_pattern_entries)
1759
1760
1760 if issues_container_callback is not None:
1761 if issues_container_callback is not None:
1761 for issue in issues:
1762 for issue in issues:
1762 issues_container_callback(issue)
1763 issues_container_callback(issue)
1763
1764
1764 if error_container is not None:
1765 if error_container is not None:
1765 error_container.extend(errors)
1766 error_container.extend(errors)
1766
1767
1767 return literal(new_text)
1768 return literal(new_text)
1768
1769
1769
1770
1770 def render_binary(repo_name, file_obj):
1771 def render_binary(repo_name, file_obj):
1771 """
1772 """
1772 Choose how to render a binary file
1773 Choose how to render a binary file
1773 """
1774 """
1774
1775
1775 # unicode
1776 # unicode
1776 filename = file_obj.name
1777 filename = file_obj.name
1777
1778
1778 # images
1779 # images
1779 for ext in ['*.png', '*.jpeg', '*.jpg', '*.ico', '*.gif']:
1780 for ext in ['*.png', '*.jpeg', '*.jpg', '*.ico', '*.gif']:
1780 if fnmatch.fnmatch(filename, pat=ext):
1781 if fnmatch.fnmatch(filename, pat=ext):
1781 src = route_path(
1782 src = route_path(
1782 'repo_file_raw', repo_name=repo_name,
1783 'repo_file_raw', repo_name=repo_name,
1783 commit_id=file_obj.commit.raw_id,
1784 commit_id=file_obj.commit.raw_id,
1784 f_path=file_obj.path)
1785 f_path=file_obj.path)
1785
1786
1786 return literal(
1787 return literal(
1787 '<img class="rendered-binary" alt="rendered-image" src="{}">'.format(src))
1788 '<img class="rendered-binary" alt="rendered-image" src="{}">'.format(src))
1788
1789
1789
1790
1790 def renderer_from_filename(filename, exclude=None):
1791 def renderer_from_filename(filename, exclude=None):
1791 """
1792 """
1792 choose a renderer based on filename, this works only for text based files
1793 choose a renderer based on filename, this works only for text based files
1793 """
1794 """
1794
1795
1795 # ipython
1796 # ipython
1796 for ext in ['*.ipynb']:
1797 for ext in ['*.ipynb']:
1797 if fnmatch.fnmatch(filename, pat=ext):
1798 if fnmatch.fnmatch(filename, pat=ext):
1798 return 'jupyter'
1799 return 'jupyter'
1799
1800
1800 is_markup = MarkupRenderer.renderer_from_filename(filename, exclude=exclude)
1801 is_markup = MarkupRenderer.renderer_from_filename(filename, exclude=exclude)
1801 if is_markup:
1802 if is_markup:
1802 return is_markup
1803 return is_markup
1803 return None
1804 return None
1804
1805
1805
1806
1806 def render(source, renderer='rst', mentions=False, relative_urls=None,
1807 def render(source, renderer='rst', mentions=False, relative_urls=None,
1807 repo_name=None, active_pattern_entries=None, issues_container_callback=None):
1808 repo_name=None, active_pattern_entries=None, issues_container_callback=None):
1808
1809
1809 def maybe_convert_relative_links(html_source):
1810 def maybe_convert_relative_links(html_source):
1810 if relative_urls:
1811 if relative_urls:
1811 return relative_links(html_source, relative_urls)
1812 return relative_links(html_source, relative_urls)
1812 return html_source
1813 return html_source
1813
1814
1814 if renderer == 'plain':
1815 if renderer == 'plain':
1815 return literal(
1816 return literal(
1816 MarkupRenderer.plain(source, leading_newline=False))
1817 MarkupRenderer.plain(source, leading_newline=False))
1817
1818
1818 elif renderer == 'rst':
1819 elif renderer == 'rst':
1819 if repo_name:
1820 if repo_name:
1820 # process patterns on comments if we pass in repo name
1821 # process patterns on comments if we pass in repo name
1821 source, issues, errors = process_patterns(
1822 source, issues, errors = process_patterns(
1822 source, repo_name, link_format='rst',
1823 source, repo_name, link_format='rst',
1823 active_entries=active_pattern_entries)
1824 active_entries=active_pattern_entries)
1824 if issues_container_callback is not None:
1825 if issues_container_callback is not None:
1825 for issue in issues:
1826 for issue in issues:
1826 issues_container_callback(issue)
1827 issues_container_callback(issue)
1827
1828
1828 return literal(
1829 return literal(
1829 '<div class="rst-block">%s</div>' %
1830 '<div class="rst-block">%s</div>' %
1830 maybe_convert_relative_links(
1831 maybe_convert_relative_links(
1831 MarkupRenderer.rst(source, mentions=mentions)))
1832 MarkupRenderer.rst(source, mentions=mentions)))
1832
1833
1833 elif renderer == 'markdown':
1834 elif renderer == 'markdown':
1834 if repo_name:
1835 if repo_name:
1835 # process patterns on comments if we pass in repo name
1836 # process patterns on comments if we pass in repo name
1836 source, issues, errors = process_patterns(
1837 source, issues, errors = process_patterns(
1837 source, repo_name, link_format='markdown',
1838 source, repo_name, link_format='markdown',
1838 active_entries=active_pattern_entries)
1839 active_entries=active_pattern_entries)
1839 if issues_container_callback is not None:
1840 if issues_container_callback is not None:
1840 for issue in issues:
1841 for issue in issues:
1841 issues_container_callback(issue)
1842 issues_container_callback(issue)
1842
1843
1843
1844
1844 return literal(
1845 return literal(
1845 '<div class="markdown-block">%s</div>' %
1846 '<div class="markdown-block">%s</div>' %
1846 maybe_convert_relative_links(
1847 maybe_convert_relative_links(
1847 MarkupRenderer.markdown(source, flavored=True,
1848 MarkupRenderer.markdown(source, flavored=True,
1848 mentions=mentions)))
1849 mentions=mentions)))
1849
1850
1850 elif renderer == 'jupyter':
1851 elif renderer == 'jupyter':
1851 return literal(
1852 return literal(
1852 '<div class="ipynb">%s</div>' %
1853 '<div class="ipynb">%s</div>' %
1853 maybe_convert_relative_links(
1854 maybe_convert_relative_links(
1854 MarkupRenderer.jupyter(source)))
1855 MarkupRenderer.jupyter(source)))
1855
1856
1856 # None means just show the file-source
1857 # None means just show the file-source
1857 return None
1858 return None
1858
1859
1859
1860
1860 def commit_status(repo, commit_id):
1861 def commit_status(repo, commit_id):
1861 return ChangesetStatusModel().get_status(repo, commit_id)
1862 return ChangesetStatusModel().get_status(repo, commit_id)
1862
1863
1863
1864
1864 def commit_status_lbl(commit_status):
1865 def commit_status_lbl(commit_status):
1865 return dict(ChangesetStatus.STATUSES).get(commit_status)
1866 return dict(ChangesetStatus.STATUSES).get(commit_status)
1866
1867
1867
1868
1868 def commit_time(repo_name, commit_id):
1869 def commit_time(repo_name, commit_id):
1869 repo = Repository.get_by_repo_name(repo_name)
1870 repo = Repository.get_by_repo_name(repo_name)
1870 commit = repo.get_commit(commit_id=commit_id)
1871 commit = repo.get_commit(commit_id=commit_id)
1871 return commit.date
1872 return commit.date
1872
1873
1873
1874
1874 def get_permission_name(key):
1875 def get_permission_name(key):
1875 return dict(Permission.PERMS).get(key)
1876 return dict(Permission.PERMS).get(key)
1876
1877
1877
1878
1878 def journal_filter_help(request):
1879 def journal_filter_help(request):
1879 _ = request.translate
1880 _ = request.translate
1880 from rhodecode.lib.audit_logger import ACTIONS
1881 from rhodecode.lib.audit_logger import ACTIONS
1881 actions = '\n'.join(textwrap.wrap(', '.join(sorted(ACTIONS.keys())), 80))
1882 actions = '\n'.join(textwrap.wrap(', '.join(sorted(ACTIONS.keys())), 80))
1882
1883
1883 return _(
1884 return _(
1884 'Example filter terms:\n' +
1885 'Example filter terms:\n' +
1885 ' repository:vcs\n' +
1886 ' repository:vcs\n' +
1886 ' username:marcin\n' +
1887 ' username:marcin\n' +
1887 ' username:(NOT marcin)\n' +
1888 ' username:(NOT marcin)\n' +
1888 ' action:*push*\n' +
1889 ' action:*push*\n' +
1889 ' ip:127.0.0.1\n' +
1890 ' ip:127.0.0.1\n' +
1890 ' date:20120101\n' +
1891 ' date:20120101\n' +
1891 ' date:[20120101100000 TO 20120102]\n' +
1892 ' date:[20120101100000 TO 20120102]\n' +
1892 '\n' +
1893 '\n' +
1893 'Actions: {actions}\n' +
1894 'Actions: {actions}\n' +
1894 '\n' +
1895 '\n' +
1895 'Generate wildcards using \'*\' character:\n' +
1896 'Generate wildcards using \'*\' character:\n' +
1896 ' "repository:vcs*" - search everything starting with \'vcs\'\n' +
1897 ' "repository:vcs*" - search everything starting with \'vcs\'\n' +
1897 ' "repository:*vcs*" - search for repository containing \'vcs\'\n' +
1898 ' "repository:*vcs*" - search for repository containing \'vcs\'\n' +
1898 '\n' +
1899 '\n' +
1899 'Optional AND / OR operators in queries\n' +
1900 'Optional AND / OR operators in queries\n' +
1900 ' "repository:vcs OR repository:test"\n' +
1901 ' "repository:vcs OR repository:test"\n' +
1901 ' "username:test AND repository:test*"\n'
1902 ' "username:test AND repository:test*"\n'
1902 ).format(actions=actions)
1903 ).format(actions=actions)
1903
1904
1904
1905
1905 def not_mapped_error(repo_name):
1906 def not_mapped_error(repo_name):
1906 from rhodecode.translation import _
1907 from rhodecode.translation import _
1907 flash(_('%s repository is not mapped to db perhaps'
1908 flash(_('%s repository is not mapped to db perhaps'
1908 ' it was created or renamed from the filesystem'
1909 ' it was created or renamed from the filesystem'
1909 ' please run the application again'
1910 ' please run the application again'
1910 ' in order to rescan repositories') % repo_name, category='error')
1911 ' in order to rescan repositories') % repo_name, category='error')
1911
1912
1912
1913
1913 def ip_range(ip_addr):
1914 def ip_range(ip_addr):
1914 from rhodecode.model.db import UserIpMap
1915 from rhodecode.model.db import UserIpMap
1915 s, e = UserIpMap._get_ip_range(ip_addr)
1916 s, e = UserIpMap._get_ip_range(ip_addr)
1916 return '%s - %s' % (s, e)
1917 return '%s - %s' % (s, e)
1917
1918
1918
1919
1919 def form(url, method='post', needs_csrf_token=True, **attrs):
1920 def form(url, method='post', needs_csrf_token=True, **attrs):
1920 """Wrapper around webhelpers.tags.form to prevent CSRF attacks."""
1921 """Wrapper around webhelpers.tags.form to prevent CSRF attacks."""
1921 if method.lower() != 'get' and needs_csrf_token:
1922 if method.lower() != 'get' and needs_csrf_token:
1922 raise Exception(
1923 raise Exception(
1923 'Forms to POST/PUT/DELETE endpoints should have (in general) a ' +
1924 'Forms to POST/PUT/DELETE endpoints should have (in general) a ' +
1924 'CSRF token. If the endpoint does not require such token you can ' +
1925 'CSRF token. If the endpoint does not require such token you can ' +
1925 'explicitly set the parameter needs_csrf_token to false.')
1926 'explicitly set the parameter needs_csrf_token to false.')
1926
1927
1927 return insecure_form(url, method=method, **attrs)
1928 return insecure_form(url, method=method, **attrs)
1928
1929
1929
1930
1930 def secure_form(form_url, method="POST", multipart=False, **attrs):
1931 def secure_form(form_url, method="POST", multipart=False, **attrs):
1931 """Start a form tag that points the action to an url. This
1932 """Start a form tag that points the action to an url. This
1932 form tag will also include the hidden field containing
1933 form tag will also include the hidden field containing
1933 the auth token.
1934 the auth token.
1934
1935
1935 The url options should be given either as a string, or as a
1936 The url options should be given either as a string, or as a
1936 ``url()`` function. The method for the form defaults to POST.
1937 ``url()`` function. The method for the form defaults to POST.
1937
1938
1938 Options:
1939 Options:
1939
1940
1940 ``multipart``
1941 ``multipart``
1941 If set to True, the enctype is set to "multipart/form-data".
1942 If set to True, the enctype is set to "multipart/form-data".
1942 ``method``
1943 ``method``
1943 The method to use when submitting the form, usually either
1944 The method to use when submitting the form, usually either
1944 "GET" or "POST". If "PUT", "DELETE", or another verb is used, a
1945 "GET" or "POST". If "PUT", "DELETE", or another verb is used, a
1945 hidden input with name _method is added to simulate the verb
1946 hidden input with name _method is added to simulate the verb
1946 over POST.
1947 over POST.
1947
1948
1948 """
1949 """
1949
1950
1950 if 'request' in attrs:
1951 if 'request' in attrs:
1951 session = attrs['request'].session
1952 session = attrs['request'].session
1952 del attrs['request']
1953 del attrs['request']
1953 else:
1954 else:
1954 raise ValueError(
1955 raise ValueError(
1955 'Calling this form requires request= to be passed as argument')
1956 'Calling this form requires request= to be passed as argument')
1956
1957
1957 _form = insecure_form(form_url, method, multipart, **attrs)
1958 _form = insecure_form(form_url, method, multipart, **attrs)
1958 token = literal(
1959 token = literal(
1959 '<input type="hidden" name="{}" value="{}">'.format(
1960 '<input type="hidden" name="{}" value="{}">'.format(
1960 csrf_token_key, get_csrf_token(session)))
1961 csrf_token_key, get_csrf_token(session)))
1961
1962
1962 return literal("%s\n%s" % (_form, token))
1963 return literal("%s\n%s" % (_form, token))
1963
1964
1964
1965
1965 def dropdownmenu(name, selected, options, enable_filter=False, **attrs):
1966 def dropdownmenu(name, selected, options, enable_filter=False, **attrs):
1966 select_html = select(name, selected, options, **attrs)
1967 select_html = select(name, selected, options, **attrs)
1967
1968
1968 select2 = """
1969 select2 = """
1969 <script>
1970 <script>
1970 $(document).ready(function() {
1971 $(document).ready(function() {
1971 $('#%s').select2({
1972 $('#%s').select2({
1972 containerCssClass: 'drop-menu %s',
1973 containerCssClass: 'drop-menu %s',
1973 dropdownCssClass: 'drop-menu-dropdown',
1974 dropdownCssClass: 'drop-menu-dropdown',
1974 dropdownAutoWidth: true%s
1975 dropdownAutoWidth: true%s
1975 });
1976 });
1976 });
1977 });
1977 </script>
1978 </script>
1978 """
1979 """
1979
1980
1980 filter_option = """,
1981 filter_option = """,
1981 minimumResultsForSearch: -1
1982 minimumResultsForSearch: -1
1982 """
1983 """
1983 input_id = attrs.get('id') or name
1984 input_id = attrs.get('id') or name
1984 extra_classes = ' '.join(attrs.pop('extra_classes', []))
1985 extra_classes = ' '.join(attrs.pop('extra_classes', []))
1985 filter_enabled = "" if enable_filter else filter_option
1986 filter_enabled = "" if enable_filter else filter_option
1986 select_script = literal(select2 % (input_id, extra_classes, filter_enabled))
1987 select_script = literal(select2 % (input_id, extra_classes, filter_enabled))
1987
1988
1988 return literal(select_html+select_script)
1989 return literal(select_html+select_script)
1989
1990
1990
1991
1991 def get_visual_attr(tmpl_context_var, attr_name):
1992 def get_visual_attr(tmpl_context_var, attr_name):
1992 """
1993 """
1993 A safe way to get a variable from visual variable of template context
1994 A safe way to get a variable from visual variable of template context
1994
1995
1995 :param tmpl_context_var: instance of tmpl_context, usually present as `c`
1996 :param tmpl_context_var: instance of tmpl_context, usually present as `c`
1996 :param attr_name: name of the attribute we fetch from the c.visual
1997 :param attr_name: name of the attribute we fetch from the c.visual
1997 """
1998 """
1998 visual = getattr(tmpl_context_var, 'visual', None)
1999 visual = getattr(tmpl_context_var, 'visual', None)
1999 if not visual:
2000 if not visual:
2000 return
2001 return
2001 else:
2002 else:
2002 return getattr(visual, attr_name, None)
2003 return getattr(visual, attr_name, None)
2003
2004
2004
2005
2005 def get_last_path_part(file_node):
2006 def get_last_path_part(file_node):
2006 if not file_node.path:
2007 if not file_node.path:
2007 return u'/'
2008 return u'/'
2008
2009
2009 path = safe_unicode(file_node.path.split('/')[-1])
2010 path = safe_unicode(file_node.path.split('/')[-1])
2010 return u'../' + path
2011 return u'../' + path
2011
2012
2012
2013
2013 def route_url(*args, **kwargs):
2014 def route_url(*args, **kwargs):
2014 """
2015 """
2015 Wrapper around pyramids `route_url` (fully qualified url) function.
2016 Wrapper around pyramids `route_url` (fully qualified url) function.
2016 """
2017 """
2017 req = get_current_request()
2018 req = get_current_request()
2018 return req.route_url(*args, **kwargs)
2019 return req.route_url(*args, **kwargs)
2019
2020
2020
2021
2021 def route_path(*args, **kwargs):
2022 def route_path(*args, **kwargs):
2022 """
2023 """
2023 Wrapper around pyramids `route_path` function.
2024 Wrapper around pyramids `route_path` function.
2024 """
2025 """
2025 req = get_current_request()
2026 req = get_current_request()
2026 return req.route_path(*args, **kwargs)
2027 return req.route_path(*args, **kwargs)
2027
2028
2028
2029
2029 def route_path_or_none(*args, **kwargs):
2030 def route_path_or_none(*args, **kwargs):
2030 try:
2031 try:
2031 return route_path(*args, **kwargs)
2032 return route_path(*args, **kwargs)
2032 except KeyError:
2033 except KeyError:
2033 return None
2034 return None
2034
2035
2035
2036
2036 def current_route_path(request, **kw):
2037 def current_route_path(request, **kw):
2037 new_args = request.GET.mixed()
2038 new_args = request.GET.mixed()
2038 new_args.update(kw)
2039 new_args.update(kw)
2039 return request.current_route_path(_query=new_args)
2040 return request.current_route_path(_query=new_args)
2040
2041
2041
2042
2042 def curl_api_example(method, args):
2043 def curl_api_example(method, args):
2043 args_json = json.dumps(OrderedDict([
2044 args_json = json.dumps(OrderedDict([
2044 ('id', 1),
2045 ('id', 1),
2045 ('auth_token', 'SECRET'),
2046 ('auth_token', 'SECRET'),
2046 ('method', method),
2047 ('method', method),
2047 ('args', args)
2048 ('args', args)
2048 ]))
2049 ]))
2049
2050
2050 return "curl {api_url} -X POST -H 'content-type:text/plain' --data-binary '{args_json}'".format(
2051 return "curl {api_url} -X POST -H 'content-type:text/plain' --data-binary '{args_json}'".format(
2051 api_url=route_url('apiv2'),
2052 api_url=route_url('apiv2'),
2052 args_json=args_json
2053 args_json=args_json
2053 )
2054 )
2054
2055
2055
2056
2056 def api_call_example(method, args):
2057 def api_call_example(method, args):
2057 """
2058 """
2058 Generates an API call example via CURL
2059 Generates an API call example via CURL
2059 """
2060 """
2060 curl_call = curl_api_example(method, args)
2061 curl_call = curl_api_example(method, args)
2061
2062
2062 return literal(
2063 return literal(
2063 curl_call +
2064 curl_call +
2064 "<br/><br/>SECRET can be found in <a href=\"{token_url}\">auth-tokens</a> page, "
2065 "<br/><br/>SECRET can be found in <a href=\"{token_url}\">auth-tokens</a> page, "
2065 "and needs to be of `api calls` role."
2066 "and needs to be of `api calls` role."
2066 .format(token_url=route_url('my_account_auth_tokens')))
2067 .format(token_url=route_url('my_account_auth_tokens')))
2067
2068
2068
2069
2069 def notification_description(notification, request):
2070 def notification_description(notification, request):
2070 """
2071 """
2071 Generate notification human readable description based on notification type
2072 Generate notification human readable description based on notification type
2072 """
2073 """
2073 from rhodecode.model.notification import NotificationModel
2074 from rhodecode.model.notification import NotificationModel
2074 return NotificationModel().make_description(
2075 return NotificationModel().make_description(
2075 notification, translate=request.translate)
2076 notification, translate=request.translate)
2076
2077
2077
2078
2078 def go_import_header(request, db_repo=None):
2079 def go_import_header(request, db_repo=None):
2079 """
2080 """
2080 Creates a header for go-import functionality in Go Lang
2081 Creates a header for go-import functionality in Go Lang
2081 """
2082 """
2082
2083
2083 if not db_repo:
2084 if not db_repo:
2084 return
2085 return
2085 if 'go-get' not in request.GET:
2086 if 'go-get' not in request.GET:
2086 return
2087 return
2087
2088
2088 clone_url = db_repo.clone_url()
2089 clone_url = db_repo.clone_url()
2089 prefix = re.split(r'^https?:\/\/', clone_url)[-1]
2090 prefix = re.split(r'^https?:\/\/', clone_url)[-1]
2090 # we have a repo and go-get flag,
2091 # we have a repo and go-get flag,
2091 return literal('<meta name="go-import" content="{} {} {}">'.format(
2092 return literal('<meta name="go-import" content="{} {} {}">'.format(
2092 prefix, db_repo.repo_type, clone_url))
2093 prefix, db_repo.repo_type, clone_url))
2093
2094
2094
2095
2095 def reviewer_as_json(*args, **kwargs):
2096 def reviewer_as_json(*args, **kwargs):
2096 from rhodecode.apps.repository.utils import reviewer_as_json as _reviewer_as_json
2097 from rhodecode.apps.repository.utils import reviewer_as_json as _reviewer_as_json
2097 return _reviewer_as_json(*args, **kwargs)
2098 return _reviewer_as_json(*args, **kwargs)
2098
2099
2099
2100
2100 def get_repo_view_type(request):
2101 def get_repo_view_type(request):
2101 route_name = request.matched_route.name
2102 route_name = request.matched_route.name
2102 route_to_view_type = {
2103 route_to_view_type = {
2103 'repo_changelog': 'commits',
2104 'repo_changelog': 'commits',
2104 'repo_commits': 'commits',
2105 'repo_commits': 'commits',
2105 'repo_files': 'files',
2106 'repo_files': 'files',
2106 'repo_summary': 'summary',
2107 'repo_summary': 'summary',
2107 'repo_commit': 'commit'
2108 'repo_commit': 'commit'
2108 }
2109 }
2109
2110
2110 return route_to_view_type.get(route_name)
2111 return route_to_view_type.get(route_name)
2111
2112
2112
2113
2113 def is_active(menu_entry, selected):
2114 def is_active(menu_entry, selected):
2114 """
2115 """
2115 Returns active class for selecting menus in templates
2116 Returns active class for selecting menus in templates
2116 <li class=${h.is_active('settings', current_active)}></li>
2117 <li class=${h.is_active('settings', current_active)}></li>
2117 """
2118 """
2118 if not isinstance(menu_entry, list):
2119 if not isinstance(menu_entry, list):
2119 menu_entry = [menu_entry]
2120 menu_entry = [menu_entry]
2120
2121
2121 if selected in menu_entry:
2122 if selected in menu_entry:
2122 return "active"
2123 return "active"
2123
2124
2124
2125
2125 class IssuesRegistry(object):
2126 class IssuesRegistry(object):
2126 """
2127 """
2127 issue_registry = IssuesRegistry()
2128 issue_registry = IssuesRegistry()
2128 some_func(issues_callback=issues_registry(...))
2129 some_func(issues_callback=issues_registry(...))
2129 """
2130 """
2130
2131
2131 def __init__(self):
2132 def __init__(self):
2132 self.issues = []
2133 self.issues = []
2133 self.unique_issues = collections.defaultdict(lambda: [])
2134 self.unique_issues = collections.defaultdict(lambda: [])
2134
2135
2135 def __call__(self, commit_dict=None):
2136 def __call__(self, commit_dict=None):
2136 def callback(issue):
2137 def callback(issue):
2137 if commit_dict and issue:
2138 if commit_dict and issue:
2138 issue['commit'] = commit_dict
2139 issue['commit'] = commit_dict
2139 self.issues.append(issue)
2140 self.issues.append(issue)
2140 self.unique_issues[issue['id']].append(issue)
2141 self.unique_issues[issue['id']].append(issue)
2141 return callback
2142 return callback
2142
2143
2143 def get_issues(self):
2144 def get_issues(self):
2144 return self.issues
2145 return self.issues
2145
2146
2146 @property
2147 @property
2147 def issues_unique_count(self):
2148 def issues_unique_count(self):
2148 return len(set(i['id'] for i in self.issues))
2149 return len(set(i['id'] for i in self.issues))
@@ -1,171 +1,186 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2010-2020 RhodeCode GmbH
3 # Copyright (C) 2010-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import sys
21 import sys
22 import logging
22 import logging
23
23
24
24
25 BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE = xrange(30, 38)
25 BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE = xrange(30, 38)
26
26
27 # Sequences
27 # Sequences
28 RESET_SEQ = "\033[0m"
28 RESET_SEQ = "\033[0m"
29 COLOR_SEQ = "\033[0;%dm"
29 COLOR_SEQ = "\033[0;%dm"
30 BOLD_SEQ = "\033[1m"
30 BOLD_SEQ = "\033[1m"
31
31
32 COLORS = {
32 COLORS = {
33 'CRITICAL': MAGENTA,
33 'CRITICAL': MAGENTA,
34 'ERROR': RED,
34 'ERROR': RED,
35 'WARNING': CYAN,
35 'WARNING': CYAN,
36 'INFO': GREEN,
36 'INFO': GREEN,
37 'DEBUG': BLUE,
37 'DEBUG': BLUE,
38 'SQL': YELLOW
38 'SQL': YELLOW
39 }
39 }
40
40
41
41
42 def one_space_trim(s):
42 def one_space_trim(s):
43 if s.find(" ") == -1:
43 if s.find(" ") == -1:
44 return s
44 return s
45 else:
45 else:
46 s = s.replace(' ', ' ')
46 s = s.replace(' ', ' ')
47 return one_space_trim(s)
47 return one_space_trim(s)
48
48
49
49
50 def format_sql(sql):
50 def format_sql(sql):
51 sql = sql.replace('\n', '')
51 sql = sql.replace('\n', '')
52 sql = one_space_trim(sql)
52 sql = one_space_trim(sql)
53 sql = sql\
53 sql = sql\
54 .replace(',', ',\n\t')\
54 .replace(',', ',\n\t')\
55 .replace('SELECT', '\n\tSELECT \n\t')\
55 .replace('SELECT', '\n\tSELECT \n\t')\
56 .replace('UPDATE', '\n\tUPDATE \n\t')\
56 .replace('UPDATE', '\n\tUPDATE \n\t')\
57 .replace('DELETE', '\n\tDELETE \n\t')\
57 .replace('DELETE', '\n\tDELETE \n\t')\
58 .replace('FROM', '\n\tFROM')\
58 .replace('FROM', '\n\tFROM')\
59 .replace('ORDER BY', '\n\tORDER BY')\
59 .replace('ORDER BY', '\n\tORDER BY')\
60 .replace('LIMIT', '\n\tLIMIT')\
60 .replace('LIMIT', '\n\tLIMIT')\
61 .replace('WHERE', '\n\tWHERE')\
61 .replace('WHERE', '\n\tWHERE')\
62 .replace('AND', '\n\tAND')\
62 .replace('AND', '\n\tAND')\
63 .replace('LEFT', '\n\tLEFT')\
63 .replace('LEFT', '\n\tLEFT')\
64 .replace('INNER', '\n\tINNER')\
64 .replace('INNER', '\n\tINNER')\
65 .replace('INSERT', '\n\tINSERT')\
65 .replace('INSERT', '\n\tINSERT')\
66 .replace('DELETE', '\n\tDELETE')
66 .replace('DELETE', '\n\tDELETE')
67 return sql
67 return sql
68
68
69
69
70 class ExceptionAwareFormatter(logging.Formatter):
70 class ExceptionAwareFormatter(logging.Formatter):
71 """
71 """
72 Extended logging formatter which prints out remote tracebacks.
72 Extended logging formatter which prints out remote tracebacks.
73 """
73 """
74
74
75 def formatException(self, ei):
75 def formatException(self, ei):
76 ex_type, ex_value, ex_tb = ei
76 ex_type, ex_value, ex_tb = ei
77
77
78 local_tb = logging.Formatter.formatException(self, ei)
78 local_tb = logging.Formatter.formatException(self, ei)
79 if hasattr(ex_value, '_vcs_server_traceback'):
79 if hasattr(ex_value, '_vcs_server_traceback'):
80
80
81 def formatRemoteTraceback(remote_tb_lines):
81 def formatRemoteTraceback(remote_tb_lines):
82 result = ["\n +--- This exception occured remotely on VCSServer - Remote traceback:\n\n"]
82 result = ["\n +--- This exception occured remotely on VCSServer - Remote traceback:\n\n"]
83 result.append(remote_tb_lines)
83 result.append(remote_tb_lines)
84 result.append("\n +--- End of remote traceback\n")
84 result.append("\n +--- End of remote traceback\n")
85 return result
85 return result
86
86
87 try:
87 try:
88 if ex_type is not None and ex_value is None and ex_tb is None:
88 if ex_type is not None and ex_value is None and ex_tb is None:
89 # possible old (3.x) call syntax where caller is only
89 # possible old (3.x) call syntax where caller is only
90 # providing exception object
90 # providing exception object
91 if type(ex_type) is not type:
91 if type(ex_type) is not type:
92 raise TypeError(
92 raise TypeError(
93 "invalid argument: ex_type should be an exception "
93 "invalid argument: ex_type should be an exception "
94 "type, or just supply no arguments at all")
94 "type, or just supply no arguments at all")
95 if ex_type is None and ex_tb is None:
95 if ex_type is None and ex_tb is None:
96 ex_type, ex_value, ex_tb = sys.exc_info()
96 ex_type, ex_value, ex_tb = sys.exc_info()
97
97
98 remote_tb = getattr(ex_value, "_vcs_server_traceback", None)
98 remote_tb = getattr(ex_value, "_vcs_server_traceback", None)
99
99
100 if remote_tb:
100 if remote_tb:
101 remote_tb = formatRemoteTraceback(remote_tb)
101 remote_tb = formatRemoteTraceback(remote_tb)
102 return local_tb + ''.join(remote_tb)
102 return local_tb + ''.join(remote_tb)
103 finally:
103 finally:
104 # clean up cycle to traceback, to allow proper GC
104 # clean up cycle to traceback, to allow proper GC
105 del ex_type, ex_value, ex_tb
105 del ex_type, ex_value, ex_tb
106
106
107 return local_tb
107 return local_tb
108
108
109
109
110 class ColorFormatter(ExceptionAwareFormatter):
110 class ColorFormatter(ExceptionAwareFormatter):
111
111
112 def format(self, record):
112 def format(self, record):
113 """
113 """
114 Changes record's levelname to use with COLORS enum
114 Changes record's levelname to use with COLORS enum
115 """
115 """
116
116
117 levelname = record.levelname
117 levelname = record.levelname
118 start = COLOR_SEQ % (COLORS[levelname])
118 start = COLOR_SEQ % (COLORS[levelname])
119 def_record = logging.Formatter.format(self, record)
119 def_record = logging.Formatter.format(self, record)
120 end = RESET_SEQ
120 end = RESET_SEQ
121
121
122 colored_record = ''.join([start, def_record, end])
122 colored_record = ''.join([start, def_record, end])
123 return colored_record
123 return colored_record
124
124
125
125
126 def _inject_req_id(record):
126 def _inject_req_id(record):
127 from pyramid.threadlocal import get_current_request
127 from pyramid.threadlocal import get_current_request
128 dummy = '00000000-0000-0000-0000-000000000000'
129 req_id = None
130
128 req = get_current_request()
131 req = get_current_request()
129 dummy = '00000000-0000-0000-0000-000000000000'
132 if req:
130 req_id = 'req_id:%-36s' % (getattr(req, 'req_id', dummy))
133 req_id = getattr(req, 'req_id', None)
134
135 req_id = 'req_id:%-36s' % (req_id or dummy)
131 record.req_id = req_id
136 record.req_id = req_id
132
137
133
138
139 def _add_log_to_debug_bucket(formatted_record):
140 from pyramid.threadlocal import get_current_request
141 req = get_current_request()
142 if req:
143 req.req_id_bucket.append(formatted_record)
144
145
134 class RequestTrackingFormatter(ExceptionAwareFormatter):
146 class RequestTrackingFormatter(ExceptionAwareFormatter):
135 def format(self, record):
147 def format(self, record):
136 _inject_req_id(record)
148 _inject_req_id(record)
137 def_record = logging.Formatter.format(self, record)
149 def_record = logging.Formatter.format(self, record)
150 _add_log_to_debug_bucket(def_record)
138 return def_record
151 return def_record
139
152
140
153
141 class ColorRequestTrackingFormatter(ColorFormatter):
154 class ColorRequestTrackingFormatter(ColorFormatter):
155
142 def format(self, record):
156 def format(self, record):
143 """
157 """
144 Changes record's levelname to use with COLORS enum
158 Changes record's levelname to use with COLORS enum
145 """
159 """
146 _inject_req_id(record)
160 _inject_req_id(record)
147 levelname = record.levelname
161 levelname = record.levelname
148 start = COLOR_SEQ % (COLORS[levelname])
162 start = COLOR_SEQ % (COLORS[levelname])
149 def_record = logging.Formatter.format(self, record)
163 def_record = logging.Formatter.format(self, record)
150 end = RESET_SEQ
164 end = RESET_SEQ
151
165
152 colored_record = ''.join([start, def_record, end])
166 colored_record = ''.join([start, def_record, end])
167 _add_log_to_debug_bucket(def_record)
153 return colored_record
168 return colored_record
154
169
155
170
156 class ColorFormatterSql(logging.Formatter):
171 class ColorFormatterSql(logging.Formatter):
157
172
158 def format(self, record):
173 def format(self, record):
159 """
174 """
160 Changes record's levelname to use with COLORS enum
175 Changes record's levelname to use with COLORS enum
161 """
176 """
162
177
163 start = COLOR_SEQ % (COLORS['SQL'])
178 start = COLOR_SEQ % (COLORS['SQL'])
164 def_record = format_sql(logging.Formatter.format(self, record))
179 def_record = format_sql(logging.Formatter.format(self, record))
165 end = RESET_SEQ
180 end = RESET_SEQ
166
181
167 colored_record = ''.join([start, def_record, end])
182 colored_record = ''.join([start, def_record, end])
168 return colored_record
183 return colored_record
169
184
170 # marcink: needs to stay with this name for backward .ini compatability
185 # marcink: needs to stay with this name for backward .ini compatability
171 Pyro4AwareFormatter = ExceptionAwareFormatter
186 Pyro4AwareFormatter = ExceptionAwareFormatter
@@ -1,363 +1,364 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2015-2020 RhodeCode GmbH
3 # Copyright (C) 2015-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 import time
21 import time
22 import errno
22 import errno
23 import logging
23 import logging
24
24
25 import msgpack
25 import msgpack
26 import gevent
26 import gevent
27 import redis
27 import redis
28
28
29 from dogpile.cache.api import CachedValue
29 from dogpile.cache.api import CachedValue
30 from dogpile.cache.backends import memory as memory_backend
30 from dogpile.cache.backends import memory as memory_backend
31 from dogpile.cache.backends import file as file_backend
31 from dogpile.cache.backends import file as file_backend
32 from dogpile.cache.backends import redis as redis_backend
32 from dogpile.cache.backends import redis as redis_backend
33 from dogpile.cache.backends.file import NO_VALUE, compat, FileLock
33 from dogpile.cache.backends.file import NO_VALUE, compat, FileLock
34 from dogpile.cache.util import memoized_property
34 from dogpile.cache.util import memoized_property
35
35
36 from pyramid.settings import asbool
36 from pyramid.settings import asbool
37
37
38 from rhodecode.lib.memory_lru_dict import LRUDict, LRUDictDebug
38 from rhodecode.lib.memory_lru_dict import LRUDict, LRUDictDebug
39 from rhodecode.lib.utils import safe_str
39
40
40
41
41 _default_max_size = 1024
42 _default_max_size = 1024
42
43
43 log = logging.getLogger(__name__)
44 log = logging.getLogger(__name__)
44
45
45
46
46 class LRUMemoryBackend(memory_backend.MemoryBackend):
47 class LRUMemoryBackend(memory_backend.MemoryBackend):
47 key_prefix = 'lru_mem_backend'
48 key_prefix = 'lru_mem_backend'
48 pickle_values = False
49 pickle_values = False
49
50
50 def __init__(self, arguments):
51 def __init__(self, arguments):
51 max_size = arguments.pop('max_size', _default_max_size)
52 max_size = arguments.pop('max_size', _default_max_size)
52
53
53 LRUDictClass = LRUDict
54 LRUDictClass = LRUDict
54 if arguments.pop('log_key_count', None):
55 if arguments.pop('log_key_count', None):
55 LRUDictClass = LRUDictDebug
56 LRUDictClass = LRUDictDebug
56
57
57 arguments['cache_dict'] = LRUDictClass(max_size)
58 arguments['cache_dict'] = LRUDictClass(max_size)
58 super(LRUMemoryBackend, self).__init__(arguments)
59 super(LRUMemoryBackend, self).__init__(arguments)
59
60
60 def delete(self, key):
61 def delete(self, key):
61 try:
62 try:
62 del self._cache[key]
63 del self._cache[key]
63 except KeyError:
64 except KeyError:
64 # we don't care if key isn't there at deletion
65 # we don't care if key isn't there at deletion
65 pass
66 pass
66
67
67 def delete_multi(self, keys):
68 def delete_multi(self, keys):
68 for key in keys:
69 for key in keys:
69 self.delete(key)
70 self.delete(key)
70
71
71
72
72 class PickleSerializer(object):
73 class PickleSerializer(object):
73
74
74 def _dumps(self, value, safe=False):
75 def _dumps(self, value, safe=False):
75 try:
76 try:
76 return compat.pickle.dumps(value)
77 return compat.pickle.dumps(value)
77 except Exception:
78 except Exception:
78 if safe:
79 if safe:
79 return NO_VALUE
80 return NO_VALUE
80 else:
81 else:
81 raise
82 raise
82
83
83 def _loads(self, value, safe=True):
84 def _loads(self, value, safe=True):
84 try:
85 try:
85 return compat.pickle.loads(value)
86 return compat.pickle.loads(value)
86 except Exception:
87 except Exception:
87 if safe:
88 if safe:
88 return NO_VALUE
89 return NO_VALUE
89 else:
90 else:
90 raise
91 raise
91
92
92
93
93 class MsgPackSerializer(object):
94 class MsgPackSerializer(object):
94
95
95 def _dumps(self, value, safe=False):
96 def _dumps(self, value, safe=False):
96 try:
97 try:
97 return msgpack.packb(value)
98 return msgpack.packb(value)
98 except Exception:
99 except Exception:
99 if safe:
100 if safe:
100 return NO_VALUE
101 return NO_VALUE
101 else:
102 else:
102 raise
103 raise
103
104
104 def _loads(self, value, safe=True):
105 def _loads(self, value, safe=True):
105 """
106 """
106 pickle maintained the `CachedValue` wrapper of the tuple
107 pickle maintained the `CachedValue` wrapper of the tuple
107 msgpack does not, so it must be added back in.
108 msgpack does not, so it must be added back in.
108 """
109 """
109 try:
110 try:
110 value = msgpack.unpackb(value, use_list=False)
111 value = msgpack.unpackb(value, use_list=False)
111 return CachedValue(*value)
112 return CachedValue(*value)
112 except Exception:
113 except Exception:
113 if safe:
114 if safe:
114 return NO_VALUE
115 return NO_VALUE
115 else:
116 else:
116 raise
117 raise
117
118
118
119
119 import fcntl
120 import fcntl
120 flock_org = fcntl.flock
121 flock_org = fcntl.flock
121
122
122
123
123 class CustomLockFactory(FileLock):
124 class CustomLockFactory(FileLock):
124
125
125 @memoized_property
126 @memoized_property
126 def _module(self):
127 def _module(self):
127
128
128 def gevent_flock(fd, operation):
129 def gevent_flock(fd, operation):
129 """
130 """
130 Gevent compatible flock
131 Gevent compatible flock
131 """
132 """
132 # set non-blocking, this will cause an exception if we cannot acquire a lock
133 # set non-blocking, this will cause an exception if we cannot acquire a lock
133 operation |= fcntl.LOCK_NB
134 operation |= fcntl.LOCK_NB
134 start_lock_time = time.time()
135 start_lock_time = time.time()
135 timeout = 60 * 15 # 15min
136 timeout = 60 * 15 # 15min
136 while True:
137 while True:
137 try:
138 try:
138 flock_org(fd, operation)
139 flock_org(fd, operation)
139 # lock has been acquired
140 # lock has been acquired
140 break
141 break
141 except (OSError, IOError) as e:
142 except (OSError, IOError) as e:
142 # raise on other errors than Resource temporarily unavailable
143 # raise on other errors than Resource temporarily unavailable
143 if e.errno != errno.EAGAIN:
144 if e.errno != errno.EAGAIN:
144 raise
145 raise
145 elif (time.time() - start_lock_time) > timeout:
146 elif (time.time() - start_lock_time) > timeout:
146 # waited to much time on a lock, better fail than loop for ever
147 # waited to much time on a lock, better fail than loop for ever
147 log.error('Failed to acquire lock on `%s` after waiting %ss',
148 log.error('Failed to acquire lock on `%s` after waiting %ss',
148 self.filename, timeout)
149 self.filename, timeout)
149 raise
150 raise
150 wait_timeout = 0.03
151 wait_timeout = 0.03
151 log.debug('Failed to acquire lock on `%s`, retry in %ss',
152 log.debug('Failed to acquire lock on `%s`, retry in %ss',
152 self.filename, wait_timeout)
153 self.filename, wait_timeout)
153 gevent.sleep(wait_timeout)
154 gevent.sleep(wait_timeout)
154
155
155 fcntl.flock = gevent_flock
156 fcntl.flock = gevent_flock
156 return fcntl
157 return fcntl
157
158
158
159
159 class FileNamespaceBackend(PickleSerializer, file_backend.DBMBackend):
160 class FileNamespaceBackend(PickleSerializer, file_backend.DBMBackend):
160 key_prefix = 'file_backend'
161 key_prefix = 'file_backend'
161
162
162 def __init__(self, arguments):
163 def __init__(self, arguments):
163 arguments['lock_factory'] = CustomLockFactory
164 arguments['lock_factory'] = CustomLockFactory
164 db_file = arguments.get('filename')
165 db_file = arguments.get('filename')
165
166
166 log.debug('initialing %s DB in %s', self.__class__.__name__, db_file)
167 log.debug('initialing %s DB in %s', self.__class__.__name__, db_file)
167 try:
168 try:
168 super(FileNamespaceBackend, self).__init__(arguments)
169 super(FileNamespaceBackend, self).__init__(arguments)
169 except Exception:
170 except Exception:
170 log.error('Failed to initialize db at: %s', db_file)
171 log.exception('Failed to initialize db at: %s', db_file)
171 raise
172 raise
172
173
173 def __repr__(self):
174 def __repr__(self):
174 return '{} `{}`'.format(self.__class__, self.filename)
175 return '{} `{}`'.format(self.__class__, self.filename)
175
176
176 def list_keys(self, prefix=''):
177 def list_keys(self, prefix=''):
177 prefix = '{}:{}'.format(self.key_prefix, prefix)
178 prefix = '{}:{}'.format(self.key_prefix, prefix)
178
179
179 def cond(v):
180 def cond(v):
180 if not prefix:
181 if not prefix:
181 return True
182 return True
182
183
183 if v.startswith(prefix):
184 if v.startswith(prefix):
184 return True
185 return True
185 return False
186 return False
186
187
187 with self._dbm_file(True) as dbm:
188 with self._dbm_file(True) as dbm:
188 try:
189 try:
189 return filter(cond, dbm.keys())
190 return filter(cond, dbm.keys())
190 except Exception:
191 except Exception:
191 log.error('Failed to fetch DBM keys from DB: %s', self.get_store())
192 log.error('Failed to fetch DBM keys from DB: %s', self.get_store())
192 raise
193 raise
193
194
194 def get_store(self):
195 def get_store(self):
195 return self.filename
196 return self.filename
196
197
197 def _dbm_get(self, key):
198 def _dbm_get(self, key):
198 with self._dbm_file(False) as dbm:
199 with self._dbm_file(False) as dbm:
199 if hasattr(dbm, 'get'):
200 if hasattr(dbm, 'get'):
200 value = dbm.get(key, NO_VALUE)
201 value = dbm.get(key, NO_VALUE)
201 else:
202 else:
202 # gdbm objects lack a .get method
203 # gdbm objects lack a .get method
203 try:
204 try:
204 value = dbm[key]
205 value = dbm[key]
205 except KeyError:
206 except KeyError:
206 value = NO_VALUE
207 value = NO_VALUE
207 if value is not NO_VALUE:
208 if value is not NO_VALUE:
208 value = self._loads(value)
209 value = self._loads(value)
209 return value
210 return value
210
211
211 def get(self, key):
212 def get(self, key):
212 try:
213 try:
213 return self._dbm_get(key)
214 return self._dbm_get(key)
214 except Exception:
215 except Exception:
215 log.error('Failed to fetch DBM key %s from DB: %s', key, self.get_store())
216 log.error('Failed to fetch DBM key %s from DB: %s', key, self.get_store())
216 raise
217 raise
217
218
218 def set(self, key, value):
219 def set(self, key, value):
219 with self._dbm_file(True) as dbm:
220 with self._dbm_file(True) as dbm:
220 dbm[key] = self._dumps(value)
221 dbm[key] = self._dumps(value)
221
222
222 def set_multi(self, mapping):
223 def set_multi(self, mapping):
223 with self._dbm_file(True) as dbm:
224 with self._dbm_file(True) as dbm:
224 for key, value in mapping.items():
225 for key, value in mapping.items():
225 dbm[key] = self._dumps(value)
226 dbm[key] = self._dumps(value)
226
227
227
228
228 class BaseRedisBackend(redis_backend.RedisBackend):
229 class BaseRedisBackend(redis_backend.RedisBackend):
229 key_prefix = ''
230 key_prefix = ''
230
231
231 def __init__(self, arguments):
232 def __init__(self, arguments):
232 super(BaseRedisBackend, self).__init__(arguments)
233 super(BaseRedisBackend, self).__init__(arguments)
233 self._lock_timeout = self.lock_timeout
234 self._lock_timeout = self.lock_timeout
234 self._lock_auto_renewal = asbool(arguments.pop("lock_auto_renewal", True))
235 self._lock_auto_renewal = asbool(arguments.pop("lock_auto_renewal", True))
235
236
236 if self._lock_auto_renewal and not self._lock_timeout:
237 if self._lock_auto_renewal and not self._lock_timeout:
237 # set default timeout for auto_renewal
238 # set default timeout for auto_renewal
238 self._lock_timeout = 30
239 self._lock_timeout = 30
239
240
240 def _create_client(self):
241 def _create_client(self):
241 args = {}
242 args = {}
242
243
243 if self.url is not None:
244 if self.url is not None:
244 args.update(url=self.url)
245 args.update(url=self.url)
245
246
246 else:
247 else:
247 args.update(
248 args.update(
248 host=self.host, password=self.password,
249 host=self.host, password=self.password,
249 port=self.port, db=self.db
250 port=self.port, db=self.db
250 )
251 )
251
252
252 connection_pool = redis.ConnectionPool(**args)
253 connection_pool = redis.ConnectionPool(**args)
253
254
254 return redis.StrictRedis(connection_pool=connection_pool)
255 return redis.StrictRedis(connection_pool=connection_pool)
255
256
256 def list_keys(self, prefix=''):
257 def list_keys(self, prefix=''):
257 prefix = '{}:{}*'.format(self.key_prefix, prefix)
258 prefix = '{}:{}*'.format(self.key_prefix, prefix)
258 return self.client.keys(prefix)
259 return self.client.keys(prefix)
259
260
260 def get_store(self):
261 def get_store(self):
261 return self.client.connection_pool
262 return self.client.connection_pool
262
263
263 def get(self, key):
264 def get(self, key):
264 value = self.client.get(key)
265 value = self.client.get(key)
265 if value is None:
266 if value is None:
266 return NO_VALUE
267 return NO_VALUE
267 return self._loads(value)
268 return self._loads(value)
268
269
269 def get_multi(self, keys):
270 def get_multi(self, keys):
270 if not keys:
271 if not keys:
271 return []
272 return []
272 values = self.client.mget(keys)
273 values = self.client.mget(keys)
273 loads = self._loads
274 loads = self._loads
274 return [
275 return [
275 loads(v) if v is not None else NO_VALUE
276 loads(v) if v is not None else NO_VALUE
276 for v in values]
277 for v in values]
277
278
278 def set(self, key, value):
279 def set(self, key, value):
279 if self.redis_expiration_time:
280 if self.redis_expiration_time:
280 self.client.setex(key, self.redis_expiration_time,
281 self.client.setex(key, self.redis_expiration_time,
281 self._dumps(value))
282 self._dumps(value))
282 else:
283 else:
283 self.client.set(key, self._dumps(value))
284 self.client.set(key, self._dumps(value))
284
285
285 def set_multi(self, mapping):
286 def set_multi(self, mapping):
286 dumps = self._dumps
287 dumps = self._dumps
287 mapping = dict(
288 mapping = dict(
288 (k, dumps(v))
289 (k, dumps(v))
289 for k, v in mapping.items()
290 for k, v in mapping.items()
290 )
291 )
291
292
292 if not self.redis_expiration_time:
293 if not self.redis_expiration_time:
293 self.client.mset(mapping)
294 self.client.mset(mapping)
294 else:
295 else:
295 pipe = self.client.pipeline()
296 pipe = self.client.pipeline()
296 for key, value in mapping.items():
297 for key, value in mapping.items():
297 pipe.setex(key, self.redis_expiration_time, value)
298 pipe.setex(key, self.redis_expiration_time, value)
298 pipe.execute()
299 pipe.execute()
299
300
300 def get_mutex(self, key):
301 def get_mutex(self, key):
301 if self.distributed_lock:
302 if self.distributed_lock:
302 lock_key = redis_backend.u('_lock_{0}').format(key)
303 lock_key = redis_backend.u('_lock_{0}').format(safe_str(key))
303 return get_mutex_lock(self.client, lock_key, self._lock_timeout,
304 return get_mutex_lock(self.client, lock_key, self._lock_timeout,
304 auto_renewal=self._lock_auto_renewal)
305 auto_renewal=self._lock_auto_renewal)
305 else:
306 else:
306 return None
307 return None
307
308
308
309
309 class RedisPickleBackend(PickleSerializer, BaseRedisBackend):
310 class RedisPickleBackend(PickleSerializer, BaseRedisBackend):
310 key_prefix = 'redis_pickle_backend'
311 key_prefix = 'redis_pickle_backend'
311 pass
312 pass
312
313
313
314
314 class RedisMsgPackBackend(MsgPackSerializer, BaseRedisBackend):
315 class RedisMsgPackBackend(MsgPackSerializer, BaseRedisBackend):
315 key_prefix = 'redis_msgpack_backend'
316 key_prefix = 'redis_msgpack_backend'
316 pass
317 pass
317
318
318
319
319 def get_mutex_lock(client, lock_key, lock_timeout, auto_renewal=False):
320 def get_mutex_lock(client, lock_key, lock_timeout, auto_renewal=False):
320 import redis_lock
321 import redis_lock
321
322
322 class _RedisLockWrapper(object):
323 class _RedisLockWrapper(object):
323 """LockWrapper for redis_lock"""
324 """LockWrapper for redis_lock"""
324
325
325 @classmethod
326 @classmethod
326 def get_lock(cls):
327 def get_lock(cls):
327 return redis_lock.Lock(
328 return redis_lock.Lock(
328 redis_client=client,
329 redis_client=client,
329 name=lock_key,
330 name=lock_key,
330 expire=lock_timeout,
331 expire=lock_timeout,
331 auto_renewal=auto_renewal,
332 auto_renewal=auto_renewal,
332 strict=True,
333 strict=True,
333 )
334 )
334
335
335 def __repr__(self):
336 def __repr__(self):
336 return "{}:{}".format(self.__class__.__name__, lock_key)
337 return "{}:{}".format(self.__class__.__name__, lock_key)
337
338
338 def __str__(self):
339 def __str__(self):
339 return "{}:{}".format(self.__class__.__name__, lock_key)
340 return "{}:{}".format(self.__class__.__name__, lock_key)
340
341
341 def __init__(self):
342 def __init__(self):
342 self.lock = self.get_lock()
343 self.lock = self.get_lock()
343 self.lock_key = lock_key
344 self.lock_key = lock_key
344
345
345 def acquire(self, wait=True):
346 def acquire(self, wait=True):
346 log.debug('Trying to acquire Redis lock for key %s', self.lock_key)
347 log.debug('Trying to acquire Redis lock for key %s', self.lock_key)
347 try:
348 try:
348 acquired = self.lock.acquire(wait)
349 acquired = self.lock.acquire(wait)
349 log.debug('Got lock for key %s, %s', self.lock_key, acquired)
350 log.debug('Got lock for key %s, %s', self.lock_key, acquired)
350 return acquired
351 return acquired
351 except redis_lock.AlreadyAcquired:
352 except redis_lock.AlreadyAcquired:
352 return False
353 return False
353 except redis_lock.AlreadyStarted:
354 except redis_lock.AlreadyStarted:
354 # refresh thread exists, but it also means we acquired the lock
355 # refresh thread exists, but it also means we acquired the lock
355 return True
356 return True
356
357
357 def release(self):
358 def release(self):
358 try:
359 try:
359 self.lock.release()
360 self.lock.release()
360 except redis_lock.NotAcquired:
361 except redis_lock.NotAcquired:
361 pass
362 pass
362
363
363 return _RedisLockWrapper()
364 return _RedisLockWrapper()
@@ -1,29 +1,38 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2017-2020 RhodeCode GmbH
3 # Copyright (C) 2017-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 from uuid import uuid4
21 from uuid import uuid4
22 from pyramid.decorator import reify
22 from pyramid.decorator import reify
23 from pyramid.request import Request as _Request
23 from pyramid.request import Request as _Request
24
24
25
25
26 class Request(_Request):
26 class Request(_Request):
27 _req_id_bucket = list()
28
27 @reify
29 @reify
28 def req_id(self):
30 def req_id(self):
29 return str(uuid4())
31 return str(uuid4())
32
33 @property
34 def req_id_bucket(self):
35 return self._req_id_bucket
36
37 def req_id_records_init(self):
38 self._req_id_bucket = list()
@@ -1,1938 +1,1941 b''
1 # -*- coding: utf-8 -*-
1 # -*- coding: utf-8 -*-
2
2
3 # Copyright (C) 2014-2020 RhodeCode GmbH
3 # Copyright (C) 2014-2020 RhodeCode GmbH
4 #
4 #
5 # This program is free software: you can redistribute it and/or modify
5 # This program is free software: you can redistribute it and/or modify
6 # it under the terms of the GNU Affero General Public License, version 3
6 # it under the terms of the GNU Affero General Public License, version 3
7 # (only), as published by the Free Software Foundation.
7 # (only), as published by the Free Software Foundation.
8 #
8 #
9 # This program is distributed in the hope that it will be useful,
9 # This program is distributed in the hope that it will be useful,
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 # but WITHOUT ANY WARRANTY; without even the implied warranty of
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
12 # GNU General Public License for more details.
12 # GNU General Public License for more details.
13 #
13 #
14 # You should have received a copy of the GNU Affero General Public License
14 # You should have received a copy of the GNU Affero General Public License
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
15 # along with this program. If not, see <http://www.gnu.org/licenses/>.
16 #
16 #
17 # This program is dual-licensed. If you wish to learn more about the
17 # This program is dual-licensed. If you wish to learn more about the
18 # RhodeCode Enterprise Edition, including its added features, Support services,
18 # RhodeCode Enterprise Edition, including its added features, Support services,
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
19 # and proprietary license terms, please see https://rhodecode.com/licenses/
20
20
21 """
21 """
22 Base module for all VCS systems
22 Base module for all VCS systems
23 """
23 """
24 import os
24 import os
25 import re
25 import re
26 import time
26 import time
27 import shutil
27 import shutil
28 import datetime
28 import datetime
29 import fnmatch
29 import fnmatch
30 import itertools
30 import itertools
31 import logging
31 import logging
32 import collections
32 import collections
33 import warnings
33 import warnings
34
34
35 from zope.cachedescriptors.property import Lazy as LazyProperty
35 from zope.cachedescriptors.property import Lazy as LazyProperty
36
36
37 from pyramid import compat
37 from pyramid import compat
38
38
39 import rhodecode
39 import rhodecode
40 from rhodecode.translation import lazy_ugettext
40 from rhodecode.translation import lazy_ugettext
41 from rhodecode.lib.utils2 import safe_str, safe_unicode, CachedProperty
41 from rhodecode.lib.utils2 import safe_str, safe_unicode, CachedProperty
42 from rhodecode.lib.vcs import connection
42 from rhodecode.lib.vcs import connection
43 from rhodecode.lib.vcs.utils import author_name, author_email
43 from rhodecode.lib.vcs.utils import author_name, author_email
44 from rhodecode.lib.vcs.conf import settings
44 from rhodecode.lib.vcs.conf import settings
45 from rhodecode.lib.vcs.exceptions import (
45 from rhodecode.lib.vcs.exceptions import (
46 CommitError, EmptyRepositoryError, NodeAlreadyAddedError,
46 CommitError, EmptyRepositoryError, NodeAlreadyAddedError,
47 NodeAlreadyChangedError, NodeAlreadyExistsError, NodeAlreadyRemovedError,
47 NodeAlreadyChangedError, NodeAlreadyExistsError, NodeAlreadyRemovedError,
48 NodeDoesNotExistError, NodeNotChangedError, VCSError,
48 NodeDoesNotExistError, NodeNotChangedError, VCSError,
49 ImproperArchiveTypeError, BranchDoesNotExistError, CommitDoesNotExistError,
49 ImproperArchiveTypeError, BranchDoesNotExistError, CommitDoesNotExistError,
50 RepositoryError)
50 RepositoryError)
51
51
52
52
53 log = logging.getLogger(__name__)
53 log = logging.getLogger(__name__)
54
54
55
55
56 FILEMODE_DEFAULT = 0o100644
56 FILEMODE_DEFAULT = 0o100644
57 FILEMODE_EXECUTABLE = 0o100755
57 FILEMODE_EXECUTABLE = 0o100755
58 EMPTY_COMMIT_ID = '0' * 40
58 EMPTY_COMMIT_ID = '0' * 40
59
59
60 _Reference = collections.namedtuple('Reference', ('type', 'name', 'commit_id'))
60 _Reference = collections.namedtuple('Reference', ('type', 'name', 'commit_id'))
61
61
62
62
63 class Reference(_Reference):
63 class Reference(_Reference):
64
64
65 @property
65 @property
66 def branch(self):
66 def branch(self):
67 if self.type == 'branch':
67 if self.type == 'branch':
68 return self.name
68 return self.name
69
69
70 @property
70 @property
71 def bookmark(self):
71 def bookmark(self):
72 if self.type == 'book':
72 if self.type == 'book':
73 return self.name
73 return self.name
74
74
75 @property
75 @property
76 def to_unicode(self):
76 def to_unicode(self):
77 return reference_to_unicode(self)
77 return reference_to_unicode(self)
78
78
79
79
80 def unicode_to_reference(raw):
80 def unicode_to_reference(raw):
81 """
81 """
82 Convert a unicode (or string) to a reference object.
82 Convert a unicode (or string) to a reference object.
83 If unicode evaluates to False it returns None.
83 If unicode evaluates to False it returns None.
84 """
84 """
85 if raw:
85 if raw:
86 refs = raw.split(':')
86 refs = raw.split(':')
87 return Reference(*refs)
87 return Reference(*refs)
88 else:
88 else:
89 return None
89 return None
90
90
91
91
92 def reference_to_unicode(ref):
92 def reference_to_unicode(ref):
93 """
93 """
94 Convert a reference object to unicode.
94 Convert a reference object to unicode.
95 If reference is None it returns None.
95 If reference is None it returns None.
96 """
96 """
97 if ref:
97 if ref:
98 return u':'.join(ref)
98 return u':'.join(ref)
99 else:
99 else:
100 return None
100 return None
101
101
102
102
103 class MergeFailureReason(object):
103 class MergeFailureReason(object):
104 """
104 """
105 Enumeration with all the reasons why the server side merge could fail.
105 Enumeration with all the reasons why the server side merge could fail.
106
106
107 DO NOT change the number of the reasons, as they may be stored in the
107 DO NOT change the number of the reasons, as they may be stored in the
108 database.
108 database.
109
109
110 Changing the name of a reason is acceptable and encouraged to deprecate old
110 Changing the name of a reason is acceptable and encouraged to deprecate old
111 reasons.
111 reasons.
112 """
112 """
113
113
114 # Everything went well.
114 # Everything went well.
115 NONE = 0
115 NONE = 0
116
116
117 # An unexpected exception was raised. Check the logs for more details.
117 # An unexpected exception was raised. Check the logs for more details.
118 UNKNOWN = 1
118 UNKNOWN = 1
119
119
120 # The merge was not successful, there are conflicts.
120 # The merge was not successful, there are conflicts.
121 MERGE_FAILED = 2
121 MERGE_FAILED = 2
122
122
123 # The merge succeeded but we could not push it to the target repository.
123 # The merge succeeded but we could not push it to the target repository.
124 PUSH_FAILED = 3
124 PUSH_FAILED = 3
125
125
126 # The specified target is not a head in the target repository.
126 # The specified target is not a head in the target repository.
127 TARGET_IS_NOT_HEAD = 4
127 TARGET_IS_NOT_HEAD = 4
128
128
129 # The source repository contains more branches than the target. Pushing
129 # The source repository contains more branches than the target. Pushing
130 # the merge will create additional branches in the target.
130 # the merge will create additional branches in the target.
131 HG_SOURCE_HAS_MORE_BRANCHES = 5
131 HG_SOURCE_HAS_MORE_BRANCHES = 5
132
132
133 # The target reference has multiple heads. That does not allow to correctly
133 # The target reference has multiple heads. That does not allow to correctly
134 # identify the target location. This could only happen for mercurial
134 # identify the target location. This could only happen for mercurial
135 # branches.
135 # branches.
136 HG_TARGET_HAS_MULTIPLE_HEADS = 6
136 HG_TARGET_HAS_MULTIPLE_HEADS = 6
137
137
138 # The target repository is locked
138 # The target repository is locked
139 TARGET_IS_LOCKED = 7
139 TARGET_IS_LOCKED = 7
140
140
141 # Deprecated, use MISSING_TARGET_REF or MISSING_SOURCE_REF instead.
141 # Deprecated, use MISSING_TARGET_REF or MISSING_SOURCE_REF instead.
142 # A involved commit could not be found.
142 # A involved commit could not be found.
143 _DEPRECATED_MISSING_COMMIT = 8
143 _DEPRECATED_MISSING_COMMIT = 8
144
144
145 # The target repo reference is missing.
145 # The target repo reference is missing.
146 MISSING_TARGET_REF = 9
146 MISSING_TARGET_REF = 9
147
147
148 # The source repo reference is missing.
148 # The source repo reference is missing.
149 MISSING_SOURCE_REF = 10
149 MISSING_SOURCE_REF = 10
150
150
151 # The merge was not successful, there are conflicts related to sub
151 # The merge was not successful, there are conflicts related to sub
152 # repositories.
152 # repositories.
153 SUBREPO_MERGE_FAILED = 11
153 SUBREPO_MERGE_FAILED = 11
154
154
155
155
156 class UpdateFailureReason(object):
156 class UpdateFailureReason(object):
157 """
157 """
158 Enumeration with all the reasons why the pull request update could fail.
158 Enumeration with all the reasons why the pull request update could fail.
159
159
160 DO NOT change the number of the reasons, as they may be stored in the
160 DO NOT change the number of the reasons, as they may be stored in the
161 database.
161 database.
162
162
163 Changing the name of a reason is acceptable and encouraged to deprecate old
163 Changing the name of a reason is acceptable and encouraged to deprecate old
164 reasons.
164 reasons.
165 """
165 """
166
166
167 # Everything went well.
167 # Everything went well.
168 NONE = 0
168 NONE = 0
169
169
170 # An unexpected exception was raised. Check the logs for more details.
170 # An unexpected exception was raised. Check the logs for more details.
171 UNKNOWN = 1
171 UNKNOWN = 1
172
172
173 # The pull request is up to date.
173 # The pull request is up to date.
174 NO_CHANGE = 2
174 NO_CHANGE = 2
175
175
176 # The pull request has a reference type that is not supported for update.
176 # The pull request has a reference type that is not supported for update.
177 WRONG_REF_TYPE = 3
177 WRONG_REF_TYPE = 3
178
178
179 # Update failed because the target reference is missing.
179 # Update failed because the target reference is missing.
180 MISSING_TARGET_REF = 4
180 MISSING_TARGET_REF = 4
181
181
182 # Update failed because the source reference is missing.
182 # Update failed because the source reference is missing.
183 MISSING_SOURCE_REF = 5
183 MISSING_SOURCE_REF = 5
184
184
185
185
186 class MergeResponse(object):
186 class MergeResponse(object):
187
187
188 # uses .format(**metadata) for variables
188 # uses .format(**metadata) for variables
189 MERGE_STATUS_MESSAGES = {
189 MERGE_STATUS_MESSAGES = {
190 MergeFailureReason.NONE: lazy_ugettext(
190 MergeFailureReason.NONE: lazy_ugettext(
191 u'This pull request can be automatically merged.'),
191 u'This pull request can be automatically merged.'),
192 MergeFailureReason.UNKNOWN: lazy_ugettext(
192 MergeFailureReason.UNKNOWN: lazy_ugettext(
193 u'This pull request cannot be merged because of an unhandled exception. '
193 u'This pull request cannot be merged because of an unhandled exception. '
194 u'{exception}'),
194 u'{exception}'),
195 MergeFailureReason.MERGE_FAILED: lazy_ugettext(
195 MergeFailureReason.MERGE_FAILED: lazy_ugettext(
196 u'This pull request cannot be merged because of merge conflicts. {unresolved_files}'),
196 u'This pull request cannot be merged because of merge conflicts. {unresolved_files}'),
197 MergeFailureReason.PUSH_FAILED: lazy_ugettext(
197 MergeFailureReason.PUSH_FAILED: lazy_ugettext(
198 u'This pull request could not be merged because push to '
198 u'This pull request could not be merged because push to '
199 u'target:`{target}@{merge_commit}` failed.'),
199 u'target:`{target}@{merge_commit}` failed.'),
200 MergeFailureReason.TARGET_IS_NOT_HEAD: lazy_ugettext(
200 MergeFailureReason.TARGET_IS_NOT_HEAD: lazy_ugettext(
201 u'This pull request cannot be merged because the target '
201 u'This pull request cannot be merged because the target '
202 u'`{target_ref.name}` is not a head.'),
202 u'`{target_ref.name}` is not a head.'),
203 MergeFailureReason.HG_SOURCE_HAS_MORE_BRANCHES: lazy_ugettext(
203 MergeFailureReason.HG_SOURCE_HAS_MORE_BRANCHES: lazy_ugettext(
204 u'This pull request cannot be merged because the source contains '
204 u'This pull request cannot be merged because the source contains '
205 u'more branches than the target.'),
205 u'more branches than the target.'),
206 MergeFailureReason.HG_TARGET_HAS_MULTIPLE_HEADS: lazy_ugettext(
206 MergeFailureReason.HG_TARGET_HAS_MULTIPLE_HEADS: lazy_ugettext(
207 u'This pull request cannot be merged because the target `{target_ref.name}` '
207 u'This pull request cannot be merged because the target `{target_ref.name}` '
208 u'has multiple heads: `{heads}`.'),
208 u'has multiple heads: `{heads}`.'),
209 MergeFailureReason.TARGET_IS_LOCKED: lazy_ugettext(
209 MergeFailureReason.TARGET_IS_LOCKED: lazy_ugettext(
210 u'This pull request cannot be merged because the target repository is '
210 u'This pull request cannot be merged because the target repository is '
211 u'locked by {locked_by}.'),
211 u'locked by {locked_by}.'),
212
212
213 MergeFailureReason.MISSING_TARGET_REF: lazy_ugettext(
213 MergeFailureReason.MISSING_TARGET_REF: lazy_ugettext(
214 u'This pull request cannot be merged because the target '
214 u'This pull request cannot be merged because the target '
215 u'reference `{target_ref.name}` is missing.'),
215 u'reference `{target_ref.name}` is missing.'),
216 MergeFailureReason.MISSING_SOURCE_REF: lazy_ugettext(
216 MergeFailureReason.MISSING_SOURCE_REF: lazy_ugettext(
217 u'This pull request cannot be merged because the source '
217 u'This pull request cannot be merged because the source '
218 u'reference `{source_ref.name}` is missing.'),
218 u'reference `{source_ref.name}` is missing.'),
219 MergeFailureReason.SUBREPO_MERGE_FAILED: lazy_ugettext(
219 MergeFailureReason.SUBREPO_MERGE_FAILED: lazy_ugettext(
220 u'This pull request cannot be merged because of conflicts related '
220 u'This pull request cannot be merged because of conflicts related '
221 u'to sub repositories.'),
221 u'to sub repositories.'),
222
222
223 # Deprecations
223 # Deprecations
224 MergeFailureReason._DEPRECATED_MISSING_COMMIT: lazy_ugettext(
224 MergeFailureReason._DEPRECATED_MISSING_COMMIT: lazy_ugettext(
225 u'This pull request cannot be merged because the target or the '
225 u'This pull request cannot be merged because the target or the '
226 u'source reference is missing.'),
226 u'source reference is missing.'),
227
227
228 }
228 }
229
229
230 def __init__(self, possible, executed, merge_ref, failure_reason, metadata=None):
230 def __init__(self, possible, executed, merge_ref, failure_reason, metadata=None):
231 self.possible = possible
231 self.possible = possible
232 self.executed = executed
232 self.executed = executed
233 self.merge_ref = merge_ref
233 self.merge_ref = merge_ref
234 self.failure_reason = failure_reason
234 self.failure_reason = failure_reason
235 self.metadata = metadata or {}
235 self.metadata = metadata or {}
236
236
237 def __repr__(self):
237 def __repr__(self):
238 return '<MergeResponse:{} {}>'.format(self.label, self.failure_reason)
238 return '<MergeResponse:{} {}>'.format(self.label, self.failure_reason)
239
239
240 def __eq__(self, other):
240 def __eq__(self, other):
241 same_instance = isinstance(other, self.__class__)
241 same_instance = isinstance(other, self.__class__)
242 return same_instance \
242 return same_instance \
243 and self.possible == other.possible \
243 and self.possible == other.possible \
244 and self.executed == other.executed \
244 and self.executed == other.executed \
245 and self.failure_reason == other.failure_reason
245 and self.failure_reason == other.failure_reason
246
246
247 @property
247 @property
248 def label(self):
248 def label(self):
249 label_dict = dict((v, k) for k, v in MergeFailureReason.__dict__.items() if
249 label_dict = dict((v, k) for k, v in MergeFailureReason.__dict__.items() if
250 not k.startswith('_'))
250 not k.startswith('_'))
251 return label_dict.get(self.failure_reason)
251 return label_dict.get(self.failure_reason)
252
252
253 @property
253 @property
254 def merge_status_message(self):
254 def merge_status_message(self):
255 """
255 """
256 Return a human friendly error message for the given merge status code.
256 Return a human friendly error message for the given merge status code.
257 """
257 """
258 msg = safe_unicode(self.MERGE_STATUS_MESSAGES[self.failure_reason])
258 msg = safe_unicode(self.MERGE_STATUS_MESSAGES[self.failure_reason])
259
259
260 try:
260 try:
261 return msg.format(**self.metadata)
261 return msg.format(**self.metadata)
262 except Exception:
262 except Exception:
263 log.exception('Failed to format %s message', self)
263 log.exception('Failed to format %s message', self)
264 return msg
264 return msg
265
265
266 def asdict(self):
266 def asdict(self):
267 data = {}
267 data = {}
268 for k in ['possible', 'executed', 'merge_ref', 'failure_reason',
268 for k in ['possible', 'executed', 'merge_ref', 'failure_reason',
269 'merge_status_message']:
269 'merge_status_message']:
270 data[k] = getattr(self, k)
270 data[k] = getattr(self, k)
271 return data
271 return data
272
272
273
273
274 class TargetRefMissing(ValueError):
274 class TargetRefMissing(ValueError):
275 pass
275 pass
276
276
277
277
278 class SourceRefMissing(ValueError):
278 class SourceRefMissing(ValueError):
279 pass
279 pass
280
280
281
281
282 class BaseRepository(object):
282 class BaseRepository(object):
283 """
283 """
284 Base Repository for final backends
284 Base Repository for final backends
285
285
286 .. attribute:: DEFAULT_BRANCH_NAME
286 .. attribute:: DEFAULT_BRANCH_NAME
287
287
288 name of default branch (i.e. "trunk" for svn, "master" for git etc.
288 name of default branch (i.e. "trunk" for svn, "master" for git etc.
289
289
290 .. attribute:: commit_ids
290 .. attribute:: commit_ids
291
291
292 list of all available commit ids, in ascending order
292 list of all available commit ids, in ascending order
293
293
294 .. attribute:: path
294 .. attribute:: path
295
295
296 absolute path to the repository
296 absolute path to the repository
297
297
298 .. attribute:: bookmarks
298 .. attribute:: bookmarks
299
299
300 Mapping from name to :term:`Commit ID` of the bookmark. Empty in case
300 Mapping from name to :term:`Commit ID` of the bookmark. Empty in case
301 there are no bookmarks or the backend implementation does not support
301 there are no bookmarks or the backend implementation does not support
302 bookmarks.
302 bookmarks.
303
303
304 .. attribute:: tags
304 .. attribute:: tags
305
305
306 Mapping from name to :term:`Commit ID` of the tag.
306 Mapping from name to :term:`Commit ID` of the tag.
307
307
308 """
308 """
309
309
310 DEFAULT_BRANCH_NAME = None
310 DEFAULT_BRANCH_NAME = None
311 DEFAULT_CONTACT = u"Unknown"
311 DEFAULT_CONTACT = u"Unknown"
312 DEFAULT_DESCRIPTION = u"unknown"
312 DEFAULT_DESCRIPTION = u"unknown"
313 EMPTY_COMMIT_ID = '0' * 40
313 EMPTY_COMMIT_ID = '0' * 40
314 COMMIT_ID_PAT = re.compile(r'[0-9a-fA-F]{40}')
314 COMMIT_ID_PAT = re.compile(r'[0-9a-fA-F]{40}')
315
315
316 path = None
316 path = None
317
317
318 _is_empty = None
318 _is_empty = None
319 _commit_ids = {}
319 _commit_ids = {}
320
320
321 def __init__(self, repo_path, config=None, create=False, **kwargs):
321 def __init__(self, repo_path, config=None, create=False, **kwargs):
322 """
322 """
323 Initializes repository. Raises RepositoryError if repository could
323 Initializes repository. Raises RepositoryError if repository could
324 not be find at the given ``repo_path`` or directory at ``repo_path``
324 not be find at the given ``repo_path`` or directory at ``repo_path``
325 exists and ``create`` is set to True.
325 exists and ``create`` is set to True.
326
326
327 :param repo_path: local path of the repository
327 :param repo_path: local path of the repository
328 :param config: repository configuration
328 :param config: repository configuration
329 :param create=False: if set to True, would try to create repository.
329 :param create=False: if set to True, would try to create repository.
330 :param src_url=None: if set, should be proper url from which repository
330 :param src_url=None: if set, should be proper url from which repository
331 would be cloned; requires ``create`` parameter to be set to True -
331 would be cloned; requires ``create`` parameter to be set to True -
332 raises RepositoryError if src_url is set and create evaluates to
332 raises RepositoryError if src_url is set and create evaluates to
333 False
333 False
334 """
334 """
335 raise NotImplementedError
335 raise NotImplementedError
336
336
337 def __repr__(self):
337 def __repr__(self):
338 return '<%s at %s>' % (self.__class__.__name__, self.path)
338 return '<%s at %s>' % (self.__class__.__name__, self.path)
339
339
340 def __len__(self):
340 def __len__(self):
341 return self.count()
341 return self.count()
342
342
343 def __eq__(self, other):
343 def __eq__(self, other):
344 same_instance = isinstance(other, self.__class__)
344 same_instance = isinstance(other, self.__class__)
345 return same_instance and other.path == self.path
345 return same_instance and other.path == self.path
346
346
347 def __ne__(self, other):
347 def __ne__(self, other):
348 return not self.__eq__(other)
348 return not self.__eq__(other)
349
349
350 def get_create_shadow_cache_pr_path(self, db_repo):
350 def get_create_shadow_cache_pr_path(self, db_repo):
351 path = db_repo.cached_diffs_dir
351 path = db_repo.cached_diffs_dir
352 if not os.path.exists(path):
352 if not os.path.exists(path):
353 os.makedirs(path, 0o755)
353 os.makedirs(path, 0o755)
354 return path
354 return path
355
355
356 @classmethod
356 @classmethod
357 def get_default_config(cls, default=None):
357 def get_default_config(cls, default=None):
358 config = Config()
358 config = Config()
359 if default and isinstance(default, list):
359 if default and isinstance(default, list):
360 for section, key, val in default:
360 for section, key, val in default:
361 config.set(section, key, val)
361 config.set(section, key, val)
362 return config
362 return config
363
363
364 @LazyProperty
364 @LazyProperty
365 def _remote(self):
365 def _remote(self):
366 raise NotImplementedError
366 raise NotImplementedError
367
367
368 def _heads(self, branch=None):
368 def _heads(self, branch=None):
369 return []
369 return []
370
370
371 @LazyProperty
371 @LazyProperty
372 def EMPTY_COMMIT(self):
372 def EMPTY_COMMIT(self):
373 return EmptyCommit(self.EMPTY_COMMIT_ID)
373 return EmptyCommit(self.EMPTY_COMMIT_ID)
374
374
375 @LazyProperty
375 @LazyProperty
376 def alias(self):
376 def alias(self):
377 for k, v in settings.BACKENDS.items():
377 for k, v in settings.BACKENDS.items():
378 if v.split('.')[-1] == str(self.__class__.__name__):
378 if v.split('.')[-1] == str(self.__class__.__name__):
379 return k
379 return k
380
380
381 @LazyProperty
381 @LazyProperty
382 def name(self):
382 def name(self):
383 return safe_unicode(os.path.basename(self.path))
383 return safe_unicode(os.path.basename(self.path))
384
384
385 @LazyProperty
385 @LazyProperty
386 def description(self):
386 def description(self):
387 raise NotImplementedError
387 raise NotImplementedError
388
388
389 def refs(self):
389 def refs(self):
390 """
390 """
391 returns a `dict` with branches, bookmarks, tags, and closed_branches
391 returns a `dict` with branches, bookmarks, tags, and closed_branches
392 for this repository
392 for this repository
393 """
393 """
394 return dict(
394 return dict(
395 branches=self.branches,
395 branches=self.branches,
396 branches_closed=self.branches_closed,
396 branches_closed=self.branches_closed,
397 tags=self.tags,
397 tags=self.tags,
398 bookmarks=self.bookmarks
398 bookmarks=self.bookmarks
399 )
399 )
400
400
401 @LazyProperty
401 @LazyProperty
402 def branches(self):
402 def branches(self):
403 """
403 """
404 A `dict` which maps branch names to commit ids.
404 A `dict` which maps branch names to commit ids.
405 """
405 """
406 raise NotImplementedError
406 raise NotImplementedError
407
407
408 @LazyProperty
408 @LazyProperty
409 def branches_closed(self):
409 def branches_closed(self):
410 """
410 """
411 A `dict` which maps tags names to commit ids.
411 A `dict` which maps tags names to commit ids.
412 """
412 """
413 raise NotImplementedError
413 raise NotImplementedError
414
414
415 @LazyProperty
415 @LazyProperty
416 def bookmarks(self):
416 def bookmarks(self):
417 """
417 """
418 A `dict` which maps tags names to commit ids.
418 A `dict` which maps tags names to commit ids.
419 """
419 """
420 raise NotImplementedError
420 raise NotImplementedError
421
421
422 @LazyProperty
422 @LazyProperty
423 def tags(self):
423 def tags(self):
424 """
424 """
425 A `dict` which maps tags names to commit ids.
425 A `dict` which maps tags names to commit ids.
426 """
426 """
427 raise NotImplementedError
427 raise NotImplementedError
428
428
429 @LazyProperty
429 @LazyProperty
430 def size(self):
430 def size(self):
431 """
431 """
432 Returns combined size in bytes for all repository files
432 Returns combined size in bytes for all repository files
433 """
433 """
434 tip = self.get_commit()
434 tip = self.get_commit()
435 return tip.size
435 return tip.size
436
436
437 def size_at_commit(self, commit_id):
437 def size_at_commit(self, commit_id):
438 commit = self.get_commit(commit_id)
438 commit = self.get_commit(commit_id)
439 return commit.size
439 return commit.size
440
440
441 def _check_for_empty(self):
441 def _check_for_empty(self):
442 no_commits = len(self._commit_ids) == 0
442 no_commits = len(self._commit_ids) == 0
443 if no_commits:
443 if no_commits:
444 # check on remote to be sure
444 # check on remote to be sure
445 return self._remote.is_empty()
445 return self._remote.is_empty()
446 else:
446 else:
447 return False
447 return False
448
448
449 def is_empty(self):
449 def is_empty(self):
450 if rhodecode.is_test:
450 if rhodecode.is_test:
451 return self._check_for_empty()
451 return self._check_for_empty()
452
452
453 if self._is_empty is None:
453 if self._is_empty is None:
454 # cache empty for production, but not tests
454 # cache empty for production, but not tests
455 self._is_empty = self._check_for_empty()
455 self._is_empty = self._check_for_empty()
456
456
457 return self._is_empty
457 return self._is_empty
458
458
459 @staticmethod
459 @staticmethod
460 def check_url(url, config):
460 def check_url(url, config):
461 """
461 """
462 Function will check given url and try to verify if it's a valid
462 Function will check given url and try to verify if it's a valid
463 link.
463 link.
464 """
464 """
465 raise NotImplementedError
465 raise NotImplementedError
466
466
467 @staticmethod
467 @staticmethod
468 def is_valid_repository(path):
468 def is_valid_repository(path):
469 """
469 """
470 Check if given `path` contains a valid repository of this backend
470 Check if given `path` contains a valid repository of this backend
471 """
471 """
472 raise NotImplementedError
472 raise NotImplementedError
473
473
474 # ==========================================================================
474 # ==========================================================================
475 # COMMITS
475 # COMMITS
476 # ==========================================================================
476 # ==========================================================================
477
477
478 @CachedProperty
478 @CachedProperty
479 def commit_ids(self):
479 def commit_ids(self):
480 raise NotImplementedError
480 raise NotImplementedError
481
481
482 def append_commit_id(self, commit_id):
482 def append_commit_id(self, commit_id):
483 if commit_id not in self.commit_ids:
483 if commit_id not in self.commit_ids:
484 self._rebuild_cache(self.commit_ids + [commit_id])
484 self._rebuild_cache(self.commit_ids + [commit_id])
485
485
486 # clear cache
486 # clear cache
487 self._invalidate_prop_cache('commit_ids')
487 self._invalidate_prop_cache('commit_ids')
488 self._is_empty = False
488 self._is_empty = False
489
489
490 def get_commit(self, commit_id=None, commit_idx=None, pre_load=None,
490 def get_commit(self, commit_id=None, commit_idx=None, pre_load=None,
491 translate_tag=None, maybe_unreachable=False, reference_obj=None):
491 translate_tag=None, maybe_unreachable=False, reference_obj=None):
492 """
492 """
493 Returns instance of `BaseCommit` class. If `commit_id` and `commit_idx`
493 Returns instance of `BaseCommit` class. If `commit_id` and `commit_idx`
494 are both None, most recent commit is returned.
494 are both None, most recent commit is returned.
495
495
496 :param pre_load: Optional. List of commit attributes to load.
496 :param pre_load: Optional. List of commit attributes to load.
497
497
498 :raises ``EmptyRepositoryError``: if there are no commits
498 :raises ``EmptyRepositoryError``: if there are no commits
499 """
499 """
500 raise NotImplementedError
500 raise NotImplementedError
501
501
502 def __iter__(self):
502 def __iter__(self):
503 for commit_id in self.commit_ids:
503 for commit_id in self.commit_ids:
504 yield self.get_commit(commit_id=commit_id)
504 yield self.get_commit(commit_id=commit_id)
505
505
506 def get_commits(
506 def get_commits(
507 self, start_id=None, end_id=None, start_date=None, end_date=None,
507 self, start_id=None, end_id=None, start_date=None, end_date=None,
508 branch_name=None, show_hidden=False, pre_load=None, translate_tags=None):
508 branch_name=None, show_hidden=False, pre_load=None, translate_tags=None):
509 """
509 """
510 Returns iterator of `BaseCommit` objects from start to end
510 Returns iterator of `BaseCommit` objects from start to end
511 not inclusive. This should behave just like a list, ie. end is not
511 not inclusive. This should behave just like a list, ie. end is not
512 inclusive.
512 inclusive.
513
513
514 :param start_id: None or str, must be a valid commit id
514 :param start_id: None or str, must be a valid commit id
515 :param end_id: None or str, must be a valid commit id
515 :param end_id: None or str, must be a valid commit id
516 :param start_date:
516 :param start_date:
517 :param end_date:
517 :param end_date:
518 :param branch_name:
518 :param branch_name:
519 :param show_hidden:
519 :param show_hidden:
520 :param pre_load:
520 :param pre_load:
521 :param translate_tags:
521 :param translate_tags:
522 """
522 """
523 raise NotImplementedError
523 raise NotImplementedError
524
524
525 def __getitem__(self, key):
525 def __getitem__(self, key):
526 """
526 """
527 Allows index based access to the commit objects of this repository.
527 Allows index based access to the commit objects of this repository.
528 """
528 """
529 pre_load = ["author", "branch", "date", "message", "parents"]
529 pre_load = ["author", "branch", "date", "message", "parents"]
530 if isinstance(key, slice):
530 if isinstance(key, slice):
531 return self._get_range(key, pre_load)
531 return self._get_range(key, pre_load)
532 return self.get_commit(commit_idx=key, pre_load=pre_load)
532 return self.get_commit(commit_idx=key, pre_load=pre_load)
533
533
534 def _get_range(self, slice_obj, pre_load):
534 def _get_range(self, slice_obj, pre_load):
535 for commit_id in self.commit_ids.__getitem__(slice_obj):
535 for commit_id in self.commit_ids.__getitem__(slice_obj):
536 yield self.get_commit(commit_id=commit_id, pre_load=pre_load)
536 yield self.get_commit(commit_id=commit_id, pre_load=pre_load)
537
537
538 def count(self):
538 def count(self):
539 return len(self.commit_ids)
539 return len(self.commit_ids)
540
540
541 def tag(self, name, user, commit_id=None, message=None, date=None, **opts):
541 def tag(self, name, user, commit_id=None, message=None, date=None, **opts):
542 """
542 """
543 Creates and returns a tag for the given ``commit_id``.
543 Creates and returns a tag for the given ``commit_id``.
544
544
545 :param name: name for new tag
545 :param name: name for new tag
546 :param user: full username, i.e.: "Joe Doe <joe.doe@example.com>"
546 :param user: full username, i.e.: "Joe Doe <joe.doe@example.com>"
547 :param commit_id: commit id for which new tag would be created
547 :param commit_id: commit id for which new tag would be created
548 :param message: message of the tag's commit
548 :param message: message of the tag's commit
549 :param date: date of tag's commit
549 :param date: date of tag's commit
550
550
551 :raises TagAlreadyExistError: if tag with same name already exists
551 :raises TagAlreadyExistError: if tag with same name already exists
552 """
552 """
553 raise NotImplementedError
553 raise NotImplementedError
554
554
555 def remove_tag(self, name, user, message=None, date=None):
555 def remove_tag(self, name, user, message=None, date=None):
556 """
556 """
557 Removes tag with the given ``name``.
557 Removes tag with the given ``name``.
558
558
559 :param name: name of the tag to be removed
559 :param name: name of the tag to be removed
560 :param user: full username, i.e.: "Joe Doe <joe.doe@example.com>"
560 :param user: full username, i.e.: "Joe Doe <joe.doe@example.com>"
561 :param message: message of the tag's removal commit
561 :param message: message of the tag's removal commit
562 :param date: date of tag's removal commit
562 :param date: date of tag's removal commit
563
563
564 :raises TagDoesNotExistError: if tag with given name does not exists
564 :raises TagDoesNotExistError: if tag with given name does not exists
565 """
565 """
566 raise NotImplementedError
566 raise NotImplementedError
567
567
568 def get_diff(
568 def get_diff(
569 self, commit1, commit2, path=None, ignore_whitespace=False,
569 self, commit1, commit2, path=None, ignore_whitespace=False,
570 context=3, path1=None):
570 context=3, path1=None):
571 """
571 """
572 Returns (git like) *diff*, as plain text. Shows changes introduced by
572 Returns (git like) *diff*, as plain text. Shows changes introduced by
573 `commit2` since `commit1`.
573 `commit2` since `commit1`.
574
574
575 :param commit1: Entry point from which diff is shown. Can be
575 :param commit1: Entry point from which diff is shown. Can be
576 ``self.EMPTY_COMMIT`` - in this case, patch showing all
576 ``self.EMPTY_COMMIT`` - in this case, patch showing all
577 the changes since empty state of the repository until `commit2`
577 the changes since empty state of the repository until `commit2`
578 :param commit2: Until which commit changes should be shown.
578 :param commit2: Until which commit changes should be shown.
579 :param path: Can be set to a path of a file to create a diff of that
579 :param path: Can be set to a path of a file to create a diff of that
580 file. If `path1` is also set, this value is only associated to
580 file. If `path1` is also set, this value is only associated to
581 `commit2`.
581 `commit2`.
582 :param ignore_whitespace: If set to ``True``, would not show whitespace
582 :param ignore_whitespace: If set to ``True``, would not show whitespace
583 changes. Defaults to ``False``.
583 changes. Defaults to ``False``.
584 :param context: How many lines before/after changed lines should be
584 :param context: How many lines before/after changed lines should be
585 shown. Defaults to ``3``.
585 shown. Defaults to ``3``.
586 :param path1: Can be set to a path to associate with `commit1`. This
586 :param path1: Can be set to a path to associate with `commit1`. This
587 parameter works only for backends which support diff generation for
587 parameter works only for backends which support diff generation for
588 different paths. Other backends will raise a `ValueError` if `path1`
588 different paths. Other backends will raise a `ValueError` if `path1`
589 is set and has a different value than `path`.
589 is set and has a different value than `path`.
590 :param file_path: filter this diff by given path pattern
590 :param file_path: filter this diff by given path pattern
591 """
591 """
592 raise NotImplementedError
592 raise NotImplementedError
593
593
594 def strip(self, commit_id, branch=None):
594 def strip(self, commit_id, branch=None):
595 """
595 """
596 Strip given commit_id from the repository
596 Strip given commit_id from the repository
597 """
597 """
598 raise NotImplementedError
598 raise NotImplementedError
599
599
600 def get_common_ancestor(self, commit_id1, commit_id2, repo2):
600 def get_common_ancestor(self, commit_id1, commit_id2, repo2):
601 """
601 """
602 Return a latest common ancestor commit if one exists for this repo
602 Return a latest common ancestor commit if one exists for this repo
603 `commit_id1` vs `commit_id2` from `repo2`.
603 `commit_id1` vs `commit_id2` from `repo2`.
604
604
605 :param commit_id1: Commit it from this repository to use as a
605 :param commit_id1: Commit it from this repository to use as a
606 target for the comparison.
606 target for the comparison.
607 :param commit_id2: Source commit id to use for comparison.
607 :param commit_id2: Source commit id to use for comparison.
608 :param repo2: Source repository to use for comparison.
608 :param repo2: Source repository to use for comparison.
609 """
609 """
610 raise NotImplementedError
610 raise NotImplementedError
611
611
612 def compare(self, commit_id1, commit_id2, repo2, merge, pre_load=None):
612 def compare(self, commit_id1, commit_id2, repo2, merge, pre_load=None):
613 """
613 """
614 Compare this repository's revision `commit_id1` with `commit_id2`.
614 Compare this repository's revision `commit_id1` with `commit_id2`.
615
615
616 Returns a tuple(commits, ancestor) that would be merged from
616 Returns a tuple(commits, ancestor) that would be merged from
617 `commit_id2`. Doing a normal compare (``merge=False``), ``None``
617 `commit_id2`. Doing a normal compare (``merge=False``), ``None``
618 will be returned as ancestor.
618 will be returned as ancestor.
619
619
620 :param commit_id1: Commit it from this repository to use as a
620 :param commit_id1: Commit it from this repository to use as a
621 target for the comparison.
621 target for the comparison.
622 :param commit_id2: Source commit id to use for comparison.
622 :param commit_id2: Source commit id to use for comparison.
623 :param repo2: Source repository to use for comparison.
623 :param repo2: Source repository to use for comparison.
624 :param merge: If set to ``True`` will do a merge compare which also
624 :param merge: If set to ``True`` will do a merge compare which also
625 returns the common ancestor.
625 returns the common ancestor.
626 :param pre_load: Optional. List of commit attributes to load.
626 :param pre_load: Optional. List of commit attributes to load.
627 """
627 """
628 raise NotImplementedError
628 raise NotImplementedError
629
629
630 def merge(self, repo_id, workspace_id, target_ref, source_repo, source_ref,
630 def merge(self, repo_id, workspace_id, target_ref, source_repo, source_ref,
631 user_name='', user_email='', message='', dry_run=False,
631 user_name='', user_email='', message='', dry_run=False,
632 use_rebase=False, close_branch=False):
632 use_rebase=False, close_branch=False):
633 """
633 """
634 Merge the revisions specified in `source_ref` from `source_repo`
634 Merge the revisions specified in `source_ref` from `source_repo`
635 onto the `target_ref` of this repository.
635 onto the `target_ref` of this repository.
636
636
637 `source_ref` and `target_ref` are named tupls with the following
637 `source_ref` and `target_ref` are named tupls with the following
638 fields `type`, `name` and `commit_id`.
638 fields `type`, `name` and `commit_id`.
639
639
640 Returns a MergeResponse named tuple with the following fields
640 Returns a MergeResponse named tuple with the following fields
641 'possible', 'executed', 'source_commit', 'target_commit',
641 'possible', 'executed', 'source_commit', 'target_commit',
642 'merge_commit'.
642 'merge_commit'.
643
643
644 :param repo_id: `repo_id` target repo id.
644 :param repo_id: `repo_id` target repo id.
645 :param workspace_id: `workspace_id` unique identifier.
645 :param workspace_id: `workspace_id` unique identifier.
646 :param target_ref: `target_ref` points to the commit on top of which
646 :param target_ref: `target_ref` points to the commit on top of which
647 the `source_ref` should be merged.
647 the `source_ref` should be merged.
648 :param source_repo: The repository that contains the commits to be
648 :param source_repo: The repository that contains the commits to be
649 merged.
649 merged.
650 :param source_ref: `source_ref` points to the topmost commit from
650 :param source_ref: `source_ref` points to the topmost commit from
651 the `source_repo` which should be merged.
651 the `source_repo` which should be merged.
652 :param user_name: Merge commit `user_name`.
652 :param user_name: Merge commit `user_name`.
653 :param user_email: Merge commit `user_email`.
653 :param user_email: Merge commit `user_email`.
654 :param message: Merge commit `message`.
654 :param message: Merge commit `message`.
655 :param dry_run: If `True` the merge will not take place.
655 :param dry_run: If `True` the merge will not take place.
656 :param use_rebase: If `True` commits from the source will be rebased
656 :param use_rebase: If `True` commits from the source will be rebased
657 on top of the target instead of being merged.
657 on top of the target instead of being merged.
658 :param close_branch: If `True` branch will be close before merging it
658 :param close_branch: If `True` branch will be close before merging it
659 """
659 """
660 if dry_run:
660 if dry_run:
661 message = message or settings.MERGE_DRY_RUN_MESSAGE
661 message = message or settings.MERGE_DRY_RUN_MESSAGE
662 user_email = user_email or settings.MERGE_DRY_RUN_EMAIL
662 user_email = user_email or settings.MERGE_DRY_RUN_EMAIL
663 user_name = user_name or settings.MERGE_DRY_RUN_USER
663 user_name = user_name or settings.MERGE_DRY_RUN_USER
664 else:
664 else:
665 if not user_name:
665 if not user_name:
666 raise ValueError('user_name cannot be empty')
666 raise ValueError('user_name cannot be empty')
667 if not user_email:
667 if not user_email:
668 raise ValueError('user_email cannot be empty')
668 raise ValueError('user_email cannot be empty')
669 if not message:
669 if not message:
670 raise ValueError('message cannot be empty')
670 raise ValueError('message cannot be empty')
671
671
672 try:
672 try:
673 return self._merge_repo(
673 return self._merge_repo(
674 repo_id, workspace_id, target_ref, source_repo,
674 repo_id, workspace_id, target_ref, source_repo,
675 source_ref, message, user_name, user_email, dry_run=dry_run,
675 source_ref, message, user_name, user_email, dry_run=dry_run,
676 use_rebase=use_rebase, close_branch=close_branch)
676 use_rebase=use_rebase, close_branch=close_branch)
677 except RepositoryError as exc:
677 except RepositoryError as exc:
678 log.exception('Unexpected failure when running merge, dry-run=%s', dry_run)
678 log.exception('Unexpected failure when running merge, dry-run=%s', dry_run)
679 return MergeResponse(
679 return MergeResponse(
680 False, False, None, MergeFailureReason.UNKNOWN,
680 False, False, None, MergeFailureReason.UNKNOWN,
681 metadata={'exception': str(exc)})
681 metadata={'exception': str(exc)})
682
682
683 def _merge_repo(self, repo_id, workspace_id, target_ref,
683 def _merge_repo(self, repo_id, workspace_id, target_ref,
684 source_repo, source_ref, merge_message,
684 source_repo, source_ref, merge_message,
685 merger_name, merger_email, dry_run=False,
685 merger_name, merger_email, dry_run=False,
686 use_rebase=False, close_branch=False):
686 use_rebase=False, close_branch=False):
687 """Internal implementation of merge."""
687 """Internal implementation of merge."""
688 raise NotImplementedError
688 raise NotImplementedError
689
689
690 def _maybe_prepare_merge_workspace(
690 def _maybe_prepare_merge_workspace(
691 self, repo_id, workspace_id, target_ref, source_ref):
691 self, repo_id, workspace_id, target_ref, source_ref):
692 """
692 """
693 Create the merge workspace.
693 Create the merge workspace.
694
694
695 :param workspace_id: `workspace_id` unique identifier.
695 :param workspace_id: `workspace_id` unique identifier.
696 """
696 """
697 raise NotImplementedError
697 raise NotImplementedError
698
698
699 @classmethod
699 @classmethod
700 def _get_legacy_shadow_repository_path(cls, repo_path, workspace_id):
700 def _get_legacy_shadow_repository_path(cls, repo_path, workspace_id):
701 """
701 """
702 Legacy version that was used before. We still need it for
702 Legacy version that was used before. We still need it for
703 backward compat
703 backward compat
704 """
704 """
705 return os.path.join(
705 return os.path.join(
706 os.path.dirname(repo_path),
706 os.path.dirname(repo_path),
707 '.__shadow_%s_%s' % (os.path.basename(repo_path), workspace_id))
707 '.__shadow_%s_%s' % (os.path.basename(repo_path), workspace_id))
708
708
709 @classmethod
709 @classmethod
710 def _get_shadow_repository_path(cls, repo_path, repo_id, workspace_id):
710 def _get_shadow_repository_path(cls, repo_path, repo_id, workspace_id):
711 # The name of the shadow repository must start with '.', so it is
711 # The name of the shadow repository must start with '.', so it is
712 # skipped by 'rhodecode.lib.utils.get_filesystem_repos'.
712 # skipped by 'rhodecode.lib.utils.get_filesystem_repos'.
713 legacy_repository_path = cls._get_legacy_shadow_repository_path(repo_path, workspace_id)
713 legacy_repository_path = cls._get_legacy_shadow_repository_path(repo_path, workspace_id)
714 if os.path.exists(legacy_repository_path):
714 if os.path.exists(legacy_repository_path):
715 return legacy_repository_path
715 return legacy_repository_path
716 else:
716 else:
717 return os.path.join(
717 return os.path.join(
718 os.path.dirname(repo_path),
718 os.path.dirname(repo_path),
719 '.__shadow_repo_%s_%s' % (repo_id, workspace_id))
719 '.__shadow_repo_%s_%s' % (repo_id, workspace_id))
720
720
721 def cleanup_merge_workspace(self, repo_id, workspace_id):
721 def cleanup_merge_workspace(self, repo_id, workspace_id):
722 """
722 """
723 Remove merge workspace.
723 Remove merge workspace.
724
724
725 This function MUST not fail in case there is no workspace associated to
725 This function MUST not fail in case there is no workspace associated to
726 the given `workspace_id`.
726 the given `workspace_id`.
727
727
728 :param workspace_id: `workspace_id` unique identifier.
728 :param workspace_id: `workspace_id` unique identifier.
729 """
729 """
730 shadow_repository_path = self._get_shadow_repository_path(
730 shadow_repository_path = self._get_shadow_repository_path(
731 self.path, repo_id, workspace_id)
731 self.path, repo_id, workspace_id)
732 shadow_repository_path_del = '{}.{}.delete'.format(
732 shadow_repository_path_del = '{}.{}.delete'.format(
733 shadow_repository_path, time.time())
733 shadow_repository_path, time.time())
734
734
735 # move the shadow repo, so it never conflicts with the one used.
735 # move the shadow repo, so it never conflicts with the one used.
736 # we use this method because shutil.rmtree had some edge case problems
736 # we use this method because shutil.rmtree had some edge case problems
737 # removing symlinked repositories
737 # removing symlinked repositories
738 if not os.path.isdir(shadow_repository_path):
738 if not os.path.isdir(shadow_repository_path):
739 return
739 return
740
740
741 shutil.move(shadow_repository_path, shadow_repository_path_del)
741 shutil.move(shadow_repository_path, shadow_repository_path_del)
742 try:
742 try:
743 shutil.rmtree(shadow_repository_path_del, ignore_errors=False)
743 shutil.rmtree(shadow_repository_path_del, ignore_errors=False)
744 except Exception:
744 except Exception:
745 log.exception('Failed to gracefully remove shadow repo under %s',
745 log.exception('Failed to gracefully remove shadow repo under %s',
746 shadow_repository_path_del)
746 shadow_repository_path_del)
747 shutil.rmtree(shadow_repository_path_del, ignore_errors=True)
747 shutil.rmtree(shadow_repository_path_del, ignore_errors=True)
748
748
749 # ========== #
749 # ========== #
750 # COMMIT API #
750 # COMMIT API #
751 # ========== #
751 # ========== #
752
752
753 @LazyProperty
753 @LazyProperty
754 def in_memory_commit(self):
754 def in_memory_commit(self):
755 """
755 """
756 Returns :class:`InMemoryCommit` object for this repository.
756 Returns :class:`InMemoryCommit` object for this repository.
757 """
757 """
758 raise NotImplementedError
758 raise NotImplementedError
759
759
760 # ======================== #
760 # ======================== #
761 # UTILITIES FOR SUBCLASSES #
761 # UTILITIES FOR SUBCLASSES #
762 # ======================== #
762 # ======================== #
763
763
764 def _validate_diff_commits(self, commit1, commit2):
764 def _validate_diff_commits(self, commit1, commit2):
765 """
765 """
766 Validates that the given commits are related to this repository.
766 Validates that the given commits are related to this repository.
767
767
768 Intended as a utility for sub classes to have a consistent validation
768 Intended as a utility for sub classes to have a consistent validation
769 of input parameters in methods like :meth:`get_diff`.
769 of input parameters in methods like :meth:`get_diff`.
770 """
770 """
771 self._validate_commit(commit1)
771 self._validate_commit(commit1)
772 self._validate_commit(commit2)
772 self._validate_commit(commit2)
773 if (isinstance(commit1, EmptyCommit) and
773 if (isinstance(commit1, EmptyCommit) and
774 isinstance(commit2, EmptyCommit)):
774 isinstance(commit2, EmptyCommit)):
775 raise ValueError("Cannot compare two empty commits")
775 raise ValueError("Cannot compare two empty commits")
776
776
777 def _validate_commit(self, commit):
777 def _validate_commit(self, commit):
778 if not isinstance(commit, BaseCommit):
778 if not isinstance(commit, BaseCommit):
779 raise TypeError(
779 raise TypeError(
780 "%s is not of type BaseCommit" % repr(commit))
780 "%s is not of type BaseCommit" % repr(commit))
781 if commit.repository != self and not isinstance(commit, EmptyCommit):
781 if commit.repository != self and not isinstance(commit, EmptyCommit):
782 raise ValueError(
782 raise ValueError(
783 "Commit %s must be a valid commit from this repository %s, "
783 "Commit %s must be a valid commit from this repository %s, "
784 "related to this repository instead %s." %
784 "related to this repository instead %s." %
785 (commit, self, commit.repository))
785 (commit, self, commit.repository))
786
786
787 def _validate_commit_id(self, commit_id):
787 def _validate_commit_id(self, commit_id):
788 if not isinstance(commit_id, compat.string_types):
788 if not isinstance(commit_id, compat.string_types):
789 raise TypeError("commit_id must be a string value got {} instead".format(type(commit_id)))
789 raise TypeError("commit_id must be a string value got {} instead".format(type(commit_id)))
790
790
791 def _validate_commit_idx(self, commit_idx):
791 def _validate_commit_idx(self, commit_idx):
792 if not isinstance(commit_idx, (int, long)):
792 if not isinstance(commit_idx, (int, long)):
793 raise TypeError("commit_idx must be a numeric value")
793 raise TypeError("commit_idx must be a numeric value")
794
794
795 def _validate_branch_name(self, branch_name):
795 def _validate_branch_name(self, branch_name):
796 if branch_name and branch_name not in self.branches_all:
796 if branch_name and branch_name not in self.branches_all:
797 msg = ("Branch %s not found in %s" % (branch_name, self))
797 msg = ("Branch %s not found in %s" % (branch_name, self))
798 raise BranchDoesNotExistError(msg)
798 raise BranchDoesNotExistError(msg)
799
799
800 #
800 #
801 # Supporting deprecated API parts
801 # Supporting deprecated API parts
802 # TODO: johbo: consider to move this into a mixin
802 # TODO: johbo: consider to move this into a mixin
803 #
803 #
804
804
805 @property
805 @property
806 def EMPTY_CHANGESET(self):
806 def EMPTY_CHANGESET(self):
807 warnings.warn(
807 warnings.warn(
808 "Use EMPTY_COMMIT or EMPTY_COMMIT_ID instead", DeprecationWarning)
808 "Use EMPTY_COMMIT or EMPTY_COMMIT_ID instead", DeprecationWarning)
809 return self.EMPTY_COMMIT_ID
809 return self.EMPTY_COMMIT_ID
810
810
811 @property
811 @property
812 def revisions(self):
812 def revisions(self):
813 warnings.warn("Use commits attribute instead", DeprecationWarning)
813 warnings.warn("Use commits attribute instead", DeprecationWarning)
814 return self.commit_ids
814 return self.commit_ids
815
815
816 @revisions.setter
816 @revisions.setter
817 def revisions(self, value):
817 def revisions(self, value):
818 warnings.warn("Use commits attribute instead", DeprecationWarning)
818 warnings.warn("Use commits attribute instead", DeprecationWarning)
819 self.commit_ids = value
819 self.commit_ids = value
820
820
821 def get_changeset(self, revision=None, pre_load=None):
821 def get_changeset(self, revision=None, pre_load=None):
822 warnings.warn("Use get_commit instead", DeprecationWarning)
822 warnings.warn("Use get_commit instead", DeprecationWarning)
823 commit_id = None
823 commit_id = None
824 commit_idx = None
824 commit_idx = None
825 if isinstance(revision, compat.string_types):
825 if isinstance(revision, compat.string_types):
826 commit_id = revision
826 commit_id = revision
827 else:
827 else:
828 commit_idx = revision
828 commit_idx = revision
829 return self.get_commit(
829 return self.get_commit(
830 commit_id=commit_id, commit_idx=commit_idx, pre_load=pre_load)
830 commit_id=commit_id, commit_idx=commit_idx, pre_load=pre_load)
831
831
832 def get_changesets(
832 def get_changesets(
833 self, start=None, end=None, start_date=None, end_date=None,
833 self, start=None, end=None, start_date=None, end_date=None,
834 branch_name=None, pre_load=None):
834 branch_name=None, pre_load=None):
835 warnings.warn("Use get_commits instead", DeprecationWarning)
835 warnings.warn("Use get_commits instead", DeprecationWarning)
836 start_id = self._revision_to_commit(start)
836 start_id = self._revision_to_commit(start)
837 end_id = self._revision_to_commit(end)
837 end_id = self._revision_to_commit(end)
838 return self.get_commits(
838 return self.get_commits(
839 start_id=start_id, end_id=end_id, start_date=start_date,
839 start_id=start_id, end_id=end_id, start_date=start_date,
840 end_date=end_date, branch_name=branch_name, pre_load=pre_load)
840 end_date=end_date, branch_name=branch_name, pre_load=pre_load)
841
841
842 def _revision_to_commit(self, revision):
842 def _revision_to_commit(self, revision):
843 """
843 """
844 Translates a revision to a commit_id
844 Translates a revision to a commit_id
845
845
846 Helps to support the old changeset based API which allows to use
846 Helps to support the old changeset based API which allows to use
847 commit ids and commit indices interchangeable.
847 commit ids and commit indices interchangeable.
848 """
848 """
849 if revision is None:
849 if revision is None:
850 return revision
850 return revision
851
851
852 if isinstance(revision, compat.string_types):
852 if isinstance(revision, compat.string_types):
853 commit_id = revision
853 commit_id = revision
854 else:
854 else:
855 commit_id = self.commit_ids[revision]
855 commit_id = self.commit_ids[revision]
856 return commit_id
856 return commit_id
857
857
858 @property
858 @property
859 def in_memory_changeset(self):
859 def in_memory_changeset(self):
860 warnings.warn("Use in_memory_commit instead", DeprecationWarning)
860 warnings.warn("Use in_memory_commit instead", DeprecationWarning)
861 return self.in_memory_commit
861 return self.in_memory_commit
862
862
863 def get_path_permissions(self, username):
863 def get_path_permissions(self, username):
864 """
864 """
865 Returns a path permission checker or None if not supported
865 Returns a path permission checker or None if not supported
866
866
867 :param username: session user name
867 :param username: session user name
868 :return: an instance of BasePathPermissionChecker or None
868 :return: an instance of BasePathPermissionChecker or None
869 """
869 """
870 return None
870 return None
871
871
872 def install_hooks(self, force=False):
872 def install_hooks(self, force=False):
873 return self._remote.install_hooks(force)
873 return self._remote.install_hooks(force)
874
874
875 def get_hooks_info(self):
875 def get_hooks_info(self):
876 return self._remote.get_hooks_info()
876 return self._remote.get_hooks_info()
877
877
878 def vcsserver_invalidate_cache(self, delete=False):
879 return self._remote.vcsserver_invalidate_cache(delete)
880
878
881
879 class BaseCommit(object):
882 class BaseCommit(object):
880 """
883 """
881 Each backend should implement it's commit representation.
884 Each backend should implement it's commit representation.
882
885
883 **Attributes**
886 **Attributes**
884
887
885 ``repository``
888 ``repository``
886 repository object within which commit exists
889 repository object within which commit exists
887
890
888 ``id``
891 ``id``
889 The commit id, may be ``raw_id`` or i.e. for mercurial's tip
892 The commit id, may be ``raw_id`` or i.e. for mercurial's tip
890 just ``tip``.
893 just ``tip``.
891
894
892 ``raw_id``
895 ``raw_id``
893 raw commit representation (i.e. full 40 length sha for git
896 raw commit representation (i.e. full 40 length sha for git
894 backend)
897 backend)
895
898
896 ``short_id``
899 ``short_id``
897 shortened (if apply) version of ``raw_id``; it would be simple
900 shortened (if apply) version of ``raw_id``; it would be simple
898 shortcut for ``raw_id[:12]`` for git/mercurial backends or same
901 shortcut for ``raw_id[:12]`` for git/mercurial backends or same
899 as ``raw_id`` for subversion
902 as ``raw_id`` for subversion
900
903
901 ``idx``
904 ``idx``
902 commit index
905 commit index
903
906
904 ``files``
907 ``files``
905 list of ``FileNode`` (``Node`` with NodeKind.FILE) objects
908 list of ``FileNode`` (``Node`` with NodeKind.FILE) objects
906
909
907 ``dirs``
910 ``dirs``
908 list of ``DirNode`` (``Node`` with NodeKind.DIR) objects
911 list of ``DirNode`` (``Node`` with NodeKind.DIR) objects
909
912
910 ``nodes``
913 ``nodes``
911 combined list of ``Node`` objects
914 combined list of ``Node`` objects
912
915
913 ``author``
916 ``author``
914 author of the commit, as unicode
917 author of the commit, as unicode
915
918
916 ``message``
919 ``message``
917 message of the commit, as unicode
920 message of the commit, as unicode
918
921
919 ``parents``
922 ``parents``
920 list of parent commits
923 list of parent commits
921
924
922 """
925 """
923 repository = None
926 repository = None
924 branch = None
927 branch = None
925
928
926 """
929 """
927 Depending on the backend this should be set to the branch name of the
930 Depending on the backend this should be set to the branch name of the
928 commit. Backends not supporting branches on commits should leave this
931 commit. Backends not supporting branches on commits should leave this
929 value as ``None``.
932 value as ``None``.
930 """
933 """
931
934
932 _ARCHIVE_PREFIX_TEMPLATE = b'{repo_name}-{short_id}'
935 _ARCHIVE_PREFIX_TEMPLATE = b'{repo_name}-{short_id}'
933 """
936 """
934 This template is used to generate a default prefix for repository archives
937 This template is used to generate a default prefix for repository archives
935 if no prefix has been specified.
938 if no prefix has been specified.
936 """
939 """
937
940
938 def __str__(self):
941 def __str__(self):
939 return '<%s at %s:%s>' % (
942 return '<%s at %s:%s>' % (
940 self.__class__.__name__, self.idx, self.short_id)
943 self.__class__.__name__, self.idx, self.short_id)
941
944
942 def __repr__(self):
945 def __repr__(self):
943 return self.__str__()
946 return self.__str__()
944
947
945 def __unicode__(self):
948 def __unicode__(self):
946 return u'%s:%s' % (self.idx, self.short_id)
949 return u'%s:%s' % (self.idx, self.short_id)
947
950
948 def __eq__(self, other):
951 def __eq__(self, other):
949 same_instance = isinstance(other, self.__class__)
952 same_instance = isinstance(other, self.__class__)
950 return same_instance and self.raw_id == other.raw_id
953 return same_instance and self.raw_id == other.raw_id
951
954
952 def __json__(self):
955 def __json__(self):
953 parents = []
956 parents = []
954 try:
957 try:
955 for parent in self.parents:
958 for parent in self.parents:
956 parents.append({'raw_id': parent.raw_id})
959 parents.append({'raw_id': parent.raw_id})
957 except NotImplementedError:
960 except NotImplementedError:
958 # empty commit doesn't have parents implemented
961 # empty commit doesn't have parents implemented
959 pass
962 pass
960
963
961 return {
964 return {
962 'short_id': self.short_id,
965 'short_id': self.short_id,
963 'raw_id': self.raw_id,
966 'raw_id': self.raw_id,
964 'revision': self.idx,
967 'revision': self.idx,
965 'message': self.message,
968 'message': self.message,
966 'date': self.date,
969 'date': self.date,
967 'author': self.author,
970 'author': self.author,
968 'parents': parents,
971 'parents': parents,
969 'branch': self.branch
972 'branch': self.branch
970 }
973 }
971
974
972 def __getstate__(self):
975 def __getstate__(self):
973 d = self.__dict__.copy()
976 d = self.__dict__.copy()
974 d.pop('_remote', None)
977 d.pop('_remote', None)
975 d.pop('repository', None)
978 d.pop('repository', None)
976 return d
979 return d
977
980
978 def serialize(self):
981 def serialize(self):
979 return self.__json__()
982 return self.__json__()
980
983
981 def _get_refs(self):
984 def _get_refs(self):
982 return {
985 return {
983 'branches': [self.branch] if self.branch else [],
986 'branches': [self.branch] if self.branch else [],
984 'bookmarks': getattr(self, 'bookmarks', []),
987 'bookmarks': getattr(self, 'bookmarks', []),
985 'tags': self.tags
988 'tags': self.tags
986 }
989 }
987
990
988 @LazyProperty
991 @LazyProperty
989 def last(self):
992 def last(self):
990 """
993 """
991 ``True`` if this is last commit in repository, ``False``
994 ``True`` if this is last commit in repository, ``False``
992 otherwise; trying to access this attribute while there is no
995 otherwise; trying to access this attribute while there is no
993 commits would raise `EmptyRepositoryError`
996 commits would raise `EmptyRepositoryError`
994 """
997 """
995 if self.repository is None:
998 if self.repository is None:
996 raise CommitError("Cannot check if it's most recent commit")
999 raise CommitError("Cannot check if it's most recent commit")
997 return self.raw_id == self.repository.commit_ids[-1]
1000 return self.raw_id == self.repository.commit_ids[-1]
998
1001
999 @LazyProperty
1002 @LazyProperty
1000 def parents(self):
1003 def parents(self):
1001 """
1004 """
1002 Returns list of parent commits.
1005 Returns list of parent commits.
1003 """
1006 """
1004 raise NotImplementedError
1007 raise NotImplementedError
1005
1008
1006 @LazyProperty
1009 @LazyProperty
1007 def first_parent(self):
1010 def first_parent(self):
1008 """
1011 """
1009 Returns list of parent commits.
1012 Returns list of parent commits.
1010 """
1013 """
1011 return self.parents[0] if self.parents else EmptyCommit()
1014 return self.parents[0] if self.parents else EmptyCommit()
1012
1015
1013 @property
1016 @property
1014 def merge(self):
1017 def merge(self):
1015 """
1018 """
1016 Returns boolean if commit is a merge.
1019 Returns boolean if commit is a merge.
1017 """
1020 """
1018 return len(self.parents) > 1
1021 return len(self.parents) > 1
1019
1022
1020 @LazyProperty
1023 @LazyProperty
1021 def children(self):
1024 def children(self):
1022 """
1025 """
1023 Returns list of child commits.
1026 Returns list of child commits.
1024 """
1027 """
1025 raise NotImplementedError
1028 raise NotImplementedError
1026
1029
1027 @LazyProperty
1030 @LazyProperty
1028 def id(self):
1031 def id(self):
1029 """
1032 """
1030 Returns string identifying this commit.
1033 Returns string identifying this commit.
1031 """
1034 """
1032 raise NotImplementedError
1035 raise NotImplementedError
1033
1036
1034 @LazyProperty
1037 @LazyProperty
1035 def raw_id(self):
1038 def raw_id(self):
1036 """
1039 """
1037 Returns raw string identifying this commit.
1040 Returns raw string identifying this commit.
1038 """
1041 """
1039 raise NotImplementedError
1042 raise NotImplementedError
1040
1043
1041 @LazyProperty
1044 @LazyProperty
1042 def short_id(self):
1045 def short_id(self):
1043 """
1046 """
1044 Returns shortened version of ``raw_id`` attribute, as string,
1047 Returns shortened version of ``raw_id`` attribute, as string,
1045 identifying this commit, useful for presentation to users.
1048 identifying this commit, useful for presentation to users.
1046 """
1049 """
1047 raise NotImplementedError
1050 raise NotImplementedError
1048
1051
1049 @LazyProperty
1052 @LazyProperty
1050 def idx(self):
1053 def idx(self):
1051 """
1054 """
1052 Returns integer identifying this commit.
1055 Returns integer identifying this commit.
1053 """
1056 """
1054 raise NotImplementedError
1057 raise NotImplementedError
1055
1058
1056 @LazyProperty
1059 @LazyProperty
1057 def committer(self):
1060 def committer(self):
1058 """
1061 """
1059 Returns committer for this commit
1062 Returns committer for this commit
1060 """
1063 """
1061 raise NotImplementedError
1064 raise NotImplementedError
1062
1065
1063 @LazyProperty
1066 @LazyProperty
1064 def committer_name(self):
1067 def committer_name(self):
1065 """
1068 """
1066 Returns committer name for this commit
1069 Returns committer name for this commit
1067 """
1070 """
1068
1071
1069 return author_name(self.committer)
1072 return author_name(self.committer)
1070
1073
1071 @LazyProperty
1074 @LazyProperty
1072 def committer_email(self):
1075 def committer_email(self):
1073 """
1076 """
1074 Returns committer email address for this commit
1077 Returns committer email address for this commit
1075 """
1078 """
1076
1079
1077 return author_email(self.committer)
1080 return author_email(self.committer)
1078
1081
1079 @LazyProperty
1082 @LazyProperty
1080 def author(self):
1083 def author(self):
1081 """
1084 """
1082 Returns author for this commit
1085 Returns author for this commit
1083 """
1086 """
1084
1087
1085 raise NotImplementedError
1088 raise NotImplementedError
1086
1089
1087 @LazyProperty
1090 @LazyProperty
1088 def author_name(self):
1091 def author_name(self):
1089 """
1092 """
1090 Returns author name for this commit
1093 Returns author name for this commit
1091 """
1094 """
1092
1095
1093 return author_name(self.author)
1096 return author_name(self.author)
1094
1097
1095 @LazyProperty
1098 @LazyProperty
1096 def author_email(self):
1099 def author_email(self):
1097 """
1100 """
1098 Returns author email address for this commit
1101 Returns author email address for this commit
1099 """
1102 """
1100
1103
1101 return author_email(self.author)
1104 return author_email(self.author)
1102
1105
1103 def get_file_mode(self, path):
1106 def get_file_mode(self, path):
1104 """
1107 """
1105 Returns stat mode of the file at `path`.
1108 Returns stat mode of the file at `path`.
1106 """
1109 """
1107 raise NotImplementedError
1110 raise NotImplementedError
1108
1111
1109 def is_link(self, path):
1112 def is_link(self, path):
1110 """
1113 """
1111 Returns ``True`` if given `path` is a symlink
1114 Returns ``True`` if given `path` is a symlink
1112 """
1115 """
1113 raise NotImplementedError
1116 raise NotImplementedError
1114
1117
1115 def is_node_binary(self, path):
1118 def is_node_binary(self, path):
1116 """
1119 """
1117 Returns ``True`` is given path is a binary file
1120 Returns ``True`` is given path is a binary file
1118 """
1121 """
1119 raise NotImplementedError
1122 raise NotImplementedError
1120
1123
1121 def get_file_content(self, path):
1124 def get_file_content(self, path):
1122 """
1125 """
1123 Returns content of the file at the given `path`.
1126 Returns content of the file at the given `path`.
1124 """
1127 """
1125 raise NotImplementedError
1128 raise NotImplementedError
1126
1129
1127 def get_file_content_streamed(self, path):
1130 def get_file_content_streamed(self, path):
1128 """
1131 """
1129 returns a streaming response from vcsserver with file content
1132 returns a streaming response from vcsserver with file content
1130 """
1133 """
1131 raise NotImplementedError
1134 raise NotImplementedError
1132
1135
1133 def get_file_size(self, path):
1136 def get_file_size(self, path):
1134 """
1137 """
1135 Returns size of the file at the given `path`.
1138 Returns size of the file at the given `path`.
1136 """
1139 """
1137 raise NotImplementedError
1140 raise NotImplementedError
1138
1141
1139 def get_path_commit(self, path, pre_load=None):
1142 def get_path_commit(self, path, pre_load=None):
1140 """
1143 """
1141 Returns last commit of the file at the given `path`.
1144 Returns last commit of the file at the given `path`.
1142
1145
1143 :param pre_load: Optional. List of commit attributes to load.
1146 :param pre_load: Optional. List of commit attributes to load.
1144 """
1147 """
1145 commits = self.get_path_history(path, limit=1, pre_load=pre_load)
1148 commits = self.get_path_history(path, limit=1, pre_load=pre_load)
1146 if not commits:
1149 if not commits:
1147 raise RepositoryError(
1150 raise RepositoryError(
1148 'Failed to fetch history for path {}. '
1151 'Failed to fetch history for path {}. '
1149 'Please check if such path exists in your repository'.format(
1152 'Please check if such path exists in your repository'.format(
1150 path))
1153 path))
1151 return commits[0]
1154 return commits[0]
1152
1155
1153 def get_path_history(self, path, limit=None, pre_load=None):
1156 def get_path_history(self, path, limit=None, pre_load=None):
1154 """
1157 """
1155 Returns history of file as reversed list of :class:`BaseCommit`
1158 Returns history of file as reversed list of :class:`BaseCommit`
1156 objects for which file at given `path` has been modified.
1159 objects for which file at given `path` has been modified.
1157
1160
1158 :param limit: Optional. Allows to limit the size of the returned
1161 :param limit: Optional. Allows to limit the size of the returned
1159 history. This is intended as a hint to the underlying backend, so
1162 history. This is intended as a hint to the underlying backend, so
1160 that it can apply optimizations depending on the limit.
1163 that it can apply optimizations depending on the limit.
1161 :param pre_load: Optional. List of commit attributes to load.
1164 :param pre_load: Optional. List of commit attributes to load.
1162 """
1165 """
1163 raise NotImplementedError
1166 raise NotImplementedError
1164
1167
1165 def get_file_annotate(self, path, pre_load=None):
1168 def get_file_annotate(self, path, pre_load=None):
1166 """
1169 """
1167 Returns a generator of four element tuples with
1170 Returns a generator of four element tuples with
1168 lineno, sha, commit lazy loader and line
1171 lineno, sha, commit lazy loader and line
1169
1172
1170 :param pre_load: Optional. List of commit attributes to load.
1173 :param pre_load: Optional. List of commit attributes to load.
1171 """
1174 """
1172 raise NotImplementedError
1175 raise NotImplementedError
1173
1176
1174 def get_nodes(self, path):
1177 def get_nodes(self, path):
1175 """
1178 """
1176 Returns combined ``DirNode`` and ``FileNode`` objects list representing
1179 Returns combined ``DirNode`` and ``FileNode`` objects list representing
1177 state of commit at the given ``path``.
1180 state of commit at the given ``path``.
1178
1181
1179 :raises ``CommitError``: if node at the given ``path`` is not
1182 :raises ``CommitError``: if node at the given ``path`` is not
1180 instance of ``DirNode``
1183 instance of ``DirNode``
1181 """
1184 """
1182 raise NotImplementedError
1185 raise NotImplementedError
1183
1186
1184 def get_node(self, path):
1187 def get_node(self, path):
1185 """
1188 """
1186 Returns ``Node`` object from the given ``path``.
1189 Returns ``Node`` object from the given ``path``.
1187
1190
1188 :raises ``NodeDoesNotExistError``: if there is no node at the given
1191 :raises ``NodeDoesNotExistError``: if there is no node at the given
1189 ``path``
1192 ``path``
1190 """
1193 """
1191 raise NotImplementedError
1194 raise NotImplementedError
1192
1195
1193 def get_largefile_node(self, path):
1196 def get_largefile_node(self, path):
1194 """
1197 """
1195 Returns the path to largefile from Mercurial/Git-lfs storage.
1198 Returns the path to largefile from Mercurial/Git-lfs storage.
1196 or None if it's not a largefile node
1199 or None if it's not a largefile node
1197 """
1200 """
1198 return None
1201 return None
1199
1202
1200 def archive_repo(self, archive_dest_path, kind='tgz', subrepos=None,
1203 def archive_repo(self, archive_dest_path, kind='tgz', subrepos=None,
1201 archive_dir_name=None, write_metadata=False, mtime=None,
1204 archive_dir_name=None, write_metadata=False, mtime=None,
1202 archive_at_path='/'):
1205 archive_at_path='/'):
1203 """
1206 """
1204 Creates an archive containing the contents of the repository.
1207 Creates an archive containing the contents of the repository.
1205
1208
1206 :param archive_dest_path: path to the file which to create the archive.
1209 :param archive_dest_path: path to the file which to create the archive.
1207 :param kind: one of following: ``"tbz2"``, ``"tgz"``, ``"zip"``.
1210 :param kind: one of following: ``"tbz2"``, ``"tgz"``, ``"zip"``.
1208 :param archive_dir_name: name of root directory in archive.
1211 :param archive_dir_name: name of root directory in archive.
1209 Default is repository name and commit's short_id joined with dash:
1212 Default is repository name and commit's short_id joined with dash:
1210 ``"{repo_name}-{short_id}"``.
1213 ``"{repo_name}-{short_id}"``.
1211 :param write_metadata: write a metadata file into archive.
1214 :param write_metadata: write a metadata file into archive.
1212 :param mtime: custom modification time for archive creation, defaults
1215 :param mtime: custom modification time for archive creation, defaults
1213 to time.time() if not given.
1216 to time.time() if not given.
1214 :param archive_at_path: pack files at this path (default '/')
1217 :param archive_at_path: pack files at this path (default '/')
1215
1218
1216 :raise VCSError: If prefix has a problem.
1219 :raise VCSError: If prefix has a problem.
1217 """
1220 """
1218 allowed_kinds = [x[0] for x in settings.ARCHIVE_SPECS]
1221 allowed_kinds = [x[0] for x in settings.ARCHIVE_SPECS]
1219 if kind not in allowed_kinds:
1222 if kind not in allowed_kinds:
1220 raise ImproperArchiveTypeError(
1223 raise ImproperArchiveTypeError(
1221 'Archive kind (%s) not supported use one of %s' %
1224 'Archive kind (%s) not supported use one of %s' %
1222 (kind, allowed_kinds))
1225 (kind, allowed_kinds))
1223
1226
1224 archive_dir_name = self._validate_archive_prefix(archive_dir_name)
1227 archive_dir_name = self._validate_archive_prefix(archive_dir_name)
1225 mtime = mtime is not None or time.mktime(self.date.timetuple())
1228 mtime = mtime is not None or time.mktime(self.date.timetuple())
1226 commit_id = self.raw_id
1229 commit_id = self.raw_id
1227
1230
1228 return self.repository._remote.archive_repo(
1231 return self.repository._remote.archive_repo(
1229 archive_dest_path, kind, mtime, archive_at_path,
1232 archive_dest_path, kind, mtime, archive_at_path,
1230 archive_dir_name, commit_id)
1233 archive_dir_name, commit_id)
1231
1234
1232 def _validate_archive_prefix(self, archive_dir_name):
1235 def _validate_archive_prefix(self, archive_dir_name):
1233 if archive_dir_name is None:
1236 if archive_dir_name is None:
1234 archive_dir_name = self._ARCHIVE_PREFIX_TEMPLATE.format(
1237 archive_dir_name = self._ARCHIVE_PREFIX_TEMPLATE.format(
1235 repo_name=safe_str(self.repository.name),
1238 repo_name=safe_str(self.repository.name),
1236 short_id=self.short_id)
1239 short_id=self.short_id)
1237 elif not isinstance(archive_dir_name, str):
1240 elif not isinstance(archive_dir_name, str):
1238 raise ValueError("prefix not a bytes object: %s" % repr(archive_dir_name))
1241 raise ValueError("prefix not a bytes object: %s" % repr(archive_dir_name))
1239 elif archive_dir_name.startswith('/'):
1242 elif archive_dir_name.startswith('/'):
1240 raise VCSError("Prefix cannot start with leading slash")
1243 raise VCSError("Prefix cannot start with leading slash")
1241 elif archive_dir_name.strip() == '':
1244 elif archive_dir_name.strip() == '':
1242 raise VCSError("Prefix cannot be empty")
1245 raise VCSError("Prefix cannot be empty")
1243 return archive_dir_name
1246 return archive_dir_name
1244
1247
1245 @LazyProperty
1248 @LazyProperty
1246 def root(self):
1249 def root(self):
1247 """
1250 """
1248 Returns ``RootNode`` object for this commit.
1251 Returns ``RootNode`` object for this commit.
1249 """
1252 """
1250 return self.get_node('')
1253 return self.get_node('')
1251
1254
1252 def next(self, branch=None):
1255 def next(self, branch=None):
1253 """
1256 """
1254 Returns next commit from current, if branch is gives it will return
1257 Returns next commit from current, if branch is gives it will return
1255 next commit belonging to this branch
1258 next commit belonging to this branch
1256
1259
1257 :param branch: show commits within the given named branch
1260 :param branch: show commits within the given named branch
1258 """
1261 """
1259 indexes = xrange(self.idx + 1, self.repository.count())
1262 indexes = xrange(self.idx + 1, self.repository.count())
1260 return self._find_next(indexes, branch)
1263 return self._find_next(indexes, branch)
1261
1264
1262 def prev(self, branch=None):
1265 def prev(self, branch=None):
1263 """
1266 """
1264 Returns previous commit from current, if branch is gives it will
1267 Returns previous commit from current, if branch is gives it will
1265 return previous commit belonging to this branch
1268 return previous commit belonging to this branch
1266
1269
1267 :param branch: show commit within the given named branch
1270 :param branch: show commit within the given named branch
1268 """
1271 """
1269 indexes = xrange(self.idx - 1, -1, -1)
1272 indexes = xrange(self.idx - 1, -1, -1)
1270 return self._find_next(indexes, branch)
1273 return self._find_next(indexes, branch)
1271
1274
1272 def _find_next(self, indexes, branch=None):
1275 def _find_next(self, indexes, branch=None):
1273 if branch and self.branch != branch:
1276 if branch and self.branch != branch:
1274 raise VCSError('Branch option used on commit not belonging '
1277 raise VCSError('Branch option used on commit not belonging '
1275 'to that branch')
1278 'to that branch')
1276
1279
1277 for next_idx in indexes:
1280 for next_idx in indexes:
1278 commit = self.repository.get_commit(commit_idx=next_idx)
1281 commit = self.repository.get_commit(commit_idx=next_idx)
1279 if branch and branch != commit.branch:
1282 if branch and branch != commit.branch:
1280 continue
1283 continue
1281 return commit
1284 return commit
1282 raise CommitDoesNotExistError
1285 raise CommitDoesNotExistError
1283
1286
1284 def diff(self, ignore_whitespace=True, context=3):
1287 def diff(self, ignore_whitespace=True, context=3):
1285 """
1288 """
1286 Returns a `Diff` object representing the change made by this commit.
1289 Returns a `Diff` object representing the change made by this commit.
1287 """
1290 """
1288 parent = self.first_parent
1291 parent = self.first_parent
1289 diff = self.repository.get_diff(
1292 diff = self.repository.get_diff(
1290 parent, self,
1293 parent, self,
1291 ignore_whitespace=ignore_whitespace,
1294 ignore_whitespace=ignore_whitespace,
1292 context=context)
1295 context=context)
1293 return diff
1296 return diff
1294
1297
1295 @LazyProperty
1298 @LazyProperty
1296 def added(self):
1299 def added(self):
1297 """
1300 """
1298 Returns list of added ``FileNode`` objects.
1301 Returns list of added ``FileNode`` objects.
1299 """
1302 """
1300 raise NotImplementedError
1303 raise NotImplementedError
1301
1304
1302 @LazyProperty
1305 @LazyProperty
1303 def changed(self):
1306 def changed(self):
1304 """
1307 """
1305 Returns list of modified ``FileNode`` objects.
1308 Returns list of modified ``FileNode`` objects.
1306 """
1309 """
1307 raise NotImplementedError
1310 raise NotImplementedError
1308
1311
1309 @LazyProperty
1312 @LazyProperty
1310 def removed(self):
1313 def removed(self):
1311 """
1314 """
1312 Returns list of removed ``FileNode`` objects.
1315 Returns list of removed ``FileNode`` objects.
1313 """
1316 """
1314 raise NotImplementedError
1317 raise NotImplementedError
1315
1318
1316 @LazyProperty
1319 @LazyProperty
1317 def size(self):
1320 def size(self):
1318 """
1321 """
1319 Returns total number of bytes from contents of all filenodes.
1322 Returns total number of bytes from contents of all filenodes.
1320 """
1323 """
1321 return sum((node.size for node in self.get_filenodes_generator()))
1324 return sum((node.size for node in self.get_filenodes_generator()))
1322
1325
1323 def walk(self, topurl=''):
1326 def walk(self, topurl=''):
1324 """
1327 """
1325 Similar to os.walk method. Insted of filesystem it walks through
1328 Similar to os.walk method. Insted of filesystem it walks through
1326 commit starting at given ``topurl``. Returns generator of tuples
1329 commit starting at given ``topurl``. Returns generator of tuples
1327 (topnode, dirnodes, filenodes).
1330 (topnode, dirnodes, filenodes).
1328 """
1331 """
1329 topnode = self.get_node(topurl)
1332 topnode = self.get_node(topurl)
1330 if not topnode.is_dir():
1333 if not topnode.is_dir():
1331 return
1334 return
1332 yield (topnode, topnode.dirs, topnode.files)
1335 yield (topnode, topnode.dirs, topnode.files)
1333 for dirnode in topnode.dirs:
1336 for dirnode in topnode.dirs:
1334 for tup in self.walk(dirnode.path):
1337 for tup in self.walk(dirnode.path):
1335 yield tup
1338 yield tup
1336
1339
1337 def get_filenodes_generator(self):
1340 def get_filenodes_generator(self):
1338 """
1341 """
1339 Returns generator that yields *all* file nodes.
1342 Returns generator that yields *all* file nodes.
1340 """
1343 """
1341 for topnode, dirs, files in self.walk():
1344 for topnode, dirs, files in self.walk():
1342 for node in files:
1345 for node in files:
1343 yield node
1346 yield node
1344
1347
1345 #
1348 #
1346 # Utilities for sub classes to support consistent behavior
1349 # Utilities for sub classes to support consistent behavior
1347 #
1350 #
1348
1351
1349 def no_node_at_path(self, path):
1352 def no_node_at_path(self, path):
1350 return NodeDoesNotExistError(
1353 return NodeDoesNotExistError(
1351 u"There is no file nor directory at the given path: "
1354 u"There is no file nor directory at the given path: "
1352 u"`%s` at commit %s" % (safe_unicode(path), self.short_id))
1355 u"`%s` at commit %s" % (safe_unicode(path), self.short_id))
1353
1356
1354 def _fix_path(self, path):
1357 def _fix_path(self, path):
1355 """
1358 """
1356 Paths are stored without trailing slash so we need to get rid off it if
1359 Paths are stored without trailing slash so we need to get rid off it if
1357 needed.
1360 needed.
1358 """
1361 """
1359 return path.rstrip('/')
1362 return path.rstrip('/')
1360
1363
1361 #
1364 #
1362 # Deprecated API based on changesets
1365 # Deprecated API based on changesets
1363 #
1366 #
1364
1367
1365 @property
1368 @property
1366 def revision(self):
1369 def revision(self):
1367 warnings.warn("Use idx instead", DeprecationWarning)
1370 warnings.warn("Use idx instead", DeprecationWarning)
1368 return self.idx
1371 return self.idx
1369
1372
1370 @revision.setter
1373 @revision.setter
1371 def revision(self, value):
1374 def revision(self, value):
1372 warnings.warn("Use idx instead", DeprecationWarning)
1375 warnings.warn("Use idx instead", DeprecationWarning)
1373 self.idx = value
1376 self.idx = value
1374
1377
1375 def get_file_changeset(self, path):
1378 def get_file_changeset(self, path):
1376 warnings.warn("Use get_path_commit instead", DeprecationWarning)
1379 warnings.warn("Use get_path_commit instead", DeprecationWarning)
1377 return self.get_path_commit(path)
1380 return self.get_path_commit(path)
1378
1381
1379
1382
1380 class BaseChangesetClass(type):
1383 class BaseChangesetClass(type):
1381
1384
1382 def __instancecheck__(self, instance):
1385 def __instancecheck__(self, instance):
1383 return isinstance(instance, BaseCommit)
1386 return isinstance(instance, BaseCommit)
1384
1387
1385
1388
1386 class BaseChangeset(BaseCommit):
1389 class BaseChangeset(BaseCommit):
1387
1390
1388 __metaclass__ = BaseChangesetClass
1391 __metaclass__ = BaseChangesetClass
1389
1392
1390 def __new__(cls, *args, **kwargs):
1393 def __new__(cls, *args, **kwargs):
1391 warnings.warn(
1394 warnings.warn(
1392 "Use BaseCommit instead of BaseChangeset", DeprecationWarning)
1395 "Use BaseCommit instead of BaseChangeset", DeprecationWarning)
1393 return super(BaseChangeset, cls).__new__(cls, *args, **kwargs)
1396 return super(BaseChangeset, cls).__new__(cls, *args, **kwargs)
1394
1397
1395
1398
1396 class BaseInMemoryCommit(object):
1399 class BaseInMemoryCommit(object):
1397 """
1400 """
1398 Represents differences between repository's state (most recent head) and
1401 Represents differences between repository's state (most recent head) and
1399 changes made *in place*.
1402 changes made *in place*.
1400
1403
1401 **Attributes**
1404 **Attributes**
1402
1405
1403 ``repository``
1406 ``repository``
1404 repository object for this in-memory-commit
1407 repository object for this in-memory-commit
1405
1408
1406 ``added``
1409 ``added``
1407 list of ``FileNode`` objects marked as *added*
1410 list of ``FileNode`` objects marked as *added*
1408
1411
1409 ``changed``
1412 ``changed``
1410 list of ``FileNode`` objects marked as *changed*
1413 list of ``FileNode`` objects marked as *changed*
1411
1414
1412 ``removed``
1415 ``removed``
1413 list of ``FileNode`` or ``RemovedFileNode`` objects marked to be
1416 list of ``FileNode`` or ``RemovedFileNode`` objects marked to be
1414 *removed*
1417 *removed*
1415
1418
1416 ``parents``
1419 ``parents``
1417 list of :class:`BaseCommit` instances representing parents of
1420 list of :class:`BaseCommit` instances representing parents of
1418 in-memory commit. Should always be 2-element sequence.
1421 in-memory commit. Should always be 2-element sequence.
1419
1422
1420 """
1423 """
1421
1424
1422 def __init__(self, repository):
1425 def __init__(self, repository):
1423 self.repository = repository
1426 self.repository = repository
1424 self.added = []
1427 self.added = []
1425 self.changed = []
1428 self.changed = []
1426 self.removed = []
1429 self.removed = []
1427 self.parents = []
1430 self.parents = []
1428
1431
1429 def add(self, *filenodes):
1432 def add(self, *filenodes):
1430 """
1433 """
1431 Marks given ``FileNode`` objects as *to be committed*.
1434 Marks given ``FileNode`` objects as *to be committed*.
1432
1435
1433 :raises ``NodeAlreadyExistsError``: if node with same path exists at
1436 :raises ``NodeAlreadyExistsError``: if node with same path exists at
1434 latest commit
1437 latest commit
1435 :raises ``NodeAlreadyAddedError``: if node with same path is already
1438 :raises ``NodeAlreadyAddedError``: if node with same path is already
1436 marked as *added*
1439 marked as *added*
1437 """
1440 """
1438 # Check if not already marked as *added* first
1441 # Check if not already marked as *added* first
1439 for node in filenodes:
1442 for node in filenodes:
1440 if node.path in (n.path for n in self.added):
1443 if node.path in (n.path for n in self.added):
1441 raise NodeAlreadyAddedError(
1444 raise NodeAlreadyAddedError(
1442 "Such FileNode %s is already marked for addition"
1445 "Such FileNode %s is already marked for addition"
1443 % node.path)
1446 % node.path)
1444 for node in filenodes:
1447 for node in filenodes:
1445 self.added.append(node)
1448 self.added.append(node)
1446
1449
1447 def change(self, *filenodes):
1450 def change(self, *filenodes):
1448 """
1451 """
1449 Marks given ``FileNode`` objects to be *changed* in next commit.
1452 Marks given ``FileNode`` objects to be *changed* in next commit.
1450
1453
1451 :raises ``EmptyRepositoryError``: if there are no commits yet
1454 :raises ``EmptyRepositoryError``: if there are no commits yet
1452 :raises ``NodeAlreadyExistsError``: if node with same path is already
1455 :raises ``NodeAlreadyExistsError``: if node with same path is already
1453 marked to be *changed*
1456 marked to be *changed*
1454 :raises ``NodeAlreadyRemovedError``: if node with same path is already
1457 :raises ``NodeAlreadyRemovedError``: if node with same path is already
1455 marked to be *removed*
1458 marked to be *removed*
1456 :raises ``NodeDoesNotExistError``: if node doesn't exist in latest
1459 :raises ``NodeDoesNotExistError``: if node doesn't exist in latest
1457 commit
1460 commit
1458 :raises ``NodeNotChangedError``: if node hasn't really be changed
1461 :raises ``NodeNotChangedError``: if node hasn't really be changed
1459 """
1462 """
1460 for node in filenodes:
1463 for node in filenodes:
1461 if node.path in (n.path for n in self.removed):
1464 if node.path in (n.path for n in self.removed):
1462 raise NodeAlreadyRemovedError(
1465 raise NodeAlreadyRemovedError(
1463 "Node at %s is already marked as removed" % node.path)
1466 "Node at %s is already marked as removed" % node.path)
1464 try:
1467 try:
1465 self.repository.get_commit()
1468 self.repository.get_commit()
1466 except EmptyRepositoryError:
1469 except EmptyRepositoryError:
1467 raise EmptyRepositoryError(
1470 raise EmptyRepositoryError(
1468 "Nothing to change - try to *add* new nodes rather than "
1471 "Nothing to change - try to *add* new nodes rather than "
1469 "changing them")
1472 "changing them")
1470 for node in filenodes:
1473 for node in filenodes:
1471 if node.path in (n.path for n in self.changed):
1474 if node.path in (n.path for n in self.changed):
1472 raise NodeAlreadyChangedError(
1475 raise NodeAlreadyChangedError(
1473 "Node at '%s' is already marked as changed" % node.path)
1476 "Node at '%s' is already marked as changed" % node.path)
1474 self.changed.append(node)
1477 self.changed.append(node)
1475
1478
1476 def remove(self, *filenodes):
1479 def remove(self, *filenodes):
1477 """
1480 """
1478 Marks given ``FileNode`` (or ``RemovedFileNode``) objects to be
1481 Marks given ``FileNode`` (or ``RemovedFileNode``) objects to be
1479 *removed* in next commit.
1482 *removed* in next commit.
1480
1483
1481 :raises ``NodeAlreadyRemovedError``: if node has been already marked to
1484 :raises ``NodeAlreadyRemovedError``: if node has been already marked to
1482 be *removed*
1485 be *removed*
1483 :raises ``NodeAlreadyChangedError``: if node has been already marked to
1486 :raises ``NodeAlreadyChangedError``: if node has been already marked to
1484 be *changed*
1487 be *changed*
1485 """
1488 """
1486 for node in filenodes:
1489 for node in filenodes:
1487 if node.path in (n.path for n in self.removed):
1490 if node.path in (n.path for n in self.removed):
1488 raise NodeAlreadyRemovedError(
1491 raise NodeAlreadyRemovedError(
1489 "Node is already marked to for removal at %s" % node.path)
1492 "Node is already marked to for removal at %s" % node.path)
1490 if node.path in (n.path for n in self.changed):
1493 if node.path in (n.path for n in self.changed):
1491 raise NodeAlreadyChangedError(
1494 raise NodeAlreadyChangedError(
1492 "Node is already marked to be changed at %s" % node.path)
1495 "Node is already marked to be changed at %s" % node.path)
1493 # We only mark node as *removed* - real removal is done by
1496 # We only mark node as *removed* - real removal is done by
1494 # commit method
1497 # commit method
1495 self.removed.append(node)
1498 self.removed.append(node)
1496
1499
1497 def reset(self):
1500 def reset(self):
1498 """
1501 """
1499 Resets this instance to initial state (cleans ``added``, ``changed``
1502 Resets this instance to initial state (cleans ``added``, ``changed``
1500 and ``removed`` lists).
1503 and ``removed`` lists).
1501 """
1504 """
1502 self.added = []
1505 self.added = []
1503 self.changed = []
1506 self.changed = []
1504 self.removed = []
1507 self.removed = []
1505 self.parents = []
1508 self.parents = []
1506
1509
1507 def get_ipaths(self):
1510 def get_ipaths(self):
1508 """
1511 """
1509 Returns generator of paths from nodes marked as added, changed or
1512 Returns generator of paths from nodes marked as added, changed or
1510 removed.
1513 removed.
1511 """
1514 """
1512 for node in itertools.chain(self.added, self.changed, self.removed):
1515 for node in itertools.chain(self.added, self.changed, self.removed):
1513 yield node.path
1516 yield node.path
1514
1517
1515 def get_paths(self):
1518 def get_paths(self):
1516 """
1519 """
1517 Returns list of paths from nodes marked as added, changed or removed.
1520 Returns list of paths from nodes marked as added, changed or removed.
1518 """
1521 """
1519 return list(self.get_ipaths())
1522 return list(self.get_ipaths())
1520
1523
1521 def check_integrity(self, parents=None):
1524 def check_integrity(self, parents=None):
1522 """
1525 """
1523 Checks in-memory commit's integrity. Also, sets parents if not
1526 Checks in-memory commit's integrity. Also, sets parents if not
1524 already set.
1527 already set.
1525
1528
1526 :raises CommitError: if any error occurs (i.e.
1529 :raises CommitError: if any error occurs (i.e.
1527 ``NodeDoesNotExistError``).
1530 ``NodeDoesNotExistError``).
1528 """
1531 """
1529 if not self.parents:
1532 if not self.parents:
1530 parents = parents or []
1533 parents = parents or []
1531 if len(parents) == 0:
1534 if len(parents) == 0:
1532 try:
1535 try:
1533 parents = [self.repository.get_commit(), None]
1536 parents = [self.repository.get_commit(), None]
1534 except EmptyRepositoryError:
1537 except EmptyRepositoryError:
1535 parents = [None, None]
1538 parents = [None, None]
1536 elif len(parents) == 1:
1539 elif len(parents) == 1:
1537 parents += [None]
1540 parents += [None]
1538 self.parents = parents
1541 self.parents = parents
1539
1542
1540 # Local parents, only if not None
1543 # Local parents, only if not None
1541 parents = [p for p in self.parents if p]
1544 parents = [p for p in self.parents if p]
1542
1545
1543 # Check nodes marked as added
1546 # Check nodes marked as added
1544 for p in parents:
1547 for p in parents:
1545 for node in self.added:
1548 for node in self.added:
1546 try:
1549 try:
1547 p.get_node(node.path)
1550 p.get_node(node.path)
1548 except NodeDoesNotExistError:
1551 except NodeDoesNotExistError:
1549 pass
1552 pass
1550 else:
1553 else:
1551 raise NodeAlreadyExistsError(
1554 raise NodeAlreadyExistsError(
1552 "Node `%s` already exists at %s" % (node.path, p))
1555 "Node `%s` already exists at %s" % (node.path, p))
1553
1556
1554 # Check nodes marked as changed
1557 # Check nodes marked as changed
1555 missing = set(self.changed)
1558 missing = set(self.changed)
1556 not_changed = set(self.changed)
1559 not_changed = set(self.changed)
1557 if self.changed and not parents:
1560 if self.changed and not parents:
1558 raise NodeDoesNotExistError(str(self.changed[0].path))
1561 raise NodeDoesNotExistError(str(self.changed[0].path))
1559 for p in parents:
1562 for p in parents:
1560 for node in self.changed:
1563 for node in self.changed:
1561 try:
1564 try:
1562 old = p.get_node(node.path)
1565 old = p.get_node(node.path)
1563 missing.remove(node)
1566 missing.remove(node)
1564 # if content actually changed, remove node from not_changed
1567 # if content actually changed, remove node from not_changed
1565 if old.content != node.content:
1568 if old.content != node.content:
1566 not_changed.remove(node)
1569 not_changed.remove(node)
1567 except NodeDoesNotExistError:
1570 except NodeDoesNotExistError:
1568 pass
1571 pass
1569 if self.changed and missing:
1572 if self.changed and missing:
1570 raise NodeDoesNotExistError(
1573 raise NodeDoesNotExistError(
1571 "Node `%s` marked as modified but missing in parents: %s"
1574 "Node `%s` marked as modified but missing in parents: %s"
1572 % (node.path, parents))
1575 % (node.path, parents))
1573
1576
1574 if self.changed and not_changed:
1577 if self.changed and not_changed:
1575 raise NodeNotChangedError(
1578 raise NodeNotChangedError(
1576 "Node `%s` wasn't actually changed (parents: %s)"
1579 "Node `%s` wasn't actually changed (parents: %s)"
1577 % (not_changed.pop().path, parents))
1580 % (not_changed.pop().path, parents))
1578
1581
1579 # Check nodes marked as removed
1582 # Check nodes marked as removed
1580 if self.removed and not parents:
1583 if self.removed and not parents:
1581 raise NodeDoesNotExistError(
1584 raise NodeDoesNotExistError(
1582 "Cannot remove node at %s as there "
1585 "Cannot remove node at %s as there "
1583 "were no parents specified" % self.removed[0].path)
1586 "were no parents specified" % self.removed[0].path)
1584 really_removed = set()
1587 really_removed = set()
1585 for p in parents:
1588 for p in parents:
1586 for node in self.removed:
1589 for node in self.removed:
1587 try:
1590 try:
1588 p.get_node(node.path)
1591 p.get_node(node.path)
1589 really_removed.add(node)
1592 really_removed.add(node)
1590 except CommitError:
1593 except CommitError:
1591 pass
1594 pass
1592 not_removed = set(self.removed) - really_removed
1595 not_removed = set(self.removed) - really_removed
1593 if not_removed:
1596 if not_removed:
1594 # TODO: johbo: This code branch does not seem to be covered
1597 # TODO: johbo: This code branch does not seem to be covered
1595 raise NodeDoesNotExistError(
1598 raise NodeDoesNotExistError(
1596 "Cannot remove node at %s from "
1599 "Cannot remove node at %s from "
1597 "following parents: %s" % (not_removed, parents))
1600 "following parents: %s" % (not_removed, parents))
1598
1601
1599 def commit(self, message, author, parents=None, branch=None, date=None, **kwargs):
1602 def commit(self, message, author, parents=None, branch=None, date=None, **kwargs):
1600 """
1603 """
1601 Performs in-memory commit (doesn't check workdir in any way) and
1604 Performs in-memory commit (doesn't check workdir in any way) and
1602 returns newly created :class:`BaseCommit`. Updates repository's
1605 returns newly created :class:`BaseCommit`. Updates repository's
1603 attribute `commits`.
1606 attribute `commits`.
1604
1607
1605 .. note::
1608 .. note::
1606
1609
1607 While overriding this method each backend's should call
1610 While overriding this method each backend's should call
1608 ``self.check_integrity(parents)`` in the first place.
1611 ``self.check_integrity(parents)`` in the first place.
1609
1612
1610 :param message: message of the commit
1613 :param message: message of the commit
1611 :param author: full username, i.e. "Joe Doe <joe.doe@example.com>"
1614 :param author: full username, i.e. "Joe Doe <joe.doe@example.com>"
1612 :param parents: single parent or sequence of parents from which commit
1615 :param parents: single parent or sequence of parents from which commit
1613 would be derived
1616 would be derived
1614 :param date: ``datetime.datetime`` instance. Defaults to
1617 :param date: ``datetime.datetime`` instance. Defaults to
1615 ``datetime.datetime.now()``.
1618 ``datetime.datetime.now()``.
1616 :param branch: branch name, as string. If none given, default backend's
1619 :param branch: branch name, as string. If none given, default backend's
1617 branch would be used.
1620 branch would be used.
1618
1621
1619 :raises ``CommitError``: if any error occurs while committing
1622 :raises ``CommitError``: if any error occurs while committing
1620 """
1623 """
1621 raise NotImplementedError
1624 raise NotImplementedError
1622
1625
1623
1626
1624 class BaseInMemoryChangesetClass(type):
1627 class BaseInMemoryChangesetClass(type):
1625
1628
1626 def __instancecheck__(self, instance):
1629 def __instancecheck__(self, instance):
1627 return isinstance(instance, BaseInMemoryCommit)
1630 return isinstance(instance, BaseInMemoryCommit)
1628
1631
1629
1632
1630 class BaseInMemoryChangeset(BaseInMemoryCommit):
1633 class BaseInMemoryChangeset(BaseInMemoryCommit):
1631
1634
1632 __metaclass__ = BaseInMemoryChangesetClass
1635 __metaclass__ = BaseInMemoryChangesetClass
1633
1636
1634 def __new__(cls, *args, **kwargs):
1637 def __new__(cls, *args, **kwargs):
1635 warnings.warn(
1638 warnings.warn(
1636 "Use BaseCommit instead of BaseInMemoryCommit", DeprecationWarning)
1639 "Use BaseCommit instead of BaseInMemoryCommit", DeprecationWarning)
1637 return super(BaseInMemoryChangeset, cls).__new__(cls, *args, **kwargs)
1640 return super(BaseInMemoryChangeset, cls).__new__(cls, *args, **kwargs)
1638
1641
1639
1642
1640 class EmptyCommit(BaseCommit):
1643 class EmptyCommit(BaseCommit):
1641 """
1644 """
1642 An dummy empty commit. It's possible to pass hash when creating
1645 An dummy empty commit. It's possible to pass hash when creating
1643 an EmptyCommit
1646 an EmptyCommit
1644 """
1647 """
1645
1648
1646 def __init__(
1649 def __init__(
1647 self, commit_id=EMPTY_COMMIT_ID, repo=None, alias=None, idx=-1,
1650 self, commit_id=EMPTY_COMMIT_ID, repo=None, alias=None, idx=-1,
1648 message='', author='', date=None):
1651 message='', author='', date=None):
1649 self._empty_commit_id = commit_id
1652 self._empty_commit_id = commit_id
1650 # TODO: johbo: Solve idx parameter, default value does not make
1653 # TODO: johbo: Solve idx parameter, default value does not make
1651 # too much sense
1654 # too much sense
1652 self.idx = idx
1655 self.idx = idx
1653 self.message = message
1656 self.message = message
1654 self.author = author
1657 self.author = author
1655 self.date = date or datetime.datetime.fromtimestamp(0)
1658 self.date = date or datetime.datetime.fromtimestamp(0)
1656 self.repository = repo
1659 self.repository = repo
1657 self.alias = alias
1660 self.alias = alias
1658
1661
1659 @LazyProperty
1662 @LazyProperty
1660 def raw_id(self):
1663 def raw_id(self):
1661 """
1664 """
1662 Returns raw string identifying this commit, useful for web
1665 Returns raw string identifying this commit, useful for web
1663 representation.
1666 representation.
1664 """
1667 """
1665
1668
1666 return self._empty_commit_id
1669 return self._empty_commit_id
1667
1670
1668 @LazyProperty
1671 @LazyProperty
1669 def branch(self):
1672 def branch(self):
1670 if self.alias:
1673 if self.alias:
1671 from rhodecode.lib.vcs.backends import get_backend
1674 from rhodecode.lib.vcs.backends import get_backend
1672 return get_backend(self.alias).DEFAULT_BRANCH_NAME
1675 return get_backend(self.alias).DEFAULT_BRANCH_NAME
1673
1676
1674 @LazyProperty
1677 @LazyProperty
1675 def short_id(self):
1678 def short_id(self):
1676 return self.raw_id[:12]
1679 return self.raw_id[:12]
1677
1680
1678 @LazyProperty
1681 @LazyProperty
1679 def id(self):
1682 def id(self):
1680 return self.raw_id
1683 return self.raw_id
1681
1684
1682 def get_path_commit(self, path):
1685 def get_path_commit(self, path):
1683 return self
1686 return self
1684
1687
1685 def get_file_content(self, path):
1688 def get_file_content(self, path):
1686 return u''
1689 return u''
1687
1690
1688 def get_file_content_streamed(self, path):
1691 def get_file_content_streamed(self, path):
1689 yield self.get_file_content()
1692 yield self.get_file_content()
1690
1693
1691 def get_file_size(self, path):
1694 def get_file_size(self, path):
1692 return 0
1695 return 0
1693
1696
1694
1697
1695 class EmptyChangesetClass(type):
1698 class EmptyChangesetClass(type):
1696
1699
1697 def __instancecheck__(self, instance):
1700 def __instancecheck__(self, instance):
1698 return isinstance(instance, EmptyCommit)
1701 return isinstance(instance, EmptyCommit)
1699
1702
1700
1703
1701 class EmptyChangeset(EmptyCommit):
1704 class EmptyChangeset(EmptyCommit):
1702
1705
1703 __metaclass__ = EmptyChangesetClass
1706 __metaclass__ = EmptyChangesetClass
1704
1707
1705 def __new__(cls, *args, **kwargs):
1708 def __new__(cls, *args, **kwargs):
1706 warnings.warn(
1709 warnings.warn(
1707 "Use EmptyCommit instead of EmptyChangeset", DeprecationWarning)
1710 "Use EmptyCommit instead of EmptyChangeset", DeprecationWarning)
1708 return super(EmptyCommit, cls).__new__(cls, *args, **kwargs)
1711 return super(EmptyCommit, cls).__new__(cls, *args, **kwargs)
1709
1712
1710 def __init__(self, cs=EMPTY_COMMIT_ID, repo=None, requested_revision=None,
1713 def __init__(self, cs=EMPTY_COMMIT_ID, repo=None, requested_revision=None,
1711 alias=None, revision=-1, message='', author='', date=None):
1714 alias=None, revision=-1, message='', author='', date=None):
1712 if requested_revision is not None:
1715 if requested_revision is not None:
1713 warnings.warn(
1716 warnings.warn(
1714 "Parameter requested_revision not supported anymore",
1717 "Parameter requested_revision not supported anymore",
1715 DeprecationWarning)
1718 DeprecationWarning)
1716 super(EmptyChangeset, self).__init__(
1719 super(EmptyChangeset, self).__init__(
1717 commit_id=cs, repo=repo, alias=alias, idx=revision,
1720 commit_id=cs, repo=repo, alias=alias, idx=revision,
1718 message=message, author=author, date=date)
1721 message=message, author=author, date=date)
1719
1722
1720 @property
1723 @property
1721 def revision(self):
1724 def revision(self):
1722 warnings.warn("Use idx instead", DeprecationWarning)
1725 warnings.warn("Use idx instead", DeprecationWarning)
1723 return self.idx
1726 return self.idx
1724
1727
1725 @revision.setter
1728 @revision.setter
1726 def revision(self, value):
1729 def revision(self, value):
1727 warnings.warn("Use idx instead", DeprecationWarning)
1730 warnings.warn("Use idx instead", DeprecationWarning)
1728 self.idx = value
1731 self.idx = value
1729
1732
1730
1733
1731 class EmptyRepository(BaseRepository):
1734 class EmptyRepository(BaseRepository):
1732 def __init__(self, repo_path=None, config=None, create=False, **kwargs):
1735 def __init__(self, repo_path=None, config=None, create=False, **kwargs):
1733 pass
1736 pass
1734
1737
1735 def get_diff(self, *args, **kwargs):
1738 def get_diff(self, *args, **kwargs):
1736 from rhodecode.lib.vcs.backends.git.diff import GitDiff
1739 from rhodecode.lib.vcs.backends.git.diff import GitDiff
1737 return GitDiff('')
1740 return GitDiff('')
1738
1741
1739
1742
1740 class CollectionGenerator(object):
1743 class CollectionGenerator(object):
1741
1744
1742 def __init__(self, repo, commit_ids, collection_size=None, pre_load=None, translate_tag=None):
1745 def __init__(self, repo, commit_ids, collection_size=None, pre_load=None, translate_tag=None):
1743 self.repo = repo
1746 self.repo = repo
1744 self.commit_ids = commit_ids
1747 self.commit_ids = commit_ids
1745 # TODO: (oliver) this isn't currently hooked up
1748 # TODO: (oliver) this isn't currently hooked up
1746 self.collection_size = None
1749 self.collection_size = None
1747 self.pre_load = pre_load
1750 self.pre_load = pre_load
1748 self.translate_tag = translate_tag
1751 self.translate_tag = translate_tag
1749
1752
1750 def __len__(self):
1753 def __len__(self):
1751 if self.collection_size is not None:
1754 if self.collection_size is not None:
1752 return self.collection_size
1755 return self.collection_size
1753 return self.commit_ids.__len__()
1756 return self.commit_ids.__len__()
1754
1757
1755 def __iter__(self):
1758 def __iter__(self):
1756 for commit_id in self.commit_ids:
1759 for commit_id in self.commit_ids:
1757 # TODO: johbo: Mercurial passes in commit indices or commit ids
1760 # TODO: johbo: Mercurial passes in commit indices or commit ids
1758 yield self._commit_factory(commit_id)
1761 yield self._commit_factory(commit_id)
1759
1762
1760 def _commit_factory(self, commit_id):
1763 def _commit_factory(self, commit_id):
1761 """
1764 """
1762 Allows backends to override the way commits are generated.
1765 Allows backends to override the way commits are generated.
1763 """
1766 """
1764 return self.repo.get_commit(
1767 return self.repo.get_commit(
1765 commit_id=commit_id, pre_load=self.pre_load,
1768 commit_id=commit_id, pre_load=self.pre_load,
1766 translate_tag=self.translate_tag)
1769 translate_tag=self.translate_tag)
1767
1770
1768 def __getslice__(self, i, j):
1771 def __getslice__(self, i, j):
1769 """
1772 """
1770 Returns an iterator of sliced repository
1773 Returns an iterator of sliced repository
1771 """
1774 """
1772 commit_ids = self.commit_ids[i:j]
1775 commit_ids = self.commit_ids[i:j]
1773 return self.__class__(
1776 return self.__class__(
1774 self.repo, commit_ids, pre_load=self.pre_load,
1777 self.repo, commit_ids, pre_load=self.pre_load,
1775 translate_tag=self.translate_tag)
1778 translate_tag=self.translate_tag)
1776
1779
1777 def __repr__(self):
1780 def __repr__(self):
1778 return '<CollectionGenerator[len:%s]>' % (self.__len__())
1781 return '<CollectionGenerator[len:%s]>' % (self.__len__())
1779
1782
1780
1783
1781 class Config(object):
1784 class Config(object):
1782 """
1785 """
1783 Represents the configuration for a repository.
1786 Represents the configuration for a repository.
1784
1787
1785 The API is inspired by :class:`ConfigParser.ConfigParser` from the
1788 The API is inspired by :class:`ConfigParser.ConfigParser` from the
1786 standard library. It implements only the needed subset.
1789 standard library. It implements only the needed subset.
1787 """
1790 """
1788
1791
1789 def __init__(self):
1792 def __init__(self):
1790 self._values = {}
1793 self._values = {}
1791
1794
1792 def copy(self):
1795 def copy(self):
1793 clone = Config()
1796 clone = Config()
1794 for section, values in self._values.items():
1797 for section, values in self._values.items():
1795 clone._values[section] = values.copy()
1798 clone._values[section] = values.copy()
1796 return clone
1799 return clone
1797
1800
1798 def __repr__(self):
1801 def __repr__(self):
1799 return '<Config(%s sections) at %s>' % (
1802 return '<Config(%s sections) at %s>' % (
1800 len(self._values), hex(id(self)))
1803 len(self._values), hex(id(self)))
1801
1804
1802 def items(self, section):
1805 def items(self, section):
1803 return self._values.get(section, {}).iteritems()
1806 return self._values.get(section, {}).iteritems()
1804
1807
1805 def get(self, section, option):
1808 def get(self, section, option):
1806 return self._values.get(section, {}).get(option)
1809 return self._values.get(section, {}).get(option)
1807
1810
1808 def set(self, section, option, value):
1811 def set(self, section, option, value):
1809 section_values = self._values.setdefault(section, {})
1812 section_values = self._values.setdefault(section, {})
1810 section_values[option] = value
1813 section_values[option] = value
1811
1814
1812 def clear_section(self, section):
1815 def clear_section(self, section):
1813 self._values[section] = {}
1816 self._values[section] = {}
1814
1817
1815 def serialize(self):
1818 def serialize(self):
1816 """
1819 """
1817 Creates a list of three tuples (section, key, value) representing
1820 Creates a list of three tuples (section, key, value) representing
1818 this config object.
1821 this config object.
1819 """
1822 """
1820 items = []
1823 items = []
1821 for section in self._values:
1824 for section in self._values:
1822 for option, value in self._values[section].items():
1825 for option, value in self._values[section].items():
1823 items.append(
1826 items.append(
1824 (safe_str(section), safe_str(option), safe_str(value)))
1827 (safe_str(section), safe_str(option), safe_str(value)))
1825 return items
1828 return items
1826
1829
1827
1830
1828 class Diff(object):
1831 class Diff(object):
1829 """
1832 """
1830 Represents a diff result from a repository backend.
1833 Represents a diff result from a repository backend.
1831
1834
1832 Subclasses have to provide a backend specific value for
1835 Subclasses have to provide a backend specific value for
1833 :attr:`_header_re` and :attr:`_meta_re`.
1836 :attr:`_header_re` and :attr:`_meta_re`.
1834 """
1837 """
1835 _meta_re = None
1838 _meta_re = None
1836 _header_re = None
1839 _header_re = None
1837
1840
1838 def __init__(self, raw_diff):
1841 def __init__(self, raw_diff):
1839 self.raw = raw_diff
1842 self.raw = raw_diff
1840
1843
1841 def chunks(self):
1844 def chunks(self):
1842 """
1845 """
1843 split the diff in chunks of separate --git a/file b/file chunks
1846 split the diff in chunks of separate --git a/file b/file chunks
1844 to make diffs consistent we must prepend with \n, and make sure
1847 to make diffs consistent we must prepend with \n, and make sure
1845 we can detect last chunk as this was also has special rule
1848 we can detect last chunk as this was also has special rule
1846 """
1849 """
1847
1850
1848 diff_parts = ('\n' + self.raw).split('\ndiff --git')
1851 diff_parts = ('\n' + self.raw).split('\ndiff --git')
1849 header = diff_parts[0]
1852 header = diff_parts[0]
1850
1853
1851 if self._meta_re:
1854 if self._meta_re:
1852 match = self._meta_re.match(header)
1855 match = self._meta_re.match(header)
1853
1856
1854 chunks = diff_parts[1:]
1857 chunks = diff_parts[1:]
1855 total_chunks = len(chunks)
1858 total_chunks = len(chunks)
1856
1859
1857 return (
1860 return (
1858 DiffChunk(chunk, self, cur_chunk == total_chunks)
1861 DiffChunk(chunk, self, cur_chunk == total_chunks)
1859 for cur_chunk, chunk in enumerate(chunks, start=1))
1862 for cur_chunk, chunk in enumerate(chunks, start=1))
1860
1863
1861
1864
1862 class DiffChunk(object):
1865 class DiffChunk(object):
1863
1866
1864 def __init__(self, chunk, diff, last_chunk):
1867 def __init__(self, chunk, diff, last_chunk):
1865 self._diff = diff
1868 self._diff = diff
1866
1869
1867 # since we split by \ndiff --git that part is lost from original diff
1870 # since we split by \ndiff --git that part is lost from original diff
1868 # we need to re-apply it at the end, EXCEPT ! if it's last chunk
1871 # we need to re-apply it at the end, EXCEPT ! if it's last chunk
1869 if not last_chunk:
1872 if not last_chunk:
1870 chunk += '\n'
1873 chunk += '\n'
1871
1874
1872 match = self._diff._header_re.match(chunk)
1875 match = self._diff._header_re.match(chunk)
1873 self.header = match.groupdict()
1876 self.header = match.groupdict()
1874 self.diff = chunk[match.end():]
1877 self.diff = chunk[match.end():]
1875 self.raw = chunk
1878 self.raw = chunk
1876
1879
1877
1880
1878 class BasePathPermissionChecker(object):
1881 class BasePathPermissionChecker(object):
1879
1882
1880 @staticmethod
1883 @staticmethod
1881 def create_from_patterns(includes, excludes):
1884 def create_from_patterns(includes, excludes):
1882 if includes and '*' in includes and not excludes:
1885 if includes and '*' in includes and not excludes:
1883 return AllPathPermissionChecker()
1886 return AllPathPermissionChecker()
1884 elif excludes and '*' in excludes:
1887 elif excludes and '*' in excludes:
1885 return NonePathPermissionChecker()
1888 return NonePathPermissionChecker()
1886 else:
1889 else:
1887 return PatternPathPermissionChecker(includes, excludes)
1890 return PatternPathPermissionChecker(includes, excludes)
1888
1891
1889 @property
1892 @property
1890 def has_full_access(self):
1893 def has_full_access(self):
1891 raise NotImplemented()
1894 raise NotImplemented()
1892
1895
1893 def has_access(self, path):
1896 def has_access(self, path):
1894 raise NotImplemented()
1897 raise NotImplemented()
1895
1898
1896
1899
1897 class AllPathPermissionChecker(BasePathPermissionChecker):
1900 class AllPathPermissionChecker(BasePathPermissionChecker):
1898
1901
1899 @property
1902 @property
1900 def has_full_access(self):
1903 def has_full_access(self):
1901 return True
1904 return True
1902
1905
1903 def has_access(self, path):
1906 def has_access(self, path):
1904 return True
1907 return True
1905
1908
1906
1909
1907 class NonePathPermissionChecker(BasePathPermissionChecker):
1910 class NonePathPermissionChecker(BasePathPermissionChecker):
1908
1911
1909 @property
1912 @property
1910 def has_full_access(self):
1913 def has_full_access(self):
1911 return False
1914 return False
1912
1915
1913 def has_access(self, path):
1916 def has_access(self, path):
1914 return False
1917 return False
1915
1918
1916
1919
1917 class PatternPathPermissionChecker(BasePathPermissionChecker):
1920 class PatternPathPermissionChecker(BasePathPermissionChecker):
1918
1921
1919 def __init__(self, includes, excludes):
1922 def __init__(self, includes, excludes):
1920 self.includes = includes
1923 self.includes = includes
1921 self.excludes = excludes
1924 self.excludes = excludes
1922 self.includes_re = [] if not includes else [
1925 self.includes_re = [] if not includes else [
1923 re.compile(fnmatch.translate(pattern)) for pattern in includes]
1926 re.compile(fnmatch.translate(pattern)) for pattern in includes]
1924 self.excludes_re = [] if not excludes else [
1927 self.excludes_re = [] if not excludes else [
1925 re.compile(fnmatch.translate(pattern)) for pattern in excludes]
1928 re.compile(fnmatch.translate(pattern)) for pattern in excludes]
1926
1929
1927 @property
1930 @property
1928 def has_full_access(self):
1931 def has_full_access(self):
1929 return '*' in self.includes and not self.excludes
1932 return '*' in self.includes and not self.excludes
1930
1933
1931 def has_access(self, path):
1934 def has_access(self, path):
1932 for regex in self.excludes_re:
1935 for regex in self.excludes_re:
1933 if regex.match(path):
1936 if regex.match(path):
1934 return False
1937 return False
1935 for regex in self.includes_re:
1938 for regex in self.includes_re:
1936 if regex.match(path):
1939 if regex.match(path):
1937 return True
1940 return True
1938 return False
1941 return False
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
1 NO CONTENT: modified file
NO CONTENT: modified file
The requested commit or file is too big and content was truncated. Show full diff
General Comments 0
You need to be logged in to leave comments. Login now